Nintendo Wii Matches the Power of the Original Xbox

xbox.jpg

According to a FIFA producer, the Nintendo Wii matches up to the original Xbox as far as power goes.  Of course the Wii is an easy system to develop code for, but matching up to the power of a last generation system isn’t exactly a big deal in my opinion.  People need to keep in mind that the Wii is great simply because it is original.  That originality could wear off one day, but certainly doesn’t look like it’s slowing down any time soon.  I honestly wish people would stop comparing the Wii’s actual system details to Sony and Microsoft’ systems.  Honestly, the Wii cannot compete on a hardware level and Nintendo knows it.    The point of the article is that the Wii is pretty much what the original Xbox was.  Is that really the point we need to care about?  Probably not.

FIFA producer Tim Tschirner has said that he believes the Nintendo Wii matches up to the first   in terms of power.

Tschirner’s comments came in an exclusive interview with GamesIndustry.biz’s sister site, Eurogamer.net

“It’s about as powerful as the original Xbox,” he said. “The video hardware unfortunately is not as powerful. There’s just a couple of key things that you can do on Xbox like shaders which you just cannot do on the Wii… Overall though it’s pretty much what the original Xbox was.”

Join the Conversation   

* required field

By submitting a comment here you grant GameFront a perpetual license to reproduce your words and name/web site in attribution. Inappropriate or irrelevant comments will be removed at an admin's discretion.

698 Comments on Nintendo Wii Matches the Power of the Original Xbox

cubeboy101

On June 16, 2007 at 5:29 pm

that is the BIGGEST LOAD OF BS on the net infact all 3rd party spoksmens are talking rubbish that statment is a EA MARKETING STATMENT

market the control experiance on wii ///market hd on competition its COMMONSENCE what he is up to WORK IT OUT………….

xbox polygon count in game 12 million at 30 frames a second

gamecube not wii gc in game poly count at 60 not 30 60 frames a second NEAR 20 MILLION hmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm

xbox fsb system bus speed 133mhz no hardware based data compression = 800mb bandwidth

gamecube bus 162mhz plus 4to1 hardware based data compresion= 5.2gb bandwidth can you see a MASSIVE DIFFERANCE

wii bus 243mhz plus the same compression 4to1 in hardware = 8gb bandwidth

so if bus data rate and ingame polygons of gamecube out performs xbox PLEASE EXPLAIN YOUR wii COMENTS BIASED EA MAN…………

WII CPU CUSTOM OPTERMIZED 750 POWERPC MICRO EMBEDDED CPU USING COPPERWIRE AND SILICON ON INSULATOR TECH RISC BASED

XBOX CPU A CELERON OF THE SHELF AT INTEL AGAIN NOOOOOOOOOOOOO CONTEST

THERE IS NO WAY ON GODS GREEN EATH THE XBOX CAN COMPEAT WITH WII ITS TOTALLY STUPID TO EVEN THINK IT

THIS GUY IS TAPPING INTO IGNORANCE OF SPECS AND HARDWARE HE KNOWS IGNORANT TEENAGERS ARE CLOCK SPEED OBSESED AND IS FEEDING MORE LIES TO THEM AS ARE OTHER 3RD PARTYS
JELOUS OF NINTENDOS POWER IN THE INDUSTRE

HOW CAN A STANDARD SPED SHEET CELERON OF THE SHELF CPU COMPARE TO A CUSTON GAMECENTRIC POWERPC RISC SILICON ON INSULATOR 2006 SPEC MICRO PROCESSOR UNIT MPU

HOLLYWOOD GPU IS A EXTENTION OF FLIPPER , FLIPPER HAD A RICHER LIST OF HARDWARE BASED ABILLITYS THAN XGPU AND WAS VASTLY MORE EFFICANT

HIS COMMENTS DONT HOLD WATER AT ALL

WII GPU CATCH 3MB OF EDRAM-1T SRAM-R XGPU HAD A TINY CATCH OF 256K

FLIPPER VS XGPU GAMECUBE GPU

FLIPPER 8 TEXTURE LAYER 16 STAGES OF EFFECTS

XGPU 4 TEXTURE LAYERS 4 STAGES OF EFFECT ERMMMMMMM THATS GAMECUBE NEVER MIND WII

WII PATENTS SHOW
DEFORMATION TEXTURING TRICK
16 STAGE SHADER TREE
EFFICANT 3D RENDERING
ETC ETC

WII IS TIGHTLY INTERGRATED CLOCK BALANCED AND HAS BLINDINGLY FAST RAM AND A BLINDINGLY FAST DISC DRIVE SET UP NOT SLOW RAM SLOW DVD LIKE XBOX

FLIPPER GPU OF GAMECUBE HAD OVER 2X BANDWIDTH OF XGPU THE HOLLYWOD GPU IN WII IS PROBABLY 4/5 TIMES XGPU

LETS LOOK AT GPU DATA FEED

XGPU 256K GRAPHICS CATCH AND NO ONBOARD BUFFERING RAM OR EDRAM A TINY 800MB CPU TO GPU BUS AND SLOW EXTERNAL

FLIPPER GPU GAMECUBE MASSIVE 3MB GPU CATCH/BUFFER 20GB BANDWIDTH MASSIVE COMPARED TO XGPU AND A 5,2GB CPU TO GPU CATCH AS YOU CAN CLEARLY SEE GAMRECUBE KICKS ASS HERE

SO HOW ON EARTH IS WII LESS POWERFULL THAN XBOX

XBOX PEAK IN GAME POLYGONS 12 MILLION 30 FRAMES A SECOND

ROGUE SQODREN 2 BEATS THAT NUMBER

ROUGUE SQODREN 3 SLAPS THAT NUMBER

RESIDENT EVIL 4 DESTROYS THAT NUMBER

BEST LOOKING GAME BAR NON LAST GEN RES EVIL 4 GC VERSION

BEST LOOKING CELL SHADING BY FAR WIND WAKER

METROID PRIME 2 PERFECT 60 FRAMES A SECOND FALTLESS GRAPHICS ZERO BUGS ZERO LOADTIMES BUG FREE NO PATCHES REQUIRED

HALO 2 LONGGG LOADTIMES BUGGY CRASHY POP UP PATCH REQUIRED CHOPPY FRAME RATE

ARE PEOPLE MISSING SOMING HERE

GAMECUBE ALLSO OUT LOAD SPEEDED THE XBOX GAMECUBE HIT THE BEST GRAPHICS LAST GEN AND THE FASTEST STYREAMING LOADING FROM DISC

SO PLEASE EXPLAIN HOW XBOX BEATS WII

Tom

On July 9, 2007 at 2:53 am

lmao what the hell.. the GameCube was a piece of compared to the Xbox. A slightly faster PS2 at best.

BranH

On September 11, 2007 at 4:48 am

Cubeboy, that’s just wrong. You’re reading all your specs out of context.
For example, Gamecube had it’s on-chip ram, but why? It’s because old fashioned multi-texturing requires multiple reads from texture memory, AND multiple reads from and writes to frame buffer memory. Which requires lots of bandwidth to both to perform well.
Xbox had programmable shaders, multiple textures could be read from texture memory, combined by the shader, and applied in one texturing pass.
And you mention data compression? Nvidia gpus have used extensive data compression, that is compressed and decompressed on the fly.
S3TC = DXTC 1-5, for texture compression, of which both ati and nvidia have their own, tweaked codecs of.

You’re comparing apples to oranges. As is the case throughout your post.

There are huge differences in how things are done on both platforms. That includes texture layering (of which xbox was twice as fast per clock cycle, as it has twice the texturing units), and the processing of polygons.
Xbox had two vertex shader alus. Gamecube has nothing of the sort, aside from a plain t&L unit.
And there is no bump mapping in Metroid games. Gamecube couldn’t handle it.
Nor is there much, (if any) in Prime 3 for the Wii.
It was everywhere in halo. Dot3, etc..

But still, I would say that the Wii’s graphics capability is under-rated.
And I agree, the cpu should be far more efficient, as well as more powerful.

BranH

On September 11, 2007 at 4:58 am

also, streaming data from a hard-drive, is much faster than off disk. Not to mention xbox had more ram. (and yes, the same types of compression technology)
Could spend time going through everything piece by piece, but I think I covered enough.
BTW, not putting your post down or anything. Sorry if it sounded like that.
(I just happen to have dealt with them before)
And I agree, many developers are making excuses. Gamecube and Wii, are well designed systems, with few bottle necks, and Wii is capable of better than most (3rd party) games show.

BranH

On September 11, 2007 at 5:02 am

ok, one more, transfer speed is faster from xbox’s dvd drive, than it is from gamecube’s drive.

Norbit

On September 11, 2007 at 10:49 am

“It’s about as powerful as the original Xbox,” he said. “The video hardware unfortunately is not as powerful.”

So in what way is it as powerful then?

I really cant see a long life for the Wii. Its selling really well but from my experience most of that is purely down to the novelty value. I know 3 people who have had Wiis for over 6 months and none of them use them any more. Initially it was fun going round there after a few drinks but that soon faded. One of my mates is a hardcore Ninty fan and even he’s disappointed with it because there is absolutely no progression at all from their last console. No better graphics and no more depth to games.

To me Nintendo releasing the Wii is like Sony simply releasing the PS2 in a different case with a few upgrades and a motion controller and then calling it the PS3 and selling it really cheap. It would probably have sold faster than the real PS3 in the short term but it would have no long term legs.

To me the Wii isn’t selling because it has a really strong console or really strong games but because it has a really strong peripheral. Without the Wiimote most of the software is shallow rubbish and the console is just a tuned up Gamecube. If people get bored of the Wiimote or just aren’t interested in it a PS2 is a far better purchase.

Peter

On October 19, 2007 at 9:28 am

The Wii isnt selling? its sold over 11 million worldwide catching the xbox 360 idiot

CUBEBOY2008

On December 15, 2007 at 5:09 pm

the compression system the ram and the disc data feed on gamecube destoyed xbox thats a fact even with its hardrive gamecube exclusives ran loadscreebn free no xbox game ever did COMPARING OFF THE SHELF TO FINLY DESIGNED CUSTOM PARTS MAKES U A PC BRAINWASHED FOOKWEED

BranH

On December 17, 2007 at 5:17 pm

“Destroys Xbox”

It really did nothing of the sort. The fact that Gamecube was a customized design, is what allowed it to keep pace with xbox, despite it being inferior in many areas.
It’s like comparing a decent, stock high performance car (for it’s time), to a souped up Toyota Corolla. It’s old technology, retrofitted with new technology to make up for it.

And they had load screens in Nintendo games. Mario was loading as you entered the level, and while you selected the Shine mission you wanted. The level shared a large amount of texture, mesh, and layout data. The only thing it needed to load dynamically after entering the portal, was what was related to the shine mission you chose. Metroid Prime just hid loading by making all the doors open with delays after shooting them, etc..
Plus, XBox had twice the ram to load than Gamecube.
Gc’s disk transfer speed was somewhere between 2-3 megabytes per second, depending on where it was being read. Xbox should have been between 2.5 and 6, but again, this would be an area Gamecube might have been able to keep up on, because I would imagine the seek-time might be better on a drive Nintendo specifically chose, rather than a drive Xbox was getting a good deal on at the time.

In the end, it worked out well, and they competed fine in graphics. (last gen)
Wii may very well be capable of 2x Xbox in some cases, but it “still” doesn’t destroy it in performance.

wiiboy101gezza

On December 19, 2007 at 9:40 pm

inferi zzzzzzzzzzzzzzzz wake me up wen you stop fanboying bill gates

xbox of the sfelf celeron made using aluminium conectors

gekko customized gaming “”"risc”" cpu made using “”superier copperwire conectors

xbox celeron high heat non embeded cisc cpu with 128k catch

gekko tiny embedded cool running design faster than non skrunk powerpc chip with 256k catch

xbox 128k catch no compresion

gekko 256k catch 4to1 compression “”"decompression in realtime”"” thats a virtual catch of 1mb and a virtual bandwidth of 4 times the standard 256k catch

xbox 133mhz fsb no compression 800mb to 1 gb bandwidth

gamecubes gekko fsb 163mhz (already faster than xbox) plus 4to1 data compression = over 5gb bandwidth

xbox texture read from main ram about 3.2gb the 6.4 was fake rememer the main ram in xbox allso did cpu sound and z buffers and frame buffers your left with around 3.2 for graphics

gamecubes texture read 10.4gb right off the texture catch of 1mb 1tsram rrealtime decompresion of textures 6to1 = virtual texture read of 6 times 10.4 gb clearly a 60mb texture read kills a 3.2 gb texture read

xbox ram dram= 70 nano seconds read speed

gamecube 1t sram = 10 or faster nano seconds read speed up to 10 times faster than xbox
1t sram matches l3 and l2 catch for brute speed

sdram in xbox was a slow dinosure fact sonys rambus ram was even slower

xbox in game texture layers 4

gamecube ingame texture layers 8

xbox realtime lighting 8×4 on gpu

gamecubes realtime lighting 8×8 on gpu plus custom lighting enhacments built into gekko cpu (check out gamecube 101 at ign) (check out gamecube vs ps2 ign) (checkout factor 5 on gamecube ign)

gamecube had faked tile rendering built in i.e it cud cull its texels and pixels and polygons from unseen by the eye areas this inflates your fillrate

xbox had no such abbillity

gamecube compressed its display list geomity and allmost al data at 4to1 and decompresed it in real time

xbox had no such abillity

the fact remains wii will do ingame graphics 2.5 times a xbox at a solid 60 frames and without loadtimes AT THE SAME CLOCKSPEEDS proving gamecube was superier

xbox highest ever in game polygon count in its life 12 million at 30 frames INDUSTRE FACT

GAMECUBE HIT 18 MILLION PLUS IN ROGUE SQ 3 AT 60 FRAMES A SECOND INDUSTRE FACT

BEST CEL SHADING LAST GEN WIND WAKER

BEST VISUALS OVERALL RES EVIL 4 GC VERSION

HALO 2 RAN AT 30 FRAMES AND DROPPED TO 12 FRAMES A SECOND INGAME AND HAD TREDFULL LOADTIMES BUGS GLITCHES ETC

METROID PRIME 2 RAN AT 60 NOT 30 60 FRAMES HAD NO LOADTIMES AND LOOKED GOUGOUS

GAMECUBE 16 STAGE REAL TIME TEXTURE BLENDING AND 8 TEXTURE LAYERS INGAME

XBOX HAD 8 STAGES REALTIME TEXTURE BLENDS AND 4 TEXTURE LAYERS AND NO VIRTUAL TEXTURING AT ALL

GAMECUBE SUCKED TEXTURES INTO ITS HUGE 1MB TEXTURE CATCH AT 6MB AND PROCESSED 6MB TEXTURES AT A TIME REAL TIME TEXTURE DECOMPRESION

XBOX HAD NO REAL TIME TEXTURE DECOMPRESION YES IT CUD COMPRES AT 6TO1 BUT IT HAD TO DECOMPRESS BEFORE THE GPU SUCKED THEM IN AND XBOX HAD A TINY 256K TEXTURE CATCH

256K VS 6MB HMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM NO CONTEST

MARIO KART GP2 ON ARCADE IS A TRI FORSE ARCADE BOARD GAME ITS BLOOFY STUNNING LOOKING AND RUNS AT 60 FRAMES THATS IMPOSABLE ON A XBOX REMINDERR TRI FORSE IS JUST A GAMECUBE WITH DOUBLED 1T SRAM

GAMECUBE WAS HIGHLY CUSTOM XBOX WAS OFF THE SHELF LOW SPEC PC PARTS NO FOOKING CONTEST

wiiboy101gezza

On December 19, 2007 at 9:52 pm

MARIO GALAXY OVER 30 MILLION POLYGONS AT 60 FRAMES TRUE WIDESCREEN NO LOADTIMES

XBOX CARNT DO 15 MILLION POLYGONS AT 30 LET ALONE 60 FRAMES SO I REPEAT YET AGAIN 3RD PARTY LIES TO JUSTIFY POR GRAPHICS THRU PS2 PORTING IM CURRECT AS I WAS BROUT UP NOT TO GO ROUND BULL TING FOLKS

YOUR XBOX FANBOYS TALKING CRAP

PLEASE EXPLAIN HOW ALL TRUE WII LOOKERS DESTOY XBOX VISUALS

METROID PRIME 3
MARIO GALAXY
MONSTER HUNTER
MARIO KART WII

WII HAS A DIE EMBEDDED 24 MB VIRTUAL LEVEL 3 CATCH SHARED BY BOTH PROCESSORS PUSHING DATA AT THEM AT AROUND 6 NANO SECOPNDS THATS LIKE 10 TIMES FASTER THAN PS3S SO CALLED FAST RAM

WII IS SIMPLY A DATA STREAMING DATA DECOMPRESSION TURBO MONSTER AND ITS FILLRATE IS OVERKILL AT 480P

3RD PARTYS TALKING DOWN THE GRAPHICS ARE DOING IT ON PURPOSE NINTENDO HAS SLAPPED RTHE WHOLE INDUSTRE AND MADE IT ITS WII AND DS ARE THE FASTEST SELING CONSOLES IN HISTORY

3RD PARTYS ARE DELIBRATLY DOWN PLAYING OUT OF JELOUSY AND FEAR OF A BETTER GAMING COMPANY THATS JUST DESTROYING ALL COMPETITION

THE WAY THERE TRYING TO PUT THE BRAKES ON IS BY LIEING THE WII CARNT DO GRAPHICS ITS A DAM COMPRERASY

PS2 ENGINES PORTED TO WII AINT FOOKING WII GRAPHICS AND YOU FANZOMBIESD DAM WELL KNOW IT SO STOP PRETENDING OTHERWISE

BranH

On December 20, 2007 at 12:39 am

Dude, no offense, but you spent alot of time, going over Gamecube whitepapers, and spent no time looking at modern gpus, and assume that many of those spec points you came across, are Gamecube exclusive.
And you’re listing the bandwidth of texture memory, on a system (Gamecube)that recirculates it’s pixels, from texture ram, to frame buffer, and back to ram again, once for every time it wants to add another texture layer. It really, really needed on chip texture ram and fram-buffer ram with high bandwidth to do that. (It just pretended the on-chip texture cache was main-ram, and decompressed from there, rather than doing all that reading and writing to main ram)

And you’re comparing that, to a gpu with proper pixel shader pipelines of Xbox. (Like I said, apples to oranges, for bandwidth, blending, layering, etc..)

Plus, for textures: One texture layer can be done per pixel on Gamecube at full speed. For two texture layers, you had to read the pixel back, and recirculate the pixel through the pipeline again and add another, thus cutting it’s fill rate. If you were to do 8, you cut fill rate by 8. (almost no Gamecube games are capable of anywhere near 8 of anything btw)
Xbox would only cut fill-rate by half for 4 texture layers, before having to write to ram, and read it back in for a second render pass.

As to “Visibility Subsystem: Z-Occlusion Culling”
You seem to suggest that Xbox didn’t have a Z-Buffer, with Z-Occlusion culling,
When, of course it did. with 4x lossless compression, and client-side Z-Occlusion query, like is found in Nvidia and ATI gpus, even before xbox.

And as to polygons, the Xbox had the same type of gpu baes vertex processor as Gamecube, except it had two of them, and was programmable. You don’t just transform polygons, you run other instructions. They’re transformation, Lighting, and vertex shader instructions.
Most of the vertex shader work done by vertex shaders on xbox, would have to be emulated on the cpu. Plain vanilla T&L wouldn’t cut it.
Overall, the Wii is likely more powerful than Xbox, but it does nothing to “destroy it” at anything.

Sorry, but that is the truth.

BranH

On December 20, 2007 at 3:26 am

A few random quotes from developers: First few, are from MotoGP devs, the last is from Factor5.

Multitexturing

“One texture per polygon just isn’t enough any more. The PS2 has fill-rate to spare, and the dual context rendering architecture can draw two copies of every triangle for little more cost than one, so you are wasting the hardware if you have only a single texture stretched over your geometry. Xbox supports four layers per render-pass, and Gamecube eight. (although in practice you can only afford to use two or three while maintaining a good framerate).”

“Our PS2 projects are using two texture layers, with the gouraud alpha controlling a cross-fade between them. On Gamecube we usually add a third multiply mode detail layer, while on Xbox the flexibility of pixel shaders lets the artists choose any possible combine modes for their three layers, with the fourth generally reserved for the programmers to do dynamic lighting or reflection effects.”

“Figure 24: Mixing three texture layers.
For performance and memory reasons, on the Nintendo GameCubeâ„¢ a meta-tile is not allowed to use more than three different texture layers blended together. In the data conversion each meta-tile that has non-trivial blending, is assigned a 32×32 pixels texture image that contains the mix-map information for that meta-tile.”

Swp64

On December 20, 2007 at 7:58 am

Realistically, I doubt very much, that Wii’s fill-rate is “OVERKILL AT 480P”, given it has the same rop pipeline configurations as the Gamecube, just with a higher clock frequency.
Perhaps overkill for what the Wii could ever actually do with them maybe.

ManOfTeal

On December 20, 2007 at 10:56 am

Wii60 FTW!!!!

wiiboy101gezza

On December 21, 2007 at 12:38 pm

wi60 no thanks i dony buy my consoles from computer op companys engaged in a set top box war with a general electric company caled sony

i buy my console from the greatest console builder / games company on earth nintendo

d=pads aint next gen ps3 is still built around a 8bit direction invention thats lafable

360 is all micro hype and again a out dated interface/controller

wii runs loadscreen free and offers pick up and play like no other

WHY ITS BECOUSE ITS DESIGNED AS A CONSOLE CLOSED SYSTEM CUSTOMIZED FOR IMIDIATE PICK UP AND PLAY GAMING AT HIGH SPEED

PS3 IS A FAKE COMPUTER

X360 IS A FAKE COMPUTER

WITCH = LONG LOADTIMES UNNESSESARY BULK MORE MULTI MEDIA THAN GAMING OPTERMIZED AND UNIMAMINATIVE OUTDATED CONTROLS

CLEARLY PROVING THERES A WINDOWS BRAND VS PLAYSTATION BRAND SET TOP BOX WAR

ONLY WII IS A GAMES CONSOLE

LONG LOSADTIMES AND BUGS THAT REQUIRE PATCHES IS ALLLLLL A WII BIT TO MUCH PC LIKE

ONLY WII IS A NEXT GEN CONSOLE THE OTHER 2 ARE CLEARLY COMPETING SET TOP MEDIA BOXES

SO AGAIN I SAY NINTENDO IS GAMING

PS AND X360 ARE INFACT HURTING GAMING THRU NONCENCE JACK OF ALL TRADE DESIGNS

IF MICROSOFT WAS HAND ON HEART CONSERNED ABOUT GAMING AND SONY TO

WOULDNT THEY HAVE NEW CONTROLS AND FAST LOADING

GAMING WAS A AFTER THOUGHT THERE OBSESSED WITH OUT MEDIA BOXING EACH OTHER

THAT WILL NLY END ONE WAY

GAMING WILL SUFFER

Swp64

On December 21, 2007 at 2:51 pm

“WII PATENTS SHOW…” recycled Gamecube patents, of texturing techniques that have been around since Geforce 256′s registry combiner days.
“Emboss bump mapping” has been around since forever, lots of Nintendo fans thought it was new, or thought it was a hardware specific form of displacement mapping, etc.. And then there was the “cube mapping”, and “Nurbs” etc.. All false hope, grasping at straws, hoping that those specs that were released really didn’t say it all….

And technically, if you read interviews with alot of Japanese developers, they’ll tell you, that Nintendo is hurting the overall development of game design and technology in Japan, with it’s recycling of it’s Gamecube hardware. Many see their ambitious ideas being crushed, when they can’t get funding and resources for next-gen development. They see extremely cheap mini-games selling well, and sucking up potential gaming revenue from the industry.

They see publishers asking them to shoe-horn their new ambitious ideas and game design, onto hardware they thought they had washed their hands of, and were being set free from with a newer generation of hardware. They were looking forward to getting the hang of newer, better technology, and stretch their legs with a much higher performance ceiling.
Only to have Nintendo shove a platform on them, that is hardly a nudge better than last generation technology. Most gaming staff, don’t particularly care about waggle. Artists, animators, programmers, etc.. severely dislike the Wii for a very good reason.

Fashir

On December 21, 2007 at 3:08 pm

@wiiboy
Stop, just stop now. Never again call the wii the only next gen…360 and Ps3 do EVERYTHING better than Wii. So what, they have load times, thats because they have good games, large games that are able to have load times and still be great games. The wii has loading too…its covered by sneaking programming such as a slow part in action, or a piece of speech. If there would be only one next-gen system, it wouldnt be the wii, the wii is a gamecube with a new controller scheme. A controller scheme that is annoying to play with in my opinion, its not precise at all. Give me a real controller, implement movement detection into that, and im good to go…now if only someone would do that…hmm…oh wait, the PS3 did! well, if only their sixaxis was better than wii’s….wait for it….oh yeah, it is.

Fashir

On December 21, 2007 at 3:10 pm

And you can also try to spell words correctly, as well as not using caps =)…nobody is impressed by your caps, nor by your useless specs that prove absolutely nothing.

Swp64

On December 21, 2007 at 3:16 pm

Akio Morita of Afterlife programming inc., comments on the “parasite of gaming” that is Nintendo.

*”It’ll be years before we recover from this. We are struggling to just keep pace with the rest of the industry. We don’t have a strong PC background to fall back on, and it’s getting more and more difficult to stay up to speed on technology this generation, thanks in no small part, to Nintendo’s “thriftiness”. They have effectively crippled our industry for years to come, and perhaps now, it is too late for us…..
*sobs*
*He chokes back the tears to continue* : “The people are scared. They’re afraid to speak out against Nintendo, and they are worried about keeping their jobs. It is not good to bite the hand that feeds, but I felt compelled to, because I know that others will not. Nintendo does not care about us, they refuse to put quality regulations on the software released on their platform, to cut back on the mounds of crap that is bound to be released, especially now that hell has frozen over, and their platform has somehow become popular. Any ambitious titles run the risk of drowning in a cheap, sea of mini-game crap.”

“When we brought our concerns to Nintendo, they spit in our face, and told us, “let the consumers decide what they buy”. And what do they care, as long as they stick their franchise characters into a games, and drop the Nintendo name unto it, people will buy it. Being lost in a sea of crap, is our problem, not theirs.”
“The old Nintendo is back. Consumer ignorance has awoken a sleeping giant, and they will utilize their newfound power, and rule us all with an iron “waggle” fist once again, and perhaps this time, there is no recovery.” *places revolver to his temple, and squeezes the trigger*…….”

Tis very sad.

Not a fan

On December 25, 2007 at 10:49 am

I didn’t knew about the pixel recycling in the GC, do multiple pixelpipelines fix this? And another thing, why are the fillrates so high? For example, the 360 specifies 4Gpix/s. But assuming a 64bit databus, 500MHz bus speed and 24bpp calculates to 1.3Gpix/s in my book.

@Swp64: perhaps they exaggerated a bit? I think the Wii lacks in internal memory, but think about how much lines of code one should write to get a 700Mhz powerPC to its knees… Creating game logic is all about simplifying and deducing. If it wasn’t, Quake1 (software rendered) could never run on a 90Mhz P1…

BranH

On December 25, 2007 at 8:40 pm

Well, all 4 pixel pipelines are contributing to it’s 648 million pixels per second fill rate. They’re like assembly lines. Each assembly line can add 1 texture layer at a time. You’d have a mixture of different pixels with different numbers of texture layers etc.. But it’s still 1 to 1 with fill rate.
Xbox is 2 to 1.
Of course, it’s not that easy to compare the two. And it’s likely that Gamecube hits closer to theoretical maximums, but those are still below Xbox.

I’ve also heard it mentioned from a few different places, that there really aren’t any cases where Gamecube can out fill Xbox. (outside of perhaps transparent pixels) And there’s a significant T&L difference in favor of Xbox, as well as a large math advantage in shaders for things like dot products, etc..
(so, on paper, XBox wins, in practice, Xbox still mostly wins)
I’m not really dissing Gamecube as a whole, it wasn’t a bad console, and had a few advantages. I was just saying, it certainly doesn’t “destroy” xbox is all. And he Wii doesn’t change much in the gpu, outside of clock rate and additional ram and bandwidth.

(also, games like Wind Waker, Pikmin, and Mario Sunshine, all ran at ~30 frames per second, not 60. And I’d be surprised if any single level in Mario Sunshine, was made up of even 100,000 polygons total. So I doubt it’s doing anything with more than a few million in any given second of time)

BranH

On December 25, 2007 at 9:15 pm

**1.3?

I assume you took 500mhz x 64bit bus = 3.2 / 24 bit pixels = 1.33 gp/s.

Actually, all pixels are at least 32 bit for color. 8-bits for red , green, blue, and alpha.-(transparency usually)
-Unless you’re looking at Gamecube’s 24-bit. Or sometimes, it’s 18- bit color that it had to use sometimes. (6, 6, 6, 6) You can see lots of color banding in those cases.

Then you’ve also got a Z-value (depth), which are usually 32 bits. (24 bits + 8 bit stencil for things like shadows or reflections, etc..)

So it’s actually 64 total bits per pixel.

And I’m not sure they’ve specified what the bus width was between the main die, and daughter die. Other than to say it’s 32gb/s, enough for the full 4gp/s, plus a little more for other required data.

I have read 64×2 bit bus.
You’d have a color value per pixel in a quad, and 4to1 Z compression, (which in this case, is supposedly “always” 4to1, because it’s transfered over the bus before rops gets it).
As it is though, the developer discussions and presentations confirm that it is practically impossible, to be bandwidth limited for frame buffer operations.
You’d become fill-rate limited before you’d ever become bandwidth limited.
Which would verify, that it’s moving 32gb’s to the daughter die.

Plus, it’s fill-rate is theoretically higher than that, since the rops in edram can “technically” do 4x aa in the edram module over the top of that fill rate, in addition to gpu’s 2x Z fill rate, unlike other gpus, which half fill-rate moving from 2x to 4x aa, and lose their double z-rate ability.

But all that makes alot of assumptions on what the game engine looks like, and how things are organized, etc.. So it’s not telling much on performance in current games.

BranH

On December 25, 2007 at 10:13 pm

@not a fan.
He was actually exaggerating when he wrote that. Mostly sarcasm.
(Akio Morita has passed away, hence the “Afterlife programming inc”)
I’m sure they can make decent games on the Wii, but I also think it is inhibiting progress.
There are a bunch of developer interviews that have said pretty much that, only one blames Nintendo specifically though.

But plenty of developers have said they’re falling behind in programming in Japan. Partially just because of their culture being different from the west. They’re new to middleware tools for example, and many of those, are English based. So it’s frustrating to them.
Last generation, they had handhelds, and they had consoles. They were pretty separate items to make games on. Consoles were all somewhat close in power and what could be achieved on one, could “pretty much” be done on another, without changing much. So the idea that they had to stick to PS2 most of the time, wasn’t “that” big of a deal.

Now, they have the handhelds, and they have the Wii and the other consoles. The Wii is the odd man out as far as game engines and power, etc.. go.
But it’s also selling really well in Japan, while 360 probably never will, and PS3 hasn’t caught on enough to get funding for anything but well known games. So, publishers are quick to push developers to put many of their ideas on the Wii. Typical programmers are techies, and really don’t like the idea much. Nor do the animators, designers, or artists, etc..
Not all that different from the developer on Spore calling the Wii a “piece of ” at the last GDC.

Couple that, with the fact that, because the Wii is doing so well and the others not so well, they can’t push for better funding and support for r&d on newer technology, the way they did the last generation of hardware.
Publishers are cheap and like money, and developers pretty much have to do what they want, when they want it.

honestypays

On December 27, 2007 at 1:07 pm

the xbox fans need to wake up the guy is simply telling the truth

xbox frontside bus = 133mhz no built in compression trick= 1gb bandwidth

wii frontside bus = 243mhz plus 4to1 data compression realtime decompression built in = 8gb
CLEARLY 8 TIMES MORE

WII cpu risc/copperwire/customized/silicon on insulator/256 k l2 catch 4to1 compresion data read realtime decompression

xbox cpu ty aluminium celeron cisc cpu 128k catch no compression trick abbilitys..

wii gpu huge 3mb extreamly fast 1tsram edram graphics catch reads textures at 6 to 1 compressed in real time

xbox crappy 256k graphics catch no compression reading abillitys textures need to decompress before reading futher botlenecking the xbox

wii texture read = 16gb plus 6to1 compression = 64gb plus easy virtual texture read

xbox texture read only whats left in main memory after cpu sound and buffering taken there tole = 3.2 gb if that texture read that has to decompres before hitting the gpu…

xbox slow dinosure sdram at 200mhz clock doubled to 400mhz in memory controller
crappy slow latency issues

wii two main memorys extreamly fast 1t sram 24mb on die extreamly fast eficent catch like performance 5 nano seconds read speed 10 TIMES FASTER PLUS THAN XBOX DRAM

64MB GDDR3 seperate dedicated bus clock balanced to fsb and 1tsram etc better all round performance than xbox whole main memory

88mb of fast optermized ram plus custom compression in hardware

vs 64mb of slow ram and only half as much compression tricks

wii dedicated sound bus xbox all data shared on a slow 133mhz bus

broadway cpu is easy 2.5 times a off the shelf celeron xboxchip FACT

BROADWAY is copperwire silicon on insulator micro embedded design clock balanced risc cpu with added compression abillitys internal bandwidth is easy 4/5 times the xbox cpu

do your math

cubeboy101

On December 27, 2007 at 1:39 pm

aparantly xbox out fillrates a gamecube IDIOT IDIOT IDIOT

YOU JUST DIRECTLY COMPARED PEAK TEXTEL PIXEL FILL RATES ON PAPER
1 FLIPPER WAS VASTLY MORE EFFICENT THAN XGPU AND WIS HOLLYWOOD IS MORE EFFICENT AGAIN

2 FLIPPER AND HOLLYWOOD ARE CUSTOM DEFERRED RENDERING GPUS XBOX IS TRADITIONAL EXPLAINATION FOLLOWS

VIRTUAL TETUREING= CULLING OF TEXTURE IMAGE AT TEXEL LEVEL I.E WHAT ISNT SEEN ON CAM ISNT RENDERED AKA DREAMCAST AND CUBE ONLY XBOX CANNOT DO THAT

PIXEL CULLING= WERE AS XBOX AND MORE SO PS2 HAVE TO RENDER A PIXEL FOR EVERY RENDER LAYER GAMECUBE AND WII MULTI RENDER TO ONE DRAWN PIXEL I.E PIXEL EFICENT JUST LIKE TEXEL EFFICENT

POLYGON CULLING= POLYGONS THAT ARE SEEN ARE DRAWN POLYGONS THAT ARE NOT SEEN ARE NOT DRAWN AKA GAMECUBE AND DREAMCAST ONLY XBOX CAN NOT DO THAT
HOLLYWOOD IN WII DOES IT EVEN BETTER

XGPU WAS BOTTLENECKED IN MANY WAYS POOR EFFICENTCIE JUST LIKE ALL PC CARDS
POOR LATENCY OF MEMORY
TINY CATCH PERFORMANCRE
FSB BANDWIDTH BOTLENECK
NOT DESIGNED FOR 480I 480P IT WAS DESIGNED FOR PC MONITORS NOT TV SETS
NO REALTIME COMPRESSION

EFFECTIVE IN GAME PIXEL TEXEL FILLRATE WAS BETWEN 300M/P M/T TO 750M/P M/T
IN GAME

FLIPPER IN GAMECUBE INFLATED ITS FILRATE THRU DEFERRED RENDERING
AND MATCHED IF NOT BETTERED XBOX FILLRATE

FACTOR 5 ROGUE SQODRON 2 60 FRAMES 15 MILLION POLYGONS NO FILLRATE ISSUES PRO SCAN 8 TEXTURE LAYERS 512X512 TEXTURE SUPPORT

SAME GAME ON XBOX WOULD HAVE HAD TO CULL ITS FRAMES TO 24FPS AND HAVE MUCH LONGER LOADTIMES

ROGUE SQODRON 3 NEAR 20 MILLION POLYGONS NEAR PERFECT 60 FRAMES AND 512X512 TEXTURES SUPPORTED

REAL TIME LIGHT SCATTERING FAKE HDR LIGHTING SHADING BLENDING BUMP MAPPING ETC

TWILIGHT PRINCESS GAMECUBE / WII GAMECUBE GRAPHICS ENGINE BETTER LOOKING THAN ANYTHING ON XBOX

WIND WAKER BETTER CELL SHADING THAN AYTHING ON XBOX

FLIPER AND GEKKO HAD 2.5 TIMES THE INTERNAL BANDWIDTH OF XBOX CHIP SET

CLEarly wii has 4/5 times internal bandwidth than xbox

gpu supports custom deferred rendering plus high bandwidth low latency realtime compression catch custom effects and a real; time 16 stage shader tree called TEV

mario galaxty has better graphics better animation better phsyics and faster loading than anything ever on cube or xbox

so how again is wii a xbox galaxy has no loadtimes or frame drops and clearly supporting muti texture layers and real time shaders plus hdr lighting etc

monster hunter 3 clearly shows graphics that are high end in this new generation and will be rendered at 480p so apart from resolution theres simply no graphics war is there

xgpu was inefficent fact //// xbox system was simply ineficent fact and compared to wii simply much less powerfull and optermizedf for 480p rendering

wii is a version 2 of gamecube gamecube was more effective and efficent than xbox and wii is more effective efficent than gamecube so its clear the xbox brigade are talking out the wrong end of there bodys

custom deferred rendering AKA FAKED TILE RENDERING combined with fast ram highly efficent system performance = INSANE FILRATE
FACTOR 5 ON WII WII IS MORE THAN 2X XBOX SPEED

CLEARLY WII IS 2X XBOX GAMECUBE LEVEL CHIP SET AND AROUND 3.5 TO 4 TIMES THE RAM PERFORMANCE

2.5 TIMES PLUS A XBOX CUBE LEVEL MACHINE

cubeboy101

On December 27, 2007 at 1:51 pm

long story made short
xgpu 120 million polygons on paper = ingame 12 million polygons at 30 frames clearly a huge gap between on paper and in game same applys to pixels and texels xbox was inefficent…………

gamecube around 90 million polygons on paper= near 20 million polygons in game at allmost perfect 60 frames a second clearly the gap between the on paper spec and in game real performance was vastly superier ingame were it matters………

wiis hollywood is not only higher clock and spec than gamecubes gpu its allso more effective eficent so accual in game performance clock for clock is higher again

wii can near match 360 minus hd native resolution FACT

LOOK UP THE WORDS CUSTOM EFFICENT
LOOK UP TILE RENDERING

DO YOUR HOMEWORK

ManOfTeal

On December 27, 2007 at 1:53 pm

GUYS!!!!!!…..Who gives a flying flip????? :cool:

seriously

Why don’t you all go outside and find girlfriends……. :lol:

havoc of smeg

On December 27, 2007 at 3:06 pm

shut up wii fanboy, the wii is a pee shooter next to next to the nuke that is the ps3 and 360.

even if it does *some how* outperform the original xbox, it is still not very powerful compared to the same generation consoles released around the same time.

new gen console + old gen technology = not very good.

BranH

On December 27, 2007 at 8:18 pm

“VIRTUAL TETUREING= CULLING OF TEXTURE IMAGE AT TEXEL LEVEL I.E WHAT ISNT SEEN ON CAM ISNT RENDERED AKA DREAMCAST AND CUBE ONLY XBOX CANNOT DO THAT

PIXEL CULLING= WERE AS XBOX AND MORE SO PS2 HAVE TO RENDER A PIXEL FOR EVERY RENDER LAYER GAMECUBE AND WII MULTI RENDER TO ONE DRAWN PIXEL I.E PIXEL EFICENT JUST LIKE TEXEL EFFICENT

It’s as though you guys don’t read anything that is written to you.
http://72.14.205.104/search?q=cache:TXaryQQ0gdgJ:www.activewin.com/reviews/hardware/graphics/nvidia/gf4ti4600/gf2.shtml+%22Z-Occlusion+culling%22+hardware+surface+removal%22&hl=en&ct=clnk&cd=1&gl=us&client=firefox-a

And you’re speaking of the virtual texturing that takes place, to move textures in and out of it’s texture cache, of which it needed, for the way in which it handled multi-texturing effects.
It’s not RENDERING the way Dreamcast did. It’s just rendering to it’s framebuffer, which just happened to be on chip.
And neither Gamecube, nor Xbox are as efficient at HSR as the Dreamcast.

And as I said, Gamecube can add ONE texture per pixel, additional pixels cut fill rate. FACT.
Xbox can add 2 texture layers without halving fill-rate. Gamecube cannot.
You keep bringing up points out of context, and continue to misread your info, and assume most of those points you think you have found about Gamecube, are some awesome new Gamecube exclusive technology.

For example, Xbox uses “single pass multi-texturing”, just like the Gamecube did. All that means, is the gpu doesn’t have to recalculate the POLYGON, every time it wants to add a texture layer.

You’re confused. The re-rendering of a polygon in “multi-pass rendering”, i.e = PS2 issue, and it’s not as big of a deal on PS2, because the PS2 can calculate more polygons than Xbox and Gamecube combined. It just has to keep recalculating them for every texture layer.

Adding a texture layer over 1, cuts the Gamecube’s pixel fill-rate in half. That is the way it is, the way it was, and the way it will continue to be for the Wii.

“”EFFECTIVE IN GAME PIXEL TEXEL FILLRATE WAS BETWEN 300M/P M/T TO 750M/P M/T
IN GAME”"

Yep, sounds exactly like the benchmarks I’ve seen, and fits with what the paper specs say. But, here’s why.

Xbox = 1 texture layer = ~700 million pixels per sec.
xbox = 2 texture layers = ~600 – 700 million pixels per sec. (1.2-1.4 texels)
Xbox – 3 texture layers ~300 -400 million pixels per sec
Xbox = 4 texture layers = ~300 – 400 million pixels per sec.

Gamecube (assuming maximum efficiency)
1 texture layer = 648 million pixles per second.
2 texture layers = 324 million pixels per second.
3 texture layers = 162 million pixels per second.
4 texture layers = 61 million pixels per second.

That is the truth in how it layered textures.
I put the important points in quotation marks, in case you guys want to look them up for once.

BranH

On December 27, 2007 at 8:56 pm

And Xbox’s cpu + 2 vertex shaders > Gamecube’s cpu + generic T&L unit.
It’s not even close. Even if I assume we falsly assume that flipper had more power at everything.
And you guys keep copy and pasting compression numbers, and bus width numbers, on issues you don’t even try to understand.

****”gpu 120 million polygons on paper = ingame 12 million polygons at 30 frames clearly a huge gap between on paper and in game same applys to pixels and texels xbox was inefficent!!!!”*****gamecube around 90 million polygons on paper= near 20 million polygons in game at allmost perfect 60 frames a second clearly the gap between the on paper spec and in game real performance was vastly superier ingame were it matters***

90 milion = lol
And these numbers are simply based on the traditional matrix x vector calculation. In other words, it takes 4 clock cycles for the t&l processor to calculate a simple vertex transform. So, for every vertex processor you have = 1/4 the clock frequency. Same is true on every pc gpu’s spec sheet.
SInce Xbox has 2 vertex shader processors, it can theoretically transform 116.5 vertices per second.
Gamcube hss 1 generic T&L unit. Theoretiaal maximum is 1/4 its clock. or 40 million vertices per second. Of course, Xbox was far, far superior in math calculations in respect to vertex work, and could drive it’s pixel shaders directly. (normal mapping, etc..)
Gamecube would need it’s cpu to similar things (and its fsb), of which it wasn’t powerful enough to do so.

Gamecube has simplified lighting model, so apples to apples comparison isn’t possible. But overall, Xbox still had the advantage in T&L processing, and a huge advantage in what could be done in them.

And, as I’m sure you know, vertex shader are more complex than simple transforms, especially on Xbox. And vertex processing is assessed based on that processing power. The number you wind up with on screen, is only part of the equation.
Overall,
Xbox > Gamecube
Every developer that has ever touched them both, will tell you the same thing.
You guys just need to learn to accept that, and move on.

BranH

On December 27, 2007 at 9:05 pm

Edit, probably more like this.
I probaly made the Gamecube even more feeble than I should have.
Sorry about that.

Xbox = 1 texture layer = ~750 million pixels per sec.
xbox = 2 texture layers = ~650 – 700 million pixels per sec. (1.2-1.4 texels)
Xbox – 3 texture layers ~300 -400 million pixels per sec
Xbox = 4 texture layers = ~300 – 400 million pixels per sec.

Gamecube (assuming maximum efficiency)
1 texture layer = 648 million pixles per second.
2 texture layers = 324 million pixels per second.
3 texture layers = 216 million pixels per second.
4 texture layers = 162 million pixels per second.

Swp64

On December 27, 2007 at 10:42 pm

The Wii is a “piece of ”, compared to the other two. At everything.
No amount of misinterpreting specs data, caps locked copy and paste arguments are going to change that.
Even if you stick to 480p. The power is just not there.
A geforce 4 is more powerful than the Wii, and significantly more flexible and versatile. That’s just the way it is.
\No need to add to why your specs are wrong cubepeople, it’s been said a half dozen times now. You can keep right on assuming Gamecube was magically several times better than Xbox. And ignore games like Doom 3, Ninja Gaiden Black, Panzer Dragoon Orta, Half Life 2, DOA3, Forza, 720p Soul Calibur 2, etc..
And keep comparing a closed corridor (non-bump mapped) game like Metroid Prime, to a much more wide open (bump mapped) game like Halo 2, and somehow draw some sort of conclusions on power from that. That’s cool, “keep the dream alive”.

(note, no one is saying the Gamecube wasn’t capable of good looking games, you don’t need a ton of bump-mapping with good art direction, and working within your limits)

Realistically, the only thing Nintendo games have going for them this gen, is the fact that they’re well made. Controls, Graphics, animation, story, etc..
Sure, Mario Sunshine had crappy low res textures in most places, and a limited number of polygons in it’s levels, but it stayed around 30 frames per second, and it’s limitations fit the art direction well. They just over saturated the color on everything, and kept everything clean, and it looked good.

Same for Galaxy. Nothing to complain with those. Same will be true for future titles as well. But people holding out for a miracle, you’re setting yourself up for disappointment.

****wii can near match 360 minus hd native resolution FACT***
There really are no facts to support that. In game, on paper, in theory…etc..

Not a fan

On December 28, 2007 at 9:32 am

@branH. Thanks for your answer. Yeah, that’s the way I calculated. But to be honest, I think Microsoft specifies 360′s fillrates in 8BPP w/o z buffering. I mean, even if it could do 2 memory transactions in one clock cycle it wouldn’t be enough. Plus the memory must support that as well in non burst mode (or is burst mode prehistoric these days:) requiring a 1GHZ memory bus. As far as I know, the 360 only has 512MB of shared memory, right?

Same goes for the 500M polygons they spec, sure I believe that GPU can calculate one poly per clock cycle, but how many pixels will it consist of: 8 neighboring on a single line, 256 colour pixels at a maximum! The 8 bit alpha you are talking about doesn’t necessary need to be stored in memory BTW: the GPU can use the alpha value of the *new* pixel to blend it with the framebuffer’s pixel. That’s my experience with nokia phones at least.

Did you benchmark the gamecube’s fillrates BTW? Say that it does z-buffering and pixelrecycling for 2 layers. It means we need 2 texture reads, 1 z-buffer read, 1 z-buffer write, 1 recycle read and 1 framebuffer write. Wouldn’t this mean the extra texture only requires 2 additional memory transactions on top of the other 4, making it faster than 324Mpix/s? Also, if flipper has separated framebuffer and texturecache databusses, it can recycle a pixel AND read a new pixel at the same time which makes your statement a bit questionable. Are you sure about its architecture? (I can’t think of any reason why it has 3MB on die if it already has 24MB memory clocked at the same speed as GPU)

@Swp64: how about titles such as Conan, NFS pro street and darkness. Those will run on Wii, 360 and PS3. Perhaps those titles are least GFX, RAM and CPU demanding but aren’t released on last gen systems. As for your last sentence: If I’m right about 360′s fillrate being 8BPP, Wii’s fill rate is on par considering 480P.

ManOfTeal

On December 28, 2007 at 10:14 am

GUYS!!!! Girlfriends……Go Outside………Get Girls…..simple concept…..

Swp64

On December 28, 2007 at 5:48 pm

Fill rate isn’t the definitive measure of a gpu. Shader instructions have taken over texture instructions. The level of vertex and pixel shader instructions, physics, etc.. Just puts alot of it out of the Wii’s reach, for things like mesh deformation, etc.. You can’t do alot of things newer hardware does, when it doesn’t have the hardware for it. You can replace things with cheaper methods, but you could technically do that on Xbox, or Dreamcast, or even Gamecube. Doesn’t really mean it matches them.

And Gamecube’s fill-rate with multiple texture layers, are just base numbers based on it’s blending ability, to compare generically to XBox numbers.
Texture read, combiner op, combiner op, texture read, combiner, etc.. Things may be higher or lower under different circumstances on XBox as well, but that’s generically how it works out, with Gamecube being more efficient with what it has, Xbox being more powerful.

Xbox having 2 to 1 texel/pixel rate, Gmaecube having 1 to 1.
Which matches what most developer say. And you can see it in games. Where reflections are moving on helmuts in Madden, but baked into the texture on Gamecube, etc.. Xbox generally has 1 or more texture layer than Gamecube.

(Also depends on what is defined as a texture layer too i”m sure)

Not a fan

On December 29, 2007 at 11:46 am

@swp: yeah completely true. But that doesn’t take away that both 360 and Wii are on par doing 1 layer textures. I don’t think the GC’s, nor the Wii’s, nor the XBOX1′s main CPU is fast enough to calculate cheap “texture effects” anyway. The amount of pixels to process is too much. An all software renderer might do it faster since it only processes the pixels that are actually displayed. So I agree, hw support is a must. However, developers must care to put in the extra time and cash to actually use it. If I was Microsoft, I’d demand/pay 360 developers to do so.

On the other hand, I’m still stunned by RE4 for example. I like its hi-res textures and static lightning. Now if they would use phong or gauroud shading instead of flat shading it would be enough for me. (of course, I haven’t played any games since Quake 1 so I’m easily impressed).

BTW, the XBOX came out later than gamecube didn’t it? That makes it very likely they made it more powerfull than other consoles.

@ManOfTeal: my GF has a Wii… But perhaps you should go out yourself instead of wasting your time on this thread :mrgreen:

BranH

On December 30, 2007 at 3:57 am

@not a fan.
You’re assuming they claim their peak fill-rate, based on 8 bit pixels, with no Z-buffering? Of what use is that really?
Thing is though, they haven’t really gone into detail on the bus between the main gpu, and the edram. Other than to say, it’s “32 gbs per second”.
Someone at NEC had said once, that the bandwidth between main die, and daughter die, is 22.4 gb per second. But someone from Microsoft corrected it,
with: “Reply:32GB/s is the correct (original) BW from parent to daughter die.

This is 8 pix/clk * (4B(32b) color + 4B(32b) z) == 64B/clk * 500MHz è 32GB/sec. It is actually a little higher than that, but that is close enough and simple enough to go with. If NEC is claiming 22.4, that is wrong.”

Which mirrors most developer comments, in that you’ll become fill-rate limited, before you become bandwidth limited in regards to frame-buffer ops.
If the rops provide 4 gigapixels, but were bandwidth limited at 1.3 gigapixels, then everyone would be saying it’s bandwidth limited, as that would be the limiting factor.
(in that sense, they consider frame-buffer bandwidth to be pretty much infinite)
As you have the (write only) bus between main die and edram module,(supposedly 32 gbps) and separate busing from there, between rops and the edram itself, (256 gbps) which is where all the z-testing, alpha blendng, 4xaa takes place etc. Then it’s resolved, and flushed to main ram as a frame, part of a frame, or data to be used as input for the gpu again)
That’s how/why they stuck with a 128 bit bus to gddr3. Because the gpu itself, only sees main ram as read only for texture and vertex data, rather than putting all the read, write, read/modify/write traffic on it.

swp64

On December 30, 2007 at 5:56 am

@NotaFan – Well, I meant replace one type of mapping technique with a cheaper method, or reduce polygon count and complexity, animation, etc.. and you can emulate vertex shaders on the cpu to some extent, not much you can do for pixels though.

And as a theoretical benchmark.
360 can “technically” render each pixel:
“with 4x anti-aliasing, a z-buffer, 2 texture fetches, 6 shader operations, at the full 8 pixels per clock.” (of course, this is assuming bs maximum efficiency)

But Wii can’t do any aa without cutting fill-rate at 1 texture (unless they changed their method of aa), so it’s already well below 1/4, without considering much else. And that’s to cover only 1/3 the resolution, and it’s all down hill from there as well.

@MoT – My gf also has a Wii, of which I bought her, and play more than she does, but still. (There is no Animal Crossing yet)

havoc of smeg

On December 30, 2007 at 6:24 am

you guys arnt gamers,
YOUR NERDS!

Not a fan

On December 31, 2007 at 11:01 am

@BranH: Ok, so basically you are saying that the GPU has a cache that supports the required bandwidth and uses main mem for reading purposes only? That would be an interesting setup and fully explains what this machine can do. (Damn, today’s technology goes too far:) Thanks for your explaination.

@Swp: You are right. I haven’t seen any AA in Wii games. The downcatch is that graphics keep on looking “computer generated” with sharp edges and stuff. What exactly do you mean by other means of AA? Are there cheap alternatives available?

@havoc of smeg: go learn how to write some proper English first, you dyslexic buttplug. In between you could learn some good stuff by just reading this thread instead of posting meaningless comments.

havoc of smeg

On December 31, 2007 at 2:47 pm

@not a fan

if you dont like my english, you can f**k off to russia or germany, where they dont speak or write english.

as for the console tech spec posts, i would, if they were interesting and didnt send me to sleep.
just not as interested in how consoles work as i am with the games you can play on them.

Swp64

On December 31, 2007 at 5:43 pm

@Not a fan – From what I recall of the Gamecube, the way the pipeline was set-up, it could do aa behind 2 texture layers. i.e, aa would half fill-rate, but if you’re multi-texturing, it’s hidden behind that.

BranH

On December 31, 2007 at 7:17 pm

Yep, using 2 tev stages cuts fill-rate by half and hides the effects of aa.
But doing aa, required dropping to 16-bit color, and 16-bit z, And it’s maximum resolution was 640×264, as opposed to 640×528 for no aa.
I would have thought they’d change at least that in the Wii though.

And in case anyone wanted to look it up, I’ll highlight some important parts in their patent.
And keep in mind, that this patent was written, back when Gamecube’s gpu was expected to be 202 mhz, hence, the higher peak fill-rate figures listed.
(It was later clocked back to 168 mhz before release)

http://64.233.167.104/search?q=cache:fmFqDVMovkkJ:www.freepatentsonline.com/EP1182617.html+%22include+an+on-chip+texture+memory%22+%22embedded+frame+buffer+(EFB)+has+a+memory+capacity+of+approximately+2MB%22+%22anti-aliasing+also+reduces+peak+fill+rate+from+800Mpixels/s+to+400Mpixels/s%22+%22using+two+TEV+stages+also+reduces+the+fill+rate+to+400Mpixels/s%22&hl=en&ct=clnk&cd=1&gl=us&client=firefox-a

BranH

On December 31, 2007 at 7:21 pm

Ok, looks like the url is to long to be posted, so i can’t highlight everything at once.
I’ll try again.

http://64.233.167.104/search?q=cache:fmFqDVMovkkJ:www.freepatentsonline.com/EP1182617.html+%22anti-aliasing+also+reduces+peak+fill+rate+from+800Mpixels/s+to+400Mpixels/s%22+%22using+two+TEV+stages+also+reduces+the+fill+rate+to+400Mpixels/s%22&hl=en&ct=clnk&cd=1&gl=us&client=firefox-a

wiiboy101

On January 3, 2008 at 1:50 pm

the above gamecube fillrates have compleatly ignored the abillity flipper had that xgpu and ps2 gs unit didnt YOU ARE USING STANDARD NUMBERS AND STANDARD ARCITEXTURES ON A DEFERRED RENDING CUSTOM CHIP PROVING TO mii YOU DONT KNOW A DAM THING YOUR ON ABOUT

DOUBLE THOSE FILLRATE NUMBERS PLEASE THEN THERE CURRECT

PIXEL CULLING NON REPEAT PIXEL RENDERING

POLYGON CULLING THE ABBILLITY TO CULL THE POLYGONS DRAWN

VIRTUAL TEXTURE DESIGN CULLING AND TRIMMING OF REALTIME TEXTURES

READ UP TILE RENDERING ON DREAMCAST AND DAM LEARNS YOURSELFS

DREAMCAST PEAK FILL WAS 100M/P BUT ITS “”"”"”"”"”"INGAME EFFECTIVE FILLRATE THE FILLRATE THAT MATTERS AS ITS ON SCREEN NOT ON PAPER WAS 300M/P DUE TO FILLRATE CULLING AKA TILE RENDERING”"”"”"”"”"”"”"”"”"”"”"”"”"”"

APLY HALF THE FILL ADVANTAGE OF DREAMCAST TO HOLLYWOOD GPI IN wii

hollywood fillrate is over 900m/p and 900m/t peak on paper

1 hollywood gets much much much much much closer to that figure than xgpu got to its peak figure BECOUSE HOLLYWOOD IS INSANLY EFFICENT IT CAN ACCUALLY NEAR ATTIVE ITS ON PAPER FILLRATES “”"”"EFFICENTCIE”"”"” LOOK IT UP

THEN ADD TO THOSE NUMBERS

PIXEL CULLING

POLYGON CULLING

VIRTUAL TEXTURE DESINING

ALL SUPPORTED IN HARDWARE AKA “”"”RIPPED OFF TILE RENDERING ABILLITY”"”"”"

LETS SAY HOLLYWOOD WIIS GPU WAS ABILE TO HIT 972 M/P M/T IN GAME

ITS ON SCREEN FILLRATE EFFECTIVENESS WOULD ACCUALLY BE 2X 972M/P 2X 972M/T

AS THE DEFERRED RENERING TRICK APPLYS

EVEN AT 4 PIXEL PIPES AND TEXTURE UNITS WII9 IS A NEAR 2GPIXEL 2GTEXEL MACHINE

THATS A HIGH 720P FILLREATE BEING USED ON A 480P CAPPED CONSOLE

AKA IN YOUR FACE I KNOW WHAT IM TALIKING ABOUT “”"”"INSANE FILLRATE @ LOW RESOLUTION”"”"”"”

JUST LIKE FACTOR 5 SAID INSANE FILLRATE
THERE TALKING IN GAME NOT ON PAPER

TWILIGHT PRINNESS ON GAMECUBE HAD BETTER GRAPHICS AND FASTER LOADING THAN ANYTHING EVER PRODUCED ON A XBOX

SO DID WIND WAKER SO DID RESIDENT EVIL 4 SO DID FZERO SO DID PRIME 2
ALL THE ABOVE TITALS OUT GRAPHIC-ED AND OUT LOADING SPEED-ED ANYTHING EVER ON XBOX

SO AGAIN “”"”AAAAAARRRRRRRRRRHHHHHHHHHHHHH”"”"”"”
EXPLAIN HOW WII IS ONLY A XBOX

IT AINT SINKING IN IS IT COUGH FANBOYS :evil: :evil: :evil:

wiiboy101

On January 3, 2008 at 1:56 pm

972M/P AND 972M/T PLUS DEFERRED CUSTOM RENDERING = NEAR 2G PIXEL 2 GTEXEL IN GAME FILLRATE

THAT IS INSANE AT 480P

PLEASE DO YOUR MATH

GAMECUBE 600 PLUS MPIXELS AND 600 PLUS MTEXELS PLUS DEFERRED RENDERING= 1200 PLUS PIXEL TEXEL FILLRATE

FLIPPER AND FLIPPER 2 AKA HOLLYEWOOD RENDER IN A HIGHLY EFFICENT EFFECTIVE WAY THEY DONT RENDER IN THE LAME ASSSED BOTTLENECKED WAY THE XBOX RENDERED

I WIN PROVE MII WRONG
Wii is clearly a 2.5 times xbox cube level machine

xboxmodder02

On January 3, 2008 at 2:08 pm

effective in game numbers are real the xbox fans are basing there opinions on …on paper numbers and clockspeeds..

wiiboy cubeboy who ever he is is basing the argument on ingame “effective performance” the performance that we all will acualy see in are games.

sorry he clearly wins this it reminds me of the xbox fans that tried to say xbox 1 had a pentium 3 in it when infact it had a celeron

or the xbox fans that said xbox cpu was more powerfull than gamecubes amasing gekko cpu when infact it wasnt

celerons are totally and xbox 1 had a bog standard celeron 733mhz cpu with a bog stanard outdated 133mhz buz

gamecube had a highly shrunk highly optermized gamecentric custom version of a 750powerpc cpu it was risc not cisc it was copper not aluminium it had 256k catch plus compression the list goes on and on

xbox had a off the shelf low end spredsheet cpu celeron 733mhz THATS ONE CRAP CPU

xbox 733mhz 128k catch standard cisc cpu with 133mhz bus………

gamecube gekko custom designed micro design and copperwire and gamecentric custom optermized with 163mhz plus 4to1 compression
163mhz bus plus 4to1 compression

why look at clock numbers when the arcitexture clearly proves you wrong

Swp64

On January 3, 2008 at 3:34 pm

Lol, monotonous. I am not interested in Microsoft as a company, as a hardware manufacturer, or as a console. Same is true for Nintendo, Sony, Nvidia, ATI, etc..
Keep making things up, and using terms you clearly do not understand. If you did, you’d have noticed the Gamecube does not render the same as the Dreamcast. You keep making the assumption, that because the Gamecube uses a form of “hidden surface removal”, that it’s Dreamcast in nature.
They aren’t the same thing. They do the same “type” of thing, but not the same efficiency.
Whoever told you that, lied to you.
And as I think has been pointed out, and has been explained, and has been quoted to you, you can see it in games, in developer comments, everywhere, all of these magical compression techniques, bus and bandwidths, etc.. that you continue to quote out of context, does nothing to prove anything of what you claim.
And as I think has also been pointed out, the Gamecube cpu would have to attempt to step in and help, assuming you wanted to run the same scenes an Xbox could. Gamecube had no vertex shaders, and it’s cpu was nowhere near powerful enough.

(and btw, no one said Wii was a piece of compared to Xbox)

Sh5619

On January 3, 2008 at 4:42 pm

Wiiboy, you clearly thought the Gamecube only rendered 1 pixel, and added textures to it, and the other did the opposite, when in fact it does not.

You clearly made the assumption that “single pass multi-texturing” was Gamecube exclusive, when in fact it is not.
You clearly assumed Gamecube and Gamecube alone used Z-occlusion, early z-checks, and compression, when it clearly is not.
You continue to give framerates, and figures, and flailing your arms, with your cap-locking responses, and you clearly don’t even try to understand any of what you write. You clearly seem to think it “blows Xbox out of the water”, when it clearly does not. Most developers have also said, “it clearly does not”. And in many cases, the advantage is the other way around.
No one was saying Xbox “destroys Gamecube”, (those were your claims the other way around), just that you are attempting to significantly exaggerate your claims. It clearly “does nothing of the sort”, and you seem to assume anyone that disagrees with that, is a fanboy.

You’re completely blinded by rabid irrational fanboyism. (clearly)
And I wonder why people waste their time with you.

BranH

On January 3, 2008 at 5:52 pm

@xboxmodder = “why look at clock numbers when the arcitexture clearly proves you wrong”
At what point did I use clock numbers to claim anything? Did I bring up the fact that Xbox’s cpu was clocked nearly twice as fast? Did I bring up clock frequencies of ram? Nope.
You’re using typical responses designed to counter people using generic figures such as those, in order to defend the Gamecube’s honor. I used it’s architecture to imply that it does nothing to “blow the Xbox out of the water”.
If this were a situation where Gamecube and Xbox were relatively equal on paper, only Gamecube was customized, and XBox was “off the shelf”, it would be easy to say Gamecube > Xbox. But that is NOT the case here.
I even said specifically, that overall, Gamecube is more efficient, but that that efficiency is what allowed it to keep pace, not overtake Xbox.
The architecture “clearly” proves me “right”, in that you’re numbers out of context, and there’s nothing you can say to prove otherwise.
And it matters not to me, which cpu was more powerful. I did say, Xbox cpu + two programmable vertex shaders > Gamecube’s cpu + generic T&L. That is reality.

(and keep in mind, that I’m not disagreeing with Wiiboy, for the sake of fanboyism. He simply made statements that I find to be incorrect, from simply checking up on what he claimed. (in a very condescending, fanboyish way)

I’m not interested in pumping up one company over another. If I had a choice, the Wii would have been powerful enough to run modern game engines as is, but targeted at ntsc. They could have gotten all the ports from the other systems, and still had it’s own unique games as well. I would likely have been happy with JUST that, for the remainder of this generation, as I don’t mind the standard tv at the moment.

As to Wiiboy, every gpu and their momma does indeed use HSR. None of them draw textures and effects to all pixels. Xbox had hardware occlusion query, occlusion culling, compression, etc.. Not as if it didn’t.

All of your posts are hyperbole, designed to worship the capabilities of the Gamecube and Wii. I care not for any of these companies. You assume anyone that doesn’t feel the same way about the Gamecube and Wii, to be fanboys positioned on the opposite side of the isle. (If that were the case, the entire gaming industry is on the same side, including Nintendo to some extent)

@the nerd comments. This is no different than a typical baseball or football discussion. You don’t have to be a loser to wonder and investigate how things work. Especially if you were looking into a job that might employ such things.

And as to fill-rate and texture rate, etc.. Most developers I have seen comment, the one I know personally, (whom only worked on Gamecube directly, but had all the internal Xbox docs, and was in a position to comment) have said the same thing. And there are plently of developer comments that mirror that. The most you will get, is Gamecube = Xbox, and that will be a very specific narrow circumstance. Not an overall statement. And of course you’ll find a few things that Gamecube could do better.
But overall, developer consensus is Xbox > or = Gamecube. With most just going with the >. On paper, in practice, in theory, etc.. And none of completely “out of context” specs quoting will change that.
There’s nothing wrong with that. Nintendo competed well in all areas. Gamecube was a fine system. (for last generation hardware)

BranH

On January 3, 2008 at 6:40 pm

And to add to the response to Xboxmodder, if I were using straight clock frequencies to determine fill-rate, I would have said 933 million pixels per second, for 1 or 2 texture layers. But I didn’t. I used Wiiboy’s own quotes, and benchmarks I’ve seen posted, to get the 350-750 million number I listed for Xbox. Not theoretical maximums. I just assumed full, theoretical maximums for Gamecube.

From Erp, known developer on B3d.
http://forum.beyond3d.com/archive/index.php/t-2168.html
Keep scrollong through the thread. I would mark it up with Google’s cache, but it won’t let me post the link here, and I don’t think this place allows html.

Plenty of other known developers that imply the same things, (as I’ve already done in this thread) They all come to the same conclusions regarding Xbox and Gamecube.

And just a side note: I would assume the Fifa developer in the original topic of this thread, described Wii’s video hardware as, “not being as powerful as Xbox”, in regards to capability, rather than performance.
Sort of like saying, Photoshop is a “powerful” tool in editing photos. Or 3dmaya is a powerful modeling tool, etc.. He’s just using “powerful” in the context of what it can and cannot do, rather than actual performance with what it’s capable of.
I doubt he means the Xbox could outperform the Wii, at something made specifically within the parameters of the Wii hardware.

So, don’t imply that I have said Xbox = or > Wii. I “clearly” agree that it isn’t. Just based on ram alone, and ignoring anything else.

cubeboy101

On January 3, 2008 at 6:50 pm

GAME OF THE YEAR 2007 GAMESPOT/EDGE MAGAZINE/GAMETRAILER ALL OTHER RESPECTED REVIEWERS ETC WIL ALL FOLLOW SUIT

GAME OF THE YEAR 2007 OVERALL ALL FORMATS

M-A-R-I-O G-A-L-A-X-Y

OUCH DOSNT THE TRUTH HURT

THE 1ST XBOX ZOMBIE TO SAY MARIO GALAXYS ZERO LOADSCREENS AND HIGH LEVEL PHYSICS AND GRAPHICS ARE CAPABLE ON A XBOX 1
IS A BORN LYER

THE PHYSICS ALONE IN GALAXY ARE AMASING

NO LOADTIMES AND 60 FALTLES FRAMES A SECOND AND CLAss leading platforer graphics

on a so— called xbox level machine

what a long winded thread when all alone post 1 was the truth EA LIED it was a marketing/policical statment

NOT A STATMENT OF FACT ON WIIS POWER

PLEASE LEARN THAT EA FEAR NINTENDOS INDUSTRY LEADING POWER

ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ XBOXFANSZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ

FUNNY M,ARIO GALAXY LOOKS 100% NEXT GEN AND RUNS AT A SOLID 60 FRAMES AND IS THE BEST VIDEO GAME THIS YEAR AND A GENUNIE TOP 5 OF ALL TIME GAME

cubebOY101

On January 3, 2008 at 6:52 pm

WERE IS THE X360 ALL FORMAT WINNER UPS THERE AINT ONE

BranH

On January 3, 2008 at 7:37 pm

I thought Mario Galaxy was an awesome game. Not all from a technical perspective, but in general. It likely deserves game of the year.
But it’s graphics and physics, aren’t a testament that the Wii is as powerful as a PS3 or 360, just that a game that’s well made and designed is a good game, regardless of power. (no one is disagreeing with that)
And the physics and graphics are far more interesting, than they are amazing, as far as processing power goes.

Swp64

On January 3, 2008 at 9:36 pm

Xboxmodder – technically, (if I recall correctly), it’s a 777mhz Celron with the same 8-way associative cache used in Pentium III. (according to the Linux profile on modded XBoxes)

Of course, you only have 128kb cache, etc.. But it’s not the same as a stock Celeron either. As it is, most hyped comparisons between G3s and Pentiums,was between G3 and Pentium II anyway. “up to” twice as powerful at the same clock rate. With particular emphasis on “up to”. i.e, in certain cases they cherry picked, other cases not so much, and in a few others, the other way.

But, then again, Gamecube used a custom modified G3, so it may very well have been 2x at the same clock rate in many, if not most cases. Bringing it’s cpu up to at least Xbox, if not superior.
But, as I think has been stated, Gamecube used tradition T&L, XBox had vertex shaders, which puts more load on Gamecube’s cpu. (and bus)

@Xboxmodder–”"effective in game numbers are real the xbox fans are basing there opinions on !on paper numbers and clockspeeds..wiiboy cubeboy who ever he is is basing the argument on ingame “effective performance” the performance that we all will acualy see in are games.
sorry he clearly wins this”"

Lol, you clearly didn’t read many of his posts. It’s quite clear he’s the one shoving the paper specs and numbers around “out of context” to prove some kind of point. Tech discussions are fun and informative, fanboy driven arguments are not. Too many people with agendas to be interesting.

BranH

On January 3, 2008 at 10:27 pm

@Swp64-”Tech discussions are fun and informative, fanboy driven arguments are not. Too many people with agendas to be interesting.”

Lol, “said the pot to the kettle(s)”. Mr. “Nintendo’s new found market leadership + lack of processing power = crippled programming innovations from Japan = Seppuku. (Hara-kiri)”

But yeah, much of what can be done with Xbox’s vertex shaders, would have to be job shared between the geometry engine and the CPU on Gamecube to achieve the same thing.
And it wasn’t anywhere near powerful enough to do the same things most of the time.

(but to be fair, overall, there are some of things you could do on Gamcube, that you couldn’t do on an Xbox, but that list was very short and specific. The list gets pretty long the other way around)

wiiboy101

On January 11, 2008 at 12:40 pm

wii and gamecube physics in the lead :grin: :grin: :grin:

YOU TUBES ALL FOLLOWING VIDEOS :razz: :razz: :razz:

PS3 PHYSICS DEMO /PS3 INGAME PHYSICS
VS

GAMECUBE MARIO 128 DEMO / ELIBITS WII /MARIO GALAXY

GAMECUBES INTERACTIVE MARIO 128 A.I /PHYSICS DEMO IS MORE IMPRESSIVE THAN PS3S NON INTERACTIVE DEMO

ELIBITS IN GAME PHYSICS MORE IMPRESSIVE THAN PS3S NON INTERACTIVE DEMO

MARIO GALAXY IN GAME PHYSICS AGAIN MORE IMPRESSIVE

COMPARE PS3 TO WII VIA YOU TUBES WATCH GC AND Wii HAND ON HEART KICK SOME physics ass

BROADWAY IS A RISC OUT OF ORDEWR SUPER SCALLER OPTERMIZED FOR GAMING FUNCTION ADD TO THAT ATIS GPU BASED PHYSICS = WII KICKS OUT HIGH END PC PHYSICS

WATCH AND LEARN CELL IS A INLINE OVERHYPED DUDU PILE :lol: :lol: :lol:

wiiboy101

On January 11, 2008 at 2:11 pm

did you realize wii has dual geomitry and light engines (T&L) ENGINES…..

hollywood on board T&L fully supports compression and real time decompression 4to1 data compression

broadway cpu on board support for fully programmable geomitry and lighting (T&L)allso fully supporting real time compression decompression at 4to1

gpu t&l plus custom effects plus TEV unit programmable 16 stage shader blender

cpu fully programmable (T&L) CUSTOM GEOMITRY AND LIGHTING SUPPORT

thats some serious t&l g&l hardware suppot

BranH

On January 11, 2008 at 7:46 pm

Yeah, I’ve seen their whitepapers explaining the cheap (inexpensive, not crap) lighting in Gamecube, and heard developers comment on it. It’s probably the same for Wii, and they did some cool things with it. Resident Evil 4 for example.

Gamecube could blend lighting into a pixel, in a texture stage. So, you could read a texture, blend it with the next texture, then blend in a lighting calculation, then blend a texture with that.
Since it writes the result to its on chip cache, and reads it back for every stage, that’s cheap, since you don’t have to re-render the polygon.
Xbox calculates it’s lighting, reads all of it’s textures at once, and runs combiner ops on those, which is pretty fast, and you’re not re-calculating the polygon to do it.
But if you wanted to blend texture layers, then blend in a lighting calculation into that, and blend the resulting texel with another texture, you’d have to resort to multi-pass rendering. i.e, you’d have to write the resulting pixel back to ram, and read it back, after re-rendering the polygon. So in that sense, Xbox could probably not run something like RE4 in the same way Gamecube did, with any kind of speed.

But at the same time, there’s a list of situations where Xbox could do things Gamecube couldn’t. The level of normal mapping in Chronicles of Rid for example, wouldn’t be doable on Gamecube. Or the per-vertx lighting in Splinter Cell, or the texture mapping in Doom 3, etc..

And I still don’t get what you are referring to with 4to1 compression. Lots of things are considered compression. Z-compression, DXT, etc.Rendering a curve using B-spline in the gpu would be considered considerable compression, since you aren’t reading all the polygons that make up a curve, you’re just tessellating them based on an equation on the gpu itself, rather than storing them in ram, and moving them over a bus, or tessellating on the cpu, and moving them over the fsb etc..
Xbox could compress volumetric textures, Gamecube couldn’t, etc.. But they both comress and decompress everything they move from one place to another on the fly, in hardware.
It’s all relative as it is.

Not degrading your argument btw, but I’ve never heard of gpu based physics on Wii, outside of a bunk developer interview, of a guy who didn’t fully understand what he was talking about. Ati could do physics in unified shader gpus, because the shaders are more general purpose, and can run math related to physics, because they’re more versatile, and can compute a wider variety of computations. From all I’ve seen and read, that’s not the case on Wii, that’s 360 and up.

And really, physics is one of those things cell is actually really fast at. And they had some interesting demos. “Cups”, the fish demo, the one showed involving Heavenly Sword, The water demo Highmoon studios had and made a minigame out of etc.. And it can do those physics, in addition to other things.
Calculating bones, mesh deformations, blending one animation with the next, etc.. Those are things Cell could outrun Wii at considerably, especially on things like http://www.youtube.com/watch?v=hdImUIhbG9E&feature=related.

Not to say Wii is week at physics, I’m sure it’s capable enough, it’s just not gonna outrun something like Cell.

Swp64

On January 11, 2008 at 8:08 pm

Yeah, I recall the “textured lighting” they did for RE4. That was pretty cool.

Overall, I think if Nintendo had updated to more conventional hardware, and targetted NTSC, they would have been destroying PS3 and 360 right now. (more so than they are)
Plus, it would have had the potential to run things like Bioshock, straight ported. And pretty much any other multi-platform game would have the potential to be ported with little effort. Then, toss in Nintendo first party stuff, and it’d have been game over.
HD won’t really matter much until next gen. Especially in Japan. Hopefully their next hardware is Gamecubesque level. (in terms of relative power to their competition)
That would be awesome.

*I’ve seen many of those physics demos. It performs the way I’d expect a decent cpu to do, given there’s no OS running in the background, etc.. Not bad, not awesome.

This demo was impressive though.
http://www.youtube.com/watch?v=Jd3-eiid-Uw

Not a fan

On January 12, 2008 at 8:03 am

Pfff too much to read since the last time I posted! Technical discussions are fun indeed. Thanks for all the answers and insight. Still have too read all though.

Some comments from my side. In game performance doesn’t say anything unless the very same game is run on multiple platforms with the least and most performing code behind it. Rogue squadron does 20M polies? Ok, but how many pixels does it render? Horsepower is torque times speed, so (bare) GPU horsepower should be polies times pixels. Only knowing the number of polies doesn’t say anything.

Still some questions though: We discussed fillrates but how about texture reads? Multi layer textures do require multiple texture reads per pixel don’t they? So how can a Wii achieve 2Gpix while has to read pixels from memory and its bus is only 240Mhz and how does the 360 do that? Or does 2Gpix only apply if it has those textures in texture cache (or “catch” as some try to refer to it:)?

@swp’s last post: Yeah, I think having more memory (256M external) would have been enough already. I would buy a game such as assassins creed even if it doesn’t have real bump mapping or realistic lighting (though Zelda has very good lighting IMO (looks like specular), much better than RE4).

@BranH: I think you are right about that fifa developer. Either he developed the J2ME version of fifa and doesn’t know anything about hardware or performing code (like a lot of java devs:), or the writer of this article misunderstood what he exaclty meant. It is quite obvious that the Wii has better memory speed, better CPU speed, better (bare) fillrates so claiming it is on the same level as XBOX wouldn’t be too bright. Though I still wonder, the Wii has newer hardware, so can’t it be possible that it supports the same shader technology as XBOX? I mean, ATI must have based the Cube’s GPU on technology that existed those days. Perhaps they were able to extend the Wii’s GPU with newer technology? Or would that require a completely different architecture?

@havoc, perhaps I was a bit harsch. BTW I already live next to Germany so it wouldn’t make any sense to move (other than cheaper BMW’s haha). But my point is, if you are not interested in this topic, you are free not to post at all instead of calling people nerds (unless you add a smiley to that ofcourse :mrgreen: ).

Swp64

On January 12, 2008 at 9:25 pm

Well, as it was explained to me:
On 360, you have 48 unified shaders, in 3 mimd blocks of 16. Capable of processing either pixel shader, or vertex shader instructions.
A block of 16 texturing units, decoupled from shaders, so they can fetch and filter, without stalling the shaders. A thread arbiter that keeps hundreds of threads in flight, that organizes threads of pixel shader, vetex shader, or texturing tasks.
You have 8 rops, located in a block of 10 megs of edram. Rops can process 8 pixels per clock with both color and z values, or, they can do 2x Z only. (i.e, no color, just “depth” value)

An ideal game engine on xenos, would run a z-only pre-pass first. In other words, it would process nothing but the geometry of a scene. Perhaps pre-cull, run back face culling, clipping polygons outside the view, etc..

Anyway, you write all geometry to the scene first. All 48 shader alus are capable of process geometry, so that runs several times faster than it would on a gpu that only had 8 vertex shaders for example. In older gpus, the pixel shaders sit idle, while vertex shaders work, because there’s no pixel shaders being run in this pass.

The polygons are made up of pixels with only z values, no color, so rops can process them twice as fast. (16 pixels per clock)
When you;re done, you have a frame with the top most pixels as determined by their z-values, (so there’s no overdraw) which then get processed in a second pass by pixel shaders. (in this case, the same processors that just ran the geometry)
It adds textures, runs shader programs, etc to those. (you don’t add complex texture and shader effects to any pixels that can’t be seen in a frame)

You have 48 shader alus, and 16 texture units. The thread arbiter, can keep 64 threads in flight, and hands out tasks to the different blocks of processing units. So, if a pixel shader needs textures for something, that haven’t been fetched from ram yet, the thread arbiter can swap out that thread, and give it a math process that needs done anyway, until the textures are there to process.
In other words, hide texture latency with math operations.
That’s why ati gpus are good at folding at home, while Nvidias haven’t been. They don’t stall as much with texture fetching from ram or filtering, because those tasks are decoupled from the shader units, and they can just do something else while they wait.

Of course, that’s simplified, and ignores how you get 4x msaa over the top of full fill rate, ignores tiling because a 1280x720p frame with 4x msaa won’t fit into edram at once, etc..

I also left out the filling the Hierarchical Z-Buffer in that first pass, the fast Zclear, z-compression, and other hardware functions.
Then there are things like, the fact that you can get HDR at the same cost, bandwidth, and storage as a normal pixel, because it can process a native FP10 HDR format.
Which, allows pixels to have a 10bit value for red, green, blue, and 2bits alpha. = still 32bit pixels.
10.10.10.2, rather than having to double the pixel’s color value with fp16, etc..
And other theoretical things you could technically do with memory export, or with the tessellation unit, etc..

Point is though, that’s where they get the theoretical, 8 pixels per clock, that includes a z-buffer, 2 texture layers, 6 shader ops, and 4x mass.

in other words, there are 8 rops, that can do 2x z only, 16 texture units, 48 shader alus, and hardware 4x msaa that doesn’t cut into fill-rate.
(of course, the mix of instructions would likely be far different overall, but it’s a generic benchmark)

Block diagram:
http://img253.imageshack.us/img253/9563/xenoszf6.jpg

wiiboy101

On January 13, 2008 at 12:38 pm

IGN GAME OF THE YEAR MARIO GALAXY

NO LOADSCREENS NO LOADING WAITS NO GLITCHES NO PATCH REQUIRED 60 FRAMES GOUGOUS GRAPHICS

XBOX 1 RAN HALO 2 LIKE AND AVERAGED 14 FRAMES A SECOND PLUS LONGGGG OUTRAGUS LOADTIMES POP UP TEXTURES MANY BOTTLENECKS

YET WII IS A XBOX

IT AINT SINKING IN IS IT :roll: :roll:

wiiboy101

On January 13, 2008 at 12:48 pm

xbox 360 10mb off chip edram buffer 32gb read 50 nano seconds latency

wii 3mb on chip embedded edram 1tsram-r 28gb bandwidth plus great compression decompression 28gb @ 480p vs 32gb @720p wii clearly far more capable per pixel and resolution to fillrate/bandwidth ratio

32gb buffer @720p poor latency issues vs

28gb catch/buffer @ 480p amasing latecy perfromance of 5 nano seconds wiis catch buffer is better @ 480p than x360s is @ 720p FACT

MAIN RAM LETS SAY 48MB FOR TEXTURES 48 x 6 COMPRESSION =288 MB OF TEXTURES thats 288mb of sd resolution textures 3 times smaller than 720p

so 288mb of textures @ 480p = loads of textures and texel fillrate at 480p

wii 1mb texture catch = 16gb bandwidth 16gb x 6 for compression= 96gb texture virtual bandwidth @ 5 nano seconds ON CHIP

TEXTURE READS OF 96GB BANDWIDTH VS AROUND 3.2 GB BANDWIDTH XBOX 1

CLEARLY WII IS A 480P 360 NOT A 480P XBOX

DO THE MATH :roll: :roll:

wiiboy101

On January 13, 2008 at 12:51 pm

TEXTURE READ WII 1MB TIMES 6 = 6MB VIRTUAL TEXTURE CATCH WITH 24 MB OF ON DIE 1T SRAM-R INSTANTANIUS READ VIA SUPER FAST 1T SRAM 10 TIMES FASTER THAN 360S RAM FACT

wiiboy101

On January 13, 2008 at 12:57 pm

HD TEXTURES TAKE 4 TIMES MORE RAM SPACE FOR THE SAME TEXTURE AS A 480P TEXTURE

40 MB OF TEXTURES WII X 4 = 160MB ADD 6 TIMES COMPRESSION YOU GET THE PICTURE WIIS A 480P BEAST

DO THE MATH

wiiboy101

On January 13, 2008 at 1:00 pm

WII FILLRATE 2XPLUS XBOX ///WII LATENCY 10TIMES PLUS XBOX ////WIIS BANDWIDTH 5 TIMES XBOX//// WIIS DISC TWICE SIZE OF XBOX PLUS GREATER COMPRESSION AND MUCH FASTER LOADING SPEED

WII CLEARLY NO XBOX :roll: :roll:

wiiboy101

On January 13, 2008 at 3:20 pm

xbox sdram slowwwwww frontside bus = 133 mhz = 1gb bandwidth

wii 1t sram-r edram 1tsram-r die embedded ram plus clock synced gddr3= dam fast
wii front side bus 243mhz plus 4to1 compression = 8gb bandwidth

again wii is noooooooo xbox

Swp64

On January 13, 2008 at 3:51 pm

Wiiboy, you’re a delusional, irrational fanboy, and you don’t even try to understand even a fraction of what you type about. It’s obvious you have no interest in this discussion, or the subject in general, outside of fellating Wii hardware.
I would hate to see the forums you frequent, as I’m sure you think you “know stuff”. The ones in which you aren’t banned from of course. And I’m sure people just blow you off as I’m about to do, or just don’t start tech discussions, just to avoid your cap-locked bs.

But I’ll help you with a couple of your “issues” since I’ve typed this much.

***32gb buffer @720p poor latency issues vs 28gb catch/buffer @ 480p amasing latecy perfromance of 5 nano seconds wiis catch buffer is better @ 480p than x360s is @ 720p FACT ***

the 32gb’s bus, is write only. It’s not addressable by the gpu. It just moves data across it to rops (located on the edram die) in one direction, it makes no requests from it directly. The bus between rops and edram, is a separate 256 gigabytes per second. You don’t store textures there, or game code, etc..
Your latency claims are wrong, in addition to being meaningless.
As is the vast majority of what you write. It’s not even worth discussing.
FACT

wiiboy101gezza

On January 14, 2008 at 6:11 pm

FACTOR 5 working on wii exclusive ps3 dropped like hot potato reports of crying at nintendos door tail between legs not confirmed but likly

WIIADD FACTOR 5 = XBOX GRAPHCS BLOWN AWAY

GAMECUBE ROGUE SQODRON 2 15 MILLION POLYGONS 60 FPS 8 TEXTURE LAYERS REALTIME SHADOWING SHADING BLENDING

GAMECUBE ROGUE SQODRON 3 NEAR 20 MILLION POLYGONS NEAR SOLID 60 FPS LIGHT SCATTERING HDR ETC
BOTH GAMES SUPPORTED 512X512 TEXTURES AND 5 TO 8 TEXTURE LAYERS INCLUDING DOT 3 BUMP MAPPING

HALO 2 XBOX MAXED OUT 4 TEXTURE LAYERS ZERO 512X512 TEXTURES SUPPORT BUGS GLITCHES ETC LONG LOADTIMES CRAPPY SUB 30 FRAMES
NBO XBOX 1 TITAL SUPPORTED 512X512 TEXTURES ONLY 256X256 “”"FACT”"”
NO XBOX GAME BETTERED 15 NMILLION POLYGONS AT SUB 30 FPS

XBOX SUB 30 FPS AND SUB 15 MILLION POLYGONS

GAMECUBE NEAR 20 MILLION POLYGONS NEAR SOLID 60- FRAMES HOW IS WII A XBOX

WATCH FACTOR 5 ON WII WATCH CAPCON ON WII MONSTER HUNTER

YOU BELIEVE XBOX CAN DO WHATS COMING “”"”FANBOYS TALKING CRAP THE LOT OF YOU”"”

GAMECUBE OUT GRAPHIC=-ED XBOX MANY TIMES OVER

WIND WAKER TWILIGHT PRINCESS
PRIME 2/3
RES EVIL 4
ETC
ALL NO LOADING TIMES

WII SHALL KICK OUT 2.5 TIMES THE VISUAL POWER AT 480P THAN XBOX OR CUBE EVER DID AND STILL RUN LOADSCREEN FREE

DONT BELIEVE MII PUT YOUR MONEY NAME AND PHOTO WRE YOUR NET POSTS ARE IM WILLING CUZZ I KNOW ILL WIN

SILLY XBOX FANS TALKING CRAP

wiiboy101gezza

On January 14, 2008 at 6:14 pm

COMING WII GAMES IN 2008 WILL PROVE WII AQ 480P OPTERMIZED 360 LIKER MACHINE NOT A XBOX1 OR SUB XBOX 1

FANBOY ALERT SOUND THE XBOX FAN ALARM THERE TALKING AGAIN

lennell

On January 15, 2008 at 1:27 am

:mad: :mad: look people xbox was faster than gamecube in the cpu,and a little better in the gpu,but gamecube had a tev-chip and it can do things the xbox could’t do.each sistem had soming the other did’t. but if you maked a game for each sistem and put it to the max,the games will look about the same,but jast each game wii look better in different spots.

BraH

On January 15, 2008 at 6:34 am

lennel- yes, i agree partially with that. And I pointed that out in this wall of text thread. I did say, and this is agreed upon by nearly all developers, that overall, Xbox > Gamecube, in both flexibility and overall power. It doesn’t have to be superior at “everything” to be superior overall.

I am not of the opinion that Wii is no better than an Xbox. It’s probably not as flexible at some things, but overall it’s more powerful. The only one that’s passionate about it is wiiboy. I really wouldn’t mind if Nintendo hardware were proven to be awesome, I just don’t see that.

lol, Wiiboy calling other people fanboys, hilarity ensures.
Anyway, you keep comparing x-game to Halo 2, of which you know little about as it is.
Gamecube would choke and die running Halo 2. It couldn’t keep up on the bump-mapping alone.
And you throw around goofy polygon numbers for what? You do know that for every vertex attribute you add to a polygon, the fewer you could actually transform, right? Your assumption that in any of the more ambitious Xbox games, the geometry processed would somehow run better on a Gamecube is flawed. Keeping up with bones, and weights, and animation, etc. would be too much for it.

On a Gamecube, you’d simply not try to run such things as much, nor would you attempt the level of bump-mapping, etc..
You’d do what they did in Prime, and use a bunch of generic effects, with good art direction. So what if you don’t have bump mapping anywhere, on walls, characters, etc., as long as it looks good. (better than most) And there are fewer glitches involved in the “generics”.

And so what if in Nintendo games, the polygon count on character drops when they run, or you lack aa, or are you’re forced to use lower color precision at times and you see ugly banding everywhere, as long as fans don’t complain too much, and the frame rate stays ok, it’s fine.

And I didn’t say cube couldn’t do dot3 bump mapping, just that it doesn’t have efficient hardware to compute it, and couldn’t use it anywhere near as much as an Xbox could. Gamecube is good at older multi-texturing, but it’s not going to keep pace with an Xbox in that area.

And you’re mentioning things like Windwaker, RE4, and Prime as examples of what?
They’re very nice looking games. But they don’t do much to prove graphical superiority, considering an Xbox fan could just as easily list Rid, Doom 3, NG Black, Splinter Cell PT, DOA:U, Wreckless:Yakuza (which was on Gamecube, sans bump-mapping) etc..
As well as having superior looking versions (and sometimes at much higher resolution), of nearly all multi-platform games.

And you continue to misunderstand texturing. Gamecube can do “up to” 8 texture layers. Sure, you could pile up 8 onto a single pixel, on some of the surfaces in a game. But you’ll notice the comments about limiting a metatile to 3 texture layers because of performance and memory reasons, was from Factor5. In addition to other devs mentioning that,in practice, you’re pretty much stuck to ~2-3.

And your assumption that Xbox “maxes out” at 4 textures is wrong. It maxes out at 4 texture layers “per rendering pass”.
No rules that say you can’t do a second or even third render pass to add more textures, blendiing, etc..
Doom 3 did several render passes on geforce3/4, and had 7+ texture layers, and a list of blending effects. It would simply cost in things like geometry processing. (which is more powerful than Gamecube’s, especially at dx8+ level programming)

In closing, Gamecube/Wii 4Life, yo.

chicothespic

On January 18, 2008 at 3:45 pm

how come mario galaxy has twice the visuals and twice the frame rate of any xbox game iv ever seen and does it all without bugs glitches tearing or crashing loading screen loading break free and lets not forget the physics….

excite truck a gamecube dev kit developed game slapped on wii at lournch has better graphics and framerate than the xbox could ever run that fast sweet arcade racer and allso supports perfect motion stearing and envirament morphing texture mophing

how come twilight princess on wii has identical graphics to the gamecube version only upgrading interlace to proscan and 4:3 to 16:9 yet has way better graphics than any xbox game and allmost no loading times….

how come visually fzero gamecube matches any xbox racer yet its frame rate motion blur etc make fzero clearly the fastest racer game ever created its insanly fast and glitch free

how come when ubisoft started optermizing there ps2 to gamecube ports insted of the lazy ports of earlier games the gamecube clearly out done the xbox even on ported ps2 code
jade engine beyond good and evil was pushed harder and better on gamecube than xbox it had better colour better effects better water and physics better loadtimes and more debugged same applys a few other cube optermized games

how come wind waker by far had the greatest cell shading last gen it clear on any cell shaded xbox game and again ran allmost compleatly free of load breaks

how come prime 2 on gamecube had zero loadtimes screens ran perfect at 60 frames a second had no bugs glitches or issues and had amasing graphics

yet halo 2 had long loadtimes bugs glitches plain colour borring flat textures pop up graphics pop up textures glitches bugs and very very very long loadtimes

the xbox fans and the nintendo fearing 3rd partys have some explaining to do

used cisco

On January 18, 2008 at 4:12 pm

Who cares about power when arguably the best game of the year is out exclusively on Wii?

Also, it looks damn good, better than xbox game I’ve ever seen. Sure, Wii is far less powerful than 360 or PS3, but as long as it has the potential for amazing games, who really cares? Not me. I’ll play amazing games on any console.

baco

On January 19, 2008 at 10:44 am

c`mon guya ffs! who cares? choose the system you like or buy them all! all i know is micro ,ninty, and sony love all this stuff you put out on the net etc, just shut up and play , or make somthing better yourself , peace!!

Not a fan

On January 19, 2008 at 1:06 pm

@chicothe, branH and swp64 already explained a some of your questions. Read all from the start of this thread.

@branH: You certainly know your stuff. Thanks for the link, I’ll look into it. BTW I should have thought of your explaination about texturing myself, it is the same solution as for fillrate you explained earlier. After the GPU finished geometry calculations, it only needs to read the visible texels from memory and write ‘em to the framebuffer. I feel like I’m up to date now :mrgreen:

@baco, you are right, but it is quite interessing to know how stuff works so we at least know what to expect from systems.

Perhaps time for a conclusion as I look at it at this point:

The Wii is faster than XBOX, but may lack some of the modern pixel mapping techniques such as bump/normal mapping. It also lacks newer technologies as seen on the 360 such as parallax mapping and light effects. (I’m sure these effects could be simulated on a Wii somehow, since it does support displacement mapping, but it would be at a great performance penalty). Fillrate wise @480p it may be on par with the 360 @720p though. I would suppose that polygon calculation would be on par with XBOX (I think it is assumable that is can calculate 120M polies a second).

So, what to expect: games with RE4′s texture resolution or better, twilight princess quality of lighting, perhaps some bumpmapping, twice as detailed environments/characters and more/better AI. I hope monster hunter will prove this to be right, especially since it was to be a PS3 title.

BranH

On January 19, 2008 at 1:18 pm

chicothe = Simple, most games optimized for Gamecube, have a good chance of looking better on Gamecube than it will on XBox.
That’s not so hard to figure out me thinks. If it’s “Gamecube optimized”, it’s set in stone, to use the types of multi-texturing effects the Gamecube is good at. Porting anything that takes a dx7 or lower code path wouldn’t give an Xbox an advantage, outside of perhaps aa, or higher color precision, etc.. Many areas of the system would go under-utilized, and would only use the limited functionality that the Gamecube is capable of. It doesn’t magically compute things like normal maps that weren’t designed into the game in the first place.

However, developing something designed for Xbox, and trying to port it visually as is on Gamecube, with all of the bump or normal mapping details, vertex and pixel shaders the XBox could muster, would simply not run. At all. Emulate what the Cube’s gpu can’t do, on the cpu. And watch it be nothing more than a frame by frame slide show, if it can render a frame at all. Everything would have to be cut back to dx7 ish level, and just cut back in general.
That’s just the way it is.

Not sure what you guys are so puzzled about. It doesn’t have to be more powerful than the Gamecube at everything, to be more powerful overall.

And you seem to think that you can counter the Xbox > Gamecube stuff, by making worthless points about Metroid Prime. It uses zero bump mapping of any kind.`It’s not an ugly game, or a primitive game overall, but it runs well because they don’t try to do things they aren’t capable of doing.
And your frame rate, depends entirely on the amount of work you;re doing per frame.
You can make an Xbox game that runs at a hundred frames per second, if that’s what you were shooting for when you started. It’s not as if any of these system aren’t capable of it. They intentionally shoot for ~30 frames per second, and design the game around that.

Nintendo first party especially, does a good job of toning down what they need to, in order to get their ~30 frames.
Even when in Mario Sunshine, it took using grainy, redundant, low resolution textures over most everything, in order to have higher res, higher quality on a few key things, and not have to load as you progress through the (small) levels.
They didn’t mind if some things looked pretty crappy, to get other, more important things looking the way they wanted, and keep a decent ~30fps rate.
That’s a good philosophy to have if you ask me.

BranH

On January 19, 2008 at 3:57 pm

@not a fan- I was playing devil’s advocate to Wiiboy, I don’t think of the Gamecube as weak compared to an Xbox. It was competitive from all I’ve seen. Wii should be able to top it.

Wii displacement mapping, is just someone misunderstanding a Nintendo patent for emboss bump mapping. Not sure if i could post the url here properly, but here it is:
http://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&p=1&u=%2Fnetahtml%2FPTO%2Fsearch-adv.html&r=3&f=G&l=50&d=PG01&S1=%22emboss-style+bump+mapping%22&OS=%22emboss-style+bump+mapping%22

It’s just hardware function to compute the displacement of an embossing map, based on a supposed light source, without involving the cpu. More or less fixed function pixel shaders.
Like this:
http://www.delphi3d.net/articles/bumpmap_02.jpg
If you copy that into photoshop or studio, then copy the top center square, paste it, make it 50% transparent, and move it over the top of the bottom center one.
Then move it around according to an imaginary light direction.
The only mention of geometry in that patent, is in regards to a typical it’s t&l unit, since that effect would be tied in with the movement of a polygon.
They could have added some functions to compute something more complicated, but i don’t really see much evidence of that yet. (in practice, or developer comments)

And I wouldn’t expect 120 million polygons out of Wii (or Xbox). AFAIK, it would be half that at most in just vertices (about 50 million vertices). A vertex being the corner of a triangle, (shared with its neighbor where possible), so depending on what type of game, and what the meshes look like, you might get over half that.. >30-35 million. But that’s nothing but transforms. Then, you can start cutting that, depending on a list of other effects you might do.

BranH

On January 19, 2008 at 4:05 pm

havoc of smeg

On January 19, 2008 at 4:53 pm

@not a fan

it was partly intended with as a kind of homer simpson quip, but put it this way, would any one who wasnt any kind of nerd talk about polygon counts ETC.
no offence, by the way.

also, i was interested from a “would i want to buy some thing that says its new but actually has antique technology in it” point of view, if you get what i mean.
and considering its supposed to be part of the “current” generation of consoles.

Swp64

On January 20, 2008 at 8:14 am

@BranH = Yeah. Even the ~35 million per second figure, would be really generic polygon work.
Wii = a few hundred thousand polygons in any given frame, tops.

And yeah, that was what I was referring to when I said people were grasping at straws when the Wii launched. Nurbs, displacement mapping, etc..

@not a fan -Although they can do things like EMBM (environmental bump mapping), I see no evidence of any extensions to the gpu itself, for any of the more modern types. They could have, (this is ati after all), but I don’t see it, and most developers have implied (and complained) that it doesn’t. Supposedly, it’s not much more than an over-clocked Gamecube, with more ram.

It’d be hard to believe that that’s ALL it is, and perhaps developers hate on it so much, because they can’t just drop their game off onto its hardware, and run it as is, the way they could for a low end pc for example.

We’ll see I guess.

WIIBOY2008HONESTYPOLICIE

On January 23, 2008 at 10:49 am

broadway cpu well over 20 million trancisters xbox1 celeron 9 million as you can see theres more than double the logic count on broadway than there is xbox combine that with risc/copperwire/embedded ibm mpu design/silicon on insulator= a celeron destoyer pure and simple……

gekko was more powerfull than xbox celeron FACT FACT FACT it had copperwire risc and customizations and was a shrunk tightly compact MPU design plus a 162mhz front side bus with 4to1 compression = virtual bus of 648 mhz (xbox 133mhz hmmmm)
risc beats cisc
copperwire beats aluminium
silicon on insulator beats non silicon on insulator
mpu embedded design beats off the dam shelf
256k level 2 catch plus 4to1 realtime compression decompression beats 128k level 2 catch nooooooooo custom compression
broadway cpu = 2x gekko = 2.5 x celeron FACT

XBOX BUS 133MHZ =1 GB BANDWIDTH
WII BUS 243 PLUS REALRTIME 4TO1 COMPRESSION DE-COMPRESSION = 8GB BANDWIDTH

XBOX FANS DO THE RESERCH DO THE MATH

ADD TO THAT DIE EMBEDDED 1TSRAM-R IS AS FAST AS LEVEL 2 CATCH 5 NANO SECONDS 24MB VIRTUAL LEVEL 3 CATCH DESTROYS XBOX RAM

WIIBOY2008HONESTYPOLICIE

On January 23, 2008 at 10:54 am

WII SRAM = 64K LEVEL 1 AND 256K LEVEL 2 ALL SUPPORTING REALTIME 4TO1 COMPRESSION CPU AND 3.2MB GPU CATCH PLUS 24MB VIRTUAL LEVEL 3 CATCH (DIE EMBEDDED 1TSRAM-R)

XBOX SRAM= 32K LEVEL 1 128K LEVEL 2 NO CUSTOM COMPRESSION AT ALL IN HARDWARE CPU AND A 256K GPU CATCH NO COMPRESSION SUPPORT IN HARDWARE

A INSANE DIFFERANCE

DO UR RESEARCH WAKE UP WII IS A HIGHLY CUSTOM ON THE FLY HIGH SPEED COMPRESSION DE-COMPRESSION CUSTOM CONSOLE XBOX IS A OFF VTHE SHELF BASIC PC

WIIBOY2008HONESTYPOLICIE

On January 23, 2008 at 10:56 am

ALLMOST 28MB OF SRAM PLUS REAL TIME CUSTOM COMPRESSION OF DATA AND TEXTURES IN REALTIME

VS WELL UNDER 1MB FOR XBOX NO SUCH COMPRESSION TRICKS

DO THE MATH :razz: :razz:

WIIBOY2008HONESTYPOLICIE

On January 23, 2008 at 11:01 am

WII TEXTURE READ 1MB TIMES 6 = 6MB TEXTURE CATCH

WII TEXTURE READ 16GB TIMES 6 FOR COMPRESSION =96 GB BANDWIDTH

XBOX TEXTURE READ ERMMMM ERMMMMMM ABOUT 3GB IF YOUR LUCKY AND A GPU CATCH OF ERMMMM 256K NO COMPRESSION INTO CATCH IN REALTIME SUPPORT WHATS SO EVER

HOLLYWOOD READS 6MB TEXTURES IN CATCH IN REALTIME ITS LIKE MANY MANY TIMES THE PATHETIC DRAM AND TINY CATCH OF XBOX

XBOX RAM 70 NANO SECONDS LATENT

WII RAM 5 NANO SECONDS READ LATENT THE MAIN MEMORY FEEDS THE CATCH IN REALTIME GIVING UNLIMITED ROOM FOR TEXTURE FEEDING

FROM DISC THRU SYSTEM TO THE SCREEN WII NUKES XBOX AT TEXTURE FEEDS AND FILL RATES AQND EFFECTS “”"”"”"”"”"”"”FACT”"”"”"”"”"”"”"”"”"”"‘

WIIBOY2008HONESTYPOLICIE

On January 23, 2008 at 11:08 am

EVERY 1MB OF RAM IN WII CAN HOLD 4MB OF DATA////// EVERY 1MB OF TEXTURE RAM CAN HOLD 6MB OF TEXTURES ALL DECOMPRESS AT THE POINT OF PROCESSING WITH IN ETHER THE CPU OR GPU ON THE FLY IN REAL TIME NO OTHER CONSOLE CAN DO THAT

243MHZ BUS BECOMES 4X THAT=972 MHZ VIRTUAL BUS

256K LEVEL 2 CATCH CPU BECOMES 1MB VIRTUAL CATCH

1MB TEXTURE CATCH BECOMES 6MB

GUESS WORK MAIN RAM

24MB DATA BECOMES 96MB DATA

SAY 40MB TEXTURES BECOMES 240 MB

SOUND 4 MB TIMES 4 COMPRESSION BECOMES 16MB

ALL ON THE FLY HIGH SPEED LOADING REAL TIME DECOMPRESSION NO PC OR COMPEATING CONSOLE CAN DO THIS “”"”"”"”"”"”"”FACT”"”"”"”"”"”"”"”!!!!!!!!!

WII=480P OPTERMIZED X360 LIKE PERFORMANCE NOT XBOX 1 AT ALL

BranH

On January 23, 2008 at 4:57 pm

Lol. Wiiboy, drowning in his own “facts”. Your like a bot, with an inferiority complex on Nintendo’s behalf.

**EVERY 1MB OF RAM IN WII CAN HOLD 4MB OF DATA////// EVERY 1MB OF TEXTURE RAM CAN HOLD 6MB OF TEXTURES ALL DECOMPRESS AT THE POINT OF PROCESSING WITH IN ETHER THE CPU OR GPU ON THE FLY IN REAL TIME NO OTHER CONSOLE CAN DO THAT**

No it doesn’t. Chips have used color and z compression between rops, and frame buffer. All of them do. Every last one of them. You’re confused, simply because the Gamecube and Wii read and write to it’s on-chip ram.
Everything gets decompressed “at the point of processing”, all of them do it.

You continue to confuse yourself, by comparing bandwidth and compression of caches and buses internal to the chip, that are required for Gamecube and Wii, for all the reading and writing that they have no choice but to do, between its ram and the frame buffer.
An Xbox reads its textures (in happy 6:1 compression), decompresses them “at the point of processing” and can combine two layers at a time, loop those pixels back, (internal to the chip) and add two more, then write the result back to ram. (where its frame buffer was stored) (It’s 30 gb per second cache serves its purpose)

Gamecube, has to read and write to and from frame buffer every single time it wants to add a texture, and cuts into its fill-rate for each and every one. It needed it’s on chip ram, it’s part of the design.

So, doing bandwidth comparisons in the manner you are, tells you nothing.

And it’s just funny how you continue to think you can magically apply your “4to1 compression” to everything.
And transistors don’t tell you much either, considering the embedded ram adds a large number of them. And you can get benchmarks of the G3 chip that came after Gecko. It’s not an unknown quantity.

And it’s really funny that you think compression (or anything for that matter) is faster on a Wii, than it is on anything current, pc or otherwise.
A 360 has all the compression Wii has (heck, an Xbox did), in addition to newer, more advanced formats, for things like normal maps.
Wii can do nothing more than standard s3tc.
They’re not in the same league, ballpark, or heck, their barely in the same sport.

cubebOY101

On January 24, 2008 at 11:26 am

XBOX CANOT DECOMPRESS IN REAL TIME XBOX CANNOT DECOMPRES IN CATCHES FACT FACT FACT XBOX DOES NOT SUPPORT CUSTON DATA COMPRESSION FACT FACT FACT

DATA COMPRESION 4TO1 GC AND WII 243 FSB = 8GB BANDWIDTH

XBOX NO SUCH TRICK 1GB BANDWIDTH FACT

XBOX SUPPORTED COMPRESION “”"”"”"”"”"”"”"NOT REALTIME DE-COMPRESSION FACT”"”"”"”"”"

WII ON THE FLY DECOMPRESION OF DISPLAY LIST/SCIPT/GEOMITRY ETC ETC ETC ON THE FLY FROM DISC THRU SYSTEM TO PROCESSOR THEN DECOMPRESED AT THE POINT OF PROCESSING

XBOX HAD NO SUCH DATA COMPRESSION AND ONLY COMPRESSED TEXTUES IN RAM

I REPEAT AS ITS FACT ONLY WII SUPORTS SUCH CUSTOM TRICKS

BranH

On January 24, 2008 at 5:16 pm

Wow, so you can compress, just not de-compress. That makes so much sense dude.

Stop making up WiiCube. I’m not guessing, nor am I speculating.

Just with an Xbox 1.
Any time you read or write anything to and from ram, it gets compressed and decompressed in real time. You never read or write anything uncompressed. It reads textures (in compressed form) from ram, over it’s memory bus. (It’s still compressed).

They get decompressed after they’ve been read, then process in its pixels, then re-compresses the color and z values of those pixels, then they travel (compressed) over the main memory bus, and are written into ram, where they are stored, compressed.

That is the entire point of texture compression when reading textures,
and color and z-compression for the framebuffer, WiiCube.

That was the PS2′s problem, in that the GPU couldn’t deal directly with compressed textures, it could only save storage space in ram, because textures would have to be decompressed (by the cpu), before they were passed over the bus to the gpu.

DXT 1-5 saves in texture storage and texture bandwidth, Lossless Z-compression saves in frame-buffer storage and bandwidth. (In addition to occlusion culling and query, etc, etc.)

That is the point of texture compression. That is the point of d3d compression period. That is the point of decoding and decompressing audio streams.
It saves bandwidth, and storage space.

And you did read the erp quote, right?
http://72.14.205.104/search?q=cache:RffI1PGW2mgJ:forum.beyond3d.com/archive/index.php/t-2168.html+%22DXTC+textures+and+that+reduces+texture+bandwidth+significantly%22&hl=en&ct=clnk&cd=1&gl=us&client=firefox-a

Which is exactly why your multiplication skills go to waste, when you attempt to inflate Gamecube / Wii’s effective storage capacity and bandwidth numbers, to make it easily comparable to next gen hardware, especially when you ignore similar and better capacity on other hardware. It must be really traumatic for you to learn otherwise.

http://72.14.205.104/search?q=cache:f-hqNPlNSIIJ:www.microsoft.com/presspass/press/1998/mar98/s3pr.mspx+%22by+a+factor+of+four+to+six%22+%22texture+storage+and+read+bandwidth%22&hl=en&ct=clnk&cd=1&gl=us&client=firefox-a

wiiboy101gezza

On January 24, 2008 at 6:13 pm

oh thers more DEDICATED SOUND PROCESSOR BUS
XBOX SOUND CHIP WAS SENT DATA OVER A ALLREADY KRIPPLED BUS SHARED FOR THE WHOLE SYSTEM HMMMMM

GAMECUBE AND WII HAVE DEDICATED SOUND BUSES 8 BIT AND CLOCK BALANCED TO THE CLOCK SPEED OF THE HIGH PERFORMANCE SOUND PROCESSOR A DEDICATED BUS TO GO WITH THE DEDICATED CHIP ONLY GAMECBE AND WII HAVE THIS

XBOX SHARED SOUND DATA ON ITS SHARDED FSB PS3 AND X360 DONT EVEN HAVE SOUND CHIPS OR BUSES THEY HAVE COUGH COUGH CPU SOUND AND SHARED BUS WITCH IS

ALL GAMING PCS AND WII HAVE DEDICATED SOUND CHIPS X360 AND PS3 DONT EVEN HAVE SOUND CHIPS

WII HAS 2X MAIN MEMORY POOLS 24MB 1TSRAM-R DIE EMBEDDED ULTRA FAST RAM 2X 64 BIT BUS AND A SECOND POOL OF 64MB GDDR3 DUALCHANNEL 32 BIT BUS
PLUS ACCESS MEMORY VIA FLASHDRIVE ON DEDICATED FLASH MEMORY BUS (POSSABLY 16 BIT)

XBOX HAD ONE SINGLE POOL OF TY SDRAM AT 200MHZ AND THATS IT CLOCKED UPTO 400MHZ IN MEMORY CONTROLLERS HARDLY WII LIKE IS IT

OH YES XBOX REQUIRED MAIN MEMORY FOR FRAME/Z BUFFERS UPTO 16 MB BEFORE YOU EVEN START USING MAIN MEMORY FOR ANYTHING ELSE

WII DOES ALL FRAME/Z BUFFERING ON CHIP IN ITS ULTRA FAST ULTREA HIGH BANDWIDTH GPU CATCH

wiiboy101gezza

On January 24, 2008 at 6:19 pm

3MB ULTRA HIGH SPEED ULTRA HIGH BANDWIDTH REAL TIME COMPRESSION DECOMPRESSION GPU CATCH ALL CATCHING AND BUFFERING DONE ON CHIP 3MB

BACKED UP BY 24 MB 1TSRAM-R EMBEDDED RIGHT NEXT TO GPU ON DIE (27MB OF SRAM BEFORE U EVEN TOUCH EXTERNAL MEMORY)

XBOX EXTERNAL RAM 64MB SOOOOOOOOOO WERES THE SRAM CATCH THERE AINT ANY AT ALL ONLY THE STANDARD GPU TINY CATCH AND THE CELERON TINY CATCH THATS LESS THAN 1MB

NOT ONLY IS WIIS 27MB IN SIZE ITS GPU AND CPU CATCHES ARE ULTRA HIGH BANDWIDTH AND THE WHOLE SYSTEM SUPPORTS COMPRESSION REALTIME DECOMPRESSION

WII CLEARLY 3 TIMES A XBOX 1 IN MEMORY TERMS IT CLEAR KILLS XBOX

wiiboy101gezza

On January 24, 2008 at 6:24 pm

XBOX 1 200MHZ SDRAM @ 200MHZ 64MB

WII 24MB 1TSRAM-R FAST RAM ON DIE @ 486MHZ 2.5 TIMES FASTER IN NATIVE CLOCKSPEED OVER 10 TIMES FASTER IN NANO SECONDS LATENCY

EXTERNAL 64MB GDDR3 @ 486MHZ NATIVE CLOCKSPEED 2.5 TIMES FASTER THAN XBOX SDRAM CLEARLY AND ALL RAM IN WII IS CLOCK SYNCED AND BALANCED GDDR3 IS FASTER AND SWEETER THAN SDRAM CLEARLY

2 POOLS OF BALANCED CLOCK SYSNED HIGH PERFORMANCE RAM VS 1 POOL OF TY SDRAM

I REST MY CASE YOUR HONOR :lol: :lol: :lol:

pigfrogmonkey

On January 24, 2008 at 6:52 pm

so we have establihed that xbox fanboys and anti nintendo statments from within the industry by jelous EA games believe that

1, 64 mb of rubbish sdram beats 88mb of dam fine and fast ram

2 a off the shelf aluminium celeron beats a risc copperwire gamecentric optermized embedded highly efficent powerpc cpu

3,xbox tiny sram beats a 27mb pool of great sram and a large 256k l2 cpu catch

a sharded and krippled fsb beats a clock balanced higher speed bus and a dedicated sound bus

4, ty sdram beats 1tsram-r and gddr3

5, high speed optermized priority disc drive is beaten by standard dvd drive

6, off the shelf aluminium beats copperwire silicon on insulator micro embedded design

7,133mhz bus beats 243 mhz bus with 4to1 compression (virtual bus of 4x 243mhz)

8,standard inefficent xgpu beats a highly fine tuned ultra efficent kinda tile rendering custom design gpu

8, tiny gpu catch that carnt read compressed data beats a 3.2mb high bandwidth super fast sram catch…

i think xbox fans are somewhat confussed

modderuk

On January 24, 2008 at 6:58 pm

xbox 12 million polygons @ 30 frames a second

gamecube near 20 million polygons @ 60 frames a second

wii 50 million if not more @ 60 frames a second (thats 360 level)

but aparantly wiis a xbox for gods green earth only an idiot fanboy would believe that bull wii is as was leaked at e3 a gamecube 2.5 witch is roughly a xbox 2.5 so how on my big phat is wii a xbox

BranH

On January 25, 2008 at 5:49 am

Lol, Wiiboy, you’re not informing me of anything.

I believe I’ve mentioned, that Gamecube has its on-chip caches, for the reading and writing it does between them. It’s inefficient with bandwidth, and therefore needs more of it, hence, its on chip frame-buffer, and frame-buffer bus.
It writes, reads back, write, reads back, writes, reads back, etc, etc, etc, for adding addition texture layers. So the bandwidth stays internal to the chip, rather than hitting main memory’s bus.

An Xbox can combine its texture layers internal to the chip, using its internal bandwidth, caches, and loop-back. It only writes to main ram again (compressed), once it’s been combined, into pixels.

And you point out the Gamecube’s crappy A-ram. That usually meant that you’re limited to 24 megs of ram for higher function graphics data, like textures. The only other things A-ram was used for, was menu functions, and to help with load times by storing things there like a cartridge would.
But ask any developer, if they’d rather have an additional 16 megs of fully functional, main ram to do with what you want, or a pool of crippled a-ram, with negligible bandwidth, see what they tell you.

You’re confusing Gamecube and Wii btw, Gamecube’s was 81 mhz, 8-bit bus. A whole additional 81 megabytes per second of awesomeness.
Wii replaced A-ram with GDDR3 in Wii, last I checked.
And it’s nice that you constantly ignore 128 bits wide data bus, when doing your “awesome comparisons” Cherry pick much?

And the 360 and PS3 don’t need sound chips, when their cpu’s are capable of devoting more processing to sound, than the entire Wii cpu. Calling its buses or bandwidth, or processing ability “” while fellating Wii hardware, is nothing but pure, hilarious fanboyism.

***OH YES XBOX REQUIRED MAIN MEMORY FOR FRAME/Z BUFFERS UPTO 16 MB BEFORE YOU EVEN START USING MAIN MEMORY FOR ANYTHING ELSE***

A 640 x 480 frame. 32bit color and Z = 2.4 mb.
Which is then, on top of that. We don’t technically have to include AA in that figure, (we could) but considering Gamecube would need to drop color precision, and sometimes cut its fill-rate, it wouldn’t make much sense to even throw that in to bloat the frame buffer size. (we could also consider that aa pixels compress much higher too)
(or, unless you want to do your comparisons assuming Xbox is taking a multi-platform game like SC3, and using its power and resources to render in high definition, and compare that, to the PS2 and Gamecube at standard res. Then, we could inflate it to 16 megs, but even then there’d still be color and Z compression)

Anyway, you’d still have to convert the back buffer to a front color buffer, but so does Gamecube. The video DAC reads the image data from main ram.
And, last I remember, since the on-chip back buffer on Gamecube never becomes the front buffer, the Gamecube had to triple buffer (doubling the amount of ram for front buffer)
Back buffer, “middle buffer”, and finally front buffer.

So, what other bull assumptions do you meed disproven WiiCube?

And keep in mind, that I am aware of your attempt to switch back and forth between Gamecube and Wii, knowing full well that most people here agree, and have even explained why they believe Wii > Xbox and Gamecube overall.
That part of your fanboyism isn’t being questioned. It’s all the Gamecube = “awesome” era bull, you’ve resurrected from your fanboy wars with Xbox, to somehow prove the Wii’s awesomeness that was being questioned, and picked apart.

I guess you do that, so you can think you’re being informative, or pretend you’re arguing the obvious, and “fighting the good fight”, “fending off the fanboys”.

The only one acting as a fanboy here is you, and of course, the random Nintendo defense force members who occasionally drop in, of whom, all seem to share a brain with you, and continue to recycle their ignorant polygon count and frame-rate numbers.

I would love to find out that Wii is magically awesome. I would enjoy reading the technical explanations of it, or point out where my assumptions are wrong. I have no problem with that.
Only problem is, that isn’t happening. Sorry, Wii is what it is, and it seems you’ll have to wait till next gen, to see a Nintendo console compete graphically again.
(of which I more than expect them to do, not the best, but good at most things)

BranH

On January 25, 2008 at 5:57 am

@modderuk – Just goes to show, you know absolutely nothing about what you just copy and pasted. Congratts.
If I thought you knew what you were talking about, I’d ask you how you arrived at your figures, but I won’t.

And not even remotely, is Wii capable of anywhere near the geometry of a 360. (nor is PS3 for that matter)

BranH

On January 25, 2008 at 6:03 am

*Minor typo correction.
**”which is then **compressed**, on top of that”

modderuk

On January 25, 2008 at 1:02 pm

broadway is in intel terms a celeron 1.5 mobile with 1mb l2 catch and 400mhz buz brodway is 2 times a intel in clock for clock power then add all the custom upgrades like compression silicon on insulator gameing instructions etc custom fpu and superier fsb its clearly on par with a intel mobile celeron 1.5ghz with 1mb catch

now imagine a 1.5 ghz 1mb catch version of xbox one i can clearly see broadway hitting 2.5 times the performance of xbox 1 cpu you clock brainwashed baffoons and all at a capped 480p dooooooooooo theeeeeeeeeee mathhhhhhhhhh

modderuk

On January 25, 2008 at 1:11 pm

branh your a fanboy talking crap your seriously sujesting wii aint ps3 level -hd your a brainwashed baffoon

inline procesors are at best 1 3rd a out of order cpu performance at the same clock speed

and at worst 1 10th a out of order cpu

ps3 and x360 use heavily stripped out inline basic chips with massivly overclocked speeds to handle the multi jobs of op system sound hd etc

inline cpus clock for clock are shear killed by out of order exacution units super scallers

inline cpus are like a train on rails out of oreder exacution units are like cars the can process in all directions and branches

clock for clock broadway kills the other 2 THAT IS FACT at 729mhz cell would be left for dead by broadway

inline=

480p takes 3 times less powere to produse the same visuals as 720p

again DO THE FUKING MATH

if hollywood has 3mb on chip super fast gpu catch at 28gb bandwidth and x360 has ty slow edram 10mb and 32gb bandwidth i think its very clear wii has more in terms of power resolution ratio than 360 again do the math

catch bandwidth wii 28gb not counting compression at 480p

catch bandwidth xbox gpu catch 32gb at 720p and upscale 1080p

clearly the wii is better at its set 480p resolution 28gb vs 32 gb at 3 times less resolution = wii is better considering the capped res you plonka

modderuk

On January 25, 2008 at 1:15 pm

720p 10mb edram does it comput yet matey at 32gb

480p 3mb edram 1tsram-r at 28gb there allmost identical in bandwidth but wii shear kills 360 at latency performance and compression

wii is overkill at 480p infact wii has more playable fillrate at 480p that 360 has at 720p

do the ing math

wii carnt match up ermmmmmmmmmmmmm MARIO GALAXY GRAPHICS PHYSICS LOADSPEED SOUND I THINK IT CAN

FANBOYYYYYYYYYYYYYYY :roll: :roll:

BranH

On January 25, 2008 at 1:54 pm

I’ve done nothing of the sort here.

And yeah, we covered that. No one said Xbox > Wii overall. Developers commented they figured they can get about 2x the performance of the Gamecube cpu, even with only a 1.5x increase in clock frequency. I’d say that puts it well above Xbox’s cpu.

All I pointed out, was that those figures don’t translate “directly” to an Xbox, since alot of math is offloaded to vertex shaders, that would normally be done on the cpu. As far as I have heard, Wii still does things like normal mapping, with heavier cpu assist.

But I would say, it is more powerful.
2.5 times is subjective, but just the bandwidth increase would effectively boost it quite a bit I would think, even if that’s all you did.

Then there’s the additional more useful ram and bus to gddr3, increased cpu / gpu clock frequency, likely increased efficiency, elimination of any bottlenecks developers ran into with the Cube, and likely a few other tweaks and additions, etc..
Then there’d be improved and streamlined development tools, and several years developer experience and resources to add to that.

Then toss in a few other newer middleware tools you find Nintendo’s name involved in, (like illuminate labs), etc.. and I could imagine 2.5 times better looking games.
Just not exclusively from a full 2.5 times power increase over an Xbox, and wouldn’t expect it to keep pace in things like geometry processing on 360 or PS3, both in complexity, or straight numbers.

Assuming 480p Xbox360 performance, is expecting way too much from it in general.
But some of that is still opinion, so we’ll see.

BranH

On January 25, 2008 at 2:26 pm

@modderuk additional nonsense = I think it was explained to you earlier, that those have nothing to do with each other. That isn’t a “cache” it’s a frame buffer. It’s rops have 256 gbs of bandwidth for all the reading and writing between them and edram. You’re listing you;re bull cache bandwidth numbers for no apparent reason.

Gamecube’s was 7.6, so Wii’s would be roughly 11 gb per second, for all the reading and writing back and forth.

And I’m sure you can figure out, that internal bandwidth on PS3 and 360 are hundreds and hundreds of gbs per second.
And you know nothing about compression and latency, or how it works, or how it’s hidden, etc, etc…

And cpu performance, depends entirely on how efficient the compiler is on an in-order cpu, in addition to what you’re trying to process.
You don’t need an instruction window at all to stream certain things. PS3 and 360 will out perform Wii in audio several times over, with a tiny fraction of processing power.

A PS3 cpu will run any cpu on the market down in some tasks. Regardless of how well their octocompiler, or edge tools, etc, are, it “destroys” the Wii at everything. Physics, A.I, tessellation, etc..

Swp64

On January 25, 2008 at 2:49 pm

Lol, I got tired of this thread long ago. Wiiboy is autistic, without any math or memory skills. And I would agree, the rest of these folks seem to share a brain with him.
At first, I thought he was being sarcastic, and just retyping his bs numbers for s and giggles. But I really think he is as delusional as he sounds. His reading comprehension is non-existent. He doesn’t “want” to understand what he types. It’s as if his mind, crashed and burned here. His NDF tactics no longer work, and he doesn’t know how to respond. It’s like watching the Rainman, flip out when he doesn’t get his fish sticks, and Wapner.
He really seems to be emotionally invested in “Wii hardware”. And his cap locking is interesting. It reminds me of people who know they don’t know what they;re talking about, or know they’re wrong, and raise their voices because they think that somehow makes them right by doing so.

Like I said, tech discussions are cool, and can be informative. And I wish there were folks around who had first hand experience, to answer questions and such, as I find that more enjoyable than just reading technical docs.

But the tard level here, is just too high, and there’s no chance of learning anything from it, other than that Wiiboy and pals are completely lost, and latching to crap they don’t want to understand, and resorting to just making up outright by now.
It’s quite a pathetic sight really. You guys need to move on to one of the later stages of grief, you’re stuck in the denial stage I think.

I don’t dislike Nintendo as a whole. I think they could have done better with hardware, but they’re still one of the best software companies there is.

lets be honest hay fangirls

On January 26, 2008 at 4:41 pm

xbox celeron 733 mhz no optermizations no 3d upgrades no high end cutting edge manufacturing processes or chemicals just a 733mhz celeron and a 133mhz bus (fsb)
32k l1 catch 128k l2 catch (standard celeron spred sheet non 3d crappy wintel fact)………………….(celeron tranny count 9 million)

broadway cpu of wii
risc (vastly superier to cisc) copperwire (vastly superier to aluminium)

silicon on insulator (latest power performance enhancing chemical trick applied to ibm cpus vastly superier to off the shelf aluminium celeron design)

catch l1 64k (2x xbox cpu) catch l2 256k (2x xbox cpu)

fsb (bus) 243 mhz (xbox only 133mhz bus speed and abbility highly important to chip performance)

compression wiis bus and ram and catch memorys suck in and deal with data at a maxed out 4 times compression and de-compress on the fly automaticly in processors examples of superierority to xbox follow

xbox 133mhz bus wii 243mhz bus plus 4 times compression= virtual 4 x 243 mhx bus or 8gb bandwidth compared to xboxes 1 gb (A VASTLY SUPERIER BUS)

XBOX L2 CATCH 128K wii L2 catch 256k plus 4 times compression = 1mb virtual catch (AGAIN VASTLY SUPERIER TO XBOX)

CELERON 9 MILLION TRANNY COUNT ON A LARGE OUT DATED CHIP USING ALUMINIUM

wii broadway way over 20 million trannys on a tiny copperwire embedded 2006/07 designed chip and smaller die and silicon on insulator and copper not aluminium

main ram xbox sdram = pathetic realtime speeds of 50 to 100 nano seconds read time

wii die embedded 1tsram-r virtual level 3 catch main memory 5 nano seconds read time 10 times plus faster than xbox dram FACT

broadway has optermized design and materials and chemicals to gekko in gamecube and has as powerpc docs show superier instruction efficencies i.e what once took 6 clock cycles now takes 3 clock cycles allso the amount of insructions per clock has allso risen

any laymen can see with sooooooooooooo many advantages broadway as part of a highly efficent tightly clock balanced ultra fast ram console

kills a xbox 1 celeron at real time processing

i repeat as fact not uninformed opinion wii is 2.5 times the performance of a xbox 1 and can do it wilst having noooooooo loadtimes and full wiimote and nunchuck support

please get it you lost to someone being honest and not a fanboy

gamecube and xbox are roughly = but gamecube was the sweeter more polished design and by far the fastest ram and disc loading wii is 2.5 times a gamecube so its roughly 2.5 times a xbox- the cluncky running and slow loading

PLEASE EXCEPT FACT AS FACT

lets be honest hay fangirls

On January 26, 2008 at 4:52 pm

ANYONE SAYS A CELERON WITH 9 MILLION TRACISTORS AND NO GAMECENTRIC OPTERMIZATIONS OR FAST RAM AND CRAPPY CATCHES IS ON PAR WITH CUSTOM BROADWAY CPU IN A VASTLY SUPERIER CONSOLE DESIGN AND VASTLY SUPERIER RAM SPEED IS A LYING FANBOY OR ANTI NINTENDO REMARK FROM A JELOUS 3RD PARTY LOOKING WITH AWE AT NINTENDOS PROFITS AND POSITION

THE EA COMMENT THAT STARTED THIS THREAD WAS A POLITICAL ANTI NINTENDO MARKETING COMMENT IT WAS A WAY OF EA GAMES TO MARKET THERE OWN PRODUCTS TALK GRAPHICS AND HD TO MARKET PS3 AND X360 MARKET 3D MOTION AND GIMICS FOR WII

IT WAS NOT BASED ON FACTUAL EVIDANCE ON HOW POWERFULL THE WII IS AND THE IGNORANCE SHOWN IN SAYING WII DOSNT SHADE IS JUST LAFFABLE
WII SUPPORTS CUSTOMN ON GPU EFFECTS STANDARD GPU EFFECTS A CUSTOM SHADER CALLED TEV UNIT AND CUSRTOM CO- PROCESSING OF LIGHTS AND SHADING VIA CPU

WII= 2.5 TIMES A XBOX1 AT 480P RENDERING PLEASE DEAL WITH THE FACTS STOP FANLYING :twisted: :twisted: :twisted: :twisted:

wiiboy101

On January 27, 2008 at 1:26 pm

wii has by far the fastest and sweetest ram of any console ever made, small fast rams with high speed data streaming and realtime de- compression kill large pools of slow ram at games performance what nintendo are infact up to is the removal of the idea of ram or main memory taking the idea of ON THE FLY DATA STREAMING AND UN COMPRESSION TO A HIGHER LEVEL THAN JUST A CLOSED SYSTEM AKA CONSOLE

THE GAMEBOY ADVANCE HAD NO MAIN MEMORY ONLY ON CHIP CATCH, WORK RAM, AND CATCH LIKE MAIN RAM GiVING THE PERFORMANCE OF A LARGE MAIN RAM WEN INFACT IT WAS JUST A CARTRIDGE FEEDING ON CHIP FAST MEMORY THE WAY A GAMING DEVICE SHOULD BE DESIGNED

the wii has small yet extreamly fast memory pools and on the fly real time compression decompression it infact allows high end pc like gaming power in a tiny closed system aka true games console design by capping resolution to 480p the wii can render on the fly massive amounts of data
AND SUPPORT A IN GAME REAL TIME FILLRATE THATS OVERKILL AT 480P

PIXEL RECYCLING CULLING//HIGH TEXEL FILLRATE EFFECTIVNESS VIRTUAL TEXTURING HUGE TEXEL BANDWIDTH///HIGH POLYGON COUNTS AND POLYGON CULLING CHOPPING TO FURTHER INCREASE FILLRATE OF POLYGONS

2X GAMECUBE FILLRATE AND 3.5 TO 4 TIMES THE MEMORY = 2.5 TIMES A GAMECUBE AVERAGE PERFORMANCE

NOT A XBOX AT ALL THE FANS ARE DREAMING :roll: :roll:
please learn the differance between pcs and pc copys cough x360/ps3 and a capped tightly intergrated fast true console design the wii is like a on the fly embedded speed freak A GIANT GAME BOY ADVANCE KIND OF :lol:

3DS Macks

On January 27, 2008 at 2:38 pm

Wow, so it’s actually an awesome, “high end pc-like” machine.

Too bad from a technical perspective, the games look like .

I’ve worked in computer modeling, and just recently started back to school to bring me up to speed on the programming side. I own a Wii, and haven’t looked into any of its specifications. I enjoy the games alot, but…

Either their graphics hardware is underpowered, or all graphics programmers that have worked on it, are a bunch of incompetent, no talent hacks.

Even Nintendo first party are but a step above mediocrity if you put their games into the same category as anything else, aka everything.
Even Retro forgot how to make basic bump maps.

And here, I thought they deserved a medal for making nice looking games under such limitations, but I guess they should all be fired, since they’re not supplying their art department with effective tools to create art assets.

It could be just the hardware that’s limited…

But neah, I choose to believe:
IT’S A CONSPIRACY I TELLS YA!!!!!
EVERYONE IS BEING PAID OFF TO MAKE GAMES THAT JUST LOOK LIKE SPIFFED UP GAMECUBE GAMES, BECAUSE EVERYONE HATES NINTENDO FOR BEING THE GREATEST HARDWARE DESIGNERS ON THE PLANET!!! EVERYONE’S AGAINST THEM!!!
FACT!!

THEY SHOULD LISTEN TO ME, AND MY AWESOME HARDWARE ANALYSIS. I CAN POST WITH NUMBERS AND WRITE OVER AND OVER!!!
EVERYONE”S JUST JEALOUS OF NINTENDO!!

PS!!!
THE SUB POWERPC 750CL, WAS SECRETLY AWESOME, AND ALL THOSE IDIOTS WHO UPGRADED TO A G4 and G5, OR MULTICORE FOR THEIR WORK AND PERSONAL COMPUTERS WERE IDIOTS!!!! EVEN NASA IS RE-EVALUATING THEIR CPU OF CHOICE. I HEAR MIT IS IN NEGOTIATIONS WITH NINTENDO, TO REPLACE THEIR CURRENT SUPER COMPUTER WITH A WII!!!
BUT NINTENDO ISN’T GOING TO FALL FOR IT, AND RELEASE ANY OF THEIR SUPER SECRET DOCS TO THEM!!! THEY KNOW IT’S A PLOT FOR GATES TO GET HIS HANDS ON THEIR SECRET CHIP DESIGNS!!

HAHAHA, FOILED AGAIN, MICRO$OFT!!!

3DS Macks

On January 29, 2008 at 2:31 pm

PPS, Wiiboy, face it, those “copy pcs” destroy the Wii in graphics!!
They just do, even if you assume 480p. They do, and are doing, and will continue to do.
All your bull hardware “facts”, won’t change reality.

You need to learn to accept that, and move on. Enjoy the games, get help for your inferiority complex. The internet is laughing at you.

Just like your bull Han Solo days, eh Wii / Cubeboy? The crap you fed your fellow Wiitards, made Criterion’s podcast, congratts)

You truly are a Burntoutgamer, who’s lost touch with reality.
It’s not every day, that bs like yours makes it into a developer podcast, but I guess everyone loves a good laugh every once in a while.

lennell

On January 29, 2008 at 2:48 pm

…and nintendo have secrets about the wii,one of them is that some of the people that knows alot about the wii [not jast nintendo] knows that the wii graphic-chip can do full HD.

wiiboy101gezza

On January 29, 2008 at 6:23 pm

xbox 1 massive slowdown doing one pallet at a time flouty physics halflife 2 xbox edition so xbox slows down dragging about a single barrel or pallet at 30 frames slowing to about 12 frames a second hmmmmmmmmm

ELEBITS wii true real time physics over 400 objects all directly interactive via wiimote all 400 plus objects supporting real time physics/gravity on screen in game at 60 frames a second

you tubes wii physics clearly shows a guy playing elebits and 400 plus objects on screen all realtime physics and gravity realtime zero gravity at a press of a button all 400plus objects flouting midair mimicing gravity zero gravity and colition physiocs in real time

xbox 1 a single barrel or pallet halflife 2 yorns zzzzzzzzzzzzzzzzzzzzzzzzz xbox fanboys live in a fantasy world broadwy like gekko before it is a physics beast compared to celeron of xbox

xbox 1 object with slow down

wii 400plus objects all directly interactive via wiimote and all suppoting realtime physics at 60 frames a second

go watch videos of both the evidance talks for its self

gamecube mario 128 tech demo real time and controlled via a gamecube pad as if its a real game (year 2000) 8 years ago
128 marios all with there own A.I there own PHYSICS and there own on off real time trancparancies at 60 frames a second

so 400 objects or 128 marios vs 1 barrel or 1 pallet in xbox 1 halflife 2

hmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm i think xbox fans need a lie down :roll: :roll: :lol:

wiiboy101gezza

On January 29, 2008 at 6:37 pm

full HD that dont exist thats a marketing term u brainwashed baffoons if 1080p is full HD as you insanly call it falling for tv manufacturers lies can you explain how my pc monitor supports twice that resolution (theres no such thing as full hd its a marketing term to promote 1080p)
if 1080p is as you misinformed idiots call it is “”"”full”"”"” HD how come the latest tvs support 1600 HD AND HOW COME PC MONITORS GO MUCH HIGHER THAN THAT FULL HD STOP USING MARKETING TERMANOGEGIES INVENTED BY SONYS PR AND ADVERTIZING DEVITIONS YOUR IDIOTS

MY TV SUPPORTS 1080P PLUS XDENGINE2 PLUS 100/120HZ SCREEN REFRESH I THINK 1080P PLUS 100HZ PLUS XDENGINE 2 IS FAR BETTER THAN 1080P ON ITS LONESUM

WAKE UP YOU BRAINWASHED SONY POLE TOCKERS TO TV MARKETING GOBBLEDDGOOP ADVERTIZING BULL

THERES NO SUCH THING AS FULL HD THERES NO SUCH THING AS HD GRAPHICS AND HIGHER RES SCREENS AINT NEXT GEN PCS OPERATED AT SO CALLED HD RESOLUTIONS 10/20 YEARS AGO
DERRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRR DUHHHH DUHHH DERRRRRRRRRRRRRRRRRRRRRR THATS THE INSIDE OF YOUR BRAIN TRYING TO OUT THINK MII

RESOLUTION AINT GRAPHICS WITCH LOOKS BETTER CRONICALS OF RIDDUCK HD X360 OR CRONICALS OF RIDDECK THE MOVIE AT 480P I THINK THE MOVIE AS I JUST STATED RESOLUTION AINT GRAPHICS WII CAN NEAR MATCH PS3 AND 360- HD RESOLUTION THATS A FACT

AND IT CAN OUT LOADSPEED AND OUT GAMEPLAY AND CONTROL BOTH PS3 AND X360 EASY AS TAKING A Wii…….

wiis fillrate to res ratio is on par if not better than 360s at 720p and broadway wipes the floor with a celeron and 1t sram out speeds dram many times over
agoian fact not opinion wii is 2.5 times a xbox with abbillitys to run loadscreen free and fully support wiimote and nunchuck

:roll: :roll:

wiiboy101gezza

On January 29, 2008 at 6:41 pm

full hd im sitting her crying with laffter into my cupped hands yes and iv got a full sized wotever that meens considering there all differant sizes

you ing morons full hd

my wii is 480p but my tv supports xdengine 2 and 100/120 hz screen re-fresh to the human eye thats a 720p quolity image please look past marketing termanoligies and try engaging a bit of brain power

wiiboy101gezza

On January 29, 2008 at 6:43 pm

xbox 1 pallet or barrel physics at 12 frames a second in game halflife 2

wii 400 plus objects all realtime interactive via wiimote with realtime physics at 60 frames

carry on dreaming xbox fans the video evidance speaks for its self

zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzrollseyeszzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz

wiiboy101gezza

On January 29, 2008 at 6:46 pm

physics and wiimote = next gen imursion

physics and button basing= last gen sorry thats just fact :roll: :roll: :razz: :razz:

3DS Macks

On January 29, 2008 at 8:58 pm

Wiiboy continues to write about he knows jack about.
That’s awesome dude.
This entire thread, is nothing but you writing / copy and pasting garbage. I honestly find it difficult to find anything you’ve written, to be even remotely true, and anyone with any understanding of any branch of hardware / software, can see that.
If this were the appropriate forum for technical discussion of any type, you would have no place here. Which probably explains why you’re here. You’ve likely been banned for trolling, as was mentioned earlier.

But you do love your unimpressive tech demos. Not sure what you get out of 400 boxes, moving around on screen, with static backgrounds, no ai, not much game code of any sort running. Just generic boxes.
THAT IS AWESOME!!!
Almost as pointless as this:

http://www.youtube.com/watch?v=MRB-zQogLeY&feature=related

I didn’t find most PS3 demos all that impressive really, but Wii couldn’t run the “scene” that this is set in, let alone the animation, skinning, shadows, etc, etc..
http://www.youtube.com/watch?v=9O2YWumQNas&feature=related

Nor could it run rag doll physics, and physics involved in this:
http://www.youtube.com/watch?v=phZQWtTAP5Q

And no need to address you’re continued strategy of comparing two unrelated games, and circumstances, in an attempt to make some kind of point.
Gamecube wasn’t capable of Half-Life 2 at all, btw.

lennell

On January 30, 2008 at 2:36 am

!full HD means 1080p for tvs that are out in stores for now…and wii graphic chip have full HD… if you don’t think it true. go to yahoo or google,search up nintendo-hides-wii-HD

wiiboy101

On January 30, 2008 at 9:57 am

i write
1 did i compare a aluminium celeron cisc cpu to a custom risc gamecentric powerpc micro embedded silicon on insulator copperwire cpu NO THAT WAS THE XBOX FANS

2did i use the term full HD a marketing term that meens nothing in reallity NO THAT WAS THE XBOX FANS

3 did i try saying dram out performs 1tsram-r NO THAT WAS THE XBOX FANS

4 did i compare off the shelf crappy parts to highly effective eficent custom designed parts NO THAT WAS THE XBOX FANS

5 did i try saying a 9million trancister count cisc celeron matches a 20 million plus trancister custom silicon on insulator risc cpu NO THAT WAS THE XBOX FANS

6 did i take the EA comments as fact as an idiot would do NO THAT WAS THE XBOX FANS

7 did i confuse political anti nintendo statment as a honest statment by a ea games spoks person NO THE XBOX FANS DID

nintendo are the games industry leaders there position is un matched and there higher than sony on japaneese stock exchange

EA SONY MICROSOFT ANDD ALL MAGOR 3RD PARTYS FEAR NINTENDO ARE YOU BLIND DONT COMPARE MARKETING BABBLE WITH SYSTEM SPEC FACTS IT MAKES YOU LOOK STUPID

if the ea guy was honest he would talk tev unit wiis custom hardwired effects the broadways massive advantages over a weak celeron THE FACT IS IT WASNT A HONEST STATMENT IT WAS INFACT A ANTI Wii STATMENT

BY A 3RD PARTY IN AWE OF NINTENDOS MIGHT AND POWER AND CLASS LEADING EDGE WITHIN THIS INDUSTRY

FACT

wiiboy101

On January 30, 2008 at 9:59 am

IF YOU BELIEVE THE EA WII=XBOX STATMENT THEN YOU ALLSO BELIEVE AMERICA INVADED IRAQ BECOUSE OF WEAPONS OF MASS DESTRUCTION AND NOT BLACK GOLD

WAKE UP WEN SOMBODYS LYING YOU BAFFOONS

WIIBOY2008GEZZA

On January 30, 2008 at 11:14 am

http://www.youtube.com/watch?v=41w-bbtVFKE

http://www.youtube.com/watch?v=zM9S3zmAHdk

http://www.youtube.com/watch?v=U8JkWBDKgKs

http://www.youtube.com/watch?v=402w0VWnMDk

http://www.youtube.com/watch?v=fGuhk-AcRmE

all the above lournch wii physics hundreds of real time objects with physic watch the power of wii it kicks physics ass objects/water/zero gravity/wiimote interaction in game near perfect 60 frames ELEBITS KONARMI

http://www.youtube.com/watch?v=zDkcGm4ElDM this link is crysis physics with limited interaction i.e no wiimote remember crysis is supposed to rep the top level wiis physic in game elebits clearly kicking crysis ass at physics

http://www.youtube.com/watch?v=PBzlDecnh7o this link shows x360 physics one object vs wiis hundreds of objects HALO 3S PHYSICS IS LAME VS WIIS ELEBITS

ANYONE SUJESTS A XBOX 1 CAN DO WII PHYSICS AND WIIMOTE REALTIME INTERACTION ALL AT THE SAME TIME IS A MORON

I WRITE THE ELEBITS VIDEOS CLEARLY SHOW AMASING PHYSICS OF HUNDREDS OF OBJECTS ALL DIRECTLY INTERACTIVE WITH WIIMOTE

WATCHING PHYSICS TAKE PLACE AFTER A BUTTON PRESS CAN NEVER MATCH TRUE REALTIME PHYSICS LINKED TO A WIIMOTE SORRY FACT :roll: :roll: :roll:

WII IS HANDLING PHYSICS MANY TIMES BETTER THAN HALF LIFE 2 ETC CLEARLY IN THOSE VIDEOS :roll: :roll:

HOLLYWOOD GPU IN WII HANDLES ALL MEMORY CONTROLLERS AND SUPPORTS DEDICATED SOUND PROCESSOR AND ON BOARD SYSTEM CORE ABBILLITYS IT ALLSO SUPPORTS ALL HARDWARE DRIVERS FOR FLASH WIFI ETC ALL HANDLED IN HARDWARE ON A DEDICATED CUSTOM SYSTEM CORE CHIP SET NOT A MEAR GPU

THIS ALLOWS BROADWAY CPU TO DO HIGH LEVEL A.I AND PHYSICS AND GRAPHICS CO-PROCESSING AS THERES NO CPU WASTED ON OTHER JOBS OR HD OR SOUND

WHAT THE PS3 X360 IDIOTS DONT UNDERSTAND IS BOTH X360/PS3 USE THERE MULTI THREAD INLINE CRAPPY PROCESSORS AS JACK OF ALL TRADE CHIPS

SOUND, OP SYSTEM , HD, CPU,GRAPHIC,ETC ETC ETC AND TO SUM UP HOW WASTFULL THEY ARE PS3 REQIURES 2 THREADS JUST TO RUN AND REQUIRES A WHOLE THREAD JUST FOR SOUND PROCESSING ADD TO THAT POOR RAM AND CATCH PERFORMANCE THE CELL IS ONLY EVER GOING TO TAP INTO SAY 10% TO 30% OF ITS POWER WERE AS WIIS CPU IS TAPPING ALLMOST ALL ITS POWER AT SET IN GAME PROCESSING

E-F-F-C-E-N-C-I-E- WORK OUT WHAT IT MEENS :roll: :roll: :lol: :lol:

3DS Macks

On January 30, 2008 at 2:42 pm

Wiiboy, no thy didn’t. They said, that it is very likely that Wii’s cpu out performs an Xbox. They, simply stated that Xbox can process math in it’s gpu, that Gamecube would need the cpu for.

That is a fact. And is likely still a fact for Wii, except now, Wii has enough of a cpu speed and power increase to cover much of it, and out-run an Xbox. That is what has been stated throughout this thread.
And they agreed that the EA guy was likely being a naysayer.
Saying that an Xbox was overall, more powerful than a Gamecube, and Wii is overall more powerful than an Xbox, is the gist of what was said here.

And now you drag Crysis, into your comparisons, by linking some schmo’s demo?
Phyics in Crysis, is different, in that you can shoot trees anywhere, and they damage realistically. You can splinter wood, cut through leaves, etc… The level of physics in that game, could not ever, ever, be done on a Wii. Developers have benched its cpu, and said they would likely get 2x Gamecube. That’s good, not great, and sure as hell not “AWESOME”. I’ve used G3s, and G4′s. We didn’t use anything but, in our workplace. They’re nice cpus, but they’re not magically awesome at anything, especially by today’s standards.

And Elebits looks sweet. I’ll have to pick it up. But, what I see in those demos, is a couple hundred, similarly shaped objects, swirling around, no different from the PS3 leaf demo.
I don’t see the cpu being tasked with animation, skinning, joints, or a.i, or anything else computationally intensive.
All I see, are a bunch of objects, being affected by a vortex.
I don’t see water physics, I see a water shader, that responds to objects, and wind in a preset manner. Nothing like actual water physics. (but i’m sure you knew that)

Plus, as I think was already mentioned, Wii’s cpu is more involved in it’s graphics, than an Xbox was. And offloading mediocre sound processing, to an underpowered sound chip, doesn’t magically give it any advantages. Lol.

lennell

On January 31, 2008 at 1:31 am

!look people the wii is two times the power of a gamecube,and yes,yes it’s more powerful then a xbox,it’s about one and a half time more powerful then xbox, and a game like mario galaxy,if it was put on xbox it will have to be put at the max of xbox power,but mario galaxy is only using 25 to 30 percent of wii power,!that not even half of it power.and metroid prime 3 could not be done on a xbox.

wiiboy101gezza

On January 31, 2008 at 6:24 pm

WHAT PLANET ARE YOU ON, THE FLIPPER HAD THE RICHEST CUSTOM HARDWARE AND COULD PULL OFF EFFECTS AND PROCESSING WITH NO OR LITTLE HIT ON ACCUAL HARDWARE PERFORMANCE, THE FLIPPER GPU WAS VASTLY MORE EFFECTIVE PER CLOCK THAN XGPU FANBOY TALKING THE XGPU WAS EXSTREAMLY IN EFFICENT THATS INDUSTRY FACT ,IT SEEKED DATA FROM EXSTREAMLY SLOW DRAM YOU CARNT PROCESS WHAT ISNT THERE CAN YOU.. FLIPPER DIDNT WAIT AT ALL FOR DATA IT WAS THERE IN REALTIME 1TSRAM FOOL AND FLIPPERS CUSTOM IN HARDWARE ABBILITYS WERE NEVER REVEALED IT CAN HANDLE WATER PHYSICS OF THE HARDWARE IT CAN HANDLE FUR SHADING OF THE HARDWARE CELL SHADING OFF THE HARDWARE ETC ETC A LOT OF THE BASIC ANIMATION PHYSICS WERE AUTO HANDLED IN HARDWARE SO CPU JUST ADDED THE FANCIER STUFF ON TOP TEV UNIT 8 TEXTURE LAYERS 16 TEXTURE STAGES FULLY PROGRAMMABLE SHADER BLENDER MULTITEXTURER XBOX GPU DID NOT SUPPORT SUCH RICH TEXTURING STAGES OR LAYERS AND USED ITS SHADERS TO SUPPORT BUMP MAPPING A FITURE FLIPPER HAD IN HARDWARE ALLSO XBOX WAS A DOT 1 BUMP MAPPER FLIPPER GPU WAS A DOT 3 BUMP MAPPER SUPERIER BUMP MAPPING

XBOX MAXED OUT AT 4 TEXTURE LAYERS IN GAME GAMECUBE MAXED OUT AT 8 TEXTURE LAYERS IN GAME

WHAT DOES CUSTOM MEEN ARE YOU SO BRAINWASHED BY PCS AND BILLGATES THE TERM CUSTOM DONT COMPUT IN YOUR BRAIN

FLIPPER SUPPORTED WELL THOUGHT OUT CUSTOM HARDWARE XGPU WAS A PC CARD NEVER DESIGNED FOR CONSOLES OR TV RESOLUTIONS

ON PAPER XGPU WAS 120 MILLION POLYGON PUSHER IN GAME IT WAS 12 MILLION FLIPPER HIT 20 MILLION ALLMOST DOUBLE THE XGPU THATS A INDUSTRY FACT

STOP TRYING TO SAY XBOX SHADED IT DIDNT THE SHADERS WERE ALL MAXED OUT WITH THE SOFTWARE BUMP MAPPING AND NORMAL MAPPING FLIPPER DID CUSTOM MAPPS IN ING HARDWARE YOU BAFFOON AND HAD TWICE THE TEXTURE LAYERS AND TWICE THE TEXTURE STAGES TO DO IT
PLEASE GO And compare rare on gamecube to rare on xbox

rare did starfox adventures on gamecube amasing graphics and shaders and fur shaders at 60 frames

rare did conkers game on xbox 5 years later YES 5 YEARS LATER OR THERE ABOUTS and conkers on xbox had lower looking graphics to fox adventures and oviously weaker fur shading

please explain how rare did better on gamecube at lournch than they ever did on xbox with 5 years of xbox experiance under there belt YET ANOTHER INDUSTRY FACT TO CHEW ON XBOX FANS

THE FUR SHADING IN GALAXY ON WII IS BETTER FUR SHADING THAN FOX ADVENTURES AND XBOX CONKERS COMBINED SO AGAIN HOW IS WII A XBOX

PLEASE POST FACT NOT FANBOY LIES
ROGUE SQODRON 2 AND 3 SUPPORTED HIGHER FILLRATE HIGHER POLYGON COUNTS HIGHER TEXTURE LAYERS HIGHER TEXTURE SIZES THAN ANY GAME EVER MADE ON XBOX AGAIN FACT OUTS FANBOY OPINION

XBOX HAD KRIPPELED RAM A KRIPPELED FSB AND NO ADVANCED REALTIME HARDWIRED DECOMPRESSION INFACT THE GAMECUBES HARDWIRED CONMPRESION WAS BEING RIPPED OF IN SOFTWARE BY 3RD PARTYS BUT NEVER WORKED AS IT WAS JUST NO MATCH FOR HARDWIRED REALTIME ON THE FLY DE-COMPRESSION

XBOX FRONT SIDE BUS 133MHZ NO COMPRESSION 800MB TO 1GB BANDWIDTH BEING FED DATA BY SLOW 50/100 NANO SECONDS DRAM

GAMECUBE 162MHZ FRONT SIDE BUS PLUS 4TO1 COMPRESSION OVER 5GB BANDWIDTH AND FED DATA BY ULTRA FAST SRAM LIKE 1TSRAM AT SUB 10 NANO SECONDS
clearly a much better design

wii 243 MHZ FRONT SIDE BUS PLUS 4TO1 COMPRESSION 8GB BANDWIDTH AND FED BY EVEN FASTER ON DIE OPTERMIZED 1TSRAM AND AGAIN BY GDDR3 RAM

CLEARLY DESTROYING XBOX XBOX WAS SERIOUSLY BUS KRIPPELED COMPARED TO GAMECUBE AND LEFT FOR DEAD VIA WIIS SUPER FAST BUS FOR 480P RENDERING

XBOX DATA FLOW IN BUS 800MB TO 1GB BANDWIDTH

WII DATA FLOW IN BUS 8GB BANDWIDTH ITS 8 TIMES FASTER IN BANDWIDTH AND OVER 10 TIMES FASTER IN RAM LATENCY SPEED

STOP IGNORING THE FACTS

BROADWAY CPU 256K L2 CATCH PLUS 4TO1 COMPRESSION = VIRTUAL L2 CATCH OF 1MB (ADD THE FACT WIIS RAM IS ALLMOST AS FAST AS L2 CATCH YOU HAVE MONSTERUS CACH PERFORMANCE)

XBOX 128K L2 CATCH NO COMPRESSION AND NO FAST RAM

THERES CLEARLY NO CONTEST

HOLLYWOOD IS CUSTOM MADE AND SUPPORTS A 720P FILLRATE CAPPED TO 480P RENDERING ITS PERFORMANCE AT A CAPPED 480P WILL BE FANTASTIC IT WILL SIMPLY NEVER RUN OUT OF IN GAME USABLE FILLRATE FOR 480P RENDERING PS3 AND X360 DO NOT SUPPORT FILLRATES FOR THE RESOLUTIONS THERE BOSTING 720P IS A STRUGGLE AT 60 FRAMES FOR PS3 SO THE 1080P WILL ALLWAYS BE UPSCALLING
SONY LIED MICRO LIED ITS THAT SIMPLE

HOLLYWOOD SUPPORTS 8 TEXTURE LAYERS AND 16 MULTI FUNCTION STAGES FOR FULL BLENDING COMBINING FIDDLING OF IN GAME TEXTURES IN REALTIME

GUESS WHAT IT DOES SHADES DUMBOS

FUNNY XBOX WAS IN EFFECTIVE AND EFFICENT IT LACKED THE SPEED OF RAM AND INTERNAL BANDWIDTH OF FLIPPER AND ITS PC CARD DESIGN LIKE ALL PC GPU CARDS WASTED ITS POWER POLYGONS ON PAPER 120 MILLION IN GAME 10/12 MILLION CLOCKSPEED 240 MHZ THERE ABOUTS

FLIPPER HIT NEAR 20 MILLION POLYGONS IN GAME AT TWICE THE FRAMERATE OF XBOX AND CLOCKSPEED WAS 162MHZ CLEARLY THE FLIPPER WAS ULTRA EFFICENT CLOCK FOR CLOCK AGAINST XBOX GPU

HOLLYWOOD IS A VERSION TWO OF FLIPPER AT 243MHZ SO ITS HIGHER SPEC ITS MORE EFFECTIVE EFFICENT THAN FLIPPER AND HAS FASTER LARGER RAM POOLS AND BETTER LATENCY ALLSO

HOLLYWOOD SUPPORTS 2TIMES PLUS THE PEAK IN GAME FILLRATE OF XBOX AND HOLD IT SUBSTAINABLY WITHOUT HITTING FRAMERATS

XBOX HAD FILLRATE ISSUES DUE TO KRIPPLED RAM AND BANDWIDTH FACT :roll: :roll: :roll:

SILICON ON INSULATOR ON ITS OWN CAN BOOST PERFORMANCE BY 35% IN A CPU NOT 35% MORE CLOCKS 35% PERFORMANCE INCREASE AT THE SAME CLOCKSPEED SO YOU COULD SAY BROADWAY IS 35% MORE POWERFULL THAN A BROADWAY CHIP NOT USING SILICON ON INSULTOR TECH

BROADWAYS RESTIERS AND INSTRUTION SET IS MANY MANY MANY TIMES RICGHER THAN A CELERON CPU BROADWAY IS ALLSO FULLY COMPLIANT WITH ALL POWERPC CODING G3 G4 G5 ETC ITS SIMPLY A MASSIVE DIFFERANCE TO THE BASIC CISC SPED SHEET CELERON DESIGN

DRAM XDR RAM PC GDDR RAM ETC PC PS3 X360 ETC LATENCY 50 TO 100 NANO SECONDS

1TSRAM-R DIE EMBEDDED FAST RAM WII 5 NANO SECONDS 10 TO 20 TIMES FASTER RAM PERFORMANCE WII READS DATA FROM RAM AS FAST AS CELL IN PS3 READS FROM ITS CATCH MEMORY

THATS UNBELIEVABLY FAST

wiiboy101gezza

On January 31, 2008 at 6:35 pm

THE WII CHIPSET THE GPU AND CPU VS THE XBOX CHIPSET AGAIN CPU GPU IN BANDWIDTH CLEAR WHIPPS ITS ASS THE CHIP SET AND BUS BANDWIDTH ARE EASILY 5 TIMES A XBOX IF NOT 8 TIMES THERES SO MUCH MORE SPEED IN BANDWIDTH AND LATENCY IN THE WIIS CHIP SET THAN THE XBOX………….

XBOX GPU BANDWIDTH READ FROM MAIN RAM AFTER ALL OTHER CHIPS TAKE THERE SHARE ABOUT 3GB IF YOUR LUCKY

WII READS MANY MANY MANY TIMES THAT AND ON TOP READS IT COMPRESSED IN REALTIME SO DESTROYS GPU BANDWIDTH OF XBOX

SAME APPLYS CPUS THE BROADWAYS CATCH IS LARGER AND SUPPORTS COMPRESSIONM AND REALTIME DECOMPRESSION AND THE TWO RAM POOLS OFFER BETTER BANDWIDTH AND LATENCY AND COMPRESSION THAN XBOX TY SDRAM 200MHZ SHARED RAM

ADMIT DEFEAT

wiiboy101gezza

On January 31, 2008 at 6:41 pm

MARIO GALAXY HAS BETTER GRAPHICS BETTER PHYSICS THAN ANYTHING ON GAMECUBE OR XBOX AND DOES IT ALL WITH NEXT TO NO LOADING TIMES

XBOX CANNOT DO THAT ALLSO WII IS DOING IT AT 60 FRAMES SOLID XBOX WOULD NOT RUN GALAXY AT 30 FRAMES IF PORTED AND WITH LOWER GRAPHICS AND MUCH MUCH LONGER LOADTIMES

3DS Macks

On January 31, 2008 at 10:25 pm

Lol, the “Rainman with no math skills” was a good description of you.

I just read the beginning of your continued cap-locked bull. No reason to continue reading past that, as it’s nothing more than you regurgitating your delusional bull, and continue to repost that has been repeated shot to all hell.
Why it is, that you continue it, I will never know, other than you’re an idiot.

Gamecube’s gpu had FIXED FUNCTION abilities. That means, that you have a narrow list of functionality. Emboss bump mapping is done like so, embm is done as such, etc…
It isn’t programmable. That’s why you see your “list” of gpu features. It’s like a list of generic tv dinners. Whereas on an Xbox was more like a chef.
I know it’s confusing for you, and you’ll live in denial, no matter how long the list of developers that disagree with you got.

Funny how you don’t get that.

Swp64

On January 31, 2008 at 10:55 pm

So how come you guys keep this thread going? You can see he’s lost his entire argument, and has started back to square one.
And you can see he doesn’t understand a lick what he copy and pastes, and doesn’t want to. There’s really no point to it. Just look at his “Dot 1, Dot 3 bump mapping) comments he;s moved into now. Lol.

He doesn’t know the difference between a “hardware feature” and a “software feature”. He doesn’t understand “fixed function” and “programmable”. He doesn’t understand compression, or any of the other bs he posts. He can do nothing but repeat his collection of mis-interpreted “facts”.

He now seems to think the Gamecube did bump mapping in “hardware” while Xbox did it in “software”. :lol:
Never mind the programmable processors on the gpu specifically designed to process such things, and never mind the fact that anything on the gpu, that’s there to process a specific function is “in hardware”, and anything done on the cpu, is considered “software”. You could explain that to him, but he’d just drop back to some of his earlier bs. Or better yet, sign in under a different name, and post some more bull.

It’s really childish , and you’re not being informative to him. He thinks this is some fanboy “debate”, and if he keeps jibbering, he can try to convince himself that he knows what he’s talking about. He needs this to sleep at night.
So it really doesn’t matter how much you link him to, or explain, or find developers who’ve worked on it to explain, he’ll continue with what he’s doing. Being a delusional fanboy.

He’d probably his wrists if he ever figured out the truth.

lennell

On February 1, 2008 at 2:03 am

wiiboy101gezza -yes thats all most right about xbox power,mario galaxy on xbox at 100 percent at xbox max power might run at 30 frames like doom 3 did,mario galaxy looks nex-gen on wii,but it will not look as good on a xbox if it can even run it…and if boom 3 whats put on wii and make from the ground up,it will run at 60 frames and look better on wii.

lennell

On February 1, 2008 at 2:35 am

gamecube and xbox is about all most the same in graphics,but gamecube had slightly better graphic then xbox,and xbox had slightly more power then gamecube,and wii is about two time more powerful then gamecube,and about one and a half too 3/4 more powerful then xbox.

BranH

On February 1, 2008 at 9:25 am

http://www.nvidia.com/page/pg_20010530444180.html
How to make a Gamecube: Take a Geforce 256, (aka Geforce 1) add better fill-rate savings mechanisms, and 1 meg of on-chip texture cache, and two megs for its frame-buffer. Allow it to read and write to and from there, instead of hitting main bus bandwidth with its pixel recirculating tev unit.
Pair it with 1tsram, and a G3, toss in a few tweaks, and call it a day.

Its fixed function “hardware” feature list looks roughly the same, its transformation and lighting units look the same. It lacked vertex shaders, had the same number of rops and per clock texture performance, had the same “hardware” S3 texture compression, etc…
Just spiff it up a bit, and call it a Gamecube. There’s not much revolutionary going on there

That was fine for last gen hardware, and Wii is an enhanced version of it. Performance is yet to be determined, but it looks to be just what its specifications suggest. No less, but no more either.

wiiboy101gezza

On February 1, 2008 at 5:20 pm

FIXED FUNCTION HMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM FLIPPER GPU FULLY PROGRAMMABLE LIGHTING FULLY PROGRAMABLE 16 STAGE 8 LAYERS TEV UNIT

HOLLYWOOD GPU WII FULLY PROGRAMMABLE 16 STAGE OPTERMIZED TEV UNIT FULLY PROGRAMMBLE LIGHTING

STOP USING PC TERMANOLEGIE TO A NON PC CUSTOM DESIGN YOUR MAKING YOURSELFS LOOK MORE AND MORE STUPID THERE IS A WORLD OUTSIDE MICROSOFT THERE IS A WORLD OUTSIDE WINDOWS
BRAINWASHED BAFFOONS

SAYING WII AINT PROGRAMMABLE IN ITS GPU IS LIKE SAYING BUT YOUR CAR DOSNT HAVE A ENGINE CAR OWNER HUH HOW COME WELL BILL GATES ENGINES RUN ON PETROL YOUR CAR RUNS ON DESIL THERE BOTH ENGINES YOU BAFFOONS

TEV UNIT IN HOLLYWOOD IS A FULLY PROGRAMABLE 16 STAGE MULTI FUNCTION BLENDER SHADER ENGINE FULLY SUPPORTING MULTIPUL CRAZY COMBINATIONS OF COLOUR AND TEXTURES AND APPLYS SUCH STAGES IN ONE PASS AND 8 TEXTURE LAYERS

ADD TO THAT CUSTOM SHADER SUPPORT VIA A CPU OPTERMIZED AND BALACED TO THE GPU FOR CO-PROCESSING :roll: :roll:

wiiboy101gezza

On February 1, 2008 at 5:32 pm

YOUR GOING TO LOOK DAM STUPID NOW LINKS OF PROOF FOLLOW

LINK ONE CONFIRMS PROGRAMABLE SHADERS IN GPU AND CPU COMBINED

http://www.extremetech.com/article2/0,1697,29619,00.asp

wiiboy101gezza

On February 1, 2008 at 5:36 pm

LINK TWO SHOWS EFFICENCIE OF GAMECUBE AND WII

http://www.nintendoworldreport.com/editorialArt.cfm?artid=516

wiiboy101gezza

On February 1, 2008 at 5:38 pm

LINK THREE CLEARLY SHOWS FLIPPER THEN HOLLYWOOD BOTH SUPPORT TEV ENGINE PROGRAMMABLE SHADERS

WIIS A VERSION TWO OF SUCH SHADERS

http://wiiall.blogspot.com/2006/06/wii-gpu-has-nice-blendingshading.html

wiiboy101gezza

On February 1, 2008 at 5:40 pm

XBOX FANNY BOYS STOP MAKING YOURSELFS LOOK STUPID CUSTOM DESIGN HAS SWEET FOOK ALL TO DO WITH PC DIRECT 12345678910 ETC

ITS CUSTOM OPENGL NINTENDOGL HARDWARE SOFTWARE HMMMMMMMMMMMMMMMMMMMMMMMMM NO SHADERS IN WII YES OK AND THATS A FLYING PIG OUTSIDE MY BEDROOM WINDOW

wiiboy101gezza

On February 1, 2008 at 5:42 pm

OH YES GEKKO IN GAMECUBE AND BRAODWAY IN WII BOTH SUPPORT CUSTOM PROGRAMMABLE NODES IN CPU/NORTHBRIDGE FOR MICRO CODING TO THE METAL OF THE CPU CUSTOM HAND BUILT PROGRAMMES BY CLEVER CPU CODERS FOR BLINDINGLY FAST CPU RAW CODE

FACT IS FACT ANTI WII BULL IS BULL

wiiboy101gezza

On February 1, 2008 at 5:46 pm

SO TIM VAN HOOK OF ATI LYED ABOUT THE TEV UNIT THE FLIPPER T&L AND THE GEKKO CPU

I THINK NOT FULLY PROGRAMMABLE END OF STORY

WII IS 2.5 TIMES XBOX NOT = TOO XBOX END OF ARGUMENT

ATI DID NOT LIE ITS THE XBOX FANS DOING THE LYING

BranH

On February 1, 2008 at 7:08 pm

What about it? Your comment about Xbox doing its bump mapping “in software” and Gamecube doing it “in hardware” is where the programmable and non-programmable parts come in. Pixel shaders on Gamecube were defined as “configurable” Xbox as programmable. That’s why you see Nintendo gpu patents for how they achieve “emboss bump mapping” or “environmental bump mapping”.

If you looked it up, there’s probably a disparity in “programmable flops”. It just means more of Gamecube’s pipeline is “fixed”.

For example, Sony likes to list the number of total flops on its gpu, while pc’s typically list only programmable. That’s why you see their bs 1.8 teraflop numbers. Lots of functions on a gpu are “non-programmable”. They just do what they were designed to do. Programmable means you have more control over what goes on.

Anyway, your first Extremetch link, explains how the cpu could help in lighting and geometry. That would be “in software”.
As opposed to vertex shaders doing the same type of work, aka, “in hardware”.
The Gamecube’s gpu contains vanilla T&L. Like this:
http://www.nvnews.net/previews/geforce3/programmable_gpu.shtml

And you don’t need DirectX to do anything. DX is just an api, that exposes the hardware features of the gpu, just like openGL or any other api. Don’t need Microsoft, and you don’t need Windows.

And those last two links, the guy is completely clueless. No offense to him, or anyone else, as he’s not being a condescending about it.

He’s confusing pipelines, and cycles, and whatever else.

wiiboy101

On February 4, 2008 at 4:34 pm

xbox 8×4 real time lighting gpu no custom cpu lighting

gamecube 8×8 realtime lighting plus custom cpu lighting

wii more of the same hmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm wii is a xbox I THINK NOT

highest realtime lighting last gen gamecube wii 2.5 times that xbox realtime lighting left for dead FACT

wiiboy101

On February 4, 2008 at 6:17 pm

xbox 733mhz celeron aluminium cisc sped sheet cpu

wii 729mhz customized gamecentric powerpc 750cl copperwire silicon on insulator optermized catch instructions simd micro embedded tight trancister design micro processor unit

xbox catch l1 32k l2 128k

wii catch l1 64k l2 256k plus 4to1 data compression realtime decompression on the fly custom compression giving a virtual 1mb l2 catch performance 256k x4= 1mb does it not

xbox front side bus 133mhz no compression tricks bog standard outdated intel pc bus no compression no bus pumping just a sad old 133mhz standard bus @133mhz 1gb bandwidth A TERRABLE BOTTLENECK TUT TUT POOR DESIGN

wii front side bus 243mhz plus 4to1 compression de-comprestion custom data compress trick realtime no hit on hardware 8gb bandwidth clearly many times over the xbox bus

xbox ram 64mb bog standard sdram @ 200mhz overclocked by controller to 400mhz poor ram choice for games console design not suited to gaming data flow and terrable latency issues 50 to 100 nano seconds

wii ram die embedded gpu embeded edram 1tsram-r 3mb high bandwidth multipull 128 buses on gpu catch memory plus 24mb 486mhz 1tsram-r die embedded fast ram super fast 5 nano seconds…
plus 64mb gddr3 @ 486mhz optermized ram on motherboard

xbox = 64mb slow clunky 200mhz sdram

wii = 88mb of faster better ram @ 486mhz clearly much much better

all data compression tricks for front side bus and catch allso apply in ram so data 1mb =4mb in wii data compression trick xbox simply does not have

wii aditional memory in flashcatch as access ram or disc catch on par with gamecubes slow access ram performance vastly qicker than hardrive virtual memorys of pc or xbox 1

wii custom disc drive system much much much much much much faster loading streaming than xbox dvd drive

wiis memory looks kinda like this

xbox 64mb wii better more effective 64mb ram poll

gamecube 24 mb fast ram wii 24mb even faster optermized ram

wiis 1tsram is better than gamecubes wiis gddr3 is better than xbox ram and wiis disc is way larger storage than xbox but way faster loading than gamecube

so the 88mb ram of wii is more like 200mb in xbox terms EASILY

the facts talk for themselfs

BranH

On February 4, 2008 at 6:37 pm

And that’s another one of those figures that you couldn’t take, and do an apples to apples comparison of.
You’d have to specify what type of lights they are. Spot, infinite, etc..
Or what type of computation you’re doing for it, Per vertex? Per pixel? Computed in the combiner stage? (seemed to be the favored Gamecube method)

The developers that have commented, have simply said Flipper’s lighting was grossly simplified, compared to even a Geforce 2. The same amount of math isn’t being done.
Of course, what matters is what it winds up looking like in the end, not so much the complexity of the math being performed.
But you could make a case, that Gamecube couldn’t do the lighting in something like the Splinter Cell series, and you’d be right.

BranH

On February 4, 2008 at 7:00 pm

And you continue with the gibberish. Using your marketing descriptions of products doesn’t magically make them cool.
And DDR has been used for PC gaming for years, it’s well suited to gaming applications, and gpus have been being tweaked to interface with the 128 bit bus to its ram.
I know you don’t read most of what’s written, because if you did, you’d have noticed that your Wii /= Xbox crusade was agreed upon, and even explained in this very thread. There really isn’t anyone here that thinks it’s an Xbox.

But no one is of the mind that Wii = 360 or PS3, simply because it’s targeting 480p, because it doesn’t. That was everyone’s hopes, but there is nothing to indicate that.

And you continue with your magical Nintendo compression. I’m sure you’ve figured out by now, that NIntendo didn’t invent most of the technology in its consoles, and therefore uses the same types of technology, and thus uses the same compression schemes and algorithms as any reasonably modern gpu does. ATI nor IBM hold no deals with Nintendo to supply them with “that little something extra” that you seem to dream about, that somehow gives it a boost to compete with the storage capacity of modern consoles.

It’s simply not a 480p 360, no matter how you attempt to spin your numbers. Sad, but true. Nintendo fanboys need to stick to sales figures, to prove Wii’s awesomeness.

wiiboy101

On February 4, 2008 at 7:38 pm

VIRTUAL TEXTURING IN HARDWARE Wii 66% MORE TEXTURE SPACE AND EFFECIVE BANDWIDTH JUST BY VIRTUAL TEXTURING HARDWIRED TRICK

XBOX HAS NO VIRTUAL TEXTURING AT ALL XBOX HAS NO LARGE FAST GPU CATCH BUFFER AT ALL XBOX HAS NO SRAM LIKE MAIN MEMORY AT ALL :roll: :roll:

XBOX HAD RAM AND HAD TO DO ALL GPU BUFFERING Z AND FRAME IN A SLOW OF CHIP OLD SKOOL DRAM AND NOOOOOOOOOO PIXEL RECYCLING
(THANK U WHOEVER FOR CURRECTING MY TERMANOLEGIE I STATED CULLING RATHER THAN RECYCLING)
NO VIRTUAL TEXTURING NO POLYGON CULLING NO IN CATCH COMPRESSED DATA/TEXTURE READ AND A TINY 128 TO MAYBE 256K GPU CATCH

WII READS INTO GPU COMPRESSED DATA/TEXTURES AND HAS HUGE FAST GPU CATCH/BUFFER AND LIGHTNING FAST RAM TO FEED IT

VIRTUAL TEXTURING/PIXEL RECYCLING/FAST RAM/READING PROCESSING COMPRESSED DATA/TEXTURES/POLYGON CULLING EFFICENTCIE TRICK/MULTI LAYRED SINGLE PASS RENDERING
JUST A FEW GPU ADVANTAGES THE HOLLYWOOD HAS
EFFECTIVE “”"”IN GAME FILLRATE”"”"”" HOLLYWOOD SHEAR DESTOYS XGPU THAT IS JUST FACT ALLSO TEXTURE STAGES AND TEXTURE LAYERS LEAVE XBOX FOR DEAD

BranH

On February 4, 2008 at 11:14 pm

You continue to write the same stuff that has been repeatedly explained. You’re comparing apples and oranges on every point you have.
An Xbox recycles pixels, the same way a Gamecube does. You’re describing “single pass multi-texturing” as if it’s some special Gamecube feature. It’s not.

And Gamecube’s texture cache is there, for the TEV unit to function, since it’s responsible for texture reads. And as has been repeatedly pointed out to you, the Gamecube NEEDS to read and write to and from it’s on chip memory, every single time it wants to add a texture layer, as it could only do them one at a time.
An XBox, simply adds two at a time, loops back once and adds two more, all on chip, THEN writes to the frame buffer.

And has also been pointed out to you, Xbox DID cull geometry, and does, routinely use extensive compression.

Multiple developers who’ve worked directly with the hardware have said Xbox overall has the advantage in graphics. There isn’t much there to argue about.
They all come to the same overall conclusion.

Now, I could point out what’s wrong with these below figures, and say that with more multi-texturing, Gamecube would pull well ahead of a PS2, and sticking to the cheap math lighting (non per vertex), it wouldn’t drop polygon count as much, when moving to “lit” polygons.
But (as most other developers have also said), It doesn’t stand up well to an Xbox, at a list of things. The advantages Gamecube had, in flexibility and overall performance is short. They’re not that far apart, but overall, Xbox > Gamecube. Not sure why that hurts your feelings so much.

EA ran the numbers from their own uses, at SigGraph.

The PC platform is a 1.4GHz Athlon and ATI Radeon 8500.

The first test was a 3998 polygon NBA player:

Gouraud shading, unlit, no textures
PS2: 17.0 MPolygons/second, 22.6 MVertexIndices/Second
Xbox: 47.2 MPolygons/second, 91.4 MVertexIndices/Second
GCN: 18.7 MPolygons/second, NA
PC: 24.1 MPolygons/second, 46.1 MVertexIndices/Second

Gouraud shading, lit, no textures
PS2: 10.9 MPolygons/second, 14.7 MVertexIndices/Second
Xbox: 22.4 MPolygons/second, 43.4 MVertexIndices/Second
GCN: 10.3 MPolygons/second, NA
PC: 15.9 MPolygons/second, 20.9 MVertexIndices/Second

Gouraud shading, lit, textured
PS2: 8.5 MPolygons/second, 11.5 MVertexIndices/Second
Xbox: 14.2 MPolygons/second, 30.3 MVertexIndices/Second
GCN: 7.2 MPolygons/second, NA
PC: 5.1 MPolygons/second, 10.9 MVertexIndices/Second

Test with a 69451 polygon Stanford bunny (from Stanford Computer Graphics Laboratory)
with Gouraud shading only:
PS2: 25.2 MPolygons/second, 31.8 MVertexIndices/Second
Xbox: 63.9 MPolygons/second, 93.8 MVertexIndices/Second
GCN: NA, NA
PC: 26.3 MPolygons/second, 36.2 MVertexIndices/Second

BranH

On February 4, 2008 at 11:22 pm

So, in closing, I guess we can see why EA wrongly assumes Wii’s video hardware is so lacking.

above benchmarks and presentation can be found here: (Upon request, you have to take an oath of anti-fanboyism)
http://64.233.167.104/search?q=cache:3Vlq42PryMIJ:trowley.org/sig2002.html+%22PDF+hosted+locally+at+author%27s+request%22&hl=en&ct=clnk&cd=1&gl=us&client=firefox-a

wiiboy101gezza

On February 5, 2008 at 3:08 pm

there is no game in existance showing 15 million polys or more on xbox at 30 frames with framerate issues
yet theres games at 16 to 20 million polys at 60 frames on gamecube
SO HOW ON GODS GREEN AND PLESENT PLANET IS WII A FOOKING XBOX IT CLEARLY ISNT

XBOX FSB 133MHZ 1 GB BANDWIDTH

WII 243MHZ PLUS REALTIME COMPRESSION DECOMPRESSION = 7.9GB BANDWIDTH

QOD PUMPED BUS XBOX WOULD BE 4 X 133 MHZ = 532MHZ BUS THATS AROUND 4 GB BANDWIDTH

AS YOIU CAN SEE IN BANDWIDTH PEAK WIIS FSB IS 4X 243MHZ OR 7.9 GB BANDWIDTH

EVEN IF XBOX HAD A QOD PUMPED BUS OF 532 MHZ ITS STILL SHORT ON WIIS FSB

1GB OR THERE ABOUTS BANDWIDTH CAN NOT IN ANY WAY SHAPE OR FORM MATCH 7.9 GB BANDWIDTH CAN IT

243 MHZ PLUS 4TO1 COMPRESSION IS = TOO A 243MHZ BUS PLUS QOD PUMPING

243 MHZ BUS WITH BANDWIDTHS OF 4 X 243MHZ = 972 MHZ OR 7,9 GB BANDWIDTH

PLEASE SEE SENCE XBOX IS TROUNCED BY WII THE RAM FEEDING THE BUS AND CHIP SET IS ALLSO 10 TIMES FASTER IF NOT MORE THAN XBOX SDRAM (1TSRAM-R)

:roll: :roll: :roll:

UNLESS WII IS SOMEHOW LESS POWERFULL THAN GAMECUBE IT CARNT BE LESS POWERFULL THAN XBOX CAN IT

HIGHEST XBOX POLYGON COUNTS PUSHING TOWARDS 15 MILLION IF THAT 30 FRAMES

GAMECUBE 20 MILLION 60 FRAMES THERES CLEARLY A BIG ADVANTAGE IN CUSTOM EFFICENT DESIGN THERE AND CLOCK FOR CLOCK GAMECUBE KICKED ASS

WII IS A GAMECUBE 2.5 AS LEAKED AT E3 CPLE OF YEARS BACK

WII IS A XBOX/GAMECUBE 2.5 SORRY THATS JUST FACT :lol: :lol

24MB OF SRAM PERFORMANCE RIGHT ON THE GPU DIE FEEDING DIRECTLY INTO MEMORY CONTROLLERS IS A MASSIVE MASSIVE ADVANTAGE WII HAS

24 MB OF LIGHTNING FAST DIRECT CPU GPU FEEDING SRAM BEFORE YOU EVEN CONSIDER GOING TO OF DIE OF CHIPSET MEMORY

I DOUT A LEVEL 3 CATCH OF 24MB WOULDNT IMPRESS A PC FANBOY ON THERE GRAPHICS CARD OR CPU MOTHERBOARD PLEASE SEE SENCE PEOPLE

BranH

On February 5, 2008 at 5:16 pm

See, you continue with the same old, same old.
You do get that Xbox can process in its gpu, what a Gamecube needed its cpu for, right?
Cpu had to pass more over its front side bus, if it wanted to off load geometry work that would be covered by vertex shaders.
And you’re confused about geometry work in general.
The amount of vertex work being done, is directly related to how many polygons you have on screen.
It’s no different than saying, PS2 has a fill rate of 2.4 billion pixels per second.
Which stomps the Gamecube’s 648 million. Difference is, that the PS2 figure, assumes no textures. YOu want to add a texture layer, you cut the fill rate in half, AND cut the polygon rate.
Gamecube could add a single texture, right off the bat, and doesn’t have to reprocess the polygon every texture layer.

I could list a game, that has a fill rate of 1.2 billion pixels on PS2, which is higher than Gamecube is even capable of doing, but that doesn’t make it better.
It all comes down to what you can actually DO with those pixels and textures, which is where the Gamecube wins out.

An Xbox can out geometry process a Gamecube, by quite a bit. All developers would tell you that. You don’t have any figures of polygon counts on Xbox games, but it is a developer accepted and backed up fact, that Xbox > Gamecube at geometry processing. There isn’t much contest there.
Doing generic polygon work, means you can push lots of polygons. No ambitious Xbox developer sticks to generic polygons.

Face it, the math doesn’t hold up, the specifications don’t hold up, the developer comments agree.

wiiboy101

On February 5, 2008 at 8:24 pm

13 year old boys in there bedrooms developers they do not make yes beyond 3d forums have developers posting more like 13 year old geeks pretending to be so hahahaha zzzzzzzzzzzzzzzzzzzzzzz

A DEVELOPER WOULDNT IGNORE SPEC FACT LIKE TEV SHADES IN REALTIME AND HOLLYWOOD SUPPORTS 16 STAGE REALTIME SHADER TREE A CUSTOM HARDWARE “”"”"PROGRAMMABLE”"”" SHADER 1 DEVELOPERS LIE IF THERE BIASED 2 TEENAGER FANBOYS IN BEDROOMS DONT MAKE DEVELOPERS HAHAHA HEHEHE LAFFING MY SOCKS OF DEVELOPERS MORE LIKE AMERICAN TEENAGERS FANBOYING BILL GATES FROM THERE NET CONECTIONS IN THERE BEDROOMS

HOLLYWOOD SUPPORTS 16 STAGE REALTIME SHADER BLENDER AND 8 TEXTURE LAYERS IN A SINGLE PASS TEV TEV TEV TEV TEV TEV TEV ITS THERE IT EXISTS ANYONE SAYS OTHER WISE IS A BIASED FANGOON TALKING CRAP

FLIPPER GPU SUPPORTED 16 STAGE SHADER TREE “”"”RIGHT”"”"

FLIPPER SUPPORTED ON CHIP LOW HIT CUSTOM VOLUMETRIC EFFECTS “”"”"RIGHT”"”

FLIPPER SUPPORTED ALL SHADERS BLENDERS CELL SHADING FUR SHADING ON CHIP THE CPU JUST ADDS MORE SOURSE THRU FULLY PROGRAMMABLE GAMECENTRIC DESIGN

XBOX GPU SUPPORTED EFFFECTS/SHADING

GAMECUBE GPU SUPPORTED EFFECTS/SHADING PLUS CPU CUSTOM EFFECTS SHADING

YOUR INFACT GOT THE ANSWER BACK TO FRONT ITS GAMECUBE WITH THE HIGHER SPECED GPU AND CPU AND BOTH ARE 3D GRAPHICS OPTERMIZED

XBOX HAD A DESENT BUT BOTTLE NECKED GPU AND TERRABLE RAM BUT ITS CPU WAS A CELERON IT HAS NO 3D OPTERMIZATION

GAMECUBE HAD 3D OPTERMIZED AND CUSTOM GPU/CPU WITH FANTASTIC CO-PROCESSING

THE XBOX HAD GOOD GPU BUT POOR AS CPU AND THE WHOLE SYSTEM WAS BOTTLENECKED AND INEFFICENT

STOP CALLING ON PAPER NUMBERS IN GAME NUMBERS THATS XBOX FANBOYING THE XGPU ONLY HIT 30% OF ITS THERIETICAL POWER

FLIPPER HIT WAY MORE THERIETICAL POWER A MAGOR POINT THE SPEC SHEET FANES ARE DELIBRATLY IGNORING

XBOX WAS INEFFICENT THAT IS A FACT

GAMECUBE AND WII ARE HIGHLY EFFICENT THATS ANOTHER FACT

WII IS MORE EFFICENT THAN GAMECUBE “”"”"”"”"”"”RIGHT”"”"”"”"”"”"”"”

AND GAMECUBE MORE EFFICENT THAN XBOX “”"”"”"”"”"”RIGHT”"”"”"”"”"”

SO AGAIN ON GODS SWEET EARTH HOW IS WII A XBOX

U CAN XBOX FAN TILL UR BLUE IN THE FACE

PROVE TO ME “”"”"”"”"”"”"”"”IN GAME “”"”"”"”"”"”"”"”"”"”"”"”"”" XBOX HAS WII POWER

GO GET THE EVIDANCE

IV PROVEN WIIS PHYSICS IV PROVEN WIIS GRAPHICS IV PROVEN WIIS LOADING SPEED

ALL I SEE FROM XBOX FANS IS NONCENCE

ELEBITS OVER 400 OBJECTS BEING DIRECTLY PLAYED WITH VIA A WIIMOTE ALL WITH THERE OWN ADVANCED PHYSICS AT 60 FRAMES

WERE IS THE 400 OBJECTS WITH PHYSICS DIRECTLY INTERACTING WITH ANOLOG STICKS ON XBOX

IT DOES NOT EXST DOES IT YOUR FANBOYING

ELEBITS PHYSICS ARE IMPOSABLE ON A XBOX THAT IS COMPUTATION “”"”"”"”"”"”FACT”"”"”"”"”"”"”"”"”"” “”"”"”"”"”"”"”"”"”"”"”"”"RIGHT”"”"”"”"”"”"

MARIO GALAXY HAS GRAPHICS/FRAMERATE/PHYSICS WAY BEYOND A XBOX ITS THERE ON THE SCREEN WEN U PLAY GALAXY ON A WII THE EVIDANCE EXISTS IV PROVEN MY FACTS AGAINST XBOX FANS LIES

ELEBITS HAS AMASING PHYSICS AGAIN WII IS PROVEN WERE IS THE 400 OBJECTS WITH PHYSICS IN A XBOX GAME AGAIN WERE IS YOUR EVIDANCE

AGAIN 243MHZ BUS PLUS 4TO1 COMPRESSION= WHAT OH YES NEAR ON 8 GB BANDWIDTH

PLEASE EXPLAIN HOW 1GB MATCHES OR BEATS 8 GB ITS CLEARLY 8 TIMES LESS DOING INFANT SCHOOL MATHAMATICS

AGAIN PLEASE EXPLAIN HOW 88MB OF FAST BALANCED SYCED RAM @ 486MHZ WITH CUSTOM ON THE FLY COMPRESSION DE-COMPRESSION IS MATCHED OR BEATEN BY 64 MB OF SLOW DINOSURE SDRAM @ 200MHZ

THE TEV UNIT BLENDS SHADES AND COMBINES (GUESS WHAT A SHADER DOES) UR SAYING THE SHADER AINT THERE BECOUSE NINTENDO DOSNT CALL IT A SHADER DO U KNOW WHY????? THE TERM SHADER IS A REGESTERED TRADE MARK OF PC BILL GATES TERMANOLIEIES ITS DIFFERANT DESIGN YOU BAFFOONS………

THE TERM WINDOWS IS A WAY OF SAYING U VIEW AND USE A APPLICATION IN A BOX DUBBED WINDOW YOUR NOW GOING TO SAY AMIGA YEARS AND YEARS AHEAD OF PC DIDNT HAVE A POINT AND CLICK WINDOWS OP SYSTEM CALLED AMIGA WORKBENCH JUST BECOUSE THE TERM “”"”"WINDOWS”"”" ISNT USED

THATS HOW MUCH SENCE A PC FANBOY MAKES HE CONFUSES WORDS WITH CHIP TECHNOLEGIE FACTS SAME APPLYS TO YOUR SHADER OBSESION TEV IS A SHADER YOU UNEDUCATED BILL BOX BRAIN DAMAGED IDIOTS

I SUPOSE A MAC AINT A PC IS IT SERIOUSLY GET OUT OF BILL GATES UNDERPANTS YOUR IMBARRASING YOURSELFS

TEV SHADES TECH FACT

SO WE HAVE ESTABLISHED XBOX FANS RATE SDRAM ABOVE CUSTOM 1TSRAM-R

THEY RATE CELERON CPUS WITH OUT DATED OVERHEATING OVERSIZED DESIGN ON PAR WITH CUSTOM MICRO DESIGNS WITH MORE THAN DOUBLE THE TRANNY COUNT DOUBLE THE CATCH RISC COPPERWIRE SILICON ON INSULATOR ETC ETC ERTC ETC ETC ETC ETC OK I BELIEVE U COUGH COUGH

THEY COMPARE 133MHZ BUS WITH A 243MHZ BUS PLUS 4TO1 COMPRESSION

THEY COMPARE 256K GPU CATCH TO A 3.2MB GPU CATCH OK COUGH GIGGLE GIGGLE

THEY COMPARE 2007 DESIGNED CHIPS WITH LATEST EFFICENCIE/EFFECTIVNESS TRICKS TO OLD SKOOL OFF THE SHELF PC INEFFICENT PARTS

PLEASE YOUR SUCKING THAT CELERON POLE LIKE THERES NO TOMORROW ITS A CELERON ITS OLD DRAM ITS OFF THE SHELF ARE YOU SERIOUSLY THAT BRAIN DAMAGED

SILICON ON INSULATOR ON ITS OWN IS GOOD FOR 35% INCREASE IN A CPUS PERFORMANCE COPPERWIRE IS GOOD FOR 10% A DIE SHRINK AND OPTERMIZE IS GOOD FOR 10% ETC ETC ETC

ADD ALL THOSE LITTLE TRICKS TOGETHER YOU HAVE 50% MORE POWER JUST BY IMPROVING YOUR MANUFACTORING PROCESS NEVER MIND CLOCK SPEEDS ETC

COPPER BEATS ALUMINIUM SILICON ON INSULATOR BEATS STANDARD RISC BEATS CISC BIG CATCH BEATS SMALL CATCH FAST FSB BEATS SLOW FSB

IGNORING THE FACTS MAKES YOU A XBOX FAN PURE AND SIMPLE

FAST RAM BEATS SLOW RAM FAST CUSTOM GAMING DISC DRIVE BEATS STANDARD DVD

COMPRESSED DATA BEATS NON COMPRESSED DATA

88MB OF RAM IN WII TERMS IS LIKE 200 TO 256MB RAM IN XBOX TERMS IT AINT LIKE FOR LIKE IS IT CONSIDERING WIIS RAM IS BY FAR THE FASTEST WIIS DISC DATA STREAMING IS BY FAR THE FASTEST WIIS COMPRESSION IS BY FAR THE BEST AND WII HAS DUAL MAIN MEMORYS AND A MASSIVE GPU CATCH BUFFER AND A BIG CPU CATCH

YET SOME HOW AS IF BY MAGIC WII IS A XBOX

POLETICS BY A NINTENDO FEARING 3RD PARTY AS I CURRECTLY STATED IN THE FIRST POST

IF IT AINT POLITICS AND EA DONT KNOW NINTENDO FANS DONT BUY INTO CRAP WHY HAS EURO 08 FIFA BEEN DROPPED FROM WII AND ITS NOW ON PS3 AND X360 ONLY

ILL TELL U NINTENDO FANS DONT BUY 6 MONTHLY CASH INS THEY BUY GAMES OF HIGH STANDARDS

EA KNOWS FULL WELL FIFA EURO 08 WILL BE DELT A DEATH BLOW BY THE ONLY TRUE NEXT GEN SOVCCER GAME A GAME THAT MAKES BOTH FIFA PS3 08 LOOK LAME WII FIFA 08 LOOK LAME AND EVEN PRO EVO PS3 LOOK LAME

ITS WII PRO EVO PLAYMAKER EDITION (THE ONLY NEXT GEN SOCCER GAME AVALABLE) THATS FACT X360/PS3 FANS YOUR CONSOLES ARE OBSOLETE

wiiboy101

On February 5, 2008 at 8:35 pm

TRY DOING 11 PLAYER POINT AND CLICK INTERACTION ON A OUTDATED CONTROLLER SORRY IT CARNT BE DONE NO MATTER HOW MANY CELLS U HAVE SONY

WII IS NEXT GEN XBOX360 AND PS3 ARE NOT FACT :lol: :lol: :lol:

I DONT BUY OBSOLETE CONSOLES

KINGPIN

On February 5, 2008 at 8:41 pm

I would rate wiis broadway cpu on par with a 1.5 ghz laptop cpu with 1mb catch and a nice fast bus with no windows to operate in the background thats a lot of cpu for gaming dedicated processing and capped 480p resolution

expect big a.i big physics and a lot of graphics co-processing agter all crappy windows aint in the way nor is resolution bull or sound processing believe me a celeron 1.5 ghz with 400mhz bus as a dedicated gaming cpu would accually kick some arse

dosnt wii run its flash its wifi and everything via hardwired drivers hardware on the system core the hollywood is not just a gpu its a system controller runner the cpu acts as a dedicated gaming cpu its a great design :?: :?:

BranH

On February 6, 2008 at 12:27 am

Lol, not everyone is as confused about how things work as you are Wiiboy. Not everyone is as clueless on how things function. Some of us can tell the difference between some Fanboy’s blog, filled with just as much falsehood, misinterpreted crap as you spew, and a legit impression of their experience with the hardware. Most developers anywhere will tell you the same things. Everyone from Itagaki, to Climax studios, Ubisoft, to EA, or Maxis. Not just the occasional honest developer on a forum.

You can read the specs yourself, as long as your somewhat capable of looking at them objectively, and don’t just latch on to bull numbers like your “16 stage shader tree” crap, of which you obviously haven’t been able to comprehend the functionality and purpose of.
You can get to counting the flops, and texturing units, factor all the different components into it, and conclude, that Xbox > Gamecube. There is nothing mysterious about any component in either console. The T&L units, the TEV units, the vertex shaders, the pixel shaders, etc.. The “emboss bump mapping, the Environmental bump mapping, etc…
I could list the math that they compute per clock cycle, and what they do, and how they do it, what they multiply and add, and when, where, why, and how they do it.
You don’t have to guess on any of it.

And it doesn’t take much “looking things up” to determine you know nothing of what you speak, and that your 4x, 5x, 6x, 7x, bs Xbox figures, you magically attribute to the Gamecube are pure fanboy fantasy.

It’s even funnier that you think of anyone who was open about their experience with programming for the two consoles, to not know what he’s talking about, or to have something against Nintendo, and at the same time link to some scho’s blog on “Nintendo World Report”, to somehow back up your argument, despite the fact that he’s even more clueless than you are, Wiiboy.
I got a laugh out of that one, btw.

And I repeatedly told you the tev unit was a shader. I never said it wasn’t. You continue to make up, apparently so you can pretending that you’re correcting something.
And it’s referred to as a “pixel shader” in DirectX, and a “fragment shader” in OpenGl. Notice the word shader is used in both, as is the word “shader” found in glsl, (OpenGL SHADER langage)
It’s funny how you just spout off about things, without even bothering to look any piece of what you say up. (a common theme for you)

I did say, that the SINGLE T&L unit in Gamecube, isn’t capable of the same things as the vertex shaders in Xbox. (this is the real world after all)

And it seems to not sink in, that no one here was saying the Wii is equal to, or less powerful than an Xbox. Just that your ignorant, delusional, hyperbole of its “awesomeness” is bull. You continue to confuse the two Wiiboy.

And again, no one said sdram was superior to 1tsram. (that’s you making up again)
Just that, once again, your pathetic hero worship of it, and all the magical benefits that it gives are off base. And point out that you cherry pic your specs, factor in bull compression figures, and ignore that it was connected to 128 bit bus, and there was more than twice of it that was useful in an Xbox.

You even use it in your attempted comparison to an Xbox 360. lol.

C.

On February 7, 2008 at 5:04 am

Very interesting topic, to say the least. I must say that I hate illiterate fools and I hate ‘discussions’ that are borderline wars.

The funny thing about all of this is we have one well informed individual who has to constantly invoke the same exact response every other time to get his argument through the rock brain idiot that keeps on with his asian ebonics and derivative ‘fact’ sheets.

All the while in reality he is not making any points relevant because he is shouting like a DUMB ******* IDIOT THAT DOESN”T UNDERSTAND THAT PEOPLE WHO TYPE EXCLUSIVELY IN CAPS FOR TWO THOUSAND HALF WORDS DO NOT GET ANY RESPECT AT ALL. LEARN YOUR ENGLISH, LEARN HOW TO CODE GAMES, OR AT LEAST TRY TO UNDERSTAND THAT THE ART OF GAMING DOES NOT INVOLVE SPEC SHEETS AND THEN ACTUALLY ‘KNOW’ A BIT ABOUT WHAT YOU ARE ‘TRYING’ TO SAY. THEN HAVE AN ENGLISH DISCUSSION WITH ENGLISH PEOPLE OR **** OFF BACK TO WHERE YOU CAME FROM.

1. Nintendo does not innovate anymore, at all, in the hardware department. You are a brazen fool of garbage if you think otherwise. They use outdated hardware and archaic tools to make games that were fully possible on the original xbox. The wii has put out nothing that was not possible in some form or another on the xbox. You can throw around poly’s all you want, shaders too, the thing of it, though, the FACT, is that mario galaxy ain’t that pretty. It’s about the gameplay. It doesn’t even begin to approach games like Heavenly Sword or Uncharted. NOT EVEN CLOSE.

2. The gamecube was the worst piece of **** I ever owned and I have 7 games to prove that. Rogue Squadron was overrated tripe, and to constantly take That game out and make it look like some masterpiece is redundant and dumb. There were dozens and dozens of better games on the XBOX and the PS2 and even the cube. I have approximately 25 ps2 triple AAA games, some of them rare. that I kept, not including trash like FFX or GT4, games that really turned me off from those respective series’, though their graphics were some of the best. Games like Dragon Quest 8 and Ico were the forebearers of a new kind of gaming and that is where this industry is going, not the Nth mario or zelda that is forced down our collective throats. I say ‘our’ with the disillusion that wiifanboy includes himself with humanity and not above.

3. You ******* infidel: the PS3 will not be compared to the Wii ever again. EVER. The architecture for Sony’s system is based on massive parallel computing that has been around forever, yet it is the most evolved piece of machinery gaming has ever seen, comparable on a microscale to computers that the make the wii look like an electron. More than likely we will be seeing Crisis on the PS3 in the future. It is not worth going over documents and cutting and pasting, simply watching GT5 in 1080p or seeing FF13 live will quell the fools that are so abundant and brash in this generation that was claimed OVER the minute SONY put BLURAY in their system. Comparing the RAM of the PS3 to the Wii and making the Wii look better! HAHAHAH

Only a true idiot, the word idiot, look it up, would even try to make equations like this. You live in a false precept of a world half closed. Remain at your own risk wiifanboy.

4. Bran has conclusively proven that he is in all respects a genius compared to the antagonist in this discussion.

5. If the Wii was overkill, then mario galaxy would have 4xAA and not a single jaggy. And it would have had 140 stars. Do the math.

6. Please, please for the love of the Bhodisatvas, please don’t mention Factor 5 or the three main N franchises as the driving force behind all of your arguments. Are you a complete bastard fool, do you actually play games, are you a robot, or do you work for Nintendo? Are you aware that Sony has been number 1 for over ten years and has been making a couple of good games itself. Try them out one day. I promise, they are really good.

7. The Wii will be replaced in under three years, mark my words, and at that time the PS3 will be entering 299 territory for the lesser system, and most likely with a 20 million plus lead by 2010.

So, the question is, why do you persist WII fanboy. Why do you still write the same TEPID CUT AND PASTE GARBAGE. Are you really that far up N’s butt. Come one now, seriously, in your world everything wii is 1.8 times what it really is. Your man member must be about 7 inches, heh, taking into account the asian smallness, TOTALLY RASCIST I WAS KIDDING. Trying to lighten up the mood, eh.

NO…………….. fine.

8. Did I mention that Wiifanboy’s CAPS and all the cutting and pasting make him look like a ******* mental patient. Because they do, so DO THE MATH THERE, FRIEND. When people that know what they are talking about approach you in a discussion, they don’t sound like you, they don’t act like you, and they don’t cut and paste as much as you do. From all empirical evidence in my life, I say that you are a jester, ‘cept you think you are a king. Fine enough.

Also HD is SO IMPORTANT to this generation, the claim that it wasn’t is preposterous. The PS3 is by all account going to be around for ten years and probably 300 million in sales, and it will win this generation because of BLURAY ALONE.

In three years wiifanboy will be having this same discussion about the Wii 2 and the PS3. Can’t wait for that backwards argument.

And Bran, you might have just as well left the situation, though I appreciate that you have had this discussion, it’s just that everyone has been over this ten times before, though, I have learned a few new things from the tech talk thank you.

Lastly, RE4 was not THE PINNACLE OF ANYTHING LAST GENERATION. YES THE WII VERSION IS AWESOME, BUT AGAIN, SO IS HEAVENLY SWORD. RE4 MAY HAVE BEEN PUSHING SOME POWER BUT THE VOICE ACTING AND STORY WERE TERRIBLE, NO ARGUMENT. GRAPHICS GREAT BUT NOT ANY BETTER THAN GOD OF WAR 2. SORRY WII FANBOY BUT ALL YOUR NUMBERS AND FILES AND QUOTES MEAN NOTHING WHEN ART DIRECTION IS THE FOREMOST FRIEND OF ANY TRIPLE AAA GAME. WHEN PROGRAMMING GETS THE JOB DONE, WHICH, IN LAY TERMS, MEANS FIDDLING WITH ALL THE NUMBERS YOU GUYS ARE TALKING ABOUT, THAT MEANS JUST ABOUT NOTHING. YOU STILL NEED GAMEPLAY CODERS, ARTISTS, VOICE ACTORS, SCRIPT WRITERS, DIRECTORS, ETC.

NO GAME IS EVER MADE WITH THEORETICAL ABSOLUTES IN MIND. ARGUING OVER THESE SYSTEMS IS POINTLESS, AND THAT IS WHY I APPRECIATE BRAN TRYING TO CONSTRUCTIVELY GUIDE THIS CONVERSATION. BUT IT IS NOT WORKING BECAUSE FOR WHATEVER REASON SOMEONE IS PLAYING THE FOOL.

PEACe.

C.

On February 7, 2008 at 5:53 am

wiifanboy, are you an asian? please substitute my racial categorizations for whatever race you come from.

Sorry for the inconvenience.

Prior comment has been reported as SPAM.

Good Luck and Good NIGHT

C.

On February 7, 2008 at 5:54 am

:twisted:

wiiboy101

On February 7, 2008 at 4:38 pm

ps3 pro evo 08 d-pad and buttons = no improvent over n64s iss 98 soccer

x360 anolog d-ad and buttons = again no real improvment over the pro evo concept forst done in iss 98 soccer on the n64

HARDLY NEXT GEN

pro evo wii “”"playmaker edition”"” full real time 11 player point and click real time tactics stagedy and play imposable on any other gaming system pc/ps2/ps3/x360/psp/etc etc

only wii is next gen

fps x360/ps3 button bash and d-pad and anolog

fps wii controls clearly proven and clearly next gen

loading times

x360 slow

ps3 slowerrrrrrrrrrrrrrrrrr still

wii fast as lightning loading speeds no hardrive required

devil may cry ps3 22 minutes install to get x360 loading times

mario galaxy noooooooooo loading times no hardrive

assasins cred x360 longggggg loading times

wii kicks loading speed ass ITS BECOUSE ITS A NEXT GEN GAMING DEVICE NOT A DVD MOVIE MEDIA BOX COUGH PS3 COUGH X360

U DARE CALL URSELFS HARDCORE GAMERS NOOOOOOOOOOOOOOOO HARDCORE GAMER PICKS PS3S 4 GENERATION OUTDATED D-PAD BASED CONTROLS AND LONGGGG BLURAY LOADING AND OBSOLETE GAMEPLAY

U DARE CALL URSELFS HARDCORE GAMERS X360S PAD IS OUTDATED AND JUST A DREAMCAST PAD REDESIGN AND AGAIN LONGGGG LOADING TIMES

WII NO LOADING FROM DISC WOW/AND A NEXT GEN EVERYBODY WINS CONTROLLER

PLEASE ONLY Wii are hardcore gamers

GO BASH A D-PAD SERIOUSLY

wiiboy101

On February 7, 2008 at 4:40 pm

LETS MORE THE ARGUMENT FORWARD I WON ROUND 1 YES RIGHT AGREED

LETS MORE ON TO CONTROLS

MY CLAIM ONLY Wii is a next gen console

PROVE MII WRONG PLEASE I DARE ANY PS3 FAN TO CALL HIS CONTROL PAD NEXT GEN

I DARE NY X360 FAN TO CALL HIS DREAMCAST RIPP OFF CONTROLLER NEXT GEN

I DARE YA :roll: :roll: :roll:

wiiboy101

On February 7, 2008 at 4:42 pm

DOES X360 OR PS3 ANY FPS TITAL COME CLOSE TO WIIS PRIME 3 OR MOH FPS CONTROLS

I DARE THE FANWOMEN TO MAKE A CLAIM THEY DO ONLY WII IS NEXT GEN THATS A FACT :lol: :lol:

wiiboy101

On February 7, 2008 at 6:24 pm

devil may cry 22 minute installl to get still poor against wii loading speeds ps3 dosnt no what the hell it is a dvd player a pc a games console its all fuking confused that machine is a dam joke 5gb installs to get x360 loading speed

wii does it all from disc guess why its a GAMEZZZZZZZZZZZZZZZZZZZZZZ CONSOLE AND IS DESIGNED AS SUCH

C.

On February 8, 2008 at 4:32 am

:idea:

C.

On February 8, 2008 at 6:13 am

Here is an interesting bit:

Planet GameCube: In a recent IGNinsider article, Greg Buchner revealed that Flipper can do some unique things because of the ways that the different texture layers can interact. Can you elaborate on this feature? Have you used it? Do you know if the effects it allows are reproducible on other architectures (at decent framerates)?

Julian Eggebrecht: He was probably referring to the TEV pipeline. Imagine it like an elaborate switchboard that makes the wildest combinations of textures and materials possible. The TEV pipeline combines up to 8 textures in up to 16 stages in one go. Each stage can apply a multitude of functions to the texture – obvious examples of what you do with the TEV stages would be bump-mapping or cel-shading. The TEV pipeline is completely under programmer control, so the more time you spend on writing elaborate shaders for it, the more effects you can achieve. We just used the obvious effects in Rogue Leader with the targeting computer and the volumetric fog variations being the most unusual usage of TEV. In a second generation game we’ll obviously focus on more complicated applications.

C.

On February 8, 2008 at 5:21 pm

sadly, I think Bran shot himself wiifanboy, thanks a lot

C.

On February 8, 2008 at 5:32 pm

I do think that Bran is underestimating hardware just a tiny bit; he has more knowledge than me but he lacks the experience, as do I, of working on the metal of a system like the cube or Wii, which, in reality, is as different as the PS3 vs. 360.

I am very curious as to his opinions on these systems because I have fast learned that people underestimating the PS3 against 360 hardware are talking complete bull****. By any true tech’s account, those schooled in IBM super processors, the PS3 in potentially 4x as powerful as the 360. That is no joke, folks. With what that PS3 has, including BluRay, many think it can compete with lesser generation PIXAR films which is saying a lot.

Anyway, you have to spend at least 2 plus years in cube or wii development to even harness the skills factor 5 is displaying, and I still think their gameplay is basically recycled 1994 ****, but that is just my taste, I guess.

So wiifanboy, good god, please shoot up again and start ranting, I am getting bored.

C.

On February 8, 2008 at 5:35 pm

PS3 pad = next gen

C.

On February 8, 2008 at 5:38 pm

okay okay, I was just baiting, don’t go boiling that spoon for that, but I must say the PS3 D-pad is king, the Wii Dpad is not that well made for CHRIST

I will say fanboy, I am a veteran/expert SSX player and BLUR was the best game yet

GREAT controls

have you played it

do you play video games wiifanboy

C.

On February 8, 2008 at 5:49 pm

:cool:

BranH

On February 8, 2008 at 10:00 pm

Already been over that “16 stage shader tree” “8 texture layers in one go” quote.

What’s meant by “in one go” is related to “single pass multi-texturing”.
Just means you only need to set up the polygon once, and apply textures to its pixels, by recirculating them in and out of the frame buffer. i.e, on chip.

But if you were to recirculate a pixel to add 8 texture layers, you’re cutting fill-rate by 8. That’s why Climax developer pointed out, you’ll average 2 or 3, while keeping a decent frame rate. (they average 4 on an Xbox)
Sure, you could add 8 to some things, but you can add more than 4 on an Xbox, as it does with Doom3. Xbox is only capable of 1 loop back, but it adds two texture layers at a time, so adding 4 texture layers to a pixel, cuts fill rate in half, while it quarters Cube’s.

A PS2, would “usually” have to reprocess the geometry to add each texture layer.

Anyway, Gamecube being under programmer control, doesn’t make it programmable in the same sense as an XBox. It’s still considered fixed function. What that means, is that they hard coded specific algorithms into the chip. That’s the way it was done on all gpus. And there’s a list of functions that the chip performs, listed by someone who had access to the chip docs. It lists all the math performed (multiplies and adds, etc.), and how many clock cycles it takes.
It also listed the total number of flops the the chip is capable of.

There’s even illustrations of the pipelines, in a presentation by factor5.

Total flops was something like ~8 gflops.
Xbox shader hardware was more programmable and customizable.
Vertex shaders could handle static meshes, or dynamic. They could utilize vertex arrays, vbos, (not just display lists), pull data from ram, and btw, decompress compressed position, normal, color, matrix and texture coordinate data, etc..

And it didn’t need to rely on its cpu in the same way that the Gamecube did.
In the end, Gamecube was competitive with XBox, not superior.

And Wii doesn’t add much from what I’ve seen. There’s still a huge difference between PS3′s Afrika in 480p, and the African Safari game on the Wii.
Sure, they’re different games, but there are plenty more examples where that came from. Being capped at 480p doesn’t seem to help the Wii compete much, in paper specs, in current games, in theory, etc..

Random examples: Gotta read where the highlighted words run together, as google won’t highlight the sentance for some reason.
http://64.233.167.104/search?q=cache:Pj7szb0MUbsJ:www.patentstorm.us/patents/7159212-description.html+On+the+Gamecube+platform,+a+similar+data+structure&hl=en&ct=clnk&cd=1&gl=us&client=firefox-a

http://64.233.167.104/search?q=cache:Pj7szb0MUbsJ:www.patentstorm.us/patents/7159212-description.html+%22Nintendo+GameCube,+but+a+rich+set+of+predefined+computational+elements%22&hl=en&ct=clnk&cd=1&gl=us&client=firefox-a

BranH

On February 8, 2008 at 10:10 pm

That was in relation the 4to1 compression, i.e vertex data being represented in bytes and shorts (8 / 16 bits) rather than full floating point.

Anyway, I doubt you’ll see much more than a Gamecube and Xbox difference when it comes to PS3 and 360.
As it is, PS3′s gpu needs its cpu to keep pace with 360′s gpu in graphics.
(current, general developer consensus)

BranH

On February 9, 2008 at 1:59 am

Controls, Wii easily takes it. PS3 hasn’t really shown much with software, but it’s a step above 360 at least.
I wouldn’t have said Wii’s controls were worth it, until Prime 3, and probably MOH.
Wii is easily the most “next gen” in terms of controls.

Load times depends on how much the developers want to avoid them, and design the engine as such.

Such was the case with Uncharted: Drake’s Fortune for example.

C.

On February 9, 2008 at 2:20 am

Yeah, I know we have, but no one understood what the **** was going on between the all the babble. He sums it up in one paragraph where before it would take……. you get the point.

Honestly, I VERY much disagree with you on that last quote Bran, you need to think about the hardware and the developers more than you are.

The PS3 by many accounts–and I am not sure where you are going with that xboxcube comparison, it is quite dated–has many advantages over the 360 from Bluray alone. Remember that the SKU’s, when fully optimized, will be like Brand new super efficient tools in a developer like ND’s hands. Super efficient tools on the PS3 means super super games in 1080p.

10 years, man, ten years.

All of your comments have been intelligent, I just don’t see how you think DVD9 is anywhere in the league of Bluray. We are talking 40 extra gibi’s here man, and that is a lot of gameplay, physics, etc.

Have you read any of IBM’s technical documents, because they really showcase, from what I can understand, the ‘potential’ power that people like Insomniac and Naughty Dog are pulling out of the metal.

Seriously, mark my words, Uncharted 2 will have no equal on the 360, EVER. If you think that statement is wrong than I suggest you are 360 fanboy yourself. No harm meant.

Uncompressed audio, being an AUDIO FREAK, is very important to me and the 360 WILL NEVER DO THAT in its current iteration.

Of course the Wii does uncompressed audio like ice cream. Fast and smooth. :wink:

And wiifanboy, the Wii’s lightning fast memory ‘might’ not be able to handle Uncharted 2, either, so don’t go getting all crazy with the horse.

As for graphical power, yes, they are on a plain, but please take the PS3 into account for more than that. It has Bluray and it has full HD across the SKU’s, and I really think that this will be the factor that pushes it far past what the 360 can do. If the 360 has these then yes, it would be more of a game.

And yes, trust me when I say I have no qualms about saying, at this point, that the Xbox is more powerful than the Wii. No question, really, look at the games.

NG Black still holds up to 360 games for ****’s sake.

Anyway, I have been long looking for an accurate picture of the Xbox or PS2 versus Wii. What do you think the potential of the Wii could eventually develop into before the sequel system?

Do you think we could see a game like Black on the Wii. Do you think that the ubiquitous shaders of the Xbox are just too much for a Wii game to look comparable?

C.

On February 9, 2008 at 2:30 am

I also think the PS3 has the best controller, while we are at it. Dual shock 3 really has no equal in my opinion. Versatality, motion, rumble, and comfort. Grade A. Though, tennis on the wii cannot be beat, I will admit, though, it just doesn’t have the power this generation to completely gain my attention.

Can you imagine if N would have bit the snake and released a PSwii combo. The power and the controls…….. the competition would be fierce.

I am a 360 fan, but not of their designers, so you won’t find me being as nice; I hate Microsoft’s gaming division with passion. And Vista will never be OS X for what it is worth, not for at least a couple years as far as innovation goes. I know M has innovated, especially on the web, but they have really fallen back on laziness in this new century.

C.

On February 9, 2008 at 2:35 am

Just about to get ‘into’ Endless Ocean. Got it for 20 bucks with google checkout. Since I am on a reefer break it will be completely caffeinated.

YAY.

I hear the graphics are a step up, actual Xbox quality. Blur, for what it is worth, was close to mid XBOX quality…. sadly I finished it so fast. People like me eat those games for breakfast.

BranH

On February 9, 2008 at 9:26 am

Yeah, I understand the Blu-ray storage.
And it’ll likely begin to show more and more as the generation moves along. And I’m not saying it won’t become really noticeable, but I still expect the Gamecube vs Xbox comparison to hold up for most of it.
A dual layer dvd had several times more storage than Gamecube’s mini-disk, and had 5.1 surround, compared to Gamecube’s stereo derived “surround”, etc..

I think expecting a huge gap between 360 and PS3 any time soon, is expecting too much from PS3.
We’ll see though.

And I don’t see much legit 1080p being used in the future, as it’s a waste, and the PS3 doesn’t have all that much fill-rate and bandwidth to spare on it. I’d rather they target 720p, without cutting corners, and utilize 4xaa, etc..

wiiboy101gezza

On February 11, 2008 at 12:57 pm

http://planetquake.gamespy.com/View.php?view=Editorials.Detail&id=209

OH LOOK SOMEONE WITH A HONEST ARTICAL THERIRETICAL MEENS DIDDLESQOT

WII IS VASTLY MORE EFFICENT AND EFFECTIVE THAN XBOX WII IS A BANDWIDTH MONSTER COMPARED TO XBOX AND WIIS RAM IS LIGHTNING FAST COMPARED TO XBOX SO USING THE IDEA OF “”"IN GAME “”" EFFICENCIE/PERFORMANCE

WII KILLS THE XBOX WII ACCUALLY GETS CLOSE TO ITS FILLRATES XBOX LIKE ALL STANDARD DESIGNS SIMPLY CANNOT

RAM/LATENCIE/BANDWIDTH ALL HIGHLY IMPORTANT TO GPU IN GAME PERFORMANCE AND WII SHEAR KILLS XBOX 1

SO AGAIN LIKE I STATED IN POST 1

WII IS FAR MORE POWERFULL THAN XBOX 2.5 TIMES A XBOX/CUBE LEVEL MACHINE OVERALL THATS SERIUOS POWER AT 480P

IGN BOOM BLOX PREVIEW YET AGAIN WII IS KICKING SWEET PHYSICS ASS AND DOING IT ALL WITH DIRECT INTERACTION WIIMOTE

:lol:

wiiboy101gezza

On February 11, 2008 at 1:00 pm

SERIOUSLY GO BUY MORE SHOOTERS X360 GOONS RACER/SHOOTER/RACER/SHOOTER YOUR SUCH INFORMED GAMERS ARNT YOU XBOX TEENAGERS BUYING THE SAME TEEN AIMED MARKERTING CRAP OVER AND OVER

LETS ALL GO OUT AND BUY FORZA AND GOTHOM STREET RACERS THEN GO OUT BUY HALO 3 THEN GO OUT AGAIN AND BUY COD 4

WHAT THE FUK EVER FPS RACER ISNT YOUR CONSOLE CHOICE A TAD REPETETIVE AND OUTDATED AND BORRING AND LAME

GROW THE FUK UP AND GET A WII

BranH

On February 11, 2008 at 1:57 pm

All that article is going over, is people listing texel fill-rate, as pixel fill-rate. No revolutionary discoveries going on there.
Theoretical fill-rate, is directly tied to rops, and texel fill-rate is directly tied to texturing units in the pipeline. They are tied together in the Gmecube and Xbox.
Xbox had two per pipe, Gamecube has 1. Simple.
The only real argument people had, was the idea that a cube could do tri-linear full speed, while xbox did bi-linear, and to do bi-linear, it would cut texel rate in half. Which, even if it did, it would still be higher than the cube’s, making theoretical maximum tri-linear fill-rate on cube, 648 million, and Xbox’s 933 million.
Toss in some efficiency gains on Gamecube, and they’d be comparable.
But, as erp explained, that’s not the case. Xbox could do -tri-linear as fast as it could do bi much of the time, so the tri-linear figure becomes less relevant.

On paper Xbox had a higher peak, in benchmarks it had a higher peak, in games it had a higher peak, etc..

And I would hope Wii out paces an Xbox. It’s pretty lame that you have to look closely, and cherry pick games, to prove next-gen hardware is more powerful than a last gen one.

And I already own a Wii, (and a Gamecube), and I enjoy the games, but there’s not much variety there.
RPGs, fighting games, racing games, etc..
There is no comparing the library of the 360 and the Wii across the board.

***LETS ALL GO OUT AND BUY FORZA AND GOTHOM STREET RACERS THEN GO OUT BUY HALO 3 THEN GO OUT AGAIN AND BUY COD 4***

That’s called variety and choice. You can play the newest Soul Calibur, Virtua Fighter, or Street Fighter, it’s up to the player.
GTA4, Ninja Gaiden 2, Halo3, Call of Duty 4, Oblivion, Orange Box, Mass Effect, Gears of Wat, Alan Wake, Bioshock, Resident Evil 5, etc.. All up to the player.

Same goes for the PS3. Wii doesn’t overshadow anyone at variety of games.

BranH

On February 12, 2008 at 1:36 am

I’ll go ahead and post this for you Wiiboy.

“We want to push the hardware. I think for us it’s relatively easy for us to push the hardware. It inherently comes. But a lot of it is about exploiting the uniqueness of the Wii. I mean, on the graphical side, we’re going to try and do everything to outdo everything else on the platform, the same as we did for the Star Wars games back on the GameCube,” said Eggebrecht. “But one of our main focuses is the innovation around the controls. Everybody is always talking about the motion control, but I think people are overplaying that a bit. I really, really love the pointing aspect of the remote. Although we’re going to use everything for what we have in development, I think the pointing stuff is probably the biggest innovation which we’re working on right now.”

“Asked about the Wii game’s state of completion, Eggebrecht responded: “We’re pretty much at a state where we’re almost done with the engine. At the same time, we’ve also been working on content quite a bit because we had enough running very quickly on the platform that we were able to. But the biggest milestone or mark right now is that we’re almost done with the engine and it does everything that the PS3 did and then some, quite frankly. So we’re pretty happy with that.”

Lol.

Riotr

On February 12, 2008 at 11:31 pm

Hey I got a question, even though the Xbox is more powerful than the gamecube, what the hell is up with resident evil 4, starfox adventures and rogue leader? Those games are probably the best looking games last gen. And they fit on one (resident evil is 2 i think) of those 8 cm disks. Is it possible for a system to be more powerful than another but the other system can have way better looking games? Honestly resident evil 4 blows any xbox games i’ve seen or played out of the water. As for halo 2, i wouldn’t even call that game good looking lol, ninja gaiden was amazing though.

3ds Mac

On February 13, 2008 at 12:32 am

School A has a better overall education system than school B. They have better facilities, budget, better resources, better trained teachers, etc..
The top ten students were collected from both schools. 2-3 out of the ten were from school B.
Therefore, we have concluded that school B is just as good as school A. At everything. No differences at all.

And not sure where you’d get “way better looking games”, as you’d be ignoring a list of them on Xbox. For each one you listed, there are a couple on Xbox the easily compare to them.

Metal Gear Solod 3, God of War 2, Zone of the Enders 2, Jack3, Shadow of the Colossus, and Ico look better than the vast majority of Gamecube games.
PS2 must actually have been equal to, or better than the Gamecube after all. Silly developers, got it all wrong all this time.

And half life 2 Lost Coast looked better when i played it on a system that was several years old, than many other games by totally different developers, running on newer hardware.
My PC must have been secretly awesome, and easily outperformed newer systems.

3ds Mac

On February 13, 2008 at 12:47 am

*Jak 3

wiiboy101gezza

On February 13, 2008 at 8:21 pm

well lets await pro evo wii and see the true meening of next generation

lets await factor 5s wii project and how it will show ps3 isnt needed and all with a motion play imposable on all other formats and much much shorter loading times

just wondering how come gamecubes rogue sqodron was 60 frames yet lair only 30 with disasterus drops i may add

allso all wii LOOKERS galaxy/smash/prime3 etc all run at 60 frames ps3s best lookers seem unable to hold 30 frames HD slide shows dont impress mii one bit nor do tares glitches bugs patch requiring games etc etc etc etc

THERES ONLY ONE NEW/NEXT GEN CONSOLE ITS WII STATING OTHERWISE IS A MORON STATMENT LOADING TIMES AND PATCHES AND D-PADS AINT NEXT GEN

FINE RUNNING NO LOADTIME SOFTWARE AT 60 FRAMES WITH 3D MOTION IS NEXT GEN

THATS COMMONSENCE

NEXT YOULL BE SAYING PSP IS BETTER THAN DS HMMMMMMMMMMMMMMMMMM

THE D-PAD AND SCREEN WAR IN HANDHELDS WAS SETTLED 20 YEARS AGO GAMEBOY WON SO 20 YEARS LATER SONY FIGHT A DEAD FIGHT HMMMMMMMMMMMM D-PAD AND SCREEN BUT A HANDHELD HAD BECOME A NEW THING

PLEASE THINK OUTSIDE THE BOX AND BUY GAMES OUTSIDE THE SHOOTER GAMES ITS CALLED GAMING NOT SOILDGER SIMULATING :roll:

3ds Mac

On February 13, 2008 at 10:08 pm

Frame rate comes down to the amount of work being done per frame.

If you stuck to Wii level visuals, (even in 720p) you could have zero load times, and a few hundred frames per second easily on any other next gen console.
But no one wants to restrict themselves to such a limited palette. They’d rather have more ambitious visual effects going on on screen, and use things like hdr and normal mapping, etc..
None of Wii’s lookers, are lookers for their technological achievements in visuals.

Riotr

On February 15, 2008 at 5:23 pm

ps3 with dmc4 has no load times….wii does have load times, the doors at mp3 are the loadtimes, sometimes i can wait for 10 seconds before it opens

also play warhawk and u’ll see ps3 motion controls aren’t as bad as you think

btw 3ds mac i didn’t own a ps2 last gen (i know huh?) so i can’t really comment on those games.

wiiboy101

On March 11, 2008 at 6:59 pm

xbox 1 celeron 733mhz level 2 catch 128k level 2 catch bandwidth around 3 gb

wii broadway level 2 catch 256k level 2 catch bandwidth around 23 gb (4to1 compression) now xbox is as or more powerful than wii depending on who’s telling the lies

but wii can clearly see wiis cpu bandwidths in catch destroy xbox but its still weaker HHHHHHHMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM COUGH

COUGH FANBOYS

wii fsb 243mhz plus 4to1 compression = 7.78 gb pushing allmost 8 gb

xbox fsb 133mhz no hardware compression trick a weak 1 gb

so if wiis bus and wiis cartch out bandwidth a xbox many many times over were is the power that makes xbox a wii

IT CLEARLY DOES NOT EXIST

wii disc dual layered dvd with custom priority files and compression 7GB PLUS thats 5/6 times bigger than gamecubes mini disc with custom compression at 480p the wii disc out data stores a xbox disc 4 times over
and loads much much much much much quicker

gamecube out loadspeed-ed xbox and wii out speeds gamecube soooo its much faster at loading and streaming data FACT

DATA IS SENT THRU XBOX NON COMPRESSED RIGHT GOT IT

TEXTURES ARE SENT THRU XBOX NON COMPRESSED (BUT DISC STORED AND RAM STORED COMPRESSED BEFORE A MORON TRYS CORRECTING MII)

WII DEALS WITH DATA AT MAX 4TO1 COMPRESSION THRU THE WHOLE SYSTEM
WII DEALS WITH TEXTURES AT 6TO1 COMPRESSION THRU WHOLE SYSTEM ALL COMPRESSION DE-COMPRESSES AT PROCESSORSIN REAL TIME NOT POSSABLE ON A XBOX (X360 CAN DO SOME OF THIS IT WAS RIPPED OFF FROM GAMECUBE FACT)

SO HOW IS XBOX AS POWERFUL AS Wii IT CLEARLY ISNT

WIIS RAM BANDWIDTH 4GB TIMES 2 TWO RAM POOLS BOTH 4GB NOT COUNTING GPU CPU CATCHES ETC JUST RAM

WELL IF WE TALKING DATA ITS 4 GB TIMES 4 = 16GB SO IF DATA BANDWIDTH PEAK IS 16 GB TWICE OVER THATS 32 GB BANDWIDTH VS 6.4 ON XBOX CLEARLY WAY WAY MORE

XBOX DATA NO CUSTOM COMPRESSION

WII DATA 4TO1 ON THE FLY COMPRESSION DE COMPRESSION 32GB VS 6.4 IN DATA TERMS (NOT TEXTURES)

PLEASE WAKE UP OR SHUT UP IDIOTS

A FUKING NAFF CELERON WITH NAFF ABILLITYS ON A NAFF BUS WITH NAFF RAM DOES NOT COMPEAT WITH WII AT ALL ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ GOING BACK SLEEP YOU BORE ME WITH YOUR BILL GATES IN THE NUDE POSTERS MICROSOFT JOCKYS

wiiboy101

On March 11, 2008 at 7:09 pm

BROADWAY TRANSISTER COUNT OVER 20 MILLION TIGHTLY COMPACT INTO A EMBEDDED ULTRA EFFICENT MPU MICRO PROCESSOR UNIT USING COPPERWIRE AND SILICON ON INSULATOR

SHRINKING A SET OF TRANSISTERS INCREASES PERFORMANCE SHRINKING DIE INCREASES PERFORMANCE BROADWAY IS MORE MODDEN AND TINY COMPARED TO A OLD FASHIONED CELLERON

COPPERWIRE BEATS ALUMINIUM BY 10% PLUS DATA FLOW PERFORMANCE

SILICON ON INSULATOR BEATS NON SOI CPUS BY 35% PERFORMANCE

RISC BEATS CISC BY 50% TO 100% SAME CLOCK SPEED EVERYONE KNOWS THAT

XBOX CELERON ALUMINIUM (OLD OUT DATED) NON SHRUNK HIGH TEMPERTURE OVERSIZED BIG ASSED CPU USING NO MODDEN MANUFACTURING TRICKS OR COMPRESSION

BROADWAY CPU RISC CUSTOM SPEC TINY CHIP SHRUNK DIE SHRUNK COPPERWIRE/SILICON ON INSULATOR 2007 SPEC CPU

TRANSISTOR COUNT XBOX CELERON 9 MILLION

TRANISTER COUNT BROADWAY CPU OVER 20 MILLION

BUT XBOX BEATS WII DOESNT IT
ARTICAL PROVEN AS FAKE ASSED 3RD PARTY LIES :lol: :lol:

3dsMax

On March 13, 2008 at 11:48 pm

Wiiboy, you have got to be one of the most delusional, hard-headed people I have ever seen take interest in any of this stuff. The autism comments weren’t far off.
You continue to regurgitate the same old e you’ve continually had shot to pieces.
I’m not going to take the time to analyze every little thing in this thread, but it doesn’t take a tech genius to see you understand almost nothing of what you re-type, and you don’t really want to. It’s not even entertaining anymore. You spout about the cpu sometimes using bytes and shorts, to send data to the T&L units, to be some sort of “custom compression”. There is no ultra advanced “custom compression” used in the Wii, of any type. (not even mentioning Gamecube) It’s all run-of-the-mill technology, that’s been used and reused for years. (and sometimes discontinued because it’s old and out of date.
Why you continue on with that, is puzzling.
All textures stored on disk, in ram, moved over any bus, is compressed. Always has been, always will be.
I think it was pointed out to you, that geometry data uses compression all the time, and typical vertex shaders are capable of reading compressed data directly from ram, they don’t need to be directly fed by the cpu the way Gamecube (and Wii) do. That’s the 4to1 compression you’re jibbering about…..That’s not some special Nintendo super secret custom compression, that no one else knows about)

And you’ve applied your misinterpreted byte code compression ratio, to general storage of all data now? So how come you’re not championing “displacement mapping” or “nurbs” or “cube mapping” like some of the other Nintendo fanboys used to covet?

And all this “grasping at straws” bs comparisons…Why do you do it?
The disk load speeds? You did notice that Xbox has twice the ram to load, right? Sure you did. And despite this, there wasn’t much if any difference between Gamecube and Xbox loading on multi-platform games.

And I’m sure you’ve noticed, but no one really cares about the Wii’s graphics capability enough to really worry about whether or not they’re better than Xbox. They’re ugly and unipressive at any res by today’s standards, and that’s all that really matters.
You should be spouting about the gameplay, not graphics technology.

Fanboy logic….Holy e, you my friend, have got issues!!

wiiboy101

On March 14, 2008 at 12:32 pm

has the EA BIASED MARKETING LIE BEEN EXPOSED AS JUST THAT YES END OF ARGUMENT

wiiboy

On March 19, 2008 at 7:07 pm

the issue lyes with 3rd party spokspersons telling blantant lies the issue is xbox fans believeing those lies how do the only honest posters HAVE ISSUES its the x fans and ea guy with the ISSUES the issue being

noone can stand nintendos dominance full stop :lol: :roll: :evil: :evil:

wiiboy

On March 22, 2008 at 6:33 pm

could someone explain the more ps3 visuals than xbox 1 in wiis sky crawlers please due to the XBOX FANBOY DE RIVED FACT THAT Wii IS A XBOX

considering sky crawlers graphics are far more x360/ps3 level than anything ever on xbox 1

FUNNY HOW Wii IS DOING THIS CONSIDERING THE MORE INFORMED XBOX BEDROOM FANBOYTEEN SAYS IT CARNT BE DONE funny i clearly see near ace combat 7 visuals in sky crawlers footage and screenshots hmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm the game musty not accually exist then as the xbox fans have allready pointed out wii can only do xbox graphics even tho many gamecube games crapped all over said X box visually hmmmmmmmmmmmmmmmmmmmmmmmmmmmm FANBOY ALERT :lol:

wiiboy

On March 22, 2008 at 6:37 pm

SO SKY CRAWLERS MUST BE A FAKE THEN ROGUE SQODRON 3 NEAR 20 MILLION POLYGONS 8 TEXTURE LAYERS 512X512 TEXTURES BUMP MAPPING 60 FRAMES A SECOND

BEST RESULTS EVER ON XBOX 15 MILLION POLYGONS if that AT ONLY 30 FRAMES A SECOND

SKY CRAWLERS WII S ON THE ABOVE but it must be magic cuzz the informed inteligent xbox can stated in there mighty wisdom it aint posable

GIGGLE SORRY POINT AND GIGGLE AT XBOX FANBOYS :cool:

wiiboy

On March 22, 2008 at 6:43 pm

GAMCUBE OUT POWERED XBOX YET WII IS = OR LESS THAN XBOX THATS MIND BENDING BECOUSE WII PLAYS GAMECUBE GAMES HMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM

FZERO GC BETTER GRAPHICS WAY WAY WAY MORE SPEED THAN ANY FUNKY RACER ON XBOX EVER

RES EVIL 4 LOOKS TWICE AS GOOD AS NINGA GAIDEN XBOX

WIND WAKER SQOTS AND S ON ANYTHING IN CELL SHADING EVER SEEN ON XBOX

ROGUE SQODRON 2/3 BOTH OUT POLY OUT FRAME OUT TEXTURE PASS OUT COLOUR ANYTHING ON XBOX

GAMECUBE LOADED FASTER THAN XBOX 1 EVEN WEN XBOX 1 USED ITS HARDRIVE SAME CASE WII VS PS3 YET ANOTHER FACT

BEYOND GOOD AND EVIL UBISOFTS ONLY TRUE GAMECUBE OPTERMIZED GAME LOOKS BETTER THAN THE XBOX VERSION AND S ON THE PS2 VERSION ALLSO LOADS QUICKER

TWILIGHT PRINCESS GAMECUBES GOOD BYE GAME CLEAR AND BLATENTLY CRAPS OVER ANY ACTION RPG ON XBOX 1 INCLUDING FABLE TWILIGHT SLAPS IT TO BITS FACT

BUT AGAIN AS IF BY MAGIC puff WII IS A XBOX

HEADS THE LOT OF YOU

wiiboy

On March 22, 2008 at 6:50 pm

LET MII GET THIS RIGHT XBOX READS TEXTURES FROM MAIN RAM WITH ITS 6.2 GB BANDWIDTH YES RIGHT BUT IT SHARES THAT BANDWIDTH WITH ALL OTHER TASKS AND ALLSO HAS TO Z BUFFER FRAME BUFFER FROM THAT SAME MAIN RAM AND BANDWIDTH SO IT LEAVES LIKE 1/2 GB FOR TEXTURE READS RIGHT YES

WII READS TEXTURES AT 512 BIT 16 GB BANDWIDTH PLUS 6 TO1 COMPRESSION AND VIRTUAL TEXTURING 50% INCREASE IN TEXTURE EFFECIVNESS SO WIIS 16GB PLUS COMPRESSION AND VIRTUAL TEXTURING IS A DAM XBOX THERES A BIG DIFFERANCE BETWEEN 512 BIT 6 MB 16 GB BANDWIDTH TIMES 6 FOR COMPRESSION AND XBOX 1/2 GB RAM BANDWIDTH

ALLSO XBOX READS TEXTURES FROM SDRAM (GIGGLE) WII HAS EDRAM 1T SRAM-R AND MAIN RAM 1T SRAM-R BACKED UP BY GDDR3

XBOX FANS HAVE A NICE CUP OF TEA YOUR BEATEN :wink:

wiiboy

On March 22, 2008 at 6:55 pm

PS2 FSB BANDWIDTH 1GB PLUS XBOX FSB BANDWIDTH 1GB PLUS DREAMCAST FSB BANDWIDTH 800MB

WII FSB BANDWIDTH ALLMOST 8 GB BANDWIDTH 1 + 1 =2 + 8 = 2.8 SO 2.8 GB BANDWIDTH VS WIIS 8GB BANDWIDTH

BUT HONESTLY WII IS A XBOX HONEST IT IS

OFF AND DIE YOU MICRO LYERS

CLEARLY WIIS BUS AND RAM SPEED KICK 480P ASS THE WIIS FSB IS LIKE A GAMECUBE PS2 AND XBOX COMBINED YOU TARDS

3DSMac

On March 22, 2008 at 10:39 pm

Lol, You continue with crap that was shot down already. Nice.

You did have it explained that an Xbox added textures twice as fast, and performed a loop back, on internal bandwidth, and only wrote to main ram after all textures were combined right?
It didn’t need on chip ram the same way Gamecube did to write to and read from. Gamecube was designed to need it. If it tried to do it’s graphics work between the gpu and main ram, the way it does to edram, it would have been totally crippled.
People re right, Mario Sunshine’s textures looked terrible in most places. Prime’s looked fine, but there were no bump maps anywhere.
.

And I have both an Xbox and a Gamecube, and a few games on both, and I really don’t see much difference in load times.
This, despite the fact that Xbox had nearly twice the ram to load.

And glancing at your lol “math” attempting to show awesome texture processing and bandwidth, and yet most of the textures in Gamecube games, and now WIi, are unimpressive from any technical standpoint, (that’s part of my current job as a matter of fact) and look like much of the time.
The answer is, the hardware isn’t impressive. Not on paper, not in practice. Face it, sub Geforce4 level is all you’ll ever get out of a Wii. But perhaps it’ll benefit (eventually) from not being gimped by running cross-platform with PS2. (the way you’ll find to be the case with games like Ace Combat:5 on Xbox)

Not everyone is as delusional as you are Wiiboy. Most aren’t crippled by fanboy logic, and are capable of doing math properly, without having to pathetically attempt to laughably inflate your console of choice’s capabilities.

3DSMac

On March 22, 2008 at 11:11 pm

ps, a die shrink in any processor, allows the chip to run cooler, and use less power. It doesn’t become much better at processing in any significant way.
What it does do, is allow for it to be clocked faster, in the same heat and energy profile.
Any increase in performance would come from a clock frequency increase.

***BEYOND GOOD AND EVIL UBISOFTS ONLY TRUE GAMECUBE OPTERMIZED GAME LOOKS BETTER THAN THE XBOX VERSION AND S ON THE PS2 VERSION ALLSO LOADS QUICKER***

And once again, a game that was Xbox optimized would would run like absolute on a Gamecube without being gutted. Normal maps would go bye-bye, as would some of the vertex processing.

Face it, Nintendo repackaged Gamecube hardware, and sold it to everyone for 250 bucks. They decided on that price, after seeing where PS3 was priced at. They’ve said it themselves, they no longer believe in losing money on hardware.
They like the idea of turning a profit from the beginning. And that meant recycling the Gamecube’s design. The chipset is likely far cheaper to manufacture than Gamecube’s was when it launched, which would offset the cost of adding additional ram. It’s good from a business standpoint, not so much from a developer or graphics artist standpoint.

That is reality. Some people see that for what it is. Others,(like yourself) seem to have this idea that the hardware is “awesome”. It’s really not, and never will be. You’re not winning any kind of argument here.

Everyone has said, Wii is more powerful than an Xbox.
But the fact that we have break out the old magnifying glass, and study the graphics as closely as possible to find evidence, would tell even the most technically ignorant person, that it’s not by much. In its best games, it performs right about where its specifications show it be.
There is no awesomeness anywhere to be found.

wiiboy

On March 23, 2008 at 3:24 pm

xbox reads textures uncompressed xgpu cannpoyt read a compressed texture only a standard un compressed texture from slow ram at 3 gb if that

wii reads textures fully compressed at 6to1 and processes and decompresses those textures in gpu in REALTIME imposable on xbox

wiis texture catch = 1mb plus 6 times compression = 6mb texture catch (xbox 128k to 256k texture catch zero compression)clearly 24 times bigger than xbox and its 32 cells at 16 bit each = 512 bit 16gb bandwidth times 6 for compression =96gb bandwidth texture read

texture read xbox a few gb if your lucky wii texture read peak with compression 96 ggb bandwidth

96gb beats a few gb here and there many many times over

wiiboy

On March 23, 2008 at 6:24 pm

halo 3 30 frames a second (s)

prime 3 60 frames a second (sweet)thats twice halo 3s so called 360 power yes twice the framerate

halo 3 out dated unsuited standard control pad both obsolete and unrefined and not suited to fps play

prime 3 sweet as suger pure fps fpa wii controls gougous to use allso nunchucks sweet as candy anolog stick (true fps controls true pick up and play true hardcore fps shooting)capable only on wii no other games device can do it

halo 3 generic menus options and button bashes

prime 3 realtime in game hud with realtime weapons visors and hypermode options all operated in hud in game in realtime (ps 3s e3 REALTIME WEAPONS CHANGES SONY HYPED NINTENDO DELIVERED)

HALO 3 LONGISH LOADSCREENS LOAD WAITS EVEN WITH A HARDRIVE

PRIME 3 FAST LOADING NO WAITING ABOUT EVEN WITHOUT A HARDDRIVE (ADMITTEDLY SOME E LOAD WAITS IN COMPARASON TO PRIME 2 AS YOU CAN SEE NINTENDO FANS ARE HONEST FANS )

HALO 3 NOT TRUE HD READY STANDARD HIGH RESOLUTION ITS NATIVE 640P NOT 720P ITS UPSCALLED SOOOOOOOOOOOOO MUCH FOR HD YOU GOT DUPPED XBOX FANS

PRIME 3 TRUE 480P NO LIES NINTENDO STATED THAT EXACT RESOLUTION PUNKED XBOX FANS PURE PUNKED OH YES @ 60 NOT 30 60 FRAMES TO THE HUMAN EYE 480P AND 640P ARE SO CLOSE THERES NOTHING IN IT SOOOOOOOOOOOOOO HOW DARE YOU BIG UP A FEW PIXELS MORE

TWICE THE FRAMERATE KILLS A SILLY FEW EXTRA PIXELS FACT NINTENDO FANS LIVE AND DEAL IN IT FACT TRUTH HONESTY BILL GATES GFANS LIVE ON PLANET HYPE

ART DIRECTION PRIME 3 GENUINLY ALIAN BIZZARRE WELL THOUGHT OUT REALISTIC BELIEVABLE ALIAN WORLDS

HALO 1/2/3 ROCKS SAND TREES WATER OOOHHHHH HOW ORIGINAL HOW ARTISTIC NOT

X360 IS ALL THAT AND THEN SOME
YET
ITS
CONTROLS ARE OUTDATED
ITS FRAMERATES WANK
ITS LOADING WAITS A JOKE
ITS REQUIRED GLITCH PATCHES UNFORGIVABLE

YET PRIME 3 ON WII IS PURE BLISS NO GLITCHES BUGS AND GREATEST FPS CONTROLS EVER SEEN DONE USED FACT

640P IS NOT TRUE OFFICIAL HD ITS TO THE HUMAN EYE NO FUKING BETTER THAN 480P YET WII IS DOING IT AT TWICE THE FRAMERATE OF HALO 3 AND WITH BETTER ART DIRECTION AND FUNKY EFFECTS HALO3 IS GENERIC GUYS RUNNING ROUND ROCKS WALLS TREES HARDLY INSPIRING NOW IS IT AND WIIS CONTROLS AND DEPTH OF GAMEPLAY THRU THOSE CONTROLS IS TOTALLY UN MATCHED

GET YOUR FACTS RIGHT BILL GATES LOVERS PRIME 3 HAS BETTER CONTROLS ART AND TWICE THE FRAMERATE OF YOUR BELOVED HALO 3

TRY BEING HONEST :mad:

3ds Mac

On March 23, 2008 at 6:47 pm

*****xbox reads textures uncompressed xgpu cannpoyt read a compressed texture only a standard un compressed texture from slow ram at 3 gb if that*****

*****wii reads textures fully compressed at 6to1 and processes and decompresses those textures in gpu in REALTIME imposable on xbox *****

Lets try something here Wiiboy. Here’s how it works: The gpu, reads textures that are stored in ram. They are compressed using S3TC or any other form of compression that the gpu supports. So there are far more textures in ram, than would fit without compression.
Then, when it reads these textures, they are compressed. They’re transmitted across the bus to main ram, compressed. They get to the gpu, compressed. the gpu uses them. It saves bandwidth and storage. On disk, in ram, over buses.

If you were going to use 4 texture layers, you’d read all 4 from ram, and combine 2, loop it back once (on chip), and combine 2 more.

The PS2′s cpu, can store compressed textures in ram, transmit them over the bus from main ram, still compressed. But, it has to decompress them before sending them to the gpu, because the gpu cannot deal with compressed textures.

Why is this so hard for you to understand? Is it the fanboyism? Are you impaired somehow? Should we feel sorry for you?

3ds Mac

On March 23, 2008 at 6:56 pm

I don’t know how it can be dumbed down any more for you to understand Wiiboy.
Do we use analogies involving Mario characters, would that work?

****96gb beats a few gb here and there many many times over*****

An Xbox fanboy could simply look at your bs numbers, and say:

“Wow, Xbox did things like Ninja Gaiden Black, with only a few gbs of bandwidth. It still keeps pace with the Wii, despite the Wii having 25 times the bandwidth, Xbox truly was awesome, and Nintendo must have the tiest graphics programmers on the face of the earth…..”

See how that works Wiiboy?

3ds Mac

On March 23, 2008 at 7:09 pm

*******PRIME 3 TRUE 480P NO LIES NINTENDO STATED THAT EXACT RESOLUTION PUNKED XBOX FANS PURE PUNKED TO THE HUMAN EYE 480P AND 640P ARE SO CLOSE THERES NOTHING IN IT SOOOOOOOOOOOOOO HOW DARE YOU BIG UP A FEW PIXELS MORE******

And more of your crap.
728,320 = Halo 3
307,200 = Prime 3

Where are the bump maps in Prime? There are none you say? But, this is 2008, there should be at least one. I mean, it’s just “bump mapping”, not any of the more complex mapping techniques….

3ds Mac

On March 23, 2008 at 7:34 pm

And that totally ignores texture res, polygon counts, shader program complexity, scope, draw distance, or the openness of the areas you’re in.
Reality is Wiiboy, a 360 or PS3 could run Prime 3, at several times the frame-rate the Wii could. Wii couldn’t run Halo 3 at all.

3ds Mac

On March 23, 2008 at 7:38 pm

And that’s only if you totally ignores texture resolution, polygon counts, shader program complexity (vertex or pixel), lighting, scope, draw distance, or the openness of the areas you’re in.

The reality is Wiiboy, a 360 or PS3 could run MP 3, at several times the frame-rate the Wii could.
Wii couldn’t run Halo 3 at all.

3ds Mack

On March 24, 2008 at 7:47 am

PS2′s gpu had a 2,560 bit bus width. That makes it “awesome” by the way.

Because, you know, the number is high.

wiiboy

On March 24, 2008 at 12:47 pm

PS2 AINT 4 BIT COLOUR TEXTURES 8 BIT COLOUR TEXTURES AND 100NS LATENT RAM AND NO DIRECT RAM TO GPU READING OF DATA = DONYT EVEN GO THERE PS2 IS A DREAMCAST LEVEL MACHINE IT AINT EVEN IN THE ARGUMENT

wiiboy

On March 24, 2008 at 12:50 pm

600P AINT OFFICIAL HD RESOLUTION = MICROSOFT LIED

480P IS THE EXACT RESOLUTION NINTENDO STATED THEY DIDNT LIE

END OF ASRGUMENT NINTENDO FANS FAN GREATNESS MICRO FANS FAN A LYING BASTARD NAMED BILL

YOU CARNT HIDE FROM THE TRUTH GO BASH YOUR OUTDATED CONTROLS AND WAIT FOR LEVELS TO LOAD PS2 STYLE ILL PLAY MY HIGH SPEED LOADING PURE 60 FRAMES A SECOND WII WITH 3D MOTION CONTROLS

3ds Macks

On March 25, 2008 at 5:19 am

But the bus widths are indeed 2,560 bits wide. That’s all that really matters.

And who told you Mario Galaxy runs at 60 frames per second? I’ve heard of interviews where they mentioned the same “solid 30 frames per second”. Similar to Wind Waker, Mario Sunshine, etc.. (Or RE4 really)
Gamespot’s preview said the same thing.

And output on 360 is still 720p. How they achieve that is up to the developer. Halo 3 was rendered internally at 1152x640p, still far better than 640×480.
There’s was for lighting range, other’s is simply to avoid tiling the fame buffer.

Wii probably has the same problem, but deals with it differently.

Wii is on;y capable of 480p, if you don’t plan to use any form of aa.

Maximum with AA, is 640 x 264, and requires dropping color precision to 5 bits for red and blue, and 6 bits for green, with a 16 bit z-buffer.
If you need an alpha value….well, you don’t get aa afaik. Wii still lacks a stencil buffer, just like the Gamecube, and unlike the Xbox.

So, Wii sacrifices color precision, for the same reason 360 games drop rendering resolution. You get to see all the pretty color banding and dithering at 480p, and they get to see their games upscaled a bit to hd.

The difference is, 360 seems to have a hardware accelerated potential workaround…

Wii might have hardware accelerated tiling like the 360, but I would think they’d be using more aa then. (or, perhaps they use it, but to avoid the color precision drop)
Either way, it’s a sacrifice of image quality.

wiiboy

On March 26, 2008 at 2:04 pm

resident evil 5 proves gamecubes power last gen allso proves wii can near match the “”"”"”real world visual experiance of x360″”"”"”"”"”"”

if you think wii carnt run this in 480p your a lieing fuking fan its just res evil 4 in hd with insted of mexicano fellows

http://kotaku.com/372261/heres-some-resident-evil-5-play-footage

if a single one of you suggest wii carnt do this your in serious need of a anti fanboy pill wii can run this easy its like 2x res evil 4 wii can do that easy at 480p were the x360 dpoing it at 720p

it looks like a blatent copy of res evil 4 a generation to late allso the x360 controls are lame compared to res evil 4 wii editions great controls

3ds Mac

On March 26, 2008 at 7:29 pm

Sure, they could make a version of the game on Wii, and it’d likely look decent.
But it couldn’t do that specifically.
The math is simply not there on the Wii. It wouldn’t be simply a drop in resolution, everything else would be gimped as well. They’d be faking this, cutting that, toning this down, etc.. It’d be getting more changes than the PS2 version of RE4 did when it was ported from the Gamecube.
And that, would be comparing a Wii version to the PS3 or 360 version at 480p. A laundry list of things would be missing.

But, if you look at different pc games, that used shader model 3. Then compare that to the same game, that drops back to shader model 2 on weaker hardware, not everyone can see as much of a difference as there actually is. Mathematically, there may be a huge difference, but the result is more subtle.
If you take a game that procedurally generates grass on the fly on one hardware for example, and replace that with a simple texture or bump map on weaker hardware, the difference in processing is huge, but what you see, might “look” decent on the alternative.

That’s part of the reason games get more graphically impressive as the generation moves on. Aside from learning the hardware, and gaining experience,
they start learning how to fake things better. Noticing what looks fine using a simple bump map, what “needs” a normal map, what can be fine with a texture map, what lighting can be baked into textures, what “needs” to be computed real-time, etc..
They’ve been doing that with Wii hardware for years now, in the Gamecube era. They do that now with the PS2, and things like God of War 2.

And the differences between a 360 and Wii in multi-texturing isn’t all that great, the difference is in the floating point math related to effects, which is huge.
Really though, alot of developers are just beginning to utilize shader programs. They stuck to the simpler dx7 mutitexturing pipeline for most of the Xbox’s life. (Japanese developers especially) So, perhaps there’s room for Wii yet.

wiiboy

On March 26, 2008 at 8:33 pm

THE RES EVIL 5 FOOTAGE LOOKS NO MORE THAN 2X RES EVIL 4 GCN IT EVEN HAS THE BLACK BOARDERS

CAPCON ALLSO STATED ANYTHING DONE WITH THERE NEW WIRE FRAME X360/PS3 ENGINE CAN BE MIMICED BY WII USING THE RESIDENT EVIL 4 ENGINE

I AGREE ONLY DIFFERANCE IS RESOLUTION (NATIVE) YOU CAN RUN A WII VERSION OF RES EVIL 5 ON WII UPSCALLED TO 720P EASY

3dsMac

On March 27, 2008 at 8:32 pm

Neah, only difference wouldn’t be rendering res.

I’ve played RE4, it wasn’t covered in bump mapping, and they faked alot.

If Nintendo wanted a “480p 360″ (or PS3) they wouldn’t have used the same cpu. They wouldn’t still be relying on generic (1) T&L unit, they wouldn’t have the (4) limited register combiner pipelines they have. It would have been capable of full 32 bit color (at least), or hdr, it would have had a stencil buffer, and post transform vertex cache, its gpu wouldn’t be roughly xbox level, when it came to floating point math, and wouldn’t still be lacking a list of graphics effects that even the XBox could pull off, etc.. It’d be capable of reasonable AA, or it would at least, have some sort of effect unique to Wii hardware, to give folks something to brag about. Just one would have been enough for Nintendo fans

It should be capable of bump mapping (or even normal mapping) anywhere a developer thought it needed it. You’d see alot more complex animation, better “real-time” lighting, etc..

But, we don’t. Hopefully Nintendo doesn’t see their success this generation, and assume they can get away with such weak hardware next gen, and moves back to the Gamecube philosophy.

But you guys can cross your fingers that Capcom still relies heavily on the type of graphics effects that the Wii would be capable of pulling off a cheap, but decent looking copy of. And hopefully the list of effects that get cut back, toned down, or completely removed won’t be too obvious.
But don’t get your hopes up that it’ll pass anywhere near the same scrutiny Nintendo fans applied to the PS2 version of RE4.
And if and when it doesn’t, I’m sure everyone will blame Capcom’s “laziness”, and jealousy rather than Wii hardware.

wiiboy

On March 28, 2008 at 6:10 pm

CAPCON LAZYNESS I WOULDNT SAY THAT ITS CLEARLY A GOOD LOOKING GAME AND IT CLEARLY INDICATES WII CAN RUN NEXT GEN AT NATIVE 480P INSTED OF NATIVE 720P

STOP MAKING THINGS UP THE TRUTH AS IV TRIED TO TELL U BUT YOUR FAN EARS WONT LISTEN AND YOUR FAN EYES WONT READ IT

WII CAN DO NEAR X360 GRAPHICS AT NATIBVE 480P INSTED OF NATIVE 720P

YOU AINT TAKING FACT INTO YOUR BRAQINS AND STORING IT THERE YOUR INFACT TAKING LIES INTO YOUR BRAINS AND BELIEVING THEM

AS I STATED AT THE TOP OF THIS VERY LONG THREAD WII IS CLOSER TO 360 THAN XBOX IT IS IN NO WAY ONLY A XBOX 1 IN POWER

FOR THE UPTEENTH TIME IF I HAD A PENNY EACH TIME I STATED THIS ID BE A VERY RICH MAN

THE EA GUY LIED AAAAAAAAAARRRRRRRRRRRRRRRRRRRRRRRRRRRRRRR AAAAAAAAAAAAAAAAAAAAARRRRRRRRRRRRRRRRRRRRR STIOP FANING THE WII IS AS IV INFORMED YOU IS A

GC/XBOX 2.5

END OF ARGUMENT

wiiboy

On March 28, 2008 at 6:13 pm

YES CAPCON HAVE BEEN LAZY I CHANGE MY STANDPOINT

RESIDENT EVIL 4 WII EDITION CLEARLY NEXT GEN GAMEPLAY/CONTROLS

RES EVIL 5 CLEARLY LAST GEN GAMEPLAY/CONTROLS

SO RES EVIL 4 GETS SWEET AS SUGER WII CONTROLS RES EVIL 5 ON SOOOOOOOOOOOOO CALLED NEXT GEN SYSTEMS PS3 X360 GETS BUTTON BASHER CONTROLS YES CAPCON ARE LAZY

IT AINT A NEXT GEN EXCLUSIVE IT AINT A Wii EXCLUSIVE NUF SAID

GO MASH YOUR OUTDATED GAMEPLAY IDIOTS

wiiboy

On March 28, 2008 at 6:15 pm

RES EVIL 5 IS A LAST GEN GAME IT HAS A RES EVIL 4 ART DIRECTION TECH DIRECTION

IT HAS GAMECUBE CONTROLS ITS CLEARLY A LAST GEN GAME

IT CAN ONLY BE NEXT GEN IF ITS ON WII AS ONLY WII IS A NEXT GEN MACHINE

PC/PS3/PSP/PC = LAST GEN

DS AND WII = NEW GEN

SORRY FANBOYS THATS A ENGINERRING FACT

3ds Maxx

On March 28, 2008 at 8:41 pm

I’m not making anything up. The resolution limitations are outlined in their frame buffer patents.
You can believe what you’d like, but what you’ve told me is wishful thinking. Even if Wii replaced its register combiner “shaders” with full fledged “pixel shaders”, you’d still be looking at a system that falls somewhere equal to 4-8 of them, with a single vertex shader. And this, in a system with a gpu, that’s clocked at half the clock frequency of the other two.

480p is exactly 1/3 of 720p.
But Wii doesn’t have 1/3 the power in any area.
We see the same system, with the same limited 4 register combiners, and cpu dependent t&l unit, all clocked at 50 faster, with additional 64m of ram, with about 4 gb of additional bandwidth to it.
That’s enough power to do decent games, but it’s not a 480p 360 or PS3. Nor is it really a 480i 360 or PS3. You can take that as opinion if you want, but there’s mounds of evidence for it, including current games.

I’ll agree though. It should perform at least twice better than a Gamecube did overall. It’s at least 50% faster. Has more than 2.5 times the bandwidth to ram. More than 3 times the amount of ram. And hopefully, they fixed any bottlenecks developers ran into with the Gamecube.

But, I’m sure even the most avid Nintendo fan can admit, they should absolutely do better in relation to the other two next-gen. Even if they want to simply stick to 720p, while Microsoft and Sony bull the 1080p+ crowd.

They shouldn’t get the impression from current sales, that fans are perfectly happy with what they have now. That’s all I’m saying.

wiiboy

On March 29, 2008 at 6:44 pm

pro evo wii playmaker proves wii the only next/new gen machine

pro evo ps2 ps3 x360 = control one player at a time

pro evo wii = control all 11 players in realtime

pro evo ps2 ps3 x360 = standard button press passing

pro evo wii = pass anywere anyhow to any player with realtime point and pass

pro evo ps2 ps3 x360 = generic hard to use outdated 2d munus tactics formation pages

pro evo wii = 3d point and click menus tactics formation pages

pro evo wii is the only next generation football game in existance so resolution is next gen and a all new totally revolutionary system is not

IDIOTS CAN YOU HEAR READ WHAT YOU ARE SAYING

D-PAD AINT NEXT GEN ANOLOG AINT NEXT GEN ONLY WIIMOTE/NUNCHUCK ARE NEXT GENERATION THATS A GAMING FACT A ENGINEERING FACT AND A FACT OF COMMONSENCE

THE TREAD HAS HIT A NEW LOW APARANTLY THE BUIGGEST COMPANY ON EAERTH RIPPS OFF A DREAMCAST CONTROL PAD AND ACCORDING TO ITS FANBASE

THATS ING NEXT GEN

IS IT LONG LOADING TIMES AND OUTDATED CONTROLS AINT ING NEXT ANYTHING ITS THE SAME DAM PLACE AS XBOX 1 THE NAME OF BILL GATES SECOND CONSOLE IS ITS SELF A ADMITION OF THAT FACT

XBOX 360 AS IN TO GO FULL CIRCLE AND END UP WERE YOU STATED = 360 DEGREES THE NAME ITS SELF SUJESTS XBOX 360 IS JUST XBOX AGAIN

PS3 IS A PS1 ALLOVER AGAIN

ONLY WII IS NEXT GENERATION PLEASE APPLY INTELIGENTS D-PAD MASHERS I ONLY USE D-PAD IN 2D AS IM A INFORMED GAMER A HARDCORE GAMER A NINTENDO GAMER

wiiboy

On March 29, 2008 at 6:49 pm

HOW MANY CELL CPUS WOULD IT TAKE TO CPU A.I all 11 players in pro evo i would hate to think how many

IN OTHER WORDS AS IM INTELIGENT AND NOT STUPID LIKE FAN360 DUDE THE WII IS CLEARLY DOING THINGS NO AMOUNT OF CELLS CAN EVER DO 11 PLAYER CONTROLS VS PS3S X360S 1 PLAYER

ENOUGH SAID ONLY WII IS NEXT GEN

wiiboy

On March 29, 2008 at 6:50 pm

IMAGINE FPS RTS AND ACTION RPGS WITH REALTIME POINT AND WAGGLE INTERFACE ALL IN REALTIME AGAIN ONLY WII CAN DO THAT SORRY FACT

wiiboy

On March 29, 2008 at 6:55 pm

THE FACT YOU ARUED HARDRIVE AS A WAY BOF PROVING XBOX OUT DOES GAMECUBE PROVES YOU ALL STUPID ITS A INDUSTRY FACTR GAMECUBE WAS A FASTER LOADER STREAMER THAN XBOX EVEN WITH THE DAM HARDRIVE

HOW MANY HIGH END PC GAMES STILL SUFFER LONG WINDED LOADING EVEN WITH MASSIVE RAMS AND FAST HARDRIVES

YOU NEED TO LEARNS YOURSELF SOME ENGINERING AND DESIGN COMMONSENCE

NEXT YOULL BE SAYING PS3 AND X360 LOAD FASTER THAN WII GIGGLE

FASTEST LOADER STREAMER WII FOLLOWED BY X360 LEAVING SAD OLD BLU PS3 IN LAST PLACE

MARIO GALAXY OUT LOADS FROM DISC ANY PS3 GAME EVEN WITH HARDRIVE INSTALLS PLEASE TAKE FACT AS FACT

ONLY NINTENDO UNDERSTAND CONSOLE DESIGN

3ds Mac

On March 29, 2008 at 11:50 pm

Dude, whom are you debating controls with? Not me. Perhaps all the phantom Xbox and PS fanboys you hear in your head.
I never had a problem with their controller, even though it took them a while for Nintendo to show its worth, it seems to be working well on a few of the newer games.
And if you’re happy with them practically recycling their chipset, that’s cool. I’m not. I recognize the tweaked last gen hardware as just that.

And I’m sure you know how load times work. i.e, the more ram you have to load, the more time it would take to load it. 512mb will take longer to load than 24+64. That’s just common sense.
And seek times, and transfer rates from an HDD are far better than either disk drives. All this data is well known. Gamecube didn’t have some special disk transfer rate, that you started out claiming in this thread. Transfer rate was slow, seek times were pretty good. Storage space was limited. No extra special Nintendo compression. That’s about it.

And you can think it was awesome for Nintendo to stick 16mbs of 81mhz/8bit a-ram into the Gamecube, but developers sometimes just pre-loaded some data there, for the following levels to cut back on load times, because that was alot of ram, that would otherwise have been wasted, since it isn’t useful for much else outside of sound.

But I’m sure you could ask any developer there is, and they’d tell you they’d rather have 40mb of ram to use for rendering real-time graphics with, rather than that set-up.

And they could do the same thing on a 360 or PS3. Just pretend they only have 256mb of ram, and load the other 256 while you play. Then load the next 256 after that, all seamless.
But that would be dumb, and looked at as a waste. So would breaking levels up into tiny sunshine or prime-like pieces, and just hide loading while waiting for doors, etc..

With better multicore programming, they could just stream data from the disk, or better yet from the HDD, while you play.
If it were a priority, and developers were accustomed to programming like that, it wouldn’t be difficult. Just keep track of what gets loaded in, and when, and you’d have a game with zero load times, and zero issues related to it.
Even early early gen games like Saint’s Row did that decently.

And you know full well, every Nintendo fam on earth, would gladly trade current Wii hardware, for either of the other two. So pump up its “awesomeness” all you’d like, it’s still underpowered at pretty much everything.

3ds Mac

On March 30, 2008 at 4:39 am

And it was easy for Microsoft to originally pick up Dreamcast software and Sega support, because they made the OS for the DC, and provided the dev kits for it.
As well as the drv kit for Chihiro arcade board, which was an Xbox with more ram.

wiiboy

On March 30, 2008 at 4:58 pm

apparently :roll: systems with control pads stolen past nintendos past and loadtimes like a ps2 are cough next gen what a load of suckers

3ds Macks

On March 30, 2008 at 8:57 pm

So? You think the layout of buttons on a controller is something they deserve a huge medal for? Sure, they should get some credit, but if you were tasked with the idea of designing a controller, would you not wind up with something nearly identical to a Nintendo controller back then?
And I never did like Gamecube’s button layout.

Intellivision controllers had d-pads. Except instead of 8 directions, they had 16. up, down, left, right, and three points of diagonal in between each, paired with a customizable button layout.
And I held it sideways, so I could click the shoulder buttons.
Atari 2600 had a d-stick with 8 directions.
The 5200′s had analogue sticks.

wiiboy101

On April 9, 2008 at 12:46 pm

analists experts gaming sites all agree MICROSOFTS WIIMOTE COPY = DISASTER

SOME SAYING ITS PROOF OF A RETREAT OUT OF CONSOLE MARKET BY BILL GATES

1 ITS CLEAR AND BLATENT COPYING

2 THE BUIGGEST RICHEST COMPANY ON EARTH CARNT INOVATE THERE OWN

3 IT LOOKS DESPERATE AND TWO FACED

4 IT SHOWS BILL IS A LYER (GIMIC KIDDY MAINSTREAM DONT WANT ADVANCED CONTROLS ETC YES BILL ACCUALLY STATES THESE IDIOT COMMENTS IN PUBLIC)

5 IT WILL COURSE CONFUSION WITH DEVELOPERS

6 MICROSOFT WILL PATENT RIPP OFF WII HARDWARE SOFTWARE AND GET AWAY WITH IT
:lol: :lol:
(WINDOWS = WORKBENCH AMIGA MAC OS RIPP OFFS)

MICROSOFT COPY EVERYTHING THEN SAY ITS THERES

MOTION X360= XBOX CULTURE KILLA I AGREE

WHAT BILL IS ACCUALLY DOING IS WIINESS CONTROLS FOR PCS AND XBOX LIVE FOR PCS AND HAS NO REAL INTEREST IN CONSOLES

HIS JUST A POINTLESS GATE CRASHER

wiiboy101

On April 9, 2008 at 12:46 pm

JOYSTICKS AINT ANOLOG STICKS THERE TWO D NOT 3D IN CONTROL A 2D 80S CONSOLE HAD ANOLOGS NO IT DID NOT LYER

wiiboy101

On April 9, 2008 at 12:48 pm

A GAMER DOES NOT RESPECT MICOSOFT OR SONY A GAMER UNDERSTANDS THE NINTENDO DIFFERANCE

COOL HYPING BILL GATES DOES NOT MAKE U A GAMER IT MAKES U A CORPERATE BRAINWASHED IDIOT

NOW LOADING BASH A D-PAD IN 2008 SORRY IM A “”"”"”"”"”"”"”GAMER”"”"”"”"”"”"”"

I EMBRASE THE FUTURE

HEHEHEHEHEHAHAHAHAHAHAHAHAHAH RED LIGHTS OF DEATH NOOOOOOOOOOOOOOO THANKS

wiiboy101

On April 9, 2008 at 12:50 pm

WHO IN THERE CURRECT MIND BUYS A CONSOLE DESIGNED TO BREAKDOWN TO SELL MORE CONSOLES AND FAKE THE USERBASE NUMBERS

XBOX FANS THATS WHO IDIOTS CARNT SEE WEN THERE BEING USED AS BILL GATES PORNS SO SAD

3ds Max

On April 10, 2008 at 3:10 pm

****JOYSTICKS AINT ANOLOG STICKS THERE TWO D NOT 3D IN CONTROL A 2D 80S CONSOLE HAD ANOLOGS NO IT DID NOT LYER****

Sure it did, 5200′s sticks were analog. They weren’t digital directional pads.
Truth. I know, you don’t “get” the meaning of the word analog, as opposed to digital when it comes to control I guess. That’s your problem. You’d rather just say, “lyer”.

And how many gamers would ever wish that all three had a gimmicky controller like the Wii? I don’t. I don’t like using it, for anything but a handful of Nintendo games. It’s awesome for a few games, crap for others.

And are you foolishly assuming Nintendo invented the technologies they put into their controllers? Sure, they’re making good use of the technology, but it’s not their invention. They’re a list of controllers and devices that use the same mechanisms. d-pads, anolog sticks, rf wireless, etc..
right along with LEDs, gyroscopes, accelerometers, magnetometers, position tracking, etc… They’ve all been done before, even for gaming purposes.

Even Microsoft used most of them back before 02.
http://research.microsoft.com/~awilson/wand/default.htm

And you continue to assume I give a rats ass about Microsoft? I actually like Nintendo as a company more than them. I simply dislike their generic hardware.

And only a tard, would assume hardware failure was intentional for “user base inflation”. Lol.
“OMG, Guitar Hero Wii, only has ty mono sound, because Nintendo wanted the developers to able to lie about sales figures. I knew it!”

wiiboy101

On April 12, 2008 at 6:06 pm

a story sorry a true story

there was a big big leak recently that sega got piissed about as there was allready exclusive magazine articals for the next months issues not going to state witch but there was and the internet leak kinda spoilt it all

but lets talk about the leak in question it has sweet graphics and high speed gameplay it has sweet cutseens its accually looking visually very combatant

the forums went wild THERE 360 GRAPHICS THERE 360 GRAPHICS DONT IT LOOK GOOD IS THIS A RETURN TO FORM FOR THIS CHARICTOR

BUT EVERY SO off den A POSTER WOULD STATE theres something very wii about this game it has a wii vib and the game looks simuler in style to a previous game on the wii system

the forums went wild with brainwashed graphics whores imposable they screamed in there so called superior xbox wisdom

there in 720p for a start and there lush (even tho there was no mention of resolution or system )

the debate continued is it a real game is it a faked video by some talented guy

the whores continued its 360 and passably ps3

then there was a wii logo spotted and then it was confirmed

OFFICIAL NINTENDO MAGAZINE UK HAD THE SKOOP ON THE TITAL

AND THE LEAKED SCREENS/FOOTAGE WAS INFACT SHOCK HORROR SURLY NOT wii footage

the sweet sharp visuals and high speed play that all of the above xbox fanes in there graphics whore wisdom stated was there system

was infact wii footage

YOUR SO BRAINWASHED SO HYPE FILLED SO GRAPHICS SLUTTED OUT YOUR TINY MINDS THAT YOU HAD TO BUG UP WII GRAPHICS AS YOUR OWN SYSTEMS GRAPHICS
YOU ENDED U[P LOOKING SAD SHALLOW AND DESPERATE

ITS CONFIRMED THE WII VERSION LOOKS THAT GOOD JUST LIKE X360 ONLY NATIVE RESOLUTION IS DIFFERANT

SONIC UNLEASHED JUST PIMPED YOUR BELIFE SYSTEM AND MADE MY POINT ON THIS VERY WEB PAGE STAND OUT LIKE A SORE THUMB

WII IS CLEARLY FAR MORE POWEWRFULL THAN XBOX

SONIC UNLEASHED LOOKS SWEET ON WII

AGAIN MY WISDOM IS PROVEN CURRECT AND THE BILL GATES BRAINWASHED BRIGADE ALL SAD LOOSERS

WII ZONE.COM

KOMBO.COM

OFFICIAL NINTENDO MAGAZINE

THE PROOF IS THERE AND NOW DONT YOU LOOK SILLY XBOX FANBOYS YOU CLAIMED IN DESPERATION IN WHORING GRAPHICS A WII GAME AS YOUR OWN

IV JUST PIMPED EVERY LAST ONE OF YOU NOW GET ON THE BOTTOM BUMK ILL BE DOWN TO FUK UR BUTT SILLY LATE IN THE NIGHT

:twisted: :smile: :cool:

wiiboy101

On April 12, 2008 at 7:39 pm

priceless just priceless

3ds Mac

On April 13, 2008 at 3:40 am

Lol, I’ve never been fooled by any of what you speak.
And no, I haven’t seen a single screenshot of anything on the Wii that tells me it’s “far more powerful” than an Xbox. (more powerful, maybe, but there is no “far” in that sentence)
Nor have I seen anything that indicates that it’s a 480p 360. It’s simply not.
The math is simply not there. The floating point capability isn’t there. I’ve seen all the specs and screenshots I need to, and it’s a weak system.
You can be as gullible, weak minded, and delusional as you’d like. I and many others are capable of reading system specs as what they are, and capable of judging graphics for what they are, and can tell what’s being computed real time, and what’s faked, whether there’s aa, what I think the z-buffer bit depth is, etc..
And the Wii is Wiik.
You on the other hand, have never made, written, nor even attempted anything related to computer graphics.
And you’re happy with repackaged Gamecube hardware. That’s fine. But you attempt to defend the system, by claiming it’s something that its not…well, you continue to grasp at those straws dude.
And no amount of mediocre Sonic Unleashed footage, (that Nintendo fans such as yourself), are crossing your fingers that it’ll look like a ps3 or 360 game is going to change that.

It would be like an Xbox fan, pointing to Ninja Gaiden Black, and saying “see, see? The Xbox I have now, is clearly significantly more powerful than the one I had 3 years ago when I only owned King Arthur and Kabuki Warriors”
Clearly.

Sampson

On April 13, 2008 at 4:17 am

The Sega guy said there was some Wii footage, at about 3/4 of the way in. And one clip particularly after Sonic jumps into the air, it switches to Wii footage too.

Sampson

On April 13, 2008 at 4:19 am

http://www.sega.com/gamesite/sonicunleashed/us/videos.php

wiiboy101

On April 13, 2008 at 8:17 am

i m not bigging up sonic or anything just pointing out xbox fans are losers ever suggesting or part believing wii is only that powerfully that is the only point im making

iv clearly proven that 3rd party lies excuses were taken as fact by daft people

and iv allso proven my own super compatence as a spec analst the wii is built to compeat at set 480p
take the fact as fact and the lies as lies

the fifa ea dev guy lied just except that iv educated your dum asss

3ds Mac

On April 13, 2008 at 2:09 pm

But I’ve learned nothing. How could you have educated me?

And the Sonic video, “after he jumps into the air”, is most likely the PS2 version, if it’s anything. It does specify at the bottom of the page,

“Available on the PlayStation®2 computer entertainment system and PLAYSTATION®3 computer entertainment system.”

Wii version shouldn’t have too much of a problem looking similar to the better parts. It’s a 2.5D game, which means you can limit the camera angles, and views of what the player can actually see. Like Tomba 2 on the PS1. If it gave the player the ability roam freely, and turn the camera at will like Mario 64, it wouldn’t have been capable of the graphics it had.

And I don’t see the Wii as being

3ds Mac

On April 13, 2008 at 2:32 pm

only equal to Xbox. It should be more powerful.

And I would agree, that developers are being lazy when it comes to Wii games, and their graphics. They should at least be able to consistently match Gamecube launch games, with second generation Wii software. Using an “it’s just an Xbox” is a lazy excuse for ugly graphics.

Btw, if there is a PS2 version of Sonic Unleashed, it’ll likely gimp the Wii version, since developers tend to develop game graphics in two categories, (PS2 and Wii) and (360 and PS3) since Wii isn’t capable of things like a 200+ instruction pixel shader, etc..So, assets would be built to also work on PS2 to some extent. While the other consoles move away from last-gen assets, animation, textures, ai, effects, etc.. Even from Japanese developers.

So Wii games will have a small sprinkle of Nintendo first party, and a couple ambitious third party games. The rest will have shovelware style graphics for the cheap cash-ins, and everything else will be PS2 cross-platform for a long time to come.

Nintendo haters: “Long live the PS2!!”

wiiboy101

On April 15, 2008 at 4:08 pm

wii fans have never whored ps3 and x360 graphics as there own

but thats exavcly what the ps3 and xbox fans did with sonic unleashed wii

yet wii owners have the problem

your like crack heads or alcoholics your in denial :wink:

wiiboy101

On April 15, 2008 at 4:10 pm

the difference in quality of sonic unleashed 360 and wii is native resolution and nothing else

didnt i state that like a mile ago up the top of this thread

you can drop at my feet now if you want to :grin:

3ds MAc

On April 16, 2008 at 1:21 am

Sonic unleashed isn’t an impressive game graphically. And that trailer is most likely not all from the same system.

And who said “nothing else”? I didn’t, and I don’t recall Sega saying that either. If they did, and it turns out somewhat true, it would simply indicate Sega didn’t take advantage of the newer hardware.
You can count the number of potential instructions per pixel each system is capable of, and Wii doesn’t come anywhere near the the other two, even in 480p.
Not even close.
That’s not to say the graphics are ugly, they can stylize things to not require hundreds and hundreds of instructions, and still look decent, but a developer would actually have to “do” that.

wiiboy101

On April 18, 2008 at 3:01 am

quantum 3 wii exclusive engine demo at ign wii and gonintendo

oh look bump mapping normal mapping 16 stage real time texture shader colour blending both fixed function and shader pipe programmable multi layered textures realtime lighting shadows self shadowing high resolution colour rgb gloss maps faked displacment mapping bloom dynamic bloom

its all there just the hd fornmat resolution missing all at 480p

just like anyone with a once of sence allready knows wii is next gen

3ds MAc

On April 18, 2008 at 6:43 am

Lol. You’re impressed by that? Looks like a geforce 4 level graphics engine, tops.
“Uh oh, look out Xbox Rid and Doom 3, there’s a Wii game that might compete with your visuals.” /sarcasm.
http://www.youtube.com/watch?v=UA9xLCc3SPY

http://wiimedia.ign.com/wii/image/article/867/867484/the-conduit-20080417032705723_640w.jpg
Nice draw distance.

And they say right in their demo, “1 to 4 stage texture composition.”

And not sure why you’re fixated on “16 stage shader tree”. That’s a meaningless metric, that only indicates the maximum number of stages the gpu could do on one pixel, not an indication of what you can apply to every pixel.
And they don’t have “stages” in a Wii sense any more, because they don’t have fixed pipelines that require running the pixel through the “entire” pipeline multiple times to perform texture layer effects. That’s wasteful.

360 fans might as well boast of 4000+ f-buffered shader programs then.

wiiboy101

On April 18, 2008 at 9:28 am

http://www.youtube.com/watch?v=hkNguFQBAqQ end of story

3ds MAc

On April 18, 2008 at 11:59 am

That’s the demo I’m talking about, where they say “1 to 4″.
And that tech demo, and especially their game, is only impressive when you compare it to other Wii games.
It fits right in with the better Xbox and Gamecube games, only more polished.
Wreckless Yakuza, Rid, Doom 3, Splinter Cell, Rogue Leader, RE4, etc..If anyone showed that demo and game as 360 or PS3, everyone would be pointing out how crappy it looked.

wiiboy101

On April 23, 2008 at 10:01 am

wrote on Dec 9 2005 8:09AM
Alright you want data, here it is. Rouge squandron 3, a release game for the gamecube pushed 15 million polys per second. WOW 15 million. The xbox boasted it could push 120 million polys per second. No xbox game, not even halo pushed more then 11 million polys per second. Gamecube had 5 release games that pushed roguhly 15 million.

For one, Microsoft’s numbers are indeed inflated. The Xbox’s fillrate is nowhere NEAR 4 Gtexels/sec (more like 250-750 Mtexels, according to developers). Xbox’s system bandwidth isn’t a true 6.4GB/sec, considering any info from the CPU to the GPU and vice-versa is bottlenecked at 1.02GB/sec; one-third of GCN’s overall system bandwidth in realtime. Xbox’s GPU also requires 16MB of the 64MB DDR just to cull a Z-buffer (which is embedded on the GCN GPU at no cost to system memory), and also GCN’s internal GPU bandwidth is more than twice that of Xbox’s (25GB/sec compared to 10GB/sec). Also, Xbox claims to have more effects than GameCube, and better texturing ability in its GPU, when the XGPU can only do 4 texture layers per pass, and only 4 infinite hardware lights per pass (8 local lights can be done, also). GCN, on the other hand, boasts 8 texture layers per pass, and 8 infinite hardware lights and local lights per pass, all realtime.
What this means is that while Xbox relies on vertex shaders and pixel shaders (which BTW are absent from GCN hardware) to do realtime bumpmapping, the same effect is done in hardware on GameCube via it’s texture layers. Xbox also must deal with texture layers per bumpmapped surface per scene, though.

Also this whole processor thing is quite twisted considering Xbox and GameCube are two TOTALLY DIFFERENT architecures (32/64-bit hybrid, PowerPC native compared to 32-bit Wintel). GameCube, having this architecture, has a significantly shorter data pipeline than Xbox’s PIII setup (4-7 stages versus up to 14), meaning it can process information more than twice as fast per clock cycle. In fact, this GCN CPU (a PowerPC 750e IBM chip) is often compared to be as fast as a 700mhz machine at 400mhz. So GCN could be 849mhz compared to Xbox’s 733mhz machine performancewise.

Dead or Alive 3, a game Tecmo said “was impossible on any system other than Xbox” due to the amount of polygons onscreen, is a 9-10mps game, tops. The character models (which were also claimed to be an impossibility elsewhere) consisted of 9,000 polygons each- the same amount of polygons in characters in StarFox Adventures, Eternal Darkness, and even in Luigi’s Mansion (end boss). Resident Evil 0, however, boasts the highest polygonal “low-end” model to-date- a whopping 25,000 poly character. Now why is this possible (even against prerendered backgrounds) on a “less techincal” console? Why isn’t Xbox smothering GCN to death with games that are impossible to be done on any other console?

People say water in games like Bloodwake and Morrowind can’t be done elsewhere. I point to StarFox Adventures, and even Super Mario Sunshine. People say games like Halo have loads of bumpmapping. I point to Rogue Leader, Eternal Darkness, and Resident Evil’s character models and doors. I’ve even heard the gripe about individual blades of grass rendered on Xbox games. I once again point to StarFox Adventures, Mario Sunshine, and even the Legend of Zelda: The Wind Waker. Some Xbox fanboys I’ve run across have even been sore enough to say Xbox has faster loadtimes. I then point to Luigi’s Mansion and Metroid Prime, which are impossible on Xbox because they HAVE NO LOADTIMES (the game is constantly streamed from the GameCube disc in burst packets). Simply put, there’s not one effect Xbox can do that GCN can’t, while this can go the other way since Xbox lacks half of GCN’s hardware lights and texture layers onboard.

Owned.

SO TRUE ABOUT BANDWIDTH

SO TRUE ABOUT TEXTURES LAYERS

SO TRUE ABOUT LIGHTING

SO TRUE ABOUT WASTED XBOX MAIN RAM JUST FOR BUFERS (GAMECUBE DID THEM ON CHIP WITH INSANE SPEED AND BANDWIDTH)

SO TRUE ABOUT CUSTOM EFFECTS

GAMECUBE BEAT XBOX WII BEATS GAMECUBE SO BY SOME KIND OF XBOX FAN MAGIC IT BEATS WII

ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ WHATEVERZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ

wiiboy101

On April 23, 2008 at 10:08 am

XBOX 8X4 REAL TIME LIGHTING ———GAMECUBE 8X8 REALTIME LIGHTING PLUS CUSTOM CPU LIGHTING

XBOX 4 TEXTURE LAYERS/4 TEXTURE STAGES——GAMECUBE 8 TEXTURE LAYERS 16 STAGES

XBOX TINY 128K GPU TEXTURE CATCH —-GAMECUBE 1MB TEXTURE CATCH PLUS 6TO1 COMPRESSION(XBOX CARNT READ COMPRESSED TEXTURES ONLY STORE THEM)

XBOX IN GAME 16 BIT COLOUR PLUS ——GAMECUBE CUSTOM OPTERMIZED 24 BIT HIGH RES TRUE COLOUR

XBOX LOADING SLOW AND CLUMBERSUM REQUIRES HARDRIVE TO HELP OUT

GAMECUBE FAST BURSTS OF BALANCED COMPRESSED DATA BEATING ALL OTHER CONSOLES AT DISC TO RAM STREAMING FACT

XBOX FRONT SIDE BUS 133MHZ NO ADDITIONAL COMPRESSION

GAMECUBE 162MHZ FSB PLUS 4TO1 COMPRESSION WITH REALTIME DECOMPRESSION

GAMECUBE BEATS XBOX WII BEATS BOTH BY 2.5 TIMES REALTIME PERFORMANCE ALL AT THE SAME RESOLUTION = WII IS = X360 – HD = CAPPED 480P

OWNED :lol:

wiiboy101

On April 23, 2008 at 10:12 am

REAL TIME 16 STAGE BLENDER SHADER ENGINE APLIED ON 8 TEXTURE LAYERS ALLSO BLENDABLE IN A SINGLE PASS PLUS CUSTOM CPU SIDE EFFECTS PLUS 8X8 REAL TIME FULLY PROGRAMMABLE LIGHTING AND CPU PROGRAMMABLE LIGHTING = FULLY PROGRAMMABLE GRAPHICS PIPLINE = CUSTOM EFFECTS=PIXEL LEVEL SHADING

OWNED

3ds Mac

On April 23, 2008 at 12:44 pm

See Wiiboy? This is why people here assumed you’re retarded. You don’t absorb any more info, you continue recycling the same old Xbox vs Gamecube bull you’ve been doing. Everything related to the you just copy and pasted for the hundredth time, has been explained to you.
If you’re talking about fill rate, it depends on the number of textures being applied. And Xbox would be 933 million max. Their 4gp numbers, comes from the fact that they had hard-wired 4xAA. AA would normally consume fill rate, the way it does on Gamecube, btw.
Their reasons for that, was to compare to PS2′s 2.4 billion. Which, would not only be cut in half for a texture layer,but would also consume fill-rate the same way for AA pixels.
If a developer tells your dumb ass, that the system has a fill-rate of 300-400 in game (which is what that article you copied that from said) it means their game averages 3-4 texture layers on every surface.
933 max raw fill rate. 2 textue layers, would be roughly 750 – 800. And looping the pixel back once, cuts it to 300 – 400.
That’s with 4 texture layers Wiiboy.
If you do that on a Gamecube, you loop back 4 times, and divide Gamecube’s max fill rate by 4. Which would make it 162 million, at the absolute most.

And you know jack about “internal bandwidth” numbers. (as you’ve proven repeatedly)
And you understand the process even less, and have no intention of retaining the truth. I could go through what they’d technically be, and count them, but you’re too busy making up for me to bother.

And the same old Rogue Leader bull. It’s been explained to you, how they transform polygons. WHat T&L stands for, and how the complexity of what you do with those polygons, affects the number you can transform.
If you wanted to do “per vertex lighting” it cuts your ability to transform polygons. Unless of course, you were trying to recreate splinter cell on Gamecube, at which time you couldn’t do the advanced per vertex lighting, and just use the combiner stages.

I’ll give you an easy example” The PS3 is set-up limited at 250 million polygons per second. But its vertex shaders are far, far, more capable than Xbox1.
That’s because the same floating point processing that you do for transforming generic polygons, is also used for all aspects of manipulating those polygons.
So, I could make a game that is pumping 60 million generic polygons if I wanted. Or, I could keep lower, and use some of that floating point power, to do more complex things with them.
And, as has been stated plenty of times, Xbox cpu + 2 vert shaders are more capable than Gamecube’s cpu + single generic T&L unit. Both in raw flops, but also in complexity of what they’re capable of doing.

And once again, that 16 sage shader tree, 8 texture layers, is all bull, that you don’t comprehend at all.

That’s just the way it is Wiiboy. The sooner you begin to understand that, and accept it, the sooner you’ll begin to accept that Nintendo pimped you this gen.
You should limit your defense to games and controls, where most other Nintendo fans have turned, and leave the “hardware analysis” to people who aren’t “autistic without the math skills”.
And stop making a fool out of yourself.

3ds Mac

On April 23, 2008 at 1:18 pm

***XBOX FRONT SIDE BUS 133MHZ NO ADDITIONAL COMPRESSION***

***GAMECUBE 162MHZ FSB PLUS 4TO1 COMPRESSION WITH REALTIME DECOMPRESSION***

And once again Wiiboy, why would Xbox need it’s cpu to do this? When the vertex shaders are more than capable of reading compressed data straight from ram, in vertex buffers and hardware command streams, without using the cpu for such a task?
You’ve simply discovered that Gamecube uses the old, generic, T&L set-up. The cpu has to read its data from ram, (up the front-side bus from main ram” to be prepared and processed, then back down to the generic t&l unit. I would hope, that if that were a required step in processing, it would be capable of compression.

Not unlike compression that the vertex shaders are capable of using.

Grasp at those straws, Wiiboy.

3ds Mac

On April 23, 2008 at 1:21 pm

***GAMECUBE BEAT XBOX WII BEATS GAMECUBE SO BY SOME KIND OF XBOX FAN MAGIC IT BEATS WII***

And you continue with this… Are the “voices back wiiboy”?

360/PS3…….Wii..Xbox.Gamecube..PS2.
Simple.

3ds Mac

On April 24, 2008 at 12:46 pm

***XBOX IN GAME 16 BIT COLOUR PLUS GAMECUBE CUSTOM OPTERMIZED 24 BIT HIGH RES TRUE COLOUR***

Got it backwards again. Caught making up for the hundredth time now.
And this was explained to you repeatedly. Xbox processed and stored 32 bit pixels. Gamecube = 24 bit. Cutting to 16 bit when an alpha value was needed for transparency.
Again, Xbox 8,8,8,8. Any time.
Gamecube = 8,8,8 or, 6,6,6,6= if you need something like transparency, or, 5,6,5 when aa was used, (which was likely never) And just to show how awesome the Wii is, it still has this limitation.
How awesome is that?

Quoted to you earlier:
“” doing aa on Gamecube, required dropping to 16-bit color, and 16-bit z, And it’s maximum resolution was 640×264, as opposed to 640×528 for no aa.”"

This is a fact WIiboy. Outlined in Nintendo’s own patent ap.
And, Wii lacks a stencil buffer for things like shadows, so the bits for it, come straight out of Z-buffer depth as well.

****Xbox’s GPU also requires 16MB of the 64MB DDR just to cull a Z-buffer (which is embedded on the GCN GPU at no cost to system memory)****

And this. If you had a 640×480 frame, with pixels with 32 bit z values, you need 640 x 480 x 32 = 1.23 mb for a z-buffer. If we assume aa, we can quadruple that. Then we can decrease it, because all gpus use 4to1 z-buffer compression.
Then we can toss in 32-bit color values, which msaa reuses for aa pixels, and we can say, 3 mbs for color and z.

Then, we can compare that, to Gamecube’s required copying of that on-chip ram, to main ram, since that’s where it gets read to screen. So, they triple buffer. One copy on-chip, copied to ram, then copied to front buffer. Sure, a frame takes up a little less space on Gamecube, due to the ty pixel formats the Gamecube used, but it’s likely compares well, as far as how much system ram was required to store the frame buffer.

No point in comparing the size with aa, because the Gamecube would drop to half res, and 3/4 color and z depth, just to fit them on its awesome on chip ram.
Not to mention it would cut fill rate in half. (at least)

That’s not to say Xbox would be awesome with AA, even with hardwired 4x. (not cutting fill-rate) It had limitations too. Not nearly what Gamecube had of course, but they were there.

3ds Mac

On April 24, 2008 at 2:38 pm

In conclusion:
So true: You don’t know what you’re talking about.
So true: Nor do many of the folks you copy and paste from.
So true: You have no interest in reading the truth on any of it.
So true: You’re a delusional, irrational, fanboy, living in denial.

No one can dispute any of these particular points. Not even your fellow Nintendo Defense Force members, of whom you share a Wiibrain with.

And keep in mind, Nintendo is a fine company, and makes some of the best games in the industry, but you’d have to be a seriously mindless troll to be so dedicated to attempting to falsely portray the Gamecube / Wii as “awesome hardware”, to somehow get over this inferiority complex you’ve developed.

Sampson

On April 24, 2008 at 3:30 pm

Wiiboy, still at it, eh?
He posts this stuff on a regular basis, across the internet, being perma-banned no doubt. Defending a company that no longer cares about him, or his interests. He’s a confused Nintendo graphics whore, and Nintendo couldn’t have built a worse piece of hardware when it comes to graphics. So he’s lost his mind, and aimlessly wanders the internet, posting bull arguments from last gen, because obviously, he has lots of time on his hands; it’s not as if he has any good games to currently play on his Wii. :lol: Or none that he enjoys, being so graphics obsessed.

Wiiboy:
This likely developed while you were “fighting your first fanboy wars” on message boards with PS2 and Xbox fantards last gen. Or perhaps this carries back to the early days of SNES vs Genesis.

You don’t bring up PS3 much in your claims, because the PS2 was so far ahead in sales, you saw only Xbox as a rival for PS2′s sloppy seconds.

But either way, it’s carried over to this gen, where Nintendo abandoned you and your ilk, and crushed your graphics whoring dreams, and left you with no ammo to defend yourselves. Now, you notice all that smug, condescending attitude you had when PS2′s RE4 was toned down compared to Gamecube’s, has blown up in your face. Sure, they tried to latch onto last ditch rays of hope, that Nintendo would use its awesome “displacement mapping” skills, combined with “cube mapping”, and NURBS, Nintendo wouldn’t let you down after all, and those specs leaked from IGN weren’t the whole truth….But, you saw the first released screens of its first game, Tony Hawk downhill, like everyone else….

After the initial denial that they were real, and not from the psp, panic set in, confusion and desertions from your ranks followed, and after being overwhelmed by actual next-gen graphics on competing platforms; some, were able to slip back into the normal, mainstream Nintendo fan-crowd, and champion controls and gameplay, and proclaim that “graphics don’t matter”. As if that’s all that’s important to you now.

But deep down, you’ll never be the same. You’ll always resent Nintendo for abandoning you for the grannies and soccer moms, and recycling the Gamecube, and calling it Wii.

You however, have not given up hope. You’re still grasping at hardware specs, and trying to spin, misinterpret, and misrepresent every bit of what you used last gen, this gen. Except it isn’t working. The graphics aren’t coming to back up your claims. They aren’t showing the promises you hoped. Developer after developer, has come forward and said, “the WIi is a piece of ” when it comes to graphics. And your only defense, is to say they’re lazy, and jealous of Nintendo’s power, and somehow gimping graphics in a conspiracy to hold the system back from its true potential of being a “480p 360″. When the specs don’t imply that, the developers don’t imply that, Nintendo doesn’t imply that, and the games, even Nintendo first party, don’t imply that.

All the while, the other systems have continued to improve, and pull well away from the Wii’s graphical grasps.

Some of your fellow NDF members, are happy with what they have, accept the Wii for what it is. The more adamant ones, that still feel the need to continue with their fanboy ways, are left jerking their pudd to sales figures from vgcharts, and drawing erotic crayon pics of Miyamoto and Iwata swimming in their money bin, while they wait another six months for Nintendo to throw them a bone, with another decent rehashed franchise, in between all the shovelware crap, and carnival game mini-games.

Tis a sad site to see a former Nintendo graphics whore drifting into madness like that…A sad, sad sight indeed. :smile:

lennell

On April 24, 2008 at 9:20 pm

to: 3ds mac
you are right about gamecube and xbox. overall both sistem can do things that the other can’t do at max power, but the xbox will still out do the gamecube in power. that the true wiiboy.

3ds Mac

On April 25, 2008 at 2:31 am

To me, if the consoles were athletes back in the day, and Gamecube were Tiger Woods, Xbox would have been Bo Jackson or Michael Jordan.
Taller, faster, and just better at most things. Sure, Woods is far better at golf, but Jordan is actually not a bad golfer, and would smoke Tiger in basketball. Probably had better endurance, could jump higher, farther and run faster. Woods might win at pool, Jordan would win at baseball, Woods might hold his own in tennis, Jordan could likely lift more weight, Woods might win at darts, and perhaps hold his own in swimming, Jordan would win at the high jump, and most track and field events, Woods might win in a rap battle, etc…

But as an overall athlete, Tiger Woods is certainly not better, and latching to one thing or another, doesn’t change that.

Goofy, generic analogy, I know, but Wiiboy is trapped in a feedback loop it seems.

wiiboy101

On April 25, 2008 at 10:28 am

gears of war charitors 10k to 15 k polygon counts

res evil 4 gamecube not wii a gamecube engine 10k leon so gamecubes hitting polygon counts on par with gears of war not a xbox 1 tital

gears of war 10k to 15k

kleon res evil 4 gamecube 10k

masterchef xbox 1 halo 2 2k

2k x 5 = 10k gamecube clearly kicking xbox ass and near matching 360

thats gamecube never miond wii

reminder

gears of war 10to15 k polygons

halo 2 2k polygons

res evil 4 gamecube 10k polygons

the evidance speaks for its self :lol:

3ds Mac

On April 25, 2008 at 1:19 pm

That doesn’t tell you anything about performance though. And you’d have to be pretty desperate to try to spin it that way.

Halo 2 used normal mapping on everything, and allowed for a far higher looking count than part 1. (which was technically higher)
And RE4 is a closed area shooter, Halo 2 is far more open and expansive. And you could assume Gamecube could run everything in Halo 2 as is, no prob, but you’d be wrong.

And the T&L processor in Gamecube and Wii, serves the same purpose as an Xbox vertex shader, and processes roughly the same amount and same type of data per clock cycle, minus being fully programmable on its own.
There are two of those processors in XBox, One in Gamecube.
Around 8, far more capable versions of them on PS3, at twice the clock frequency.
So, you could cut PS3 clock rate in half, only use 1 of the 8, and it’d still have more floating point power devoted to vertex work than Wii.

No point even tossing 360′s unified shaders into it.

3ds Mac

On April 25, 2008 at 1:38 pm

***the evidance speaks for its self :lol: ***

Evidence of your rabid, delusional, fanboyism sure. But that was never in question .

wiiboy101

On April 25, 2008 at 6:51 pm

the xbox had a hardrive how could gamecube load faster THAT IS SOOOO FANBOY IT CLEARLY SHOWS YOU LOT KNOW NOTHING

GAMECUBE CLEAR S ON XBOX AT LOADING DATA EVEN WITHOUT A HARDRIVE THATS FACT YET YOUR ARGUING AGAINST FACT DUE I MEEN REALY DUE :wink:

WIS COMPRESSED ON THE FLY DATA STREAMING IS FASTER STILL THAN GAMECUBE AND ITS FLASH IS FASTER THAN X360S HARDDRIVE AND ITS VISUAL DATA CAPPED AT 480P

SO AGAIN WII WIPPS ASS AT LOADING STREAMING SPEED PS3 IS LIKE A PSP X360 LIKE A XBOX 1 AND WII IS FASTER STILL THAN GAMECUBE BY FAR THE FASTEST LOADING DISC SYSTEM OUT THERE ITS CALLED DESIGNING YOUR SYSTEM FOR GAMES NOT MOVIES

CHECK OUT GTA 4 PS3 EVEN AFTER INSTALS IT LOADS LIKE CRAP X360 NO SUCH INSTALL REQUIRED BUT STILL LOADS LIKE CRAP

WERES THE LOADING IN TRUE WII TITALS THERE AINT ANY OR VERY VERY LITTLE

wiiboy101

On April 25, 2008 at 7:34 pm

apparently silicon on insulator is old tech //apparently copperwire is old tech//apparently micro embedded tight trancister design is old tech// apparently die embedded high speed s-ram main memory’s is old tech//

ibms latest manufacturing tricks and tight shrunk mpu(cpu) design from 2006/7 is in fact 2000 tech HAY WHAT THE FUK

tighter your chip set better it performs//die embedded beats board embedded///clock balanced beats unbalanced//silicon on insulator beats non soi//copperwire beats aluminium// its common knowledge chips/ram etc in general are inefficent piles of crap that never reach there theoretical performance figures

the whole point to such tight design and sweet balance is to suck every last drop of avalable power

xbox inefficient gamecube highly efficient wii even more efficient GET IT

EXAMPLE OF EFFECTIVE RESULTS

DRAM 50 TO 100NS LATENT

1T SRAM-R 5NS LATENT GET IT VASTLY MORE REAL WORLD RAM SPEED THAN SILLY OLD DRAM

LEARNS YOURSELFS FOOLS

3ds Mac

On April 25, 2008 at 10:59 pm

Wii and Gamecube load quick, because there isn’t much ram to load. And again, you can design a game to not have load times at all. It would just mean balancing what ram you use for what task. Most developers don’t like compromises like that, or don’t like dealing with streaming.

But there are games on both systems that do exactly that type of thing to eliminate load times completely. Uncharted for example, loads for about 30 seconds at the beginning of the entire game, and that’s it.

Gamecube had 16 megs of a-ram that wasn’t useful for much else. So, it was useful to pre-load things there, to shift in and out of main ram. But if you magically made that 16 mbs into more 1t-sram, companies like factor 5 would have used the entire thing for real-time rendering. They would consider keeping things there to pre-laod a waste then.

Anyway, Xbox had lots of things that improve efficiency, a post transform vetex cache for example. It keeps vetices around on chip to be reused. Gamecube lacked things like that. It lacked a stencil buffer, it had problems with z-buffer depth, etc..

And there’s no point bringing up building material and manufacturing processes, because things like copper and die shrinks improve performance, mostly by technically allowing chips to be clocked significantly faster, with less heat output.
Take a chip on 90 nm, and a chip on 110 nm, as long s their clocked the same, there isn’t much performance difference. The difference, is in the fact that the 90 could be clocked faster, stay cooler, using less power.
Of course, Nintendo didn’t use that efficiency to increase clock frequency, they still kept it clocked low for less power and heat, thus allowing them to stick it all into a tiny box.

And if you were using manufacturing technology in your claims, Xbox gpu used 130 nm fabs (.13 microns) when it launched, Gamecube’s gpu was 180 nm (.18 microns)
.13 was newer, more advanced and efficient than .18.

And you could have a 360 with ram with 10 nanosecond latency, and one with 50 nanosecond latency, and as long as the game was programmed properly, (from the gpu’s perspective) there wouldn’t be any difference in performance.
Simply because gpus are designed to hide memory latency. If a pixel shader doesn’t have its texture fetch result from ram, it processes floating point on other shaders until it does.
It would be like, if you were preparing a cake, and needed to break and add eggs, measure and add flour, milk, sugar, etc… And had someone going back and forth to the fridge to fetch ingredients, it doesn’t really matter how fast you get the milk after requesting it, because you can break and beat the eggs while that’s happening, as that’s a task that needs to be done anyway.

So, by the time you’ve done what’s required to the eggs, the milk is already there to be used.
And so on, and so forth. That’s what the thread arbiter is for.

Wii likely doesn’t have much to hide latency with, so low latency would be required, otherwise it’d be stalling constantly.

3ds Mac

On April 26, 2008 at 12:22 am

****ibms latest manufacturing tricks and tight shrunk mpu(cpu) design from 2006/7 is in fact 2000 tech HAY WHAT THE FUK****

It’s still a G3 derivative, rather than a higher spec chip. (though highly modified I’m sure)
And of course IBM would be using the latest manufacturing techniques to make chips for the Wii. All their fabs have moved to better processes. PS2 continues to be moved to better processes, getting smaller and smaller. But the specs haven’t changed.

Nintendo made the Gameboy advance again in the Gameboy Micro, and it has the same specifications and performance, but the chips would be using the newest technology, thus allowing them to be far smaller and more efficient than the original. But it doesn’t make the specifications cutting edge.

lennell

On April 28, 2008 at 1:58 pm

3ds mac
that is right, ps2 processes got small and better jast like nintendo did with the gameboy advance, now it’s gameboy micro…and yes you can design a game with no load time it’s up to the devepolers.

lennell

On April 28, 2008 at 2:09 pm

and thats the same go’s with the gamecube and xbox load times, both console’s loads different but it’s all up to the developers.

darren-hack-1980

On May 8, 2008 at 2:37 pm

so what we have here is blind ignorance, so no edram beats edram, so slow ram beats 1tsram, so uncompressed gpu reads beat compressed gpu reads, so standard rendering beats custom tile rendering and gpus fill rate is in game clock for clock the exact same as flipper or hollywood WHAT THE THAT MAKES NO SENSE

hollywood is as a whole more efficient effective than xgpu and wii as a whole is more efficient, effective than xbox.

hollywood has real world abilities higher than xgpu

3mb edram/1t sram custom graphics catch high bandwidth in chip buffer/catch (xbox/xgpu has nothing like this)

custom high speed tile rendering (xgpu/xbox doesn’t have this)

virtual texture design (xgpu/xbox doesn’t have this)

high speed die embedded fast ram (xbox/xgpu doesn’t have this)

high bandwidth system bus realtime decompression (xbox doesn’t have this)

tight co -procesing from cpu clock synced system clocks (xbox doesn’t have this)

real time on the screen peak fill rates are at least plus 2x xgpu

realtime in game graphics ram at least 2.5 times a xbox and 10 x faster in latency terms

27mb of sram with on the fly data supplied from gddr3/disc/flash memorys gives a constant feed of realtime catch speed data to gpu

thats 24mb of chip in die sram before you seek from gddr3/flash/disc thats a huge amount of sram performance right there if you compared that directly to xbox you got

xbox 128k l2 cpu catch and probably 128k of gpu catch no ability to hold and read compressed data
VS
wii 256k cpu catch plus 4to1 compression 1mb texture catch plus 6to1 compression plus virtual texture design 2mb frame z buffer and 24mb of chip 1t sram with 6to1 compresion of textures and 4to1 data compression

grow up xbox fanboys wii is a 480p xbox 360 competive level machine its clearly in game closer to x360/ps3 than it is xbox 1

prove me incorrect

3ds_Mac

On May 9, 2008 at 12:23 am

Who’s debating Hollywood vs Xbox? Don’t see it much here.

And every single thing you just mentioned, from edram, to cpu bus compression, to virtual texturing, to 1tsram, is a requirement for the Gamecube and Wii to be a functional system. I’m not sure how else it could be explained to you guys.
The gpu uses die embedded frame buffer to read and write things directly too and from. It only did 1 texture layer at a time. It wrote to frame buffer 4 times for 4 texture layers. You NEED abundant frame buffer bandwidth to keep up.
An Xbox does 2 at a time, loops back on chip once, before writing to frame buffer. It’s designed, just like Geforce3′s and 4′s, to use main ram as a frame buffer. The Gamecube would run like absolute , if it tried the same thing.
It’s part of the system’s design.
Comparing numbers and bandwidth figures, is apples and oranges in every single figure you guys give. They aren’t designed the same, and many of the components, don’t serve the same purpose. And rattling off numbers and features, because they’re high or interesting to you, doesn’t show anything at all, unless you look at them in proper context. Of which, you guys fail to do every time this stuff gets posted. You ignore context completely. No different than generically citing the 2,560-bit bus width in PS2′s gpu.

And there isn’t a single figure anywhere in the Wii’s specifications that would indicate “480p 360″, or even “480i 360″.

It IS closer in performance to Xbox, than it is to Xbox 360. That’s exactly what the specifications indicate, and exactly what you see in games. (if even that, thanks in part, to developer laziness with visuals)

Deny that all you’d like, doesn’t change anything.

3ds_Mac

On May 9, 2008 at 4:02 am

***uncompressed gpu reads beat compressed gpu reads***

And I don’t know how many times this has to be said, explained, dumbed down and explained, analogies given, retyped, re-said, reworded, and reiterated.
Xbox’s gpu used the EXACT same compression algorithms Gmaecube did. The exact same, and used them the exact same way. The only reason you fanboys latched to it, was because ArtX had licensed s3tc for their gpu design. That was considered news back then. While, Nvidia and ATI had already been using their own codecs of it, and therefore, it was not news.

The difference being, that Xbox had additional formats for things like 3d volumetric-textures.

And continuing to cite Gamecube’s texture cache to compare directly to Xbox, is once again, ignoring context of how it was used, why it was required, and why it was there in the first place.

3ds_Mac

On May 9, 2008 at 5:24 am

****Grow up xbox fanboys wii is a 480p xbox 360 competive level machine its clearly in game closer to x360/ps3 than it is xbox 1****

If you aren’t capable of reading the specifications properly yourself or simply don’t want to, look no further than every single developer there is, including Nintendo themselves. All of them, will disagree with you.

And if for whatever reason you don’t believe them, look at their games. All the proof you need, is right there on the screen. Even the best ones aren’t visually impressive from a technological standpoint by today’s standards.

“It is essentially a Gamecube”.”My idea was to spend nothing on the console technology so all the money could be spent on improving the interface and software.” -Miyamoto.
“We no longer believe in the philosophy of losing money on hardware in the beginning of a console’s life-span, to be made up in software sales and royalties. We believe in making money from launch, and continuing to increase profits as the costs to manufacture gets cheaper and cheaper”. pp – N-rep.

Nintendo can get away with that this gen, because of their controller, and because their games are still fun, and they picked up a bunch of “non-gamers”. They don’t need their fanboy brigade to twist specs numbers to pump up their hardware. It’s an over-clocked Gamecube with more ram. It’s what a Geforce4 is to an Xbox. Even Miyamoto would tell you that.

Nintendo isn’t even trying to mislead you, you guys do an awesome job of that on your own.

darren-hack-1980

On May 9, 2008 at 11:10 am

xbox xgpu texture read bandwidth peak ram peak 6.4 gb bandwidth shared and allso has to support gpu buffering frame and z buffer. leaving around 3.2 gb if that for gpu texture reads no compression read supported all data/textures have to be red un compressed bottle necking the xgpu and redusing avalable bandwidth

so xgpu texture read bandwidth below 3gb easy

hollywood 1mb texture catch plus 6 to 1 compression = 6mb texture catch vs xgpu 128k texture catch

hollyood gpu texture read bandwidth 16gb plus 6to1 compression = 96GB texture read bandwidth add virtual texturing (50%) increase in texture space bandwidth

6mb catch becomes 9mb effective and 96gb bandwidth becomes 144gb bandwidth

so hollywoods peak texture bandwidth read is 144gb bandwidth theoretical

vs xbox xgpus texture read of about 3gb bandwidth

NO CONTEST

WIIBOY101

On May 9, 2008 at 11:17 am

XBOX TEXTURE READ 16 BIT COLOUR 16 BIT TEXTURES (32BIT IN YOUR DREAMS)128 BIT GPU FROM MAIN RAM BUS

128BIT TEXTURE READ

VS

HOLLYWOOD GPU TEXTURE READ 32 X 16 BIT =512 BIT TEXTURE READS 4X XGPU

XGPU TEXTURE CATCH 128K

HOLLYWOOD TEXTURE CATCH 1MB PLUS 6TO1 COMPRESSION = 6MB TEXTURE CATCH

HOLLYWOOD SUPPORTS OPTERMIZED 24 BIT COLOUR SO 32 BIT COLOUR EFFECTIVE
IN GAME

XBOX XGPU 16 BIT COLOUR EFFECTIVE IN GAME = HALO HALO 2 ETC

AVERAGE RAM LATENCY XBOX 50 OR SO NS

WIIS FAST RAM 5NS LATENCY 10 TIMES FASTER AT LEAST REAL WORLD RAM READ SPEED

XBOX DATA 1MB RAM = 1MB DATA

CUSTOM COMPRESSION REAL TIME DE-COMPRESSION WII 1MB RAM = 4 MB OF DATA

AGAIN NO CONTEST REALLY

WIIBOY101

On May 9, 2008 at 11:19 am

EVERY 1MB OF DATA IN XBOX RAM IS = TO 4MB PEAK DATA IN WIIS RAM CUSTOM ON THE FLY COMPRESSION DE COMPRESSION

NO CONTEST

EVEN X360 TRYS TO COPY GAMECUBES/WIIS VIRTUAL TEXTURING AND CUSTOM COMPRESSION SO DONT EVEN TRY TO SAY OTHERWISE

WIIBOY101

On May 9, 2008 at 11:40 am

FOLLOWING INFO IS BROADWAY CPU IN STANDARD RUNNING FORM AND THEN FOLLOWED BY PEAK ABILITY WITH CUSTOM 4TO1 COMPRESSION….

XBOX CELERON IS ROUGHLY 50% BANDWIDTH READS OF BROADWAY RUNNING NO COMPRESSION, SO XBOX = 50% BANDWIDTH OF STANDARD CHART NOT COMPRESSED CHART LOOK AT THE MASSIVE DIFFERENCE

Normal Broadway interface:
Bus Interface Unit to System Bus = 64 bit * 243 MHz = 1.944 GB/s
Bus Interface Unit from chip = 17 GB/s
L2 Data cache to fill buffer 64 bit * 729 MHz = 5.832 GB/s
L2 Instruction cache to L1 instruction cache = 32 bit * 729 MHz = 2.916 GB/s
DMA controller to fill buffer 64 bit * 729 MHz = 5.832 GB/s
Fill buffer to L1 Data cache 256 bit * 729 MHz = 23.328 GB/s
Write Gather Pipe from Load/Store Unit 64 bit * 729 MHz = 5.832 GB/s
XBOX CELERON AROUND 50% ONLY OF ABOVE BANDWIDTHS OR THERE ABOUTS LOOK BELOW FOR BRAODWAYS REAL PEAK BANDWIDTHS DESTOYING XBOX

Broadways data compression
Data compression of 4:1 average data compression:
Bus Interface Unit to System Bus 4:1 = 7.78 GB/s
Bus Interface Unit from chip 4:1 = 68 GB/s
L2 Data cache to fill buffer 4:1 = 23.328 GB/s
L2 Instruction cache to L1 instruction cache 4:1 = 11.664 GB/s
DMA controller to fill buffer 4:1 = 23.328GB/s
Fill buffer to L1 Data cache 4:1 = 93.312 GB/s.
Write Gather Pipe from Load/Store Unit 4:1 = 23.328 GB/s

CASE RESTED YOUR HONOR I LEAVE IT IN THE HANDS OF THE DORY

WIIBOY101

On May 9, 2008 at 11:46 am

XBOX BUS 1GB

WII BUS RUNNING STANDARD 1.9GB

WII RUNNING CORRECTLY (HINT 3RD PARTYS) 7.7GB PLUS BANDWIDTH

XBOX CPU L2 CATCH DATA AROUND 3GB INSTRUCTIONS AROUND 3GB

WII BROADWAY L2 CATCH DATA 23 GB PLUS INSTRUCTIONS 12GB

THE WHOLE PIPELINE SYSTEM BUS TO CPU ITSELF ALL CATCH AND BUFFERS AND PIPES CLEARLY HITTING 8 X XBOX BANDWIDTHS AND 10 X THE RAM SPEED INLATENCY TERMS

IT DESTROYS A XBOX CLEARLY A 480P X360 LEVEL MACHINE

EVIDENCE PROVES CASES

3ds_Mac

On May 9, 2008 at 2:09 pm

***xbox xgpu texture read bandwidth peak ram peak 6.4 gb bandwidth shared and allso has to support gpu buffering frame and z buffer. leaving around 3.2 gb if that for gpu texture reads no compression read supported all data/textures have to be red un compressed bottle necking the xgpu and redusing avalable bandwidth***

So, what we’ve concluded here, is that you’re retarded? And don’t think I don’t notice you signing your posts as multiple names, because no else is this stupid.You should scroll up, and refresh your pathetic memory on this stuff Wiitard.
All reads are compressed. You denying it doesn’t change that.

****6mb catch becomes 9mb effective and 96gb bandwidth becomes 144gb bandwidth
so hollywoods peak texture bandwidth read is 144gb bandwidth theoretical
vs xbox xgpus texture read of about 3gb bandwidth. NO CONTEST****

See, and here again, you prove you’re an idiot. 144gb of bandwidth… And all the rest of your bandwidth bull….
And yet the games look like . So, let me get this straight Wiiboy, you’re saying Nintendo has a team of retarded programmers? You’re saying Retro studios is full of drooling morons? They have all that bandwidth, and couldn’t do a single bump map? That’s good to know that you think that wiiboy.
.
And here, I thought you were a Nintendo fanboy… “Nintendo am sad”

****BOX TEXTURE READ 16 BIT COLOUR 16 BIT TEXTURES (32BIT IN YOUR DREAMS)128 BIT GPU FROM MAIN RAM BUS****

See? And even more of you having no clue what you’re regurgitating Wiiboy. You’re not even on the right subject.
Pixel shaders process red, green, blue, and alpha. = 32 bit. It’s a measure of color precision on pixels being rendered. 24-bit is all you need for color. The additional alpha is where things like transparency is stored. Each pixel shader can process all 4 values each clock cycle, whether you need them or not.

Gamecube was capable of storing 24-bit color, which is fine, until you needed somewhere to store an alpha value. So, if you need a value for transparency, it comes out of color bits. 6, 6, 6, 6,.= 18-bit color. Or 16-bit, if you tried to do aa. Same thing goes for the Z-buffer depth.

360 can do a 10, 10, 10, 10. or 16, 16, 16, 16, etc..If you need a higher dynamic range, and can still do aa without dropping anything. You haven’t got a whiff of a clue what you write, so no point in pointing this out to you.

And once again, for the millionth time now, the bull “custom data compression” you keep reposting about, is absolutely nothing more, than using lower precision 8 and 16-bit bytes and shorts on geometry, for the cpu to pass geometry to the generic T&L unit that Nintendo kidnapped from the mid 90′s.

That’s it Wiiboy. All vertex shaders are capable of unpacking compressed data. Nothing to see here. Gamecube and Wii would have no choice but to use the cpu to emulate what most modern gpus do in hardware.
That particular portion of the gou was considered cutting edge, back when you were home watching Boy Meets World and Fresh Prince of Bel’air.

And I don’t know whom you think you’re convincing with any of your garbage. Anyone with a fraction of a brain can see you’re an idiot. You got your argument and claims completely destroyed, shat on, and pitched back in your face before I even began posting. And you resort to making up outright now, with your failed attempt at deriving cache bandwidths.

darren-hack-uk

On May 9, 2008 at 2:35 pm

xbox loads faster than gamecube its got a hardrive giggling my ass of at that

xbox halo 2 16 bit colour 30 frames droppin to 10 a lot and tredful loadtimes

halflife 2 xbox mid level long loadtimes

metroid prime 1/2 no loadtimes whats so ever 24 bit colour solid 60 frames killing the xbox

twilight princess no loading times massive open worlds better graphics than any xbox game

xbox girls shut the fuk up

3ds_Mac

On May 9, 2008 at 3:18 pm

Lets try something different here Wiiboy.

Wii is clocked at 243 mhz.
PS3 is clocked more than twice as fast.

Wii has one T&L processor, that acts as a semi-retarded vertex shader.
PS3 has 8 vertex shaders.

8 times the processors, at twice the clock frequency, and we’ll ignore the fact that the Wii’s unit is limited.

PS3 = winner.

Wii has 4 pixel pipelines. This includes the ability to do 1 texture layer, and the combiner op per clock cycle. (4 rops, with a single texture, and a combiner op)

PS3 is once again, clocked twice s fast, and has 8 rops, capable of 2x z only fill-rate for the pre-pass.
Tied to 24 pixel shader pipelines, each capable of a texture per clock, (24 total) each having 2 alus, (48 total) each capable of 4 floating point madd ops per RGBA.

To make this easy, I’ll give you a generic figure to compare the two, and ignore the 2x-z rate for aa on PS3.

If processors were people, PS3 has twice as many folks to process pixels and can move twice as fast.
8 times the folks to do geometry, and can work twice as fast.
6 times the people to do textures, and move twice as fast.
12 times as many people to process the floating point operations related to effects,l and can…you guessed it, work twice as fast.
Along with more than 6 times the amount of space to store things for processing, and moves several times the amount per clock, at a far higher clock rate.

I didn’t take time to explain how more than one texture layers fractures fill rate on Wii, while this isn’t the case on ps3. And I’m ignoring the 2x z-fill rate on PS3, and the fact that a vertex shader is far superior to the one in Wii, and I didn’t even bring in cell’s vector processors, or the fact that the gpu can do aa, “128-bit color”, or any other marketing figure.

Winner = PS3.

No amount of (capped at 480p) changes this.

3ds_Mac

On May 9, 2008 at 3:25 pm

*You don’t understand, nor are you capable of understanding color precision.
8,8,8,8, = Xbox
8,8,8, or 6,6,6,6, or 5,6,5 for Gamecube and Wii.
32-bit Z = Xbox.
24-bit or 16-bit = Gamecube and Wii.
Reality is hard to comprehend for you I guess.

And your load times, are done behind doors on Prime. And you can time load times on multi-platform games. And sure, we can ignore disk transfer rates, or transfer rate from HDD, and just look at multi-platform games.

Been explained repeatedly.

3ds_Mac

On May 9, 2008 at 4:19 pm

So, in conclusion, 720p is exactly 3x 480p. And yet, figures wind up 6x or more, on most things. (and a legit 6x+ more, not some backwards bull figures you attempt to throw together)
In addition to being far far weaker at floating point processing of all kinds, far less programmable and flexible, and being inferior and out-right incapable of lots of things I ignored.

The answer is simple.

It won’t.

WIIBOY101

On May 9, 2008 at 5:50 pm

broadway cpu risc copperwire silicon on insulator micro tight tranny design over 20 million trannys

celeron not shrank not micro embedded design not copperwire not silicon on insulator not risc only 9 million trannys count

its clearly only half if that the broadway cpu

oh yes fsb 133 mhz vs 243 mhz plus 4to1 compression

its all so simple

3ds_Mac

On May 10, 2008 at 3:16 am

****its all so simple****

Is it, Wiiboy? Is it really?

Xbox gpu = 60 million.
Flipper = 51-million.

Half of flippers transistors is 1tsram.
~25 million for processing logic.
Guess what the majority of NV2A’s transistors are devoted to?

Not fair to compare 60 million to a stream-lined 25 million directly, as that would be fanboyism to do so, but none-the-less, the numbers are correct.

But, lets take a look at what you posted.
9 million vs. 20 million.
9-million for what? Does this include L2 cache, or is that separate?
Obviously it’s separate. And suggests that perhaps someone thinks Xbox’s cpu was a pentium Katmai core, despite everywhere listing it as Coppermine.
The difference being, that L2 cache being on the chip, as opposed to being external, adds alot to the transistor count.

Direct quote:
“Coppermine offers many benefits over its previous core, and offers a 0.18-micron technology compared to the KATMAI core, which was distributed on the 0.25 process. This technique will allow the CPU to be distributed in a smaller size. The 0.18 process allows the implementation of over 3 times more transistors, which is a big move forward. Compared to the KATMAI core (9.5 million transistors) the Coppermine core uses 28.1 million transistors. This is due to the direct L2 implementation on the chip, which uses a large amount of space on the core.”

Now, since we’ve determined that adding transistors accounted for L2 cache, we can count them up. The quote above, refers to a Coppermine with 256kb of L2 cache. Xbox has 128kbs.
So we can estimate that Xbox should be around ~19 million transistors.

A quick look here:
http://www.cpu-collection.de/?tn=&l0=md&l1=2001&l2=Intel#KC733/128(XboxCPU)
Tells us that, they’re either not adjusting their transistor counts to account for the differences in L2 cache size, (as it’s listed at 28.1 million) or this quote from anandtech is true.

“The fact that Intel decided to go with a 128KB version of the Coppermine core indicates that there is a way of disabling half of the L2 cache without modifying the mapping associativity. We fully expect the Xbox’s CPUs to be nothing more than Coppermine Pentium 3 processors with half of their 256KB L2 cache disabled.”

And Eurogamer PC, in regards to Celeron and Pentium 3.
“The cores themselves are both forged on the same 0.18 micron process, and it has been said that the Celeron II core is in fact a Pentium III E core with half the cache disabled.”
“This has been hinted at by Intel, and it would make a great deal of sense. Instead of requiring a new production line, they can just take chips from the existing Coppermine process, and modify them so that half the cache doesn’t work. This may seem like a waste of money, but it is far more efficient for Intel to be able to churn out one chip and modify it later, than it is for them to have two separate production lines.”

Which makes sense, and shows why the profile of the Celeron from the same family, listed directly below it on that site are the same.

But, you can also go to Intel’s own site, and scroll down till you find a processor that fits the profile, and look at the transistor count.
http://www.intel.com/pressroom/kits/quickreffam.htm#Celeron

Now, lets take a look at the g3 750ish cpu. A quick look here again:
http://www.cpu-collection.de/?tn=&l0=cl&l1=PowerPC&l2=IBM
Indicates that it contains around 6.5 million transistors. But then, you’ll notice, that while this profile accounts for things like the L1 cache “32+32″, it doesn’t list an L2. While in the Xbox profile, it lists L1, but also L2.

So, taking into account the 256 kb L2 cache in Gamecube and Wii cpu, and assume they modified it, it would likely be ~21 million transistors.

But again, as has been said many, many times, the cpu in Xbox isn’t meant to serve the same purpose as the cpu in Gamecube and Wii. It takes up more slack for the gpu.

3ds_Mac

On May 10, 2008 at 3:26 am

So in closing, you’re comparing two different transistor counts.
Again apples and oranges.
And you’ll also notice, that the Anandtech article on Gamecube’s cpu, that listed it as having “21million transistors compared to 9 million” was also of the opinion that it was a weak sauce cpu compared to xcpu.

I, would give Nintendo bonus points for modifying theirs, and having 1tsram, which would benefit a cpu more than a gpu. Perhaps making up for its shortcomings.

However, it does still have more responsibilities….

darren-hack-1980

On May 11, 2008 at 5:17 pm

flipper texture read 10.4 gb X 6 for compression =plus 60gb bandwidth texture read (gamecube)add more for virtual texturing in handware

xgpu, texture read 6.4 gb wrong 6.4gb shared whole system half then 3.2gb no compressed texture read and process capability 3.2 if that max texture read

hollywood texture read 16gb x 6 for compression = 96 odd gb bandwidth vs xgpu 3.2 to 6.4 very max system bandwidth (wii) oh and add more for custom virtual texturing in hardware

simply no contest 1t sram beats dram and edram,1tsram utterly destroys dram

1tsram-edram beats edram allso

xbox 360 10mb off chip (less effective than on chip)EDram based on dram no texture catch just a huge hd buffer and some AA capability reads at 32gb bandwidth

wii 3mb in chip 1tsram-EDram inside the gpu texture catch 512bit 16gb bandwidth
frame/z buffer 12gb bandwidth read
16+12=28gb bandwidth

xbox 360 gpu edram slow off chip edram 32gb bandwidth

wii hollywood gpu very very fast in chip 1tsram-edram 28gb bandwidth

x360 720p/1080ip only 32 gb bandwidth wii gpu locked 480p 28 gb bandwidth plus compression

AS YOU CAN CLEARLY SEE WII IS CAPABLE OF MONSTER FILLRATE/BANDWIDTH AT 480P RENDERING

PS2 PORTING IS NOT WII GRAPHICS

WIIBOY101

On May 11, 2008 at 5:27 pm

TEXTURE BANDWIDTH WII 16GB X 6 FOR COMPRESSION =96GB ADD 50% FOR VIRTUAL TEXTURING
=144GB bandwidth at 512 bit 24 bit colour 24 bit texture custom bit rate texture read catch

HARDLY UNDER POWERED IS IT and saying but the catch is only 1mb is totally meaningless as 1mb acts as 6mb compression plus 50% more available space thru virtual texturing and the catch is fed by THE MIGHTY POWER OF 1T-SRAM so its as fast as catch memory no waits no delays just realtime on the fly texture reading no bottlenecks no delays no thrashing

thats a 720p capable texture read locked down at 480p = insane fillrate and bandwidth at 480p

the EA spoks person lied he was marketing hd/graphics on ps3 x360 and seperating wii by marketing waggle

it was a sales pitch not a honest look at wii specs

how many times WAKE UP FANBOYS

3ds+max

On May 11, 2008 at 8:21 pm

And you continue to attempt to explain you know nothing about, and obviously don’t wish to understand.
Your multiplication skills don’t help in situations like this.

Xbox reads compressed textures. Gamecube reads compressed textures. PS3 reads compressed textures. Geforce 2 reads compressed textures, etc, etc, etc, etc…

Your continued retarded rainmaan routine dosn’t change this Wiitard. Everyone else on the planet knows this, except you.

And there is no added multiplication to bandwidth on-chip for frame-buffer operations. It doesn’t compress frame buffer ops.

And the frame buffer bull you just attempted to out line for 360, proves even further, how little you understand.

For the 2nd or 3rd, or perhaps fourth time now.. 360 uses a 32 gb bus to feed rops in one direction. Rops are the processors that actually write pixels., Those processors are on chip with the edram. bandwidth between them for reading and writing is 256 gbs per second. There is no 1tsram advantage to that. There is no reading from there by the gpu.

And you can look no further than the PS3 or any pc available, that uses gddr3 for it’s frame buffer. There’s 20gb of bandwidth there, for reading texture and vetex data, and read, write, and read-modify-write operations for frame buffer.

All of that data moves compressed, from color and Z compression for for frame buffer where the compression ratio gets higher when there’s aa, to reading all textures and vertices. There’s heirarchial-z, there occlusion culling, z testing, clipping, etc, etc..

Modern PC’s destroy Gamecube and Wii. Not a lick of competition there, what-so-ever.

3ds_Mac

On May 11, 2008 at 9:26 pm

And it’s hilarious that you use the sentance “honest look at Wii specs”. You have no business using such a sentence Wiiboy. Absolutely none.

The fact that you have no choice but add up cache bandwidths, and multiply by 6, and add this that and the other, proves two things.
You don’t know how things work. (As if we’re just now learning that, lol)
You don’t care how things work. As long as you can convince yourself of something, you can sleep at night.

Well, lets do some math, the “Wiiboy” way:
256 gbs + 32 gb bus x 4 for z compression + 22.4 x 6 for compression from main ram.
Now, technically, we would say that cpu takes up a small percentage of bandwidth from main ram, but lets say the cpu is generating things off of an algorithms, and locks them in L2 cache every frame, to be read by the gpu, compressed using 3dc. 21 gbs bandwidth for the fsb to L2 cache, x (lets go with 2 for 3dc compression) No, lets go with 6, since profx generates textures straight to dxt formats. And I’ll ignore the fact that procedural synthesis is meant more for geometry than textures, but none the less, we’ll use it, since it could do this technically, and generate textures from a program, rather than from ram.
Then lets toss in a texture cache bandwidth of, lets estimate it to be 128 gb per second, and considering it uses things like native 3dc, they can also be stored in caches, so x 4 for those. I won’t use s3tc’s max, since we’ll assume we’re using lots of normal mapping.
So:
256 between rops and edram.
128 (32×4 for the bus between main chip and edram module, where z applies to all z values in msaa)
134.4 (22.4 x 6 for main ram bandwidth)
126 (21×6 for procedural texture generation, that wouldn’t affect main ram bandwidth)
512 (128×4 for texture caches everywhere, with compression applied)

1,156.4 gigabytes of bandwidth.
How awesome is that?

Prove me wrong, Wiiboy.

3ds_Mac

On May 11, 2008 at 10:08 pm

The point here is, that when you look at a game like the Wii version of “skate”, it’s covered in low-res, ugly textures (especially by today’s standards).

They don’t look low-res and ugly, because developers are lazy. All those developers have accumulated a massive database of textures of all sorts; rock, concrete, bark, gravel, etc, etc.. All of which, are highly detailed, and high resolution. When they go mapping textures to polygons models making a Wii-game, they HAVE to lower resolution, and not use multiple layers on them, simply because the Wii isn’t capable of displaying them.
They get gimped, simply because the Wii can’t handle them. If Wii was a “480p 360″, the games would look like a 360 game in 480p. But it doesn’t. None of them do. They get cut back everywhere to make them run.
Sure, developers get lazy, and moving PS2 assets to Wii isn’t going to look good.
But when it comes to moving assets from other platforms to Wii, they’ll be at wherever Wii is capable. That just happens to be “Gamecube 1.5″ level.

You fanboys get mad that all developers don’t have the skills to hide how weak the Wii really is, and design textures and polygon models in a cartoony or abstract manner, to run on underpowered hardware and still look decent.

“Hey look, this pipe looks like a pentagon, because it’s “supposed” to look like a pentagon, to match all the other angular shapes in the world. If it were round, it would look out of place”

Or “hey, this character is running around in circles, and repeated patterns, with wooden animation, because he’s just an automated robot, and automated robots aren’t supposed to look like they have decent ai.”

Well, you should complain about that specifically, then.
But the fact that games ported down to Wii look like crap, is almost entirely the fault of the Wii hardware itself. You guys are angry because all developers don’t have “special ed” skills, for dealing effectively with the far less capable member of the console community.
And those that did, are busy trying to get the most out out of far newer, more complicated hardware. If they tool them off the more complex games, there’d be no one to cover their position. So let the noobs deal with Wii.

That’s just the way it is, Wiiboy.
Acceptance is the first step to recovery.

darren-hack-uk

On May 12, 2008 at 8:38 am

x360 gpu buffer=32gb read @720p/1080i-p

wii gpu catch/buffer=28gb read plus compression @ 480p

clearly wii has faster edram and more bandwidth to resolution than x360

break down every number and wii = 1 3rd to a half of those numbers but is locked @480p

do the math

3ds_Mac

On May 12, 2008 at 11:38 am

What math do you insist should be done?
You make 0 sense. You’re adding cache and frame buffer bandwidth together, and tossing in compression everywhere. And comparing that to the 35gb write bus on 360.
Wii has 11 gbs raw bandwidth at the most for fame-buffer. There is no s3tc, no z, nor color compression applicable there, it’s straight on the bus to frame buffer, between rops and embedded ram.

For 360′s frame buffer, between rops and edram is 256 gbs of bandwidth. Those processors are built into the ram die itself. The 35 gbs of bandwidth is for moving data to rops. The 48 shader processors run shader programs, and texture processors process textures. They’re the ones using main ram. rops get data from the main chip.

PS3 (and Xbox1) uses color and z compression for frame buffer ops, between rops and gddr3, the same bus it uses for reading texture and vertex data. But compression makes up some, for low raw bandwidth there.

WIIBOY101

On May 12, 2008 at 12:25 pm

example of truths follow

480p x 3 = 720p wii native 480p x360 native 720p 3x native resolution

xbox 360 main ram bandwidth 22gb wii main ram bandwidth 2x 4gb = 8gb total minus compression oh look 8×3= 24gb bandwidth x360 has 22.4 or something

CLEARLY TO ANYONE USING THERE BRAINS WIIS BANDWIDTH IS = TO OR BETTER THAN XBOX 360 AT THE SUPPORTED RESOLUTION 480P X 3 = 720P 8GBX3=24GB

DO THE MATH JESUS ITS SO SIMLPLE

XBOX 360 MAIN RAM DRAM 50 NS LATENCIE DRAM PERFORMANCE

WII 5NS 1TSRAM AT LEAST 10 TIMES FASTER THAN X360

X360 720P NATIVE RESOLUTION 3X WII BUT X360 GPU HAS 2X THE CLOCKS BUT NEEDS TO SUPPORT 3X THE RESOLUTION

500MHZ 720P 243MHZ 480P WIIS CLOCKSPEED TO SUPPORTED RESOLUTION RATIO IS CLEARLY HIGHER

DO THE MATH 480P IS 1 3RD OF 720P BUT WIS CLOCK IS HALF THAT OF A 720P CONSOLE THERE FOR ITS CLOCK FOR CLOCK PIXEL FOR PIXEL HIGHER THAN X360

DO – THE – MATH

WIIBOY101

On May 12, 2008 at 12:30 pm

IN PLAIN ENGLISH WII PUSHES FURTHER AT 480P THAN X360 DOES AT 720P WITCH PROVES WIIS HIGH SPEED HIGH BANDWIDTH FILLRATE PERFORMANCE ITS CLEARLY A INSANE AMOUNT OF BANDWIDTH AND FILLRATE @ 480P ADD TO THAT HIGH PERFORMANCE EFFECTS AND BLENDERS

I DONT SEE THAT IN A LAZY PS2 SO HOW IS A LAZY PORT WII GRAPHICS IT CLEARLY IS NOT

CONSPIRASY 3RD PARTY FEAR WII FEAR NINTENDO AND EA ARE MARKETING 3 CONSOLES SO THE WII ISNT GRAPHICS MARKETED HE WAS CLEARLY TRYING TO DEFEND A TARTED UP PS2 ENGINE ON WII

BY LYING ABOUT WIIS GRAPHICS

TRY THINKING IN A ADULT MANNER NOT A TEEN FAN MANNER

WIIBOY101

On May 12, 2008 at 12:37 pm

WII MAIN RAM

88MB AT 4GB X2

DATA COMPRESSION 4TO1 =88MBX4=352MB OR 8GBX4=32GB BANDWIDTH

WII MAIN RAM TEXTURES

88MBX6=528MB 8GBX6=48GB BANDWIDTH

TOTALS AS IF ALL RAM WAS ETHER DATA OR TEXTURES OBVIOUSLY IN REALITY ITS BOTH IM JUST GIVING PEAK DATA IN RAM WITH ETHER PEAK DATA OR TEXTURE STORAGE

COMPARE ANY OF MY NUMBERS AND WIIS ARCHITECTURE TO PS2 ETC AND STATE IT AINT NEXT GEN

1+1=2 TRY BASIC MATHS :lol:

WIIBOY101

On May 12, 2008 at 12:49 pm

CPU SH-4 to 16 MB Main Memory

64-bits x 100 MHz = 800 MB/s
PVR2DC to 8 MB Graphics Memory

64-bits x 100 MHz = 800 MB/s
ARM7 Sound to Sound Memory

16-bits x 66 MHz = 132 MB/s
Total:

1732 MB/s

ABOVE DREAMCAST BANDWIDTHS PLUS COMPRESSION AND TAKE INTO ACCOUNT EFFICENT TILE RENDERING

WI IS A CUSTOM TILE RENDERING CONSOLE
BUT HAS
SYSTEM BUS 8GB BANDWIDTH VS DREAMCAST 800MB

4GB X2 RAM POOLS VS 800MB

MASSIVE 16GB TEXTURE READ CATCH

MASSIVE CPU CATCHES AND READ BANDWIDTH

AT THE “”"”SAME”"”"”

WIIBOY101

On May 12, 2008 at 12:57 pm

RESOLUTION 480I/P………………………………………………….

wiis system bus is easy a gamecube/xbox/ps2 system bus combined

ps2 1gb plus but shared both cpu/gpu

xbox non shared 1gb

gamecube not shared 4to1 compression over 5gb

wii not shared 4to1 compression 8gb bandwidths

wii aint next gen my it aint lol bye bye

3ds_Mac

On May 12, 2008 at 1:33 pm

Lol, same old, same old, Wiiboy.
360 is clocked twice as fast. Already 2x everything there.
Has twice the rops. now it’s 4x.
Can run z-rate at 2x speed, in a geometry pass. 8x z rate.
4 times the filtered texturing units (16), that function independent of shaders, also known as 8x the texturing rate, when combined with clock frequency.
Also has 16 additional point sampled texturing units, for things like hardware displacement mapping.
And, can technically do 4xmsaa over the top of max fill rate.

360 has 48 shader processors, compared to 5 total for Wii, (if we are to call the t&l unit and tev “shaders”), and again twice as fast.

shaders:
5 vs. 48 at twice the clock.

textures:
4 vs 16 at twice the clock. + 16 point sampled textures

These are how ratios work out.
Again, Wii, 2 texture layers, and 2 “shader ops” applied to 486 milion pixels per second.

360 can do a z-pre-pass, two texture layers, and multiple shader instructions, all applied to 4 billion pixels per second, with 4xmsaa.

And you’re applying compression for Wii, despite 360 and ps3 have superior compression rations at everything. From geometry, to textures of all tyles.
Nice.

3ds_Mac

On May 12, 2008 at 1:49 pm

And neither Gamecube, nor the Wii, uses a tile based renderer. It was designed to use early z-checks.
Similar to this.
http://www.gamasutra.com/features/19991109/moller_haines_02.htm

Just because it uses buzzwords like “tileinto”, doesn’t make it a Dreamcast renderer.
Small snippets, since I know how much you hate reading articles:
“When recursion finishes, all visible primitives have been tiled into the Z-pyramid, and a standard Z-buffer image of the scene has been created. For example, in the scene pictured in Figure 3, more than 99% of on-screen polygons are inside occluded octree nodes, which are therefore culled by the Z pyramid.”

lennell

On May 13, 2008 at 2:47 am

!look the wii is next-gen because it’s more powerful then all last-gen console,but it’s only 50% more power then gamecube,and 25% more power then xbox. but 3ds_Mac is right,the xbox 360 is clocked twice as fast then wii, and also the 360 has three cores, all run at 3.2ghz.!you will have to put wii at it’s max power to make up one of the xbox 360 cores.

3ds_Mac

On May 13, 2008 at 7:46 am

Depends on how you characterize power It’s clocked 50% faster than Gamecube, so that right there would give it a 50% boost in “overall” processing performance.

But it also got a few times the ram and more bandwidth, and hopefully fixed one or two bottlenecks developers ran into. (although not many, considering I know of a couple who said it still has a problem or two the Gamecube had, that they still have to work around, make choices for how to use a particular resource) They did test the cpu, and figure they could get about 2x performance out of it.

old-big-head

On May 17, 2008 at 1:27 pm

the conduit wii in game beta trailer
clearly wii rivels ps3 and x360 minus HD resolutions

add the fact fps controls are 10 fold a xbox 360 wiimote nunchuck

im not going to choose slighty better graphics over VASTLY BETTER GAMEPLAY

wiimote nunchuck are next gen a mear resolution upgrade is next to nothing

great graphics at 480p and wiimote nunchuck or great graphics at 720p and the same old gameplay/controls

i know what im choosing and wii is going to leave x360 userbawse far behind this year far far far behind

old-big-head

On May 17, 2008 at 1:32 pm

usa april hardware sales

GTA 4 HAS NO IMPACT AT ALL ON WII DOMINATION
Wii – 714.2K
DS – 414.8K
PSP – 192.7K
Xbox 360 – 188K
PS3 – 187.1K
PS2 – 124.4K

ITS CLEAR WII ARE HEADING TO A SINGLE FORMAT FUTURE THANK U NINTENDO FOR SAVING GAMING FROM IDIOTS LIKE BILL GATES

old-big-head

On May 17, 2008 at 1:33 pm

PS2 IS SELLING AT A RATE THAT MATCHES PS3 AND X360 PROVING THERE DEAD CONSOLES ONLY WII AND DS CAN BE TAKEN SERIOUSLY

3ds_mac

On May 17, 2008 at 11:13 pm

Wii is worthless to own on its own. The only reason it’s considered a decent console, is because people can get their typical gaming on 360, PS3, or PC, and secondary stuff on Wii.
If the other two repackaged their hardware, added gimmicky controllers, and sold their consoles for around 250, I and I’m sure most others would have been finished with console gaming.

It’s not pulling much software or gamers away from the main two. And there isn’t a programmer out there that would find anything interesting about the Wii. If they tried shoehorning games like GTA4 or Metal Gear Solid 4 onto Wii hardware, fans of the genres would have lost interest.

If the Nintendo fans and non-gamers are happy with just that, that’s fine. Even I own one, but I also recognize it as a mediocre piece of hardware, that I won’t buy anything but Nintendo first party stuff for. I’ll get all the next-gen games on the other two. That’s part of the profile for people that own the Wii.

Non-gamers, that bought it out of fad because they saw it on Oprah, and are now pretty much done with it.

Nintendo hardcore fans, whom have pretty much the same number and selection of games as they did last gen, despite Nintendo success.

And people who own a next-gen console besides. They, like me, will get next-gen gaming on something else. Software developers are well aware of this, and part of the reason why they don’t bother.

And that mediocre tech demo doesn’t show anything, but the fact that Wii can do effects from around the year 2000ish. There’s not much impressive about it, aside from the fact that its running on Wii hardware. You can be gullible and believe what you think though. Not even the makers of that demo think the way you do however.

lennell

On May 19, 2008 at 1:43 am

the conduit looks good on wii it might pull the wii about 50 to 60 percent, most people will get it for the controls if there’er like metriod prime 3 or metal of honor hero 2, but like 3ds_mac sad people will get there hi-def typical games on 360 and ps3,game like bioshock,gear of war 2,ninja gaiden2,metal gear solid 4.games that pull those console…!and 3ds_mac is right there no programmer that will be interesting of putting GTA4 on wii hardware, GTA4 was not easy to be make on 360 or ps3,so there no way it can be done on wii,the ram is to small.

WIIBOY101

On July 16, 2008 at 3:41 pm

CAN THE XBOX IDIOTS EXPLAIN COD 5 WII WORLD AT WAR GRAPHICS PLEASE

CAN XBOX FANS EXPLAIN CONDUIT WII EXCLUSIVE PLEASE

CAN XBOX 360 FANS EXPLAIN AWAY DEAD RISING WII PLEASE

CAN XBOX FANS EXPLAIN AWAY DEADLY CREATURES E3 2008 PLEASE

CAN XBOX FANS EXPLAIN FATAL FRAMES GRAPHICS PLEASE

WII= X360-HD

AS NINTENDO CLEARLY STATED

COD 5 WII IS 99% X360 PERECT VISUALLY AND HAS VASTLY BETTER CONTROLS AND ADDED ARCADE MODES

EVERY SINGLE XBOX IDIOT ON THIS THREAD IS GUILTY OF BELIEVEING A POLITICAL STATMENT NOT A SPEC STATMENT BY EA

YOUR ALL PROEN IDIOTS

EXPLAIN THE X360 COD 5 WII TRAILER X360 AS IN GRAPHICS AND KILLS X360 IN GAMEPLAY

I WORNED YOU ALL E3 WOULD MAKE YOU LOOK STUPID

WIIBOY101

On July 16, 2008 at 3:52 pm

EXPLAIN PLEASE
http://www.gametrailers.com/player/36536.html COD 5 WII LOOKS NEAR X360 PERFECT

http://www.gametrailers.com/player/36621.html EXPLAIN XBOX FANTASISTS

http://www.gametrailers.com/player/36548.html HMMMMMMMMMMMMMMM XBOX FANS

http://www.gametrailers.com/player/36424.html stunning

WIIBOY101

On July 16, 2008 at 3:54 pm

xbox 1 133mhz 1gb bus

wii 243mhz plus peak 4to1 compression 7,7gb bus

wii gpu has 4x plus effecive ingame bandwidth of xgpu

wii cpu has 4to8 times processing bandwidth of xcpu

wii supports twice the hardware lighting and twice the stages and layers of textures of xbox 1

fanboys now looking stupip AS PRE WORNED

WIIBOY101

On July 16, 2008 at 3:56 pm

WII ONLY NEXT GEN SYSTEM JUST WENT AHEAD ANOTHER LEVEL

WII MOTION PLUS NOW 2 GENS AHEAD OF XBOX 360S CONTROLS

WII SHOWS HOW VOICE CHAT IS DONE THE NON GIMPY WAY EAR PEACE = TELLE SALES EMPLOYEE

WII SPEAK = SIMPLE YET GINIUS EAR PEACES ARE FOR TARDS

WIIBOY101

On July 16, 2008 at 3:59 pm

XBOX 360 NOW IN 3RD PLACE SALES WISE

WII BY FAR IN THE LEAD

E3 2008 MICROSOFT TRIED TO BE NINTENDO HMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM NEVER

WIIBOY101

On July 16, 2008 at 4:00 pm

WII= SURROUND SOUND WIIMOTE SPEAKER SOUNDS AND A NON IN THE WAY CHAT POD

XBOX 360= A EAR PEACE THATS IN YOUR WAY AND OVERLAYS THE SURROUND SOUND HOW STUPID EAR PEACES ARE FOR OFFICE USE NOT GAMES TYPICAL RETARD MICRO FANS

WIIBOY101

On July 16, 2008 at 4:02 pm

COD 4 ENGINE FULLY RUNNING ON WII PLEASE EXPLAIN XBOX TARDS

WIIBOY101

On July 16, 2008 at 4:03 pm

QAUNTUM 3 ENGINE 16 STAGE REAL TIME PROGRAMMABLE SHADER ENGINE PLEASE EXPLAIN XBOX FANS

WIIBOY101

On July 16, 2008 at 4:03 pm

XBOX 1 COULDNT SUPPORT COD 2 ENGINE YET WII SUPPORTS COD 4 ENGINE

PLEASE EXPLAIN XBOX FREAKS

spaceguy

On July 16, 2008 at 6:34 pm

wiitard I watched that COD5 video of it running on the wii and honestly it looks like that could be done on the original xbox. It not nearly as close and I’ll repeat that, NEARLY as close as COD4 or 5 on the 360. And the reason the COD4 engine can run on the wii is because on the PC (I’m a PC gamer born and raised) there’s one setting where you can change the Directx setting the game runs at from 9 to 7. This was added so people that had outdated hardware that couldn’t support Directx9 could still play the game. Your comparison of hardware is retarded. It like saying my old computer with a P4 3.0 ghz processor and a Nvidia 6800gtx with 1 Gb of ram ran better then my new computer with a duel core processor running at overclocked 3.5 ghz and a overclocked Nvidia 8800gt with 4 Gbs of ram because it ran hl2 at medium settings at 70 frames per second while the other ran Crysis at Very High at 40 frames per second. It just doesn’t compare.

@ 3ds_mac you might as well stop this guy isn’t ever going to give up the delousion that the Wii is as good as a PS3/360

WIIBOY101

On July 18, 2008 at 7:13 pm

DO I PLAY WITH A OBSOLETE CONTROL PAD OR DO I PLAY WITH WIIMOTE NUNCHUCK AND NOW WII MOTION PLUS

ANYTHING LESS IS LAST GEN

DEADLY CREATURES VASTLY SUPERIOR TO ANY XBOX 1 GAME

CONDUIT VASTLY SUPERIOR TO ANY XBOX 1 GAME

COD 5 WII 60 FRAMES NEAR X360 GRAPHICS @ 480P

666000 JUNE WII SALES 200000 THERE ABOUTS X360 USA SO WHOS THE DADDY WHOS THE NUMBER ONE USERBASE OH ITS WII GLOBALY AND USA

X360 NOW IN 3RD PLACE SALES WISE

2 GENERATIONS BEHIND IN CONTROLS NOW

AND ONLY SLIGHTLY AHEAD IN GRAPHICS THERES NO POINT TO X360S EXISTENCE WHATS SOOOOOOOOOO EVER

DEAD RISING CARNT RUN ON WII FUNNY ITS WII CONFIRMED AND HAS BETTER CAMERA BETER CONTROLS BETTER GAMEPLAY NEW EXTRAS AND NEZR X360 GRAPHICS

BYE BYE I TOLD YOU SOOOOOOOOOO

WIIBOY101

On July 18, 2008 at 7:15 pm

LOST PLANET WOULD RUN ON WII CAPCOM STATED I REST MY CASE

WIIBOY101

On July 18, 2008 at 7:16 pm

DEAD RISING RUNS ON OPTERMIZED RES EVIL 4 ENGINE DOES EVERYTHING THERE PS3 X360 ENGINE CAN DO – HD

TOLD YOU SOOOOOOOOOOOOO

WIIBOY101

On July 19, 2008 at 10:25 am

xbox 1 133mhz bus = 1gb BANDWIDTHS

400mhz pc fsb = 3.2gb

wii 243mhz fsb 4to1 peak compression = 7.7gb

wii 243mhz fsb 2to1 secondary compression ratio=3.8gb

DATA FLOW FSB WII 4TO1 COMPRESSION 2TO1 COMPRESSION AND NON COMPRESSED DATA
EXAMPLE 40% 4TO1 40% 2TO1 20% NO COMPRESSION

EFFECTIVE FSB BANDWIDTHS OF 500MHZ ARE EASILY OBTAINABLE

133MHZ EFFECTIVE XBOX 1 VS 243MHZ X 2 TO X4 NO CONTEST WII HAS EASY A 500MHZ BUS EFFECTIVE THU-PUT BANDWIDTH PLUS LOW LATENCY RAM

XBOX 1 CPU OFF THE SHELF CELERON 9 MILLION TRANSISTORS ALUMINIUM CISC 128K LEVEL 2 CATCH

WII CUSTOM MICRO EMBEDDED UTRA EFFICIENT COPPERWIRE SILICON ON INSULATOR RISC 3D GAMING OPTERMIZED CPU OVER 20 MILLION TRANSISTORS 256K L2 CATCH PLUS PEAK 4TO1 COMPRESION

DEAD RISING = X360 NOT XBOX 1 DEAD RISING PORT = WII

YOU CAN DREAM -ON YOUR JUST COMPARING WII TO XBOX BASED ON CLOCK SPEED NOT ACTUAL ABILITY

THE FACT WII IS MORE LIKE X360 AT XBOX 1 CLOCKSPEEDS THAN IT IS XBOX 1
HAS GOT YOU ALL ED UP

GALAXY PRIME 3 COD5 CONDUIT WOULD NEVER EVER RUN ON XBOX 1

OWNED

3ds_mac

On July 19, 2008 at 3:09 pm

Lol, once again dude, you’re recycling specs bull that’s been shot to , and you aren’t capable of debating any of it.
I’ve already addressed your transistor bull, your bandwidth bull, your compression bull, your bus bandwidth bull, etc, etc, etc…I even provided you with a link, where you can read for yourself what the transistor count is on the Xbox and Gamecube cpus. You can see the amount for logic, and the amount for cache. It’s not 9 million vs. 20 million. It’s 28 million vs 20 million. Only ~20 million used on Xbox, because it’s a gimped P3. More for logic, less for cache. It’s all up there Wiitard. Xbox is a Coppermine, not a Katmai.

And you still continue to imply that folks are comparing generic clock frequencies? Who’s doing that? Who in this thread is claiming Xbox is overall more powerful than Wii?
The answer is, no one. You just post that schlock, because you’re getting flashbacks of you’re fanboy debates with Xbot tards who base their arguments on the same know-nothing bull that you do. It doesn’t work here Wiiboy. You need to take that back to the school lunch room where it belongs.

You’ve been proven to not have a clue on what you speak. Every little detail you just cut and pasted for the dozenth time, has been completely dismantled, and the fact that you think reposting it changes that, or resets your argument proves you are literally impaired in some way. I don’t even mean that as an insult.
It does provide some insight into folks with psychological problems however, and I’m sure there are plenty of psychologists who’d love to study your case.

Fanboyism = not always a simple act of childish brand loyalty, it seems to sometimes be a sign of serious psychological problems.

As to the games you listed, I’ve played plenty of Dead Rising in standard def, and the Wii version doesn’t come close to it. You need to look at the gameplay shots, and not the video footage of cutscenes. If you did, you’d notice no lighting, you’d notice shadows aren’t everywhere, you’d notice the drop in geometry in the surroundings and overall detail everywhere, you’d notice the low resolution textures, you’d notice that they’re promotional shots, and yet have a very limited number of characters on screen at once, and despite that limited scope, it looks like a PS2 game.
You’ve obviously never played very far into the original game. And you obviously didn’t read the developer interview, where he mentions Wii owners are more forgiving when it comes to graphics, and a whole line of bs justification for gimping the visuals.

And the developers of COD5 have explained that they would try to keep the “experience” of the game intact on the Wii. (that’s experience, not graphics)
They also said the obvious, of not being able to have the same number of characters on screen, or the same real-time physics or lighting running, or the variety of textures, or resolution of those textures, etc, etc, etc..ad nauseum.

Conduit looks like for a “next-gen” game, but good for Wii. Nothing more than that. And Fatal Frame would look far better on PS3 or 360, and still have a far greater scope. Wii can pull decent looking screenshots, on a small room, with nothing going on, just like an Xbox or PS2 could. Plenty of examples of that.

3ds_mac

On July 19, 2008 at 4:15 pm

@spaceguy – yeah, I know he’s delusional as all hell. I honestly think there’s something to this fanboyism in people like him and a legit mental condition. He’s not the only one who acts like this either. You run into folks like this all the time. And there’s obviously more to it with them, than just brand preference.

He seems to understand at least a fraction of what he types, or copy and pastes, but it’s as if he’s incapable of coping with the idea that his favorite system is what it is. He’s latched to his bull numbers, and doesn’t seem to be able to updating any of it.
If I thought something, and someone explained in detail why it wasn’t so, or why the thinking is off, I and most other people are capable of updating our understanding. But he doesn’t seem to be able to do that. Or refuses to accept it. He even continues reposting the same thing that’s been explained in detail, so that even someone who doesn’t care about the numbers related to it, would get it.

I used to just find it funny, but it actually provides insight to other avenues of society.

And it reminds me a whole hell of alot of the conversation about Brawndo not being good for plants, in the movie Idiocracy. Awesome movie btw.

lennell

On July 21, 2008 at 6:51 am

!look wiiboy,we all no that the wii is more powerful then xbox 1,but like 3ds_mac sad,cod 5 will not look like xbox 360 cod 5 graphics,it will look more like a slightly better xbox 1 version.!so stop rewriting the same specs and bandwidth bull.

WIIBOY101

On July 21, 2008 at 11:48 am

ROCKSTAR X360 TABLE TENNIS OH LOOK WII PORTED NEAR IDENTICAL GRAPHICS AND IT WAS A QUICK PORT NOT A FULL ON BUILD

DEAD RISING OH LOOK ANOTHER X360 GAME HEADING TO WII NOT PS3…..BUT WII HASN’T GOT THE POWER TO RUN THESE GAMES FUNNY IN REALITY WERE WII OWNERS LIVE ITS HAPPENING IN XBOX FANLAND WERE FANTASY OUT DOES REALLALITY IT CARNT HAPPEN WII CARNT DO IT HMMMMMMMMMMM….

NOW FOR THE KILLER BLOW (I DID WORN YOU DIDNT I) IF EA SPOKSPERSON GUY YOU REMEMBER THE ARTICAL THIS THREAD WAS STARTED OVER

WAS TELLING THE TRUTH ABOUT WII POWER THEN PLEASE PLEASE PLEAE EXPLAIN HOW

GALAXY PRIME 3 CONDUIT COD 5 FATAL FRAME TABLE TENNIS DEAD RISING 2KSPORTS ICE HOCKEY WII NHL AND MANY MANY MORE TO COME

ALL HAVE GRAPHICS THAT COMPLETELY DESTROY THE WII VERSION OF FIFA 08 SOMETHING THE EA GUY SAID COULDNT BE DONE

YOU ALL AGREED WITH A LIAR THAT LIAR IS NOW EXPOSED AS SUCH AND AL THE ANTI WII XBOX FANS BEEN PROVEN IDIOTS

AND YET YOUR STILL SWIMMING AGAINST THE TIDE IN THE RIVER NAMED “”"”DENIAL”"”"

IF FIFA GAMECUBE HAD BETTER GRAPHICS THAN FIFA XBOX AND FIFA WII AND WII IS NOW STARTING TO GET THE GRAPHICS IT CAN CLEARLY DO

EXPLAIN HOW EA GUY WASNT LYING CUZZ THERESZ MORE AND MORE GAMES COMING AND HAVE ALLREADY BEN RELEASED THAT CLEARLY ALL OVER FIFA 08 WII

YOU HAVE ALL AGREED WITH A LIAR THAT A HIGH RESSED PS2 ENGINE FIFA 08 IS WII GRAPHICS

BUT ALL THE GAMES IV MENTIONED AND THE FACT I OWN A COPY OF FIFA ON THE GAMECUBE THATS BETTER LOOKING THAN THE RUSHED WII VERSION

100% CONFIRMS XBOX FANS ARE RETARDED LIARS AND SO ARE EA

PIMPED OWNED COMPLETELY DEBUNKED

I DID WORN YOU :lol: :lol:

Davn Kincade

On July 21, 2008 at 12:29 pm

@WiiBoy101

You alright guy? need a glass of water or something?

Seriously, you take this stuff -way- to seriously…you need to go find your happy place and come to the understanding that this argument you seem to be having with these good people is pointless banter that solves nothing.

You like the Wii, right then…we understand…but seriously man you need to chill out before you have a stroke or something.

3ds_mac

On July 21, 2008 at 1:54 pm

I would guess his cap-locking is a sign of emotional distress, and probably thinks it makes him more “right”. He most likely speaks that way as well. Funny thing is, he’s probably got his friends thinking he’s knowledgeable, and acts like a condescending and arrogant toward them.

And Wiiboy – I’ll help you analyze your Dead Rising Wii screenshots, since this seems to be your new postergame.

http://www.capcom.co.jp/deadrising_wii/img/ss1.jpg
http://www.capcom.co.jp/deadrising_wii/img/ss2.jpg
http://www.capcom.co.jp/deadrising_wii/img/ss3.jpg
http://www.capcom.co.jp/deadrising_wii/img/ss4.jpg

All part of an action cut-scene, likely not running real-time and swiped from the 360 version, most likely in fmv.
How can you tell? The lack of jaggies would be a clue. As would the clarity and detail. But also, see the shadows the zombies casts? You don’t find anything but the generic Mario64-like circle shadows under anyone’s feet in any gameplay footage on Wii. (if even those)

http://www.capcom.co.jp/deadrising_wii/img/ss6.jpg
http://www.capcom.co.jp/deadrising_wii/img/ss8.jpg
http://www.capcom.co.jp/deadrising_wii/img/ss9.jpg
http://www.capcom.co.jp/deadrising_wii/img/ss10.jpg

All part of cinematic cut-scenes that progress the story. Nothing real-time. Simple fmv.

http://www.capcom.co.jp/deadrising_wii/img/ss13.jpg
Cg, nuf said.

http://www.capcom.co.jp/deadrising_wii/img/ss5.jpg
http://www.capcom.co.jp/deadrising_wii/img/ss11.jpg
http://www.capcom.co.jp/deadrising_wii/img/ss12.jpg
http://www.capcom.co.jp/deadrising_wii/img/ss14.jpg
http://www.capcom.co.jp/deadrising_wii/img/ss15.jpg

These are the real-deal. How can you tell?
Because they look like asss, that’s why. Full of jaggies, lack lighting of any sort, you can see the circular shadows under folk’s feet, low detail in pretty much everything,lower resolution textures, and most if not all the subtle detail on anything is gone. As well as a likely very low variety in enemy types.

And the only thing what was impressive at all on the 360 version to begin with, was the number of characters on screen, but they only show small herds of them on Wii. The rest of the engine looked very basic.
A sequal on 360 or PS3, would look better than the action cut-scenes on Wii, but in potentially high-def.

And your COD5 “Xbox can’t run that engine”. Neither can Wii. Porting an engine is easy, you gimp everything the system doesn’t support, and presto, the system runs COD4 engine. You obviously don’t know how engines work, or what they are.
Any modern engine can be ported to Xbox 1, just like it could to the Wii, or any other system. Just depends on if it’s worth doing.
Unreal Engine 3 has been ported to cell phones, for example.

Aristotle

On July 21, 2008 at 6:46 pm

Dude. Wiiboy…. this news is over a year old…. no one really cares anymore. (PS2 still rocks the world.)

3ds_mac

On July 21, 2008 at 8:55 pm

Those Dead Rising Wii pics, are straight from Capcom Jp.
And it may very well be on the RE4 engine, but that wouldn’t make it look like RE4. Just the fact that you can potentially throw 100 characters on screen, means they’d have to be more conservative with what the game looks like with one or two, because each character brings more polygons to be lit, animated, etc..

And the only point anyone else has, is that Wii doesn’t equal a 480p 360 or PS3. EA may very well be able to do a better job with their games on Wii, but they could potentially do a better job with any game. They avoid 360 strengths that would improve their engine on 360, because PS3 couldn’t run it. And they most likely don’t take full advantage of PS3 because of 360.

The Wii doesn’t hold some magical rendering abilities. It’s entire pipeline is known. It simply isn’t capable of what you’re implying.
360 and PS3′s gpus aren’t just clocked twice as fast. they have several times everything, and are far more advanced and flexible.

****IF WII GAMES SUPPORTED TRUE WII GRAPHICS THOSE GRAPHICS WOULD BE SO CLOSE TO X360/PS3 IN REALTIME ON A TV SCREEN THAT WII WOULD EFFECTIVELY PROVE PS3 AND X360 EXISTENCE IS POINTLESS****

They would look like some of the most underwhelming 480p PS3 and 360 games ever made, (The Conduit) on a technical level, and would only have a shot of being competitive in graphics, through art design.
Art design that could just as easily go into 360 or PS3 games as well.

Nothing wrong with that, as Nintendo isn’t focused on graphics this generation, maybe next gen they’ll wake up.

someperson

On July 21, 2008 at 10:46 pm

Wow, cuteboy that was a lot of bs to spill in one post

Aristotle

On July 22, 2008 at 8:15 am

Oh, by the way everyone… the wii? It uses a laptop-grade video processor by ATI.

ATI, that company that isn’t doing too well on the mobile market (which is dominated by Nvidia).

wiiboy101-2008

On July 22, 2008 at 11:08 am

the fact my posts confirming wii fifa runs on ps2 engine not a wii engine was removed CONFIRMS THOSE RUNNING THIS SITE ARE XBOX FANBOYS WITH BIASED VIEWS

DOES WII FIFA RUN ON A WII OR PS2 ENGINE PS2 YOU KNOW I KNOW IT EVERYONE KNOWS IT BUT YOU REMOVE IT BECAUSE ITS THE TRUTH AND YOU DONT LIKE THE TRUTH YOU ONLY WANT TO EXIST IN A FANBOY BUBLE

THE FACT IT WAS REMOVED CONFIRMS UR LIARS AND I WAS 100% ON THE MONEY

OUCH DOES THE “”"”"”"”"”"”TRUTH HURT”"”"”"”"”"”"”"”

THERES ONLY ONE SINGLE REASON MY POSTS WERE REMOVED THEY CONTAINED EVIDENCE THAT DEBUNKED YOUR VIEW AND THE EA GUYS LYING

SO U REMOVED IT

AGAIN FOR THE BENEFIT OF HONESTY OVER XBOX FANBOY BULL

FIFA 08 09 PS3 X360 NEW ENGINE

PS2 FIFA LONG RUNNING SLIGHTLY TWEEKED EACH YEAR PS2 ENGINE (ALSO PORTED TO XBOX AND GAMECUBE BUT GAMECUBE WAS THE FRESHEST LOOKING AND MOST STABLE FRAMERATE) DONT BELIEVE MII READ REVIEWS AND FORUMS ON FIFA PS2 VS XBOX VS CUBE (SO HOW COME FIFA WII LOOKS LOWER THAN THE CUBE VERSION)

WII FIFA 08 09 AGAIN RUNS ON A GIMPED PS2 ENGINE

HOW DOES EAS FIFA 08 09 REP WII GRAPHICS WEN ITS RUNNING ON A PS2 ENGINE

THERE IS NO WII ENGINE IN EXISTENCE AT EA

COMPLETELY SHATTERING YOUR FUKING ILLUSIONS

:razz:

wiiboy101-2008

On July 22, 2008 at 11:28 am

REMOVING POSTS CONFIRMS I WON ADMIT DEFEAT ITS A GROWN UP THING TO DO CENSERING THE SITES POSTS CLEARLY INDICATES BIASED SITE RUNNING AND XBOX FANBOYING

THE WII VERSION OF FIFA AND ALL EA SPORTS GAMES FOR THAT MATTER

DO NOT REPEAT DO NOT RUN ON WII ENGINES EA IS PLAYING A DUAL MARKET TRICK MARKET GRAPHICAL LOOKS THRU HD HYPE EVEN THO HD IS RWSOLUTION AND NOT GRAPHICS

AND MARKET FAMILY FUN THRU WII

ITS CALLED MARKETING LIES AND WIIS POWER HAS NOTHING TO DO WITH IT

THE TRUTH THE WHOLE TRUTH AND NOTHING BUT THE TRUTH

XBOX LIARS

BROADWAY CPU HAS MORE TRANSISTORS THAN XBOX 1 CELERON AND PS2 ENTIRE EMOTION ENGINE PUT TOGETHER

EE PS2 CPU AND CO-PROCESOR 12 MILLION TRANSISTORS(INLINE BASIC RISC CPU)

CELERON XCPU 9 MILLION TRANSISTORS BASIC INTEL CISC CPU

BROADWAY OVER 20 MILLION TRANSISTORS PACKED INTO A TINY SPACE JUST 1/4 IF NOT SMALER THAN THE XBOX CELERON TIGHTLY COMPACT MICRO EMBEDDED TRANSISTORS PERFORM VASTLY BETER THAN STANDARD DESIGNS LIKE CELERON

EE PS2 INLINE CRAP RISC NUMBER CRUNCHER NO TRUE CPU OR OUT OF ORDER EXACTION ABILITY’S TINY PATHETIC CATCH MEMORY OLD LARGE INEFFIECENT DESIGN ALUMINIUM CONECTORS DAM HARD TO CODE AND RELYS ON THE SLOWEST RAM POSSIBLE IN A CONSOLE RAMBUS RAM

XBOX CELERON OFF THE SHELF STANDARD CELERON INTEL SPRED SHEET NON 3D CPU 9 MILION TRANSISTORS NO CUSTOM UPGRADES NO CUSTOM COMPRESSION OLD ASS CISC DESIGN

WII BROADWAY OVER 20 MILION TRANSISTORS COPPERWIRE AND SILICON ON INSULATOR TECH TINY EMBEDDED SMALL DIE DESIGN CUSTOM COMPRESSION DECOMPRESSION 50 NEW GAME CENTRIC INSTRUCTIONS CUSTOM GAMING OPTERMIZED 2 WAY SIMS VECTOR ENGINE HUGE CATCH MEMORY IN COMPARISON TO PS2 XBOX AND PS3

YET AS IF BY MAGIC XBOX 1 MORE POWERFUL

PENTIUM 3S RAN ON 100MHZ BUSES THEN UPTO 133MHZ BUSES A OVER CLOCKED OR PUMPED FSP 400MHZ LIKE MORE MODDEN PCS OFFERS A 46% INCREASE IN PENTIUM 3S POWER SO IF A PENTIUM 3 RAN ON A 400MHZ BUS IN STED OF 133MHZ JUST THAT UPGRADE WOULD MAKE A PENTIUM 3 46% MRE POWERFUL LOOK IT UP ITS A FACT

RIGHT BACK TO BRAODWAY THE CPU IS RISC ITS COPPERWIRE ITS SILICON ON INSULATOR ITS A MICRO EFFICIENT DESIGN ITS GAMING UPGRADED ITS GOT CUSTOM COMPRESSION IT HAS TWICE THE CATCH MEMORY PLUS THE COMPRESSION TRICK IT FEEDS OF LIGHTNING FAST 1T SRAM IT HAS OVER TWICE THE TRANSISTOR COUNT

AND AS I WAS SAYING FRONT SIDE BUS 400 VS 133 ON A P3= 46% MORE POWER

WELL WII HAS A 243 MHZ BUS PLUS CUSTOM COMPRESSION SO A EFFECTIVE BUS SPEED OF 500MHZ PLUS IS EASY CAPABLE ON BROADWAYS FSB

SO IF BROADWAY IN ITS SELF IS 2X PLUS A XBOX 1 CPU AND ON TOP OF THAT IT HAS A VASTLY SUPERIOR FRONT SIDE BUS SPEED BANDWIDTH AND ON TOP OF THAT THE WORLDS FASTEST RAM 1T- SRAM

HOW THE FUK IS XBOX = TO IT OR BETTER THAN IT

ITS IMPOSSIBLE COMPUTING FACT

wiiboy101-2008

On July 22, 2008 at 11:36 am

DRAM = 50 TO 100 NANO SECONDS IN COMPUTER TERMS THATS A SNAIL

1TSRAM= 5 NANO SECONDS THATS TWICE AS FAST AS MOST LEVEL 3 CATCH SPEEDS (10 TO 12 NANO SECONDS)IF 1T SRAM IS FEEDING DATA TO A PROCESOR 10 TIMES FASTER THAN DRAM

THEN ITS CLEARLY GOT A 10 FOLD BETTER CHANCE OF DEALING WITH THAT DATA AKA GETS IT AND PROCESSES IT 10 TIMES FASTER THAN XBOX DRAM

THAT 1TSRAM IN ITS OWN RIGHT BOOSTS GPU AND CPU PERFORMANCE BY HELL LOADS

BECAUSE DATA THAT ISNT THERE CAN NOT REPEAT CAN NOT BEEEEEEEEEE PROCESSED

ANOTHER MASSIVE DESIFGN ADVANTAGE YOUR TOTALLY IGNORING AS IF IT ISNT THERE

THATS BIASED FAN BOYING CLEARLY

:razz:

TRUTH CAN BE NASTY AT TIMES

THE CUBEBOY 101 AGAIN WINS THE ONLINE ARGUMENT AGAIN FOOLS

wiiboy101-2008

On July 22, 2008 at 11:40 am

GEKKO GAMECUBES CPU HAS 2.5 X THE INTERNAL BANDWIDTH OF A CELERON 733MHZ FOUND IN XBOX 1

WII HAS 4 X THE PROCESSING AND DATA SHIFTING BANDWIDTH OF A CELERON IN XBOX 1

HOW IS 4X PLUS = TO OR LOWER THAN XBOX

BASIC MATH CONFIRMS THIS SITE THE XBOX FANS AND THE EA GUY LIARS

wiiboy101-2008

On July 22, 2008 at 11:44 am

EMOTION ENGINE CPU PLUS VECOR CO PROCESSOR= 12 MILLION TRANSISTORS

CELERON XBOX 1 CPU = 9 MILLION TRANSISTORS AND ITS 70S TY CISC

BROADWAY CPU = OVER 20 MILLION TRANSISTORS PLUS RISC PLUS COPPERWIRE PLUS SILICON ON INSULATOR PLUS MICRO EMBEDDED DESIGN PLUS CUSTOM COMPRESION PLUS CUSTOM INSTRUCTION SET PLUS TWICE THE CATCH MEMORY PLUS 4X PLUS THE FRONT SIDE BUS

BUT HONEST GUVNOR XBOX IS MORE POWERFUL

UTTER HOG WASH

Davn Kincade

On July 22, 2008 at 2:38 pm

Removing your posts doesn’t confirm anything except that your constant and pointless ranting on something that was brought out over a -year ago- is getting tiresome to quite a few people on here.

You’ve made your point several times over but you utterly -refuse- to let it go which is just more proof that you take this -way- too seriously, you also post three to four posts saying the -exact- same thing in a different way.

Constant and deliberate CapsLock posting is for wankers who are in desperate need for attention,it makes your comments boorish and adolescent in both form and appearance and generally tends to rub people the wrong way….which was your intention from the start if you were to be honest about it.

So from your posting we can extrapolate two things:

1: Your over-inflated sense of self and/or ego outweights your common sense and makes you incapable of understanding that people are going to disagree with this topic and yourself no matter how often you rant…making your entire exercise in posting (i.e ranting about something utterly trivial) a pointless affair.

2:In a blind fanboy attempt to defend a piece of hardware you have degenerated into a common forum troll which nobody likes.

Frankly i think all of your posts should be removed…no, better yet this entire article should be closed both for your sake and ours.

wiiboy101-2008

On July 22, 2008 at 4:27 pm

ILL REPEAT UNTILL COMPUTATION FACT SINKS IN

XBOX 1 XCPU 9 MILLION TRANSISTORS CISC SHED SHEET CPU

WII BROADWAY OVER 20 MILLION TRANSISTORS RISC COPPERWIRE SILICON ON INSULATOR TECH MICRO EMBEDDED DESIGN TWICE AS MUCH L1 AND L2 CATCH MEMORY CUSTOM DATA COMPRESSION DECOMPRESSION AND 4X THE FRONT SIDE BUS

BROADWAY IS CLEARLY AT LEAST 2.5 X A XCPU IN REAL WORLD GAMING APPLICATION

COMPUTER INDUSTRY FACT

impuk

On July 22, 2008 at 6:14 pm

long ing thread :roll:

To my calculation the xbox would require a 1.5 celeron M with 1mb level 2 catch and a 500mhz fsb to compete with the wii cpu and bus set up.you cannot directly compare xbox cpu bus set up to the custom optimized wii.

the gpu with its massive edram vatch and its custom tile rendering high efficencie and virtual texturing is like comparing cisc to risc cpus so again based on wiis superior design id say a overclocked xgpu at 500mhz would be required it would also balnce out the clock speeds just like wii has

wii is like xbox 1 with the following spec

1.5 ghz celeron M 500mhz 64 bit bus and a 500mhz xgpu and 128mb of gddr3 ram on 128 bit bus at 500mhz x 2 for dual channel

iv got your back caps lock guy lol :idea:

impuk

On July 22, 2008 at 6:17 pm

250mhz x 2 for dual channel 500mhz gddr3 128 bit sorry :eek:

wiiboy101-2008

On July 22, 2008 at 6:20 pm

EXACTLY MY POINT RISC VS CISC CUSTOM VS STANDARD FAST RAM VS SLOW RAM CUSTOM COMPRESSION VS NO CUSTOM COMPRESION GIVES WII A 2TO1+ ADVANTAGE OVER XBOX BASED ON CLOCK SPEEDS

THATS WHAT THE XBOX FANS WONT OR CANNOT ADMIT

AnnoyedProgrammer

On July 22, 2008 at 8:28 pm

To wiiboy, the COD4 engine runs on the wii because they stripped it of many of the things that made COD4 on the 360 and PS3 so beautiful. And speaking as a game programmer, it is impossible for the Wii to even approach 360 and PS3 graphically. First off, the 360 and PS3 are multi-threaded and multi-cored and the Wii is not meaning that not only are the 360s and PS3s cpus vastly superior game wise to the Wii’s, they are also more efficient and easier to program A.I for and easier to pass graphical arrays shader arrays and vectors through without a lot of slowdown. Also, 360 PS3 and Wii go in this order(worst to best),
Wii<PS3Fanboys

AnnoyedProgrammer

On July 22, 2008 at 8:31 pm

disregaurd the last line. As I was saying Wii<PS3<360. So your hopes and dreams of the Wii matching non-HD 360 and PS3 graphics are invalid because all HD does is increase resolution making the already existing game the same just sharper looking.

3ds_mac

On July 22, 2008 at 10:30 pm

@Aristotle – They’d be better off if they had a modern mobile ATI gpu. But they really don’t. They recycled the ArtX chip they got for the Gamecube. Which was a nice design for it’s time, and had its own strength (and weaknesses)
They kept more with the bandwidth-heavy style of multi-texturing, but added edram to compensate. So it worked fine there, and stuck to older T&L, but added instructions to the cpu to compensate there as well, so it was “pretty much” fine there too.

But, reusing that same hardware, over-clocking everything by 50%, doesn’t make it competitive with the other 2, even if they stick to 480p. It means it isn’t capable of running modern game engines.

The “rich set of hardware effects” is exactly that.
A rich set, of fixed function actions that it can perform in hardware. Not even close to anything modern.

It does allow for full backwards compatibility, and negates the need to upgrade development tools, or the need to learn new hardware, especially for their internal EAD teams. So it’s a good thing for Nintendo themselves.

@Wiiboy, you’re right. It’s a conspiracy. Everyone is just trying to keep you from telling the “truth”. Everyone hates Nintendo, and is jealous of their power and doing everything we can to keep the Wii down. It’s really not an over-clocked Gamecube with more ram, it’s actually a state-of-the art piece of hardware, that out-classes everything.
We’re all in on it. We’re just trying to get you on our side. We couldn’t have an awesome “tech genius” like yourself shooting holes the plot to bring Nintendo to its knees.

3ds_mac

On July 23, 2008 at 4:12 am

Oh, and since you failed to read it the first time, I’ll repost some links for you.
The World Anti-Nintendo Conspiracy (W.A.N.C) for short, has taken the liberty of modifying the transistor count on a list of sites.
http://www.cpu-collection.de/?tn=&l0=cl&l1=PowerPC&l2=IBM

^^See the transistor count of the entire G3 750 family? See how it’s listed at 6.3 million? (and introduced in 1998, that’s a good year for a modern cpu. Vintage yo)

http://www.cpu-collection.de/?tn=&l0=md&l1=2001&l2=Intel#KC733/128

^^See the transistor count of the entire 2001 Pentium family? See how it’s listed as 28.1 million for the Xbox cpu?
Coppermines had 3 times the transistor count as the Katmai, (the one you consistently post figures on), because it incorporated the L2 cache on-die.

Uh, oh.
28.1 million is higher than 6.3 million. And since your tactics are to compare generic numbers without context, I guess Xcpu owns.
And you know nothing of cisc vs risc differences, nor did you bother to look up legit statistics on it, nor would you, so no point going over that. Nor the old, dead, G3 vs Pentium 2′s from the 90′s performance. (notice I said P2 there)

And you ignore the context involved with what those cpus actual roles were on each console…No, that’s too complicated.

We should just stick to the generic number, and conclude that “transistor count = awesomeness”
And ignore the fact that L2 cache wasn’t counted in the G3 specs, while it was on the Xbox cpu. And ignore the fact that only half the cache is active in the Xbox cpu, even though the transistors related to them are there. (because all Celerons were gimped Pentium 3s)

I would point out that I, unlike you wouldn’t judge the cpu based on such a generic figure, (that’s your job). And say that, I would fully expect the Wii’s version of the G3 cpu to be more capable than a generic Pentium. (Gamecube’s is another story)

No no, that doesn’t help to further the anti-Nintendo propaganda.

3ds_mac

On July 23, 2008 at 4:15 am

Correction: (because all Coppermine Celerons were gimped Pentium 3s)

wiiboy101-2008

On July 23, 2008 at 10:08 am

EAs MOH HEROES 2 WII LOOKS OKISH RUNS SUPER SMOOTH SOLID 60 FRAMES LAG FREE ONLINE AT 60 FRAMES GREAT WII CONTROLS

BUT IT ISNT REPPING WII GRAPHICS IN THE SLIGHTEST ITS A HIGH RESSED FLESHED OUT PSP ENGINE PORTED FROM PSP SO HOW IS THAT IN ANY WAY SHAPE OR FORM Wii GRAPHICS

COD 5 WII IS CLEArly 2.5 x moh heroes 2 and still maintains 60 frames

how does psp and ps2 porting define wii graphics YOUR MAKING NO SENSE XBOX FANS

wiiboy101-2008

On July 23, 2008 at 2:00 pm

HAY I BET THESE LOT GO ABOUT SAYING LIKE 3.2 GHZ X 3 = 9.6GHZ AND THATS HOW POWERFUL X360 IS

OR MAYBE 3 G5S INLINE TRIPPLE CORE HMMMMMMMMMMMMMMMMMMMMMMMMM BUL

3.2GHZ INLINE BASIC DESIGN AS IN PRE G3 INLINE OLD AS IBM CORES WITH OVERCLOCKED CLOCKSPEEDS AKA PPE NOT PPC NO BRANCH PREDICTION OR TRUE DESK TOP CPU ABILLITYS

BUT THE X360 FANS LIKE TO SAY 3X G5S AT 3.2GHZ

YEAH RIGHT THERE POWER MACS COUGH

wiiboy101-2008

On July 23, 2008 at 2:02 pm

WERES YOUR DEDICATED SOUND CHIP AND SOUND BUS OH I FORGOT CPU BASED SOUND USING E WINDOWS SOUND TOOLS IF BROADWAY WAS 3.2GHZ EVEN IN SINGLE CORE IT WOULD DESTROY THE INLINE PPE CRAP IN X360

3ds_mac

On July 23, 2008 at 9:21 pm

Who’d say that? No one here has. You’re putting claims in people’s mouths they haven’t made.
Xenon and Cell were designed by modern engineers, who made design decisions based on the needs of a console, and Mic/Sony’s requests.

And, they stripped out-of-order instructions from them, because they figured they could write a compiler 1 to 1 with the cpu, and have code more organized, thus reducing the need for things like an instruction window. (made to make even crappy code run fast) If you ran typical pc code on either, it’d run like crap.
(still above Broadway of course)
But these cpus can have compilers tailored specifically to them, rather than having to run code that’s meant for all. P4, AMD, etc, etc..

Stripping the logic for it allowed for more cores and threads dedicated to processing. They also added custom features like vmx128 to 360, and high front-side bus bandwidth, etc.. Having more core allows for a higher peak flop capability. 9assuming you have something that actually uses it effectively)

In addition to the fact that in-order allowed them to be clocked faster, at 3.2 ghz.

Broadway couldn’t reach 3.2 ghz. Nor can a G5 for that matter.
And I don’t know what you’d expect having to deal with sound would matter, considering Wii’c cpu handles an entire list of things the gpus in the other systems handle on their own.

Cell and Xenon run sound…uh, oh…but Broadway runs any vertex shader work you wanted to do on Wii. They could very well have taken their 165-234 million transistor budget and built Wii like cpus with few cores, etc.., but they don’t need them. Streaming most graphics related data doesn’t require it.

And performance depends entirely on what you’re doing. Dealing with polygon checks, or sound, etc is trivial.
Not much use for something like Pentium 4 type processing for things like that to run fast. And for things like procedural textures for example, an spe is 50% faster than a P4, as benchmarked by profx.

And Capcom, having worked on PC, 360 and PS3, have indicated that thus far from their experience overall, 1 core on 360 = 2/3 a 3.2 ghz Pentium 4 core.
Hyperthreading was more efficient on Xenon, and overall for their uses:
Xenon = duel core Pentium 4 Extreme.

Of course, that’s for an actual game, probably with compiler optimizations for Xenon that wouldn’t have been for the P4, etc, etc..) And it likely wouldn’t be th same for general purpose pc operations, but these aren’t meant to be general purpose pcs.
And different developers, and different games would yield different results. If they required more raw flops, it’d favor Cell and Xenon more.

But regardless of any of this, the point here is, Cell/Xenon out perform Broadway handily.
Handily.

3ds_mac

On July 23, 2008 at 10:33 pm

****IF BROADWAY WAS 3.2GHZ EVEN IN SINGLE CORE IT WOULD DESTROY THE INLINE PPE CRAP IN X360****

Broadway “should” have been clocked faster. Maybe 1 ghz or more. But it still wouldn’t be the a 20 million transistor, G3 of awesomeness that you dream of.

No matter how inefficient the compiler is, or how crapily the code is written, or how sensitive they are to unoptimized code, (which they are, no doubt) the other two cpus are simply better.
And more than 8-12 times more transistors. For those fixated on transistor count.

wiiboy101

On July 25, 2008 at 11:07 am

broadway cpu out of order fully branch predictable PPC cpu

x360 cpu basic inline cores no branch prediction just uped clock speeds PPE

EXAMPLE OF INLINE CPUS VS OUT OF ORDER EXACTION UNITS

A INLINE CPU LIKE X360 AT BEST IS 1/3RD THE POWER OF A OUT OF ORDER CPU AT THE SAME CLOCK AND AT WORST 10 X SLOWER

THATS INLINE PROCESSING FACT VS OUT OF ORDER EXACTION UNIT

YET YOU STILL BASE YOUR BELIEFS ON CLOCK SPEED

RETARDS THE BROADWAY IS EASY ON PAR WITH A SINGLE 3.2 GHZ CORE IN X360 AND DOESNT DO HD OR SOUND OR CLUNKY BACK GROUND OP SYSTEMS WITCH GIVES ITS DEDICATED GAMING PROCESSING A GOOD OLD BOOST IN POWER

BUT IM NOW GOING FURTHER OVER YOUR HEADS LIKE ALLWAYS

WIIS SPEC @ 480P NATIVE RESOLUTION ARE CLEARLY = TO X360S AT NATIVE 720P

EXACTLY A 3RD AND 480P IS EXACTLY A 3RD OF 720P BUT WII HAS SOME BIG ADVANTAGES LIKE 1T SRAM ITS SO FAST IT ALLOWS INSTANT LOADING OF DATA

SO NO POP UP GRAPHICS AND POP UP TEXTURES LIKE COUGH X360 AND UR3 ENGINE

GET SOME REAL RAM IN THERE 1TSRAM OR IBMS NEW CUSTOM EDRAM

DRAM OH PLEASE SOOOO SLOW

wiiboy101

On July 25, 2008 at 11:19 am

rogue squadron 2 15 million polygons 60 frames a second 24 bit textures 24 bit colour

rogue squadron 3 near 20 million polygons 60 frames a second 24 bit colour 24 bit textures

prime 2 15 million polygons 60frames 24 bit colour 24 bit textures

zelda wind waker best cell shading by far last gen

resident evil 4 near 20 million polygons 30 frames 24 bit colour 24 bit textures

all above gamecube added to that is zero or near zero loading times

———————————————————————————

halo 2 30 frames 10 million polygons 16 bit colour 16 bit textures choppy frames and pop up textures

riddic fps 30 frames 16 bit colour 16 bit textures 9 million polygons

ninja gaiden 16 bit colour 16 bit textures

AL THE ABOVE XBOX GAMES HAVE VERY LONG LOADING TIMES AND CHOPPY FRAME RATES AND TEXTURE POP UP

BUT XBOX IS NOT ONLY MORE POWERFUL THAN GAMECUBE ITS MORE POWERFUL THAN WII

GET A GRIP OF YOUR SELFS GAMECUBE BEAT OUT XBOX AT POLYGONS/FRAMERATES/COLOUR/LOADING SPEEDS

WITH EASE THE GAMES EXIST TO PROVE IT

3ds_mac

On July 25, 2008 at 12:54 pm

You’re over no one’s head but your own. And I’m sure you’ve no clue what a branch hint is inserted by the compiler, so no point going over such things. Steps in programming a vcr would likely be a challenge to you.
It’s almost entertaining to hear someone so clueless, speak in such a condescending manner.

And I just outlined what Capcom’s report was, did I not? Sure I did. Then I linked it. But, couldn’t expect you to read. That’d be too much.
For their uses, (in an actual game) 2/3 a 3.2 ghz Pentium 4, = a 2+ ghz Pentium 4. And that’s a hyper-threaded P4 core. (then we can consider that the compiler has improved recently since that article was written)

Are you suggesting Broadway is as powerful as a 2ghz HT P4 core?
Of course not. It’d have trouble with later model P3′s.

And, please remind yourself that Broadway is a 20 transistor G3.
A 20 transistor G3. A 20 transistor G3.
Not a G5. Not a G4, but a G3. Clocked at 729 mhz. Simple split floating point unit modification. Not altivec, not VMX of any sort.
360 had 3x vmx128. Cell has a VMX32, Braodway doesn’t even get the standard vmx32.

And I just explained to you the benefit of ooo instructions. You can write something with Borland or whatever compiler, and it’ll run ok on an ooo, because the system is capable of search in the instruction window for the best order for execution. Not as needed in a closed console environment, where you know the exact specifications of the entire system. You can’t know that on a pc, hence ooo instructions being absolutely required.

And again, since you’ve regurgitated random internet jibber on the efficiency of ooo instructions, I’m sure you can understand that that “worst case” efficiency is with unoptimized code, and especially not graphics related tasks, or raw floating point.

And I’ll say it again, if they needed a cpu similar to a G5, they could easily have built one. 165 million transistors is plenty. But, as I explained, they wanted multi-core cpus. If they’d have made it ooo and similar to G5, it’s have been 2.5 ghz at the most for use inside a console, and would have had fewer cores. And as I also explained but will say again, the e you just copy and pasted from the internet, regarding ooo and in order, is in regards to generic unoptimized code, compiled by a generic compiler.
That isn’t the case here Wiiboy. Plenty of data you could look up regarding power comparisons when the compiler is optimized, and there are branch hints.
Try looking up ray-tracing benchmarks on a Cell vs Pentium 4s and other processors. You won’t find the ooo vs in-order bs you just tried to use to boast of Braodway’s awesomeness.

And cpu doesn’t get much help from being 480p, that would be more the job the gpu dude. No need to shoot the rest of that schlock you just reposted for the dozenth time, as it’s already been addressed. From color to polygon transforms.

Sampson

On July 25, 2008 at 3:22 pm

Wiitard, still drooling his way down the thread I see. And still as retarded as ever. And I do mean impairment.

And I do know you can look up efficiency ratings for 1tsram. That’s all the benefit there is to low latency btw, is efficiency.
And 10 nanosecond latency, doesn’t net you “3 times” anything, especially when it comes to graphics. The best it’ll do, is bump efficiency up about 15 percent. As outlined with Xbox 1 benchmarks, the DDR in xbox was 75% efficient. 1tsram was 90% efficient. And you can do some basic division there, coupled with bandwidth figures, and provide an overall effectiveness.

Again, 2-3x low latency doesn’t net you 3x anything Wiiboy. Especially with linear accesses related to graphics data. You didn’t bother to research your understanding of page-fetches, going over your marketing papers, and picking out buzz-words you liked.

This is why the developer podcast someone linked earlier, they were laughing at the NDF’s self-crowned tech analysis regarding Gamecube 1t-sram ram.

3ds_mac

On July 25, 2008 at 3:37 pm

****EXACTLY A 3RD AND 480P IS EXACTLY A 3RD OF 720P BUT WII HAS SOME BIG ADVANTAGES LIKE 1T SRAM ITS SO FAST IT ALLOWS INSTANT LOADING OF DATA
SO NO POP UP GRAPHICS AND POP UP TEXTURES LIKE COUGH X360 AND UR3 ENGINE
GET SOME REAL RAM IN THERE 1TSRAM OR IBMS NEW CUSTOM EDRAM ****

And this e. Pop-in on 360 is primarily to do with expansive levels, (especially open-world) that don’t cache to the HDD properly. It’s not much to do with ram.

If you want a good example of ugly as hell pop-in, in an open world environment, go play No More Heroes on your Wii. (that dead and lifeless world will show you lots of it, with bland, jaggy-covered low detail, and a spotty frame-rate to boot.

(nice boss battles though)

lennell

On July 26, 2008 at 12:43 am

i agreed with 3ds_mac, no more hero is a good game,but has a dead world and pop up graphics.and the hulk game on wii has ugly pop up graphics and it looks lifeless. spider man 3 for wii looks like a dam nintendo 64 game at max power,that the .

wiiboy101

On July 31, 2008 at 5:14 pm

AVIATORS>>>>>>Miis cough cough cough COUGH COUGH

DREAMCAST CONTROL PAD xbox and x360 control pad cough cough cough COUGH

analog stick d-pad rumble shoulder buttons etc etc etc etc sony microsoft sega panasonic cough cough cough cough NINTENDOPS INNOVATATIONS EVERY SINGLE LAST ONE OF THEM

COUGH COUGH COUGH COUGH

SALES WII BETTER THAN X360 AND PS3 COMBINED GLOBALLY

EA JUST RESENTLY ADMITED NO TRUE WII SUPPORT NO EGGS IN WII BASKET SHOULD HAVE BACKED IT (PS2 FIFA ENGINE BEING DUBBED WII GRAPHICS)

AND YOU BELIEVE HIM

DUM MOTHER FUKERS

3ds_mac

On August 2, 2008 at 3:50 am

Should do something about that cough. A lozenge perhaps.
It’s a common symptom of fanboyitis. It usually spreads through the internet, infecting those with a biological predisposition for irrational thinking.

Lets go over some history.

D-pad, directional pad, digital pad, etc.. have had various designs, etc.. Nintendo simply took the stick off, and used the plus sign shape to execute the the directions directions. Very nice. Nintendo deserves some credit. (even if I can point to a plus sign digital control scheme on a remote control cars, etc..)

And tons of folks predate Nintendo with analog.
For example 1982:
http://www.old-computers.com/MUSEUM/computer.asp?st=2&c=1018

“included is a 1.5 inch, self-centering, joystick with 4 buttons on the right. It uses an analog/potentiometer system allowing differing degrees of directional input.”

But wait, Miyamoto invented analog sticks in his garage in 86…Those silly time machining guys at Milton Bradley.

This is 85 and they’re duel analog sticks….on a Sony controller? That reminds me alot of the duel analog sticks in Sony’s controller that came later….
http://en.wikipedia.org/wiki/Sony_Flightstick

And also:
http://www.1up.com/do/newsStory?cId=3168949
You should testify on Nintendo’s behalf Wiiboy. I’m sure they’ll listen to an awesome videogame hardware historian like yourself.

3ds_mac

On August 2, 2008 at 4:08 am

That should be 19 96, not 86. And Sony’s would be 95, not 85.

Typo. I wouldn’t want to give you the impression that Nintendo put an analog stick on their controller, back at the time they had just “invented” the d-pad. – which is only a few years “after” Milty used his time machine to steal Nintendo’s analog stick idea from the future, to take back to use in 1982.

wiiboy101

On August 4, 2008 at 10:20 am

gamecube 8×8 hardware lights plus custom lighting via optermized cpu

xbox 8×4 hardware lights no custom cpu lighting CLEARLY WEAKER

gamecube 8 texture layers 16 texture stages

xbox 4 texture layers 4 texture stages CLEARLY WEAKER
internal gpu bandwidths flipper=2.5 plus the xgpu flipper reads compressed textures and data xgpu does not
flipper supports tile rendering and virtual textures xgpu does not flipper reads data from fast edram/1tsram xgpu reads from pc standard dram

but xbox is more powerful

ok reallity or fanboy
wii all know the truth but the anti nintendo idiots carnt be honest
fsb gpu linked to xbox 133mhz 1 gb
gamecube 164mhz+4to1 peak compression 5.2gb

gamecube nevermind wii is more than xbox
goons

Wiiboy101

On August 4, 2008 at 12:11 pm

By extrapolation, we got performance numbers of both Pentium 3 working on 400MHz FSB speed that
is employed by Pentium 4. For both Pentium 3 600 and Pentium 3 1000, a performance boost of 46% is
obtained by increasing the FSB speed from 133MHz to 400MHz.

3. Reckoning that Performance of Pentium 4 by IPC and FSB

From the figure below, we could extrapolated the performance number of Pentium 3 1400 with 133MHz
FSB speed to be 180. As has been mentioned above, a 46% performance boost will be obtained when
increasing the FSB speed applied to the Pentium 3 from 133MHz to 400MHz, then the performance number
of the Pentium 3 1400 should be:
Pentium 3 1400(400) = Pentium 3 1400(133) x 146%

so a Pentium 3 judging from the above 133mhz to 400mhz bus increase alone increases a p3 performance by a average of 46%

xbox 1 = celeron m 733mhz on a pentium 3 133mhz bus = 1gb bandwidth

but the fact wii uses a base speed 243mhz bus and a peak 4to1 of all data compression in vector mode = 7.7plus gb bandwidths

mix 4to1 with 2to1 and non compressed data you get a effective bus of around 500mhz add to that the wii system is clock balanced to its fsb and the wii has 1t sram sram like performance low latency

how can xbox be wii thats utter fanboy fantasy

in xbox terms wiis fsb is like 500mhz in xbox terms the broadway cpu is 2.5 x xbox celeron in game

PLONKERS YOU INTEL MICROSOFT WORSHOPING PLONKERS

Wiiboy101

On August 4, 2008 at 12:14 pm

fsb dont mean =xbox liars

broadway risc copperwire compression silicon on insulator dont mean = xbox liars

1t sram dont mean = xbox liars

why would nintendo use ex streamlining expencive ram 1t sram-r if it didnt do anything

DONT BE SO DAM STUPID

Wiiboy101

On August 4, 2008 at 12:20 pm

WII MAIN RAM BANDWIDTH = 4GB X 2 = 8GB BANDWIDTH NOT COUNTING COMPRESSION @480P

X360 BANDWIDTH MAIN RAM 24GB BANDWIDTH @ 720P 480PX 3= 720P

8GB X 3= 24GB oh look the wii = a 480p x360 using maths from nursery school

8×3=24 480px3=720p

if you cannot add up basic numbers GO BACK TO SCHOOL

ADD TO THE ABOVE NUMBERS WIIS SUPERIOR RAM SPEED BALANCE AND COMPRESSION THE WIIS 480P SET PERFORMANCE CAN BE AMASSING

AS FACTOR 5 WILL PROVE AND CONDUIT ALSO

3ds_mac

On August 5, 2008 at 12:13 am

You can’t “extrapolate” anything with the you post. I know you think you can, but you fail to comprehend the differences in the tasks the cpu is responsible for. (among countless other things you fail at) You can’t compare any two consoles without taking that into account.
And it’s been explained countless times, what a light is, and how it’s calculated. Gamecube couldn’t do legit per vertex lighting anywhere near as effectively. Cube’s is far more generic. Splinter Cell would be a good example.
If you want to stick to the older, generic lighting models for your comparison, that’s fine. But keep in mind, that modern games wouldn’t touch it with a 10 foot pole. You’d use things like vertex shader driven normal maps, even on Xbox, not to mention anything modern.

And once again, nothing does 8 hardware lights by default. More lights, cuts into performance elsewhere. Whether it’s in the combiner stages, or in transformation unit. You can’t simply add 8 lights to anything, any more than you can add 8 textures to anything arbitrarily. That’s simply not how things work.
Texture layers cuts into fill rate, hence the countless developer quotes pointing out the average number of texture layers in a given scene. Making your “8 vs 4″ comparisons useless.
They even mention other limitations in one of Factor 5′s presentations.

And also, once again, 360 has Superior compression to the Wii, in every way, shape, and form. More formats, better quality, better compression ratio, more types, benefiting both bandwidth, storage, and processing.
So, it has 6 times the ram. In addition, to far more efficient and effective compression. Wii doesn’t have formats for compressing a normal map. Nor does it have them for volumetric. Nor anything past the basics for regular textures, nor anywhere near the potential filtering capability, or layering, or anywhere near the math related to it.

And, once again, nowhere near the shader capability. Arithmetic operations out pace texture operations several times over these days, and more-so in the futire. Real-time shadows, per-vertex lighting, and all the complex pixel and vertex shader work, don’t require the bandwidth the old multi-texturing that the Wii is limited to.

Could even go into legit on-chip displacement mapping with the tessellation unit, or cpu generated content, or spus, etc..
But no need to use such hypothetical uses of technology to effectively multiply numbers, the Wii hardware is already owned by citing the absolute basics. Even if you pretended PS3 and 360 were generic texture combiners and ignore 90% of the rest.

3ds_mac

On August 5, 2008 at 4:26 am

****gamecube 8 texture layers 16 texture stages
xbox 4 texture layers 4 texture stages CLEARLY WEAKER****

And clearly, once again, you don’t know what a “render pass” is.
And again developer quote, Gamecube averages 3, XBox averaged 4. Xbox isn’t limited to 4 any more than Gamecube is, as it can multi-pass, as it does in Doom 3, and several other games.

And clearly you don’t understand the “4 to 1″ compression you repeatedly post. You don’t care what that is, nor what it means. It’s using lower precision bytes and shorts to rep vertex data. That seems to be it. You should try reading the Sega tech article you likely picked that from more closely.
Byte = 8 bits, short = 16 bits, instead of the typical 32 bits. (aka up to 4 to 1)

Click here:
http://www.patentstorm.us/patents/7159212/description.html
Press: ctrl + F
And write: Data Structure Generation

You’ll notice they outline the different methods for PS2, Gamecube, and Xbox/PC. Xbox/PC are capable of using pre-compiled command streams. Xbox and PC vertex shaders are capable of decompressing vertex data.

As to the rest of the crap you keep reposting….Not sure why it’s so difficult for you to wrap your head around Wiiboy. -It’s like an Abbott and Costello skit.

Wii will never be a 480p 360….Or PS3 for that matter.
You can make failed attempts to argue otherwise the rest of your life (or until you’re banned from every forum across the net, whichever comes first), it won’t change reality. Fanboy delusions never do.

Sampson

On August 5, 2008 at 11:05 am

I’m pretty sure I’ve banned Wiiboy before, as well as deleted his copy and paste bull as a mod.
He’s a pathetic little troll who’s developed an inferiority complex on Nintendo’s behalf, and thinks copying all the buzzwords and technical info he doesn’t understand, and reposting it everywhere he can find, will convince people of the Wii’s “awesomeness”. (this is by no means the only place he trolls)

You see Wiiboy, you’ll never be capable of actually programming games.

Therefore, the Wii will forever go “under-utilized”. That’s just the way things work.
Everyone else who’s smart enough to make games, hates the Wii, Nintendo, and everything they stand for with a passion, and will continue to sabotage their graphics performance.
Sure, every once in a while we’ll come across a Nintendo fan in our midst, that’s not willing to go along with our plot, but I’m sure you know who’ll be cleaning the floors and picking up our dry-cleaning….
And any programmer or artist who shows potential will be quickly hired away from places like Retro Studios, to be put to work for our own anti-Nintendo benefit.

“And we’ll get away with it too, even with you meddling kids trying to expose our plans.”

There’s simply nothing you or the “Nintendo Defense Farce” can do about it Wiiboy.

wiiboy101

On August 6, 2008 at 8:49 pm

Internal CPU data bus:
Pentium 3 – 733 mhz * 64 bit data bus = 5.6 GB/s
Gekko – 485 mhz * 64 bit data bus * 4 (compression) = 15.5 GB/s

broadway cpu 23.25GBs plus increased efficiency and spec tweaks

tranny count xbox celeron 9 million

gekko cpu 20 plus million

broadway higher again to gekko plus 20 million

but wii is some how a xbox

3ds_mac

On August 7, 2008 at 12:40 am

Broadway is 6.5 million, Wiiboy. Context is important.
And compression is applicable to any vertex data.

wiiboy101

On August 8, 2008 at 1:20 pm

COMPRESSION what bull ter gekko and broadway and the fsb and the gpu write gather pipe are cuastom dum fuk

compressed data textures everything in wii is compressed at all points untill PROCESSING IF ALL VECTOR DATA IS AS YOU INFORMED FUK SAYS IS COMPRESSABLE

HOW COME XBOX 1 HAS NO SUCH ABILLITY AND X360 SUPPORTS AND ATI/IBM CONFIRMED COPYED FROM GAMECUBE

REAL TIME DECOMPRESSION AT THE POINT OF PROCESSING FOR TEXTURES AND DATA PREVIOUSLY ONLY CAPABLE O GAMECUBE ALSO ON WII AND SLIGHTLY COMPATATBLE ON X360 (TO A POINT)

SO GET YOUR FACTS RIGHT

YOU JHUST COMPARED STORAGE COMPRESSION SOFTWARE COMPRESSION WITH CUSTOM HARDWIRED REAL TIME COMPRESSION

AGAIN THE XBOX FAN LIESSSSSSSSSSSSSSS

YOU JUST GOT PIMPED X360 HAS REAL TIME DATA COMPRESION TO A POINT NOT AS FULL ON AS GAMECUBE AND WII

X360 ALSO HAS REALTIME TEXTURE DECOMPRESSION AND VIRTUAL TEXTURING AGAIN COPYED FROM GAMECUBE AS ATI STATED

CUSTOM COMPRESSION REAL TI8ME DE COMPRESSION IS NPOT6 STANDARD YOU JUST SAID IT WAS

AGAIN THE XBOX FAN HAS LIED

WII SUPPORTS CUSTOM ALL DATA SOUND AND TEXTURE AND SCIPT AND DISPLAY LIST AND GEOMITRY COMPRESSION AND REAL TIME DE COMPRESSION AT THE POINT OF PROCESSING

GUESS WHAT NO OTHER CONSOLE OR PC ETC SUPPORTS SUCH A THING AT ALL IN ANY FORM

THE WII DECOMPRESSES ON THE FLY IN REAL TIME AT THE CPU AND GPU AND SOUND CHIP AT NO POINT UNTILL PROCESSING DOES THE DATA DECOMPRESS

XBOX PS2 PS3 PC ETC CAN NOT REPETE CAN NOT DO THAT

XBOX 1 COMPRESSED TEXTURES ON THE DISC AND IN MAIN RAM IT COULD NOT READ COMPRESSED TEXTURES OVER ITS BUS OR AT THE GPU THE TEXTURES DE COMPRESSS THEN ARE SENT

SAME APPLYS TO ALL DATA AND SOUND AND VIDEO ETC

BUT WII AND GAMECUBE SUPPORT TRUE REALTIME COMPRESSION DECOMPRESSION

wiiboy101

On August 8, 2008 at 1:25 pm

AGAIN HONESTY RULES XBOX FANS DRULLLL

BE HONEST BEFORE POSTING FUKWIID

TRANNY COUNTS

PS2 EMOTION ENGINE 12 MILLION

DREAMCAST SH CPU 3 MILLION

XBOX 1 CELERON 9 MILLION

GAMECUBE GEKKO 20 MILLION

WII BROADWAY PLUS 20 MILLION

ADD TO THAT ITS THE SMALLEST AND MOST ADVANCED AND MOST MODDEN DESIGN OF ALL CPUS LISTED

AND IS USING IBM 2007 MANUFACTURING TECH

SO ITS A KILLER CPU FOR GAMING CODE NO HD NO SOUND NO OP SYSTEM JUST PURE GAMECENTRIC CODE TO ASSEMBLY LEVEL AND MICRO CODING

XBOX FANS KNOW NOTHING

Crowhurst

On August 8, 2008 at 2:10 pm

Why don’t you just shut up and play the damn games. Whats specs without great games?

All consoles have had good games so can’t we just leave it at that.

I have a Wii60 and I love them both. So why can’t you just get over the fact that they are both great consoles, and so was the Gamecube and XBOX original.

Sampson

On August 8, 2008 at 9:06 pm

Wiiboy, I’ve repeatedly explained and linked, and re-explained, and said it multiple ways, dumbed it down for you, given analogies, etc…
You have no idea what you’re talking about. You’re probably capable of fooling your friends with this garbage, but it doesn’t work for anyone who even remotely pays attention to what you write.

And, you’re confused (shocking, I know). The vertex compression you’re speaking of on 360, is for procedural geometry Wiiboy. It’s part of their plan for generating a polygon mesh using an algorithm on the cpu. So the polygons aren’t being read from ram directly, they’re being generated and / or heavily modified on the cpu using source data, locked into L2 cache compressed, then read by the gpu over the fsb, which then converts it to floating point formats based on the compressed data.

You’ve confused that, with the fact that Wii’s (fixed function t&l) on the gpu requires the cpu to feed it directly, by reading data up from ram, processing it, and feeding it to the gpu.
Modern gpus have vertex shader processors, (that perform tasks that Wii requires its cpu to do) Vertex shaders, are like simplified cpus themselves. And they’re on the chip with everything else, not at the other end of a front side bus.

Under the direction of the cpu, they can pull a mesh pre-compiled from video ram. They, unlike your “favoritest gpu in the whole wide world” don’t require the cpu to feed them data. Again, Wii and Gamecube still require the cpu to read it up from ram, process it as a vertex shader would, and pass back down to the gpu directly.
Modern gpus have processors on the gpu specifically for it.
Wii lacks them entirely.
Wii and Gamecube have a fixed function t&l pipeline. The cpu is the brain.

Again, a vertex shader can read a vertex and index buffer from ram. Which means, the vertex (side of a triangle) can be shared with all neighboring triangles, and doesn’t need to be stored multiple times for each triangle. Thus, compressing its data. It can get that data from ram.

Not sure how else to explain that.

And, as ignorant as you are , I still don’t get why you don’t seem to be capable of comprehending the process of compressing textures.
They get stored on disk, compressed.
They get stored in ram, compressed.
They get read compressed.
Transmitted over the bus compressed.
Where the gpu, the gpu, the gpu, decompresses and uses them.

Why is that so hard for you to get I wonder…
Gpu’s have built-in processing ability to decompress texture data since long before Wii. The bus never moves uncompressed textures from ram. There is absolutely no reason for it.
Again, gpus have on them, processors that read compressed texture data.

Analogy: Your computer is a gpu, a server is ram that’s storing a compressed video.
It’s stored compressed, it’s read compressed, and transmitted over the internet compressed.

****XBOX 1 COMPRESSED TEXTURES ON THE DISC AND IN MAIN RAM IT COULD NOT READ COMPRESSED TEXTURES OVER ITS BUS OR AT THE GPU THE TEXTURES DE COMPRESSS THEN ARE SENT****

Again, and again. The bus connects the gpu to ram. The gpu reads compressed data from ram, the gpu decompresses it after it’s been transmitted over the bus. There is nothing sending post compressed textures over the bus. The gpu gets them compressed.
The PS2, can store compressed textures, but would require the cpu to decompress them before sending them to the gpu, because it doesn’t have direct access to ram, and doesn’t support copressed textures natively.

One of these days, perhaps you’ll understand.

Or not.

And Wii’s cpu is 6.5 million transistors. You’re including L2 cache to come up with 20. If they did that for Xbox cpu, it’d be 28.1 million. It’s only 9.5 if you ignore the on-chip L2 cache. Which was the case with Pentium 3 katmai cpus, not anything in the Coppermine family.

And I am in no way a particular fan of Xbox, or any other system.
Being a biased, delusional, uninformed, unpaid cheerleader is your job, and the jobs of the Xbox and PS fanboys of whom you’re a distant cousin to.

And your focus on Xbox brand and not the PS3 is kind of odd. PS3′s gpu is almost as far above Wii’s as 360′s is. I guess it’s mysterious fanboy logic, and your brain is stuck in a feedback loop from last generation.

Sampson

On August 8, 2008 at 9:51 pm

You see Wiiboy, anyone who spends 10 minutes actually looking up the things you post, will find you’re full of shiit. You’ve been provided links that destroy your “point” point by point. But it’s also obvious you don’t pay attention to such things. There are plenty of places that list the transistor counts of these cpus, and confirm you to be wrong. But who needs proof, eh?

You mentioned Sonic on Wii, as unimpressive as it was, is confirmed to be a different game on Wii. Your Dead Rising point backfired.
You asked about Deadly Creatures, and it looks exactly like a 2x Gamecube would produce. You’ve mentioned a number of Nintendo first party titles, but they look like polished Gamecube games.
I’m playing Mario Kart Wii now, and it looks like Double Dash.
I’ve played Mario Galaxy, but it’s a low polygon platformer, with a few actual bump maps for once.
Prime 3 lacks bump mapping all together. (which I hadn’t noticed, but was confirmed by a Retro developer)
The Call of Duty developer also explained the Wii version of COD5, what it is, and what it isn’t.
Conduit looks generic as well. It’s a very clean Xbox game at best.
And then there are games like Animal Crossing, that make it seem like Nintendo isn’t even trying anymore.

Most other console games, have extensive online options, multi-player co-op in multiple modes, High definition and high quality graphics, with extensive use of modern rendering, parallax, normal, and bump mapping everywhere. Extensive voice work, high quality music, high quality sound fidelity, etc..
Soon, there’ll be far better physics based fluid simulation, and more advance animation. Complex lighting, weather systems, trees with swaying leaves, etc..
But even Nintendo themselves do little to demonstrate competent hardware.

Why would you expect 3rd parties to do so?

Sampson

On August 9, 2008 at 8:30 am

@Crowhurst: I’m not so sure he’s actually played many of these system and games. He’s just applied his obsessive compulsive nature to specs whoring for Nintendo, from the last generation of consoles to now. Nintendo simply gave him no ammo for his arguments this generation.

What I and at least one other person here would like to know however, is if Wiiboy knows what he posts is bs or not. I’m sure he thought he had a sound argument in the beginning, and built up an arrogant attitude behind it. But any normal person should have been able to see the flaws in his own arguments by now.

I’m just curious is all. Since fanboys of all stripes behave in the same manner, as well as their political counterparts, I’m just wondering if it’s an impairment of some sort. Like a mild form of autism perhaps.

(and I don’t mean people with legit preferences for one thing or another, who might be labeled fanboys, I mean the real deal, irrational, trying to delusionally rationalize ever last point.

Like this fanboy satire for example:
http://www.youtube.com/watch?v=AxY8yQRJW08

It’s a dramatic reading of what these folks actually write, but it’s not too far off from folks like Wiiboy.

If something extreme, like Miyamoto were to retire from Nintendo, and begin making games for the competition, I wouldn’t be surprised to see suicide bombers afterwards, or protesters lighting themselves on fire outside Nintendo headquarters.

http://www.youtube.com/watch?v=k-uTnqYHZ-I

3ds_mac

On August 9, 2008 at 9:09 am

I actually gave up on the idea that Wiiboy is just an act. He’s an authentic Wiitard through and through, who honestly thinks he’s right.

Facts mean nothing to him if they don’t support his fanboyism. He recycles his old, dead, shot to argument, and then attempts to blend them with the new things he thinks he’s discovered and misinterpreted.

“It’s Branwdo, it’s got electrolytes.”

wiiboy101

On August 9, 2008 at 10:41 am

hollywood 1mb texture catch plus 6x compression = 6 mb plus 50% more space and bandwidth via virtual texturing = 9mb texture catch

xgpu 128k texture catch does not read compressed textures and thats it

but xbox is = to wii SHUT THE FOOK UP

XBOX FSB 1GB AT 133MHZ 64 BIT AND THATS YOUR LOT

WII 7.7PLUS GB AT 243MHZ 64 BIT PLUS MAX 4 X COMPRESSION DESTROYING XBOX

EFFECTIVE EXTERNAL AND INTERNAL BANDWIDTHS DATA FLOWS FOR GPU AND CPU ARE AT LEAST AT MINIMUM 4X THAT OF XBOX AT THE SAME PEAK NATIVE RESOLUTION OF 480P

EFFECTIVE FILL RATES IN REAL TIME ARE PLUS 2X A XBOX DUE TO BETTER EFFICIENCY AND CUSTOM TILE RENDERING AND HUGE BANDWIDTHS AND LIGHTNING FAST RAM

A GPU CAN NOT PROCESS DATA ITS WAITING FOR ONLY DATA THATS ACTUALLY THERE = 1 TSRAM-R EDRAM HUGE CATCHES AND REAL TIME DECOMPRESSION

IN REAL WORLD GAMING PERFORMANCE WIIS LEVEL 2 CPU CATCH CAN HOLD 10X PLUS THE DATA OF XBOX LEVEL 2 CATCH AS BROADWAY IS CUSTOM AND OPTERMIZED AND SUPPORTS REAL TIME DECOMPRESION INTERNALLY AS THE DATA ENTERS TO BE PROCESSED

WII IS AAS NINTENDO CLEARLY STATED AT E3 2005 THEN STOPPED TALKING POWER 2.5 TO 3 X A GAMECUBE

CHIP SET IS 2X GAMECUBE

MAIN MEMORY IS 3.5 X GAMECUBE BLEND THAT TOGETHER YOU GOT A GAMECUBE 2.5 JUST AS NINTENDO STATED

1.5 X A GAMECUBE THATS UTTER CRAP

GAMECUBE MAIN RAM 24 MB 1TSRAM AT 2.6 GB BANDWIDTH NOT COUNTING COMPRESSIN ETC

WII MAIN RAM 88 MB AT 8 GB BANDWIDTH OH LOOK 3.5 X PLUS THE MEMORY AND BANDWIDTH OF GAMECUBE JUST AS I STATED THE CHIP SET IS 50% HIGHER CLOCKED BUT YOU ALSO HAVE TO ADD INCREASED SPEC AND INCREASED EFFICENCY AND NEW DEV KITS AND NEW SOFTWARE SO CHIP SET IS EFFECTIVLY 2X GAMECUBE MEMORY IS 3.5 X GAMECUBE

THAT TO ANYONE IS CLEARLY A GAMECUBE 2.5 PLUS SO AGAINST PS2 ITS LIKE 6X A PS2

SO WHY THE PS2 GRAPHICS IN FIFA WII = LAZY PORT OF PS2 GAME/CODE

AS I INTELIGENTLY AND HONESTLY STATED ALL THE WAY UP THERE AT TRHE TOP OF THIS THREAD

AND STILL YOU LIVE IN FANTASY WORLD

3ds_mac

On August 9, 2008 at 6:22 pm

Once again, nope. You’re a confused individual. You’re multiplying bull.
It’s been explained to you, the fact that you aren’t capable of understanding is your problem.
For the hundredth time Wiiboy, the GPU on Xbox doesn’t require the cpu to act as a vertex shader. The vertex shaders are capable of reading data from ram compressed. They, unlike Gamecube and Wii, have vertex shader processors on the gpu. They process information themselves, they don’t require the cpu to do it.
Gamecube’s cpu reads data from ram, through the gpu’s memory comtroller, up the front side bus; processes it as a vertex shader would, and passes it to the gpu in compressed form.
Real gpus, can have the cpu instruct the gpu to read compressed data from an index buffer in ram. An index buffer, is a storage of triangles in a mesh, that only needs data for each vertex once, even though that vertex is a part of several triangles. Vertex shaders are capable of real-time decompression of vertex data, and processing it on its own.

And your refreshing you “data can’t be processed that isn’t there crap” There are benchmarks for this Wiiboy. Nvidia and ATI build into their gpus, mechanisms to deal with latency of all sorts. An Xbox has a post transform vertex cache, the Gamecube and Wii does not. That means the gpu keeps data right there on chip, and doesn’t require fetching from anywhere but that cache. They’ve had memory controllers tuned for ddr, tools for organizing data to minimize and optimize data. Gpu’s access data linearly, you get latency from random accesses.

And once again, there are 3d-mark benchmarks that measure efficiency of all sorts. And as was posted to you, efficiency ratings exist for different forms of ram. If 1tsram is 90% efficient, and the ddr in Xbox is 75%, then you simply divide the peak bandwidth, by .75 and .90 respectively. Not multiply by latency, like you ignorantly attempt to do.

And, as was also explained to you repeatedly, they don’t process things in the same way. Gamecube needs on-chip ram, and high frame buffer bandwidth, because it’s constantly reading and writing to and from it, as it is only capable of 1 texture layer at a time.
An Xbox, reads its textures, and combines 2, loops back on chip, and does 2 more before ever writing back to a frame buffer.
Again, it does this on chip, then writes the result to the frame buffer. Gamecube writes and reads to the frame buffer for every layer it adds.
It requires the onchip cache to function properly.

And you can stop with your bull cache and frame buffer bandwidth. It simply uses it to keep data near-by on chip, because again, that’s part of the design of the gpu. It’s there, with it’s bandwidth, because it’s required for the design. It’s ineffecient with frame buffer bandwidth, thus requiring lots of it.

Not much different from retardedly comparing the peak raw bandwidth of 360′s frame-buffer, to the main ram frame buffer bandwidth on a PS3. PS3′s is ~20, but they use color and Z-compression, thus increasing the effective bandwidth, just like they do for all gpus, Xbox1 included.

You’d be an idiot to to say “256gb beats 20gbs easy yo” Like ATI and Microsoft tried to do, and like you routinely attempt to do here.

Wiibob101

On August 10, 2008 at 10:54 am

hsr efficient rendering on hollywood gpu combined with virtual texuring combined with high bandwidth super fast catch buffer edram combined with 1t sram main ram combined with compression realtime decompression combined with fixed native resolution of 480p =

HUGE FUKING FILLRATE AND BANDWIDTH

THINK LOTUS ELISE VS FERRARI OR PORCHE POWER TO WEGHT ETC BRINGS ON TRACK PERFORMANCE VERY CLOSE BUT ONE CAR HAS A 3.5 LITRE TURBO THE OTHER A 1.8 16 V RUN OF THE MILL HOT HATCH ENGINE

I DONT THINK ANYONE WOULD DOUBT A LOTUS ELISE ON THE TRACK AGAINST ANY HALF MILLION POUND SUPER CAR

LOW RES PLUS HUGE FILLRATE AND BANDWIDTH AND A CUSTOM EFFECTIVE GPU = GREAT GRAPHICS hd lies DONT EVEN COME INTO IT

Wiibob101

On August 10, 2008 at 11:00 am

NATIVE RESOLUTION OF X360 IS 3X WIIS BUT ITS GPU CLOCK IS ONLY 2X WIIS SO WHOS GOT THE CLOCK PER RESOLUTION ADVANTAGE OR PER PIXEL

CLEARLY WII,,, :wink: EFFICIENCY WORK IT OUT :wink:

NICE TO SEE NINTENDO WORKING ON HOLOGRAPHIC STORAGE HARDDRIVES ARE OLD ASSW OBSOLETE DONT BELONG IN A GAMING DEVICE

Wiibob101

On August 10, 2008 at 11:10 am

WII TEXTURE CATCH 1 MB X 6 = 6MB PLUS VIRTUAL TEXTURING 512 BIT 16GB BANDWIDTH

NOW INCLUDE REAL TIME DECOMPRESSION AND CATCH READS WILST COMPRESSED 16 GB X 6=

96 GB BANDWIDTH TEXTURE READ AT 9 MB EFFECTIVE TEXTURES IN CATCH AT ONE TIME

THATS FUKING HUGE

XBOX 1 128K TEXTURE CATCH NO COMPRESSED TEXTURE READING

CASE RESTED

Wiibob101

On August 10, 2008 at 11:12 am

WIIS TEXTURE BANDWIDTH IS INCREASED 6 FOLD VIA COMPRESSION AND BY 50% PLUS VIA VIRTUAL TEXTURING

ITS A BANDWIDTH LATENCY BEAST COMPARED TO XBOX 1 AKA X360 LIKE VISUALS – HD

HONESTY CLEAR VALID HONESTY

3ds_mac

On August 10, 2008 at 11:58 am

****NATIVE RESOLUTION OF X360 IS 3X WIIS BUT ITS GPU CLOCK IS ONLY 2X WIIS SO WHOS GOT THE CLOCK PER RESOLUTION ADVANTAGE OR PER PIXEL****

2x clock rate, with 2x raster operations pipelines. Rops x clock = fill-rate.
capable of 2x z-rate over the top of that, with 2x the texture layers per pixel, many shader ops, and 4xaa.

Wii is 486 million with 2 texture layers and 2 cycles of shader effects, possible to do aa behind texture ops.

360 is 4 billion with 2 texture layers, several times the shader ops, potentially 4xmsaa.

Toss in the ability to do 2x Z, decoupled pipelines from texture ops, shaders, and rops, and 32 billion pixels for shadow buffers when you consider 4xaa and 2x Z, 360 is well over 10x Wii in most areas of processing, but is responsible for only 3x the resolution.

Sampson

On August 10, 2008 at 12:07 pm

And it shows blatantly in games across the board. Nintendo first and second party included.

Wiibob101

On August 11, 2008 at 8:13 am

nintendo flash/holographic storage

microsoft harddrive soooooooooo 1950s

nintendo custom high speed loading

microsoft long load times

nintendo every control option invented true next gen 3d mouse 3d motion

microsoft a dreamcast 2 control pad

fact is fact

Sampson

On August 11, 2008 at 11:56 am

What long load times? It’s a few seconds tops. And pretending there aren’t any on Wii, and exagerating them on other platforms “does not a point make”.

Many of the newer and upcoming game engines are being designed to stream, and thus have none.

And anyone with a brain would have rather had a large HDD than the cheap flash ram they stuck in the Wii. People like things like downloadable songs on Guitar Hero, etc..
If Nintendo can actually use research into into a newer storage medium, that’d be great, but it’ll be next+ gen, if even then.

lennell

On August 12, 2008 at 12:10 pm

wiiboy. you’re saying wiis texture bandwidth latency beast compared to xbox 1 and 360 like visuals hd !what the .wiiboy, !go back and read what 3ds_mac jast say. THE WII IS !ONLY 486 MILLION WITH 2 TEXTURE LAYERS AND 2 CYCLES OF SHADER EFFECTS. THE XBOX 360 IS 4 BILLION,!NOT MILLION,4 BILLION WITH 2 TEXTURE LAYERS AND SEVERAL TIMES THE SHADER EFFECTS. so wiiboy there you have it,the 360 have the adventage.

wiiboy101

On August 14, 2008 at 4:34 pm

example of a xbox fanboy

last gen edram dont mean lets dismiss edram

following gen oh no edram is so important ONLY CUZZ YOU GOT IT NOW

THATS SOOOOOOOO TYPICAL OF XBOX FANBOYS

wiiboy101

On August 14, 2008 at 4:36 pm

EDRAM X360 32 GB BANDWIDTH READ POOR LATENCY DUE TO COUGH DRAM

EDRAM WII 28GB BANDWIDTH PLUS BETTER COMPRESSION AND ULLTRA FAST LATENCY CAPPED 480P

WII CLEARLY MORE EFFECENT AND EFFECTIVE THERE

3ds_mac

On August 15, 2008 at 9:15 am

The only fanboy here is you.
And I, had made the comparison of the 360′s edram, to a typical PC or PS3, where the frame buffer is in main ram.
256gb doesn’t really tell you as much, because anything using gddr3 as a frame buffer uses extensive compression.

And once again Wiiboy, it’s 256gbs for 360′s frame buffer, where things like alpha blending and AA expansion take place. 32 is a transfer bus from shaders. And you get latency, from READING random access. That’s not happening over that bus either Wiiboy.

Again, 256gbs is the frame buffer bandwidth for the bus between ROPS and ram in 360, because they expand the fill-rat for 4x msaa, etc.. That is why Wiiboy, developers will say, they can consider the bandwidth for certain frame buffer operations infinite, because they have enough bandwidth to never be bound there.

Your bull 28gb for Wii, is where you add the frame buffer and cache bandwidth together, and add whatever else you think you can just toss in out of desperation.

Lets look at the frame buffer on Wii and Gamecube. It uses 24 bit for both color and Z for rendering.
Gamecube:
4 pixels per clock cycle, (24bit color + 24 bit Z), 162 million cycles.
That’s 3.8 gbs of bandwidth to write every pixel Gamecube can produce once. But, lets write and read. That’s 7.6 gbs of bandwidth to write and read back every last pixel it can produce.
What’s Gamecube’s frame buffer bandwidth?…….Oh, it’s 7.68 gbs of bandwidth.
But we must need more…Nope, because for each texture layer, it has to write and read back, but the fill rate drops each time, because a rop only has 1 texturing unit, so it can only do 1 at a time. (write, read back to blend, and write again, and read back to blend, etc..)
648 million pixels with 1 txture layer, would need 7.6 gbs.
324 million with 2 texture layers, would need……7.6 gbs.
216 million with 3 texture layers, would need……7.6 gbs.
etc..
Wii is just 11.5 instead of 7.6. (first Gamecube model was 9.6)

You could have tons of bandwidth there, and it wouldn’t make a lick of difference when the system doesn’t have anything to use it for.

Dufy

On August 15, 2008 at 12:29 pm

i really think the wii fanboy needs to calm down everybody knows the 360 is capable of better graphics then the wii just the way it is. the wii is just the gamecube 1.5. i have a wii i enjoyed it when it first came out and for awhile after now they just releasing piles of so i hardly use it anymore

wiiboy101

On August 15, 2008 at 12:30 pm

THE FACT REMAINS EA LIED END OF STORY I WON THE ARGUMENT

3ds_mac

On August 15, 2008 at 8:04 pm

No one disagreed with the idea of Wii being overall more powerful than Xbox.
The difference is no more than what separated the consoles from last generation, but there is one.
There isn’t much more of a difference between Wii and Xbox, than there is between 360 and PS3.

lennell

On August 15, 2008 at 11:53 pm

you are right 3ds_mac i agree. xbox 1 and wii,xbox 360 and ps3, one is jast slightly more powerful then the other.

wiiboy101

On August 18, 2008 at 11:38 am

xgpu in xbox 1 128k graphics catch no abillity to read compressed textures

hollywood gpu wii 3mb edram in gpu 24 mb 1tsram next to gpu on die plus all data and textures is read compressed and processed as it de compresses in the gpu

128k vs 27mb ARE YOU ON FUKING DRUGS REPEAT

128K VS 27MB ARE YOU INSANE XBOX FANBOYS OR WHAT

128K NO COMPRESSION VS 27 MB WITH 4TO1 DATA AND 6TO1 TEXTURE COMPRESSION THE WIIS GPU SRAM IS INSANLY HUGE

THE WII GPU HAS 216TIMES THE SRAM OF A XGPU WITH OUT CONSIDERING COMPRESSION TRICKS

27MB OF EDRAM 3MB AND NON EDRAM SRAM 24MB ALL ON THE GPU OR GPU DIE VS XGPUS 128K CATCH

27MB VS 128K YOUR FAN BOYING IS INSANE 216 X THE SRAM XGPU HAS

216 X THATS INSANE AMOUNTS OF SRAM ON THE DIE CPU CAN ALSO ACCESS THE SRAM

wiiboy101

On August 18, 2008 at 11:43 am

SO 128K SRAM CATCH VS 3MB EDRAM CATCH AND EXTRA 24MB GPU DIE EMBEDDED MAIN SRAM 1T SRAM R

YES XBOX FANS 128K NO COMPRESSION IS = TO 27 MB WITH COMPRESSION 3MB OF IT TRUE EDRAM IN THE GPU

GET OUTA HETRE YOUR PROVEN FANBOYISUM IS HERE ONLINE FOR ALL THE WORLD TO SEE

wiiboy101

On August 18, 2008 at 11:45 am

BEFORE THE WII EVEN GOES TO MAIN EXTERNA;L MEMORY GDDR3 IT HAS 27MB OF SRAM TO USE ON DIE AND YOU THINK THAT COUNTS FOR NOTHING YOUR INSANE

Sampson

On August 18, 2008 at 1:34 pm

No Wiiboy, your opinion counts for nothing. As you’ve repeatedly demonstrated with your inability to comprehend what you write.
You continue to misunderstand compression, misrepresent every piece of info you copy and paste, and continue to pretend as if your entire argument hasn’t been dismantled.
Xbox and every other gpu, reads compressed textures, decompressing them in the gpu. We’ve gone over how this works, in the simplest of ways, even a child could follow.

But lets go over the cache. What’s in the cache, is a small copy of what’s in ram, simply moved on-chip so that the gpu doesn’t have to read from main ram, in the middle of rendering, as that would kill its performance.
Gamecube was designed to require it’s cache to compensate for the way it renders. Modern gpus were designed to do more on-chip.
As an example, Gamecube requires multiple reads and writes to a frame buffer for multi-texturing, Xbox and PC gpus do not. They only write to frame buffer AFTER things are combined in the pixels, and use 4to1 compression when they do. Saving bandwidth, and storage.

Nowhere near the reading and writing the Gamecube required for fb (with zero compression) and thus, need nowhere near the (raw) framebuffer bandwidth.

It’s a crutch, for it’s potentially inefficient way it renders. It’s an effective crutch, but a crutch that simply allows it to keep pace. It’s not a source of awesomeness, as you desperately wish it to be.

Now lets go over its texture cache bandwidth. Lets say you need 4 bytes per texel, to each pixel. (it can’t do anything with more than this)
4 bytes, for a fill-rate of 648 pixels per second, and assume the typical filtering requires 4 taps.

648 texels, at 4 bytes, 4 tap filtering.
648 x 4bytes x 4 = 10.368 gbs required to max it out.
What’s Gamecube’s texture cache bandwidth listed at?
10.4 gb.
You’re delusional multiplication of its bandwidth is pointless (both with s3tc and 4to1 bull). This maxes the console’s rendering abilities out.

If that bus was to main ram, you could get more variety in the textures, but it helps not in rendering ability. Compression helps in storage, and moving data on-chip, not in rendering bandwidth from cache. It simply has enough bandwidth to supply it’s 648 pixels with a single texture.

You could stick 600gbs of bandwidth to texture cache on Gamecube, and it wouldn’t even notice it was there.

Xbox and all other gpus render multiple textures on chip, reading their textures from ram, (where s3tc compression comes into play, and has 4 times higher bandwidth of GC).
They use the cache to supply it with the 30 gb’s of bandwidth required to texture 933 pixels, with 2 filtered texture layers.

All in all, the differences make comparing any raw numbers pointless, but as has been said by every last developer on earth, Xbox and Gamecube were comparable, with Xbox having the overall advantage in most areas.

And I know this is confusing to you WIiboy, but as was also said, there isn’t much more of a difference between Wii and Xbox, than there was between Xbox and Gamecube.
For anyone who doesn’t care how things actually work, (like yourself) physical proof is in the games from Nintendo themselves.

We can go over what the internal bandwidth is on “modern” gpus if you want, but you won’t like it, because they’ll be significantly higher than the Wii.
And we all know how much that affects you emotionally.

wiiboy101

On August 18, 2008 at 2:07 pm

XBOX XGPU GRAPHICS CATCH 128K

WII HOLLYWOOD GRAPHICS CATCH 3MB PLUS 24 ADDITIONAL MB OF 1TSRAM ON THE GPU DIE

27MB VS 128K

SHUT UP UR FANBOYING CONTINUES

wiiboy101

On August 18, 2008 at 2:09 pm

3MB GPU EMBEDDED EDRAM = 28GB BANDWIDTH NOT COUNTING COMPRESSION
PLUS 24 MB EMBEDDED 1TSRAM-=R AT 4GB BANDWIDTH=27MB RAM 32GB BANDWIDTH NOT COUNTING COMPRESSION

VS XBOX 1 128K

KEEP ON GOING IM PASTING YPOUR FANBOY REMARKS IN MANY FORUMS FOR A GOOD GIGGLE

wiiboy101

On August 18, 2008 at 2:10 pm

128 X 216=27 MB AS THE BASIC MATH PROVES

128K VS 27MB YET YOUR FANBOYING CONTINUES TO MAKE YOU LIE :grin: :razz: :lol:

3ds_mac

On August 18, 2008 at 2:11 pm

****BEFORE THE WII EVEN GOES TO MAIN EXTERNA;L MEMORY GDDR3 IT HAS 27MB OF SRAM TO USE ON DIE AND YOU THINK THAT COUNTS FOR NOTHING YOUR INSANE****

Sure it accounts for something, it makes Wii 2x a Gamecube, even though it’s the same chip set, clocked only 1.5 times faster.
You don’t add the two pool’s bw together, any more than you add the cache bw to main ram bw, or video ram to system ram, etc..

****It simply has enough bandwidth to supply it’s 648 pixels with a single texture.*****

Yep, or to supply 324 pixels with 2 texels, or 216 with 3 texels, or 162 with 4 texels, etc.. It simply has no use for more from cache, because with each additional layer, cuts fill rate at the same rte.

We simply add 1.5 the rate, and account for better external ram = Wii.

wiiboy101

On August 18, 2008 at 2:12 pm

SAME APPLYS CPU

XBOX CELERON = 128K LEVEL 2 CATCH NO CUSTOM COMPRESSION

WII BRIADWAY=256K LEVEL 2 CATCH PLUS CUSTOM ABILLITYS AND COMPRESSION GIVING A 1MBN LEVEL 2 CATCH LEVEL OF PERFORMANCE ADD TO THAT 1TSRAM IS LEVEL 3 CATCH SPEED THERES NOOOOOOOOO COMPARISON

wiiboy101

On August 18, 2008 at 2:15 pm

BROADWAY CPUS LEVEL 2 CATCH IS OVER 10 X EFFECTIVE AS CELERONS CATCH IN XBOX 1 DUE TO CUSTOM ABILLITYS AND ON THE FLY COMPRESSION DE COMPRESION AND ITS SCRATCH PAD ABILLITYS CELERON IN XBOX 1 IS 128K CATCH AND THATS YOUR LOT

3ds_mac

On August 18, 2008 at 2:16 pm

****3MB GPU EMBEDDED EDRAM = 28GB BANDWIDTH NOT COUNTING COMPRESSION
PLUS 24 MB EMBEDDED 1TSRAM-=R AT 4GB BANDWIDTH=27MB RAM 32GB BANDWIDTH NOT COUNTING COMPRESSION****

There is no compression to account for, with the texture cache, or frame buffer bandwidth.

****KEEP ON GOING IM PASTING YPOUR FANBOY REMARKS IN MANY FORUMS FOR A GOOD GIGGLE****
There is no one capable of backing up your claims Wiiboy. Is this a forum you’ve been banned from?

3ds_mac

On August 18, 2008 at 2:19 pm

And has been repeatedly posted to you Wiiboy, this isn’t an Xbox over WIi debate, your failure to recognize that adds to your list of failures.

Sampson

On August 18, 2008 at 2:35 pm

****KEEP ON GOING IM PASTING YPOUR FANBOY REMARKS IN MANY FORUMS FOR A GOOD GIGGLE****

Why don’t you just link them to your bull Wiiboy?
I could do the same with your comments as well, but I’m sure you don’t wish to start doing that.
You’ve failed to convince people of the WIi’s awesomeness, because it isn’t, nor will it ever be.

wiiboy101

On August 18, 2008 at 4:28 pm

24mb of die embedded clock balanced s ram accounts for nothing

so the large catch ,memorys in the latest pcs also account for nothing

ok and pigs fly

internal processing bandwidths xcpu 6gb ish

broadway cpu 23 gb

wake up to honest fact fifa wii = ps2 engine high ressed in colour and framerate ironed out

that aint a wii engine

wiiboy101

On August 18, 2008 at 4:30 pm

stating exact honest fact is not fanboying its posting fact

if i post rain comes from clouds does that make me a cloud fanboy OFF COURSE NOT

broadway cpu over 20million transistors

celeron m in xbox 9 milion again fact not fanboyisum

3ds_mac

On August 18, 2008 at 5:57 pm

Wiiboy, no one is disagreeing with the idea of developer laziness, or the fact that PS2 engines are being ported to Wii, and not an indication of the Wii’s potential. This has been made clear for as far back as I have read.
As I am sure Factor 5 and a few others will demonstrate someday.

But lets focus on one fact:

****broadway cpu over 20million transistors, celeron m in xbox 9 milion again fact not fanboyisum****

Xcpu is 28,1 million transistors. It’s only 9.5 million, if there is no onchip L2 cache, of which all coppermine cpus had.
Sony took their data for their Cell slides from a Pentium 3 Katmai, which had a half speed, off chip L2 cache.
And anyone else that mentioned “9 million” was referencing a Katmai, since that was what was around back then.
All Coppermines, of which Xcpu is one, have the transistors represented for L2 on chip. Hence, their “3 times the transistors” listing. aka 28.1 million.
If you ignore L2, Gekko is 6.5 million. It’s only 20 million if you count L2.
That’s why a G3750GX has 44 million, and a CL only has 20 million.

So how come you seem so blank when it comes to this?
Serious question Wiiboy.

3ds_mac

On August 18, 2008 at 6:02 pm

“That’s why a G3750GX has 44 million, and a CL only has 20 million.”
Aka, a GX has 1 mb of L2, compared to 256kb of a CL.

wiiboy101

On August 18, 2008 at 6:23 pm

catch memory 5 nano seconds wiis 1tsram 5 nano seconds

so xbox 1 xgpu 128k catch speed xcpu 128k catch speed no custom compression real time decompression

wii 256k cpu catch speed plus compresion wii gpu 3mb edram catch speed plus compression die embedded 24mb 1tsram catch speed plus compression

128k x 2 catch speed no compression vs 27.256 mb catch speed ram plus compression before hitting gddr3 main ram

chip set has 27.256 mb of sram on die to use for high speed high end processing before ever hitting external main ram add compression and high speed disc streaming and access ram via flash drive the overal ram bandwidth latency perforance of wii CLEAR AND UTTER S ON XBOX 1

EA LIED ADMIT IT

wiiboy101

On August 18, 2008 at 6:34 pm

THE BROADWAY IS A OPTIMIZED 750CL THE CL IS A ALL NEW HIGH EFFICIENCY GRAPHICS AND MEDIA OPTIMIZED 750 SERIS CPU USING LATEST MICRO EMBEDDED DESIGNS AND SILICON ON INSULATOR TECH AND COPPERWIRE TECH

IT HAS A TRANNY COUNT PLUS 20 MILLION IT HAS A CLOCK PEAK OF 1.2 GHZ

ITS THE SAME AS BROADWAY BUT BROADWAY HAS A SLIGHTLY OPTIMIZED FSB SPEED AND ADDITIONAL HARDWARE TO CUSTOMIZE IT TO THE REAL TIME COMPRESSION DE COMPRESSION AND THE CUSTOM WRITE GATHER PIPES ITS A CUSTOM SPECED 750CL A 750CL THATS TUNED AND SET UP AS A GEKKO VERSION 2

THATS WHY THE BROADWAY IS CONFIRMED PLUS 20 MILLION TRANNY COUNT AND PLUS IN MM2 SIZE ON DIE IN COMPARISON TO STANDARD 750CL

ITS A WII SET WII OPTIMIZED VERSION OF THE CL THATS EXACTLY WHAT IT IS

ON DIE CHIP SIZE AND ON DIE TRANNY COUNTS ARE BOTH SLIGHTLY BIGGER THAN A STANDARD CL AND ITS FSB IS SLIGHTLY FASTER 240MHZ VS WIIS 243MHZ ADD TO THAT WIIS CPU TO GPU GPU TO CPU DISC AND RAM CUSTOM DATA COMPRESSION AND ON THE FLY TEXTURE READ COMPRESSION AND OTHER CUSTOM COMPRESSION TRICKS HARDWARE SUPPORTED

THATS EXACTLY WHAT WIIS BROADWAY CPU IS

IT DOESN’T NEED A 1MB CATCH LIKE GS ETC IT HAS CUSTOM COMPRESSION AND REALTIME DECOMPRESSION AND ADD TO THAT 24MB OF SRAM ON GPU DIE BOTH GPU AND CPU CAN DIRECTLY FEED FROM SO ITS A EFFECTIVE CATCH LIKE PERFORMANCE OF PLUS 24MB WITH THE SCRATCH PAD AND ON THE FLY DE COMPRESSION ITS A HUGE AMOUNT OF SRAM CATCH LIKE POWER

ITS THE MOST 3D GAMING AND MEDIA OPTIMIZED 750 POWERPC

EVER MADE AND IT SITS IN A EXSTREAMLY WELL BALANCED AND OPTIMIZED ULTRA EFFICIENT CONSOLE

THANK U AND GOOD NIGHT EA LIED

Sampson

On August 18, 2008 at 8:48 pm

Broadway, the most awesome G3 ever made. A 20 million transistor, 729 mhz BEAST.
Modified to software emulate a vertex shader better than a regular G3, to help the gpu be a state-of-the-art gpu from 2002.

They didn’t need IBM’s velocity engine, or vertex shaders on the gpu, or a high clock rate. Neah, it’s just too awesome for those.
How awesome?
SO AWESOME!

Btw, EA guy didn’t lie. He just embellished on it’s crappiness a little, to get under people’s skin, you see.
Similar to the Maxis guy who said it was a “piece of ”.

WAKY WAKY

On August 29, 2008 at 11:07 am

XGPU 128K GRAPHICS CATCH THAT CAN-NOT REPEAT CAN-NOT READ COMPRESSED TEXTURES

HOLLYWOOD GPU 3MB GRAPHICS CATCH PLUS ON THE FLY READING OF COMPRESSED DATA AND TEXTURES PLUS ADITIONAL 24MB IOF DIE EMBEDDED SRAM CLEARLY NO CONTEST

BROADWAY 23GB PROCESSING BANDWIDTH VS XCPUS 6ISH GB COME ON WAKY WAKY OUTA UR FANBOY DREAM WORLD

FRONT SIDE BUS XBOX 133MHZ 1GB WII 243MHZ PLUS 4TO1 COMPRESSION 7.7PLUS GB BANDWIDTH OHHHH PLEASE XBOX FANS STOP EMBARRASSING UR SELF…….

XBOX 128K CATCH CPU 128K CATCH GPU

WII 256K CATCH PLUS CUSTOM COMPRESSION CPU 3MB CATCH PLUS CUSTOM COMPRESSION GPU PLUS 24MB EMBEDDED SRAM BOTH CPU AND GPU CAN FEED OF PLUS AGAIN CUSTOM COMPRESSION

256K X 4 FOR COMPRESSION PLUS 24MB SRAM PLUS COMPRESSION VS 128K ON THE XCPU

HOW DOES A XBOX FAN SLEEP AT NIGHT COMPARING

128K CPU CATCH 128K GPU CATCH

TO WIIS HUGE 3MB GPU CATCH WIIS HUGE 256K PLUS COMPRESSION CPU CATCH AND THE SHARED VIRTUAL LEVEL 3 CATCH OF 24MB PLUS ON TOP OF ALLLLLLLLL OF THAT ON THE FLY COMPRESSION DECOMPRESSION OF DATA AND TEXTURES WERE AS XBOX HAS NOOOOOOO SUCH COMPRESSION READING IN BOTH ITS TINT CATCH MEMORYS

XBOX REQUIRES ITS SLOW MAIN RAM TO DO EVERYTHING

WII HAS 27MB OF SRAM PLUS COMPRESSION AND HUGE GPU CATCH BUFFER BEFORE YOU EVEN HIT MAIN MEMORY

WAKE THE HELL UP YOUR DEFENDING A LYER FROM EA ON THE GROUNDS YOU FAN XBOX

GROW THE FUK UP

Mosley

On August 30, 2008 at 2:02 am

Wiiboy, just so you’re clear, the Wii and Gamecube do & did require the cpu to act as a vertex shader. It’s the same on any pc gpu that lacked them, they required the cpu to emulate it.
Things like “custom lighting” isn’t much more than the splitting the floating point unit to issue 2 32 bit floats, instead of the standard single, which was useful for accelerating typical graphics calculations. That was required, as you wouldn’t want to stick a stock G3 into a game console for such a task.

Most cpus you’d have used on pc, (xbox included) had 4x simd floating point, and had proper vertex shaders
And if Nintendo were going for processing power this gen, they’d have incorporated something along the lines of VMX, which wasn’t the case.

Not sure what else you’ve got going there, but an Xbox had 4-5 gb of bandwidth to main ram after you factor frame buffer, etc.., of which compression applies.

Gamecube had 1.6 gb of bandwidth to main ram where compression applied. That’s where the cache is useful. It made sure there was enough bandwidth for texturing between layers. But as was said, you don’t multiply “bandwidth” by compression there. It’s simply raw bandwidth.

Considering compression into it:

Gamecube had 10.4 gb to 6 mbs of texture ram, or ~9gb to ~72 mbs of compressed texture ram.
Xbox had 30 gb to 128 kb of ram, or ~24 gb to 270 mbs of ram.

Tossing in efficiency, latency, “custom”, etc, etc.. Is the only reason Gamecube was able to keep pace.

If it had a stock G3 at half the clock, that was also required to fill in for vs, it would have been garbage. If it used main ram in the same way pc gpus do, it would have been garbage. If it didn’t have on chip cache, it would have been garbage, etc, etc..

And just so we’re clear, Xbox is garbage by today’s standards.
Wii is overall more powerful than garbage. Oh the power, THE POWER!!

3ds_mac

On August 30, 2008 at 8:13 am

Wiiboy only absorbs new information in tiny chunks. That’s why you’ll notice he’s subtly moved his compression bs, to to “caches”. Forgetting, apparently, that the scheme for passing data in bytes and shorts, instead of 32 bit data, is mainly required because the cpu passes things to the gpu, that a normal gpu can pull from ram…(compressed)
And you can explain that the texture cache on the gpu is there as part of the overall design, and is the only reason the gpu functions at all, but he wouldn’t get it.
He likes logic like, “bike one has training wheels, and therefore has 4 wheels, compared to other bikes, which only have 2″. “4 wheels is better than 2 wheels.”

That’s all he cares about. He doesn’t hold interest in anything related to “how stuff works”, beyond his mindless devotion to Nintendo.
He’ll never be capable of actually “using” any of the texturing he writes about, nor ever be involved in making games. He’s far too irrational and emotionally unstable to work in such a field. So he’s attempting to read this stuff, for nothing more than the sake of being a (free) Nintendo damage control PR.

And developers will continue to get a kick out seeing people like him, work himself into a frothing aneurysm, whenever they decide to take a shot at that
“two gamecubes ducktaped together”.

WAKY WAKY

On August 30, 2008 at 2:30 pm

3mb edram huge bandwidth catch that reads compressed textures data and a additional 24mb sram on the gpu die

COUNTS FOR NOTHING AGAINST XGPUS PATHETIC 128K GPU CATCH

JOG ON FOOLS WI DONT LIKE YOUR KIND AROUND THESE ERE PARTS WII DONT LIKE LIARS

WII TEXTURE CATCH 1MB 16GB TEXTURE READ 6TO1 COMPRESSION= 96GB TEXTURE READ BANDWIDTH

frame z buffers 2mb edram 12gb bandwidth

thias is on chip super fast and deals with data and textures wilst compressed

the xbox 1 does all of this in main ram with no catch speed no abillity to do it compressed and a pathetic 6.4 gb bandwidth that it shares for all purposes

6.4gb bandwidth FOR EVERYTHING in main ram

wii has 28gb of bandwidth on its gpu catch before you even go external plus compression

wiis 3mb 28gb plus compression gpu catch buffer destroys 16mb of main ram used as frame z buffer on xbox one theres simply no contest

3ds_mac

On August 30, 2008 at 5:23 pm

*****WII TEXTURE CATCH 1MB 16GB TEXTURE READ 6TO1 COMPRESSION= 96GB TEXTURE READ BANDWIDTH wii has 28gb of bandwidth on its gpu catch before you even go external plus compressionwiis 3mb 28gb plus compression gpu catch buffer destroys 16mb of main ram used as frame z buffer on xbox one theres simply no contest****
——————————————————————————-

Again, you don’t multiply bandwidth from cache. It’s texturing ability is 1 single texel per pixel. That’s it. 4 bytes per texel with filtering. Done.

And again, a 640×480 frame buffer is nowhere near 16mbs. It’s not different than what’s on Wii and Gamecube. With potential added precision and aa, but again, z-buffer is 4to1 compressed. It isn’t on Wii or Gamecube. And Wii and Gamecube drop z buffer precision, and are limited to 240 instead of 480 when using aa at all.

The fact that you don’t understand what you type, or what others type to you, doesn’t change reality.

Mosley

On August 30, 2008 at 10:20 pm

****wiis 3mb 28gb plus compression gpu catch buffer destroys 16mb of main ram used as frame z buffer on xbox one theres simply no contest****

Wii has 11 gb of “raw” bandwidth for frame-buffer. There is no compression to be factored into that. It was demonstrated to you earlier. It’s simply enough for the (inefficient) way in which it multii-textures. i.e. it writes and reads back to and from frame buffer, for every layer, without any sort of compression.
(which is fine, because it “works”)

An Xbox, simply combines 4 internal to the chip, and THEN writes it once to the frame buffer. Where it also uses automatic compression on top of that. You can get a 640×480 frame buffer, using around 300mb per second of bandwidth when all said and done, tops.
Requiring far far less bandwidth for the same thing.

That is why Wiiboy, PC GF3/4 gpus, are capable of supporting PC resolutions (1024×768, 1280×960, etc.) in many of their games, with the exact same bus and memory controller structure, and storing their frame buffer in main ram, with similar bandwidth.
PCs get their texture and vertex data, and write their frame buffers the same way, using the exact same compression formats as an XBox.

And again, you don’t compound bandwidth from texture cache either. The only place you multiply bandwidth for compression, is the bus to main ram.

But I see you regularly swap back and forth between GC and Wii, and regularly bring up the EA guy, despite no one really being interested in that subject.
They simply disagreed with your Wii = awesomeness spiel.

So don’t get confused here Wiiboy:
PS3/360……Wii.Xbox.Gamecube.PS2.DC….N64.PS1.SS, etc…

WAKY WAKY

On August 31, 2008 at 11:46 am

MARIO STRIKER CHARGED TO THIS DAY HAS THE BEST SPORT GAME GRAPHICS ON WII

ITS ALSO SOLID 60 FRAMES SUPPORTS DEFORMATION OF TEXTURES BUMP MAPING PHYSICS EFFECTS AND ANIMATED 3D REPEAT ANIMATED 3D SPECTATORS

EA STILL DO NOT SUPPORT SOLID FRAME RATES ON WII SPORTS TITALS AND STILL USE 2D CUT OUT SPECTATORS AS THE ENGIONE THERE USING IS THERE PS2 ENGINE

IF MARIO STRIKER CHARGED HAS GRAPHICS PLUS 2X FIFA AND ALSO A FRAME RATE TWICE AS FAST AND ON TOP OF ALL THAT 3D ANIMATED SPECTATORS

THEN HOW IS FIFA WII WII GRAPHICS

AGAIN FACT VS FANES

WAKY WAKY

On August 31, 2008 at 11:53 am

http://www.gametrailers.com/player/17774.html

OH LOOK GREAT VISUALS 60 FRAMES LOADS OF SPECIAL EFFECTS AND PHYSICS EFFECTS

AND FULLY ANIMATED 3D SPECTATORS

BUT HONESTLY FIFA WII REPS WII GRAPHICS AND NOT PS2

XBOX FAN = PATHETICV CHILD

WII FAN= HONEST GUY

AGAIN PROVEN

MARIO STRIKER CHARGED IS A IMPOSABILLITY ON XBOX 1 IT COULD NEVER DO THE 3D SPECTATORS AND ALL THE GROOVY EFFECTS AND 60 FRAMES AND NEAR ZERO LOADTIMES

MARIO STRIKER CHARGED WII IS OVER 12 MONTHS OLD AND CLEAR S ON FIFA 09 WII GRAPHICALLY

SO AGAIN XBOX FANS ARE S

Mosley

On August 31, 2008 at 2:50 pm

Looks like a decent (for Wii) looking game, with a limited actual viewing angle during gameplay, interspersed with repetitive prefabricated cut-scene special moves, with a really generic mini-game used in the place of actual goal-tending.
Great.
And I see 4 players and a goalie to a side, and a very limited view of the hopping crowd during actual gameplay. I’d bet if you could look up, you’d find all but the front-most crowd clipped.

Real soccer has 22 players on the field at once.

And Xbox, is a “piece of ”.

WAKY WAKY

On August 31, 2008 at 4:26 pm

fifa wii = ps2 engine = 30 frames=2d specators

mario soccer= wiiengine=60 frames=3d specators=physics=great visuals

but fifa ps2 graphics are infact wii graphics

so what this thread has clearly proven is x360 fans believe marketing statments by EA

WAKY WAKY

On September 1, 2008 at 8:31 am

Pentium 3:
Bus Interface Unit to System Bus = 32 bit * 133 mhz = 1.0 GB/s
Bus Interface Unit from chip: 23 + 2.9 = 25.9 GB/s
L2 Data cache to L1 Data cache: 256 bit * 733 mhz = 23 GB/s
L2 Instruction cache to L1 instruction cache = 32 bit * 733 mhz = 2.9 GB/s

Gekko:
Bus Interface Unit to System Bus = 64 bit * 167 mhz = 1.3 GB/s
Bus Interface Unit from chip = 11.6 GB/s
L2 Data cache to fill buffer 64 bit * 485 mhz = 3.8 GB/s
L2 Instruction cache to L1 instruction cahe = 32 bit * 485 mhz = 3.8 GB/s
DMA controller to fill buffer 64 bit * 485 mhz = 3.8 GB/s
Fill buffer to L1 Data cache 256 bit * 485 mhz = 15.5 GB/s
Write Gather Pipe from Load/Store Unit 64 bit * 485 = 3.8 GB/s

*with data compression of 3.8:1 average data compression:
Bus Interface Unit to System Bus = 64 bit * 167 mhz = 1.3 GB/s * 4 = 5.2 GB/s
Bus Interface Unit from chip = 11.6 GB/s * 4 = 46.4 GB/s
L2 Data cache to fill buffer 64 bit * 485 mhz = 3.8 GB/s * 4 = 15.2 GB/s
L2 Instruction cache to L1 instruction cahe = 32 bit * 485 mhz = 3.8 GB/s
DMA controller to fill buffer 64 bit * 485 mhz = 3.8 GB/s * 4 = 15.2 GB/s
Fill buffer to L1 Data cache 256 bit * 485 mhz = 15.5 GB/s * 4 = 62.2 Gb/s.
Write Gather Pipe from Load/Store Unit 64 bit * 485 = 3.8 GB/s * 4 = 15.2 GB/s

The Gekko architecture is suited much more for streaming a large amount of data then the Pentium 3.

Broadway is Wii’s CPU.

PowerPC 750cxe FX/GX @ 729 MHz
Front Side Bus: 243 MHz, 64 bits @ 1.944 GB’s/sec
256 KB L1 instruction cache
256 KB L1 data cache (can set up 16-kilobyte data scratch pad).
This leads to 512 KB L1 Cache total

Normal Broadway interface:
Bus Interface Unit to System Bus = 64 bit * 243 MHz = 1.944 GB/s
Bus Interface Unit from chip = 17 GB/s
L2 Data cache to fill buffer 64 bit * 729 MHz = 5.832 GB/s
L2 Instruction cache to L1 instruction cache = 32 bit * 729 MHz = 2.916 GB/s
DMA controller to fill buffer 64 bit * 729 MHz = 5.832 GB/s
Fill buffer to L1 Data cache 256 bit * 729 MHz = 23.328 GB/s
Write Gather Pipe from Load/Store Unit 64 bit * 729 MHz = 5.832 GB/s

Broadways data compression
Data compression of 4:1 average data compression:
Bus Interface Unit to System Bus 4:1 = 7.78 GB/s
Bus Interface Unit from chip 4:1 = 68 GB/s
L2 Data cache to fill buffer 4:1 = 23.328 GB/s
L2 Instruction cache to L1 instruction cache 4:1 = 11.664 GB/s
DMA controller to fill buffer 4:1 = 23.328GB/s
Fill buffer to L1 Data cache 4:1 = 93.312 GB/s.
Write Gather Pipe from Load/Store Unit 4:1 = 23.328 GB/s

ok you got mii xcpu is more powerful cough cough cough cough :roll:

oh and xbox 1 is weaker still as its not a pentium its a celeron on a pentium bus OUCH

BROADWAY DESTROYS XCPU CLEARLY

WAKY WAKY

On September 1, 2008 at 8:38 am

3RD XCPU CELERON ON PENTIUM 133MHZ BUS

2ND GEKKO CUSTOM GAMING PPC750

1ST BROADWAY CUSTOM GAMING PPC 750

NOT ONLY IS XCPU VASTLY WEAKER THAN BROADWAY IT ALSO FAILS TO MATCH GAMECUBES GEKKO

GEKKO 20 MILLION TRANNYS ALSO RISC AND COPPERWIRE MEDIA GAMING OPTERMIZED

BROADWAY OVER 20 MILLION TRANNYS ALSO RISC COPPERWIRE AND SILICON ON INSULATOR MEDIA GAMING OPTERMIZED

XCPU A OFF THE SHELF CELERON M ON A PENTIUM M BUS OLD ASSED CISC NON SHRUNK NON OPTERIMZED TY CPU

FOR SPRED SHEETS ON A PC DOING WOED PROCESSING IN COUGH A GAMING MACHINE

PRICLESS FANBOYING XBOX FANS

3ds_mac

On September 1, 2008 at 11:29 am

Lol, you just owed yourself with that copy and paste job you just did.
But, first line, is 64 bit, not 32. You editing in your own bs doesn’t make sense, when the bandwidth math doesn’t come out.

You’ll notice, you can’t get anywhere with your specs, without your bull 4to1 compression you blindly toss into everything.
If you meant, the “representing vertex data as 8bit (bytes) or 16-bit shorts, instead of 32 bit floats” you’ll notice that only applies to passing geometry. Since the cpu has a greater role in vertex shader processing than the Xbox does.

Xbox GPU and pretty much all gpus, can read pre-compiled index lists from ram on it’s own. Index lists are a compression of vertex data.
You even solved your own mystery yourself, when you noticed 360 had a modification to pass geometry in the compression format through the fsb from L2.
(actually, I think that was pointed out to you in this very thread)

But that, unlike Wii’s fixed function T&L unit, is for the creation of geometry in real time. Not to modify the cpu, to be better at helping an out-of-date gpu pretend to be modern. (Modern in a 2001 sense mind you, don’t get confused here)
As was listed earlier, “custom lighting modification” simply added 2x simd functionality to the floating point unit. Similar to the SSE (4x simd) in Xbox.
It would have run like if it were a stock G3.

Again, Gamecube and Wii (and PS2) feed the gpu directly.
Xbox does not.
In the case of GC and Wii, it’s a side effect of being “fixed function” Wiiboy, as no gpus are stupid enough to be so limited as to still be using “fixed function T&L”.

And the xcpu data was linked to you multiple times.
http://www.cpu-collection.de/?tn=&l0=md&l1=2001&l2=Intel#KC733/128(XboxCPU)
Has the same 8-way associative ATC cache. A P3 Katmai would have been a step back. Not much “ouch” about it. It’s the same cpu type used in pc’s for years. You know, where all the games are.

Now, since we’ve established that Wii is likely a bit more powerful than an Xbox, and also come to the conclusion that the cpu is required to be far more involved in processing with the gpu, we can conclude, that Wii’s cpu is overall more effective than Xcpu.

But that’s not saying much, now is it?

And the Maxis developer that referred to Wii as a “piece of ”, was talking specifically about the cpu, as his primary area of interest involved the cpu.

WAKY WAKY

On September 1, 2008 at 2:24 pm

PLUS 2X THE TRANSISTOR COUNT PLUS RISC PLUS COPPERWIRE PLUS TIGHT SHRUNK TIGHT TRANSISTOR MPU DESIGN PLUS 243MHZ BUS PLUS COMPRESSION TWICE THE CATCH PLUS 4TO1 COMPRESSION PLUS INSANLY FAST 1TSRAM

VS A COUGH CELERON

CASE RESTED

WAKY WAKY

On September 1, 2008 at 2:26 pm

4X PLUS EASY BANDWIDTHS PLUS 2X TRANSISTOR COUNT PLUS THE RISC VS CISC PLUS THE FSB SPEED PLUS THE COMPRESSION PLUS THE CATCH PLUS THE FAST RAM

GET IT THRU UR HEADS BROADWAY REALTIME GAMING PERFORMANCE IS LIKE 2.5 X A CELERON XCPU YOU TARDS

:roll:

Mosley

On September 1, 2008 at 3:44 pm

Why do you continue with the 2x transistor count? You’ve been given links repeatedly that it’s 28.1 million. Not the Katmai 9.5, as the cache is onchip in Coppermines.
You’ve also been given links that demonstrate that a typical G3 has little more than 6 million transistors of logic devoted to processing.
Cache has a transistor count.
Wii’s gpu is also made up of 25m of logic. The other 25m is edram.
Xbox was 60 million of mostly processing logic.

Are you seriously this retarded?
Serious question.

WAKY WAKY

On September 1, 2008 at 4:07 pm

8 texture layers 16 real time texture blender custom shader stages plus custom shader blender cpu effects VS xgpu basic shaders

NO CONTEST WII WINS HANDS DOWN

gpu can co process directly with cpu

cpu can co process directly with gpu

fully customizable and programmable to assembly level

fillrate plus 2x xbox 1

bandwidths plus 4x xbox 1

additional flash access ram and a super fast disc streaming capability

vs a crappy xbox with crappy slow ram crappy bottlenecks and a crappy bog standard dvd drive

whatever xbox tards :roll:

Mosley

On September 1, 2008 at 4:22 pm

A dvd drive with a faster transfer rate, but more ram to load.
And they average 3 textures on Gamecube, 4 on Xbox, as quoted to you by at least 3 different developers.

Mosley

On September 1, 2008 at 5:14 pm

And custom = fixed function.
Shaders are programmable.
That’s also been explained. If your gpu lacks shaders, you need the cpu to drive the gpu as a co-processor.
Fixed function = standard list of generic shader effects, that can be done in hardware. If you wanted to do an effect not on the list, you need the cpu. Just like every other pre dx8 gpu there is. And that list of “custom” shaders, was outlined. Environmental bump mapping, emboss bump mapping.
Any other form of mapping is a nope, and will tax the cpu. Not much special there.

Programmable shaders, act as a cpu would, and process floating point data on the gpu itself, and are “programmable” in a similar way to what the cpu is capable of.

Nothing special either way really. Both are out of date.

3ds_mac

On September 2, 2008 at 11:39 am

****A dvd drive with a faster transfer rate, but more ram to load. And they average 3 textures on Gamecube, 4 on Xbox, as quoted to you by at least 3 different developers.****

He’s using GC and Wii interchangeably, but in this case, he’s sticking to Wii. (for once)

WAKY WAKY

On September 2, 2008 at 1:35 pm

SORRY CUSTOM PRIORITY DRIVERS BALANCED TO A CONSOLE USING LOTS OF COMPRESSION = FAST LOADING

DVD XBOX DVD X360 AND FAR MORE SO PS3 BLURAY = TERRABLE LOADING SPEED AND STREAMING

AGAIN ENGINEERING FACT

FASTEST LOADER STREAMER LAST GEN GAMECUBE

FASTEST LOADER STREAMER THIS GEN WII

THATS FACT WHY ARGUE AGAINST IT

GAMECUBE COULD LOAD/STREAM FASTER THAN XBOX EVEN WITH ITS HARDDRIVE

AGAIN FACT

WII CAN RUN GAMES LOAD FREE WITH NOOOOO INSTALS

PS3 EVEN WITH HUGE INSTALLS STILL FAILS TO LOAD IN GAME AS FAST AS WII

ITS CALLED BALANCE

YOUR STILL NOT UNDERSTANDING CONSOLE DESIGN CLEARLY

Mosley

On September 2, 2008 at 3:23 pm

Again, what are you talking about? You sound as though you’ve never actually played the system. Or you think you can simply make up, to people who’ve not played these Wii games.

No More Heres has load times. Metroid Prime 3 still has several seconds of them between doors, regularly (just like the previous games).
That Mario Strikers game you just tried to pimp, has at least 10 seconds of it after matches. Holy , My Sims has the worst loading of any game I’ve played recently. Metal Slug is full of them, what I’ve played of Bleach does, and of course, there’s Smash Brothers Brawl……

If you’re too much of a fanboys to see them (not sure how you could miss them in some games), or aren’t bright enough to see when they’re hiding them is another issue.

You can design any game to not have load times, it just depends on the game design choices made. Potentially, it’s far easier on a system with multi-core cpus, for reasons you wouldn’t understand.

****CONSOLE DESIGN**** -Don’t make me laugh Wiiboy.

Mosley

On September 2, 2008 at 3:40 pm

And since you’re on this issue, I have Soul Calibur 2 on both systems right now, Xbox loads faster, consistently. (this might vary, depending on the dvd drive the model has however)

In general, I don’t see much differences. I see different games try to hide them in different ways.
And I do see a big difference in total ram to load however…..

WAKY WAKY

On September 3, 2008 at 8:26 am

sram on chip xbox 128k cpu 128k gpu no additional sram or edram at all THATS YOUR LOT ON CHIP ON DIE

VS

256k plus compression wii cpu and 3mb edram plus compression gpu plus 24 mb sram gpu die all again plus compression vs reminder 128k x 2 COUGH COUGH COUGH COUGH

WERE THE XBOX HAS 128K CATCH PER PROCESSOR WITH NO CUSTOM COMPRESION REALTIME DECOMPRESSION AT ALL

THE WII HAS 256K PLUS 3MB PLUS 24MB PLUS COMPRESSION

THATS A HUGE AMOUNT OF CATCH EDRAM SRAM ON CHIP ON DIE

BEFORE YOU EVEN GO TO MAIN MEMORY GDDR3 64MB EXTERNAL PLUS FLASH DRIVE ACCESS RAM

THERES SIMPLY NO CONTEST

IF THE WIIS GPU USED ALL THE 1TSRAM IT WOULD HAVE 3MB PLUS COMPRESSION GPU EDRAM CATCH AND 24MB OF ADDITIONAL SRAM ACTING AS 24MB PLUS COMPRESSION LEVEL 3 GPU CATCH

27MB PLUS COMPRESSION VS 128K NO COMPRESSION XGPU

YOU REALLY DO NEED TO WAKE UP TO REALITY :roll:

comeon guys stop telling porkys

On September 3, 2008 at 8:49 am

right so mr xbox fan is stating a gpu catch of 128k that cannot deal with compressed data textures is = to or better than wiis 3mb catch plus compression and the 24mb 1tsram embedded right next to the gpu on the gpu die so mr xbox has a gpu he fans that has at most 128k of data in its catch against a gpu with 3mb plus 24mb = 27mb of sram to feed of plus compresion 2to1 compresion 4to1 compression and 6to1 texture compression all in ram and catch processed in real time

so lets say 128k no compression = 128k yes got it thats your lot xgpu

vs

3mb plus 24 mb =27mb but theres 6to1 4to1 2to1 compression avalable so lets average that by 3to1

27mb x 3 = 81mb just a example remember textures are read at 6to1 compression data at 2to1 or 4to1

24mb all textures on gpu die 24×6=144mb before ever hitting main ram

the xgpu has 128k and requires 8mb to 16 mb of main ram just to frame z buffer

thers no contest

2.5 x xbox graphics are a breeze for wii

EA porky pied did they not

so you side with liars how pathetic :roll: how fanboy

comeon guys stop telling porkys

On September 3, 2008 at 8:54 am

3mb huge bandwidth gpu catch no main ram required at all for buffering catching frames etc its all done at insane speed on wiis edram catch in real time fully compressed no 1tsram or gddr3 ram used at all for frames buffers etc its all free to use as fast main ram

xbox has to use main ram before you even start due to no edram or huge catch its all done in slow low bandwidth main ram wii clearly destroys it there but wii even after all this work handled on chip it still has before leaving the die 24mb 1tsram fast ram to feed of plus compression again before needing gddr3 main ram at all

efficent work out what the word means

128k vs 27mb plus compression YOUR KIDDING RIGHT XTARDS

comeon guys stop telling porkys

On September 3, 2008 at 9:03 am

XBOX RAM 64MB 8 TO 16 MB WASTED ON GPU FRAME BUFFERS ETC SO ITS NO LONGER THERE AS MAIN RAM YPOU LOST 8 TO 16 MB BEFORE YOU START

THEN THERES SOUND SAY 8MB THEN THERES CPU SAY 16MB

CLEARLY THERES LESS THAN 32 MB FOR GRAPHICS AND TEXTURES PLUS ALL COMPRESSED TEXTURES CANNOT TRAVEL OVER BUS OR BE READ BY GPU COMPRESSED SO THE COMPRESSION DE COMPRESSES AND WASTES MORE SPACE RAM ETC ADD TO THAT DRAMS TERRABLE LATENCY ISSUES AND NO HUGE BANDWIDTH TRICK ON GPU OR CPU

VS WII ALL 24MB 1TSRAM ALL 64MB GDDR3 RAM AND ALL ACCESS MEMORY VIA FLASH DRIVE IS USED AS MAIN RAM NOTHING WASTED ON BUFFERS ETC AND ALL USING BETTER COMPRESSION WITH REALTIME DECOMPRESSION

WERE THE XBOX MIGHT HAVE 28MB FOR GRAPHICS AND TEXTURES AND THE BOTTLENECK OF DE COMPRESSING BEFORE SEMDING

THE WII COULD HAVE 64MB OF GRAPHICS AND TEXTURES RAM ALL USING COMPRESSION AND FEEDING INTO GPU WILST COMPRESSED WITH INSANE SPEED (1TSRAM) AND ADDITIONAL FLASH ACCESS RAM AND A MUCH FASTER DISC FOR STREAMING DATA ADD ALL THAT TOGETHER AND THE 3MB EDRAM GPU RAM YOU GOT A SYSTEM THATS REAL WORLD GRAPHICS RAM PERFORMANCE

CLEAR AND BLANTANTLY S ON XBOX 1 LATENCY IS LIKE 15 X FASTER AND GRAPHICS BANDWIDTH AND RAM SIZE IS ALSO MUCH BIGGER

2.5X PLUS THE GRAPHICS RAM PERFORMANCE OF XBOX 1 IS EASY AS FOR WII

LEARN HOW THINGS WORK RETARDS

comeon guys stop telling porkys

On September 3, 2008 at 9:43 am

PS2 GS UNIT 4MB RAMBUS EDRAM 48GB BANDWIDTH NO COMPRESSION NO DIRECT ACCESS TO MAIN RAM TERRIBLE LATENCY ISSUES……..

XBOX XGPU 128K CATCH NO COMPRESSION RELYS ON SLOW CLUNKY MAIN RAM AND LOW BANDWIDTH TO DO GPUS WORK

WII HOLLYWOOD GPU 3MB ED 1TSRAM 1MB TEXTURES PLUS 6T01 COMPRESSION=6MB
16GB BANDWIDTH 6TO1 COMPRESSION 96GB BANDWIDTH PLUS 2MB FRAME Z BUFFER 12GB BANDWIDTH PLUS COMPRESSION PLUS 24MB 1TSRAM EMBEDED ONTO GPU DIE

PS2 4MB 48 GB NO COMPRESSION NO DIRCT ACCESS TO MAIN RAM

XGPU 128K CATCH NO COMPRESSION

HOLLYWOOD 3MB SUPER FAST EDRAM PLUS COMPRESION 96GB TEXTURE BANDWIDTH 6MB TEXTURE EFFECTIVE CATCH PLUS 24MB PLUS COMPRESSION SRAM

PS2 4MB

XGPU 128K

WII 3MB PLUS 24MB PLUS COMPRESSION WOW WII CLEARLY KICKING ALL LAST GEN ASS

BUT YOUR STILL CALLING PS2 GRAPHICS WII GRAPHICS

WERE THE GS UNIT IN PS2 HAS 4MB OF SLOW EDRAM RAM THE WIIS HOLLYWOOD HAS 27MB TO FEED OF AT INSANE SPEED AND WITH FULL COMPRESSION DE COMPRESSION ON TOP

WII CAN OUT PERFORM PS2 BY A MILE WITH OUT EVER USING ITS GDDR3 MAIN RAM THE WIIS CHIP SET ALONE WITH ITS SRAM DESTROYS THE WHOLE PS2 SYSTEM

PS2 PORTS ON WII ARE LIKE 20% OF ITS ACTUAL PERFORMANCE CAPABILITY

AS CONFIRMED A PS2 PORT AND POLISH ON GAMECUBE WAS 40% OF GAMECUBES POWER SHOWING GAMECUBE TO BE PLUS 2X PS2 PLUS POLISH (GAMES RAN SMOOTHER AND SWEETER AND LOADED QUICKER)

WII IS CLEARLY PLUS 5X A PS2 SO AGAIN HOW IS EA GUY NOT LYING TO US CLEARLY HE IS

:roll:

comeon guys stop telling porkys

On September 3, 2008 at 10:01 am

3MB EDRAM = 28GB BANDWIDTH 24MB DIE EMBEDDED SRAM = 4GB BANDWIDTH

6TO1 TEXTURE COMPRESSION GPU READ PLUS ADDITIONAL DATA COMPRESSIONS OF 4TO1

LETS BE CONSERVATIVE AVERAGE COMPRESSION RATIO 3TO1

27MB X 3 = 81MB

28GB ADD 4 GB = 32 GB BANDWIDTH

32GB X 3GB = 81GB BANDWIDTH

VS GS UNITS 4MB VS XGPUS 128K CATCH

WII DESTROYS LAST GEN CONSOLES CLEARLY

comeon guys stop telling porkys

On September 3, 2008 at 10:06 am

24MB 1TSRAM DIE EMBEDED

DATA COMPRESSION 4TO1 =96 MB OR 16GB BANDWIDTH GET IT

SAME MEMORY AGAIN BUT TEXTURES 24MB 1TSRAM DIE EMBEDDED 6TO1 TEXTURE COMPRESSION

24MB X 6=144MB

4GB X6 = 24GB BANDWIDTH

SO 24GB BANDWIDTH AND 144MB OF TEXTURES ON THE GPU DIE BEGFORE EVER HITTING THE 64MB GDDR3

AND DONT FORGET INFRONT OF THAT 24MB RAM IS A 3MB EDRAM CATCH WITH HUGE BANDWIDTHS

AL; ON GPU CATCH

OWNED

Mosley

On September 3, 2008 at 10:16 am

This isn’t a debate Wiiboy. This you being told the truth, and delusionally denying it. This, is why you’re regularly banned.
Compression has been explained to you, no need to recycling that.
In fact, I don’t see a single thing you just copy and pasted that hasn’t been ripped to shreds, with evidence, developer quotes, and multiple links.

And I don’t know who you think you’re being informative to.
Those who’ve taken any sort of programming or graphics arts programs, raise your hand…. Those who have actually dealt with compression formats on textures, of any type, raise your had….
I know yours isn’t up Wiiboy, nor will it ever be, with the circular logic you have going on in your mind.
I know a guy who has an Ndev kit. I’d love to have you take a seat in front of it, so you can implement all your awesome “understanding” of how things work.

But that would be like handing an ape a rubix cube.

Crowhurst

On September 3, 2008 at 10:41 am

PROOF THAT THE WII IS SUPERIOR TO XBOX, XBOX360, AND EVEN PS3!!!

http://uk.youtube.com/watch?v=oHg5SJYRHA0

Ye, it’s amazing isn’t it?

Mosley

On September 3, 2008 at 10:57 am

And the extent to which you’ll try to bull up as big a number as you can get through your “awesome” multiplication.

And yet, the textures look like , and most are basic and flat, and there’s limited variety to them.
The models are blocky.
The average load times are crap for such a small amount of ram.
The ai…what ai?
The physics are lame.
Jaggies and pop-in are everywhere, even in the lamest of games.
Lighting is crap., real-time shadows….what real time shadows?

If we are to assume that the Wii is “awesome”, then, (as was said multiple times) we have to assume Nintendo’s best and brightest programmers and artists are some of the most incompetent hacks in the industry.

Is that what you’re saying Wiiboy? For the record?

So let me get this straight, you “Wiiboy” is saying, that the awesome Nintendo game designers are being crippled by their incompetent graphics programmers and texture artists?

Wow Wiiboy, that’s pretty harsh man, pretty harsh.
“Miyamoto am cry”

Wiiboy = 360 / PS3 FANBOY!!! I KNEW IT!!!!

Mosley

On September 3, 2008 at 11:08 am

Let me repost this revelation, since the ad is covering it in some Firefox browsers. (at least mine)

If we are to assume that the Wii is “awesome”, then, (as was said multiple times) we have to assume Nintendo’s best and brightest programmers and artists are some of the most incompetent hacks in the industry.

Is that what you’re saying Wiiboy? For the record? So let me get this straight, you “Wiiboy” is saying, that the awesome Nintendo game designers are being crippled by their incompetent graphics programmers and texture artists?

Wow Wiiboy, that’s pretty harsh man, pretty harsh. “Miyamoto am cry”

Wiiboy = 360 /PS3 FANBOY!! I KNEW IT!!

Crowhurst: Looks pretty definitive.

What do you have to say for yourself now Wiiboy? Abandoned ship when you thought Nintendo’s graphics programmers were “full of fail”, and now it’s been proven, without question that the Wii’s superiority to both 360 and PS3 is pure truth, with a link and everything.

Gonna come crawling back now, eh? I don’t think Nintendo has much to say to you any more now. Nope, it’s too late. The ship has sailed.

comeon guys stop telling porkys

On September 3, 2008 at 11:09 am

NO ITS NOT A DEBATE ITS A FACT EA GUY LIED THATS A FACT NOT A DEBATE

comeon guys stop telling porkys

On September 3, 2008 at 11:21 am

SO THE FACT REMAINS

WERE XGPU IN XBOX 1 HAS 128K
THE WII GPU HAS 27MB

128K NO COMPRESSION VS 27MB PLUS COMPRESSION 128K X 8=1MB 27X8=216

SO NOT EVEN COUNTING COMPRESSION WII HAS 216 X MORE SRAM ON ITS GPU AND GPU DIE

AGAIN REMINDER 27MB PLUS COMPRESSION VS 128K

GPU DIE ALONG HOLDS 32GB BANDWIDTH NOT COUNTING COMPRESSION

SO XGPU DIE VS WII GPU DIE WII DESTROYS XBOX

FSB FRONT SIDE BUS

XBOX 133MHZ AT 64 BIT = 1GB

WII 243MHZ PLUS 4TO1 COMPRESSION PEAK AT 64 BIT = 7.7 GB BANDWIDTH 1X7=7 DOES IT NOT SO WII HAS 7/8 X THE FRONT SIDE BUS PEAK BANDWIDTH AS THE SPECS CLEARLY PROVE

INTERNAL CPU BANDWIDTHS CELERON OF XBOX 1 6ISH GB BANDWIDTH

INTERNAL BANDWIDTHS BROADWAY CPU = 23GB BANDWIDTH CLEARLY DESTROYING XBOX 1

1TSRAM DIE EMBEDDED 5 NANO SECONDS LATENCY

SDRAM XBOX 1 50 TO 100 NANO SECONDS LATENCY

WII 3MB PLUS COMPRESSION HUGE BANDWIDTH GPU CATCH BUFFER

XBOX 128K GPU CATCH AND NO COMPRESSION

WII TINY AND EMBEDDED AND WELL DESIGNED AND BUILT

XBOX OFF THE SHELF POOR DESIHN TAKING UP HUGE AREA AND GENERATING HUGE HEAT AND WASTING PERFORMANCE THRU POOR DESIGN AND SLOW ASED RAM

XBOX MAIN RAM 6.4 GB BANDWIDTH SHARED FOR EVERYUTHING EVEN GPU BUYFFERS ETC

WII HUGE ON CHIP AND BUS BANDWIDTHS BEFORE EVER HITTING MAIN RAM

THEN 4GB EACH POOL OF MAIN RAM PLUS COMPRESSION ADVANTAGES AND LIGHTNING FAST RAM SPEED LOW LATENCY

DISC TO SYSTEM LOAD SPEEDS

3RD XBOX 2RD GAMECUBE FASEST WII AGAIN A BIG ADVANTAGE

VIRTIUAL MEMORY XBOX HARDDRIVE WII MOTHERBOARD EMBEDDED FAST FLASH DRIVE AGAIN FASTER

EVERY SINGLE PART OF WII IS SUPERIOR TO XBOX FACT IS FACT

comeon guys stop telling porkys

On September 3, 2008 at 11:29 am

LOVE THE “”"FACT”"” XBOX FANS IGNORE THE FASTEST POSSABLE RAM IN ANY CONSOLE 1TSRAM AND ALSO IGNORE HOW TIGHT AND POLISHED AND EMBEDDED THE WII DESIGN IS

XBOX 1 OFF THE SHELF POOR PC PARTS SLAPPED IN A OVER HEATING SUITCASE SIZED BLACK BOX

VS

A HIGHLY INTERGRATED OPTERMIZED EMBEEDED SYSTEM IN A TINY COOL RUNNING WHITE BOX WITH THE FASTEST RAM OF ANY CONSOLE

PRICELESS FANING

Mosley

On September 3, 2008 at 11:48 am

I know Wiiboy, it’s sad. You’ve lost faith in Nintendo’s graphics programmers.

Most gamers love Nintendo games, they have some of the best game designers around. But you don’t think they understand Wii hardware, and haven’t advanced much on graphics programming past 2001….
Mario Kart Wii is terrible looking for such a graphics processing “beast”. Metroid Prime 3 looks pretty much like 1 and 2, with a little extra polish to it. You likely weren’t impressed with Brawl either, or its load times. And upcoming games like Captain Rainbow…..

I mean, there are still Xbox1 games (and PC games from the same era) that shame 95% of what’s on the Wii, and that’s upsetting.

“sigh” such a sad turn of events. Do you think perhaps they’ve got one guy, locked in a room somewhere turning out the graphics parts of their games?

I don’t know what to tell you Wiiboy. Maybe you could start an on-line petition or picket Nintendo headquarters. Tell them, that if they want third parties to take full advantage of the hardware, then they should at least lead by example.

Would that work?

comeon guys stop telling porkys

On September 3, 2008 at 12:16 pm

GPU TEXTURE CATCH XBOX 128K

GPU TEXTURE CATCH WII 1MB PLUS 6TO1 COMPRESSION =6MB

128K X 8 = 1MB WIIS TEXTURE CATCH IS 48 X BIGGER NOT COUNTING ADDITIONAL DIE 1TSRAM WITH DIRECT ACCESS TO GPU AND CPU

48X THE TEXTURE CATCH

FAXT IS FACT

3ds_mac

On September 3, 2008 at 12:23 pm

There’s just no convincing Wiiboy. His mind is made up.
Nintendo’s graphics programmers are , and that’s the end of it.

I personally think they’ve done a really good job, with the hardware limitations they have. So that’s not my opinion either.

But it seems Wiiboy is a mindless anti-Nintendo fanboy through and through. He’ll continue to troll Nintendo the rest of the generation, with his Nintendo programmer hating bull.
And he’ll keep putting folks like Valve, I.D Software, and Cryteck up on pedestals, and worship their graphics programming skills, and belittle everything Nintendo or Retro puts out, just because they don’t live up to his expectations of the hardware, with it’s “100′s of gbs of bandwidth and immense processing power” that continues to go unused.
Big N just can’t do anything right according to him and his ilk. It is indeed sad.

3ds_mac

On September 3, 2008 at 12:45 pm

****128K X 8 = 1MB WIIS TEXTURE CATCH IS 48 X BIGGER NOT COUNTING ADDITIONAL DIE 1TSRAM WITH DIRECT ACCESS TO GPU AND CPU*****

We know “why” you think Nintendo’s programmers are a bunch of chimps. I would too, if I thought there was such a disparity in hardware, and so little difference in actual games.

It’s pretty sad to see Nintendo’s biggest supporters turn on them like this.

Those who defended them even when Nintendo was rubbing e in their face at E3, and even when they referred to those that supported them throughout their poorest performing console generation, as “basement dwelling otaku”, they stuck by Nintendo.

But now, that Nintendo isn’t living up to their expectations, they’ve resorted to implying that they have a pack of drooling morons pretending to be graphics programmers working on their biggest franchises.

Nintendo just can’t win. They try something new with the control schemes to carve out a niche for themselves and become successful, and the graphics whores jump down their throat for the primitive visuals.

comeon guys stop telling porkys

On September 3, 2008 at 1:44 pm

EA GUY LIED THATS ALL IM SAYING AND NOTHING MORE AND THAT IS FACT

HE LIED

comeon guys stop telling porkys

On September 3, 2008 at 2:01 pm

WIIS PIXEL DRAW PER FRAME IS LIKE 108 X OVER DRAW IT CAN DRAW THE SAME PIXEL 108 TIMES PER FRAME ACCORDING TO PIXEL PIPE TO CLOCK SPEED COUNT

BUT THAT DOESN’T INCLUDE TILE RENDERING AND VIRTUAL TEXTURING AND INSANELY FAST RAM AND HUGE BANDWIDTHS

EFFECTIVE IN GAME FILL RATE IS EASY PLUS 2X XBOX 1 AT THE SAME NATIVE RESOLUTION

4X PLUS THE BANDWIDTHS ALSO CLEARLY AT 480P ITS MORE A X360 THAN A XBOX 1

FACTOR 5 STATED AT 480P WIIS EFFECTIVE IN GAME FILL RATE WAS INSANE AND THERE 100% CORRECT

comeon guys stop telling porkys

On September 3, 2008 at 4:27 pm

XBOX 1 DOES NOT SUPPORT TRUE TEXTURE COMPRESSION LIKE GAMECUBE AND WII YES IT SHARES THE SAME 6TO1 COMPRESSION BUT XGPU CANNOT READ AND DE COMPRESS IN REAL TIME

ONLY GAMECUBE AND WII CAN AND ALSO THAT TERICK APPLYS TO DATA COMPRESSION

XGPU HAS TEXTURES DE COMPRESS AS THERE SENT ITS NOT TRUE COMPRESSION

WII DE COMPRESSES DATA ON THE FLY IN REAL TIME AS ITS PROCESSED SAME APPLY S TEXTURES

HANK AKA HANKFILES IS A LYING XGPU DOES NOT READ COMPRESSED TEXTURES LIKE WII CAN

ALSO TEXTURES ARE SENT AT 5NANO SECONDS LATENCY COMPARED TO SAY 50 NANO SECONDS FOR XBOX 1

PS2 BY FAR THE WORST NO COMPRESSION AND NO GPU TO RAM DIRECT ACCESS AND ON TOP OF ALL THAT THE SLOWEST LATECY ISSUE RAM

1TSRAM REALTIME DECOMPRESSION HUGE BANDWIDTH 3MB CATCH XBOX CANNOT MATCH THAT

WIIS TEXTURING CAPABILITY INCLUDING ALL FANCY STUFF IS LIKE XBOX GAMECUBE AND PS2 COMBINED

THERES WAY MORE RAM/BANDWIDTH/SPEED/STREAMING SPEED/COMPRESSION/EFFECTIVE EFFICENT TRICKS TO DEAL WITH TEXTURES THE WII CAN DO NEXT GEN – NATIVE HD THAT IS FACT

Sampson

On September 3, 2008 at 9:51 pm

Max-fill rates are always far and away higher than screen res. So is polygon count. Until of course, you start adding lighting, gourad or phong shading, multiple texture layers, aa, alpha-blending, testing, overdraw, etc, etc.

Lol!
Who needs to go by Hank from HankFiles, to determine what can use compressed textures? It’s simple logic. You have a bus to ram that stores a texture compressed. It gets read by the gpu. Done.

****XGPU HAS TEXTURES DE COMPRESS AS THERE SENT ITS NOT TRUE COMPRESSION****

You misunderstand the process. I’ll try to help you out.

“Sent”? What are you assuming is doing the “sending”? RAM doesn’t decompress anything. It simply stores data to be read. It’s not until it’s been read from ram, transmitted over the bus, to the gpu that it ever gets used.

In the case of PS2, you can store textures in ram compressed,(and transmit them over the bus from ram fine). But the cpu does the reading, decompress them, and “sends” them to the gpu decompressed over the fsb, because there’s nothing in the gpu to deal with them.

In the case of an Xbox, it reads compressed data straight from ram, transmitted over the bus to the gpu, where the gpu has the ability to deal with them on its own. There’s nothing sending them between the two. Everything that gets to the gpu is compressed.

If you wanted to read a texture from ram on the Cube or Xbox, that’s in a format other than S3TC, then the cpu would have to read it, decompress it, and send it to the gpu.
But obviously, that’s never done.

get outa here

On September 4, 2008 at 8:23 am

xgpu to xcpu comunication capped at 1gb as thats ur lot in bus

hollywood gpu to broadway cpu comunication peak 7.7 gb bandwidth oh look 7/8 x more powerful

celeron level 2 catch 128k no compression and poor latency issues with slow main ram

broadway level 2 catch 256k plus 4to1 peak compression plus catch like speed main ram huge space bandwidth and speed compared to xcpus pathetic catch

broadways catch is way over 10x effective as the rubbish catch on xcpu its custom and supports custom things and offcourse 4to1 peak compression

256k x 4 = 1mb vs xcpu 128k plus the broadway catch is optermized and more effecent its easy 12x the effective catch as the xcpus

fact vs idiots

4 to 8 x the processing bandwidth of xbox 1 2x plus the fill rate much faster ram much faster disc loading and effective effient per clock performance xbox 1 can only dream of

the ram along is like 15 x faster you muppets

24mb 1tsram on die show me a pc or a x360 with that amount of on chip sram

you clearly know nothing

get outa here

On September 4, 2008 at 8:30 am

factor 5 rogue sqodron 3 20 million polygons 60 frames

asked about possible xbox version CONFIRMED 12 million polygons and 24 frames a second if ported to xbox

gamecube more powerful fact

also xbox would run peak 4 texture layers and 16 bit colour

gamecube ran that game at 24 bit colour and peak 8 texture layers

owned

why do textures pop up on x360 why do textures pop up on xbox 1

pooooooorrrr ram speed you know nothing about spec fools ram speed is sooooooooo important but like every other pc brainwashed idiot you wouldnt know that

wiis ram is catch memory like in speed THE WHOLE RAM

show me a xbox x360 or pc that can do that = pop up textures halo 2 pop up textures all unreal 3 games on x360

no such pop up on gamecube and wii

AGAIN SUPERIOR CONSOLE DESIGN

NOW WAITY FOR YOUR GAMES TO LOAD AND PLAY THEM ON OUT DATED OBSOLETE CONTROLS FOOLS

get outa here

On September 4, 2008 at 8:32 am

X360 PRICE DROP CONFIRMED
“”"”"”"”"”"”"”"WHY”"”"”"”"”"”"”"

NO ONE BUYS CRAP CONSOLES IN 3RD PLACE SALES WISE GLOBALLY WII BREAKING “”"”ALLL”"” SALES RECORDS

YOU FAN A DEAD CONSOLE

GOOD BYE

Sampson

On September 4, 2008 at 9:38 am

Lol, this can go the rest of your life Wiiboy. I’m planning a teaching degree in special ed. So this is nice practice.
Now, if Gamecube’s gpu needed geometry, where does it get it?
From the cpu.
Why? Because it lacks vertex shaders capable of reading a mesh from ram by itself.
Geometry can be in ram as an index list. An index list is a compression of geometry.
Gamecube’s gpu requires the cpu to act in such a role. And this, is where you pull in your compression bs. This is where you mention custom lighting.
But, this ignores the fact that every other gpu has vertex shaders, Gamecube and Wii do not. Those vertex shaders are right there on the same chip as pixel shaders.

And Wii doesn’t have 24 mb of embedded ram. Unless you want to count only being connected to that ram by a 64-bit / 486 mhz single data rate bus.
Sub-4gb isn’t awesome cache Wiiboy.

Nor is adding your latency figures (aka ram speed in Wiiboy terms), considering you ignore what actually causes latency, and ignore page fetches, and ignore the linearity of reading graphics data.

And please don’t attempt to give me effectiveness of cpu caches, as it’s obvious you ‘re noting even describing its use properly.

And I think we’ve already been over your “wii has no pop-in” bull, so lets not drift back backwards here Wiiboy.

And don’t attribute quotes to Factor 5, that they’ve never made.

And you can toss in actual developer quotes, of their own contrived cases, such as what was linked and sourced far earlier to you.

“One texture per polygon just isn’t enough any more. The PS2 has fill-rate to spare, and the dual context rendering architecture can draw two copies of every triangle for little more cost than one, so you are wasting the hardware if you have only a single texture stretched over your geometry. Xbox supports four layers per render-pass, and Gamecube eight. (although in practice you can only afford to use two or three while maintaining a good framerate).”

“Our PS2 projects are using two texture layers, with the gouraud alpha controlling a cross-fade between them. On Gamecube we usually add a third multiply mode detail layer, while on Xbox the flexibility of pixel shaders lets the artists choose any possible combine modes for their three layers, with the fourth generally reserved for the programmers to do dynamic lighting or reflection effects.”

Or the benchmark tests in that patent storm application, about building their dma chains, etc.. (pc is an older ati gpu, with a 1.4 ghz P3)

The first test was a 3998 polygon NBA player:

Gouraud shading, unlit, no textures
PS2: 17.0 MPolygons/second, 22.6 MVertexIndices/Second
Xbox: 47.2 MPolygons/second, 91.4 MVertexIndices/Second
GCN: 18.7 MPolygons/second, NA
PC: 24.1 MPolygons/second, 46.1 MVertexIndices/Second

Gouraud shading, lit, no textures
PS2: 10.9 MPolygons/second, 14.7 MVertexIndices/Second
Xbox: 22.4 MPolygons/second, 43.4 MVertexIndices/Second
GCN: 10.3 MPolygons/second, NA
PC: 15.9 MPolygons/second, 20.9 MVertexIndices/Second

Gouraud shading, lit, textured
PS2: 8.5 MPolygons/second, 11.5 MVertexIndices/Second
Xbox: 14.2 MPolygons/second, 30.3 MVertexIndices/Second
GCN: 7.2 MPolygons/second, NA
PC: 5.1 MPolygons/second, 10.9 MVertexIndices/Second

Test with a 69451 polygon Stanford bunny (from Stanford Computer Graphics Laboratory)
with Gouraud shading only:
PS2: 25.2 MPolygons/second, 31.8 MVertexIndices/Second
Xbox: 63.9 MPolygons/second, 93.8 MVertexIndices/Second
GCN: NA, NA
PC: 26.3 MPolygons/second, 36.2 MVertexIndices/Second

See Wiiboy, there are lots of contrived cases to point to.

Mosley

On September 4, 2008 at 10:35 am

Wiiboy’s on a mission to expose Nintendo’s graphics programming staff as frauds.

He’s already stated here, that Nintendo only has 1/8th, the programming skill of others.
Heck, he’s even stated that other developers are “7x”, “8x”, “10x”, “12x”, hell, even “48 times” better at their job than Nintendo, considering they pretty much do the same thing or better, using a tiny fraction of the processing power on other last-gen hardware.

And even when their games are selling in the millions, Nintendo still refuses to put in any effort what-so-ever.

He’s just an anti-Nintendo fanboy, who thinks it’s his job to tell the world what awesome graphics programmers every other game developer outside of Nintendo have on staff.

He can’t help but wonder what REAL programmers could do with a Wii, a console he thinks is dozens of times more capable than an Xbox. He’s tired of what he no doubt considers to be “half assed ” from Nintendo.

All that power and performance, and they refuse to use it, according to him.
And DAMN IT, HE’S NOT GOING TO TAKE IT ANYMORE!!!

Mosley

On September 4, 2008 at 10:43 am

Proof that Wiiboy knows what he’s talking about.
http://www.youtube.com/watch?v=DUs79PnTKus

Wiiboy, twice the idiot, none of the savant.

You tell ‘em Wiiboy, YOU TELL THE WORLD!!

comeon guys stop telling porkys

On September 4, 2008 at 11:53 am

x360 is a dreamcast its 3rd in sales globally and latest price drops CONFIRM ITS DEAD

how many price drops and new models have been released x360 and ps3

wii in one ty[pe and one colour is outselling every version of x360 and every version of ps3 COMBINED and then the psp numbers on top of that

every single x360 sold ps3 sold and psp sold combined still doesnt match wii sales per month

x360 is selling at below ps2 levels globally

ITS A DEAD CONSOLE PROPPED UP WITH BILL GATES MONEY

DEAD IS DEAD DONT TRY WIGGLING OUTA THIS ONE THE X360 IS DEAD THAT IS SALES FACT AND BUILD QUALITY FACT

wii is killing x360 in its own country and has battered x360s userbawse in half the time on sale

x360 is a dead console admit it

Sampson

On September 4, 2008 at 12:50 pm

You’re assessing quality of gaming, with sales figures, numbers pumped up by soccer mom’s who don’t know any better, not because Wii’s had a great software lineup. And 360 in the us isn’t too far behind Wii, despite the fact that a far, far greater percentage of those that buy Wiis, do so because it’s the popular thing to do.

Games like Prime 3, Zelda, etc.. don’t sell anywhere near what you’d expect a user-base filled with “”gamers” to do. That’s despite the fact that you can’t play those games anywhere else, and there’s only half attempts at competition.
What else is there for a gamer to buy and play on WIi, besides what there’s always been on Nintendo consoles?
Looks to me, like they have a decent increase in gamers over what the Gamecube had, but a huge boost in folks who don’t care about Zelda or The Conduit, etc..

But I could really care less how many people buy the system. If it weren’t such crap hardware, I’d be happy to see it, and like plenty of others have commented, would really have no reason to own anything else.

But, reality is, they stuck a gpu they pay ati 20-25 dollars for(10-15 pounds), and a cpu they pay half that to ibm for.

Sales figures aren’t my area of interest. Games and decent hardware are.

comeon guys stop telling porkys

On September 4, 2008 at 12:53 pm

twilight princess a gamecube game UNMATCHED BY ANY XBOX GAME VISUALLY

SO HOW IS XBOX ANYWERE NEAR WII

ROGUE SQODRON 3 UNMATCHED BY ANY XBOX GAME SPO HOW IS WII A XBOX

WIND WAKER NO XBOX OR X360 GAME TO DATE HAS CELL SHADED GRAPHICS ON PAR WITH THE GREAT LOOKING WIND WAKER AGAIN GAMECUBE WINS OUT

LIST OF GAMES ALL OF WITCH ARE UNMATCHED VISUALLY BY ANY XBOX GAME IN EXISTENCE

RESIDENT EVIL 4

TWILIGHT PRINCESS

METROID PRIME 2

WIND WAKER

FZERO

ALL RUN LOADING FREE VS XBOX1S OUTRAGEOUS LOADING TIMES LIKE A DAM PS2

ALL RUN IN OPTERMIZED 24 BIT COLOUR VS 16 BIT COLOUR OF UR AVERAGE XBOX TITAL

HALO 2 LONG LOADING TIMES POP UP GRAPHICS POP UP TEXTURES GLITCHES ISSUES 16 BIT COLOUR 30 FRAMES DROPS TO 12 FRAMES OUTRAGOUS BUGGY CRAP

PRIME 2

24 BIT COLOUR 60 FRAMES NO LOADING TIMES NO ISSUES NO POP UP

GET YOUR FACTS RIGHT XBOX FANS

ANOTHER SHOCKING HOME TRUTH ALL XBOX FANS TRY TO PRETEND ISNT TRUE

XBOX FANS BIGGED UP CONKERS BAD FUR DAY ON XBOX STATING LOOK AT THE EFFECTS LOOK AT THE ART LOOK AT THE FER SHADING ITS UN-MATCHED

REPETE THE XBOX FAN LIE ITS UN MATCHED

THIS GAME WAS MADE BY RARE AFTER 5 YEARS OF XBOX EXPERIENCE AND MICROSOFT OWNERSHIP YES WE ALL ACCEPT THIS AS FACT YES

BUT THERE’S A GAME THAT OUT SHINES CONKERS GRAPHICALLY IN EVERY SINGLE WAY

TEXTURES EFFECTS WATER FER SHADING EVERY LAST THING XBOX FAN BIGGED UP ABOUT CONKERS WAS BEATEN IN THIS GAME

THE GAME WAS MADE 5 YEARS EARLIER THAN CONKERS AND BY THE SAME DEVELOPERS RARE AND WITH A CONSOLE THEY HAD ONLY MONTHS OF EXPERIENCE WITH

THAT GAME WAIT FOR IT XBOX LIARS

STAR FOX ADVENTURES IT HASD BETTER GRAPHICS BETTER FER SHADING AND BETTER FRAMERATE AND SHORTER LOADTIMES THAN CONKERS AND IT WAS A GAMECUBE GAME

RARE DID BETTER GRAPHICS ON GAMECUBE WITH ONLY MONTHS OF GAMECUBE DEV KIT EXPERIENCE THAN THEY EVER DID ON XBOX 1 EVEN AFTER 5 YEARS WITH THE HARDWARE

INDUSTRY FACTS BEAT OUT XBOX FANS FANTASYS

GO CHECK OUT STAR FOX ADVENTURES VS CONKERS XBOX

GAMECUBES FOX HAS BETTER GRAPHICS BETTER FER SHADING BETTER FRAME RATE BETTER COLOUR AND SHORTER LOADING TIMES

IN DUST RY FACT
OWNED

SO EXPLAIN HOW THE FUK WII IS A XBOX YOU RETARDED FANBOYS

comeon guys stop telling porkys

On September 4, 2008 at 1:04 pm

SO RARE DID STAR FOX A 1ST YEAR GAMECUBE GAME THEN 4/5 YEARS LATER DID CONKERS ON XBOX

YET THE 1ST GAME WAS BETTER LOOKING AND SMOOTHER RUNNING

I WONDER HOW CONSIDERING XBOX WAS COUGH COUGH MORE POWERFUL

ARGUE AGAINST INDUSTRY FACT I DARE YOU TO

FANBOYS :roll:

Sampson

On September 4, 2008 at 1:38 pm

Once again Wiiboy, Loading in games like Prime is done behind doors.
It lacked bump maps of any type. etc, etc..

And anyone will tell you, that Ninja Gaiden Black, Orta, Doom 3, Dead or Alive: Ultimate, etc..etc, etc, matched or exceeded anything on Gamecube.

Just as anyone can give a list of games that went from one console to another, that looked far better on Xbox. Do you really need that done?
I could care less about Star Fox adventures, or Conker, as I’ve played neither. But if I looked at them, I’ll be sure to tell whether or not I thought they utilized the extra ram, or whether they used the superior bump mapping ability of Xbox
Of course, simply adding more computationally impressive textures and lighting, doesn’t take the place of the art design that goes into them.

Sampson

On September 4, 2008 at 2:07 pm

And those aren’t industry facts of any type, Wiiboy.

Star Fox Adventures, started as Dinosaur Planet on Gamecube. And they moved assets from their engine, over to Xbox for Conker as well as Perfect Dark. Where they’ve gone into detail on where they made changes, and what specs were higher in a couple interviews about their games. Do we judge 360, on Perfect Dark being moved from Xbox? Do we judge Wii, on PS2 game engines?
Apparently we should, and then base our entire argument on that one point.

What’s likely an industry fact, is the fact that Gamecube is (pretty much) designed and optimized to be a fixed function system. (very DX7 like)
On which most games throughout much of Xbox and Gamecube’s life ran on.

Xbox can do both, but is designed for DX8′s programmable pipeline. It only has 7 to make be able use what currently existed.

Developers however, didn’t begin using extensive programmable shaders until more recently. Game engines were still tuned to utilize what came before Geforce 3, which is fixed function.

Xbox vs Gamecube using a DX7-like pipeline, isn’t what you’d do to compare overall abilities of the system as a whole.

For using the full spectrum of abilities on an Xbox, you’d have to be using DX8.1, which many developers hadn’t moved to.

Sampson

On September 4, 2008 at 2:32 pm

And there are no “long loading times” on Halo 2. There are NO load times in Halo 2, in Wiiboy terminology.

The levels load on the fly. That’s part of the reason you can see textures pop in every once in a while during a cinema.
You’d see the Exact same thing and more, if they actually allowed you to open a door when you shoot it in Prime, or if they showed the island you’re sailing up to in Windwaker. (they do, and you can see it)
Or if they showed any view of the stage you’re going into in Mario Sunshine.

And your assessment of bit depth is also a complete waste.

comeon guys stop telling porkys

On September 5, 2008 at 6:50 am

YOU FAN A OBSOLETE CONTROL PAD A OBSOLETE LONG LOAD TIMES AND BUILD QUALITY EVEN WORSE THAN PS2 X360S FAIL RATE IS A JOKE ITS EVEN HIGHER THAN PS2 WAS

I DONT GET HOW OBSOLETE FANBOYING CAN DEEM YOU COUGH GAMERS A GAMER EMBRASES THE GAMING FUTURE HE DOESNT IGNORE IT

BASHING A DREAMCAST COPY CONTROL PAD IS A FACT OF OBSOLITION

ITS LAST GEN

comeon guys stop telling porkys

On September 5, 2008 at 6:52 am

BASHING BUTTONS IN FPS GAMES VS 3D MOUSE AND 3D MOTION

OK THEN ILL CHOOSE BUTTON BASHING

ERM NOOOO THAT WOULD BE “”"”"”"”"STUPID”"”"”"”"”"” WHY WOULD A CORE GAMER DO THAT

comeon guys stop telling porkys

On September 5, 2008 at 11:04 am

PS2 GPU CATCH BUFFER WORK EDRAM 4MB EDRAM RAMBUS NO COMPRESSION NO REAL TIME DECOMPRESSION CRAPPY LATENCY

WII GPU CATCH BUFFER WORK EDRAM 1TSRAM-R 3MB PLUS 24MB 1TSRAM-R NON EDRAM EMBEDDED ONTO GPU DIE PLUS DATA COMPRESION PLUS TEXTURE COMPRESSION CATCH SPEED AND CLOCK BALANCED

27MB PLUS COMPRESSION REAL TIME DECOMPRESSION VS 4MB ON THE GS UNIT

ALSO REMEMBER ALL RAM IS DIRECT GPU FEED ON WII PS2 HAS NO DIRECT FEED RAM TO GS UNIT ITS CPU SHARED AND BOTTLE NECKED

4MB NO COMPRESSION AT 8 BIT COLOURS TEXTURES GS UNIT RAM

27MB PLUS COMPRESSION REALTIME DECOMPRESSION OPTERMIZED 24 BIT COLOUR 24 BIT TEXTURES HOLLYWOOD GPU

2TO1 4TO1 DATA COMPRESSION 6TO1 TEXTURE COMPRESION

AVERAGE OUT COMPRESSION TO 3TO1 27MB X 3 = 81MB

81MB VS 4MB

FIFA WII = PS2 GRAPHICS SLIGHTLY POLISHED

NOT WII GRAPHICS OR ENGINE AT ALL

SO XBOX FANS LIKE EA SPOKSPERSONS ARE LYING FUK HEADS

:roll: :roll: :roll: :roll:

comeon guys stop telling porkys

On September 5, 2008 at 11:09 am

PS2 GS UNIT EDRAM @ 150MHZ

HOLLYWOOD 3MB 1TSRAM-R EDRAM 243MHZ AND 24 MB 1TSRAM-R DIE EMBEDDED RAM @ 486MHZ

WIIS GPU DIE RAM ALONE DESTROYS THE WHOLE MAIN RAM OF PS2 AND XBOX EASILY

HOLLYWOOD GPU 3MB @ 243 MHZ 24MB @ 486MHZ GS UNIT 4MB @ 150MHZ

THERES NO CONTEST WII = X360 LEVEL VISUALS AT CAPPED 480P

IT AINT A PS2 IS IT RETARDS

comeon guys stop telling porkys

On September 5, 2008 at 12:02 pm

FIFA ENGINE BUILT FOR PS2

PS2 = N64 TURBO THATS BEING KIND

WII = EASY PLUS 5X A PS2 SO EXPLAIN HOW FIFA WII IS WII GRAPHICS POWER

IT IS SIMPLY NOT WII GRAPHICS WII A.I OR WII PHYSICS AT ALL

ITS A PS2 ENGINE THATS HAD A SLIGHT POLISH ITS NOT EVEN PUSHING THE GAMECUBE LEVEL HARDWARE NEVERMIND THE WII

TALK ABOUT S

EA LYED YOU KNOW THEY LYED SO WHY STATE OTHERWISE

OH I FORGOT YOUR IDIOT FANBOYS

PS2 IN GAME 2 TEXTURE LAYERS ZERO SHADERS LOWER THAN 480I NATIVE RES AND MAX 8 BIT COLOUR AND 8 BIT TEXTURES

WII TRUE 480P RES OPTERMIZED 24 BIT COLOUR /TEXTURES 8 TEXTURE LAYERS 16 REALTIME TEXTURE STAGES CUSTOM CPU CO PROCESSED GRAPHICS

THERES NO COMPARASON EA GUY LYED AND THATS 100% FACT

Sampson

On September 5, 2008 at 1:02 pm

You can have a system that’s 2x an Xbox, or 2-3 times as powerful as a Gamecube, or 4x a PS2, and you wouldn’t notice a whole lot.

Take Shadow of the Colossus, (while bumping the frame-rate to 60) double and triple texture res, improve draw distance, increase polygon counts on things like trees, and even add more layers of textures for more bump maps, add full aa, slightly better lighting; all would eat up 3x-4x a PS2 easily. And that would look like a very clean and sharp SotC. Not a whole lot different.

Most graphics effects are done through floating point calculations, and there, something like rsx gpu in ps3 has gone up 20-25x or more since those days. (and no, I’m not referring to the bs 1-2+ teraflops numbers)
I mean the realistic, programmable operations.

Sure, something planned for PS2 wouldn’t be impressive moved to Wii, compared to something planned for it exclusively, but people’s expectations of the jump in quality of something that was, are pie-in-the-sky-too-high.

direct comparason videos guys

On September 6, 2008 at 2:03 pm

COLD FEAR XBOX VS RESIDENT EVIL 4 GAMECUBE NO CONTEST RESIDENT EVIL 4 DESTROYS COLD FEAR GRAPHICALLY GO COMPARE GAMEPLAY VIDEOS

FABLE XBOX VS TWILIGHT PRINCESS GAMECUBE AGAIN NO CONTEST FABLE LOOKS LIKE A PS2 GAME TWILIGHT PRINCESS IS SLIGHTLY UNDER FABLE 2 ON X360 (YET ITS GAMECUBE GRAPHICS HMMM)

WIND WAKER VS ANYTHING CELL SHADED ON XBOX 1 AGAIN NO CONTEST WIND WAKER IS A VISUAL MASTERPIECE NO CELL SHADED GAME ON XBOX COMES CLOSE

HALO 2 DOOM 3 RID ETC ALL 30 FRAMES ALL 16 BIT COLOUR ALL LONG LOADTIMES AND GRAPHICS GLITCHES AND BUGS VS

PRIME 2 GAMECUBE 24 BIT COLOUR 60 FRAMES NEAR NO LOADING TIMES NO GLITCHES OR BUGS OR POP UP

EVERY SINGLE CONTEST ABOVE GAMECUBE WINS NOT ONLY IN GRAPHICS BUT ALSO LOAD SPEED AND FRAMERATES AND POLISH SMOOTHNESS IN ALL AREAS THESE GAMES MENTIONED KILL THERE XBOX COMPETITION BY A COUNTRY MILE

YET AS IF BY MAGIC A CONSOLE CALLED WII 2.5 X MORE POWERFUL THAN GAMECUBE A CONSOLE ALLREADY KICKING XBOX ASS IS SOME HOW LESS POWERFUL

YOU HAVE SERIOUS ISSUES XBOX FANS

COLD FEAR LOOKS LIKE DOG COMPARED TO RES EVIL 4

FABLE LOOKS BLOCKY AND OLD COMPARED TO TWILIGHT PRINCESS

XBOX CELL SHADING LOOKS LIKE POO COMPARED TO WIND WAKER

AND LETS NOT FORGET ROGUE SQODRON 2/3 BOTH DESTROYING XBOX POLYGON COUNTS AND TEXTURE LAYERS AND LIGHTING

THERE GAMECUBE GAMES GUYS NOT WII AND THERE NOT JUST BETTER LOOKING THAN ANY XBOX COUTERPART THEY ALSO LOAD MUCH FASTER AND RUN 99 TO 100% GLITCH FREE

INDUSTRY FACT VS XBOX FAN FANTASY

direct comparason videos guys

On September 6, 2008 at 2:05 pm

COMPARE LIKE 4 LIKE = GAMECUBE WINS NEVER MIND WII

direct comparason videos guys

On September 6, 2008 at 2:19 pm

http://uk.gamespot.com/xbox/rpg/fable/video/6106534/fable-video-review?om_act=convert&om_clk=gssummary&tag=summary;watch-review

ABOVE VIDEO FABLE 1 DATED SERIOUSLY

http://www.gametrailers.com/player/15222.html

ABOVE VIDEO TWILIGHT PRINCESS KILLS FABLES GRAPHICS

http://www.gametrailers.com/player/17690.html

ABOVE VIDEO FABLE 2 OH LOOK TWILIGHT PRINCESS AND FABLE 2 ARE CLOSE VISUALLY FABLE 1 LOOKS LIKE ASS

FABLE 2 A X360 GAME VS TWILIGHT PRINCESS A GAMECUBE GAME CLEARLY VERY CLOSE NOT EVEN WII GRAPHICS

YET XBOX 1 FABLE TRAILS BEHIND LOOKING LIKE

VIDEOS PROVE THE TRUTH THE END

WII= X360- HD FACT

direct comparason videos guys

On September 6, 2008 at 2:41 pm

256MB GRAPHICS RAM X360 DOES NOT = 256MB OF GRAPHICS DETAIL IT WASTES SPACE JUST SUPPORTING THE NATIVE HIGHER RESOLUTION AND RESOLUTION IS NOT GRAPHICS

YOU BEEN EDUCATED

direct comparason videos guys

On September 6, 2008 at 3:08 pm

FABLE 1/2 10 TO 20 HOURS MAIN GAME

TWILIGHT PRINCESS 60 HOURS

YOUR NOT GAMERS YOUR GAME CULTURE FATE CRASHERS THAT KNOW NOTHING ABOUT PUNASH QUALITY AND TRUE GAMING SPIRIT

MICROSOFT AND GAMING GO TOGETHER LIKE ICE CREAM AND CURRY SOURSE “”"THEY DONT BELONG”"”

direct comparason videos guys

On September 6, 2008 at 3:09 pm

GATE CRASHERS

Sampson

On September 6, 2008 at 3:47 pm

Again, you know zilch about color depth, or what’s a 32 bit or 24 bit, 16-bits, etc…

And again, you obviously have never attempted to make a normal map, texture a polygon, or fooled with any type of graphics program of any sort. (I actually question whether you’ve actually played any of these consoles, or their games at all to be honest)
Plenty of free-ware programs and literature for doing all of that, and you can get a better idea of what you jibber about.

You’re proclaiming things like Metroid Prime and WInd Waker, as being “awesome” graphically, and using them as examples of a the system’s power, particularly in the context of “texture power”.

You’re pointing to games that are unimpressive from any technical standpoint in textures.

Take a nice look at the resolution and quality of the textures in Wind Waker. They’re extremely low res, blurry, and just plain ugly as far as textures go.

Sure, it fits with the art style, as do the low res, blurry textures in Sunshine, but they aren’t impressive from a technical standpoint.
The other graphical effects that those textures were likely gimped for were nice, but more from an artistic point of view.

Like was mentioned earlier, you can look at the fake shadow on the ground in Mario Sunshine, and see the banding, that stretched from light to dark. Bit depth is low. You see banding in Wind Waker as well. Draw distance, isn’t “awesome” in WindWaker either. (as some reviews state)
They don’t render those islands from a distance, they’re simply 2d placeholder outlines until you get closer. Then they shift to a little higher detail as you get close. Go to the ice or fire islands, you’ll see the ice / fire pop into view before your very eyes.

And you’re making a comparison list, for what?
As was said before, you can take God or War 2, Shadow of the Colossus, or Metal Gear Solid 3, and take 97% of Gamecube games, and proclaim the PS2 as more powerful from that logic.

From anyone who looks at games from a technical standpoint, who was familiar with how things work, would tell you games like Ninja Gaiden Black, the Splinter Cells, Doom 3, PGR2, etc.. are as impressive, or more than any games on Gamecube.

I’ve looked at every specification the Gamecibe has, I’ve looked at how things are done in its fixed function pipeline, and what effects it’s capable of doing in hardware, and what it’s not. I’ve seen Factor5′s presentations long ago, It’s simply not the “magically awesome, console of awesomeness” you think it is.
It’s not some mysterious magic box, that we can only “guess” about, it’s pretty much known top to bottom. If you want to know what it’s like to make a Gameecube or Wii game, find a decent graphics card, and use directX 7.
Done. (or use the horse’s name hack, and homebrew directly)

And no one would look at Twilight princess and the work done in Fable 2, and even remotely suggest the systems were anywhere near each other in power.

I like the art direction in Twilight, but…nope.

And to add to the example of Shadow of the colossus improvements. Notice, I didn’t say add in ambient occlusion or dynamic lighting, or a full shadow system that shows leaves and movement, or weather, destructible environments, or normal or parallax mapping everywhere, or precedural grass, etc..

Nor did I say add more animation blending, or more bones, or vertex attributes. All while making the world bigger, with more objects, more detail, and add higher maximum number of characters on screen at once, with more complex physics and particle systems on top of that.

We’ve already got it figured we needed 3x-4x a PS2 just to clean it up well…
Guess what type of system you’d need to do any of what’s listed?

Hint: it ain’t a Wii.

Sampson

On September 6, 2008 at 3:55 pm

****256MB GRAPHICS RAM X360 DOES NOT = 256MB OF GRAPHICS DETAIL IT WASTES SPACE JUST SUPPORTING THE NATIVE HIGHER RESOLUTION AND RESOLUTION IS NOT GRAPHICS****

That’s why it’s 522mbs Wiiboy.
Wii is 91.

522 / 91 = Almost 6x.
We could factor in superior compression formats, particularly for normal maps, but we’ll ignore that., as it’s not needed to prove anything further.

Keep going Wiiboy.

direct comparason videos guys

On September 6, 2008 at 3:58 pm

CAN XBOX FANS PLEASE EXPLAIN HOW WIND WAKER AND TWILIGHT PRINCESS BOTH GAMECUBE ACTION RPGS

BOTH GRAPHICALLY ON FABLE AND FABLE HAS TERRABLE LOAD TIMES IN GAME YET BOTH TWILIGHT AND WIND WAKER HAVE NO SUCH LOADINGH WAITS WITH NEAR INSTANT LOADING OF NEXT AREA COMPARED TO THE LOAD AND WAIT ON XBOX1 (WITH A HARDR5IVE I ADD)

WAY BETTER GRAPHICS WAY BETTER LOADING TIMES AND ITS GAMECUBE NOT WII IM TALKING ABOUT

AGAIN XBOX FANS ARE LYING :roll:

direct comparason videos guys

On September 6, 2008 at 4:01 pm

POWER VR TILE RENDERING GPU PRODUCED NEAR 32 BIT COLOUR IN ITS 16 BIT COLOUR MODE

ITS CALLED CUSTOM WOD

WII SUPPIORTS 32 BIT COLOUR POWER USING A AGAIN “”" CUSTOM”" 24 BIT COLOUR MODE

PC GPU TALK AND DIRECT MICRSOFT TALK DOES NOT APPLY TO CUSTOM WII

AGAIN MICRSOFT BRAIN WASHED XBOX IDIOTS

direct comparason videos guys

On September 6, 2008 at 4:05 pm

A PC PRODUCES GRAPHICS IN A PC WAY THERE FOR ILL APPLY PC THINKING TO A WII

THATS A XBOX FAN TALKING OUTA HIS BACKSIDE WII IS NOT A WINDOWS PC IS IT WOD

ILL APPLY PETROL ENGINE LOGIC TO A DESIL ENGINE

ILL APPLY COAL POWERSTATION LOGIC TO A NUCLEAR POWERSTATION

ILL APPLY WATER COLOUR ART LOGIC TO A OIL PAINTING

ILL APPLY HELICOPTER LOGIC TO MY SPACE SHUTTLE

ARE YOIU GETTING THE PICTURE XBOX FANS IM CALLING YOU WHAT YOU ARE STUPID MOTHER FUKERS

direct comparason videos guys

On September 6, 2008 at 4:07 pm

ILL APPLY PS2 GS LOGIC TO A DREAMCAST POWER VR GPU

ILL APPLY PSP LOGIC TO MY HIGH END PC

ILL APPLY WORM LOGIC TO STUDY BIRDS

ILL APPLY ELEPHANT LOGIC TO STUDY WHALES

WII IS NETHER A PC OR A XBOX 1 ITS A WII WAKE UP U BILL GATES BRAIN WASHED DIRECT 7/8/9 DOES NOT APPLY TO WII FUKING MONKEY BRAINED S

direct comparason videos guys

On September 6, 2008 at 4:13 pm

HOLLYWOOD GPU PROGRAMMABLE TO ASSEMBLY LEVEL WITH CUSTOM SHADER/BLENDER CALLED TEV

THERE FOR ITS GUESSED YET “”"”"”"PROGRAMMABLE”"”"”"”

BROADWAY OUT OF ORDER MEDIA 3D OPTERMIZED CPU AGAIN PROGRAMMABLE

CPU GPU DESIGNED TO CO HELP EACH OTHER FSB SUPPORTS THE BANDWIDTH FOR THIS 1T SRAM SUPPORTS THE SPEED TO DO THIS

AKA

CUSTOM CUSTOMIZABLE PROGRAMMABLE

ITS CALLED USING YOUR GOD GIVEN COMMON SENSE AND NOT LISTENING TO ANTI NINTENDO MARKETING LIES

LIKE EA GUY ETC

AGAIN I TYPE LOGICAL FACT AND XBOX FAN CANNOT SEE THE LOGICAL FACT BECAUSE HE HAS NO ABILLITY TO THINK FOR HIM SELF HE NEDS MARKETING LIARS TO DO IT FOR HIM

FABLE 1 GRAPHICS AND VERY LONG LOAD TIMES

TWILIGHT PRINCESS FANTASTIC GRAPHICS SHORT LOADTIMES YET XBOX WAS BETTER

FANBOY BULL

WII= 2.5 X XBOX PLUS 3D MOUSE 3D MOTION FACT

direct comparason videos guys

On September 6, 2008 at 4:16 pm

FACTOR 5 HOLLYWOOD GPU HIGHLY CUSTOMABLE

FACTOR 5 HOT WIRED HIGH POWER EFFECTS ON HOLLYWOOD CAN BE RE HACKED TO SUPPORT OTHER FANCY THINGS

FACTOR 5 TEV SUPPORTS SHADING AND PS3 X360 LIKE SHADING EFFECTS

FACTOR 5 WIIS FILL RATE AT 480P IS INSANE

FACTUAL EVIDENCE VS XBOX FAN HHHMMMMMMMMMMMMMMMMMMMMM :roll:

direct comparason videos guys

On September 6, 2008 at 4:19 pm

FABLE 1 LOOKS LIKE CRAP = XBOX FABLE 1 LOADS LIKE CRAP = XBOX

WIND WAKER GAMECUBE LOOKS AMASSING WIND WAKER LOADS DAM FAST = GAMECUBE

TWILIGHT PRINCESS LOOKSD AMASSING TWILIGHT PRINCESS LOADS DAM FAST= GAMECUBE

IS FACT SINKING IN XBOX WAS OVER RATED CRAP IT COULDNT LOAD AS FAST STREAM AS FAST AS GAMECUBE EVEN WITH ITS HARDDRIVE

PATHETIC

XBOX FSB 1GB

GAMECUBE FSB PLUS 5GB

WII FSB 8GB

I REST MY CASE :roll:

direct comparason videos guys

On September 6, 2008 at 4:21 pm

XBOX RAM 50 TO 100 NANO SECONDS

GAMECUBE RAM 10 NANO SECONDS

WII RAM 5 NANO SECONDS

XBOX DISC DRIVE TO RAM SLOW

GAMECUBE DISC DRIVE TO RAM FAST

WII DISC DRIVE TO RAM FASTER STILL

FACT VS FANING :roll:

direct comparason videos guys

On September 6, 2008 at 4:23 pm

WII 88MB FAST BALANCED RAM PLUS PEAK DATA COMPRESSION OF 4TO1

XBOX 64MB SLOW UNBALANCED RAM AND NO SUCH DATA COMPRESSION

CASE RESTED

direct comparason videos guys

On September 6, 2008 at 4:26 pm

SO ON DIE XBOX HAS GPU CATCH = 128K AND CPU CATCH 128K

ON DIE WII HAS 3MB CATCH PLUS 24 MB EXTENTION OF FAST MEMORY PLUS CPU 256K CATCH PLUS 4TO1 DATA COMPRESSION AND 6TRO1 TEXTURE COMPRESSION

YES THATS THE SAME POWER IS IT NOT XBOX FANS IN UR DREAMS IT IS

direct comparason videos guys

On September 6, 2008 at 4:28 pm

XBOX TEXTURES IN CATCH = 128K

WII TEXTURES IN CATCH = 6MB

OH LOOK XBOX CAN MATCH WII IN A FANTASY DREAM WORLD IT CAN

ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ WAKE UP XBOX TARDS

direct comparason videos guys

On September 6, 2008 at 4:29 pm

ASSEMBLY LEVEL CODING OF CPU AND GPU = HIGHLY FLEXABLE HIGHLY PROGRAMMABLE DOES IT NOT = WII

direct comparason videos guys

On September 6, 2008 at 5:08 pm

METRIOD PRIME 123 NOT ONLY OUT GRAPHICS HALO 2 IT DOES IT WITH VISOR AND VISOR EFFECTS FOR OTHER ENVIREMENT VIEWS SO WERE U GET ONE SET LOOK IN HALO YOU HAVE MULTIPUL LOOKS IN PRIME PROVING FURTHER HOW POWERFUL GAMECUBE WAS

TWILIGHT PRINCESS NOT ONLY S ON FABLE GRAPHICALLY IT DOES IT TWICE OVER AS YOU ALSO HAVE THE TWILIGHT WORLDS WEIRD LOOK GAMECUBE HAS TO SUPPORT

AGAIN MORE EVIDENCE OF ACTUAL IN GAME “”"”"”"”POWER”"”"”"” AND WITHOUT LOADTIMES

direct comparason videos guys

On September 6, 2008 at 5:09 pm

FABLE 16 BIT COLOUR TWILIGHT PRINCESS OPTERMIZED 24 BIT COLOUR

AGAIN XBOX FANS PROVEN LIARS

direct comparason videos guys

On September 6, 2008 at 5:19 pm

GTAMECUBE DOESNT DO SUROUND THE XBOX COMUNITY SCREAMED

GAMECUBE FULLY SUPORTS 5.1 IN GAME SURROUND SOUND IN THE PRO LOGIC 2 FORMAT

AND 7.1 PRO LOGIC 2X SO HOW CAN IT NOT SUPPORT IT DUM FUKS

WII SUPPORTS 5.1 7.1 AND OFF COURSE IMMERSION SPEAKER WIIUMOTE FOR IMMERSIVE GAME PLAY SOUND EFFECTS THAT CAN ALASO BE BLENDED WITH SUROUND SOUNDS FOR DEEP SOUND GAMEPLAY ENHANCMENTS

ITS CALLED CORRECT CONSOLE DESIGN

EXAMPLE OF THIS TWILIGHT PRINCESS

THRU UR BOOMERANG IT LEAVES UR HAND U HEAR THIS VIA WIIMOTE SPEAKER AS IT TRAVELS THRU THE AIR THE 5.1 SOUROUND TRACKS ITS FLIGHT AND YOU HEAR IT GO AROUND THE ENVIREMENT AS IT MAKES ITS WAY BACK TO LINKS HAND WIIMOTE SPEAKER FADES THE RETURNING BOOMERANG SOUND IN TILL IT GETS CLOSE AND AS IT MAKES CONTACT WITH LINKS HAND YOU HEAR THIS IN THE WIIMOTE SPEAKER

THATS HOW GAMING SURROUND IS DONE IMPOSABLE ON PS3 X360 ETC ONLY POSABLE ON WII

AGAIN GAMING FACT VS XBOX FANING

Sampson

On September 6, 2008 at 5:39 pm

:???: Don’t give yourself a there stroke Wiiboy.

****ITS CALLED CUSTOM WOD. WII SUPPIORTS 32 BIT COLOUR POWER USING A AGAIN “”" CUSTOM”" 24 BIT COLOUR MODE. PC GPU TALK AND DIRECT MICRSOFT TALK DOES NOT APPLY TO CUSTOM WII. AGAIN MICRSOFT BRAIN WASHED XBOX IDIOTS****

Nope Wiiboy. 24 bit color is fine. 32 bit is to make addressing easier.
On Xbox, the extra 8 bit alpha, can store transparency values, or in the case of a Z value, for shadow values.
Wii lacks a stencil buffer after all, so such things cut into color bits.

The “custom” part, comes in, when the rendering gets dropped to 16 bit color, and 16 bit z, if you wanted to do aa, or store an alpha value.
You can see the evidence of a 16-bit z buffer in Wind Waker. Just turn res to 480p,

****WII IS NETHER A PC OR A XBOX 1 ITS A WII WAKE UP U BILL GATES BRAIN WASHED DIRECT 7/8/9 DOES NOT APPLY TO WII FUKING MONKEY BRAINED S****

I said, if you wanted to know “what it’s like” to create graphics for Wii, get a decent gpu and use directX 7. T&L and multi-texturing are pretty much DX6/7 style processing. CPU’s are programmable, Gpu is fixed function.

Like the analogy given earlier, fixed function is like microwavable dinners, programmable is more like cooking things from ingredients.

****FACTOR 5 HOT WIRED HIGH POWER EFFECTS ON HOLLYWOOD CAN BE RE HACKED TO SUPPORT OTHER FANCY THINGS****

Sure, Factor5 can find hacks to mix the mashed potatoes with the peas, or cover the chicken in the mashed potatoes gravy, but that’s different from making your own recipes.
You wouldn’t understand that, but it’s true.

Sampson

On September 6, 2008 at 5:53 pm

****FABLE 16 BIT COLOUR TWILIGHT PRINCESS OPTERMIZED 24 BIT COLOUR AGAIN XBOX FANS PROVEN LIARS****

Again, we’ve established that you’re lost on this subject. You just reiterated that with your “custom” schlock.
And it’s ok if Gamecube renders at 16 bit. You’ll live.
I was just playing the old Counter Strike on the pc, which used 8 bit palletized textures (if you look it up). The Xbox version used 24 and 32 bit textures. Still a decent game.

And to add, what’s so special about 2x an Xbox?
“I’d still call that a “piece of ” by today’s hardware standards.
Something to look forward to and hope for though, eh Wiiboy?

ManOfTeal

On September 6, 2008 at 6:36 pm

Who gives a ?

Go Marlins!!!!

direct comparason videos guys

On September 6, 2008 at 6:57 pm

WII= FASTEST EVER SELLING CONSOLE IN HISTORY STILL GOING

X360= MULTIPUL PRICE DROPS MULTIPUL MODELS AND DIEING IN 3RD PLACE

WII NUMBER 1 USERBASE YET ONLY AVALABLE HALF OF X360S LIFE TIME

X360 LONG LOADING MANY BUGS GLITCHES AND PATCHES REQUIRED

WII IS FAST LOADING AND GAMES LESS BUGGED OUT

ITS ALL FACT DONT ARGUE AGAINST IT

direct comparason videos guys

On September 6, 2008 at 7:00 pm

TWILIGHT PRINCESS CLEAR AND BLATENTLY S ON FABLE FACT AND HAS NON OF THE XBOX TERRABLE LOADING ISSUES

WIND WAKER LOOKS BETTER THAN FABLE AND AGAIN LOADS MUCH FASTER

FZERO IS STILL TO THIS DAY THE FASTEST RACER EVER MADE ITS A GAMECUBE EXCLUSIVE NO OTHER RACER IN EXISTENCE RUNS AS FAST FACT AND AGAIN BETTER GRAPHICS THAN ANY XBOX FUTURISTIC RACER AND FASTER LOADING

SO AGAIN HOW IS THE 2.5 X MORE POWERFUL WII = TO COUGH XBOX

direct comparason videos guys

On September 6, 2008 at 7:03 pm

SO WHAT WII HAVE CONFIRMED BEYOND ALL DOUBT IS XBOX FANS FAN

LONG LOADING TIMES

POOR BUILD QUALITY

OUT DATED CONTROL PADS

GLITCHY CRASHY BUGGY PATCH REQUIRING GAMES

POP UP GRAPHICS

POP UP TEXTURES

NO HISTORY WITHIN THE INDUSTRY

NO 1ST PARTY MERIT

AND TELLING LIES

YOUR NOT A VERY NICE BUNCH ARE YOU

AGAIN THE ABOVE IS FACT

direct comparason videos guys

On September 6, 2008 at 7:05 pm

IV NEVER EVER SEEN A XBOX 1 GAME SUPPORT 7.1

YET GAMECUBE GAMES DID HMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM

AGAIN THE TRUTH

COLD FEAR LOOKS LIKE ASS RES EVIL 4 GAMECUBE/WII LOOKS GREAT NOW RUN ALONG KIDDYS WITH UR CHAV TEENAGER CONSOLE WITH NOTHING BUT SHOOT OR RACE ON IT

direct comparason videos guys

On September 6, 2008 at 7:06 pm

XBOX SCRATCHING DISCS AND CATCHING FIRE

GAMECUBE AMASSING LEVELS OF BUILD QUALITY

AGAIN THE TRUTH

direct comparason videos guys

On September 6, 2008 at 7:08 pm

NINTENDO JUST CONFIRMED NO HARDDRIVE 4 WII

SOLID STATE FLASH AND HOLOGRAPHIC DRIVES FOR WII HARDDRIVES ARE SLOW UNRELIABLE AND EASILY BROKEN

AGAIN NINTENDO KNOWS BEST MICROSOFT AND ITS FANS KNOW SWEET FUK ALL ABOUT CONSOLE DESIGN

Sampson

On September 6, 2008 at 8:33 pm

:lol:

Lets go over your old Gamecube vs Xbox bull.

Every single developer who’s ever commented on “Xbox vs Gamecube” will either say, Gamecube is overall weaker, or “just as good”.
Every single technology centric website, will say, “Xbox is overall more powerful”, or “Gamecube is just as good”.

The number who’ll say “Xbox is overall more powerful” is nearly all of them.
The number who will say “Gamecube is just as good” are a tiny few.

Next to none of them will say, “Gamecube was overall more powerful”.

Those are the developers own assessments. Fanboyism doesn’t change that.
Misquoting them doesn’t change that.
Your own blatantly biased, blind assessment doesn’t change that.

That is also exactly what the specifications imply. If we took specs at face value, we’d conclude that Xbox was far more powerful. If we look at actual performance, Xbox is still ahead.

Developer’s have spoken. The specifications have spoken.
It’s definitive across the board.

That’s just the way it is Wiiboy, and you’ll simply have to come to terms with that.

As to the Wii, doesn’t it bother you, that you’re actually having to try to argue the idea that Wii is more powerful than a system that’s more than a half decade old now?

You attempt to give your biased infomercial-like bull comparisons, to “prove” it, yet, for any game you attempt to list, there’s still a counters on Xbox. Ninja Gaiden, DOom 3, the Splinter Cells, etc..
And that’s for a system from the turn of the century.

You can dream of a “2.5x an Xbox” if you’d like, (it’s bull from any angle) but you can cling to it.
But what is that?

It would still be a cheap, over-priced, underpowered, “piece of ” from a specifications point of view.

No one would disagree on build quality. External design is nice. Sales figures are awesome. First party games are good. (few and far between, but very good) And they made a nice effort trying to be innovative with the controls.

That’s about it. The difference between you and Xbots, is that if Microsoft had taken an Xbox, even fully doubled the clock rate, and doubled the ram, none of them would be here, working their tiny little brains into a cap-lock frenzy, trying to argue that it isn’t what its specifications say it is.

Sampson

On September 6, 2008 at 8:40 pm

****IV NEVER EVER SEEN A XBOX 1 GAME SUPPORT 7.1****

7.1 what?

direct comparason videos guys

On September 7, 2008 at 9:27 am

7.1 surround sound what did you think 7.1 graphics YOU TARD

direct comparason videos guys

On September 7, 2008 at 9:34 am

I DONT EVER RECALL A GAMECUBE SETTING PEOPLES CARPETS ON FIRE

I CLEARLY RECALL A GLOBAL RECALL OF XBOXES AFTER MANY FIRE INCIDENTS AGAIN A GAMING INDUSTRY FACT XBOX FANS IGNORE

GAMECUBE AND WII NEVER SCRATCHED DISCS I RECALL CLEARLY BOTH XBOX 1 AND X360 ARE RENOWN FOR SCRATCHING DISCS AGAIN FACT XBOC FANS PRETEND ISNT HAPPENING

THATS UNBELIEVEABLE LEVELS OF FANBOYISUM RIGHT THERE

TYPICAL EXAMPLE OF FANBOYS AT A CONSOLES LAUNCH

PS FANS OH THERES MANY PROBLEMS BUT SONY WILL DE BUG THEM IT WILL ALL BE FINE IN THE END REGARDING BUILD QUALITY OVER HEATING DISC SCRATCHING DRIVE FAILER ETC

ITS CALLED A FANBOYS EXCUSE

AGAIN XBOX FANS OH IT ONLY BECAUSE ITS A NEW CONSOLE THELL OIL OUT THE BUGS SOON

STILL TO THIS DAY X360 HAS TERRABLE BUILD ISSUES

NINTENDO FANS NO SUCH TALK NO SUCH EXCUSE FANBOY BIULL “”"”"WHY”"”"

BECAUSE NINTENDOS CONSOLE BUILD QUALITY IS GUESSED YET SECOND TO NON AND THERE CONSOLES ARE CURRECTLY BUILT FROM DAY ONE

AGAIN INDUSTRY FACT

direct comparason videos guys

On September 7, 2008 at 9:44 am

NINTENDO WINS AWARDS FOR BUILD QUALITY AND EVERY SINGLE YOU TUBE VIDEO OF GAMECUBE VS XBOX VS PS2 CLEARLY SHOWS GAMECUBES WORKING AFTER TERRABLE DAMAGE

BOTH PS2 AND XBOX GAVE UP AFTER A SLIGHT NOCK PATHETIC BUILD QUALITY

AGAIN ITS 100% FACT

I CAN SLANDER SONY FANS AND XBOX FANS EVERY DAY FOR THE NEXT 10 YEARS ON THIS THREAD AS THERES NOTHING IN THIS INDUSTRY/CULTURE THAT GETS PAST THE MIGHTY

CUBEBOY/WIIBOY101

THE MOST “”"FACTUAL”"” GUY ON THE NET

LIES ARE FOR FANBOYS HONEST POSTING IS FOR CUBEBOY GAMER BORN THRU AND THRU AND GAMED SINCE 1980

AND SOME PUNK TEEBNAGER THINKS HE KNOWS “”"GAME”"” BETTER THAN MII OK THE MEAR BOY KNOWS MORE THAN THE GAMING MASTER ON HIS MOUNTIN YEAH RIGHT

FACT IS FACT SONY AND MICRSOFT ARE NOT GAMING COMPANYS AND CARE ABOUT GAMING AND GAMERS

THERE IN A HOME SET TOP BOX OPERATING BOX NET BOX WAR

BILL GATES FEARED PLAYSTATION CULTURE AS A THEAT TO WINDOWS AND PC

SONY FEARED WINDOWS/PC WOULD BECOME THE HOME TV BOX

RESULT A TV SET TOP BOX WAR BETWEEN SONY AND BILL GATES WITH GAMING AS THE PLOY TO GET US TO CHOOSE FORMATS WITH ACTUALLY NO CARE IN THE WORLD ABOUT GAMING GAMING WAS THE FOOT IN THE DOOR THE REAL WAR IS SONY VS BILL GATES OVER WHO CONTROLS THE GLOBAL

TV SET TOP BOX OF THE FUTURE “”"GAMING”" IS WHAT WILL SUFFER

IF YOU CARNT SEE THAT THIS IS A COMPUTER DO IT ALL BOX FORMAT WAR

THEN YOU ALSO CARNT SEE USA IS IN IRAQ FOR OIL AND NOT GLOBAL PEASE

TALK ABOUT A BRAINWASHED GENERATION YOUR PATHETIC

NINTENDO= SAVE GAMING

SONY AND MICRSOFT= DESTROY GAMING

IF YOU CARNT SEE THAT YOUR FUKING BRAIN WASHED CLEARLY

direct comparason videos guys

On September 7, 2008 at 9:48 am

IF SONY AND BILL GATES CARED ABOUT GAMING WERES THE GAMING INNOVATIONS THEY AINT DONE WERES THE FAST LOADING TIMES WERES THE COMMONSENSE IN THERE DESIGN WERES THE CHILD PROOF BUILD QUALITY ITS NOT THERE CLEARLY

COPYING NINTENDOS CONTROL PADS BUILDING CONSOLES THAT BREAKDOWN WEN A CHILD SNEEZES AND ALLOWING THAT CONSOLE TO LOAD SLOWLY

IS NOT IN THE GAMERS INTERESTS IS IT (TRY ENGAGING A DAM BRAIN WILL YOU)

DVD PLAY BACK AND PAY 4 DOWN LOAD TV EPISODES IS NOT GAMING

3D MOUSE 3D MOTION “”"IS”" GAMING

SO WHOS THE GAMING COMPANY AND WHOS THE

AGAIN INDUSTRY FACT

YOULL LEARN ONE DAY WEN YOU GROW UP

direct comparason videos guys

On September 7, 2008 at 9:51 am

SO NINTENDO TAKE US FORWARD WITH 3D MOUSE 3D MOTION AND OTHER CLEVER INNOVATIONS

THEY STICK TO THERE FAST LOADING TRUE CONSOLE DESIGNS AND STICK TO THE IDEA A CONSOLE SHOULD BE CHILD PROOF AND SAFE TO USE

MICROSOFT AND SONY TRUY SELLING US DISC FORMATS FOR MOVIES AND OVER PRICED TV SHOW DOWNLOADS

AND JUST SLAP A NINTENDO COPY CONTROL PAD IN THE BOX

AND YOU CALL THEM GAMING CONSOLE MAKERS DONT MAKE MII LAFF UR CLEARLY UNEDUCATED

direct comparason videos guys

On September 7, 2008 at 10:03 am

SO FIFA PS2ISH ON WII IS NOT WII GRAPHICS ARE WII ALL GOING TO AGREE I WON AND EA GUY LIED OR ARNT YOU MAN ENOUGH TO ADMIT DEFEAT……..

wii is like 10x ps2 at texture performance @480p so please stop refering to lazy ps2 porting as wii graphics

ps2 gs unit 4mb edram no direct acces to main ram no abillity to read or decompress compressed data no abillity to read or decompress compressed textures has to use pathetic 4 bit 8 bit colour and textures as anything bigger is just to much for the poor gs ujnit to handle 4mb of 4/8 bit data thats pathetic even for last generation

wii 3mb plus 24mb all on gpu chip all lightning fast all capable of holding and sending compressed data and compressed textures at lightning speed and all colour texture data is optermized 24 bit not pathetic 4 bit

24mb if all textures 6to1 compression = 144mb vs ps2s 4mb max with only say 1mb for textures

3mb edram wii gpu 1mb for textures x 6 = 6mb

24+1=25 25×6=150 mb textures on gpu chip before even touching gddr3 main ram

max data z compression = 4to1

24mb x 4 = 96mb graphics data

it doesnt matter any way you cut it

wii is at least very least 5x ps2 @ 480p rendering

textures alone would be more like 10x ps2 easy

yet acording to liars fifa wii has wii level graphics not slightly polished ps2

STOP LYING GUYS ITS EMBARRASSING

direct comparason videos guys

On September 7, 2008 at 10:11 am

NOW THIS IS DAM FUNNY ACORDING TO BILL GATES XBOX WEN LAUNCHED WAS CAPABLE OF COUGH COUGH 110 MILLION POLYGONS

110MILLION POLYGONS FUNNY ENOUGH XBOX 1 NEVER EVER BEAT 12-15 MILLION POLYGONS THATS NO WERE NEAR 110 POLYGONS NOW IS IT SO BILL GATES IS A LYING MOTHER ER CORRECT

NINTENDO STATED GAMECUBE WAS CAPABLE OF 6 TO 12 MILLION POLYGONS BUT THEY LIED ALSO BUT IN A NICE WAY A CONSERVATIVE WAY AS IN GAME GAMECUBE ACTUALLY HIT

20 MILLION POLYGONS

AGAIN INDUSTRY FACT

20 MILLION POLYGONS 24 BIT COLOUR VS 12 MILLION POLYGONS 16 BIT COLOUR AND LONG LOADING TIMES

XBOX AINT

direct comparason videos guys

On September 7, 2008 at 10:26 am

DID FIFA ON XBOX HAVE XBOX GRAPHIC DID FIFA ON GAMECUBE HAVE GAMECUBE GRAPHICS

ANSWER NO BOTH HAD SLIGHTLY POLISHED PS2 PORT GRAPHICS AS IS THE CASE STILL WITH WII

AGAIN XBOX FANS CONTRADICT THEM SELFS LIKE FANBOYS USUALLY DO

FACT IS FACT THERES NO WII ENGINE IN EXISTENCE AT EA THERE LAZY AND THEY PREPED FOR PS3 LIKE THE IDIOTS THEY ARE

COULDNT THEY SEE WII COMMING ARE THEY BLIND

Sampson

On September 7, 2008 at 10:58 am

****7.1 surround sound what did you think 7.1 graphics YOU TARD****

Yeah, that’s what I thought you meant. Those are pretty good ears you got there Wiiboy, considering Gamecube and now Wii, support nothing more than Pro-Logic II, with its generic signal processor, that’s limited to 2d.
Which, if you knew how this stuff worked, would know it’s emulated surround sound., and not discreet 5.1 doubly digital.
(And certainly not 7.1, lol)

Xbox’s sound chip encodes sound into 5.1 in real time, straight from 3-dimensional positional data. Wii / Gamecube takes a high quality stereo signal, and breaks it up in to “awesomeness”.

You can read your old buddy “The Hank”.
http://hankfiles.pcvsconsole.com/answer.php?file=118

And as to the rest of what you just regurgitated for the hundredth time. Your polygon data is bull, and I’m sure it’s been explained to you before up there, but the exact same type of T&L unit in Gamecube, is in Xbox, except Xbox has 2 of them, are clocked faster, and are far more programmable on their own.
Gamecube can transform 1/4th its clock rate, max. Xbox can do half its clock rate, max. Those are the facts, not your fanboy delusion.

Sampson

On September 7, 2008 at 11:09 am

Again, Gamecube and Wii = fake / emulated 5.1.
That’s not putting the system down that’s really what it is.

Sampson

On September 7, 2008 at 11:36 am

More reading material from Dolby themselves.

http://www.dolby.com/consumer/technology/prologic_II.html

direct comparason videos guys

On September 7, 2008 at 1:56 pm

ps2 graphics ram 4mb edram plus say 16mb main ram indirect over cpu bus all terable latency and very poor compression modes dealing in 4/8 bit textures colour….AND BANDWIDTH BOTTLE NECKED OVER FSB AT 1.5GB SHARED WITH CPU….

IF USING ALL 1TSRAM WII GRAPHICS RAM 3MB EDRAM 24MB 1TSRAM AND SAY HALF OF THE 64MB GDDR3 SO 32MB GDDR3 RAM 32+24+3=59MB PLUS 6TO1 TEXTURE COMPRESSION 4TO1 DATA AND DISPLAY AND Z COMPRESSION ALL IN OPTERMIZED 24 BIT COLOUR TEXTURES

10X PS2 LEVEL TEXTURE PERFORMANCE AT 480P FIFA WII = PS2 PORT

LIKE I STATED ALL THE WAY UP THERE AT THE TOP OF THE THREAD

direct comparason videos guys

On September 7, 2008 at 2:04 pm

IF FIFA WII HAS WII GRAPHICS THEN EXPLAIN THE VASTLY SUPERIOR MARIO GALAXY METROID PRIME 3 AND AT TWICE THE FRAMERATE ANYONE CAN PORT PS2 CODE TO WII

GAMECUBE RAN PS2 PORTS AND ADDED POLISH AT 40% OF ITS ACTUAL POWER PERFORMANCE LEVEL PS2 PORTED CODE RUNS ON GAMECUBE IN ITS SLEEP LET ALONE THE WII

IF A AVERAGE PORT AND POLISH FROM PS2 TO GAMECUBE USED 40% OF GAMECUBES POWER THEN CLEARLY WII IS VASTLY MORE CAPABLE THAN FIFA WII GRAPHICALLY

STOP CONFUSING MARKETING TALK WITH FACTUAL SPEC TALK EA LIED

direct comparason videos guys

On September 7, 2008 at 2:21 pm

RESOLUTION IS ANOTHER IDIOTS LOVE CHILD RESOLUTION IS NOT GRAPHICS

if i display a crappy ps2 plain concrete wall outa say hulk ps2 you know those plain concrete no detail walls that plaged ps2 games because ps2 couldnt texture

well if i render that same wall at native 1080p what happens does the wall become more detailed and have more detailed things on it with more real time shader blender effects and bump mapping

OFFCOURSE NOT ITS THE SAME WALL PS2 WAS DRAWING AT 480I IV SIMPLY RE DRAWN THE WALL ON MY PC AT NATIVE 1080P THERE FOR THE RESOLUTION HAS CHANGED NOT THE GRAPHICS

BUT IM USING 3X 6X MORE RAM AND PIXEL FILL RATE TO SIMPLY PRODUCE THE SAME DAM WALL IMAGE

IV JUST PROVEN TO A HD GRAPHICS BELIEVING PS3 X360 FANBOY IDIOT THAT GRAPHICS AND RESOLUTION ARE TOO COMPLETELY DIFFERENT THINGS

HAS COMMON SENSE YET SANK IN

WII CAN DO THE X360 LIKE GRAPHICS BUT NOT IN HD DOES THAT FACT NOW SINK IN WII IS CURRENT GEN GRAPHICS -HD NATIVE

ITS NOT HARD SIMPLY THINK FOR UR SELFS INSTED OF BELIEVING SONYS HD TV MARKETING HYPE

direct comparason videos guys

On September 7, 2008 at 3:37 pm

100% currect console power ranking
least to most powerful

ps2
xbox
gamecube
wii tops the list thats simply the truth

wii = or weaker than xbox wen its 2.5 x gamecube imposable simply imposable

direct comparason videos guys

On September 7, 2008 at 3:42 pm

fable charictor models 2k polygons

link twilight princess 8k polygons

leon res evil 4 gamecube version 10k polygons ps2 sub 5k polygogs

gears of war charitors average 14k polygons

x360 = 14k polygon count gamecube is hitting 8/10k polygons wii can do gears of war at 480p native simly the truth

bioshock dev/publisher also just confirmed bioshock also possible on wii with enhanced gameplay and controls obviously

look at xbox and ps2 polygon counts charictors 1k to 5k polygons but gamecube is 5k to 10k polygons

simly the dam truth

wii destroys xbox and clear pisses on ps2

wii can (to the human eye) near the visual experience of ps3 x360 but do it loading free and with advanced 3d motion 3d mouse and now 1to1 advanced 3d gameplay

thats the truth its called a NINTENDO Wii

direct comparason videos guys

On September 7, 2008 at 3:46 pm

twilight princess is near fable 2 visually and beats the pants of fable 1 it also out load speeds both fable 1 and 2 yet its gamecube graphics not wii

wii and gamecube share identical graphics in twilight princess the only difference is gamecube is video out 4;3 interlace and wii is 16by9 widescreen and 480p not i

the video out is optermized the graphics are identical same applys to res evil 4 wii edition

they both have gamecube not wii graphics yet both clear on any xbox game visually

wii is 2.5x xbox gamecube level with out a doubt

direct comparason videos guys

On September 7, 2008 at 3:48 pm

10x ps2 textures with full support for shading blending combining bump mapping normal mapping custom mapping

how is that sub xbox asholes

direct comparason videos guys

On September 7, 2008 at 4:53 pm

FASTEST RAM IN A CONSOLE 1TSRAM-R AND EDRAM 1TSRAM-R WII @ 486MHZ

SECOND FASTEST RAM EVER IN A CONSOLE 1TSRAM IN GAMECUBE

3RD FASTEST RAM PS3S XDR RAMBUS RAM

4TH WIIS GDDR3

5TH PS3S AND X360S GDDR3

6TH XBOX 1 SDRAM

7TH PS2 RAMBUS RAM

1TSRAM-R AND EDRAM 1TSRAM-R ARE SIMPLY UNMATCHED ADDING THE FACT ITS ALL CLOCK SYNCED AND DIE EMBEDDED WIIS 1TSRAM IS BLINDINGLY FAST AND FULY SUPPORTS COMPRESSED DATA AND TEXTURES

its trully fast as in catch fast

direct comparason videos guys

On September 7, 2008 at 4:57 pm

wiis 1tsram-r is 100% effective performance meaning its theoretical performance is 100% real in realtime

the average pc is 50% ram effective ps3 and x360 not much more than that

think beyond the marketed way your told to think ram latency and balance is very important to games performance

EXAMPLE

multi layerd textures ps3 x360 unreal 3 engine POP UP TEXTURES POP UP TEXTURE LAYERS

PS2 KILLZONE POP UP TEXTURES

GAMECUBE/WII NO POP UP WHY

THE SPEED OF RAM IS SIMPLY AMASSING THATS WHY

Sampson

On September 7, 2008 at 8:28 pm

*****WII CAN DO THE X360 LIKE GRAPHICS BUT NOT IN HD DOES THAT FACT NOW SINK IN WII IS CURRENT GEN GRAPHICS -HD NATIVE, wii can (to the human eye) near the visual experience of ps3 x360 but do it loading free and with advanced 3d motion 3d mouse and now 1to1 advanced 3d gameplay****

Neah. Your s eyes maybe, and perhaps those with cataracts, or really nearsighted.
For everyone else, there’s pretty nice difference.

Not even listing animation, or any other aspect of processing.

****GAMECUBE/WII NO POP UP AND IS LOADING FREE****

And yet….there’s pop-up everywhere Wiiboy.
There’s loading everywhere.
Funny how that works.

Sampson

On September 7, 2008 at 8:38 pm

*****its trully fast as in catch fast****

Caches don’t have sub 4gb of bandwidth.
Caches are more like 120gb of bandwidth.

direct comparason videos guys

On September 8, 2008 at 6:54 am

catch like 5 nano seconds latency = wii ram

pathetic pc like main ram latency 50 nano seconds even 100 nano seconds= x360 and ps3

1tsram-r is like 15 x faster in latency terms than xbox 1s fest sdram

wen wii has used up its catch it still has catch speed die embedded ram to feed from maintaning catch like speed even outside of the catch

EFFICENT EFFECTIVE CUSTOM

THOSE “”"”KEY”"” WORDS AGAIN XBOX FAN SEEMS UNABLE TO COMPUT IN HIS BACKWARD HEAD

direct comparason videos guys

On September 8, 2008 at 6:59 am

NO ONE CAN ARGUE THE “”"”FACT”"”

WERE XBOX 1 HAS 128K CATCHES TO EACH PROCESSOR I.E 128K LEVEL 2 CPU CATCH AND A 128K GPU CATCH

WII HAS 3MB GPU 256K CPU AND 24MB GPU DIE THAT FEEDS BOTH GPU AND CPU DIRECTLY

27MB PLUS VS 128K X 2 ADD TO THE HIUGE PLUS 27MB THE FACT IT ALL FULLY SUPPORTS DATA COMPRESSION Z COMPRESSIION AND TEXTURE COMPRESSION

WIIS 27MB OF SRAM IS LIKE PS2S 4MB EDRAM WIIS 3MB EDRAM AND DREAMCASTS WHOLE 8MB VRAM COMBINED AND DEDICATED TO INTENSE GRAPHICS WORK BEFORE EVER HITTING MAIN EXTERNAL RAM

WII IS EASY 5X PS2 A LOT MORE IN KEY AREAS FIFA WII = POLISHED PS2 CODE NOT TRUE WII

THAT IS NOT A FANBOY STATMENT ITS A STATMENT OF FACT

direct comparason videos guys

On September 8, 2008 at 7:04 am

GRAPH PS2 AT 1 PS3 AT 100

WERE ON THAT GRAPH WOULD I PUT WII GRAPHICS WISE ID GUESS AT PLUS 60 IN TERMS OF VISUAL CAPABILITY ON SCREEN WHAT YOUR SEEING

THATS CLEARLY MORE PS3 THAN IT IS PS2 THE WII IS COMPARABLE TO CURRENT GEN NOT LAST GEN

AS TABLE TENNIS WII A X360 PORT CLEARLY SHOWS

AS GALAXY SHOWS

AS PRIME 3 SHOWS

AS ZACK AND WIFI SHOWS

AS FANTASTIC LAST GEN GRAPHICS SHOW TWILIGFHT PRINCESS

RES EVIL 4

WE HAVE NOT SEEN TRUE WII GRAPHICS FROM 3RD PARTYS ITS ALL HAPPENING NOW AS THERE NOW REALIZING THE MASSIVE ERROR OF NOT BACKING WII IN THE FIRST DAM PLACE

ManOfTeal

On September 8, 2008 at 9:45 am

Nerds ^^^^^^^^^ :roll:

SHUT UP!!!!!

Go Marlins!!!!

Sampson

On September 8, 2008 at 9:46 am

Wiiboy, you list latency, yet know nothing about it.
A PS3 hides latency of all types.
360′s “gpu” doesn’t care much about latency. Shader programs run on-chip, they simply process everything that’s available through thread scheduling.
Shadows, lighting, alpha-blending, etc. and a list of other effects process in parallel to any texture ops.
All of Wii’d effects are texture operations, and the texture ops and fetches, are fully tied together in a pipeline. If it doesn’t have data for a texture, it stalls the whole thing. 360 does not. PS3, usually does not.

An Xbox doesn’t fetch textures between combiner ops. Wii and Gamecube are dependent on the texture cache design. Put normal graphics ram in it, and it’ll run like garbage. Put 1tsram on a 360 gpu, and it wouldn’t make much difference.

And, as was also linked, random access causes latency, not linearly accessed graphics data. It’s not difficult to organize graphics data to be accessed in a more linear manner.
Latency effects overall efficiency.
Like that link up there listed; If 1tsram is 90% efficient, and under a Geforce 3 memory controller, ddr was 75% efficient, it simply gives a 15% boost in effective bandwidth. Which, in the case of Gamecube, doesn’t make up for a 300+% disadvantage in bandwidth to main ram, after you the cpu takes its chunk.

Comparing ram latency is pointless, comparing how well a gpu uses that ram is not. Wii can’t handle latency, modern gpus can.
Which doesn’t matter much, because the overall Wii design is made to not have as much latency.

Made to deal well with latency.
Made to not have as much latency.

Two solutions to the same problem.

As to the rest of it, no one disagreed with the idea of Wii not showing full potential with most 3rd party software. Most of it looks sub-Xbox (and Gamecube). Wii being overall more powerful than both, there’s no question there.

ManOfTeal

On September 8, 2008 at 9:49 am

My is bigger than either of you……I WIN!!!!

Go Marlins!!!!

Y2K

On September 8, 2008 at 9:53 am

lol, caps lock much people?

Sampson

On September 8, 2008 at 10:30 am

There’s really no point to this thread. It’s not a legit discussion or anything.

I’m just responding (and letting a few people around me respond) for the heck of it. It’s interesting to see irrational brand loyalty at its best.
Cap-locking is the same individual over and over again. No way is there more than 1 of him. Which is amazing (and disturbing) if you look at how much effort he puts into trying to curb his hardware inferiority complex.

It’s like a mild form of autism or something. Which would probably explain alot of fanboyism really. You don’t usually get to see it though, because the mods do a decent job of moderating it.

direct comparason videos guys

On September 8, 2008 at 2:43 pm

AND EA LIED GET IT

AT LAST MAYBE HONESTY IS REPLACING FANING IN THE MIND OF THE STRANG MICRSOFT FAN

LIKE MICROSOFT HAVE ANY MERIT AT ALL IN THE CONSOLE WORLD REALLY PATHETIC FANS

NINTENDO FANS FAN A HISTORY “”"WORTH”"” FANNING

MICRSOFT FANS FAN A 4 EYED GEEK WHO GOT LUCKY WITH A RIP OFF OF AMIGA WORK BENCH “”"HE STOLE”" AND CALLED WINDOWS

HHHMMMMMMM AGAIN THE TRUTH

direct comparason videos guys

On September 8, 2008 at 3:06 pm

Planet GameCube: In a recent IGNinsider article, Greg Buchner revealed that Flipper can do some unique things because of the ways that the different texture layers can interact. Can you elaborate on this feature? Have you used it? Do you know if the effects it allows are reproducible on other architectures (at decent framerates)?

Julian Eggebrecht: He was probably referring to the TEV pipeline. Imagine it like an elaborate switchboard that makes the wildest combinations of textures and materials possible. The TEV pipeline combines up to 8 textures in up to 16 stages in one go. Each stage can apply a multitude of functions to the texture – obvious examples of what you do with the TEV stages would be bump-mapping or cel-shading. The TEV pipeline is completely under programmer control, so the more time you spend on writing elaborate shaders for it, the more effects you can achieve. We just used the obvious effects in Rogue Leader with the targeting computer and the volumetric fog variations being the most unusual usage of TEV. In a second generation game we’ll obviously focus on more complicated applications.

THE TEV UNIT IS UNDER PROGRAMMERS CONTROL THE MORE EXPERIENCE YOU HAVE WITH IT THE MORE SHADERS YOU CAN WRITE FOR IT “”FACTOR 5″”

FULLY BLENDABLE PROGRAMMABLE TEV UNIT HACKABLE GPU AND FULLY PROGRAMMABLE GPU EFFECTS VIA CPU

THE GAMECUBE WAS CLEARLY PROGRAMMABLE GRAPHICS (SHADERS) AS IS WII ALSO

THANK YOU THANK YOU THANK YOU

FIXED FUNCTION MY ASS

CUSTOMABLE AND PROGRAMMABLE YES

direct comparason videos guys

On September 8, 2008 at 3:08 pm

SO AGAIN THE EVIDENCE PROVES EA AND XBOX FANS ARE LIARS

direct comparason videos guys

On September 8, 2008 at 3:17 pm

d

Planet GameCube: Why is it that Rogue Leader looks so much better than virtually every other GameCube game, let alone X-Box or PS2? Is it because you have had very early access to GameCube hardware?

Julian Eggebrecht: Maybe. I really don’t know. One thing Rogue uses extensively is bump-mapping and that’s something I haven’t seen much in other games. It makes such a difference. We spent a lot of the overall development time on our shaders and the light/shadow code. That certainly made a huge difference. Hoth without the landscape shadows, cloud shadows and the self-shadowing on all objects looked very boring. The landscape levels also use a lot of texture layers to give an organic feeling to the surfaces.

SHADERS BUMP MAPPING MULTIPUL TEXTURE LAYERS

24 BIT COLOUR 24 BIT TEXTURES 512X512 TEXTURES 256X256 TEXTURES

60 FRAMES

= GAMECUBE

WII= 2.5 X GAMECUBE

XBOX AINT VS XBOX EA LIED

direct comparason videos guys

On September 8, 2008 at 3:18 pm

XBOX VS WII MAYBE

direct comparason videos guys

On September 8, 2008 at 3:24 pm

WII IN SHADER BLENDER REALTIME SHADER EFFECTS TERMS IS PROBABLY ON PAR WITH ALL SHADER CAPABILLITYS OF PS2 GCN AND XBOX COMBINED WIIS GPU RAM CAPABILLITYS AGAIN LIKE PS2 XBOX AND GCN COMBINED

ITS CLEARLY GOT BAGS OF POWER AND TRICKS WAY ABOVE LAST GEN PS2 LED CRAPICS SORRY GRAPHICS

WII CAN KICK SOME ASS DONT WORRY ABOUT WII WORRUY ABOUT A BIASED INDUSTRY THAT “”"”"”"”FEARS”"”"”"

NINTENDO

direct comparason videos guys

On September 8, 2008 at 3:29 pm

BROADWAYS CUSTOM CPU GRAPHICS CO-PROCESSING I.E GEOMITRY/LIGHTING FULLY PROGRAMMABLE IS LIKE BOTH PS2S VECTOR UNITS COMBINED AND DEDICATED TO CUSTOM EXTRA GRAPHICS EFFECTS AND SHADERS

WIIS TEV UNIT WOULD KICK XBOX 1S SHADERS ASS

AND ON TOP OF ALL THAT HOLLYWOOD GPU IN WII SUPPORTS A HUGE LIST OF HACKABLE HARDWIRED EFFECTS

THATS CLEARLY PROGRAMMABLE AND ALSO NEAR X360 NOT XBOX 1

TIS SINKING IN NOW BOYS IS IT NOT “”"”"THE TRUTH”"”" THAT IS

:roll: :roll: :lol: :grin: :mrgreen:

direct comparason videos guys

On September 8, 2008 at 4:03 pm

Expected hardware sales:

Wii – 550K
DS – 550K
PS3 – 225K
Xbox 360 – 200K
PSP – 200K
PS2 – 150K

AUGUST USA POSSIBLY HIGHER STILL FOR DS/WII

X360 IS DEAD

Sampson

On September 8, 2008 at 4:48 pm

I know how fixed function works. It’s not a mystery science.
I’ve also read Factor 5′s presentation on their game. They were demoscene-like hacky with a few things. But you can technically exploit features on any system.
Developers even got some good graphics out of PS2 towards the end. You can even get normal mapping out of it in some cases.

But if we label Gamecube programmable, then all the fixed function hardware that came out in the last 15 years, are technically programmable.

Fixed function is a technical description of how the gpu achieves certain things.
It’s not just a derogatory insult to the hardware. Anyone that wrote games for PC in the past, would describe Wii’s graphics pipeline as fixed function. Just like High Voltage software would and does. Everything is “programmable” if we broaden the description of it.

Mainly, fixed function is as though you give someone a list of 50 words to make sentences with. You could probably write things that made sense, as long as you look at what you have, and tried to form sentences around that. You might even be able to find workarounds and hacks, where you find you can make some new words, by putting a few smaller words together, if you were clever.
(or use the cpu to edit in a few)

But it’s preferable to have the ability to control things closer to the letter, and form words yourself, rather than working with a pre-made list.

That’s what defines its gpu as “fixed function”, and modern gpu’s as programmable.

Xbox isn’t all that “programmable” either by today’s standards, especially in Pixel shaders.

Sampson

On September 8, 2008 at 4:51 pm

****X360 IS DEAD****

And yet it has the majority of good games, and is where the majority of multi-platform games sell best at.

Funny how that works as well.

direct comparason videos guys

On September 8, 2008 at 5:27 pm

TEV PROGRAMMABLE TO ASSEMBLY LEVEL, HOLLYWOOD GPU PROGRAMMABLE TO ASSEMBLY LEVEL,HOLLYWOOD HARDWARE EFFECTS HACKABLE AT DRIVER ASSEMBLY LEVEL….

FULLY PROGRAMMABLE CPU INCLUDING CUSTOM GRAPHICS

CLEARLY WII IS PROGRAMMABLE

FROM SCRATCH HAND WRITEN CODE CAN BE APPLIED TO TEV AND THE HOLLYWOOD GPU AND BROADWAY LIKE GEKKO BEFORE IT IS CUSTOM OPTERMIZED FOR HAND WRITEN MICRO CODING FAR VASTLY SUPERIOR PERFORMANCE

AGAIN ALL CONFIRMED BY FACTOR 5

direct comparason videos guys

On September 8, 2008 at 5:29 pm

FACTOR 5 CONFIRMED ASEMBLY LEVEL AND MICRO CODE LEVEL PROGRAMMING OF BOTH GPU AND CPU FIXED LIST OF GPU HARDWIRED EFFECTS CAN ALSO BE RE PROGRAMMED VIA ASEMBLY LEVEL GPU HACKING AND TEMP FIRMWARE HACKS

THAT KINDA CUSTOMIZATION AND A HUGE FILL RATE GIVES WII BAGS OF POWER @ 480P

direct comparason videos guys

On September 8, 2008 at 5:34 pm

STOP USING PC TERMINOLOGY TO DESCRIBE A CUSTOM WII

IT SHOWS NOT YOU SPEC INTELECT BUT MORE THE FACT YOUR BILL GATES BRAIN WASHED

OPENGL AND CUSTOM NINTENDOGL AS ITS REFERED TO IS FAR MORE FLEXABLE AND EFFICENT THAN ANY DIRECT WINDOWS BULL

FACT

direct comparason videos guys

On September 8, 2008 at 5:40 pm

PC BRAIN WASHING FIXED FUNTION GPU AND NEWER PROGRAMABLE GPUS

BUT THE HOLLYWOOD IS UNIQUE AND CUSTOM AND ONLY IN WII STOP APPLYING PC WAYS OF THINKING TO A CUSTOM CHIP AND A CUSTOM CONSOLE

FIXED FUNTION HOLLYEWOOD THEN PROGRAMABLE AS YOU UNDERSTAND IT

HIOLLYWOOD IS CLOSER TO WHAT YOUR CALLING A PROGRAMMABLE GPU THAN IT IS TO WHAT YOUR CALLING FIXED FUNCTION

ITS IN BETWEEN CLOSER TO SHADER/PROGRAMMABLE THAN FIXED FUNCTION

GET IT

IT AINT GOT SHADERS BUT WII FANS COULD SAY YOUR GPU AINT GOT TEV ITS SWINGS AND ROUND ABOUTS NO ONES RIGHT AS THERE GUESSED YET

DIFFERENT ARCHITECTURES GET IT

WIIS TEV S ON XBOX 1S SHADERS

WIIS CO PROCESSED GRAPHICS CPU S ON XBOX 1S SHADERS AND ADD TO THAT A HUGE LIST OF STRONG FIXED HARDWARE EFFECTS

THANK U

Sampson

On September 8, 2008 at 6:47 pm

Again, you can read the list of features the gpu supports, the way it implements them, and how configurable they are. Also in Fact 5′s article.

Effects on gpus are done in math. Mul/add/multiply+add….floating point, integer, square roots, etc..
The idea is to perform a string of math operations, to simulate some random effect.
Gamecube and WIi have fixed strings of math, that have a particular resulting effect. They’re configurable as listed by factor 5, and you might find use for one function, that was intended for something else, etc..

But something “programmable” just has processors that perform add/mul/madd, etc.. in whatever way you think you can get away with, using simple simd floating point processors, etc.

Cpus can do the same types of operations, and are fully programmable, but they don’t have the “gflops” to do very many, and offloading as much as possible to a gpu, leaves more processing power for more complex things like a.i, etc..

Sampson

On September 8, 2008 at 6:59 pm

Something simply being “fixed function” doesn’t make it weak.
Being fixed, takes up fewer transistors.
They could have stuck to fixed function, and simply extended their math to automatically cover things like normal parallax, etc. mapping, and a list of other effects.
Programmers wouldn’t like it, as it isn’t as flexible. And they can’t simply drop their effects onto Wii. They have to change the art itself, to support the more specific math that the WIi supports.
They’d like hate on it, just like they’re doing now.

direct comparason videos guys

On September 8, 2008 at 7:01 pm

XBOX 360 IS DEAD SALES FACT ITS DIEING

AMAZON CHARTS SHOW WII PS3 IN TOP TEN AND WII AND WII RELATED PRODUCTS KICKING ALL ASS

ALL X360 MODELS ARE LIKE 50ISH IN AMAZON CHARTS

WII USA AUGUST 550K

DS 550K EST NUMBERS PASSABLY HIGHER

X360 200K WII SALES NUMBERS TO INCREASE BIG TIME UP TO CHRISTMAS ITS GOING TO DESTROY X360S GLOBAL USERBASE BY XMAS 08

AND IS ALLREADY GLOBAL NUMBER 1 THIS GEN

WII AND DS VS X360 NO CONTEST

direct comparason videos guys

On September 8, 2008 at 7:03 pm

FUNNY IV JUST WITNESSED FABLE XBOX AND ZELDA TWILIGHT PRINCESS ON IDENTICAL TV SETS
FABLE LOOKS PS2

TWILIGHT PRINCESS LOOKS MID RANGE X360

FABLE LOOKS SOOOOOOOOOO DATED AND FLAT AND LAGGY AND LONG LOADTIMES

TWILIGHT PRINCESS LOOKS GREAT LOADS FAST AND NO POP UP OR GLITCHES IN UR FACE

direct comparason videos guys

On September 8, 2008 at 7:09 pm

TEV= PROGRAMMABLE BLENDING/SHADING/COMBINING 16 REAL TIME SHADER STAGES EACH STAGE MULIPUL REAL TIME EFFECTS BLENDS

USING A 8 LAYER SINGLE PASS MULTI TEXTURING ENGINE

ADD TO THAT INTERLEAVED REAL TIME FULLY PROGRAMMABLE CPU SHADER GEOMITRY LIGHTING EFFECTS HUGE BANDWIDTHS FAST RAM LOTS OF RAM COMPRESION AND A LONG HARDWARE LIST OF EFFECTS ALSO CUSTOMIZABLE

IT S ON XBOX 1

THERES NO TEV IN FIFA THERES NO HUGE BANDWIDTHS/FILLRATES IN FIFA

THERWS NOTHING WII ABOUT FIFA ITS CLEARLY POLISHED UP PS2

AND A GAMECUBE IS PROVEN TO HANDLE POLISHED PS2 CODE USING ONLY 40% OF ITS POWER

AND WII IS A DAM SITE MORE THAN GAMECUBE WAS

EA LIED

Sampson

On September 8, 2008 at 8:11 pm

****TWILIGHT PRINCESS LOOKS MID RANGE X360****
No it really doesn’t. It looks like a polished Wii game with good art direction, and choices in where to spend effort. Not a technical marvel.
Looks good in some areas, mediocre in others.

And again, 8 layers is at the direct 1to1 expense of fill-rate, because texture stages are done in a pipeline. Modern gpus decoupled them, and have more texturing units.

And sales of Wii are really good. But games like Prime 3 sold what?
Less than a million and a half. And that’s Metroid.
So did Resident Evil: Umbrella Chronicles at 1.2.
And those games have no competition for sales on Wii, are exclusive to a platform that has a much larger install base.

Just the 360 version of GTA4 sold well past 6 million.
360′s version of Assassin’s Creed is going on 4 million.

And those are multi-platform games, with tons of direct competition on that platform.
So no, no one considers 360 dead, especially developers.

direct comparason videos guys

On September 9, 2008 at 8:38 am

YOUR NOW TRYING TO DEFEND A DEAD FPRMAT = XBOX 360 IF IT DOESNT SELL THERE FOR ITS DEAD

GET OUTA YA DREAM WORLD X360 SALES USA 3RD UK 3RD EUROPE 3RD JAPAN NOTHING ITS SO LOW

AGAIN FOR THE BENEFIT OF THE STUPID XBOX 360 IS DEAD

direct comparason videos guys

On September 9, 2008 at 8:40 am

MULTIPUL PRICE DROPS = MICROSOFT SCARED

MULTIPUL MODELS = MICROSOFT SCARED

TERRABLE BUILD QUALITY = MICRSOFT STINKS

XBOX 3609 IS A DEAD FORMAT

direct comparason videos guys

On September 9, 2008 at 8:47 am

I LOVE HOW IDIOTS CALL PS2 PORTING WII GRAPHICS CONSIDERING WIIS DIE EMBEDED RAM ALONE UTTERLY DESTROYS THE WHOLE PS2S RAM SYSTEM YES THATS RIGHT WII HAS VASTLY MORE POWER MEMORY WISE ON ITS DIES ALONE THAN THE WHOLE PS2 HAS AND ITS PLUS 10X FASTER LATENCY WISE AND ALL DIRECT GPU CPU FEED

BEFORE WII EVER HITS EXTERNAL MAIN RAM ITS ALLREADY KICKING PS2S ASS WITH ITS ON DIE SRAM

THERES ANOTHER 64MB GDDR3 AND FLASH DRIVE AND FAST DISC STREAMING ON TOP OF WHAT WII HAS ON DIE

PS2 32MB RAM SLOW CLUNKY RAMBUS NO DIRECT FEED TO GS UNIT ALL ACCCESS IS SHARED WITH CPU AND VECTOR UNITS OVER A SHARED BUS BOTTLENECKED AT 1.5 GB

32 MB INDIRECT NO COMPRESSION BOTTLE NECKED MAIN RAM

WII 3MB GPU EDRAM 24MB EMBEDDED SRAM 27 MB RIGHT THERE ON THE DIE PLUS 6TO1 COMPRESSION TEXTURES 4TO1 PEAK DATA COMPRESSION AND 5 NANO SECONDS LATENCY VS PS2S 100 NANO SECONDS

AS YOU CAN SEE WII DESTROYS PS2 WITH OUT EVEN TOUCHING GDDR3 RAM

SO HOW IS FIFA WII WII AND NOT PS2 ENGINE WISE ITS CLEARLY RUNNING ON WII AND HARDLY TOUCHING ITS POWER YET YOU INSIST ITS WII NOT PS2 PORT

TALK ABOUT IGNORENT FANBOYING

Sampson

On September 9, 2008 at 9:05 am

Neah. Sales are about 30m Wii, 20m 360, and 15m PS3 overall.
PS3 should pass it eventually, but it’ll be a while with the price drop. 360 moved their chips to smaller process, so it’s more cost effective to make, and newest chips have been redesigned.
PS3 BR player is good. Only problem with PS3, is the removal of backwards compatibility with PS2 on everything now. Which isn’t really to save money, since recent bc was through software emulation. They gimped it to make sure those who buy them, buy PS3 games. And they still sell PS2s.

****TERRIBLE BUILD QUALITY*****

Pretty much. Good overall design for ease of programming and function, suck ass reliability.
Probably cost them the eventually lead over Sony alot earlier now.

Sampson

On September 9, 2008 at 9:24 am

PS2 can store compressed textures in ram, but the cpu has to decompress them, before passing to the gpu. (cpus can decompress textures fine)

And even games co-developed with PS2 have enhanced texture resolution and effects. They don’t just leave the additional ram space empty. And they up the lod system on models. It likely looks 2x better than a PS2 version would.
So, one could argue that it might look 2x better still, (aka 4x) if coded fully for Wii.

But, people aren’t missing out on a whole load of WIi power.

wiiboy101

On September 9, 2008 at 11:24 am

27mb fast ram plus compression vs 4mb slow ram and no compression ps2

yeah right wii is a last gen system
zzzzzzzzzzzzzzzz fanboyszzzzzzzzzz

wiiboy101

On September 9, 2008 at 11:33 am

as confirmed by 3rd party devs working on wii titals average ps2 port and polish on gamecube was 30 to 40% of gamecubes power includingbmuch better loadtimes fifa 06/07 had better graphics shorter loading times and better framerate on gamecube than it did on xbox yet wii fifa 08 had weaker graphics how come EA rushed it outbw9th lame tacted on wiimote controls it was not a wii built game as 09 is allso a portover
the only EAtital 09 wise thats even remotly pushing wii graphics is tiger woods 09nbut its still running on a ps2 engine EA didnt prep and engine build for wii you DUM S they built engines for ps3 everybody knows this as fact epic built un 3 for pc and converted for ps3

ITS CALLED A OUT OF TOUCH INDUDTRY

LIKE UR OUT OF TOUCH WITH THE FACT X360 IS DEAD

wiiboy101

On September 9, 2008 at 11:36 am

so according to xbox fans building engines wasting time and money on ps3 was worth it considering the lead format is wii and everyone WITH A BRAIN new wii would kick all ass

wiiboy101

On September 9, 2008 at 11:41 am

ps2 compression is a waste of time we are talkung the compression of 4 bit texturs that allresdy look like crap with cpu busting software compression and no bandwidth in the fsb to send desent textures ps2 is a complete joke ONLY A BUFFOON WOULD DEFEND IT

Sampson

On September 9, 2008 at 1:05 pm

Nope, not defending it. It’s bottlenecked. But comparing straight storage in figures for how much can be in ram, it’s false to assume the 32mbs in PS2 needs to store uncompressed, while anything else can be compressed. “Amount” of ram wasn’t ps2′s problem.
Systems’ get their own engines, and get converted through an api that covers them all. They design things through an abstraction layer.
Things like polygon models, etc.. get designed in programs like Maya or 3dsMax, and textures and normal maps get designed in things like photoshop.
Then they port the model to systems. You get wasted processing when the art isn’t converted to what’s ideal for the system it’s on. Happens all the time.

****yeah right wii is a last gen system*****

I’d consider a console that used my 300mhz, Geforce4 TI and P4 to be last gen. (at least) It’d do ok at standard res, but instruction-wise, and what it does and doesn’t support is lame.
To each his own though.

wiiboy101

On September 9, 2008 at 1:32 pm

example of ps2 vs wii graphics

ps2 =4mb edram no compression and say 16mb of the 32mb main ram for graphics indirect bus bottlenecked terrable latency software compression 4/8 bit pixels/texels data
vs
wii3mb edram plus compression, 24mb die sram direct gpu feed plus compression, and direct feed gddr3 plus compresion say 32mb of the 64mb avalable. a much faster disc load stream speed again plus compression and a accessable flash drive as virtual ram

CLEARLY WII UTTERLY POOS ON PS2 AT 480P RENDERING
wii pixels/texels are optermized 24 bit vs 4/8 bit for ps2

59mb direct fast ram plus compression vs 16mb indirect bottlenecked ram with latency issues

add to that the FACT data going thru wiis memory is vastly supierior to thevdsta used in ps2

theres no comparason wii is 10x ps2 at textures with ease

wiiboy101

On September 9, 2008 at 1:45 pm

laymans example graphics ram

xbox 32mb gamecube 16mb dreamcast 8mb=56mb

wii 24mb 1tsram say 32mb gddr3=56mb oh look wiinjust matched 3 last gen systems all put together add better compression/decompression/better bandwidth/better latency balance/better disc speed/and aditioal flash access ram

wii in direct gpu memory terms is better than gamecube xbox and dreamcast put together

clearly kicking last gen ass by avouding hd and having such a refined system wii can kick out serious looking games

EA GUY LIED

Sampson

On September 9, 2008 at 2:18 pm

Well, a person could say 360 and PS3 are on par. But they’d be wrong on a long list of things throughout the system, that could be demonstrated with figures, graphs, etc.. Doesn’t mean much.

If you think it’s powerful enough to kick out awesome graphics, that’s up to you, and what you consider awesome.

direct comparason videos guys

On September 9, 2008 at 3:21 pm

wii 24mb 1tsram direct feed gpu direct feed cpu

wii gddr3 direct feed gpu direct feed cpu direct feed sound processor bus(yes wii has a dedicated sound bus and sound processor that doesn’t hit cpu or fsb at all)

flash drive direct to ram and cpu

disc drive feeds all obviously

thats some serious flexibility there gddr3 acts as main ram sound ram and graphics ram DIRECTLY, it can data dump to 1tsram also acting as a instant data texture feed for the graphics catch and 1tsram

1tsram acts as super fast main ram and is dedicated to gpu and cpu direct feeding

developers can access flash drive and use it as disc catch or access auxiliary ram like gamecube did its access ram

the whole system is clock synced and balanced and the whole system including buses and catches can deal with compressed data textures etc add to that software compression and a fast disc drive and the lowest latency performance of any console

ps2 is a dinosaur in comparison as is xbox 1

theres nothing wii fanboy about saying EA guy lied its simply THE TRUTH

direct comparason videos guys

On September 9, 2008 at 4:19 pm

WHAT THIS DISCUSSION BOILS DOWN TO IS THIS AND XBOX FANS DONT ARGUE ITS THE TRUTH THE WHOLE TRUTH AND NOTHING BUT THE TRUTH

XBOX FANS “”"CAN NOT”"” ACCEPT THAT Wii AT XBOX CLOCK SPEEDS KICKS XBOX1s ASS..

YOUR COMPARING ON CLOCK SPEED “”"ARNT YOU”"” NOT ACTUAL EVIDENCE AND ARCHITECTURE AND ABILITY……..

GEKKO 486MHz IS IN INTEL TERMS LIKE 1GHz EA games rated gekko at 1ghz pentium 3 factor 5 also rated gekko at 1ghz pentium 3 infact the original gekko clock speed was 404mhz and factor 5 rated it as 800mhz plus in intel cisc cpu terms

gekko was up graded to 486mhz so 1ghz in intel terms in cpu performance

gekko was better than xbox 1s celeron and destroyed even a 1ghz true pentium 3 at processing bandwidths and front side bus bandwidths

broadway is not comparable to xcpu directly its a vastly better cpu on a better bus being fed by better ram and internal bandwidths clear on the xcpu

2.5x xbox at 480p level resolution plus no long loading bull its simply the truth

Sampson

On September 9, 2008 at 5:47 pm

Again, every single developer that’s ever touched either system last gen, has stated:
Xbox is overall more powerful than the Gamecube. or
Gamecube is as good as Xbox.
Number that said Xbox over Gamecube: nearly all.
Number that said Gamecube just as good: one or two.

Exactly what specifications indicate: Zboz is overall more powerful than Gamecube, with a few exceptions.
Developers have spoken, Specifications have spoken.
Definitive.

Wii is a Gamecube, overclocked by 50%, with additional system ram.
It treats the additional ram, as something akin to ATI’s hypermemory.

The gpu’s instruction set is unchanged from Gamecube. Which means, it’s all set to pull off some of the effects that a Geforce2/3 were doing.
Total floating point operations for the gpu is still 46.5 floating point ops per cycle. (as stated in Nikkei Electronics) They added no modern instructions.
At 243 mhz, it’s now 11.3 for the gpu, and the 2x split simd floating point units in the cpu, is now 2.9 gflops.

I don’t care if it’s 100% efficient. It could magically double its theoretical power, and it would still be unimpressive from any angle.
I would give it credit for being “overall” more powerful than Xbox. But 2x Xbox is weak. 3-4x PS2 is weak.
You’re fascinated with the specifications, and its buzzwords therein, good for you. I and most others are not, developers certainly aren’t, Nintendo themselves isn’t… But you are, and that’s great.

Sampson

On September 9, 2008 at 6:07 pm

****t treats the additional ram, as something akin to ATI’s hypermemory.****

Correction, I actually doubt this now. The cpu likely still has to read up from the fsb from the gpu, as it likely still controls memory access.

direct comparason videos guys

On September 10, 2008 at 9:49 am

ps2 gpu die 4mb edram no compression no direct link to main ram

xbox 128k gpu no compression reading catch thats your lot direct access to sdram main ram

wii 3mb gpu catch plus compression 24mb gpu die embedded 1tsram plus compression and direct access to gddr3 main ram

4mb vs 128k vs 27mb(plus compression)wii

come on spit it out SIMPLY NO CONTEST WII KICKS LAST GEN ASS

feeding a gpu more powerful more efficient and richer deep effects list in hardware than any xgpu or gs unit

xgpu 8×4 lighting (233mhz gpu on 133mhz bus)
xgpu 4 texture layers 4 texture stages thats your xbox shaders right there hhmmm
xgpu in game colour 16 bit
no realtime decompression
no virtual texturing
no tile rendering

wii gpu 8×8 lighting (plus custom cpu lighting)(243mhz gpu plus 3mb edram)
wii gpu 8 texture layers 16 texture stages(plus cpu shaders)
wii gpu ingame colour optimized 24 bit
real time decompression
virtual texturing
tile rendering
longer list of hardware effects and hardware supported functions
highly efficient per clock and part of a highly efficient console design with blindingly fast ram

wii = 2.5 xbox @480p native resolution

hollywood gpu fully supports fur shading and advanced water effects and physics IN HARDWARE

ANTISCOPIC FILTERING TRILINE FILTERING AA custom filters custom deflicker custom motion correction

true 480p@ 60 fps widescreen high res colour

deformation texture/ indirect texturing /normal mapping/bump mapping/custom mapping

ps2 porting and non useage of hollywoods hardware effects and TEV unit

aint fuking wii graphics is it

WIIBOY101

On September 10, 2008 at 11:06 am

HOUSE OF THE DEAD 3/4 XBOX YES

HOUSE OF THE DEAD OVER KILL WII

OVERKILL UTTERLY S ON H O T D 4 GRAPHICALLY

4= XBOX OVERKILL=WII AND CLEARLY OVERKILL KILLS XBOX GRAPHICS AND FULLY SUPPORTS BOTH 3D AIMING MOUSE(LIGHT GUN) AND 3D MOTION WIIMOTE GAMEPLAY

ALSO HOUSE OF THE DEAD 3/4 USES XBOX ARCADE BOARD WITH 128 MB RAM NOT 64MB AND ASTILL WIIS OVER KILL LOOKS WAY BETTER GRAPHICALLY

EXPLAIN THAT ONE XBOX FANS

WIIBOY101

On September 10, 2008 at 11:19 am

4 RUNS ON PC BOARD NOT XBOX SORRY MY MISTAKE OH LOOK A HONEST GUY APPOLAGIZES FOR MISTAKES XBOX FANS NOT YET APPOLGAGIZED FOR LYING ABOUT WII LOL

Mosley

On September 10, 2008 at 1:06 pm

****wii gpu ingame colour optimized 24 bit****

It’s ok Wiiboy. No need to bash Nintendo’s 24-bit rendering. Gosh, get off their backs man.
24-bits is all they need for colour. (it’s what ATi supported back then too)
But WIi and GC have no choice but to use 16-bit to fit the frame buffer, otherwise they wouldn’t be able to give you transparency or aa. (same thing goes for the z-buffer, if they needed to emulate a shadow buffer)
It’s not because their programmers are “graphically incompetent fools”.

RE4, Wind Waker, Twilight Princess, Sunshine, Kart, etc.. all used 16-bit rendering. You lived through the dithering last gen, you’ll likely live with it through the Wii-gen as well.

****longer list of hardware effects and hardware supported functions****

We know, Wiiboy. Wii and Gamecube’s gpu is fixed function, and has a very limited prefabricated list of effects that it can actually perform. The already limited number of math ops, (gflops) are pretty hardwired.
And yeah, they didn’t extend that list to do anything more modern (like normal or parallax mapping), just embossing, and embm, etc..

But that’s no reason to continually call them out on it every other post. At least their games are usually fun to play.

Silly fanboys, always trying to slam Nintendo at every turn. They never claimed to have awesome graphics. They simply said their games would play well with new controls, and be interesting to look at. And they gave you that.

It’s obviously not the system for “spec whores”. If you hate the WIi so much Wiiboy, sell it and get a PC for gaming.

WIIBOY101

On September 10, 2008 at 3:10 pm

HOUSE OF DEAD 4 = PENTIUM 4 CPU AND MID RANGE GRAPHICS CARD AND AT LEAST 256MB RAM

HOUSE OF THE DEAD OVER KILL = WII YET OVER KILL HAS BETTER GRAPHICS EVEN AT LOWER RESOLUTION

OUCH AGAIN WII PROVEN WAY MORE POWERFUL THAN XBOX 1

WIIBOY101

On September 10, 2008 at 3:59 pm

SEGAs ARCADE BOARD IS PENTIUM 4 AND 1GIG RAM AND A 2005 PC GPU CARD WITH 256MB RAM CUT DOWN OP SYSTEM AND OPTERMIZED CODING THAT CAN KICK OUT SOME COOL

BUT WII IS BEATING IT GRAPHICALLY – HIGH RESOLUTION MODES

WII IS FAR MORE POWERFUL THAN FIFA GAME AND FIFA EA GUY ARE LETTING ON CLEARLY

WII 729MHZ SEGAS NEW BOARD PEN 4 3GHZ

CUSTOM/EFFICIENT AND NO HD = PERFORMANCE

IS IT SIUNKING IN GUYS

WIIBOY101

On September 10, 2008 at 5:23 pm

SEGAS HD ARCADE BOARD 800MHZ FSB = AROUND 6.4 GB BUS BANDWIDTH

WII FSB PEAK COMPRESSION MODE = 7.7 GB BANDWIDTH

SHOWS HOW POWERFUL WII IS IN KEY AREAS AND -HD RESOLUTION

Samspon

On September 10, 2008 at 8:00 pm

You know one of the reasons they used Xbox hardware in their arcades, was that their earlier games, House of the Dead 2 specifically, was based on DX6 dev kits. And Xbox could run things under DX6 just like a PC could.

So, if you wanted to move on to HotD 2 from HotD 3, and use the same development tools as the last game with a few upgrades, using Xbox would be easy.

HotD 1 and 2 were ported to Xbox. They’ve also been moved to Wii, where they don’t run as well.
Not a good example of “Wii power” because Wii is overall more powerful than an Xbox.

The house of the Dead you’re speaking of, isn’t done by the same group of developers. It doesn’t look “awesome” especially for a rail shooter. Rail shooter, being not like Doom 3 or HL2, etc..

They did a unique one for WIi, because porting HotD4 to Wii, “would be difficult”. But they had said he “didn’t think it’d be too difficult to port it to WIi, and keep it enjoyable”. Anything can be ported to Wii, as everything from Far Cry, to Dead Rising shows.

And PC specs tells you nothing. As minimum requirements for various typical console games will show.

wiiboy101

On September 11, 2008 at 2:34 pm

windows direct gobbledegoop again sega codes direct to chip set in opengl and asembly code add to that no windows to run segas arcade board is easy 2x the power of a pc with the same spec tighter coding and no pc op system to run its a desent spec pc board dedicated to games it will piss on a pc of the same spec

wiiboy101

On September 11, 2008 at 2:40 pm

did you know the average pc game wastes 50 to 75% of ram and bandwidth avalable znd all pc games are not gpu or cpu optermized as pc is open format not a dedicted chip set like amigas used to be and consoles are also pumped fsb on pc are highly latent and not on par with true fsb speeds pumping gives a fake performance boost

wiis design is ULTRA efficent compared to a pc or ps3

WIIBOY101

On September 11, 2008 at 4:48 pm

DS – 518.3K units
Wii – 453K
PSP – 253K units
360 – 195.2K units
PS3 – 185.4K units
PS2 – 144.1K units

OFFICIAL NUMBERS X360 DIDNT HIT 200K AND WII IS AT SELL OUT STATUS AGAIN IN USA WITH INCREASED SHIPMENTS FROM SEPTEMBER SEPT OCT WII SALES EXPECTED TO BE HUGE

X360 SELLS AT PS2 SALES RATES AND BOTH MACHINES SHARE BUILD QUALITY ISSUES SO THERE SALES FIGURES ARE INFLATED DUE TO RE BUYS

X360 IS DEAD AND EA LIED PROVE MII WRONG

WIIBOY101

On September 11, 2008 at 4:58 pm

DREAMCAST 100MHZ BUS 800MB BANDWIDTH SHARED

PS2 150MHZ BUS 1.3 GB BANDWIDTH SHARED

XBOX 133MHZ BUS 1GB BANDWIDTH NOT SHARED

WII 243 MHZ BUS PLUS CUSTOM REALTIME COMPRESSION DATA 7.7GB 4TO1 3.9GB AT 2TO1
PLUS LOW LATENCY BUS LOW LATENCY RAM

ALL SYSTEMS PEAK NATIVE RESOLUTION = 480P

GRAPHICS RAM

DREAMCAST 8MB NO EDRAM

PS2 4MB ED RAM ROUGHLY 16 MB INDIRECT MAIN RAM

XBOX NO EDRAM PLUS MAIN RAM HAS TO Z FRAME BUFFER ETC PLUS 24MB SAY MAIN RAM 32INCLUDING BUFFERING ETC

WII 3MB EDRAM 24MB SRAM SAY 32MB MAIN RAM

STREAMING FROM DISC

DREAMCAST CRAP

PS2 CRAPPER

XBOX CRAP

WII EVEN FASTER THAN GAMECUBE THERE FOR ITS DAM FAST

WII IS A NON HD X360 CLEARLY

3ds_Mac

On September 11, 2008 at 5:51 pm

Of course they do. But you’ll notice things like DOom 3 or Gaiden look better than the game you speak of on the WIi. And no reviews or interviews have cited it looking close to anything like 480p Gears, or Bioshock, etc.. Even if you consider ntsc resolution.

And I don’t know what you think fsb bandwidth and bogus compression is telling you. Wii is fixed function, requires the cpu to feed the gpu. Other gpus get data from ram, because they have the equivalent of the Wii’s cpu built into them.

And then of course, you can look at 360 and PS3′s fsb. 360′s is 21.6 gb raw bandwidth, not including compression. (of which there is extensive amounts)
Crap load on PS3 as well. PS3 like other PC cpus, have their own system ram, and thus don’t require getting all their bandwidth from the fsb to gpu’s memory controller to get data. Unlike the Wii.

GC was designed to be a console for 01-06. They didn’t add anything to it in instruction, pretty much just ram and clock for Wii. It’s still a gpu with 25 million transistors for processing logic.
If ATI could design a chip back in 2000, that would run pretty much as well as systems with more than 10 times the transistors for processing and more than twice the clock rate, but still 1/3 the res, they wouldn’t be getting owned by Nvidia right now.

WIIBOY101

On September 11, 2008 at 5:53 pm

http://www.youtube.com/watch?v=fgtFXXzE8bk

MARIO 128 TECH DEMO GAMECUBE FANTASTIC A.I FANTASTIC PHYSICS AND GREAT GRAPHICS AND EFFECTS YES IV POSTED IT BEFORE

WHAT I DIDNT REALIZE BEFORE WAS THE BAR AT THE BOTTOM OF THE SCREEN IT LOOKS LIKE A PS2 LOADING BAR

ITS A CPU POWER USAGE BAR AS YOU CAN SEE IT HARDLY MOVES ABOVE 25% SO ALL THAST FINE WAS 25% OF GEKKO CPUS POWER WOW JUST 25%

BROADWAY IS 2X GEKKO WII CAN KICK SOME PHYSICS A.I AND CPU CO PROCESSED GRAPHICS ASS

WIIBOY101

On September 14, 2008 at 11:06 am

hotd 3 xbox hotd 4 segas new arcade board pentium 4 3ghz based

hotd overkill wii 729mhz cpu YET KICKS THE ASS OF BOTH ABOVE GAMES VISUALLY

the wii is keeping up with a 3ghz cpu based board from sega

xbox just couldn’t do that

wii= 2.5x xbox

obvious advantages cpu

wii vs xbox

risc vs cisc

customized vs standard

copperwire vs aluminium

silicon on insulator vs standard

tight tiny trasistor packed design vs standard

256k catch vs 128k

243mhz bus vs 133 mhz bus

over 20million trannys vs 9 million trannys

custom data compression real time decompression vs standard

clock synced vs not clock synced

fast balanced optermized ram vs standard dram

theres just so many advantages broadway has over xcpu THE UNEDUCATED DIRECT WINDOWS BRAINWASHED PC FOOLS ARE JUST NOT SEEING BECAUSE THEY VIEW SPEC IN THE WAY THEY HAVE BEEN BRAINWASHED TO VIEW THEM AND DONT VIEW THEM IN A COMMONSENSE WAY

AKA SAYING WII DONT SHADE IS A EXAMPLE OF THIS UTTER STUPIDITY

WIIBOY101

On September 14, 2008 at 11:09 am

CPU BANDWIDTHS EASY 4X XCPU CPU INSTRUCTIONS PER CLOCK EASY 2X XCPU AND LATENCY EASY 10 X XCPU

CASE RESTED BROADWAY UTTERLY KILLS XCPU AT 480P LEVEL RENDERING AND KICKS ITS ASS AT PHYSICS AI ETC

WIIBOY101

On September 14, 2008 at 11:12 am

WIIS TEV UNIT AND HARDWIRED EFFECTS KICKS XBOX SHADER ASS WIIS CPU GRAPHICS CO PROCESSING KICKS XBOX SHADERS ASS WII= 2.5 X THE SHADERS OF XBOX 1 WITH EASE

LOOK AT THE SPECS BREAK THEM DOWN APPLY COMMONSENSE

PC THIS DIRECT THAT WII IS NOT A PC WOULD YOU JUDGE A DESIL ENGINE ON PETROL ENGINE ENGINEERING OFFCOURSE NOT

SO WHY DO THAT WITH WII

PC BILL GATES BRAINWASHED CARNT SEE THERE BRAINWASHED “”"”IDIOTS”"”"”"

3ds_Mac

On September 17, 2008 at 1:49 am

****hotd overkill wii 729mhz cpu YET KICKS THE ASS OF BOTH ABOVE GAMES VISUALLY the wii is keeping up with a 3ghz cpu based board from sega ****

Game is a rail shooter, that looks inferior to even Doom 3.

But is that what the developers said of it? Of course not. Is that what Sega said of Wii? Of course not. Is that what the guys who left Retro Studios and started their own company said? Of course not. Is that what Retro Studios themselves said? Of course not. Is that what TheConduit developers said of Wii? Of course not. Is that what Nintendo said of Wii? Of course not. Is that what I.D said of Wii? Of course not. Is that what Maxis said of WIi? Of course not.
Is that what anyone sees in any of the Wii’s top games? Of course not. Is that what anyone who has any idea what they’re talking about, look at any of the specifications would say about WIi? Of course not.

You’re all alone Wiiboy.

3ds_Mac

On September 17, 2008 at 1:55 am

Wii’s cpu could perform at 100% theoretical maximum, (2.916 gflops)in an actual game, and it would still be crap. As would the gpu.
No amount of delusional spin changes that.

WIIBOY101

On September 17, 2008 at 8:28 am

MONSTER HUNTER BLOW OUT PS3 LEVEL GRAPHICS AT CAPPED 480P

CONFIRMED XBOX FANS ARE S

MONSTER HUNTER BLOW OUT SCANS ALL OVER NET LOOKS STUNNING ADD 3D MOUSE 3D MOTION 1TO1 ADVANCED 3D MOTION

THATS NEXT GEN TOTALLY IMPOSSABLE ON XBOX 360 AND PS3

WII HAS THE GRAPHICS AND THE GAMEPLAY AND THE FAST LOADING AND NO INSTALLS

X360 PS3 LONG LOADING TIMES NICE GRAPHICS OBSOLETE CONTROLS

GO LOOK HOW STUNNING MONSTER HUNTER 3 LOOKS THEN ADD IN THE FACT ITS A WII 3D MOTION 3D MOUSE CONTROLLED GAME

ONLY WII IS NEXT GEN CLEARLY

PS3 X360 NOW PROVEN TOTALLY UN NEEDED WHY WOULD YOU BASH BUTTONS WEN YOU CAN HAVE GREAT GRAPHICS AND TRULLY NEXT GEN GAMEPLAY

WHO NEEDS PS3 MONSTER HUNTER 3 LOKS NEAR AS GOOD ON WII ALSO HD CLEARLY PROVEN NOT GRAPHICS AS IV STATED BEFORE

NOW WII AWAIT FACTOR 5S BLOW OUT IT EWILL BE ANOTHER STEP UP VISUALLY

XBOX 1 YEAH RIGHT MONSTER HUNTER 3 LOOKS LIKE TWILIGHT PRINCESS AND RES EVIL 4 COMBINED IN FULL 16BY9 PRO SCAN HIGH RES COLOUR

NO XBOX CAN DO THAT ITS CLEARLY ON PAR WITH X360/PS3 TOP VISUALS – HD MODES

DIDNT I SAY THAT UP TOP OF THREAD IN THE FIRST DAM PLACE :roll: :roll:

Sampson

On September 22, 2008 at 11:47 am

Someone could just as easily single out Ninja Gaiden Black, or DOA:U and say, “this clearly couldn’t be done on Xbox”. But they are.

I’ve seen Monster Hunter 3. It’s a good example of faking computationally expensive graphics effects. I see alot of flat shaded mountains, water that lacks reflection mapping and overall looks bland outside of cinemas, shadow maps on everything that aren’t dynamic, but prefabricated.
The same techniques you see in the Wii’s best games, attempting to use decent art design to compensate for lack of processing power. (ie, MP3 using no bump maps, but making sure the effects they do have look good, and are well crafted)

And something that couldn’t be done on an Xbox, doesn’t make it “clearly on par with 360 and PS3″ as it clearly isn’t. All developers agree. Even the ones that actually put effort into their engines.
Simple analysis of the hardware’s mathematical computational ability, (the very basis of graphics processing) would tell anyone, that it is not. Their best games demonstrate that it is not, developers verify that it is not…

Not much more to say than that.

wiiboy101

On September 22, 2008 at 3:22 pm

HD is not “GRAPHICS” just like i stated the difference to the human eye “WERE IT MATTERS” not on paper!!??

between monster hunter if on ps3, and monster hunter 3 wii is so slight ARGUING IT IS BUFFOON LIKE…..

HD is not a miracle worker, and ps3 x360 are not all powerful

clock wastage clock trashing poor latency of memory slow disc streaming and a DO IT ALL INLINE CPU does not = massive power

it offers a step up from last gen plus some HD

the power you fanes keep pretending is there is not actual power as its not used to power up the games its used and wasted SUPPORTING NATIVE HD

GET IT the 256mb ram say dedicated to graphics is taken up by bigger hd textures so the 256mb ram is not taken up with 256mb of extra graphics or detail

resolution is not the answer the answer is effecency and genuine tech evolvment and some desent resolutiuon to go with it

HD IS NOT GRAPHICS GET IT IF IT WAS THEN HOW COME TRUE WII GAMES ARE FAR CLOSER TO PS3 THAN PS2

and in some cases better

haze ps3 looks like a apes ass, metroid prime 3 wii looks gorgous

i want to see the water and underwater stuff in wii monster hunter because the scans look lush

gamecube was water king last gen water graphics animations and physics

x360 is doing some fine ass water i want to see true wii water graphics,physics i bet its dam close to x360

another game showing gamecubes true might is the vs capcom thing coming to wii

its being said in xbox fan forums wii carnt run it it looks great it must be ether lindberg or taito 2 boards but not in HD

wrong wrong wrong

the arcade version is running on triforce a gamecube with doubled up 1tsram

so vs capcom in the arcade a game thats on par with street fighter hd visually is running on a arcade board 50% the power of wii

its clear street fighter hd is possable on wii in 480p that is and its clear wii can compete with lindberge segas 3ghz cpu arcade board

like i stated HONESTLY EA GUY LIED XBOX FANS BELIEVE LIES AND FIFA WII IS NOT WII GRAPHICS

FIFA WII AINT EVEN PUSHING THE TRIFORCE BOARD LET ALONE THE MORE POWERFUL WII

wiiboy101

On September 22, 2008 at 3:29 pm

IF I TAKE A PS2 GAME AND SIMPLY HD it change native resolution to a true 1080p

THEN TAKE A TRUE WII GRAPHICS LEVEL GAME and display it at native 480p

WITCH GAME WOULD YOU THINK WILL HAVE THE BETTER GRAPHICS

THE WII BY FAR AS IV TRIED TO EXPLAIN BEFORE HD IS NOT GRAPHICS

REMAKE GOD OF WAR2 PS2 EXACTLY THE SAME BUT IN NATIVE HD 1080P

THEN RE MAKE GOD OF WAR 2 FOR WII USING THE WII HARDWARE PROPERLY BUT KEEP IT AT 480P

THE PS2 VERSION WOULD GET RAPED BY THE WII VERSION IN GRAPHICS FRAME RATE AA FILTERING POLISH DE BUGGING LOADING TIMES

THE 1080P RESOLUTION WILL SHARPEN UP THE PS2 VERSION IT WOULDNT GIVE PS2 VERSION A MAGICAL GRAPHICS OVERHAUL WOULD IT

STOP CONFUSING RESOLUTION TO GRAPHICS

wiiboy101

On September 22, 2008 at 3:32 pm

IF I PLAYED QUAKE 3 AT SUPER HIGH RESOLUTION ON A TOP SPEC GAMING PC

AND COMPARED IT TO METROID PRIME 3 OR CONDUIT ETC WOULD QAUKE AS ITS RUNNING AT ABOVE 1080P RESOLUTION LOOK BETTER THAN A TRUE WII FPS GAME

AT 480P NO

AGAIN IV JUST PROVEN THAT HD IS NOT GRAPHICS WEN WILL THE BRAIN WASHED IDIOTS WAKE UP

wiiboy101

On September 22, 2008 at 3:39 pm

METROID PRIME 3 AT 480P NATIVE AND 480P DISPLAY UTTERLY CRAPS ON HAZE AT DISPLAY MODE 1080P

AGAIN IV TRUST PROVEN GRAPHICS AND RESOLUTION ARE NOT THE SAME THING

IM NOT A HD PROTESTER OR ANYTHING IM SIMPLY STATING THAT WHAT IS TRUE VS THE MARKETING LIES OF SONY AND MICROSOFT

ID TAKE BIG PHAT CUSTOM UPGRADES AND POLISH AND INNOVATION OVER RESOLUTION ANY DAY OF THE WEEK

I WOULD ALSO TAKE 480P @ 60 FRAMES OVER 720P @ 30 FRAME ANY DAY OF THE WEEK

PRIME 3 WII = 480P 60 FRAMES AND CLASS LEADING FPS CONTROLS

HAZE PS3 LOOKS LIKE CRAP RUNS LIKE CRAP YET IS HD DISPLAYED AND THE CONTROLS ARE PREHISTORIC AND CLUNKY

SO WHOS NEXT GEN THEN AH-HHHMMMMMMMMMMMMMMMMMMMMMM :roll: :roll:

wiiboy101

On September 22, 2008 at 3:42 pm

SO XBOX FANS ARE ARGUING THAT SLIGHTLY BETTER VISUALS ARE MORE IMPORTANT THAN GAMEPLAY CONTROLS AND LOADING SPEED AND FRAME RATE POLISH

HD VS COMMONSENSE SORRY COMMONSENSE WINS

LOOK AT MY FACTS LOOK AT WII SALES BOTH PROVE IM RIGHT AND XBOX FANS ARE WRONG

wiiboy101

On September 22, 2008 at 3:43 pm

HALO 3 30 FRAMES LOADING ISSUES GLITCH ISSUES AND OUTDATED PREHISTORIC CONTROLS

PRIME 3 60 FRAMES FAST LOADING NO GLITCHES AND AMASSING FPS AND 3D MOTION CONTROLS

I REST MY CASE WII IS A CLASS ACT :lol: :lol:

wiiboy101

On September 22, 2008 at 4:34 pm

WII 27MB OF 1TSRAM AND 32 GB BANDWIDTH NOT COUNTING COMPRESSION FOR GPU BEFORE EVER HITTING GDDR3 MAIN RAM

XGPU XBOX ONLY A WEAK 128K GPU CATCH AND A 8 TO 16 MB CHUNK OF MAIN MEMORY JUST TO RUN THE GPU AND THEN YOU GET YOUR VRAM FROM THE REMAINING POOL OF RAM ALSO SHARED WITH CPU AND SOUND CHIP

CLEARLY WII KICKS ITS ASS

3ds_mac

On September 22, 2008 at 6:27 pm

Not making any difference. Prime 3 lacks bump mapping of any sort. Its developers explained why they avoided heavy shader work, and that’s because WIi lacks power in that area. Sure, you can use bump mapping, but not to the extent you can on other platforms. Nothing about Prime 3′s graphics are a technical marvel. Nothing in Galaxy is either. It’s only nice to look at from an artistic point of view.

360/PS3 wouldn’t have any problem running what’s in Prime 3 at 720p, well above 60fps. Wii couldn’t run an ntsc version of Halo 3 at any real-time frame rate. It’s a matter of processing resources.

And Xbox doesn’t need 8-16 mbs for a fame buffer, given the way Gamecube copies to main ram as front buffer, then copies that to back buffer for display, coupled with z compression, Xbox likely uses less than GC.

And, there is no efficient disk streaming capabilities in Wii over PS3 and 360. Multicore cpus have far more potential in that area. And 360 and PS3 have 512mb of ram. Not 256 mb.

And I’m well aware of the differences in graphics that hd makes and does not make. And if Nintendo had stuck at least a 2002 gpu in their console, we could expect it to be somewhat computationally effective at running similar real-time graphics effects in NTSC, but unfortunately they didn’t.

But, we take what we get, and hope they do better next generation. They should at least be well ahead of the game with controls.

lennell

On September 23, 2008 at 3:57 am

you are right 3ds_mac, prime 3 had no bump mapping and little shader work in it because the wii lacks power in that art, because it is harder to do things like that on wii. but if developer wanted to try and put bump mapping and heavy shader work in prime 3,it might have been another six more month’s before prime 3 came out. it wood have came out in Q1 2008.

wiiboy101

On September 23, 2008 at 12:02 pm

METRIOD PRIME 3 HAS QA 3D VERSION OF THE OLD METROID GRAPHICS

THE ALIEN WORLDS ARE BROUT TO LIFE WITH A TRULY ALIEN AERT DIRECTION BUMP MAPPING IS USED IN ABUNDANCE AND THE REASON YOU THINK IT AINT THERE IS SIMPLE TO A MAN WHO CAN THINK

THE WAY THE TEXTURES BLEND AND THE ART CHOICE OF PRIME 1 2 3 ARE NOT “”"”GENERIC”"”

ITS NOT A BRICK WALL WITH BUMPS ON IT

ITS NOT A ROCK FACE WITH BUMPS ON IT

ITS NOT BODY ARMOR WITH BUMPS ON IT

ITS A TRULY ALIEN LOOK WITH ALL MANNER OF BLENDING COMBINING AND REAL TIME CUSTOM TEXTURE MAPS

AND AS GAME OF THE YEAR AWARDS RUNNER UP AWARDS AND SPECIAL AWARDS FOR THE SERIES SHOW

AND GAMETRAILERS PRIME 3 BEING IN TOP 3 FOR GAME OF THE YEAR WITH MARIO GALAXY WINNING ITS CLEARLY A GOUGOUS LOOKING AND SWEET PLAYING GAME

HALO IS IS GENERIC OH LOOK A TREE OH LOOK A ROCK FACE OH LOOK A BRICK BUILT BUILDING

PRIME 3S ART DIRECTION AND GEMETRY KICK ALL ING ASS

YOU CARNT SEE GENERIC BUMPS AS THE GRAPHICS GO MUCH FURTHER THAN THAT AND THE ART DIRECTION COMPARED TO ANYTHING ON X360 PS3 IS TRULY UNIQ AND ALIEN

LETS PUT ARE CARDS ON THE TABLE

EVERY SINGLE SHOOTER ON X360 AND PS3

A OVER THE TOP GUY IN OVER THE TOP BODY ARMOR WALKING AND RUNNING AROUND IN OVER THE TOP OVER SIZED COMBAT BOOTS THAT IN REAL LIFE NOONE COULD EVER RUN IN SHOOTING GENERIC ENEMYS

METROID PRIME 3 HAS ITS POWN UNIQ UTTERLY STUNNING AND 100% ALIEN ART DIRECTION

METRIOD STANDS OUT ON ITS OWN

FPS ON PS3 PC X360 ARE ALL EXACTLY THE SAME

GEARS MEN IN BOOTS

UNREAL MEN IN BOOTS

HALO MEN IN BOOTS

HAZE SKINNIER MEN IN AGAIN SAME BIG BOOTS AND BODY ARMOR

I CAN SEE UTTERLY GOUGOUS ART WEN I SEE IT AND I CAN ALSO SEE REPETITIVE AMERICANIZED PC LED REPETITIVE ART WEN I SEE IT

ART DIRECTION WISE HALO 3 LOOKS COMPLETE ASS

ART DIRECTION WISE PRIME 3 IS A THING OF BEAUTY

AND PLAYING FPS ON ANYTHING BUT WIIMOTE AND NUNCHUCK IS OBSOLETE AND OUT DATED BUTTON MASHING IS SO LAST GEN

480P WIDESCREEN 60 FRAMES PRIME 3

FAKE 720P NOT REAL HD HALO 3 AND STILL IT CAN ONLY HIT 30 FRAMES AND THE CONTROL PAD STINKS

WII DOESNT BUMP MAP YOU ARE ON SOME POQWERFUL FAN DRUGS

wiiboy101

On September 23, 2008 at 12:08 pm

XBOX 1 DIDNT BUMP MAP ALL BUMP MAPING WAS SOFTWARE PROGRAMMED INTO THE SHADERS

GAMECUBE SUPPORTED DOYT 3 BUMP MAPPING IN HARDWARE NOT JUST SOFTWARE BOTH HARDWARE AND SOFTWARE

XGPU SIMPLY DID NOT

WII SUPPORTS BUMP MAPPING NORMAL MAPPING CUSTOM MAPPING DEFORMATION MAPPING TEXTURE MORPHING AND INDIRECT TEXTURING RIGHT OF THE CHIP THERE ALL FULLY SUPPORTED FULLY CUSTOMIZABLE PROGRAMMABLE VIA TEV UNIT HARDWARE BASED EFFECTS ALL WITH IN TEVS MULTI TEXTURE LAYERD ENGINE

XBOX ONE DID NOT SUPPORT THESE THINGS AT ALL THEY HAD TO BE SOFTWARE DRIVEN VIA THE SHADERS THE GPU ITS SELF SUPPORTED SWEET FUK ALL

GET YOUR FACTS RIGHT

wiiboy101

On September 23, 2008 at 12:16 pm

XGPU OF XBOX 1 HAD SHADERS DID THEY SHADE “”"HARDLY”"” AS 4X TEXTURE LAYERS NORMAL MAPS AND BUMP MAPS ALL HAD TO BE PROGRAMMED IN THE SHADERS SO ACTUAL CLEVEL SHADING WASNT POSABLE AS THE SHADERS POWER WAS USED UP JUST DOING THE GENERIC MAPS LIKE BUMP MAPPING #

SORRY THATS XGPU FACT

GAMECUBE NEVER MIND WII SUPPORTED THESE THINGS IN HARDWARE AND HAD TWICE THE TEXTURE LAYERS TO DO IT AND ON TOP OF ALL THE MAPPING IT CAN CUSTOM MAPP AND CUSTOM BLEND AND SHADE ON TOP OF THE BUMP MAPPING

TRY ACTUALLY PLAYING PRIME 3

CUSTOM MAPPING COLOUR BLNDING REALTIME TEXTURE SHADER/LIGHTING RUTINES HIGH RESSED COLOUR REFLECTIONS TRANSPARENCY 3D TEXTURE EFFECTS BLOOM DYNAMIC BLOOM HEAT HAZE DEPTH OF FIELD LIST GOES ON

NOT ONLY CAN WII BUMP MAP IT CAN CUSTOM MAP AND BLEND AS MUCH AS A ARTIST SO DESIRES TEV UNIT IS THERE TO GIVE THE ARTIST WHAT HE WANTS

STOP CONFUSING TEV UNIT WITH NO SHADERS THE WIIS SHADERS ARE NOT CALLED SHADERS ITS CALLED TEV

AGAIN PC BRAIN WASHING AND BILL GATES BRAIN WASHING HAS YOU THINKING ANYTHING THATS ON A WII GPU THAT ISNT ON A PC GPU MUST NOT ACTUALLY EXIST

THATS SOME REAL FANBOYING RIGHT THERE WII IS NINTENDO ITS NOT A PC OR A PC TYPE GPU WEWRE DO YOU GET YOUR BRAINS FROM A ZOMBIE

wiiboy101

On September 23, 2008 at 1:09 pm

ROUGE SQODRON 2 AND 3 ON GAMECUBE HAD BETTER BUMP MAPPING AND MORE TEXTURE LAYERS AND HIGHER RES COLOUR THAN ANY GAME ON XBOX 1 EVER

5 TO 8 TEXTURE LAYERS 24 BIT TEXTURES 24 BIT COLOUR AND LOADS OF BUMP MAPPING 60 FRAMES

XBOX COULD NEVER DO THAT 1 DOT BUMP MAPPING VS 3 DOT LOW BANDWIDTH VS HIGH BANDWIDTH 16 BIT COLOUR VS 245 BIT AND 4 MAX TEXTURE LAYERS VS GAMECUBES 8

WII = 2.5 X GAMECUBE REMEMBER

wiiboy101

On September 23, 2008 at 1:15 pm

HALO 2 HAS A PATHETIC POLY COUNT A CHOPPY 30 FRAMES AND POP UP TEXTURES THE LOW LEVEL NORMAL MAPS TAKE WAY TO MUCH OF XBOX POWER

ROGUE SQUADRON 2/3 HAVE TWICE THE FRAMERATE OF HALO 2 TWICE THE TEXTURE LAYERS HIGHER RES COLOUR AND WAY BETTER BUMP AND CUSTOM MAPS

OH AND 7.1 SOUND SOMETHING IV NEVER WITNESSED ON XBOX 1

MASTERCHEF HALO 2 = 2K POLYGONS

LEON RES EVIL 4 GAMECUBE = 10 K POLYGONS

3ds_mac

On September 23, 2008 at 2:05 pm

Once again Wiiboy, Prime 3′s developer themselves said Wii lacks shader power, they themselves said they didn’t focus on bump mapping, but what they have looks good for their uses.

“IGN Wii: Will the game feature anything like bump-mapping or normal-mapping?”

“Bryan Walker: We probably won’t be focusing much on bump-mapping or embossing. We’re really focusing on our lighting effects at this point. ”

So, as you can see, they chose to use their tev processing on things like specular highlighs, and gloss maps, etc..

I’ve played plenty of the game, and characters like that purple chick were ugly, blocky, and would have been laughed off the screen if it were on 360 or PS3.

And Prime 3′s art direction has nothing to do with Nintendo, their hardware, or even Retro Studios if you’d like to be technical, considering the lead artist left, and the concept artist is freelance.

And generic, is a description of the graphics effects used, not the art direction.

****XBOX 1 DIDNT BUMP MAP ALL BUMP MAPING WAS SOFTWARE PROGRAMMED INTO THE SHADERS****XBOX ONE DID NOT SUPPORT THESE THINGS AT ALL THEY HAD TO BE SOFTWARE DRIVEN VIA THE SHADERS THE GPU ITS SELF SUPPORTED SWEET FUK ALL****

Analogy, “fixed function” guy, stands in one spot, and when he’s handed a piece of paper, he folds it into a particular shape. He can fold it into this shape, that shape, or this other shape.
“Programmable” guy, can fold the same shapes, except he’s not an idiot, he can be taught to fold into various shapes not supported by the “fixed function” guy.

If you wanted to do some of the shapes “programmable” guy has learned to do, he has no choice but to hand it to the cpu to cover it. Cpu is already responsible for preparing more things for the gpu as it is. He’ll be doing vertex shader work, he’ll be driving the gpu.

And TEV is just another name for register combiner. The fixed function hardware in Gamecube and WIi, computes a series, a chain of floating point and other mathematical operations in a fixed manner.
Mul x2, square root, etc.. That is the way all gpus functioned, until programmable shaders came about. Programmable shaders perform the Exact same function, except you can configure the math operations it computes, more to what you want computed, rather than what is set in the fixed chain.

Again, floating point processing is performed by the gpu, in the exact same manner as is done on Gamecube, except the math can be configured to compute what is needed for things like normal mapping. Wii has support for embm and embossing, Normal mapping and any other various forms of mapping would have to be emulated on the cpu. Those are the basis of its texturing ability. They didn’t extend that mathematical processing ability in the Wii, as has been verified by every developer who has spoken about it. High Voltage, Retro Studios, included..

Overall floating point processing in those is known, as is the math they compute, and how they are computed. TEV is a decent alternative to “shaders” in Xbox, they aren’t decent by today’s standards, and weren’t extended by Nintendo.

And yet again, you’re citing the number of texture layers per render pass. Most games multipass regularly. PS2 can only do 1 texture per render pass, but you’ll notice many games do 2 sometimes 3 texture layers.

wiiboy101

On September 23, 2008 at 2:07 pm

SCALE XBOX AT 1 X360 AT 10

WERE ON THAT SCALE IS WII EDUCATED GUESS ID SAY 6 PLUS

HARDLY A XBOX NOW IS IT MULTI LAYERD TEXTURES SHADERS BUMP MAPPING ETC IS GRAPHICS NOT HD GET IT

wiiboy101

On September 24, 2008 at 10:12 am

house of the dead over kill screens and preview today clearly show wii is rendering the best looking hotd game ever

yes the last version hotd 4 is running in hd on a 3ghz pc board arcade board

wii is running at 480p and looks loads better it shows wii is more powerful than idiots think and it also proves hd is not graphics……..

wiiboy101

On September 24, 2008 at 2:56 pm

no one can dispute x360 ps3 and pc games are all being sterilized with the whole qauke we carnt move on art directions and samey shoot a guy games

thats not game thats lazy it sells development and publishing

the main guy in gears of war is allmost identical to a guy in qauke 3

its tirering its borring its repetitive its lame AND I THINK ITS EVIDENCE TO THE NARROW ACTUAL ABILLITYS OF THE SHADERS IF THERE SO POWERFUL WHY IS EVERYTHING THEY DO EXACTLY THE SAME

GEAR OF WAR UNREAL TOURNAMENT QAUKE HAZE ETC ETC ETC ALL COULD BE THE SAME DAM GAME TALK ABOUT LAME DEV/PUBLISHING

GOD FORBID SOMEONE INNOVATES GOD FORBID AN ARTIST TRYS A DIFFERENT TYPE OF ART GOD FORBID A PROGRAMMER TRYS SOMETHING NEW WITH SHADERS

NOONE CAN ARGUE AGAINST THIS AS ITS AMERICAN GAME DEVELOPMENT FACT

AND DONT GET MII STARTED ON POP UP GLITCHES BUGS TEARING FREEZING AND PATCHES ANOTHER LOAD OF CRAP YOU MICROSOFT FANS BROUT TO CONSOLES

wiiboy101

On September 24, 2008 at 3:05 pm

XBOX 8 TO 16 MB OF MAIN RAM WASTED ON GPU BUFFERING

WII ALL BUFFERING DONE ON GPU IN 3MB EDRAM CATCH AT VASTLY SUPERIOR BANDWIDTHS AND SPEED AND BALANCE THAN ANYTHING IN XBOX 1

XBOX HAS LOST BANDWIDTH AND MEMORY FROM MAIN RAM JUST TO RUN THE GPU ADD TO THAT VERY POOR LATENCY AND BALANCE AND A FSB BOTTLENECK

WII EVEN WEN GOING OUTSIDE OF ITS EDRAM CATCH BUFFER IT HAS A FURTHER 24MB 1TSRAM TO DO INTENSE GRAPHICS TEXTURE WORK WITH

THEN 64MB GDDR3 TO FEED OF MAIN RAM

THE WII RENDERS GRAPHICS AND USES BANDWIDTH VASTLY MORE EFFICIENT AND EFFECTIVE THAN XBOX ITS LATENCY IS MANY TIMES FASTER TO DISMISS A 3MB EDRAM CATCH AND A 24 MB SRAM GPU DIE EMBEDDED FAST RAM SHOWS YOUR IGNORENT #

OH A PC OR A XBOX DOESNT USE 1TSRAM SO AS THE BRAINWASHED MICROSOFT WORSHIPER I AM ILL DISMISS 1TSRAMS GREAT PERFORMANCE

JOKERS

wiiboy101

On September 25, 2008 at 3:46 pm

fill rate of 1.1GPixels/s and a dual-texture fill rate of 2.2GPixels/s

BANDWIDTH 10GB PLUS

TOP SPEC GEFORCE 4 Ti 4800 A HD CAPABLE GPU CARD FOR PC THAT IS WAY ABOVE XGPU

10GB PLUS BANDWIDTH FROM 128MB RAM WITH ALL GPU BUFFER RUNNING TASKS IN THAT MAIN GPU RAM AND ITS A INEFFICIENT CARD IN A INEFFICIENT PC AND ITY HAS TO SUPPORT HIGH RESOLUTION

HOLLYWOOD GPU 2MB EDRAM 12GB Z/FRAME BUFFER –1MB 16GB TEXTURE CATCH=28GB TOTAL NOT COUNTING COMPRESSION 4TO1/6TO1 DATA TEXTURES ETC

PLUS 24MB 1TSRAM ON GPU DIE 4GB HOLLYWOOD AND ITS 1TSRAM ARE AMASSINGING BALANCED AND FAST (LOW LATENCY) AND ALL DATA TEXTURES ARE DECOMPRESSED ON THE FLY IN THE GPU IN REALTIME

HOLYWOOD IN WII IS CAPPED 480P PC GPUS ARE SUPPORTING ALL MANNER OF RESOLUTIONS

HOLLYWOOD IN WII HAS HUGE BANDWIDTH AND FILLRATE PERFORMANCE AND GREAT LATENCY AND BALANCE AND COMPRESSION

WII CLEARLY HAS THE “”BALLS” TO PULL OF HD HIGH RESIOLUTION AND IS A VERY COMPATENT GPU

BUT CAPPED AT 480P ITS CLEARLY GOING TO HAVE NO BOTTLE NECKS AT ALL AT 60 FRAMES SUPPORTING ALL EFFECTS

ADD INTO THE MIX VIRTUAL TEXTURING TILE RENDERING ITS CLEAR WII HAS WHAT FACTOR 5 DESCRIBED A HUGE FILL RATE

INFACT WIIS EFFECTIVE FILLRATE AND BANDWIDTH IS ABOVE A GEFORCE 4 4800 AND ITS A CONSOLE AND ITS RESOLUTION CAPPED 480P

SO ACTUAL PERFORMANCE WITH SOLID FRAMES IS EXCELLENT

LOW RESOLUTIONS ALLOWS HUGE FILLRATE GEOMITY AND POLYGONS AS POLY PIXEL TEXEL FILL IS NEVER STRESSED AT SUCH AN EASY RESOLUTION FOR MODDEN GPS

AND WII IS OPTERMIZED TO WORK AT THIS RESOLUTION WORKABLE USABLE FILLRATE AND BANDWIDTH ARE HUGE SO X360 LIKE VISUALS -HD CLEARLY ARE POSSABLE

3ds_mac

On September 26, 2008 at 5:04 pm

****its tirering its borring its repetitive its lame AND I THINK ITS EVIDENCE TO THE NARROW ACTUAL ABILLITYS OF THE SHADERS IF THERE SO POWERFUL WHY IS EVERYTHING THEY DO EXACTLY THE SAME****

No, this is evidence that they’re licensing and using Unreal Engine 3.
Nothing to do with flexibility of the gpu’s shader ability. If we were looking at that, modern gpus are several generations ahead of Wii.
You can quote ATI themselves on that one.

****fill rate of 1.1GPixels/s and a dual-texture fill rate of 2.2GPixels/s BANDWIDTH 10GB PLUS TOP SPEC GEFORCE 4 Ti 4800 A HD CAPABLE GPU CARD FOR PC THAT IS WAY ABOVE XGPU.

Nope, that isn’t “way above Xbox”. Exact same number of shader, rops, and vertex processors that are configured the same.
4 pipelines, 2 texturing units per pipe. 2 vertex processors, double data rate 128-bit bus to ram.

Xbox fill rate is 933 million pixels and 1.87 billion texels because it’s clocked at 233mhz.
The GF4 you speak of’s 1.1 billion, and 2.2b texels is because it’s clocked at 275 mhz, instead of 233. It’s a revision of GF3, with an additional vertex processor, and hardware AA.
Xbox has both an additional vertex processor and the aa was actually in GF3, it just wasn’t exposed by directx8, so it didn’t get used on pc, until gf4.

Xbox had 6.4gb of bandwidth to main ram, while the gf4 you speak of had 10, because it used ram clocked 33% faster.

The main difference is clock rate, with a some tweaks. The base specifications are the same. There’s far more differences between WIi and Gamecube, than there were between Xbox and GF4.

wiiboy101

On September 27, 2008 at 3:20 pm

pro evo 09 wii edition

advaced 3d motion 3d mouse new physics ans a.i new graphics

co-op with standard control pads with main player using wiimote nunchuck for total team control

new defence line push up fall back via analog stick new aim with wiimote then shoot mechanic

PES 09 IS ALL ABOUT WII

advanced realtime wimote rts real tiome wii pes controls added standard controler options co-op with both controls mii players all modes of x360 ps3 included new wiimote only pick up and play mode d-pad shake and buttons

the list goes on pro evo wii 09 is simply another level of football video gaming its this kinda polish and innovation that makes ps3 x360 look utterly pointless

wiiboy101

On September 27, 2008 at 3:44 pm

HSR rendering helps improve fillrate/bandwidth effectivness

tile rendering also does this but better

edram catch buffers also help do this

pc gpus support hsr to various levels of success

ps2 supported edram

power vr gpus kuros sega etc and no handhelds like phones used tile rendering to help

well let me just tell you that wiis hollywood is

tile render/HSR/EDRAM all in one gpu combined with very fast ram and a balanced system wiis fillrate/bandwidth gets a tasty boost add to this a powerful virtual texturing ability and real time decompression

thats some efficiency right there id love to know wiis real effective fillrate numbers there clearly much much higher than wii think

wiiboy101

On September 28, 2008 at 10:10 am

http://www.gamekyo.com/videoen13210_monster-hunter-3-tri-new-video.html

apparently wiiboy is delusional so delusional infact his delusions embed them selfs in game sittes as monster hunter 3 realtime graphics

THE POOR MOBILE PHONE VIDEO IS SOOOOO TEASE CAPCOM WII ALL KNOW ITS GOING TO LOOK AMASSING

DELUSINAL WIIBOY OR WII GRAPHICS FACT AND AGAIN HD PROVEN NOT GRAPHICS

SO IS FIFA PUSHING WII GRAPHICS CLEARLY NOT

THE DELUSIONAL IS THE XBOX FANS AND THERE INSISTENCE X360 AINT DEAD

YOU CAN BUY X360S IN THE UK FOR $109 WIIS ARE 179.99 NEVER HAD A PRICE DROP

LOOK AT THE DREAMCAST PRICING OF X360 ITS DEAD

lennell

On October 4, 2008 at 10:23 am

i have to say those are some good picture of monster hunter 3 you got there wiiboy,and the conduit is starting to look better and better.

Sampson

On October 4, 2008 at 3:04 pm

*****look at it xbox 360 like not xbox 1*****

More PS2-like actually. I guess you haven’t played MH2.
Not interesting for its processing, just art design of its pseudo 3d environments. They moved it to Wii, so they could continue in that style.
Qick Low res youtube:
http://www.youtube.com/watch?v=HLyynEhzUkE
http://www.youtube.com/watch?v=ICc8LTWl2Is

Sampson

On October 4, 2008 at 3:06 pm

A couple more.

http://www.youtube.com/watch?v=qq61BeswRPM
http://www.youtube.com/watch?v=_fC3Bv4kytM

http://media.www.gamestats.com/media/751/751063/imgs_1.html

wiiboy101

On October 5, 2008 at 3:00 pm

xbox 360 now less than 99 pounds in the uk THATS THE PRICE OF DEATH

wiiboy101

On October 5, 2008 at 3:06 pm

cod 5 wii confirmed near x360 every preview states as we approached screen we thought it was a xbox 360 build wii slightly below ps3 x360.

so how is fifa wii graphics and how is wii = to or lower than xbox 1.

again confirms this threads article as anti wii bull….

by holding back native resolution wii can do the same games LIKE I ALLREADY TRIED TO EXPLAIN HHHHMMMMMMMM

conduit looks amasssing aswell

wii has vastly superior fps controls and gameplay depth for any game really but fps obviously

if people are arguing slightly better graphics vs VASTLY BETTER CONTROLS GAMEPLAY PLUS WIIMOTE SPEAKER SOUNDS PLUS FASTER LOADING PLUS NO INSTALLS PLUS NO PATCHES REQUIRED

IM SORRY YOUR FANBULLTING

IV PURCHUSED A WELL DESIGNED WELL BUILT FAST LOADING GOOD GRAPHICS CONSOLE THAT ON TOP HAS 3D MOTION 3D MOUSE

XBOX 360 NO THANKS PS3 NEVER IN A MILLION YEARS

Sampson

On October 6, 2008 at 9:40 am

As was said before, if the other consoles recycled their hardware, but used a new control scheme, I wouldn’t be a console gamer of any type.
For me, the Wiimote isn’t enough to make up for the hardware, or lack of games. To each his own though.

3ds_Mac : ****And Prime 3′s art direction has nothing to do with Nintendo, their hardware, or even Retro Studios if you’d like to be technical, considering the lead artist left, and the concept artist is freelance.****

Come on now, Retro had a whole list of artists, (concept and in-game).
I’m sure the art and design directors that left were a big part, but they have other talented folks in the art and design department. Staffs shift around all the time. Retro still has plenty of talented folks.

wiiboy101

On October 6, 2008 at 3:04 pm

cod 5 on a dualshock NEVER cod 5 on a x360 pad ok ID KINDA EXCEPT THAT NOT THE WORST STANDARD PAD OUT THERE

cod 5 on a wavebird wirless gamecube pad BETTER THAN BOTH OF THE ABOVE WAVE BIRD IS THE HIGHT OF TRADITIONAL PADS WITH SWEET NINTY ANALOG STICKS

cod 5 on wiimote/nunchuck NOW WE ARE TALIKING TRUE NEXT GEN FPS GAMEPLAY ONLY ON WII

ALL OF THE ABOVE IS FACT

Sampson

On October 6, 2008 at 7:24 pm

That’s opinion Wiiboy. I liked Nintendo’s sticks, hated the d-pad, and didn’t like the button layout for anything but Nintendo games. That is also opinion.

I also wasn’t all that impressed by COD5 screenshots. They have some ugly designed textures here and there, and you can catch a screen cap of some unimpressive shots if it doesn’t emphasize lighting, or shadows from trees, or fire, etc.. (in the PS3 and 360 version anyway)
Infinity Ward seemed to have done better thus far on COD4. Even though they moved the franchise up to 60 fps over COD3, they still improved the visuals.
http://media.xbox360.ign.com/media/902/902590/imgs_1.html

But Wii…well, there doesn’t seem to be much lighting of any type really, fire doesn’t cast lighting, they just made it bright around the edges sometimes. Screenshots for it look….woof. I especially like the sandbags in some of these.
http://media.wii.ign.com/media/142/14222041/imgs_1.html

It literally looks terrible in most shots.

Despite that however, I might look into it if the controls are done well. I’m burned out on COD games for a while, but a Wii controller version would qualify as “fresh”, assuming it’s well implemented.

WIIBOY101

On October 9, 2008 at 3:04 pm

next gen if nintendo or maybe all console manufacturers go to a newer edram design and systems on chips

the whole im brainwashed by pc way of thinking is going to have the xbox fans stumped HA HA HA HA HA HA HA

I CARNT WAIT FOR A SYSTEM COMPLETELY EDRAM AND ULTRA HIGH SPEED STORAGE MEDIA and say 128mb ram edram and slow looking clock speeds and await for the same kind of guys posting here on x360 etc

GOING BUT ITS IT AINT GOT 2 GIG OF RAM LIKE ME PC HAS ITS GOT LESS RAM THAN X360

BECAUSE THEY JUST DONT GET IT AT ALL DUM MOTHER FUKERS :roll: :roll: :roll:

the future is high speed media high speed edrams systems on a chip it aint

MORE AND MORE DRAM MORE AND MORE CLOCK SPEED MORE AND MORE HARDDRIVE YOUR WAY OF THINKING IS DEAD AND I FIND IT HILARIOUS PEOPLE JUDGE EVERYTHING BY HOW THERE TOLD TO JUDGE A PC

BUY THE GUYS MAKING THE PROFITS

YOUR PRICELESS MICROSOFT FANS PRICELESSLY STUPID

lennell

On October 17, 2008 at 6:36 pm

wiiboy i jast look at dead rising wii on youtube and there are people that think the ps2 is more powerful then wii. we all know thats wrong. to see the video and the stupid comments go to youtube and put in, [ dead rising wii comparison lol ] and read the stupid comments.

wiiboy101

On October 20, 2008 at 2:19 pm

iv seen the videos its all about ON SCREEN ZOMBIE COUNTS like that matters and a load of 3d sprites on a ps2 game were being directly compared to wii and polygon 3d enemys on wii

witch in its self is dam stupid and how does dead rising rep the wii any how its a cash in game with a small dev budget and lower capcom team its not going to be a true wii game so using it to say how powerful wii is is again sony fans etc talking CRAP

wii version has different cam/cam angle and motion controls and a arcade like experience dead rising x360/wii are two separate games there not exactly the same

if dead rising wii is real wii power then explain conduit monster hunter 3 etc

if dead rising graphics and zombies prove wiis on screen action/physics etc capability

then doubters please explain mario 128 tech demo cube not wii

then explain elebits on screen numbers and physics

dead rising wii 100 max zombies very low a.i areas were theres 10 or so better a.i

but elebits has 100s and 100s of objects on screen at 60 frames all with there own real time physics

mario 128 gamecube not wii 128 marios all with there own a.i and physics

suda 51 just comented on wii statin graphics wiull get beter and better BECAUE 3RD PARTYS HAVE TO SUPPORT IT NOW THE USERBASE IS HUGE AND GROWING ANTI NINTENDO MENTALLITY WILL PUT THEM OUT OF BUSSINESS

ps4 will be a refined ps3 WITH EDRAM

XBOX NEW WILL BE ANOTHER CASE OF MORE EDRAM AND POLISHED X360 CHIPS

WII HD WILL BE A TIGHTLY EMBEDDED ULTRA EFFICENT EDRAM DESIGN WITH FAST MEDIA FOR LOADING STREAMING

THE 2GB 4 GB 6 GB RAM 1GB GPU CARD IS THE PAST AND ITS PC

A VERY CHEAP CONSOLE WITH HIGH POWER AND HD CAN BE BIULT FOR FAR LESS

WII HD JUST MIGHT BE WHAT SHIGSY WANTED AND ALL POWERFULL ALL 3D MOTION READY PICK UP AND PLAY CONSOLE @ $99.99

CONSOLE WAR OVER

wiiboy101

On October 20, 2008 at 2:27 pm

PS2 FRONT SIDE BUS SHARED GPU CPU SOUND CHIP AND 2 VECTOR UNITS 150MHZ 1.5 GB BANDWIDTH NO COMPRESSION

WII FRONT SIDE BUS NOT SHARED JUST CPU TO GPU 243MHZ PLUS PEAK 4TO1 COMPRESSION = 7.7GB BANDWIDTH NOT SHARED

PS2 32MB MAIN RAM NO ACCESS TO GPU ETC INDIRECT ACCESS VIA CPU OLD FASHIONED BOTLE NECKED DESIGN AND EXSTREAMLY SLOW RAMBUS RAM

WII 88MB MAIN RAM 24MB DIE EMBEDDED 1TSRAM PLUS 64MB GDDR3 EXTERNAL RAM ALL DIRECT TO GPU CPU AND A DEDICATED SOUND BUS TO SOUND CHIP ALL SUPPORTING COMPRESSION REAL TIME DECOMPRESSION

1TSRAM OVER 10X FASTER THAN PS2 RAMBUS RAM GDDR3 ABOUT 3 X FASTER (LATENCY BALANCE)

PS2 CPU AND VECTOR UNIT 1 = EMOTION ENGINE = 12 MILLION TRANSISTORS

BROADWAY CPU ALONE = PLUS 20 MILLION TRANSISTORS AND YET ITS TINY DUE TO IBMS EMBEDDED MICRO DESIGN

WII @ 480P IS 5X PS2 @ 480P LEVEL PERFORMANCE

THOSE COMPARING WII TO PS2 ARE INSANE

lennell

On October 21, 2008 at 7:52 am

wii @ 480p is 5× ps2 @ 480p level performance.thats about right,i agree with that wiiboy.

3ds_Mac

On October 26, 2008 at 7:14 pm

PS2′s cpu reads from main ram on dedicated bandwidth, and passes to the gpu over the fsb, dedicated interface.
Gamecub / Wii cpu sits at the top of a front side bus. It’s only connection to any form of ram, is through the gpu’s memory controller.
Access to main ram as well as the interface to gpu, are completely shared.
Reading up the fsb to get data from main ram, and passing it back and forth to and from the gpu.
fsb is its lifeline to ram.

That of course, isn’t to say PS2 is more powerful than WIi, as that would be stupid. But we can ignore how things actually work, for the sake of exaggeration and fanboyism. That’s really how the world should work.

lennell

On October 27, 2008 at 11:41 am

main ram on dedicated bandwidth,thats right 3ds_Mac, and the ps2 was not even powerful as the gamecube,some developer findout that ps2 ports to gamecube is only about 30% of gamecube power.

WIIBOY101

On November 10, 2008 at 2:02 pm

wii reads data to cpu from ram “”DIRECTLY”" (THE FACT MEMORY CONTROLLER IS ON A SYSTEM CORE LSI AKA HOLLYWOOD GPU HAS NOTHING TO DO WITH DIRECT OR NOT)

Wii reads data direct to its gpu there’s nooooooooo fsb sharing at all FACT ENGINEERING FACT LIKE I STATED IN MY SUPERIORITY WII HAS NO SHARING ON FSB FACT

wii also has a dedicated sound bus (ps2 no ps3 no psp no x360 no xbox no) only wii has this clever separate dedicated sound bus like gamecube did no fsb sharing at all not even sound data ENGINEERING FACT
TO DEDICATED SOUND PROCESSOR, PS3 X360 DON’T EVEN HAVE A SOUND CHIP COUGH COUGH AGAIN DIRECTLY COMPARING CELL X360 CPU AND BROADWAY IS INSANELY STUPID

HOLLYWOOD LSI SYSTEM CORE IS A CONSOLE HEART ITS NOT A MEAR GPU IT ALSO HOUSES CONTROLLERS ETC FOR WIFI MOTION PLAY ETC PLUS EXTERNAL CHIPS SO CPU CAN BE LEFT FREE TO DO TRUE GAME CENTRIC WORK

CELL HAS TO DO EVERYTHING/X360 CPU HAS TO DO EVERYTHING

AGAIN THE MORONS CARNT SEE DIRECTLY COMPARING CELL X360 CPU AND BROADWAY IS INSANELY STUPID(WII USES DEDICATED HARDWARE ITS CALLED CONSOLE DESIGN)

GBA HAS A CLEVER iwRAM (SRAM) 32K TO ITS ARM 7 CPU FOR GREAT RAM SPEED AND CPU PUSHING

DS HAS 64K ON ITS ARM 7 PLUS 32K MORE FOR BOTH ARM 7 AND THE ARM 9 CPU BUT MOST DEVS DEDICATE THE LOT TO ARM 7 TO BOOST ITS PERFORMANCE (CLEVER ON CHIP FAST WORK RAM TO PUSH CPUS MUCH FURTHER THAN YOU EXPECT)

OH LOOK WII HAS A DIE EMBEDDED 24MB 1TSRAM FEEDING CPU AND GPU DIRECTLY MORE SO GPU RIGHT ON THE CHIP AGAIN ITS THE NINTENDO WAY THE CONSOLE WAY

NOT THE PC WAY STILL THIS FACT DOESN’T SINK INTO PC FANBOY X FANBOYS OUT OF TOUCH IT AINT LIKE MY PC I DON’T UNDERSTAND IT MORONS

PS2 FRONT SIDE BUS FEEDS CPU/ VECTOR(0) 1/ VECTOR(1) 2 GS UNIT AND SOUND CHIP OH LOOK A VERY BIZZY WASTED BOTTLENECK BUS

WII FSB IS GPU TO CPU ONLY CPU FROM RAM IS DIRECT GPU TO CPU CPU TO GPU IS DIRECT RAM TO GPU IS ALSO DIRECT AND RAM TO SOUND CHIP IS ALSO DIRECT

ADD TO THAT GREAT COMPRESSION AND DECOMPRESSION AND LIGHTNING FAST 1TSRAM AND HUGE CATCHES CLEARLY WII UTTERLY DESTROYS PS2

SO AS IV STATED AS FACT MORE THAN ONCE ON THIS THREAD EA GUY LIED DID HE NOT EA ARE LIARS AND SO ARE XBOX FANS

IV CLEARLY PROVEN THAT FACT

REMINDER WII CPU HAS DIRECT OVER FSB RAM LINK GPU HAS DIRECT TO RAM LINK AND SO DOES THE DEDICATED SOUND BUS ITS ENGINEERING WII DESIGN FACT

WII HAS A SHARED BUS COUGH COUGH COUGH NO IT DOES NOT :roll: :roll:

COD 5 WII AND CONDUIT AND DEADLY CREATURES ETC ETC ETC DO NOT HAVE PS2 OR XBOX GRAPHICS THEY CLEARLY HAVE NEXT GEN GRAPHICS AND AGAIN HD IS NOT GRAPHICS

AGAIN MY HONESTLY BLOWS AWAY YOUR FANBOUYING AS IT ALWAYS DOES :roll: :roll:

A CONSOLE NOT BASED AROUND FULL 3D MOTION AND 3D MOUSE IS AN OBSOLETE CONSOLE ONLY WII IS NEXT GEN PROVE ME WRONG :lol: :lol:

WII CONTIUES TO OUT SELL PS3 AND X360 COMBINED DS KILLS PSP AND IPHONE COMBINED SO LETS STAY IN THE RELM OF REALITY NOT X FANBOY FANTASY

WIIBOY101

On November 11, 2008 at 8:26 am

confirmed by developers the average ps2 to gamecube port and polish pushed gamecube only 30 to 40% gamecube is ps2 x 2.5 wii is plus 2x gamecube

so wii= plus 5x ps2 SO HOW IS FIFA WII, WII GRAPHICS WEN IN FACT ITS SLIGHTLY POLISHED PS2 GRAPHICS

WHY ARE YOU STILL IN DENIAL X360 FANS ARE YOU STUPID

2 MAYBE 3 TEXTURE LAYERS 8 BIT WITH 8 BIT COLOR =PS2

8 TEXTURE LAYERS 16 STAGE SHADERS FULL EFFECTS 24 BIT TEXTURES COLOUR AA/FILTERING ETC = WII

EA LIED DIDNT THEY X360 FANS AS YOUR ALSO LYING WII IS NOT PS2 ROLLS EYES

3ds_Mac

On November 18, 2008 at 8:48 am

I’m not even going to read past the first few lines Wiiboy, everything you have posted regarding texture layering, bit depth, etc.. has been dismantled, not just by detailed explanation of how things actually work, but various developer quotes as well. From Retro to Ubisoft.

Revealing the fact that you haven’t a clue what you regurgitate, has already been ages ago, trying to save your clueless bull argument by reposting the same old , with the word “fact” after it, doesn’t change reality.

And you can compare Broadway to Cell or Xenon any time we’d like. The sound processing you’re referring to on Wii, is prologic. It’s stereo. Not legit 3 dimensional sound positioning, not encoding surrounding sound, etc..
360 has an audio processor in hardware, for decoding and decompressing sound streams, it uses the cpu to encode to 5.1, which, from a cpu’s perspective is a cheap task.
An spe can perform the task of audio work, faster than Wii’s entire cpu and audio chip combined, and do so with legit 3d sound, and 5.1+ surround sound encoding.
Cell has its own pool of ram, with its own bus and bandwidth to it, as well as a separate fsb interface to the gpu, flexio, etc..

And again, Broadway sits at the top of a fsb. It reads data over that fsb, using the same bandwidth it uses to interface and trade data with the gpu.

And any data the memory controller supplied to the cpu from main ram, uses the very same bandwidth the gpu uses to read data for itself. Any bandwidth devoted to the cpu from man ram, can’t be used by the gpu.

Hence, the bus and bandwidth to main ram is shared between the cpu and gpu.

And the fsb’s bandwidth is shared between reading data from main ram for the cpu, and feeding and exchanging data with the gpu. An Xbox used push buffers, and simply points the gpu to compressed precompiled meshes in main ram, PS2′s cpu is directly connected to ram, while it’s gpu is not.
Totally different set-ups, with different meaningful numbers, nothing useful for your fanboy bull.

That’s just how things work Wiiboy, a few million years of eugenics should allow your descendants to understand that as well..perhaps.

Wii is a “piece of ” by today’s standards.

Engineering fact, that even the most delusional and extreme fanboyism can change.

Maybe in the next round of consoles, Nintendo will give you something worthy of pretending to understand the inner workings of, so your trolling the internet with your ignorant copy and pasted bull, won’t make you look like such a foolish tool….
Probably not though, on all accounts.

3ds_Mac

On November 18, 2008 at 10:05 am

In closing, overall, 360/PS3…..Wii.Xbox.Gamecube.PS2.DC, etc, etc..
And you’re right, Wii isn’t a PS2, it’s an overclocked Gamecube with more ram.
It is, what it is, and that’s all it is. signed, Shiguru Miyamoto.

WIIBOY101

On November 18, 2008 at 1:20 pm

MAIN RAM WII = 8 GB BANDWIDTH = 480P

MAIN RAM X360 = 24GB BANDWIDTH= 3X WII

720P = 3X 480P 8X3=24

OH LOOK WII IS DEAD = TO X360 – THE NATIVE HD

xbiox 360 fans go to skool learns some math

480p @ 8gb is the same effective performance as 720p @ 24gb – the higher native resolution

aka 480p x360 like performance NOT XBOX 1

AGAIN THE X-TARD CANNNOT SEE COMMONSENSE BEFORE HIS IDIOT EYES

OH AND WII KILLED X360 AND PS3 IN SALES AND GAMEPLAY AND INNOVATION AND CONTROLS

FOOLS

3ds_Mac

On November 18, 2008 at 3:06 pm

And PS3 has almost 50gbs of bandwidth. It must be awesome.
Differences from Wii. 360 has an actual bandwidth to its pool of ram. You don’t add the two pools of ram on Wii together, anymore than you add the texture cache bandwidth, system ram, and video ram together on any other system.

360 has higher compression ratios for all textures, additional compression algorithms for normal mapping, and the ability to generate procedural content off an algorithms from a cpu core. Generated and transfered over the fsb, rather than read from memory. Compression applies, bandwidth to main ram is completely untouched.
You can also tessellate polygons in the tessellation unit, from a very low polygon mesh, and spin it into a higher polygon count than you could ever read from ram.. Like from a gray scale map for example. Being a Nintendo fanboy, you should know how displacement mapping works, I’m sure.

And don’t forget the 6x the ram total, with higher compression ratios for everything. (Wii couldn’t store more than a handful of normal maps in ram, which doesn’t really matter, because it can only process an handful to begin with)

But aside from all of that junk, the biggest difference is in processing.
Wii, equal to 4 texturing units.
360 16 texturing units on pixel side, 16 point sampled on the vertex side, with a clock rate twice as fast.
4 vs 32 + 32 on vertex texture fetch.

Shader units.
Wii, equal to 4 for the pixel side, 1 for the vertex side.
360, 48 total, with a clock rate more than twice as fast.
5 vs 96

Rops
Wii has 4.
360 has 8, clocked twice as fast, can perform 2x that in z-only, and perform aa without cutting fill-rate. . Additional differences being it can texture multiple layers, without cutting in half the way WIi does.
This is easily 8x.

So, you can see how lopsided it gets if you push into the numbers and “math”.

3ds_Mac

On November 18, 2008 at 5:54 pm

But of course, don’t get confused. 10x+ the power is used for things like a real-time tree shadow canopy, that interacts and casts shadows onto grass, and leaves that light shines between and sometimes through, etc..
More animated bones on the internal skeleton of a character, skin deformation, etc.. Animation that’s calculated real-time, rather than being canned, a far higher range of light to darks, steps in color variation, etc..
lighting and shadows that interact with varieties of normal maps, actual fluid deformation, etc..

Some of which can be faked, baked, and precomputed, and many Wii fans won’t notice the several times jump in power the other systems have. You’ll look at something full of fake and static surfaces and objects, and think “wow, this system is at least a fourth, or maybe a half as powerful as the other systems…But the gap in processing that’s going on behind the scenes is epic.

And that’s what developers complain about. The only games that’ll look good on Wii, are the ones that do a good job of faking what it can’t actually process, and focus processing on what little it can.
But Wii fans are easily impressed, so they might get there eventually.

WIIBOY101

On November 19, 2008 at 7:38 am

720P TAKES 3X FILLRATE AND 3X BANDWIDTH TO PRODUCE THE SAME IMAGE

480P VS 720P THE ONLY DIFFERENCE TO THAT IMAGE IS NATIVE RESOLUTION EVERYTHING ELSE IS THE SAME

8GB X 3 = 24GB OH LOOK WII VS X360

OH BUT WII SUPPORTS THE RESOLUTION NINTENDO STATED 480P

BUT X360 STRUGGLES AT 720P RELYING ON 640P AND UPSCALE AND NATIVE 1080P IS IMPOSABLE MICROSOFT LIED

SO AGAIN XBOX FANS, FAN A LIAR CALLED BILL GATES, AND NINTENDO FANS FAN HONESTY

LETS ALSO NOT FORGET MICROSOFT TERRIBLE BUILD QUALITY AND LONG LOADTIMES AND NOISY FAN AND OUT DATED OBSOLETE CONTROLS AND POOR WIFI PERFORMANCE

wiiboy101/uk

On November 19, 2008 at 7:54 am

Wii is an over clocked gamecube.

but isn’t a modern day pc a overclocked 486 dos pc

isn’t a Pentium 4 a upgraded Pentium 1

isn’t 512mb dram an upgrade of 64mb dram

isn’t a Intel core duo based on older pentium 3 M cpu

3dc do you see my intelligents at work here maybe one day you will grow up

and to top it all of both the core of the ps3 and the 3 cores of the x360 cpu are overclocked ppe ibm cpus

DO YOU KNOW WHAT THEY ARE ITS A CORE BASED ON VERY VERY VERY OLD INLINE RISC IBM CHIPS FROM THE PAST WAY BEFORE G3S

THEY LACK BRANCH PREDICTORS AND MULTI TASKING ABILLITYS AND ARE HEAVILY STRIPPED DOWN FROM TRUE PPC CPUS LIKE G3 G4 G5 ETC

PS3S CENTRAL CORE AND X360S 3X CORES ARE OVERCLOCKED BASIC INLINE RISC CPUS FROM IBMS PAST

AKA THERE PPE NOT PPC OR POWER RANGE

YOU BELOVED X360 IS AN OVER CLOCKED 1990 CPU AND THATS A FACT

BROADWAY IN WII HAS BETTER BRANCH AND MULTI TASK AND TRUE DESK TOP ABILLITYS THAN PS3 AND X360S 3 CORES ALL PUT TOGETHER

PS3S CELL IS TERRIBLE AT BRANCH IT HAS TO DO LOADS OF SEPERATE THREADS JUST TO DO WHAT A PENTIUM 3 800MHZ DOES IN ITS SLEEP

3DC JUST GOT WIIBOYED :lol: :lol:

3ds_Mac

On November 19, 2008 at 9:36 am

You’ll never be a programmer of any type, so the stuff you just regurgitated went over your head.
I’m well aware of what the cpu cores look like.

Generic compilers that you’d use to write programs for a pc, produces code in serial, and the cpu itself is required to reorganize data in such a way, so that it can take advantage of doing 2 floats, 2 ints, etc..or however wide the cpu core is.. at the same time.
Lacking an instruction window for out of order execution and other mechanics that are there to reorganize data at run time, is where the similarities to older cpus comes in.
Reorganizing data is something they’ve added since, to allow generically generated code, to be as optimized as possible for that particular processor, since processors come in hundreds of different types and configurations, a general compiler can’t know how all cpus are configured, and can’t generate code that runs well everywhere.

But, all 360′s and PS3′s are identical. They all have the exact same number of execution units, and are organized and clocked the same.
Thus, a program doesn’t need to be reorganized at run time, it can be organized at compile time, as the code is being written, and branch hints inserted into the compiler code itself.

Thus, allowing more actual execution units, registers, and processing logic to be included on the die That’s why you see extremely high floating point bs for these processors.

Hence, 165 million, and 200+ million transistors vs barely 20 million.
Wii lacks simple vmx32 of G4′s or G5′s. 360 uses vmx128. It’s designed by IBM, as a processor sufficient for use in a video game console, with transistors spent on processing useful to graphics and games.
Wii’s cpu is a modified cpu from the late 90′s, with a 729 mhz clock rate, that’s also required to emulate functions that were moved to the gpu years ago in other systems, because the Wii’s gpu was actually designed years ago, with “years ago” technology in mind.

That’s why developers label Wii “a piece of ”.
And it matters not how efficient developers are with using them, as even if they do a horrible job, they outclass Wii’s.
But you can read the Capcom outline for their MT Engine, linked up there somewhere for more details of their uses, and running the processing load of an actual game.

And you’ve tried your bandwidth focus once again, but Wii can’t compress a normal map, thus can’t save bandwidth for reading it, nor save space in storing it, nor store it in its texture cache that way. It can’t compress volumetric textures, and has only basic S3 texture compression.

Once again, it has barely a 6th the total ram, with inferior compression of all types. You could get 10 times the textures in 360′s ram, than you could Wii, thus allowing a far wider variety of them in a level.

And once again, many many many times the gpu processing elements. The very basis of graphics work.
It doesn’t have to drop to 16 bit color for things like aa. It can compute shadow and lighting in real-time, they can add blades of grass, leaves on trees, etc, etc..

And also, once again, Wii doesn’t render 480p in many of its games. It’s 448p or lower into the upper 300′s, in the case of RE4, Prime 3, etc.. And horizontal resolution doesn’t go up at all for wide screen, it’s pretty much always 640.

Then there’s the issue of not supporting aa and progressive scan at the same time, or aa and full color at the same time, etc etc etc….that Nintendo outlined themselves, when they wrote their frame buffer patents.

Wii cpu 20 million transistors, Wii gpu 24 million transistors for processing logic, 25 million for eram.

360 and PS3 8-12 times that for cpu, PS3′s gpu has 12 times the transistors for processing, 360 has 10 times that, plus 4 times the eram transistors.
Without even factoring in relative clock rates, etc..
They both have 6 times, effectively 10+ times the total ram.

3ds_Mac

On November 19, 2008 at 9:42 am

So “divide by 3″ all you’d like.

WIIBOY101

On November 20, 2008 at 11:30 am

the cpu broadway is based on is more modern and a true desk top risc out of order super scaling pc cpu

the cores used in ps3s and x360s are INLINE NOT OUT OF ORDER AND BOTH TOTALLY LACK BRANCH PREDICTION

THERE NUMBER CRUNCHERS NOT TRUE CPUS

A G5 1.8 IS BETTER THAN A X360S 3CORE JOB

3X G5S YEAH RIGHT XBOX FANS CARRY ON DREAMING IF X360 HAD 3 G5 CORES IT WOULD COST A $1000 TO BUY ONE

DUFFUSES

BROADWAY @ 3.0GHZ WOULD UTTERLY DESTROY AN X360 CPU ITS A BETTER CHIP FULL STOP

POWER FACT WII = 5X PLUS PS2, PS3 IS 10X PLUS PS2 WITH THE NEED TO SUPPORT HD

THERE IS NO MAGICAL POWER IN X360 OR PS3 YOUR ALL MISSING THE FACTS

THE FACTS ARE JUST TO SUPPORT THE HIGHER NATVE RESOLUTION (NOT THE GRAPHICS HD MARKETING BRAINWASHED FOOLS) THE RESOLUTION TAKES LOADS MORE CLOCKS AND RAM

THATS BEFORE WE START TALKING ACTUAL GRAPHICS PHYSICS a.i etc

your calling resolution graphics if i remove the graphics from a screen and just leave the resolution

what have i got a screen full of black and white dots fuzzing just as i would have at any other resolution

STOP CONFUSING HD MARKETING WITH GRAPHICAL FACTUAL DATA TALK ABOUT BRAIN WASHED IDIOTS

SO IS THE TERM TRUE HD REAL OR IS IT A MARKETING STUNT

ITS OBVIOUSLY A STUNT AS TRUE HD DOES NOT EXIST

AND WHOS THE GUY WHO SAID 1080P WAS TRUE MY DAD UR GRANDFATHER WHO DEEMED 1080P TRUE HD

OH I REMEMBER IT WAS SONY OH LOOK AGAIN MARKETING HYPE WORDS IN PLACE OF FACT SONYS VERY GOOD AT LYING

HD IS NOT GRAPHICS

TRUE HD DOES NOT EXIST ITS A MARKETING TERM GET IT

IF 1080P IS THE SUPER DUPPER TRUE HD SONY CLAIM HOW COME THE BRITISH GOVERMENT AND THE BBC HAVE STATED THAT LARGE HD SCREENS ALL OVER THE UK IN PUBLIC PLACES TO AIR THE NEXT OLIMPICS HELD IN THE UK

WIULL DISPLAY AT A RESOLUTION 32X HIGHER THAN 1080P IF BBC IS GOING TO SHOW THE OLIMPICS AT 32X HIGHER RESOLUTION THAN 1080P

THEN HOW IS 1080P TRUE HD

AGAIN THINKING IS ABOVE AND BEYOND FANBOYING

EA LIED THIS SITE LIED AND THE XBOX FANS LIED WII IS FAR MORE POWERFUL THAN A XBOX :roll: :roll:

A GUY WIIBOY KNOWS

On November 20, 2008 at 11:48 am

WAIT A SECOND according to ps3 and xbox fans hd is new hd is next gen….

But i was playing pc games at very high resolutions 10 years ago it just wasn’t MARKETED AS COUGH HD it was simply referred to as higher resolution..

but i can clearly remember not 3d motion playing until wii arrived so that’s in its self has just proven as fact not xbox fan opinion i repeat fact

3d motion 3d mouse is next gen HD IS NOT AS HD IS VERY VERY OLD DON’T BELIEVE ME WELL FIRE UP AN 8 YEAR OLD PC MONITOR AND THEN COME BACK AND APOLOGIES TO THE GUY WHOS KICKING YOUR ASS WIIBOY101…

SO XBOX 360 FANS MAKE SENSE I THINK NOT HD IS NOT GRAPHICS WANT PROOF OK

IF I FIRE UP QAUKE 3 AND PLAY AT 1080P OR ABOVE THEN WIIBOY FIRED UP METROID PRIME 3 AT 480P

WOULD QAUKE 3 HAVE BETTER GRAPHICS ANSWER NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO IT WOULD PALE AGAIN THE GOURGOUS METROID PRIME 3 AS THE “”"”"”"”GRAPHICS”"”"”"” ARE OLDER AND LESS TECHNICAL

HD HAS NOTHING TO DO WITH IT

SEE YOU IN THE CHATROOM WIIBOY WITH OTHER INTELIGENT PEOPLE WHO DONT BUY LONG LOADTIME OUT DATED CONTROLS POOR BUILD QUALITY CONSOLES CALLED XBOX 360 :razz: :razz: :razz: :razz:

WIIBOY101UK

On November 20, 2008 at 12:08 pm

HA HA HA HA HA FORGIVE THEM LORD THEY KNOW NOT WHAT THEY DO :roll: :roll: :wink:

APPARENTLY PUTTING MORE DETAIL AND EFFECT ON A TEXTURE IS GIVING IT A HIGHER RESOLUTION COUGH COUGH

XBOX 360 FANS HIDE BEHIND PC FAN TERMINOLOGY

WEN A TEXTURE HAS MORE DETAIL AND EFFECTS THEY REFER TO IT AS A HIGHER RES TEXTURE NOT A HIGHER DETAILED TEXTURE

CONFUSING DETAIL AND EFFECT WITH RESOLUTION THEY CARNT HELP IT THERE BRAINWASHED BILL GATES FUK CHILDREN IF THEY COULD SHARE BILL GATES BED JACKSON STYLE THEY ALL WOULD JUMP AT THE CHANCE IDIOT BRAINWASHED AMERICANS WHO BUY X360S THEN JUST PLAY SHOOTERS AND DEEM THEMSELFS INFORMED GAMERS COUGH COUGH

DIDNT THE INFORMED GAMERS BUY WIIS HA HA HA HA 3D MOTION IS NEXT GEN BUTTON BASHING CLEARLY IS NOT

IF I PLAY MGS 3 FROM PS2 AT 1080P DO THE GRAPHICS MAGICALLY IMPROVE NOOOOO THE FULL SCREN IMAGE JUST LOOKS SLIGHTLY SHARPER WITCH IS RESOLUTION FACT

BUT XBOX 360 FANS LIKE SONY FANS BELIEVE TJHAT HD ALONE INPROVES IMAGES BY 6X

YEAH THATS RIGHT SONY ETC CLAIM 1080P IS 6X 480P QUALITY AND THESE DUM FUKS BELIEVE IT

LET YOU IN ON A SECRET SONY MICROSOFT FANS THE HUMAN EYE IS THE HUMAN EYE ITS NOT AN EGLES EYE OR A SHARKS ETC

THE HUMAN EYE SEES 2 TO 3 X A SHARPER IMAHE AT 1080P THAN 480P NOT FUKING 6X AT ALL

THATS THE PIXEL COUNT ITS 6X MORE BUT THE HUMAN EYE DOES NOT SEE 6X MORE IMAGE SONY ETC ARE FAKING THE NUMBERS

AND AS YOU ALL FELL FOR SONY’S 128 BIT AND POLY PUSHING LIES LAST GEN YOUR NOW FALLING FOR THE HD MYTH THIS GEN

BECAUSE YOUR STUPID

ALSO DONT HD MYTH ARTICLES ON THE NET SCIENTIFICALLY PROVE RESOLUTION COUNTING AS STUPID AND IRRELEVANT

CONTRAST COLOUR (COLOR YANKS ROLLSEYES) FILTERING SCREEN QUALITY FRAMERATE NOISE REDUCTION ETC ALL HAVE MORE ROLL TO PLAY IN IMAGE QUALITY THAN RESOLUTION

NEWER HD TV LCD SCREENS STILL MAX OUT AT 1080P BUT THERE PICTURE QUALITY IS STILL IMPROVING HOW CAN THIS BE IF THE MAGICAL HD RESOLUTION IS THE SAME

BECUASE WHAT MAKES HDTV SET LOOK SO SHARP IS LESS TO DO WITH HD AND MORE TO DO WITH OTHER TECH LIKE ANTI8 BLURING AND BETTER MOTION CAPTURE AND COLOUR

YOU PEOPLE ARE SO BRAINWASHED AND YET ARE TOTALLY UNAWARE OF IT

I SUPPOSE USA INVADED IRAQ BECAUSE OF TERRORISTS AND SADAM AND NOT BLACK GOLD UNDER THE GROUND

WAKE UP YOU DUFFUS MOTHER FUKERS :lol: :lol: :roll:

3ds_mac

On November 20, 2008 at 4:50 pm

Once again Wiiboy, you went right back to recycling bull. I know your reading comprehension is horrible, so we can try again.

Out-of-order instructions, simply means reorganizing data through an instruction window at run time. (run-time means as it’s loaded into the cpu)

And why would you need to reorganize data at run-time? Because all pc cpus are different. You have pentium 3′s of all types, pentium 4′s of all types, amd’s of all types, or G4′s, G5′s, etc..etc, etc.. Out-of-order simply means there’s logic on the cpu, that helps to reorganize serial data that a programmer produces, so a cpu that can compute 2x floats simultaneously or int, or 4x simd, etc, etc.. can get the data they need when they need it, in the order it’s needed to best suit that particular processor.

When a person writes a pc program, they can’t know the configuration of the cpu, they rely on the cpu’s “out-of’order logic on the cpu itself, to organize it into buffers for them, and in the best order for that particular cpu’s configuration.

But, these being consoles, the cpu specifications are identical from one to another. Therefore, you know how the cpu is organized, how they’re structured, what can be simd, hoe many cores, threads, etc, etc.. So, the data can be organized at compile time.
Compile time, means as it’s been created into data when it’s written.

Since costs on cpus is more related to transistor counts and die size, they left out the transistors used for out-of-order instructions and attempted to make up for it in a custom compiler, in exchange for more execution hardware.

So, they used their 165 million transistors for a triple core cpu, capable of hyperthreading, rather than a 58 million transistor G5, since they don’t need the same type of computing power a pc does.
At any rate, any decision they could have made, is far above Wii. A G4 would do that.

And it’s PS3′s spe that lacks branch prediction. PPE and all Xenon cores have branch prediction. The compiler sticks branch hints into the compiler code itself, to improve branch prediction.

You attempting to mindlessly outline the benefits of out of order instruction is pointless, not only because you don’t understand the you post, but because you’re using it to attempt to ride a modded G3.
Not a G4, not a G5, not a dual core cpu, but a modded late 90′s 729mhz G3.

G4′s have been out for years. They incorporated VMX, G5′s also had VMX32. 360 has VMX128…Wii? 2x floats.

And once again, you can read through Capcom’s outline of their MT engine. Their development kits use pentium 4 dual core extremes. They outline what performance they get out of Xenon on an actual game, using actual game code, with a mix of processing typical to their game.
You can also read commentary on Wii by Capcom developers thenselves.
-”Wii couldn’t run RE5′s title screen” and other hyperbole is there for the reading.

you can also read through development papers about the early 2xG5 development kits that they used in early stages of development on 360, you can read about cross platform development with PC, etc, etc, etc… But that would involve actually reading stuff And comprehending it.

3ds_mac

On November 20, 2008 at 5:26 pm

****APPARENTLY PUTTING MORE DETAIL AND EFFECT ON A TEXTURE IS GIVING IT A HIGHER RESOLUTION COUGH COUGH XBOX 360 FANS HIDE BEHIND PC FAN TERMINOLOGY WEN A TEXTURE HAS MORE DETAIL AND EFFECTS THEY REFER TO IT AS A HIGHER RES TEXTURE NOT A HIGHER DETAILED TEXTURE ****

No, putting higher res textures, and more processing on a pixel produces better graphics. People enjoy real-time lighting. They like the look of normal mapping, and grass, and shadows produced by tree canopies. They like dynamic animation, as opposed to canned. They like seeing a higher ranged color palette, and higher polygon objects. They also enjoy interactive, vertex shader driven water, as opposed to static non-interactive water, they like reflection maps, lighting more complicated particle effects, etc, etc..
The list goes on, and on..

And they render 3x the resolution, but have more than 10 times the processing power to do it, in fill rate, alpha blending, shadows, lighting, normal mapping, etc, etc, etc..
All, while not looking like for people who happen own hdtvs. Go figure.

WIIBOY101UK

On November 21, 2008 at 10:45 am

iv just played metal gear solid 1 on emulation on a pc iv set the resolution to 1080p and magically the ps1 graphics became ps3 graphics

IN YOUR DREAMS GRAPHICS AND HD ARE THE SAME THING XBOX TARDS

WIIBOY101UK

On November 21, 2008 at 10:49 am

SO WII HAS BEEN OUT HALF THE TIME OF X360 YET HAS OUT SOLD PS3 AND X360 COMBINED

HOW MANY DESPERATE PRICE DROPS XBOX FANS

HOW MANY DISC SCRATCHING ISSUES

RED LIGHTS OF DEATH

PAY FOR ONLINE

OUTDATED CONTROLS

FAKE EXCLUSIVES THERE ALSO ON PC

SHALL I GO ON

WII WON THE CONSOLE WAR

WIIBOY101UK

On November 21, 2008 at 10:59 am

APPARENTLY GAMERS CHOOSE BUTTONS AND NON GAMERS CHOOSE 3D MOTION

CARRY ON FOOLING YOUR FANBOY SELF’S

ALLLLLLLLL CONSOLES WIL BE MOTION NINTENDO LEADS ALL OTHERS FOLLOW

ONLY WII IS NEXT GEN TO A INTELLIGENT PERSON

MICROSOFT MAKING GAMES CONSOLES WHAT THE FUK EVER WHAT NEXT NINTENDO MAKING SNEAKERS (TRAINING SHOES) OR SONY MAKING CARS

MICROSOFT HAVE NO PLACE HERE

BIGGEST COMPANY ON EARTH CARNT EVEN INNOVATE RELYS ON COPYING

WINDOWS IS A AMIGA/MAC RIP OFF

X PAD IS A DREAMCAST RIP OFF

MICROSOFT BACKED DREAMCAST THEN STABBED SEGA IN THE BACK WITH XBOX

XBOX SETTING FIRE TO HOMES POWER LEAD BOX ISSUES

LETS FAN BILL GATES AS WE ARE STUPID

Haz

On November 22, 2008 at 5:33 am

Shut the up wiiboy the fan boy. You have spent 1 year trying to argue your point, and have failed. The Wii and the Game Cube are vastly inferior to that of The Xboxs and the PS3.

gray boris uk

On November 28, 2008 at 9:07 am

cell oh so powerful THAT give me a good spec pc with a spursengine card added beats the out of cell toshiba make a gaming spursengine for wii HD it would rock

wiiboy101

On November 28, 2008 at 9:12 am

good idea a higher clocked/speced wii with loads of EDram and ram and a custom spursengine along side broadway cpu 2 would be dam powerful and cheep its dvd scaling and ingame hd scaling would be great

3ds_Mac

On November 28, 2008 at 10:35 am

Lots of edram? Like the 128 million mbs worth of it, that everyone won’t respect for being such a small amount, while really, it’ll actually be overwhelmingly awesome, but people won’t know it, because they’re judging it in PC terms?

That bring the transistor count to well over a billion.
That’s more than twice an 8800 just for edram…
Then we got capacitors….and what nec would need for redundancy, without having to throw half of them away when they make them.
but…then we also need transistors to actually process.

We could also wonder what we’d need more than a few dozen mbs for frame buffer too, even at hd..

Hopefully Nintendo doesn’t phone in their hardware design next-gen. (ala “two Wii’s duct taped together”)

wiiboy101

On November 28, 2008 at 3:06 pm

take 9 480i frames build 1 frame from it =960p native effective performance =spursengine by toshiba in a mid to top pc that thing could be great hd optimizing and co-processing graphics and physics…

in dvd players and hdtvs it will also rock

but if there was a nintenoized gaming version of it they could build one cheep wii HD double up wii clocks beef spec and edram and ram add new high speed gaming disc/cart max the out of that little wii 2 system and run a gaming spursengine next to the gpu cpu and sound chip helping optimize everything wii 2 is doing …..

wiiHD released at 129.99 1080p no loadtimes gaming plus all the innovations cheep as frys

i garantee xbox next ps4 will be wiified ps3s and x360s with more edram and motion copying the wii bussiness plan EXACTLY

SO WHO RULES GAMING NINTENDO DO

wiiboy101

On November 30, 2008 at 1:57 pm

“SUPER RESOLUTION” YET ANOTHER TECH THAT PROVES BLURAY POINTLESS, AND HD GRAPHICS BABBLE BABY’S FULL OF …..

SINGLE FRAME SUPER RESOLUTION TAKES A FRAME REMASTERS IT TO NEAR HD QUALITY

TRUE SUPER RESOLUTION TAKES UPTO 9 FRAMES AND COMPUTS THEM INTO ONE FRAME = TO 960P RESOLUTION 960P IS THEN DISPLAYED AS 1080I/P OR 720P/768P AS HD

BLURAY IS DEAD

TVS TO SUPPORT SUPER RESOLUTION AS WELL AS USUAL HD

DVD PLAYERS TO SUPPORT SUPER RESOLUTION (DVD PLUS DVD2)

LAPTOPS AND PCS ETC ALSO SUPPORT SUPER RESOLUTION VIA SPURSENGINE OR EVEN SOFTWARE OR GPUS

SUPER RESOLUTION OFFERS TRUE HD MEDIA FROM NON HD SOUSES AND CAN OPTERMIZE TRUE HD AND AUDIO ALSO

IMAGINE A OVERCLOCKED SPEC TWICKED WII WITH PLENTY OF EDRAM AND RAM AND ALSO A NINTENDOIZED SPURSENGINE

Wii SR

WII SUPER RESOLUTION EVEN IF THE BASE NATIVE RESOLUTION OF A GAME WAS 480P ITS DISPLAY RESOLUTION WOULD BE NEAR AS DAM IT 1080I DUE TO SUPER RESOLUTION

HD FOR FREE IS PROVEN YET YOU X360 FANS TRYED TO DENY IT AND SAY EVERYTHING SONY AND BILL GATES TEL YOU IS THE TRUTH

YEAHHHHH RIGHT

NEC SUPER RESOLUTION

HITACHI SUPER RESOLUTION

wiiboy101

On November 30, 2008 at 2:02 pm

AND THE KING OF THEM ALL SO FAR SPURSENGINE AND TOSHIBAS SR SOFTWARE

DVD CAN PLAY IN ALL DVD PLAYERS BUT IN A SUPER RESOLUTION PLAYER THE DVD WILL ACT AS A 960P HD DISC

DVD2 IS TOTALLY DVD COMPATABLE BUT IN A SUPER RESOLUTION PLAYER IT WILL OFFER BLURAY LIKE EXTRAS AND MENUS AND MAYBE ANOTHER IMAGE BOOST DUE TO A BETTER DISC

DVD AND DVD 2 WILL BOTH SUPPORT 960P NATIVE RESOLUTION LIKE QUALITY VIA SUPER RESOLUTION PLAYERS/TVS/PCS/ETC

A GAMING OPTERMIZED VERSION BUILT INTO WII HD (2) WHAT EVER THE FUK

WOULD ROCK AND BE CHEEP

wiiboy101

On December 2, 2008 at 9:08 am

The Playstation 3 version of Far Cry is believed to run at full 1080p whereas the Xbox 360 version runs at 720p. Quaz51, the person from the Beyond3D forum who found out the native resolutions of GTA4 and Halo 3 has confirmed the native resolutions for the PS3 and Xbox 360 versions of Far Cry 2. PS3 version of Far Cry 2 is running at 960*1080p as opposed to 1280*720p on X360.

His comments on the PS3 version:

Does anyone wanna take a crack at this Far Cry 2 720p footage, entitled “Console Walkthrough? There’s quite a bit of visible aliasing in certain areas.

http://www.gametrailers.com/player/39010.html

I’m think I’m measuring 960×1080, which indicates this might be the PS3 version running at 960×1080 in 1080p mode.

And his comment on the Xbox 360 version:

1280×720

Either way, both version will look identical.

Source.
Share or Bookmark this post:

* N4G
* Digg
* del.icio.us
* Facebook
* Mixx
* Google
* Reddit
* StumbleUpon
* Technorati
* Propeller
* Slashdot
* SphereIt
* TwitThis
* YahooMyWeb
* Live
* E-mail this story to a friend!

Tags: far cry 2, playstation 3 news, ps3 news
News/Announcements

If you enjoyed this post, please consider to leave a comment or subscribe to the feed and get future articles delivered to your feed reader.
Similar Posts

* Metal Gear Online gets patched
* Flash Player upgraded thanks to Firmware 2.4
* Far Cry 2 features smoother graphics on PS3?
* GTA IV is better on PS3 according to IGN
* PS3 exclusive new Wardevil Details and screenshots

Comments
Comment by Tahiri’s older brother on October 24, 2008 @ 8:04 pm

Actually, that’s not correct.

Quaz51′s post was from over a month ago based upon video footage only without any idea which was 360 and which was PS3. Because he measured 1080p, he assumed it may be the PS3 version, hence he says “I think I’m measuring 960×1080, which indicates this *MIGHT* be the PS3 version running at 960×1080 in 1080p mode”.

However, now that Far Cry 2 has been released, it has be found to be the reverse, where it’s actually the 360 version which goes up to 1080p and the PS3 version which is only 720p;

http://www.product-reviews.net/2008/10/24/far-cry-2-on-sony-ps3-apparently-its-lacking/

“The problem is the PS3 of Far Cry 2 only supports 720p, while the Xbox 360 version offers full 1080p support.”

http://www.gamefaqs.com/boards/genmessage.php?board=944400&topic=46121686

“look on the back of the game boxes, ps3 hdvideo output 720p, 360 hdvideo 720p,1080i,1080p”

[Reply to this comment]
Comment by WIIBOY101/UK on December 2, 2008 @ 6:29 pm

RESOLUTION AINT GRAPHICS, THIS ARTICLE GOES TO PROVE THAT, BUT ENTER ANY FORUM BOTH X360 AND PS3 FANS WILL CALL RESOLUTION GRAPHICS, OR GRAPHICS HD AKA hd graphic!.

WITCH IS ALL MARKETING HYPE!!..

its gaming quality that matters not pixel counts “rollseyes”!

haze supports 567p with hd scaling on ps3 haze looks and plays because thats what it is 576p doesn’t come into it the graphics could be way better and full scaling can be offered

stop with the hd lies jesus

x360 doesnt support 1080p it offers a 720p ana chip upscaled mode what display res your playing at is meaningless to a point but arguing 1080p vs 720p is a meaningless argument!

id rather argue 3d motion 3d mouse vs caveman control pads

id rather argue fast loading vs crappy slow loading etc

who native resolution is the highest and whos smells the sweetest is all bull

its realtime gaming experience that matters!..

i think super resolution and hd optimizing tech can really help games in HD

a wii turbo with advanced super resolution will offer 1080p high end gaming at a rock bottom price

its cheep but powerful powerful but energy efficent thats the future

not big phat boxes loud fans and bull

[Reply to this comment]
Trackbacks

Leave a comment

Name (required)

E-mail (required)

WIIBOY101

On December 4, 2008 at 3:13 pm

I GARANTEE EDRAM AND MOTION PLAY ON PS4 XBOX 4….. (AS 360 WAS JUST A WAY OF PUTTING 3 BEHIND THE WORD XBOX TO MAKE IT THE SAME NUMBER AS SONY) I CALL THAT CHILDISH MARKETING LETS CALL IT 360 TOTALLY AVOWED THE NUMBER 2 (XBOX 2) AND HAVE A 3 (360) LIKE THE NUMBER 3 EVEN MATTERS

PS3 XBOX3(60) ANYONE DENYS THAT FACT IS FAN ING THATS EXACTLY WHY MICROSOFT CALLED IT 360

AND THAT’S CHILDISH MARKETING AIMED AT CHILDISH XBOX FANS (PROVEN USA USERBASE WII ADULTS X360 CHILDREN AND TEENS PROVEN)

WASNT THE NAME Wii GOING TO BE THE DEATH OF NINTENDO IN THE WEST YOU MICRO AND SONY FANS CONSTANTLY TALK Wii KILLED ALL WHO STAND IN NINTENDOS WAY JUST LIKE I SAID WOULD HAPPEN

COMMON FLAMING SENSE

IM REALLY HOPING MICRO/SONY INFRING ON NINTENDOS WIIMOTE PATENTS TRYING TO COPY NINTENDO THAT WILL RESULT IN LAW SUITS SO GOD DAM HUGE THOSE TO COMPANYS WOULD BE PAYING BILLIONS AND BILLIONS

FACT REMAINS NINTENDO HAS GAMING BY THE BALLS BILL AND SONY AND THERE FANS LOOK ON IN ENVY

lennell

On December 4, 2008 at 10:50 pm

hd resolutions is not graphics,it jast have more pixel on tvs and make things look better.but with more pixels it help make the graphic in game look better if making a game with more pixel.

WIIBOY101

On December 5, 2008 at 2:17 pm

LENNELL NICE TO SEE YOU ADMIT TRUTH OVER FICTION HD IS NOT A MAGIC CURE HD IS NOT GRAPHICS ITS MERELY MORE NATIVE RESOLUTION THAT ALLOWS MORE PIXELS TO BE USED…

AT LAST HONESTY ITS TOOK LIKE A YEAR AND A THREAD A MILE LONG TO GET SOME HONESTY OUTA YOU GUYS….LOL

WHAT IM SAYING IS STOP THINKING A CAT CAN BE SKINNED ONLY ONE WAY THERES MANY WAYS TO GET SOMEWHERE OR DO SOMETHING

LIKE GPU CARDS AS WE KNOW THEM ITS CALLED RASTERZERS THEN YOU GOT A SPIN OF TILE RENDERING DEFERRED RENDERING THEN THERES RAY CASTING

STOP THINKING THE PC WAY IS THE BEST WAY OR THE ONLY WAY THATS SOOOOOOO MISS INFORMED

HOLLYWOOD GPU IS A CUSTOM TILE RENDERER AND A GREAT VIRTUAL TEXTURE CAPABLE CHIP WITH LIGHTNING FAST RAM = BIG FILL RATE AND BANDWIDTH (WELL AT 480P IT IS)

MY THEORY ON SUPER RESOLUTION IS THIS IF A 3D GAME CAN BE SUPER UPSCALED LIKE A DVD MOVIE THEN WHY WOULD YOU RENDER AT THE HIGHEST RESOLUTION WEN YOU CAN AIM IN THE MIDDLE AND GAIN A HUGE FILLRATE/BANDWIDTH/FRAMERATE ABILLITY THEN LET SUPER RES TECH HANDLE THE 1080P HD SIDE OF THINGS

ITS NOT HOW MUCH POWER ITS HOW MUCH ACTUAL PERFORMANCE AND TRICKERY DEVS CAN PULL OFF

EASY TO CODE BEATS HARD TO CODE

CHEEPER TO CODE BEATS EXPENSIVE TO CODE

EFFICENT HD TRICKS BEATS TRYING TO BE ALL THINGS AT HIGH PRICE

I WANT MY CONSOLE SMALL/QUIET LOW ENERGY CONSUMPTION NOT HUGE LOUD AND PC LIKE IN ITS POWER SUCKING LIKE PS3 IS

IT COMPLETELY DEFEATS THE IDEA OF A “”"”"”"”"”GAMES CONSOLE”"”"”"”"”

WIIBOY101

On December 6, 2008 at 5:28 am

When im driving fast the road under be disappears and all the cars do.. and then i start driving in thin air.. the textures and objects slowly catch up to me… wtf is going on! BTW im playing it on xbox360.

EDIT: It really started happening after the 2nd bridge opened up and doing the Mannie mission where i have to follow the train in my car. Before that i didnt notice it.

Your xbox 360 was probally getting close to over heating mine used to do that until i bought a intercooler EX and the problem only happens after maybe after 3-4 hours of gameplay.

wait i could play wii for 48 hours solid (well i couldnt lol) and there would be no heat issues

again poor design :roll: :roll: :roll:

lennell

On December 6, 2008 at 10:27 pm

playing a game over 4 hours,thats the xbox 360 graphics chip issue do to poor manufacturing when its overheating on the chip. microsoft rushed the 360 in development so it can come out first a year ahead.without checking it and this is why people 360′s is having red rings and over heating.

lennell

On December 6, 2008 at 10:59 pm

if playing the 360 over 4 hours the gpu will overheat,!play it over 4 hours almost all the time and in about 9 too 12 months you will get the red rings of death.but this happen because a critical fault related to the connections between the console main processor and its graphics processor and the 360 motherboard inside the case the 360 motherboard is attached to the case by its edges rather than being properly pinned in the center,allowing it to flex and warp as the temperature inside increases,this combined with the way the already inadequate heatsink big metal grills that dissipate heat from the chips are attached to the board places stress on the corners and edges of the cpu and gpu,which gradually lifts the chips from their mounts making the gpu bunch and broken.

WIIBOY101

On December 7, 2008 at 6:17 am

wii = does not over heat

x360= heat issues

ps3 = heat issues

its supposed to be a fully functioning games console were is the commonsense

snes didnt over heat megadrives didnt ether

its the console vway small cool and child proof

sony and microsoft lost touch of what a games console is

WIIBOY101

On December 7, 2008 at 6:21 am

ps4 ps3 with edram and added motion (hold me to this )

x720 360 upgrade with edram (yes it has it now i mean more)plus added motion

wii hd upgraded wii more sram/edram and version 2 motion

ps4 will be a wiied ps3

xbox 720 will be a wiied 360

nintendo will carry on down the path all others follow

i garantee it

wiiboy 101

On December 10, 2008 at 2:38 pm

only game i have on wii with pop up cod 5 all versions suffer it………….

all other wii games don’t…………..

i went to a mates every single game he played on x360 had pop up issues whats the crack here why do sony and microsoft fans not see this shocking low level of quality in there beloved consoles they constantly put up with this and 3rd party nonchalance…………

metroid prime 3 60 frames perfection on a disc no issues no pop up no nothing

cod 5 pop pop pop pop pop every were u look

thats why im a nintendo fan im a gamer not an idiot

wiiboy 101

On December 11, 2008 at 3:00 pm

Wii 2.04M
DS 1.57M
Xbox 360 836k
PSP 421k
Playstation 3 378k
Playstation 2 206k

thats your confirmation nintendo won the console war

lennell

On December 11, 2008 at 9:37 pm

call of duty 5,the game was make ground up for the xbox 360,then port to ps3 and wii, the pop in graphics are the 100s of small tree’s,can’s,sandbag’s,the developer rushed the small objects but the main graphics they reprogram them to stay in and not pop up,but they did’nt reprogram most of the small little graphics,because they wanted to get the game out before the end of the year and not spend two more months working on reprograming the small little graphics to stay in and not pop up.they could’nt way to start the next call of duty game. call of duty 6.

wiiboy 101

On December 12, 2008 at 7:47 am

thats right a rushed multi platform 3rd party title THATS WHY NINTENDO FANS ARE SNOBS TOWARDS 3RD PARTYS POOR QUALITY WII DONT LIKE IT….

the sales of games like petz aint core wii owners its family fun and to call wii casual is daft as buzz and guitar heroes all sell well on other platforms like ps3 x360

id love a cod type game thats polished as fuk conduit just might be that game

3ds_Mac

On December 14, 2008 at 4:19 pm

On Wii, they don’t render things in the distance on Wii if they can avoid it. They’ll either break the level up, or confine the level to short distances, or they’ll put up the fake 2d backdrops like you see in Monster Hunter. (or you see objects pop in) simple as that.
If you’re going 360/_PS3, PS3 has an HDD, and can stream textures from there. 360 can do the same, if you install, or if the developer used it to begin with.
360′s gpu can render more geometry on screen, in the foreground and the distance. That’s half the point of pre-culling geometry on the cpu, before sending to the gpu on PS3.
That’s also what they mean by “optimizing” things ike Madden for PS3. They lowered the geometry on the crowd in the background, to fit what PS3′s gpu can process.
When they “lead” on PS3, they just don’t design the levels with more objects and detail than it can handle, so when they port, you don’t see a reduction like you see in COD5, where they remove trees and vines, and whatnot in the PS3 version.
Many developers have gone over that. Carmack for example.

3ds_Mac

On December 15, 2008 at 7:55 am

Not sure what you’re attempting to point out. Super Mario Galaxy has limited mobility from one place to the next. Most objects are background until you actually go to them. And Brawl’s areas are small.

And Halo3′s engine renders 14 miles out. It’s actually a record.
What do you think the Conduit’s draw distance is?

You see stuff pop-in when you have big open ground levels, in places like cities. ala No More Heroes.
You hide it, by trying to design levels in things like Mario Kart, by adding turns before you hit the draw distance threshold. You still see pop-in in Mario Kart btw. Or you limit the viewing angle, like you see in Pikmin.

Stuff like Wind Waker, the islands in the distance weren’t geometry until you were right near it. At a distance, it’s just a 2d cardboard cut-out outline. You stop out a little ways from the ice island, or the volcano, and you’ll see the ice, smoke and lava pop-in.

wiiboy101

On December 15, 2008 at 12:02 pm

what on gods earth are u tking about GO PLAY A DEAD CONSOLE ROLLSEYES

wiiboy101

On December 15, 2008 at 12:06 pm

MARIO GALAXY 60 FRAMES ROCK SOLID MARIO GALAXY NO LOADTIMES MARIO GALAXY MOTION CONTROLS MARIO GALAXY NO POP UP NO ISSUES NO CRASHES ITS PERFECTION AND ITS GAME OF YEAR IS 100% DESERVED…

I DONT RECALL NEEDING A PATCH FOR IT HMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM

I DONT RECALL WAITING FOR AREAS TO LOAD HMMMMMMMMMMMMMMMMMMMMMMMMMMMM

I DONT RECALL OUT DATED CONTROLS AND LACK OF INNOVATIONS HMMMMMMMMMMMMMMMMMMMMM

I DONT RECALL MY WII DESPLAYING 3 RED LIGHTS OF DEATH AND THEN DIEING HMMMMMMMMMM

AND I RECALL THE WII IS THE FASTEST SELLING GAMING DEVICE OF ALL TIME AS I PREDICTED 2 YEARS BEFIORE IT RELEASED
HMMMMMMMMMMMMMMMMMMMMMMMMMM

X360 IS DEAD ITS ONLY CASH BEING THROUN AT IT BY BILL GATES THAT KEEPS IT A FLOUT PS3 EVEN WORSE OFF

wiiboy101

On December 16, 2008 at 12:19 pm

leaked docs show microsoft ignored x360s build quality issues

YES THATS RIGHT OFFICIAL DOCS LEAKED FROM MICROSOFT PROVE THAT MICROSOFT DIDNT CARE A DAM ABOUT DISC SCRATCHING AND 3 RED LIGHTS OF DEATH AND JUST IGNORED THE ISUES AND RIPPED OF THE CONSUMERS

ANYONE BUYING A XBOX PRODUCT IS A DISGRASE TO CONSUMER RIGHTS

NINTENDO JUST WOULDNT DO THIS THERE BUILD QUALITY IS SECOND TO NON

3ds_Mac

On December 16, 2008 at 4:49 pm

WiiForFun= You aren’t showing anything that’s a strength in hardware. You can render out as far as you want, especially for a low geometry planetoid in the distance. Especially when there is no terrain between you and that planet to render.
That’s not hard to do. Take a computer graphics class, or look into everything that’s involved in draw-distance, LOD, etc.. and then judge what you’re looking at.

****I don’t think conduit’s developers are trying to draw out 14 miles in a dense city. They seem to be focused on filling the levels with enemies, while using advanced graphical effects and improving controls.****

You’re implying that the Conduit is focusing on foreground detail and enemies on screen, rather than trying to render out far. It’s obvious their engine is meant for halls and indoors, but the detail there is also nothing special, and there are no massive numbers of enemies, and no advanced effects. Just your basic checklist, and even then, baked in lighting, and many of the enemies sport spot shadows. Nothing like screen space ambient occlusion, etc, etc..
http://wiimedia.ign.com/wii/image/article/867/867484/the-conduit-20080417032707582.jpg
http://wii.ign.com/dor/objects/14248157/the-conduit/images/the-conduit-20080417032716817.html
http://wiimedia.ign.com/wii/image/article/867/867484/the-conduit-20080417032705723_640w.jpg

Average Geforce4 at best, and I’ll ignore the ugliness in those particular outdoor shots, and assume the best of its screens. It’s still nothing special.

And the rrod is Microsoft’s own fault, but they also replace it for free. Wii’s have a 20 dollar gpu in them.
“Gee, I hope this slow clocked, underpowered gpu doesn’t overheat”

wiiboy101uk

On December 17, 2008 at 9:34 am

“”madworld”" exclusive “”house of dead overkill”" exclusive “”conduit”" exclusive SEGA CLEARLY BACKING THE Wii……..with 100% exclusive games….

konarmi EA capcon all have multipul wii exclusives in the works all core games

factor 5 has core and casual wii exclusives in the works

ps3 remains in 3rd place x360 sells thru outragus price drops and hype but still fails to hit targets and is clearly being wipped aside by the wii

by christmas 2009 wii will rule all and a wii 2 in 2011 2012 at a low retail price will end the console war for good

wiiboy101uk

On December 17, 2008 at 10:07 am

PS3 “flopping so badly”

An analyst with Silicon Alley Insider has called the PlayStation3 a “sinking ship” which is “flopping so badly”. The site is run and written by reputable industry analysts.

Sony’s PS3 is dying on the shelves. (…) There’s really only one option left for Sony to remain in the game: deep price cuts, and not just for people with good credit. Tell yourself the PS3 has superior graphics if it makes you feel better, but a $400 console with a mediocre game library simply cannot compete against an Xbox 360 priced at $200 in this economy.

The same analyst later wrote a similar article about the PlayStation Portable, noting it was in just as much trouble.

What’s wrong? The problems with the PSP are surprisingly similar to the problems with the PS3. It’s overpriced ($170 to $130 for the DS Lite), has superior graphics no one cares about, and a mediocre game library. And because PSP console sales stink, game publishers are unlikely to devote resources to making more quality PSP titles.

Sony rules out PSP2 for now

These are harsh words to say the least, if not exaggerated. Of course, the discussion whether the PSP is a dead platform is an admissible one. SCEE president David Reeves told MCV UK in unusually blunt words that the PSP was struggling and that a successor was not yet being planned – at least not to his knowledge.

There are currently no plans for a PSP2. I go to Tokyo quite a lot and no one has referred to it and I think they have their hands full at the moment. (…) We just launched the PSP-3000 so we are still focused on this generation of the platform. (…)

The PSP is as successful in numbers as PS2 and it tracks its numbers in a cumulative basis. (…) Its weakness, however, is its software. And that’s because developers, when it comes to placing their bets, have to choose PS3 and 360, then Wii, then DS, maybe even PS2 before PSP. It’s the same at our internal studios, where the focus has been on PS3. They’ve also focused a lot on PS2 as well because we have to get the SingStars out for that format. So PSP games will come and they just take a while longer.

EDIT A PSP2 is in the works, after all, Eurogamer has learned from anonymous “publishing sources”. Apparently, yet another PSP revision is due to launch in 2009. A proper successor to the handheld will follow later. Publishers have already started developing games for the new system, the article claims.

Third round of Sony lay-offs

One worrying sign are another round of lay-offs at the corporation. A three year restructuring program called ´Transformation 60´announced in October 2003 already cut the workforce by 20.000 employees.

In 2007, around a hundred people were fired from the US PlayStation division, Sony Computer Entertainment America. The division’s then head of PR, Dave Karraker, attempted to reassure everyone that these renewed redundancies are “not wholly related to any one product in our portfolio.”

Now, Sony is firing another 8.000 employees from the “electronics business”. This means that SCE will again be affected.

EDIT Apparently, Sony Computer Entertainment Europe will not be affected by those job cuts. Dutch television station RTL Nederland broadcast a 15-minute interview with SCEE president David Reeves today (interview in English). Reeves talks at length about Sony’s hardware and software sales in the face of the worldwide economic crisis. About a PlayStation4, similarly to what he said above about a PSP2, Reeves notes that he has no knowledge of it and is “not even sure it is being planned.”

Sony platforms in last place

Remember the analyst who said two years ago that there will never be a PlayStation4? The same analyst now said in a Reuters article that the race for supremacy in this console generation is over. Yuta Sakurai from Nomura Securities believes that the Wii will remain at the top.

The competition stage is over. (…) The spread on shipment volumes is so large that it’s not even worth talking about it. They aren’t rivals.

The sales figures are indeed quite clear. Reuters compiled a nice chart comparing the three home consoles, as well as both portables. Take a look for yourself, but bear in mind that Forbes painted a more grim picture with their PS3 lifetime sales totalling a mere 13 million units.

The Reuters figures source the manufacturers and Sony only ever publishes sell-in data, i.e. the amount of consoles sold to stores. If the Forbes figures are a true reflection of consoles sold to consumers, then the PS3 is indeed dying on the shelves – millions of units, that is.

So, here we have another harsh analyst’s statement. Should we care? Of course, we need to cut the populist rhetoric (particularly the swear words). What remains, is still a substantial problem. Both Sony consoles have been caught in the downward spiral of dwindling sales causing third parties to devote their money to other platforms, which in turn makes for even worse sales figures.

As noted above, the PSP may be seen as a dead platform, (although some key publishers have recently told me about renewed interest). The PS3, it may be argued, will stay in third place. I strongly believe so and in this, I agree with the articles above.

But there is a key difference between PSP and PS3. The former introduced a storage medium which failed. The latter introduced a storage medium that is set to become the new industry standard. Of course, winning against HD-DVD was no final victory. Blu-ray still has to win against the regular DVD – and before physical storage media become obsolete. But continued adoption of the Blu-ray format may make for a push significant enough to keep the PS3 afloat. It may end up in the same situation Nintendo was in when their Gamecube came in a close third.

I myself am sceptical whether Sony president Howard Stringer really can rally up shareholder support for another multi-billion investment called PlayStation4. Perhaps he himself has no such intentions. But also bear in mind that the PlayStation brand remains one of the strongest brand names in consumer electronics and even Stringer will try his utmost to reap at least some rewards from it.

Incidentally, this is the very reason why Microsoft will not pull out of this market. For all their investments into both Xbox projects, Microsoft has losses of around six billion dollars to show to the shareholders. But after establishing a strong brand, as well as making and solidifying third party relations, they would be fools to pull out now. Of course, Microsoft has the money, if they only want to. With Sony, money may end up too tight to play the home console roulette for a fourth time.

EDIT The people over at Ars Technica have more interesting graphics to illustrate the console race in November in the US. And Chart Get! has calculated that Nintendo platforms took more than 65 percent of the total console market in that time, promising more charts to break down the numbers in detail soon.

EDIT The above article has now appeared over on CNN Money.

EDIT A number of other media outlets have picked up on the Silicon Alley / CNN story. Here are their key points below, in the familiar style of guest commentary.

A little hard to swallow

It’s hard to argue with their conclusion that the PS3 needs a price cut soon, and a substantive one at that. Ten year lifespan or not, it stands on the brink of falling behind the Xbox 360 by an insurmountable margin. (…)

Acknowledging the price issue, the other two [key reasons given] fall into a much more gray area. Blu-Ray shows more signs of life with each month that passes since it won the HD format war. (…) Admittedly, the question of whether the PS3 has a library of must-have titles is more subjective. But dismissing Metal Gear Solid 4, LittleBIGPlanet, Resistance 1 and 2, and Uncharted alone is a little hard to swallow.

PC Magazine (via Yahoo News)

What were the Sony execs thinking?

The Nintendo Wii remains the most popular system in the land, and, at $250, isn’t necessarily a budget buster. XBox 360 has made serious inroads by dropping the price of its core system to $199. So how did Sony respond?

By releasing a new version of the PS3 … that’s $100 more expensive. Yes, it comes with a game, and yes, it has more hard-drive space, to which I respond: Who cares? Was the marketplace clamoring for more memory from the PS3? Is that why its market penetration is so low compared to its predecessors and competition? What were the Sony execs thinking? (…)

Market penetration remains low, and every month people don’t buy a Blu-ray player is a month they get closer to downloadable HD movies and the death of the format as a whole. Sony would be wise to step it up and do a better job at getting Blu-ray players into people’s homes.

Washington Times

Sony has actually been playing catchup

Sony rightly points out that the PS3 has seen hardware sales grow 60% year-to-date. I realize the PS3 wasn’t selling well in 2007, so that figure’s less impressive than it sounds, but growth is growth, any way you slice it. What’s more, look at PS3 and Xbox 360 units sold in total worldwide, and Sony pretty much throughout 2008 has actually been playing catchup. (…)

On the other hand, CNN’s whacking the nail on the head when it raises the problem of the PlayStation 3′s price. The recession’s been on well and long enough for Sony to have reacted by now, and yet it’s stubbornly clung to that $400 entry point.

Washington Post

Sony has put on a brave face

Sony has put a brave face on its Christmas failure. It said that over the last year sales have been picking up on the console. The figure it claims is about 60 per cent.

Now given the fact that sales during 2007 were even worse than 2008 that is not something you want to crow too much about. However Sony seems to think that means that 2009 will be even better.

The Inquirer

How far should Sony go?

The question is, how much of the PS3′s sluggish sales are due to pricing? And if they do cut prices, how far should Sony go? $300? $200?

Firing Squad

Blu-ray awesome, exclusives are there

Blu-ray is catching on big time, and I think the first-day sales numbers of The Dark Knight and Iron Man will back me up on that one. As for the third point, LittleBigPlanet, Valkyria Chronicles, and Resistance 2 say “Hi.” Heck, even IGN declared the PS3 the place to go for exclusives in 2008.

It’s obvious that a price cut would help the PS3′s competitive position, but it seems silly at this point to brand the black beauty a sinking ship. After all, Blu-ray is awesome, the exclusives are there (with more on the way), and we haven’t even begun to review emergency procedures (it’s men and old people first, right?).

TVG blog

Sales will need to improve significantly, and soon

Sony is still trying to make the PlayStation 3 business profitable and start paying off the costs of developing the hardware. Its software sales are strong, despite weaker hardware sales, but both will need to improve significantly, and soon. Given Sony’s ironclad devotion to profitability in the near term, the quickest route to higher PS3 sales — a price drop — simply is not feasible.

As an alternative, Sony could publish a must-have software title that attracts more consumers willing to pay the price for its hardware. Short of Metal Gear Solid 4 in June 2008, it would appear that no exclusive software has really driven hardware sales. Even Sony’s flagship holiday title, LittleBigPlanet, only managed 141,000 units during November.

Gamasutra

A formidable presence in this industry

We’re still seeing the PS3 business ramp up and that the story is going to take a little longer to unfold this generation than in previous cycles. Sony is a formidable presence in this industry and year-to-date has achieved a pretty significant increase over last year. I think we’re just going to have to wait and see how 2009 unfolds.

NPD analyst Anita Frazier (via GameDaily)

Slow to react to the current crisis

We believe fundamental changes to its business structure are necessary. (…) Compared to its peers both at home and overseas, Sony has been slow to react to the current crisis. [These comments were made after Sony announced worldwide job cuts of 8.000.]

Credit Suisse analyst Koya Tabata (via Bloomberg)

Too overpriced to penetrate the market

If you’re worried about your job, are you going to buy a $400 PS3? (…) Christmas is not going to have the same glow. (…) The PS3 is too overpriced to penetrate the market we are in today.

Janco Partners analyst Mike Hickey (via Bloomberg)

Between a rock and a hard place

Sony should be worried. (…) The value proposition for the PS3/Xbox 360 is now out of whack in favor of the Xbox 360. I think there will be some very hard decisions to make after the New Year. They might not be able to afford a price cut, but on the other hand they might not be able to afford their current price. They are between a rock and a hard place. Like I said, my biggest concern for the PS3 is if they let the Xbox 360 gain big momentum in Europe.

DFC Intelligence analyst David Cole (via GameDaily)

Plenty of time left to catch up

With both the PS3 and the PSP being the highest priced platforms in their segment, it is no surprise that both were the only next-gen platforms to post a year-over-year decline. (…)

In terms of cutting prices, currently, it wouldn’t be financially responsible or beneficial for them to cut the PS3′s price point; manufacturing costs are just too high. That being said, I do expect the PS3 to receive a price cut in early 2009. This is when Sony should reach a point of manufacturing efficiency that would financially warrant a price cut. (…)

We are not even at the half-way point for this generation, there is still plenty of time left for Sony to catch up and I believe in the long-term, Sony will gain a significant amount of ground on the Xbox 360. It just might take longer than what Sony expected; except in the handheld market and I think we can all agree there is no hope for the PSP in North America at this point.

EEDAR analyst Jesse Divnich (via GameDaily)

A greater victim of the recession

I think Sony is a greater victim of the recession, more because they are a consumer electronics company than a video game company; people aren’t buying HDTVs and that’s why Best Buy has been talking about their same store sales being down 5-15%. It’s really hard to get people to come in and spend on the super big ticket items. I think Sony is feeling the pain of that and I had expected that there would be a high attach rate this holiday of PS3s to HDTVs because Best Buy was pushing it… and that’s not happening.

So that’s the first problem. Secondly, you can’t buy a PS3 for less than $399, and the average of all Xbox 360 SKUs in November was $270. So the Xbox 360 average price is $129 less than the PS3, and that’s hurting [Sony]. The other factor is that the Wii is ridiculously cheaper. So Sony has all these things working against it, and then at the end of the day, their game lineup, which is very good, is still not sufficiently differentiated to induce people to say “I’ve just got to buy a PS3.”

Wedbush Morgan analyst Michael Pachter (via GameDaily)

The cash-haemorraghing disaster that has been PS3

So, they’ve decided to release HOME as a public Beta. That’s HOME, the amazing virtual world that had game journalists and PS3 fanboys proclaiming the greatness of Sony and how it would be the best thing EVER! Except, it’s not. Two years on, it turns out to be a half empty, badly thought out mess (queuing to play a game of Pool?!) and clearly nothing more than a cynical attempt to actually make some money out of the cash-haemorraghing disaster that has been PS3. And with nothing on the horizon, save GT5, surely the ‘nail in the coffin’. Even ThreeSpeech can’t be arsed to Big It Up like they did with Little Big Planet (already being sold for $19.99)

UK:RESISTANCE

Should the price be cut? No.

Yes the PS3 is in last place in sales in November. But once you look at the year over all and add in one other geographic sales area, Japan, suddenly the PS3 is in second place for yearly sales to date. Logic can be frustrating to those with a plan based on a house of lies. I am sure that if the truth were to escape, there are 360 fans that would be hurling themselves off the roof. I would not care for that but I would really like to see Eric Krangel and the staff of Kotaku take a header. The gaming community would be better for their absence.

With the combined sales of the US and Japan putting the PS3 in second place by just over 30,000 consoles, should the price be cut? Again, no. The main persons crowing for a price cut are the uninformered, the cheap, and those interested in Sony losing money. But what about Europe you say? Other than some spurts in the UK, the PS3 is soundly winning in Europe. The only place sweating that territory is Microsoft.

Playstation Army

Sources: Silicon Alley Insider, Silicon Alley Insider, Reuters, MCV UK, EDGE, Eurogamer
Thanks to: Joystiq, Joystiq, Kotaku, Games Industry

Labels: PlayStation3

posted by Falafelkid at 12:58 AM 13 comments
AddThis Social Bookmarking Widget

ps3 is dead x360 is on a crutch of bil gates money xbox devision in the red 6 billion

nintendo rules the world

3ds_Mac

On December 17, 2008 at 10:50 am

Madworld and most everything else they make will sell like No More Heroes.
Wii = Gamecube as far as sales of gamer games are concerned.

That’s what the other systems are for.

WIIBOY101

On December 19, 2008 at 1:52 am

FREE RADICAL (TIMESPLITTERS) ARE DEAD THANKS TO PS3 DEV COSTS AND FREE RADICALS STUPIDITY NOT BACKING THE WII

ANOTHER EXAMPLE OF IM RIGHT XBOX FAN IS WRONG FREE RADICAL WOULD HAVE BENEFITED FROM WII EXCLUSIVE SUPPORT WELL DONE XBOX/PS3 BUYERS

WIIBOY101

On December 19, 2008 at 4:01 am

rumor free radical now pumkin beach rumor is there wii ds exclusive now

WIIBOY101

On December 19, 2008 at 4:09 am

wii= gamecube

gamecube 485mhz cpu /162mhz gpu total ram including access ram 43mb 1.5gb disc standard (nintendo invented controller) (best standard controller of all time also)

wii 729mhz cpu 243mhz gpu 88mb ram more memory avalible via flash 8/9gb disc motion controllers (unic and class leading)

wii = gamecube ur on powerfull drugs

gaming experience xbox =ps2=gamecube=ps3=psp=x360=generic=old=stale=copying nintendo

wii=the future=ds=the future too ps3=x360=the past =ps2 =ps1

backward does not = forward button bashing is obsolete

WIIBOY101

On December 19, 2008 at 4:10 am

and the sales and profits prove it

3ds_mac

On December 19, 2008 at 7:54 am

Who’s an Xbox fan? Nintendo could take over the industry for all I care. I just don’t like their hardware choice.
And I said, “Wii = Gamecube” when it comes to potential sales of games. Developers don’t look at Wii as a Gamecube with twice the installed base. They look at it as if it had Gamecube’s install base, if even that.

boywii101

On July 4, 2009 at 12:16 am

motion is not the future cough natal cough ps3 motion

boywii101uk

On July 4, 2009 at 12:25 am

camera motion sensing 300degrees per second natal and ps3 motion

wii motion plus 1600 degrees a second

natal a casual joke cam restricted laggy red jump suits so cam can see pile of crap and lionheads natal demo FAKED

ps3 wonds look like sex toys cam motion = rejected by Nintendo not combining traditional with motion like Nintendo ether competitor

analog plus ir mouse pointer plus wm+ and other motion = core motion perfection natal a cam tracking gimmick

conduits controls destroy x360/ps3 classic retro pads and both ps3 motion and natal looking very unsuited to core play and fps

wm+ true 1to1 and physics at point of movement UR HANDS

imagine wii+ wiihd with

dual wii motion plus, dual ir mouse pointer, dual speakers, dual rumble, no wire between

true dual weapon 3d motion plus analog stick YUM CORE GAMEPLAY

graphics clearly vastly Superior to xbox

conduit/mario galaxy/house of dead over kill,monster hunter 3,pes 2009,cod waw,the grinder,new res evil,dead space, silent hill

SHALL I GO ON OH AND NO INSTALLS AND NO LOADING IN MOST WII GAMES

wii continues to OWN wii hd may hit soooner than u think

boywii101uk2009

On July 4, 2009 at 12:29 am

a xbox 1 cannot do this this ;looks far closer to x360 than xbox 1 WITCH IS WHAT I STATED ALL ALONG ALSO MOTION CONTROLS AND NOOOOO LOADING

http