A man of dubious moral fibre
28th May 2003
Links to the reviews:
Beyond3D Preview http://www.beyond3d.com/reviews/ati/r520/
The Inquirer http://www.theinquirer.net/?article=26720
The Tech Lounge http://www.thetechlounge.com/review...n_x1000_preview
HardOCP X1800XL/X1600 - http://hardocp.com/article.html?art=ODIy
Look very good considering its only got 16 pipelines. But the RRP for the X1800XT 512mb is like £470 and its pretty much neck and neck with a 7800GTX which is over £100 cheaper.....
I've just been looking at some of this first batch of reviews, and I'm not terribly impressed. Sure, R520 holds up pretty well for having "only" 16 pipelines, but it holds up pretty well against the 7800GT (an important consideration). It certainly isn't a G70 killer (to paraphrase what [H] concluded). I think it had potential, but I suspect that ATI shot themselves in the foot during design phases: R520 is supposed to have 32 pipelines in it's architecture, but we're not seeing 32-pipe cards because the yields are too low to make it cost effective (such is the word on the street, anyway). However, ATI's decision to drop down to 16 pipes is curious, at best. I think there's more going on here than is immediately appearant.
Next up, in four months: G72! You may now all bait your breath ;)
2nd September 2004
Anyways, the R520 looks very good, pricey, but when you're up spending all that money I guess having the best card is more important. Like always it beats NVidia card in some benchmarks, get beaten in others, but from what I saw it seems the X1800XT is a bit superior. It will also be interesting to see how the X1600 will do as a mid-range cards, and how the X1300 will do too. I have read some rumours that the R580 is ready to be shipped this fall, probably with 24 or 32 pipes. This would be surprising, but would really be needed to keep up with NVidia. I do however wonder if the current ATi drivers fully support the X1000 cards :uhm:
It's Not Easy Being Green
10th November 2003
ATI kicks the lamas ass(old winamp thing) now ATI kicks Nvidia (once thay come out nvidia maks good cards but there not ment to last, ATI makes card that have lots of power but dont use a over power GPU nvidia wants fastes now an cuts corners to get there ATI develops more stabel card at lower operation frequinces
FN_lewrbm69ATI kicks the lamas ass(old winamp thing) now ATI kicks Nvidia (once thay come out
You didn't actually read any of the reviews, did you?
nvidia maks good cards but there not ment to last, ATI makes card that have lots of power but dont use a over power GPU nvidia wants fastes now an cuts corners to get there ATI develops more stabel card at lower operation frequinces
Would someone care to explain to me when 625MHz became a lower frequency than 430MHz? Or, for that matter, how the claim can be made that R520 is more or less stable than G70 when nobody's actually seen the retail product yet, much less had a chance to break it.
Don't kid yourself, and don't be a sycophant. Both companies cut corners where they can, and neither "builds cards to last", because the other won't content themselves to do so. ATI has done some interesting things with R520 (ring topology for the memory and "High Quality AF" being chief amongst them in my opinion), but the final result isn't anything to write home about, especially at it's current price. Performance isn't any better or worse than the relevant G70 part, on average.
21st February 2005
Every one thought ATI was gonna make 32 Pipelines and kick Nvidia butt this round... Well I feel sorry for you if you did. ATI could if they up the Pipelines to 24 and add more speed but they didn't :/ Seems like there only making cards to beat the ones ATI made now and not Nvidia...
3rd May 2005
ATI seem to have lost the plot a little here. I've been reading these reviews (Number 2 gave me a 404 - the link hates me :( ), and it seems kind of stupid to market themselves to OEM resellers - whose PCs will probably end up being shipped out via PC World to your average family-accounts/word processing sort of buyer. I would have thought it would make more sense marketing to the enthusiasts and gamers, who will certainly be willing to pay more for a decent card. So saying, one of the X1xxx series cards - the X600 to be precise - is actually going to be more expensive than it's competition - the Geforce 6600 GT. Way to drop the ball. :thatsgreat:
As for the card itself, I see nothing extraordinary, apart from the new Shader Model 3.0, and the alleged HDR support - which nVidia will probably introduce in some shape or form in their products at some point. Looking at the benchmarks, it looks very tight. But then, nVidia can give you this sort of performance right now, whereas you'll have to wait four more weeks for ATI's equivalent offering. Apparently the X1 series will be better for overclocking, which is something I suppose.
I personally don't see the point in wasting a lot of cash on an uber-performance graphics card, when games that will be able to take full advantage of these new capabilities are still on the drawing board. I don't upgrade often - the X700PE 256MB I'm currently using is a relatively recent addition, prior to that I was using a lowly 9200SE. I've always been a fan of ATI's products, but it looks like they've lost the plot here - bigtime.
On a side note, the reviews mention that ATI used the same R500 (or C1 as it's officially known) core in their "Xenos" solution for the XBOX 360. I've seen some screenshots of the currently available games for the 360, and they look mind-numbingly awesome, bordering on real-life images. If nVidia can beat that sort of performance, then their new G70 should certainly be worth the wait.
Rookie_42As for the card itself, I see nothing extraordinary, apart from the new Shader Model 3.0, and the alleged HDR support - which nVidia will probably introduce in some shape or form in their products at some point.
Just as a quick aside, nVIDIA included SM 3.0 support in NV4x, and HDR support in G70. "High Quality AF" is the only demonstrable technological leg up that ATI has over nVIDIA at this point (though the redesigned memory bus may prove fortuitous at some point in the future).
3rd May 2005
JakkcWhats its clock speed?[/quote] 500 - 625MHz, depending on which one of the X1 series you get. [QUOTE=C38368] Just as a quick aside, nVIDIA included SM 3.0 support in NV4x, and HDR support in G70. "High Quality AF" is the only demonstrable technological leg up that ATI has over nVIDIA at this point (though the redesigned memory bus may prove fortuitous at some point in the future).
Heh. Shows how much experience I've had with nVidias. :lookaround: