Hey Hey, My aim was to setup my entire PC system to be HD. I have everything I need for it to be done, I just don't have the HD image or Audio quality... GPU does HD, LCD does, I have the cables, adapters, speakers, Audio card, Mobo, everything. I basically need a real n00b how-to guide to set it up. Google isn't being terribly much help, is there a setting to change in the nVidia CP for drivers, so that the GPU does HD format quality? Or is it really as easy as plug and play? If its as easy as plug and play, then its not working lol
720p/1080i/1080p is just a matter of resolution. If your monitor/TV is using the appropriate resolution then it's displaying 'in HD', although if you're talking about your VW222 then it's not going to be able to display either of the 1080 types. That said, on the other side of things if your content is low def (e.g. a DVD) then you're not going to get 'HD quality' out of it, because it's not HD to start with.
As for the audio quality, there is no such thing as 'HD audio'. Audio quality is largely subjective but if you think the quality is poor, might I ask what you were expecting with a crap speaker set like the X530? Also, what sound card have you got?
I don't care about the Audio, I have decent speakers, I only use the Monitors speakers at LANs, which is when the HDMI cable would be used for Audio.
I know for a fact my VW222U does 1080, and I've discovered how to setup the format to HD, It is in the drivers, more specifically, nVidia Control Panel and "Change HD format or resolution".
At the moment its set to DVI, which explains why quality isn't great to what it should be. Changing that to be HDMI will work, but the only item listed is DVI. More then likely because the adaptor is DVI to HDMI...So I need to find a way to force, or hack the drivers to be HDMI rather then DVI.
I put a BluRay drive (a mates) into my system today and played a BluRay movie with it, the quality was NO better then standard Def DVDs, so the format in CP is certainly playing a part... In addition, I've read in places that playing games in HD introduces alot of lag, my games aren't running any slower, rather faster being a fresh build.
Any ideas on how to play games as HD? For example, Crysis supports High Def, And I assume will run under it once I get the format set right...
If you don't care about audio quality then don't complain about it in your first post, and if you're calling those X530s decent speakers I'm gonna fall over laughing because they're just not. And yes I do know what they sound like.
As for the monitor, it doesn't do 1080i/p. 1080p uses a resolution of 1920x1080. Your screen has a resolution of 1650x1050. It will display 1080p content but it will be scaled down to fit the screen, hence why you're bitching about quality. The only monitors that can do 1080p properly are the big 24" ones that support 1920x1200 (like my 2407WFP). That is a basic fact, if you want to argue with me about that when I'm trying to give you information you can bugger off and argue with the wall instead.
DVI and HDMI are effectively the same. The difference is that HDMI can carry audio. If you're experiencing poor quality with DVI then it's because of the above paragraph, not because DVI is bad.
As for games, will you bloody listen to me? If you play a game at 1280x720 then that IS in 'high definition' resolution. You do not play a PC game in HD! HD is new shiny shit for home cinema and games consoles, we've all been playing PC games in 'high def' for years!
As for the nVidia control panel, those options are intended for use with TVs. Hence the name 'Video and Television'. It even says that in the text at the top of the window. Are you using a TV? No. You're using a Monitor. Do we really have to go to such basic levels here?
Get this into your head: High definition is not two magic words. It's a cover-all term for video displayed in relatively high resolutions. The major standards are 1280x720 (720p), 1920x1080 interlaced (1080i), 1920x1080 progressive (1080p).
1. Your speakers are adequate, but they are rubbish speakers and you'll never get good quality sound out of them, unless your sig is lying about which ones you have. 2. Your monitor cannot display 1080 content properly. 3. DVI is an excellent connection type and there's nothing wrong with it. 4. You don't need to do anything with the nVidia control panel. 5. Learn to listen to people that know more than you or don't bother asking for help. Just because you don't like the answer doesn't make it wrong.
I was stating it can do 1080, I didn't say at what res... HDMI yeah, is designed for Audio quality, but it also has a better image quality, such as removing bright color banding in bright scenes.
But yeah, those TV settings apply to Monitors aswell... Monitors are just as common to be used as TVs now as TVs are...
You don't need to pull an attitude because im questioning somethings you've said... The videos on hdmi.org don't explain everything, so questions will remain
DVI and HDMI are exactly the same when it comes to video quality. The only differences are that HDMI can transfer audio and video along the same cable whereas DVI can only send video, and that they use different connectors. They use the same encoding method and all that to send the data, which is why you can just use an adapter cable rather than a conversion box. The only reason this wouldn't be the case is if you're using a crap DVI cable and comparing it to a good HDMI one.
You need to set the correct resolution for your monitor like you would normally, i.e. it's native resolution (1650x1050 I believe?), and go from there. Those options are designed to be used with specific TVs that support 1080p, 1080i, 720p and the lower EDTV resolutions. Since your monitor only properly supports 1280x720 and lower, you're better off setting it to the higher native resolution (which will give you a clearer picture on any LCD) and using that. Let the PC itself deal with scaling video and the like, because most monitors are awful at it.
PS - The reason it doesn't show HDMI in the nVidia control panel is because you're using the DVI output, hence it shows as DVI. But as I've already stated, you don't lose any quality by using DVI.
But HDMI still supports more colors? Yes? Thus using an adapter with a high data transfer bandwidth should (In Theory) provide the user with the ability to see those extra colors... Thats one improvement I, atleast, seem to have noticed, is a greater improvement in color. But my PC has been apart for a good 2 or 3 months so I can't remember to great what it was like before... All I can remember is the DVI quality was crap at best, heaps of color banding in bright scenes, blurry movement like all hell (LCD is probably to slow with 60Hz and 2ms but still), black was grey... And now black is black, It seems alot crisper....But still nothing what HDMI is reported to do...(Going by the videos I've seen on HDMI)
Sgt. D. Pilla;4577327But HDMI still supports more colors? Yes?[/quote] No. What part of 'They are the same for video' are you not getting?Sgt. D. Pilla;4577327Thus using an adapter with a high data transfer bandwidth should (In Theory) provide the user with the ability to see those extra colors...[/quote] As above. Even if there was a difference it's still a DVI cable at one end, so if DVI supports less colors then they still wouldn't 'get' to the HDMI part of the cable. But as I said, they're the same.Sgt. D. Pilla;4577327But my PC has been apart for a good 2 or 3 months so I can't remember to great what it was like before...[/quote] Well there you go. [QUOTE=Sgt. D. Pilla;4577327]All I can remember is the DVI quality was crap at best, heaps of color banding in bright scenes, blurry movement like all hell (LCD is probably to slow with 60Hz and 2ms but still), black was grey...Having looked at a few reviews just now, the ASUS VW222 was absolutely panned in a few of them for having all these problems. That said, a few others gave it decent marks for image quality so I'm not sure on that one. [quote=Techgage]The display market is chalked-full of models that fill up the quality spectrum, from sub-par up to high-end. The VW222 falls into the sub-par category, sadly, with overall poor color representation and noticeable screen-door effects, resulting in a model that should not be considered for purchase.
Could be the monitor, but another possibility is that you were just using a really bad DVI cable and the DVI-HDMI one you're using now is higher quality and so just happens to provide a better signal.
If you're using a £20 DVI-HDMI cable you'll probably get a better picture than if you're using the manky 50p one that came in the box with the monitor. I'm getting bored of saying it, but I'll say it again, for video DVI and HDMI are technically the same. But that doesn't mean every cable is the same quality. [QUOTE=Sgt. D. Pilla;4577327]And now black is black, It seems alot crisper....But still nothing what HDMI is reported to do...(Going by the videos I've seen on HDMI)
See first point.
I didn't read through all the above guys, but of what I've read I only see hardware being talked about. Nine times out of ten when someone tries to play HD movies for the first time they have a common problem, the software that comes with their HD burner and/or player does not actually support 1080p. You can shell out a lot for PowerDVD Ultra, or just use KMPLayer, a free media player that is perhaps the best out there.
Other than that you need to make sure you have the right codecs installed, esp if you are playing movies you or friends compress via x264 and AC3, which is one of the most common ways of watching HD content now for it's high image quality and low file size.
Great News :D I fixed it! This is a rarity coming from me..."Stupid Vista" As silly (And Unexpected) as this is, The Monitor required drivers... I downloaded and installed the drivers for it, after reading that it would greaty improve the quality of all output, including DVDs. I didn't notice anything until I looked a bit deeper after installing these drivers... It was defaulted to "Asus VW222U [Analog]", I then changed this to "Asus VW222U [Digital]" and the quality improvement was massive. Further more, I then adjusted some settings of drivers in the nVidia CP, Including Image Vibrance, Contrast, Noise Reduction, Edge Smoothing and the quality jumped probably doubly high. Whats more, the image banding around bright scenes is gone. All this is while watching Hogans Heroes in Windows Media Center, which is set to output HDMI content, and not DVI.
Further crazyness here, Windows Media Center has a huge difference in quality when choosing "DVI" "HDMI" or "VGA" Monitor output Choosing HDMI seems to output a better 20% image quality then DVI does...All this is off of a DVI interface. So I am seeing HDMI quality coming through now, after screwing with drivers, CP settings and program output settings. Obviously the DVI-D to HDMI adaptors are doing something to that digital signal.
Further more, I'm investing in an LG BluRay Reader SuperMulti Driver tomorrow, for $170, I opted for a $20 more expensive drive because quite frankly, it had more details on it, I could find what I was looking for easy...
LG had a good website, Good details and easy to find details, so I brought their product, over the higher reviewed, better rated Pioneer drive. So Hurrah! Plus, Im getting that Plasma TV for my Birthday and Christmas present, its a joint gift from both parents, my sister, and grandmother lol But well worth it :D