Agp? -1 reply

  • 1
  • 2

Please wait...

VonMeyer

Waiting for Forgotten Hope 2

50 XP

21st October 2002

0 Uploads

2,212 Posts

0 Threads

#1 16 years ago

What is this AGP option on my video card properties? I have it enabled to 1X and my card is o/c to 272/520. Id like to know is AGP nessecary for performance or will it hurt my video card which is a GF4ti4200. So far since having it on my fps went up and lag is gone of course i reinstalled bf1942. :cya:




DARK ANGEL

The Internet ends at GF

50 XP

31st December 2002

0 Uploads

102 Posts

0 Threads

#2 16 years ago

You should have it set to 4x which i think you card only goes to 4x and im also guessing the the the AGP slot in your puter is a 4x slot. Then again if you have a new MB you may even have a 8X slot.

Accelerated Graphics Port (AGP) technology provides a dedicated, high-speed port for the movement of large blocks of 3D texture data between the PC's graphics controller and system memory.




VonMeyer

Waiting for Forgotten Hope 2

50 XP

21st October 2002

0 Uploads

2,212 Posts

0 Threads

#3 16 years ago

My card supports 8X buy my mb only goes to 4x.




ditchhopper

Death Obsession

50 XP

25th August 2002

0 Uploads

710 Posts

0 Threads

#4 16 years ago

Currently 8x isnt much of a gain from 4x anyway. But with the newer cards comin out and more memory bein added, 8x will have a better factor in the not so far future.




apocalypse_kid

I would die without GF

50 XP

20th May 2002

0 Uploads

5,498 Posts

0 Threads

#5 16 years ago

Hi VonMeyer,

Definately 4X. Keep in mind that the diffrence netween 2X and 4X is not great and only gives you about 15% better performance, but from 1X to 2X is like 50%+, so at 1X you are losing heaps.

Also many mobo's based on the VIA chipset won't run AGP properly unless the Via 4-in-one drivers are loaded (preferably straight after installing the OS and before loading the vid drivers). Also many 8X vidcards simply won't run in a 1X GP slot cause of the voltage difference (this can cause damage to the vid card so be careful with your settings guys)

You may find that the mobo autodetects your vid card and sets the AGP for you so it may be running at 4X already. (Gigabyte Mobo's have special circuitry on the board for this I believe)

have fun

:cya: :cya: :cya:




VonMeyer

Waiting for Forgotten Hope 2

50 XP

21st October 2002

0 Uploads

2,212 Posts

0 Threads

#6 16 years ago

This is what it says on my video properties for my card.

Capabilites. AGP2.0 4X 2X 1X

Nvidia GPU. AGP3.0 8X 4X 2X 1X

Actual 4X 2X 1X

SBA is enabled.

GF4ti4200 O/C to 272/520 AA is off AF is off V sync is on 1024x768 32 bits 85HZ VIA hyperion Drivers 4 in 1




viprdude

I don't spend enough time here

50 XP

2nd January 2003

0 Uploads

22 Posts

0 Threads

#7 16 years ago

turn off vsync, it is uselss. you get more fps




apocalypse_kid

I would die without GF

50 XP

20th May 2002

0 Uploads

5,498 Posts

0 Threads

#8 16 years ago

Hmm, with the vsync issue it's really yes and no.

With Vsync on you will get a max fps the same as your monitor refresh rate. If your monitor is at 100Hz and your vid card never get above 100fps then it ain't going to make a bit of difference, if your vid card does then it will "appear" to make a difference - in measuring, but visually there will be no difference cause your monitor is refreshing 100 times a second anyway, so 200 fps isn't going to make it appear any smoother. You're much better off using those extra fps to increase res and visual quality like AA and AS. Also having vsync off can cause some visual artifacts if your fps is a certain proportion of your refresh rate (about 1.5 I think). If it is you will get a visual effect called "tearing", where the monitor displays part of one frame at the top of the monitor and part of the next frame a the bottom, also possible some discolored pixels and things. If you are having any of those problems turning vsync on can help.

What's you AGP aperture VonMeyer, sometimes less is better. I find 32Mb to be ideal for my vid card, having it too high can degrade AGP performance, but under 32Mb can cause big prolems.

Have fun

:cya: :cya: :cya:




viprdude

I don't spend enough time here

50 XP

2nd January 2003

0 Uploads

22 Posts

0 Threads

#9 16 years ago

so you would recommend to have vsync on? if not here are my system specs. i have not seen any of those problems but when i had vsync on i saw in bf1942 and moh:aa and spearhead the colors were richer.

amd athlon xp 2100, 1.73ghz crucial 512mb ddr pc2100 ram geforce3 ti 200 64mb vid card overclocked 220/500, no heat problems also.

i run games from counterstrike to bf1942 to ut2003 to medal of honor. let me know what u think i should run my settings at and ill give it a try.

NOTE: i have tried vsync on and it works fine with bf1942 and medal of honor. but, i have some questions with it for counterstrike and any other half life mods. i have my refresh rate set at 100 hertz and from what u say, the fps for that game should go up to 100, but it only stays at 60 even when i type "fps_max 101" which should change fps to 100 but does not, it stays at 60. any information on this?




apocalypse_kid

I would die without GF

50 XP

20th May 2002

0 Uploads

5,498 Posts

0 Threads

#10 16 years ago

Hi viprdude,

In theory Vsync is only available for DirectX games and is not supported by OpenGL, however anecdotal evidence suggests it does work with OpenGL in some games, I haven't tested it myself.

You are correct it should go to 100, however make sure you check to see what the game setting actually is, it defaults to 60hz unless you actually set your video mode to a higher refresh rate in the game, so while you may be running your monitor at 100hz in windows when you run BF it may actally be dropping back to 60Hz. When you go into video modes you should see a listing for each resolution with diferent refresh rates, if not you will need to unlock it by changing the setting in videodefault.con, adjusting the line;

"renderer.allowAllRefreshRates 0" to

"renderer.allowAllRefreshRates 1",

Then you should have all refresh rates available.

While I don't actually recommend having it set to on unless you are actually getting visual artifact problems, all I am saying is that if someone is getting 200fps with his Athlon 3200+, 1 GB PC3200 DDR and Radeon 9800 he would be better off actually increasing quality settings for the game and dropping his FPS cause 100 of those FPS are just wasted.

have fun

:cya: :cya: :cya:




  • 1
  • 2