Framerate Drops When Aiming in Black Ops 2 PC? Here’s a Fix

Players jumping into the multiplayer and Zombies modes of Call fo Duty: Black Ops 2 on PC have been complaining about some serious framerate drops in the game. These drops seem to be happening specifically when aiming down the iron sights of weapons.

Fortunately, it appears there’s a fix for the issue, and it’s a simple one.

The trouble is with the game’s rendering of “depth of field.” When you look down the sights of a gun in Call of Duty, you’ll generally see that what you’re directly aiming at is in focus, while the rest of the image is blurry — that’s depth of field. It makes far-away things seem far away and makes the game feel a touch more realistic. Unfortunately, it also seems to kill framerates, and if you’re running a more average PC rig, this can be a problem.

However, Treyarch was nice enough to make the depth of field option adjustable in the Options menu. Go to your Main Menu, hit Options, then Graphics. Once there, you’ll need to go into the “Advanced” graphics menu to see the depth of field option. When you find it, knock it down to medium or low and you should see a drastic boost in performance when aiming.

You can also test the situation to see how your framerate is doing, which can help you make other graphical adjustments. In that same menu, set “Draw Framerate” to on. You’ll now have a number in the top-right corner of your screen that lets you know Black Ops 2′s framerate. Use that to try switching around your graphics settings to figure out what you can turn up and what you might want to turn down for best performance.

Finally, you might want to check your graphics card drivers to make sure they’re up to date. In fact, NVIDIA released beta drivers specifically tuned to Black Ops 2. Get more information about them here.

Crush the conspiracy with Game Front’s full Black Ops 2 walkthrough, with step-by-step instructions showing you how to complete each mission.

Join the Conversation   

* required field

By submitting a comment here you grant GameFront a perpetual license to reproduce your words and name/web site in attribution. Inappropriate or irrelevant comments will be removed at an admin's discretion.

5 Comments on Framerate Drops When Aiming in Black Ops 2 PC? Here’s a Fix

Scott

On November 13, 2012 at 8:12 pm

I’ve been having a lot of frame rate issues, not only while aiming. It’s all the time. Though, what confuses me is that I can run BF3 at Ultra, with 45-50 fps constantly, but I can’t run Black Ops II well at all. I have the graphics as low as they’ll go, and the max I’ve seen is 30 fps. Any fixes? There’s no way it can be my hardware. Right now I’m using a Six-Core AMD Bulldozer @ 3.3 GHz, 16 gigs of 1866 MHz DDR3 ram, and an AMD Radeon 6870.

Scott

On November 14, 2012 at 9:55 pm

I was having the same problem until I set “sync every frame” to no and set the max fps to 60 fps. It doesn’t really degrade the quality and really helps with the frame rate drop. All this with an Intel I7, 16GB ram, and nvidia gtx 670.

me

On November 15, 2012 at 1:40 pm

Ye every of you super pro idiots tellin this bull over and over again copyin from every web to another. Its bull. Drivers wont help, decreasin wont help neither. Its codin bug and it cant be changed by players, u ing idiots

dicks

On February 22, 2013 at 5:06 pm

Oh, Treyarch is SO nice to include the option to adjust DOF… although no matter what you set it to, not only does it look the exact same but the performance is still equally terrible. If they can’t optimize it properly then we should at least be allowed the option to disable the effect, especially considering that it renders the game unplayable for many (of which, I am one).

ChillyBencher

On August 31, 2013 at 8:44 pm

While BLOPS2 has relatively low requirements for playing @ 1080p (and frankly, very low for 1440p relatively speaking), being that it is about as pure of a console port as they come, it does have some poor coding that is unable to be altered as well as some effects that simply tax your system.

First off, regarding the post about “I run BF3 @ Ultra with 45-50fps constant” using an FX63xx, 16GB DDR3-1866, and an AMD Radeon HD6870….
WHAT?
If you’re playing at 1600×900 or 1280×720 or something, sure, I buy it; especially if you only play Metro or never play in games with more than 20 or so players.

However, it is simply NOT POSSIBLE to achieve that kind of framerate with the game TRULY MAXED OUT @ 1080p/1200p/1440p/1600p/etc.
The FX6000-series are not strong enough to deal with 64 players at once, even overclocked, and although I haven’t played with the FX6xxx chips in games at all, the FX8350 @ 4.8Ghz was unable to maintain 60fps and that was with 16GB DDR3-2200 7-10-8-22 1T memory (vastly faster than 1866 9-10-9), paired with either one of my R7970 Lightning’s (OC’d to 1350core/7250mem with EK FC-R7970Ltg full coverage block) OR one of my 680 Lightning’s (1508core/7698mem with an Aquacomputer Limited Edition 680 Lightning full coverage block) OR with one of my 670FTW’s (1486core/7998mem with Heatkiller GPUx3 680 Hole Edition full coverage block and HK backplate).

FX8350 @ 4.8Ghz
Playing for 10min on the same server with a constant 64 players per card, the results:
7970 Lightning: 58fps Avg, 192fps Max, 11fps Min (Note: Extremely choppy, very unpleasant)
680 Lightning: 65fps Avg, 199fps Max, 28fps Min (Note: Smoother but still laggy)
670FTW: 62.8fps Avg, 199fps Max, 24fps Min (Note: Smoothest, power is between the R7970Ltg Ghz Edition and 680Ltg)

Now, with the game played on one of my own rigs…. All this is in a white Case Labs TH10 with magnum pedestal, 120mm extended top, and literally hundreds of accessory bits…
i7 3930K @ 4.98Ghz, RIVE, 16GB Ripjaws Z 2133CL9 @ 2488 10-11-10-25 1T, Samsung 830 256GB OS/Boot, 2x Plextor M5P-Xtreme 512GB RAID0 for Apps/Games via Adaptec 8-series PCIe3.0 Hardware RAID Card, 4x WD 1TB Blue (WD10EZEX) Single-Platter HDD’s in RAID0 (798MB/sec Seq), 4x WD RE 4TB in RAID10, 6x WD RE 3TB in RAID5, OCZ Vector 256GB used to cache arrays via Adaptec caching (in addition to the 4GB onboard write-back cache); Fully custom loop with Apogee HD CPU Block, MIPS R4E Motherboard Block Kit for Front-Side MOSFET/VRM and PCH, customized Koolance MVR-100 (universal MOSFET block) modified to replace the hold-down plate on back for VRM cooler (there are MOSFET’s on both sides of board, and not cooling backside = throttling), 2x Bitspower 4-DIMM “Freezer” RAM blocks with acrylic over copper with universal DIMM adapters, Watercool Heatkiller Universal GPU block used for Adaptec RAID SoC, always Heatkiller (Hole Edition) or Aquacomputer (AquagraFX or Kryographics) full coverage blocks and backplates used on GPU’s. 2x BP 250mL reservoirs, Swiftech MCP35X2 dual-pump, 98x Bitspower fittings, 1/2×3/4 Primochill Advanced with Mayhem’s Pastel White dye added to distilled…. 3x Alphacool NexXxoS UT60 560 (4x140mm) radiators, 1x Alphacool NexXxoS Monsta 560 Radiator, 1x Alphacool NexXxoS Monsta 360 Radiator, and 1x Alphacool NexXxoS UT60 240 Radiator (16x140mm + 5x120mm worth of rads; 3×140 is slightly more rad than 4×120); all rads use Push-Pull fans: Bgears Blasters 140mm for 560′s; Koolance FAN-12038HBK 120x38mm 2600rpm fans for 360 and 240; Phobya 140/120 20mm-thick shrouds used on push sides, Phobya 10mm-thick on pull sides.

Point is: CPU temps stay at 55C max at 5Ghz, GPU’s, well, I’ve never seen them get more than 7C above room temp, and board/memory stays frosty.

i7 3930K @ 4.0Ghz (HT On/HT Off)
R7970Ltg – 74fps/77fps Avg, 199fps/199fps Max, 52fps/59fps Min
680Ltg – 79fps/84fps Avg, 199fps/199fps Max, 57fps/69fps Min
670FTW – 72fps/78fps Avg, 199fps/199fps Max, 61fps/68fps Min

That’s results with both 6c/12t and 6c/6t so no arguing “Hyperthreading isn’t fair!”.

Same EXACT cards…

What about SLI/CFX (note: CFX is HORRIBLE, and in fact I don’t even play games on my AMD cards and haven’t in a year as the stuttering is so bad I get a migraine; even a single card will give me a headache within an hour!)

SLI and CFX Results

Results:
*(See Note)

FX-8350 @ 4.8Ghz with same previously mentioned specs
2x/3x R7970Ltg – 79fps/91fps Avg, 9fps/22fps Min
2x/3x/4x 680Ltg – 86fps/117fps/128fps Avg, 33fps/46fps/51fps Min
2x/3x/4x 670FTW – 86fps/122fps/128fps Avg, 38fps/49fps/44fps Min

i7 3930K @ 4.0Ghz
2x/3x R7970Ltg – 98fps/121fps Avg, 48fps/62fps Min
2x/3x/4x 680Ltg – 106fps/131fps/154fps Avg, 52fps/81fps/83fps Min
2x/3x/4x 670FTW – 103fps/130fps/149fps Avg, 51fps/89fps/81fps Min
(FPS Increases by 19-28% at my usual 5Ghz CPU clock, but as that is the exact amount I lose playing @ 4360×1440 aka “27:9″ as it’s almost exactly 3x wide as tall, made of one of my U2713Hm 1440p in landscape in center and a single 1440×900 IPS panel in portrait on either side for Surround; all displays de-bezelled reducing gap to 6.8mm)

*NOTE: unless otherwise mentioned, the game maxed at 199fps, easily done by looking at a black wall for anyone, but sometimes it just spikes up.

I haven’t tested any older cards on newer drivers, and the closest thing I have to a 6870 is a pair of 6850′s with unlocked shaders and 1GB VRAM (and full voltage control). Will do my best to find them.

Most, as in more than 50%, of games out there are going to be GPU bound; BF3/BF4 (as well as Crysis 3, btw) are easily bottlenecked by CPU before GPU, a trend that is guaranteed to continue thanks to the pseudo-”8-core” Jaguar processor used in both the XB0NE and PS4.