Jump to content

  • Curse Sites
Help
- - - - -

Bad FPS...why? Is GW2 too CPU bound or....


  • Please log in to reply
27 replies to this topic

#1 Kilian

Kilian

    Vanguard Scout

  • Members
  • 317 posts

Posted 15 December 2012 - 07:43 PM

What?
Is it me?

System Specs:

ASrock Extreme3 Gen3
MSI GTX 670 Power Edition
8 GB of G.Skill RAM @ 1600 MHz
Intel i5 2500k @3.3 Ghz
550 Watt Xclio PSU

I play the game maxed out at 1920 x 1080 (minus supersampling, I keep it at native.)
and my framerate is horrible at times. When it comes to cities I usually range between 25-45, during holiday events in LA my FPS ranges from 15-30, and in zones/WvW I go from 30-120 ( I have a 120hz monitor). Majority of the time I stay within 30-mid 50s in zones and WvW.

I have had this issue since the game launched, I just finally got tired of it.

I have the system to able to pull off 60+ FPS easily. What gives? Because other games run just fine on my rig.

Edited by Kilian, 15 December 2012 - 07:43 PM.


#2 Gorwe

Gorwe

    Vanguard Scout

  • Members
  • 480 posts

Posted 15 December 2012 - 08:21 PM

That spec is pretty close to mine(you have 670/I have 7850, you have 2500k/I have 3570k).
Tho I play at 1680x1050...

I have noticed those lag issues as well. Idk why tho...

This post is just a confirmation that you are not the only one and that we with more powerful rigs have got the same problems :(.

#3 The Comfy Chair

The Comfy Chair

    The best at space

  • Members
  • 5331 posts
  • Location:Birmingham, UK
  • Profession:Elementalist
  • Guild Tag:[TKOT]
  • Server:Gandara

Posted 15 December 2012 - 09:09 PM

It's 'normal', just annoying.

If you want to check if your computer will run GW2, check here.

If you find out you can't and need to think about upgrading or building another, check here.


#4 Brivk562

Brivk562

    Pale Tree Seedling

  • New Members
  • 1 posts

Posted 15 December 2012 - 09:18 PM

I am running the following system and I am almost always above 60 fps. It's normal for me to hit 90's.

I5 3570k
Z77 asrock extreme4
Pny 660 ti
Corsair vengeance 1600 8 gb
Ssd 120 gb sandisk extreme
650w coolermaster

Everything stock clocks. You should easily be breaking 70's.

#5 sbbee

sbbee

    Fahrar Cub

  • Members
  • 31 posts

Posted 15 December 2012 - 09:49 PM

I got more or less the same setup except with a 7970 card and I cant keep it maxed either. Turning off reflections helps a lot for me.

Edited by sbbee, 15 December 2012 - 09:59 PM.


#6 Kilian

Kilian

    Vanguard Scout

  • Members
  • 317 posts

Posted 15 December 2012 - 10:05 PM

Well, at least I'm not the only one.

View PostThe Comfy Chair, on 15 December 2012 - 09:09 PM, said:

It's 'normal', just annoying.

Really annoying. You'd think Anet would do something about it by now.
What's the likelihood of them fixin this?

#7 The Comfy Chair

The Comfy Chair

    The best at space

  • Members
  • 5331 posts
  • Location:Birmingham, UK
  • Profession:Elementalist
  • Guild Tag:[TKOT]
  • Server:Gandara

Posted 15 December 2012 - 10:27 PM

View PostKilian, on 15 December 2012 - 10:05 PM, said:

Well, at least I'm not the only one.



Really annoying. You'd think Anet would do something about it by now.
What's the likelihood of them fixin this?

It'll likely be vastly improved with directX 11 when/if they ever add it.

If you want to check if your computer will run GW2, check here.

If you find out you can't and need to think about upgrading or building another, check here.


#8 Stargate

Stargate

    Seraph Guardian

  • Members
  • 1766 posts

Posted 16 December 2012 - 01:42 AM

View PostKilian, on 15 December 2012 - 07:43 PM, said:

I have the system to able to pull off 60+ FPS easily. What gives? Because other games run just fine on my rig.
Well first of all this game is not a FPS game like Unreal Tournament 2004 so you do not really need 60+ FPS.

View PostThe Comfy Chair, on 15 December 2012 - 10:27 PM, said:

It'll likely be vastly improved with directX 11 when/if they ever add it.
I doubt it. DX11 is much faster then DX 10. However I would not say that DX11 is much faster then DX9 though it looks much better. Ok yes they might slightly improve FPS, but I would not expect a miracle.

View PostStargate, on 15 December 2012 - 07:23 AM, said:

Best part of Guild Wars 2 is for me Mist PvP. Problem is my budget gaming laptop can not run well PvP the Mist.

That said I have been very patient with computer upgrading, but I'll buy a super Haswell CPU desktop computer in late 2013. There is really not much performance difference between a Sandybridge or IVY generation CPU, but when Haswell comes out we have a crystal clear winner though it will not be some groundbreaking shocking improvement.
The answer is future computers + what Anet can slightly do to improve FPS. Maybe one day you can actually can run GW2 with 60 FPS. I do not promise Haswell can run GW2 with 60 FPS. However I believe Haswell high end computers can run better this game then Intel i5 2500k @3.3 Ghz!

Edited by Stargate, 16 December 2012 - 01:57 AM.


#9 sqwertty278448

sqwertty278448

    Asuran Acolyte

  • Members
  • 91 posts
  • Location:East Coast
  • Server:Stormbluff Isle

Posted 16 December 2012 - 01:48 AM

Lower your graphics settings and stop complaining. I hate people who are mad they can barley run the game on highest settings. Some people cant even run the game at all, just be happy its running fine on your computer.

#10 Stargate

Stargate

    Seraph Guardian

  • Members
  • 1766 posts

Posted 16 December 2012 - 02:17 AM

View Postsqwertty278448, on 16 December 2012 - 01:48 AM, said:

Lower your graphics settings and stop complaining. I hate people who are mad they can barley run the game on highest settings. Some people cant even run the game at all, just be happy its running fine on your computer.
Let people ask what they want... it is interesting and fun sometimes to talk about high end or future tech. Of course since I rarely post in Guild wars 2 Technical support forums so that is why it is for me interesting and does not feel like work.

Edited by Stargate, 16 December 2012 - 02:19 AM.


#11 Baldur The Bold

Baldur The Bold

    Vanguard Scout

  • Members
  • 443 posts
  • Guild Tag:[ARM]
  • Server:Blackgate

Posted 16 December 2012 - 05:04 AM

View Postsqwertty278448, on 16 December 2012 - 01:48 AM, said:

Lower your graphics settings and stop complaining. I hate people who are mad they can barley run the game on highest settings. Some people cant even run the game at all, just be happy its running fine on your computer.
Lowering graphic settings is moot due to the game being CPU bound. They have a right to complain. There hasn't been a single performance patch since beta. Devs say in the Q&A they posted about working on things but I think that is bullshit. Don't expect any performance updates for this game anytime in the near future.

#12 Khlaw

Khlaw

    Vanguard Scout

  • Curse Premium
  • Curse Premium
  • 209 posts
  • Guild Tag:[PoE]
  • Server:Ehmry Bay

Posted 16 December 2012 - 05:11 AM

I run an i7 920 oc'd to 3.2 with a 7970 and 6GB of RAM, 1920x1080 Fullscreen windowed, on Windows 8.  Everything is maxed except super sampling, and I very rarely dip under 50FPS.

I swapped to the 7970 from a pair of 4950's (kinda pointless in windowed mode) and gained around 15 FPS, so I was bottlenecking on the GPU - but the CPU is the real limiter.

#13 Baldur The Bold

Baldur The Bold

    Vanguard Scout

  • Members
  • 443 posts
  • Guild Tag:[ARM]
  • Server:Blackgate

Posted 16 December 2012 - 06:02 AM

You don't dip under 50fps in LA during Wintersday? How was the Karka event for you with that 3.2ghz 920?How is WvW 100 man zergs or  Dragon battles? This game is zerg friendly but cannot support it. Anet needs to get a HUGE performance patch up before we all end up on one server per continent.

#14 Raif89

Raif89

    Asuran Acolyte

  • Members
  • 106 posts
  • Guild Tag:[PD]
  • Server:Sanctum of Rall

Posted 16 December 2012 - 10:40 AM

I'm having the same problem since Winterstday launched. I'm having massive FPS drops and I'm using a

i5 2500K OC'd 4.5GHz
GTX 580
8GB RAM
installed on a Crucial M4 256GB SSD

Im dropping down to some 5 FPS when I turn around in LA. It's weird. If I turn on supersampling I hover around 28-29 FPS and if its off I'll sit on native ill be on 35 FPS with everything else turned up and Vsync is off as well as post processing.

Any ideas?

#15 The Comfy Chair

The Comfy Chair

    The best at space

  • Members
  • 5331 posts
  • Location:Birmingham, UK
  • Profession:Elementalist
  • Guild Tag:[TKOT]
  • Server:Gandara

Posted 16 December 2012 - 11:02 AM

View PostStargate, on 16 December 2012 - 01:42 AM, said:

I doubt it. DX11 is much faster then DX 10. However I would not say that DX11 is much faster then DX9 though it looks much better. Ok yes they might slightly improve FPS, but I would not expect a miracle.

DX10 is much faster than DX9 as well. It's a myth that DX10 is slower. DX10 games that 'ran slower' just used effects that DX9 versions of the game would be crippled doing. It's the same with many of the DX11 effects like full screen post processing used in dirt 2/3. Go back to 'vanilla' apples to apples comparisons, and DX10 ran better, same as DX11 runs better than DX10.

The way multiple threads are handled is quite a lot better in DX11 compared to DX9, and the issue is with the CPU at the moment in GW2.

Edited by The Comfy Chair, 16 December 2012 - 11:03 AM.

If you want to check if your computer will run GW2, check here.

If you find out you can't and need to think about upgrading or building another, check here.


#16 Elder III

Elder III

    Technician

  • Technicians
  • 4424 posts
  • Location:OH
  • Guild Tag:[ION]
  • Server:Jade Quarry

Posted 16 December 2012 - 06:01 PM

Overclocking your CPU as high as it will go - in your case at least 4.2 if not 4.5 Ghz will make the biggest difference for you, particularly in Cities and WvW.

#17 Khlaw

Khlaw

    Vanguard Scout

  • Curse Premium
  • Curse Premium
  • 209 posts
  • Guild Tag:[PoE]
  • Server:Ehmry Bay

Posted 16 December 2012 - 06:06 PM

View PostBaldur The Bold, on 16 December 2012 - 06:02 AM, said:

You don't dip under 50fps in LA during Wintersday? How was the Karka event for you with that 3.2ghz 920?How is WvW 100 man zergs or  Dragon battles? This game is zerg friendly but cannot support it. Anet needs to get a HUGE performance patch up before we all end up on one server per continent.
OK, yes, in those two circumstances, I had the same issues as everyone else.  Figured that was kind of a given :P.  don't know that I've been in a 100 man Zerg, but with 50 or so everything is fine, as are dragon battles.  May be under 50 but still no noticeable issues.

Edited by Khlaw, 16 December 2012 - 06:06 PM.


#18 TheMuayThaiMan

TheMuayThaiMan

    Pale Tree Seedling

  • New Members
  • 1 posts

Posted 17 December 2012 - 12:29 AM

Just wanted to add my machine to the list on here of ones that aren't running Guild Wars 2 at the level that it seems they should:

Asrock Extreme4 Mobo
Radeon HD 7870 (2gb of video RAM)
Intel core i5-2500k @ 3.3ghz
8gb of DDR3 RAM
700W Cooler Master PSU

With the game running at the best appearance preset, I can get up to 63 FPS, but when moving around and turning sharply it can drop to as low as 30ish FPS. With a machine like I have, I would expect better. I just built this computer for gaming purposes and I'm not very happy with this. Anyone have any ideas? Would overclocking my CPU to like 4.0ghz help out my fps at all? Thanks!

#19 Lord Sojar

Lord Sojar

    Mesmer of Death

  • Site Contributors
  • 2111 posts
  • Location:Inside your GPU, ohaidar
  • Profession:Mesmer
  • Guild Tag:[Heil]
  • Server:Maguuma

Posted 17 December 2012 - 04:20 AM

DX11 code remnants exist in the GW2 engine.  

Also, indeed, DX11 is VASTLY superior to DX9 in terms of efficiency, usability and resource management.  Additionally, it allows for the exact same effects at a fraction of the processing cost.  Improved lib functions and direct GPU calling are two of the biggest contributors to this.  

Hell, the sheer improvement in threading is reason enough to call it vastly superior.

One trick you can utilize to maximize GW2 performance on an nVidia card is follow proper settings and override all the craptastical texture management of the GW2 engine.   You can also get rid of that fugly FXAA implementation they utilize and use MSAA or CSAA for vastly superior results.  Use the nVidia Control Panel to override all such settings.  Force AF to 16x via the control panel, turn on normal AA if your card is powerful enough (and a 670 is absurdly enough for GW2)

Lastly, boost the clocks on that i5 2500k.  They can easily hit 4.4GHz on STOCK cooling, and 4.6-5.2GHz depending on your aftermarket cooler (if present)

That should give you substantial boosts overall.  But in Lions Arch, expecting anything past 50fps is pretty much impossible in a really crowded area.  My rig is brought down to 33-37fps in the worst places in LA (excluding random, very brief massive drops to 15ish which happen rarely on camera panning)  

Using that as a baseline, it stands to reason that your rig, even at base clocks shouldn't get less than 20fps worst case, based on depressed clock and generation gap between CPUs.

#20 Krill

Krill

    Asuran Acolyte

  • Members
  • 85 posts

Posted 17 December 2012 - 06:19 AM

View PostLord Sojar, on 17 December 2012 - 04:20 AM, said:

Lastly, boost the clocks on that i5 2500k.  They can easily hit 4.4GHz on STOCK cooling, and 4.6-5.2GHz depending on your aftermarket cooler (if present)

4 GHz + on sandy / ivy bridge CPU's with stock cooling may be fine with low load, but GW2 does exert a very heavy load on the CPU. I think it's a bad idea to recommend overclocking to someone unless they know what they are doing, and have sufficient cooling, a good power supply, and a mobo with robust voltage regulation and thermal protection. All CPU's now have internal thermal protection mechanisms, but it's still possible to damage a mobo loading it beyond its design specification. Or, more likely, at least have problems with instability and performance from thermal throttling.  

I haven't hooked up an inline power meter yet, but this is easily the most intensive game resource wise I've played...my 2500k @ 4.6 GHz will actually pump out enough heat to raise the temperature in a 400 square foot room 2-4 degrees after a few hours of WvW.

Not trying to jump all over you here...more people do need to understand that a ~4.5 GHz sandy / ivy bridge should be like, the minimum for decent performance in high traffic areas. But you really need a system that is designed to handle a high overclocked load for long periods of time.

Edited by Krill, 17 December 2012 - 06:26 AM.


#21 Lord Sojar

Lord Sojar

    Mesmer of Death

  • Site Contributors
  • 2111 posts
  • Location:Inside your GPU, ohaidar
  • Profession:Mesmer
  • Guild Tag:[Heil]
  • Server:Maguuma

Posted 17 December 2012 - 03:42 PM

You can hit 4-4.2GHz on a Sandy Bridge without touching voltage at all Krill.  If the motherboard blows out at that level, that means you had defective VRMs in the first place.  Even basic budget boards can do it.

Also, how did you do that temperature increase calculation?   I hope not from direct draw...  because that value isn't correct.   Watt heat dissipation is tricky to calculate as it is, but converting that to a VHC calculation is even more difficult.  My thermal dynamics are a bit rusty, but there are at least 10-12 formulaic steps to that calculation.  

Now, your GPU, depending on the model, throws enough watts of heat to potentially have that effect when coupled with total MB + CPU watt heat potential.   However, CPU alone?   Good lord no, even at glance.  I'd wager, just as a quick estimate it would require about 400W of total dissipated watts heat assuming your volume to be constant at 8ft ceilings and using the normal R5-30-15 dimensional values.

#22 Krill

Krill

    Asuran Acolyte

  • Members
  • 85 posts

Posted 17 December 2012 - 04:40 PM

Well, I won't pretend to be an electrical engineer, but higher clock speeds do draw more power without increasing the voltage. Thus, more heat is produced. An unmodified stock heatsink with the 2500k is barely sufficient to cool the CPU at stock speeds under extreme loads like Linpack using the AVX extension...it will come very close to the thermal limit, and produce an incredible amount of heat at the VRM's if uses the minimum number of phases and / or uses the cheapest components. GW2 isn't that extreme, but it is a fairly heavy load for extended periods of time.

All I'm saying is that you're right that overclocking to 4.5 GH plus is a great idea for performance, arguably necessary for WvW, but have sufficient cooling and a beefed up enthusiast mobo that is designed to handle the load.

#23 Lord Sojar

Lord Sojar

    Mesmer of Death

  • Site Contributors
  • 2111 posts
  • Location:Inside your GPU, ohaidar
  • Profession:Mesmer
  • Guild Tag:[Heil]
  • Server:Maguuma

Posted 17 December 2012 - 08:48 PM

View PostKrill, on 17 December 2012 - 04:40 PM, said:

Well, I won't pretend to be an electrical engineer, but higher clock speeds do draw more power without increasing the voltage. Thus, more heat is produced. An unmodified stock heatsink with the 2500k is barely sufficient to cool the CPU at stock speeds under extreme loads like Linpack using the AVX extension...it will come very close to the thermal limit, and produce an incredible amount of heat at the VRM's if uses the minimum number of phases and / or uses the cheapest components. GW2 isn't that extreme, but it is a fairly heavy load for extended periods of time.

All I'm saying is that you're right that overclocking to 4.5 GH plus is a great idea for performance, arguably necessary for WvW, but have sufficient cooling and a beefed up enthusiast mobo that is designed to handle the load.

I've seen stock hit 4.4Ghz.  Sure, Linpack would make it get too hot, but for GW2, it should be fine.

#24 Stargate

Stargate

    Seraph Guardian

  • Members
  • 1766 posts

Posted 18 December 2012 - 01:19 AM

View PostThe Comfy Chair, on 16 December 2012 - 11:02 AM, said:

DX10 is much faster than DX9 as well. It's a myth that DX10 is slower. DX10 games that 'ran slower' just used effects that DX9 versions of the game would be crippled doing. It's the same with many of the DX11 effects like full screen post processing used in dirt 2/3. Go back to 'vanilla' apples to apples comparisons, and DX10 ran better, same as DX11 runs better than DX10.

The way multiple threads are handled is quite a lot better in DX11 compared to DX9, and the issue is with the CPU at the moment in GW2.
Well lets simple agree that Dx11 is best for this game.

According to tests by professionals Dx10 on Crysis and Age of conan and yes it was slower then Dx 9. However please remember first DX10 tries were done with Windows Vista. Windows Vista is as operating system a vast resource hog compared to Windows XP or Windows 7. Many older games had also more effects so DX 10 showed more graphic effects then with DX 9. I am not saying the code itself is slower in DX10 then DX9. I have also not seen DX 10 on Windows XP tests.

Anyway yes DX11 is best for this game, but no way developers can pull a miracle to suddenly lift the average FPS to 60 in Lions Arch for Lord Sojar with his current computer. Therefore respectfully you may be very correct about DX versions, but I do believe that combination of DX 11 and future computers(Haswell or better) might boost GW2 performance together.

View PostKrill, on 17 December 2012 - 06:19 AM, said:

I haven't hooked up an inline power meter yet, but this is easily the most intensive game resource wise I've played...my 2500k @ 4.6 GHz will actually pump out enough heat to raise the temperature in a 400 square foot room 2-4 degrees after a few hours of WvW.
Reminds me of true fun story from my personal experiences. I live in so north of Europe so we have plenty of snow and ice here already and full blown winter. Anyway many years ago when I was studying I returned home to our student apartment in very cold winter climate. I enter the door and it is freezing cold in the whole apartment.

I noticed the door to balcony and some windows were open. I asked my the friend what the f*ck are you doing :surprised: ? It is freezing cold here :devil: !

My friend answers with a smile. Wait don't close them I am doing an overclocking experiment. That IT people never do crazy things is a myth ;).

Edited by Stargate, 24 December 2012 - 12:27 AM.


#25 Baldur The Bold

Baldur The Bold

    Vanguard Scout

  • Members
  • 443 posts
  • Guild Tag:[ARM]
  • Server:Blackgate

Posted 18 December 2012 - 04:06 AM

This game does make my system pump out the heat lol. Although we dont have any snow(thx global warming), I have had my window open a few times playing gw2 for an extended period of time. I have seen my cpu go up to 60c, which is pretty high even with my cooler.

#26 typographie

typographie

    Golem Rider

  • Members
  • 2006 posts
  • Guild Tag:[LAW]

Posted 18 December 2012 - 08:17 AM

View PostStargate, on 18 December 2012 - 01:19 AM, said:

According to tests by professionals Dx10 on Crysis and Age of conan and yes it was slower then Dx 9.

DX10 mode on Crysis enabled its ultra detail level, if I remember correctly. DX9 mode had less to do; which was necessary, as per Comfy's explanation. Its not like Crysis was ever a shining model of efficiency in the first place, so it may not be a fantastic place for such a test.

#27 Lord Sojar

Lord Sojar

    Mesmer of Death

  • Site Contributors
  • 2111 posts
  • Location:Inside your GPU, ohaidar
  • Profession:Mesmer
  • Guild Tag:[Heil]
  • Server:Maguuma

Posted 18 December 2012 - 10:17 AM

View PostStargate, on 18 December 2012 - 01:19 AM, said:

Anyway yes DX11 is best for this game, but no way developers can pull a miracle to suddenly lift the average FPS to 60 in Lions Arch for Lord Sojar with his current computer.

My rig consumes small planets... the fact that I can't be framelocked in Lions Arch shows how poor the client is....  sigh.

#28 The Comfy Chair

The Comfy Chair

    The best at space

  • Members
  • 5331 posts
  • Location:Birmingham, UK
  • Profession:Elementalist
  • Guild Tag:[TKOT]
  • Server:Gandara

Posted 18 December 2012 - 08:34 PM

View Posttypographie, on 18 December 2012 - 08:17 AM, said:

DX10 mode on Crysis enabled its ultra detail level, if I remember correctly. DX9 mode had less to do; which was necessary, as per Comfy's explanation. Its not like Crysis was ever a shining model of efficiency in the first place, so it may not be a fantastic place for such a test.

Aye, it also enabled effects that were 'hidden' on the menu which slowed it down or effects were handled in a different way. Games like far cry 2 ran better in dx10 mode versus dx9 when just the API was changed.

Edited by The Comfy Chair, 18 December 2012 - 08:35 PM.

If you want to check if your computer will run GW2, check here.

If you find out you can't and need to think about upgrading or building another, check here.





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users