Jump to content

  • Curse Sites
Help
* - - - - 1 votes

CPU vs. GPU question about optimization


  • Please log in to reply
14 replies to this topic

#1 Thorfinnr

Thorfinnr

    Vanguard Scout

  • Members
  • 475 posts
  • Location:Hangin' with the Kodan
  • Profession:Warrior
  • Guild Tag:[DBoS]
  • Server:Henge of Denravi

Posted 26 November 2012 - 06:37 PM

Does anyone know if there are plans to split the processing load a little better between the CPU and GPU?

I know my CPU gets hammered with this game...its handling it...but when i get in really big crowds, like the world event for the Lost Shores...I started to get jerky...not usually running into this in other events and such...and I figured it was due to population in the event/local area being lower than what we had for Lost Shores. B)

Now i understand that could have been the game itself being laggy due to such high pop in each server when the event was going on...but I hadn not read anything in a while about any more optimizations with the game and how it uses your PCs resources.

Anyone also have any tips on settings to tweak to get best experience? I understand the basics and such of adjustments in game, but anyone found any tricks to use on settiings say in your graphics card interface to help out with what little processing it is handling?

Thanks for any and all info... :cool:

Thorfinnr

#2 dhatcher1

dhatcher1

    Technician

  • Technicians
  • 3786 posts
  • Guild Tag:[SAnD]
  • Server:Maguuma

Posted 26 November 2012 - 08:44 PM

There is a tech support forum where this really belongs.  It is filled with (literally) hundreds of threads of people trying different things with different levels of success.  Anet wrote their own engine and client, and it is unique.  It does not react like any other game.  It also reacts inconsistently for people with similar hardware configurations.

If the game client/engine was coded in a more mainstream fashion, we would have less problems figuring out how to tune performance, but there may be some game features or graphics that would not have been possible.  IMO its a shame they stuck to a DX9 only client.  I suspect some of the problem is them doing things with DX9 that really shouldnt have, and that DX11 would have handled much more efficiently and reliably.

When Asura push the tech envelope...

Edited by dhatcher1, 26 November 2012 - 08:46 PM.


#3 Ezendor

Ezendor

    Vanguard Scout

  • Members
  • 372 posts
  • Guild Tag:[SYN]
  • Server:Sanctum of Rall

Posted 27 November 2012 - 03:39 PM

View Postdhatcher1, on 26 November 2012 - 08:44 PM, said:

There is a tech support forum where this really belongs.  It is filled with (literally) hundreds of threads of people trying different things with different levels of success.  Anet wrote their own engine and client, and it is unique.  It does not react like any other game.  It also reacts inconsistently for people with similar hardware configurations.

If the game client/engine was coded in a more mainstream fashion, we would have less problems figuring out how to tune performance, but there may be some game features or graphics that would not have been possible.  IMO its a shame they stuck to a DX9 only client.  I suspect some of the problem is them doing things with DX9 that really shouldnt have, and that DX11 would have handled much more efficiently and reliably.

When Asura push the tech envelope...

DX11 only improves on the GPU pipeline front.  GW2 is solely held back by their horrible CPU code.

#4 typographie

typographie

    Seraph Guardian

  • Members
  • 1804 posts

Posted 27 November 2012 - 05:28 PM

View PostEzendor, on 27 November 2012 - 03:39 PM, said:

GW2 is solely held back by their horrible CPU code.

How have other MMO's run that tried to shove 50-100 players in one place and made them fight? Maybe there's room for more improvement, but I'm not convinced that "demanding" automatically means "horribly coded."

View PostThorfinnr, on 26 November 2012 - 06:37 PM, said:

Anyone also have any tips on settings to tweak to get best experience?

Shadows and reflections seem to carry some additional CPU workload, so I'd try lowering those first. Video cards are pretty well taxed in GW2 as well, so it still makes sense to try graphics settings first especially if you're using slower hardware.

I don't think Anet has said much about performance in awhile, but I'm sure they'll continue optimizating the game incrementally if they find more serious issues that they can easily fix. Don't expect miracles though. I know that its frustrating, but an upgrade is usually the only real cure for old hardware.

#5 dhatcher1

dhatcher1

    Technician

  • Technicians
  • 3786 posts
  • Guild Tag:[SAnD]
  • Server:Maguuma

Posted 27 November 2012 - 05:53 PM

View PostEzendor, on 27 November 2012 - 03:39 PM, said:

DX11 only improves on the GPU pipeline front.  GW2 is solely held back by their horrible CPU code.
My belief is that is because they are using the CPU to do some things that the GPU would be doing in other game clients, probably because they are doing things outside of what the DX9 graphics interface can really handle.

#6 Angelus359

Angelus359

    Vanguard Scout

  • Members
  • 358 posts
  • Location:Illinois

Posted 27 November 2012 - 05:59 PM

As someone who has played dark age of camelot, successfully, on a 1.2ghz athlon, pre xp series, in a 300 man raid, without stutters at all, I have to say that GW2 is not optimized for CPU code

#7 tijo

tijo

    Technician

  • Technicians
  • 3169 posts
  • Guild Tag:[RISE]
  • Server:Stormbluff Isle

Posted 27 November 2012 - 06:59 PM

View PostAngelus359, on 27 November 2012 - 05:59 PM, said:

As someone who has played dark age of camelot, successfully, on a 1.2ghz athlon, pre xp series, in a 300 man raid, without stutters at all, I have to say that GW2 is not optimized for CPU code

You know you are comparing a 2001 game to a 2012 one. I'm not saying GW2 is optimized, it is optimized to a certain point (far from perfect though). I'm just saying that your comparison is far from ideal.

#8 Angelus359

Angelus359

    Vanguard Scout

  • Members
  • 358 posts
  • Location:Illinois

Posted 27 November 2012 - 08:04 PM

My 2001 game was being played on 1999 hardware.

Outside of graphical concerns, the required load is almost identical! it had dots, it had buffs, it had aoe, it had summons, it had castles, it had siege, it had LOS calculations

Outside of graphics, what's different?


View Posttijo, on 27 November 2012 - 06:59 PM, said:

You know you are comparing a 2001 game to a 2012 one. I'm not saying GW2 is optimized, it is optimized to a certain point (far from perfect though). I'm just saying that your comparison is far from ideal.


#9 tijo

tijo

    Technician

  • Technicians
  • 3169 posts
  • Guild Tag:[RISE]
  • Server:Stormbluff Isle

Posted 27 November 2012 - 09:05 PM

View PostAngelus359, on 27 November 2012 - 08:04 PM, said:

My 2001 game was being played on 1999 hardware.

Outside of graphical concerns, the required load is almost identical! it had dots, it had buffs, it had aoe, it had summons, it had castles, it had siege, it had LOS calculations

Outside of graphics, what's different?

I'm not an expert on this, but the way i see it it's not just what you calculate, but how, the precision and the amount of calculations being done.

Saying it calculates the same things is accurate, saying the load is the same is not.

Let's take for example an AoE spell like meteor shower, if you apply the damage to a specific surface area, the calculations are less intense than calculating the trajectory of each projectile and detecting whether they hit an enemy. In GW2, each meteor counts as a projectile so more calculations than applying the damage to an area of a given radius.

I'm guessing on this one, but collision detection is likely done by the CPU and the more polygons you have, the more complicated collision detection is bound to be.

If the CPU load was the same for every game, then we wouldn't need beefier CPUs for games, we'd only need beefier video cards.

Also for the record, GW2 outside of massive zerg fests (which imo should be considered separately from the rest of the game) runs perfectly fine on 2010 hardware.

Edited by tijo, 27 November 2012 - 09:07 PM.


#10 Ezendor

Ezendor

    Vanguard Scout

  • Members
  • 372 posts
  • Guild Tag:[SYN]
  • Server:Sanctum of Rall

Posted 28 November 2012 - 02:10 AM

View Posttypographie, on 27 November 2012 - 05:28 PM, said:

How have other MMO's run that tried to shove 50-100 players in one place and made them fight? Maybe there's room for more improvement, but I'm not convinced that "demanding" automatically means "horribly coded."


Standing in the middle of nowhere with no one else besides me and CPU utilization at 75% is "demanding" but not "horrible coded"?  My GTX 670 doesn't even hit 60% most of the time so no the bottleneck is not my GPU.

#11 Angelus359

Angelus359

    Vanguard Scout

  • Members
  • 358 posts
  • Location:Illinois

Posted 28 November 2012 - 04:51 PM

Collision detection if I remember correctly, is floating point :P therefore can be done on gpu :P

GW2 doesn't use high levels of percision anyways. That's why everything is calculated at intervals and integer. Nothing uses decimals except for boon duration.

CPU requirements go up when GPU requirements go up, because the CPU has to prepare data to pass to the GPU. In many games, GPU requirements actually have went up more than CPU requirements, by huge margins.



View Posttijo, on 27 November 2012 - 09:05 PM, said:

I'm not an expert on this, but the way i see it it's not just what you calculate, but how, the precision and the amount of calculations being done.

Saying it calculates the same things is accurate, saying the load is the same is not.

Let's take for example an AoE spell like meteor shower, if you apply the damage to a specific surface area, the calculations are less intense than calculating the trajectory of each projectile and detecting whether they hit an enemy. In GW2, each meteor counts as a projectile so more calculations than applying the damage to an area of a given radius.

I'm guessing on this one, but collision detection is likely done by the CPU and the more polygons you have, the more complicated collision detection is bound to be.

If the CPU load was the same for every game, then we wouldn't need beefier CPUs for games, we'd only need beefier video cards.

Also for the record, GW2 outside of massive zerg fests (which imo should be considered separately from the rest of the game) runs perfectly fine on 2010 hardware.

Edited by Angelus359, 28 November 2012 - 04:52 PM.


#12 dhatcher1

dhatcher1

    Technician

  • Technicians
  • 3786 posts
  • Guild Tag:[SAnD]
  • Server:Maguuma

Posted 28 November 2012 - 06:56 PM

View PostAngelus359, on 28 November 2012 - 04:51 PM, said:

CPU requirements go up when GPU requirements go up, because the CPU has to prepare data to pass to the GPU.

In many games, GPU requirements actually have went up more than CPU requirements, by huge margins.
You first statement is common, but not strictly technologically required.  You can also do GPU emulation, effectively all of the graphics calculations using the CPU, and pass very little to the GPU for display (which I think Anet is guilty of doing too much of); you can also go all the way to Tesla and run your entire computer off nothing but the GPU.

Part of the GPU requirements going up so fast is because of monitor resolution. Just a few years ago, 1024x768 was the most common screen resolution, now 1080p has to be assumed.  Thats a 3 times increase just to tread water.

Edited by dhatcher1, 28 November 2012 - 06:57 PM.


#13 Baldur The Bold

Baldur The Bold

    Vanguard Scout

  • Members
  • 443 posts
  • Guild Tag:[ARM]
  • Server:Blackgate

Posted 28 November 2012 - 07:59 PM

I cant really see Anet doing anything more than minor optimizations to the engine. Basically we have to wait for the 6ghz+ CPUs to come out and hope that will fix issues.
GPU in this game is going to stay fixed at current gen specs in order to maintain high performance.
I doubt that we will ever get a DX11 patch.
That being said with my 680 oc and Sweet FX and supersampling the game looks very nice and performance is good when limited to non zerg environments.
It really is too bad that they decided to rework the old GW1 engine instead of developing a new one since they had the time.

#14 Lord Sojar

Lord Sojar

    Mesmer of Death

  • Site Contributors
  • 2111 posts
  • Location:Inside your GPU, ohaidar
  • Profession:Mesmer
  • Guild Tag:[Heil]
  • Server:Maguuma

Posted 28 November 2012 - 10:55 PM

Haswell should help some with GW2 performance.  

But in reality, the thing that's really going to boost it?  AMD's Steamroller.  The architecture Steamroller utilizes is nearly perfect for how GW2's engine functions.

#15 Angelus359

Angelus359

    Vanguard Scout

  • Members
  • 358 posts
  • Location:Illinois

Posted 30 November 2012 - 05:22 AM

Haswell is mostly about power optimizations, with a very small performance boost, unless you have AVX2 support in your code, which is extremely unlikely, because ATM no processors use it.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users