Jump to content

  • Curse Sites
Help

gamedev

Member Since 03 Aug 2012
Offline Last Active Dec 07 2012 11:29 PM

Posts I've Made

In Topic: SSD Reccomendations?

14 November 2012 - 04:34 PM

Another vote for the Samsung 840 Pro. It's basically the best SSD you can get these days, and the price isn't too outrageous either.

In Topic: Manifesto In Flames - What Can Be Done?

13 November 2012 - 05:31 PM

So just out of curiosity, I went back and rewatched the much-vaunted manifesto. It says literally nothing about not adding new items, or even anything about item progression at all. Why is everyone using it as the rallying cry over ascended gear?

In Topic: Guild Wars 2 hits 400k concurrent users before launch

28 August 2012 - 05:35 PM

SWTOR only had 350k total concurrent over the first week, from announcements made back then. GW2 already topped it from prepurchase alone.

In Topic: Official update on DX11 Support

07 August 2012 - 11:07 PM

There seem to be a lot of misconceptions floating around, so here's a little info from someone who has worked with DirectX and knows a fair bit about how it operates. "DX9 graphics" is a misleading way of looking at how a game renders its content. To do this, you need to look at both pieces of the puzzle: the rendering API, be it D3D9, D3D11, or OpenGL, and the actual underlying graphics hardware. The former is simply a gateway for the application to access the features of the latter in a consistent manner. What this means is that changing the rendering API on its own has little to no effect on what actually ends up on screen that you as the player can see.

Certainly the change is big from the developer's point of view. D3D11 as an API is much nicer to work with, and more closely matches the way modern underlying hardware operates. Additionally, it makes guarantees about the minimum features supported by any hardware branded as "Direct3D11 Compatible". For example, D3D11 cards have a higher minimum texture size. This doesn't mean, however, that a D3D9 card can't also support that same texture size; it simply means that all D3D11 cards are *required* to do so, whereas for D3D9 cards it is optional.

Even though D3D11 enables access to a few more powerful features of higher end cards (such as compute shaders, conservative depth output, and custom coverage masks), GW2 wouldn't be able to make use of them even if it used D3D11 without losing support from lower end cards that don't support those particular features. This would mean doing double the work to put in a code path for the high end feature as well as a fall-back path for lower end cards that don't support it.

The primary reason why GW2 supports D3D9 is because it wants to support Windows XP. Microsoft did significant reworking of the driver model for Vista and Windows 7 and D3D10 and 11 require those changes; as a result, targeting one of them would mean losing XP support. As mentioned previously, ANet could write *two* codepaths: one for D3D9 and one for D3D11, but obviously this requires twice as much work, and in the end they'd have to do yet more work for their to be *any* visual differences between the two. Additionally, this doubles the amount of code that needs to be maintained, optimized, and debugged.

As such, they chose to go with the option that gave the widest possible support for graphics cards and OS's, making best use of the development time they have. Once the game ships and they have more free time to work on updates, they can evaluate doing a D3D11 path and see whether the advantages it brings for code maintainability and graphical flair on high end cards is worth the resources required to implement it.

In Topic: Your thoughts on FXAA and lack of real AA?

03 August 2012 - 12:59 AM

View Postthehotsung, on 03 August 2012 - 12:26 AM, said:

According to the official forum, Arena net staff said that DX 10 will be release sometime after launch and DX11 probably much later on.  

Yeah 2012 games with DX9 graphics is shameful, almost like they plan to make this game on console....

This seems to be a very common misconception. "DX9 graphics" is a misleading way of looking at how a game renders its content. To do this, you need to look at both pieces of the puzzle: the rendering API, be it D3D9, D3D11, or OpenGL, and the actual underlying graphics hardware. The former is simply a gateway for the application to access the features of the latter in a consistent manner. What this means is that changing the rendering API on its own has little to no effect on what actually ends up on screen that you as the player can see.

Certainly the change is big from the developer's point of view. D3D11 as an API is much nicer to work with, and more closely matches the way the underlying hardware operates. Additionally, it makes guarantees about the minimum features supported by any hardware branded as "Direct3D11 Compatible". For example, D3D11 cards have a higher minimum texture size. This doesn't mean, however, that a D3D9 card can't also support that same texture size; it simply means that all D3D11 cards are *required* to do so, whereas for D3D9 cards it is optional.

Even though D3D11 enables access to a few more powerful features of higher end cards (such as compute shaders, conservative depth output, and custom coverage masks), GW2 wouldn't be able to make use of them even if it used D3D11 without losing support from lower end cards that don't support those particular features. This would mean doing double the work to put in a code path for the high end feature as well as a fall-back path for lower end cards that don't support it.

The primary reason why GW2 supports D3D9 is because it wants to support Windows XP. Microsoft did significant reworking of the driver model for Vista and Windows 7 and D3D10 and 11 require those changes; as a result, targeting one of them would mean losing XP support. As mentioned previously, ANet could write *two* codepaths: one for D3D9 and one for D3D11, but obviously this requires twice as much work, and in the end they'd have to do yet more work for their to be *any* visual differences between the two. As such, they chose to go with the option that gave the widest possible support for graphics cards and OS's, making best use of the development time they have. Once the game ships and they have more free time to work on updates, they can evaluate doing a D3D11 path and see whether the advantages it brings for code maintainability and graphical flair on high end cards is worth the resources required to implement it.