OK. Any of you techeads with a passing interest in graphical processing hardware and software will have heard the rumblings about DX10.1.
MS Direct3D team has recently unveiled further details at the annual SIGGGRAPH(Special Interest Group for Computer GRAPHics) conference.
Here's some of the presentation content (
http://www.scribd.com/doc/231350/9-Direct3D-10-1) with some further tech details... There is also a great article at elitebastards (
http://www.elitebastards.com/cms/index.php?option=com_content&task=view&id=103&Itemid=29)
but a basic summary can be as follows:
*DX 10.1 is a revision of the DX10 API. MS announced that it plans quarterly point revisions of DX10.
*DX 10.1 will introduce pixel shader 4.1 and WDDM 2.1(the latter being windows encapsulated drivers)
*The DX10.1 SDK preview is availabe for MS TechNet subscribers to have a looksee. Though theres not much they can do with it without DX10.1 compliant kit or SP1 ;)
*The DX10.1 SDK and hardware will require Vista Service Pack 1 but can service DX10 and 10.1 apps/hardware. ie. it wont be dx10.1 specific
*Introduces the long awaited shift from DirectSound to XA2(Xaudio2). A few of you may know this is the same standard used by the x360. This has been done to improve cross platform development between PC an x360 > which fits in with Windows Live and Xbox Live's integration infrastructure
Now, the interesting bit:
*Current DX10 hardware devices will *NOT* support DX10.1 Functionalities.
Now, there is more to this than "OMG! MY DX10 CARD NOW SUX!!" which has been the initial reaction to this(including mine for a couple of minutes until reading further ;))
*What DX10.1 protocols will do is *mandate* performance characteristics to hardware manufacturers. Ever wondered why the ATI<insert model no> card does better/worse in benchies than the Nvdia counterpart with seemingly identical specs ??
This is simply because any such protocols pre DX10 did not exist. It will allow devs to make games of higher standards, without the comptability and worries across the range of cards circulating on the market. If a game is made with DX10.1 an you have DX10.1 hardware, then this means you have certain functionalitiy built in as a minimum, which did not exist in DX9. eg. ATI an nvidia cards that has similar specs often worked off different shading models(which was a nightmare for devs to code for)
Heres a slice from the elitebastards article which adresses tech issues that will mean the most to gamers:
Firstly, this revision of the API will see the introduction of 32-bit floating-point filtering over the 16-bit filtering currently on show in DirectX 9 and 10 - This should see improvements to the quality of High Dynamic Range rendering which use this functionality over what is currently available.
On top of this, overall precision throughout the rendering pipeline will also be increased, although to what level doesn't seem to have been publically specified at present.
...DirectX 10.1 will also see the introduction of full application control over anti-aliasing. This will allow applications to control the usage of both multi-sample and super-sample anti-aliasing, as well as giving them the ability to choose sample patterns to best suit the rendering scenario in a particular scene or title.
Finally, these changes in DirectX 10.1 give the application control over the pixel coverage mask, a mask which is used to help to quickly approximate sampling for an area of pixels. This in particular should prove to be a boon when anti-aliasing particles, vegetation, scenes with motion blur and the like.
All of this additional control handed to the application could allow for anti-aliasing to be used much more wisely and effectively, and controlled by game developers themselves, rather than the current 'all or nothing' implementation available, which basically amounts to a simple on-off switch.
To add further to the additional focus on anti-aliasing in DirectX 10.1, support for a minimum of four samples per pixel (in other words, 4x anti-aliasing) is now required (Although this doesn't necessarily mean that support for 2x anti-aliasing in hardware and drivers is a thing of the past)
So for those who have a DX10 card, or are contemplating a DX10 card the the G80 from the R600 series. Should you wait ??
Whilst there have been no firm announcements, there is currently no production level work being done by nvidia or AMD. With the August SDK preview just having been released, it is definitely very early days. Add to this the fact that Vista Sp1 also has not been released yet(though not too far away)
Devs are still *really* catching up on DX10 alone, with a hardly solid foundation of apps and titles to be established, making planning a DX10.1 purchase still a bit premature.
Speaking of premature, the final and most important consideration when dealing with graphics hardware.....drivers. With DX10 drivers gently edging towards a semblance of functionality(SLI was only enabled in G80 in Vista as recently as March 2007) the timeframe involved from R&D>testing>SDK deployment>Release to the drooling tech head masses is signficant,
In a years time, when G80 and R600 drivers will be comig into their element, the 10.1 compliant series will be meandering through its formative processes. By then, there will of course be at least 2 point revisions to DX10...although it hasnt been stated clearly if further revisions will create exclusive functionalities for each generation of hardware.
The best course of action is to decide what you want to play and what level of performance you want. Then look up some benchies and see what your best options are, relative to the size of your wallet. With the amazing benches posted by the likes of 8800GTX, I would gladly take one now!!
I guess it's just a reality of tech and computing ;) sucks on the bank balance too !
mtfbwya