Anyone tried the crysis sp demo ? Although not being a FPS fan, this game seems quite intriguing, especially to a techie curious to see how it crunches into graphics cards.
Ive since had a go, and from what I saw - it is indeed visually stunning...so much damn foliage !!
But does someone with a 1600p monitor have a chance of havving any joy with maxxed out settings ??
I daresay unless you have one of those Quad Extremes and two 8800GTX Ultras, then you will see this game at its full potential in 1600p. The variety of visual effects used would make it almost a photo-realistic experience - but unless you have such a setup, you will have to opt for the low-med settings to get an acceptable framerate(ie. 60 :p)
With med settings, AA off I can get away with 1920 x 1080 with my x2 6000+ and 8800GTS 640. But the beauty is indeed in the high-very high settings.
Im just going to have to wait for a single slot card that can equal two 8800GTX Ultras in SLI to get that 1600p maxxed out goodness :D I wonder when such a card will ensue ?? Is the G90 up to it ??
anyone else tried crysis yet ??
mtfbwya
It seems that the game can even bring two 8800gtx in sli down to their knees and that no system can perform decently at ultra high settings:
http://www.lesnumeriques.com/article-394.html) (you can still check the benchmarks even if you don't understand French - just click on the appropriate screenie at the bottom of the page to access the benchmarks).
SLI benchmarks for the 640 mb 8800 GTS:
http://www.lesnumeriques.com/article-394-2499-93.html) According to the article, they also tried a SLI of 8800gtx but the difference with a single card solution was almost inexistant so they didn't publish the results. They contacted nvidia who told them that the demo wasn't optimised for SLI and the results should be better in the final version...once the drivers are optimized too.
lolz Astro, got too much pride to lower the rez for more eye candy? Or does it hardly make any difference?
well, i think that i can certainly get away with 1600x1200 on my compy with most settings at High and several others at Very High (Textures, Shaders, and Shadows). the game itself runs around 25-40 FPS on my system, but the benchmarks run pretty slow (CPU @ 17 FPS, GPU @ 19 FPS) even with my Athlon oc'd to 2.6 GHz and the 2900 Pro OC'd to 800 MHz core/ 1100Mhz mem.
with the testing i've done on my computer, the 64-bit version does run a bit smoother overall, but it doesn't really improve the framerate much. i would like to get 4GB of RAM in my system so i can really put the demo through its paces, and i can tell that the game could use that much RAM as my hard drive and the Speedboost drive are getting paged very often throughout the demo.
on our SLI rig at work, the Crysis demo runs a bit better, but curiously not by much. the rig slows down to about 15-25 FPS with all settings at Very High. with identical settings, the demo runs similarly to mine but with a slight boost to the framerate. as mentioned previously, the demo does not appear to be optimized for duel graphics cards as of yet, and it really shows with that rig. its hard to tell if its even optimized for quad-core CPU's either since that rig doesn't perform much better than mine despite it having a superior processor.
anyways, i'm still stoked about this game. i'm very much looking forward to the launch, and i'm sure that my system will be more than ready for it. :)
lolz... negsun - it aint pride - when you have 30" of 16:9 screen - dumbing down the rez makes it look bad. It all looks much prettier on my 19" 1280x1024 4:3 screen
@stinger - we have a very similar setup...I dont think you are going to get anywhere near 60fps with all high settings at 1600x1200... medium sure, with maybe a couple of things on high.
As D33 mentions, Im not sure what hardware exists that can manage this at ultra high!! Imagine being a 8800GTX Ultra owner and having your system get its a$$ kicked by a game... it'd be a massive dent to ones braggissimo :p
Ive giggled to read those reports of people thinking they are 'enabling DX10 effects' in xp by messing with config scripts. Fact of teh matter is, the game will launch different settings depending on what hardware it detects you have. If you jump into task manager whilst in the game - you can see its running Crysis Dx10 in vista... and DX9 in XP. The game obviously loads the different shader sets etc on installation. Im not sure if it was a restriction in the demo, but there were no dx10 setting tweaksd in the demo - like in Bioshock for example.
still, kudos to crytek for handing nvidia and ATI their a$$es on a plate :D
mtfbwya
...*sniff* This isn't really a counselling thread, is it? It's there to mock us weaklings with stone age rigs, right?
*wail*
Curse you, vile Astro!
lolz...sabre - havent you been reading !! there's no one on the planet that can run this game with ultra HQ settings at ultra high rez.
mtfbwya
Well, technically, every DX10 graphics card owner can??
lolz... negsun - it aint pride - when you have 30" of 16:9 screen - dumbing down the rez makes it look bad. It all looks much prettier on my 19" 1280x1024 4:3 screen
Hey I'd be happy with that kind of screen :)
Well, technically, every DX10 graphics card owner can??
looolz!!!
lolz...sabre - havent you been reading !! there's no one on the planet that can run this game with ultra HQ settings at ultra high rez.
Crowy made a lovely joke on this once while we were on MSN, about the low FPS you get in Crysis. He was something like: Crysis is excellent in every way, except frame rates. That suit can do everything, and it's cool in a fight, but there are severe slowdowns during fights. The suit should have had a "Maximum Frames" ability. You switch it on. Okay, kickass time. FPS jumps to really high. :D
If only... *sigh*
Well, technically, every DX10 graphics card owner can??
nope, that's what we've been saying. Even the ultimate DX10 card, the 8800GTX is getting its a$$ kicked at maxxed out settings :) lolz
I think I recall Mr T commenting on this......
http://i215.photobucket.com/albums/cc288/Astrotoy7/CrysisT.jpg)
:p
mtfbwya
^ Post of the year!
I lol at people who bought GTXes and Ultras...Who be teh n00bs now?
Early adopters always get screwed.
As I said in another thread: Those with 8800GTXs should sell them now while they're still worth something.
lolz...dont get me wrongo - I'd love a gtx meself ;) But I know *many* people who forked out the extra $100s just to get a GTX/GTX ultra just for their "crysis box". Whilst they still do get superior performance to the rest of the cards on the market, it would be a tad annoying I'd imagine if you did game at high rez or wanted to do the max settings.
It's going to be really interesting to see how nvidia and ATI respond. What gave me a chuckle is how nvidia.com has a link to the crysis demo, labelled "are you ready?"
I dont think they themselves are ready, let alone any of us :D
mtfbwya
You've got a GTX in that Shuttle you have? I thought you needed at least 600W+ for that monstrosity...
You've got a GTX in that Shuttle you have? I though you needed at least 600W+ for that monstrosity...
smallforms work differently negsun.... For starters there are only 2 expansion slots. I have a shuttle sn27p2 and an evga 8800GTS 640. The sn27p2 has a 450W PSU proprietary psu that can handle anything thrown at it - as long as it physically fits in the case!! I know people with other shuttle smallforms that power a GTS/GTX with the shuttle 350W PSUs... like I said - smallforms work differently :D
mtfbwya
well, i have the full version of Crysis now, and needless to say, its awesome. :D
but, i do have a quick warning: after a couple hours of gameplay, my Radeon hit 87C temp wise, and the game crashed on me since the card had to underclock itself to bring the temps back down. my friend that uses my old GeForce 8800 GTS also had some problems with Crysis overheating his card.
my solution was to use Rivatuner to have the onboard fan run at max RPM. although the fan sounds like a vacuum cleaner, it actually manages to keep the temps below 70C in the above scenario. my friend had to do the same thing as well.
my CPU was fine, though, as it kept itself at a reasonable 52C.
this is just a fair warning to anyone that already has heat problems with their systems. ;)
^^^
Well, it's nice to see that it's running OK on your new rig, stingerhs. :)
Ever considered aftermarket heatsinks for your video card and CPU?
Lots of people who have bought 8800GT cards have bought them and are getting insane overclocks out of them, but then again the 8800GT comes with a pretty crappy single-slot stock cooler, while your 2900 comes with a pretty good dual slot one.
I was just reading a guide on how to replace the stock cooler on your GPU and it looks complicated as f*** to me...but maybe that's just cause I'm a n00b when it comes to that.
^^^
Well, it's nice to see that it's running OK on your new rig, stingerhs. :)
Ever considered aftermarket heatsinks for your video card and CPU?
Lots of people who have bought 8800GT cards have bought them and are getting insane overclocks out of them, but then again the 8800GT comes with a pretty crappy single-slot stock cooler, while your 2900 comes with a pretty good dual slot one.actually, i have an aftermarket cooler for the CPU. its the nMediaPC Icetank. its uses a copper block with four copper heatpipes that curve upwards into a set of oval cooling fins that are both aluminum and copper. it has a 90mm fan sitting on top that runs nearly silent.
the main reason the CPU is getting hot has a lot to do with the overclocking i have on it as it runs much, much cooler at stock speeds. it would probably run a bit better if i upped the voltage into the processor, but i'm trying to conserve power as best i can while still getting good performance from the CPU.
as for the GPU, i was unaware that an aftermarket cooling solution existed for the 2900 series of cards until a friend at work pointed me in this direction (
http://www.newegg.com/Product/Product.aspx?Item=N82E16835109020). *orders heatsink and compatible 92mm fan*
lolz...I was wondering about heat issues and this game.... as I have epilepsy, I only game in 30ish minute blocks, so it'll more likely be my neural circuits that will go cactus before the 8800GTS does :p
@stinger...you put a Radeon in your rig.... I didnt know that.. My comiserations :p That CCC is the direst bowl of rubbish a man/woman/goat/cat could a could ever fathom [/nvidia fanboy] Still, Im sure the card itself is doing smashingly.
For us nvidia peeps...169.04 beta has *finally* corrected this annoying menu bug that existed in some EA sports games. My FIFA 08 experience is now beyond HD :D
after youve had enough shooting, let us know how things are performing settings wise..
mtfbwya
It's been a couple of years since I messed around with an ATI card (a 9800XT) and the CCC, but I distinctly remember it being a rather unpleasant experience :swear: (there was nothing wrong with the card; I just hated the CCC). I did think that the ATI card delivered better picture quality compared to its Nvidia counterpart (a 6600GT), though.
As for ForceWare, I'm still using good 'ol 93.71 that I installed a year ago and I see no reason to upgrade it as I'm not playing any new games on my antiquated machine.
Anywho, I've read recently that there will be a new 8800GTS card to be released in about a month. It will sport the G92 GPU like the 8800GT but have all 128 stream processors enabled (like the present G80-based 8800GTX) along with the ability to run at much greater clockspeeds. If the overclocks that I've seen people getting with the 8800GT are any indication, this card will be a beast (at least 20% faster than the present 8800GTX). Whereas the G80 seems to top out at ~640-650MHZ, the G92 can do 780MHZ or more with the right aftermarket cooler. It also uses about 25% less power.
I can only guess that the R670 will see a similar jump in clockspeeds and overclocking ability, given that it is an even more drastic die shrink down to 55NM, whereas G92 is 65NM. We should also expect to see a similarly drastic drop in power usage compared to R600.
Best of all, these cards are priced a hell of a lot cheaper (flagrant price-gouging notwithstanding :rolleyes: ) than the cards they're replacing. Both the HD 3870 and the 8800GT have an MSRP of around $250.00, making SLI and XFire an actual possibility to those that feel like bothering with it.
We can expect the next generation of GPUs from both camps in less than six months.
well, the Catalyst drivers is actually what prompted the graphics card switch in the first place. with the 8800 GTS i had, booting up a game was like playing roulette: i never really knew if it was going to crash and give me a BSOD or not. my problem was that the Nvidia driver would all the sudden quit responding which would prompt Vista to go to a BSOD, create a dump file, and reboot.
the Catalyst drivers don't have this issue, and from what i've researched, the Catalyst drivers in Vista are more stable and perform better than Nvidia's Forceware drivers. as such, i had an opportunity to sell my old 8800 to a friend that was looking for a next gen card on his XP computer, and thus i picked up a 2900 Pro for $279. all in all, i think i got the better end of the bargain especially with all the overclocking headroom on the 2900 Pro (heat issues aside).
as for Crysis, i'm playing with the majority of settings at High, but with Textures, Shaders, Physics, and Object quality at Very High. Resolution is @ 1280x1024x32 w/ no AA. i could go with a lower res to enable the other settings to Very High, but i have to sacrifice quite a bit with image quality since everything becomes much more jagged in appearance.
the game runs at a reasonable 20-40 FPS in most areas except during moments where physics is being processed in which the framerate tanks to under 10 FPS. i realize that this has to do with the physics set to Very High, but i rather enjoy watching buildings blow into a million pieces when i hit them with a rocket. :D
Oh, and I forgot to ask you how much RAM Crysis uses. I took note a while back that you mentioned that Crysis performs better with Vista64 and this seems to be the general consensus. With DDR2 as cheap as it is right now, going to 4GB might provide another performance boost.
EDIT1: And yay! Here's (
http://www.anandtech.com/video/showdoc.aspx?i=3151) Anandtech's review of the recently released ATI HD 3870 and HD 3850. Let the price wars begin! :)
EDIT2: And it's what I expected. Some, but not much improvement in performance over the HD2900 but daaaamn: look at the truly gigantic improvement in power consumption! Way to go, AMD! :thumbsup:
i'm running 2GB, and i haven't noticed much in the way of my hard drives getting paged during gameplay. i did a small experiment and removed 1GB of the RAM, and it did seem that the hard drives were getting paged much more frequently. however, there wasn't a really noticeable difference in the framerate. that could have something to do with the speed of my hard drives, though. load times were much, much longer, though. with 2GB, load times are about 15-30 seconds. with 1GB, load times are around 30-40 seconds.
as for 64-bit vs 32-bit, Crysis doesn't give you too much of a performance boost (its around 5-10%), but hitching and hard drive paging is almost non-existant on 64-bit.
i hope that answers a couple of questions. :)
thanks for the info stinger.... I havent experienced the dropout issues you mentioned on my 8800GTS but I DO get it on another rig with a shyte ole 6200TC running vista. It did the same thing when it had a 6600 as well.
On that rig, an off the shelf compaq the ladies use - it had more to do with the mainboard chipset than the gfx card itself. Chipset driver update and the relevant nvidia hotfixes did the trick :)
FW 169.04, along with the very important "essential vista" hotfixes have gotten rid of the few niggly glitches, including for some of the "xp only" games that were having difficulty.... I wonder what KOTOR would do ??? :p Ive never played lightside on KOTOR2...might give it a go.
I have to admit - MS have done a great job on the x86 emulation for us x64 fans. Using a x64 is usually an alienating experience - like xp64 - but not the case in Vista.
When I get crysis, I'll stick to medium settings... 1280x1024 looks like monkey smearings across 30" of luscious screen. It would defeat the purpose of such a screen!! 1920x1028 at medium it will have to be for me :D
mtfbwya
AMD has released a hotfix for Crysis that addresses some issues with the Radeon 2x00 series of cards as well as support for the new 3800 series, and its been said that it helps the framerate out a little bit as well.
Vista (32 & 64 bit) (
http://www.tcmagazine.com/forums/index.php?automodule=downloads&showfile=325)
XP (32 & 64 bit) (
http://www.tcmagazine.com/forums/index.php?automodule=downloads&showfile=326)
lolz... funny to see xp64 getting gaming related support/fixes... about 4 years to late, but what the hey :D
mtfbwya
need help running Crysis?? then get teh patch. (
http://files.filefront.com/Crysis+v11+Patch/;9400533;/fileinfo.html) its supposed to help performance in general in addition to a much needed performance boost to SLI/Xfire systems. get it, love it, use it.
my system tests:
Crysis 1.0x64, CPU Benchmark (Very High settings, 1600x1200x32): 14FPS (average)
Crysis 1.0x64, GPU Benchmark (Very High settings, 1600x1200x32): 15FPS (average)
Crysis 1.1x64, CPU Benchmark (Very High settings, 1600x1200x32): 21FPS (average)
Crysis 1.1x64, GPU Benchmark (Very High settings, 1600x1200x32): 25FPS (average)
its a pretty good performance increase considering that its just a patch. you'll need it if you're playing online because it introduces several multiplayer tweaks as well.
anyways, i hope that helps some of you. :)
yes, I noticed that patch, couple them with the latest nvidia/ati omega drivers and thats the closest you will get to optimised performance, perhaps :p
I havent played the full game yet - I only recently started CoD4 and am working through NWN2:MOTB... Ive been way to busy with real life stuff to game at all :(
by teh time I get to crysis, hopefully there is a GPU that can whallop it in UHD :p
mtfbwya