Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
pritch

Nvidia launches RTX 2000 series

Recommended Posts

Tbh, I'm not much into graphics cards in general, because I don't have a gaming PC/laptop for that. As far as I know, I have played games on laptops having AMD Radeon, and most games hit the 60fps mark pretty well. As for me, I guess the amount of RAM you have is more important than graphics. 

Share this post


Link to post

I'm quite surprised at how well my hardware has aged, especially compared to the old days. 

 

I still have my i5 2500k from 2011 (which I haven't OCd yet) and my GTX 970 doesn't seem to struggle with newer games, save for a few and even then it's only on the highest settings. I don't see a reason why I should upgrade my graphics card so far.

Share this post


Link to post
17 hours ago, Avoozl said:

And I thought I was far behind with my 960. :P

Well, I've had it for four years now and its handled everything I've thrown at it. Had to turn down some settings in TNC, but it handled Doom 4 and TNO like a champ. I'm fine with 45-60 FPS with a mix of high and ultra settings, and it's done the job.

Share this post


Link to post
2 hours ago, DooM_RO said:

I'm quite surprised at how well my hardware has aged, especially compared to the old days. 

 

I still have my i5 2500k from 2011 (which I haven't OCd yet) and my GTX 970 doesn't seem to struggle with newer games, save for a few and even then it's only on the highest settings. I don't see a reason why I should upgrade my graphics card so far.

 

That's for Intel bringing only single digit improvements in computing power for years. Hopefully AMD will remain  competitive with its future cpu models, we can only profit from it. 2 years ago, having 16 cores was unthinkable for Intel. Same for Nvidia vs AMD. Perhaps I'll upgrade from my 1070 if the 2070 offers 1080ti performance, and sell the 1070 quickly so the price isn't that steep. If not, then I guess I'll pass. I'm more interested in raw performance increase than in graphical effects,  and it really seems strange that Nvidia doesn't offer any other numbers except for the performance at raytracing, something the previous generation wasn't built for in the first place.

Share this post


Link to post
13 hours ago, DooM_RO said:

I'm quite surprised at how well my hardware has aged, especially compared to the old days. 

 

I still have my i5 2500k from 2011 (which I haven't OCd yet) and my GTX 970 doesn't seem to struggle with newer games, save for a few and even then it's only on the highest settings. I don't see a reason why I should upgrade my graphics card so far.

The 2500k is at the top of the relative longevity standings for sure, it was a smart buy.

 

I still run my Q6600 (2007) in my secondary PC, with my previous card - a Radeon R9 270. Plays most modern games at decent settings tbh, especially as games look much better on modest settings these days compared to many years ago...

 

My main build that ran the Q6600 had the board die a couple of years back. I stripped it down, sold the old DDR2 for over £60 which surprised me. I decided to take the plunge and build a new machine around the 6600k and DDR4. It wasn't long before I picked up a 970 like you - I bought it used, it's the Galax HOF one running at 1500Mhz. It's actually my first Nvidia card. It's great apart from some noticeable split-second lag when hi-res textures come in to the FOV in some games, which I never had with an AMD/ATi card. Do you get that too?

 

As for the secondary PC, I bought an old MSI board for £30 on ebay that came with 8GB of DDR3 and a cooler for socket 775. I picked up a 120GB SSD for £25, I plugged in the Q6600 overclocked to 3Ghz, the R9 270, and stuck it in a spare case with the old PSU and HDD with most of my Steam library and have a perfectly decent mid-gaming PC running Win 10 for next to no net spend at all. We have a vacation apartment and that's where I've just brought it today - using it right now.

 

This is partly why I prefer PC to console in the longer term, and partly why I laughed when Tom's Hardware's editor-in-chief claimed the 2080Ti is worth the money at over a thousand bucks / pounds;

 

https://www.tomshardware.com/news/nvidia-rtx-gpus-worth-the-money,37689.html

 

I'm still not sure if he was trolling...

Share this post


Link to post

Whatever one thinks is worth the money. Hardly anything enlightening  in that article. 

Share this post


Link to post

@pritch

 

Hmm, sometimes it happens on my end too. I thought it was a hard drive issue.

 

Anyway, these new cards are way too expensive for what they offer. In two years my 970 will be 6 years old (!!!) and I still expect it to be pretty good. I also plan to overclock it. I've never had a video card be so good for so long. If you told me during the mid 2000s that if my hardware would still be pretty good after 6 or more years I'd call you crazy.

 

I think it's also because games finally got realistic enough during this decade (arguably even earlier) and the graphical advancements are really more incremental. For example, even though I can barely keep Battlefield 1 above 40 FPS on my video card on the very highest settings, I don't really see much difference between it and Battlefield 4, which came out in 2013.

Share this post


Link to post

1080 Gtx launched at $540 USD. 2080 RTX is launching at $800 USD. There better be a real leap in performance and not just shiny API and a 25% increase. Otherwise we blame the minimal performance to high price increase on trade tariffs, crypto speculatora, and the current silicon yields from dead unicorns. 

Share this post


Link to post

I royally fucked up as I ordered (2) 2080tis on launch.

 

I completely ignored the 9/20 ship date in a panic to get them before they sold out.

 

I had sold my Titan XP's the day before to a buddy the day before the launch assuming they would ship the new cards upon order.

 

Now I sit here GPU-less for another three weeks. Fml. (Also CPU-less, but that's until the 2950x hits shelves on Friday. Had to hock the 1950x while it was still worth something)

Share this post


Link to post
On 8/25/2018 at 10:37 PM, MTF Sergeant said:

Currently waiting for Microsoft to launch HoloLens.

Sadly, while the initial impression is kinda cool as far as projected holograms go , it's got a long way to go as far as visual fidelity, fov, and optics. Let's hope the consumer version is a bit more evolved than the development model, else I would advise you against it.

 

Makes for some interesting selfies, tho.

18118759_10154517840943616_5830737131373037190_n.jpg

Edited by Buckshot

Share this post


Link to post
On 8/25/2018 at 10:45 AM, Jerry.C said:

Aside from that, GZDoom is hardly an engine that can put a modern graphics card under full load.

 

I wish that were true!  Because GZDoom is based on such an old engine, for many of its more extensive graphical features it has to brute-force them through.  Before optimizations, some of my maps for Elementalism would absolutely cripple GPUs: my GTX 1060 would plunge from a typical 200 fps in Entryway to less than 20 fps. 

 

After a lot of geometry simplifying and scripts that deactivate things like lights and models as you move through the map I've gotten it up to an acceptable frame rate, but honestly it's probably easier to max out a modern GPU using GZDoom than it is using something like Unreal Engine 4.  

Share this post


Link to post

You are mistaken here. GZDoom is purely CPU limited, the GPU actually sits idle most of the time. You can easily verify this in 3.5.0 by playing around with the scaling slider in the menu. It's very unlikely that it'll make any difference at all.

 

Unless you switch on all advanced effects or render in ultra-high resolutions, it won't be any faster on a 1060 than on a 650. That's the main reason why I see little point upgrading my graphics hardware. For the games I am playing that older card is good enough and I have a hard time finding anything justifying the expense. Full resolution shadowmaps in GZDoom are not worth it really, nice as they may look.

 

 

Share this post


Link to post
1 hour ago, Buckshot said:

Sadly, while the initial impression is kinda cool as far as projected holograms go , it's got a long way to go as far as visual fidelity, fov, and optics. Let's hope the consumer version is a bit more evolved than the development model, else I would advise you against it.

 

Makes for some interesting selfies, tho.

Hah, I see you have a similar amount of grey in your beard to me these days.

 

You're probably doing a bit better then me if you can afford two 2080Tis at launch, though :)

 

Are you using the new high bandwidth link on these? What are you hoping to run through them vs the Titan XPs - or do you ruthlessly go for the latest and greatest and sell the older stuff ASAP?

Share this post


Link to post
26 minutes ago, pritch said:

Hah, I see you have a similar amount of grey in your beard to me these days.

 

You're probably doing a bit better then me if you can afford two 2080Tis at launch, though :)

 

Are you using the new high bandwidth link on these? What are you hoping to run through them vs the Titan XPs - or do you ruthlessly go for the latest and greatest and sell the older stuff ASAP?

 

 

Heh, it's more I just spend all my money on gaming hardware when I probably shouldn't be 😂 but in all honesty I just make decent pay; not the greatest, not bad, but just decent. My wife would probably murder me if she knew how much I put into gaming tech every time something new came out.

 

But yes, I ordered the (extremely overpriced) $80 NVLink adapter for the as well. As I've had experience with NVLink Tesla/Quadro hardware through my job, i can safely say it's going to beat the living shit out of SLI performance in games going forward. At a hardware level, think of more unified stacked card performance (where cards act more like one big single card instead of striped frame handling across two or more cards) and stacked vram (where you get the total amount of vram between between both cards as opposed to only of one card as the other us mirrored in SLI). These type of features can only be done currently at software level API in Windows with like dx12, and no game and the Nvidia driver's themselves don't  integrate it's ability for SLI. 

 

NVLink still is backwards compatible with sli-supported games. Games that currently take advantage of SLI will continue to run in NVLink,but still only under the limitations of SLI performance.

 

Games going forward, though... NVLink is a massive overhaul in bandwidth and hardware level GPU stacking; you will literally get double the performance in games (right down to vram) for any game that is NVLink enabled. Unlike SLI, where you'd get no vram additional. From the second card, synch latency, and maybe only 30 to 50% of the performance of the second GPU (if even that)

 

 

 

And yes, that's kind of my tactic, dump the old just prior to the launch of the new to recoup as much of the cost as possible. Typically that only leaves me down for a few days no more than a week at most, but I I'm down for at least another couple weeks for my little slip-up on actual release date.

Share this post


Link to post

I was eagerly awaiting for this so I could buy the 2050ti... but those prices are just stupid. It took me almost a year to save enough money for a 1050ti, so yeah, fuck that. I'll buy a 1050ti next week and I'll be happy with it.

Share this post


Link to post
2 hours ago, Jerry.C said:

You are mistaken here. GZDoom is purely CPU limited, the GPU actually sits idle most of the time. You can easily verify this in 3.5.0 by playing around with the scaling slider in the menu. It's very unlikely that it'll make any difference at all.

 

 

And my uneducated guess is that transferring this load from the CPU onto the GPU  (which has seen far larger performance improvements within the last 10 years) would be very complicated and involve rewriting a lot of doom's code...

Share this post


Link to post
22 hours ago, Buckshot said:

As I've had experience with NVLink Tesla/Quadro hardware through my job, i can safely say it's going to beat the living shit out of SLI performance in games going forward. At a hardware level, think of more unified stacked card performance (where cards act more like one big single card instead of striped frame handling across two or more cards) and stacked vram (where you get the total amount of vram between between both cards as opposed to only of one card as the other us mirrored in SLI). These type of features can only be done currently at software level API in Windows with like dx12, and no game and the Nvidia driver's themselves don't  integrate it's ability for SLI. 

 

NVLink still is backwards compatible with sli-supported games. Games that currently take advantage of SLI will continue to run in NVLink,but still only under the limitations of SLI performance.

 

Games going forward, though... NVLink is a massive overhaul in bandwidth and hardware level GPU stacking; you will literally get double the performance in games (right down to vram) for any game that is NVLink enabled. Unlike SLI, where you'd get no vram additional. From the second card, synch latency, and maybe only 30 to 50% of the performance of the second GPU (if even that)

 

This is really interesting - when I made this thread I was kinda hoping someone who has actually worked with this stuff would chime in.

 

It's good to hear that the new link will bring that kind of improvement over SLI. I ran crossfire with a second R9 270 for a little while, as I already had one and it was just to tide me over to a bigger upgrade, and even though I got almost all the money back on the second card thanks to mining, it was barely worth my time. I just ended up with high temps, but they were 28nm chips, hopefully the way things are heading (hasn't AMD just pinned everything to 7nm?!) multi-card set ups will use less power and produce significantly less heat in the average mid-tower than in the SLI era, to compliment the performance boost you've described.

 

I guess my only concern would be driver support? Aside from the hardware issues of SLI/Crossfire, day one driver support is/was notoriously bad, with some pretty lengthy waits in many games before multi-card set ups saw an advantage, if at all. I don't know if it will be different with NVLink?

 

21 hours ago, KVELLER said:

I was eagerly awaiting for this so I could buy the 2050ti... but those prices are just stupid. It took me almost a year to save enough money for a 1050ti, so yeah, fuck that. I'll buy a 1050ti next week and I'll be happy with it.

 

If you're thinking of buying a new 1050ti, would you not consider a used 1060 6GB or even a 980 / 980ti instead? You'll get more bang for your buck?

Share this post


Link to post
21 minutes ago, pritch said:

If you're thinking of buying a new 1050ti, would you not consider a used 1060 6GB or even a 980 / 980ti instead? You'll get more bang for your buck?

 

Hm, maybe. I guess I'm kinda nervous about giving my money to some rando that could or could not send me what I bought, but I guess I should be fine as long as I'm careful about the site I'm buying in.

 

Also, what about the card's remaining lifespan? It'll probably be a few years before I can buy another one, so I want this one to last for as long as possible.

 

I'll keep it mind, regardless.

Share this post


Link to post
22 hours ago, pritch said:

 

This is really interesting - when I made this thread I was kinda hoping someone who has actually worked with this stuff would chime in.

 

It's good to hear that the new link will bring that kind of improvement over SLI. I ran crossfire with a second R9 270 for a little while, as I already had one and it was just to tide me over to a bigger upgrade, and even though I got almost all the money back on the second card thanks to mining, it was barely worth my time. I just ended up with high temps, but they were 28nm chips, hopefully the way things are heading (hasn't AMD just pinned everything to 7nm?!) multi-card set ups will use less power and produce significantly less heat in the average mid-tower than in the SLI era, to compliment the performance boost you've described.

 

I guess my only concern would be driver support? Aside from the hardware issues of SLI/Crossfire, day one driver support is/was notoriously bad, with some pretty lengthy waits in many games before multi-card set ups saw an advantage, if at all. I don't know if it will be different with NVLink?

 

 



Now, thats the thing I'm not quite sure of. I have a slight understanding on how NVlink functions in compute and/or content creation systems as many of our teams use at the office, but how exactly a game will utilize such feature isn't as clear (nobody has used the tech for gaming yet). From what I can find, rather than being handled by the software drivers and dependent on the game engine as SLI is, NVlink routes all cross-gpu data through its own highspeed interface and not through the PCI-E bus, so the performance is literally "stacked" between the gpu's at a hardware level, and it acts as more "many gpu's acting as one at a hardware level" vs. "multiple gpu's being detected as many and striping data and mirroring vram across them all at software level". 

Does this mean the system API will see it as being all one giant supercard and the drivers/game engine no longer need to have seperate code to support it? Possibly. The goal is to have API say "Here's your total amount of GPU power and your total amount of VRAM, now have fun", and have all the multi-gpu stuff handled at hardware level and not be driver/engine reliant. To have GPU's function akin to hardware-level RAID (in comparison to storage), ideally. I know that's not a great example, but it's the best I could think of.

Again, DX12 and later OpenGL/Vulkan api's have supported GPU stacking in this manner from a software level, but that means the drivers and engine also have to support it, and while there are games out there that use DX12 (many do these days), the rest of it was never really in place to take advantage of that, albeit maybe a game or two.

Of course, there would probably still have to be *some* level of software driver/engine support for NVLink, but games going forward will see likely full performance and vram usage abilities from the additional gpus, unlike sli.   

Edited by Buckshot

Share this post


Link to post
On ‎8‎/‎28‎/‎2018 at 7:04 PM, KVELLER said:

 

Hm, maybe. I guess I'm kinda nervous about giving my money to some rando that could or could not send me what I bought, but I guess I should be fine as long as I'm careful about the site I'm buying in.

 

Also, what about the card's remaining lifespan? It'll probably be a few years before I can buy another one, so I want this one to last for as long as possible.

 

I'll keep it mind, regardless.

 

I share those concerns, but I had to ditch them in order to be able to afford a half decent card these days. Well, actually I could afford a brand new 2080Ti if I truly valued it enough, but I have a house to renovate etc. and truth be told I cannot ever see myself being someone who spends more than 300 on a card - and I'm only up to 180 so far. I just don't like the new retail prices so that's pushed me into used.

 

The 970 HOF I bought used is about 50-70% faster than a 1050Ti and was about the same price when I did this. I'd have had to get the 1060 6GB at double the price, just to get a small boost in performance over the 970. So it was an easy decision. For me, performance beats age, hands down. There isn't enough of a technical difference between the 9 and 10 series cards in games to warrant placing the premium on the newer series now. With the 20 series that remains to be seen - but it's still going to be quite some time before that's a genuine concern, and you'd be looking to a newer card by then anyway. As a rule, the top models in a generation hold their value best over the long term - I wouldn't pay retail for an entry level card that's about to be superseded. That's going to go down in value a lot when the 2050Ti or whatever it ends up being called is released. If you really wanted a new card, that's what I'd wait for now - the price difference won't be as sharp as we've seen on the high end cards - Nvidia has to make something in the line-up affordable, even if it's, say, $50 more than the 1050 was.

 

I'm glad I bought used though - I had to clean the dust out of it but I knew this from the listing. Same rules as with anything used - make sure it's clear photos of the actual item for sale, best if it has its original box etc. and the seller has provided decent info. And if you don't like it walk away from it - there will be plenty of fish to choose from as people start to upgrade to the 20xx series.

 

On ‎8‎/‎29‎/‎2018 at 3:21 PM, Buckshot said:

Does this mean the system API will see it as being all one giant supercard and the drivers/game engine no longer need to have seperate code to support it? Possibly. The goal is to have API say "Here's your total amount of GPU power and your total amount of VRAM, now have fun", and have all the multi-gpu stuff handled at hardware level and not be driver/engine reliant. To have GPU's function akin to hardware-level RAID (in comparison to storage), ideally. I know that's not a great example, but it's the best I could think of.

 

No that's a pretty good analogy, really. That would be awesome - tbh I was amazed SLI / Crossfire ever worked at all given what was actually going on at an API and hardware level - the potential for error just seemed so huge. Doing it like this would be a really cool prospect. If multi-GPU can make a case for a second coming, it'll be really interesting to see what that'll do to hardware prices. We saw even with SLI's limitations that an optimised game could run equal or better on two substantially cheaper mid-range GPUs than an expensive single card. If this functions as you describe, that difference could be even more marked.

 

Cynical as I am, in this scenario I couldn't see Nvidia pricing its premium single card solutions out of the market! First time around with SLI adoption levels were always low, and problems numerous and Nvidia knew this. They kept plugging SLI because they knew it wouldn't really harm top-end sales. If it works much better all round in future, multi-GPU could become much better adopted. It would be interesting to see if the lower cards either saw a price rise or their performance cut back relative to the difference between models in previous series to compensate, and thus force the price / performance ratio to parity.

Share this post


Link to post
3 hours ago, pritch said:

 

Cynical as I am, in this scenario I couldn't see Nvidia pricing its premium single card solutions out of the market! First time around with SLI adoption levels were always low, and problems numerous and Nvidia knew this. They kept plugging SLI because they knew it wouldn't really harm top-end sales. If it works much better all round in future, multi-GPU could become much better adopted. It would be interesting to see if the lower cards either saw a price rise or their performance cut back relative to the difference between models in previous series to compensate, and thus force the price / performance ratio to parity.

 

Well Nvidia had made sure not to cannibalize their top end cards by eliminating NVLink and SLI on cheaper gpus. This prevents anyone from buying (2) cheaper cards where the total amount combined would still be lesser than than just purchasing one higher card, connecting them and getting faster performance than if they had bought just the higher priced enthusiast card. Which is entirely possible... But Nvidia has been careful in the past couple generations to gimp said cheaper cards and remove SLI or now NVLink interface entirely.

 

For this reason, the 2070 does not have an NVLink interface.

Share this post


Link to post

Well, I'll see if the 2070 matches the 1080ti's performance, and how prices for used ti's will be in comparison - and get either a 2070 or a 1080ti as upgrade to my current 1070. I don't care much about their raytracing revolution, and I don't ever see myself shelling out money for top end hardware.

Share this post


Link to post

Preview samples are provided on the condition of signing a five year NDA. It's to prevent leaked benchmarks on beta drivers, but it doesn't inspire confidence. The pseudo tinfoil interpretation of this is to obscure the previewers receiving cherry picked samples. And to control negative press. 

 

But I think the majority of outlets are more reasonable than Tom's "Who cares? Preorder and huff your own farts" 

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×