IMPORTANT: Because of the upgrade you need to login with your DISPLAY NAME, So members will need to use their =GEN=YOURNAME to login

Sign in to follow this  
Followers 0
=GEN=Smartlink

Building a Gaming PC

6 posts in this topic

Greetings;

 

We covered CPU, RAM and Motherboard in previous parts, now time to tackle the most important piece of a gaming rig, the Video Card or GPU.

 

The GPU (Graphic Processing Uni);

 

This is what makes your games looks good. It is the most important part of a PC when your are thinking about gaming. There are currently two main stream GPU chip makers, ATI and Nvidia. There is a great competition in GPU industry and these two are always trying beat each other. They both offer great products, but use different technologies, mainly their multi-GPU technology.

 

What specs do card makers advertise to get you to buy a videocard, Memory, Clock Speed, DirectX support, Shaders, Shaders Memory, Pipelines, Memory Bandwith, Fill Rates, GFLOPS... ... .... ....

 

Woooow, stop right there, that is a lot of words. What does it all do?? Well there are only 3 you should concern your self with, Memory, Clock Speed and DirectX support. Your video card is like a mini motherboard with a cpu and memory but with only purpose to render graphical instruction. Kinda makes it more easy to understand, so the Memory on your videocard, is like the RAM of your PC, more is always better, but it is not the only aspect to consider. The GPU and it's Clock Speed is what drives everything, is like the CPU that is in your PC, logically faster is better. Finally DirectX support, well view DirectX like an operating system, it is the "windows" of your videocard. So when getting a card you should always check to see if it runs the latest DirectX version, currently that would be DirectX 11. So if you are planing an upgrade or a new build, check to ensure your video card supports it!

 

Don't get caught up In specs and the other features or numbers you will see marketed, it is easy to get lost since the will throw every number they got at you. And these are not applicable directly in gaming performances since it relies also on other components of your PC. Besides there are a lot of places you can find reliable comparison charts made in a real gamer perspective. Tom's Hardware, Maximum PC, Anandtech are always testing out new hardware and post easy to understand graphics to allow you to see the real frame rate advantage a card can have over an other. Before you buy a new card you should ALWAYS visit Tom's Hardware, you will find every few months a thread giving you best options for a budget! Like this September 2012 edition of "Best Graphics Cards For The Money".

 

There is a sweet spot when buying to video card, just like when getting a CPU, Video Cards priced over 250$ do not usually play out to be good investment. For the same reason too, price drops, price / performance ratio's.

 

Multi-GPU technologies;

 

Has the name suggests multi-GPU means you have more then one GPU in your gaming rig! This allows two, three or four GPU's to share the workload when rendering a frame. Ideally, two cards using identical GPUs are installed in a motherboard that contains two PCI-Express slots, set up in a master-slave configuration. Both cards are given the same part of the 3D scene to render, but effectively half of the work load is sent to the slave card through a connector called the bridge or up-link. As an example, the master card works on the top half of the scene while the slave card works on the bottom half. When the slave card is done, it sends its output to the master card, which combines the two images to form one and then outputs the final render to the monitor.

 

Now this is an over simplification, since the actual workings of SLI and CrossfireX are way more complex but it should give you a good idea of how it can improve performances.

 

If you wish to know the inner workings, here is what Hodgy was kind enough to get for us;

 

Split Frame Rendering (SFR), The driver will split the scene workload into multiple regions and assign these regions to different GPUs. For example, on a system with two SLI-enabled GPUs, a render target may be divided vertically, with GPU 1 rendering the left region and GPU 2 rendering the right region. Rendering is also dynamically load balanced, so the division will change whenever the driver determines that one GPU is working more than another. This SLI rendering mode is typically not as desirable as AFR mode, since some of the work is duplicated and communications overhead is higher.

 

Alternate Frame Rendering (AFR), The driver divides workload by alternating GPUs every frame. For example, on a system with two SLI-enabled GPUs, frame 1 would be rendered by GPU 1, frame 2 would be rendered by GPU 2, frame 3 would be rendered by GPU 1, and so on. This is typically the preferred SLI rendering mode as it divides workload evenly between GPUs and requires little inter-GPU communication.

Users can optionally forcefully enable AFR mode for an individual application using the NVIDIA driver control panel. However, this approach may not lead to any scaling due to a variety of pitfalls that are covered in the section on AFR Performance.

 

To get optimum result one should always use identical videocards, that is same model, manufacturer, everything!

 

This leaves use just to over look the 2 different multi-GPU technologies and who is making what!. It is important to note that you cannot mix both technologies. So you have to pick a side! Like it was discussed in part 1, you need to check your motherboard and verify it supports not only the type of multi-GPU technology you will choose but also the number of cards it can support.

 

Multi-GPU technologies offers a great way to wisely spend you cash on a Video Card. Buy a +-250$ video card today, in a year or two buy a second identical card for half the price when you frame rate on newer games drops. And just like that, BANG! You just double your graphic power for cheaper then getting a new mid-rang card all together. I have worked out the financials and

 

ATI;

 

They started making video cards in Ontario, Canada in the 90's when they launched the Ati Wonder, followed by the famous 3d Rage, I have had both in my days! In 1996 they introduced the "All-in-Wonder", the first combination of integrated graphics chip with TV tuner card and the first chip that enabled to display computer graphics on a TV set. In 2000 they unveiled the "Radeon " line of graphics card, this is where the real rivalry between ATI and NVIDA started out. They also engineered modified version of their GPU's for the Game Cube, and more recently for the Nintendo Wii, WII-U and Microsoft's Xbox 360!

 

Their current microarchitectures is still named Radeon or Radeon HD. The top of their line are HD 6990 and HD 7970 GHz edition, but they are not cheap, the sweet spot cards would be HD 7870, HD 6950.

 

CrossfireX, is ATI's multi-GPU solution! The technology allows up to four GPUs to be used in a single computer to improve graphics performance. If you plan using this, make sure you videocards are CrossfireX ready. Just to make sure you didn't already forget, I will remind you that you must check that your motherboard suports CrossfireX and the number of card you wish to use!

 

Nvidia;

 

The California based company had made little noise in the early 90's focusing on Professional Video Cards used for CGI graphics in the movie industry. But in 1999 when they release the GeForce 256,they made a big bang. It was the first PC graphics chip with hardware transform, lighting, and shading. 3D games did not even use that technology back then. It offered a huge leap forward in 3D gaming performance and was the first fully Direct3D 7-compliant 3D accelerator, leaving competitor far behind. But this is not the only time they have revolutionized GPU's. In 2000 they manufactured the first GPU for laptop, making it finally possible to play 3D games on a laptop. Then early 2001 they introduced the GeForce 3, making them the first Direct3D 8.0 compliant 3D-card. Again leaving in the dust the competition A variation of the GeForce 3 was used for the original Microsoft Xbox in late 2001.

 

They where the first to re-introduce a multi-GPU technology in 2004, after purchasing the company (3DFX) who made the technology originally for the Voodoo gfx cards! Later on in 2005, Playstation choose NVDIA to make the GPU of their new PS3! A variant of the GeForce GPU was adopted by Apple for MAcbook, Macbook pro and Macbook air lineup! NVIDIA has a rich and solid history with many innovation and remarkable achievements. Did you know the great CGI effects in Inception, Iron Man series and Avatar where made using NVIDIA chips based hardware. In fact in 2001 all Oscar nominees in the "Best Visual Effects" category were created by studios using NVIDIA® Quadro® professional graphics solutions. I kinda guess by figured out by now that I kinda have a preference for Nvidia! :-)

 

Their current microarchitectures is named GeForce, GTX 690 is the king on top at the moment, beating anything ATI has in there lineup. It goes around for a wallet burning sum of 1,000$ USD. ouch, not very cost / performance friendly. GTX 470, GTX 560 Ti, GTX 480, are the cards worth looking at.

 

SLI (Scalable Link Interface), is Nvidia's multi-GPU solution! Like their competitor, this technology allows up to four GPUs to be used in a single computer to improve graphics performance. If you plan using this same thing then above applies, card must be SLI ready (most are) and you motherboard. Again, you should always check that your motherboard is SLI compatibility and number of card slots!

 

Manufactures;

 

You can get video cards made by Nvidia or ATI, but there is tones of companies that also make Video Cards based on either microarchitecture. You see both ATI and NVIDA sell their GPU' s, that is the chip (or core) on it's own to other companies that then build everything else on their own. So circuit board, fan, RAM and the rest that makes the final video card is made by those companies. They usually tweak everything to give you a little more out of your card, so it is often better then those made by the chip makers themselves (ATI and NVIDA).

 

Like most PC components it is important to choose a good manufactures. Obviously getting a card from the the chip makers will be reliable and well constructed. Like stated above, third party manufactures are making videocard that tend to have a little more punch, but that is not always true. Asus, MSI, EVGA, XFX and Gygabyte are the big 5. So you can trust those manufacturers to make durable and reliable cards with excellent designs that will usually give you more performance, for a bit less.

 

Please comment or ask any questions related to this part 1.

 

To part 4, Sound, Power and everything else!

Share this post


Link to post
Share on other sites

Hi Smarty,

 

Good read but I do see one problem, how you explain the way multi gpu's work. There is 2 kinds of rendering methods. The method you stated is really quite old now and AFR is the preferred method at this moment

 

Split Frame Rendering (SFR), The driver will split the scene workload into multiple regions and assign these regions to different GPUs. For example, on a system with two SLI-enabled GPUs, a render target may be divided vertically, with GPU 1 rendering the left region and GPU 2 rendering the right region. Rendering is also dynamically load balanced, so the division will change whenever the driver determines that one GPU is working more than another. This SLI rendering mode is typically not as desirable as AFR mode, since some of the work is duplicated and communications overhead is higher.

 

 

Alternate Frame Rendering (AFR), The driver divides workload by alternating GPUs every frame. For example, on a system with two SLI-enabled GPUs, frame 1 would be rendered by GPU 1, frame 2 would be rendered by GPU 2, frame 3 would be rendered by GPU 1, and so on. This is typically the preferred SLI rendering mode as it divides workload evenly between GPUs and requires little inter-GPU communication.

Users can optionally forcefully enable AFR mode for an individual application using the NVIDIA driver control panel. However, this approach may not lead to any scaling due to a variety of pitfalls that are covered in the section on AFR Performance.

 

 

Link

 

 

Looking forward to the rest mate!

Share this post


Link to post
Share on other sites

Hi Smarty,

 

Good read but I do see one problem, how you explain the way multi gpu's work. There is 2 kinds of rendering methods. The method you stated is really quite old now and AFR is the preferred method at this moment

 

I was just over simplifying to keep the technobable minimal since I already cover a lot of stuff :-)

 

Hehehheehhe

Share this post


Link to post
Share on other sites

Bye the by you can now buy a Nvidia based GTX 570 card for $250, which is the top of your sweet spot and one of the best price perfomance ratings.

Share this post


Link to post
Share on other sites

One other important thing to realize is that Video Cards these days tend to want to be plugged into your power supply directly, instead of drawing power through the motherboard.

 

Make sure that your power supply can handle the power draw of your one (or two) video cards, as well as meeting the other power drain needs of the other crap on your motherboard.

 

Don't do what Jinx is currently going through and what I went through a couple of years ago when a video card upgrade turned into video card, power supply and a new case to fit the larger power supply!

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0