WWS Technology Wiki

Video Card

Computer video card fan 600.jpg

On a monitor there is over a million pixels and the computer has to color and each one in a specific way to create a cohesive image. The CPU has the information needed to do this, but in binary data. A translator is needed and that is where the graphics card comes in. The CPU sends the necessary information, working with the software app, to the graphics card. The graphics cards sends the instructions for each pixel to create the image to the monitor from a cable. 

Creating an image from binary data is no simple task. To make the 3D image the graphics card must create a wire frame out of straight lines, fill in the remaining pixels and finally add lighting and texture. Not only must this process be done down to the very pixel but it has to be done quickly. In fast paced video games, the image must refresh up to sixty times per second. 

There are essential pieces for the graphic card’s functionality. The motherboard is used for a connection to the data and power necessary as well as provides a connection to the CPU. The processor decides what to do with each pixel. Memory within the graphics card is used to store information concerning each pixel and temporarily, completed pictures. The graphics card itself is a printed circuit board. It houses a processor and a RAM as well as an input/output systems. This system comes in the form of a BIOS chip, storing the card’s settings and performing diagnostics on the memory as well as the input/output system when starting the computer. The RAM houses the data about the pixels and is in most cases double ported, meaning it can read and write at the same time. The RAM, by connecting to the digital-to-analogue converter translates the image to analogue signal. It sends the final picture to the monitor through a cable. 


The graphic card processor is in many ways similar to the CPU except that the graphics card is specifically designed to do as computer.howstuffworks.com stated “mathematical and geometric problems for graphics rendering.” Often, the fastest GPUs will have more transistors than the CPUs, it being necessary for them to work rapidly. Originally, the GPU would just take binary data form the central processor and render images. Today however, it does complex calculations such as data research, machine learning and AI. Beginning with a single fixed core for only graphics , the GPU has also evolved to have programmable cores. What is considered the first 3D graphics system was created in 1951 by the Navy. However, modern GPUs don’t come on the scene until 1995 with 3D add in cards. From there, in 1999 Nvidia came out with the “world's first GPU” which proved to have “a single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second.” (Pictured Below) 


Today, the graphics card has expanded and continues to look more and more like a CPU in terms of its core. It has been used  in machine learning, oil exploration, scientific image processing, statistics, linear algebra, 3D reconstruction, medical research as well as even stock options pricing determination.