- published: 20 Jan 2016
- views: 82191
A video card (also called a display card, graphics card, graphics board, display adapter or graphics adapter) is an expansion card which generates a feed of output images to a display. Most video cards offer various functions such as accelerated rendering of 3D scenes and 2D graphics, MPEG-2/MPEG-4 decoding, TV output, or the ability to connect multiple monitors (multi-monitor).
Video hardware can be integrated into the motherboard or (as with more recent designs) the CPU, but all modern motherboards (and some from the 1990s) provide expansion ports to which a video card can be attached. In this configuration it is sometimes referred to as a video controller or graphics controller. Modern low-end to mid-range motherboards often include a graphics chipset manufactured by the developer of the northbridge (e.g. an nForce chipset with Nvidia graphics or an Intel chipset with Intel graphics) on the motherboard. This graphics chip usually has a small quantity of embedded memory and takes some of the system's main RAM, reducing the total RAM available. This is usually called integrated graphics or on-board graphics, and is usually low in performance and undesirable for those wishing to run 3D applications. A dedicated graphics card on the other hand has its own Random Access Memory or RAM and Processor specifically for processing video images, and thus offloads this work from the CPU and system RAM. Almost all of these motherboards allow the disabling of the integrated graphics chip in BIOS, and have an AGP, PCI, or PCI Express slot for adding a higher-performance graphics card in place of the integrated graphics.