Code360 powered by Coding Ninjas X Naukri.com. Code360 powered by Coding Ninjas X Naukri.com
Table of contents
1.
Introduction
2.
About GPUs
2.1.
Integrated GPUs
2.2.
Discrete GPUs
3.
GPU architecture
4.
Uses of GPUs
4.1.
GPU for gaming
4.2.
GPU for video editing
4.3.
GPU for Machine Learning
5.
GPU vs CPU
6.
Frequently Asked Questions
6.1.
Which graphics card is best for gaming?
6.2.
Are GPUs good for multitasking?
6.3.
How can we test the performance of a GPU?
7.
Conclusion
Last Updated: Mar 27, 2024
Easy

Graphics Processing Unit

Author Md Yawar
1 upvote
Master Python: Predicting weather forecasts
Speaker
Ashwin Goyal
Product Manager @

Introduction

Modern computers are capable to represent various visual content on our screen. You may wonder how computers are capable to represent such content in a low amount of time. Computer graphics are responsible for presenting the visual content on the computer screen. While the CPU performs all the calculations, the GPU handles the graphics portion. GPUs are also used to accelerate the machine learning processes. GPU technology has been significantly improved over the years and has unlocked new possibilities in different fields. Let us learn more about GPU.

source

About GPUs

GPU is a specialized processor designed for parallel processing. The chip manufacturer Nvidia coined the term “GPU” was coined in the 1990s by the chip manufacturer Nvidia. GPUs were primarily used for generating real-time 3D graphics. Now, GPUs are used to solve a broad set of problems, for example, machine learning, video editing, gaming applications, etc.

GPUs are of two basic types, Integrated GPUs and discrete GPUs

Integrated GPUs

The majority of the GPUs in the market is the integrated GPUs. An integrated GPU is embedded with the CPU. Integrated GPUs take less power to operate but provide less power.

Discrete GPUs

A discrete GPU is a separate chip mounted on a different circuit board connected to a PCI express slot. Discrete GPUs are used for high performance but take more power to operate. Nvidia and AMD control about 100% of the discrete GPU market. Their market shares are 66% and 33%, respectively.

Also read - AMD vs Intel

Get the tech career you deserve, faster!
Connect with our expert counsellors to understand how to hack your way to success
User rating 4.7/5
1:1 doubt support
95% placement record
Akash Pal
Senior Software Engineer
326% Hike After Job Bootcamp
Himanshu Gusain
Programmer Analyst
32 LPA After Job Bootcamp
After Job
Bootcamp

GPU architecture

GPU consists of numerous computational units or cores. The cores of the GPU are weaker than the cores of the CPU but in larger quantity. GPU programming follows SIMD (Single Instruction Multiple Data) model. It means that all the cores perform the same operation on different data. GPU cores can perform simple operations like multiply-add (MAD) or fused multiply-add (FMA). Modern GPU cores can also perform complex operations like tensor and Ray tracing operations. Thus, the strength of GPU is not in the processing power of its core, but in the high parallel processing capabilities of the cores. GPU is equipped with memory named Global memory (GMEM). GMEM allows the easy access to the data. The control units of the GPU gives instructions to the different parts of the GPU.

GPU architecture

The green parts represent the computational units, orange represents the memories, and yellow represents the control units.

Click on the following link to read further: Multitasking Operating System

Uses of GPUs

GPU for gaming

Modern video games are very computationally intensive. There is a vast in-game world with real-life graphics. GPUs can render graphics quickly and allow us to play games with better graphics in high resolution.

source

GPU for video editing

The parallel processing provided by GPU makes it possible to render graphics and videos in high-definition format. GPU can process each pixel parallely and can accelerate the process of editing videos.

Video editing

GPU for Machine Learning

GPUs have a high computational capability and follow SIMD (Single Instruction Multiple Data) processing model. Training the machine learning models takes advantage of the parallel nature of GPUs. SIMD allows accelerating the processing of machine learning or deep learning models. An example is to scale the pixels of an image. We can map each pixel to a different core of the GPU, and each core scales the pixels separately. This solves the problem in one shot rather than processing the image differently for each pixel.
 

Training time GPU vs CPU

Difference between GPU and Graphics card:

There is a misconception that people think that GPU and graphics card are the same things. The Graphics card is like an add-in board that contains the GPU, just like the motherboard has the CPU. The Graphics card ensures that the GPU can function and connect to the rest of the system.

GPU vs CPU

GPU vs CPU

Frequently Asked Questions

Which graphics card is best for gaming?

The GeForce RTX 3090, 3090 Ti, and 3070, along with the Radeon Rx 6800, are considered the best options for heavy gaming.

Are GPUs good for multitasking?

No, GPUs are not built for multitasking and cannot be used for general-purpose computing.

How can we test the performance of a GPU?

We can use many different tools like 3DMarkFurMarkFraps to test the performance of our GPU.

Conclusion

This blog gave us an overview of GPU. We learned about its uses and how is it different from a CPU. 

Refer to our Guided Path on Coding Ninjas Studio to upskill yourself in Data Structures and AlgorithmsCompetitive ProgrammingJavaScriptSystem DesignMachine learning, and many more! If you want to test your competency in coding, you may check out the mock test series and participate in the contests hosted on Coding Ninjas Studio! But if you have just started your learning process and are looking for questions asked by tech giants like Amazon, Microsoft, Uber, etc; you must look at the problemsinterview experiences, and interview bundle for placement preparations.

Nevertheless, you may consider our paid courses to give your career an edge over others!

Previous article
Difference Between Arduino and Raspberry Pi
Next article
Flash Memory
Live masterclass