Core
What is it?
It used to be that a single CPU (central processing unit) possessed a single core—the processing unit within the CPU that receives and executes instructions. Times have changed, however, and now a single CPU can house a multitude of cores, each capable of carrying out its own set of instructions. The cores themselves may contain even smaller threads—the smallest sequence of instructions that can be executed independently. The upshot of all this is that modern CPUs can carry out multiple tasks simultaneously thanks to their high core and thread counts. You can envision it as a multi-lane highway; rather than making the cars go faster on a single lane, the use of cores opens up multiple lanes on the same road for many instructions to be executed at the same time.
Why do you need it?
Almost all modern processors are based on the multi-core architecture. The inclusion of more cores not only means more tasks can be completed simultaneously, which eliminates bottlenecks and accelerates computation, it also opens the door to innovative computing techniques such as parallel computing and distributed computing. High performance computing (HPC), which is a major trend in advanced computing, also relies heavily on linking multiple servers running multi-core processors to form a computing cluster that calculates on the scale of supercomputers.
How is GIGABYTE helpful?
GIGABYTE servers are powered by the most advanced processors. For example, if you installed dual 64-core AMD EPYC™ CPUs in each node of GIGABYTE's H262 Series of High Density Servers, and then populated a full 42U (42 rack units) server rack with these servers (leaving some room for networking equipment), you would have as many as 10,240 cores in one rack. Another kind of processor that is rapidly gaining ground is based on the ARM architecture, which can house as many as 128 cores in a single CPU. The excellent power efficiency of ARM processors goes a long way towards reducing power consumption and resolving heat-related issues.