Jan 03 2023
Hardware

What Is a Field Programmable Gate Array and How Does It Work?

After 40 years, the FPGA hits its stride in the cloud and on the edge. Here’s why.

When it comes to major types of chips used in computers, much of the focus is on central processing units (CPUs) or graphic processing units (GPUs). But those aren’t the only types of chips that can help shape a computing experience: Another type of chip called a field programmable gate array (FPGA) is an extremely flexible kind of integrated circuit that can be reprogrammed.

This technology has been around for 40 years, but it is just now hitting its stride in IT settings, particularly in the cloud and in data centers.

What Is an FPGA?

An FPGA is a type of integrated circuit that allows the electronic circuitry within the chip to be re-created as needed. The end user can program it to work in a specific way for a software-like approach to building specialized circuits.

The FPGA has a long, rich history in computing. Its basic concepts stem from predecessor technologies such as programmable read-only memory (PROM), which dates back to the 1960s, and mask-programmable gate array, which Motorola and Texas Instruments first experimented with in the early 1970s. Like FPGAs, these technologies can be reprogrammed. However, they cannot be changed dynamically. Many basic PROMs, for example, only allow binary changes that are enabled through hardware.

The commercial FPGA market began in the early 1980s, led by the emergence of Altera, which developed the first reprogrammable logic device in 1984, and Xilinx, which brought the first commercially viable FPGA to market.

Altera and Xilinx had long independent histories, and for many years, they were the only manufacturers of FPGA devices. The competition between the two companies helped turn FPGAs into a successful technology in the embedded systems market.

Each company recently has been acquired by one of the major CPU giants: Intel purchased Altera for $16.7 billion in 2015 and AMD completed a $50 billion acquisition of Xilinx at the start of 2022, the largest acquisition yet in the chip industry. These acquisitions impact the role FPGA will play in future computing, including in your IT stack.

Click below to receive exclusive data content when you register as an Insider.

How Does an FPGA Work?

Think of an FPGA as a more dynamic version of an integrated circuit that can be managed for more specialized tasks. Rather than the general-purpose tasks often undertaken by CPUs or GPUs, FPGAs can adapt to the task at hand. This can lead to more specialized results, but it requires customization.

Central to the FPGA concept is a gate array, a style of application-specific integrated circuit (ASIC) used to physically change the circuitry of a chip for a specific use case, creating a “semi-custom” chip. The FPGA replaces the mechanical process with the use of a hardware description language, a kind of programming language that defines the behavioral structure of the chip. Fans of old-school computers and video games have utilized FPGAs to offer faithful recreations of hardware that is no longer being produced as an alternative to software emulation.

One popular DIY project called the MiSTer uses a DE-10 Nano, based on Intel’s Cyclone V system on a chip, to recreate the chips in vintage computers and video game consoles. Another selling point is that modern FPGAs can dynamically implement the full hardware functions of vintage systems such as the Commodore Amiga or the Nintendo Entertainment System.

FIND OUT: How data and AI are going to disrupt the world of IT in 2023.

FPGA vs. ASIC vs. Microcontroller

The FPGA fits into a broader class of integrated circuits commonly used in embedded systems, such as the ASIC and the microcontroller. All three have seen significant improvements in recent years, with the integrated system-on-a-chip approach. The microcontroller inspired much of the innovation seen in ARM-based CPUs such as those used in smartphones and Apple silicon. While microcontrollers may play in similar spaces as ASICs and FPGAs, they tend to share more DNA with traditional CPUs.

FPGA and ASIC chips had similar evolutions. They were both designed for specific, narrowly defined use cases. At one time, the ASIC, which is a small-batch type of integrated circuit, was more popular than the FPGA. Though ASICs continue to see improvements in embedded and server-room use cases, FPGAs now have become the preferred technology, thanks to their flexible re-programmability. In a paper on FPGA, 2011 IEEE Fellow Stephen M. Trimberger notes that in its early years, FPGA struggled to match ASIC’s capabilities, but its flexibility eventually put it ahead.

“Cost, capacity and speed were precisely those attributes at which FPGAs were at a disadvantage to ASIC in the 1980s and 1990s. Yet they thrived,” he writes. “A narrow focus on those attributes would be misguided, just as the ASIC companies’ narrow focus on them in the 1990s led them to underestimate FPGAs. Programmability gave FPGAs an advantage despite their drawbacks. That advantage translated into lower risk and easier design.”

LEARN: Why businesses are investing in platform engineering.

FPGA Applications: Uses in the Workplace

One reason FPGA is often used in engineering contexts is because it is a type of semi-custom integrated circuit. Engineers can use FPGA to develop new types of chip technology. It’s an effective alternative to ASICs because it’s more agile and adaptable. FPGAs are also used in networking equipment and telecommunications.

This has been true since the evolution of the technology in the 1980s and 1990s. Today, FPGAs are still widely used in high-end networking equipment by companies like BittWare and Arista, as they can help accelerate complex network traffic within an organization. FPGAs are also a popular choice for Internet of Things and connected devices.

Another notable area where FPGAs have gained influence is in the form of expansion cards that can speed up video rendering. Apple’s Mac Pro can take advantage of a proprietary FPGA-based expansion card to accelerate the rendering of ProRes video files, freeing up the CPU for other processing tasks.

Finally, FPGAs can be used in artificial intelligence work as an alternative to GPUs or ASICs. “FPGAs offer hardware customization with integrated AI and can be programmed to deliver behavior similar to a GPU or an ASIC,” Intel notes on its website. “The reprogrammable, reconfigurable nature of an FPGA lends itself well to a rapidly evolving AI landscape, allowing designers to test algorithms quickly and get to market fast.”

DISCOVER: How to deliver on digital signage and DvLED Displays.

FPGA Applications: Uses in the Cloud and at the Edge

FPGAs are entering the market today in data center settings, such as the cloud. Writing for the engineering site All About Circuits, Jake Hertz notes that the flexibility of FPGAs is an advantage in a data center environment. “Instead of needing a variety of different hardened ASICs, a single FPGA can be configured and reconfigured for various applications, opening the door to further optimization of hardware resources,” Hertz writes.

FPGAs can also be used as on-demand tools through cloud platforms. Microsoft has been a heavy investor in FPGAs, using them as a key element of its Bing search engine and offering access as part of its Azure cloud offering.

Microsoft notes on its website that FPGAs enhance machine learning applications, balancing the efficiency of a more specialized ASIC with the flexibility of a CPU or GPU: “Implementations of neural processing units don't require batching; therefore, the latency can be many times lower, compared to CPU and GPU processors.”

And there’s still potential to grow. One area where FPGAs are likely to have a future impact is with computational storage, a type of solid-state drive that includes a built-in processor, often an FPGA or CPU, to manage some of the workload separately from the main device. While this technology is still emerging, companies like Dell have invested heavily in it because it promises to improve processing times in the cloud or in edge computing contexts.

FPGA is already a key component of the server room and the cloud. If you’re curious about what the future holds for your infrastructure (FPGA or not), check out CDW Amplified™ Infrastructure, which can help you answer any questions.

Olemedia / Getty Images
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT