FPGA stands for Field-Programmable Gate Array. That’s a mouthful, so allow’s begin with a standard meaning. Essentially, an FPGA is a hardware circuit that a user can program to perform one or more logical operations. A little further down, FPGAs are integrated circuits, or ICs, which are sets of circuits on a chip – this is the “matrix” part. These circuits, or ranges, are groups of programmable logic entrances, memory, or various other components.
With a standard chip, such as the Intel Curie element within an Arduino panel or a processor chip inside your laptop, the chip is fully baked. It can not be set; you get what you get.
With these chips, a user can write software that loads onto a chip and performs functions. This software can be replaced or removed later, but the hardware chip remains unchanged.
With FPGA solutions, there is no chip. The user programs the hardware circuit (s). Programming can be a simple logic gate (an AND or OR function), or it can involve one or more complex functions, including functions that together act as a complete multicore processor.
Why Use an FPGA?
You can use an FPGA when you need to optimize a chip for a particular workload, or when you are likely to need to make chip-level changes later. The uses of FPGAs span a wide range of fields: from equipment for video and imaging to circuitry for computer, automotive, aerospace, and military applications, in addition to electronics for specialized processing and more. FPGAs are particularly useful for prototyping integrated circuits (ASICs) or application-specific processors. An FPGA can be reprogrammed until the ASIC or processor design is final and bug-free and the actual manufacture of the final ASIC begins. Intel itself makes use of FPGAs to model new chips.
In fact, Intel recently acquired a company called eASIC in order to speed up its design and prototyping process. eASIC produces something called “structured ASIC,” which is based on a model that sits between an ASIC and an FPGA. As this AnandTech article describes, with a structured ASIC:
“Engineers can produce a design utilizing an FPGA, and also afterward rather than investing top quality time optimizing the circuit style, they turn the taken care of layout into a solitary design mask for manufacturing. Being a fixed design like an ASIC, it is faster than a variable design, but without the advantages of the ASIC-like energy-saving matrix area. Nevertheless, it was developed in FPGA time, rather than ASIC time (as much as 6 months saved), and saves power with its set style. “
FPGA For The rest of us
So what’s a real-life example of how FPGAs are used? In the eBook, FPGAs for Dummies, co-authors Andrew Moore and Ron Wilson give a simple FPGA example of a rearview camera for a car. In the example, a camera may take 250 milliseconds to capture and display an image to the pilot. If regulations change to require the time window to be only 100 milliseconds, the car could require expensive modifications that were nearly impossible if the camera relied on a chip-based solution. Cars in production, unsold cars, and previously sold cars could be updated with a simple reprogramming of the FPGA.
FPGAs are also useful for services due to the fact that they can be dynamically reprogrammed with a data course that precisely matches a certain workload, such as information analysis, image inference, security, or compression. Optimized FPGAs are also more energy-efficient than running equivalent workloads on a processor. This combination of versatility, efficiency, and performance makes an attractive package for modern businesses looking to process more data at a lower total cost of ownership (TCO).
The New Frontier of FPGAs: Artificial Intelligence
Today, FPGAs are gaining in importance in another area: deep neural networks (DNN) which are used for artificial intelligence (AI). Running DNN inference models requires significant processing power. Graphics Processing Units (GPUs) are often used to speed up inference processing, but in some cases, high-performance FPGAs can actually outperform GPUs in analyzing large amounts of data for machine learning.
Microsoft is already using Intel FPGA versatility to accelerate AI. Microsoft’s Brainwave project provides customers with access to Intel Stratix FPGAs through Microsoft Azure cloud services. Cloud servers equipped with these FPGAs have been configured specifically to run deep learning models. The Microsoft service enables developers to harness the power of FPGA chips without purchasing or configuring specialized hardware and software. Instead, developers can work with common open-source tools, such as the Microsoft Cognitive Toolkit or the TensorFlow AI development framework.