Artificial Intelligence For Dummies
Book image
Explore Book Buy On Amazon
The CPU still works well for business systems or in applications in which the need for general flexibility in programming outweighs pure processing power. However, GPUs are now the standard for various kinds of data science, machine learning, AI, and deep-learning needs. Of course, everyone is constantly looking for the next big thing in the development environment. Both CPUs and GPUs are production-level processors. In the future, you may see one of two kinds of processors used in place of these standards:
  • Application Specific Integrated Circuits (ASICs): In contrast to general processors, a vendor creates an ASIC for a specific purpose. An ASIC solution offers extremely fast performance using very little power, but it lacks flexibility. An example of an ASIC solution is Google’s Tensor Processing Unit (TPU), which is used for speech processing.
  • Field Programmable Gate Arrays (FPGAs): As with an ASIC, a vendor generally crafts a FPGA for a specific purpose. However, contrary to an ASIC, you can program a FPGA to change its underlying functionality. An example of a FPGA solution is Microsoft’s Brainwave, which is used for deep-learning projects.

The battle between ASICs and FPGAs promises to heat up, with AI developers emerging as the winner. For the time being, Microsoft and FPGAs appear to have taken the lead. The point is that technology is fluid, and you should expect to see new developments.

Vendors are also working on entirely new processing types, which may or may not actually work as expected. For example, Graphcore is working on an Intelligence Processing Unit (IPU). You have to take the news of these new processors with a grain of salt given the hype that has surrounded the industry in the past. When you see real applications from large companies such as Google and Microsoft, you can start to feel a little more certain about the future of the technology involved.

About This Article

This article is from the book:

About the book authors:

John Mueller has produced 114 books and more than 600 articles on topics ranging from functional programming techniques to working with Amazon Web Services (AWS). Luca Massaron, a Google Developer Expert (GDE),??interprets big data and transforms it into smart data through simple and effective data mining and machine learning techniques.

John Mueller has produced 114 books and more than 600 articles on topics ranging from functional programming techniques to working with Amazon Web Services (AWS). Luca Massaron, a Google Developer Expert (GDE),??interprets big data and transforms it into smart data through simple and effective data mining and machine learning techniques.

This article can be found in the category: