ADVERTISEMENT

Science Fiction is Becoming a Reality

Published: October 21, 2024
Thanapipat / stock.adobe.com

A great deal has transpired during my 78 years. A short (but impressive) list would be television, space travel, digital technologies, computers, the internet, cell phones and, most recently, artificial intelligence and quantum computing. Each of these innovations was science fiction not too long ago. 

I have written extensively on AI, but there is yet another thing on the horizon that has qualified as science fiction but that will become a reality sooner rather than later. Think of the novel 2001: A Space Odyssey and the accompanying movienot HAL the malevolent voice of AI but, rather, HAL the supercomputer. No, this is not a new chip, graphic card or hard drive size. This is about a truly new approach to the ingestion, digestion and distribution of data. 

For those who aren’t “in the know,” I am speaking about quantum computing. This will ultimately facilitate and extend applications using AI in ways we have not yet considered. It will impact commercial AV, IT and certainly security. This truly fits the old adage that necessity is the mother of invention, where AI is the necessity and quantum computing is the invention. 

Consider this column a “teaser” for the more in-depth article, covering the key elements of quantum computing, which I’ve already published on CommercialIntegrator.com. 

Traditional Computers and Their Limitations 

For context, let’s explore where we are with traditional computers and their limitations. Then, we’ll move forward into the future. 

A traditional computer is made up of five basic components: A motherboard is a primary circuit board; a CPU (aka processor) is the brain; a GPU handles graphics and advanced imagery; RAM is volatile memory that holds currently used data on standby for instant use; and a storage device, which is non-volatile memory for installing programs and saving files. 

Let’s examine the two biggest limitations for computers today…. 

One of the apparent limitations of computers is their processing power. Although modern computers boast impressive processing speeds, they are still bound by the constraints of Moore’s Law, which predicts that the number of transistors on a microchip will double about every two years. It becomes increasingly difficult to shrink transistors further. Consequently, there’s a limit to how much more powerful we can make individual processors. Tasks that require massive amounts of computation — for example, simulating complex scientific phenomena or rendering lifelike graphics in real time — often strain even the most powerful supercomputers. 

The reliance on binary code is the most significant bottleneck. Binary code uses combinations of two numbers (0 and 1) to represent numbers, letters and other types of information that computers or other electronic devices can understand, interpret and use. Devices typically organize the code into segments called “bits” or “bytes.” Computers group single bits into bytes, which are eight-bit units. Each eight-bit byte represents a piece of information that the computer uses to build information segments like letters or colors, combining them to form larger pieces of information. 

Applications That Require High Precision 

Binary systems can only represent a limited range of values, which can be a problem for applications that require high precision. Due to the inherent limitations of existing technologies, improvements in traditional computers are incremental. This contrasts with the rapidly increasing amount of data available to us, which is anything but incremental. This is where AI comes into play, and ultimately begs for a new way to handle the onslaught of data and employ new algorithms. I file this under the “If only…” umbrella. Just think what we could do, if only…. 

All the “hoopla” surrounding AI and what it portends for the future is what feeds the unrest with the limitations we have with current computers. Filling in the blank of “if only…” is quantum computing. 

It is an oversimplification, but quantum computers can work with data between 0 and 1. In AV terms, think black and white versus an infinite greyscale. Every day, we produce 2.5 exabytes of data. That number is equivalent to the content on 5 million laptops. Quantum computers will make it possible to process the growing amount of data we’re generating in the age of big data. 

Quantum computers are not (totally) here yet. And to be fair, classical computers are better at some tasks (e.g., email, spreadsheets and desktop publishing, just to name a few) than quantum computers are. The intent for quantum computers is to have a different tool to solve different, larger and more complex problems — not to replace classical computers. 

For those techies among us who need all the details, I suggest visiting  CommercialIntegrator.com’s site archives to read my full report on this revolutionary development. 


Alan C. Brawn, CTS, DSCE, DSDE, DSNE, DCME, DSSP, ISF-C, is principal of Brawn Consulting. 

ADVERTISEMENT
ADVERTISEMENT
B2B Marketing Exchange
B2B Marketing Exchange East