Do you ever wonder why the battery in your cellphone runs out of juice so fast? There’s an algorithm for that and FAMU-FSU College of Engineering researchers may have discovered how to make your digital devices run more efficiently.
Victor DeBrunner, a professor in electrical and computer engineering at the FAMU-FSU College of Engineering, is the lead investigator on a new study to optimize the algorithms that work behind the scenes on your cellphone. Algorithms power everything from your Wi-Fi and cell connection, to near-field operations, like Apple or Google Pay, on your device.
“As engineers, we look at a cellphone and know it takes a lot of battery power to perform basic operations,” DeBrunner said. “Digital signal processing requires complex multiplication, so if we can decrease that hardware complexity, processes like Wi-Fi can run more efficiently and ultimately use less battery power. That’s what our algorithm does.”
Every time you stream something, you are running two mathematical operations: the DFT (Discrete Fourier transform) and linear convolution. These are the mathematical foundations of all DSP operations, and they require algorithms tailored to the hardware to work efficiently.
“There is a famous algorithm called Cooley-Tukey, named for two IBM engineers, J.W. Cooley and John Tukey,” DeBrunner said. “It greatly reduced the computation time it took to do a DFT back in the 1960s. At the time it was used for computing hardware used to track Soviet submarines. It used to take hours, but the new algorithm reduced the time to less than a minute.”
Like the Cooley-Tukey, the DeBrunner algorithm has the potential to reduce the time it takes a cellphone to process events. Their method simplifies the equation by taking the known parts and eliminating the calculation needed for those parts.
If you already know something equals 0, do you need to go through all the calculations to get that answer? The researchers say no. By eliminating some of the unnecessary math they can still get the result they want. DeBrunner describes it as symmetry.
“Symmetry says you have four points in a plane that are conjugates of each other,” DeBrunner said. “The real and imaginary part cancel and double each other out. So, you don’t need the additions. Symmetry tells you; you don’t have to add those four numbers, you already know the result.”
Victor DeBrunner is working with Linda DeBrunner, a professor in electrical and computer engineering and Rajesh Thomas, a doctoral student in electrical engineering at the college, on the project.
“DFTs and similar algorithms affect the life of people every second—from medical technology to self-driving cars to cellular devices,” Thomas said. “Improving the efficiency of DFT computation even a little can make devices and machines faster, cheaper and use less power—which can ultimately lead to greener technology and devices with longer battery life.”
Thomas, who recently accepted a position with Texas Instruments, is writing his dissertation from the research.
DeBrunner says the algorithm has broad appeal and is pursuing industry avenues for development. Texas Instruments and hardware manufacturers like Intel, AMD, and Nvidia and System integrators like L3 Harris and MITRE might be interested in the technology.
“Artificial intelligence engineering used in Amazon, Google, Facebook, IBM, and Microsoft, as well as an emerging host of startups, write code with these functional operations every day,” DeBrunner said.
Their research, A Sparse Algorithm for Computing the DFT Using Its Real Eigenvectors, is featured on the Signals journal website.