How Light-Powered Computers Can Help AI’s Energy Problem


Computers that use light instead of circuits to run calculations may sound like a plot point from an episode of Star Trek, but researchers have been working on this new method of computing for years.

These are called optical computers, and laboratories around the world are exploring how they can be useful in everyday life.

On Wednesday, a group of researchers from Penn State published a paper in the journal Science Advances that examines how optical computing can reduce power consumption in artificial intelligence systems.

Xingjie Nian engineering professor at Penn State and one of the authors of the paper, told CNET that the work is a proof of concept for how optical computing could benefit the rapidly growing AI industry in the future.

AI Atlas

“Sometimes progress comes from rethinking familiar physics with a new purpose,” Ni said. “By redefining classic ideas in optics through the lens of modern AI challenges, we can open up practical new directions for faster, greener computing hardware.”

Strengthening AI

As AI is increasingly adopted for work and home use, the issue of AI energy costs is relevant. A lot of computing power is required to run AI products and services like ChatGPT, and a lot of energy is used in the process.

You may live in or near a city where a tech company is planning to build a data centeror your monthly utility bill may increase due to higher demand on the local power grid.

The International Energy Agency estimates that data centers account for about 1.5% of global energy consumption in 2024 and that this figure has increased by 12% annually in the past five years. The IEA too estimates that data center energy use will double by 2030.

So using an alternative computational method to reduce the energy used by AI is an attractive prospect.

Light intensity

Optical computers — computers that use light instead of electricity — are still mostly in the moonshot category of the tech industry, where they are years away from commercial use. They become a concept since the 1960sthat the roots of optical information processing stretch back.

True optical computers are usually left to research laboratories. But optical data transfer, which quickly sends data through pulses of light, is now used in some large data centers and for ground-to-plane transmission.

However, the use of optical computing in artificial intelligence is an emerging field of study. There are real challenges in getting light to work together so it can perform the functions required by neural networks, which are a subset of AI used in products like chatbots today.

In fact, light naturally moves in a straight line. To build a computer that can process data, you need an optical system that generates nonlinear functions. For optical computers to do this, they usually need other materials that can be difficult to manufacture and consume a lot of power.

“True optical nonlinearity is usually weak and difficult to access – it usually requires high-power lasers or special materials, which increases the complexity and weakens the energy-efficiency advantage of optics,” said Ni. “Our approach avoids those requirements while still delivering performance comparable to nonlinear digital networks.”

Infinity mirror

Penn State researchers have found an interesting solution that helps optical computers perform nonlinear functions that are more suited to the type of data processing required by AI.

The prototype the team built uses an “infinity mirror” setup that loops “tiny optical elements, encoding data directly into beams of light,” creating a nonlinear relationship over time. Then, the light patterns are captured using a microscopic camera.

“The key takeaway is that a carefully designed optical structure can enable the nonlinear input-output behavior required by AI without relying on strong nonlinear materials or high-power lasers,” Ni said. “By allowing light to ‘blow’ through the system, we create this nonlinear mapping while keeping the hardware simple, low power, and fast.”

The (above) figure shows how light is focused into a small processing unit, allowing vast strings of computational information to be transferred without the use of energy-intensive circuitry. Another figure (below) illustrates how the team process works conceptually.

The (above) figure shows how light is focused into a small processing unit, allowing vast strings of computational information to be transferred without the use of energy-intensive circuitry. Another figure (below) illustrates how the team process works conceptually. The light input is repeatedly reflected through lenses and other optical devices, encoded into complex strings of information, and finally focused into a camera that provides a simplified output.

Xingjie Ni

It’s an interesting concept, but turning the prototype into a system with real-world applications will take a lot of time, work and money.

From the lab to the data center

Ni acknowledged that AI optical computers are still a few years away.

“A realistic timeline to reach an industry-facing prototype and early demonstrations is about two to five years, depending on the level of investment and the target application,” he said.

However, this is a hot topic in the computer world. Francesca Parmigianiprincipal research manager at Microsoft Research, told CNET that optical chips could one day work alongside traditional GPUs to help AI systems perform specific tasks.

“Optical computing has the potential to perform many more operations in parallel and at much higher speeds than conventional digital hardware,” Parmigiani said. “This can translate into significant gains in energy efficiency and reduced latency for workloads.”

The traditional computers we use for AI are not being replaced by optical computers anytime soon. But in a few years, it is possible that optical computers can be integrated with AI systems to work with regular computers.

“The goal is a hybrid approach: Electronics still manages general-purpose computing, memory and control, while optics can facilitate specific high-volume computations that dominate AI in time and energy costs,” Ni said.





Source link

  • Related Posts

    TikTok US has launched a local feed that uses a user’s exact location

    TikTok US for users to “get the inside scoop on must-try restaurants, shops, museums and events.” This is done by using the exact location of people using the app and…

    Today’s NYT Connections: Sports Edition Hints, Answers for Feb. 12 #507

    Looking for Latest regular responses to Connections? Click here for today’s Connection hintsas well as our daily answers and hints for The New York Times Mini Crossword, Wordle and Strands…

    Leave a Reply

    Your email address will not be published. Required fields are marked *