Eric Bogatin, Signal Integrity Journal Technical Editor
Eric Bogatin, Signal Integrity Journal Technical Editor RSS FeedRSS

Eric Bogatin_new headshot_100

Eric Bogatin is Technical Editor at Signal Integrity Journal and the Dean of the Teledyne LeCroy Signal Integrity Academy. Additionally, he is an Adjunct Professor at the University of Colorado - Boulder in the ECEE Dept. Eric improves the signal to noise ratio by sorting through all of the information available and finding the best quality content to publish on signalintegrityjournal.com.

Power Integrity

The Impact of AI at the Singularity

November 2, 2025

In 2005, noted futurist Ray Kurzweil published his book The Singularity is Near in which he applied his Law of Accelerating Returns to extrapolate computational performance.

The Law of Accelerating Returns applies Moore’s law to all technological advances. By whichever metric selected, the performance of technology increases exponentially over time rather than linearly. This is a consequence of the differential equation that technology advances obey. The rate of advancement increase in each generation is proportional to the previous generation of technology.

This is a linear first-order differential equation, and the solution is an exponential. Kurzweil offers numerous examples of the exponential growth of technologies.

In particular, he plotted the computation ability that $1000 buys you at its current value. He demonstrated that over the last 120 years, it has had a doubling time of about one to two years. His plot from 2005 is shown in Figure 1.1

SIJ_1025_p008-FIG1x500.jpgFigure 1. The information processing ability that $1000 buys in current dollars increases exponentially. In his book, Kurzweil writes that “by around 2020,” $1000 will buy computer power equal to a single brain. He also states that by 2045, the onset of the singularity, the same amount of money will buy one billion times more power than all human brains combined today.He calls the singularity the merger of human and machine intelligence.

Fast-forward to present day, 2025. An LLM running on a $1000 desktop computer is not so far off from Kurzweil’s prediction of human brain-level computing performance. Take, for example, a typical hyperscale data center with 10,000 GPUs on which the LLMs are developed; we are nearing Kurzweil’s singularity. What does this mean? What is the consequence of the combination of AI software running on hardware optimized for AI processing?

As engineers, we interact with AI systems in three ways: as developers of the hardware and software platforms, as users of the new paradigm of AI-driven EDA design tools, and as consumers of AI tools in our daily lives.

Twenty years ago, the killer app that drove the need for speed was the transmission of video information over the internet. Netflix and YouTube created the internet. Today, AI data centers drive the need for speed and interconnect density. Heterogenous packaging is finally in production. Optical interconnects from board to board are also in production. Power distribution and thermal management for 50 kW processor boards are driving the emergence of new technology revolutions.

Agentic AI models, autonomous agents that learn hardware design on their own using EM simulation outputs, are being offered commercially. There seems to be a consensus that these tools can currently replace the junior engineer. The debate is not whether these tools will replace the engineering judgement of senior engineers, but when they will be capable of replacing senior engineers. If they do replace the junior engineer, where will the next generation of senior engineers come from?


All aspects of business and work life are currently impacted by the proliferation of AI tools. Education is not exempt from the revolutionary change brought about by AI. For example, Khanmigo displays the benefit of free personalized AI tutors in any subject for children of any age, anywhere in the world.

We are in the early chaotic revolution of AI tools; it is used by faculty to grade content and by students to create content. Many institutions have yet to regulate the use of AI in academics, meaning anything goes.

Where lies the future of AI in education? I am reminded of an old joke I heard from another professor 50 years ago. It goes like this:

A professor realizes one day that he gives the same lectures year after year. He decides to record his lectures. The following year, he comes to class and turns on his tape recorder, and the class listens to his recorded lecture. It goes so well that he starts leaving class after five minutes and just lets the recorder play.

One day, he decides to check on his class and comes back ten minutes before the end of the lecture. When he walks into the classroom, he sees a tape recorder on every desk, left by each student to record his recording.

We are now witnessing one outcome of the law of unintended consequences. Students create content using AI, which is then graded by professors using AI. Emails written and sent by AI are read and responded to by AI. Soon, an AI processor board for an AI-generated application will be designed by AI, verified by AI, and fabricated by an AI-driven assembly line. Where is the human in these loops?

In 2024, Kurzweil published The Singularity is Nearer, a sequel in which he paints a beautiful picture of all the new opportunities we can expect from the AI revolution: longer lifespans using designer genes, higher quality of life from new materials, instant communication, and safer travel.3

Before cars replaced horse-drawn carriages, who could have anticipated the problems of car accidents, rush-hour traffic, air pollution, and parking? Kurzweil has a great track record for his predictions coming true, but there are alternative visions of the future. Just look at some of the movies from the last 60 years, such as “I, Robot,” “The Terminator” series, “The Matrix” series, “WarGames,” or “Colossus: The Forbin Project.” In all of these narratives, when humans were taken out of the loop, it did not end well for them.

As developers of the next generation of technology, we should always keep in mind that technology amplifies human nature. Let’s pay attention to the unintended consequences of what we create.

REFERENCES        

  1. R. Kurzweil and Kurzweil Technologies, Inc., “PPTExponentialGrowthofComputing.jpg,” Wikimedia Commons, January 2008.
  2. R. Kurzweil, The Singularity is Near: When Humans Transcend Biology, Viking, 2005.
  3. R. Kurzweil, The Singularity is Nearer: When We Merge with AI, Viking, 2024.
You must login or register in order to post a comment.