Can Nvidia Technology Accelerate Computing 4x Every Year?

And what does it mean to Cybersecurity?

As in the latest videos by Jason Huang the CEO of Nvidia where the Blackwell architecture and other pieces have been revealed with more detail:

Blackwell is a very large GPU installation which has the capability of increasing computing power 4x every year.  As the image explains Blackwell rack weighs 3000 pounds at this time.

The way one has to think of this ‘rack’ is that this chunk of computing is now abstracted. One can work on this chunk  every year and build denser and denser computing capability which will create a 4x increase every year as Jensen Huang explains.

What does a 4x increase every year mean?

Well, let’s go back to something we have understood and lived with for the last 50 odd years. Moore’s law is 2x computing power increase every 18 months. The computer silicon manufacturers have been able to increase the computing power every 18 months by about 2x. We were able to ride this major technological wave and use it to power many industries (in fact all industries). This wave helped usher in an age of improvement that has been unheard of throughout human history.

If you multiply 2x by 2x by 2x = 8x in 54 months over the years  where does it go in 10 years ? it is like 2^7 which is of course 128 – let us call it 100x.

What happens to our calculations for 10 years in a 4x every year? well it is 4^10 of course. That means it is 1,048,576 or let us call it 1 MillionX.

That means a million versus 100 means that it is 10000 times faster in 10 years. These calculations of course assume that the Blackwell Nvidia computing block (or whatever it will be called) can increase computing power 4x every year.

Now that I have your attention, it obviously means that we are in for a unique moment in history, as we are in the dawn of a new computing era.

This just means we are still not using the latest software improvements properly since it is in it’s infancy.

These are CUDA-X libraries

If you look at this still image from the video it is a breadth of software into many industries. Nvidia has been developing it’s CUDA (Compute Unified Device Architecture) What has Nvidia been working on? (from left to right)

  1. cuDNN  – Deep Learning
  2. Modulus – AI Physics
  3. cuOPT – Decision Optimization
  4. cuDT – Data Programming
  5. cuPyNumeric – Numerical computing
  6. cuVS – Vector Search
  7. Parabricks – Gene Sequencing
  8. CUDA-Q – Quantum Computing
  9. AI Aerial – Radio Signal Processing
  10. cuDSS – Sparse Solver
  11. cuLitho – Computational Lithography

Each one of these applications (I am sure it is only a subset) has the ability to supercharge every industry with improvements and productivity improvements just like Moore’s law did. The CUDA (shortened to cu within Nvidia nomenclature) and thus in the last so many years they  have been able to make 20x or 30x improvements in cuDNN and cuDT among others.

The other aspect of this architecture is the Omniverse, AGX(Jetson) and DGX.

The DGX is where things start to get developed, then a version of the ‘robot’ or software agent will be placed in the Omniverse. The Omniverse  is a digital copy of the real world. So let us give an example with a factory building widgets.

Design new way of building widgets in DGX. Place this new way into the Omniverse which is a digital copy of the factory floor.  The Omniverse allows engineers to see how their new method will work in a test environment before making the method in a real environment.

So if one has a new method and is ready to place in real world there is AGX (or Jetson as Jensen Huang called it). In this environment there are physical robots that mimic the factory new method in a controlled manner.

Once the new method resolved any of those hurdles in AGX environment only then will it go into the real world.

I also like this image for different reasons.  On the left it has a set of images under “Understand data” – Text, Text, Image, Video, Text, speech, multi-modal, amino acid, and brainwaves

the right side is image, video, image, 3D, sound, animation, manipulation, protein, speech.

What do these two sides mean when it also says: ” You could teach it to understand just about anything”.

First of all the sides signify input and output.  I.e. text can be translated to image, or video, sound, etc.   The program could relate most of these items to each other video can become text and vice versa. the left side can translate to the right side or the other way as well. It is what we are doing right now with the AI programs such as chatGPT (or all the other ones – gemini and more).

Why am I making such a deal out of this video? Because this is the good guy video. Now let us put our cybersecurity hat on and imagine how we could use CUDA or blackwell (or it’s weaker cousins) into a criminal hacker version.   In case you are wondering how GPUs became so powerful in AI , the video in the link points to “Dr. Waku” which goes into some detail. essentially the Graphics processor became powerful by encoding the actions of placing stuff on the monitor again and again – which means the GPU is very good at concentrated repetitive computation in parallel. Which is sort of like AI computations, so Nvidia just had to scale up the CPU quite a bit (that is what Blackwell is).

Obviously our lives are going to get very interesting in the next few years.

What we need to do is to learn AI within our industry and then remember that the bad guys will also use AI and spend a little time (10%? 5%?) to think about defending our AI systems.

One of the items Dr Waku mentioned is that there is a company in China called Moore Threads Technology which has been sanctioned by the US already as it will develop GPUs at a level with old tech compared to Nvidia. As mentioned in the techzine techinasia.com   Where china is building it’s own AI architecture and pieces.

Here is a brand new Nvidia SC24 address (11/18/24) Where Jensen Huang is discussing Nvidia is discussing the Nvidia technologies past and future before the Wednesday release of their stock report.

 

It is clear from now on we are in an AI arms race…