Two truths about the technology industry that aren’t always particularly popular to acknowledge are:
- For better or worse, we are a fashion industry; a brilliant and weirdly technical fashion industry, but a fashion industry nonetheless. The hype cycles around trendy ideas are intense, and our collective attention is easily diverted to the new hotness.
- For better or worse, industry trends often move as a pendulum.
As an observer this can make it difficult to assess:
- What is hype and what is legit? and
- What is the industry’s overall direction of travel?
I can’t recall any conference in recent memory that has caused me to think more deeply about these forces than NVIDIA’s GTC conference, and it’s been a fun topic for us to discuss at RedMonk. This collaborative post with my colleague James Governor is to share the context of some of these discussions.
There are parts of NVIDIA’s business model that are definitely tied to the technological hype cycle (e.g. cryptocurrency mining on GPUs), and the macro changes in said hype cycle have had very real financial impacts for the company over the past year.
But what interested us more than cryptocurrency impacts was trying to parse some strategic bets the company is placing around cyclical movements. How is NVIDIA navigating hype cycles and pendulum swings?
Centralized vs. Decentralized Compute
As fashions have changed, so have computing models. Mainframes gave way to minicomputers gave way to client/server computing gave way to PCs gave way to mobile and so on. Each wave of change was associated with a change in accounting and cost models, and so a change in how lines of business work with information technology. Now edge computing is threatening to become the preferred hemline for the next few seasons.
Fashion aside though, it’s important to remember that we in fact always see a complex interplay of fashions. Hipsters start growing beards a long time before it seems like the mainstream do. Suits keep wearing their suits. Distributed computing always complements more centralized computing. For example mobile apps drive mainframe workloads in online travel.
Today we see NVIDIA making both edge plays and “mainframe” plays simultaneously. We previously explored the forces impacting NVIDIA’s data center investments, but we wanted to spend more time talking about the interplay with what they’re doing at the edge.
On the edge side, at GTC NVIDIA introduced The Jetson Nano, a small footprint device designed to be powerful enough to run AI and ML workloads. NVIDIA sees Jetson as being the natural choice for developers that outgrow Raspberry Pi devices, wanting more horsepower in a small footprint device. The base Jetson Nano is priced at $99 to encourage volume adoption.
CEO Jensen Huang pointed at automotive as a target market where edge-based AI and ML makes sense. AI in the car needs to be contextually aware at the edge, including:
- perception capability (sensors)
- reasoning capability
- action capability
On the edge, we need intelligent reflexes and localized decisions. However, compute-heavy and data-intensive intelligence will remain the domain of centralized compute.
In the future (thus the Jetson’s reference!) Huang argued that every building, every factory, every city will be robotic, with the capabilities laid out above. Intelligence will be comprised of end points and cloud-based compute and information processing. A metaphor for this is human intelligence, with a brain doing central processing, and our reflexes being edge computing.
Credit: This was a collaborative piece of writing between myself and my colleague, James. Equal thanks and credit go to him for his contributions.
- About NVIDIA: NVIDIA: Scaling Up and Scaling Out
- About pendulums: A Swing of the Pendulum: Are Fragmentation’s Days Numbered?
Disclaimer: NVIDIA is a client and paid for my (Rachel’s) T&E to GTC.