Happy 40th anniversary, PCMag! Times like these call not only for nostalgia, but a look back at what we got right, and what we got wrong. I was the editor-in-chief of PC Magazine for 14 years, and in the September 2001 issue , for the 20th anniversary of the IBM PC, I made some predictions(Opens in a new window) about technology and how I expected it to look 20 years into the future. So now is the perfect time for me to revisit those assertions and tally up my hits and misses. Let’s just say I didn’t quite have crystal-ball vision. But all in all, I didn’t fare too badly.
Miss: The Utmost Importance of the Smartphone
“Digital cameras will be ubiquitous, with just about everyone using computers to edit photos and digital video. Every business will use the Internet for communications, and web services will start to take shape this year. Over the next few years, your calendar will be available on the web and accessible wherever you are. You’ll be able to share it with multiple people.”
I was sort of right, these things did happen, but I didn’t take the prediction nearly far enough. By 2011, digital cameras and the Internet were indeed everywhere, and you could easily share content on the internet. But what I missed was how the smartphone would basically consume the digital camera market—and more importantly, how it would become most people’s primary computing device for its portability factor, unlike the PC on your desk. Apple launched the iPhone in 2007, with the App Store following the next year. The rest is history.
Apple iPhone 13 Pro Max
Hit: The Genesis of Cloud Computing
“The applications I really want—real-time, accurate voice recognition and translation—are still years away, but they’re coming. In the next few years, we’ll see advances in peer-to-peer computing not only for file sharing but also for harnessing all the computing power we have out there to solve big problems.”
Yes, the idea of what we now call “scale-out(Opens in a new window)” computing was already taking off. We had software-as-a-service (SaaS) solutions, including Salesforce—and depending on how you look at it, going as far back as, say, ADP processing payroll on mainframes. Amazon Web Services launched in 2002, and it soon evolved into what we now call “cloud computing.”
These platforms initially started as more efficient ways of running traditional applications, but they also let organizations collect, store, and analyze massive amounts of information in a cost-effective way. This enabled new applications and new business models, accompanied by various pros and cons. What I hadn’t realized was just how important they would become as software-development platforms.
((Illustration: Weiquan Lin/Getty Images))
And it was the ability to train deep neural networks with GPUs and typically, the ability to run these massive models in cloud architectures, that really enabled speech recognition and later, translation. Siri launched in 2010 and Alexa in 2013, and since then, such platforms have become more and more accurate, with real-time translation vastly improving in the past couple years.
Hit: Broadband Becomes Big
“The broadband and wireless revolutions are still in early stages, and the telecommunications market is overbuilt. But I’m convinced we’ll eventually have fantastic broadband and wireless applications.”
This one’s a no-brainer, of course. If anything, as I mentioned earlier, I underestimated everything we would be doing on smartphones. But it did take years for internet traffic to catch up—and then exceed—what was built in the dot.com era.
Hit: AI As a Double-Edged Sword
“I also take seriously the very real concerns about where technology is headed. I find some comfort in the slow progress within the field of artificial intelligence, but the ideas from folks like Ray Kurzweil and Vernor Vinge make me wonder.”
I was right to be concerned about the use of technology, but I didn’t account for the AI explosion of the past decade. Deep-learning neural networks were an academic backwater when I wrote this; it would be another 10 years until researchers started using them on GPUs. When combined with the massive amount of data we now have available and the cloud infrastructure to handle it, this technology has brought new accuracy to image recognition and voice recognition, and later to all sorts of other applications.
(Illustration: imaginima/Getty Images)
We’ve seen a lot of utility from machine-learning algorithms and the applications they’ve made possible, but we’ve also seen plenty of instances in which these applications have resulted in unintended or biased results, as well as much controversy over how they’ve been applied in the real world. We’re still grappling with these issues, and there’s no end in sight.
Hit: Nanotechnology and Biotechnology
I think that nanotechnology and biotechnology are more fertile grounds for both excitement and concern. For instance, the controversy about bioengineered food presages harder debates to come.
We’ve seen many nanotechnology and biotechnology improvements in the past 20 years—mRNA vaccines for COVID-19 among them—along with many debates on these topics. A lot of the progress has been slower than I might have guessed, but let’s call it a hit.
Recommended by Our Editors
SpaceX Starship SN15
Miss: The Commercialization of Space
I don’t think that technology will evolve as smoothly or quickly as some people predict. After the moon landing, people thought we’d start colonizing the planets. Well, a quarter-century has passed since the last man walked on the moon, and no one is even talking about going back.
No one has walked on the moon since I wrote that, either. But we have seen incredible growth in commercial space applications—everything from satellite communications to GPS. I wouldn’t have predicted the advances startups such as SpaceX have given rise to, including lowering the cost of going to space, or that we’d see “space tourism” before anyone could get back to the moon. These firms are now talking about going back to the moon—and maybe to Mars.
I knew there would be controversies, but I didn’t expect social networks to spread so widely and to become instruments of further polarizing society.
It’s always difficult to predict how people will use technology. In the end, no matter what the technology is, people will decide what is and is not useful for them. As I often put it: Technology changes quickly. People change slowly.
I knew there would be controversies, but I didn’t expect social networks to spread so widely and to become instruments of further polarizing society. I hadn’t grasped the importance that these networks would play in commerce or in encouraging billions of people to make and share their own short videos. And I completely missed the emergence of a decentralized blockchain (first described by Satoshi Nakamoto in 2008) and cryptocurrency.
Bonus Hit: Technology Is Never Boring
The next 20 years promise to be quite a ride. I wouldn’t miss it for anything.
I was definitely right about that. Here’s to the next 20 years!