From “Colossus: The Forbin Project”, 1970

Jonathan Wise

Chief Technology Architect
CESMII – The Smart Manufacturing Institute

“By far, the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.” — Eliezer Yudkowsky

Last year I wrote about the hype –– and challenges –– of generative AI. I was, and remain, dubious that it will ever live up to the breathless optimism that accompanied last year’s splashy launches. But I may have moderated my position a little as a few practical use-cases have started to emerge from the noise.

Generative AI still produces problematic results, still can’t tell when its hallucinating, and as we’re learning, turns out to have enormous infrastructure and energy impact. But as I said in my previous post, we can’t put this genie back in the bottle –– and we shouldn’t try. As with previous innovations in computation, our understanding of AI will evolve until today’s magic seems commonplace. As in the past, we’ll likely decide to rescind the “AI” label entirely, and reserve it for some future magical innovation –– but only if we ride out this hype cycle and keep pushing for the next summit.

In last year’s post, I urged caution in adoption and integration of generative AI. On an impromptu panel at the ARC conference earlier this year, I held that line –– but with a hint of optimism added. There are valid applications of Gen AI, and promising potential for continued improvement of cognitive-style computing, but we should continue to proceed thoughtfully. While in the past year, I’ve never managed to coerce ChatGPT into producing a complete and useful novel work, I have successfully used it to summarize existing work. While I’m nowhere near trusting it to replace a software developer, its ability to regurgitate Stack Overflow, and (potentially plageriously) playback related code from GitHub, can accelerate monotonous or obscure development tasks. And although they aren’t a part of our current Gen AI hype, machine learning models that quickly classify, estimate, or leverage first-principles to simulate or predict, have long since proved their worth in the toolbox of manufacturing problems –– and data science in general.

Much prognostication has been done by others about the societal impact of Gen AI. Some have even compared it to the iPhone as the dawn of a new era. I’d argue that this moment in computing has the potential to be even more important: the iPhone, in its elegance, made us all dumber. Its glossy, fluid UI, and closed-sandbox model abstracted users from the challenges computing presented in the past. If I hadn’t taught them, it would have been entirely possible for my kids to make it through high school without understanding what a file system is. The iPhone turned our most personal computer into an appliance that only gets cracked open by specially trained service people. Apple continues this trend with more hardware that can’t be repaired, an Operating System so locked down that you can no longer drag pre-installed apps to the trash, and, with each release of the OS, additional “security” restrictions making the platform more locked down. If any part of the system breaks, the expectation is customers will just throw it out and buy a new one

Continue reading on LinkedIn…

Technology Resources

Explore videos, code samples, demos and documentation for interacting with the CESMII Smart Manufacturing Interoperability Platform.


Learn More