Why Generative AI Fails to Recommend the Right Architecture

When popularity overrides architectural fitness

Dan McCreary
3 min readFeb 11, 2024

In my prior article, I discussed how Generative AI (GenAI) can generate the first draft of a GenAI strategy and roadmap. In this article, I will show the limits of this approach and how GenAI can create inappropriate suggestions.

Let’s take a sample prompt:

What are the leading technologies that enable large organizations to create models of the real world to help them build intelligent systems that predict customer behavior and provide high-quality recommendations?

When you give this general prompt to GPT-4, it returns a group of technologies that it believes are appropriate. One of the items that shocked me was its suggestion that blockchain was relevant. After asking clarifying questions, it came up with flimsy excuses about collecting and validating data used in training. But the real reason was that blockchain has been in the news and is naturally associated with AI and other emerging technologies. But this is not a good recommendation.

One of the critical reasons that blockchain has fallen out of favor is because of extremely high transaction costs compared to standard tools for nonrepudiation, such as using hashing, digital signatures, certificates, and a certificate authority (CA). By cleverly digitally signing transactions, transaction logs (DIGSIG), and reports, you can clearly show that collections of transactions have not been tampered with. But there is one catch: you need to have a certificate authority hand out the certificates.

Today, the ONLY use case for blockchain remains when organizations can’t figure out how to build auditing systems to trust a group of distributed CAs. But in the real world, almost any group of organizations can build auditing systems that govern CAs. That is how everything on the web works with SSL. Blockchain transactions can be up to six orders of magnitude more expensive than using a combination of certs, DIGSIG, Merkle trees, and CAs. That means that in the real world, blockchain is more costly than standard DIGSIG stacks for 99.99% of use cases. The remaining 0.01% is only for technologies like Bitcoin and other cryptocurrencies wreaking unfathomable environmental damage.

Popularity is Not Suitability

The reason that GPT-4 suggested blockchain is NOT because it was a good fit for the task I suggested. It was because so many people don’t understand the basics of trust protocols, and investors with no solutions architects on staff put a lot of money behind blockchain firms. These investors have now lost most of their investments. They would have been MUCH better off investing in scale-out knowledge graph companies.

The bottom line is that you must ask GPT the right question and give it the proper context. Here is another response:

Screen image of a direct question with some context about the appropriate use of blockchain. Image from the screenshot of ChatGPT.

In the question above, I was careful to state that we could trust a CA and that digital signatures were acceptable. Then we got the correct answer!

Conclusion

I was cautious in stating that it is okay to use ChatGPT to create the first draft of a generative AI strategy and roadmap. However, this document needs to be reviewed by professionals. Otherwise, your company will invest in technologies like older but popular Excel spreadsheets and RDBMS systems that fail to scale and might not be appropriate for your task.

--

--

Dan McCreary

Distinguished Engineer that loves knowledge graphs, AI, and Systems Thinking. Fan of STEM, microcontrollers, robotics, PKGs, and the AI Racing League.