HomeBlogPosts about AIAI Hallucinations Could Contain Genius Ideas

AI Hallucinations Could Contain Genius Ideas

Is it Mad to Make Up Answers to Questions?

Is it bad for artificial intelligence (AI) to make things up or hallucinate? In HAL149, a model invents several start-ups every day using its ability to hallucinate.

The system randomly combines several concepts and invents (tha is, “hallucinates”) a business idea. It then works out how it could become a reality and publishes it on LinkedIn, even including the name and experience of fictitious entrepreneurs.

With this experiment, we show that the common use of AI is part of the possible applications. If you know Python and are interested, check out my GitHub post. It’s a lot more fun than the usual use of generative AI.

The Importance of Randomness

We live in a world so obsessed with data that we often forget that creativity and innovation are based on hallucinations.

By introducing sources of randomness in a AI model, we can generate entirely new ideas. Something similar happens in humans, where most good ideas come to them randomly.

But can good ideas be systematised? Yes, in a roundabout way. One way to have good ideas is to have lots of ideas. And this is where AI can help.

Hallucinations are necessary

The vision of a product that does not yet exist is a hallucination, and with it all forms of art and creativity that exist. You could say that everything around us is a hallucination that someone had at some point.

Artificial intelligence that gives wrong answers or hallucinates is neither good nor bad. It is in the use and context of the answers that the confusion occurs.

Only custom-trained AI models can give correct answers to factual questions. On the other hand, only by introducing randomness can we make the machine hallucinate in a useful way. ¡The best business idea in the world could be being published right now by HAL149!.

ai hallucinations could contain genius ideas


Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe

* Required fields