• 0 Posts
  • 3 Comments
Joined 9 months ago
cake
Cake day: February 28th, 2024

help-circle
  • I want to clarify my point about language not being sufficient. This point was not understood. When you use an LLM you may observe that there are certain ideas and concepts they do not understand. Adding more words to the language doesn’t help them. There are other parts of the human mind that do not process language. Visual processing, strategic and tactical analysis, anger, lust, brainstorming, creativity, art; the list goes on.

    To rival human intelligence, it’s not enough to build bigger and bigger language models. Human intelligence contains so many distinct mental abilities that nobody has ever been able to write them all down. Instead, we need to solve many problems like vision, language, goals, altruism/alignment etc. etc. etc., and then we need to figure out how to integrate all those solutions into a single coherent process. And it needs to learn quickly and efficiently, without using prohibitive resources to do it.

    If you think that’s impossible, take a look in the mirror.



  • LLM’s are only one kind of AI program. How smart would we be, if we only used the speech areas of our brains? It’s important to be able to complement language with other kinds of thinking.

    The problem with neutral network technology is the vast computational resources it requires to learn. The brain also requires enormous computing power, but brains grow organically and can efficiently run on corn and beans.

    To compete, AI systems will need to become much more efficient in the way they learn and process. Venture capital only goes so far. The subscription fees for ChatGPT don’t earn enough money to even cover the electricity costs of running the system.