written by owen on 2016-Dec-17.
Let me just preface this to say that I have no fear of machines taking over or somehow putting everyone out of work. Machines will definitely put some people out of work but not everyone. In fact I think pollution, inflation, crime and poverty are bigger problems than a robot apocalypse. The future robot overlords might even save us from our selfish ways.
That being said I do not think we are there yet or even close to simulating the brain and letting computers solve our problems for us. I often think of the brain as a simulation rather than a running program. The only concept I have really delved into or written about until now is my general theory on dreams. In this article I will rant more about Artificial intelligence and where I think we are going wrong.
All your data
The current push is to gather as much data as possible in my view is a dead end. Gathering lots if data will help you solve spelling errors and do search really quickly but it is only for things you already know. How much data do we really need? No matter how much data you throw into AI there is still the flaw which is in the programming model itself. The brain is not asynchronous. A human can have many ideas in play at the same time with little data. But the current trend in AI is pretty much the same as it was in the 80s we just have more data and faster computers. Its the same pattern matching we have been doing for years. You give the computer a question and goes down a list of answers.
To me pattern matching is too simple a process. I think brain is doing more than that. Yes pattern matching is a big part of it but I would guesstimate that the brain is compiling programs in real time against multiple languages and data sets in a mesh or grid data store - something that we cannot do at the moment - if we would then we have had cool stuff already. It is not matching a pattern but move to a state when all the cards are close together. The brain's data store has to be some kind of data meets program. This data store is not only "data" but a "program", "file system" and bus at the same time. In current computers these are physically separate things which creates bottlenecks everywhere.
Pattern matching is asynchronous. If we keep going down the road of pattern matching we will just keep going through more and more data until we run out of bandwidth, storage or time. The brain has to be doing something more clever along the lines of a real time linker. The brain might be constantly compiling but never "executes" in the classic sense. Why wait to execute when you can come conclusions at anytime? You just need enough data to act or come to a conclusion.
Is there a Brain BIOS?
Alex proposed a theory about finding the BIOS or some kind of small base program which human has in common but yet it is unlikely. The BIOS concept seems to be what machine learning is targeting. A base set of code in which to put all the world's information so that we can find a simple pattern which we can use to do pattern matching AGAIN!
But what if there is none? What if the brain contains many such programs? There might be no one central point of operations in the human brain. The machine learning that system that you are dumping all your information into might just be doing the same thing with 1kb of data that it does with 1 terabyte.
Big Data as Artificial Intelligence
AI in its current state is trying mimic a clever system with lots of data. It's like a person using a bulldozer because they don't know how to use a shovel. And that same person keeps buying bigger and bigger bulldozers and still can't do what a shovel can do because they are too focused the big picture. They use the bulldozer in the hopes that somewhere along the line they will figure out the shovel.
If I do a web search on "what is the speed of light?" I will get several links to web pages containing information. While this may seem impressive the act of knowing that light has a "speed" and actively deciding to search for it is the part of being "intelligent" - not the search itself. But many see the "search" as revolutionary to the point where they are impressed by Netflix suggestions based on the movies you watch. All this is simply a side effect of having a large database. You could come to allot of conclusions if you had a million data points but you will learn very little. In order to learn more you will have to get more data and more and more into infinity. The only advantage in big data now is being the data gatekeeper.
Either way these are all theories. Eventually we might discover a way to store all the worlds information and still be alive to see it. One thing is certain; light speed is constant. Light speed is the fastest you can compute and therefore the data you have and the speed of computation are linked. You can't look at big data and interpret it at the same time. And certainly not at the speed of light. There will always be lag. Hence the brain must more clever than it is fast. If we ever hope to actually start making new strides in AI we need to drop the old hat tricks and focus on being clever as opposed to being fast.
p.s. open sourcing your AI framework is not going to help make it better. Its like throwing bodies at a dead project.