Content

Artificial Intelligence, machine learning and big data is a dead end

A part of Articles.  submit an article

posted by owen on Sat, 17th Dec 2016 at 8:36 am .

Let me just preface this to say that I have no fear of machines taking over or somehow putting everyone out of work. Machines will definitely put some people out of work but not everyone. In fact I think pollution, inflation, crime and poverty are bigger problems than a robot apocalypse. The future robot overlords might even save us from our selfish ways.

That being said I do not think we are there yet or even close to simulating the brain and letting computers solve our problems for us. I often think of the brain as a simulation rather than a running program. The only concept I have really delved into or written about until now is my general theory on dreams. In this article I will rant more about Artificial intelligence and where I think we are going wrong.

All your data

The current push is to gather as much data as possible in my view is a dead end. Gathering lots if data will help you solve spelling errors and do search really quickly but it is only for things you already know. How much data do we really need? No matter how much data you throw into AI there is still the flaw which is in the programming model itself. The brain is not asynchronous. A human can have many ideas in play at the same time with little data. But the current trend in AI is pretty much the same as it was in the 80s we just have more data and faster computers. Its the same pattern matching we have been doing for years. You give the computer a question and goes down a list of answers.

Pattern matching

To me pattern matching is too simple a process. I think brain is doing more than that. Yes pattern matching is a big part of it but I would guesstimate that the brain is compiling programs in real time against multiple languages and data sets in a mesh or grid data store - something that we cannot do at the moment - if we would then we have had cool stuff already. It is not matching a pattern but move to a state when all the cards are close together. The brain's data store has to be some kind of data meets program. This data store is not only "data" but a "program", "file system" and bus at the same time. In current computers these are physically separate things which creates bottlenecks everywhere.

Pattern matching is asynchronous. If we keep going down the road of pattern matching we will just keep going through more and more data until we run out of bandwidth, storage or time. The brain has to be doing something more clever along the lines of a real time linker. The brain might be constantly compiling but never "executes" in the classic sense. Why wait to execute when you can come conclusions at anytime? You just need enough data to act or come to a conclusion.

Is there a Brain BIOS?

Alex proposed a theory about finding the BIOS or some kind of small base program which human has in common but yet it is unlikely. The BIOS concept seems to be what machine learning is targeting. A base set of code in which to put all the world's information so that we can find a simple pattern which we can use to do pattern matching AGAIN!

But what if there is none? What if the brain contains many such programs? There might be no one central point of operations in the human brain. The machine learning that system that you are dumping all your information into might just be doing the same thing with 1kb of data that it does with 1 terabyte.

Big Data as Artificial Intelligence

AI in its current state is trying mimic a clever system with lots of data. It's like a person using a bulldozer because they don't know how to use a shovel. And that same person keeps buying bigger and bigger bulldozers and still can't do what a shovel can do because they are too focused the big picture. They use the bulldozer in the hopes that somewhere along the line they will figure out the shovel.

If I do a web search on "what is the speed of light?" I will get several links to web pages containing information. While this may seem impressive the act of knowing that light has a "speed" and actively deciding to search for it is the part of being "intelligent" - not the search itself. But many see the "search" as revolutionary to the point where they are impressed by Netflix suggestions based on the movies you watch. All this is simply a side effect of having a large database. You could come to allot of conclusions if you had a million data points but you will learn very little. In order to learn more you will have to get more data and more and more into infinity. The only advantage in big data now is being the data gatekeeper.

Conclusion

Either way these are all theories. Eventually we might discover a way to store all the worlds information and still be alive to see it. One thing is certain; light speed is constant. Light speed is the fastest you can compute and therefore the data you have and the speed of computation are linked. You can't look at big data and interpret it at the same time. And certainly not at the speed of light. There will always be lag. Hence the brain must more clever than it is fast. If we ever hope to actually start making new strides in AI we need to drop the old hat tricks and focus on being clever as opposed to being fast.

p.s. open sourcing your AI framework is not going to help make it better. Its like throwing bodies at a dead project.


permanent link, visited 69 times, Post your comments on this article. .

comments

  1. I think you have some interesting ideas. I think the next couple of decades will really test whether increasing the size of neural nets will allow us to more simulate the brain. Some think that until we can build neural nets with trillions of connections like the brain, the we won't know for sure if scalability is the last barrier. That being said I tend to agree with you. I think the brain works in a super clever way and data alone will not equal intelligence. But I do think that a brain, however clever, does not learn and self organize without lots and lots of data to draw conclusions. One of the biggest open questions is how to transfer learning. The more tuned the brain is, it seems it needs less data to learn new things. What I think it's doing is extremely hierarchical, which is exactly what deep neural networks is about.

    by Alex Sat, 17th Dec 2016 at 8:00 pm  

  2. Learning is structure and data. If you deepen the net all you are doing is making the data more complex. And the more complex the data the harder it will be to process. It's a losing battle.

    by owen Sat, 17th Dec 2016 at 8:21 pm  

  3. brain does have a bios, all babies are born with a set of functions that are common... primitive reflexes. could this be a bios?

    by debo Fri, 20th Jan at 10:07 pm  

  4. yes, but there is a big gap between what babies know and Mozart. Figuring out what bridges that gap is the real problem.

    by owen Wed, 15th Feb at 2:26 pm  

  5. Precisely, which is to say the difference between a baby and Mozart is training data

    by Alex Fri, 23rd Jun at 8:27 am  

  6. But of course don't under estimate the bios that baby has built in. Trust me, watching my son grow up I tend to think toddlers are almost as smart as adults just without enough experience so they spend more time in drawing intuitions from whats in front of them or what they just saw the other day, than from a large corpus of trained weights from years and years of experience

    by Alex Fri, 23rd Jun at 8:28 am  

  7. There is a reason why it's important to remember that neural networks are biologically inspired. Just like air travel is biologically inspired. Just like any real major artificial contraption. All we are attempting to do is engineer from what we already see works

    by Alex Fri, 23rd Jun at 8:30 am  

  8. So I would argue that machine learning and big data are not a dead end, contrarily they are the starting block that will allow us to refine our algorithms once we master them, this is what it looks like Google DeepMind is actually doing

    by Alex Fri, 23rd Jun at 8:32 am  

  9. I see what you are saying but its a dead horse, we need to move on from 1s and 0s. There is a upper limit to how much we can process by just adding cores or offloading to the GPU or putting it in the "cloud". Of course there is fun to be had in the present technologies we are using, MAYBE a few more things to discover but hopefully some of us start working on newer experimental tech.

    by owen Fri, 23rd Jun at 9:16 am  

  10. Like quantum computers?

    by Alex Fri, 23rd Jun at 1:07 pm  

  11. quantum is a start but seems to be marketing driven instead of it being actually useful at solving new computational problems. look at this video;

    by owen Fri, 23rd Jun at 1:33 pm  


comment