Well that's what Marvin Minsky said recently at Boston University according to an article in Wired. Minsky is in a good position to judge since he's considered one of the founding fathers of AI co-founded the MIT Artificial Intelligence Laboratory back in 1959 with John McCarthy. Minsky, "accused researchers of giving up on the immense challenge of building a fully autonomous, thinking machine."
Read the Wired article and come back for my take on it...
...I think Minsky has a valid point. There are two issues which have troubled me in recent years. The one is the rise in popularity of competitions within AI: RoboCup, trading agents, computer poker, etc...These are very popular with grad students, who like to win, but it may be questionable if there is a big benefit to AI as a whole. Take RoboCup, the robot soccer competition; several years ago RoboCup was co-located with IJCAI, the main bi-annual AI competition. More delegates were registered for RoboCup than were for IJCAI! That can't be right - it means more people were working in the narrow application of making robots kick a ball about than in the entire discipline of artificial intelligence. That's not balanced or healthy for the development of the subject.
I've also noticed over the last decade and half, and it's true that this is largely because of AI's failure with expert systems in the 1980s, the rise of what I call the "smart algorithms approach." That is, given a problem AI researchers now typically attack it using machine learning methods and never attempt to use any explicit knowledge even when that knowledge may be easily available. This approach has been successful but eventually AI's will need to be able to use knowledge, codify it, pass it amongst themselves and be able to generalize knowledge from their experience. Consider an analogy, whom would you trust more, a doctor who said "take this drug it always seems to work," or a doctor who said "take this drug it always seems to work because it inhibits the protein receptors on the virus and interferes with its reproduction." The latter has explicit knowledge used to support an observation based on data. the former just has data.
Minsky is right, AI has to get back to explicitly handling knowledge, both expert and common sense.
No comments:
Post a Comment