Courtesy of pxhere.com |
Computers Have Changed the World
Face it, since the 1980s the
personal computer has changed the world as we know it. Before Apple and IBM started offering
computers to the masses, the world was a much
different place. We weren’t as connected. Life ran at a slower pace. The world seemed
bigger. However, the advent of the
microchip and everything that goes along with it has forever changed the ways
in which we communicate, educate, shop, do business and entertain ourselves. In fact, just about the only thing that hasn’t
changed in the past 25 years has been the fact that we still control
the machines that reside at the heart of every PC, tablet, smartphone,
automobile, airplane and power plant.
Without us, they would just be a die cast doorstop that would be about
as smart as an anvil.
The first |
While science fiction novels
and movies galore speak of the wonder and the horror of thinking machines, the
fact is that there still aren’t any machines on the planet that can self-program
or learn from their mistakes. That’s the
good news. The bad news is that the day
is coming sooner than you think when computers will be able to think for themselves.
Computer Games Have Been the Springboard to AI
Computers are really good at
games. The reason is that games have
rules. Programming the rules into a
computer is fairly straightforward. Once
programmed, a game-playing computer has a distinct advantage over a human because computers can perform hundreds of millions of calculations per
second. This was first brought to light
in a big way when, in 1997, the IBM computer "Deep Blue" beat the world’s chess
champion Gary Kasparov.
Deep Blue (Photo credit: James the photographer) |
While an impressive feat,
"Deep Blue" and its successors are extremely limited in what they can accomplish
and how they can interact with humans.
All they do is play chess. They
not only are unable to hold a conversation about the nuances of the game, they
don’t understand what the word nuance means.
However, all that changed in 2010 with the creation of the computer
known as "Watson."
English: IBM's Watson computer, Yorktown Heights, NY (Photo credit: Wikipedia) |
“Since Deep Blue's
victory over Garry Kasparov in chess in 1997, IBM had been on the
hunt for a new challenge. In 2004, IBM Research manager Charles Lickel, over
dinner with coworkers, noticed that the restaurant they were in had fallen
silent. He soon discovered the cause of this evening hiatus: Ken Jennings, who was then in the middle of
his successful 74-game run on Jeopardy! Nearly the entire
restaurant had piled toward the televisions, mid-meal, to watch the phenomenon.
Intrigued by the quiz show as a possible challenge for IBM, Lickel passed the
idea on, and in 2005, IBM Research executive Paul Horn backed Lickel up, pushing for someone
in his department to take up the challenge of playing Jeopardy! with an IBM system. Eventually David Ferrucci took him up on the
offer.
In initial tests run during 2006, Watson was given 500
clues from past Jeopardy programs. While the best real-life
competitors buzzed in half the time and responded correctly to as many as 95%
of clues, Watson's first pass could get only about 15% correct. During 2007,
the IBM team was given three to five years and a staff of 15 people to solve
the problems. By 2008, the
developers had advanced Watson such that it could compete with Jeopardy! champions . By February 2010, Watson could beat
human Jeopardy! contestants
on a regular basis.”
Doctor Watson has a PhD?
More incredibly, after
retiring from television, "Watson" was repurposed in 2013 to provide management
decisions in lung cancer treatment at memorial sloan -Kettering Center. IBM
Watson’s business chief Manoj Saxena says that 90% of nurses in the field who
use Watson
now follow its guidance. IBM
is also looking at the possibility of using "Watson" for legal research.
Watson, Ken Jennings, and Brad Rutter in their Jeopardy! |
While
the software that upon Watson is based is available to large corporations and
research centers (such as Rensselaer Polytechnic Institute), a system that
meets the minimum requirements currently costs more than one million
dollars. However, as computer chips
become faster and less costly, it won’t be long before this kind of technology
makes it to the masses. When you realize
that the computer power available in today’s Smartphones is superior to that
used to fly the Space Shuttle, then this claim is hardly beyond the realm of
possibility.
“According to IBM, "The goal is to have computers
start to interact in natural human terms across a range of applications and
processes, understanding the questions that humans ask and providing answers
that humans can understand and justify."
Moore’s Law Still Means More!
Moore’s Law
states that computer power doubles approximately every two years. This little gem was coined by Intel
co-founder Gordon Moore back in 1965, when he published a paper noting that the
number of components in integrated circuits had doubled ever since their
invention in 1958. While this trend has
slowed slightly over the intervening forty eight years, this exponential growth
has directly influenced every aspect of the electronics industry and brought us
closer to the point where computers will be able to think for themselves.
Moore's Law, The Fifth Paradigm. |
"By producing computer chips that allow computers to learn for themselves, we have unlocked the next generation of computers and artificial intelligence," Mr Van Der Made says. “We are on the brink of a revolution now where the computers of tomorrow will be built to do more than we ever imagined. Current computers are great tools for number crunching, statistical analysis, or surfing the Internet. But their usefulness is limited when it comes to being able to think for themselves and develop new skills," he says.
“The synthetic
brain chip of tomorrow can evolve through learning, rather than being
programmed.”
Peter
goes onto say in his book that he and his colleagues have already been able to
simulate many of the functions of the human brain and convert them into
hardware that learns without the intervention of a programmer. If he is correct, the next few years could
see a paradigm shift that is more earth shattering than that of the advent of
the silicon chip. Already we are seeing autonomous aerial vehicles flying the
friendly skies and driverless vehicles plying the highways of Southern
California. With a few more iterations
of Moore’s Law and a bit of tinkering, will we shortly be on the verge of intelligent
systems, everyday robotics and machines that can outthink their makers?
If
this isn’t quite the case yet, all I can say is, “Game on!”
If you like this article, you can find more by typing “robots” or "artificial intelligence" in the search box at the top left of this blog. If you found this article useful, share it with your friends, families and co-works. If you have a comment related to this article, leave it in the comment sections below. If you would like a free copy of our book, "Internet Marketing Tips for the 21st Century," fill out the form below.
Thanks for sharing your time with me.
Thanks for sharing your time with me.
Since 1995, Carl Weiss has been helping clients succeed
online. He owns and operates several online marketing businesses, including Working the Web to Win and Jacksonville Video Production.
He also co-hosts the weekly radio show, "Working the Web to Win,"
every Tuesday at 4 p.m. Eastern on BlogTalkRadio.com.
Related articles
Computers outsmarted me back in the days of Lotus 123 on a DOS platform. Since then it's been paddling as hard as I can to keep my head above water!
ReplyDelete