Demystifying AI: Understanding the human-machine relationship

Demystifying AI: Understanding the human-machine relationship 800 450 C-Suite Network

The artificial intelligence of today has almost nothing in common with the AI of science fiction. In “Star Wars,” “Star Trek” and “Battlestar Galactica,” we’re introduced to robots who behave like we do — they are aware of their surroundings, understand the context of their surroundings and can move around and interact with people just as I can with you. These characters and scenarios are postulated by writers and filmmakers as entertainment, and while one day humanity will inevitably develop an AI like this, it won’t happen in the lifetime of anyone reading this article.

Because we can rapidly feed vast amounts of data to them, machines appear to be learning and mimicking us, but in fact they are still at the mercy of the algorithms we provide. The way for us to think of modern artificial intelligence is to understand two concepts:

  1. Computers can ingest millions of data points per second and make instant calculations and predictions based on this data set.
  2. Very specific rules can be written to help a computer system understand what to do once a calculation is made. (Or, if training a neural network, very specific inputs and outputs must be provided for the data that’s being ingested.)
Executive Briefings: Intersection of Leadership and Social Media

To illustrate this in grossly simplified terms, imagine a computer system in an autonomous car. Data comes from cameras placed around the vehicle, from road signs, from pictures that can be identified as hazards and so on. Rules are then written for the computer system to learn about all the data points and make calculations based on the rules of the road. The successful result is the vehicle driving from point A to B without making mistakes (hopefully).

The important thing to understand is that these systems don’t think like you and me. People are ridiculously good at pattern recognition, even to the point where we prefer forcing ourselves to see patterns when there are none. We use this skill to ingest less information and make quick decisions about what to do.

Computers have no such luxury; they have to ingest everything, and if you’ll forgive the pun, they can’t “think outside the box.” If a modern AI were to be programmed to understand a room (or any other volume) it would have to measure all of it.

Think of…

Executive Briefings: Intersection of Leadership and Social Media