If you’re not familiar with The Mitchells Vs The Machines, please reference the following clip wherein two robots attempt to identify a strange looking dog. Hilarity ensues.
This is an iconic example of the “machine freaks out” trope, but it’s so much more. First and foremost, did you know that, minus the sparks and the Exorcist 360 degree head rotation, this is an actual challenge in machine vision? People make datasets to teach machines to make subtle distinctions and one of them is, in fact, dogs vs baked goods.
Chihuahua or Blueberry Muffin?
Labradoodle or Fried Chicken?
In real life, machines don’t explode or shoot sparks out of their faces just because they can’t figure out what they’re looking at. They guess and move on. They’re also (probably) are never going to need to worry about whether they should eat that muffin in front of them or whether it’s actually a chihuahua. Nevertheless, we want our machines to be as accurate as possible because the things they do can be highly sensitive. We don’t want a machine identifying cancer cells to be wrong, for instance, and a self-driving car ought to be able to tell the difference between a bucket of KFC left in the road and a live animal.
What may be easy for us to distinguish as humans can be challenging for an AI, and we can’t always guess at what will trip our synthetic friends up. Animals that look like things has become a popular meme, and not every image I show here is actually used with vision recognition systems, but in general, datasets like these serve an important purpose as we increasingly rely on computer vision in our day-to-day lives.
Duckling or Plantain?
Puppy or Bagel?
Name: Eric and Deborahbot 5000
Origin: The Mitchells Vs the Machines (2021)
Likely Architecture: Reinforcement Learning, Convolutional Neural Networks for vision processing with a haywire ambiguity resolution system, and Transformers for Speech and Language.
Possible Training Domains: Since the cell phone character can see and speak, these features were probably ported from her and fine-tuned. Motor controls could be learned in a virtual environment and perfected in real-world experiments.
I take requests. If you have a fictional AI and wonder how it could work, or any other topic you’d like to see me cover, mention it in the comments or on my Facebook page.