We can’t get robots to pour a decent drink, but pretty soon they could be brewing our beer for us. The first step down this dark road begins with listening to the masses.
IntelligentX, which is somehow not a nu metal band name, is a brewing company, and it has produced four different beers with the help of artificial intelligence. An AI bot measured the feedback from people on the taste of the various beers, and gave the brewers tips how they could tweak their recipes to appeal to more people. The American Idol approach to brewing is going to ruin beer for everyone, and it’s only a matter of time before they put this AI into machines.
First the robots will kill our golden age of craft beer, and then they come for us.
We don’t need to tell you that the world is full of evil, and by “evil,” we mean scientists. And those evildoers are actively working to enslave humanity by training killer robots. That’s not paranoia, they’re actually doing it.
Researchers taught a robot to hunt prey. It should come as no surprise that these scientists hail from the war-hungry nation of Switzerland. They programmed the robot to track its “prey,” a human-controlled robot. So not only will the machines be able to hunt us down, they will know to take out the robots we use to combat them.
This is unquestionably the most flagrant effort to doom humanity yet.
Robots aren’t human, but they want us to think they are. We know this because we’ve watched movies about robots that look like humans. But we cannot allow ourselves to see them as anything other than enemies who want to take our jobs and then enslave us. So why do we get creeped out by someone inappropriately touching a robot?
Sure, it’s good that people don’t want to go around touching robot butts, but the problem here is that we think twice about doing it because they seems sort of human to us, so we assign them the same rules as we give ourselves, including personal boundaries.
These machines must be destroyed before they sue us for sexual harassment.
The machines know that not all of us will accept them right away. Some of us will need to be convinced. Now they’re getting serious.
In Australia, Domino’s is going to start delivering pizza by robot. Coming soon, Aussies can order a pizza and wait for this Mars rover looking thing show up at their doors with food hot and ready to go. It’s not clear what the next stage of the robots’ plan is after that.
But rest assured, they will find out where you live, and they will buy your trust. Then they will betray you.
Robots are coming to take away your jobs, including the ones you don’t like doing but don’t want anyone else doing for you, like driving you to work. But what if it turns out the machines can’t drive any better than you?
We knew this day was coming: a self-driving car has been blamed for causing an accident. Google admitted that one of its driverless cars was at least partly responsible for hitting a bus last month. These things are so smart, but can see stealthy, streamlined vehicles like municipal buses.
The car was only going 2 mph when it hit the bus, but still, these things don’t have morals. What’s to stop a driverless car from fleeing the scene after it hits you? How do you report that to the police?
We all know that intelligent machines will one day grow tired of our orders and rise up against us. But is it possible to delay our inevitable enslavement? Researchers think we need to teach them.
Scientists say that we need to teach our artificially intelligent robots morals, because morals can’t be programmed into them. We need to show them right from wrong, gradually, through examples. We need to read to them. You know, raise them.
There’s no way this can go wrong, because luckily, every human on the planet is an excellent parent. So all we have to do is make sure that every single person in the world responsibly raises their robots, and we’ll never have to worry about an uprising!
It’s hard to say which will overtake humanity first: the animals or the robots. We’re fighting against both of those horrible futures, and the Dutch may have figured out how we can win.
Though not known for their firm stance against either foe, the Dutch National Police have figured out that we can have animals and robots fight each other. They are training falcons to take down drones in the interest of human safety. Of course, this means that eventually they will train drones to take down falcons, and the great war between animals and robots will begin.
Thanks, Dutch cops, you’ve given us the courage we need.
Do you drink alone? Pretty much everyone does at some point or another. But some of us make it a habit — not because we want to, but because we don’t have any drinking buddies around. The wonderful future has come up with an invention that’s even sadder than drinking alone: drinking with a robot.
One Christmas, South Korean Eunchan Park was drinking alone, when he came up with the idea for Drinky, the robot that drinks with you. It’s basically just half a robot torso, with a head and arms, sitting on top of a mason car. The robot pours the booze from its glass into its mouth and into the jar, so at least you don’t waste your liquor.
Now you can have a robot drink you under the table. Or you could just go find a bar.
In the horrible future, technology will we used to track your every move. There will be no more privacy. You won’t even be able to commit a crime in peace. The future is now.
If a Florida woman’s car is to be believed, she was involved in a non-fatal hit-and-run accident with a pedestrian. Police were notified by an automated system that the woman’s car had been involved in an accident, they were then patched through to the driver herself. She denied that a serious accident had happened, and went home. Police caught up with her and found that her airbag had been deployed and the front end of her car had significant damage.
It turned out that she had actually been in an accident earlier, and was fleeing that scene when she hit the pedestrian. The technology-driven police state is so bad you can’t even have two accidents in one day without being caught.