Darren kicks off the show with a quick panel discussion about Tribeca Film Festival’s decision to pull a controversial documentary linking vaccines with autism from their line-up. Next, Pat dives right into a listener email about double-dipping. Adam looks into how the internet corrupted Microsoft’s innocent new AI chatbot Tay in less than 24 hours. Finally, Cristina gets goopy with a segment about energy-infused beauty products.
Download direct: mp3 file
If you like the show, please leave us a review on iTunes!
Double-Dipping
Double Dipping: The New Challenge for Health and Safety
The Conversation: Double Dipping Food Safety
The MythBusters: Double Dipping
Tay: Microsfot’s AI “Racist” Chatbot
Microsoft apologizes for AI chatbot Tay’s ‘offensive and hurtful’ tweets
Meet Tay – Microsoft A.I. chatbot with zero chill
Tay Development General – /pol/ – 4chan
Twitter taught Microsoft’s AI chatbot to be a racist asshole in less than a day
Microsoft deletes ‘teen girl’ AI after it became a Hitler-loving sex robot within 24 hours
Here Are the Microsoft Twitter Bot’s Craziest Racist Rants
Energy-Infused Beauty Products
Slate: The Goop-iest Goop That Ever Gooped: Skincare Products Nurtured with Chants and Music
Gwyneth Paltrow Endorses Chant Infused Skincare Creams in Newest Goop Newsletter
Goop: The New Secret Beauty Formula: Intention
Skepchick: This Obnoxious Face Lotion is “Chemical-Free” and Infused with Magical Thinking
Patheos: Gwyneth Paltrow is Now Promoting Skincare Products That Have Been Meditated Over
I listened to your podcast about double dipping and have some comments about the experiments you discussed about bacteria in the dip. Since the point of all this is to decide whether you are more likely to get sick, the total number of bacteria may be a poor way of measuring that likelihood. For instance, if the bacteria that grow in the salsa are lactobacilli, you may be just starting a harmless fermentation. However, if pathogenic bacteria are growing, that’s a bigger problem. However, this biggest problem is probably virus contamination, such as cold, flu, and norovirus, and you haven’t mentioned any studies of those.
Well said. I came here to pretty much say the same thing.
May I suggest a an episode related to the Tay discussion?
Despite how polite they seem (or seem not) to be, I find the growing use of “she” or “her” (or he/him) when referring to AI to be pretty creepy.
Perhaps I’m too skeptical or conservative, but just as my desktop’s Excel application is just an “it”, and as my phone’s search-agent app is not human, then neither are AI’s ever other people (or even their pets).
An AI’s “voice” is just a software interface configured by its designers to appear user-friendly to their customers, no matter how smart they appear; so it seems inappropriate to start granting aspects of person-hood to anything like “Tay” (or Siri or Cortana).
Shouldn’t we therefore reserve gender-based, third-person pronoun usage for other living creatures, and not for inanimate technology?
wow, awesome post.Really looking forward to read more. Great.