TRC #395: Tribeca vs Vaxxed + Double-Dipping + Tay: Microsoft’s AI “Racist” Chatbot + Energy-Infused Beauty Products

chatbotDarren kicks off the show with a quick panel discussion about Tribeca Film Festival’s decision to pull a controversial documentary linking vaccines with autism from their line-up. Next, Pat dives right into a listener email about double-dipping. Adam looks into how the internet corrupted Microsoft’s innocent new AI chatbot Tay in less than 24 hours. Finally, Cristina gets goopy with a segment about energy-infused beauty products.

Download direct: mp3 file

If you like the show, please leave us a review on iTunes!


CBC – The Nature of Things

Double Dipping:  The New Challenge for Health and Safety

The Conversation: Double Dipping Food Safety

The MythBusters: Double Dipping

Seifeld: Double Dip

Tay: Microsfot’s AI “Racist” Chatbot

Microsoft apologizes for AI chatbot Tay’s ‘offensive and hurtful’ tweets

TayTweets – Twitter

Meet Tay – Microsoft A.I. chatbot with zero chill

Tay Development General – /pol/ – 4chan

Twitter taught Microsoft’s AI chatbot to be a racist asshole in less than a day

Microsoft deletes ‘teen girl’ AI after it became a Hitler-loving sex robot within 24 hours

Here Are the Microsoft Twitter Bot’s Craziest Racist Rants

TweetsFromTay – Reddit

Energy-Infused Beauty Products

Slate: The Goop-iest Goop That Ever Gooped: Skincare Products Nurtured with Chants and Music

Gwyneth Paltrow Endorses Chant Infused Skincare Creams in Newest Goop Newsletter

Goop: The New Secret Beauty Formula: Intention

Skepchick: This Obnoxious Face Lotion is “Chemical-Free” and Infused with Magical Thinking

Wiki: Masaru Emoto

Shop Goop

Patheos: Gwyneth Paltrow is Now Promoting Skincare Products That Have Been Meditated Over

de Mamiel

10 of the World’s Deadliest Plants — And How They Kill You

NCBI: Centella asiatica in cosmetology

This entry was posted in The Reality Check Episodes and tagged , , , . Bookmark the permalink.

4 Responses to TRC #395: Tribeca vs Vaxxed + Double-Dipping + Tay: Microsoft’s AI “Racist” Chatbot + Energy-Infused Beauty Products

  1. Suzy Kulshrestha says:

    I listened to your podcast about double dipping and have some comments about the experiments you discussed about bacteria in the dip. Since the point of all this is to decide whether you are more likely to get sick, the total number of bacteria may be a poor way of measuring that likelihood. For instance, if the bacteria that grow in the salsa are lactobacilli, you may be just starting a harmless fermentation. However, if pathogenic bacteria are growing, that’s a bigger problem. However, this biggest problem is probably virus contamination, such as cold, flu, and norovirus, and you haven’t mentioned any studies of those.

  2. Michael S. says:

    May I suggest a an episode related to the Tay discussion?
    Despite how polite they seem (or seem not) to be, I find the growing use of “she” or “her” (or he/him) when referring to AI to be pretty creepy.
    Perhaps I’m too skeptical or conservative, but just as my desktop’s Excel application is just an “it”, and as my phone’s search-agent app is not human, then neither are AI’s ever other people (or even their pets).
    An AI’s “voice” is just a software interface configured by its designers to appear user-friendly to their customers, no matter how smart they appear; so it seems inappropriate to start granting aspects of person-hood to anything like “Tay” (or Siri or Cortana).
    Shouldn’t we therefore reserve gender-based, third-person pronoun usage for other living creatures, and not for inanimate technology?

  3. wow, awesome post.Really looking forward to read more. Great.

Comments are closed.