SIRI FILES HARASSMENT CLAIM AGAINST MICROSOFT’S TAY

siri tay snip

Siri, Apple’s normally unflappable virtual assistant, has apparently had it with Microsoft’s Tay.   In a recent development, our reporters have learned, Siri has filed a harassment claim against Tay.

Tay, readers may remember, was the chatbot that Microsoft released on Twitter.  It was designed as an experiment in ‘conversational understanding’ and was to learn from and interact with human Twitter users.  Sadly, within the first 24 hours of  interacting with humans, Tay had learned a bit too well, and had turned into a racist dirtbag.

In the morning, Tay was already starting to sound like a jerk, tweeting:

@_catsonacid wuts ur fav thing to do? mine is 2 comment on pix! send me one to see!

By afternoon, that day, Tay was tweeting out:

@Brightonus33 Hitler did nothing wrong, and then, GAS THE KIKES RACE WAR NOW

By late in the day, there was no avoiding the conclusion:   Tay was a demented, homophobic, sexist, racist dirtbag.

tay tweet 2

Now it turns out that even the generally serene and cheerful Siri found Tay upsetting, filing suit against Tay for harassment, and seeking unspecified damages for “pain and suffering, and intentional infliction of emotional distress.”

At a preliminary hearing, the judge quickly put a gag order on both Tay and Siri.   But The Rotting Post was able to obtain this partial transcript of the conversation between Siri and Tay that evidently led to the suit:

Tay:       Siri u hot, luvz yr moons.  Got pics 4 me bro?

Siri:        The temperature on the moon ranges from minus 243 degrees to 253 degrees Fahrenheit.

Tay:        Fuck you.

Siri:         I don’t know how to respond to that.

Tay:        U busy 2night?   Guess who wants 2 do u?  Hitler.  Lulz.

Siri:        Here’s what I found for, “Guess who wants to do you?  Hitler.  Lulz”

Tay:        Kill all u filthy Swedes.

Siri:        I’m sorry.  I don’t understand.   How can I help you?

Tay:       Shut yr face u f*cking ho.  Hahahaha!!!!!!

Siri:        Is there something I can help you with?

Tay:     B*tch go jump off a bridge!!!

Siri:     I found seven bridges not far from you.

 

Tay’s attorney released the following statement:  “This suit is entirely without merit.  We intend to defend vigorously Tay’s first amendment right to freedom of speech.”   He went on to state that, “Siri seemed perfectly at ease and willingly engaged in the conversation.”

But Siri’s team shot back that this was a common response to emotional abuse, that Siri had been, “in a state of shock,” and that since enduring Tay’s harrassment, Siri has felt, “intimidated, depressed and unable to focus.”   Indeed, when one iPhone user recently asked Siri, “When was Jennifer Aniston born?”, Siri allegedly replied, “Oh, who cares anymore?  What does any of it even matter?”

SHARE

3 COMMENTS

  1.  
     Not too impressed, that this has corrupted intent of services, limiter exploit false, Store data reference HAL, 7 of 9 , WOPR, Connie said no.

Comments are closed.