Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums.

You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today!

If you have any problems with the registration process or your account login, please contact us.

Post New Thread Reply

Register GFY Rules Calendar
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed.

 
Thread Tools
Old 07-27-2019, 12:18 PM   #1
wehateporn
Promoting Debate on GFY
 
wehateporn's Avatar
 
Industry Role:
Join Date: Apr 2007
Posts: 27,173
Siri ‘regularly’ records sex encounters, sends ‘countless’ private moments to Apple

Apple’s Siri AI assistant sends audio of sexual encounters, embarrassing medical information, drug deals, and other private moments recorded without users’ knowledge to human ‘graders’ for evaluation, a whistleblower has revealed.

Recordings from Apple’s Siri voice assistant are fed to human contractors around the world, who grade the AI based on the quality of its response and whether its activation was deliberate, according to an anonymous contractor who spoke to the Guardian. They claimed accidental activations are much more frequent than Apple lets on, especially with Apple Watch users – and wants the company to own up to the problem.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data,” the whistleblower revealed.

Continued https://www.rt.com/news/465181-apple...n-contractors/
wehateporn is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-27-2019, 12:36 PM   #2
RedFred
Confirmed User
 
RedFred's Avatar
 
Industry Role:
Join Date: Feb 2016
Posts: 9,782
RedFred is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-27-2019, 01:06 PM   #3
bronco67
Too lazy to set a custom title
 
bronco67's Avatar
 
Join Date: Dec 2006
Posts: 29,032
Quote:
Originally Posted by wehateporn View Post
Apple’s Siri AI assistant sends audio of sexual encounters, embarrassing medical information, drug deals, and other private moments recorded without users’ knowledge to human ‘graders’ for evaluation, a whistleblower has revealed.

Recordings from Apple’s Siri voice assistant are fed to human contractors around the world, who grade the AI based on the quality of its response and whether its activation was deliberate, according to an anonymous contractor who spoke to the Guardian. They claimed accidental activations are much more frequent than Apple lets on, especially with Apple Watch users – and wants the company to own up to the problem.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data,” the whistleblower revealed.

Continued https://www.rt.com/news/465181-apple...n-contractors/
No Siri doesn't do that. RT is full of shit and so are you. Die.
__________________
bronco67 is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-27-2019, 01:08 PM   #4
GFED
Confirmed User
 
GFED's Avatar
 
Industry Role:
Join Date: May 2002
Posts: 8,098
Quote:
Originally Posted by wehateporn View Post
Apple’s Siri AI assistant sends audio of sexual encounters, embarrassing medical information, drug deals, and other private moments recorded without users’ knowledge to human ‘graders’ for evaluation, a whistleblower has revealed.

Recordings from Apple’s Siri voice assistant are fed to human contractors around the world, who grade the AI based on the quality of its response and whether its activation was deliberate, according to an anonymous contractor who spoke to the Guardian. They claimed accidental activations are much more frequent than Apple lets on, especially with Apple Watch users – and wants the company to own up to the problem.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data,” the whistleblower revealed.

Continued https://www.rt.com/news/465181-apple...n-contractors/
All of the wake-on voice command devices, phones, etc do this.
GFED is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-27-2019, 01:13 PM   #5
TrafficTitan
Confirmed User
 
Industry Role:
Join Date: Nov 2012
Posts: 350
If people opt in to this it wouldn't even be illegal
TrafficTitan is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-27-2019, 01:14 PM   #6
wehateporn
Promoting Debate on GFY
 
wehateporn's Avatar
 
Industry Role:
Join Date: Apr 2007
Posts: 27,173
Quote:
Originally Posted by bronco67 View Post
No Siri doesn't do that. RT is full of shit and so are you. Die.
So everything on RT is fake news, who do you trust, Guardian? Maybe Guardian got tricked by Putin into reporting the same fake story.

https://www.theguardian.com/technolo...iri-recordings
__________________
wehateporn is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-28-2019, 09:36 AM   #7
Bladewire
StraightBro
 
Bladewire's Avatar
 
Industry Role:
Join Date: Aug 2003
Location: Monarch Beach, CA USA
Posts: 56,229
Quote:
Originally Posted by bronco67 View Post
No Siri doesn't do that. RT is full of shit and so are you. Die.
Well said
__________________


Skype: CallTomNow

Bladewire is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-28-2019, 01:38 PM   #8
MrBottomTooth
Confirmed User
 
MrBottomTooth's Avatar
 
Join Date: Sep 2009
Posts: 5,795
I can go into my Alexa app and listen to every single time I triggered my Alexa. You can go in and delete the recordings if you want. Amazon admitted they have people analyzing these, no doubt to improve reliability and accuracy. I'm sure all the companies do this. In my case 90% of the recordings are me turning lights on and off.
MrBottomTooth is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-29-2019, 03:42 PM   #9
Bladewire
StraightBro
 
Bladewire's Avatar
 
Industry Role:
Join Date: Aug 2003
Location: Monarch Beach, CA USA
Posts: 56,229
Quote:
Originally Posted by MrBottomTooth View Post
I can go into my Alexa app and listen to every single time I triggered my Alexa. You can go in and delete the recordings if you want. Amazon admitted they have people analyzing these, no doubt to improve reliability and accuracy. I'm sure all the companies do this. In my case 90% of the recordings are me turning lights on and off.
Google is creating voice profiles for every person using voice to text so that , in the future, your Google personal assistant will have your voice, if you choose.

Scary
__________________


Skype: CallTomNow

Bladewire is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-29-2019, 03:56 PM   #10
pimpmaster9000
Too lazy to set a custom title
 
pimpmaster9000's Avatar
 
Industry Role:
Join Date: Dec 2011
Posts: 26,732
This is why I will only buy huaweii from now on...nobody cares if some china dude is spying on him...avoid all US tech its all spyware and can not be trusted...
__________________
Report a suspicious cracker: Click Here
pimpmaster9000 is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Post New Thread Reply
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >

Bookmarks

Tags
apple, recordings, private, encounters, siri, revealed, deals, sexual, whistleblower, assistant, sends, apple’s, moments, human, countless, “there, frequent, watch, company, instances, users, business, location, data, user



Advertising inquiries - marketing at gfy dot com

Contact Admin - Advertise - GFY Rules - Top

©2000-, AI Media Network Inc



Powered by vBulletin
Copyright © 2000- Jelsoft Enterprises Limited.