Just last week a report by The Guardian dug into a program where third-party contractors listened in to anonymized recordings of Apple users asking Siri questions to judge the assistant's responses, and now Apple has shut it down. In a statement to TechCrunch, the company said that while it conducts a "thorough" review, it's suspending the program globally. This comes shortly after Google announced it would temporarily shut down a similar effort, but only for users in the EU.
While Apple has touted the privacy built into its products and derided models that mine user data for advertising, just like Amazon and Google it relies on real people to improve its AI assistant. However, as The Guardian's report indicated, listening in on real-world recordings could mean picking up all kinds of situations, including criminal activities and sexual encounters. As TechCrunch notes, its terms of service indicate that these programs exist, but exactly how much end-users understand about the possibility of being overheard by a real person -- even if less than one percent of queries are ever reviewed -- is unclear.
While we don't know what will happen with the program or when it may restart, according to Apple a future software update will give users the option to explicitly choose whether they want to participate in grading.
Apple:
We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.
Source: TechCrunch, Axios
Tech
via https://www.aiupnow.com
, Khareem Sudlow