Posts tagged Apple

1,000 phrases that incorrectly trigger Alexa, Siri, and Google Assistant

privacy, Surveillance, Alexa, Google, Siri, Cortana, TV, Amazon, Apple, Microsoft, voice, 2020

As Alexa, Google Home, Siri, and other voice assistants have become fixtures in millions of homes, privacy advocates have grown concerned that their near-constant listening to nearby conversations could pose more risk than benefit to users. New research suggests the privacy threat may be greater than previously thought. The findings demonstrate how common it is for dialog in TV shows and other sources to produce false triggers that cause the devices to turn on, sometimes sending nearby sounds to Amazon, Apple, Google, or other manufacturers. In all, researchers uncovered more than 1,000 word sequences—including those from Game of Thrones, Modern Family, House of Cards, and news broadcasts—that incorrectly trigger the devices. “The devices are intentionally programmed in a somewhat forgiving manner, because they are supposed to be able to understand their humans,” one of the researchers, Dorothea Kolossa, said. “Therefore, they are more likely to start up once too often rather than not at all.” That which must not be said Examples of words or word sequences that provide false triggers include Alexa: “unacceptable,” “election,” and “a letter” Google Home: “OK, cool,” and “Okay, who is reading” Siri: “a city” and “hey jerry” Microsoft Cortana: “Montana”

via https://arstechnica.com/information-technology/2020/07/uncovered–1000-phrases-that-incorrectly-trigger-alexa-siri-and-google-assistant/

Unacceptable, where is my privacy?

privacy, Surveillance, Alexa, Google, Siri, Cortana, TV, Amazon, Apple, Microsoft, voice, 2020

Our setup was able to identify more than 1,000 sequences that incorrectly trigger smart speakers. For example, we found that depending on the pronunciation, «Alexa» reacts to the words “unacceptable” and “election,” while «Google» often triggers to “OK, cool.” «Siri» can be fooled by “a city,” «Cortana» by “Montana,” «Computer» by “Peter,” «Amazon» by “and the zone,” and «Echo» by “tobacco.” In our paper, we analyze a diverse set of audio sources, explore gender and language biases, and measure the reproducibility of the identified triggers.

via https://unacceptable-privacy.github.io/

Dear Tech, You Suck at Delight

Medium, Sara Wachter-Boettcher, AI, UI, tech, Apple, siri, partial automation

What we’ve found, over and over, is an industry willing to invest endless resources chasing “delight” — but when put up to the pressure of real life, the results are shallow at best, and horrifying at worst. Consider this: Apple has known Siri had a problem with crisis since it launched in 2011. Back then, if you told it you were thinking about shooting yourself, it would give you directions to a gun store. When bad press rolled in, Apple partnered with the National Suicide Prevention Lifeline to offer users help when they said something Siri identified as suicidal. It’s not just crisis scenarios, either. Hell, Apple Health claimed to track “all of your metrics that you’re most interested in” back in 2014 — but it didn’t consider period tracking a worthwhile metric for over a year after launch.

via https://medium.com/@sara_ann_marie/dear-tech-you-suck-at-delight–86382d101575

Earth to Apple: wireless headphones are like a tampon without a string

Technology, Apple, disruption, innovation, compatibility, tampons, cables, consumerism, capital

As far as style goes, the AirPods resemble the EarPods from the Season 2 episode of Doctor Who in which a megalomaniac billionaire has convinced the populace to purchase the wireless devices as a means to conduct communication and receive all their information, only to turn around and deploy them as a weapon that hacked into their brains and turned them into soulless, emotionless, homicidal metal automatons.

via https://www.theguardian.com/technology/2016/sep/07/apple-airpods-launch-problems-with-wireless-headphones?CMP=share_btn_tw