Music, reminders, recipes, how to say "Happy Birthday" in French. Home assistants like the Google Home and Amazon Alexa provide us with a glimpse into a future where our homes are smart and our disembodied assistants speak in Received Pronunciation.
However, these devices are, by design, constantly listening to and recording our voices. Amazon provides a dashboard for you to play back your utterances from many months ago, or delete them en masse. But backed by virtually unlimited cloud data storage we are encouraging these devices to record our lives and the activities in our homes and bedrooms. This may be helpful for our own future reference (although I'm yet to find a good use for it), but the "feature" certainly holds great potential for the police and law enforcement.
Earlier this month Google launched its Google Mini, a miniaturised version of its Google Home assistant. The donut sized device is even more portable, even more discreet, and just as clever as its two bigger siblings. Unfortunately a hardware error means that the device has been recording thousands of sound snippets and shipping the data off to Google for analysis, all without user intervention.
Although teething troubles are to be expected with all modern technologies, the line between technological convenience and data privacy violation may be blurred for some years yet. We have already seen voice recordings recovered from the Amazon Echo service used as evidence in a murder trial.
Grant Thornton's computer forensic teams routinely restore and probe the data stored on embedded chips, "IoT" systems (internet of things), smart TVs and fitness trackers. Our courts and legal systems are increasingly recognising the value and probity of these data types. But should we trust the sleeper agents we are introducing in to our homes?
Russakovskii realised the device had transmitted thousands of audio recordings to the company without his knowledge, all of which were available for playback. Tech companies sucking up large quantities of data indiscriminately without explicit user consent—especially in the intimate environment of one’s home—is one of the primary fears of privacy advocates skeptical smart speakers won’t be abused.