The Rise and Danger of Virtual Assistants in the Workplace

8 Min Read By: Lenore Benessere, Robert D. Lang

IN BRIEF

  • Virtual assistants likely will become a mainstay in the workplace, so how should lawyers counsel their clients on their use?
  • Virtual assistants processing and retaining your interactions in the Cloud raises privacy concerns and potentially creates a discoverable and admissible record with each use.
  • Lawyers will play a crucial role in developing this new area of law.

“Don’t ever say anything you don’t want played back to you someday.” This famous quote from Mafioso John Gotti is not the most likely advice that we would think to give to our clients. After all, law school and years of practice have taught us to counsel them on the need for good record-keeping practices to aid in prosecuting a lawsuit or ensuring a meaningful defense. We, especially those who are in-house counsel, are also likely to dispense advice regarding how to avoid litigation altogether by creating processes and providing training to ensure that clients and their employees are aware of their contractual obligations and comply with them. Although these tasks are still the linchpin of sound lawyering, a new area of concern has emerged.

We would be remiss if we did not counsel our clients on the impact of virtual assistants like Amazon’s Alexa, Apple’s Siri, Google’s Assistant or Microsoft’s Cortana, who rely on speech-recognition technology to listen and record our every word. When it comes to the impact of recorded statements, Gotti may be an expert, and his advice is perhaps some of the best that we can give to our clients. In some respects, these devices closely mimic wiretapping and may be used both intentionally and unintentionally to this end.

Although the term “speech recognition” sounds complex, it refers simply to what these virtual assistants do to understand our commands to call our friends, play music, or add events to our calendars. Those in the technology sector define it as “the ability to speak naturally and contextually with a computer system in order to execute commands or dictate language.” At this point, most of this technology has become so precise that a simple command, or “wake word” (i.e., “Alexa?!”), allows us to ask our virtual assistants a myriad of questions from “how long is my commute to the office?” to “when is President’s day this year?” In fact, reviewers of Alexa and her technological siblings (Siri and Cortana) distinguish them from first-generation voice assistants because of this “responsiveness.” They praise the technology for doing away with an “activation button,” which, as a result, allows users to “simply say the trigger word (either “Alexa,” “Echo,” “Amazon,” or “Computer”) followed by what you want to happen.” Our ability to speak to Alexa, which is essentially a hands-free speaker you control with your voice, is what we as users find both novel and convenient. It is what allows us to play music while typing an e-mail, or add an appointment to our calendars without opening Outlook.

Amazon.com, Alexa’s creator, boasts that the Alexa Voice Service, which is integrated into the Echo (the “smart speaker” that allows users to connect to Alexa) is “always getting smarter.” When you interact with Alexa, the Echo streams audio to the Cloud. Amazon’s Terms of Use for the Echo duly notifies users that “Alexa processes and retains your Alexa Interactions, such as your voice inputs, music playlists, and your Alexa to-do and shopping lists, and in the cloud to provide and improve our services.” Cloud storage of Alexa’s audio raises a host of privacy concerns that have been best highlighted by the recent Arkansas trial of James Bates for the murder of his friend, Victor Collins, who was found dead, floating face-up in Mr. Bates’ bathtub. Specifically, in the Bates case, the prosecution asked Amazon to disclose recordings from Mr. Bates’ Amazon Echo. Amazon refused, citing privacy concerns. Ultimately, the constitutional issue of whether Amazon may use the First Amendment’s protection of free speech to refuse to disclose the recordings gathered by our Amazon Echoes went unresolved, without addressing Amazon’s position regarding privacy concerns, because Mr. Bates voluntarily turned over the recordings. The case remains important, however, because it makes clear that users have access to their recordings and can therefore willingly disclose them. Amazon confirms such access, stating on its website that Amazon’s Alexa app keeps a history of the voice commands that follow the wake word (“Alexa!”). Specifically, Amazon’s response to whether users can review what they have asked Alexa is, “Yes, you can review voice interactions with Alexa by visiting History in Settings in the Alexa App. Your interactions are grouped by question or request. Tap an entry to see more detail, provide feedback, or listen to audio sent to the Cloud for that entry by tapping the play icon.” Accordingly, it is clear that data stored to the Cloud may allow Alexa to function more seamlessly and “get smarter,” but it does so at the cost of storing information that many users may have considered unattainable and private.

Not surprisingly, as Alexa and other virtual assistants continue to increase in popularity, we are beginning to see them in both homes and businesses. If a virtual assistant is a luxury at home, then certainly it is a necessity at work. In fact, on November 30, 2017, Amazon introduced “Alexa for Business,” which is a set of tools specifically designed to

give [business customers] the tools [they] need to manage Alexa-enabled devices, enroll [their] users, and assign skills at scale. [They] can build your own custom voice skills using the Alexa Skills Kit and the Alexa for Business APIs, and [they] can make these available as private skills for [their] organization[s].

In rolling out this new platform for Alexa, Amazon.com advertises that “Alexa helps you at your desk,” “Alexa simplifies your conference rooms,” and “Alexa helps you around the workplace.” So, if we use Alexa the way that Amazon.com hopes, Alexa will be in every office, conference room, and even the hallway of our workplaces. We won’t have to undergo the mundane task of dialing into a conference call. Instead, we can just use our voice to allow it to commence. According to Amazon.com, Alexa can also “find an open meeting room, order new supplies, report building problems, or notify IT of an equipment issue.” Gone are the days when you have to walk around the office in search of an empty conference room, but also gone are the days when you have any privacy in a closed office or conference room.

In most offices, it is common to hear a topic raised in the hallway, only to be abruptly halted by one party asking for the conversation to continue in his or her office. Other times, a conversation that began in e-mail will be postponed until the parties have the ability to talk in person. The obvious reason for these conversations to take place in person, behind closed doors, is to avoid creating a record or to avoid being overheard. With the advent of virtual assistants in the workplace, however, closing the door to talk privately may actually ensure that you are allowing your virtual assistant the ability to listen to your conversation with unfiltered access, and thus creating a potentially discoverable and admissible record. In this environment, Gotti’s advice, “Don’t ever say anything you don’t want played back to you someday,” is perhaps the best that we can offer our clients. At a minimum, they should be aware that a “closed-door conversation” is more a term of art than a certainty and definitely not a given simply because the door is in fact closed. Instead, if the room contains Alexa or another type of device, one’s conversation can be recorded, especially if the parties are using the assistant to obtain answers to search inquires or to complete tasks.

With respect to the admissibility of the recordings of virtual assistants like Alexa, we must question whether they can actually be used during litigation. The simple answer is that it depends, and there currently are no laws on the books that specifically address how courts will treat statements recorded by virtual assistants. If they are treated like other recorded statements, including those obtained during wiretapping, then the jurisdiction where the communication took place will dictate whether they can be introduced into evidence.

States typically fall into one of two categories: those that require “one-party consent” or those states that require “two-party consent.” Federal law follows the one-party consent doctrine, which allows the recording of telephone calls and in-person conversations with the consent of at least one of the parties. Under one-party consent law, you can record a phone call or conversation so long as you are a party to the conversation. New York, New Jersey, and Indiana have adopted the one-party consent doctrine. New York, which follows this law, makes it a crime to record or eavesdrop on an in-person or telephone conversation unless one party to the conversation consents. Other states, like Massachusetts and California, require two-party consent. This means that it is a crime to secretly record a conversation, whether the conversation is in-person or taking place by telephone or another medium, like Alexa. However, the information recorded from Alexa and other virtual assistants, including transcribed search terms, may be treated differently because they are more akin to data from a computer, not wiretapping. Given that this is a new area of law, attorneys will play a critical role in helping to put these issues before the courts, which may create an entirely new body of law.

*The authors would like to thank paralegal Megan Kessig as well as Alexa, Bixby, Siri, Google’s Assistant, and Cortana for their assistance. For further reference, see Robert D. Lang & Lenore E. Benessere, Alex, Siri, Bixby, Google’s Assistant and Cortana Testifying in Court, 99 N.Y. State Bar J. 9 (Nov./Dec. 2017).

By: Lenore Benessere, Robert D. Lang

Connect with a global network of over 30,000 business law professionals

18264

Login or Registration Required

You need to be logged in to complete that action.

Register/Login