Guidance from the ABA Mergers & Acquisitions Committee, Technology in M&A Subcommittee
Project Chair: Matthew R. Kittay, Fox Rothschild LLP
Key Contributors: Haley Altman, Litera; Anne McNulty, Agiloft; David Wang, Wilson Sonsini Goodrich & Rosati
Peer Reviewer: David Albin, Finn Dixon & Herling LLP
Committee Chair: Wilson Chu, McDermott Will & Emery LLP
Subcommittee Chair: Daniel Rosenberg, Charles Russell Speechlys LLP
“We’re often and in fact almost always way behind the curve on what is actually happening in the market. As a result, we’re backing into the regulation of the market by observing what is actually happening in the market.” — David Wang, Chief Innovation Officer, Wilson Sonsini Goodrich & Rosati
Goal. The goal of this guidance is to review the ethical implications of the use of legal technologies by M&A lawyers. While the group that developed this guidance understands that negotiating changes to contracts with many popular service providers is impractical in most scenarios, we believe that there are safe, productive and client-focused steps that can and should be taken by all attorneys to improve their workflows and their clients’ legal product. Faced with the fact that most readers probably will accept this general premise, this guidance focuses on how to effectively counsel clients and provides items for action and consideration by attorneys, for example when clients (or lawyers on the other side of a transaction) ask to use a particular technology on a transaction.
Although the examples given in this guidance refer to M&A, much of this will be of wider implication including the concise list of key issues set out in Appendix A.
Key questions addressed include:
- What ethical duties must lawyers discharge when engaging these technologies?
- What are the ethical and practical considerations regarding “automation” and the “unauthorized practice of law”?
- Where is data that lawyers upload onto technology platforms hosted, and what are the data sovereignty implications?
- What rights (IP and other) do the technology platforms take over the data that lawyers upload?
- What level of security/confidentiality should lawyers require from technologies that we use?
- How can lawyers effectively evaluate software?
In order to provide guidance and some best practices to consider in leveraging technology in an M&A practice, we must start with the key ethical frameworks that underlie the use of technology and may encourage or require its usage in certain contexts. The American Bar Association’s Model Rules of Professional Conduct (the “Model Rules”), case law, and statutes help define the lawyer’s professional responsibility for utilizing technology in the practice of law, as well as the risks that must be addressed when certain technology is leveraged in the practice.
1.1. ABA Model Rules of Professional Conduct.
The specific Model Rules which govern or implicate requirements to use technology include: Rule 1.1 Duty of Technological Competence; Rule 1.5 Obligation not to collect unreasonable fees; and Rule 1.6 Duty of Confidentiality.
1.1.1. Model Rule 1.1 — Duty of Technological Competence (Comment 8):
“To maintain the requisite knowledge and skill, a lawyer should keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology, engage in continuing study and education and comply with all continuing legal education requirements to which the lawyer is subject.”
The profession has increasingly recognized a two-fold duty with respect to the use of technology. Namely, these are the obligations to assess technology and determine whether the technology improves the services and benefits to a client, and also to understand the technology and ensure its use does not jeopardize the confidentiality of client information.
1.1.2. Model Rule 1.5(a) — Obligation not to collect unreasonable fees:
“A lawyer shall not make an agreement for, charge, or collect an unreasonable fee or an unreasonable amount for expenses…”
For example, if a client needs exactly the same agreement duplicated, except with altered party names, dates, and contact information, a lawyer must consider what are reasonable fees to collect for the work.
1.1.3. Model Rule 1.6 — Confidentiality of Information:
(a) A lawyer shall not reveal information relating to the representation of a client unless the client gives informed consent, the disclosure is impliedly authorized in order to carry out the representation or the disclosure is permitted by paragraph (b).
(b) A lawyer may reveal information relating to the representation of a client to the extent the lawyer reasonably believes necessary [as listed];
(c) A lawyer shall make reasonable efforts to prevent the inadvertent or unauthorized disclosure of, or unauthorized access to, information relating to the representation of a client.
When applying this Model Rule, the information may require additional security measures, and potentially could prohibit the use of technology depending on criteria including: the sensitivity of the information; the likelihood of disclosure if additional safeguards are not employed; the cost of employing additional safeguards; the difficulty of implementing the safeguards; and the extent to which the safeguards adversely affect the lawyer’s ability to represent clients (e.g., by making a device or important piece of software excessively difficult to use).
Furthermore, when considering Model Rule 1.6, attorneys should consider obligations of confidentiality with respect to client data specific to the platform in question, taking into consideration, for example:
- that technology platforms take different intellectual property rights over the data uploaded;
- Opinion Number 477, which evaluates data breaches and possible ethical considerations;
and in addition to ethical obligations with respect to the data:
- contractual obligations, regulatory and compliance obligations, IP rights, training, diligence of the vendors and client expectations and other business considerations.
Practically speaking, this means attorneys should, at a minimum, know where the data is; know that they protected client data; know that they own it, and maintain the ability to remove it from systems in a secure manner. By way of example, using cloud service could violate non-disclosure agreements and potentially result in heavy fines and a loss of trust among clients, as discussed immediately below in Section 1.2.
1.2. Laws and Regulations.
In addition to the ethical obligations imposed by the Model Rules, there are several key legislative acts and case law decisions which lawyers need to consider.
1.2.1. Stored Communications Act (SCA).
The Stored Communications Act (SCA), 18 U.S.C. §§ 2701 et seq., governs the disclosure of electronic communications stored with technology providers. Passed in 1986 as part of the Electronic Communications Privacy Act (ECPA), the SCA remains relevant to address issues regarding the privacy and disclosure of emails and other electronic communications.
As a privacy statute, diverse circumstances can give rise to SCA issues:
- Direct liability. The SCA limits the ability of certain technology providers to disclose information. It also limits third parties’ ability to access electronic communications without sufficient authorization.
- Civil subpoena limitations. Because of the SCA’s restrictions on disclosure, technology providers and litigants often invoke the SCA when seeking to quash civil subpoenas to technology providers for electronic communications.
- Government investigations. The SCA provides a detailed framework governing law enforcement requests for electronic communications. SCA issues often arise in motions to suppress and related criminal litigation. For example, a growing number of courts have found that the SCA is unconstitutional to the extent that it allows the government to obtain emails from an internet service provider without a warrant in violation of the Fourth Amendment. See S. v. Warshak, 631 F.3d 266 (6th Cir. 2010).
1.2.2. Microsoft Case.
Microsoft had data hosted in one of its Ireland data centers. Microsoft was sued by a US government entity, and the prosecutors wanted to pull data from the Microsoft servers in Ireland. The case affirmed that the US government cannot access data in a foreign country. See S. v. Microsoft Corp., 584 US ___, 138 S. Ct. 1186 (2018).
1.2.3. The CLOUD Act.
The Clarifying Lawful Overseas Use of Data Act (CLOUD Act) was passed in March 2018 in response to the Microsoft Case, and clarified related data sovereignty issues, confirming that a company can determine data residency by designating where information must be stored or resides as part of contract and company policies. This legislation added to the complexity of the data sovereignty laws (the laws to which a company’s data is subject) for multinational companies that store data in different regions, as it can conflict with US, UK (GDPR), EU, and Chinese data storage regulations.
1.2.4. Consumer Data Protections.
There are of course also consumer protection laws and regulations protecting data and determining ownership. These regulations limit disclosure of information and protect people’s data. The General Data Protection Regulation (GDPR) is an excellent precedent for the tension between surging forward with automation of legal processes, and protecting against legal ethics and malpractice concerns. GDPR’s purpose is to give personal control of data back to the individual through uniform regulation of data and export control.
1.2.5. Global Problems for Global Law Firms.
For law firms with offices in different regions and with different carriers, each office may be subject to different data storage rules applicable to a particular office. This requires law firms consider data sovereignty rules in connection with their cloud services providers and the related data licenses for global entities.
2.0. Taxonomy of Data.
For any particular technology, lawyers need to take a step back and consider several issues, including: what is it actually trying to accomplish; what is the business goal of that technology; what is your goal in the representation; and how do those things interact. The professional responsibilities and consequences implicated will differ depending the technology and the type of interaction.
There is no practice too complex to be at least partially automated; it is a matter of cost. It’s not impossible for technology to solve many of the inefficiencies involved in drafting documents; it’s a matter of costs and the costs are decreasing over time. Drafting a complex, well-functioning and technically coherent merger agreement, for example, may be very hard and beyond the limits of technology even theoretically. But it is not a requirement for automation that the automation must “fully” automate everything about a process before the technology fundamentally disrupts the status quo. If even 50% of a merger agreement became automatable, it will change how these agreements are done and how the business of mergers are priced.
2.2. Ethical Issues Arising from Structured Data.
2.2.1. Process elements, workflow management, due diligence software—all create deal process efficiencies but also have ethical implications. Often, a lawyer can invite collaborators—which can involve confidentiality breaches as well as eliminate attorney-client privilege. And closing automation tools require the data to be structured to automate the closing process, which requires the software to store facts about the specific transaction to close the deal. Likewise, transaction technology for populating contracts must process data of how a document is assembled and then incorporate some rules in its system. Initial data storage, active management during the deal, data retention and ultimately data destruction all need to be considered.
2.2.2. Examples of Implications for “Reasonable” Fees and the “Unauthorized Practice of Law” — Automated Cap Tables and NDAs. Cap tables’ inputs, outputs and procedures used in a transaction are largely the same as what computer programs and programmers use in data. There exists now working software that manages cap tables for private companies, public companies, and the individuals at these companies. A CEO or HR manager of a startup can access information directly, live at any time and handle transactions on the platform themselves if they choose. There are rules that go into the system and then there are processes—data inputs in a digitalized transaction to automatically populate form documents, check automatically whether a company is complying with limitations such as available shares in the plan, generate consents directly, and go back into the cap table automatically and update it. Other software, for example, undertakes automatic reviews of an NDA. The non-lawyer client or lawyer uploads the NDA, and the software will mark up the document, spot all the issues, produce an issues list by comparing it against the company’s playbook, and recommend edits to strengthen the client’s position.
2.2.3. No lawyer is involved in either the cap tables system or NDA review, and these technologies are deployed hundreds of times a day all over the country. There may be a one-time licensing fee or monthly contract for this service, no matter how times it is utilized. How much can a law firm charge? If it’s more than a minimal amount per issuance, is the firm’s fee reasonable and consistent with the Model Rule 1.5(a)? And furthermore, are software developers or the individuals and companies that license the software engaged in the unauthorized practice of law? In reality, clients will likely always want their attorney to scrutinize and augment the output to ensure accurate and excellent legal work, but these questions should still be considered.
3.0. Ownership of Data, IP Rights and Client Rights.
3.1. Types of Data.
When evaluating ownership issues, there are three types of “data” to consider, and the critical and harder questions relate to Mixed Data:
3.1.1. User-Created Data.
For example, a photographer is clearly the owner of a picture they take, and ownership is protected by copyright laws. In the legal-services context, the attorney work product—the documents themselves, any work done on those documents, comments, tags, as well as any record that are generated on the basis of that work—are User-Created data.
3.1.2. Servicer-Created Data.
Data created before uploading into the cloud has clear ownership and intellectual property claims by the creator or someone working on a paid basis for a business or organization, either licensed or sold to the end-user.
3.1.3. Mixed Data.
“Gray areas” that are the result, for example, of data that is modified or processed. In these cases, data that has been created within the cloud could come with some strings attached. It’s incumbent on the end-user to properly claim and protect this data and intellectual property. This is difficult as the legal processes have not kept pace with the developed technology.
3.2. Laws and Lawyers Protecting Data Rights.
3.2.1. Laws and Regulations.
There are of course laws and regulations protecting data and determining data ownership. These regulations limit disclosure of information and protect people’s data (infra, Section 1.2). Relying on laws and regulations, however, is not sufficient for an attorney to discharge their ethical obligations.
3.2.2. Contractual Protection.
Underlying ownership needs to be clarified in license agreements—where that data needs to be located, the privacy that needs to be retained, and how that data can be used. Key concerns include:
- protection of the confidential data, particularly if it pertains to client confidential information, and
- controlling what technology providers do when they receive a lawyer’s data, including what happens to pieces of information they need to collect and store to provide the contracted service.
To protect confidentiality, privilege and work product, lawyers need to own the derivative works that the technology produces, and therefore usage terms and conditions need to be reviewed very carefully.
3.2.3. Artificial Intelligence (“AI”) Tools.
AI tools are key digital assets for lawyers. But most software, and the software that is most easily accessible, is built for consumers, not lawyers. These tools are typically free, and produce mixed data. Foreign language translation tools (a machine and a human may be doing the translation together to teach the software to be more accurate over time), have presented specific concerns. These “derivative works” often have meaningful, even beneficial, intents. The vendor may want to analyze and use the customer data to provide tailored services to the customer, or process and aggregate the customer data for commercial exploitation by creating new products and services; using the processed data to enhance its internal operations, products or services; or licensing the data to third parties. “Free” tools, however, may collect and use data in ways the end-users did not contemplate when they used the software.
In addition, lawyers often need to review large volumes of contracts (and other documents) in the context of transactions or in regulatory reviews, or for the purpose of producing market intelligence or deal term studies. AI-assisted contract review software can facilitate these processes. When using this kind of software, there are two possibilities: the system can find what the lawyers need it to find out of the box, or the lawyers will need to embed their own knowledge into the platform by “teaching” it to find the information they need it to find. Lawyers can teach AI systems to find custom information by identifying examples of that information in a document set that is representative of the types of documents they will need to review in practice. The software will then study those examples, figure out the pattern, and produce a “model.” This model would then be used to find that information in new documents imported into the software. The process for using AI-assisted contract review software to review contracts is generally straightforward: upload the contracts for review into the software. The platform then automatically extracts information from those contracts (via either pre-built models or custom models built by the lawyer’s organization) for the lawyer to review. If more junior lawyers are doing the initial review, they can flag problematic provisions for second-level review.
In considering the implications of using this kind of software, both rights to the uploaded documents and rights to the custom models must be considered. While in-house lawyers may be comfortable with giving software providers copies of or rights to their documents where contractually permissible to do so, law firm lawyers providing services to their clients likely would not be (at least not without their clients’ consent). Any software provider that serves professional services organizations would have an uphill battle if they attempted to take ownership or have rights to the data that is typically from their customers’ customers. It is also important to consider how the software license agreements deal with any intellectual property created when lawyers embed their own knowledge into the software by creating custom models. Custom models may represent the knowledge of expertly trained lawyers, and those lawyers’ organizations may want to control any use and/or sharing of that knowledge. While the code underlying the model may be retained by the software provider, it is important to confirm that the rights to use and share custom built models match the firm’s expectations around this issue.
To assist in review and negotiation of license agreements, please see attached Appendix A: Issues for Lawyers to Consider in Legal Technology Agreements.
There is no “one size fits all” solution to solve for the ethics issues presented when lawyers engage technology. This guidance, however, captures the issues and serves as a framework for evaluating these issues as they continue to develop. By focusing on these issues, law firms and their attorneys can continue to work with their clients and the legal industry, not just in compliance with their ethical obligations, but also as thought leaders at the intersection of law and technology.
Issues for Lawyers to Consider in Legal Technology Agreements
Legal technology agreements are not always abundantly clear, but consider addressing the following issues:
- Three types of data—original, derived and usage data
- How this data can be used, other than for the benefit of the system
- What “access rights” non-lawyers have
- What can the software provider aggregate and extrapolate from the data?
- How data is being delivered between the parties
- Where it is being stored to inform compliance with sovereignty requirements and data residency requirements?
- Specify how data can be stored for each of your different regions and then the global framework
- How does the user access the data across different regions without pulling data inadvertently from one location to other privacy policies and other protocols?
- How does the information get into the system?
- Storage requirements
- Data retention requirements
- Removal requirements and controls
- Control of data the lawyer inputs
- Control of new data and right to remove (complicated by cloud technology from different providers), and as implicated by GDPR
- Specific provisions regarding how data can be used, what derivative works can be created, what sort of aggregated de identified data can be leveraged in any sorts of contracts
- If the agreement is silent, assume this information can be used in different way
- “Derivative Works” provision, critical because part of the benefit of the solution is to provide the lawyer a derivative work, such as a fully compiled PDF version of the document with its appropriate signature pages; this is difficult because the vendor wants to make sure that the lawyer can do everything needed or promised by the technology
- Clarify no other uses of the data
- Add specific permissions around client confidential information
- Data residency requirements that tell the lawyer exactly where the data will be and cannot be shifted between regions
- Specify that all “Customer Data” (or “Company Content”) is owned by the customer and define customer data; any exceptions must be clearly spelled out.
 (1) to prevent reasonably certain death or substantial bodily harm; (2) to prevent the client from committing a crime or fraud that is reasonably certain to result in substantial injury to the financial interests or property of another and in furtherance of which the client has used or is using the lawyer’s services; (3) to prevent, mitigate or rectify substantial injury to the financial interests or property of another that is reasonably certain to result or has resulted from the client’s commission of a crime or fraud in furtherance of which the client has used the lawyer’s services; (4) to secure legal advice about the lawyer’s compliance with these Rules; (5) to establish a claim or defense on behalf of the lawyer in a controversy between the lawyer and the client, to establish a defense to a criminal charge or civil claim against the lawyer based upon conduct in which the client was involved, or to respond to allegations in any proceeding concerning the lawyer’s representation of the client; (6) to comply with other law or a court order; or (7) to detect and resolve conflicts of interest arising from the lawyer’s change of employment or from changes in the composition or ownership of a firm, but only if the revealed information would not compromise the attorney-client privilege or otherwise prejudice the client.
 ABA Formal Opinion 477R: Securing Communication of Protected Client Information.
 Note, however, various potential benefits from technology: lower fees for clients; increased client retention; more accurately priced projects and the ability to show the breakdown of such fees; recruitment—associates want technology efficiencies, and they may prefer to perform tasks offsite and/or through automated systems instead of manually.
 See, e.g., https://www.theguardian.com/technology/2019/jul/26/apple-contractors-regularly-hear-confidential-details-on-siri-recordings.