Daily Record--Legal Currents Column

Illinois Supreme Court AI Policy Offers Caution With a Side of Clarity

Stacked3Here is my recent Daily Record column. My past Daily Record articles can be accessed here.

****

Illinois Supreme Court AI Policy Offers Caution With a Side of Clarity

The use of generative artificial intelligence (AI) by litigants hasn’t exactly been embraced by courts nationwide. Instead, sensational headlines about hallucinated case citations have led to knee-jerk rejections of the technology by many judges, even when the root cause of those mistakes was attorney competency issues. This pattern of restrictive responses highlights the broader challenge faced when balancing innovation with accountability in the legal profession. 

Like many of the emerging technologies that preceded it, AI was not embraced with open arms by our judiciary. Instead, it was met with suspicion and trepidation. Many judges banned its use by litigants, and others require full disclosure of all AI tools used in the preparation of court filings. 

The problem with these extreme responses is that they establish unrealistic standards that won’t withstand the test of time. AI is advancing exponentially and is already embedded in many popular legal software programs, ranging from legal research and law practice management tools to document management and legal billing platforms. As a result, legal professionals are already producing legal work using GAI and may not even realize it. Accordingly, penalizing them for doing so is counterintuitive, at best, and unfairly punitive, at worst.

Fortunately, the tide seems to be turning with the issuance of a new, progressive court AI policy by the Illinois Supreme Court. The policy went into effect on January 1st and provides a judicious approach to the incorporation of GAI tools into the workflows of legal professionals, including court personnel. 

In the policy, the Court wisely acknowledges the inevitability and unprecedented speed of AI adoption, along with the benefits and challenges it presents: “The integration of AI with the courts is increasingly pervasive, offering potential efficiencies and improved access to justice. However, it also raises critical concerns about authenticity, accuracy, bias, and the integrity of court filings, proceedings, evidence, and decisions. Understanding the capabilities and limitations of AI technology is essential for the Illinois Judicial Branch.”

Importantly, the Court advises that judges take an open-minded approach to AI and its “use of AI by litigants, attorneys, judges, judicial clerks, research attorneys” and should not require disclosure when implemented: “The use of AI…should not be discouraged, and is authorized provided it complies with legal and ethical standards. Disclosure of AI use should not be required in a pleading.”

The Court also cautions that it’s essential to fully understand any technology, including AI, prior to adopting it and that all AI-created output should be carefully reviewed “before submitting it in any court proceeding to ensure accuracy and compliance with legal and ethical obligations.”

Finally, judges were reminded that they “remain ultimately responsible for their decisions, irrespective of technological advancements.” To assist in their efforts to stay apprised of technological advancements, the Court provided additional resources for judges, including a judicial reference sheet on AI.

The Illinois Supreme Court’s new AI policy offers a thoughtful, balanced approach to AI adoption in our profession. The Court wisely rejects outright bans and unnecessary disclosure mandates, while acknowledging the inevitability of AI adoption. 

 

By highlighting AI’s potential benefits while addressing risks like accuracy, bias, and ethical compliance, it successfully establishes a framework that ensures the ethical and practical adoption of AI without compromising the justice system’s integrity. In doing so, it sets an example for other courts to follow by providing a flexible, forward-thinking roadmap for responsible AI usage in our profession

Nicole Black is a Rochester, New York attorney, author, journalist, and Principal Legal Insight Strategist at MyCase, CASEpeer, Docketwise, and LawPay, practice management and payment processing tools for lawyers (AffiniPay companies). She is the nationally-recognized author of "Cloud Computing for Lawyers" (2012) and co-authors "Social Media for Lawyers: The Next Frontier" (2010), both published by the American Bar Association. She also co-authors "Criminal Law in New York," a Thomson Reuters treatise. She writes regular columns for Above the Law, ABA Journal, and The Daily Record, has authored hundreds of articles for other publications, and regularly speaks at conferences regarding the intersection of law and emerging technologies. She is an ABA Legal Rebel, and is listed on the Fastcase 50 and ABA LTRC Women in Legal Tech. She can be contacted at [email protected].



The Year Ahead in Legal Tech: AI, Innovation, and Opportunity

Stacked3Here is my recent Daily Record column. My past Daily Record articles can be accessed here.

****

The Year Ahead in Legal Tech: AI, Innovation, and Opportunity

Looking back on 2024, this Grateful Dead lyric comes to mind: “What a long, strange trip it’s been.” It perfectly captures the upheaval of the last four years, which were nothing if not unpredictable and tumultuous. A worldwide pandemic closed our borders—and our offices—but we never stopped working. Business carried on as usual even as we struggled to wrap our minds around the realities of living in the midst of a deadly, highly contagious virus.  

Technology saved the day. Without it, our world would have come to a grinding halt. Instead, it ushered in a newfound receptivity to cloud and remote working software, priming us for what came next: the generative AI era. 

In late 2022, just as normalcy seemed to return, generative AI became a catalyst for unprecedented change with the release of GPT 3.5. Its release marked a turning point. From there, technological advancement occurred at a rapid clip, with 2024 seeing the continued integration of generative artificial intelligence (AI) into the tools legal professionals rely on. 

The pace of AI development over the past year, however, was slower than many had predicted. Nevertheless, the impact on the practice of law overall was significant. Legal professionals continued to learn about and experiment with generative AI for many tasks, including legal research, document drafting and editing, brainstorming, and more. 

In fact, according to the 2025 AffiniPay Legal Industry Report, which will be published in the spring, one-fifth of firms have already adopted legal-specific generative AI tools. Personal adoption was even more significant. For example, 47% of immigration practitioners reported personally using generative AI for work-related purposes.

In the coming year, you can expect to see a heightened pace of AI development with generative AI appearing as the interface in all the tools you regularly use in your law firm. From legal research and practice management to legal billing and knowledge management, generative AI conversational interactions will increasingly be the mechanism through which you access all of the information you need to effectively represent your clients’ interests.

You’ll also notice that generative AI will be more deeply embedded into your firm’s IT stack, enabling in-depth analysis of your office's data, including client matters, documents, finances, billable hours, employee productivity, and more. This ability to easily access the metrics needed to run a productive, efficient, and profitable practice will make all the difference and will enable firms to scale and compete more easily in an increasingly competitive, AI-driven legal marketplace. 

Additionally, as generative AI becomes seamlessly embedded into everyday tools, you might not even realize you’re using it. One immediate effect of this deeper-level integration will be that court rules banning AI-generated documents will quickly become outdated and impractical, in part because they could effectively prohibit lawyers from using essential technology altogether. 

Another notable trend in 2025 will be continued regulatory changes and further ethics guidance. Bar associations will issue additional ethics opinions and guidelines that provide roadmaps for compliant AI implementation, effectively removing the remaining barriers that stand in the way of broad-scale adoption. 

Similarly, regulatory changes impacting bar exam and licensure requirements highlight a broader effort to make legal services more accessible. As states revisit licensure rules and AI ethics frameworks evolve, the legal landscape will continue to shift in the face of these efforts to balance innovation with the profession’s core principles.

In other words, if you thought the last few years brought unwelcome upheaval, brace yourself—there’s more to come. Rest assured, our profession won’t be immune from the changes and will likely be impacted far more than others. 

So get ready. Dive in and ensure you’re maintaining technology competence. Sign up for tech-related CLEs, experiment with generative AI, and learn as much as you can about emerging and innovative technologies. 2025 is sure to be a year for the record books, and now is the time to prepare yourselves for what will come. 

Nicole Black is a Rochester, New York attorney, author, journalist, and Principal Legal Insight Strategist at MyCase, CASEpeer, Docketwise, and LawPay, practice management and payment processing tools for lawyers (AffiniPay companies). She is the nationally-recognized author of "Cloud Computing for Lawyers" (2012) and co-authors "Social Media for Lawyers: The Next Frontier" (2010), both published by the American Bar Association. She also co-authors "Criminal Law in New York," a Thomson Reuters treatise. She writes regular columns for Above the Law, ABA Journal, and The Daily Record, has authored hundreds of articles for other publications, and regularly speaks at conferences regarding the intersection of law and emerging technologies. She is an ABA Legal Rebel, and is listed on the Fastcase 50 and ABA LTRC Women in Legal Tech. She can be contacted at [email protected].


Top Picks for 2024: Last-Minute Holiday Gift-Giving

Stacked3Here is my recent Daily Record column. My past Daily Record articles can be accessed here.

****

Top Picks for 2024: Last-Minute Holiday Gift-Giving

The winter holidays are just around the corner, and you know what that means—it’s the perfect time to tackle your gift shopping. But don’t wait too long—time is running out! Have you finished your holiday shopping yet? Struggling to find the perfect gift for that hard-to-please person in your life? If so, don’t worry! I’ve rounded up a few ideas that might be just what you’re looking for, including a few technology-related options.

Looking for the ideal present for a frequent traveler? I’ve got you covered! Now that most airplanes are equipped with seatback video monitors, compatible headphones are a must-have. Corded headphones are always a reliable fallback, and you can usually get a free pair on board. However, a wireless Bluetooth adapter is the perfect solution for those who prefer cordless headphones. This handy device plugs into the plane’s audio jack, allowing users to connect their Bluetooth headphones seamlessly and enjoy their entertainment wire-free. The Twelve South AirFly Pro Audio Streaming device, which currently costs $54.99 on Amazon, is one good option, but there are plenty of others that do the trick as well.

Another great investment is a ChatGPT Plus subscription, as staying up-to-date with the latest technology is essential for maintaining tech competence. While basic ChatGPT is free, the paid plan, priced at $19.99 per month, offers enhanced control over data and features. With a subscription, users can disable data training to prevent OpenAI from using their input for system training. Additionally, the paid version provides exclusive access to the newest features and the ability to explore or even create GPTs—specialized generative AI agents designed for specific use cases. It’s a thoughtful gift for anyone looking to stay ahead in the tech world!

A walking desk is a fantastic gift idea for the recipient who spends long hours at their desk. Compact walking pads don’t take up much space and often come with an attached desk, allowing users to work comfortably while walking at a steady pace. For example, the HccSport 3-in-1 Under Desk Treadmill Walking Pad with Removable Desk Workstation is available on Amazon for $369.99. This setup is easy to use—just place it under the desk and set the pace—and it’s a worthwhile investment in both the physical health and mental well-being of the sedentary workaholic on your gift list.

If you’re shopping for an oenophile, this idea is sure to be a hit: a subscription to SOMM TV. For just $6.99 monthly or $59.99 annually, the wine lover on your list will gain unlimited access to a treasure trove of wine-focused programming. From iconic films like SOMM and Bottle Shock to exclusive series exploring blind tastings, global wine regions, and specific grapes, there’s always something new to watch. It’s a thoughtful gift that will keep them entertained and informed all year long!

Finally, consider a Coravin for the wine enthusiast on your list. This popular device, favored by sommeliers, is ideal for those who enjoy exploring higher-end wines. Instead of removing the cork, the Coravin uses a fine needle to pierce it, injecting argon gas as you pour a glass, preserving the remaining wine. The Coravin Timeless Three Plus Wine Preservation System, which includes two argon gas capsules, is currently available on Amazon for the special holiday price of $186.75. 

So whether you’re seeking gifts for a technophile, an office warrior, or a wine connoisseur, this year’s gift list has you covered! There's something for everyone, from low-cost subscriptions to practical gadgets and high-end indulgences. With these thoughtful ideas, you can find the perfect present and bring a little extra joy to your loved ones this holiday season. Happy gifting!

 Nicole Black is a Rochester, New York attorney, author, journalist, and Principal Legal Insight Strategist at MyCase, CASEpeer, Docketwise, and LawPay, practice management and payment processing tools for lawyers (AffiniPay companies). She is the nationally-recognized author of "Cloud Computing for Lawyers" (2012) and co-authors "Social Media for Lawyers: The Next Frontier" (2010), both published by the American Bar Association. She also co-authors "Criminal Law in New York," a Thomson Reuters treatise. She writes regular columns for Above the Law, ABA Journal, and The Daily Record, has authored hundreds of articles for other publications, and regularly speaks at conferences regarding the intersection of law and emerging technologies. She is an ABA Legal Rebel, and is listed on the Fastcase 50 and ABA LTRC Women in Legal Tech. She can be contacted at [email protected].


The Risks of Using Dropbox for Client Files

Stacked3Here is my recent Daily Record column. My past Daily Record articles can be accessed here.

****

The Risks of Using Dropbox for Client Files

Years ago, after the American Bar Association published my “Cloud Computing for Lawyers” book, I was often asked to speak to lawyers about the benefits and risks of implementing cloud computing in law offices. At the time, most lawyers weren’t sure what cloud computing was but were nevertheless confident that they didn’t trust it and didn’t want to adopt it into their firms. 

Conversely, most of them were already using it and just didn’t realize it.

I know this because I would often begin my talks by asking how many people in the room used cloud computing tools. Inevitably only a few attendees would respond affirmatively. Then, I would ask how many in the audience had shared files using Dropbox, and at least half of the people in the room would raise their hands. In other words, most lawyers were using cloud computing, whether it was Dropbox, Box, or Gmail, and simply didn’t realize it.

Fast-forward to the present day, and how times have changed! According to data from the 2022 MyCase Legal Industry Report, the vast majority (80%) of legal professionals surveyed reported that their firms had cloud computing tools in place in their workplaces post-pandemic. Before the pandemic, most survey data showed that cloud computing adoption in the legal profession had remained stable for a number of years at just under 40%.

Despite the increased adoption, the risks associated with cloud computing haven’t changed. As part of your duty of technology competence, it’s essential to carefully vet cloud providers to ensure that your firm’s confidential data is securely stored and encrypted. Whenever you entrust your law firm’s data to a third party you must ensure that you fully understand the procedures and protections in place. This duty includes obtaining information as to how the data will be handled by that company, where the servers on which the data will be stored are located, who will have access to the data, and how often and when it will be backed up, among other things.

Also important is ensuring that the software you choose has features that will protect your client data and that you and your employees receive the necessary training and are familiar with the program's features. The failure to provide proper training and choose a secure platform designed for law firms that includes the features needed to protect confidential data can have unintended consequences.

Case in point: a recent disciplinary reprimand issued by the Indiana Supreme Court. At issue in Matter of James H. Lockwood, Supreme Court Case No. 24S-DI-319, was the respondent’s failure to secure client files stored in Dropbox. 

Specifically, Lockwood had represented a client in a protective order case, and that same client had also worked at Lockwood’s firm for several months as an unpaid non-attorney assistant. During that timeframe, Lockwood provided the client with a Dropbox cloud storage link that provided access to firm materials and client files. The client stopped working for the firm in January 2023, but Lockwood failed to secure or deactivate the Dropbox link, which remained active and unsecured at least through May 2024.

Based on that conduct, the Court concluded that he had violated Indiana Professional Conduct Rules 1.6, which prohibited “revealing information relating to representation of a client without the client’s informed consent.”

This mishap could have been prevented by using software specifically designed for legal professionals. Legal-specific tools address lawyers’ ethical obligations and ensure compliance with confidentiality and data security requirements. These cloud tools often include features like encryption, controlled permissions and access, and activity tracking to ensure client information stays protected.

The lesson to be learned: if your firm still relies on general-use software like Dropbox, it’s time to reconsider your choices and transition to tools designed specifically for legal professionals. Legal-specific platforms address the unique needs of law firms, offering the security and compliance features needed to protect client data and conform to professional standards. Now is the time to make this change to protect client data —- — and your law license — from unnecessary and preventable risk.

Nicole Black is a Rochester, New York attorney, author, journalist, and Principal Legal Insight Strategist at MyCase, CASEpeer, Docketwise, and LawPay, practice management and payment processing tools for lawyers (AffiniPay companies). She is the nationally-recognized author of "Cloud Computing for Lawyers" (2012) and co-authors "Social Media for Lawyers: The Next Frontier" (2010), both published by the American Bar Association. She also co-authors "Criminal Law in New York," a Thomson Reuters treatise. She writes regular columns for Above the Law, ABA Journal, and The Daily Record, has authored hundreds of articles for other publications, and regularly speaks at conferences regarding the intersection of law and emerging technologies. She is an ABA Legal Rebel, and is listed on the Fastcase 50 and ABA LTRC Women in Legal Tech. She can be contacted at [email protected].


GenAI, Talent, and Remote Work: Legal Industry Trends from the 2024 Wolters Kluwer Survey

Stacked3Here is my recent Daily Record column. My past Daily Record articles can be accessed here.

****

GenAI, Talent, and Remote Work: Legal Industry Trends from the 2024 Wolters Kluwer Survey

The 2024 Wolters Kluwer Future Ready Lawyer Report was released last month and highlights key trends impacting the legal profession. The survey findings from legal professionals across the U.S. and Europe reveal how organizations are addressing efficiency, regulatory pressures, and evolving client needs to stay competitive in a rapidly changing environment. Topics covered include the integration of generative AI (GenAI) into legal workflows, changing remote work expectations, and the value of work-life balance for talent recruitment and retention.

First, let’s examine the GenAI data. The survey results showed that at least 76% of legal professionals in corporate legal departments and 68% in law firms use GenAI weekly, with 35% and 33%, respectively, using it daily. There are implementation challenges, however, and (37%) of law firm employees and 42% of their corporate counterparts report issues integrating GenAI with their organization’s existing legal systems and processes.

Another notable set of statistics revolved around GenAI’s potential effect on, and potential to disrupt, the almighty billable hour. A surprising 60% of those surveyed expect AI-driven efficiencies to reduce the prevalence of the billable hour moving forward, and 20% predict it will have a significant impact. Fortunately, more than half of the legal professionals surveyed (56%) feel well-prepared to adapt their business practices, service offerings, workflows, and pricing models in response to AI’s potential impact on the traditional billable hour business model.

Additionally, 65% of legal professionals anticipated increased organizational investment in AI technology over the next three years, with 71% anticipating that GenAI’s rapid development will continue impacting firms and corporate legal departments during that same timeframe. 31% believe it will have a significant effect, with 69% feeling generally prepared to manage this impact. Only 26% consider themselves “very prepared.”These findings are evidence of the significant interest in this GenAI, driven by its time-saving benefits. However, trepidation exists regarding the pace of change, implementation challenges, and the levels of investment and training needed to keep up with a rapidly changing technology landscape.

In addition to GenAI trends, perspectives on changing talent acquisition and retention trends were also explored. One positive finding was that 80% of respondents believe their workplaces are equipped to address the need for talent attraction. Key factors cited as legal talent draws included an acceptable work-life balance (81%), competitive compensation packages (79%), and opportunities for professional development and training (79%).

Interestingly, employees surveyed reported that work culture is particularly important in attracting legal talent. Nearly 72% of respondents shared that they valued diverse and inclusive workplaces, and 75% believed their organizations fostered such environments.

Finally, remote work trends were also addressed. The survey results revealed a global trend toward returning to the office, despite the employee push-back often reported in the media. Most respondents (73%) reported that staff are required to work in the office four or more days per week, with this figure higher in corporate legal departments (77%) compared to law firms (69%).

This year’s survey results highlight the legal industry’s efforts to balance the adoption of cutting-edge technologies like GenAI with ethical and implementation challenges. Furthermore, issues like workforce retention remain significant, with the data showing that organizations must prioritize innovation, adaptability, and strategic investments in training and technology. In the midst of rapid technological and societal change, the importance of proactive planning and technology adoption cannot be overstated for organizations seeking to remain competitive and positioned for success in the years to come.

Nicole Black is a Rochester, New York attorney, author, journalist, and Principal Legal Insight Strategist at MyCase, CASEpeer, Docketwise, and LawPay, practice management and payment processing tools for lawyers (AffiniPay companies). She is the nationally-recognized author of "Cloud Computing for Lawyers" (2012) and co-authors "Social Media for Lawyers: The Next Frontier" (2010), both published by the American Bar Association. She also co-authors "Criminal Law in New York," a Thomson Reuters treatise. She writes regular columns for Above the Law, ABA Journal, and The Daily Record, has authored hundreds of articles for other publications, and regularly speaks at conferences regarding the intersection of law and emerging technologies. She is an ABA Legal Rebel, and is listed on the Fastcase 50 and ABA LTRC Women in Legal Tech. She can be contacted at [email protected].


Closing the Justice Gap: How Courts Are Leveraging GenAI for Greater Accessibility

Closing the Justice Gap: How Courts Are Leveraging GenAI for Greater Accessibility

Last week, I wrote about the release of recent judicial guidance for judges and court personnel seeking to use generative artificial intelligence (GenAI) to assist with the administration of justice. The guidance provided a roadmap for the ethical implementation of GenAI into the judiciary's workflows. 

This week, let’s discuss another relevant use case for GenAI in the courts: expanding access to justice by making the court system more accessible.

For years, courts have grappled with high caseloads and limited resources. More recently, the number of self-represented litigants has increased due to reduced federal support for legal aid organizations. With this influx of pro se litigants comes an increased demand for legal and procedural information. In the past, court websites and directories provided some assistance but were often difficult to navigate. Obtaining relevant information about court processes continued to be challenging.

Enter GenAI, which has emerged as a powerful tool with the potential to help bridge the access-to-justice gap. One of the most notable benefits of GenAi when applied to a database of information, such as court documents and data, is that it provides a user-friendly interface in the form of a responsive, knowledgeable chatbot. What was once challenging to unearth becomes quick and easy to access.

These GenAI interfaces can make all the difference to our overloaded court systems. Once deployed, they simplify and streamline complex court instructions and processes, from translating complex legal language and providing easy access to templates and court forms to enhancing public understanding of the court system. 

With GenAI, courts can eliminate procedural barriers and provide much-needed information, reduce administrative burdens, and empower pro se individuals with the tools needed to navigate our judicial system. For courts willing to embrace change and take advantage of all these tools, the benefits can be significant.

A few courts have already deployed GenAI-powered chatbots. For example, Nevada courts recently introduced a generative AI-powered chatbot designed to deliver plain-language legal guidance in multiple languages. Developed by CiviLaw.Tech for the Nevada Supreme Court, this generative AI-powered chatbot provides clear, concise, and personalized responses to common legal questions, helping individuals understand their options and the procedural steps they need to take. 

Similarly, Lemma Legal recently developed Missouri Tenant Help, an online resource for Missouri tenants seeking legal support. The platform includes an intake screening tool that incorporates the advanced GenAI language processing of GPT-4. This approach helps users determine their eligibility for assistance before speaking with program staff. Adding genAI to the intake process has removed a key barrier for tenants needing legal help, allowing them to understand their options quickly and easily.

These early court adopters of GenAI are finding that, with careful oversight, generative AI can not only make legal resources more accessible but also improve the efficiency and effectiveness of courts as a whole. While challenges remain—especially around ethical implications, data privacy, and accuracy—GenAI interfaces present unique opportunities to democratize access to justice. 

As courts continue to experiment with and refine these tools, the hope is that legal services will become more readily available and tailored to the needs of all individuals, regardless of their background or resources. Will this actually happen, or is it a pie-in-the-sky pipe dream? I tend toward cynicism, but every effort counts and moves us one step closer to a more equitable and accessible judicial system. Only time will tell if GenAi will truly help bridge the access to justice gap.

Nicole Black is a Rochester, New York attorney, author, journalist, and Principal Legal Insight Strategist at MyCase, CASEpeer, Docketwise, and LawPay, practice management and payment processing tools for lawyers (AffiniPay companies). She is the nationally-recognized author of "Cloud Computing for Lawyers" (2012) and co-authors "Social Media for Lawyers: The Next Frontier" (2010), both published by the American Bar Association. She also co-authors "Criminal Law in New York," a Thomson Reuters treatise. She writes regular columns for Above the Law, ABA Journal, and The Daily Record, has authored hundreds of articles for other publications, and regularly speaks at conferences regarding the intersection of law and emerging technologies. She is an ABA Legal Rebel, and is listed on the Fastcase 50 and ABA LTRC Women in Legal Tech. She can be contacted at [email protected].

 


Judicial Ethics: Navigating the AI Era

Stacked3Here is my recent Daily Record column. My past Daily Record articles can be accessed here.

****

Judicial Ethics: Navigating the AI Era

Over the past two years, generative artificial (AI) ethics guidance has been plentiful, with many State Bars swiftly responding to the increasing need for AI adoption advice. Within months of ChatGPT’s initial release in November 2022, the risks of using generative AI in legal practice were alarmingly clear, as captured in numerous sensational headlines. The benefits were also evident, with the speed of adoption outpacing the necessary learning curve needed to utilize these tools competently. As more AI ethics opinions were issued, a clear path to ethical adoption emerged for lawyers.

But what about judges and court staff? Generative AI offers obvious benefits that could significantly increase efficiencies and remove tedium from their daily workflows by streamlining legal research and the drafting of orders and opinions. Of course, ethical implementation of AI by the courts is essential, and while the risks presented are similar to those encountered by lawyers, there are also considerations unique to the judiciary.

The good news is that some guidance is available. For starters, in October 2023, two different judicial ethics opinions were released. The first was JIC Advisory Opinion 2023-22.  It was issued on October 13, 2023 by the West Virginia Judicial Investigation Commission. 

The Commission determined that judges may use AI for research purposes but not when deciding the outcome of a case. Additionally, the Committee advised that extreme caution should be taken when using AI to assist with drafting orders or opinions. Finally, the Commission emphasized the importance of maintaining technology competence when using AI, clarifying that the duty was ongoing.

Later that month, on October 27, 2023, judicial ethics opinion JI-155 was issued in Michigan. The focus of this opinion was technology competence. Like the West Virginia opinion, judges were advised to maintain technology competence regarding technology, including AI: “(J)udicial officers have an ethical duty to maintain technological competence and understand AI’s ethical implications to ensure efficiency and quality of justice (and) take reasonable steps to ensure that AI tools on which their judgment will be based are used properly and that the AI tools are utilized within the confines of the law and court rules.”

More recently, Delaware and Georgia issued orders addressing the judiciary's use of AI. On October 21, 2024, the Delaware Supreme Court adopted an interim AI policy for judges and court personnel (online: https://courts.delaware.gov/forms/download.aspx?id=266848). It requires users to maintain technology competence and outlines the appropriate usage of authorized AI tools, including the requirement that “(u)sers may not delegate their decision-making function to Approved GenAI.” 

The State of Georgia’s Order related to the formation of its Ad Hoc Committee on Artificial Intelligence (online: https://jcaoc.georgiacourts.gov/wp-content/uploads/2024/10/AI_Committee_Orders.pdf). The Order appointed sixteen people to the committee whose mission is to assess “the risks and benefits of the use of generative AI on the courts and to make recommendations to ensure that the use of AI does not erode public trust and confidence in the judicial system.”

While guidance for the judiciary has been less plentiful, it remains valuable. These guidelines offer a clear roadmap for adopting AI responsibly, ensuring that judicial integrity is preserved throughout implementation. As AI technology advances rapidly, the judiciary must keep pace by leveraging AI’s potential to streamline processes and improve the quality of justice. By committing to continuous education and adhering to these standards, courts can gain the benefits of AI while upholding judicial integrity and maintaining public trust.

Nicole Black is a Rochester, New York attorney, author, journalist, and Principal Legal Insight Strategist at MyCase, CASEpeer, Docketwise, and LawPay, practice management and payment processing tools for lawyers (AffiniPay companies). She is the nationally-recognized author of "Cloud Computing for Lawyers" (2012) and co-authors "Social Media for Lawyers: The Next Frontier" (2010), both published by the American Bar Association. She also co-authors "Criminal Law in New York," a Thomson Reuters treatise. She writes regular columns for Above the Law, ABA Journal, and The Daily Record, has authored hundreds of articles for other publications, and regularly speaks at conferences regarding the intersection of law and emerging technologies. She is an ABA Legal Rebel, and is listed on the Fastcase 50 and ABA LTRC Women in Legal Tech. She can be contacted at [email protected].

 

 


Amid a Flurry of AI Ethics Opinions, New Mexico Weighs In

Stacked3Here is my recent Daily Record column. My past Daily Record articles can be accessed here.

****

Amid a Flurry of AI Ethics Opinions, New Mexico Weighs In

Did you know that in less than two years, more than ten U.S. jurisdictions have issued guidance on generative artificial intelligence (AI)? For the past two decades, I’ve written about legal technology. My goal has always been to help legal professionals navigate the twists and turns of 21st-century innovations. From blogging and social media to cloud and mobile computing, I’ve encouraged members of my profession to actively learn about and implement technology into their practices.

Initially, my efforts felt like swimming upstream. Very few colleagues were receptive, and only the most tech-savvy showed interest in new tools and platforms. Adoption rates were slow, and my attempts to educate were often met with indifference.

Then, in early 2020, the pandemic struck, forcing lawyers to work remotely, conduct meetings online, and rely heavily on cloud-based tools. Attitudes shifted almost overnight, leading to a dramatic spike in technology adoption.

In many ways, the pandemic had the effect of priming legal professionals to be open to new tools and ways of working. This change of heart could not have come at a better time, and when generative artificial intelligence (AI) was unleashed, attorneys were immediately receptive and curious about its potential to streamline their workflows and increase law firm profitability.

When GPT 3.0 was released in November 2022, it amounted to a technological tidal wave whose impact on the practice of law continues to be felt today. The amount of ethics guidance handed down over the past two years focused on a single technology is unprecedented. This rapid response reflects both heightened concerns about potential risks and the acknowledgment of the potentially significant impact that AI could have on the practice of law. By my count, at least eleven jurisdictions have issued guidance or opinions on the ethics of using AI in law firms: California, Florida, New Jersey, Michigan, New York, Pennsylvania, Kentucky, the American Bar Association, Virginia, and D.C.

Most recently, New Mexico joined their ranks, issuing Formal Ethics Advisory Formal Opinion 2024-005 (Online: https://www.sbnm.org/Portals/NMBAR/GenAI%20Formal%20Opinion%20-%20Sept_2024_FINAL.pdf). At issue was whether lawyers may use generative AI in the practice of law. The short answer? Yes.

The State Bar of Mexico Ethics Advisory Committee determined that generally speaking, “the responsible use of Generative AI is consistent with lawyers’ (ethical) duties.” According to the committee, generative AI offers many potential benefits for lawyers and their clients, increasing efficiency and reducing costs for clients. 

The Committee offered a number of examples of use cases, which include the initial drafting of legal documents and routine correspondence, assisting with drafting complex contracts or cross-examining witnesses, and streamlining discovery. 

Importantly, the Committee clarified that lawyers are not required to use this technology, but “those lawyers who choose to do so…must do so responsibly, recognizing that the use of Generative AI does not change their fundamental duties under the Rules of Professional Conduct.” 

Interestingly, the Committee offered a unique take on the risk of law firm data being used to train AI models. According to the Committee, conflict of interest issues could be triggered when using generative AI since “there is a risk that future outputs may use information relating to the prior representation or concurrent representation by another lawyer in the same firm in a way that disadvantages the prior/other client.” The Committee cautioned that if lawyers are unable to verify a lack of a conflict, they should avoid inputting confidential client data into a generative AI tool unless they’ve confirmed that the tool possesses safeguards that “protect prior client information and…screen potential conflicts.”

The Committee also addressed many other ethical issues that are implicated when lawyers use generative AI, including confidentiality, candor toward the tribunal, AI costs and billing, and supervisory issues. Make sure to read the full opinion for their in-depth analysis of these topics, especially if you happen to practice law in New Mexico.

No matter where you practice, one thing is clear: keeping up with the pace of change is essential. Given generative AI's rapid advancement, it is more important than ever to stay informed, uphold ethical standards, and take full advantage of AI’s benefits. By doing so, you’ll be well-positioned to thrive, ultimately providing better client service and staying ahead of the curve in an increasingly competitive legal marketplace.

Nicole Black is a Rochester, New York attorney, author, journalist, and Principal Legal Insight Strategist at MyCase, CASEpeer, Docketwise, and LawPay, practice management and payment processing tools for lawyers (AffiniPay companies). She is the nationally-recognized author of "Cloud Computing for Lawyers" (2012) and co-authors "Social Media for Lawyers: The Next Frontier" (2010), both published by the American Bar Association. She also co-authors "Criminal Law in New York," a Thomson Reuters treatise. She writes regular columns for Above the Law, ABA Journal, and The Daily Record, has authored hundreds of articles for other publications, and regularly speaks at conferences regarding the intersection of law and emerging technologies. She is an ABA Legal Rebel, and is listed on the Fastcase 50 and ABA LTRC Women in Legal Tech. She can be contacted at [email protected].

 

 

 


New York Surrogates Court on Admissibility of AI Evidence 

Stacked3Here is my recent Daily Record column. My past Daily Record articles can be accessed here.

****

New York Surrogates Court on Admissibility of AI Evidence 

The last few decades have seen rapid technological advances. For busy lawyers, keeping up with the pace of change has been a challenging endeavor. For many, the inclination has been to ignore the latest advancements in favor of maintaining the status quo.

Unfortunately, that approach has proven ineffective. 21st-century technologies have infiltrated all aspects of our lives, from how we communicate, make purchases, and obtain information to how we conduct business. Turning a blind eye is no longer an option. Instead, it is necessary to prioritize learning about emerging technologies, including their potential implications for your law practice, your clients’ cases, and your law license. 

This enlightened approach is essential as we enter the artificial intelligence (AI) era. Like the technologies that preceded it, AI will inevitably impact many aspects of your law practice, even if you choose not to incorporate it into your firm’s daily workflows.

For example, just as social media evidence has altered the course of trials, so too has artificial intelligence. A case in point is Saratoga County Surrogate’s Court Judge Schopf's October 10th Court Order in Matter of Weber (2024 NY Slip Op 24258). One issue under consideration in this case was the use of generative AI-produced evidence at a hearing.

In Matter of Weber, the Petitioner filed a Petition for Judicial Settlement of the Interim Account of the Trustee. The Objectant responded by filing objections to the Trust Account alleging, in relevant part, that the Petitioner had breached her fiduciary duty as Trustee. A hearing was held to address the Objectant’s allegations. 

This opinion followed. In it, the Court considered whether the Objectant had overcome the prima facie accuracy of the Interim Account and proved his objections. One issue addressed was whether and under what circumstances AI-generated output is admissible into evidence.

The hearing testimony revealed that the Objectant's expert witness, Charles Ranson, used Microsoft’s generative AI tool, Copilot, to cross-check his damage calculations. The evidence showed that Ranson could not provide the input or prompt used, nor could he advise regarding the sources relied upon and the process used by the chatbot to create the output provided. 

When determining the admissibility of Copilot’s responses, Judge Schopf explained that the “mere fact that artificial intelligence has played a role, which continues to expand in our everyday lives, does not make the results generated by artificial intelligence admissible in Court.”

The Court concluded that the reliability of AI-generated responses must be established before they are admitted into evidence. The Court explained that “due to the nature of the rapid evolution of artificial intelligence and its inherent reliability issues that prior to evidence being introduced which has been generated by an artificial intelligence product or system, counsel has an affirmative duty to disclose the use of artificial intelligence and the evidence sought to be admitted should properly be subject to a Frye hearing prior to its admission, the scope of which should be determined by the Court, either in a pre-trial hearing or at the time the evidence is offered.”

According to Judge Schopf, the Objectant failed to meet that burden: “In the instant case, the record is devoid of any evidence as to the reliability of Microsoft Copilot in general, let alone as it relates to how it was applied here. Without more, the Court cannot blindly accept as accurate, calculations which are performed by artificial intelligence.”This decision evinces the growing need to carefully scrutinize AI-generated evidence in legal proceedings. Courts are unlikely to admit this type of evidence at this early stage unless its reliability is established beforehand. As this technology becomes commonplace, these standards may evolve and become more elastic. Only time will tell. 

In the interim, take steps to proactively learn about AI tools so that you can advocate for or challenge their use in court effectively. By staying informed, you will be well-positioned to meet both the opportunities and challenges posed by AI-driven evidence. There’s no better time than now to get up to speed. Start learning about generative AI today to ensure that you are prepared for the future of law.

Nicole Black is a Rochester, New York attorney, author, journalist, and Principal Legal Insight Strategist at MyCase, CASEpeer, Docketwise, and LawPay, practice management and payment processing tools for lawyers (AffiniPay companies). She is the nationally-recognized author of "Cloud Computing for Lawyers" (2012) and co-authors "Social Media for Lawyers: The Next Frontier" (2010), both published by the American Bar Association. She also co-authors "Criminal Law in New York," a Thomson Reuters treatise. She writes regular columns for Above the Law, ABA Journal, and The Daily Record, has authored hundreds of articles for other publications, and regularly speaks at conferences regarding the intersection of law and emerging technologies. She is an ABA Legal Rebel, and is listed on the Fastcase 50 and ABA LTRC Women in Legal Tech. She can be contacted at [email protected].


Florida’s Professional Conduct Rules Will Include AI—But Was It Needed?

 

Stacked3Here is my recent Daily Record column. My past Daily Record articles can be accessed here.

****

Florida’s Professional Conduct Rules Will Include AI—But Was It Needed?

When faced with the impact of a potentially disruptive technology, our profession follows a very predictable path: ignorance, indifference, overreaction, readjustment, begrudging acceptance, and finally, appreciation. With the sudden emergence and advancement of generative artificial intelligence (AI), the cycle has started, and all signs point to deep entrenchment in the overreaction phase.

OpenAI released ChatGPT 3.5 nearly two years ago, in November 2022. Since then, AI's effects have been inescapable, rapid, and significant, with unavoidable cascading changes occurring. Headlines about lawyers relying on generative AI tools and submitting briefs to courts that include fake case citations have only amplified already heightened and overblown concerns about AI.

These reactions are unsurprising given AI's wide-ranging potential to revamp core legal functions, from legal research and document drafting to litigation analytics and contract review. In the face of inevitable change, our profession is now focused on whether AI is a tool that will enhance their practices or a force that could undermine or even replace the practice of law as we know it.

In response to these concerns, several jurisdictions across the United States have formed AI committees, issued guidance, or authored opinions to help lawyers navigate a strange new world where AI runs rampant. More than eight states, including Florida, California, and Michigan, have taken formal steps to address AI’s role in legal practice. 

While these efforts are welcome to the extent that they help to encourage adoption, they are arguably unnecessary. Current rules and guidance on technology usage are more than sufficient.

The most recent efforts arose in Florida, where the Bar took the extreme step of modifying the Rules Regulating The Florida Bar to include references to generative AI. On August 29, the Florida Supreme Court adopted the amendments proposed by the Bar. These changes will go into effect on October 28.

One update was to the comment regarding the competency requirement. It now advises that lawyers must stay on top of technology changes, “including generative artificial intelligence.” 

Additionally, the duty of confidentiality now includes the obligation to “be aware that generative artificial intelligence may create risks to the lawyer’s duty of confidentiality.” 

Similarly, the duty of supervision now requires that supervising attorneys must “consider safeguards for the firm’s use of technologies such as generative artificial intelligence, and ensure that inexperienced lawyers are properly supervised.”

Finally, Rule 4-5.3, which addresses lawyers’ responsibilities regarding nonlawyer assistants, now requires that a “lawyer should also consider safeguards when assistants use technologies such as generative artificial intelligence.”

These amendments were unnecessary and unwise and will not withstand the tests of time. AI is simply a new tool. Other technologies preceded it, and new ones will follow. It is not the be-all and end-all of technology or our profession, and trying to ban or reduce its use by lawyers is a pointless, ineffectual endeavor that fails to serve the needs of our profession in the AI era.

The reactions by state bars to AI are entirely predictable. No matter the technology, our profession has tried to regulate it, from email and the Internet to social media and cloud computing. 

Demonizing new technology and banning its use have been par for the course. Eventually, however, a resigned acceptance set in as each tool became commonplace. AI will be no different. 

This same cycle is occurring with AI. Soon, we’ll move on from overreaction to the later phases, ultimately landing on appreciation. This process will happen much faster than it has in the past due to AI’s rapid rate of advancement. So buckle up, shore up your AI knowledge, and hold on, folks! The times are a-changin’ and quickly. Catch up while you still can!

Nicole Black is a Rochester, New York attorney, author, journalist, and Principal Legal Insight Strategist at MyCase, CASEpeer, Docketwise, and LawPay, practice management and payment processing tools for lawyers (AffiniPay companies). She is the nationally-recognized author of "Cloud Computing for Lawyers" (2012) and co-authors "Social Media for Lawyers: The Next Frontier" (2010), both published by the American Bar Association. She also co-authors "Criminal Law in New York," a Thomson Reuters treatise. She writes regular columns for Above the Law, ABA Journal, and The Daily Record, has authored hundreds of articles for other publications, and regularly speaks at conferences regarding the intersection of law and emerging technologies. She is an ABA Legal Rebel, and is listed on the Fastcase 50 and ABA LTRC Women in Legal Tech. She can be contacted at [email protected].