A law firm recently reached out to me to conduct a CLE on Mental Health Challenges in the Age of AI. It was an interesting request. I’ve spoken about AI issues on panels, as a keynote speaker, and in the classroom, and I wrote about it for Tennessee Journal of Business Law. I also conduct workshops and CLEs on mental health in the profession. But I’ve never been asked to combine the topics. 

Before I discussed issues related to anxiety about job disruption and how cognitive overload affects the brain, I spent time talking about the various tools that are out there and how much our profession will transform in the very near future.

If you’re like many lawyers I know, you think that AI is more hype than substance. So I’ll share the information I shared with the law firm.

According to a  2024 Bloomberg survey on AI and the legal profession, 69% of Bloomberg survey respondents believe generative AI can be used ethically in legal practice. But they harbor “extreme” or “moderate” concerns about deep fakes (e.g., human impersonations, hallucinations and accuracy of AI-generated text,  privacy, algorithmic bias, IP, and of course, job displacement.

Those are

We just finished our second week of the semester and I’m already exhausted, partly because I just submitted the first draft of a law review article that’s 123 pages with over 600 footnotes on a future-proof framework for AI regulation to the University of Tennessee Journal of Business Law. I should have stuck with my original topic of legal ethics and AI.

But alas, who knew so much would happen in 2023? I certainly didn’t even though I spent the entire year speaking on AI to lawyers, businesspeople, and government officials. So, I decided to change my topic in late November as it became clearer that the EU would finally take action on the EU AI Act and that the Brussels effect would likely take hold requiring other governments and all the big players in the tech space to take notice and sharpen their own agendas.

But I’m one of the lucky ones because although I’m not a techie, I’m a former chief privacy officer, and spend a lot of time thinking about things like data protection and cybersecurity, especially as it relates to AI. And I recently assumed the role of GC of an AI startup. So

I’m a law professor, the general counsel of a medtech company, a podcaster, and I design and deliver courses on a variety of topics as a consultant. I think about and use generative AI daily and it’s really helped boost my productivity. Apparently, I’m unusual among lawyers. According to a Wolter’s Kluwers Future Ready Lawyer report that surveyed 700 legal professionals in the  US and EU, only 15% of lawyers are using generative AI right now but 73% expect to use it next year. 43% of those surveyed see it as an opportunity, 25% see it as a threat, and 26% see it as both.

If you’re planning to be part of the 73% and you practice in the US, here are some ethical implications with citations to select model rules. A few weeks ago, I posted here about business implications that you and your clients should consider.

  • How can you stay up-to-date with the latest advancements in AI technology and best practices, ensuring that you continue to adapt and evolve as a legal professional in an increasingly technology-driven world? Rule 1.1 (Competence)
  • How can AI tools be used effectively and ethically to enhance your practice, whether in legal research,

Andrew Granato has posted his draft paper After the “Partner Run”: the Dewey & LeBoeuf Diaspora on SSRN.  You can find it here.  The abstract reads as follows:

“Partner runs” are a phenomenon distinctive to the American legal profession, a result of legal professional responsibility rules, partnership governance, and bankruptcy law that occasionally causes individual law firms to spiral into liquidation following unexceptional setbacks. It is unclear whether this idiosyncratic feature of law firm collapse can pose a threat to the industrial organization of the legal profession. Can lawyers easily recover and recreate the benefits of law firm scale by re-merging into other law firms with ease, or does a partner run mark a scarlet letter that poisons lawyers’ careers, and the legal profession as a whole, permanently?

I provide the first rigorous examination of this issue using the case study of the 2012 downfall of Dewey & LeBoeuf, the largest law firm bankruptcy ever. I hand-construct a dataset using public information in directories, news reports, and LinkedIn of the career outcomes of every lawyer who worked at Dewey’s U.S. offices in 2012 and a control group of similarly situated lawyers at law firms identified to me by

Last week I had the pleasure of joining my fellow bloggers at the UT Connecting the Threads Conference on the legal issues related to generative AI (GAI) that lawyers need to understand for their clients and their own law practice. Here are some of the questions I posed to the audience and some recommendations for clients. I'll write about ethical issues for lawyers in a separate post. In the meantime, if you're using OpenAI or any other GAI, I strongly recommend that you read the terms of use. You may be surprised by certain clauses, including the indemnification provisions. 

I started by asking the audience members to consider what legal areas are most affected by GAI? Although there are many, I'll focus on data privacy and employment law in this post.

Data Privacy and Cybersecurity

Are the AI tools and technologies you use compliant with relevant data protection and privacy regulations, such as GDPR and CCPA? Are they leaving you open to a cyberattack?

This topic also came up today at a conference at NCCU when I served as a panelist on cybersecurity preparedness for lawyers.

Why is this important?

ChatGPT was banned in Italy for a time

Greetings from SEALS, where I've just left a packed room of law professors grappling with some thorny issues related to ChatGPT4, Claude 2, Copilot, and other forms of generative AI. I don't have answers to the questions below and some are well above my pay grade, but I am taking them into account as I prepare to teach courses in transactional skills; compliance, corporate governance, and sustainability; and ethics and technology this Fall.

In no particular order, here are some of the questions/points raised during the three-hour session. I'll have more thoughts on using AI in the classroom in a future post.

  1. AI detectors that schools rely on have high false positives for nonnative speakers and neurodivergent students and they are easy to evade. How can you reliably ensure that students aren't using AI tools such as ChatGPT if you've prohibited it?
  2. If we allow the use of AI in classrooms, how do we change how we assess students?
  3. If our goal is to teach the mastery of legal skills, what are the legal skills we should teach related to the use of AI? How will our students learn critical thinking skills if they can

If you follow me on LinkedIn, you know that I posted almost every day in May for Mental Health Awareness Month.
 
Last week,  I had the opportunity to discuss mental health and well being for an AmLaw 20 firm (one of my coaching clients) that opened the presentation up to all of its legal professionals. Hundreds registered. Too often, firms or companies focus on those with the highest salaries. As a former paralegal, I know how stressful that job can be. And I know I could never have done my job as a lawyer without the talented legal professionals who supported me.

Here are some scary statistics that I shared from the most recent ALM Mental Health and Substance Abuse Survey.

If you’re a law firm leader or work with legal professionals in any capacity, please read the report and take action. If you can’t get rid of the billable hour (which would solve a lot of issues), think about how you allocate work, respond to unreasonable client demands, and reward toxic perfectionism and overwork. 

✅ 71% of the nearly 3,000 lawyers surveyed said they had anxiety

✅ 45% said their morale has not changed since the pandemic

✅ 38%

Professors Jordan Neyland (George Mason, Antonin Scalia Law School), Tom Bates (Arizona State University), and Roc Lv (ANU/Jiangxi University), have recently posted their article, Who Are the Best Law Firms? Rankings from IPO Performance to SSRN. Here's the Description:

If you have ever wondered who the best law firms are (which lawyer hasn’t?), have a look at our new ranking. My co-authors—Tom Bates at ASU and Roc Lv at ANU/Jiangxi University—and I developed a ranking method based on law firms’ clients’ outcomes in securities markets. 

There is no shortage of recent scandals in rankings in law. In particular, U.S. News’ law school rankings receive criticism for focusing too much on inputs, such as student quality or acceptance rates, instead of student outcomes like job quality and success in public interest careers. Many schools even refuse to submit data or participate in the annual ranking. Similar critiques apply to law firm rankings. We propose that our methodology improves upon existing methods, which frequently use revenue, profit, or other size-related measures to proxy for quality and reputation. Instead, we focus on the most important outcomes for clients: litigation rates, disclosure, pricing, and legal costs.

By focusing on the most relevant outcomes, this

Warning: this post addresses suicide.

I was supposed to post yesterday about a different topic but I'm posting today and not next week because someone needs to read this today.

Maybe it's you. Maybe it's your "strong" friend or colleague.

I found out yesterday that I lost a former student to suicide. She lit up every room she walked into and inspired me, her classmates, and everyone she met. I had no idea she was living in such darkness. Lawyers, law students, compliance professionals, and others in high stress roles are conditioned to be on top of everything. We are the strong ones that clients and colleagues rely on. We worry so much about the stigma of not being completely in control at all times, that we don't get help. We worry that clients won't trust us with sensitive or important matters. We worry that we won't pass the character and fitness assessments to get admitted to the bar. 

The CDC released a report this week showing an alarming rise in depression, suicidal thoughts, and anxiety among our youth. The report noted that:

  • Female students and LGBQ+ students are experiencing alarming rates of violence, poor mental health, and suicidal thoughts

An ambitious question, yes, but it was the title of the presentation I gave at the Society for Socio-Economists Annual Meeting, which closed yesterday. Thanks to Stefan Padfield for inviting me.

In addition to teaching Business Associations to 1Ls this semester and running our Transactional Skills program, I'm also teaching Business and Human Rights. I had originally planned the class for 25 students, but now have 60 students enrolled, which is a testament to the interest in the topic. My pre-course surveys show that the students fall into two distinct camps. Most are interested in corporate law but didn't know even know there was a connection to human rights. The minority are human rights die hards who haven't even taken business associations (and may only learn about it for bar prep), but are curious about the combination of the two topics. I fell in love with this relatively new legal  field twelve years ago and it's my mission to ensure that future transactional lawyers have some exposure to it.

It's not just a feel-good way of looking at the world. Whether you love or hate ESG, business and human rights shows up in every factor and many firms have built