Two days after the US election, I moderated and participated on a Society of Corporate Compliance and Ethics (SCCE) panel on  ESG through the life cycle of a business with Eugenia Maria Di Marco, who focused on startups and international markets, and Ahpaly Coradin, who focused on M&A, private equity, and corporate governance.

I shared these stats with the audience before we delved into the discussion:

  • In July 2024, SHRM, the
  •  The Society of Corporate Compliance and Ethics is hosting a virtual ESG and Compliance Conference on November 7.  I love to hear academics talk about these issues at conferences but because I still engage in the practice of law and I teach about compliance, governance, and sustainability, I find the conversations are very different when listening to practitioners.

    My panel is titled ESG Due Diligence Across the Corporate Lifecycle From Start-Up to Maturity: The Roles of Compliance, Ethics, Legal, and the Board. My co-panelists, Ahpaly Coradin, Partner, Pierson Ferdinand, and Eugenia di Marco, a startup founder and international legal advisor, and I will focus on:

    •  how to measure and prioritize ESG factors at different stages of a company's life cycle, according to a company's industry, and technology use.
    •  how ESG creates value in M&A  beyond risk mitigation and learn the impact of ESG on target selection, valuation, and integration.
    • board and management responsibilities in overseeing and managing ESG-related risks, particularly in light of Caremark duties and Marchand.

    Date & Time: Thursday, November 7 from 12:45 PM – 1:45 PM central time

    Other topics that speakers will discuss include:

    • Supply chains and European due diligence 
    • Global regulatory and legislative developments
    • Sustainable governance

    We just finished our second week of the semester and I’m already exhausted, partly because I just submitted the first draft of a law review article that’s 123 pages with over 600 footnotes on a future-proof framework for AI regulation to the University of Tennessee Journal of Business Law. I should have stuck with my original topic of legal ethics and AI.

    But alas, who knew so much would happen in 2023? I certainly didn’t even though I spent the entire year speaking on AI to lawyers, businesspeople, and government officials. So, I decided to change my topic in late November as it became clearer that the EU would finally take action on the EU AI Act and that the Brussels effect would likely take hold requiring other governments and all the big players in the tech space to take notice and sharpen their own agendas.

    But I’m one of the lucky ones because although I’m not a techie, I’m a former chief privacy officer, and spend a lot of time thinking about things like data protection and cybersecurity, especially as it relates to AI. And I recently assumed the role of GC of an AI startup. So

    Last week I had the pleasure of joining my fellow bloggers at the UT Connecting the Threads Conference on the legal issues related to generative AI (GAI) that lawyers need to understand for their clients and their own law practice. Here are some of the questions I posed to the audience and some recommendations for clients. I'll write about ethical issues for lawyers in a separate post. In the meantime, if you're using OpenAI or any other GAI, I strongly recommend that you read the terms of use. You may be surprised by certain clauses, including the indemnification provisions. 

    I started by asking the audience members to consider what legal areas are most affected by GAI? Although there are many, I'll focus on data privacy and employment law in this post.

    Data Privacy and Cybersecurity

    Are the AI tools and technologies you use compliant with relevant data protection and privacy regulations, such as GDPR and CCPA? Are they leaving you open to a cyberattack?

    This topic also came up today at a conference at NCCU when I served as a panelist on cybersecurity preparedness for lawyers.

    Why is this important?

    ChatGPT was banned in Italy for a time

    Depending on who you talk to, you get some pretty extreme perspectives on generative AI. In a former life, I used to have oversight of the lobbying and PAC money for a multinational company. As we all know, companies never ask to be regulated. So when an industry begs for regulation, you know something is up. 

    Two weeks ago, I presented the keynote speech to the alumni of AESE, Portugal’s oldest business school, on the topic of my research on business, human rights, and technology with a special focus on AI. If you're attending Connecting the Threads in October, you'll hear some of what I discussed.

    I may have overprepared, but given the C-Suite audience, that’s better than the alternative. For me that meant spending almost 100 hours  reading books, articles, white papers, and watching videos by data scientists, lawyers, ethicists, government officials, CEOs, and software engineers. 

    Because I wanted the audience to really think about their role in our future, I spent quite a bit of time on the doom and gloom scenarios, which the Portuguese press highlighted. I cited the talk by the creators of the Social Dilemma, who warned about the dangers of social

    Corporate governance has become a bit of an alphabet soup over the years–CSR,* DEIB,** and ESG*** (among other initialisms) are all part of the current practical lexicon for those of us working with businesses.  As a day celebrating the emancipation of the last enslaved Black Americans, Juneteenth, connects with so many of those acronyms in one way or another.  Businesses have been noticing.

    For example, Hassina Obaidy's June 13, 2023 article, Juneteenth in the Workplace: Why your company should celebrate, posted on the website of workplace training and compliance provider Emtrain, offers one perspective on Juneteenth and CSR.

    Black Lives Matter has taught both individuals and companies what allyship can really look like. We’ve also learned that the passing of time is not enough to make real change. Companies need to support employees that come from demographics that have historically been marginalized through company policies, workplace culture, and corporate social responsibility (CSR). Giving employees a day off to celebrate Juneteenth and engage with their communities in a productive way is one step leaders can take to move the needle on CSR.

    Similarly, in an article entitled Seven thoughtful ideas for observing Juneteenth in the workplace, Christina Bibby at

    A few months ago, I asked whether people in the tech industry were the most powerful people in the world. This is part II of that post.

    I posed that question after speaking at a tech conference in Lisbon sponsored by Microsoft. They asked me to touch on business and human rights and I presented the day after the company announced a ten billion dollar investment in OpenAI, the creator of ChatGPT. Back then, we were amazed at what ChatGPT 3.5 could do. Members of the audience were excited and terrified- and these were tech people. 

    And that was before the explosion of ChatGPT4. 

    I've since made a similar presentation about AI, surveillance, social media companies to law students, engineering students, and business people. In the last few weeks, over 10,000 people including Elon Musk, have called for a 6-month pause in AI training systems. If you don't trust Musk's judgment (and the other scientists and futurists), trust the "Godfather of AI," who recently quit Google so he could speak out on the dangers, even though Google has put out its own whitepaper on AI development. Watch the 60 Minutes interview with the CEO of

    My mind is still reeling from my trip to Lisbon last week to keynote at the Building The Future tech conference sponsored by Microsoft.

    My premise was that those in the tech industry are arguably the most powerful people in the world and with great power comes great responsibility and a duty to protect human rights (which is not the global state of the law).

    I challenged the audience to consider the financial price of implementing human rights by design and the societal cost of doing business as usual.

    In 20 minutes, I covered  AI bias and new EU regulations; the benefits and dangers of ChatGPT; the surveillance economy; the UNGPs and UN Global Compact; a new suit by Seattle’s school board against social media companies alleging harmful mental health impacts on students; potential corporate complicity with rogue governments; the upcoming Supreme Court case on Section 230 and content moderator responsibility for “radicalizing” users; and made recommendations for the governmental, business, civil society, and consumer members in the audience.

    Thank goodness I talk quickly.

    Here are some non-substantive observations and lessons. In a future post, I'll go in more depth about my substantive remarks. 

    1. Your network

    An ambitious question, yes, but it was the title of the presentation I gave at the Society for Socio-Economists Annual Meeting, which closed yesterday. Thanks to Stefan Padfield for inviting me.

    In addition to teaching Business Associations to 1Ls this semester and running our Transactional Skills program, I'm also teaching Business and Human Rights. I had originally planned the class for 25 students, but now have 60 students enrolled, which is a testament to the interest in the topic. My pre-course surveys show that the students fall into two distinct camps. Most are interested in corporate law but didn't know even know there was a connection to human rights. The minority are human rights die hards who haven't even taken business associations (and may only learn about it for bar prep), but are curious about the combination of the two topics. I fell in love with this relatively new legal  field twelve years ago and it's my mission to ensure that future transactional lawyers have some exposure to it.

    It's not just a feel-good way of looking at the world. Whether you love or hate ESG, business and human rights shows up in every factor and many firms have built

    I'm a huge football fan. I mean real football– what people in the US call soccer. I went to Brazil for the World Cup in 2014 twice and have watched as many matches on TV as I could during the last tournament and this one. In some countries, over half of the residents watch the matches when their team plays even though most matches happen during work hours or the middle of the night in some countries. NBC estimates that 5 billion people across the world will watch this World Cup with an average of 227 million people a day. For perspective, roughly 208 million people, 2/3 of the population, watched Superbowl LVI in the US, which occurs on a Sunday.

    Football is big business for FIFA and for many of its sponsors. Working with companies such as Adidas, Coca-Cola, Hyundai / KIA, Visa, McDonald's, and Budweiser has earned nonprofit FIFA a record 7.5 billion in revenue for this Cup. Fortunately for Budweiser, which paid 75 million to sponsor the World Cup, Qatar does not ban alcohol. But in a plot twist, the company had to deal with a last-minute stadium ban. FIFA was more effective in Brazil, which has