Two days after the US election, I moderated and participated on a Society of Corporate Compliance and Ethics (SCCE) panel on  ESG through the life cycle of a business with Eugenia Maria Di Marco, who focused on startups and international markets, and Ahpaly Coradin, who focused on M&A, private equity, and corporate governance.

I shared these stats with the audience before we delved into the discussion:

  • In July 2024, SHRM, the
  • We just finished our second week of the semester and I’m already exhausted, partly because I just submitted the first draft of a law review article that’s 123 pages with over 600 footnotes on a future-proof framework for AI regulation to the University of Tennessee Journal of Business Law. I should have stuck with my original topic of legal ethics and AI.

    But alas, who knew so much would happen in 2023? I certainly didn’t even though I spent the entire year speaking on AI to lawyers, businesspeople, and government officials. So, I decided to change my topic in late November as it became clearer that the EU would finally take action on the EU AI Act and that the Brussels effect would likely take hold requiring other governments and all the big players in the tech space to take notice and sharpen their own agendas.

    But I’m one of the lucky ones because although I’m not a techie, I’m a former chief privacy officer, and spend a lot of time thinking about things like data protection and cybersecurity, especially as it relates to AI. And I recently assumed the role of GC of an AI startup. So

    Last week I had the pleasure of joining my fellow bloggers at the UT Connecting the Threads Conference on the legal issues related to generative AI (GAI) that lawyers need to understand for their clients and their own law practice. Here are some of the questions I posed to the audience and some recommendations for clients. I'll write about ethical issues for lawyers in a separate post. In the meantime, if you're using OpenAI or any other GAI, I strongly recommend that you read the terms of use. You may be surprised by certain clauses, including the indemnification provisions. 

    I started by asking the audience members to consider what legal areas are most affected by GAI? Although there are many, I'll focus on data privacy and employment law in this post.

    Data Privacy and Cybersecurity

    Are the AI tools and technologies you use compliant with relevant data protection and privacy regulations, such as GDPR and CCPA? Are they leaving you open to a cyberattack?

    This topic also came up today at a conference at NCCU when I served as a panelist on cybersecurity preparedness for lawyers.

    Why is this important?

    ChatGPT was banned in Italy for a time

    Depending on who you talk to, you get some pretty extreme perspectives on generative AI. In a former life, I used to have oversight of the lobbying and PAC money for a multinational company. As we all know, companies never ask to be regulated. So when an industry begs for regulation, you know something is up. 

    Two weeks ago, I presented the keynote speech to the alumni of AESE, Portugal’s oldest business school, on the topic of my research on business, human rights, and technology with a special focus on AI. If you're attending Connecting the Threads in October, you'll hear some of what I discussed.

    I may have overprepared, but given the C-Suite audience, that’s better than the alternative. For me that meant spending almost 100 hours  reading books, articles, white papers, and watching videos by data scientists, lawyers, ethicists, government officials, CEOs, and software engineers. 

    Because I wanted the audience to really think about their role in our future, I spent quite a bit of time on the doom and gloom scenarios, which the Portuguese press highlighted. I cited the talk by the creators of the Social Dilemma, who warned about the dangers of social

    A few months ago, I asked whether people in the tech industry were the most powerful people in the world. This is part II of that post.

    I posed that question after speaking at a tech conference in Lisbon sponsored by Microsoft. They asked me to touch on business and human rights and I presented the day after the company announced a ten billion dollar investment in OpenAI, the creator of ChatGPT. Back then, we were amazed at what ChatGPT 3.5 could do. Members of the audience were excited and terrified- and these were tech people. 

    And that was before the explosion of ChatGPT4. 

    I've since made a similar presentation about AI, surveillance, social media companies to law students, engineering students, and business people. In the last few weeks, over 10,000 people including Elon Musk, have called for a 6-month pause in AI training systems. If you don't trust Musk's judgment (and the other scientists and futurists), trust the "Godfather of AI," who recently quit Google so he could speak out on the dangers, even though Google has put out its own whitepaper on AI development. Watch the 60 Minutes interview with the CEO of

    My mind is still reeling from my trip to Lisbon last week to keynote at the Building The Future tech conference sponsored by Microsoft.

    My premise was that those in the tech industry are arguably the most powerful people in the world and with great power comes great responsibility and a duty to protect human rights (which is not the global state of the law).

    I challenged the audience to consider the financial price of implementing human rights by design and the societal cost of doing business as usual.

    In 20 minutes, I covered  AI bias and new EU regulations; the benefits and dangers of ChatGPT; the surveillance economy; the UNGPs and UN Global Compact; a new suit by Seattle’s school board against social media companies alleging harmful mental health impacts on students; potential corporate complicity with rogue governments; the upcoming Supreme Court case on Section 230 and content moderator responsibility for “radicalizing” users; and made recommendations for the governmental, business, civil society, and consumer members in the audience.

    Thank goodness I talk quickly.

    Here are some non-substantive observations and lessons. In a future post, I'll go in more depth about my substantive remarks. 

    1. Your network

    An ambitious question, yes, but it was the title of the presentation I gave at the Society for Socio-Economists Annual Meeting, which closed yesterday. Thanks to Stefan Padfield for inviting me.

    In addition to teaching Business Associations to 1Ls this semester and running our Transactional Skills program, I'm also teaching Business and Human Rights. I had originally planned the class for 25 students, but now have 60 students enrolled, which is a testament to the interest in the topic. My pre-course surveys show that the students fall into two distinct camps. Most are interested in corporate law but didn't know even know there was a connection to human rights. The minority are human rights die hards who haven't even taken business associations (and may only learn about it for bar prep), but are curious about the combination of the two topics. I fell in love with this relatively new legal  field twelve years ago and it's my mission to ensure that future transactional lawyers have some exposure to it.

    It's not just a feel-good way of looking at the world. Whether you love or hate ESG, business and human rights shows up in every factor and many firms have built

    I'm a huge football fan. I mean real football– what people in the US call soccer. I went to Brazil for the World Cup in 2014 twice and have watched as many matches on TV as I could during the last tournament and this one. In some countries, over half of the residents watch the matches when their team plays even though most matches happen during work hours or the middle of the night in some countries. NBC estimates that 5 billion people across the world will watch this World Cup with an average of 227 million people a day. For perspective, roughly 208 million people, 2/3 of the population, watched Superbowl LVI in the US, which occurs on a Sunday.

    Football is big business for FIFA and for many of its sponsors. Working with companies such as Adidas, Coca-Cola, Hyundai / KIA, Visa, McDonald's, and Budweiser has earned nonprofit FIFA a record 7.5 billion in revenue for this Cup. Fortunately for Budweiser, which paid 75 million to sponsor the World Cup, Qatar does not ban alcohol. But in a plot twist, the company had to deal with a last-minute stadium ban. FIFA was more effective in Brazil, which has

    As much as I love being a professor, it can be hard. I’m not talking about the grading, keeping the attention of the TikTok generation, or helping students with the rising mental health challenges.

    I mean that it’s hard to know what to say in a classroom. On the one hand, you want to make sure that students learn and understand the importance of critical thinking and disagreeing without being disagreeable.

    On the other hand, you worry about whether a factual statement taken out of context or your interpretation of an issue could land you in the cross hairs of cancel culture without the benefit of any debate or discussion.

    I’m not an obvious person who should be worried about this. Although I learned from some of the original proponents of critical race theory in law school, that’s not my area of expertise. I teach about ESG, corporate law, and compliance issues.

    But I think about this dilemma when I talk about corporate responsibility and corporate speech on hot button issues. I especially think about it when I teach business and human rights, where there are topics that may be too controversial to teach because some issues are too close

    Last month, I posted about an experiment I conducted with students and international lawyers. I’ve asked my law student, Kaitlyn Jauregui to draft this post summarizing the groups’ reasoning and provide her insights. Next week, I’ll provide mine in light of what I’m hearing at various conferences, including this week’s International Bar Association meeting. This post is in her words.

    After watching The Social Dilemma, participants completed a group exercise by deciding which social issues were a priority in the eyes of different tech industry stakeholders. The Social Dilemma is a 2020 docudrama that exposes how social media controls that influences the behavior, mental health, and political views of users by subjecting them to various algorithms. Director Jeff Orlowski interviewed founding and past tech employees of some of the biggest companies in Silicon Valley to bring awareness to viewers.  

    Groups of primarily American college students, primarily American law students, one group of Latin American lawyers, and one group of international lawyers completed the exercise. Each of the groups deliberated from the perspective of a CEO, investor, consumer, or NGO.  Acting as that stakeholder, the team then ranked the following issues in order of importance: Incitements to