AI Joel: Who owns him?
00:00:05:03 - 00:00:31:06
Lori MacVittie
Hi everyone. This is Pop Goes the Stack, your front row seat to emerging tech, digital duct tape, and the slow unraveling of sanity in production. See? It's already starting with me. I'm Lori MacVittie. Brace yourself. Because we're going to be talking about a very interesting topic today. So the question is who owns AI Joel? He's right here.
00:00:31:08 - 00:00:34:18
Joel Moses
That's right. I'm used to getting owned by you in every podcast, Lori.
00:00:34:21 - 00:00:35:18
Lori MacVittie
Oh, It's
00:00:35:19 - 00:00:37:09
Joel Moses
Yeah, simple answer.
00:00:37:11 - 00:01:07:13
Lori MacVittie
Thank you. Thank you, Joel. If that is you, Joel, we're not really sure. That's kind of the point. CEOs are doing this. There are, digital twins are real. They're being used. And the question is, who owns you once your employer trains a model on everything you've written, said, or posted. So this week we're going to talk about the quiet clause in your employment contract that already says your IP, right, belongs to your employer.
00:01:07:15 - 00:01:30:17
Lori MacVittie
But what happens with generative AI? Does that get included? Do we need more clauses about training on our email as Joel has pointed out? I, the point here is that, you know, if you create content for your job, whether it's blogs, it's podcasts, you're doing slide decks, you're doing internal documents, you're not just shipping knowledge anymore, right?
00:01:30:17 - 00:01:42:24
Lori MacVittie
You're training data. You're training a model. So we've got Ken with us again today. I love having Ken. Let's let's dig in. Who owns AI Joel?
00:01:42:27 - 00:02:01:11
Joel Moses
That's a fantastic question. I like I said, I mean, you obviously own me every single time we have a podcast here, Lori. But, you know, it is it is a really good question. And, you know, it's not, when I think about the data sets that are generated and you mentioned some of them, like blogs and decks and, all sorts of different public postings.
00:02:01:11 - 00:02:31:17
Joel Moses
But there's also the fact that you're employer actually owns the emails that you send as well. And I think about the length of an average U.S. career, that's about 42 years. And if you extrapolate that out and, consider an average work year is consisting of about 250 days and you're sending 40 emails a day, that's over 420,000 emails that they can use as a source of, of, of obtaining your voice and obtaining some of your internal inputs.
00:02:31:19 - 00:02:51:25
Joel Moses
And regardless of whether you generate decks or whether you generate blog posts or anything like that, the there is a data set that you generate as a, as a matter of doing business. And so, you know, obviously the business owns that. They own the email. But but do they have necessarily the right to use that, that data to create a digital twin of you?
00:02:51:26 - 00:02:57:12
Joel Moses
And I think that that's an area that's kind of not established case law yet.
00:02:57:14 - 00:03:18:24
Ken Arora
I would think there's an analogy here with intellectual property. I mean, come on, what Joel comes up with from his intellect is sort of the definition of intellectual property. And I think that, you know, there are there will follow those precedents. It becomes an asset. It becomes an asset that, sometimes deliberately a CEO making a digital twin is a deliberate choice.
00:03:18:24 - 00:03:44:22
Ken Arora
And we don't, it's no different than a book. If a CEO wrote a book about this is how I would run the company. And these are, these are my words of wisdom. It's a lot like this. It's maybe a more rudimentary version, because it isn't as interactive. But I think that's the analogy. And you'll, it'll be an asset that employers will ask for, that you'll be able to sign away, that you'll be able to have rights to. Potentially even, you could even go to what are the rights after you pass away?
00:03:44:24 - 00:04:08:09
Joel Moses
That's true. There is a there is a mechanism in case law, and but it's it's it's state driven. And, the concept behind it is essentially around digital rights ownership or the, the rights to your image. And that's called the right of publicity. And it's being encoded into law in certain states, especially states that have a high degree of tie to the entertainment industry.
00:04:08:09 - 00:04:34:00
Joel Moses
So I think Tennessee has got one called the Elvis Act, which restricts companies from, from training models that are in the likeness of popular performers. And has specific compensation requirements related to that. And I think California also has one as well. I even think that there's an associated mechanism for, for, for changing copyright law to cover some of these things as well.
00:04:34:03 - 00:04:47:05
Joel Moses
But these are these are new. These are brand new things. They're not established in every single state. And so the right of publicity is not total. And it's, it's also unclear as to whether it would be affected by employment contracts.
00:04:47:07 - 00:05:07:27
Lori MacVittie
Well, and it's, it's also, I mean, where we talk about, right, so if we leave the company, if we die. Right? Can they continue to use that. That's a, that's a good question. Another one is let's say that they have the rights to it while we're employed. We understand that like you said IP, patents, blog posts, we know that all belongs to your employer.
00:05:07:27 - 00:05:35:20
Lori MacVittie
No big deal. But what if they create an AI Joel, in your likeness? And then while you're off doing one thing, you are also off doing another thing and words are put into your mouth and maybe you weren't comfortable with that. So is that, is that something you know that is going to happen? I think companies are generally not, right, that, but...
00:05:35:22 - 00:05:55:15
Joel Moses
Yeah, this doesn't have that much to do with like an employer's use of this, but it's already happened. I mean, we talked about it on this podcast, the, the, the, the event where Fortnite debuted a character, Darth Vader, which actually leveraged the actor's voice and likeness and then attached a generative AI model to it.
00:05:55:15 - 00:06:22:11
Joel Moses
And people found that they could game the system and make that that that likeness say anything that they wanted. And that that's, you know, I'm sure I'm sure the family that that, that licensed his likeness to this particular effort wasn't terribly happy about that. But again, it's it's, it's one of those things where if it's your employer doing it and it's established into your employment contract, I don't think that that is a necessarily settled area of law.
00:06:22:14 - 00:06:40:09
Joel Moses
I don't know that it, it really deals well with right to publicity. And also, if you look at, like, the right to privacy and some of the privacy laws that come out of the EU, you have a right to be forgotten in some regions, meaning I don't want my data sets to remain around for longer than they're needed.
00:06:40:11 - 00:06:46:04
Joel Moses
And so this is, this is definitely something that's going to be talked about for the next few years.
00:06:46:11 - 00:07:09:00
Ken Arora
Yeah. Now, another thing I'll take from copyright laws is disclosure. You can use excerpts and bits of pieces, but if I generate something by AI Joel, I, I would think that a legal requirement would be disclosing, this is not from Joel. This is from an AI facsimile, imitation, whatever we end up calling that, of Joel. And who knows, maybe and maybe you'd agree to that.
00:07:09:00 - 00:07:14:12
Ken Arora
Maybe they, they'd give you residuals Joel for every
Joel Moses
Wow!
Ken Arora
blog post.
00:07:14:14 - 00:07:40:22
Joel Moses
So if they create a model based on my likeness, appearance, and demeanor, then I get proceeds posthumously, perhaps of, perhaps for the use of that? I mean, this is, this is one of those things where, we're reaching this point in technology where it may be possible to, to create a fairly good individual likeness that's trained on, on the, the, the writings and the emails that people have generated.
00:07:40:22 - 00:08:06:09
Joel Moses
And, what does that mean? Is it a continuance to our employment contract with a company? Do they retain rights in perpetuity? Once you signed on on the dotted line. It's it's, it's a real thing. But on that, on that, on that measure, Ken, how likely is it that that someone could take data sets like the emails stored up from, from years of correspondence?
00:08:06:16 - 00:08:10:27
Joel Moses
How likely is it that they could create a fairly accurate model that responds like you?
00:08:11:00 - 00:08:32:07
Ken Arora
I think they would have. Well, my email persona is not necessarily the same persona I have outside of email. So I think they do have a a skewed data set there. But I think it's, you know, they could get a reasonable approximation to my tone and character. Whether that would be sufficient to figure out my chain of reasoning or my thought patterns.
00:08:32:10 - 00:09:00:15
Ken Arora
Maybe we're a little ways away from that. Maybe we'll get there. But I'll go, I'll go in another place with this, which is what happens if I actually want this to happen. And you see this today. People are building tuned models that are trying to be good facsimiles of a person deliberately. I've seen this in the context of people who are passing away and they want their children or their, or their progenitors to have a, this is what dad was like
00:09:00:15 - 00:09:07:26
Ken Arora
or mom was like.
Joel Moses
Interesting.
Ken Arora
And, you know, that's maybe an interesting use. Yeah.
00:09:07:28 - 00:09:26:08
Joel Moses
Yeah. I think from a, yeah and I think from an achievability perspective, one thing to understand about the technology is, you know, it used to be there was a concept called the rule of ten. Very common concept in machine learning, which is that for a, for a model with ten features, you need 100 elements in order to be a starting point.
00:09:26:11 - 00:09:50:12
Joel Moses
With generative AI, you can actually use other types of training, things like reinforcement learning, combined with synthetic data generation. And it drops that per, to, to below the rule of ten. But you still need an immense amount of data. Having said that, I don't think people are actually aware of the immense amount of data that they generate just by being employed.
00:09:50:15 - 00:10:11:14
Joel Moses
It's, there's a lot. There's a ton. And every interaction you have, every public recording that you post to YouTube like the one here, every email that you send, every email that is sent to you, is, is an element of, of data that that can be incorporated into these models. And the data sets are immense.
00:10:11:16 - 00:10:34:17
Ken Arora
Right.
Lori MacVittie
And I think the, the scary part about that is not being able to replicate, you know, Ken's tone in an email. Because I mean, quite frankly, GPT can replicate my tone and style writing a blog, right? Level of snark, pacing, voice, everything. It can do that because it has all that data. As you point out. It's got, had enough that it can do that.
00:10:34:19 - 00:10:54:12
Lori MacVittie
The scary part is when they start doing the likeness, or right, so then it starts to appear to be, now it's Ken. It's not, I'm on a zoom call, but that's not Ken, but it is Ken, it answers like Ken. Is it Ken? I think that's that next leap where it gets really scary and then the law starts to get muddled.
00:10:54:15 - 00:11:12:15
Lori MacVittie
You know, whether you know, I know Ken you can't be in two places at once. So your, your AI is going to be over here giving this presentation while you're doing this other one. And then the question is, is that okay. Is that okay? Yeah, if it's in your contract of course you agreed to it, but it isn't right now.
00:11:12:22 - 00:11:15:17
Lori MacVittie
Is that okay? Is that something we want to allow?
00:11:15:19 - 00:11:31:03
Joel Moses
Yeah.
Ken Arora
Well, aren't there stories of people having when multiple jobs, you know, time splitting thanks to remote work. And this seems like an area that would be, would be very exploitable by digital twins.
Joel Moses
Yeah.
Ken Arora
Literal digital twins.
00:11:31:06 - 00:11:57:00
Joel Moses
Well, there's also the concerns that people have over misuse, accidental misuse or the loss of the data set or the model and it being used to, to ill effect. Two years ago, I had an interesting conversation with a CISO of a very large American company who told me point blank, that his CEO came to him and said, hey, I need you to go out and discover how we could generate a deepfake of me.
00:11:57:02 - 00:12:17:21
Joel Moses
I'm tired of doing all these company meetings where I have to show up and practice and do all these things. What I would really like to have is, I just want to write a script and then you can generate the company meeting presentation from that. And the, the gut reaction from the CISO was, oh, my goodness, this is terrible.
00:12:17:23 - 00:12:47:00
Joel Moses
Why would someone possibly want to do that? And remember, this is this is two years ago. This is when deepfakes were okay, but you could still kind of see straight through them. And he pointed out to the CEO that if, if I, if you can put words in your own mouth and have it look realistic and provide output to the entire company or the public, that anyone else with a word processor could put words in your mouth as well.
00:12:47:03 - 00:13:13:11
Joel Moses
And to think about what would happen if perhaps a company announcement was made by someone who had stolen the deepfake model? And that's, that's a concern. The funny part about it is, as this stuff has gotten better and as it's actually embedded itself into the society a little bit more. I've heard the desire for this more from C levels now than I ever heard back then.
00:13:13:14 - 00:13:16:03
Joel Moses
Which, which is a little concerning.
00:13:16:06 - 00:13:38:07
Lori MacVittie
That's kind of a security angle. Good thing we have Ken with us, right? The how do you, how do you make sure you lock it down so we can only say what you want and not someone else. Because otherwise, you're right, it becomes a huge target of let's get that model so that we can use it to, you know, tank the stock, do this thing, you know, whatever.
00:13:38:09 - 00:13:54:19
Ken Arora
Right. Yeah, I mean, I, I look at it as who's, who's really going to want a deep fake Ken? I mean a deep fake Joel I get. You know, the only people that are going to want, you know, there's, there's the deep fake Joel that people may want. But mostly for me, it's they want to impersonate me for usually ill.
00:13:54:23 - 00:14:15:11
Ken Arora
Right? They want to impersonate my voice because of biometric verification. They want to impersonate likeness because they want to fool some, you know my mother into thinking I'm, I'm kidnaped or whatever else. There's not so many actual legitimate uses. It's sort of funny. I mean, unless you're Jole or you're Taylor Swift. Otherwise, why would I care?
00:14:15:13 - 00:14:21:00
Joel Moses
Joel or Taylor Swift. Never, never been lumped into those, into that category before. Okay.
00:14:21:03 - 00:14:55:06
Lori MacVittie
I'm not sure I would have come up with that one. That was, that was good Ken. But you make a good point. I mean, that's that's huge. We've seen right, things like Google and I'm afraid to say it, Google, Meta, right, all of the the assistants get activated just by, you know, voice in general. Even if you assume lockdown by voice print, suddenly now, you know, are we going to, you know, carry around the little model like, okay, talk like Joel so you can get in to, you know, this place. You know, those kind of, like, sci fi scenarios we've seen in movies, forever.
00:14:55:06 - 00:14:57:26
Joel Moses
In the movie Sneakers comes to mind. My voice is my passport.
00:14:57:29 - 00:15:00:24
Lori MacVittie
That's right. Like, not even that's safe.
00:15:00:27 - 00:15:15:14
Ken Arora
They do you know I, I avoid them. One or 2 times I couldn't avoid, just as a side note, doing the voiceprint. What I actually did was I had Google translate basically translate from English to English for me. So it wasn't my voice.
00:15:15:17 - 00:15:41:28
Joel Moses
Interesting. Poisoning the data set, so to speak, there Ken. So it can't be reproduced. Yeah. So, you know, it's really interesting. And I can only imagine that, first of all that the law related to the right to publicity, it's state by state and it does cover this. It does cover the generation of digital, what I would call neural twins, as well as visual or auditory twins.
00:15:42:00 - 00:16:07:24
Joel Moses
It does cover that. But it is state by state, is very specific to in certain industries in the state as well. There is nothing that covers this, at least from a U.S. federal perspective. I think European law is moving a lot faster in understanding this, and they're attaching the right to privacy to it. And so I think that that's, that's going to have to be something especially multinational companies are going to have to take into account,
00:16:07:26 - 00:16:31:27
Joel Moses
But yeah, I, I, I don't know where this is going to go. I would expect that there are already lawyers inserting language for derivative IP, based on the data that they've collected from you in the course of your employment. I think that that's probably going to become a very common modality in, in employment law.
00:16:32:00 - 00:16:48:12
Lori MacVittie
Yeah. So apparently you can move to Wisconsin because we recognize right to publicity as a statutory right, not just a law. So it's governed. So you can't have, that's it. Move, move to Wisconsin. We have cows and a statutory right.
00:16:48:15 - 00:16:51:17
Joel Moses
That's not near as cool as a law named Elvis. But come on.
00:16:51:23 - 00:16:54:02
Ken Arora
The Elvis I like that.
00:16:54:05 - 00:16:56:05
Lori MacVittie
What, what were we going to name it?
00:16:56:07 - 00:17:00:02
Ken Arora
Yeah, I, I will, I will
Joel Moses
Cheese head.
Lori MacVittie
Cheese law. Cheese law.
00:17:00:04 - 00:17:01:21
Joel Moses
Yeah.
00:17:01:24 - 00:17:15:06
Ken Arora
I got to say that, well, I was gonna make a different point, but I will say by the way, I think both Joel and Taylor Swift are from Nashville. So, just another.
Joel Moses
Very true, very true.
Ken Arora
I actually I want to end, well not end, but I want to say something I think it's pretty optimistic. So we talk about all the risks.
00:17:15:07 - 00:17:44:28
Ken Arora
And there are a lot of risks around this. But I'm, I'm specifically thinking about, neural imitations. I think we're on the verge of something that could be very powerful. Could you imagine something that, take, down to the technology level? Think think of reinforcement learning. Something that continually is with, you know, we're a few years away, but something that's continually with you, that watches, makes predictions, and uses reinforcement learning with human feedback, your own feedback to get better and better at approximating you. Not just your work context.
00:17:45:05 - 00:18:10:21
Ken Arora
And now you have a virtual you that you can pass along. Just like, you know we, you know for centuries we books have been how you would speak from beyond the grave. You would how you would convey information. Say, this is what I learned. And you would do that, and that's great. Imagine having now something that was interactive that could you could chat with and you can now talk to Einstein or Aristotle or Jesus Christ or, you know.
00:18:10:23 - 00:18:18:06
Joel Moses
Yeah, it's an interesting concept. But, is that a corporate owned entity or is that owned by the individual and the individual's progeny?
00:18:18:09 - 00:18:18:28
Ken Arora
Well. I mean the cynic in me.
00:18:19:00 - 00:18:22:17
Joel Moses
That, that's I think is what's really at stake here.
00:18:22:19 - 00:18:44:09
Lori MacVittie
Yeah. And we spend a lot of time right during the day at work. And a lot of the ways that I think and, you know, how I approach problems, how I solve them. Those kind of, you know, insight generating activities that you could certainly get a pattern of that. But, right, is that? That's me. It's not, it's not content.
00:18:44:09 - 00:18:50:16
Lori MacVittie
It's my brain. So now, yeah, now are we? Is?
Joel Moses
That's a good question.
Lori MacVittie
I'm not comfortable with.
00:18:50:16 - 00:18:51:07
Ken Arora
But I think.
00:18:51:23 - 00:19:02:22
Ken Arora
I think it becomes intellectual property. And then the cynic in me says, and people will be willing to sign, many people will be willing to sign away that intellectual property for, you know, a couple of free t-shirts.
00:19:02:22 - 00:19:11:29
Lori MacVittie
Well, you know. I mean, you got to get better than t-shirts. Come on. Better swag. At least a vest. I mean, Joel got a vest.
Joel Moses
There you go, see.
00:19:11:29 - 00:19:12:17
Ken Arora
Yeah, right, right. But I mean the, the.
00:19:12:21 - 00:19:14:07
Joel Moses
Whole closet full.
00:19:14:07 - 00:19:33:17
Ken Arora
So it will be, those might turn in terms of service for a lot of the providers of, of public genAI services.
Joel Moses
Yeah.
Ken Arora
And people will decide what to do. But I could imagine that, you know you could, you would license this, right, after your death. You've got virtual Lori. You've got something that thinks like Lori. Do you want to, do you want to give rights to that to your family?
00:19:33:18 - 00:19:51:02
Ken Arora
If so, would they start, you know. You can imagine a virtual Steve Jobs. Imagine that you had a virtual Steve Jobs that was trained on Steve Jobs. Now, maybe you hire him out for, you know, I don't know, $100,000 an hour. And he will, and he will respond as Steve Jobs might have responded to your business problems.
00:19:51:04 - 00:20:09:05
Joel Moses
Well, I'll be the first in line for a virtual Ken Arora. That that would be. That'd be pretty awesome. So I think it's time to talk about some of the things we've learned today.
Lori MacVittie
Yeah.
Joel Moses
One thing that that struck me when I was researching this topic is, number one, the law around this is not established well.
00:20:09:05 - 00:20:38:21
Joel Moses
It's not established consistently. It, it is going to become a force within employment law. And you'll see employment agreements and employment contracts begin to contain language about derived products using AI. And, you should probably look for that in your employment contract. The second thing is, I don't think people are aware just how much data they generate and that that that data set is incredibly rich and probably plenty viable for creating models against.
00:20:38:21 - 00:21:01:23
Joel Moses
And so, any, any correspondence that you have, could be potentially subject to this. And third, when the when the law is eventually established about this and we have people generating digital twin models that react just like me, I look forward to the paychecks, once I'm retired, from being generated by my AI model. That would be wonderful.
00:21:01:25 - 00:21:19:23
Ken Arora
Yeah, I, yes and, I mean I think it's exciting. We're on the verge, for those, all of you science fiction fans out there, were on the verge of brain taping. Essentially, this is kind of a brain tape. I think it's going to be treated, I would look for intellectual property law for guidance. And there's gonna be a lot of interesting intellectual property law around this.
00:21:19:25 - 00:21:39:20
Ken Arora
There's some federal statutes. As you said, Joel, things are going on a state level. I would look for federal statutes. And, and the debate's already happening. We're having the copyright Mickey Mouse law debates right now about how long inheritors get rights to those things. So, stay tuned. I, I think it's exciting. I think it's copyright.
00:21:39:20 - 00:22:10:01
Lori MacVittie
Awesome. Awesome. I, I learned all those things, but not much more. Because this, I did I did a lot of research as well. Back and forth and looking. Just, I mean, trying to understand because this is a broad area. It's it kind of started with AI art, AI music, AI you know. And it's trickling down like AI is doing everywhere, you know, into the enterprise technology professionals, even those of us who spend most of our time thinking and writing and talking on podcasts.
00:22:10:01 - 00:22:29:17
Lori MacVittie
So it's a it's a topic everyone should be aware of because it's going to affect every single person eventually. So, you know, learning more is good. Understanding more and you know, advocating for, you know, what you think is right in the right places, always a good idea.
Joel Moses
Right?
Lori MacVittie
But that's that's really all we have time for today.
00:22:29:17 - 00:22:44:09
Lori MacVittie
So that's a wrap, you know? Hey, tap subscribe, stash some backup somewhere, and rendezvous here for the next meltdown, which we will definitely have. So get your pager at the ready, and we'll see you next time.
Creators and Guests



