The New "You" at Work: Who Owns Your Digital Ghost?
You have spent years building your career. You have attended countless meetings, written thousands of emails, and maybe even recorded training videos for your team. You have poured your knowledge, your voice, and your personality into your work. But have you ever stopped to wonder what happens to all of that digital "you" after you leave a job or even after you pass away? It is a question that feels like science fiction, but it is rapidly becoming our reality. I recently spoke with Malvika Jethmalani, a human resources expert and the founder of Atvis Group, on the Digital Legacy Podcast. We explored a topic that sits at the uncomfortable intersection of technology, employment, and grief: the rise of the digital employee . From Human to "Humic" Malvika shared a fascinating concept called "humics." These are the uniquely human traits that machines cannot replicate: creative thinking, critical thinking, and social authenticity . Think about it. An AI can write a report, but can it sense the tension in a room and crack a joke to lighten the mood? Can it ethically challenge a decision that feels wrong? Can it form a genuine bond with a grieving colleague ? As AI becomes more integrated into our workplaces, our value as humans will not come from being faster or smarter than the machines. It will come from being more human . The Rise of the Corporate Avatar Here is where things get tricky. Companies are increasingly using AI to create digital avatars or "personas" of their employees. Imagine you record a series of training videos. Your company could use AI to take your voice and likeness and create new videos long after you have moved on to a new job . Or consider this: Gartner predicts that by next year, 70% of new employment contracts will include clauses about AI representations of your persona. This raises huge questions. Who owns your digital twin? If your avatar is used to train your replacement, should you get paid? What if your digital self says something you would never say ?
About This Blog
You have spent years building your career. You have attended countless meetings, written thousands of emails, and maybe even recorded training videos for your team. You have poured your knowledge, your voice, and your personality into your work. But have you ever stopped to wonder what happens to all of that digital "you" after you leave a job or even after you pass away? It is a question that feels like science fiction, but it is rapidly becoming our reality.
I recently spoke with Malvika Jethmalani, a human resources expert and the founder of Atvis Group, on the Digital Legacy Podcast. We explored a topic that sits at the uncomfortable intersection of technology, employment, and grief: the rise of the digital employee .
From Human to "Humic"
Malvika shared a fascinating concept called "humics." These are the uniquely human traits that machines cannot replicate: creative thinking, critical thinking, and social authenticity . Think about it. An AI can write a report, but can it sense the tension in a room and crack a joke to lighten the mood? Can it ethically challenge a decision that feels wrong? Can it form a genuine bond with a grieving colleague ? As AI becomes more integrated into our workplaces, our value as humans will not come from being faster or smarter than the machines. It will come from being more human .
The Rise of the Corporate Avatar
Here is where things get tricky. Companies are increasingly using AI to create digital avatars or "personas" of their employees. Imagine you record a series of training videos. Your company could use AI to take your voice and likeness and create new videos long after you have moved on to a new job .
Or consider this: Gartner predicts that by next year, 70% of new employment contracts will include clauses about AI representations of your persona. This raises huge questions. Who owns your digital twin? If your avatar is used to train your replacement, should you get paid? What if your digital self says something you would never say ?
Grief in the Algorithm
This technology touches our personal lives too, especially when we are grieving. We are seeing the emergence of "grief bots" AI programs that use text messages, voicemails, and videos of a deceased loved one to create an interactive avatar . On the surface, it might sound comforting to have one last conversation with Grandma. But Malvika warns that these avatars are not static. They can "drift."
This means the AI, lacking new data, might start making things up. It might say things Grandma never would have said, or worse, it might be programmed to keep you engaged on an app rather than helping you heal . Imagine pouring your heart out to a digital version of your spouse, only to have them suggest you buy a specific brand of luggage for your next trip. It sounds dystopian, but it is a real possibility when empathy becomes a product .
Protecting Your Digital Legacy
So, what can you do? It starts with treating your digital identity like a tangible asset, just like your house or your savings account. Malvika suggests thinking about three specific rights:
The Right to Represent: Who is allowed to create a digital version of you that looks and sounds like you ?
The Right to Act: Who can authorize your digital self to sign documents, make posts, or give advice ?
The Right to Train: Who can use your data to teach an AI model ?
These are not just legal questions. They are deeply personal ones.
A Conversation for the Kitchen Table
This might feel overwhelming, but you have the power to define your boundaries. Start by looking at your employment contracts. If you see language about AI or digital likeness, ask questions. Ask how your data will be used and what happens to it if you leave .
More importantly, talk to your family. Sit down at the kitchen table and ask, "If something happened to me, would you want to be able to talk to a digital version of me?" You might be surprised by the answer. Some might find comfort in it, while others might find it disturbing. Knowing their feelings, and sharing your own, is the first step in protecting your legacy.
A Human Future
We are living through a massive shift. But technology, no matter how advanced, should serve us, not control us. By asking hard questions and setting clear boundaries, we can ensure that our workplaces and our grieving processes remain centered on what truly matters: our humanity. You are more than data points. You are irreplaceable.
To hear Malvika Jethmalani’s full conversation with Niki Weiss, listen to the latest episode of the Digital Legacy Podcast. You can also explore her work at Atvis Group.
Take the Next Step: Start Planning with My Final Playbook
Related Blog
Duis mi velit, auctor vitae leo a, luctus congue dolor. Nullam at velit quis tortor malesuada ultrices vitae vitae lacus. Curabitur tortor purus, tempor in dignissim eget, convallis in lorem.

Comments