Whether it’s Valentine’s Day notes or emails to loved ones, using AI to write leaves people feeling

When you outsource romance, there’s a hidden emotional cost.

Author: Julian Givi on Feb 02, 2026
 
Source: The Conversation
People seem to intuitively understand something meaningful should require doing more than pushing a button or writing a prompt. design master/iStock via Getty Images

As Valentine’s Day approaches, finding the perfect words to express your feelings for that special someone can seem like a daunting task – so much so that you may feel tempted to ask ChatGPT for an assist.

After all, within seconds it can dash off a well-written, romantic message. Even a short, personalized limerick or poem is no sweat.

But before you copy and paste that AI-generated love note, you might want to consider how it could make you feel about yourself.

We research the intersection of consumer behavior and technology, and we’ve been studying how people feel after using generative AI to write heartfelt messages. It turns out that there’s a psychological cost to using the technology as your personal ghostwriter.

The rise of the AI ghostwriter

Generative AI has transformed how many people communicate. From drafting work emails to composing social media posts, these tools have become everyday writing assistants. So it’s no wonder some people are turning to them for more personal matters, too.

Wedding vows, birthday wishes, thank you notes and even Valentine’s Day messages are increasingly being outsourced to algorithms.

The technology is certainly capable. Chatbots can craft emotionally resonant responses that sound genuinely heartfelt.

But there’s a catch: When you present these words as your own, something doesn’t sit right.

When convenience breeds guilt

We conducted five experiments with hundreds of participants, asking them to imagine using generative AI to write various emotional messages to loved ones. Across every scenario we tested – from appreciation emails to birthday cards to love letters – we found the same pattern: People felt guilty when they used generative AI to write these messages compared to when they wrote the messages themselves.

When you copy an AI-generated message and sign your name to it, you’re essentially taking credit for words you didn’t write.

This creates what we call a “source-credit discrepancy,” which is a gap between who actually created the message and who appears to have created it. You can see these discrepancies in other contexts, whether it’s celebrity social media posts written by public relations teams or political speeches composed by professional speechwriters.

When you use AI, even though you might tell yourself you’re just being efficient, you can probably recognize, deep down, that you’re misleading the recipient about the personal effort and thought that went into the message.

The transparency test

To better understand this guilt, we compared AI-generated messages to other scenarios. When people bought greeting cards with preprinted messages, they felt no guilt at all. This is because greeting cards are transparently not written by you. Greeting cards carry no deception: Everyone understands you selected the card and that you didn’t write it yourself.

We also tested another scenario: having a friend secretly write the message for you. This produced just as much guilt as using generative AI. Whether the ghostwriter is human or an artificial intelligence tool doesn’t matter. What matters most is the dishonesty.

There were some boundaries, however. We found that guilt decreased when messages were never delivered and when recipients were mere acquaintances rather than close friends.

These findings confirm that the guilt stems from violating expectations of honesty in relationships where emotional authenticity matters most.

Somewhat relatedly, research has found that people react more negatively when they learn a company used AI instead of a human to write a message to them.

But the backlash was strongest when audiences expected personal effort – a boss expressing sympathy after a tragedy, or a note sent to all staff members celebrating a colleague’s recovery from a health scare. It was far weaker for purely factual or instructional notes, such as announcing routine personnel changes or providing basic business updates.

What this means for your Valentine’s Day

So, what should you do about that looming Valentine’s Day message? Our research suggests that the human hand behind a meaningful message can help both the writer and the recipient feel better.

This doesn’t mean you can’t use generative AI as a brainstorming partner rather than a ghostwriter. Let it help you overcome writer’s block or suggest ideas, but make the final message truly yours. Edit, personalize and add details that only you would know. The key is co-creation, not complete delegation.

Generative AI is a powerful tool, but it’s also created a raft of ethical dilemmas, whether it’s in the classroom or in romantic relationships. As these technologies become more integrated into everyday life, people will need to decide where to draw the line between helpful assistance and emotional outsourcing.

This Valentine’s Day, your heart and your conscience might thank you for keeping your message genuinely your own.

The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

Read These Next

Recommended for You