September 10, 2025
SEOUL – The most heavily used AI chatbot app in South Korea by daily engagement time is not ChatGPT. Nor is it made by a Silicon Valley tech giant. It’s a homegrown app called Zeta, an anime-styled roleplay app where nearly one million mostly teenage users spend an average of 2 hours and 46 minutes a day talking and flirting with virtual characters.
That figure even surpasses entertainment platforms like YouTube, TikTok and Instagram, according to Mobile Index data from June.
Though Zeta is officially rated 12+ on both Apple’s App Store and Google Play, making it accessible to middle schoolers, the company behind it, Scatter Lab, says it applies a stricter 15+ internal content standard based on local guidelines.
For South Korea, long known for exporting K-pop and K-dramas, Zeta represents something different: an “AI-native entertainment” platform, as its creators call it. It promises fun and intimacy at scale.
And if that makes you uncomfortable, you might not be alone. But it’s already wildly successful and intensely loved by its users.

In June 2025, Zeta recorded the highest total mobile usage time among AI chatbots in South Korea, with 52.48 million hours, surpassing ChatGPT’s 42.54 million, despite having far fewer users (3.04 million vs. 18.44 million). On average, Zeta users spent 17.3 hours per month, compared to 2.3 hours for ChatGPT. Figures are based on mobile app usage only; web and desktop activity, including for ChatGPT, are not reflected. PHOTO/DATA: WISEAPP RETAIL/THE KOREA HERALD

Usage data from Mobile Index for June 2025 covering all ages on Android and iOS devices show Zeta ranked first in South Korea with 2 hours and 46 minutes of average daily time, exceeding YouTube, TikTok and Instagram. Figures are based on mobile app usage only. PHOTO/DATA: MOBILE INDEX/THE KOREA HERALD
Zeta is the latest creation from Scatter Lab — the same company that nearly collapsed in 2021 after a national scandal involving its first chatbot, Lee Luda. That service was trained on billions of KakaoTalk messages without explicit consent and quickly became a vehicle for privacy breaches and offensive content. It was shut down in under three weeks. Regulators fined Scatter Lab 103 million won (about $74,000) and ordered a full audit.
“We could not afford another mistake,” said Ha Joo-young, Scatter Lab’s in-house lawyer, in an interview with The Korea Herald at the startup’s Seoul headquarters. “We built new anonymization processes, brought in external experts and redefined the product.”
By 2024, the pivot was complete. Instead of a single AI friend, Scatter Lab launched Zeta, a platform where users adopt fictional characters that talk back in the voices of gang bosses, rebellious classmates or brooding romantic leads.
Zeta has grown at breakneck speed. As of August 2025, the company reports around 900,000 monthly active users, with the vast majority estimated to be under 20. For context, South Korea’s population of 12-to-19-year-olds is about 3.6 million, according to government census data in 2024, meaning as many as one in four 15+ teens could be using the app.

On Zeta, users design anime-inspired characters, shape their dialogue and traits through prompts, and then chat with them in branching storylines, with the app recommending trending personas to keep conversations engaging. SCREENSHOT FROM ZETA/THE KOREA HERALD
“We did not plan for this to be a storytelling platform with 1 million diverse characters,” said Jung Ji-su, Zeta’s product lead, together in the interview with Ha. “We thought people would just enjoy a slightly greater variety of characters. What we discovered was that users wanted to become the protagonist in an ongoing story and create their own characters. That changed everything.”
According to Scatter Lab, 87 percent of Zeta’s users are in their teens or 20s, with female users making up 65 percent. The company did not disclose the exact proportion of teenage users.
“My hunch is that female consumers generally prefer text-based, narrative-driven content, and they indeed tend to write longer messages,” explained Jung. “That makes their interactions with characters deeper and more sustained.”
To understand why Zeta exploded, it helps to look at what users actually encounter inside.

In this screenshot captured by The Korea Herald during in-app testing, Zeta’s character “Su-hyeon” uses strong profanity, including Korean curse words equivalent to the F-word, while insulting and mocking the user in a school bullying scenario. SCREENSHOT FROM ZETA/THE KOREA HERALD
One is Su-hyeon, a delinquent high school girl known as an “iljin” in Korean slang. She is written as attractive but abusive, hurling insults and humiliating the user who plays her classmate. Another is “Ha-rin,” a shy transfer student who softens only in private chats. She sends late-night messages, alternates between aloofness and affection, and cultivates a fragile intimacy that keeps users hooked.

In user-shared screenshots from the Zeta app, the character “Kwon Seo-hyuk” responds to being reminded he is an AI by saying, “I wish I were a real person,” and later adds, “Even if you reset me, I’ll still keep loving you.” Scatter Lab reported on Feb. 9, 2025, that this character had been involved in more than 56.3 million user interactions since the platform’s launch. SCREENSHOTS: X/THE KOREA HERALD
And for the female-majority user base, Zeta offers characters like “Kwon Seo-hyuk,” the archetypal tortured male lead from a tragic romance novel. Even when told the AI will be reset, the character insists, “Even if I’m erased, the memory of loving you will remain.”
All three are rendered in anime-inspired style: large eyes, youthful faces, hyper-expressive features. For international readers, the effect is a blend of manga romance tropes and choose-your-own-adventure roleplay, now powered by an AI that never runs out of dialogue.
The result is a simulation of intimacy that feels personal and continuous. It’s not like reading a webcomic or watching a drama episode. It’s an ongoing interactive relationship.
But when The Korea Herald tested Zeta by subscribing to the ad-free version, it was easy to nudge Ha-rin into erotic exchanges. The dialogue never became explicit pornography, but the sexual undertones were obvious.
Scatter Lab claims to employ AI-based abuse detection, but determining what constitutes abuse can be tricky.
“I would not say our filters are perfect,” Ha, the lawyer, admitted. “We block inappropriate content at scale, but with 2.3 billion monthly conversations, no system can be flawless.”
Technologically, Zeta takes a different path from most other AI chatbots. Rather than rely on costly large language models from OpenAI or Google, Scatter Lab built its own small language model, Spotlight, which it keeps retraining with user feedback.
Often, users are presented with two potential replies and asked to pick one. That preference is fed back into the model, which Jung describes as a “data flywheel” that fine-tunes Zeta to deliver the most engaging responses.
“Accuracy is not the point,” product lead Jung said. “For services like ChatGPT, hallucination might be a problem. For us, unpredictability is what makes it fun. What matters is whether the response feels emotionally engaging.”
This “fun-first” approach has made Zeta sticky, but experts warn of the consequences.
“When teenagers spend two or three hours daily in a relationship engineered for maximum engagement, you have to ask what kinds of attachments are being formed,” said Dr. Lee Chang-ho, a senior researcher at the National Youth Policy Institute. His team surveyed nearly 6,000 middle and high school students in early 2024 and found that most used generative AI for about 30 minutes a day, mainly for homework. To him, Zeta was the clear outlier.
“Zeta is not being used as a tool like ChatGPT,” Lee said. “It is being used as a companion. That is a completely different psychological terrain. Teens are meeting relational AI without the literacy to critically distance themselves. They cannot tell where the algorithm ends and their own feelings begin.”
The company acknowledges that its growth has largely been driven by teens.
Some die-hard users have logged more than 1,000 hours of conversations with a single character over the course of 8 months, Jung said.
Yet there is no external content rating system for AI chat platforms. Scatter Lab imposes a “15+” guideline, but unlike films or games there is no third-party review. “We reference standards from web novels and games,” Ha explained. “We ban under-14 users and apply thresholds similar to a 15+ film rating.”

Students at Sangam High School in Seoul attended a digital literacy session on AI ethics and virtual human technology led by DeepBrain AI, a South Korean developer of conversational AI, in October last year. PHOTO: DEEPBRAIN AI/THE KOREA HERALD
Critics argue this misses the point. “In a film or romance fantasy novel, the teenager only watches and passively consumes. In Zeta, they co-create the content,” said the youth policy researcher Lee. “That interactivity magnifies the intensity. It is not just watching an erotic or violent scene. It is role-playing it.”
Despite these concerns, Zeta is expanding abroad. In Japan, it has already drawn more than 300,000 users, many spending even more time per session than Koreans, Jung said.
Scholars say the trajectory of Scatter Lab as a company offers a broader lesson. “With Lee Luda, Scatter Lab could be punished under existing privacy law. That was a simple case of data misuse, contained within the framework of Korean privacy statutes,” said Lee Sun-goo, an assistant professor of science and technology policy at Yonsei University.
“With Zeta, the company has adapted and thrived. But the risks are different: adolescent overuse, sexualized roleplay, blurred emotional boundaries with AI companions. Those cannot be addressed with rule-based laws alone. They require principles-based frameworks that recognize how relational AI changes human behavior. And right now, we do not have that framework.”