
Cluely, an AI startup infamous for enabling users to “cheat on everything,” has secured $15 million in Series A funding led by Andreessen Horowitz. The tool provides real-time, screen-based AI assistance during interviews, exams, and sales calls. Originally launched as “Interview Coder” to help software engineers ace technical interviews, the startup has since rebranded while maintaining its provocative edge. Co-founder Roy Lee, suspended from Columbia University over the original tool, now aims to push Cluely to 1 billion views across social platforms. With controversy driving attention, Cluely is scaling fast and leaning into AI-powered productivity, ethics be damned.
Undetectable AI Assistant Sparks Ethical Debate
Cluely markets itself as an “undetectable” AI assistant that watches users’ screens and feeds them real-time answers. The tool blurs the line between productivity aid and unethical automation, sparking heated debate in the tech and education sectors. Initially framed around job interviews, the product now targets broader use cases, exams, meetings, and even dates, raising questions about the future of integrity in an AI-driven world.
The startup’s core innovation lies in its invisibility. It is running behind the scenes and assists the user to come up with professional AI-written responses without having to notify the other party. It may strike teachers, recruiters, and ethicists as alarming, but it will be attractive to job seekers and marketing representatives. Despite this, Cluely is already profitable and rapidly growing. It claims that up to 70% of interactions supported by its AI are indistinguishable from unaided human behavior.
Critics argue this sets a dangerous precedent, normalizing deception under the guise of efficiency. For now, investors are betting on virality. Andreessen Horowitz’s backing adds legitimacy and momentum to Cluely’s ambitions. Yet the broader AI community is watching closely, wary of how fast “cheating tools” like Cluely could disrupt hiring pipelines, academic integrity systems, and user trust across digital platforms.
Marketing on Steroids, Cluely’s Growth Playbook
Cluely’s rise owes as much to its AI capabilities as to its brash, viral marketing strategy. CEO Roy Lee, who once went viral for using Cluely on a date, has built a cult-like following with slick videos, controversial antics, and a willingness to toe the line between comedy and scandal. This unapologetically bold branding has fueled public curiosity and helped position Cluely as both a tech product and a cultural phenomenon.
The company plans to hire 50 “growth interns” tasked with posting TikToks daily to saturate social media. Lee also threw a massive afterparty during YC’s AI Startup School, so large that police shut it down. He says the goal is simple: attention at any cost. This aligns with Cluely’s vision of becoming “the tool everyone talks about,” not just in tech but across mainstream media.
This aggressive visibility strategy plays into Cluely’s product narrative: AI as an edge, a cheat code to win in competitive environments. For better or worse, the startup is creating a new category of “assistive dishonesty” that appeals to users who see modern success as a game best hacked. While traditional AI tools promise productivity or creativity, Cluely offers an advantage, even if it comes with ethical gray zones baked in.
The Future of AI Ethics Gets Complicated
Cluely’s rapid success illustrates a growing tension in AI: where’s the line between assistance and deception? As tools like Cluely scale, industries must confront a new reality. AI can quietly guide behavior in real time, challenging assumptions about merit and authenticity. While investors see disruptive potential, critics warn of a digital arms race that undermines trust in exams, hiring, and communication. Whether Cluely becomes the next productivity revolution or the poster child for AI misuse, one thing is clear: the ethics of “helping” with AI are no longer theoretical. Cluely’s model dares us to ask, what happens when cheating becomes productized?