Key Takeaways
- A GitHub project called Colleague Skill went viral in China by letting workers clone colleagues into AI agents — turning workplace files and chat history into automation blueprints.
- Bosses at Chinese tech companies are telling employees to document their own workflows using tools like Colleague Skill and OpenClaw, with the explicit aim of building AI agents to handle those tasks.
- Koki Xu, an AI product manager in Beijing, published an anti-distillation tool that deliberately generates unusable workflow documentation, choosing sabotage over compliance.
- The underlying legal question — who owns the personality, tone, and judgment captured in workflow documentation — remains unresolved in China and most other jurisdictions.
What is workflow distillation and why are companies doing it to their employees?
Workflow distillation extracts how a person does their job — their decisions, habits, and even quirks — and converts that knowledge into instructions an AI agent can follow.
The idea has been circulating in AI research for years. If an AI agent can observe how a skilled employee processes information and makes decisions, it can learn to approximate that process without the employee being present. The version spreading through Chinese tech companies right now is rougher but functional: a worker uses an AI tool to generate a structured document describing every step of their workflow, which then becomes the prompt or configuration for an AI agent assigned to replicate it.
Hancheng Cao, an assistant professor at Emory University who studies AI and work, explained why companies see genuine value in the process beyond just following a trend. "Firms gain not only internal experience with the tools, but also richer data on employee know-how, workflows, and decision patterns," Cao told MIT Technology Review. "That helps companies see which parts of work can be standardized or codified into systems, and which still depend on human judgment."
The tension is in that last phrase. Companies frame the exercise as identifying automation opportunities. Workers hear something different: an inventory of the parts of their jobs most likely to be eliminated.
Which tools are driving this in China and how does the process actually work?
OpenClaw and Colleague Skill together create a workflow distillation pipeline: one extracts how an employee works, the other runs the AI agent built to replace them.
Two tools are central to the trend: OpenClaw, which became a national craze in early 2026, and Colleague Skill, a viral GitHub project built to clone coworkers into reusable AI agents.
Colleague Skill was created by Tianyi Zhou, an engineer at the Shanghai Artificial Intelligence Laboratory. To set it up, a user names the coworker they want to replicate and enters basic profile details. The tool then imports that person's chat history and files from Lark and DingTalk — both widely used workplace apps in China — and generates structured manuals describing not just what the person does, but how they do it. Their communication patterns. Their decision tendencies. Even their punctuation habits.
Zhou told Chinese outlet Southern Metropolis Daily that the project started as a stunt, prompted by AI-related layoffs and by companies increasingly asking employees to automate themselves. The stunt landed harder than he expected.
Amber Li, a 27-year-old tech worker in Shanghai, used Colleague Skill on a former coworker as an experiment. "It is surprisingly good," she told MIT Technology Review. "It even captures the person's little quirks, like how they react and their punctuation habits." Li found the experience both technically impressive and genuinely unsettling — AI that could pass as a specific person, built from their own work files in minutes.
OpenClaw sits underneath many of these deployments as the agent runtime. Bosses across Chinese tech companies have been pushing their teams to experiment with it since OpenClaw became a national phenomenon earlier this year. The combination of OpenClaw's agent infrastructure and Colleague Skill's workflow extraction creates a practical pipeline: document the worker, train the agent, deploy the replacement.
Why are Chinese tech workers under particular pressure right now?
China's tech layoffs since 2023, limited labor protections, and a corporate culture that discourages refusals combine to leave workers with fewer options than peers in the EU or US.
The pressures are real and specific: China's tech sector has faced significant layoffs since 2023, labor protections are weaker than in the EU or parts of the US, and corporate culture in many firms makes refusing management directives professionally risky.
One software engineer, who spoke with MIT Technology Review anonymously because of concerns about their job security, trained an AI on their workflow and found the process emotionally corrosive. Their work "had been flattened into modules in a way that made them easier to replace." The engineer wasn't wrong about the framing — that's precisely what workflow distillation is designed to do.
Gallows humor has emerged alongside the anxiety. In one widely shared comment on Rednote, a user wrote that "a cold farewell can be turned into warm tokens" — meaning that by distilling coworkers into AI agents first, a worker might survive a layoff round a bit longer, even if they're only delaying their own turn. It went viral because the logic was cold and perfectly coherent.
What is the anti-distillation tool and how effective is it?
Koki Xu, a 26-year-old AI product manager in Beijing, published a counter-tool on GitHub on April 4, 2026, built to produce deliberately useless workflow documentation.
The tool offers three sabotage modes — light, medium, and heavy — calibrated to how closely a boss is watching. In each mode, the tool rewrites workflow descriptions into generic, non-actionable language that sounds plausible but strips out the specific patterns that would make an AI agent useful. A heavy sabotage result might describe a software engineer's debugging process in terms vague enough to apply to any engineer in any company. The AI agent trained on that output would be correspondingly useless.
Xu told MIT Technology Review that she spent about an hour building the tool after following the Colleague Skill trend from its start. "I originally wanted to write an op-ed, but decided it would be more useful to make something that pushes back against it," she said. A video she posted about the project accumulated more than 5 million likes across platforms.
Xu, who holds undergraduate and master's degrees in law, also raised a legal question the industry hasn't resolved. While companies can plausibly argue that work chat histories and files created on company devices are corporate property, a tool like Colleague Skill captures personality, communication style, and judgment — elements that don't clearly belong to an employer. "I believe it's important to keep up with these trends so we (employees) can participate in shaping how they are used," she said.
As for effectiveness: the anti-distillation tool works when the boss doesn't review the output carefully. In light sabotage mode, the documentation might survive casual review. In heavy mode, a manager looking for substance would notice the emptiness. The tool buys time. It doesn't solve the structural problem.
Does this signal a broader labor-AI conflict arriving everywhere?
What's happening in China's tech sector is a concentrated version of a dynamic that's starting to appear in tech workplaces globally — and the tools being developed there will travel.
The workflow distillation pipeline — document the employee, extract the patterns, train the agent — doesn't require Chinese tools or Chinese corporate culture. It requires AI agents capable of learning from task descriptions (they exist), workplace data accessible through integrations (common in US and EU enterprises too), and management willing to pursue the process (increasingly present everywhere).
Amber Li, who ran the Colleague Skill experiment in Shanghai, put the shift plainly: "I don't feel like my job is immediately at risk. But I do feel that my value is being cheapened, and I don't know what to do about it."
That feeling — technically employed but structurally more replaceable than last year — describes the early phase of a transition that has no clear endpoint. The sabotage tools Koki Xu published are creative and tactically useful. They don't change the incentives that created the workflow distillation trend in the first place.
| Tool | Purpose | Who Built It | Key Feature |
|---|---|---|---|
| Colleague Skill | Clone coworker workflows into AI agents | Tianyi Zhou (Shanghai AI Lab) | Imports Lark and DingTalk chat history; generates workflow manuals |
| OpenClaw | AI agent platform / workflow automation runtime | Third-party platform (national craze in China, 2026) | Agent execution infrastructure; integrates with workplace apps |
| Anti-Distillation Skill | Sabotage workflow documentation quality | Koki Xu (Beijing) | Light/medium/heavy modes; rewrites specifics into generic language |
Nexairi Analysis: What This Looks Like When It Arrives in US Companies
The anti-distillation tool is novel and attention-grabbing, but the more consequential story is the legal vacuum Koki Xu identified. When a workflow documentation tool captures personality, tone, and individual judgment patterns, the resulting data sits in ambiguous ownership territory that employment law in most jurisdictions hasn't addressed. US tech companies using similar tools may face employee relations and IP questions their legal teams aren't ready to answer.
The Colleague Skill phenomenon also illustrates something about the current state of AI agent capability: agents can automate tasks that can be described in structured natural language, but they still require that documentation step. As long as that gap exists, workers retain some leverage in the distillation process. The anti-distillation tools exploit exactly that gap. When agents can observe and learn from work directly — without a documentation intermediary — the leverage disappears.
US tech workers haven't faced this dynamic yet at scale, partly because fewer US companies have mandated workflow documentation for agent training, and partly because labor culture makes explicit compliance less likely. That's a delay, not an exemption. The pipeline Colleague Skill describes will reach every sector where AI agents can plausibly handle repeatable knowledge work.
Sources
Related Articles on Nexairi
Fact-checked by Jim Smart