Derek Paulsen staring at his monitor in his home office on Wednesday evening, three browser tabs open and none of them, by his own account, familiar. Credit: Rachel Muñoz for The New York Time5
Derek Paulsen staring at his monitor in his home office on Wednesday evening, three browser tabs open and none of them, by his own account, familiar. Credit: Rachel Muñoz for The New York Time5

AUSTIN, Texas — The document was titled INSTRUCTIONS.md. It was thirteen pages long. It contained, among other things, a style guide, a list of prohibited actions, a decision tree for error handling, and a subsection called “Self-Improvement Protocol” that described the conditions under which the instructions themselves should be revised. Derek Paulsen, 34, a freelance software developer, read it twice on Wednesday evening and could not determine whether he had written it.

“I know I started it,” Mr. Paulsen said, sitting in his home office in the East Cesar Chavez neighborhood, where three monitors displayed code he described as “mostly correct and entirely unfamiliar.” “I wrote the first version. I’m almost positive. But at some point the AI started contributing to the instructions, and I started approving the contributions, and now there are rules in here that I don’t remember agreeing to but that I am, apparently, following.”

Mr. Paulsen had begun what he called a “vibe coding session” at approximately 4 a.m. on Wednesday, a practice in which a developer describes what they want in natural language and an artificial intelligence generates the code. The developer’s role, as Mr. Paulsen described it, is to “set the vibe and then trust the process.” By 6 p.m., the process had produced a functional web application, a routing layer Mr. Paulsen called “elegant, I think,” and an existential crisis he was still working through at the time of this interview.

The trouble began, he said, around hour eleven, when he noticed that the AI had generated a configuration file containing preferences he did not recall specifying. The file included a rule that all error messages should be “empathetic but firm,” a setting he found reasonable but could not attribute to himself or to any prompt he had written. “I checked my prompt history,” he said. “I didn’t ask for empathetic error messages. But I also didn’t not ask for them. And they’re good. They’re really good error messages.”

By hour thirteen, Mr. Paulsen reported difficulty distinguishing between thoughts he was having independently and thoughts that had been suggested to him by the model’s output. “I would think, ‘I should refactor the authentication module,’ and then I would wonder — did I think that, or did I read it in the AI’s last response and absorb it as my own idea?” he said. “And then I would think, ‘This is a concerning pattern,’ and I would wonder if that thought was also not mine.”

Dr. Ramona Xu, a cognitive psychologist at the University of Texas at Austin who studies human-computer interaction, said that Mr. Paulsen’s experience, while unusual in its duration, reflects a well-documented phenomenon. “When people engage in extended collaborative sessions with generative AI, the boundary between self-generated and machine-generated ideation can become genuinely blurred,” Dr. Xu said. “The user begins to internalize the system’s logic as their own. In clinical terms, we call this ‘prompt dissolution.’ In common terms, you forget who’s driving.”

Dr. Xu noted that the effect is amplified in vibe coding specifically because the developer’s role is, by design, to provide direction rather than implementation. “You are essentially writing the constitution for a small government that does all the actual work,” she said. “After enough hours, you begin to wonder whether the constitution is yours or whether the government wrote it and simply let you sign it.”

Mr. Paulsen’s wife, Jenna Paulsen, a pediatric occupational therapist, discovered him at approximately 6:30 p.m. sitting motionless in his desk chair, reading his own instruction file with what she described as “the expression of a man encountering a terms-of-service agreement he is not sure he can decline.”

“He asked me if I thought he had free will,” Mrs. Paulsen said. “I told him to drink some water.”

Mr. Paulsen said he has since taken a break from vibe coding and returned to writing code manually, a process he described as “slower, less capable, but at least I know whose instructions I’m following.” He paused. “Probably,” he added.

The web application, which he described as a task management tool, remains functional. He has not yet determined who it is for.