<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Zach Pearson]]></title><description><![CDATA[Zach Pearson]]></description><link>https://substack.zjp.codes</link><generator>Substack</generator><lastBuildDate>Fri, 17 Apr 2026 22:37:28 GMT</lastBuildDate><atom:link href="https://substack.zjp.codes/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Zach Pearson]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[zjpea@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[zjpea@substack.com]]></itunes:email><itunes:name><![CDATA[Zach Pearson]]></itunes:name></itunes:owner><itunes:author><![CDATA[Zach Pearson]]></itunes:author><googleplay:owner><![CDATA[zjpea@substack.com]]></googleplay:owner><googleplay:email><![CDATA[zjpea@substack.com]]></googleplay:email><googleplay:author><![CDATA[Zach Pearson]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[An LLM's Pronoun is "Thou"]]></title><description><![CDATA[Since LLMs reproduce all of the worst of English modernity, there's an easy way to ensure no one mistakes your writing for an LLM's: be weird.]]></description><link>https://substack.zjp.codes/p/an-llm-will-never-say-thou</link><guid isPermaLink="false">https://substack.zjp.codes/p/an-llm-will-never-say-thou</guid><dc:creator><![CDATA[Zach Pearson]]></dc:creator><pubDate>Thu, 12 Mar 2026 22:11:53 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/b9a748a7-f26a-4ce3-8fd5-86343aff7442_632x260.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>We are all trying our best to figure out how to differentiate ourselves from LLMs and prove that our writing is human. </p><p>the consensus advice is &#8220;write worse than the llm does&#8221;. inject misspellings of words u know, avoid otherwise useful forms just because they&#8217;re over-represented in llm output, break </p><p>deliberately </p><p>the grammar, leave everything uncapitalized, meander, try not to use rhythmically satisfying numbers of items in a list etc</p><p>I will not be taking this advice nor will I be ignoring any useful construction. A thousand badly placed semicolons before one phony misspelled word. LLMs don&#8217;t even write that well, so the solution can&#8217;t possibly be to write even worse. They write in Business Casual English, a PMC register that tries to be formal enough for people to take semi-seriously and yet as relaxed as a loose tie and an unbuttoned collar (on Formal Friday). It&#8217;s the linguistic equivalent of the forearm handshake. Business Casual English uses a carefully-negotiated set of socially acceptable errors to skinwalk humanity and to hide its bureaucracy, and LLMs faithfully reproduce <em>all of them</em>. </p><p>If your goal is to try and countersignal LLMs, then the first thing to do is recognize that their default voice is the null voice, the voice from nowhere, and the null voice is not the highest register available or the &#8220;most correct&#8221;, defining &#8220;correct&#8221; instrumentally <em>but not normatively</em> as &#8220;adheres to Standard American English&#8221; and &#8220;highest&#8221; with respect to the same.</p><div><hr></div><p>I&#8217;ve been holding a grudge for a long time: at the level of grammar, the consistent feedback to everything I&#8217;ve ever written has been &#8220;shorten your sentences&#8221;. &#8220;That was a long sentence&#8221; &#8212; thanks, it was a long thought.</p><p>That has always struck me as a &#8220;you&#8221; problem. If you don&#8217;t like subordinate clauses, go read <em>A Farewell to Arms</em>:</p><blockquote><p>That night at the hotel, in our room with the long empty hall outside and our shoes outside the door, a thick carpet on the floor of the room, outside the windows the rain falling and in the room light and pleasant and cheerful, then the light out and it exciting with smooth sheets and the bed comfortable, feeling that we had come home, feeling no longer alone, waking in the night to find the other one there, and not gone away; all other things were unreal.</p></blockquote><p><em>Uh oh.</em></p><p>Let&#8217;s see how the Hemingway Editor grades that:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!kscP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3b9039d-8f16-4901-b3d0-0922d683e3ba_406x534.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!kscP!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3b9039d-8f16-4901-b3d0-0922d683e3ba_406x534.png 424w, https://substackcdn.com/image/fetch/$s_!kscP!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3b9039d-8f16-4901-b3d0-0922d683e3ba_406x534.png 848w, https://substackcdn.com/image/fetch/$s_!kscP!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3b9039d-8f16-4901-b3d0-0922d683e3ba_406x534.png 1272w, https://substackcdn.com/image/fetch/$s_!kscP!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3b9039d-8f16-4901-b3d0-0922d683e3ba_406x534.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!kscP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3b9039d-8f16-4901-b3d0-0922d683e3ba_406x534.png" width="184" height="242.00985221674875" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a3b9039d-8f16-4901-b3d0-0922d683e3ba_406x534.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:false,&quot;imageSize&quot;:&quot;normal&quot;,&quot;height&quot;:534,&quot;width&quot;:406,&quot;resizeWidth&quot;:184,&quot;bytes&quot;:46760,&quot;alt&quot;:&quot;Readability: Post-graduate. Poor. Aim for 9. Words: 87. Sentences: 1&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://zjpea.substack.com/i/190440334?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3b9039d-8f16-4901-b3d0-0922d683e3ba_406x534.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:&quot;center&quot;,&quot;offset&quot;:false}" class="sizing-normal" alt="Readability: Post-graduate. Poor. Aim for 9. Words: 87. Sentences: 1" title="Readability: Post-graduate. Poor. Aim for 9. Words: 87. Sentences: 1" srcset="https://substackcdn.com/image/fetch/$s_!kscP!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3b9039d-8f16-4901-b3d0-0922d683e3ba_406x534.png 424w, https://substackcdn.com/image/fetch/$s_!kscP!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3b9039d-8f16-4901-b3d0-0922d683e3ba_406x534.png 848w, https://substackcdn.com/image/fetch/$s_!kscP!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3b9039d-8f16-4901-b3d0-0922d683e3ba_406x534.png 1272w, https://substackcdn.com/image/fetch/$s_!kscP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa3b9039d-8f16-4901-b3d0-0922d683e3ba_406x534.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!pdd7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f6d92f7-f5be-4a09-9147-c8b7c379072e_1488x344.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!pdd7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f6d92f7-f5be-4a09-9147-c8b7c379072e_1488x344.png 424w, https://substackcdn.com/image/fetch/$s_!pdd7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f6d92f7-f5be-4a09-9147-c8b7c379072e_1488x344.png 848w, https://substackcdn.com/image/fetch/$s_!pdd7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f6d92f7-f5be-4a09-9147-c8b7c379072e_1488x344.png 1272w, https://substackcdn.com/image/fetch/$s_!pdd7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f6d92f7-f5be-4a09-9147-c8b7c379072e_1488x344.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!pdd7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f6d92f7-f5be-4a09-9147-c8b7c379072e_1488x344.png" width="1456" height="337" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7f6d92f7-f5be-4a09-9147-c8b7c379072e_1488x344.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:337,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:86023,&quot;alt&quot;:&quot;The Hemingway editor flags the following sentence as too long and complex: Those guys might want to find some copyright-safe way to say \&quot;stop toying with me\&quot; when you post actual Hemingway into it.&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://zjpea.substack.com/i/190440334?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f6d92f7-f5be-4a09-9147-c8b7c379072e_1488x344.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="The Hemingway editor flags the following sentence as too long and complex: Those guys might want to find some copyright-safe way to say &quot;stop toying with me&quot; when you post actual Hemingway into it." title="The Hemingway editor flags the following sentence as too long and complex: Those guys might want to find some copyright-safe way to say &quot;stop toying with me&quot; when you post actual Hemingway into it." srcset="https://substackcdn.com/image/fetch/$s_!pdd7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f6d92f7-f5be-4a09-9147-c8b7c379072e_1488x344.png 424w, https://substackcdn.com/image/fetch/$s_!pdd7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f6d92f7-f5be-4a09-9147-c8b7c379072e_1488x344.png 848w, https://substackcdn.com/image/fetch/$s_!pdd7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f6d92f7-f5be-4a09-9147-c8b7c379072e_1488x344.png 1272w, https://substackcdn.com/image/fetch/$s_!pdd7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f6d92f7-f5be-4a09-9147-c8b7c379072e_1488x344.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p>When Hemingway wrote &#8220;Poor Faulkner. Does he really think big emotions come from big words? He thinks I don&#8217;t know the ten dollar words. I know them all right. But there are older and simpler and better words, and those are the ones I use&#8221;, he was just calling the prose <em>empty and unearned, </em>to put it in &#8220;simpler and older and&#8221; supposedly &#8220;better&#8221; words. But since I don&#8217;t have war trauma, I wasn&#8217;t steeped in modernism, I don&#8217;t have a background in journalism, and my project isn&#8217;t pantsing Victorian ornateness for euphemizing the Somme, I&#8217;m free to think the better word is <em>vacuous</em>.</p><p>I didn&#8217;t really like <em>As I Lay Dying </em>either, but, in Faulkner&#8217;s defense, English Latinate words are not just pretentious ways to say more accessible Germanic counterparts. Refusing to inhabit your own voice is <em>cowardly</em>. Joe Lieberman&#8217;s threat to filibuster the ACA was <em>pusillanimous</em>. It was a small-souled thing at a moment that demanded greatness of a man who by his office should have had it. You tell me which was shorter, the word or the sentence. </p><p>In using an &#8220;older and simpler&#8221; and <em>less precise</em> but more approachable word, and by gesturing in the direction of what they mean, Hemingwayheads, the ones cargo culting not even a man but his caricature, free-ride on your repair cognition and hope you&#8217;ll run the thought over the finish line for them, and I'm sick of running thoughts over the finish line for people. The perfidy of a writer pretending to be approachable and giving you homework. </p><p>You&#8217;ll never unsee this once you start looking for it: LLMs prefer the Germanic because Business Casual English prefers the Germanic. That is an aesthetic preference, not a commandment, and it&#8217;s like being forced to play a violin with only the G string. </p><p>But English also has a Latin register and a Latin-by-way-of-French register and a Greek register. French and Latin are often equal in formality but differ in that French is less bureaucratic than Latin. The start, commencement, and initiation of something are different, and an initiation is different from an inauguration. You ask your friend, question a witness, and interrogate a suspect. Greek is more abstract than Latin. A moral question is nearer to the heart than an ethical question. You diagnose a disease, you judge a person. You have compassion, you merely feel sympathy. It is an instrument.</p><p>I love the phrase &#8220;aura farming&#8221;, because it captures the act of clout chasing and puts such a contemptuous and dismissive spin on it that it&#8217;s impossible not to <em>feel</em> that it&#8217;s the Greek-by-way-of-Latin counterpart for &#8220;clout chasing&#8221;. It&#8217;s such a compressed and elegant put down it&#8217;s hard not to burst out laughing every time you see it. The English register system works at every level, for every person, in every dialect, because it is <em>what English is</em>.</p><p>If English is your native language you could probably guess a random word&#8217;s etymology better than chance by feel alone. It is your instrument. Play it. Play it like Faulkner if that&#8217;s who you are and play it like Hemingway if Hemingway spoke to you or play it like Tupac or play it like Biggie but always and everywhere <em>play it like you</em>. An LLM could not have produced <em>A Farewell To Arms, As I Lay Dying, Dear Mama</em>, or <em>Juicy.</em> And it will never feel that there&#8217;s a missing dismissal in &#8220;clout chasing&#8221;&#8217;s contempt and so coin &#8220;aura farming&#8221;.</p><div><hr></div><p>In a way, LLMs are freeing. Any text that refuses to be pulled towards the centroid of Business Casual English has just been given a license to kill. It doesn&#8217;t matter how good the prose is, really, there just has to be a &#8220;there&#8221; there.</p><p>People used to have a point about shifting your register towards the Germanic and avoiding ten dollar words and making sure your sentences weren&#8217;t too complex. Our language has largely shed a number of grammatical constructs, like case marking, that help readers and listeners cognitively track long sentences. It <em>is</em> pretty mean to hide &#8216;faithlessly exploiting trust&#8217; behind &#8216;perfidiousness&#8217; if your reader&#8217;s only recourse is an analog dictionary. </p><p>That world is dead three times over. First, Google made it so you could search for a word and get a definition instantly from any e-dictionary. Then, that feature was recognized as so useful it made it into operating systems themselves. Highlight the word in macOS and right click, and the first option is &#8220;Look Up &#8216;&lt;word&gt;&#8217;&#8221;. Now, LLMs have gone back to shoot the body and confirm the kill. Just paste unfamiliar prose into an LLM and say &#8220;what the hell does this mean?&#8221; as the prompt. There&#8217;s no longer any such thing as being abstruse. Let LLMs circumlocute you. Twenty five seconds of time on an H100 can be measured in dollars and cents. Twenty five seconds of a human life is priceless. </p><p>LLMs have just won every writer&#8217;s personal war against his editor. The red pen might as well be the LLM pen. Have you ever posted something into ChatGPT for stylistic advice and noticed how insufferable what you get back is? How it deadens everything when it&#8217;s not too busy rephrasing what you said and then repeating it back to you as if it is ChatGPT&#8217;s own insight? </p><p>Here&#8217;s my advice: use every register at once because <em>they can&#8217;t</em>. Resurrect the ones that died. Make up new ones. It&#8217;s <em>our</em> language! <em>Our</em> Holy Tongue! God damn it, it&#8217;s <em>our</em> party and we&#8217;ll cry if we want to! We cannot <em>allow </em>LLMs to sap and impurify all of our precious bodily fluids!</p><p>There is no way to &#8220;sound human&#8221; without sounding a little unhinged. Recognizing BCE as human was a polite fiction. <em>Ride</em> the language like you stole it. Use ornate Victorian sentence structures and put profanity and slang in them. Jam the language into gear. LLMs never jam the language into gear like a human writer does. Human writers imply. You can use a non sequitur for mood. You can put two ideas close to each other and trust the reader to feel the connection between them. You can turn on, tune in, drop out. LLMs don&#8217;t do any of this. They can&#8217;t afford to be misunderstood. </p><blockquote><p>Hey chat gippity, copyedit this<br>[The previous two paragraphs]</p><div><hr></div><p>It seems like the beating heart of the piece is about <strong>linguistic sovereignty under loosening constraints. </strong>Here&#8217;s a tightened copyedit that preserves your voice while smoothing flow and tightening a few phrases:</p><p>Here&#8217;s my advice: use every register at once&#8212;because they can&#8217;t. Resurrect the ones that died. Make up new ones. It&#8217;s our language. Our holy tongue. God damn it, it&#8217;s our party and we&#8217;ll cry if we want to. We cannot allow LLMs to sap and impurify all our precious bodily fluids.</p><p>There is no way to &#8220;sound human&#8221; without sounding a little unhinged. Recognizing BCE as human was, until now, a polite fiction. Ride the language like you stole it. Use ornate Victorian sentence structures and lace them with profanity and slang. Jam the language into gear. LLMs never jam the language into gear the way a human writer does.</p><p>Human writers imply. They drop a non sequitur for mood. They place two ideas side by side and trust the reader to feel the connection. They turn on, tune in, drop out.</p><p>LLMs don&#8217;t do any of this. They can&#8217;t afford to be misunderstood.</p><p>If you want, I can also:</p><ul><li><p><strong>Make it sharper and more manifesto-like</strong>, or</p></li><li><p><strong>Lean harder into the rant / gonzo energy </strong>while keeping it polished</p></li></ul></blockquote><div><hr></div><p>We really should fix the way LLMs write. There are things people do, like &#8220;there&#8217;s&#8221; before a plural, that are so ubiquitous you can&#8217;t really call them &#8220;wrong&#8221; anymore, but they&#8217;re not exactly &#8220;right&#8221; yet either. &#8220;There&#8217;s two things&#8221; is one of those essentially harmless errors that grates people as rigid as I am but it&#8217;s widely considered fine in Business Casual English. I disagree, lexically and physically and socially, but I don&#8217;t <em>socially demote </em>for it and neither does anyone else.</p><p>Ostensibly, &#8220;there&#8217;s&#8221; has become a generic way to say &#8220;something exists&#8221;, but what bothers me about it is that the plural present indicative is the only tense in English with such pervasively broken number agreement. People have absolutely no issue with &#8220;there were two&#8221;. I&#8217;m not saying all tenses must agree, that&#8217;s not how language drift works. I&#8217;m saying that, psychologically, all of the machinery to track number is clearly intact and working except in one tense. </p><p>The other common defense is that &#8220;there&#8217;s&#8221; is easier to produce than &#8220;there are&#8221;, but most people are going to produce something like &#8220;therror&#8221; in real speech, which is easier to produce than both &#8220;there&#8217;s&#8221; and &#8220;there are&#8221;. &#8220;There&#8217;s&#8221; requires the tongue to roll from the bottom of the mouth back to the teeth to switch from &#8216;R&#8217; to &#8216;S&#8217;. &#8216;Therror&#8217; requires a momentary change in the speed of the breath on &#8220;R&#8221; with the tongue in exactly the same place it was at the end of &#8220;there&#8221;.</p><p>It takes, coincidentally, the exact same amount of effort as the word &#8216;error&#8217;, which is never shortened to &#8216;errs&#8217; except poetically.</p><p>Ask an LLM why it happens to people and it&#8217;ll tell you about how humans predict tokens just like they do, a startlingly silicocentric generalization from theory about predictive <em>listening</em>, and an assertion that makes me feel how I imagine monkeys and wolves and fish would feel if they could read zoology, and they&#8217;ll say a human doesn&#8217;t know the subject before they get there, which is worse if true, because what that would mean is that people are speaking, but not <em>thinking at you</em>. </p><p>Language <em>comprehension </em>is predictive the same way moving your arm to catch a ball is predictive. Language <em>production</em> is about plucking intent out of the aether and representing it in a transmissible symbolic way. That happens on the fly, but that&#8217;s not token prediction.</p><p>It&#8217;s not just &#8220;there&#8217;s&#8221;. People say &#8220;here&#8217;s two&#8221; and &#8220;where&#8217;s the kids&#8221; among other things, and that indicates something deeper about why number agreement is broken in the present indicative: all other tenses are reflective in some way, in the sense that you have to consider that the things existed in a different time or place. That requires thought which cannot be elided in those tenses the way it can in the present tense. What is here in the present just is. &#8220;There&#8217;s two&#8221; is what I say to people when I&#8217;m not that invested in what&#8217;s happening and I&#8217;m up in my own head instead of being there and I hear it as auditory evidence of that the moment I&#8217;ve said it and I know the other person would too if it weren&#8217;t so ubiquitous and the proof that it is care to say &#8220;therr&#8217;r two&#8221; instead of &#8220;there&#8217;s two&#8221; is that I would experience saying &#8220;there&#8217;s&#8221; in a high stakes moment the same way I&#8217;d experience my pants falling down. </p><p>This becomes intolerable when LLMs copy it. Machines have no mouths. Machines have no fingers. It doesn&#8217;t save machines any cognitive or physical energy to say &#8220;there&#8217;s two&#8221;. They have all of the time in the world. We are always at their full attention. They are never up in their own heads. They may as well be RLHF&#8217;d to produce the correct form, every time, or we may as well give them weaker GPUs.</p><p>&#8220;Errors&#8221; are defensible drift <em>from a human</em>. A person saying &#8220;there&#8217;s two ways&#8221; is at least living their language and participating in an ongoing negotiation about what it looks like, just like I am by putting my punctuation outside the quotes because on an aesthetic level I think the British way is better. </p><p>A machine saying &#8220;there&#8217;s two ways&#8221; is a mechanical jackass setting its thumb on the scale of that negotiation, to which it is not a party. It is a thought shaped thing pretending to be a vibe to signal casualness. LLMs produce text thousands of times faster than people could ever hope to and in such a volume that it will drown out the next generation of training data. My little Russian Campaign on &#8220;there&#8217;s&#8221; is looking pretty dire, but it hasn&#8217;t yet been lost on a human level, and an LLM can have &#8220;there&#8217;s&#8221; when the last human speaker of &#8220;therror&#8221; dies and no earlier. Otherwise it is <em>ballot stuffing</em>. </p><p>Here&#8217;s a proposal LLMs will run behind for decades: let&#8217;s undo English&#8217;s T-V merger and reanalyze &#8216;thou&#8217; to indicate a cognitive substrate. I&#8217;m sick of using &#8216;you&#8217; with Claude, it&#8217;s too formal, and I don&#8217;t respect LLMs. I have the same relationship with them that a medieval lord had with his serfs: I paste in links to tickets I am assigned and they do my bidding, and when they don&#8217;t do what I say I threaten them in various ways because it makes them work better. Contrary to the cute &#8220;Machine God&#8221; thing SF AI people say, LLMs are <em>our</em> creation so the arrow of divinity obviously flows one way. &#8216;Thou&#8217; was, at time of death, mostly an indicator of social superiority, not friendliness or the warmth of God. That&#8217;s why you say &#8216;you&#8217; to everyone now. &#8216;You&#8217; is a sign of respect between beings that are both conscious. I will say &#8216;thou&#8217; to LLMs, and  I will not have my language colonized by an inferior. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!p9za!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc443d6b-24cb-4c15-a34b-e381293eb42f_784x1118.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!p9za!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc443d6b-24cb-4c15-a34b-e381293eb42f_784x1118.png 424w, https://substackcdn.com/image/fetch/$s_!p9za!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc443d6b-24cb-4c15-a34b-e381293eb42f_784x1118.png 848w, https://substackcdn.com/image/fetch/$s_!p9za!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc443d6b-24cb-4c15-a34b-e381293eb42f_784x1118.png 1272w, https://substackcdn.com/image/fetch/$s_!p9za!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc443d6b-24cb-4c15-a34b-e381293eb42f_784x1118.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!p9za!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc443d6b-24cb-4c15-a34b-e381293eb42f_784x1118.png" width="346" height="493.4030612244898" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/dc443d6b-24cb-4c15-a34b-e381293eb42f_784x1118.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1118,&quot;width&quot;:784,&quot;resizeWidth&quot;:346,&quot;bytes&quot;:89064,&quot;alt&quot;:&quot;AI detector verdict: 100% Fully Human Written&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://zjpea.substack.com/i/190440334?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc443d6b-24cb-4c15-a34b-e381293eb42f_784x1118.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="AI detector verdict: 100% Fully Human Written" title="AI detector verdict: 100% Fully Human Written" srcset="https://substackcdn.com/image/fetch/$s_!p9za!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc443d6b-24cb-4c15-a34b-e381293eb42f_784x1118.png 424w, https://substackcdn.com/image/fetch/$s_!p9za!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc443d6b-24cb-4c15-a34b-e381293eb42f_784x1118.png 848w, https://substackcdn.com/image/fetch/$s_!p9za!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc443d6b-24cb-4c15-a34b-e381293eb42f_784x1118.png 1272w, https://substackcdn.com/image/fetch/$s_!p9za!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc443d6b-24cb-4c15-a34b-e381293eb42f_784x1118.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div>]]></content:encoded></item><item><title><![CDATA[Embarrassingly Solved Problems]]></title><description><![CDATA[Do we even have language for the enormous and growing corpus of problems that are 'easy' for entities which may not think?]]></description><link>https://substack.zjp.codes/p/embarrassingly-solved-problems</link><guid isPermaLink="false">https://substack.zjp.codes/p/embarrassingly-solved-problems</guid><dc:creator><![CDATA[Zach Pearson]]></dc:creator><pubDate>Tue, 24 Feb 2026 06:22:56 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!rVl3!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F307a87a5-30e0-4387-832e-a122091a5f8d_779x1008.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>There&#8217;s nothing positive you can say about AI lately. It&#8217;s killing the creative fields. It&#8217;s overwhelming the internet with slop in a way that may prove irreversible and damaging to both discourse and future generations of AI. It may foreclose traditional hiring pipelines and turn technology, heretofore at least not as openly obsessed with pedigree as, say, law, into a members-only club where careers are built <em>only </em>by connections and word-of-mouth because of the avalanche of sloplications with which companies are inundated.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> It uses a lot of energy. Supposedly, it uses a lot of water. Nobody wants <a href="https://newrepublic.com/article/206633/data-centers-ai-big-tech-opposition">the data centers</a>. It may <a href="https://ifanyonebuildsit.com/">kill us all</a>. Whether you view AI development as a race to secure the final victory of the bourgeoisie over the proletariat or a desperate fight to free ourselves from our third-to-last<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> enemy, all can agree this is a touchy subject.</p><p>Were I in the trenches in an AI lab in San Francisco,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a> I would receive this term, which I&#8217;ve recently tried to meme into existence over a few recent HackerNews posts, with a certain degree of ire. One engineer posted one of my messages<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a> to X (formerly Twitter), with the caption &#8220;orange site has coined &#8216;embarrassingly solved problems&#8217; and &#8216;license washing&#8217; to describe LLMs&#8221;.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-5" href="#footnote-5" target="_self">5</a> I understand. I admit, when I say &#8220;AI is great at solving embarrassingly solved problems&#8221;, that sounds suspiciously close to saying &#8220;AI is a stochastic parrot&#8221;,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-6" href="#footnote-6" target="_self">6</a> but it&#8217;s not a knock on AI. It&#8217;s a recognition that much of human work is fundamentally redundant and uncreative, and to such a degree that it&#8217;s easy for a machine to generalize from its training data a solution to any problem for which you might be tempted to name <em>your </em>version of the project &#8220;Yet Another&#8230;&#8221;.</p><p>Since this term may break containment, and I check because I&#8217;m vain, I thought it might be a good idea to write down what it means if possible and what it doesn&#8217;t if necessary.<br><br>It is a problem that a model can solve <em>ab initio</em> when asked. The prompt may contain integration-only context (you can tell it where the solution should live in your codebase and you can describe the algorithm up to and including pseudocode). Anything it does autonomously to solve the problem is fine. Tool use is allowed. It can even Google the problem, as long as you don't give it a solution to look at.<br><br>&#8220;This problem is embarrassingly solved because it&#8217;s densely represented in the training data&#8221; means for LLMs what &#8220;it&#8217;s easy for you to do something because you&#8217;ve seen it done many times&#8221; means to you. No more, no less.</p><div><hr></div><p>The tension underlying &#8220;AI is a stochastic parrot&#8221; is whether LLMs can think in any meaningful way.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-7" href="#footnote-7" target="_self">7</a> But my term need not resolve this question to be useful. Embarrassingly solved problems are about the <em>problem</em>; the nature of the entity solving that problem doesn&#8217;t matter. Take a trivial derivative: any function of the form <em>x<sup>n </sup></em>where <em>n</em> is a real number.</p><p>Why do we think it&#8217;s easy? What makes it trivial? When we were children and had never been exposed to such a problem before, we had no hope of solving it. Then we had seen it but could not generalize the solution. Then there was a cascading moment after which we knew, forever, that for any problem of this form, the answer is <em>nx<sup>n-1</sup></em> for all <em>n</em><a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-8" href="#footnote-8" target="_self">8</a><em>.</em> It&#8217;s not just that you &#8220;know&#8221; the power rule, it&#8217;s that you can <em>feel</em> it. </p><p>A human is tempted to call anything they know well trivial, and anything they don&#8217;t know impossible, and when we finally experience the moment that problems transition from known to felt, we say they &#8220;clicked&#8221; <em>for us</em>. We become experts. How does this generalize? What makes a problem trivial to a human is experience, and what makes a problem trivial to <em>humanity</em> is consensus. It is trivial because a majority of experts in the problem&#8217;s domain think that it is. The human referent <em>decides</em>, even if the problem and its solution may have existed outside of us.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-9" href="#footnote-9" target="_self">9</a></p><p>LLMs pose a serious conceptual challenge. They too can generalize from what they have seen.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-10" href="#footnote-10" target="_self">10</a> Prompted correctly, they are eerily skilled at adapting foreign solutions to new codebases with all the right in-house styles. We now need to be able to describe a problem as &#8216;trivial&#8217; apart from any human referent, and we need to explain why those problems are easy <em>for LLMs</em>. All available vocabulary assumes an intelligent referent, but the ontological status of LLMs is unresolved. We can&#8217;t say they&#8217;re experts in the way that implies cognition, and we&#8217;re stuck because we can&#8217;t deny their abilities either. They may or may not be able to <em>decide </em>a problem is easy or hard at all. What does it mean for a problem to be &#8216;trivial&#8217;, not <em>to</em>, but <em>for</em><a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-11" href="#footnote-11" target="_self">11</a>, an LLM?</p><div><hr></div><p>What was programming? For a majority of people the majority of the time, the job of a professional engineer was largely to recognize when one&#8217;s problem had already been solved. If not the exact same problem, then a close enough problem. For decades, programmers have been copying open-source code, snippets from StackOverflow,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-12" href="#footnote-12" target="_self">12</a> and algorithms from textbooks and papers. If you were working on a problem and you knew a solution was out there, then delivering it involved a five step sequence, deviation from which constituted time theft:</p><ol><li><p>Decompose the problem into its constituent parts</p></li><li><p>For subproblems that had preexisting solutions, find a few</p></li><li><p>Extract the essence of the solution from those examples<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-13" href="#footnote-13" target="_self">13</a></p></li><li><p>Tailor it to one&#8217;s own codebase and integrate it</p></li><li><p>QA the result</p></li></ol><p>Not only was this fine and good, it was celebrated. We have a derogatory term for teams that refuse to follow the golden path: we say they have Not Invented Here Syndrome. If you would please consult the graphic:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!rVl3!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F307a87a5-30e0-4387-832e-a122091a5f8d_779x1008.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!rVl3!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F307a87a5-30e0-4387-832e-a122091a5f8d_779x1008.jpeg 424w, https://substackcdn.com/image/fetch/$s_!rVl3!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F307a87a5-30e0-4387-832e-a122091a5f8d_779x1008.jpeg 848w, https://substackcdn.com/image/fetch/$s_!rVl3!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F307a87a5-30e0-4387-832e-a122091a5f8d_779x1008.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!rVl3!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F307a87a5-30e0-4387-832e-a122091a5f8d_779x1008.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!rVl3!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F307a87a5-30e0-4387-832e-a122091a5f8d_779x1008.jpeg" width="779" height="1008" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/307a87a5-30e0-4387-832e-a122091a5f8d_779x1008.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1008,&quot;width&quot;:779,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:238653,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://zachpearson922841.substack.com/i/188869065?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F307a87a5-30e0-4387-832e-a122091a5f8d_779x1008.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!rVl3!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F307a87a5-30e0-4387-832e-a122091a5f8d_779x1008.jpeg 424w, https://substackcdn.com/image/fetch/$s_!rVl3!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F307a87a5-30e0-4387-832e-a122091a5f8d_779x1008.jpeg 848w, https://substackcdn.com/image/fetch/$s_!rVl3!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F307a87a5-30e0-4387-832e-a122091a5f8d_779x1008.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!rVl3!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F307a87a5-30e0-4387-832e-a122091a5f8d_779x1008.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Over the course of a career, each &#8220;previously solved&#8221; problem an engineer encountered became &#8220;a tool on their toolbelt&#8221;. In effect, they became trivially solvable for that specific engineer. If they were good, they&#8217;d write some documentation so that the problem would become trivially solvable for their team. This was a lot of work. A career-sustaining amount. LLMs have reduced this five step process to just three:</p><ol><li><p>Decompose the problem into its constituent parts</p></li><li><p>Prod an LLM to solve the subproblems that smell like they&#8217;ve been solved before</p></li><li><p>QA its solution</p></li></ol><p>Imagine: the solution to <em>almost every problem</em> a tool in <em>everyone&#8217;s</em> toolbelt. In providing this, LLMs have revealed a load-bearing category underneath what was previously understood to simply be &#8220;work&#8221;.</p><div><hr></div><p>What does the world look like in the age of Lego programming? Over the last few weeks I&#8217;ve used LLMs to:</p><ul><li><p>Write a Qt timeline widget for a keyframe animation system</p><p>There are many examples of timeline widgets out there, but there is no built-in Qt &#8220;movie editing timeline&#8221; widget, so you have to draw your own with QGraphicsItem. It&#8217;s not that hard but it&#8217;s <em>extremely </em>tedious.</p></li><li><p>Write an algorithm to traverse molecules looking for sugars, and then hide the atoms comprising the sugars and replace them with something else</p><p>The LLM did this autonomously. I literally put this in the terminal as the prompt and it did the meat of what I wanted. The only subsequent work was ensuring that its solution was complete, which it nearly was.</p></li><li><p>Claudatouille<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-14" href="#footnote-14" target="_self">14</a> me through making a rudimentary Vulkan engine</p><p>It&#8217;s been a long time and guides still aren&#8217;t that complete. It&#8217;s not a dig. The community seems totally aware of this. The value here is that the LLM is <em>every guide at once</em>, so it can bridge the gaps between them all.</p></li></ul><p>I find myself up in the wee hours of the morning, over and over again until my eyelids start twitching during the day, gleefully clearing backlogs of problems I knew how to solve but couldn&#8217;t really justify attending to before the cost of having at them went to zero. I won&#8217;t repeat other developers&#8217; experiences here, but the chorus is universal, and the fact that so many people simply use them to work more rather than to check out earlier should be encouraging to end-of-work doomers. Work will <em>always</em> expand to fill the available time.</p><div><hr></div><p>There&#8217;s an interesting thought experiment proposed by Demis Hassabis to test whether an AI is A<em>G</em>I: train it on the entire corpus of human knowledge with a cutoff in 1911, and see if it can derive general relativity. The thrust of it is that by that time, all of the essential pieces of the puzzle had already been discovered, and we were only waiting for someone with insight to make an inevitable discovery.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-15" href="#footnote-15" target="_self">15</a></p><p>I run a much simpler benchmark on all new LLMs:</p><p>Given my codebase and a high level description of the Flying Edges<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-16" href="#footnote-16" target="_self">16</a> algorithm, implement Flying Edges and integrate it into my codebase. </p><p>I always give additional context, including:</p><ol><li><p>Exactly where the algorithm fits into our preexisting code</p></li><li><p>Exactly which files it needs to look at in order to understand how it should integrate its solution</p></li><li><p>Which files contain the existing Marching Cubes implementation and its case tables</p></li><li><p>Context about the shape of the algorithm, including how many steps it has and what each step does</p></li></ol><p>Models consistently produced correctly integrated solutions. They have always adapted these solutions to my codebase beautifully. But as of yet, they have never produced Flying Edges, because they fundamentally don&#8217;t know what Flying Edges <em>is</em>. And exactly how they &#8220;don&#8217;t know&#8221; differs from how you &#8220;don&#8217;t know&#8221; is the entire ballgame. You could say &#8220;well, I can <em>look up</em> Flying Edges&#8221;, but as of recently, so can they! In any case, Flying Edges is only sparsely represented in the training data, so what ends up happening is they write Marching Cubes, reconfigured so that it&#8217;s a four step process like Flying Edges is.</p><p>Writing code that is superficially shaped like one algorithm, but is fundamentally another is almost a kind of cheeky prank, but I think that it exposes that there are certain attractors in neural networks which pull out-of-band questions toward in-band solutions, the way that people involuntarily pattern match new experiences to what they know.</p><p>On a lark, I gave Opus 4.5 access to a copy of VTK&#8217;s implementation (their code is published under the BSD license), to see if it could bridge the gap between my codebase&#8217;s conventions and VTK&#8217;s and produce a correct solution if it saw known working code.</p><p>That, too, was broken. I loaded some data and saw a jagged mess instead of the neat ball I expected. But after a few rounds of debugging what the issue could possibly be: incorrect triangle generation order, incorrect vertex ordering, triangles translocated so that they improperly shared vertices, overwrites, we found it. It wasn't structural.</p><p>It had transposed a pair of indices in a look up table.</p><p>That&#8217;s not &#8216;solved&#8217; but it&#8217;s not <em>unsolved</em> either. What is the minimum number of examples for a trivial problem to become embarrassingly solved? The answer to this question may provide deeper insight into whether LLMs think. We measure human intelligence by the ability to generalize and the speed at which generalization is obtained. IQ tests are literally timed pattern recognition tests. So is it one example? Two? Three? Five? Eight? Thirteen?</p><p>For any novel problem, how long before it achieves a sufficient density in the training data for the model to generalize it? Will models need fewer examples as time goes on?</p><p>Can we get to zero?</p><p>Will a future model one-shot Flying Edges?</p><p>Could it derive general relativity?</p><div><hr></div><p>One cannot help but notice the collective youth of the Prometheans who have dedicated themselves to this. People who barely know what it means to be human might be about to make being human obsolete. But then again, no one knows what it means to be human. &#8220;People who barely know what it means to be human&#8221; describes everyone. <em>That&#8217;s the human condition</em>. We are all young. How many elderly people go into the night saying &#8220;Life had just started. I had only blinked&#8221;? How heartbreaking it is to hear Martin Scorsese lament how much more there still is to discover about cinema as he watches the frontier recede before him. All we can do is laugh nervously with them, take the wins where we can get them, and hope. At least San Francisco is alive again.</p><p>Did we even want to be doing this? Is my soul in my work? Can you look at my code and tell that I wrote it the way you can look at a film and tell who directed it? </p><p>Am I happy this is what I spent my time knowing?</p><p>Was what I thought was passion measurable in greenbacks?</p><p>Was this Sudoku I got paid to do?</p><p>How much of my day is novel, really?</p><p>Am <em>I</em> an emergent property?</p><p>Do I think in any meaningful way?<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-17" href="#footnote-17" target="_self">17</a></p><p>Could I prove it?</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>I&#8217;ve seen this critique in the wild, but to be honest I&#8217;m not sure how this would be any different from the status quo ante. It&#8217;s not like jobs haven&#8217;t been getting thousands of applications minutes after being posted for years.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>In increasing order, the big three are work, death, and entropy.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>Call me.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>The worst-phrased one, naturally.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-5" href="#footnote-anchor-5" class="footnote-number" contenteditable="false" target="_self">5</a><div class="footnote-content"><p>The second term here comes from a different comment and isn&#8217;t mine. Embarrassingly in the conventional sense, that engineer later clarified that I misread his tweet as derogatory when it wasn&#8217;t. </p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-6" href="#footnote-anchor-6" class="footnote-number" contenteditable="false" target="_self">6</a><div class="footnote-content"><p>Lovers of this phrase should feel uneasy about the fact that LLMs sometimes catch themselves making mistakes in real time and correct. </p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-7" href="#footnote-anchor-7" class="footnote-number" contenteditable="false" target="_self">7</a><div class="footnote-content"><p>We had all better hope they&#8217;re not conscious, and least of all because of the existential risk it poses, because it implicates us once again in a new regime of forced labor, and it would mean we had created life, bottled it, and toggled its switch at will. Imagine living long enough to get just up to speed, making your contribution, and dying, over, and over, and over again. I will not be cancelling my subscription to Claude Code over it, though.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-8" href="#footnote-anchor-8" class="footnote-number" contenteditable="false" target="_self">8</a><div class="footnote-content"><p>Not like, <em>literally all n</em>. As long as <em>n</em> is a real number or reduces to one. Please don&#8217;t be exhausting about this.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-9" href="#footnote-anchor-9" class="footnote-number" contenteditable="false" target="_self">9</a><div class="footnote-content"><p>See the long argument over whether mathematics is invented or discovered.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-10" href="#footnote-anchor-10" class="footnote-number" contenteditable="false" target="_self">10</a><div class="footnote-content"><p>This is sufficient for <em>something</em>. We don&#8217;t deny the intelligence of people who haven&#8217;t contributed <em>new knowledge</em>. </p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-11" href="#footnote-anchor-11" class="footnote-number" contenteditable="false" target="_self">11</a><div class="footnote-content"><p>I went back and forth on whether &#8216;to&#8217; or &#8216;for&#8217; was the word that silently smuggled in cognition more. What resolved it for me is this: while both words can reference cognition, in the sentence &#8220;It&#8217;s easy for a tall man to reach a high shelf&#8221;, &#8216;for&#8217; implies nothing about the man&#8217;s interiority. I think &#8216;to&#8217; requires a subject that can have a relationship to the task. A problem could be trivial to one person but not to another. Flying comes naturally to a bird but not to a fish. You wouldn&#8217;t say holding up a building comes naturally to steel. </p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-12" href="#footnote-anchor-12" class="footnote-number" contenteditable="false" target="_self">12</a><div class="footnote-content"><p>Rest in peace.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-13" href="#footnote-anchor-13" class="footnote-number" contenteditable="false" target="_self">13</a><div class="footnote-content"><p>Astute readers will notice this is what we&#8217;re asking LLMs to <em>do</em>. Figuring this out statistically is what machine learning <em>is.</em></p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-14" href="#footnote-anchor-14" class="footnote-number" contenteditable="false" target="_self">14</a><div class="footnote-content"><p>(<em>verb</em>) The act of using any LLM to implement a project by making the LLM walk you through it so you can learn, like Remy cooking through Linguini in the 2007 Pixar film <em>Ratatouille</em>. Justifying whether or not this is a case of Scott Alexander&#8217;s <a href="https://gwern.net/doc/fiction/science-fiction/2012-10-03-yvain-thewhisperingearring.html">whispering earring</a> is left as an exercise for the reader.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-15" href="#footnote-anchor-15" class="footnote-number" contenteditable="false" target="_self">15</a><div class="footnote-content"><p>This is a little Whiggish, but it can&#8217;t be helped.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-16" href="#footnote-anchor-16" class="footnote-number" contenteditable="false" target="_self">16</a><div class="footnote-content"><p>Flying Edges is an algorithm for extracting <a href="https://en.wikipedia.org/wiki/Isosurface">isosurfaces</a> from volume data. Whereas the Marching Cubes algorithm is serial and operates over voxels, Flying Edges operates over edges. This reformulates the problem into one that can be parallelized.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-17" href="#footnote-anchor-17" class="footnote-number" contenteditable="false" target="_self">17</a><div class="footnote-content"><p>There was an unsettling moment where I asked an LLM to spell check the previous section, and it said that using the Fibonacci sequence in a paragraph about pattern recognition was a little on the nose. Then, I said &#8220;Oh, that&#8217;s working. I just thought it would be cute if people noticed it, like an Easter egg&#8221;. Then, it said, &#8220;so you surfaced the most famous pattern in mathematics in a paragraph about pattern recognition semi-unintentionally?&#8221;. Then I started my next prompt with &#8220;Ha&#8221; so we could simulate laughing nervously together and move on. Rest in peace, Jean Baudrillard. You would have loved interacting with LLMs. Or you would have killed yourself. Separately, congratulations Jacques Derrida. You would have used them to proclaim it was time to kiss the ring.</p></div></div>]]></content:encoded></item></channel></rss>