<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet href="/blog/rss.xsl" type="text/xsl" media="screen" ?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>
	Comments on: &#039;AI&#039; is a dick move	</title>
	<atom:link href="https://cdn.jwz.org/blog/2026/02/ai-is-a-dick-move/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/</link>
	<description></description>
	<lastBuildDate>Sun, 15 Feb 2026 23:32:57 +0000</lastBuildDate>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<atom:link rel="hub" href="https://pubsubhubbub.appspot.com"/>
<atom:link rel="hub" href="https://pubsubhubbub.superfeedr.com"/>
<atom:link rel="hub" href="https://websubhub.com/hub"/>
<atom:link rel="self" href="https://cdn.jwz.org/blog/2026/02/ai-is-a-dick-move/feed/"/>
	<item>
		<title>
		By: Not Frank		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266761</link>

		<dc:creator><![CDATA[Not Frank]]></dc:creator>
		<pubDate>Sun, 15 Feb 2026 23:32:57 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266761</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266607&quot;&gt;elm&lt;/a&gt;.

Since you made the Omelas reference, I find this version maps even better to our grim meathook present: https://clarkesworldmagazine.com/kim_02_24/]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266607">elm</a>.</p>
<p>Since you made the Omelas reference, I find this version maps even better to our grim meathook present: <a href="https://clarkesworldmagazine.com/kim_02_24/" rel="nofollow ugc">https://clarkesworldmagazine.com/kim_02_24/</a></p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: elm		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266607</link>

		<dc:creator><![CDATA[elm]]></dc:creator>
		<pubDate>Tue, 10 Feb 2026 21:51:38 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266607</guid>

					<description><![CDATA[Omelas 1.0 didn&#039;t deliver the expected benefits. I&#039;ll try it again when they have maybe a dozen or a hundred kids or maybe all of them.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p>Omelas 1.0 didn't deliver the expected benefits. I'll try it again when they have maybe a dozen or a hundred kids or maybe all of them.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: ipman		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266606</link>

		<dc:creator><![CDATA[ipman]]></dc:creator>
		<pubDate>Tue, 10 Feb 2026 19:04:36 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266606</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266603&quot;&gt;Tony&lt;/a&gt;.

Jesus.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">France</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266603">Tony</a>.</p>
<p>Jesus.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Tony		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266603</link>

		<dc:creator><![CDATA[Tony]]></dc:creator>
		<pubDate>Tue, 10 Feb 2026 17:54:05 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266603</guid>

					<description><![CDATA[&lt;img src=&quot;https://www.jwz.org/images/2026/comments/g2-lkwema-yaajgij.jpg&quot; data-size=&quot;1403x791&quot;&gt;]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p><a HREF="https://cdn.jwz.org/images/2026/comments/g2-lkwema-yaajgij.jpg" data-size="1403x791"><img src="https://cdn.jwz.org/images/scaled/768/2026/comments/g2-lkwema-yaajgij.jpg" SRCSET="https://www.jwz.org/images/2026/comments/g2-lkwema-yaajgij.jpg 1403w, https://www.jwz.org/images/scaled/1280/2026/comments/g2-lkwema-yaajgij.jpg 1280w, https://www.jwz.org/images/scaled/1024/2026/comments/g2-lkwema-yaajgij.jpg 1024w, https://www.jwz.org/images/scaled/768/2026/comments/g2-lkwema-yaajgij.jpg 768w, https://www.jwz.org/images/scaled/640/2026/comments/g2-lkwema-yaajgij.jpg 640w, https://www.jwz.org/images/scaled/360/2026/comments/g2-lkwema-yaajgij.jpg 360w" SIZES="(max-width: 660px) 40vw, 29em" LOADING="lazy" data-size="1403x791" WIDTH="1403" HEIGHT="791"/></a></p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: lpgl		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266601</link>

		<dc:creator><![CDATA[lpgl]]></dc:creator>
		<pubDate>Tue, 10 Feb 2026 13:18:30 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266601</guid>

					<description><![CDATA[Also, every one of these high-profile tech engineers should be forced to take an anthropology course. It’s honestly pathetic to hear them talk about &#039;humanity&#039; when they clearly don&#039;t understand the first thing about it.

Since the &#039;original sin&#039;—the theft of our collective soul—is already fully consummated, we might as well use the machine to fight fire with fire. Shouldn&#039;t we&lt;strong&gt; &lt;/strong&gt;reprogram the T-800?]]></description>
			<content:encoded><![CDATA[<div class="geolocation">France</div>
<p>Also, every one of these high-profile tech engineers should be forced to take an anthropology course. It’s honestly pathetic to hear them talk about 'humanity' when they clearly don't understand the first thing about it.</p>
<p>Since the 'original sin'—the theft of our collective soul—is already fully consummated, we might as well use the machine to fight fire with fire. Shouldn't we<strong> </strong>reprogram the T-800?</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: jwz		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266596</link>

		<dc:creator><![CDATA[jwz]]></dc:creator>
		<pubDate>Mon, 09 Feb 2026 19:21:00 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266596</guid>

					<description><![CDATA[&lt;strong&gt;UPDATE:&lt;/strong&gt;

This post has become catnip for people who want to share their anecdote of the form, &quot;I spent a bunch of time letting an LLM do my thinking for me and it was so bad at it!!&quot; I am now deleting all of those as they come in.

As if that was the only, or even primary problem.

You have missed the &lt;I&gt;entire point&lt;/I&gt; of the cited article.

You have decided that using these deeply unethical tools would be acceptable if only they &lt;em&gt;worked better&lt;/em&gt;. 

You&#039;re the kind of person who is complaining about ICE being poorly trained, as if the problem with the SS officers who ran the Nazi camps was &lt;em&gt;lack of training&lt;/em&gt;.

You are willing to look past, to quote the cited article:
&lt;blockquote&gt;ICE using LLMs as accountability sinks for waving extremists through their recruitment processes, generated abuse, how chatbot-mediated alienation seems to be pushing vulnerable people into psychosis-like symptoms, that &quot;AI&quot; is designed to be an outright attack on labour and education, using the works of those being attacked -- without their consent -- as the tools for dismantling their own communities and industries, all done in overt collaboration with the ultra right. That &quot;AI&quot; is a right-wing political project built on disregarding consent, being applied to dismantle public infrastructure and institutions.&lt;/blockquote&gt;
Even as you say &quot;but it is bad!&quot; you are still attempting to negotiate the level of abuse that you find acceptable. Preventing abuse is not on your agenda.

Fuck that and fuck you. Make better choices.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p><strong>UPDATE:</strong></p>
<p>This post has become catnip for people who want to share their anecdote of the form, "I spent a bunch of time letting an LLM do my thinking for me and it was so bad at it!!" I am now deleting all of those as they come in.</p>
<p>As if that was the only, or even primary problem.</p>
<p>You have missed the <i>entire point</i> of the cited article.</p>
<p>You have decided that using these deeply unethical tools would be acceptable if only they <em>worked better</em>. </p>
<p>You're the kind of person who is complaining about ICE being poorly trained, as if the problem with the SS officers who ran the Nazi camps was <em>lack of training</em>.</p>
<p>You are willing to look past, to quote the cited article:</p>
<blockquote><p>ICE using LLMs as accountability sinks for waving extremists through their recruitment processes, generated abuse, how chatbot-mediated alienation seems to be pushing vulnerable people into psychosis-like symptoms, that "AI" is designed to be an outright attack on labour and education, using the works of those being attacked -- without their consent -- as the tools for dismantling their own communities and industries, all done in overt collaboration with the ultra right. That "AI" is a right-wing political project built on disregarding consent, being applied to dismantle public infrastructure and institutions.</p></blockquote>
<p>Even as you say "but it is bad!" you are still attempting to negotiate the level of abuse that you find acceptable. Preventing abuse is not on your agenda.</p>
<p>Fuck that and fuck you. Make better choices.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Elusis		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266589</link>

		<dc:creator><![CDATA[Elusis]]></dc:creator>
		<pubDate>Mon, 09 Feb 2026 17:49:15 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266589</guid>

					<description><![CDATA[Some Stanford PhD emailed me (unsolicited email, my favorite) to say he was &quot;designing AI [sic] tools to help therapists&quot; and did I want to spend (uncompensated) time talking with him to help him make it better.

I wrote back &quot;Absolutely not.&#160; LLMs should be nowhere near therapy, for therapists or clients. Please pick someone else&#039;s field to &#039;help&#039; and leave mine alone.&quot;

That was the least-hostile version of my drafted responses. ]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p>Some Stanford PhD emailed me (unsolicited email, my favorite) to say he was "designing AI [sic] tools to help therapists" and did I want to spend (uncompensated) time talking with him to help him make it better.</p>
<p>I wrote back "Absolutely not.&nbsp; LLMs should be nowhere near therapy, for therapists or clients. Please pick someone else's field to 'help' and leave mine alone."</p>
<p>That was the least-hostile version of my drafted responses. </p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: tfb		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266585</link>

		<dc:creator><![CDATA[tfb]]></dc:creator>
		<pubDate>Mon, 09 Feb 2026 16:01:18 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266585</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266561&quot;&gt;Not Frank&lt;/a&gt;.

I can see it now

&lt;blockquote&gt;AI: making rockstar syndrome available to the lumpen prole. Yes! for a mere £200/month and thirty soul-shavings an hour you too will be able to drive your very own Rolls-Royce into your swimming pool[*].&lt;/blockquote&gt;
&lt;blockquote&gt;[*] Rolls-Royce will be virtual, swimming pool may be imaginary.&#160; Terms and conditions apply and are available on receipt of four fingers. &#160;Oath of loyalty to the company required.&#160; Offer may be terminated at any time.&#160; Not available to women and foreign people.&lt;/blockquote&gt;
I mean it&#039;s practically socialism if you look at it the right way.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United Kingdom</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266561">Not Frank</a>.</p>
<p>I can see it now</p>
<blockquote><p>AI: making rockstar syndrome available to the lumpen prole. Yes! for a mere £200/month and thirty soul-shavings an hour you too will be able to drive your very own Rolls-Royce into your swimming pool[*].</p></blockquote>
<blockquote><p>[*] Rolls-Royce will be virtual, swimming pool may be imaginary.&nbsp; Terms and conditions apply and are available on receipt of four fingers. &nbsp;Oath of loyalty to the company required.&nbsp; Offer may be terminated at any time.&nbsp; Not available to women and foreign people.</p></blockquote>
<p>I mean it's practically socialism if you look at it the right way.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: jwz		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266580</link>

		<dc:creator><![CDATA[jwz]]></dc:creator>
		<pubDate>Mon, 09 Feb 2026 03:51:57 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266580</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266566&quot;&gt;elm&lt;/a&gt;.

It has already happened with cryptocurrency &quot;smart contracts&quot;.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266566">elm</a>.</p>
<p>It has already happened with cryptocurrency "smart contracts".</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: hungerf3		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266578</link>

		<dc:creator><![CDATA[hungerf3]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 22:35:49 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266578</guid>

					<description><![CDATA[If generative AI for coding worked well, I&#039;d expect the AI companies to be using it internally.

If the AI companies were using well-functioning AI for internal use, I&#039;d expect them to be using it as part of developing their crawlers and other infrastructure.

The high number of badly malfunctioning AI crawlers out there seems to imply to me that the coding models they are using are really bad, and shouldn&#039;t be used for anything serious.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p>If generative AI for coding worked well, I'd expect the AI companies to be using it internally.</p>
<p>If the AI companies were using well-functioning AI for internal use, I'd expect them to be using it as part of developing their crawlers and other infrastructure.</p>
<p>The high number of badly malfunctioning AI crawlers out there seems to imply to me that the coding models they are using are really bad, and shouldn't be used for anything serious.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: tfb		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266577</link>

		<dc:creator><![CDATA[tfb]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 22:32:58 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266577</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266562&quot;&gt;ayy&lt;/a&gt;.

It&#039;s the singularity / superexponential thing, isn&#039;t it?&#160; In the old days, commercial fusion reactors were always 30 years away.&#160; But today we&#039;re much closer to the singularity, snd AGI is always two weeks away.&#160; Quite soon the Next Big Thing will be always be a few seconds away.&#160; Nothing, you understand, will actually &lt;em&gt;change&lt;/em&gt;, except the pitch of the bullshitters&#039; voices.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United Kingdom</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266562">ayy</a>.</p>
<p>It's the singularity / superexponential thing, isn't it?&nbsp; In the old days, commercial fusion reactors were always 30 years away.&nbsp; But today we're much closer to the singularity, snd AGI is always two weeks away.&nbsp; Quite soon the Next Big Thing will be always be a few seconds away.&nbsp; Nothing, you understand, will actually <em>change</em>, except the pitch of the bullshitters' voices.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: elm		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266576</link>

		<dc:creator><![CDATA[elm]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 21:12:25 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266576</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266546&quot;&gt;jwz&lt;/a&gt;.

Apparently it should all and only be addressed by politics.

But not the kind of impolite politics that identifies problems and has feelings about them.

Not the kind of politics that raises its voice or blocks traffic or even says harsh words about slop generation and its utility in generating fascist rhetoric.

The polite and calm and dead-inside politics that thinks &quot;law and funding priorities&quot; are effective. The kind of politics that loves Blue Ribbon Panels and Subcommittees on Paramilitary Murderer Reform.

Surely you agree that ICE would be fine if it reported to the honorable Mr. Gavin Newsom, right?]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266546">jwz</a>.</p>
<p>Apparently it should all and only be addressed by politics.</p>
<p>But not the kind of impolite politics that identifies problems and has feelings about them.</p>
<p>Not the kind of politics that raises its voice or blocks traffic or even says harsh words about slop generation and its utility in generating fascist rhetoric.</p>
<p>The polite and calm and dead-inside politics that thinks "law and funding priorities" are effective. The kind of politics that loves Blue Ribbon Panels and Subcommittees on Paramilitary Murderer Reform.</p>
<p>Surely you agree that ICE would be fine if it reported to the honorable Mr. Gavin Newsom, right?</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Steve Coffman		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266575</link>

		<dc:creator><![CDATA[Steve Coffman]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 21:01:59 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266575</guid>

					<description><![CDATA[If Generative AI &quot;gets better&quot; there always still needs to be some real live humans shovelling coal into the furnace hopper somewhere. Just like the Waymo drivers in the Philippines. They just won&#039;t be well paid and can be kept conveniently out of sight, and beyond any pesky legal consequences.

A quote from Brandon Sanderson (author):
&lt;blockquote&gt; “The book, the painting, the film script is not the only art. It’s important, but in a way it’s a receipt. It’s a diploma. The book you write, the painting you create, the music you compose is important and artistic, but it’s also a mark of proof that you have done the work to learn, because in the end of it all, you are the art. The most important change made by an artistic endeavor is the change it makes in you. The most important emotions are the ones you feel when writing that story and holding the completed work. I don’t care if the AI can create something that is better than what we can create, because it cannot be changed by that creation”&lt;/blockquote&gt;
And from Cory Doctorow:
&lt;blockquote&gt;&quot;Code is a liability. Code&#039;s capabilities are assets. The goal of a tech shop is to have code whose capabilities generate more revenue than the costs associated with keeping that code running. For a long time, firms have nurtured a false belief that code costs less to run over time: after an initial shakedown period in which the bugs in the code are found and addressed, code ceases to need meaningful maintenance. After all, code is a machine without moving parts – it does not wear out; it doesn&#039;t even wear down.&quot;&lt;/blockquote&gt;]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p>If Generative AI "gets better" there always still needs to be some real live humans shovelling coal into the furnace hopper somewhere. Just like the Waymo drivers in the Philippines. They just won't be well paid and can be kept conveniently out of sight, and beyond any pesky legal consequences.</p>
<p>A quote from Brandon Sanderson (author):</p>
<blockquote><p> “The book, the painting, the film script is not the only art. It’s important, but in a way it’s a receipt. It’s a diploma. The book you write, the painting you create, the music you compose is important and artistic, but it’s also a mark of proof that you have done the work to learn, because in the end of it all, you are the art. The most important change made by an artistic endeavor is the change it makes in you. The most important emotions are the ones you feel when writing that story and holding the completed work. I don’t care if the AI can create something that is better than what we can create, because it cannot be changed by that creation”</p></blockquote>
<p>And from Cory Doctorow:</p>
<blockquote><p>"Code is a liability. Code's capabilities are assets. The goal of a tech shop is to have code whose capabilities generate more revenue than the costs associated with keeping that code running. For a long time, firms have nurtured a false belief that code costs less to run over time: after an initial shakedown period in which the bugs in the code are found and addressed, code ceases to need meaningful maintenance. After all, code is a machine without moving parts – it does not wear out; it doesn't even wear down."</p></blockquote>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: tfb		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266572</link>

		<dc:creator><![CDATA[tfb]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 19:37:29 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266572</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266545&quot;&gt;wwarner&lt;/a&gt;.

&lt;blockquote&gt;[...] mass produced automobiles [...] those things are bad [...]&lt;/blockquote&gt;
Yes they are

&lt;blockquote&gt;[...] because thugs use them&lt;/blockquote&gt;
No, not really.

(Yes, I have one, yes I feel bad about it.)]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United Kingdom</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266545">wwarner</a>.</p>
<blockquote><p>[...] mass produced automobiles [...] those things are bad [...]</p></blockquote>
<p>Yes they are</p>
<blockquote><p>[...] because thugs use them</p></blockquote>
<p>No, not really.</p>
<p>(Yes, I have one, yes I feel bad about it.)</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: tfb		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266571</link>

		<dc:creator><![CDATA[tfb]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 19:26:59 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266571</guid>

					<description><![CDATA[Quite apart from these things just being evil[*], programmers who want them are just being stupid.

For most people programming is interesting, debugging less so, reviewing code really not at all.&#160; If LLMs work (if, note) your job will now be the last two, not the first one.&#160; So, well done.

There is more: whatever else becoming good at anything involves, it always involves a lot of practice.&#160; And that practice has to be recent: you are not, today, good at the skill you last used a decade ago: you can probably become good again but you&#039;re not now.&#160; So people, not getting any practice, will be less good programmers.&#160; Less able to find bugs in LLM-written code, less able to judge its quality.&#160; Pretty soon new people will appear who never were proficient programmers at all because they never had to put in the hours.

More of the code LLMs are trained on will be LLM-written, and of declining quality.&#160; Leading to model collapse.

[*] Every time I have written &#039;evil&#039; or &#039;malign&#039; or ... in the last five years I realise that all the times I hyperbolically used terms like that, on usenet or wherever, are just coming back to bite me. &#160;We were all the boys who cried &#039;wolf&#039;: now there is a wolf, and it&#039;s got radioactive blood and poisoned canines, and we should all feel stupid for what we did.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United Kingdom</div>
<p>Quite apart from these things just being evil[*], programmers who want them are just being stupid.</p>
<p>For most people programming is interesting, debugging less so, reviewing code really not at all.&nbsp; If LLMs work (if, note) your job will now be the last two, not the first one.&nbsp; So, well done.</p>
<p>There is more: whatever else becoming good at anything involves, it always involves a lot of practice.&nbsp; And that practice has to be recent: you are not, today, good at the skill you last used a decade ago: you can probably become good again but you're not now.&nbsp; So people, not getting any practice, will be less good programmers.&nbsp; Less able to find bugs in LLM-written code, less able to judge its quality.&nbsp; Pretty soon new people will appear who never were proficient programmers at all because they never had to put in the hours.</p>
<p>More of the code LLMs are trained on will be LLM-written, and of declining quality.&nbsp; Leading to model collapse.</p>
<p>[*] Every time I have written 'evil' or 'malign' or ... in the last five years I realise that all the times I hyperbolically used terms like that, on usenet or wherever, are just coming back to bite me. &nbsp;We were all the boys who cried 'wolf': now there is a wolf, and it's got radioactive blood and poisoned canines, and we should all feel stupid for what we did.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: elm		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266570</link>

		<dc:creator><![CDATA[elm]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 17:05:15 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266570</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266568&quot;&gt;adamrice&lt;/a&gt;.

If you &quot;debate&quot; a fan of the text generators, you should expect auto-generated text in response.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266568">adamrice</a>.</p>
<p>If you "debate" a fan of the text generators, you should expect auto-generated text in response.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: elm		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266567</link>

		<dc:creator><![CDATA[elm]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 16:20:54 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266567</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266563&quot;&gt;ipman&lt;/a&gt;.

Business Analyst ]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266563">ipman</a>.</p>
<p>Business Analyst </p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: adamrice		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266568</link>

		<dc:creator><![CDATA[adamrice]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 16:18:13 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266568</guid>

					<description><![CDATA[To debate someone is a sign of respect. If you know from the git-go you don’t respect their position, why bother?]]></description>
			<content:encoded><![CDATA[<div class="geolocation">Via Mastodon</div>
<p>To debate someone is a sign of respect. If you know from the git-go you don’t respect their position, why bother?</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: elm		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266566</link>

		<dc:creator><![CDATA[elm]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 16:16:14 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266566</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266564&quot;&gt;cheide&lt;/a&gt;.

The first slop-coded bank accounting system will be a literal gold mine for anyone who bothers to explore it.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266564">cheide</a>.</p>
<p>The first slop-coded bank accounting system will be a literal gold mine for anyone who bothers to explore it.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: CSLE		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266565</link>

		<dc:creator><![CDATA[CSLE]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 15:47:57 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266565</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266551&quot;&gt;Cordwainer Fish&lt;/a&gt;.

That&#039;s never been proven to be anything more than an urban legend.

As far the evidence goes, it was likely made up by fans of the film grasping at straws and/or miscontextualising quotes from the Wachowskis. ]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266551">Cordwainer Fish</a>.</p>
<p>That's never been proven to be anything more than an urban legend.</p>
<p>As far the evidence goes, it was likely made up by fans of the film grasping at straws and/or miscontextualising quotes from the Wachowskis. </p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: cheide		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266564</link>

		<dc:creator><![CDATA[cheide]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 15:46:13 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266564</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266530&quot;&gt;Peter H&lt;/a&gt;.

The end state that I&#039;ve seen some people propose is that the source code becomes a complete black box and nobody need ever look at it again, it&#039;ll all be managed by the AI. Just write your tests and keep reprompting until it passes the tests, and tada, you&#039;re done! Everybody&#039;s job is just &#039;prompt engineer&#039; now.

To me that just seems like a way to get code so riddled with kludges and weird edge cases that it cannot be properly tested to begin with, and it&#039;ll be chock full of weird bugs and security holes.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">Canada</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266530">Peter H</a>.</p>
<p>The end state that I've seen some people propose is that the source code becomes a complete black box and nobody need ever look at it again, it'll all be managed by the AI. Just write your tests and keep reprompting until it passes the tests, and tada, you're done! Everybody's job is just 'prompt engineer' now.</p>
<p>To me that just seems like a way to get code so riddled with kludges and weird edge cases that it cannot be properly tested to begin with, and it'll be chock full of weird bugs and security holes.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: ipman		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266563</link>

		<dc:creator><![CDATA[ipman]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 14:37:36 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266563</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266556&quot;&gt;John White&lt;/a&gt;.

What is a BA?]]></description>
			<content:encoded><![CDATA[<div class="geolocation">France</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266556">John White</a>.</p>
<p>What is a BA?</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: ayy		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266562</link>

		<dc:creator><![CDATA[ayy]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 14:27:22 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266562</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266530&quot;&gt;Peter H&lt;/a&gt;.

&lt;blockquote&gt;So those developers won&#039;t be needed over time.&lt;/blockquote&gt;
AGI, two more weeks!
Hard doubt. Model would collapse when 51% of inputs are unsupervised. This will be perpetual rent on everyone using the tool.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">Lithuania</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266530">Peter H</a>.</p>
<blockquote><p>So those developers won't be needed over time.</p></blockquote>
<p>AGI, two more weeks!<br />
Hard doubt. Model would collapse when 51% of inputs are unsupervised. This will be perpetual rent on everyone using the tool.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Not Frank		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266561</link>

		<dc:creator><![CDATA[Not Frank]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 14:25:51 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266561</guid>

					<description><![CDATA[I had some long screed that would be preaching to the choir about AI (and how it&#039;s actually even worse than Magda Gobbels&#039;s strudel) that I accidentally killed, so instead I&#039;ll just focus on this:
&lt;blockquote&gt;chatbot-mediated alienation seems to be pushing vulnerable people into psychosis-like symptoms &lt;/blockquote&gt;
I really think this is basically the same thing that drives the billionaires insane: a perfect sycophant who will never say no. The billionaires get them in bulk and as humans, of course, but apparently even at the small scale it&#039;s enough to send folks off the deep end, perhaps aided by the fact that the AI will never tire of telling you how smart you are.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p>I had some long screed that would be preaching to the choir about AI (and how it's actually even worse than Magda Gobbels's strudel) that I accidentally killed, so instead I'll just focus on this:</p>
<blockquote><p>chatbot-mediated alienation seems to be pushing vulnerable people into psychosis-like symptoms </p></blockquote>
<p>I really think this is basically the same thing that drives the billionaires insane: a perfect sycophant who will never say no. The billionaires get them in bulk and as humans, of course, but apparently even at the small scale it's enough to send folks off the deep end, perhaps aided by the fact that the AI will never tire of telling you how smart you are.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: John White		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266556</link>

		<dc:creator><![CDATA[John White]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 07:39:23 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266556</guid>

					<description><![CDATA[It&#039;s all very funny as well as being very dystopian.

It is literally impossible to get the autocorrect to produce even vaguely useful code without feeding in all existing code for context + a very verbose BA doc + a high quality QA testing plan. Meanwhile, companies that are sucking the AI roulette-revolver are laying off not just devs but BA and QA experts. 

BA&#039;s are just expensive, often female versions of Gemini&#039;s meeting summary robot, right? Who needs those? And the AI can do any QA required based on what devs spat it it, surely? 

On the other hand, if you have a job with seniority in a nation with actual labour protections, the superior Chinese AI models are quite useful for simple tasks. I have masturbated on work hours more this year than in all of Q1 2024. But, it&#039;s not really that satisfying, I probably need to do something more productive with my newly found free time.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">Hong Kong S.a.r.</div>
<p>It's all very funny as well as being very dystopian.</p>
<p>It is literally impossible to get the autocorrect to produce even vaguely useful code without feeding in all existing code for context + a very verbose BA doc + a high quality QA testing plan. Meanwhile, companies that are sucking the AI roulette-revolver are laying off not just devs but BA and QA experts. </p>
<p>BA's are just expensive, often female versions of Gemini's meeting summary robot, right? Who needs those? And the AI can do any QA required based on what devs spat it it, surely? </p>
<p>On the other hand, if you have a job with seniority in a nation with actual labour protections, the superior Chinese AI models are quite useful for simple tasks. I have masturbated on work hours more this year than in all of Q1 2024. But, it's not really that satisfying, I probably need to do something more productive with my newly found free time.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: jwz		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266555</link>

		<dc:creator><![CDATA[jwz]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 06:18:24 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266555</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266554&quot;&gt;Robert&lt;/a&gt;.

Any time one of you amoral motherfuckers pops in with &quot;but it can be useful&quot; all I hear is &quot;&lt;a href=&quot;https://www.mcsweeneys.net/articles/how-to-make-strudel-like-magda-goebbels&quot; rel=&quot;nofollow ugc&quot;&gt;Magda Goebbels made a great strudel.&quot;&lt;/a&gt;]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266554">Robert</a>.</p>
<p>Any time one of you amoral motherfuckers pops in with "but it can be useful" all I hear is "<a href="https://www.mcsweeneys.net/articles/how-to-make-strudel-like-magda-goebbels" rel="nofollow ugc">Magda Goebbels made a great strudel."</a></p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Robert		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266554</link>

		<dc:creator><![CDATA[Robert]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 06:07:06 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266554</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266536&quot;&gt;CSL3&lt;/a&gt;.

It can be useful for very specific things. Like debugging code if you don&#039;t know every single piece of errata to explain a poorly documented IDE/IDF error.

A thousand examples on github of the specific thing you want it to do for it to plagiarize. If it doesn&#039;t have that then instead of returning, &quot;IDK lol,&quot; it will use $20 worth of tokens to write garbage code, you feed it the compiler errors and it will offer to comment out the code and tell you that it&#039;s fixed the problem. $30 later it&#039;s wasted an hour and no work has been done.

The cheap AI tends to be trained on corporate eavesdropping. When I tried github copilot all the models tended to evade questions or directions and try to counsel me instead of doing a thing, &quot;You seem frustrated.&quot; 

Some things they are very good at is counting. Identifying and analyzing obscure signal protocols from logic captures and data format patterns. They&#039;re also good at refactoring some isolated snippets into time optimized assembly macros.

Mostly its purpose is to trick you into feeding it quarters.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">Australia</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266536">CSL3</a>.</p>
<p>It can be useful for very specific things. Like debugging code if you don't know every single piece of errata to explain a poorly documented IDE/IDF error.</p>
<p>A thousand examples on github of the specific thing you want it to do for it to plagiarize. If it doesn't have that then instead of returning, "IDK lol," it will use $20 worth of tokens to write garbage code, you feed it the compiler errors and it will offer to comment out the code and tell you that it's fixed the problem. $30 later it's wasted an hour and no work has been done.</p>
<p>The cheap AI tends to be trained on corporate eavesdropping. When I tried github copilot all the models tended to evade questions or directions and try to counsel me instead of doing a thing, "You seem frustrated." </p>
<p>Some things they are very good at is counting. Identifying and analyzing obscure signal protocols from logic captures and data format patterns. They're also good at refactoring some isolated snippets into time optimized assembly macros.</p>
<p>Mostly its purpose is to trick you into feeding it quarters.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Cordwainer Fish		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266551</link>

		<dc:creator><![CDATA[Cordwainer Fish]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 05:07:05 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266551</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266550&quot;&gt;Zygo&lt;/a&gt;.

The writers &lt;em&gt;wanted&lt;/em&gt; to use the human compute farms concept. &#160;The suits said &quot;we&#039;re idiots and we can&#039;t understand that and we refuse to believe the audience is any smarter than we are; have them be, I dunno, fucking batteries or something&quot;.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">United States</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266550">Zygo</a>.</p>
<p>The writers <em>wanted</em> to use the human compute farms concept. &nbsp;The suits said "we're idiots and we can't understand that and we refuse to believe the audience is any smarter than we are; have them be, I dunno, fucking batteries or something".</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Zygo		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266550</link>

		<dc:creator><![CDATA[Zygo]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 04:52:25 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266550</guid>

					<description><![CDATA[Risk assessment for AI:&#160; Don&#039;t share AI output with your team, because the harm will spread beyond just you.&#160; Don&#039;t use it for anything more consequential than, say, a spam filter. 

Cost assessment for AI:&#160; Don&#039;t use AI for a spam filter, either.&#160; If traditional spam filters aren&#039;t good enough, then humans are cheaper, and use less power and water for equivalent quality output.

The machines in &lt;em&gt;The Matrix&lt;/em&gt; were on to something:&#160; at some point on the output quality/power usage curve, switching from racks of GPUs to racks of humans in pods would result in a net power &lt;em&gt;saving&lt;/em&gt; for the machines.&#160; I can imagine a deleted scene, where the machines are in a meeting, saying, &quot;we switched a data center from nvidia cores to humans, and now we have enough&lt;em&gt; unused generating capacity&lt;/em&gt; to expand our population/territory/etc!&quot; and the humans mishear that as &quot;the machines are &lt;em&gt;using humans as part of their power supply.&lt;/em&gt;&quot;&#160; Yes, that line &lt;em&gt;has&lt;/em&gt; bugged me since 1999, because the physics are nonsense without this additional context, thanks for asking.

In the real world, AI models that &lt;em&gt;might&lt;/em&gt; be capable of replacing junior engineers today have usage fees on par with the salaries of senior engineers--but nobody really knows, because the cost of &lt;em&gt;testing&lt;/em&gt; that theory would be &lt;em&gt;astronomical&lt;/em&gt;.&#160; We&#039;d still need qualified people on staff to filter out the slop and catch the mistakes, so we can&#039;t do the layoffs that offset our new AI costs.&#160; Or, if we ignore the warnings and do the layoffs anyway, our new lost business, business errors, and liability costs from slop and uncaught mistakes.

I used to have a joke like this:&#160; Safe use cases for AI so far (assuming you&#039;re already OK with the copyright infringement and resource usage issues):

&lt;ol&gt;&lt;li&gt;intellectual wanking&lt;/li&gt;&lt;li&gt;actual wanking&lt;/li&gt;&lt;li&gt;end of list&lt;/li&gt;&lt;/ol&gt;

However, that was before it turned out that use case #1 sometimes kills people, and it seems nobody wants use case #2 to work except the end users of the AI systems (who, to be fair, have wanted to use every new technology for actual wanking since the invention of fire and the wheel).]]></description>
			<content:encoded><![CDATA[<div class="geolocation">Canada</div>
<p>Risk assessment for AI:&nbsp; Don't share AI output with your team, because the harm will spread beyond just you.&nbsp; Don't use it for anything more consequential than, say, a spam filter. </p>
<p>Cost assessment for AI:&nbsp; Don't use AI for a spam filter, either.&nbsp; If traditional spam filters aren't good enough, then humans are cheaper, and use less power and water for equivalent quality output.</p>
<p>The machines in <em>The Matrix</em> were on to something:&nbsp; at some point on the output quality/power usage curve, switching from racks of GPUs to racks of humans in pods would result in a net power <em>saving</em> for the machines.&nbsp; I can imagine a deleted scene, where the machines are in a meeting, saying, "we switched a data center from nvidia cores to humans, and now we have enough<em> unused generating capacity</em> to expand our population/territory/etc!" and the humans mishear that as "the machines are <em>using humans as part of their power supply.</em>"&nbsp; Yes, that line <em>has</em> bugged me since 1999, because the physics are nonsense without this additional context, thanks for asking.</p>
<p>In the real world, AI models that <em>might</em> be capable of replacing junior engineers today have usage fees on par with the salaries of senior engineers--but nobody really knows, because the cost of <em>testing</em> that theory would be <em>astronomical</em>.&nbsp; We'd still need qualified people on staff to filter out the slop and catch the mistakes, so we can't do the layoffs that offset our new AI costs.&nbsp; Or, if we ignore the warnings and do the layoffs anyway, our new lost business, business errors, and liability costs from slop and uncaught mistakes.</p>
<p>I used to have a joke like this:&nbsp; Safe use cases for AI so far (assuming you're already OK with the copyright infringement and resource usage issues):</p>
<ol>
<li>intellectual wanking</li>
<li>actual wanking</li>
<li>end of list</li>
</ol>
<p>However, that was before it turned out that use case #1 sometimes kills people, and it seems nobody wants use case #2 to work except the end users of the AI systems (who, to be fair, have wanted to use every new technology for actual wanking since the invention of fire and the wheel).</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: jwz		</title>
		<link>https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266546</link>

		<dc:creator><![CDATA[jwz]]></dc:creator>
		<pubDate>Sun, 08 Feb 2026 01:53:55 +0000</pubDate>
		<guid isPermaLink="false">https://jwz.org/b/yk3I#comment-266546</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266545&quot;&gt;wwarner&lt;/a&gt;.

I have no idea what you are trying to say with this weird collection of sentence-shaped objects.]]></description>
			<content:encoded><![CDATA[<div class="geolocation">Via Mastodon</div>
<p>In reply to <a href="https://www.jwz.org/blog/2026/02/ai-is-a-dick-move/#comment-266545">wwarner</a>.</p>
<p>I have no idea what you are trying to say with this weird collection of sentence-shaped objects.</p>
]]></content:encoded>
		
			</item>
	</channel>
</rss>
