how about plain text? Imagine writing and presentation software where all you do is think about what you want to say.
Software that provides the illusion and sensation that you’re getting stuff done without actually getting much done is a great enterprise sale.
Anil Seth:
If we confuse ourselves too readily with our machine creations, we not only overestimate them, we also underestimate ourselves.
Also, I like the idea to stop using the word “hallucinate”:
The language that we ourselves use matters too. Consider how normal it has become to say that LLMs “hallucinate” when they spew falsehoods. Hallucinations in human beings are mainly conscious experiences that have lost their grip on reality (uncontrolled perceptions, one might say). We hallucinate when we hear voices that aren’t there or see a dead relative standing at the foot of the bed. When we say that AI systems “hallucinate,” we implicitly confer on them a capacity for experience.
We use language that implies consciousness exists where it actually doesn’t.
C. Thi Nyugen:
The reason that beareaucrats and politicians reach for numbers is to avoid responsibility by not having to make a judgement or exercise discretion. They take themselves out of the judgement seat — “it’s not me it’s just the numbers”.
I quite enjoyed this podcast. So much, in fact, that I bought his book and am enjoying it so far too!
Jason Gorman:
Computing has a long history of teaching us that there are many things we thought we understood that, when we try to explain it to the computer, it turns out we don’t.
This is true of computing specifically, but also many other intellectual exercises more broadly. To restate it:
There are many thing we think we understand that, when we try to explain it to someone else (in writing or by the spoken word), we realize we don’t understand it nearly as well as we thought we did.
Coffee tastes like music. When you listen to music:
you could pick out the cello from the violin, but they blend so well together. But if the balance is off, if one instrument is too loud or the tone’s not right or something’s a bit off, very quickly it is distracting and unpleasant. It’s more than just the music, the harmony, the melodies, balance is so important too. With great coffee, you should really notice the instruments unless you want to. You should just be enjoying the music.
Replace “great coffee” with “great software” and this holds.
Sadness. It’s that moment at the end of a cup of coffee where you go for one more sip and it's finished and your brain’s like, “Hey, I was drinking that! I’m not ready to be done yet.” It’s such a lovely moment of sadness. It’s beautiful because it ends. Great coffee should leave you wanting more, not worrying about what you should eat or drink to get rid of that sensation in your mouth.
There’s an art of knowing when to end something.
via R. Alex Anderson
John Gruber talking about how numbers may show that yucky tactics work, but that’s not the whole story. From his experiment opting in to a waterfall of SMS marketing messages:
the marketing team running this points to those sales as proof that it “works”. You can measure that. It shows up as a number. Some people in business only like arguments that can be backed by numbers. 3 is more than 2. That is indeed a fact.
But there are an infinite number of things in life that cannot be assigned numeric values.
Instagram is a media production company:
The story of social media ever since has been a story of the refinement of feeds as a media product aimed at capturing and holding an audience. The platforms have invested billions of dollars in designing those feeds—what they contain, how they look, how they work—to make them as “engaging” as possible. To argue that the companies are still in the business of transmitting “user-generated content” is absurd. They’re not carriers anymore; they’re media companies. Yes, users still contribute posts and comments—though even those, in today’s era of influencers, creators, and AI, are often subsidized and actively shaped by the companies—but the essential content of social media is now the feeds produced by the platforms, not the individual messages posted by users. Go to Instagram and scroll through your feed. It’s obvious that what you’re experiencing is not discrete bits of user-generated content. It’s an elaborate, finely tuned media production manufactured by Instagram for an audience of one: you
What I hate most is when companies try to turn my media into a feed for their product.
The feed is the content, and the social media company is its publisher. Period.
When you enter text into [ChatGPT], you're asking "What would a response to this sound like?"
If you put in a scientific question, and it comes back with a response citing a non-existent paper with a plausible title, using a real journal name and an author name who's written things related to your question, it's not being tricky or telling lies or doing anything at all surprising! This is what a response to that question would sound like! It did the thing!
“What would an answer to my question look like?” Is very different than, “What is the answer to my question?” But do we care about the difference?
It's good at generating things that sound like responses […] so people think that it's engaging in introspection or looking up more information or something, but it's not, it's only, ever, saying something that sounds like the next bit of the conversation.
Om Malik:
The point of the internet now is mostly to hook attention and push it toward commerce
We’ve made an entire system that rewards “motion first and judgment later”:
We built machines that prize acceleration and then act puzzled that everything feels rushed and slightly manic.
Reminds me of this quote from Eric Gill:
We have elected to order manufacture upon inhuman lines; why should we ask for humanity in the product?
You get the sense from reading Om here that speed is not conducive to wisdom.
We get a culture optimized for first takes, not best takes.
The system rewards whoever speaks first, not whoever lives with it long enough to understand it […] It’s that the structure pays a premium for compliance and levies a tax on independence. The result is a soft capture where creators don’t have to be told what to say. The incentives do the talking.
Side note: Nicholas Carr’s book Superbloom is a treatise on this exact topic: how velocity in a communication medium is everything.
You’ve heard of Nick Bostrom’s paperclip maximizer? A super-intelligent AI is tasked with making paperclips and, in single-minded pursuit of that goal, converts all available resources — including humanity — into paperclips.
Nicholas Carr puts a spin on that thought experiment and concludes we’re already living in that reality — except it’s us who have become the maximalizers.
Bostrom’s story, I would argue, becomes compelling when viewed not as a thought experiment but as a fable. It’s not really about AIs making paperclips. It’s about people making AIs. Look around. Are we not madly harvesting the world’s resources in a monomaniacal attempt to optimize artificial intelligence? Are we not trapped in an “AI maximizer” scenario?
Oh.
Elon Musk, having abandoned his earlier misgivings about AI, announced last week that he was merging xAI into SpaceX. The combined companies were “scaling to make a sentient sun to understand the Universe and extend the light of consciousness to the stars!” he declared. “In the long term, space-based AI is obviously the only way to scale.” It’s exactly what Bostrom predicted. The monomaniacs will not stop with the resources of the Earth. They’ll extend their plundering to the heavens. Everything is raw material.
We have met the enemy and he is us.
View all notes