Skip to content

Plutonic Rainbows

The Light That Was Already a Memory

Sodium vapour light had a particular trick. It didn't just illuminate a street — it replaced it. Everything under those lamps became the same deep amber-orange, a colour so dominant it overrode the actual world with a monochrome version of itself. Reds vanished. Blues turned grey. Skin took on a tone that belonged to no living complexion. You could stand on a pavement at ten in the evening and feel like you were already looking at a photograph of yourself standing there.

The UK ran on this light for decades. Low-pressure sodium, mostly — emitting that narrow wavelength around 589 nanometres. Councils chose them because they were cheap and efficient, not for the aesthetics. But the aesthetics is what stuck. If you grew up in Britain between roughly 1970 and 2010, every memory of nighttime carries that cast. Every walk home from school in winter. Every car park. Every breath of condensation under a lamppost outside a shop you've since forgotten.

There's a photograph of Park Street in Bristol, late 1980s, that captures it exactly. The Mauretania pub glowing red through the sodium haze, Averys beneath, the Wills Memorial Building dissolving into mist at the top of the hill. Light trails from passing cars streak down the slope like something being poured. Every surface baptised in that single, overwhelming wavelength.

What made it strange — and this is hard to convey to anyone who didn't stand under it — is that the light felt temporary while it was happening. Not because anyone predicted its removal. Because there was something inherently melancholic about how it flattened the world. A row of shops, a residential street, a car park — all of it looked like a memory even while you were standing in it. The scene was fading as you watched.

Since around 2015, councils across Britain have been systematically replacing sodium with white LED. The old colour is nearly gone. When you see it now in photographs or footage, the response isn't nostalgia exactly — it's more visceral than that. That light is the past. The colour no longer exists on any street you can walk down. It has become an artefact in itself, as vanished as the world it illuminated.

Hauntological light. Already ghostly when it was real. Now it's the visual signature of an entire lost era — an amber wavelength that belongs to no living night.

The Recording Medium Was Always Me

No drug — not even the most powerful hallucinogen known — can retrieve my November 1990. A substance like 5-MeO-DMT can radically alter present-moment perception, dissolve the self, overwhelm with emotion. But it is always an experience of now. It cannot reconstruct the specific quality of light on a specific street in Bedford, or return to me the face of a particular person as they looked on a particular evening. The neurons that held that night have long since been repurposed. A powerful psychedelic might produce a feeling of reconnection — an overwhelming intensity that my mind interprets as closeness to something lost — but it would be a hallucination of return, not a return.

The past is not stored somewhere waiting for the right chemical key. It existed once, in real time, in real light, and then it was gone.

What 5-MeO-DMT actually does — what all classical psychedelics do — is flood serotonin 2A receptors and suppress the default mode network, the brain's internal narrator. The result feels like ego dissolution, cosmic unity, connection to something vast. Users routinely describe it as the most meaningful experience of their lives. But meaning and retrieval are different operations. The drug offers an experience of profound present-moment significance. It does not offer a time machine.

Neuroscience confirms this with uncomfortable precision. Every act of recall is a reconstruction, not a playback. The process — reconsolidation — means that retrieving a memory destabilises it, renders it briefly malleable, then restores it in altered form. Each time you remember something, you change it slightly. The memory I hold of a specific evening in 1990 has been rebuilt so many times that its relationship to the original event is, at best, approximate. I'm not accessing the past. I'm accessing the last time I accessed it. And the time before that altered it too. The signal degrades with every retrieval — not despite remembering but because of it.

The cruelty is that it doesn't feel gone. My memories are vivid. The fragrances are still in the drawer. The music still plays. Everything suggests the past is right there, just out of reach — close enough to almost touch. And that "almost" is what makes it cruel. If it felt truly distant, truly alien, I could let it go. But it doesn't. It feels like a room I've just stepped out of, except the door has quietly locked behind me and there's no handle on this side. The past isn't far away — it feels impossibly close while being absolutely, permanently unreachable.

I've written before about the metabolic cost of this — the way dwelling on a vivid memory exhausts the body as though the experience were happening again. The nervous system doesn't distinguish between reconstruction and reality. It responds to the remembered room with the same activation it would bring to the actual room. Heart rate shifts. Breathing changes. Energy is spent on a place that no longer exists, and the spending leaves you tired in a way that has nothing to do with what you've actually done that day.

And the Stone Tape — Nigel Kneale's 1972 idea that the walls of buildings absorb and replay what happened within them — is a beautiful metaphor dressed as physics, not a reality. The play, broadcast on BBC Two on Christmas Day 1972 to 2.6 million viewers, proposed that stone structures might act as recording media for emotionally charged events — capturing trauma the way magnetic tape captures sound. The idea has antecedents. Charles Babbage theorised about "place-memory" in 1838, suggesting that voices and actions leave permanent but imperceptible imprints on their surroundings. The archaeologist Thomas Charles Lethbridge elaborated in his 1961 book Ghost and Ghoul. But Kneale gave the concept its most enduring dramatic form, and the name stuck.

There is no known mechanism by which stone could encode and store human experience. The walls of the Bowen West Theatre held no trace of my performance. The bricks didn't know I was there. What is real is that places can trigger memory with extraordinary power, and from the inside that feels like the same thing. But the recording medium was always me, not the stone.

I think the idea persists because it's not really a theory about buildings. It's a wish — that the world itself remembers us, that our presence leaves a mark deeper than we can see. That we weren't just passing through.

Sources:

Handprints That Outlast the Hand

The cave paintings at Chauvet are roughly thirty-six thousand years old. We don't know the names of the people who made them. We don't know what language they spoke, what they believed, whether they were happy. But their handprints are still on the walls. That's not nothing.

Horace knew what he was doing when he wrote exegi monumentum aere perennius — I have built a monument more lasting than bronze. Shakespeare's sonnets are practically advertisements for the idea: the beloved will die, but the poem won't. These were conscious bids. Deliberate. The artist looked at mortality and decided to build something that could outrun it.

But I think for most people who make things, the impulse operates deeper than that. The urge to take what's inside you and fix it in a form that outlasts the moment — to write a sentence, paint a wall, carve a shape — is so fundamental that it probably predates any conscious reasoning about death. You don't sit down to write a novel because you've calculated your mortality and devised a strategy. You sit down because something inside you insists on being externalised, and the fact that the external version might survive you is almost incidental. Almost.

There's a paradox buried in this. The work survives, but it separates from its creator almost immediately. A novel outlives its author but becomes something else in the hands of every new reader. A painting hangs in a gallery long after the painter's intentions have become unknowable. The artist achieves a kind of immortality, but it's an immortality they're not present for — which brings it back to the territory I keep circling. The work becomes another object that has slipped its time, carrying a signal from someone who is no longer there to clarify it.

I've written before about things that persist after their context vanishes — fragrance bottles that carry molecules from a dead era, records that transmit cultural moments that no longer exist. Art does something similar, but with an added cruelty. A degrading perfume doesn't know it's degrading. A painting in a museum doesn't know its maker is gone. But the maker knew. Somewhere in the act of creation, whether they articulated it or not, they were reaching forward into a time they wouldn't inhabit.

The Chauvet handprints are the starkest version of this. No frame, no title card, no artist's statement. Just a hand pressed against stone with pigment blown around it — the most literal trace a person can leave. And it worked. Thirty-six thousand years later, we're still looking at it. We just have no idea who we're looking at.

So yes — art is probably the closest thing to immortality that humans have managed. But it's a haunted version. The voice persists. The speaker doesn't.

Sources:

The War Outside the Frame

The Gulf War was two weeks old when this hit newsstands. Operation Desert Storm had launched on January 17th, the nightly news was all missile footage, and American Vogue's February 1991 number opened to Patrick Demarchelier photographing Karen Mulder and Elaine Irwin as though none of it was happening.

That disconnect wasn't accidental. Anna Wintour had been editor-in-chief for three years, and her Vogue didn't acknowledge the world outside the frame unless the world outside the frame was wearing something worth photographing. Claudia Schiffer got the cover — twenty years old, a year into her Chanel contract, blonde in a way that made you forget other hair colours existed. Inside, Carlyne Cerf de Dudzeele styled the Demarchelier editorial with the chromatic confidence that was already becoming her signature. She hated minimalism before minimalism had properly arrived. Dressed images the way other people decorated cakes — too much colour, absolute conviction, zero apology.

Oribe did the hair. This was peak Oribe — before the product line, when his name on a credit sheet meant volume and movement and a kind of engineered glamour that looked effortless from ten feet and technically impossible from two. Marie-Josée Lafontaine handled makeup. Between them, the credits beneath Demarchelier's name represented a team that could make anyone look extraordinary. That Mulder and Irwin already were extraordinary — or were about to become so — was almost incidental.

Mulder was twenty. Within the year she'd land her first Vogue cover, a Guess contract, and the kind of ubiquity that made "supermodel" feel like a word coined for her specifically. Irwin was twenty-one, already embedded in Ralph Lauren campaigns and months away from meeting John Mellencamp on a music video set — an encounter that would eventually pull her out of fashion's orbit entirely. Neither knew any of this yet. In February 1991 they were two models on location with good light and a photographer who understood that the best fashion images work by implying a life happening just beyond the crop marks.

Karen Mulder and Elaine Irwin, February 1991

Demarchelier had that gift. His pictures never felt staged, even when they obviously were. He lit faces the way Golden Age cinematographers lit their leading ladies — soft, directional, warm enough to suggest intimacy without the mess of actual closeness. The distance between his work and Meisel's conceptual provocations wasn't only aesthetic. It was temperamental. Demarchelier believed beauty was a sufficient subject. Meisel believed beauty needed irony around it to survive.

I found this editorial while digitising a stack of old magazines, and the light stopped me. Not the clothes — fashion from 1991 is simultaneously dated and about to come back around, the way fashion always is. The light. It hasn't aged because it was never trying to be contemporary. It was trying to make two women look like the best versions of themselves in a world that represented an escape from the brutality of the nightly news.

The war ended weeks after this issue published. The economy took another two years to find its feet. Mulder's career peaked and then unravelled in ways nobody predicted. Irwin married a rock star and raised a family. Oribe died in 2018. And this image — two women, soft light, a February that history forgot — sits exactly where they left it, unaware that everything around it moved.

Two Women in Borrowed Light

OpenAI Keeps Shipping, Nobody Keeps Cheering

GPT-5.4 launched this week. It merges the coding muscle of GPT-5.3-Codex with improved reasoning, native computer-use capabilities, and an experimental million-token context window. On paper it is the most capable model OpenAI has ever shipped. The reaction across every forum I follow has been a collective shrug.

That silence says more than any benchmark.

The GPT-5 era started badly. When OpenAI unified its model lineup last year under the GPT-5 brand, it simultaneously killed GPT-4o, GPT-4.1, GPT-4.5, and the entire o-series — with no deprecation period. Power users woke up to find the models they'd built workflows around simply gone, replaced by a router that quietly picked which sub-model would answer each query. The backlash was immediate and brutal. People who'd been paying $200 a month for o3-level reasoning discovered they were getting something closer to 4o-mini for half their prompts.

OpenAI scrambled. Within 72 hours they bolted on "Auto," "Fast," and "Thinking" toggles and restored legacy model access for paid users. The damage was done. Trust, once lost with power users, doesn't reinstall with a patch.

Since then it's been relentless iteration — GPT-5.2, 5.2-Codex, 5.3, 5.3-Codex, 5.3 Instant with its 26.8% hallucination reduction, and now 5.4. Each release brings genuine improvements. The 5.3-Codex model was legitimately impressive for agentic coding. The Instant variant finally addressed the chronic overrefusal problem that made earlier versions refuse to help you rename a variable if it contained the word "kill." These are real engineering wins.

But nobody's excited.

Part of it is the trust deficit OpenAI keeps widening through its own choices — Pentagon deals signed hours after publicly praising the competitor who refused them, ads in a product whose CEO called advertising in AI "uniquely unsettling". Part of it is that Anthropic and Google have been shipping at the same pace. Gemini 2.5 Pro has dominated the LMArena preference leaderboard for months. Claude hasn't stood still either. When everyone improves simultaneously, no single release feels like a leap.

My prediction: the version numbering treadmill will accelerate further. GPT-5.4 hints are already surfacing about 5.5. Each point release will be competent, occasionally excellent, and met with diminishing enthusiasm. OpenAI's problem was never capability. It was credibility. And you can't ship your way out of that.

Sources:

Escada and the Weight of 1989

Escada knitwear from the late 1980s has a particular density to it — not just in the fabric, which was heavy and substantial in a way that fast fashion has since abandoned, but in the sheer volume of design decisions packed into a single garment. Floral appliqué, colour blocks, contrasting collars, embroidered roses sitting next to geometric patches of hot pink and yellow. Every surface covered. No breathing room.

It looked expensive then. It was expensive. Escada under Margaretha Ley traded on a specific kind of European maximalism that read as affluent and cosmopolitan in its moment. The gold earrings, the structured shoulders, the saturated palette — all of it signalled a confidence that only money could buy. That confidence is still visible in the knitwear, frozen in place like a time capsule nobody asked to open.

Viewed now, the aesthetic sits somewhere between ambitious and overwhelming. The colour combinations that felt luxurious in 1989 register as cluttered today. Minimalism won. Quietly, decisively, and probably permanently. The current fashion vocabulary has so little tolerance for this kind of ornamental excess that the garments look almost archaeological — artefacts from a civilisation with different rules about how much was enough. I'm not sure they were wrong. But the distance is real.

The Sinister Menace of Teatime Warnings

A hooded figure stands at the edge of a pond. Children play nearby. The figure narrates their deaths in advance, calmly, as though reading a weather forecast. Donald Pleasence provides the voice — that dry, unhurried register he perfected across decades of playing men who have seen too much. "I'll be back," the figure promises as the credits roll. "Back... back... back." This aired during children's television. Between cartoons.

The Central Office of Information produced public information films in Britain from 1946 until its closure in 2011, but the early 1970s were the golden age of civic terror. The COI's remit was simple: warn the public about hazards. Drowning, electricity, railway lines, building sites, farm machinery. The approach they chose was less simple. Rather than gentle instruction, the films opted for a kind of controlled psychological violence — measured, institutional, and deeply unsettling. The philosophy appeared to be that a traumatised child was a safe child.

Lonely Water from 1973 remains the most discussed, but it wasn't unusual. Director John Krish earned the nickname "Doctor Death" for his COI work. His 1971 film The Sewing Machine built tension through a ticking clock and the certainty of a child's injury, revealed from the opening frame. Fireworks: Eyes, made in 1974, staged children standing motionless with their backs to the camera, gazing toward a setting sun in compositions borrowed from folk horror, before delivering its conclusion about what fireworks do to faces. These weren't afterthoughts. They were carefully constructed pieces of filmmaking that happened to be about not touching electrical substations.

The most notorious is probably Apaches, directed by John Mackenzie in 1977. Twenty-seven minutes long. Six children play cowboys on a working farm. One by one, they die — crushed by a tractor, drowned in a slurry pit, poisoned by pesticide. The film was commissioned by the Health and Safety Executive to address the roughly thirty annual child deaths on British farms, and it broke all COI booking records. Schools screened it for years. Reports of nightmares and distress among viewers were widespread and apparently considered acceptable collateral.

What makes these films linger isn't the content alone. It's the tone. The pacing is slow. The framing is static. Sound design is sparse — sometimes just wind, or the ambient hum of a field. There's no reassuring music to signal that everything will be fine. The institutional coldness that defined so much British public messaging in this period reaches its purest expression here. The narrator doesn't care about you. The camera doesn't flinch. The government is telling you, flatly, that the world will kill your children if you let it, and it's doing so in the same register it uses to announce postal rate changes.

I watched several of these again recently on the BFI Player, which hosts a free archive. They still work. Not as nostalgia, but as genuinely unnerving short films. The analogue grain, the muted colour palettes, the absence of anything reassuring in the frame — it all compounds into something that feels closer to arthouse horror than public service broadcasting. The Fatal Floor from 1974 sets up a grandmother preparing her home for a newborn and then deploys a punchline so disproportionate it almost qualifies as dark comedy.

Modern safety campaigns use empathy, relatability, bright graphics, social media integration. They want you to feel supported. The COI films of the early 1970s wanted you to feel afraid. Whether that was more effective is genuinely debatable, but it certainly produced a generation of adults who still flinch when they see an unattended body of water on a grey afternoon.

Sources:

Colour Blocking Before It Had a Name

Nobody called it colour blocking in the mid-90s. The term wouldn't enter mainstream fashion vocabulary until the 2010s, when every high-street brand suddenly rediscovered the idea and pretended it was new. But the technique was already everywhere in editorial work from that era — panels of saturated colour cut against each other, no print, no pattern, just geometry and conviction.

The appeal was partly material. Knit fabrics in the early to mid-90s had a density that made flat colour sing. Not the tissue-thin polyester blends that dominate now, but something with actual body. A magenta sleeve held its shape. A cobalt collar sat where it was supposed to sit. The garments had a sculptural quality that depended entirely on the fabric doing its job, because there was nothing else to hide behind — no logo, no embellishment, no distraction.

I think what made this particular moment work was the contrast between the boldness of the colour and the restraint of everything else. Hair was undone. Makeup was minimal, or at least suggested minimal. The backgrounds were plain. All the visual energy went into those intersecting fields of red and yellow and blue, and it was enough. More than enough. The clothes didn't compete with the person wearing them so much as amplify something already there.

Colour blocking came back around 2011 courtesy of Celine and then immediately got flattened by fast fashion into something cheaper and louder. The proportions were wrong. The fabrics were wrong. What had been confident became garish because confidence doesn't survive mass production — it's the first thing to go when you're cutting corners on a factory floor in Dhaka. The shapes stayed roughly the same but the feeling evaporated entirely. A mustard panel against magenta only works if the mustard is the right mustard, and the right mustard costs more than most brands are willing to spend.

I've been listening to a lot of Seefeel lately, which has nothing to do with fashion except that it shares the same mid-90s conviction that less information, delivered precisely, beats more information delivered carelessly.

The irony is that the original editorial pieces from this period are now harder to find than vintage couture. Nobody archived knitwear. It pilled, it stretched, it got donated. The photographs survive but the objects don't, which gives the whole aesthetic a slightly ghostly quality — a different frame, a different dress, the same steady gaze — colour so vivid it feels permanent, attached to garments that were anything but.

The Technology Trap Was Always the Point

James Burke stood on the roof of the World Trade Centre in 1977, looked out across Manhattan, and asked the most dangerous question in television: what happens when the electricity stops? Not as a thought experiment. Not as speculative fiction. As a documentary premise — grounded in the 1965 New York blackout that had stranded eight hundred thousand people on the subway and turned one of the most technologically advanced cities on earth into a dark, confused village.

That was the opening of Connections, the BBC series that first aired in October 1978 and quietly became the most-watched programme in PBS history up to that point. I wrote a brief note about it years ago, but the series deserves more than a passing mention. It deserves the kind of attention we reserve for things that were right before their time and remain right long after everyone has stopped paying attention.

Burke's central argument was simple enough to state and almost impossible to accept comfortably: modern civilisation is a trap. Not a conspiracy, not a design flaw — a trap in the structural sense. Every convenience we rely on depends on systems we don't understand, maintained by specialists we'll never meet, powered by infrastructure so complex that no single person comprehends the whole of it. The 1965 blackout was his proof of concept. Millions of people discovered in the space of twelve hours that they could not feed themselves, heat their homes, or navigate their own city without a continuous supply of electricity that they had never once thought about.

Nearly fifty years later, we still don't have a good answer to his question.

The format of the show was its genius. Each episode began with some historical event or invention — the plough, the watermill, Arab astronomy — and traced a chain of consequences forward through centuries until it arrived at something recognisably modern. A loom leads to computing. A medieval need to keep food fresh leads to refrigeration leads to air conditioning leads to the demographic transformation of the American South. The connections were never obvious and never forced. Burke had done the research. He walked through historical sites, handled objects in museums, and talked directly to camera with the confidence of someone who had spent years verifying each link in the chain before committing it to film.

What made this work as television — rather than as a lecture — was Burke himself. He moved. Physically, I mean. The man was never still. He'd begin a sentence in a thirteenth-century Italian church and finish it in a twentieth-century laboratory, the cut happening mid-thought so that the viewer experienced the jump as a continuation rather than a disruption. No other documentary presenter has ever used location quite like that. Bronowski stood and reflected. Sagan sat and marvelled. Burke walked and connected, and the walking was the argument.

The production values were extraordinary for 1978. Mick Jackson directed with a restlessness that matched Burke's own energy — crane shots, tracking movements through narrow streets, occasional aerial footage that must have cost the BBC more than they'd budgeted. The score by David Cain had a synthesised unease to it, something between library music and early electronic composition, that made even the most benign historical segment feel like it was building toward a revelation. Which it usually was.

Episode five, "The Wheel of Fortune," is the one I return to most often. It traces how the invention of the stirrup changed warfare, which changed feudal land distribution, which changed agricultural practice, which eventually — and this is the part that makes you sit forward — contributed to the development of the printing press. The logic is airtight at every step and completely invisible until Burke lays it out. That's the trick. He wasn't inventing connections. He was revealing ones that had always been there, hidden by the way we compartmentalise history into tidy subjects that never speak to each other.

I think about Sagan's The Demon-Haunted World sometimes in the context of Burke's project. Sagan worried about scientific illiteracy — about a public that couldn't distinguish evidence from superstition. Burke's worry was different. He wasn't concerned that people didn't understand science. He was concerned that people didn't understand dependence. That we had built a world so intricately networked that the failure of any single node could cascade through systems in ways nobody had mapped. The technology trap wasn't ignorance. It was trust — blind, unexamined trust in systems that had no obligation to keep working.

The series spawned two sequels. Connections2 arrived in 1994 with twenty episodes, and Connections3 followed in 1997 with ten more. Both were good. Neither was essential in the way the original was, partly because the format had been absorbed into the culture by then — every pop-history show owes something to Burke's method — and partly because the original had the advantage of genuine novelty. Nobody had told history that way on television before. By the nineties, plenty of people were trying to.

In 2023, CuriosityStream revived the format with Burke himself presenting, now in his late eighties and working within a CGI environment they called MindSpace. The ambition was admirable. Whether a virtual set can replace Burke walking through actual historical locations is a question I haven't fully resolved. Something about the physical presence mattered — the dust on the stones, the particular light in a medieval corridor, the sense that Burke was there and had come specifically to tell you why this place connected to something you'd never considered.

The first episode remains the most prescient. "The Trigger Effect" opened with that blackout and closed with Burke asking whether we were prepared for the next one. In 1978, that felt like a provocation. In 2026, after rolling power crises across multiple continents and a global infrastructure so interdependent that a blocked canal in Egypt can disrupt manufacturing in Stuttgart, it feels like a description of daily life. Burke wasn't warning about something that might happen. He was describing something that had already happened and that we had collectively decided not to think about.

I rewatched the full series last month. Most of it is on the Internet Archive. The picture quality is what you'd expect from late-seventies BBC film stock — warm, slightly soft, with that particular amber cast that British television had before everything went digital and cold. It doesn't matter. The arguments don't depend on resolution. If anything, the visual distance helps. It reminds you that someone was saying all of this forty-eight years ago, and that we built everything he warned about anyway.

Sources:

Lavender and Leather at the Ralph Lauren Spring 1993 Show

Ralph Lauren understood something in 1993 that most designers still haven't figured out. Restraint as luxury. Not the performative minimalism that would dominate later in the decade — the Helmut Lang austerity, the Jil Sander reduction — but something warmer. A confidence that didn't need to announce itself.

The Spring/Summer 1993 collection was built around this idea. Oversized double-breasted blazers in pale lavender wool, worn loose over cream turtlenecks, cinched at the waist with a single tan leather belt. The proportions were generous without being sloppy. The palette was muted without being dull. Everything looked like it had been worn before, in the best possible sense — as though the clothes already belonged to someone rather than arriving fresh off a factory line.

What strikes me now is how completely this approach has disappeared from mainstream fashion. Lauren was selling a mood — old money at ease, a weekend in Connecticut that never actually happened — but the construction underneath was real. Those blazers had structure. The fabric had weight. You could feel the difference between this and the fast-fashion approximations that would flood the market a decade later, even through a photograph.

I keep returning to early 90s Lauren because it sits at a strange inflection point. The excess of the 80s had burned itself out but the stripped-back severity of mid-90s fashion hadn't arrived yet. For a brief window, there was this in-between space where clothes could be beautiful without being loud and expensive without being ostentatious. Lauren lived in that space more comfortably than almost anyone.