Skip to content

Plutonic Rainbows

Handprints That Outlast the Hand

The cave paintings at Chauvet are roughly thirty-six thousand years old. We don't know the names of the people who made them. We don't know what language they spoke, what they believed, whether they were happy. But their handprints are still on the walls. That's not nothing.

Horace knew what he was doing when he wrote exegi monumentum aere perennius — I have built a monument more lasting than bronze. Shakespeare's sonnets are practically advertisements for the idea: the beloved will die, but the poem won't. These were conscious bids. Deliberate. The artist looked at mortality and decided to build something that could outrun it.

But I think for most people who make things, the impulse operates deeper than that. The urge to take what's inside you and fix it in a form that outlasts the moment — to write a sentence, paint a wall, carve a shape — is so fundamental that it probably predates any conscious reasoning about death. You don't sit down to write a novel because you've calculated your mortality and devised a strategy. You sit down because something inside you insists on being externalised, and the fact that the external version might survive you is almost incidental. Almost.

There's a paradox buried in this. The work survives, but it separates from its creator almost immediately. A novel outlives its author but becomes something else in the hands of every new reader. A painting hangs in a gallery long after the painter's intentions have become unknowable. The artist achieves a kind of immortality, but it's an immortality they're not present for — which brings it back to the territory I keep circling. The work becomes another object that has slipped its time, carrying a signal from someone who is no longer there to clarify it.

I've written before about things that persist after their context vanishes — fragrance bottles that carry molecules from a dead era, records that transmit cultural moments that no longer exist. Art does something similar, but with an added cruelty. A degrading perfume doesn't know it's degrading. A painting in a museum doesn't know its maker is gone. But the maker knew. Somewhere in the act of creation, whether they articulated it or not, they were reaching forward into a time they wouldn't inhabit.

The Chauvet handprints are the starkest version of this. No frame, no title card, no artist's statement. Just a hand pressed against stone with pigment blown around it — the most literal trace a person can leave. And it worked. Thirty-six thousand years later, we're still looking at it. We just have no idea who we're looking at.

So yes — art is probably the closest thing to immortality that humans have managed. But it's a haunted version. The voice persists. The speaker doesn't.

Sources:

The War Outside the Frame

The Gulf War was two weeks old when this hit newsstands. Operation Desert Storm had launched on January 17th, the nightly news was all missile footage, and American Vogue's February 1991 number opened to Patrick Demarchelier photographing Karen Mulder and Elaine Irwin as though none of it was happening.

That disconnect wasn't accidental. Anna Wintour had been editor-in-chief for three years, and her Vogue didn't acknowledge the world outside the frame unless the world outside the frame was wearing something worth photographing. Claudia Schiffer got the cover — twenty years old, a year into her Chanel contract, blonde in a way that made you forget other hair colours existed. Inside, Carlyne Cerf de Dudzeele styled the Demarchelier editorial with the chromatic confidence that was already becoming her signature. She hated minimalism before minimalism had properly arrived. Dressed images the way other people decorated cakes — too much colour, absolute conviction, zero apology.

Oribe did the hair. This was peak Oribe — before the product line, when his name on a credit sheet meant volume and movement and a kind of engineered glamour that looked effortless from ten feet and technically impossible from two. Marie-Josée Lafontaine handled makeup. Between them, the credits beneath Demarchelier's name represented a team that could make anyone look extraordinary. That Mulder and Irwin already were extraordinary — or were about to become so — was almost incidental.

Mulder was twenty. Within the year she'd land her first Vogue cover, a Guess contract, and the kind of ubiquity that made "supermodel" feel like a word coined for her specifically. Irwin was twenty-one, already embedded in Ralph Lauren campaigns and months away from meeting John Mellencamp on a music video set — an encounter that would eventually pull her out of fashion's orbit entirely. Neither knew any of this yet. In February 1991 they were two models on location with good light and a photographer who understood that the best fashion images work by implying a life happening just beyond the crop marks.

Karen Mulder and Elaine Irwin, February 1991

Demarchelier had that gift. His pictures never felt staged, even when they obviously were. He lit faces the way Golden Age cinematographers lit their leading ladies — soft, directional, warm enough to suggest intimacy without the mess of actual closeness. The distance between his work and Meisel's conceptual provocations wasn't only aesthetic. It was temperamental. Demarchelier believed beauty was a sufficient subject. Meisel believed beauty needed irony around it to survive.

I found this editorial while digitising a stack of old magazines, and the light stopped me. Not the clothes — fashion from 1991 is simultaneously dated and about to come back around, the way fashion always is. The light. It hasn't aged because it was never trying to be contemporary. It was trying to make two women look like the best versions of themselves in a world that represented an escape from the brutality of the nightly news.

The war ended weeks after this issue published. The economy took another two years to find its feet. Mulder's career peaked and then unravelled in ways nobody predicted. Irwin married a rock star and raised a family. Oribe died in 2018. And this image — two women, soft light, a February that history forgot — sits exactly where they left it, unaware that everything around it moved.

Two Women in Borrowed Light

OpenAI Keeps Shipping, Nobody Keeps Cheering

GPT-5.4 launched this week. It merges the coding muscle of GPT-5.3-Codex with improved reasoning, native computer-use capabilities, and an experimental million-token context window. On paper it is the most capable model OpenAI has ever shipped. The reaction across every forum I follow has been a collective shrug.

That silence says more than any benchmark.

The GPT-5 era started badly. When OpenAI unified its model lineup last year under the GPT-5 brand, it simultaneously killed GPT-4o, GPT-4.1, GPT-4.5, and the entire o-series — with no deprecation period. Power users woke up to find the models they'd built workflows around simply gone, replaced by a router that quietly picked which sub-model would answer each query. The backlash was immediate and brutal. People who'd been paying $200 a month for o3-level reasoning discovered they were getting something closer to 4o-mini for half their prompts.

OpenAI scrambled. Within 72 hours they bolted on "Auto," "Fast," and "Thinking" toggles and restored legacy model access for paid users. The damage was done. Trust, once lost with power users, doesn't reinstall with a patch.

Since then it's been relentless iteration — GPT-5.2, 5.2-Codex, 5.3, 5.3-Codex, 5.3 Instant with its 26.8% hallucination reduction, and now 5.4. Each release brings genuine improvements. The 5.3-Codex model was legitimately impressive for agentic coding. The Instant variant finally addressed the chronic overrefusal problem that made earlier versions refuse to help you rename a variable if it contained the word "kill." These are real engineering wins.

But nobody's excited.

Part of it is the trust deficit OpenAI keeps widening through its own choices — Pentagon deals signed hours after publicly praising the competitor who refused them, ads in a product whose CEO called advertising in AI "uniquely unsettling". Part of it is that Anthropic and Google have been shipping at the same pace. Gemini 2.5 Pro has dominated the LMArena preference leaderboard for months. Claude hasn't stood still either. When everyone improves simultaneously, no single release feels like a leap.

My prediction: the version numbering treadmill will accelerate further. GPT-5.4 hints are already surfacing about 5.5. Each point release will be competent, occasionally excellent, and met with diminishing enthusiasm. OpenAI's problem was never capability. It was credibility. And you can't ship your way out of that.

Sources:

Escada and the Weight of 1989

Escada knitwear from the late 1980s has a particular density to it — not just in the fabric, which was heavy and substantial in a way that fast fashion has since abandoned, but in the sheer volume of design decisions packed into a single garment. Floral appliqué, colour blocks, contrasting collars, embroidered roses sitting next to geometric patches of hot pink and yellow. Every surface covered. No breathing room.

It looked expensive then. It was expensive. Escada under Margaretha Ley traded on a specific kind of European maximalism that read as affluent and cosmopolitan in its moment. The gold earrings, the structured shoulders, the saturated palette — all of it signalled a confidence that only money could buy. That confidence is still visible in the knitwear, frozen in place like a time capsule nobody asked to open.

Viewed now, the aesthetic sits somewhere between ambitious and overwhelming. The colour combinations that felt luxurious in 1989 register as cluttered today. Minimalism won. Quietly, decisively, and probably permanently. The current fashion vocabulary has so little tolerance for this kind of ornamental excess that the garments look almost archaeological — artefacts from a civilisation with different rules about how much was enough. I'm not sure they were wrong. But the distance is real.

The Sinister Menace of Teatime Warnings

A hooded figure stands at the edge of a pond. Children play nearby. The figure narrates their deaths in advance, calmly, as though reading a weather forecast. Donald Pleasence provides the voice — that dry, unhurried register he perfected across decades of playing men who have seen too much. "I'll be back," the figure promises as the credits roll. "Back... back... back." This aired during children's television. Between cartoons.

The Central Office of Information produced public information films in Britain from 1946 until its closure in 2011, but the early 1970s were the golden age of civic terror. The COI's remit was simple: warn the public about hazards. Drowning, electricity, railway lines, building sites, farm machinery. The approach they chose was less simple. Rather than gentle instruction, the films opted for a kind of controlled psychological violence — measured, institutional, and deeply unsettling. The philosophy appeared to be that a traumatised child was a safe child.

Lonely Water from 1973 remains the most discussed, but it wasn't unusual. Director John Krish earned the nickname "Doctor Death" for his COI work. His 1971 film The Sewing Machine built tension through a ticking clock and the certainty of a child's injury, revealed from the opening frame. Fireworks: Eyes, made in 1974, staged children standing motionless with their backs to the camera, gazing toward a setting sun in compositions borrowed from folk horror, before delivering its conclusion about what fireworks do to faces. These weren't afterthoughts. They were carefully constructed pieces of filmmaking that happened to be about not touching electrical substations.

The most notorious is probably Apaches, directed by John Mackenzie in 1977. Twenty-seven minutes long. Six children play cowboys on a working farm. One by one, they die — crushed by a tractor, drowned in a slurry pit, poisoned by pesticide. The film was commissioned by the Health and Safety Executive to address the roughly thirty annual child deaths on British farms, and it broke all COI booking records. Schools screened it for years. Reports of nightmares and distress among viewers were widespread and apparently considered acceptable collateral.

What makes these films linger isn't the content alone. It's the tone. The pacing is slow. The framing is static. Sound design is sparse — sometimes just wind, or the ambient hum of a field. There's no reassuring music to signal that everything will be fine. The institutional coldness that defined so much British public messaging in this period reaches its purest expression here. The narrator doesn't care about you. The camera doesn't flinch. The government is telling you, flatly, that the world will kill your children if you let it, and it's doing so in the same register it uses to announce postal rate changes.

I watched several of these again recently on the BFI Player, which hosts a free archive. They still work. Not as nostalgia, but as genuinely unnerving short films. The analogue grain, the muted colour palettes, the absence of anything reassuring in the frame — it all compounds into something that feels closer to arthouse horror than public service broadcasting. The Fatal Floor from 1974 sets up a grandmother preparing her home for a newborn and then deploys a punchline so disproportionate it almost qualifies as dark comedy.

Modern safety campaigns use empathy, relatability, bright graphics, social media integration. They want you to feel supported. The COI films of the early 1970s wanted you to feel afraid. Whether that was more effective is genuinely debatable, but it certainly produced a generation of adults who still flinch when they see an unattended body of water on a grey afternoon.

Sources:

Colour Blocking Before It Had a Name

Nobody called it colour blocking in the mid-90s. The term wouldn't enter mainstream fashion vocabulary until the 2010s, when every high-street brand suddenly rediscovered the idea and pretended it was new. But the technique was already everywhere in editorial work from that era — panels of saturated colour cut against each other, no print, no pattern, just geometry and conviction.

The appeal was partly material. Knit fabrics in the early to mid-90s had a density that made flat colour sing. Not the tissue-thin polyester blends that dominate now, but something with actual body. A magenta sleeve held its shape. A cobalt collar sat where it was supposed to sit. The garments had a sculptural quality that depended entirely on the fabric doing its job, because there was nothing else to hide behind — no logo, no embellishment, no distraction.

I think what made this particular moment work was the contrast between the boldness of the colour and the restraint of everything else. Hair was undone. Makeup was minimal, or at least suggested minimal. The backgrounds were plain. All the visual energy went into those intersecting fields of red and yellow and blue, and it was enough. More than enough. The clothes didn't compete with the person wearing them so much as amplify something already there.

Colour blocking came back around 2011 courtesy of Celine and then immediately got flattened by fast fashion into something cheaper and louder. The proportions were wrong. The fabrics were wrong. What had been confident became garish because confidence doesn't survive mass production — it's the first thing to go when you're cutting corners on a factory floor in Dhaka. The shapes stayed roughly the same but the feeling evaporated entirely. A mustard panel against magenta only works if the mustard is the right mustard, and the right mustard costs more than most brands are willing to spend.

I've been listening to a lot of Seefeel lately, which has nothing to do with fashion except that it shares the same mid-90s conviction that less information, delivered precisely, beats more information delivered carelessly.

The irony is that the original editorial pieces from this period are now harder to find than vintage couture. Nobody archived knitwear. It pilled, it stretched, it got donated. The photographs survive but the objects don't, which gives the whole aesthetic a slightly ghostly quality — a different frame, a different dress, the same steady gaze — colour so vivid it feels permanent, attached to garments that were anything but.

The Technology Trap Was Always the Point

James Burke stood on the roof of the World Trade Centre in 1977, looked out across Manhattan, and asked the most dangerous question in television: what happens when the electricity stops? Not as a thought experiment. Not as speculative fiction. As a documentary premise — grounded in the 1965 New York blackout that had stranded eight hundred thousand people on the subway and turned one of the most technologically advanced cities on earth into a dark, confused village.

That was the opening of Connections, the BBC series that first aired in October 1978 and quietly became the most-watched programme in PBS history up to that point. I wrote a brief note about it years ago, but the series deserves more than a passing mention. It deserves the kind of attention we reserve for things that were right before their time and remain right long after everyone has stopped paying attention.

Burke's central argument was simple enough to state and almost impossible to accept comfortably: modern civilisation is a trap. Not a conspiracy, not a design flaw — a trap in the structural sense. Every convenience we rely on depends on systems we don't understand, maintained by specialists we'll never meet, powered by infrastructure so complex that no single person comprehends the whole of it. The 1965 blackout was his proof of concept. Millions of people discovered in the space of twelve hours that they could not feed themselves, heat their homes, or navigate their own city without a continuous supply of electricity that they had never once thought about.

Nearly fifty years later, we still don't have a good answer to his question.

The format of the show was its genius. Each episode began with some historical event or invention — the plough, the watermill, Arab astronomy — and traced a chain of consequences forward through centuries until it arrived at something recognisably modern. A loom leads to computing. A medieval need to keep food fresh leads to refrigeration leads to air conditioning leads to the demographic transformation of the American South. The connections were never obvious and never forced. Burke had done the research. He walked through historical sites, handled objects in museums, and talked directly to camera with the confidence of someone who had spent years verifying each link in the chain before committing it to film.

What made this work as television — rather than as a lecture — was Burke himself. He moved. Physically, I mean. The man was never still. He'd begin a sentence in a thirteenth-century Italian church and finish it in a twentieth-century laboratory, the cut happening mid-thought so that the viewer experienced the jump as a continuation rather than a disruption. No other documentary presenter has ever used location quite like that. Bronowski stood and reflected. Sagan sat and marvelled. Burke walked and connected, and the walking was the argument.

The production values were extraordinary for 1978. Mick Jackson directed with a restlessness that matched Burke's own energy — crane shots, tracking movements through narrow streets, occasional aerial footage that must have cost the BBC more than they'd budgeted. The score by David Cain had a synthesised unease to it, something between library music and early electronic composition, that made even the most benign historical segment feel like it was building toward a revelation. Which it usually was.

Episode five, "The Wheel of Fortune," is the one I return to most often. It traces how the invention of the stirrup changed warfare, which changed feudal land distribution, which changed agricultural practice, which eventually — and this is the part that makes you sit forward — contributed to the development of the printing press. The logic is airtight at every step and completely invisible until Burke lays it out. That's the trick. He wasn't inventing connections. He was revealing ones that had always been there, hidden by the way we compartmentalise history into tidy subjects that never speak to each other.

I think about Sagan's The Demon-Haunted World sometimes in the context of Burke's project. Sagan worried about scientific illiteracy — about a public that couldn't distinguish evidence from superstition. Burke's worry was different. He wasn't concerned that people didn't understand science. He was concerned that people didn't understand dependence. That we had built a world so intricately networked that the failure of any single node could cascade through systems in ways nobody had mapped. The technology trap wasn't ignorance. It was trust — blind, unexamined trust in systems that had no obligation to keep working.

The series spawned two sequels. Connections2 arrived in 1994 with twenty episodes, and Connections3 followed in 1997 with ten more. Both were good. Neither was essential in the way the original was, partly because the format had been absorbed into the culture by then — every pop-history show owes something to Burke's method — and partly because the original had the advantage of genuine novelty. Nobody had told history that way on television before. By the nineties, plenty of people were trying to.

In 2023, CuriosityStream revived the format with Burke himself presenting, now in his late eighties and working within a CGI environment they called MindSpace. The ambition was admirable. Whether a virtual set can replace Burke walking through actual historical locations is a question I haven't fully resolved. Something about the physical presence mattered — the dust on the stones, the particular light in a medieval corridor, the sense that Burke was there and had come specifically to tell you why this place connected to something you'd never considered.

The first episode remains the most prescient. "The Trigger Effect" opened with that blackout and closed with Burke asking whether we were prepared for the next one. In 1978, that felt like a provocation. In 2026, after rolling power crises across multiple continents and a global infrastructure so interdependent that a blocked canal in Egypt can disrupt manufacturing in Stuttgart, it feels like a description of daily life. Burke wasn't warning about something that might happen. He was describing something that had already happened and that we had collectively decided not to think about.

I rewatched the full series last month. Most of it is on the Internet Archive. The picture quality is what you'd expect from late-seventies BBC film stock — warm, slightly soft, with that particular amber cast that British television had before everything went digital and cold. It doesn't matter. The arguments don't depend on resolution. If anything, the visual distance helps. It reminds you that someone was saying all of this forty-eight years ago, and that we built everything he warned about anyway.

Sources:

Lavender and Leather at the Ralph Lauren Spring 1993 Show

Ralph Lauren understood something in 1993 that most designers still haven't figured out. Restraint as luxury. Not the performative minimalism that would dominate later in the decade — the Helmut Lang austerity, the Jil Sander reduction — but something warmer. A confidence that didn't need to announce itself.

The Spring/Summer 1993 collection was built around this idea. Oversized double-breasted blazers in pale lavender wool, worn loose over cream turtlenecks, cinched at the waist with a single tan leather belt. The proportions were generous without being sloppy. The palette was muted without being dull. Everything looked like it had been worn before, in the best possible sense — as though the clothes already belonged to someone rather than arriving fresh off a factory line.

What strikes me now is how completely this approach has disappeared from mainstream fashion. Lauren was selling a mood — old money at ease, a weekend in Connecticut that never actually happened — but the construction underneath was real. Those blazers had structure. The fabric had weight. You could feel the difference between this and the fast-fashion approximations that would flood the market a decade later, even through a photograph.

I keep returning to early 90s Lauren because it sits at a strange inflection point. The excess of the 80s had burned itself out but the stripped-back severity of mid-90s fashion hadn't arrived yet. For a brief window, there was this in-between space where clothes could be beautiful without being loud and expensive without being ostentatious. Lauren lived in that space more comfortably than almost anyone.

The People Who Simply Vanished

A girl I knew at school moved to another town in 1988. I never saw her again. I don't know where she went, what she did with her life, whether she's alive. There was no forwarding address, no email, no profile to search. She left on a Friday, and by Monday she had ceased to exist in any verifiable sense. I was fifteen. This was ordinary.

Before the internet, people disappeared from your life with a regularity that would seem pathological today. Not dramatically — not in the way true crime podcasts mean when they say "disappeared." Quietly. A colleague took a job somewhere else. A friend moved. A neighbour emigrated. A person you spoke to every day became, over the course of a single week, permanently irretrievable. The world absorbed them and offered nothing back.

I keep thinking about how casually we accepted this. The finality of it. You could spend three years sitting next to someone in a classroom, sharing jokes and minor confidences, and then one of you would leave — and that was it. There was no mechanism for reconnection beyond extraordinary effort. You might try directory enquiries, if you remembered their surname and guessed which town they'd landed in. You might write a letter to their old address and hope it was forwarded. More often, you did nothing. The loss barely registered as loss. It was just how things worked.

The infrastructure of connection was laughably thin. Landline telephones required you to know the number, and numbers changed when people moved. Phone books covered local areas. Letters required a postal address. If someone relocated and didn't tell you — and why would they, if you were a casual friend rather than a close one — the connection severed cleanly and permanently. There was no search engine to type their name into. No social graph linking mutual acquaintances. No algorithm to reconnect you. No suggested friends. Just silence, and eventually acceptance.

I think about a specific group of people I worked with in 1993 at a small office in Sheffield. We shared a space five days a week for almost a year. I remember first names, a few surnames, fragments of personality. One woman was saving for a house. A man was obsessed with rally driving. Someone's mother was unwell. These details survive in my memory with surprising clarity, but the people themselves are gone. When the contract ended, we dispersed. No one suggested staying in touch because staying in touch required sustained, deliberate effort — regular phone calls, letters, visits — and we all understood, without saying so, that the relationship did not warrant that level of maintenance. The threshold for sustained contact was much higher than it is now.

This created a strange emotional texture. You accumulated a growing catalogue of people you had genuinely known and would never encounter again. Not estranged. Not deliberately lost. Simply — gone. The butcher's son who moved to Canada. The woman at the next desk who left to have a baby. The friend from university who returned to Malaysia. Each departure was a small, quiet severance. You carried forward a version of them frozen at the moment of last contact, and that version slowly degraded, merging with invention, losing specificity until only an impression remained.

What strikes me now is how much this resembled a kind of low-grade grief that no one acknowledged. Researchers at Psychology Today have described the concept of "commemorative friends" — people who were important to you earlier in life, with the understanding that you might never see or hear from them again. Before the internet, nearly everyone in your life outside your immediate circle was a commemorative friend in waiting. The category was so large it was invisible. You didn't mourn each departure because there were too many of them, and because the culture offered no framework for treating a drifted friendship as a genuine loss. It was simply what happened.

The asymmetry with the present is difficult to overstate. Today I can find almost anyone. A name typed into a search bar will surface a LinkedIn profile, a social media account, a local news mention, an obituary. The mystery has been eliminated so thoroughly that we've forgotten it ever existed. But for decades, the default condition of human relationships was impermanence followed by permanent silence. You met people, you knew them, they vanished, and the world closed over the gap they left behind.

I've written before about how pre-internet life was never designed to be archived — how it existed as lived experience rather than data, and how the absence of records is not a failure of retrieval but a genuine absence. The disappearance of people operates on the same principle. Those connections were not documented, tracked, or preserved. They existed in person, in proximity, in shared physical space. When the proximity ended, the connection ended. No trace remained in any system. The only archive was your own memory, and memory — as I've explored in thinking about how memories detach from their temporal anchors — is not a reliable archive of anything.

I sometimes wonder whether those people think of me. Whether the woman from Sheffield ever recalls the office we shared, the specific quality of light through those windows, the coffee machine that never worked properly. Probably not. Or if she does, she remembers a vague shape — a young man whose name she cannot retrieve, whose face has blurred into a composite of several faces from that era. This is how it goes. We were real to each other for a period, and then we became ghosts in each other's pasts. Not dead, not absent — just permanently unreachable.

There was something honest about it, though I'm reluctant to romanticise. The impermanence forced a certain presence. You paid attention to people because you sensed, even unconsciously, that this might be all the time you'd get. Conversations carried more weight when you couldn't resume them later via text message. Departures had gravity. When someone left, you understood — really understood — that this was probably the end, and you conducted yourself accordingly. There were more proper goodbyes. More deliberate last conversations. More attention to the fact of someone's physical presence before it was withdrawn.

My father had a friend called Roy who he'd known since childhood. Roy moved to Australia in 1971 and they lost contact almost immediately. For over thirty years, my father mentioned Roy occasionally — wondering aloud what had become of him, whether he'd married, whether he was still alive. There was no way to find out. In 2004, after my father had been online for a few years, he searched for Roy's name and found him within minutes. They exchanged emails. It was friendly but brief. The gap was too wide. They had become different people. The reunion answered the question but couldn't restore the relationship. The mystery had been more sustaining than the resolution.

I suspect that is the real loss here. Not the people themselves — they are out there, or they aren't, living their lives independent of my curiosity. The loss is of a world where not-knowing was a permanent and accepted condition. Where you could carry someone with you for decades as an unanswered question, and the question itself was a form of connection. The internet resolved the questions but dissolved the carrying. Now everything is either findable or confirmed dead. The middle state — alive in memory, unknown in fact — has been almost entirely eliminated.

I don't want to go back to it. But I notice its absence.

Sources:

What Oxidation Does to Memory

I keep a drawer of bottles that I rarely open. Not because they're precious in the collector's sense — nobody is bidding on half-used flacons of discontinued Dior — but because each one carries a specific temporal charge that I'm not always prepared to encounter. Opening them is not like playing an old record or flipping through photographs. It's stranger than that, and more destabilising.

The world of 1990 vanished so completely that even infinite resources couldn't reconstruct it. I've written about this before — the cold clarity of realising that entire atmospheres have disappeared without ceremony. But fragrance is unlike almost any other surviving artefact from that period, and it's an idea worth dwelling on.

A compact disc from 1990 plays back identically to how it played in 1990. The data is frozen. It gives you the music but nothing of the room, nothing of the moment, nothing of you. A photograph, if you had one, would show you a surface — a face, a place — but flattened, stripped of dimension and sensation. These are recordings, but they're recordings of information, not of experience.

Fragrance is different. When you open one of those bottles in the drawer, what reaches you is a chemical substance that was actually present in the era you're grieving. Those molecules were manufactured in the late 1980s or early 1990s. They sat in department stores that no longer exist, were worn by people who have aged or died or disappeared from your life entirely. In a very literal sense, you are inhaling something that belonged to that world. It's not a representation of the past — it's a remnant of it.

But here's where the drift comes in. Fragrance degrades. Top notes evaporate over decades. Oxidation shifts the balance of a composition — terpenes and aldehydes break down into new compounds, hydroperoxides forming and collapsing into ketones and alcohols that weren't part of the original design. What you smell when you open a thirty-five-year-old bottle of something is not quite what it smelled like in 1990. It's close — recognisably close — but altered. The signal is still transmitting, but it has wandered. And that wandering is what makes it so uncanny, because it sits in a space that is neither faithful reproduction nor complete loss. It's the past almost reaching you, but not quite. A hand extended across time that falls just short of touching yours.

There's a reason smell does this more violently than sight or sound. The olfactory bulb feeds directly into the amygdala and hippocampus — the brain's emotional and memory centres — without the interpretive detour that visual and auditory signals take through the thalamus. A photograph gives you time to brace yourself. A scent does not. It arrives before you've decided whether you're ready for it, which is why opening an old bottle can feel less like remembering and more like being ambushed.

Jacques Derrida coined the term hauntology in his 1993 work Spectres of Marx to describe the persistence of things that are neither fully present nor fully absent — ghosts in the philosophical sense, not the supernatural one. Mark Fisher later applied the concept to culture and sound, exploring how certain recordings and artefacts carry the residue of futures that never arrived. I've spent time with that framework before, mostly through music. But fragrance may be its most literal expression.

A record from 1981 can be hauntological because it evokes a cultural moment that has vanished. A fragrance from 1990 is hauntological because it is the vanished moment — or what remains of it after thirty-five years of molecular decay. The distinction matters. One is a representation of loss. The other is loss actively happening, right there on your wrist.

And that near-miss is arguably more painful than total absence. If the fragrance were gone entirely, you could grieve cleanly. If it were perfectly preserved, you could close your eyes and almost believe. But instead you get this third thing — a haunted version, a ghost of a scent carrying just enough of the original to remind you of exactly what has been lost, while simultaneously proving that even the physical traces are slipping away.

I wrote recently about objects that outlive their context — things that become unsettling not through decay but through persistence, surviving into a world that no longer makes sense of them. Fragrance fits that description, with a cruel additional dimension. The object isn't merely out of time. It's actively changing while out of time, drifting further from its original state with each passing year. The drawer doesn't preserve the bottles. It slows their departure.

Perfumers understand this intuitively, even if they frame it differently. The IFRA regulations and serial reformulation of classic compositions have been debated exhaustively in fragrance circles, often with genuine anger. People talk about "vintage batches" the way audiophiles talk about original pressings — as though the earlier version contains something sacred that the new one has lost. They're not entirely wrong. But the reformulation debate concerns commercial products altered by manufacturers. What I'm describing is different. It's the slow, unauthorised revision that time itself performs on a sealed bottle. Nobody decided to change what's in there. Chemistry did. And chemistry doesn't care what the bottle meant to you.

I sprayed some Escada Pour Homme the other day — a bottle from approximately 1993, discontinued and long forgotten by anyone who doesn't haunt fragrance forums. The opening was thinner than I remembered. Sharper. Some of the warmth had retreated behind a veil of something slightly medicinal, which I suspect is the aldehydes shifting after three decades. The heart was still there, though. That particular woody amber signature that I associate with a very specific period in my life, when that fragrance was ordinary enough to buy in any department store and unremarkable enough that nobody commented on it. It reached me the way a voice reaches you through a bad phone connection — recognisable, but with parts missing. And those missing parts were precisely what hurt, because they confirmed that even the most intimate physical traces of a period are subject to the same entropy as everything else.

That's what makes vintage fragrance such a powerful hauntological object. It doesn't just represent the passage of time. It enacts it, right there on your skin.

Sources: