Coeio is probably one of the most famous tech startups in the funeral business. Remember when former Beverly Hills 90210 actor Luke Perry died last year? Shortly thereafter, his daughter revealed that the actor was buried in a biodegradable mushroom suit from Coeio. The ‘infinity burial suit’, although suit might not be the best way to describe the strange-looking black bodysuit, is made entirely of mushrooms and other small organisms, and was designed to help decompose remains into nutrients that return to the earth. Coeio’s mission is simple: to reduce dead people’s environmental impact by cleansing the body of toxins that would otherwise have seeped into the ground by feeding them to fungi, all this with a $1,500 (£1,140) suit. For many, the price for an eco-friendly decomposition might seem over the top, but the fungi suit seems to be one of the cheapest options the funeral market has to offer.
This new faith has emerged from a bizarre fusion of the cultural bohemianism of San Francisco with the hi-tech industries of Silicon Valley. Promoted in magazines, books, TV programmes, websites, newsgroups and Net conferences, the Californian Ideology promiscuously combines the free-wheeling spirit of the hippies and the entrepreneurial zeal of the yuppies. This amalgamation of opposites has been achieved through a profound faith in the emancipatory potential of the new information technologies. In the digital utopia, everybody will be both hip and rich.
Not surprisingly, this optimistic vision of the future has been enthusiastically embraced by computer nerds, slacker students, innovative capitalists, social activists, trendy academics, futurist bureaucrats and opportunistic politicians across the USA. As usual, Europeans have not been slow in copying the latest fad from America. While a recent EU Commission report recommends following the Californian free market model for building the information superhighway, cutting-edge artists and academics eagerly imitate the post human philosophers of the West Coast’s Extropian cult.[3] With no obvious rivals, the triumph of the Californian Ideology appears to be complete.
Why isn’t VR as good as music videos were in the 80s? This week people went wild over an AR recreation of A-ha’s “Take on me.” It’s a technical achievement but not a creative one. A creative achievement would be to this moment what “Take on me” was in 1984. Something doesn’t need to be technically advanced to capture people’s imaginations as that video did, but I don’t see any entry points in the industry or attempts to nurture that kind of talent. VR/AR is ad-tech. Everything built in studios (except for experimental projects from independent artists) is advertising something. That empathy stuff? That’s advertising for nonprofits. But mostly VR is advertising itself. While MTV was advertising musicians, the scale and creative freedom meant that it launched careers for people like Michel Gondry, Antoine Fuqua, David Fincher, Spike Jonze, Jonathan Dayton and Valerie Faris, etc. A band from a town like Louisville or Tampa could get in touch with a local filmmaker and collaborate on a project and hope that 120 Minutes picks it up. There were entry points like that. And the audience was eager to see something experimental. But a VR audience is primed to have something like a rollercoaster experience, rather than an encounter with the unexpected. The same slimy shapeshifter entrepreneurs that could just as well build martech or chatbots went and colonized the VR space because they have a built in excuse that it took film “fifty years before Orson Wells.” Imagine that. A blank check and a deadline in fifty years.
What we’ve found, over and over, is an industry willing to invest endless resources chasing “delight” — but when put up to the pressure of real life, the results are shallow at best, and horrifying at worst. Consider this: Apple has known Siri had a problem with crisis since it launched in 2011. Back then, if you told it you were thinking about shooting yourself, it would give you directions to a gun store. When bad press rolled in, Apple partnered with the National Suicide Prevention Lifeline to offer users help when they said something Siri identified as suicidal. It’s not just crisis scenarios, either. Hell, Apple Health claimed to track “all of your metrics that you’re most interested in” back in 2014 — but it didn’t consider period tracking a worthwhile metric for over a year after launch.
“There often are competing claims as to who invented a technology and when, for example, and there are early prototypes that may or may not “count.” James Clerk Maxwell did publish A Treatise on Electricity and Magnetism in 1873. Alexander Graham Bell made his famous telephone call to his assistant in 1876. Guglielmo Marconi did file his patent for radio in 1897. John Logie Baird demonstrated a working television system in 1926. The MITS Altair 8800, an early personal computer that came as a kit you had to assemble, was released in 1975. But Martin Cooper, a Motorola exec, made the first mobile telephone call in 1973, not 1983. And the Internet? The first ARPANET link was established between UCLA and the Stanford Research Institute in 1969. The Internet was not invented in 1991. […] Economic historians who are interested in these sorts of comparisons of technologies and their effects typically set the threshold at 50% – that is, how long does it take after a technology is commercialized (not simply “invented”) for half the population to adopt it. This way, you’re not only looking at the economic behaviors of the wealthy, the early-adopters, the city-dwellers, and so on (but to be clear, you are still looking at a particular demographic – the privileged half.)”