The American Civil War was a conflict that occurred in North America in the late Edo Period
The American Civil War was a conflict that occurred in North America in the late Edo Period
The American Civil War was a conflict that occurred in North America in the late Edo Period
“In an attempt to better understand their colonial subjects in those years, officials in the British empire undertook a curious and little-known research project: to collect dreams from the people of South Asia, Africa and the Pacific. The results were not what they expected.”
“Seligman struggled to impose meaning on his unusual archive. When he tried to establish universalities, exceptions and contradictions proliferated. And when he tried to draw sharp distinctions between the minds of Britons on the one hand, and colonial subjects on the other, commonalities asserted themselves. Even in a situation where researchers held all the power – with the authority of the imperial state behind them, and an elaborate theoretical structure setting the terms of the encounter – their subjects did not always follow the script.”
“Did colonial officials get what they wanted from these growing collections of Freudian data? Some results, to be sure, ended up in tendentious arguments portraying anticolonial politics as the product of mental illness. The language of ‘frustration-aggression’ reactions and ‘deculturation’ disorders allowed some British officials to suggest that calls for independence derived from inchoate expressions of anger and immaturity. Once again, however, a clear-cut vindication of empire through expert knowledge proved elusive. The same studies that furnished evidence of indigenous pathology could not avoid pointing to the damage inflicted by British rule: the crushing racial hierarchies, the lack of economic opportunities, the weirdly Anglocentric schooling. Some researchers even suggested that imperialism, not anticolonial nationalism, was the real mental disorder; they explained the behaviour of British colonialists in terms of status anxieties, sexual hang-ups, and feelings of insecurity.”
(via https://aeon.co/essays/britains-imperial-dream-catchers-and-the-truths-of-empire )
The content below was written by Jared Diamond and appeared in the August 1, 1995 edition of Discover Magazine (http://discovermagazine.com/1995/aug/eastersend543). The content of the article article been widely copy-and-pasted by others and often serves as the basis for what people think they know about the island. Here, I try to provide comments to help update the essay with knowledge that we have gained over the past 25 years. In just a few centuries [the entire prehistoric occupation was just 500 years: ca. 1200AD to 1722AD. Radiocarbon dates (Mann et al 2008) show the loss of the palm forest took this entire span of time.], the people of Easter Island wiped out their forest [note that choice of the word “wiping” belies the fact that the palm forest was turned into gardens over the entire prehistoric span of occupation and possibly into European times], drove their plants [large palm trees (Jubea Chilensis) with little economic value] and animals [by animals, Diamond means “seabirds.” Excavations by Steadman et al (1994) show that there were once seabirds on the island that are now extinct.] to extinction, and saw their complex society spiral into chaos [according to whom? and when? Archaeologically, we only see changes in settlement patterns after the point of European contact] and cannibalism [there is no empirical evidence of cannibalism on Rapa Nui]. Are we about to follow their lead?
On 29 November 1917 while campaigning to introduce military conscription, Hughes was the target of eggs thrown by protestors when he arrived at Warwick Railway Station in southern Queensland. Prime Minister Hughes was incensed that the attending Queensland Police would not arrest the offenders under federal law, so when he returned to Parliament he set about drafting legislation to create the Commonwealth Police Force (CPF). The ‘Warwick Incident’ was the last straw for the Prime Minister who was engaged in a range of jurisdictional struggles with the Queensland Government at the time.
via https://www.afp.gov.au/about-us/our-organisation/history-afp
Singaporeans are obsessed with food. We can expound ceaselessly on where to find the best bak chor mee (minced meat noodles) and will queue for hours for a good yong tau foo (surimi-stuffed tofu and vegetables). Perhaps because most of us are descendants of immigrants thrust into an artificial construct of a nation, or maybe because we live in a country that is constantly renewing and rebuilding, one of the few tangible things that connects us to the past and our cultural identity is food. There are many facets of Singaporean cuisine: Malay, Chinese, Indian, Eurasian (a fusion of European and Asian dishes and ingredients) Peranakan (combining Chinese and Malay food traditions), and catch-all Western, which usually means old-school Hainanese-style British food—a local version of Western food adapted by chefs from the southern Chinese province of Hainan, who worked in British restaurants or households.
via https://roadsandkingdoms.com/2019/a-history-of-singapore-in–10-dishes/
The Zuni maps, says Jim, contain something very important: a different way of looking and knowing. “To assume that people would look at the earth only from a vantage point that is above and looking straight down doesn’t consider the humanity of living on the landscape. Saying that there’s a pond, there are cattails, there are turtles in that water—that is a different view that expands the human experience of a place.” This different view is what Jim, the committee, and the artists hope the Zuni people will recognize when they encounter these maps and consider their place in the cosmos—not a world that is constructed from GPS waypoints or one that was decreed in an executive order—but a particularly Zuni world, infused with the prayers and histories that created it. The Zuni maps have a memory, a particular truth. They convey a relationship to place grounded in ancestral knowledge and sustained presence on the land. That such a relationship consistently fails to appear on modern maps has been the impetus for creating and sharing the Zuni maps—both with the A:shiwi people and with a wider audience. They remind all of us of the ancient names, voices, and stories that reside within the landscape, inviting us to examine our assumptions about what it is that makes up a place and the role that we play in that long and layered story.
“Kalaallit Nunaat (Greenland), the Inuit people are known for carving portable maps out of driftwood to be used while navigating coastal waters. These pieces, which are small enough to be carried in a mitten, represent coastlines in a continuous line, up one side of the wood and down the other. The maps are compact, buoyant, and can be read in the dark.
These three wooden maps show the journey from Sermiligaaq to Kangertittivatsiaq, on Greenland’s East Coast. The map to the right shows the islands along the coast, while the map in the middle shows the mainland and is read from one side of the block around to the other. The map to the left shows the peninsula between the Sermiligaaq and Kangertivartikajik fjords.”
(via https://decolonialatlas.wordpress.com/2016/04/12/inuit-cartography/ )
“A plastic washing-up bottle that is at least 47 years old has been found washed up on a beach in the UK with its lettering and messaging still clear, prompting warnings about the enduring problem of plastic waste.“
For those contemplating exactly how out of control America was then compared to now, the most pertinent evidence is the book’s compendium of a near-constant series of terror bombings. The authors describe explosions in New York at National Guard headquarters, police headquarters, and three Manhattan banks; bombings in San Francisco’s Presidio and at a church during a police officer’s funeral; Molotov cocktails tossed in Wisconsin city halls and Connecticut ROTC offices; post offices, courthouses, and draft boards lit up across the country; 81 sticks of dynamite found at a Kansas university; and rocks, bottles, and eggs tossed directly at Nixon and California Gov. Ronald Reagan. According to Bryan Burrough’s 2015 book Days of Rage (Penguin Press), the U.S. suffered nearly five bombings every day during one 18-month period in 1971–72. Hijackings had become so common—33 in 1969 alone—that the president’s family was barred from flying commercial. Leary’s overseas spree (where he found himself continually squeezed as a cash cow by those he relied on) dovetailed with America’s cultural and political chaos. By January 1973, when the feds decided they weren’t going to let aggravating legal niceties hold them back and just kidnapped him in Afghanistan, the violence that had inspired Nixon to prioritize his capture was winding down. But for a while there, it was bad. The modern American populace would likely die of head-exploding embolisms if even a quarter of that sort of madness were common today.
via http://reason.com/archives/2018/06/23/1972-the-year-that-made–2018-s
the word Afghan is used in a contemporary sense to refer to someone from Afghanistan. However, this has not always been the case. For hundreds of years, Afghan as a term was primarily used by Persian speakers to describe another ethnic community, known as the Pashtuns (also called the Pakhtuns and Pathans). In the twenty-first century, the Pashtuns made up the largest ethnic group in Afghanistan, yet more Pashtuns live in Pakistan, particularly in the Federally Administered Tribal Areas, the North Western province of Khyber Pakhtunkhwa, and Karachi. The reason I provide this background is because the Pashtuns along with the Baloch (another ethnic group found in South Asia and Iran), probably played the most significant role in non-Aboriginal explorations of the Australian outback in the late 19th and early 20th centuries, as cameleers. For example, the most famous cameleer, Bejah Dervish, was a Baloch.
via https://newmatilda.com/2016/06/13/how-the-afghan-cameleer-campaign-accidentally-hides-diversity/
“We’re still trying to figure out what time is,” Gleick said. Time travel stories apparently help us. The inventor of the time machine in Wells’s book explains archly that time is merely a fourth dimension. Ten years later in 1905 Albert Einstein made that statement real. In 1941 Jorge Luis Borges wrote the celebrated short story, “The Garden of Forking Paths.” In 1955 physicist Hugh Everett introduced the quantum-based idea of forking universes, which itself has become a staple of science fiction.
“Time,” Richard Feynman once joked, “is what happens when nothing else happens.” Gleick suggests, “Things change, and time is how we keep track.” Virginia Woolf wrote, “What more terrifying revelation can there be than that it is the present moment? That we survive the shock at all is only possible because the past shelters us on one side, the future on another.”
“Enjoy the present. Don’t waste your brain cells agonizing about lost opportunities or worrying about what the future will bring. As I was working on the book I suddenly realized that that’s terrible advice. A potted plant lives in the now. The idea of the ‘long now’ embraces the past and the future and asks us to think about the whole stretch of time. That’s what I think time travel is good for. That’s what makes us human — the ability to live in the past and live in the future at the same time.”
via https://medium.com/the-long-now-foundation/time-travel-is-time-research–6f3248fef6b0
Kingpin reports on the collection of videos that Professor Iain Borden has compiled in his re-write of his seminal academic work on Skateboarding. His new book ‘Skateboarding and the City’ will be published in 2018 and has been brought up to date and also made interactive. In accompanying the book Iain has put together a playlist with classic clips from skateboarding’s past. The playlistis an amazing resource for skateboarding fans and you will find yourself clicking through old favourites and undiscovered gems.
But, in truth, it’s not that difficult to understand Ethereum, blockchains, Bitcoin and all the rest — at least the implications for people just going about their daily business, living their lives. Even a programmer who wants a clear picture can get a good enough model of how it all fits together fairly easily. Blockchain explainers usually focus on some very clever low-level details like mining, but that stuff really doesn’t help people (other than implementers) understand what is going on. Rather, let’s look at how the blockchains fit into the more general story about how computers impact society.
The war for the open internet is the defining issue of our time. It’s a scramble for control of the very fabric of human communication. And human communication is all that separates us from the utopia that thousands of generations of our ancestors slowly marched us toward — or the Orwellian, Huxleyan, Kafkaesque dystopia that a locked-down internet would make possible.By the end of this article, you’ll understand what’s happening, the market forces that are driving this, and how you can help stop it. We’ll talk about the brazen monopolies who maneuver to lock down the internet, the scrappy idealists who fight to keep it open, and the vast majority of people who are completely oblivious to this battle for the future.In Part 1, we’ll explore what the open internet is and delve into the history of the technological revolutions that preceded it.In Part 2, we’ll talk about the atoms. The physical infrastructure of the internet. The internet backbone. Communication satellites. The “last mile” of copper and fiber optic cables that provide broadband internet.In Part 3, we’ll talk about bits. The open, distributed nature of the internet and how it’s being cordoned off into walled gardens by some of the largest multinational corporations in the world.In Part 4, we’ll explore the implications of all this for consumers and for startups. You’ll see how you can help save the open internet. I’ll share some practical steps you can take as a citizen of the internet to do your part and keep it open.
via https://medium.freecodecamp.com/inside-the-invisible-war-for-the-open-internet-dd31a29a3f08
Spanning the years 1937-2001, the collection should especially appeal to those with an avant-garde or musicological bent. In fact, the original uploader of this archive of experimental sound, Caio Barros, put these tracks online in 2009 while a student of composition at Brazil’s State University of São Paulo. Barrios’ “initiative,” as he writes at Ubuweb, “became some sort of legend” among musicophiles in the know. And yet, Ubuweb reposts this phenomenal collection with a disclaimer: “It’s a clearly flawed selection”
via http://www.openculture.com/2016/03/the-history-of-electronic-music-in–476-tracks–1937–2001.html
For hundreds of years the now-extinct turnspit dog, also called Canis Vertigus (“dizzy dog”), vernepator cur, kitchen dog and turn-tyke, was specially bred just to turn a roasting mechanism for meat. And weirdly, this animal was a high-tech fixture for the professional and home cook from the 16th century until the mid-1800s. Turnspit dogs came in a variety of colors and were heavy-set, often with heterochromatic eyes. They were short enough to fit into a wooden wheel contraption that was connected to ropes or chains, which turned the giant turkey or ham on a spit for the master of the house.
“Small corrections to the programmed sequence could be done by patching over portions of the paper tape and re-punching the holes in that section.”
I don’t know how we rehabilitate science and fact. Some large subset of our population believes that climate change is a hoax. For them, the fake is completely real. When you look the mid-20th century, you see Germany leaving facts behind too. Citizens cease to debate the German economy, and instead put their faith in a charismatic leader. In the US now there is a large population that can’t understand what’s happening to them politically, economically or culturally. Today, people can’t understand why abortion is legal. They can’t understand why gay marriage is legal. They can’t understand where the factories have gone. It’s the turn from fact that makes fascism possible. If they turn away from reasoning altogether, they can turn toward feeling like part of a body following a charismatic leader.
In 1959, as the director of a secret military computer research centre, Kitov turned his attention to devoting ‘unlimited quantities of reliable calculating processing power’ to better planning the national economy, which was the most persistent information-coordination problem besetting the Soviet socialist project. (It was discovered in 1962, for example, that a handmade calculation error in the 1959 census goofed the population prediction by 4 million people.) Kitov wrote his thoughts down in the ‘Red Book letter’, which he sent to Khrushchev. He proposed allowing ‘civilian organisations’ to use functioning military computer ‘complexes’ for economic planning in the nighttime hours, when most military men were sleeping. Here, he thought, economic planners could harness the military’s computational surplus to adjust for census problems in real-time, tweaking the economic plan nightly if needed. He named his military-civilian national computer network the Economic Automated Management System.
via https://aeon.co/essays/how-the-soviets-invented-the-internet-and-why-it-didn-t-work
A reading list created by a group of Black, Brown, Indigenous, Muslim, and Jewish people who are writers, organizers, teachers, anti-fascists, anti-capitalists, and radicals.
So these four points can be resumed: collectivism against private property, polymorphous worker against specialization, concrete universalism against closed identities, and free association against the state. It’s only a principle, it’s not a programme. But with this principle, we can judge all political programmes, decisions, parties, ideas, from the point of view of these four principles. Take a decision: is this decision in the direction of the four principles or not. The principles are the protocol of judgement concerning all decisions, ideas, propositions. If a decision, a proposition, is in the direction of the four principles, we can say it’s a good one, we can examine if it is possible and so on. If clearly it’s against the principles, it’s a bad decision, bad idea, bad programme. So we have a principle of judgement in the political field and in the construction of the new strategic project. That is in some sense the possibility to have a true vision of what is really in the new direction, the new strategic direction of humanity as such.
via http://mariborchan.si/video/alain-badiou/reflections-on-the-recent-election/
Our foundation of Earth knowledge, largely derived from historically observed patterns, has been central to society’s progress. Early cultures kept track of nature’s ebb and flow, passing improved knowledge about hunting and agriculture to each new generation. Science has accelerated this learning process through advanced observation methods and pattern discovery techniques. These allow us to anticipate the future with a consistency unimaginable to our ancestors. But as Earth warms, our historical understanding will turn obsolete faster than we can replace it with new knowledge. Some patterns will change significantly; others will be largely unaffected, though it will be difficult to say what will change, by how much, and when.
via http://www.nytimes.com/2016/04/19/opinion/a-new-dark-age-looms.html?_r=0
Earlier this year our organization, the Rockefeller Family Fund (RFF), announced that it would divest its holdings in fossil fuel companies. We mean to do this gradually, but in a public statement we singled out ExxonMobil for immediate divestment because of its “morally reprehensible conduct.”1 For over a quarter-century the company tried to deceive policymakers and the public about the realities of climate change, protecting its profits at the cost of immense damage to life on this planet. Our criticism carries a certain historical irony. John D. Rockefeller founded Standard Oil, and ExxonMobil is Standard Oil’s largest direct descendant. In a sense we were turning against the company where most of the Rockefeller family’s wealth was created. (Other members of the Rockefeller family have been trying to get ExxonMobil to change its behavior for over a decade.) Approached by some reporters for comment, an ExxonMobil spokesman replied, “It’s not surprising that they’re divesting from the company since they’re already funding a conspiracy against us.”2 What we had funded was an investigative journalism project.
via https://www.nybooks.com/articles/2016/12/08/the-rockefeller-family-fund-vs-exxon/
Fascism became an all-purpose term because one can eliminate from a fascist regime one or more features, and it will still be recognizable as fascist. Take away imperialism from fascism and you still have Franco and Salazar. Take away colonialism and you still have the Balkan fascism of the Ustashes. Add to the Italian fascism a radical anti-capitalism (which never much fascinated Mussolini) and you have Ezra Pound. Add a cult of Celtic mythology and the Grail mysticism (completely alien to official fascism) and you have one of the most respected fascist gurus, Julius Evola. But in spite of this fuzziness, I think it is possible to outline a list of features that are typical of what I would like to call Ur-Fascism, or Eternal Fascism. These features cannot be organized into a system; many of them contradict each other, and are also typical of other kinds of despotism or fanaticism. But it is enough that one of them be present to allow fascism to coagulate around it.
Certainly this crisis makes us ask: what comes after work? What would you do without your job as the external discipline that organises your waking life – as the social imperative that gets you up and on your way to the factory, the office, the store, the warehouse, the restaurant, wherever you work and, no matter how much you hate it, keeps you coming back? What would you do if you didn’t have to work to receive an income? And what would society and civilisation be like if we didn’t have to ‘earn’ a living – if leisure was not our choice but our lot? Would we hang out at the local Starbucks, laptops open? Or volunteer to teach children in less-developed places, such as Mississippi? Or smoke weed and watch reality TV all day? I’m not proposing a fancy thought experiment here. By now these are practical questions because there aren’t enough jobs. So it’s time we asked even more practical questions. How do you make a living without a job – can you receive income without working for it? Is it possible, to begin with and then, the hard part, is it ethical? If you were raised to believe that work is the index of your value to society – as most of us were – would it feel like cheating to get something for nothing?
via https://aeon.co/essays/what-if-jobs-are-not-the-solution-but-the-problem
I’m just archiving this Asian Age summary of a lecture from 9th April 2015, because the newspaper webpage has vanished. [Photos]
Time Out listing: How would you design an object for a world that does not exist? What does such an object say about the world in which we actually live? This idea tugs at the core of ‘design fiction’ practice. For instance, the iPad first appeared in Kubrick’s 2001: A Space Odyssey. Its writer Arthur C. Clarke was also the first to imagine geostationary satellites. The impact of Minority Report on human-computer interfaces cannot be overstated. Even outside of fully formed fictional worlds, a standalone object can trigger many unexpected narratives, such as the famous 3D-printed gun or the US Army’s “indestructible sandwich”. We will discuss these and many other examples of speculative design in this talk.
Asian Age Article: (16 Apr) For Rohit Gupta, the essential question isn’t “why” but “why not”. He held forth on the concept of “design fiction” at a talk in the city recently. His previous projects include trying to figure out a way to fit astronomical contraptions on top of auto-rickshaws and coming up with a mechanism to type through walking (in which one could type out a whole text message in no less than seven hours!). While many around him may wonder “why”, for Rohit Gupta aka Compasswala aka fadesingh, the only question is “why not”. Giving a talk on design fiction at the Maker’s Asylum, the researcher who studies the history of science and mathematics explained why for him fiction was everywhere, not just in the depiction of future, but even the past. Speaking about what exactly design fiction is, Rohit says, “It’s about the objects. Design fiction deals with how to create objects that describe or imply a story or an aspect about a world that doesn’t exist.” Going on to give us an example in his own style, Rohit says, “Let us consider hypothetically that there was a catastrophic event in Mumbai in 1960 that entirely changed the city. Now let us take a map of Mumbai in 2015 that shows how it looks now in that scenario. We don’t have to describe everything that happened in the time frame between the disaster and now, but just the map, which is an object of design fiction can show or tell us a huge number of details about that world. ‘That’ is design fiction.” Rohit adds, “Design fiction has existed for a long time. Now we may have sci-fi movies and earlier there were books. But those were just the interfaces. It has existed for long before these interfaces came about.” While sci-fi and fiction is usually considered to depict the future or altogether different realities, Rohit contends, it is equally relevant and present in describing the past as well.
He explains, “Not many might have heard about the Ishango bone. Now the Ishango bone is considered to be the oldest mathematical instrument known to man. But basically it is just a simple bone with hand carved lines drawn on it in varying sequences. Now what these prehistoric humans were trying to do with those lines we don’t know, but researchers have interpreted various reasons ranging from calculating menstrual cycles to lunar calendars. But this is our modern interpretation of what this particular object tells us. It could well have been something else but these are the stories we are interpreting from it. So this is design fiction as well, only in the past.” Design fiction, says Rohit, varies from the miniscule to the astronomical. “You could create a simple toy in a workshop or you could even create an enter solar system like Asimov (Isaac) did in Nightfall.” But while the potential of design fiction could be limitless, it is upto us to ask the questions from whence we can derive the answers says Rohit. “This is increasingly becoming a trend. Researchers in top institutes are taking questions that may sound ridiculous and are coming up with the most scientific explanations for them. For example, 'How does a Muslim astronomer face Mecca while in space’ but believe it or not the Malaysians have actually come up with an entire manual for it.” And progress, says Rohit is all about not shying away from doing what may sound crazy. “One of my friends, a poet named Christian Book is now engaged in a project to create the world’s first indestructible book. How he’s doing it is the most interesting part. He actually took a strain of this microbe called Dienococcus Radiodurans, which is an extremophile (Something which can survive in extreme conditions such nuclear blasts, volcanoes or even in space) and imprinting a poem into its very DNA and is planning to launch it off into space. Now whom he is writing for or what the poem itself is irrelevant. But the only question is 'Why the hell not’,” concludes the Compasswala.
A specter is haunting Eastern Europe: the specter of what in the West is called “dissent” This secter has not appeared out of thin air. It is a natural and inevitable consequence of the present historical phase of the system it is haunting. It was born at a time when this system, for a thousand reasons, can no longer base itself on the unadulterated, brutal, and arbitrary application of power, eliminating all expressions of nonconformity. What is more, the system has become so ossified politically that there is practically no way for such nonconformity to be implemented within its official structures.
via https://medium.com/@bruces/the-power-of-the-powerless-by-vaclav-havel–84b2b8d3a84a
If we believe that, indeed, “software is eating the world,” that we are living in a moment of extraordinary technological change, that we must – according to Gartner or the Horizon Report – be ever-vigilant about emerging technologies, that these technologies are contributing to uncertainty, to disruption, then it seems likely that we will demand a change in turn to our educational institutions (to lots of institutions, but let’s just focus on education). This is why this sort of forecasting is so important for us to scrutinize – to do so quantitatively and qualitatively, to look at methods and at theory, to ask who’s telling the story and who’s spreading the story, to listen for counter-narratives.
“There often are competing claims as to who invented a technology and when, for example, and there are early prototypes that may or may not “count.” James Clerk Maxwell did publish A Treatise on Electricity and Magnetism in 1873. Alexander Graham Bell made his famous telephone call to his assistant in 1876. Guglielmo Marconi did file his patent for radio in 1897. John Logie Baird demonstrated a working television system in 1926. The MITS Altair 8800, an early personal computer that came as a kit you had to assemble, was released in 1975. But Martin Cooper, a Motorola exec, made the first mobile telephone call in 1973, not 1983. And the Internet? The first ARPANET link was established between UCLA and the Stanford Research Institute in 1969. The Internet was not invented in 1991. […] Economic historians who are interested in these sorts of comparisons of technologies and their effects typically set the threshold at 50% – that is, how long does it take after a technology is commercialized (not simply “invented”) for half the population to adopt it. This way, you’re not only looking at the economic behaviors of the wealthy, the early-adopters, the city-dwellers, and so on (but to be clear, you are still looking at a particular demographic – the privileged half.)”
–The Best Way to Predict the Future is to Issue a Press Release. Audrey Watters.
China has the world’s preeminent cuisine, absolutely unparalleled in its diversity and its sophistication. You can find practically everything you could possibly desire in terms of food in China. From exquisite banquet cookery, exciting street food, bold spicy flavors, honest farmhouse cooking, delicate soups, just everything, apart perhaps from cheese, although they do actually have a couple of kinds of cheese [laughs] in Yunnan province. Also, because China is such a food-orientated culture, and it has been since the beginnings of history, that if you want to understand China, almost more than anywhere else, food is a really good window into the culture, into the way people live, into history, everything.
There are two different angles at play in the discussion about colonialism and science. First is what constitutes scientific epistemology and what its origins are. As a physicist, I was taught that physics began with the Greeks and later Europeans inherited their ideas and expanded on them. In this narrative, people of African descent and others are now relative newcomers to science, and questions of inclusion and diversity in science are related back to “bringing science to underrepresented minority and people of color communities.” The problem with this narrative is that it isn’t true. For example, many of those “Greeks” were actually Egyptians and Mesopotamians under Greek rule. So, even though for the last 500 years or so science has largely been developed by Europeans, the roots of its methodology and epistemology are not European. Science, as scientists understand it, is not fundamentally European in origin. This complicates both racist narratives about people of color and innovation as well as discourse around whether science is fundamentally wedded to Euro-American operating principles of colonialism, imperialism and domination for the purpose of resource extraction.
via https://medium.com/@chanda/decolonising-science-reading-list-339fb773d51f
Gómez’s study is the first thorough survey of violence in the mammal world, collating data on more than a thousand species. It clearly shows that we humans are not alone in our capacity to kill each other. Our closest relatives, the chimpanzees, have been known to wage brutal war, but even apparently peaceful creatures take each other’s lives. When ranked according to their rates of lethal violence, ground squirrels, wild horses, gazelle, and deer all feature in the top 50. So do long-tailed chinchillas, which kill each other more frequently than tigers and bears do.
Gómez’s study is the first thorough survey of violence in the mammal world, collating data on more than a thousand species. It clearly shows that we humans are not alone in our capacity to kill each other. Our closest relatives, the chimpanzees, have been known to wage brutal war, but even apparently peaceful creatures take each other’s lives. When ranked according to their rates of lethal violence, ground squirrels, wild horses, gazelle, and deer all feature in the top 50. So do long-tailed chinchillas, which kill each other more frequently than tigers and bears do.
The primates—the order that includes us, apes, monkeys, and lemurs—seem to be especially violent. While just 0.3 percent of mammal deaths are caused by members of the same species, that rate rose to 2.3 percent in the common ancestor of primates, and dropped slightly to 1.8 percent in the ancestor of great apes. That’s the lethal legacy that humanity inherited.
That isn’t to imply determinism. Even within the apes, chimps are notably more aggressive than bonobos, which suggests that group-wide capacities for violence can be tempered by other factors. And history shows that humans have also varied greatly in our violent tendencies. We are influenced by our history, but not saddled to it.
Gómez’s team showed that by poring through statistical yearbooks, archaeological sites, and more, to work out causes of death in 600 human populations between 50,000 BC to the present day. They concluded that rates of lethal violence originally ranged from 3.4 to 3.9 percent during Paleolithic times, making us only slightly more violent than you’d expect for a primate of our evolutionary past. That rate rose to around 12 percent during the bloody Medieval period, before falling again over the last few centuries to levels even lower than our prehistoric past.
The cloud, however, remains a model of the world, just not the one we have taken it to mean. The apparent growth of crisis is, in part, a consequence of our new, technologically-augmented ability to perceive the world as it actually is, beyond the mediating prism of our own cultural sensorium. The stories we have been telling ourselves don’t bear out. They’re weak all over. The cloud reveals not the deep truth at the heart of the world, but its fundamental incoherence, its vast and omniferous unknowability. In place of computational thinking, we must respond with cloud thinking: an accounting of the world which reclaims the recognition and the agency of unknowing. Aetiology is a dead end. The cloud, our world, is cloudy: it remains diffuse and forever diffusing; it refuses coherence. From our global civilisation and cultural history arises a technology of unknowing; the task of our century is to accommodate ourselves with the incoherence it reveals.
Rest in peace, HyperCard. It was one of the most important applications in the history of personal computing, in my humble opinion, and responsible for the “amazing bloom” of ideas and applications noted by Ben Hyde and Matt Jones. I made a few things with it, and I’m pretty sure they weren’t in the ‘amazing bloom’ class — but I can certainly say HyperCard was a massive influence on who I am now. (Ed. This article was originally published at cityofsound.com on 4th April 2004.)
via https://medium.com/a-chair-in-a-room/hypercard-rip-c9126c28020b
Facebook’s Mission Statement states that your objective is to “make the world more open and connected”. In reality you are doing this in a totally superficial sense.
If you will not distinguish between child pornography and documentary photographs from a war, this will simply promote stupidity and fail to bring human beings closer to each other.
To pretend that it is possible to create common, global rules for what may and what may not be published, only throws dust into peoples’ eyes.
– Espen Egil Hansen (Editor-in-chief and CEO Aftenposten)
Building and maintaining a n-to-n communications platform for over a billion *daily* active users across multiple access platforms *is* difficult and *is* hard and you’ve done it and congratulations, that was lots of work and effort. You - and your Valley compatriots - talk excitedly and breathlessly about solving Hard Problems and Disrupting Things, but in other areas - other areas that are *also* legitimate hard problems like content moderation and community moderation and abuse (which isn’t even a new thing!) - do not appear to interest you. They appear to interest you to such a little degree that it looks like you’ve given up *compared to* the effort that’s put into other hard problems.
You can’t have it both ways. You can’t use rhetoric to say that your people - not just engineers - are the best and the brightest working to solve humanity’s problems without also including the asterisk that says “Actually, *not all hard problems*. Not all difficult problems. Just some. Just the engineering ones, for example."What you’re doing right now - with your inflexible process that’s designed to be efficient and work at scale without critically being able to deal *at scale* with nuance and context (which, I’d say, is your difficult problem and a challenge you should *relish* - how do you deal with nuance at scale in a positive manner?!) smacks of algorithmic and system-reductionism.
–Dan Hon, s3e27: It’s Difficult
It is tempting to make every fiasco at Facebook about the power (and the abuse of power) of the algorithm. The "napalm girl” controversy does not neatly fit that storyline. A little-known team of humans at Facebook decided to remove the iconic photo from the site this week.
That move revealed, in a klutzy way, just how much the company is struggling internally to exercise the most basic editorial judgment, despite claims by senior leadership that the system is working.
–Aarti Shahani, With ‘Napalm Girl,’ Facebook Humans (Not Algorithms) Struggle To Be Editor
The same week Nick Ut’s picture didn’t make it, the small town East Liverpool (Ohio) posted two photographs of a couple that had overdosed in their car, with a small child sitting right behind them. Addiction experts were quick to point out that public shaming would very likely be counter productive. In this case, it was reported, “a Facebook spokesperson said the photos did not violate the company’s community standards.”
As in the case of Ut’s picture, the decision over whether or not to publicly share photographs like the two East Liverpool ones ought to be in the hands of highly trained photo editors, people who not only have the knowledge to understand the “news value” of the photographs, but who have also wrestled with the different underlying ethical problems.
However much any editor’s decisions might be flawed at times, at the very least we can be certain that they have thought about the underlying problems, that, in other words, we’re looking at the end result of an educated process (regardless of whether or not we end up agreeing with it or not). The world of Facebook does away with this.
– Jörg M. Colberg,The Facebook Problem
Mockup of the Dynabook conceived by Xerox PARC’s Alan Kay, 1970s. Source: https://www.parc.com/newsroom/media-library.html via http://p-dpa.net/in-defense-of-poor-media/
People in the innovation-obsessed present tend to overstate the impact of technology not only in the future, but also the present. We tend to imagine we are living in a world that could scarcely have been imagined a few decades ago. It is not uncommon to read assertions like: “Someone would have been unable at the beginning of the 20th century to even dream of what transportation would look like a half a century later.” And yet zeppelins were flying in 1900; a year before, in New York City, the first pedestrian had already been killed by an automobile. Was the notion of air travel, or the thought that the car was going to change life on the street, really so beyond envisioning—or is it merely the chauvinism of the present, peering with faint condescension at our hopelessly primitive predecessors? The historian Lawrence Samuel has called social progress the “Achilles heel” of futurism. He argues that people forget the injunction of the historian and philosopher Arnold Toynbee: Ideas, not technology, have driven the biggest historical changes. When technology changes people, it is often not in the ways one might expect: Mobile technology, for example, did not augur the “death of distance,” but actually strengthened the power of urbanism. The washing machine freed women from labor, and, as the social psychologists Nina Hansen and Tom Postmes note, could have sparked a revolution in gender roles and relations. But, “instead of fueling feminism,” they write, “technology adoption (at least in the first instance) enabled the emergence of the new role of housewife: middle-class women did not take advantage of the freed-up time … to rebel against structures or even to capitalize on their independence.” Instead, the authors argue, the women simply assumed the jobs once held by their servants.
via http://nautil.us/issue/28/2050/why-futurism-has-a-cultural-blindspot
Five years ago, Matthew Kirschenbaum, an English professor at the University of Maryland, realized that no one seemed to know who wrote the first novel with the help of a word processor. He’s just published the fruit of his efforts: Track Changes, the first book-length story of word processing. It is more than a history of high art. Kirschenbaum follows how writers of popular and genre fiction adopted the technology long before vaunted novelists did. He determines how their writing habits and financial powers changed once they moved from typewriter to computing. And he details the unsettled ways that the computer first entered the home. (When he first bought a computer, for example, the science-fiction legend Isaac Asimov wasn’t sure whether it should go in the living room or the study.)
“In the early days of the automobile, it was drivers’ job to avoid you, not your job to avoid them,” says Peter Norton, a historian at the University of Virginia and author of Fighting Traffic: The Dawn of the Motor Age in the American City. “But under the new model, streets became a place for cars — and as a pedestrian, it’s your fault if you get hit.” One of the keys to this shift was the creation of the crime of jaywalking.
As much as my Wired archive is a document of its era’s aspirations, it’s also a record of what people once hoped technology would be—and, in hindsight, a record of what it might have become. In early Wired, a piece about a five-hundred-thousand-dollar luxury “Superboat” would be followed by a full-page editorial urging readers to contact their legislators to condemn wiretapping (in this case, 1994’s Digital Telephony Bill). Stories of tech-enabled social change and New Economy capitalism weren’t in competition; they coexisted and played off one another. In 2016, some of my colleagues and I have E.F.F. stickers on our company-supplied MacBooks—“I do not consent to the search of this device,” we broadcast to our co-workers—but dissent is no longer an integral part of the industry’s ethos.
‘Peer review’ was a term borrowed from the procedures that government agencies used to decide who would receive financial support for scientific and medical research. When 'referee systems’ turned into 'peer review’, the process became a mighty public symbol of the claim that these powerful and expensive investigators of the natural world had procedures for regulating themselves and for producing consensus, even though some observers quietly wondered whether scientific referees were up to this grand calling. Current attempts to reimagine peer review rightly debate the psychology of bias, the problem of objectivity, and the ability to gauge reliability and importance, but they rarely consider the multilayered history of this institution. Peer review did not develop simply out of scientists’ need to trust one another’s research. It was also a response to political demands for public accountability. To understand that other practices of scientific judgement were once in place ought to be a part of any responsible attempt to chart a future path. The imagined functions of this institution are in flux, but they were never as fixed as many believe.
via http://www.nature.com/news/peer-review-troubled-from-the-start–1.19763?WT.mc_id=SFB_NNEWS_1508_RHBox
It was the ultimate goal of many schools of occultism to create life. In Muslim alchemy, it was called Takwin. In modern literature, Frankenstein is obviously a story of abiogenesis, and not only does the main character explicitly reference alchemy as his inspiration but it’s partially credited for sparking the Victorian craze for occultism. Both the Golem and the Homunculus are different traditions’ alchemical paths to abiogenesis, in both cases partially as a way of getting closer to the Divine by imitating its power. And abiogenesis has also been the fascinated object of a great deal of AI research. Sure, in recent times we might have started to become excited by its power to create a tireless servant who can schedule meetings, manage your Twitter account, spam forums, or just order you a pizza, but the historical context is driven by the same goal as the alchemists - create artificial life. Or more accurately, to create an artificial human. Will we get there? Is it even a good idea? One of the talks at a recent chatbot convention in London was entitled “Don’t Be Human” . Meanwhile, possibly the largest test of an intended-to-be-humanlike - and friendlike - bot is going on via the Chinese chat service WeChat.
via http://www.antipope.org/charlie/blog-static/2016/04/5-magical-beasts-and-how-to-re.html
Located on the futurist left end of the political spectrum, fully automated luxury communism (FALC) aims to embrace automation to its fullest extent. The term may seem oxymoronic, but that’s part of the point: anything labeled luxury communism is going to be hard to ignore. “There is a tendency in capitalism to automate labor, to turn things previously done by humans into automated functions,” says Aaron Bastani, co-founder of Novara Media. “In recognition of that, then the only utopian demand can be for the full automation of everything and common ownership of that which is automated.” Bastani and fellow luxury communists believe that this era of rapid change is an opportunity to realise a post-work society, where machines do the heavy lifting not for profit but for the people.
I believe that it is correct to view luxury communism from a utopian perspective, not in the sense of something that is impossible but in the sense of something that attempts to open up the sense of future possibilities as opposed to a mere repetition of present conditions. Partially this is to act as a critique of the present, partially to act as a spur towards an open future. Indeed, the use of the term ‘communism’ implies a radical alternative future vision, one that is subversive of the present and, yes, even utopian. It is here that I think that fully automated luxury communism, by putting too much faith in capitalist technology overcoming scarcity and the need for labour, fails to imagine a more general transformation of social relations. To avoid this tendency, and to encourage thinking about the overcoming of the paradoxes and miseries of capitalism, we need to seriously engage in utopian experimentation in future possibilities.
via https://libcom.org/blog/fully-automated-luxury-communism-utopian-critique–14062015
This tension between freedom and stability was long ago formalized in two sets of official and binding rules: the International Code of Zoological Nomenclature (ICZN), which deals with animals, and the International Code of Nomenclature for algae, fungi, and plants (ICN). Periodically updated by committees of working taxonomists, these documents set out precise, legalistic frameworks for how to apply names both to species and to higher taxa. (The animal and plant codes operate independently, which means that an animal can share a scientific name with a plant, but not with another animal, and vice versa.) While this freedom opens up a valuable space for amateur contributions, it also creates a massive loophole for unscrupulous, incompetent, or fringe characters to wreak havoc. That’s because the Principle of Priority binds all taxonomists into a complicated network of interdependence; just because a species description is wrong, poorly conceived, or otherwise inadequate, doesn’t mean that it isn’t a recognized part of taxonomic history. Whereas in physics, say, “unified theories” scrawled on napkins and mailed in unmarked envelopes end up in trashcans, biologists, regardless of their own opinions, are bound to reckon with the legacy of anyone publishing a new name. Taxonomists are more than welcome to deal with (or “revise”) these incorrect names in print, but they can’t really ignore them.
via http://nautil.us/issue/35/boundaries/why-do-taxonomists-write-the-meanest-obituaries
This story may help explain Hume’s ideas. It unquestionably exemplifies them. All of the characters started out with clear, and clashing, identities—the passionate Italian missionary and the urbane French priest, the Tibetan king and lamas, the Siamese king and monks, the skeptical young Scot. But I learned that they were all much more complicated, unpredictable, and fluid than they appeared at first, even to themselves. Both Hume and the Buddha would have nodded sagely at that thought. Although Dolu and Desideri went to Siam and Tibet to bring the wisdom of Europe to the Buddhists, they also brought back the wisdom of the Buddhists to Europe. Siam and Tibet changed them more than they changed Siam and Tibet. And his two years at La Flèche undoubtedly changed David Hume.
Aussie - Peter Drew (via http://flic.kr/p/FEWn7G )
This is the first part of ‘A Brief History of Neural Nets and Deep Learning’. In this part, we shall cover the birth of neural nets with the Perceptron in 1958, the AI Winter of the 70s, and neural nets’ return to popularity with backpropagation in 1986.
via http://www.andreykurenkov.com/writing/a-brief-history-of-neural-nets-and-deep-learning/
this is about social sadism – deliberate, invested, public or at least semi-public cruelty. The potentiality for sadism is one of countless capacities emergent from our reflexive, symbolising selves. Trying to derive any social phenomenon from any supposed ‘fact’ of ‘human nature’ is useless, except to diagnose the politics of the deriver. Of course it’s vulgar Hobbesianism, the supposed ineluctability of human cruelty, that cuts with the grain of ruling ideology. The right often, if incoherently, acts as if this (untrue) truth-claim of our fundamental nastiness justifies an ethics of power. The position that Might Makes Right is elided from an Is, which it isn’t, to an Ought, which it oughtn’t be, even were the Is an is. If strength and ‘success’ are coterminous with good, what can their lack be but bad – deserving of punishment?
The station initially operated under the name Radio Assel, but also became known under the name Radio Hoog Buurlo. ‘Kootwijk Radio’ was the international call sign for radio traffic. Queen Emma brought about the first telephone connection in 1929 with the Dutch East Indies with the legendary words: “Hello Bandoeng Hello Bandoeng! Can you hear me?“. The first conversations, which invariably concluded with the Dutch national anthem Wilhelmus, were free as it was still in an experimental phase. Subsequently, people had to pay considerable amounts for a phone call to family members overseas.
The recent decades have seen a dramatic reversal in the conceptualisation of inattention. Unlike in the 18th century when it was perceived as abnormal, today inattention is often presented as the normal state. The current era is frequently characterised as the Age of Distraction, and inattention is no longer depicted as a condition that afflicts a few. Nowadays, the erosion of humanity’s capacity for attention is portrayed as an existential problem, linked with the allegedly corrosive effects of digitally driven streams of information relentlessly flowing our way.
via http://www.metafilter.com/158361/Your-attention-please
(via http://flic.kr/p/DTPFT1 )
Science isn’t fiction, science is weirder than fiction. Teleportation, ubiquity, levitation, spontaneous appearance. Inconceivable on a human scale, but totally logical on the scale of elementary particles. Working in the field of quantum mechanics is a wild ride, and even though the mythical Pauli Effect was a private joke among highly scientific minds, some of them were nonetheless superstitious enough to ban Wolfgang Pauli from even entering their lab. In quantum physics as well as in photography, the act of observing is not a neutral act. It participates in the outcome of a scene. These photos are sometimes real, sometimes completely fabricated. The observer is actor in fixing what is science and what is myth.
Writers and travellers alike do their best work when they don’t know what they’re looking for; disorientation requires problem-solving, and a new landscape holds secrets still. These days, I never totally unpack my suitcase. I buy only folding toothbrushes. I leave, often, on short notice—my record is three and a half hours before takeoff, for a transatlantic trip—and I like my mind best when it’s on the move. To land somewhere unfamiliar is to force yourself into alertness, to redraw whatever maps you have, to set the stage for creativity more than mere pattern-matching productivity.
The only reason that anyone could be induced to take part in such a dangerous business was the fabulous profit that could be made. Gideon Allen & Sons, a whaling syndicate based in New Bedford, Massachusetts, made returns of 60% a year during much of the 19th century by financing whaling voyages—perhaps the best performance of any firm in American history. It was the most successful of a very successful bunch. Overall returns in the whaling business in New Bedford between 1817 and 1892 averaged 14% a year—an impressive record by any standard. New Bedford was not the only whaling port in America; nor was America the only whaling nation. Yet according to a study published in 1859, of the 900-odd active whaling ships around the world in 1850, 700 were American, and 70% of those came from New Bedford.
The fact is that slaving was at the very centre of state-making. It is impossible to exaggerate the massive effects of this human commodity on stateless societies. Wars between states became a kind of booty capitalism, where the major prize was human traffic. The slave trade then completely transformed the non-state ‘tribal zone’. Some groups specialised in slave-raiding, mounting expeditions against weaker and more isolated groups and then selling them to intermediaries or directly at slave markets. The oldest members of highland groups in Laos, Thailand, Malaysia and Burma can recall their parents’ and grandparents’ memories of slave raids. The fortified, hilltop villages, with thorny, twisting and hidden approaches that early colonists found in parts of South-East Asia and Africa were largely a response to the slave trade.
http://www.lrb.co.uk/v35/n22/james-c-scott/crops-towns-government
Swami Vivekananda’s Raja Yoga, published in 1896, became a best-seller and had a lasting impact on American culture […] Vivekananda’s yoga didn’t involve the asanas, or poses, that we know as yoga today, because asana-based yoga is a modern phenomenon—one that emerged from the Indian nationalist movement’s attempt to develop a distinctly Indian version of what was then called physical culture (essentially, physical fitness). The short version of this story, which scholars like Mark Singleton and Joseph Alter have described, is that Indian innovators combined facets of medieval tantric practices with elements from Indian wrestling exercises, British army calisthenics, and Scandinavian gymnastics. They called their system “yoga,” a word that previously had had very different connotations.
(via http://flic.kr/p/Bgx7TY )
Then there’s Elgaland-Vargaland, which was thought up by two Swedish artists – and is meant to consist of all the areas of “No Man’s Land” across the world, including the land marking the borders between other nations and any bits of the sea outside another country’s territorial waters; any time you have travelled abroad, you have passed through Elgaland-Vargaland. In fact, of all the countries Middleton has looked at, this is the closest to his starting point, Narnia – since the artists claim that any time you enter a dream, or let your mind wander, you have also crossed a border and temporarily taken a trip into Elgaland-Vargaland.
http://www.bbc.com/future/story/20151103-the-countries-that-dont-exist
Yet though her creation is everywhere, Myers and the details of her life’s work are curiously absent from the public record. Not a single independent biography is in print today. Not one article details how Myers, an award-winning mystery writer who possessed no formal training in psychology or sociology, concocted a test routinely deployed by 89 of the Fortune 100 companies, the US government, hundreds of universities, and online dating sites like Perfect Match, Project Evolove and Type Tango. And not one expert in the field of psychometric testing, a $500 million industry with over 2,500 different tests on offer in the US alone, can explain why Myers-Briggs has so thoroughly surpassed its competition, emerging as a household name on par with the Atkins Diet or The Secret.
The home of the future has a long history. In 1893, at the World’s Fair in Chicago, domestic science and home economics were presented on the global stage for the first time as academic disciplines, topics to be systematically considered and innovated upon. In 1933, the Chicago World’s Fair was themed “Century of Progress.” It had a whole exhibition called Homes of Tomorrow, advertised by a flyer touting “the home of the new era … a steel house you would want to live in,” one that’s “fireproof and sanitary.” The home itself was now fair game for innovation, and companies like Monsanto and General Motors started to get on board.
http://www.eater.com/2015/9/15/9326775/the-kitchen-of-the-future-has-failed-us
Culinary Luddites are right, though, about two important things. We need to know how to prepare good food, and we need a culinary ethos. As far as good food goes, they’ve done us all a service by teaching us to how to use the bounty delivered to us (ironically) by the global economy. Their culinary ethos, though, is another matter. Were we able to turn back the clock, as they urge, most of us would be toiling all day in the fields or the kitchen; many of us would be starving. Nostalgia is not what we need. What we need is an ethos that comes to terms with contemporary, industrialized food, not one that dismisses it, an ethos that opens choices for everyone, not one that closes them for many so that a few may enjoy their labor, and an ethos that does not prejudge, but decides case by case when natural is preferable to processed, fresh to preserved, old to new, slow to fast, artisanal to industrial.
https://www.jacobinmag.com/2015/05/slow-food-artisanal-natural-preservatives/
Aboriginal Australia (Not Suitable for use in Native Title and other Land Claims)
“Our members do not recoil from the future. We believe that life on earth is embarked on a unique trajectory, one that will not be repeated. We believe that the outward journey has entailed a long and intricate interweaving of the interests of all living things. We believe that the homeward path will entail the systematic unweaving of those threads. We believe we are eminently suited for a role in this process.”
http://www.southsoundchapterwnps.org/fun/interviewfungus.htm
“The Sumerians were so serious about their beer that they had their own deity devoted to the beverage named Ninkasi. Ninkasi was the goddess of beer and alcohol, who brewed the beverage daily to to “satisfy the desire” and “sate the heart.” One of the earliest known devotions to Ninkasi was a hymn written on clay tablets dating to 1,800 BC. Called “The Hymn to Ninkasi” it was more than just a devotional script or prayer, it was a detailed recipe and procedure for making beer”
That painstaking process is similar to the technique Cobden-Sanderson and Walker used to create the Doves type, itself a confection of two earlier designs. Doves owes most to the type of Nicholas Jenson, a Venetian printer from the 15th century whose clear and elegant texts shunned the gothic blackletter favoured by print’s early pioneers. A few letters were added, and others redrawn. The arrow-straight descender of its lower case ‘y’ divides critics; purists lament the thick crossbar of the upper case ‘H’. Most people neither notice nor care. “No more graceful Roman letter has ever been cut and cast,” opined A.W. Pollard, a contemporary critic, in the Times. Simon Garfield, a modern writer, celebrates its rickety form, which looks “as if someone had broken into the press after hours and banged into the compositor’s plates.”
The Osborne 1 among the Mujahideen
The typewritersof famous authors.
Manichaeism was a major Gnostic religion that was founded by the Iranian prophet Mani (c. 216–276 AD) in the Sasanian Empire. Manichaeism taught an elaborate dualistic cosmology describing the struggle between a good, spiritual world of light, and an evil, material world of darkness. Through an ongoing process which takes place in human history, light is gradually removed from the world of matter and returned to the world of light whence it came. Its beliefs were based on local Mesopotamian gnostic and religious movements.
Manichaeism thrived between the third and seventh centuries, and at its height was one of the most widespread religions in the world. Manichaean churches and scriptures existed as far east as China and as far west as the Roman Empire. It was briefly the main rival to Christianity in the competition to replace classical paganism. Manichaeism survived longer in the East than in the West, and it appears to have finally faded away after the 14th century in southern China, contemporary to the decline in China of the Church of the East. While most of Mani’s original writings have been lost, numerous translations and fragmentary texts have survived.
Until discoveries in the 1900s of original sources, the only sources for Manichaeism were descriptions and quotations from non-Manichaean authors, either Christian, Muslim, Buddhist or Zoroastrian. While often criticizing Manichaeism, they also quoted directly from Manichaean scriptures. This enabled Isaac de Beausobre, writing in the 18th century, to create a comprehensive work on Manichaeism, relying solely on anti-Manichaean sources.
“President Xi Jinping and the Chinese government have committed over $16 billion towards building the required infrastructure to recreate the centuries-old trade route stretching from China to the Mediterranean. The new ‘Silk Road Economic Belt’, a high-speed train line running through Eurasia, Iran and Turkey before finishing in Western Europe, is one of two large-scale, global trading projects China is aiming to create, as well as the ‘Maritime Silk Road’, which will run via Southeast Asia, India, and Kenya, before finishing in the Mediterranean.”
The casual alteration of idioms risks nothing less than “cultural and linguistic chaos”, it warns. Chinese is perfectly suited to puns because it has so many homophones. Popular sayings and even customs, as well as jokes, rely on wordplay. But the order from the State Administration for Press, Publication, Radio, Film and Television says: “Radio and television authorities at all levels must tighten up their regulations and crack down on the irregular and inaccurate use of the Chinese language, especially the misuse of idioms.” Programmes and adverts should strictly comply with the standard spelling and use of characters, words, phrases and idioms – and avoid changing the characters, phrasing and meanings, the order said. “Idioms are one of the great features of the Chinese language and contain profound cultural heritage and historical resources and great aesthetic, ideological and moral values,” it added.
Ghost Stations of the London Underground
This map offers an alternative way to browse the 2,619,833 images contained in the Internet Archive’s book collection. It shows 5500 different subjects which have been algorithmically arranged by their thematic relationships. The size of each link resembles the amount of images that are available for that topic. Clicking on a link will open the flickr page containing all the pictures for that subject.
http://incubator.quasimondo.com/internetarchive/InternetArchiveBookSubjectsMap.html
Some people identify the birth of virtual reality in rudimentary Victorian “stereoscopes,” the first 3D picture viewers. Others might point to any sort of out-of-body experience. But to most, VR as we know it was created by a handful of pioneers in the 1950s and 1960s. In 1962, after years of work, filmmaker Mort Heilig patented what might be the first true VR system: the Sensorama, an arcade-style cabinet with a 3D display, vibrating seat, and scent producer. Heilig imagined it as one in a line of products for the “cinema of the future,”
Twenty-Eight Equally Sized European Union Member States
In any intelligence agency it is important to keep track of crank contacts, not only to improve protection but also to assure continuity of control and analysis. Centralization of records in CIA’s Office of Security permits quick identification of phonies and time-wasters. Professional security officers know how to handle the off-beat approach, and others would do well to rely on the professionals when they receive an irrational letter or find themselves face to face with an apparently unbalanced stranger.
It was from the RAND study that the false rumor started claiming that the ARPANET was somehow related to building a network resistant to nuclear war. This was never true of the ARPANET, only the unrelated RAND study on secure voice considered nuclear war. However, the later work on Internetting did emphasize robustness and survivability, including the capability to withstand losses of large portions of the underlying networks.
http://www.internetsociety.org/internet/what-internet/history-internet/brief-history-internet
A trip to Mars, with its invisible technology and vast, unprecedented distance from home, could estrange or alienate a crew to an unprecedented degree. Such a distance could produce an entirely new kind of boredom, impossible to imagine on Earth.
http://aeon.co/magazine/being-human/what-four-months-on-mars-taught-me-about-boredom/
What if a future decentralized social networking platform allowed everyone to connect their capture node, for the use of any other artist, or just a chosen circle of friends? We already use Google Street View for location scouting. What if it enabled us to change to any angle and scrub back and forth in time as well, and from any “open” node near it, side to side, and from drones above, not just from a single Google car that passed by once? This is the Constant Moment. This is as close to a time machine as we’re likely to get. Great technological leaps will be required to fulfill the furthest reaches of the Constant Moment. Massive gains in the quality of search and organization, not to mention cost of storage, and resolution. Perhaps even some form of a neural interface. But it’s clear to me this is a “when,” not an “if,” and artists need to begin anticipating this future, to inspire and guide the technologists, and to keep up with the military dreamers (it’s been said that in childhood development the destructive urge precedes the creative one by months, as blocks get knocked down long before they get stacked.) To the photographer that still thinks photography mostly means being physically present, crouched behind their Leica, finger poised to capture the classic vision of the Decisive Moment, this coming Constant Moment might be terrifyingly sacrilegious, or perhaps just terrifying, like an insect eye dispassionately staring.
We have ten thousand years of data showing what has worked, and what has not, in the realms of cultural practices, politics, warfare, and economics to name but a few. Applied History looks at this data and extracts the valuable lessons that can guide us in structuring our present and our future.
The images were rampantly blurred, grainy, scratched, and often just muddled shades of gray. The compositions were negligible, if they could be called compositions at all. Moriyama’s pictorial choices seemed to have been made completely at random, and the reproductions often included the sprocket holes at the negatives’ edges, like a film gone completely off its track. With thirty-five years’ hindsight, it’s easy to see the book as the spiritual godfather of the garage-band aesthetic that dominated commercial design in the eighties and nineties, typified by Raygun magazine and 4AD Records. The visual aesthetic of punk owes Moriyama a debt, as does every art school naïf who has ever taken it upon himself to boil his negatives; piss in the developer tray; mangle, staple, and tear at his prints; or otherwise molest the mechanics of the medium to achieve what by now are fairly standard results.
Documenting the Origins, History & Chaos of the Discordian Society
Medieval brewers had used many problematic ingredients to preserve beers, including, for example soot and fly agaric mushrooms. More commonly, other “gruit” herbs had been used, such as stinging nettle and henbane. Indeed, the German name of the latter, Bilsenkraut, may originally mean “Plzeň herb”, indicating that this region was a major centre of beer brewing long before the invention of (Reinheitsgebot-compliant) Pilsener.
Typographers seem eager to dismiss wider spaces as some sort of fad, either something ugly that originated with typewriters, or some sort of Victorian excess that lasted for a few brief decades and quickly petered out. But this is simply not the case. As we will explore presently, the large space following a period was an established convention for English-language publishers (and many others in Europe) in the 1700s, if not before, and it did not truly begin to fade completely until around 1950.
The Gini Coefficient, which can measure inequality in any set of numbers, has been in use for a century, but until recently it rarely left the halls of academia. Its one-number simplicity endeared it to political scientists and economists; its usual subject—economic inequality—made it popular with sociologists and policy makers. The Gini Coefficient has been the sort of workhorse metric that college freshmen learn about in survey courses and some PhD statisticians devote a lifetime to. It’s been so useful, so adaptable, that its strange history has survived only as a footnote: the coefficient was developed in 1912 by Corrado Gini, an Italian sociologist and statistician—who also wrote a paper called “The Scientific Basis of Fascism.”
The conventional view of Machiavelli is as an unscrupulous amoralist, for whom, as Alasdair MacIntyre argues, the only ends of social and political life ‘are the attainment and holding down of power’. Moral rules are merely ‘technical rules about the means to these ends’. Because Machiavelli viewed all humans as inherently corrupt, so ‘we may break a promise or violate an agreement at any time if it is in our own interests to do so, for the presumption is that, since all men are wicked, those with whom you have contracted may at any time break their promises if it is in their interest.’
If you attempt to make sense of Engelbart’s design by drawing correspondences to our present-day systems, you will miss the point, because our present-day systems do not embody Engelbart’s intent. Engelbart hated our present-day systems. If you truly want to understand NLS, you have to forget today. Forget everything you think you know about computers. Forget that you think you know what a computer is. Go back to 1962. And then read his intent. The least important question you can ask about Engelbart is, “What did he build?” By asking that question, you put yourself in a position to admire him, to stand in awe of his achievements, to worship him as a hero. But worship isn’t useful to anyone. Not you, not him. The most important question you can ask about Engelbart is, “What world was he trying to create?” By asking that question, you put yourself in a position to create that world yourself.
By “augmenting human intellect” we mean increasing the capability of a man to approach a complex problem situation, to gain comprehension to suit his particular needs, and to derive solutions to problems. Increased capability in this respect is taken to mean a mixture of the following: more-rapid comprehension, better comprehension, the possibility of gaining a useful degree of comprehension in a situation that previously was too complex, speedier solutions, better solutions, and the possibility of finding solutions to problems that before seemed insoluble. And by “complex situations” we include the professional problems of diplomats, executives, social scientists, life scientists, physical scientists, attorneys, designers–whether the problem situation exists for twenty minutes or twenty years. We do not speak of isolated clever tricks that help in particular situations. We refer to a way of life in an integrated domain where hunches, cut-and-try, intangibles, and the human “feel for a situation” usefully co-exist with powerful concepts, streamlined terminology and notation, sophisticated methods, and high-powered electronic aids.
Multics is no longer produced or offered for sale; Honeywell no longer even makes computers. People edit on computers on their desktop so cheap and fast that not only do redisplay algorithms no longer matter, but the whole idea of autonomous redisplay in a display editor is no longer a given (although autonomous redisplay’s illustrious child, WYSIWYG, is now the standard paradigm of the industry.). There is now no other kind of editor besides what we then called the “video editor”. Thus, all of the battles, acrimony, and invidious or arrogant comparisons in what follows are finished and done with, and to be viewed in the context of 1979 – this is a historical document about Multics and the evolution of an editor. It is part of the histories of Multics, of Emacs, and of Lisp.
MONEY is perhaps the most basic building-block in economics. It helps states collect taxes to fund public goods. It allows producers to specialise and reap gains from trade. It is clear what it does, but its origins are a mystery. Some argue that money has its roots in the power of the state. Others claim the origin of money is a purely private matter: it would exist even if governments did not. This debate is long-running but it informs some of the most pressing monetary questions of today.
Recorded sound was first imagined as long ago as 1552. In the fourth book of François Rabelais’ Gargantua and Pantagruel, there’s a tale about crossing the Frozen Sea where, the previous winter, there had been a battle between two warring tribes. The noise of combat had turned to ice but, as the sea unfroze, so too did the sounds, pouring forth in a torrent of war cries, whinnying horses and clashing weapons. This ‘cryosonic’ notion of sound as a solid, retrievable object appealed so much to Pierre Schaeffer that, in 1952, he created a piece called Les paroles dégelées (Thawed Words), in which he altered the timbre of a voice reading Rabelais’ work aloud by various tape manipulation techniques – a process he had already dubbed musique concrète.
Boston,_as_the_Eagle_and_the_Wild_Goose_See_It.jpg (JPEG Image, 2845×3354 pixels) - Scaled (19%) (via https://upload.wikimedia.org/wikipedia/en/9/9d/Boston,_as_the_Eagle_and_the_Wild_Goose_See_It.jpg)
Although she writes, “I would not dream of denying the evolutionary heritage present in our bodies,” Zuk briskly dismisses as simply “wrong” many common notions about that heritage. These errors fall into two large categories: misunderstandings about how evolution works and unfounded assumptions about how paleolithic humans lived. The first area is her speciality, and “Paleofantasy” offers a lively, lucid illustration of the intricacies of this all-important natural process. When it comes to the latter category, the anthropological aspect of the problem, Zuk treads more gingerly. Not only is this not her own field, but, as she observes, it is “ground often marked by acrimony and rancor” among the specialists themselves.
http://www.salon.com/2013/03/10/paleofantasy_stone_age_delusions/
However tawdry their origins, the creation of new media of exchange – coinage appeared almost simultaneously in Greece, India, and China – appears to have had profound intellectual effects. Some have even gone so far as to argue that Greek philosophy was itself made possible by conceptual innovations introduced by coinage. The most remarkable pattern, though, is the emergence, in almost the exact times and places where one also sees the early spread of coinage, of what were to become modern world religions: prophetic Judaism, Christianity, Buddhism, Jainism, Confucianism, Taoism, and eventually, Islam. While the precise links are yet to be fully explored, in certain ways, these religions appear to have arisen in direct reaction to the logic of the market. To put the matter somewhat crudely: if one relegates a certain social space simply to the selfish acquisition of material things, it is almost inevitable that soon someone else will come to set aside another domain in which to preach that, from the perspective of ultimate values, material things are unimportant, and selfishness – or even the self – illusory.
The Bronze Age collapse is a transition in the Aegean Region, Southwestern Asia and the Eastern Mediterranean from the Late Bronze Age to the Early Iron Age that historians such as M. Liverani, S. Richard, Robert Drews, Frank J. Yurco, Amos Nur, Leonard R. Palmer, and others believe was violent, sudden and culturally disruptive. The palace economy of the Aegean Region and Anatolia which characterised the Late Bronze Age was replaced, after a hiatus, by the isolated village cultures of the Greek Dark Ages.
Usually when I speculate about the future, I stick to two areas; either the really near future (within the next couple of decades), or the really far future (so far out that signs of continental drift should be glaringly obvious). But what about the medium term?
http://www.antipope.org/charlie/blog-static/2012/11/2512.html