Saturday, October 22, 2016
10 BOOKS THAT DON’T EXIST, BUT SHOULD
Scott Esposito in Literary Hub:
The ten books below are selections from Scott Esposito’s The Missing Books, available exclusively as an electronic download from his website. The Missing Books is a curated directory of nearly 100 books that don’t exist, but should. Its listings are taken from the ranks of books that have not yet been published (but might one day be), books within books, and books whose authors did not manage to ever complete.
The Missing Books is a living document. As Esposito discovers more missing books (and as circumstances demand changes to this list) Esposito will update The Missing Books and release new editions. Anyone who purchases The Missing Books is entitled to all future versions of it, for free.
The Passenger by Cormac McCarthy (Reputed manuscript-in-progress by Cormac McCarthy.)
McCarthy’s most recent novel, The Road, was published in 2006, ten years prior to the time of this writing; at no other point in McCarthy’s 50-year career has the author let such a span pass without publishing a new novel. He has reportedly filled this time with two major projects, The Passenger being the one about which the most is known. Some light was shed on The Passenger in August 2015 (creating a small media firestorm) when McCarthy appeared at a Lannan Foundation event where he reportedly read excerpts from the book. The Passenger is said to be a very long book set in New Orleans, and there is conjecture that it engages theoretical physics and tropes of science fiction to a large degree. Additionally, there are reports that the book has been continually pushed back and may at one time have had a 2016 release date.
More here.
Posted by S. Abbas Raza at 06:04 PM | Permalink
| Comments
What a Legless Mouse Tells Us About Snake Evolution
Ed Yong in The Atlantic:
At a lab in Berkeley, California, there’s a mouse with no legs. Its head, torso, and tail are normal. It just lacks limbs. It didn’t lose those limbs; it just never grew them originally. And that’s because a team of researchers led by Axel Visel at the Lawrence Berkeley National Laboratory had replaced part of its DNA—a small sequence known as ZRS—the equivalent sequence from a snake. That tiny change was enough to “serpentize” the mouse, to stop it from developing any limbs.
ZRS is not a gene itself. Rather, it’s an enhancer—a stretch of DNA that controls the activity of genes. These sequences have long been thought to drive the wide variety of body shapes found in back-boned animals. By influencing when and where genes are activated, they can produce astonishing variety from the same basic toolkit, changing everything from the length of limbs to the number of toes.
“But it’s been difficult identifying concrete examples of this,” says Visel. Enhancers are hard to identify. You can’t just eyeball a stretch of DNA and work out where the enhancers are. They also tend to sit far away from the genes that they control—they’re like a sentence in a book that changes the meaning of a paragraph several chapters away.
More here.
Posted by S. Abbas Raza at 05:46 PM | Permalink
| Comments
Noam Chomsky Has 'Never Seen Anything Like This'
Chris Hedges in Films For Action:
Noam Chomsky is America’s greatest intellectual. His massive body of work, which includes nearly 100 books, has for decades deflated and exposed the lies of the power elite and the myths they perpetrate. Chomsky has done this despite being blacklisted by the commercial media, turned into a pariah by the academy and, by his own admission, being a pedantic and at times slightly boring speaker. He combines moral autonomy with rigorous scholarship, a remarkable grasp of detail and a searing intellect. He curtly dismisses our two-party system as a mirage orchestrated by the corporate state, excoriates the liberal intelligentsia for being fops and courtiers and describes the drivel of the commercial media as a form of “brainwashing.” And as our nation’s most prescient critic of unregulated capitalism, globalization and the poison of empire, he enters his 81st year warning us that we have little time left to save our anemic democracy.
“It is very similar to late Weimar Germany,” Chomsky told me when I called him at his office in Cambridge, Mass. “The parallels are striking. There was also tremendous disillusionment with the parliamentary system. The most striking fact about Weimar was not that the Nazis managed to destroy the Social Democrats and the Communists but that the traditional parties, the Conservative and Liberal parties, were hated and disappeared. It left a vacuum which the Nazis very cleverly and intelligently managed to take over.”
More here.
Posted by S. Abbas Raza at 05:34 PM | Permalink
| Comments
Benedict Cumberbatch Reads Sol LeWitt's letter to Eva Hesse
For Morgan Meis. Video length: 6:09
Posted by S. Abbas Raza at 05:30 PM | Permalink
| Comments
Albert Murray’s Symphonic Elegance Sings in a New Anthology
Dwight Garner at The New York Times:
“It is always open season on the truth,” the great cultural critic Albert Murray wrote in his first and probably best book, “The Omni-Americans” (1970), “and there never was a time when one had to be white to take a shot at it.”
Murray (1916-2013) took his share of shots in “The Omni-Americans.” He skewered social scientists for pathologizing black life in what he called “this great hit-and-miss republic.” He poured scorn upon black protest writers and certain novelists, including Richard Wright, for insisting on narratives of victimhood and marginalization. Not for him were novels that “read like interim research reports.”
Part of Murray’s genius was for sounding so cheerful in the midst of battle. He’d pause during an extended and elegant argument to toss off a riff like this one (the dated word “meriny” refers to a light skin and hair tone): “If U.S. Negroes don’t already have self-pride and didn’t know black, brown, beige and freckles, and sometimes even m’riny is beautiful, why do they always sound so good, so warm, and even cuss better than everyone else?” Murray, it should be said, was an imaginative swearer himself. Henry Louis Gates Jr. said of his conversation, “Imagine Redd Foxx with a graduate degree in literature.”
more here.
Posted by Morgan Meis at 02:05 PM | Permalink
| Comments
AFTER SZYMBORSKA, AND POLISH POETRY TODAY
Sean Gapsar Bye at The Quarterly Conversation:
During the Polish poet Wisława Szymborska’s lifetime, it was commonly said that in Poland each of her new volumes was greeted with a rush to the bookshops, with enthusiastic readers even memorizing and reciting her verses. After winning the Nobel Prize for Literature in 1996, her fame spread worldwide. Modest and private, Szymborska found the experience mortifying—she reportedly referred to her Nobel Prize as “the Stockholm tragedy” and kept the medal itself in a drawer.
Szymborska and her fellow Nobel Prize–winner Czesław Miłosz formed two opposite poles (if you’ll pardon the expression) of the postwar generation of Polish poets. Miłosz’s intellectual seriousness and grandiose ego contrasted with Szymborska’s accessible wit and self-effacing charm. But much united them—both survived the Second World War, both embraced and then abandoned Communism, and both endeavored to express their country’s suffering through their work. Though chafing against the idea of political poetry, they shared with their fellow postwar poets a conviction that poetry should tackle the big questions—life and death, freedom and slavery.
By the time Szymborska passed away in 2012, she was one of the last exemplars of that school: Polish poetry had been blown wide open by the collapse of Communism two decades before, and, reconnected with the Western world, younger poets looked abroad for inspiration.
more here.
Posted by Morgan Meis at 02:01 PM | Permalink
| Comments
Why Claudio Magris’s Danube is a timely elegy for lost Europe
Richard Flanagan at The Guardian:
Danube was originally published in Italian in 1986, the same year Mikhail Gorbachev introduced the Soviet Union to two new concepts: glasnost andperestroika. Written during the final efflorescence of the cold war – when, as we now know, the world came the closest it has ever been to a nuclear war – the countries of what was then called eastern Europe had become, after four decades of isolating Soviet rule, terra incognita to many in the west.
Ignorance always summons greater ignorance in its defence. When Danube was published in English, in 1989, the influential American Kirkus Reviews called the book “heavy-going” in its description of what it termed “this little-known (at least to most Americans) corner of Europe”. The New York Times reviewer tellingly declared his preference for the Rhine as the river of civilisation, “closer to our western world and to our history ... It only sends its Nibelungen to the east to get them massacred by the hordes of Attila.”
more here.
Posted by Morgan Meis at 01:57 PM | Permalink
| Comments
Saturday Poem
Persian Letters
Dear Aleph,
Like Ovid: I’ll have no last words.
This is what it means to die among barbarians. Bar bar bar
was how the Greeks heard our speech —
sheep, beasts — and so we became
barbarians. We make them reveal
the brutes they are, Aleph, by the things
we make them name. David,
they tell me, is the one
one should aspire to, but ever since
I first heard them say Philistine
I’ve known I am Goliath
if I am anything.
by Solmaz Sharif
from Poetry, 12/2014
.
Posted by Jim Culleny at 08:28 AM | Permalink
| Comments
A Welcome Change: Radical Hospitality
Matthew Browne in Harvard Magazine:
Last week, a brimming crowd of grayed, bespectacled, and Tyvek-ed Cantabrigians, dotted throughout with important figures from the Harvard administration and faculty, packed into Sanders Theatre to hear actress and playwright Anna Deavere Smith.
...All of the scenes spoke to Smith’s notion of Radical Hospitality, which was only loosely defined, to the point of being difficult to pin down. At different times, she presented it as the virtue of patience, laboring to empathize with others, and giving the exiled a home, just to name a few. Radical Hospitality, in its elasticity, ran the risk of not seeming radical at all, and just becoming a stand-in for the warm nicety du jour. But there seemed to be a stable core that held it together: people around the world ought to do a better job of treating each other as welcomed guests. Like the maxim “Love thy neighbor,” the principle is apparent, simple, and unsurprising—but to insist on its importance, and to hold oneself and others to its standards, is radical. A lot of what seemed novel about Smith’s concept was in language: the focus on the very word hospitality, and the attempt to trace its political import. We are familiar to the point of callousing with the idea that we should love strangers or that we should empathize with others, but we rarely hear that we should be more hospitable. The word feels new in our mouths. Focusing on hospitality reinvigorates the vitality of a word that’s retreated to the hotel and dining room. And these common associations strengthen Smith’s political usage, rendering otherwise abstract debates in terms of warm, ground-level personal relations. Offering amnesty to refugees, for example, can be thought of as a matter of hospitality; should we not feel the same careful responsibility to those around the world that we do to those in our homes? The idea suggests that there is an ethics to our etiquette and an etiquette to our ethics.
Posted by Azra Raza at 07:28 AM | Permalink
| Comments
What’s Up With Those Voices in Your Head?
Casey Schwartz in The New York Times:
In the course of his life, Vincent van Gogh wrote hundreds of letters to his beloved brother Theo. “I have the grounds pretty well in my mind, and will choose a fine potato field at my ease,” he wrote in the early 1880s, when he was 30 and just beginning to think of himself as an artist. Vincent’s letters often sounded more like private speech than outward exchange; he didn’t seem to expect or require a reply. The act of writing, the expression of his internal, inchoate jumble of thoughts, was a crucial part of his creative process, helping him orient himself within his own vision and plan its execution. In “The Voices Within: The History and Science of How We Talk to Ourselves,” Charles Fernyhough, a professor of psychology at Durham University in England, points to van Gogh’s letters as showing how these voices in our heads are connected to larger questions of thought, decision making, creativity — even consciousness itself.
Inner voices are Fernyhough’s subject, but he admits they are slippery, hard to track, chaotic and cacophonous. “A solitary mind is actually a chorus,” he writes. Tune into yours right now: What are you hearing? Who’s speaking, and when did the conversation begin? This is ambiguous territory. Measuring one’s own private soundtrack is hard enough. Now add in the confounding element of other people’s, too. “Studying something as private and ineffable as our inner voices was, my elders might have warned me, never going to furnish a successful research career,” Fernyhough writes. Yet he has a penchant for exploring exactly these kinds of shifting landscapes. In an earlier book, “Pieces of Light,” he took on memory, building an artful case for the intensely improvised, subjective way we recall the experiences that make up our lives. In “The Voices Within,” he has again rendered complicated mental experience without losing its human texture, as so often happens when psychological questions are addressed in the lab.
More here.
Posted by Azra Raza at 07:10 AM | Permalink
| Comments
Friday, October 21, 2016
For the Wealthy, Citizenship at a Premium
Max Holleran in Boston Review:
This summer’s holiday season in the Mediterranean began with the startling announcement, from the International Organization for Migration, that more than 3,000 migrants have already died in 2016 attempting to cross into Europe over the Mediterranean Sea. While Germany resettled nearly a million people in 2015, other EU nations have been far more reluctant. Since last year, the European public has resolutely told their national leaders to begin deportations and reform border security, often in urgently nationalistic language of the kind found in Brexit’s “Breaking Point” ad. The EU has begun to tighten entry for those immigrating from outside of the continent, and securing the southern border has become an existential test of whether the political federation can survive. Mediterranean countries are on the frontline of this effort despite their limited economic resources compared to their wealthier Northern neighbors. They have been tasked with the role of sentry, patrolling the walls of fortress Europe. Yet a backdoor to the castle seems to have been left open.
Since the 2008 financial crisis, many Mediterranean countries have begun to offer citizenship-for-sale to non-European nationals. These countries include places hit hard by austerity like Cyprus, Portugal, and Spain (where the program is called “golden visa” in a nod to the optimism about the value of an EU passport as well as excitement for the wealth that citizenship investors could potentially bring). Often connected to the purchasing of property, these programs offer residency, a passport, and—after several years—full citizenship to those able to pay several hundred thousand euros. Selling citizenship is a contentious idea that disrupts some of our basic notions about what it means to belong to a national community. Mediterranean states support it partly as a way to raise revenues after the global financial crisis, which brought budget slashing and pushed unemployment over 20 percent in many countries.
More here.
Posted by Robin Varghese at 12:58 PM | Permalink
| Comments
How Einstein and Schrödinger Conspired to Kill a Cat
David Kaiser in Nautilus:
If all the bizarre facets of quantum theory, few seem stranger than those captured by Erwin Schrödinger’s famous fable about the cat that is neither alive nor dead. It describes a cat locked inside a windowless box, along with some radioactive material. If the radioactive material happens to decay, then a device releases a hammer, which smashes a vial of poison, which kills the cat. If no radioactivity is detected, the cat lives. Schrödinger dreamt up this gruesome scenario to mock what he considered a ludicrous feature of quantum theory. According to proponents of the theory, before anyone opened the box to check on the cat, the cat was neither alive nor dead; it existed in a strange, quintessentially quantum state of alive-and-dead.
Today, in our LOLcats-saturated world, Schrödinger’s strange little tale is often played for laughs, with a tone more zany than somber.1 It has also become the standard bearer for a host of quandaries in philosophy and physics. In Schrödinger’s own time, Niels Bohr and Werner Heisenberg proclaimed that hybrid states like the one the cat was supposed to be in were a fundamental feature of nature. Others, like Einstein, insisted that nature must choose: alive or dead, but not both.
Although Schrödinger’s cat flourishes as a meme to this day, discussions tend to overlook one key dimension of the fable: the environment in which Schrödinger conceived it in the first place. It’s no coincidence that, in the face of a looming World War, genocide, and the dismantling of German intellectual life, Schrödinger’s thoughts turned to poison, death, and destruction. Schrödinger’s cat, then, should remind us of more than the beguiling strangeness of quantum mechanics. It also reminds us that scientists are, like the rest of us, humans who feel—and fear.
More here.
Posted by Robin Varghese at 12:56 PM | Permalink
| Comments
‘Future Sex’: Exploring the Illusion of Choice After Tinder and Monogamy
Liza Batkin in Broadly:
In All the Single Ladies, her recent book about the growing population of single women in America, Rebecca Traister relates her experience of going off to college knowing that, "by most accounts, marriage was coming to swallow [her] up in just a few short years," but simultaneously feeling that nothing was less likely. A gap, resulting from a sizable sociological shift, had yawned between the expectations of her parents' generation and her own. The median age of first marriage—which hovered between 20 and 22 years old during the 20th century—today is approximately 27, and whereas 60 percent of Americans between the ages of 18 and 29 were married in 1960, the percentage now falls around 20. Today it is more common to be unmarried than married in your 20s, and Traister concludes from this that young women will "no longer have to wonder," as she did when she graduated high school, "what unmarried adult life for women might look like, surrounded as we are by examples of this kind of existence."
But figuring out "what unmarried adult life for women might look like" still seems to require a good deal of wondering. In Spinster, published last year, Kate Bolick recounts her realization at the age of 23—which stands out for her as the age at which Sylvia Plath married Ted Hughes—that "marriage was the last thing on [her] mind." With a husband far from her vision of her future, Bolick experienced a "failure of imagination." "How do you embark on your adulthood," she asks, "when you don't know where you're headed?" In Labor of Love, another recent book that examines modes of dating as they reflect and are produced by historical economic conditions, Moira Weigel describes being broken up with by a boyfriend and finding herself asking him what she should want.
"Why was I always asking some man?" she wonders. When she realizes that she "had learned to do it by dating," she sets out to understand why she "was struggling to follow desires that did not seem to be [her] own."
In the introduction to Future Sex, another hyped nonfiction book about modern relationships, out from FSG this week, Emily Witt narrates her own moment of reckoning with a failure of imagination. It arrives after she sleeps with a man who is seeing another woman; she is chastised for "pantomiming thrills" and fears that she may have contracted chlamydia. Researching methods for preventing STDs, Witt finds that the CDC recommends being in "a long-term mutually monogamous relationship with a partner who has been tested and is known to be uninfected."
More here.
Posted by Robin Varghese at 12:54 PM | Permalink
| Comments
Death, Afterlife, Justice and Value
Richard Marshall interviews Samuel Scheffler in 3:AM Magazine:
3:AM: A criticism of some moral philosophy – and perhaps of the position that you’ve just been discussing where the scope is about small-scale personal relationships and avoiding harm – is that it doesn’t accommodate big-scale issues like justice. These are deeply felt values, so how do you propose we accommodate them within your non-consequentialist ethical position?
SS: Your question seems to suggest that the issue of how to accommodate justice within one’s overall moral outlook is a problem for non-consequentialists alone. And in a way that’s right, but only because justice is not a concept that plays a fundamental role in consequentialist thought at all. We can, if we like, treat utilitarianism (for example) as a candidate theory of justice, as Rawls did in A Theory of Justice, but this is in one respect misleading. Utilitarianism offers us a theory of right action, but it is not a theory that even mentions, let alone uses, the concept of justice. At no point in their theory do utilitarians rely on an independent notion of justice or fairness. They are concerned solely with the maximization of value. Non-consequentialists are the only people who treat justice as a fundamental moral concept.
Since justice is a fundamental moral concept, the question should be: how do we (any of us) accommodate ideas of justice, and especially ideas about the justice of basic social, political, and economic institutions, within an overall outlook that is also sensitive to a variety of other moral values and principles, including values and principles that apply to small-scale personal relationships? That is a pressing and difficult question. One of the attractions of Rawls’s theory is that it suggests a kind of division-of-labor answer to the question. The idea is that there are sui generis principles of justice that apply to the basic institutional structure of society. If a society’s basic structure satisfies those principles, then individuals in the society may appropriately and without qualms be guided by the many different values and principles that apply to them, including principles governing the conduct of their personal relationships. Of course, individuals have duties to support and sustain just institutions, according to this view, but they have duties of other kinds as well.
More here.
Posted by Robin Varghese at 12:50 PM | Permalink
| Comments
Bookish Fools
Frank Furedi in Aeon:
It is Saturday, 1 November 2014. I am book-browsing at Barnes and Noble on Fifth Avenue in New York City when my attention is caught by a collection of beautifully produced volumes. I look closer and realise that these books are part of what’s called the Leatherbound Classic series. An assistant informs me that these fine specimens help to ‘embellish your book collection’. Since this exchange, I am reminded time and again that, as symbols of cultural refinement, books really matter. And, though we are meant to be living in a digital age, the symbolic significance of the book continues to enjoy cultural valuation. That is why, often when I do a television interview at home or in my university office, I am asked to stand in front of my bookshelf and pretend to be reading one of the texts.
Since the invention of the cuneiform system of writing in Mesopotamia around 3500 BCE and of hieroglyphics in Egypt around 3150 BCE, the serious reader of texts has enjoyed cultural acclamation. The clay tablets on which marks and signs were inscribed were regarded as precious and sometimes sacred artefacts. The ability to decipher and interpret the symbols and signs was seen as an extraordinary accomplishment. Egyptian hieroglyphics were thought to possess magical powers and, to this day, many readers regard books as a medium for gaining a spiritual experience. Since text possesses so much symbolic significance, how people read and what they read is widely perceived as an important feature of their identity. Reading has always been a marker of character, which is why people throughout history have invested considerable cultural and emotional resources in cultivating identities as lovers of books.
In ancient Mesopotamia, where only a small group of scribes could decipher the cuneiform tablets, the interpreter of signs enjoyed tremendous prestige. It is at this point in time that we have one of the earliest hints of the symbolic power and privilege enjoyed by the reader.
More here.
Posted by Robin Varghese at 12:48 PM | Permalink
| Comments
This is the best explanation of gerrymandering you will ever see
Christopher Ingraham in the Washington Post:
Gerrymandering -- drawing political boundaries to give your party a numeric advantage over an opposing party -- is a difficult process to explain. If you find the notion confusing, check out the chart above -- adapted from one posted to Reddit this weekend -- and wonder no more.
Suppose we have a very tiny state of fifty people. Thirty of them belong to the Blue Party, and 20 belong to the Red Party. And just our luck, they all live in a nice even grid with the Blues on one side of the state and the Reds on the other.
Now, let's say we need to divide this state into five districts. Each district will send one representative to the House to represent the people. Ideally, we want the representation to be proportional: if 60 percent of our residents are Blue and 40 percent are Red, those five seats should be divvied up the same way.
Fortunately, because our citizens live in a neatly ordered grid, it's easy to draw five lengthy districts -- two for the Reds , and three for the Blues. Voila! Perfectly proportional representation, just as the Founders intended. That's grid 1 above, "perfect representation."
More here.
Posted by S. Abbas Raza at 12:34 PM | Permalink
| Comments
Sean Carroll: How Entropy Powers The Earth
Video length: 3:15
Posted by S. Abbas Raza at 12:29 PM | Permalink
| Comments
The scientists who make apps addictive
Ian Leslie in The Economist:
In 1930, a psychologist at Harvard University called B.F. Skinner made a box and placed a hungry rat inside it. The box had a lever on one side. As the rat moved about it would accidentally knock the lever and, when it did so, a food pellet would drop into the box. After a rat had been put in the box a few times, it learned to go straight to the lever and press it: the reward reinforced the behaviour. Skinner proposed that the same principle applied to any “operant”, rat or man. He called his device the “operant conditioning chamber”. It became known as the Skinner box. Skinner was the most prominent exponent of a school of psychology called behaviourism, the premise of which was that human behaviour is best understood as a function of incentives and rewards. Let’s not get distracted by the nebulous and impossible to observe stuff of thoughts and feelings, said the behaviourists, but focus simply on how the operant’s environment shapes what it does. Understand the box and you understand the behaviour. Design the right box and you can control behaviour. Skinner turned out to be the last of the pure behaviourists. From the late 1950s onwards, a new generation of scholars redirected the field of psychology back towards internal mental processes, like memory and emotion. But behaviourism never went away completely, and in recent years it has re-emerged in a new form, as an applied discipline deployed by businesses and governments to influence the choices you make every day: what you buy, who you talk to, what you do at work. Its practitioners are particularly interested in how the digital interface – the box in which we spend most of our time today – can shape human decisions. The name of this young discipline is “behaviour design”. Its founding father is B.J. Fogg.
...In a phone conversation prior to the workshop, Fogg told me that he read the classics in the course of a master’s degree in the humanities. He never found much in Plato, but strongly identified with Aristotle’s drive to organise and catalogue the world, to see systems and patterns behind the confusion of phenomena. He says that when he read Aristotle’s “Rhetoric”, a treatise on the art of persuasion, “It just struck me, oh my gosh, this stuff is going to be rolled out in tech one day!”
More here.
Posted by Azra Raza at 09:23 AM | Permalink
| Comments
Program good ethics into artificial intelligence
Jim Davies in Nature:
Some researchers argue that consciousness is an important part of human cognition (although they don’t agree on what its functions are), and some counter that it serves no function at all. But even if consciousness is vitally important for human intelligence, it is unclear whether it’s also important for any conceivable intelligence, such as one programmed into computers. We just don’t know enough about the role of consciousness — be it in humans, animals or software — to know whether it’s necessary for complex thought. It might be that consciousness, or our perception of it, would naturally come with superintelligence. That is, the way we would judge something as conscious or not would be based on our interactions with it. A superintelligent AI would be able to talk to us, create computer-generated faces that react with emotional expressions just like somebody you’re talking to on Skype, and so on. It could easily have all of the outward signs of consciousness. It might also be that development of a general AI would be impossible without consciousness. (It’s worth noting that a conscious superintelligent AI might actually be less dangerous than a non-conscious one, because, at least in humans, one process that puts the brakes on immoral behaviour is ‘affective empathy’: the emotional contagion that makes a person feel what they perceive another to be feeling. Maybe conscious AIs would care about us more than unconscious ones would.)
Either way, we must remember that AI could be smart enough to pose a real threat even without consciousness. Our world already has plenty of examples of dangerous processes that are completely unconscious. Viruses do not have any consciousness, nor do they have intelligence. And some would argue that they aren’t even alive. In his book Superintelligence (Oxford University Press, 2014), the Oxford researcher Nick Bostrom describes many examples of how an AI could be dangerous. One is an AI whose main ambition is to create more and more paper clips. With advanced intelligence and no other values, it might proceed to seek control of world resources in pursuit of this goal, and humanity be damned. Another scenario is an AI asked to calculate the infinite digits of pi that uses up all of Earth’s matter as computing resources. Perhaps an AI built with more laudable goals, such as decreasing suffering, would try to eliminate humanity for the good of the rest of life on Earth. These hypothetical runaway processes are dangerous not because they are conscious, but because they are built without subtle and complex ethics.
More here.
Posted by Azra Raza at 07:22 AM | Permalink
| Comments
Thursday, October 20, 2016
The white flight of Derek Black
Eli Saslow in the Washington Post:
Their public conference had been interrupted by a demonstration march and a bomb threat, so the white nationalists decided to meet secretly instead. They slipped past police officers and protesters into a hotel in downtown Memphis. The country had elected its first black president just a few days earlier, and now in November 2008, dozens of the world’s most prominent racists wanted to strategize for the years ahead.
“The fight to restore White America begins now,” their agenda read.
The room was filled in part by former heads of the Ku Klux Klan and prominent neo-Nazis, but one of the keynote speeches had been reserved for a Florida community college student who had just turned 19. Derek Black was already hosting his own radio show. He had launched a white nationalist website for children and won a local political election in Florida. “The leading light of our movement,” was how the conference organizer introduced him, and then Derek stepped to the lectern.
“The way ahead is through politics,” he said. “We can infiltrate. We can take the country back.”
Years before Donald Trump launched a presidential campaign based in part on the politics of race and division, a group of avowed white nationalists was working to make his rise possible by pushing its ideology from the radical fringes ever closer to the far conservative right. Many attendees in Memphis had transformed over their careers from Klansmen to white supremacists to self-described “racial realists,” and Derek Black represented another step in that evolution.
He never used racial slurs. He didn’t advocate violence or lawbreaking. He had won a Republican committee seat in Palm Beach County, Fla., where Trump also had a home, without ever mentioning white nationalism, talking instead about the ravages of political correctness, affirmative action and unchecked Hispanic immigration.
More here.
Posted by S. Abbas Raza at 02:20 PM | Permalink
| Comments
Bill Gates: Mapping the End of Malaria
Bill Gates in his own blog:
A few years ago, I pulled off a purposeful prank. While I was giving a TED Talk on malaria to a room full of influential people, I opened a canister and let loose a small swarm of mosquitoes. “There’s no reason that only poor people should have the experience,” I said. I let the audience squirm in their seats for about half a minute before I let on that the mosquitoes were not infected with malaria. My gimmick worked. A distant problem suddenly got very close to home.
Today, gimmicks are no longer necessary for convincing Americans of the danger of mosquito-borne diseases. The spread of Zika virus in south Florida, Puerto Rico, and other parts of the U.S. has given millions of Americans a direct understanding what it’s like to live with the fear of mosquitoes and the harm they can do, especially to pregnant women and children.
The world must focus serious attention and resources on ending the Zika epidemic. At the same time, we should keep in mind that the overwhelming toll of mosquito-related illness and death comes from malaria. Malaria is the key reason mosquitoes are the deadliest animal in the world.
More here.
Posted by S. Abbas Raza at 02:13 PM | Permalink
| Comments
From algorithms to aliens, could humans ever understand minds that are radically unlike our own?
Murray Shanahan in Aeon:
In 1984, the philosopher Aaron Sloman invited scholars to describe ‘the space of possible minds’. Sloman’s phrase alludes to the fact that human minds, in all their variety, are not the only sorts of minds. There are, for example, the minds of other animals, such as chimpanzees, crows and octopuses. But the space of possibilities must also include the minds of life-forms that have evolved elsewhere in the Universe, minds that could be very different from any product of terrestrial biology. The map of possibilities includes such theoretical creatures even if we are alone in the Cosmos, just as it also includes life-forms that could have evolved on Earth under different conditions.
We must also consider the possibility of artificial intelligence (AI). Let’s say that intelligence ‘measures an agent’s general ability to achieve goals in a wide range of environments’, following the definition adopted by the computer scientists Shane Legg and Marcus Hutter. By this definition, no artefact exists today that has anything approaching human-level intelligence. While there are computer programs that can out-perform humans in highly demanding yet specialised intellectual domains, such as playing the game of Go, no computer or robot today can match the generality of human intelligence.
But it is artefacts possessing general intelligence – whether rat-level, human-level or beyond – that we are most interested in, because they are candidates for membership of the space of possible minds. Indeed, because the potential for variation in such artefacts far outstrips the potential for variation in naturally evolved intelligence, the non-natural variants might occupy the majority of that space. Some of these artefacts are likely to be very strange, examples of what we might call ‘conscious exotica’.
In what follows I attempt to meet Sloman’s challenge by describing the structure of the space of possible minds, in two dimensions: the capacity for consciousness and the human-likeness of behaviour.
More here.
Posted by S. Abbas Raza at 01:58 PM | Permalink
| Comments
Noam Chomsky: After the Electoral Extravaganza
This talk was presented at Harvard-Epworth Church, Cambridge, MA on May 12, 2016. Video length: 1:26:30
Posted by S. Abbas Raza at 01:44 PM | Permalink
| Comments
Newly revealed letters from Heidegger confirm his Nazism
Luisa Zielinski at The Paris Review:
Martin Heidegger never apologized for his support of the Nazis. He joined the party in 1933 and remained a member until the bitter end, in 1945. First, he spoke out enthusiastically in favor of a conservative revolution with Hitler at its helm. From about 1935, he found his own ambitions disappointed, and grew more silent. Yet, when he called his dalliance with National Socialism his greatest mistake after the war, he was upset not at his crime, but at the fact that he got caught.
Not that Heidegger has had to apologize, either. For the past seventy years, his many apologists and acolytes have gone to astounding lengths in trying to prove that his philosophical oeuvre exists independent of what was, they avowed, a mere weakness of character, an instance of momentary opportunism. In 2014, a group of French philosophers even tried to halt the publication of Heidegger’s Black Notebooks, his philosophical diaries. But if antisemitic references in his philosophy are oblique and, as some would have it, coincidental to his critique of modernity, the Notebooks leave little room for such charitable reading. Even after the war he would bemoan the Jewish “drive for revenge,” with their aim consisting in “obliterating the Germans in spirit and history.”
more here.
Posted by Morgan Meis at 09:22 AM | Permalink
| Comments
Late Night Thoughts on Nuclear Weapons
Jerry Delaney at The American Scholar:
In his book Command and Control: Nuclear Weapons, the Damascus Accident and the Illusion of Safety, Eric Schlosser reveals that worst-case scenarios have come harrowingly close to coming true on a number of occasions—yet the American public has never been adequately informed.
So the question that continues to haunt me is, Why would a generation of presidents, supported by responsible men like William Perry, engage in a nuclear poker game that no sane gambler would in good conscience play? Why on earth wouldn’t both sides calculate the worst-case scenario and elect not to play the game?
On some nights during the Cold War, I lay awake turning over that question. The only plausible answer I was able to imagine is that they, the two governments, couldn’t help it. They had no choice, or thought they had no choice: the nuclear genie was out of the bottle and both sides seized on deterrence as an existential necessity. But was it?
more here.