Science and technology

Babbage

  • Government archives

    Scan and deliver

    Jan 3rd 2012, 19:13 by G.F. | SEATTLE

    CARL MALAMUD never thinks small and never shies away from a fight. The internet's co-archivist, who shares that unofficial title with Brewster Kahle of The Internet Archive, has spent most of the past two decades cajoling, hectoring and teasing local and federal government entities in the United States to unlock the material they produce. Mr Malamud (pictured to the right) believes releasing such information spurs innovation by allowing private and non-profit firms to compete by coming up with better methods to present and analyse data.

    In his latest effort, co-sponsored by the Center for American Progress, a think-tank in Washington, DC, he is prodding Barack Obama's administration to set comprehensive and coherent policy for digitising government information locked away in analogue form—to release it without restrictions. He thinks stumping up $250m a year to the venture is a good place to start.

    Mr Malamud's non-profit Public.Resource.org has targeted the American government in particular because the United States foreswears copyright protection within its borders of work created by public employees. Some works by government contractors, or those donated or assigned to the government, may retain copyright. But the vast bulk of creations that would otherwise be protected under current terms is freely available—if one can get one's hands on it.

    The government's multiple troves of resources which are, in theory, in the public domain, are often hard to access and sometimes made available only with restrictions. (For example, those at the Smithsonian Institute are the subject of a previous battle.) Information available solely in analogue formats, like paper or microfilm, may seem too abstruse to arouse anyone besides historians or academics. Yet perhaps it should, for it often comprises the fundaments of government actions which may have been enacted decades ago, but remain relevant today.

    Like many archivists, Mr Malamud frets that analogue records are physically disintegrating, in part, he says, because those who created them it in the first place did such a poor job. (The latest effort does not concern the accessibility of previously digitised government documents or those originally created in digital form, which Mr Malamud and others address in other projects.)

  • Babbage awards

    More wackier

    Jan 2nd 2012, 10:25 by J.P.

    THE year 2011 abounded in momentous scientific achievements, from hopeful signs in the fight against AIDS (which we put on the cover) to progress in the search for physicists' most elusive quarry, the Higgs boson. Then there was the bombshell from Italy, where an experiment hidden beneath the mountain of Gran Sasso clocked neutrinos travelling faster than the speed of light, flying in the face of modern science's most cherished assumptions, Albert Einstein's theory of relativity. The odds are still that a mistake has crept in somewhere. But if it hasn't, and if the ethereal neutrinos really do flout the supposed cosmic speed limit, then that would no doubt rank as the biggest news of the year, if not the past hundred years (as we explain in a leader).

    However, we are also committed to keeping abreast of developments in science and technology's seemingly less sober quarters. And, just as last year, we have come up with a shortlist of the most bizarre bits of boffinry and the wackiest widgets to have featured in the pages of The Economist in the preceding twelve months. The contest was fierce, but after much deliberation Babbage managed to winnow the field down to twelve finalists. Here they are, in no particular order.

     

    Bizarre boffinry

    Coming to a head: Mathematicians invent a new way to pour stout

    Please be seated: An astrophysicist comes up with a faster way to board planes

    Facing the truth: Why a man's face can lie but still produce orgasms

    How much is too much?: Why some duck livers are delicious, and others nasty

    Physical implausibility: A mathematical expression to quantify ballooning bosoms and winnowed waists

    Talking to the neighbours: A modest proposal for a neutrino-based interstellar communications network

     

    Wackiest widget

    Liquid radio: America's navy is developing an antenna made of seawater

    Wholly shit: An Indian company makes paper out of elephant dung

    Bottom feeders: Growing edible oyster mushrooms on (used) disposable nappies

    A healthy glow: A laser is created from a biological cell

    Invitation to the dance: Robot ballerinas take to the air

    Put that in your pipe and poke it: Rediscovering the extinct pneumatic pipe for goods distribution

     

    After an arduous, and wholly unscientific, evaluation procedure Babbage decided to award the bizarre-boffinry prize to a crack group of mathematicians from the University of Limerick, led by William Lee, who meticulously modelled bubble formation in stout beers. Their work suggests that lining the rims of cans and bottles with a similar to ordinary coffee filter would ensure the creamiest of heads, without the need for a fiddly beer widget. A well-deserved victory for maths, then.

    Speaking of widgets, the prize for 2011's wackiest goes to Seok-Hyun Yun, of Harvard Medical School, and his team for the creation of laser-emitting cells, something that does not, as Babbage's colleague noted, seem to have intuitively obvious applications but certainly scores well in the jaw-dropping department. 

    Of course, readers are invited to pick their own favourites and let us know what they think in the comments section. Also, as ever, it is not our intention to disparage odd research and weird gubbins. Far from it. Progress often works in mysterious ways, arriving initially in guises that may, at first blush, seem frivolous. It is impossible to foretell whether any of the above achievements will turn out to be world-changing. Even if none does, though, they are testament to human curiosity—and rollicking fun to boot.

  • Digital photography

    Difference Engine: Point, shoot, discard

    Dec 31st 2011, 18:21 by N.V. | LOS ANGELES

    ON RARE and very special occasions, your correspondent digs out his ancient 120-format camera and loads a roll of colour-reversal film from a precious supply of Fujichrome Astia Professional he keeps stored in the fridge. He shoots off a dozen scenes painstakingly composed on the camera’s large ground-glass screen, carefully extracts the film-carrier and removes the exposed spool, wraps it in kitchen foil, and takes it to a lab across town that still knows how to process roll-film properly. Despite the palaver, the silver halide communion is hugely satisfying. Even after all the years of use, viewing the 120's big square transparencies under a loupe on a lightbox can still take the breath away.

    In so far as it is possible to compare two entirely different ways of capturing photons, the Hasselblad's 6cm square frame is equivalent to a 70 megapixel digital sensor. The attraction of such a format is that it provides a large enough transparency for art editors to select the crop they like best, while still offering more than enough resolution (when the cropped area is enlarged) to cover losses incurred during plate-making and printing.

    Creating a full-page bleed for a colour magazine using even 35mm film is far more challenging. Trying to do so with a compact digital camera or smartphone is out of the question. But, then, the vast majority of digital cameras have light-gathering sensors the size of tiny toe-nails—and are used largely for uploading images to Flickr or Facebook, or for making 3.5 inch by 5 inch (8.9cm by 12.7cm) prints for family albums.

    That is not to say film beats digital any day. Not having to pay for and reload a fresh film every couple of dozen shots encourages digital photographers to experiment more. And there is the convenience of being able to see the result immediately, which allows users to delete inferior images and, if necessary, shoot additional ones. Meanwhile, the past decade has seen the light-processing power of silicon sensors become truly awesome.

    Depending on the lens and the film speed, a frame of 35mm film has the digital equivalent of between 15 and 20 megapixels. The “full-frame” sensors (with the same 36mm by 24mm format of 35mm film) in digital single-lens reflex (D-SLR) cameras used by professionals can more than match that today. The Canon EOS 5D Mark II, for instance, uses a 21 megapixel sensor.

    With a decent lens, even the cheaper D-SLRs produced for the “prosumer” market can come close. Thanks mainly to their smaller sensors (typically 22mm by 15mm), these popular devices tend to be lighter, more compact and less than a third the price of full-frame models. Their sensors are based on the old APS film format, which promised to revolutionise photography but failed miserably. The cameras were barely any smaller than 35mm ones, yet had only 40% the frame size.

    So, how come digital cameras that use so-called APS-C sensors, with less than half the sensor area of full-frame cameras, perform as well as they do? Even the so-called “micro four-thirds” D-SLRs, with sensors less than a third the size of a full-frame’s chip, seem more than passable. The answer is that while professional photographers using full-frame D-SLRs may blow up their images to poster size, the majority of camera users rarely make prints larger than 8 inches by 10 inches. Under normal lighting conditions, practically any 12 megapixel D-SLR will suffice.

  • Online fumbles

    Measure seven times, send once

    Dec 31st 2011, 10:04 by G.F. | SEATTLE

    A RUSSIAN friend of Babbage's once laughed after hearing the hoary epigram, "Measure twice, cut once." "In Russian," he said, "we say, 'measure seven times, cut once'." That may reveal much about Moscow, but the Russian turn of phrase should have been observed by the New York Times when it sent 8.7m e-mail missives instead of an intended 300.

    On the morning of December 27th, Babbage received a message from the Times about a recent cancellation to his home delivery, offering a 50% deal for re-upping. Since he had not received the paper on his Seattle doorstep for several years, the message seemed odd. On Twitter, however, he found that nearly all of his acquaintances and colleagues had received similar offers, including both those that had active dead-tree subscriptions and those that once had.

    Early speculation was that a sophisticated spammer had sent out a huge number of messages, some snaring legitimate current and former print subscribers, and that this was a test message to soften up readers for future phishing expeditions. The Times's official Twitter account initially confirmed this, stating that the message was spam. Given the number of data breaches that have occurred in recent years, recipients of the e-mail unsurprisingly flooded the paper's phone lines and websites (according to experiences related on Twitter and confirmed by the newspaper). A story on the Times website by a media reporter confirmed that the message was indeed spam.

    Or was it? One way to tell is to look at e-mail headers. These are received by all e-mail programs, and contain explicit fingerprints of how a message went from its origin to destination, tracking all the servers through which it passed. Such headers are hidden from uninitiated mortals, but can be viewed through various mail client and webmail commands to show full text or the raw message.

    The putative spam, it turns out, clearly originated from an outside marketing firm, Epsilon Interactive, that the New York Times employs to handle its e-mail. The message was sent first to a mail server at bfi0.com, a domain owned by Epsilon Interactive (formerly Bigfoot Interactive, hence bfi0), and from there to gridserver.com, a distributed-computing hosting system operated by Media Temple. (Strangely, two separate means of assuring that an e-mail was sent via legitimate e-mail servers for the domain in question failed.)

    Further, several of Babbage's friends had received the message to e-mail addresses dedicated to e-mail from the New York Times, and used for no other purpose. That may seem extreme, but many e-mail hosts make it a doddle to set up as many addresses as one desires that drop into the same inbox, but include a specific address for filtering.

    Within an hour of the first Times story being posted, it was replaced by one that looks like a more plausible explanation. The outside marketing firm was not to blame. Rather, a Times employee had pushed the button with the wrong settings in place. 

    Most people have experienced the shame of choosing "Reply All" instead of "Reply". Your correspondent has, at times, flipped the wrong switch and sent many thousands of people the same message repeatedly. As embarrassing as it is, however, the scale of this error isn't perhaps quite up there with the mistake that caused a space probe to miss the planet Mars because of boffins' failure to define whether the sums were done in foot-pounds or newton-metres. Even so, one imagines that the Times employee's red face may well have been visible from orbit. Perhaps he or she should learn Russian.

  • Social media in the 16th Century

    How Luther went viral

    Dec 29th 2011, 10:34 by The Economist online

    FIVE centuries before Facebook and the Arab spring, social media helped bring about the Reformation

    Read on: How Luther went viral, from our Christmas edition

  • Babbage: December 28th 2011

    Room for anyone else?

    Dec 27th 2011, 20:25 by The Economist online

    PREVIEWING 2012 with a look at the tech world's four largest companies, the smart phone and tablet computer market and futuristic technologies in the coming year

  • Avionics

    Mythology at 10,000 metres

    Dec 27th 2011, 19:50 by G.F. | SEATTLE

    EVERY airline flight you are on has at least a handful of mobiles, laptops and other electronic kit left in a standby mode or actively on, rather than shut off as aviation regulators and airlines demand. Every flight, in other words, tests the proposition that hardware carried on board by passengers disrupts the aircraft or confuses the crew with false readings from cockpit instruments. And yet airplane electronics, or avionics to use the technical term, do not routinely squawk or fail. 

    Your correspondent has not himself performed a controlled experiment to confirm his hunch. Instead, he derives the conclusion from two factors. First, as readers certainly know from their own experience and observation, mobiles and laptops are often put into sleep mode, rather than fully powered down. While most mobile operating systems now have an easy-to-access "airplane mode" in which all of a device's radio circuitry is turned off, not all users remember to switch it on before take-off. Many simply press the "power" button, which puts the device to sleep. Computer owners often just shut the lid, which has a similar effect.

    In sleep and standby modes, modern electronics go on chirping wirelessly to sort out an available signal. Newer laptops try to find an active Wi-Fi network, while mobiles boost their power to maximum in the hopes of finding a mast. Other personal electronic devices, or PEDs as the airline industry calls them, emit a range of signals that are inevitable byproducts of functioning electronics. (The Federal Aviation Administration, which regulates all matters aeronautic in America, has issued a list of devices that may be used on planes, though airlines may impose further restrictions; the Federal Communications Commission, meanwhile, bars all use of 800 MHz-band mobiles, which sweeps in nearly all modern phones.

    The second factor which led Babbage to his conclusion is an interpolation from a widely cited report published in 2006 in IEEE Spectrum, a magazine produced by the Institute of Electrical and Electronics Engineers (IEEE), a trade body which also sets technology standards. Researchers, with the FAA's and airlines' blessing, conducted extensive measurements of in-flight signal activity on 37 commercial flights in 2003. (The other passengers were unaware of the experiment.)

    The study found that passengers were using mobile phones at least once per flight, on average, contrary to FCC and FAA regulations. They sometimes even did so during the critical flight phases of take-off and landing. The IEEE article concluded that the potential for interference with satellite-navigation (Sat-Nav) systems used in cockpits to assist with take-offs and landings in particular was a concern. Yet this was not based on data the article's authors collected themselves. Instead, they culled data from an ongoing NASA project in which the space agency collects reports from pilots about any flight anomalies. The IEEE article's authors found a few dozen examples over a decade ending in 2001, and drew its conclusions from this sketchy, anecdotal and non-rigorous source.

  • Spaceflight

    Thinking big in space

    Dec 27th 2011, 4:25 by N.L. | CHICAGO

    AS A small boy Paul Allen, the co-founder of Microsoft, dreamed of going into space. He even tried to launch the hollow aluminium arm of a chair, stuffed with propellant, into orbit. It didn't work out. But his latest adventure in space travel—a joint venture with Burt Rutan, a famous designer of aircraft—looks more promising. Earlier this month, the two of them said they will build an air-launched orbital delivery system. To do this, Paul Allen’s company Stratolaunch Systems will have to build the world’s largest aeroplane.

    The Stratolaunch, as the plane will be called, will be big. Really, really big. It will have six engines, a wingspan of 117 metres (385 feet) and weigh about 544 tonnes. (The wingspan of Boeing's 747 is around half that of the Stratolaunch.) Taking off will require 3.6km of runway, and the aircraft will launch its rocket—a shortened version of the Falcon 9 rocket, built by another private space firm called SpaceX—at around 9,100 metres. The whole contraption will be able to put about 6 tonnes of payload into low-earth orbit.

    The idea is to offer a cheaper way of getting medium-sized payloads into orbit, and the system is designed to fill a niche that Boeing's Delta 2 rocket once served. Former NASA administrator, Mike Griffin, who now sits on the board of Stratolaunch, says that besides delivering cargo to the International Space Station, the Stratolaunch will tap a thriving market for launching small to middling communications satellites. There are also other customers in the form of NASA and the Department of Defense. Ultimately, however, Mr Allen wants to see the system launch humans into space.

    Of course the obvious question is why not launch the rocket directly from the ground in the first place? It turns out that land-based rocket launches are greatly restricted by irritations such as where one’s rocket pad is, and what the weather is like. Air launch, by contrast, makes orbital access to space much more flexible, a particular bonus for military applications. There will also be a small efficiency gain from launching the rocket from above much of the Earth’s atmosphere. Mr Allen is being cautious about saying how much money he will put into the venture. All he will admit is that an effort of this size requires an "order of magnitude" more money than he invested into a previous collaboration with Mr Rutan, SpaceShipOne. This cost Mr Allen $25m.

    Meanwhile, Mr Rutan’s company, Scaled Composites in Mojave, will be doing what it is best at: scaling composites. It will be super-sizing its existing White Knight aeroplane, which can carry rockets—such as SpaceShipOne—of suborbital flight. Other components for the Stratolaunch will be scavenged from second-hand 747s. Mr Rutan plans to start work as soon as he has a hanger large enough to build the giant airframe. The current schedule foresees test flights in 2015, and an initial launch by 2016. But the spaceflight business is hard and unforgiving, and the schedule is likely to slip. Mr Rutan has come a long way since he built his first plane, the dinky two-seater VariViggen, in 1972. With its 6-metre wingspan, he would be able to fit 20 along the Stratolaunch.

  • Trade shows

    The show mustn't go on

    Dec 25th 2011, 16:00 by G.F. | SEATTLE

    THE bosses of technology firms once vied for the privilege of delivering a keynote address at the large trade shows where electronics dealers and other professionals that rely on computer software and hardware in one way or another (like humble correspondents) flocked. Those days are gone. For many years now big firms have preferred to host their own conferences with tightly controlled messages and attendee lists, rather than partake in slightly more democratic events run by trade-show organisations. Microsoft just made the announcement that its chief would—after 2012—no longer deliver a keynote at the Consumer Electronics Show (CES), an event staged in Las Vegas every January.

    CES is run by the Consumer Electronics Association for the benefit of its 2,000 members, including Microsoft, who let retailers ogle their latest wares and hope to fill their order books. It is also a way to reach the eyes of the world with the latest sparkling objects right after Christmas, when last year's goodies are beginning to look dated.

    Microsoft's chief executive, first Bill Gates and later Steve Ballmer, has given a keynote at this sort of show for decades, starting in the 1980s with COMDEX, the Computer Dealers' Exhibition, a November fixture until its demise in 2003. This made sense when Microsoft was serenading middlemen, keen to lure them away from Netscape, Apple, Linux, Google or other rivals for showroom space.

    Microsoft has always wanted to flog a compelling vision of the future, even though it rarely lives up to it. Its Surface table-top interactive system and the Xbox gaming console may be the only two truly innovative products it has launched of late and neither of them debuted at COMDEX or CES. The track record of keynote to market, meanwhile, is terrible. Mr Ballmer showed next-generation Windows tablets during keynotes in 2009 and 2010—nothing much came of it. In 2010 he demonstrated a tablet from HP which shipped in small quantities in late 2010, but HP put its marketing muscle behind a tablet that ran the webOS operating system instead of Microsoft's Windows (and was quickly cancelled in any case).

    Harry McCracken, former editor of PC World and currently the man behind Technologizer, a techie website, has been to COMDEX or CES every year for the last two decades. Reviewing Microsoft's history of keynotes he notes that an awful lot of the things it unveiled at COMDEX and CES never amounted to much: Tablet PC, Windows Smart Displays, the Smart Watch or the amazingly short-lived Urge music service.

    Apple skipped off the trade-show treadmill after the January 2009 iteration of Macworld, an ancient event focused on consumers and video and graphics professionals staged by IDG World Expo. Apple had already put more effort into its own developer event, the Worldwide Developers Conference (WWDC), which takes place in early summer. Since 2009 the company's announcements have been made in accordance with its internal product cycle.

    Apple's withdrawal from Macworld has left the show's name looking like wishful thinking (its name switched for 2012 to "Macworld|iWorld", even). No more Apple has meant the event lost its global appeal for press coverage. Microsoft's impact on CES will be considerably less. The show occupies many football fields' worth of convention and hotel space, and the software giant will still send representatives to examine offerings, talk to media and discuss products, just not at a dedicated booth.

    This is increasingly true of other firms, too. Like Apple, Google also has large developer events, which may involve product news. Facebook hosts regular shindigs with media at its headquarters and elsewhere. (Smaller firms benefit from conventions, which still draw hacks looking for a story.) Many others are concluding that to stick to an external news schedule is not such a bright idea, after all. Deferring or pushing forward announcements for an event like CES may backfire if it means that the product launch misses its window or is premature. Microsoft would probably do better to hold its tongue and show something off when it is ready rather than be branded once again as all talk, no trousers.

  • Internet security

    A cyber-remedy for poison

    Dec 25th 2011, 9:38 by G.F. | SEATTLE

    WHY should you care about the domain name system (DNS)? This inherently dull bit of the internet's plumbing turns the names used to label a single server or a collection of servers (like www.economist.com, say) into machine-readable numeric addresses (like 64.14.173.20). The rub is that DNS can be easily "poisoned" so that legitimate intelligible addresses redirect users to malicious numeric ones leaving them none the wiser. This can happen within a coffeeshop network or affect an entire country's online operations. As a consequence, web surfers and e-mail readers may fall into the hands of criminals or prying government authorities who can grab passwords and intercept communications, transfer money or imprison dissidents.

    Now, though, OpenDNS, a firm that provides free and paid DNS-based services, has come up with a fix. It has released an early, working version of a tool that, in effect, packages a computer's request to translate a name into a number inside a secure wrapper on its way to and from the firm's own DNS servers. This prevents interception and tampering at the most likely weakest points along the way. The tool, called DNSCrypt, is ready for Macs; versions for other platforms are in the works. OpenDNS has also released the source code to be freely used, and hopes the protocol might be widely adopted, and perhaps even built into web browsers and other software.

    Normally poisoning DNS responses does not trigger alerts in a browser or other software. But it can be detected when a user wants to establish a secured, or "https", connection. Communication with such secured sites is protected by SSL/TLS certificates. Those certificates, vouched for by third-party certificate authorities (CAs) using a cryptographic signature, ought only to be in the hands of the verified owners of a domain's servers. When a computer on a network with a hacked DNS tries to establish a secure connection with a bogus server, and the impostor fails to serve up the right credential, alarms are sounded by the operating system or client software.

    The difficulty at present is that a few supposedly trusted CAs have been compromised: in April an affiliate of Comodo leaked a few certificates, and the Dutch authority DigiNotar let slip as many as 250. An illegitimately issued but valid certificate from a trusted CA combined with the ability to poison DNS allows online mischief makers to pose as secured websites. (Certificates can be revoked, but this typically takes time; following this year's incidents, nearly all operating-system and browser makers released updates with the subverted CAs' certificates flagged as invalid.)

    For a suborned certificate to work, a cracker has to both have the legit-seeming certificate and make sure that the domain contained within it produces a DNS response to a malicious server. Imagine a postal carrier with a package that must be signed for being waylaid and told that a recipient has moved and presenting the recipient's signature on a letter to prove it. The mailman blithely hands over the box to the wrong home.

    With DNSCrypt in place, however, that particular subterfuge falls flat. The software creates an encrypted tunnel between a user's computer and OpenDNS's servers through which all requests are sent. As a result, an impostor has no way of knowing which domain name a user is requesting a numeric equivalent of. A malicious network might try to block OpenDNS, but that would reveal its machinations. And OpenDNS has another clever trick up its sleeve: it can masquerade its secure connection as a regular web page request. (Securing DNS requests in this fashion pairs neatly with a separate effort to spruce up the web's certificate integrity, called notarisation.)

    OpenDNS's boss, David Ulevitch, says his firm decided to tackle the problem because it is in a bully position to provide assistance to those users or firms who want to take extra steps to ensure the integrity of their communications. With 30m users, mostly of its free look-up service, it can spread awareness—and plug its paid offerings.

  • Digital content

    Take it all off

    Dec 23rd 2011, 10:24 by G.F. | SEATTLE

    DIGITAL rights management (DRM), as wrapping video in encryption that may only be decoded by licensed hardware or software is known, is the only way to prevent widespread piracy of films and television shows. That, at least, is what the respective industries that create the programming will have you believe. This is tosh, of course. Tune in to any public torrent-seeding site—which links up millions of people willing to share pieces of a digital file—and take your pick of all the latest entertainment. Then there is the invitation-only Darknet, a busy back alley of encrypted servers sporting the full contents of cracked Blu-rays and up-to-the-minute bootlegs obtained by surreptitiously filming a cinema screen.

    While reverse-engineering DRM is (mostly) illegal in the United States under the Digital Millenium Copyright Act, and illegal or frowned upon in an increasing number of countries, DRM is the pasties in a strip tease. It hardly hides the real goods, which are there for the taking. It is trivial to share video, despite the threat of fines, imprisonment and, perhaps worst of all, temporary disconnection of one's internet feed. Further, nearly all music sold as digital downloads in the western world is free of DRM. The music industry acceded to retailers' wishes to sell unencrypted MP3 and AAC files to break Apple's pricing stranglehold on online music purchases.

    Now Louis C.K., a comedian, has put his oar in. In early December he made available for sale from a bespoke website a DRM-free video of a summer performance of his latest material. He priced the hour-long show at $5, a significant discount from the $15 to $20 such a gig would typically cost (though given the lack of intermediaries, he might net the same per-unit fee). He soon clocked 220,000 purchases and (according to a screenshot he posted) over $1m in receipts after processing costs. He says his move was dictated not by the failure of other methods of distribution, but undertaken in the spirit of experimentation.

    As Mr C.K. explained in a blog post on December 13th (not shying away from the odd obscenity), he did not stint on the production. He paid hundreds of thousands of dollars upfront to stage and record the shows, and many hundreds of hours of his time were taken up in creating and testing material, and then editing the final production, not to mention the technical and financial details. (Mr C.K. has form: he singlehandedly edits his television series, now entering its third season on FX network.)

    Of course, many performers and bands have made DRM-free audio directly purchasable from their own websites. Creating a high-quality video takes much more effort and expense; your correspondent is unaware of any efforts other than Mr C.K.'s that were not free, promotional, or hosted by a studio or network. 

    Attention to detail was key. Mr C.K. splurged $32,000 on developing a site that would make it easy for his fans to pay, and then stream or download the video. It may be streamed from the site twice, or downloaded three times, though only a single download is necessary since the unprotected file may be viewed by any number of standard digital video players on desktop and mobile operating systems. He also made a number of particularly friendly choices in his website construction. When signing up to purchase, for instance, he offers to put the buyer on an e-mail list, but pre-selects the opt-out option, which he labels as "No, leave me alone forever, you fat idiot." The purchase, handled by PayPal with either a credit card or a PayPal account, leaves the user automatically logged in without requiring a separate step. When resetting a lost password, as your correspondent needed to do, the new code is true to form, and begins with "stupid".

    The performer says he priced the video so low as to make it look like a steal, and so prevent the file from becoming a popular torrent from which no benefit would accrue to him. Inevitably, someone posted a torrent seed on Pirate Bay, noting "i kinda feel bad putting it here but people like louis ck gotta realize without torrents and the net he wouldnt be anywhere." This provoked a deluge of angry comments from other users. Hardly the types to shed a tear for big film studios' piracy woes, they nonetheless respected Mr C.K.'s approach.

    Now, of course, Mr C.K. has the advantage of millions of fans from his live and television performances. He receives praise from his fellow comedians and appears regularly on late-night television. His approach would probably not work for someone appearing at open-mic night in Duluth once a month. But it is a chink in the DRM argument: that a popular video will inevitably be so widely stolen that the creator cannot count on a reasonable return. Mr C.K. is the first to try this method, and thus earns additional attention—and sales. Others are sure to follow.

  • Babbage: December 21st 2011

    Age of transparency

    Dec 22nd 2011, 8:31 by The Economist online

    THE post-PC era, China's rising tech power and technology's impact on politics

  • Flu research

    A deadly balance

    Dec 21st 2011, 22:36 by C.H. | NEW YORK

    TEMPTING fate is never wise; tempting a flu pandemic is downright foolish. Yet it is impossible for scientists to understand influenza or create vaccines without at least some risk. The question, then, is what level of risk is acceptable.

    On December 20th American authorities said they had asked the world’s leading scientific journals to withhold research. The request, to Science and Nature, is highly unusual. But so is the research in question. Two separate teams, led by Yoshihiro Kawaoka at the University of Wisconsin, Madison, and Ron Fouchier at Erasmus Medical Centre in Rotterdam have tinkered with H5N1, otherwise known as bird flu. The resulting strains are dramatically more dangerous.

    According to the World Health Organisation, bird flu has killed more than 300 people since 2003. Its toll would certainly have been far greater had it not been for H5N1's important limitation: it is not easily transmitted to humans, or between them. But if the virus ever evolved to hop nimbly from man to man, it could wreak a pandemic.

    That evolution has now occurred, helped by the researchers in Madison and Rotterdam. Each team engineered the virus so that it could be transmitted through the air from ferret to ferret (good proxies for humans). Details of both studies are still under wraps but a paper Dr Fouchier presented in September at a virology conference in Malta outlines his team's approach. He and his colleagues first tried to fiddle with the flu genome directly, introducing bespoke changes to it in an effort to create an airborne strain. When this did not work, he resorted to the low-tech method of passing the virus (albeit one with three engineered mutations) from one ferret to another a number of times, giving it an opportunity to mutate naturally. After ten generations evolution worked its (in this case black) magic: the flu had gone airborne. The nasty strain had five mutations in two genes. Each of these has, Dr Fouchier notes, already been found in nature, only in separate strains and never clumped together.

    The new, deadlier flu strains exist only in labs, of course. However, the fear is that if the researchers are allowed to describe the genetic changes needed to create the new strains and the precise methods used to obtain them, then terrorists or other mischief-makers can copy the techniques. H5N1 would become the atomic bomb of biological warfare.

    American officials would rather stymie such proliferation. After the attacks of September 11th, 2001, America created the National Science Advisory Board for Biosecurity (NSABB) to advise the health department. The NSABB has not asked Science and Nature to withhold the new research altogether. Rather, it has tried to strike a balance, asking the journals to publish enough information to encourage further understanding and responsible research, but not enough to allow the researchers’ methods to be put to nefarious use.

    Bruce Alberts, the editor of Science, which accepted Dr Fouchier's work for publication, said in a statement that the journal was considering what to do. (Dr Kawaoka submitted his to Nature.) The journal would wait for the government to suggest how the critical data might be shared with scientists confidentially. Knowledge about the new virus, Dr Alberts wrote, “could well be essential for speeding the development of new treatments to combat this lethal form of influenza.” Blunt censorship would be counterproductive.

    There are other worries. Laurie Garrett at the Council on Foreign Relations, a think-tank, points out that some deadly viruses, like smallpox, are kept under countless locks and keys in highly secure laboratories. The new strains, meanwhile, are not as well protected. As such, they might be unleashed not just by terrorists, but by simple error. That is probably one risk not worth taking.

    (Photo credit: Tequiua via Flickr)

  • Global health

    'Tis not the season to be frugal

    Dec 20th 2011, 9:55 by C.H. | NEW YORK

    ELEVEN years ago the world’s leaders made ambitious promises for global health. The toll from HIV, tuberculosis and malaria would plummet by 2015, they said, while the health of mothers and their children would improve. Donations swelled to help reach the so-called Millennium Development Goals. Then came the financial crisis and panic over the rich world’s deficits. The giving mood has waned.  

    A new report, with an accompanying paper in Health Affairs, examines how aid for global health has changed. Christopher Murray and his colleagues at the University of Washington track disbursements from 1990 to 2009, then use budget documents, financial statements and other sources to estimate spending for 2010 and 2011. Their findings are not as grim as some might expect. Aid for health reached $27.7 billion in 2011, up 4% each year from 2009 to 2011. But that growth is anaemic compared with what it once was. 

    The change is notable for two reasons. First, it marks a shift from what had been an historic wave of enthusiasm. Donations for health doubled from 2001 to 2008. The current trajectory resembles the more measured growth of the 1990s. 

    Second, the change marks the first threat to a new model of aid. After 2000 the structure of aid shifted. Government agencies such as the United States President’s Emergency Plan for AIDS Relief (PEPFAR) funneled more of their money through private organisations. In 2000 the Bill and Melinda Gates Foundation helped to found the Global Alliance for Vaccines and Immunisation (GAVI). In 2001 Kofi Annan, then secretary-general of the United Nations, announced a new public-private partnership, the Global Fund for AIDS, Tuberculosis and Malaria. Public donors gave GAVI and the Global Fund $3.24 billion in 2009. Agencies at the UN received just $2.11 billion. Now, however, the Global Fund is struggling. In November it announced that it would give no new grants until 2014. The downturn is largely to blame; concerns over mismanagement have not helped. GAVI has fared better, with a successful fundraiser in June. 

    Other shifts threaten health too. Although American aid has been growing the pace has slowed to just 2% from 2010 to 2011. Britain is the rare country that has devoted more money to global health, increasing aid by 14% from 2010 to 2011, but there is pressure to cut it. The World Bank has offset some of the broader drop. However the bank mostly gives loans to middle-income countries, not grants to poor ones. If the World Bank’s loans are excluded from Dr Murray’s tally, health aid would have increased by a meagre 1% from 2010 to 2011. 

    It was always doubtful that the world would meet the Millennium Development Goals. Now those goals seem even more distant. The money that remains, however, could have greater impact. Dr Murray estimates that for every dollar donated from the rich world, a poor government redirects 56 cents from health. The Global Fund is working harder to wring value from money, a welcome shift. As aid dwindles, both donors and recipients must find ways to do more with much less. 

  • Web games

    Press pause, then resume

    Dec 20th 2011, 9:42 by G.F. | SEATTLE

    Pet a pig and get rewardedBEING in beta means never having to say you're sorry. The internet hosts perpetual prerelease software and web applications that are presented as an amalgam of finished service and something not to be relied upon. Just check the next nought-point-something version and the next, and we're sorry for lost connections, missing data and complete system resets. Google's webmail system, Gmail, was famously in beta for six years, and Google has made a habit of releasing early and often.

    Apple's beta-version Siri voice-processing system received a battering last week when questions about abortion clinics weren't understood (in the United States, clinics offering abortions don't always advertise under that label). Apple and independent search-engine analysts provided reasonable explanations about Siri's responses, some of which are hardwired, while others rely on natural-language parsing. But Apple also hid behind the beta banner: "As we bring Siri from beta to a final product, we find places where we can do better," the firm said in a statement. This hardly explains its non-beta promotion of Siri in television and print advertising as a key feature of the iPhone 4S, on which it is exclusively available.

    The culture of launching before a product is truly ready makes the decision by the makers of Glitch, a massively multiplayer online role-playing game (MMOPRG), to retreat two months after its formal release back into beta all the more startling. Stewart Butterfield, the boss of Glitch's developer Tiny Speck and the co-founder of the firm that developed Flickr, says that despite over a year in an early invitation-only alpha stage, and six months in broader release, some of the game goals and mechanics just weren't working correctly. Glitch has roughly 100,000 registered users, although free game play is common. The company doesn't disclose revenue or numbers of paying customers. Paid subscribers may be billed $5 to $15 per month for varying levels of game add-ons.

    Your correspondent spent a couple of hours playing Glitch, a game that has no combat, and involves commerce, puzzles and paying homage to a set of giants that run the mostly bucolic Glitch world. One learns skills by asking a sort of spirit guide to study up in real time as a proxy. A first lesson in mining takes 40 real-world minutes, for instance, which progresses even when you're not actively playing the game. Babbage spent an inordinate amount of time squeezing chickens, petting pigs and giving kudos to cherry trees while trying to figure out how to find the bureaucratic agency that would issue him paperwork to allow freer travel.

  • Intellectual property and mobile devices

    World patent war 1.0

    Dec 19th 2011, 22:17 by P.L.

    HTC, a Taiwanese maker of smartphones, could clearly have done without this sort of Christmas present. On December 19th America’s International Trade Commission (ITC) ruled on one of the most closely watched of the many patent battles being fought over mobile phones. It upheld a judge’s decision, made in July, that some of HTC’s devices that use Google’s Android operating system infringe a patent owned by Apple, creator of the iPhone, but reversed his verdict that another patent had been violated. The offending handsets may no longer be imported into the United States after April 19th 2012. Not only is the ruling plainly unwelcome for HTC, but it illustrates how important an American trade agency has become as an arbiter of disputes that, at first blush, have little to do with international trade.

    HTC sells around 40% of its smartphones in North America, nearly all of them using Android. In the third quarter the firm sold more smartphones in America than anyone else, a whisker ahead of Apple and South Korea’s Samsung, according to Gartner, a market research firm. A ban on some of its Android phones is thus a blow, although it or Google may find a way of working around the patent. (Florian Müller, an analyst who tracks patent disputes, believes that upholding the judge’s decision on the other patent would have done more damage.) To make matters worse, last month HTC shocked investors by saying that it expected its revenue in the fourth quarter to be no more than they were a year before. The company’s shares have lost nearly two-thirds of their value since May.

  • The future of film

    Difference engine: Going to the movies again?

    Dec 16th 2011, 20:04 by N.V. | LOS ANGELES

    HOME theatres that use large high-definition television sets coupled to surround-sound audio systems offer so immersive an experience, at so modest a cost, that they have begun to threaten the movie industry’s ticket sales. More and more people are waiting for Hollywood’s new releases to come out on DVD or Blu-ray Disc—so they can experience them in the comfort of their own homes, rather than pay extortionate prices at a local multiplex for the dubious privilege of viewing them on the silver screen.

    There is nothing new, of course, about television’s impact on the cinema. A steady erosion of ticket sales has ensued since the telly took over the living room in the middle of the last century. Until recently, though, the competition between the two media was for the viewer’s time. Now it is more about disposable income and the quality of the viewing experience. A Blu-ray Disc played on a large 1080p plasma-panel display can more than match a cinema’s sound and picture quality for a fraction of the cost—and provide a compelling reason to keep even avid movie fans at home.

    The impact on ticket sales is telling. For the first nine months of this year, box-office receipts in America were down 2.3% (to $8.33 billion, from $8.52 billion for the same period last year), say analysts at SNL Kagan, a finance and media research company based in Virginia. Meanwhile, video rentals from Netflix alone increased 48% (to $2.33 billion, from $1.57 billion). And this comes after the film industry has tried every trick in the book to halt its downward spiral.

    The biggest stunt of late has been 3D. Following James Cameron’s blockbuster movie “Avatar”, Hollywood has rushed out reels of second-rate 3D films. In catching at straws, however, the industry made the mistake of crediting the success of “Avatar”, which grossed over $2.7 billion, to the film’s stereoscopic effects rather than Mr Cameron’s craftsmanship and artistic genius (see “The best seat in the house”, May 7th 2010).

    But, once again, the 3D fad is fading fast. Audiences dislike having to wear dorky glasses just as much today as they did in the 1950s—the previous time 3D was hailed as cinema’s saviour from the onslaught of television. And many people still complain that the stereoscopic effects induce nausea and headaches. Besides, 3D is also costly to capture, process and project. The only reason 3D has continued as long as it has this time around has been the enthusiasm of the cinema chains, which have used it to add $5 to $7.50 to ticket prices.

    But now that television makers have put 3D circuitry in their HDTV sets, the gimmick’s value has largely been eroded. While the stereoscopic circuitry adds little to a television’s cost, makers had hoped to charge a hefty premium for 3D-enabled sets, and up to $150 extra for every pair of stereoscopic glasses. Consumers quickly disabused them of such thoughts. Over the past year, the premium for 3D television has all but evaporated, showing how little the effect is valued by viewers in general (see “Beyond HDTV”, July 28th 2011).

    With box-office sales continuing their free-fall, the picture houses ought to be champing for a cinematic experience that viewers cannot get at home. Some are. The giant-screen IMAX system, for instance, can pack theatres even at premium prices—so rewarding and unique is the experience. But the movie business in general takes its cue from the film studios, which have rarely been in the vanguard of innovation.

  • Cheap computing

    A tablet for everyone

    Dec 15th 2011, 10:33 by A.A.K. | DELHI

    IN OCTOBER, a day after Apple iPhone 4S was launched, a little-known London based company named DataWind, in collaboration with the Indian government, unveiled the world’s cheapest tablet computer in New Delhi. The product, christened "Aakash" ("sky", in Hindi) will be sold to students at subsidised price of $35, or available in the retail market for $60. Although the scheme is part of a national education programme, Suneet Singh Tuli, DataWind's boss, aims to sell his tablet to anyone who can afford a cheapish mobile phone. This may sound Pollyannaish as previous such attempts have failed to take off—most notably, Nicholas Negroponte’s One Laptop Per Child (OLPC) project.

    Mr Negroponte sought funding from education ministries in developing countries who promised to buy at least 1m units each priced at just $100. The plan seemed fairly simple: economies of scale would drive down hardware costs and relying on open-source stuff meant that most software was as good as free. However, in the end, the device turned out to be too expensive at $200. Countries like Haiti, Rwanda, Peru and Uruguay cancelled their bulk orders. The product was written off as too clunky and too slow. Admittedly, although the project lacked a sound execution plan, the vision was grandiose. At press gatherings, Mr Negroponte would describe it as an education programme and not a technology project.

    This is where Mr Tuli’s vision differs. Unlike his more celebrated counterpart, Mr Tuli considers his project to be a business venture; he is in it to make money. To keep costs in check, his company produced most parts from scratch instead of buying from vendors. “We buy glass and make our own screens,” he says referring to the LCD screens which can cost up to $12 to $15 if sourced from a vendor. Instead of procuring a ready Wi-Fi module, his team assembled the unit. Driving down costs was important, but so was arriving at the right price.

    In India, manufacturers of electronic items have learned the hard way that throwaway pricing does not necessarily assure sales. Take netbooks which were introduced at a price much cheaper than the laptops. A no-frills laptop sells at 15,000 rupees ($280); a netbook on the other hand is a third cheaper. In this price range, buyers can afford to pay the difference and upgrade themselves to a laptop—much like the mobile phone market where a 1,300-rupee mobile handset outsells the one priced at just 800 rupees. This is where DataWind’s 3,000-rupee laptop scores over its competitors as the closest substitute costs more than three times as much. Although its technical specifications may seem dodgy at first glance, it covers most bases.

    The Wi-Fi enabled seven-inch touch-screen tablet weighs 350 grams and runs on the somewhat older Android 2.2 operating system meant for smart-phones. The current version has a 366MHz processor and a separate video-core processor for playing YouTube-like multimedia content. In addition to two USB ports, it has a 2GB micro-SD card which is expandable up to 32GB. A SIM-card slot allows access to the internet through 2G and 3G network. However, accessing internet on such devices can be tricky. Limited processing power and low memory do not support high speed browsing. Mr Tuli claims that his proprietary browser hogs less data and fetches faster results.

    Ordinarily, when a URL is keyed in, the browser establishes contact with the website’s servers. Datawind’s browser functions differently. Instead of fetching data from the website directly, the browser sends a message to its proxy servers located on the cloud. Using DataWind’s patented technology, these servers compress the webpage and render that information back to the browser. This shifts the burden of processing from the client device to the cloud, he says and reckons that the frugal data usage (owing to the compressed data) would translate to smaller broadband bills.

    Additionally, Mr Tuli likes to stress that his product will bear a "Made in India" tag. However, this has come at a steep price. The company’s manufacturing unit is located in Andhra Pradesh, a state constantly marred by political strife. Repeated delays in production have postponed the retail launch date to “early 2012”. Mr Tuli also needs to address some product related issues. With just three hours of claimed battery life, schools will need to have desks with charging sockets, an unviable option at most places. Although the device supports USB powered solar chargers, at 2,000 rupees apiece, they are too pricy. A cheaper alternative which may be priced at 500 rupees is in the works, says Mr. Tuli. With such early feedback mainly from students, product revisions are already underway. Also, shunning the much larger Android marketplace over GetJar, a free app store may not be such a smart idea in the long run. In a few months, when the product goes on sale, it will be clear if Mr Tuli’s idea takes off or ends up as another case of grand vision gone awry.

  • Babbage: December 14th 2011

    Going public

    Dec 14th 2011, 10:07 by The Economist online

    ZYNGA and Jive lead up a field of new tech IPOs and video-game giant Activision breaks its own sales record

  • The Higgs boson and the Large Hadron Collider

    A hint of Higgs

    Dec 14th 2011, 6:49 by The Economist online

    CERN announces sightings, but not proof, of possible Higgs boson signatures in the LHC, and our correspondents take a look inside the world's largest microscope

  • Changes at Wikipedia

    Seeing things

    Dec 13th 2011, 20:34 by L.M.

    WIKIPEDIA has just unveiled the first version of its new visual editor. Babbage saw a preview of the new interface (now available here) before it went live and it would be no overstatement to call it the most significant change in Wikipedia’s short history. The hope is that editing the online encyclopedia with the visual editor will be more like playing around on blogging platforms such as Wordpress or Blogger and less like something that only other people do.

    For the moment the visual editor is still a developer prototype, or a demo of a demo, intended only for software developers to fiddle with and help improve. The options are limited to using pre-formatted lists and inserting bold, italics and links. There's also a dropdown menu with a list of formats (see picture), which obviates the need to mark up headers and boxes. Advanced users also have the option to see how their changes look in wiki syntax, HTML or JSON, a kind of Javascript-lite. Picture-related tools and the like will be included in later versions as the new editor is steadily improved before integration into Wikipedia starts in June 2012. Wikipedia reckons it is the most challenging technical project they have ever undertaken.

    The new editor is needed for a simple reason: the number of active editors on Wikipedia's English-language version is in decline, having peaked in 2007. According to the 2011-12 annual plan of the Wikimedia Foundation, the organisation that runs Wikipedia, “declining participation is by far the most serious problem facing the Wikimedia projects”. Visual editor is a “big obvious fix”, part of a $1m project to develop new features and make improvements. It is a change that was long overdue.

    A fortnight before Jimmy Wales and Larry Sanger officially launched Wikipedia on January 15th 2001, neither one had ever heard of a wiki. When Mr Sanger was introduced to the idea, he was impressed by its simplicity, writing that it was “the ULTIMATE ‘open’ and simple format for developing content” (his emphasis). At the time, wikis seemed revolutionary. Here was a way for amateurs to avoid mucking about with complicated code such as HTML in favour of a relatively easy mark-up language with only the most basic of stylistic and structural conventions.

    That was, of course, before your grandmother joined Facebook. Ten years later the online population has grown to include most sentient beings (at least in the developed world) and web users have grown accustomed to graphical interfaces. The wiki language now seems impenetrable to mostor at least too much trouble to learn. “When Wikipedia was created, everything was hard on the internet. We were no harder than anything else. But today most forms of interaction online are easier than editing a wiki article and that creates a barrier to entry that doesn’t do anybody any good,” says Sue Gardner, executive director of the Wikimedia Foundation.

    If HTML is a vast open field on which you can wander in any direction unfettered by restrictions, then Facebook is a city tram line, structured and restrictive of where those using it can go. Wikis fall somewhere in the middle, allowing a great deal of freedom within certain limits. Think of it as a network of pathways and cycle lanes where your route is based on the specific needs of your journey. The wiki syntax assumed people were familiar with the lay of the land. Visual editor is akin to handing out maps at the entrance.

    The Foundation hopes this will lead to a fresh spurt of new editors, upon whom “the success of the projects is entirely dependent”. This is both good and bad. The last great influx, in 2006-07, was badly handled by existing editors who couldn’t deal with the sudden influx. Unable to coach or orient what Ms Gardner calls the “well-intentioned but clueless” new editors, they erected barriers in the form of templates and automated responses.

    This time around, the Foundation will do things differently. As with editing articles, editors no longer need to know arcane codes in order to dispense “wikilove”virtual medals congratulating contributors on a job well done. Instead, another new tool makes it easier to award stars and badges as a means of encouraging new recruits. It is hard to say whether that will be enough. But if things go well, Wikipedia’s famously grumpy senior editors should have their hands full once again.

  • Particle physics

    Higgs on the horizon

    Dec 13th 2011, 18:16 by J.P.

    WHEN it emerged that two experiments at CERN, the world's leading particle-physics laboratory on the outskirts of Geneva, are sending their most senior scientists to present the latest lowdown from the search for the Higgs particle on December 13th, speculations swirled. Will they at last confirm the existence of the boson, famously implicated in endowing other elementary particles with mass, which has eluded physicists for over 40 years? Might they say for sure that it does not exist, consigning the Standard Model, a framework which has governed particle physics for nearly as long on the assumption that it does, to the dustbin of dislodged theories—and sending their theoretically inclined colleagues back to the drawing board?

    In the event, Fabiola Gianotti and Guido Tonelli, who lead ATLAS and CMS collaborations respectively, tried to sound a cautionary note. But the excitement during and after their presentations was palpable. For the two experiments have provided the most tantalising, though inconclusive, evidence to date for the existence of the sought-after particle, which Peter Higgs, a British physicist, plucked from mathematical formulae he had been working on in 1964 while trying to spruce up the Standard Model.

    The model postulates the existence of 17 particles. Of these, 12 are fermions, like quarks (which coalesce into neutrons and protons in atomic nuclei), electrons (which whiz around these nuclei) and neutrinos (the ubiquitous, diaphanous beasts which have themselves been grabbing headlines of late by seemingly travelling faster than light). These make up ordinary matter. (All have corresponding anti-fermions which, logically, constitute antimatter.)

    A further four particles, known as bosons, transmit three fundamental forces of nature. Familiar photons, particles of light, convey the electromagnetic force which holds electrons in orbit around atoms. Gluing quarks into protons and neutrons are appositely named gluons. Finally, W and Z bosons carry the weak nuclear force responsible for certain types of radioactive decay, as well as the hydrogen fusion which fuels stars. (How the fourth force, gravity, fits into all this remains arguably the greatest unsolved puzzle in physics.)

    Physicists need the Higgs to make sense of the properties of these other 16 subatomic species. Without it, or something like it, they have no way to explain how fermions and some bosons get their mass. That, though, is not its main virtue. As far as the Standard Model is concerned, one could simply assume that mass is a fundamental property of particles with no need for further explanation. The rub is that a Higgs-less Standard Model predicts that all bosons should have no mass. Photons and gluons abide by this rule. The W and Z, by contrast, flout it, weighing almost as much as 100 protons.

    Dr Higgs figured out (as did five other physicists around the same time) that this could be explained by postulating the existence of a field, later dubbed the Higgs field, which pervades all space. To understand how it works consider a ferromagnet which is heated up and then chilled. Each atom inside it acts as a miniature magnet. At high temperatures, they wiggle around willy-nilly, not preferring any one direction to any other. The system is, in a sense, symmetrical: the milling atoms look the same whatever the observer’s vantage point. On reaching a particular temperature, though, they suddenly pick a preferred direction, creating a uniform magnetic field. They no longer look the same to different observers.

    Something similar is believed to have happened with the Higgs field. At the scorching temperatures instants after the Big Bang it was in disarray and all elementary particles were oblivious to it. They were, in other words, massless. Moreover, photons, Ws and Zs all looked the same. There was no distinction between electromagnetism and the weak interaction. Instead, the three bosons conveyed the same “electroweak” force. 

    As the universe cooled, however, the pleasing uniformity suddenly collapsed: the Higgs field picked a direction. The W and Z feel the resulting field but the photon does not, just as some metals feel the pull of a ferromagnet and others don’t. Physicists say that on reaching a critical temperature, the symmetry between electromagnetic and weak force was spontaneously broken. The upshot may not look at all symmetrical, but it nonetheless reflects a deeper symmetry which just happens to be hidden from view in the low-temperature world. The Higgs boson emerges from the mathematical wizardry used to flesh out this symmetry-breaking mechanism.

    Playing hard to get

    Rolf-Dieter Heuer, the head of CERN (interviewed here), once quipped that physicists know everything about the Higgs apart from whether it exists. There are several reasons why the particle has proved so elusive. For a start, as Dr Heuer knows full well, his assertion is not strictly speaking true: theory is irritatingly noncommital about the particle’s mass. That means that searching for it involves looking across a wide range of possible masses. Past experiments at CERN's old accelerator, the Large Electron-Positron Collider (LEP), ruled out masses below 114 gigaelectron-volts (GeV), the esoteric unit particle physicists like to use. Anything higher, though, has been fair game.

    Both ATLAS and CMS draw their subatomic cannon fodder from the LEP’s snazzier successor, the Large Hadron Collider (LHC), CERN’s (and the world’s) biggest particle accelerator. The LHC, housed in a 27km circular tunnel beneath the Franco-Swiss border, collides protons whizzing around it in opposite directions at a smidgen below the speed of light (see our video of how it works). The colliding protons’ kinetic energy is converted into other particles (since, as Einstein showed, energy and mass are one and the same). More precisely, each proton-proton collision involves a handful of quarks and gluons. It is only two of these that actually collide. The remaining lot cannot exist by themselves and decay to produce obfuscating detritus. 

    Moreover, the Higgs, should it emerge from such a collision, is unstable and immediately decays into less fleeting bits. ATLAS and CMS are honed to detect particular patterns of the less chimerical particles that the Higgs is believed to morph into. Unfortunately, such patterns are not specific to the Higgs; other subatomic processes produce an abundance of identical telltales. So the experiments are not after a signature signal but a excess of such signals—a fraction of a percent or less—over what would be expected were the Higgs not real. Each having analysed some 380 trillion collisions recorded since the LHC got cracking in earnest in 2010, both have now seen just such an excess, around 125GeV.

    At between one chance in 2,000 to one in 20 of being a fluke—depending on what statistical method is used—the findings fall short of the exacting one-in-3.5m target particle physicists have set themselves to claim discovery with confidence. But the fact that independent measurements of different possible decay patterns (especially extremely rare ones involving the production of two photons) from two separate experiments point to a mass of the putative Higgs within a few GeV of each other has led some physicists to claim that discovery is afoot.

    Other see this as premature. Earlier this year both CMS and ATLAS presented alluring bumps around 130-140GeV but these evaporated on closer inspection. However, at the time they also observed smaller spikes around 125GeV which now appear to have grown into something statistically sturdier.

    Importantly, ATLAS and CMS have also ruled out pretty much the entire range below 115GeV and between 130-600GeV, beyond which the LHC currently lacks the oomph to whip up anything of interest. This means that they can now focus their efforts on probing the interesting 15GeV-wide band. Dr Tonelli and Aleandro Nisati, who helps co-ordinate ATLAS’s research efforts, are wary of committing to a date by which a definitive answer to the Higgs question will be known. If all goes well, though, it could be as early as a few months from now.

    Should the latest findings be confirmed, the next step will be to ascertain that the bump in question really is the sort of Higgs its eponymous conjurer envisaged. That will mean making precise measurements not just of its mass, but also of other properties like its assorted charges. If this ends in success, the Standard Model will finally be complete, and Dr Higgs will no doubt have earned his Nobel Prize, together with two other of the six physicists who came up with the same idea. (The Nobel committee’s rules prevent the prize being split more than three ways.) 

    A Higgs with a mass of 125GeV is also, in the words of John Ellis, a former head of theory at CERN, “just dandy” for a theory called supersymmetry which many see as the most viable successor to the Standard Model. It postulates the existence of heavier partners to all the known particles, and by doing so neatly explains many aspects of the physical reality the Standard Model has no purchase on. If a Higgs exists and weighs around 125GeV, then, the LHC ought, in principle, to be powerful enough to create the lightest supersymmetric particles. That is still an “if”. But it just got a bit less iffy.

  • Nuns and contraception

    Praying for the Pill

    Dec 9th 2011, 14:48 by C.H. | NEW YORK

    THE Catholic church condemns all forms of contraception, a policy that Paul VI laid out in detail in Humanae Vitae in 1968. Over the subsequent decades it has had various brawls with secular authorities over the use of birth control pills. Most recently, America’s bishops have fought to keep Barack Obama’s health law from providing contraception free. The church has already won an exemption for women who work for a church, but it also wants to keep coverage from women who work for any Catholic institution, even if the women in question are not Catholics and the institution has a secular purpose, such as a school, say, or hospital. Given all this, it would seem unlikely that the church would want to give the Pill to its nuns.

    Yet that is precisely what a recent paper in the Lancet suggests. Its authors, Kara Britt and Roger Short, of Monash University and the University of Melbourne, urge the Church to provide oral contraception to the sisters. Nuns need the Pill not to prevent pregnancy, but to prevent cancer.

    In 1713, the authors write, an Italian doctor observed that nuns had a very high rate of that “accursed pest”, breast cancer. Modern studies have confirmed that Catholic have a higher risk than most women of dying from breast, ovarian and uterine cancers. Women who bear children have fewer menstrual cycles, thanks to both pregnancy and lactation (which suppresses menstruation). Other studies have established a relationship between menstrual cycles and the prevalence of cancer, with fewer cycles meaning a smaller risk. Nuns - who are required to be celibate - experience more cycles than the typical woman, and therefore run a higher risk of developing cancer.

    The Pill can help to counteract this. The overall mortality in women who use, or have used, oral contraception, is 12% lower than among those who do not. The effect on ovarian and endometrial cancer is greater: the risk of such cancers plummets by about 50%. Drs Britt and Short make a compelling medical case. But it is unlikely to sway the Church.

  • Special report: Video games

    All the world’s a game

    Dec 8th 2011, 14:53 by The Economist online

    VIDEO games will be the fastest-growing and most exciting form of mass media over the coming decade, says Tim Cross

  • Battery vehicles

    Difference Engine: Volt farce

    Dec 8th 2011, 7:15 by N.V. | LOS ANGELES

    FOR General Motors, a good deal of the company’s recovery from its brush with bankruptcy is riding on the Chevrolet Volt (Opel or Vauxhall Ampera in Europe), its plug-in hybrid electric vehicle launched a year ago. Not that GM expects the sleek four-seater to be a cash cow. Indeed, the car company loses money on every one it makes. But the $41,000 (before tax breaks) Chevy Volt is a “halo” car designed to show the world what GM is capable of, and to lure customers into dealers’ showrooms—to marvel at the vehicle’s ingenious technology and its fuel economy of 60 miles per gallon (3.9litres/100km)—and then to drive off in one or other of GM’s bread-and-butter models.

    So, it is no surprise that GM should bend over backwards to mollify customers concerned by recent news of the Volt’s lithium-ion battery catching fire following crash tests. GM is offering to loan cars to Volt owners worried about their vehicle’s safety while an official investigation is underway and modifications made if deemed necessary. The company has even offered to buy vehicles back from owners who have lost confidence in the technology.

    There have not been many takers. As of December 5th, fewer than three dozen owners—out of 6,400 Volts sold to date in North America—had requested loan cars. And only a couple of dozen had asked for their Volts to be bought back. At a suitable price, your correspondent would have welcomed the chance to buy one of those secondhand buy-backs for himself, had they not already been snapped up by employees. Dan Akerson, GM’s chief executive, is believed to have bought one for his wife.

    The trouble all started in May, when the National Highway Traffic Safety Administration (NHTSA) carried out a routine 20 mph (32km/h) crash test on a Volt—to simulate a sideways impact with a tree or telegraph pole followed by a rollover. Three weeks after the test, the car’s 16 kilowatt-hour battery pack caught fire in NHTSA’s car park, destroying the vehicle and several others nearby.

    Shortly thereafter, both NHTSA and the carmaker repeated the side-impact and rollover test on at least two other cars, all to no effect. However, in subsequent tests—carried out in November by experts from the energy and defence departments as well as GM—the investigators deliberately damaged the battery packs and ruptured their coolant lines. One battery pack behaved normally. Another emitted smoke and sparks hours after it was flipped on its back. And a third exhibited a temporary increase in temperature, but then burst into flames a week later.

    GM claims the initial fire in June would never have happened if the NHTSA's engineers had drained the Volt’s battery immediately after the impact. It is odd that they did not. When crash testing a conventional petrol-powered car, the standard procedure is to drain the fuel tank to prevent any chance of fire. It would seem reasonable to do the equivalent with an electric vehicle.

    But, then, GM did not adopt a “depowering” protocol for the Volt until after the June fire. Even when it did, it failed to share the procedure with the safety agency until embarking on the November tests. In the wake of the latest findings, GM is now working with the Society of Automotive Engineers, NHTSA and other vehicle manufacturers, as well as fire-fighters, tow-truck operators and salvage crew, to implement an industry-wide standard for handling battery-powered vehicles involved in accidents.

    Toyota ran into similar troubles when its Prius hybrid car was introduced over a decade ago. Though the Prius’s battery pack is considerably smaller than the Volt's, fire-fighters and other first-responders had to learn how to disarm the vehicle following an accident—by removing fuses from under the bonnet and pulling a catch beneath the rear storage area to isolate the high-voltage system. Until they had done so, they were warned, they were on no account to take a metal cutter to an overturned Prius to extricate trapped occupants. Lurking beneath the floor was a big orange cable carrying a heavy current that would have fried anyone slicing though it.

About Babbage

In this blog, our correspondents report on the intersections between science, technology, culture and policy. The blog takes its name from Charles Babbage, a Victorian mathematician and engineer who designed a mechanical computer. Follow Babbage on Twitter »

Advertisement

Trending topics

Read comments on the site's most popular topics

Advertisement

Latest blog posts - All times are GMT
Mission failure
From Newsbook - 1 hrs 44 mins ago
Time to vote
From Democracy in America - 2 hrs 42 mins ago
Link exchange
From Free exchange - 3 hrs 55 mins ago
Scan and deliver
From Babbage - January 3rd, 19:13
More from our blogs »
Products & events
Stay informed today and every day

Subscribe to The Economist's free e-mail newsletters and alerts.


Subscribe to The Economist's latest article postings on Twitter


See a selection of The Economist's articles, events, topical videos and debates on Facebook.