Qatar Spyware

Everyone visiting Qatar for the World Cup needs to install spyware on their phone.

Everyone travelling to Qatar during the football World Cup will be asked to download two apps called Ehteraz and Hayya.

Briefly, Ehteraz is an covid-19 tracking app, while Hayya is an official World Cup app used to keep track of match tickets and to access the free Metro in Qatar.

In particular, the covid-19 app Ehteraz asks for access to several rights on your mobile., like access to read, delete or change all content on the phone, as well as access to connect to WiFi and Bluetooth, override other apps and prevent the phone from switching off to sleep mode.

The Ehteraz app, which everyone over 18 coming to Qatar must download, also gets a number of other accesses such as an overview of your exact location, the ability to make direct calls via your phone and the ability to disable your screen lock.

The Hayya app does not ask for as much, but also has a number of critical aspects. Among other things, the app asks for access to share your personal information with almost no restrictions. In addition, the Hayya app provides access to determine the phone’s exact location, prevent the device from going into sleep mode, and view the phone’s network connections.

Despite what the article says, I don’t know how mandatory this actually is. I know people who visited Saudi Arabia when that country had a similarly sketchy app requirement. Some of them just didn’t bother downloading the apps, and were never asked about it at the border.

Posted on October 18, 2022 at 6:57 AM19 Comments

Hacking Automobile Keyless Entry Systems

Suspected members of a European car-theft ring have been arrested:

The criminals targeted vehicles with keyless entry and start systems, exploiting the technology to get into the car and drive away.

As a result of a coordinated action carried out on 10 October in the three countries involved, 31 suspects were arrested. A total of 22 locations were searched, and over EUR 1 098 500 in criminal assets seized.

The criminals targeted keyless vehicles from two French car manufacturers. A fraudulent tool—marketed as an automotive diagnostic solution, was used to replace the original software of the vehicles, allowing the doors to be opened and the ignition to be started without the actual key fob.

Among those arrested feature the software developers, its resellers and the car thieves who used this tool to steal vehicles.

The article doesn’t say how the hacking tool got installed into cars. Were there crooked auto mechanics, dealers, or something else?

Posted on October 17, 2022 at 10:07 AM8 Comments

Regulating DAOs

In August, the US Treasury’s Office of Foreign Assets Control (OFAC) sanctioned the cryptocurrency platform Tornado Cash, a virtual currency “mixer” designed to make it harder to trace cryptocurrency transactions—and a worldwide favorite money-laundering platform. Americans are now forbidden from using it. According to the US government, Tornado Cash was sanctioned because it allegedly laundered over $7 billion in cryptocurrency, $455 million of which was stolen by a North Korean state-sponsored hacking group.

Tornado Cash is not a traditional company run by human beings, but instead a series of “smart contracts”: self-executing code that exists only as software. Critics argue that prohibiting Americans from using Tornado Cash is a restraint of free speech, pointing to court rulings in the 1990s that established that computer language is a form of language, and that software programs are a form of speech. They also suggest that the Treasury Department has the authority to sanction only humans and not software.

We think that the most useful way to understand the speech issues involved with regulating Tornado Cash and other decentralized autonomous organizations (DAOs) is through an analogy: the golem. There are many versions of the Jewish golem legend, but in most of them, a person-like clay statue comes to life after someone writes the word “truth” in Hebrew on its forehead, and eventually starts doing terrible things. The golem stops only when a rabbi erases one of those letters, turning “truth” into the Hebrew word for “death,” and the golem ceases to function.

The analogy between DAOs and golems is quite precise, and has important consequences for the relationship between free speech and code. Ultimately, just as the golem needed the intervention of a rabbi to stop wreaking havoc on the world, so too do DAOs need to be subject to regulation.

The equivalency of code and free speech was established during the first “crypto wars” of the 1990s, which were about cryptography, not cryptocurrencies. US agencies tried to use export control laws to prevent sophisticated cryptography software from being exported outside the US. Activists and lawyers cleverly showed how code could be transformed into speech and vice versa, turning the source code for a cryptographic product into a printed book and daring US authorities to prevent its export. In 1996, US District Judge Marilyn Hall Patel ruled that computer code is a language, just like German or French, and that coded programs deserve First Amendment protection. That such code is also functional, instructing a computer to do something, was irrelevant to its expressive capabilities, according to Patel’s ruling. However, both a concurring and dissenting opinion argued that computer code also has the “functional purpose of controlling computers and, in that regard, does not command protection under the First Amendment.”

This disagreement highlights the awkward distinction between ordinary language and computer code. Language does not change the world, except insofar as it persuades, informs, or compels other people. Code, however, is a language where words have inherent power. Type the appropriate instructions and the computer will implement them without hesitation, second-guessing, or independence of will. They are like the words inscribed on a golem’s forehead (or the written instructions that, in some versions of the folklore, are placed in its mouth). The golem has no choice, because it is incapable of making choices. The words are code, and the golem is no different from a computer.

Unlike ordinary organizations, DAOs don’t rely on human beings to carry out many of their core functions. Instead, those functions have been translated into a set of instructions that are implemented in software. In the case of Tornado Cash, its code exists as part of Ethereum, a widely used cryptocurrency that can also run arbitrary computer code.

Cryptocurrency zealots thought that DAOs would allow them to place their trust in secure computer code, which would do exactly what they wanted it to do, rather than fallible human beings who might fail or cheat. Humans could still have input, but under rules that were enshrined in self-running software. The past several years of DAO activity has taught these zealots a series of painful and expensive lessons on the limits of both computer security and incomplete contracts: Software has bugs, and contracts may do weird things under unanticipated circumstances. The combination frequently results in multimillion-dollar frauds and thefts.

Further complicating the matter is that individual DAOs can have very different rules. DAOs were supposed to create truly decentralized services that could never turn into a source of state power and coercion. Today, some DAOs talk a big game about decentralization, but provide power to founders and big investors like Andreessen Horowitz. Others are deliberately set up to frustrate outside control. Indeed, the creators of Tornado Cash explicitly wanted to create a golem-like entity that would be immune from law. In doing so, they were following in a long libertarian tradition.

In 2014, Gavin Woods, one of Ethereum’s core developers, gave a talk on what he called “allegality” of decentralized software services. Woods’s argument was very simple. Companies like PayPal employ real people and real lawyers. That meant that “if they provide a service to you that is deemed wrong or illegal … then they get fucked … maybe [go] to prison.” But cryptocurrencies like Bitcoin “had no operator.” By using software running on blockchains rather than people to run your organization, you could do an end-run around normal, human law. You could create services that “cannot be shut down. Not by a court, not by a police force, not by a nation state.” People would be able to set whatever rules they wanted, regardless of what any government prohibited.

Woods’s speech helped inspire the first DAO (The DAO), and his ideas live on in Tornado Cash. Tornado Cash was designed, in its founder’s words, “to be unstoppable.” The way the protocol is “designed, decentralized and autonomous …[,] there’s nobody in charge.” The people who ran Tornado Cash used a decentralized protocol running on the Ethereum computing platform, which is itself radically decentralized. But they used indelible ink. The protocol was deliberately instructed never to accept an update command.

Other elements of Tornado Cash—­its website, and the GitHub repository where its source code was stored—­have been taken down. But the protocol that actually mixes cryptocurrency is still available through the Ethereum network, even if it doesn’t have a user-friendly front end. Like a golem that has been set in motion, it will just keep on going, taking in, processing, and returning cryptocurrency according to its original instructions.

This gets us to the argument that the US government, by sanctioning a software program, is restraining free speech. Not only is it more complicated than that, but it’s complicated in ways that undercut this argument. OFAC’s actions aren’t aimed against free speech and the publication of source code, as its clarifications have made clear. Researchers are not prohibited from copying, posting, “discussing, teaching about, or including open-source code in written publications, such as textbooks.” GitHub could potentially still host the source code and the project. OFAC’s actions are aimed at preventing persons from using software applications that undercut one of the most basic functions of government: regulating activities that it deems endangers national security.

The question is whether the First Amendment covers golems. When your words are used not to persuade or argue, but to animate a mindless entity that will exist as long as the Ethereum blockchain exists and will carry out your final instructions no matter what, should your golem be immune from legal action?

When Patel issued her famous ruling, she caustically dismissed the argument that “even one drop of ‘direct functionality'” overwhelmed people’s expressive rights. Arguably, the question with Tornado Cash is whether a possibly notional droplet of free speech expressivity can overwhelm the direct functionality of running code, especially code designed to refuse any further human intervention. The Tornado Cash protocol will accept and implement the routine commands described by its protocol: It will still launder cryptocurrency. But the protocol itself is frozen.

We certainly don’t think that the US government should ban DAOs or code running on Ethereum or other blockchains, or demand any universal right of access to their workings. That would be just as sweeping—and wrong—as the general claim that encrypted messaging results in a “lawless space,” or the contrary notion that regulating code is always a prior restraint on free speech. There is wide scope for legitimate disagreement about government regulation of code and its legal authorities over distributed systems.

However, it’s hard not to sympathize with OFAC’s desire to push back against a radical effort to undermine the very idea of government authority. What would happen if the Tornado Cash approach to the law prevailed? That is, what would be the outcome if judges and politicians decided that entities like Tornado Cash could not be regulated, on free speech or any other grounds?

Likely, anyone who wanted to facilitate illegal activities would have a strong incentive to turn their operation into a DAO—and then throw away the key. Ethereum’s programming language is Turing-complete. That means, as Woods argued back in 2014, that one could turn all kinds of organizational rules into software, whether or not they were against the law.

In practice, it wouldn’t be so easy. Turning business principles into running code is hard, and doing it without creating bugs or loopholes is much harder still. Ethereum and other blockchains still have hard limits on computing power. But human ingenuity can accomplish many things when there’s a lot of money at stake.

People have legitimate reasons for seeking anonymity in their financial transactions, but these reasons need to be weighed against other harms to society. As privacy advocate Cory Doctorow wrote recently: “When you combine anonymity with finance—­not the right to speak anonymously, but the right to run an investment fund anonymously—you’re rolling out the red carpet for serial scammers, who can run a scam, get caught, change names, and run it again, incorporating the lessons they learned.”

It’s a mistake to defend DAOs on the grounds that code is free speech. Some code is speech, but not all code is speech. And code can also directly affect the world. DAOs, which are in essence autonomous golems, made from code rather than clay, make this distinction especially stark.

This will become even more important as robots become more capable and prevalent. Robots are even more obviously golems than DAOs are, performing actions in the physical world. Should their code enjoy a safe harbor from the law? What if robots, like DAOs, are designed to obey only their initial instructions, however unlawful­—and refuse all further updates or commands? Assuming that code is free speech and only free speech, and ignoring its functional purpose, will at best tangle the law up in knots.

Tying free speech arguments to the cause of DAOs like Tornado Cash imperils some of the important free speech victories that were won in the past. But the risks for everyone might be even greater if that argument wins. A world where democratic governments are unable to enforce their laws is not a world where civic spaces or civil liberties will thrive.

This essay was written with Henry Farrell, and previously appeared on Lawfare.com.

Posted on October 14, 2022 at 9:08 AM46 Comments

Digital License Plates

California just legalized digital license plates, which seems like a solution without a problem.

The Rplate can reportedly function in extreme temperatures, has some customization features, and is managed via Bluetooth using a smartphone app. Rplates are also equipped with an LTE antenna, which can be used to push updates, change the plate if the vehicle is reported stolen or lost, and notify vehicle owners if their car may have been stolen.

Perhaps most importantly to the average car owner, Reviver said Rplate owners can renew their registration online through the Reviver mobile app.

That’s it?

Right now, an Rplate for a personal vehicle (the battery version) runs to $19.95 a month for 48 months, which will total $975.60 if kept for the full term. If opting to pay a year at a time, the price is $215.40 a year for the same four-year period, totaling $861.60. Wired plates for commercial vehicles run $24.95 for 48 months, and $275.40 if paid yearly.

That’s a lot to pay for the luxury of not having to find an envelope and stamp.

Plus, the privacy risks:

Privacy risks are an obvious concern when thinking about strapping an always-connected digital device to a car, but the California law has taken steps that may address some of those concerns.

“The bill would generally prohibit an alternative device [i.e. digital plate] from being equipped with GPS or other vehicle location tracking capability,” California’s legislative digest said of the new law. Commercial fleets are exempt from the rule, unsurprisingly.

More important are the security risks. Do we think for a minute that your digital license plate is secure from denial-of-service attacks, or number swapping attacks, or whatever new attacks will be dreamt up? Seems like a piece of stamped metal is the most secure option.

Posted on October 13, 2022 at 6:19 AM54 Comments

Recovering Passwords by Measuring Residual Heat

Researchers have used thermal cameras and ML guessing techniques to recover passwords from measuring the residual heat left by fingers on keyboards. From the abstract:

We detail the implementation of ThermoSecure and make a dataset of 1,500 thermal images of keyboards with heat traces resulting from input publicly available. Our first study shows that ThermoSecure successfully attacks 6-symbol, 8-symbol, 12-symbol, and 16-symbol passwords with an average accuracy of 92%, 80%, 71%, and 55% respectively, and even higher accuracy when thermal images are taken within 30 seconds. We found that typing behavior significantly impacts vulnerability to thermal attacks, where hunt-and-peck typists are more vulnerable than fast typists (92% vs 83% thermal attack success if performed within 30 seconds). The second study showed that the keycaps material has a statistically significant effect on the effectiveness of thermal attacks: ABS keycaps retain the thermal trace of users presses for a longer period of time, making them more vulnerable to thermal attacks, with a 52% average attack accuracy compared to 14% for keyboards with PBT keycaps.

“ABS” is Acrylonitrile Butadiene Styrene, which some keys are made of. Others are made of Polybutylene Terephthalate (PBT). PBT keys are less vulnerable.

But, honestly, if someone can train a camera at your keyboard, you have bigger problems.

News article.

Posted on October 12, 2022 at 6:30 AM22 Comments

Inserting a Backdoor into a Machine-Learning System

Interesting research: “ImpNet: Imperceptible and blackbox-undetectable backdoors in compiled neural networks, by Tim Clifford, Ilia Shumailov, Yiren Zhao, Ross Anderson, and Robert Mullins:

Abstract: Early backdoor attacks against machine learning set off an arms race in attack and defence development. Defences have since appeared demonstrating some ability to detect backdoors in models or even remove them. These defences work by inspecting the training data, the model, or the integrity of the training procedure. In this work, we show that backdoors can be added during compilation, circumventing any safeguards in the data preparation and model training stages. As an illustration, the attacker can insert weight-based backdoors during the hardware compilation step that will not be detected by any training or data-preparation process. Next, we demonstrate that some backdoors, such as ImpNet, can only be reliably detected at the stage where they are inserted and removing them anywhere else presents a significant challenge. We conclude that machine-learning model security requires assurance of provenance along the entire technical pipeline, including the data, model architecture, compiler, and hardware specification.

Ross Anderson explains the significance:

The trick is for the compiler to recognise what sort of model it’s compiling—whether it’s processing images or text, for example—and then devising trigger mechanisms for such models that are sufficiently covert and general. The takeaway message is that for a machine-learning model to be trustworthy, you need to assure the provenance of the whole chain: the model itself, the software tools used to compile it, the training data, the order in which the data are batched and presented—in short, everything.

Posted on October 11, 2022 at 7:18 AM5 Comments

Complex Impersonation Story

This is a story of one piece of what is probably a complex employment scam. Basically, real programmers are having their resumes copied and co-opted by scammers, who apply for jobs (or, I suppose, get recruited from various job sites), then hire other people with Western looks and language skills are to impersonate those first people on Zoom job interviews. Presumably, sometimes the scammers get hired and…I suppose…collect paychecks for a while until they get found out and fired. But that requires a bunch of banking fraud as well, so I don’t know.

EDITED TO ADD (10/11): Brian Krebs writes about fake LinkedIn profiles, which is probably another facet of this fraud system. Someone needs to unravel all of the threads.

Posted on October 10, 2022 at 6:09 AM21 Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.