The Risks Digest

The RISKS Digest

Forum on Risks to the Public in Computers and Related Systems

ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Volume 29 Issue 13

Thursday 26 November 2015

Contents

HIPAA Settlement Reinforces Lessons for Users of Medical Devices
HHS
China Cuts Mobile Service of Xinjiang Residents Evading Internet Filters
*NYTimes*
Who's right on crypto? An American prosecutor or a Lebanese coder?
Kieren McCarthy
Sneaky Microsoft renamed its data slurper before sticking it back in Windows 10
*The Register*
Black Friday Falters as Consumer Behaviors Change
*NYTimes*
Info on RISKS (comp.risks)

HIPAA Settlement Reinforces Lessons for Users of Medical Devices (HHS)

Monty Solomon <monty@roscom.com>
Wed, 25 Nov 2015 22:33:38 -0500
Lahey Hospital and Medical Center (Lahey) has agreed to settle potential
violations of the Health Insurance Portability and Accountability Act of
1996 (HIPAA) Privacy and Security Rules with the U.S. Department of Health
and Human Services (HHS), Office for Civil Rights (OCR). Lahey will pay
$850,000 and will adopt a robust corrective action plan to correct
deficiencies in its HIPAA compliance program.

http://www.hhs.gov/ocr/privacy/hipaa/enforcement/examples/LAHEY/index.html

http://www.hhs.gov/about/news/2015/11/25/hipaa-settlement-reinforces-lessons-users-medical-devices.html

http://www.hhs.gov/ocr/privacy/hipaa/enforcement/examples/LAHEY/lahey.pdf


China Cuts Mobile Service of Xinjiang Residents Evading Internet Filters (NYTimes)

Monty Solomon <monty@roscom.com>
Thu, 26 Nov 2015 01:10:51 -0500
http://www.nytimes.com/2015/11/24/business/international/china-cuts-mobile-service-of-xinjiang-residents-evading-internet-filters.html

People who had downloaded foreign messaging services and other software were
said to be targeted, as part of a new measure in the country's fractious
western territory.


Who's right on crypto? An American prosecutor or a Lebanese coder? (Kieren McCarthy)

Henry Baker <hbaker1@pipeline.com>
Wed, 25 Nov 2015 07:53:38 -0800
"Encryption is about mathematics, not policy."

"When you make a credit card payment or log into Facebook, you're using the
same fundamental encryption that, in another continent, an activist could be
using to organize a protest against a failed regime."

"It's not something that we're not smart enough to do; it's something that's
mathematically impossible to do.  I cannot backdoor software specifically to
spy on jihadists without this backdoor applying to every single member of
society relying on my software."

"politicians are now furiously trying to move the needle back to where they
were most comfortable: secret access to huge amounts of information."

Kieren McCarthy, *The Register*, 24 Nov 2015
Who's right on crypto: An American prosecutor or a Lebanese coder?
District attorney and encrypted chat app dev sound off on privacy
http://www.theregister.co.uk/2015/11/24/perspectives_on_encryption/

Special report The debate over encryption has become particularly intense
following the deadly attacks in Paris.

Politicians, police, and government agents insist the encryption in our
software and gadgets be limited.  Tech companies and programmers insist the
encryption be implemented fully securely.

http://www.theregister.co.uk/2015/11/20/tech_companies_against_weaker_encryption/

This past week, there have been two posts from opposite ends of this debate,
both argued passionately and eloquently, that highlight the complexities
around the issue.

One comes from Manhattan's District Attorney and is a 42-page report [PDF]
making the case for law enforcement access to smartphones; the second is a
blog post from a 25-year-old Lebanese security researcher living in Paris
whose secure chat app has become the focus of media interest after the
recent attacks.
http://manhattanda.org/sites/default/files/11.18.15%20Report%20on%20Smartphone%20Encryption%20and%20Public%20Safety.pdf
https://nadim.computer/2015/11/23/on-encryption-and-terrorists.html

The question is: who's right? The American prosecutor or the Lebanese coder?

The two questions

The debate boils down to two basic questions. One: should investigators be
able to get hold of communication data if they strongly feel it will solve a
crime? And two: how would that system actually work?

With few exceptions, almost everyone agrees that, yes, the police and Feds
should be able to access information that will assist in sending down
criminals, so long as there are adequate measures to prevent the system from
being abused.

The problem comes with the second question: how is it actually done?  And
here lies the problem, because the answer to that question in many respects
overrides the first.

Encryption is about mathematics, not policy.  If you create a system that
makes data accessible only to Alice and Bob, and inaccessible to Eve, and
you then try to ensure the data is somehow accessible to people
indistinguishable from Eve, you have to purposefully break the system.  And
that break, no matter how eloquently implemented, is still a break.  Once it
is there, it cannot go away.

Technologists and coders have become increasingly outspoken about the
fundamentally flawed logic of creating an encryption system with a hole in
it, in significant part because Edward Snowden revealed the lengths to which
the US government was prepared to go to access all data.

Previously, tech companies had reached an uneasy agreement that they would
include carefully designed holes in their systems so information could be
provided to a third party in extreme circumstances  typically the
production of a search warrant.

And while local law enforcement  like our prosecutor from New York 
largely stuck to that agreement, it was clear that the security and
intelligence services did not.  Once the hole exists, if you know its full
details, you are free to access information on anyone using that system.

With smartphones in particular becoming increasingly important to everyone's
privacy, ready access has become far more than logging suspicious activity.
Your phone now contains your interactions with friends and family; personal
pictures; your locations now and over time.  With apps, your phone contains
your financial information, your searches for information, access to secure
work networks, your personal life.

In his report, Cyrus Vance Jr, the Manhattan District Attorney, argues:
"What makes full-disk encryption schemes remarkable is that they provide
greater protection to one's phone than one has in one's home, which, of
course, has always been afforded the highest level of privacy protection by
courts.  Every home can be entered with a search warrant.  The same should
be true of devices."

Except in many respects, smartphones contain more personal information than
your own home  and all in one tiny portable device.  While you may be
able to find details on people's personal finances in a filing cabinet in a
house, there won't be a drawer  even a locked one  that contains the
details of every location you visited in the past few weeks, complete with
timestamps.

In your house, you may have left some letters, or even printed out an email.
You may have photo albums.  But the interactions we have these days with our
phones are more akin to recording our voices.  Law enforcement needs more
than a search warrant to install a bug in your home.  And while people still
keep photo albums, they don't come with GPS coordinates and instant links to
the identities of the other people pictured.

In short, while entering your home is a significant invasion of privacy, the
physical interference is actually likely to reveal less about yourself than
the ability to go through your phone.

The law enforcement case

That said, Vance does make a persuasive case.
http://manhattanda.org/sites/default/files/11.18.15%20Report%20on%20Smartphone%20Encryption%20and%20Public%20Safety.pdf

His report includes real-world examples of where access to people's
phones has led to real evidence that has led to real convictions.  And the
examples are harrowing:

* A man accidentally filmed his own murder.  The recovered video supported
  eyewitness accounts and the shooter was found guilty and given 35 years.

* Text messages sent between two accused rapists concerned the use of mace
  spray which is being used as a piece of evidence in their trial.

* Child abuse images were taken off a phone after the owner showed one to a
  taxi driver.

* A sex trafficker's phone contained photos of him posing with women who
  appeared in online prostitution ads.  They were used in his trial and
  helped lead to his conviction.

* A credit card swiping ring that fleeced restaurant customers of over $1m
  was taken down thanks to the details on a phone from one of the waiters
  involved.

* A murder suspect was actually cleared when his phone's details made it
  clear he was not involved; a second phone found at the scene of the crime
  led to the person responsible.

The prosecutor makes the argument that if tech companies do not include some
method for accessing information then "we risk losing crucial evidence in
serious cases if the contents of passcode-protected smartphones remain
immune to a warrant."

And the paper cites a conversation from jail in which an inmate asks a
friend to check what operating system his iPhone is using.  They upgraded
their phones at the same time and the fact that the friend's phone was
running iOS 8 meant that the cops would not be able to access his phone
data.  "That means God might be in my favor.  I don't think they can open
it," the inmate said over the recorded phone line.  "I mean, you know how
much shit is on that phone."

If the only way to access a phone's data is for the user to type in their
personal passcode, then the police will be missing out on hugely valuable
information.  "It is the rare case in which information from a smartphone is
not useful; rather, it is often crucial," he argues, citing 111 search
warrants between September 17, 2014 and October 1, 2015 where his office was
not able to get at phone data because of new encryption standards.

And it wasn't just suspects refusing to hand over the code: in some cases,
the phone belonged to a dead victim.  It is not hard to imagine the enormous
frustration that must exist in a detective at a crime scene if she is simply
not able to get at what may be critical evidence because she doesn't know
what the correct four numbers are.

In short, the district attorney argues that the previous system  where
the authorities would get a search warrant after it had persuaded a judge of
"probable cause" and then send it with the phone to the manufacturer's
headquarters in California and get a hard drive back in return with all its
contents  was a good balance between security and privacy.

It gave law enforcement what it needed; it meant that the average Joe was
not impacted.  Getting access to details through cloud-storage rather than
directly from a phone was also not equivalent, Vance argues, even producing
a table that highlights the sort of information that can be acquired from
phones themselves, cloud storage, and network operators.

https://regmedia.co.uk/2015/11/23/district-attorney-table.jpg

A table from the Manhattan District Attorney pointing out what data can be
accessed by different means

The district attorney rails against the default encryption that Apple and
Google have introduced that means they don't have ready access to a phone's
data, and argues for a new law to pass through US Congress that would make
it a requirement for "any designer of an operating system for a smartphone
or tablet manufactured, leased, or sold in the US to ensure that data on its
devices is accessible pursuant to a search warrant."

The developer's case

On the flipside of this argument sits Nadim Kobeissi, a programmer born and
raised in Beirut, and now lives in Paris.

Kobeissi developed the secure, open-source chat tool Cryptocat and as a
result has been the focus of a lot of attention following the recent Paris
attacks.  "In light of the recent terrorist attacks, things are getting
heated for the regular security and encryption software developer.  Being
one myself, I've been on the receiving end of a small avalanche of requests
from journalists, political pundits, and even law enforcement."

https://crypto.cat/

Kobeissi has had a diametrically opposite experience to that of district
attorney Cyrus Vance, and he's written about his perspective in a personal
blog post.

https://nadim.computer/2015/11/23/on-encryption-and-terrorists.html

Where Vance's job is to lock up criminals, Kobeissi recalls how his home in
Beirut was demolished in a bombing attack because his family happened to
live close to the headquarters of a wing of the militant group Hezbollah.

"While walking through a field of rubble and unexploded cluster bombs to try
and find my house, I distantly saw a friend of mine, far away on the other
side of whatever it was that I was staring across.  We locked eyes. Then, we
burst out laughing.  We laughed for a long time."

Kobeissi has seen the flipside of effective encryption: the ability of
ordinary people to communicate in their society's better interests without
being spied on.

"I've seen my software used in Hong Kong to organize protests against a
government otherwise unwilling to give people their rights.  I've seen my
colleagues produce software used by Egyptians rallying for democracy.  I've
had childhood friends call me from Beirut, desperate to know of a way to
organize protests against a government that would lock them up were they to
use public phone lines.

"I've set up communication lines for LGBTQ organizations so that they can
give counsel without fearing ostracization or reprisal.  And in the comfort
of my new life in France, I've also relied on encryption so that I know I'm
obtaining my simple right to privacy when discussing my daily life with my
friends or with my partner."

He has also seen the darker side of law enforcement authorities in a
democratic society.  He was detained and questioned at the United States
border over Cryptocat in 2012, and he was searched and questioned almost
every time he flew into the country.  But more disturbingly, he was one of
the targets of the sting operation against hackers that the FBI ran through
LulzSec member Sabu.

http://www.theregister.co.uk/2012/03/07/lulzsec_takedown_analysis/

Sabu, under the direction of the federal unit, repeatedly tried to get
Kobeissi to work with him in carrying out illegal hacking operations.  He
refused, and when he found out much later about the sting operation, he
warned others about being seduced into breaking the law.

"The incident doesn't personally worry me at all, since I'm confident in my
standing as a lawful citizen.  To all young hackers out there  use your
talents for research.  Never acquiesce to anything illegal with anyone, even
if they do it with you," he wrote.

http://nadimkobeissi.tumblr.com/page/29

It is hardly surprising then that Kobeissi has a different perspective when
it comes to encryption.

On that encryption work, he wrote this week: "We're using mathematics and
engineering to contribute towards a society that's safer, more capable, and
able to communicate with a sense of privacy and dignity inherent to all
modern societies.

"The premise driving the people writing encryption software is not exactly
that we're giving people new rights or taking some away; it's the hope that
we can enforce existing rights using algorithms that guarantee your ability
to free speech, to a reasonable expectation of privacy in your daily life.
When you make a credit card payment or log into Facebook, you're using the
same fundamental encryption that, in another continent, an activist could be
using to organize a protest against a failed regime."

He uses a variation of an analogy used by many pro-encryption advocates in
recent weeks: that blaming the tools used by violent criminals is illogical.
"Ford and Toyota build automobiles so that the entire world can have access
to faster transportation and a better quality of life.  If a terrorist is
suspected of using a Toyota as a car bomb, it's not reasonable to expect
Toyota to start screening who it sells cars to, or to stop selling cars
altogether."

On the issue of law enforcement access, he also takes the firm line put down
by technologists: it's all or nothing.  "The issue is that cryptography
depends on a set of mathematical relationships that cannot be subverted
selectively.  They either hold completely or not at all.  It's not something
that we're not smart enough to do; it's something that's mathematically
impossible to do.  I cannot backdoor software specifically to spy on
jihadists without this backdoor applying to every single member of society
relying on my software."

To his mind, the solution is not preventing people from communicating
securely, but addressing the issues that cause them to act in violent ways.
On a recent visit to his old home in Beirut, he writes, "I found that people
were angry ... Left without any hope for a good education, for a happy life,
with much of their families missing, with their friends dead, many pledged
themselves in return.  That's what's causing terrorism, not encryption
software."

How to square the circle

It's not hard to see both perspectives.  Nor is it hard to find holes in
either [both?  PGN].

The District Attorney Vance goes out of his way in his report to note that
he is not talking about real-time data, only information held on phones "at
rest."

And yet he can only be too aware of the cases going on across the United
States where the police have used phone data to correlate location data with
crimes  a situation of dubious legality that will be heading to the
Supreme Court soon.

http://www.theregister.co.uk/2015/08/05/cell_phone_location_data_protected/

And while he stresses the need for a search warrant and hence proof of
"probable cause," the reality on the ground is that many police forces argue
that "reasonable suspicion" is sufficient to access phone data.

There is also legal uncertainty about whether the police can access phones
with the latest encryption software anyway by forcing suspects to give their
fingerprints.  Except rather than giving their fingerprints on a piece of
paper, they are forced to apply it to their smartphone's reader and hence
unlock their phone.

Vance is also determinedly obtuse about the fact that authorities in other
countries can oblige companies to give them access to phone data if they are
able to do so.  According to Vance, that situation would only happen if tech
companies decided to do so; otherwise other countries' governments and law
enforcement agencies would be forced to come to the United States to make
their case.

The argument is laughable: companies set up in many different countries and
are subject to local laws.  The idea of Apple refusing to comply with a
request from, say, the Chinese government and telling them to head over to
its parent company in the United States if they want access is pure fantasy.
If the company is technologically able to access that data, it will be made
to do so wherever it sells its phones.

It is also worth noting that even in the district attorney's sobering
examples, the information retrieved from phones was just a small part of the
puzzle in convicting people.  The information helped, certainly, but the
cases did not depend on it.

Meanwhile, in the real world

Likewise while Kobeissi's idealism is admirable in many respects, it is also
painfully nave.  The fact is that government is not a singular mass, but
an extremely complex interaction of different groups with different jobs.

The FBI is no more able to effect change in its country's foreign policy
than small developers are able to change the policies of Google or Apple.
You can bet there are no shortages of law enforcement personnel who
sympathize with the difficulties and struggles of ordinary people in Middle
Eastern countries, but their job is to track down criminals and lock them
up.

It is not difficult to hold two seemingly opposing thoughts at the same
time, and yet act in accordance with just one of them if that's what you get
paid for and are trained to do.

You can bet that given the choice, the security services would much rather
there be no one at all that was hell bent on bring death and destruction to
innocent people.  But when those people already exist, the rest of peaceful
society is relying on the security services to find them and stop them
before they turn up outside a music venue with AK-47s.

Commercial pressures

While these two views represent opposite, understandable, but flawed
arguments, where we end up on encryption will come from a combination of
policy and commercial pressures.

The fact is that the Internet's ability to share information and code across
the globe has created an environment fundamentally different from the past.
Anyone can create an application that makes their communication secure over
the Internet.  And there have been many hundreds of them in the past decade.
But despite their existence, large numbers of people were not driven to use
encrypted software because it was a little clunky and they didn't really see
the point.

That all changed when Edward Snowden revealed the depth of mass surveillance
undertaken by the US government in particular, but also the UK government.
Suddenly people started to look at things a little differently; the demand
for secure communications rocketed, secure apps and software became more
user-friendly.  Safe, secure messaging is now a 20-second download away.

Aside from the fact that the big tech companies were not excited about the
fact that the NSA had tapped their own data centers, it quickly became clear
that millions of customers would head for the exit unless something more was
done to protect their data.

The app economy that has made Apple a global force could easily push the
giant back into its old hardware box if the apps end up becoming more
important than the operating system.  Tech companies are extremely wary
about becoming complacent because it only takes the emergence of the new
Facebook to become the old MySpace.

Likewise, Apple, Google, and other technology companies that don't like the
idea of their products being used to put people in jail, or worse, would
much rather not be responsible for safeguarding the highly personal
information of their millions of users.  If a software update puts them out
of the equation without impacting their bottom line, you're not going to
find many vice presidents arguing against it.

Enter the politicians

And that's where the balancing force of politics will soon come into play.
Politicians have become increasingly vocal about their desire to see what's
inside people's communications, especially when many of their nightmares
came true on the streets of Paris.

From the tech companies' perspectives, the cat is well and truly out of the
bag.  Any efforts that force them to introduce backdoors represent a
significant commercial risk.

They don't want to be talking about cold-blooded murders or child molesters
so they are pointing to the mathematic realities of encryption.  They argue
it is "magical thinking" to imagine you can have a hole that only the right
people can access.

But then this is not exactly the first time that compromises over clean code
have been made to get regulatory approval or build a large customer base.
In fact, it is hard to think of a single example where technology has not
been bent to accommodate commercial or political pressures, either internal
or external.  Companies that don't bend with the wind end up failing.

Which is why politicians are now furiously trying to move the needle back to
where they were most comfortable: secret access to huge amounts of
information.

http://www.theregister.co.uk/2015/11/20/clinton_silicon_valley/

It remains to be seen whether that pressure will prove sufficient to force
tech companies to backtrack on their effort to stay out of the way
altogether, and introduce systems that again put them in a position of
privileged access.  Expect to see a lot more about encryption in the coming
months.

As to where we all end up: that will depend on whose arguments you found the
most persuasive: the American prosecutor or the Lebanese coder?


Sneaky Microsoft renamed its data slurper before sticking it back in Windows 10

Lauren Weinstein <lauren@vortex.com>
Thu, 26 Nov 2015 08:22:26 -0800
http://www.theregister.co.uk/2015/11/26/microsoft_renamed_data_slurper_reinserted_windows_10/

  The data that DiagTrack collected was typical of a spyware programme.  The
  only way you knew you were being monitored was by eyeballing the list of
  running processes in Task Manager. As Microsoft explained: Examples of
  data we collect include your name, email address, preferences and
  interests; browsing, search and file history; phone call and SMS data;
  device configuration and sensor data; and application usage.  Users
  thought it had disappeared in recent Windows 10 builds - but it hadn't.
  Microsoft had simply renamed it.  The sinister-sounding tracking app was
  now the beatific and caring "Connected User Experiences and Telemetry
  Service". Once again, it needs to be disabled manually (this time through
  the Services control panel).  "It is this kind of overriding desire for
  control and a disregard for user choices which is harming Windows 10,"
  says Forbes journo Gordon Kelly, and he's right.

Windows 10 is a horrific turkey when it comes to privacy.

Happy Thanksgiving!  [at least in the U.S.!  PGN]


Black Friday Falters as Consumer Behaviors Change

Monty Solomon <monty@roscom.com>
Thu, 26 Nov 2015 00:48:47 -0500
http://www.nytimes.com/2015/11/26/business/black-friday-falters-as-consumer-behaviors-change.html

The decline in impact of the day after Thanksgiving suggests a shift in the
way consumers spend. They're going online more, and buying furnishings
instead of sweaters.

Please report problems with the web pages to the maintainer

Top