Date: Wed, 25 Nov 2015 22:33:38 -0500
Lahey Hospital and Medical Center (Lahey) has agreed to settle potential violations of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy and Security Rules with the U.S. Department of Health and Human Services (HHS), Office for Civil Rights (OCR). Lahey will pay $850,000 and will adopt a robust corrective action plan to correct deficiencies in its HIPAA compliance program.
Date: Thu, 26 Nov 2015 01:10:51 -0500
People who had downloaded foreign messaging services and other software were said to be targeted, as part of a new measure in the country's fractious western territory.
Kieren McCarthy <firstname.lastname@example.org>
Date: Wed, 25 Nov 2015 07:53:38 -0800
"Encryption is about mathematics, not policy."
"When you make a credit card payment or log into Facebook, you're using the same fundamental encryption that, in another continent, an activist could be using to organize a protest against a failed regime."
"It's not something that we're not smart enough to do; it's something that's mathematically impossible to do. I cannot backdoor software specifically to spy on jihadists without this backdoor applying to every single member of society relying on my software."
"politicians are now furiously trying to move the needle back to where they were most comfortable: secret access to huge amounts of information."
Kieren McCarthy, *The Register*, 24 Nov 2015 Who's right on crypto: An American prosecutor or a Lebanese coder? District attorney and encrypted chat app dev sound off on privacy http://www.theregister.co.uk/2015/11/24/perspectives_on_encryption/
Special report The debate over encryption has become particularly intense following the deadly attacks in Paris.
Politicians, police, and government agents insist the encryption in our software and gadgets be limited. Tech companies and programmers insist the encryption be implemented fully securely.
This past week, there have been two posts from opposite ends of this debate, both argued passionately and eloquently, that highlight the complexities around the issue.
One comes from Manhattan's District Attorney and is a 42-page report [PDF] making the case for law enforcement access to smartphones; the second is a blog post from a 25-year-old Lebanese security researcher living in Paris whose secure chat app has become the focus of media interest after the recent attacks. http://manhattanda.org/sites/default/files/11.18.15%20Report%20on%20Smartphone%20Encryption%20and%20Public%20Safety.pdf https://nadim.computer/2015/11/23/on-encryption-and-terrorists.html
The question is: who's right? The American prosecutor or the Lebanese coder?
The two questions
The debate boils down to two basic questions. One: should investigators be able to get hold of communication data if they strongly feel it will solve a crime? And two: how would that system actually work?
With few exceptions, almost everyone agrees that, yes, the police and Feds should be able to access information that will assist in sending down criminals, so long as there are adequate measures to prevent the system from being abused.
The problem comes with the second question: how is it actually done? And here lies the problem, because the answer to that question in many respects overrides the first.
Encryption is about mathematics, not policy. If you create a system that makes data accessible only to Alice and Bob, and inaccessible to Eve, and you then try to ensure the data is somehow accessible to people indistinguishable from Eve, you have to purposefully break the system. And that break, no matter how eloquently implemented, is still a break. Once it is there, it cannot go away.
Technologists and coders have become increasingly outspoken about the fundamentally flawed logic of creating an encryption system with a hole in it, in significant part because Edward Snowden revealed the lengths to which the US government was prepared to go to access all data.
Previously, tech companies had reached an uneasy agreement that they would include carefully designed holes in their systems so information could be provided to a third party in extreme circumstances
*The Register* <email@example.com>
Date: Thu, 26 Nov 2015 08:22:26 -0800
The data that DiagTrack collected was typical of a spyware programme. The only way you knew you were being monitored was by eyeballing the list of running processes in Task Manager. As Microsoft explained: Examples of data we collect include your name, email address, preferences and interests; browsing, search and file history; phone call and SMS data; device configuration and sensor data; and application usage. Users thought it had disappeared in recent Windows 10 builds - but it hadn't. Microsoft had simply renamed it. The sinister-sounding tracking app was now the beatific and caring "Connected User Experiences and Telemetry Service". Once again, it needs to be disabled manually (this time through the Services control panel). "It is this kind of overriding desire for control and a disregard for user choices which is harming Windows 10," says Forbes journo Gordon Kelly, and he's right.
Windows 10 is a horrific turkey when it comes to privacy.
Happy Thanksgiving! [at least in the U.S.! PGN]
Date: Thu, 26 Nov 2015 00:48:47 -0500
The decline in impact of the day after Thanksgiving suggests a shift in the way consumers spend. They're going online more, and buying furnishings instead of sweaters.