2015-11-15 The Command Line Podcast

old-newspaper-350376_1280This is an episode of The Command Line Podcast.

This time, I chat about some recent news stories that caught my attention, including:

You can subscribe to a feed of articles I am reading for more. You can follow my random podcast items on HuffDuffer too.

You can directly download the MP3 or Ogg Vorbis audio files. You can grab additional formats and audio source files from the Internet Archive.

Creative Commons License

This work is licensed under a Creative Commons Attribution-Share Alike 3.0 United States License.

Help Support a Critical, Free Software Privacy and Security Tool (Updated)

I noticed an update from the GNU Privacy Guard project (gnupg or gpg) come across my feeds the other day. If you have received an email from me that has a digital signature if you know what that is or a bunch of gobblety-gook characters at the bottom if you don’t, the tool that makes those signatures possible is gnupg.

More people seem aware of what encryption is and why it is important. We have had a string of increasingly distressing leaks, the ones from Edward Snowden just the latest, about how many governments in presumed open societies are participating in some very questionable trawling of their citizens’ personal communications. For those still not sure why encryption is important, it is the one technology answer everyone can agree upon that allows individual citizens any sense of secrecy and privacy in their online communications, regardless of who may want to snoop on it and how well resourced those eavesdroppers may be.

gnupg is especially important as it is is both free of charge and freely licensed. That second point is critical, it means that gnupg is open to scrutiny from any expert to help ensure it is free of back doors or other problems that might compromise its effectiveness. For users of alternate operating systems like BSD and GNU/Linux, it is often the only choice for certain applications of encryption. Thankfully, it happens to be a usable and useful one that interoperates with the commercial, proprietary choices available to users of more mainstream operating systems.

That post from the gnupg folks? They are in clear need of help in terms of funding.

Work on GnuPG is mostly financed from donations. To continue maintaining GnuPG so to keep it strong and secure against the ever increasing mass surveillance we need your support. Until the end of November we received a total of 6584 € (~5500 net) donations for this year. Along with the 18000 € net from the Goteo campaign this paid for less than 50% of the costs for one developer.

For a critical project of this size two experienced developers are required for proper operation. This requires gross revenues of 120000 Euro per year. Unfortunately there is currently only one underpaid full time developer who is barely able to keep up with the work; see this blog entry for some backgound. Please help to secure the future of GnuPG and consider to donate to this project now.

Support for half of one developer for a project that could easily engage a handful, full time, year round. Do please consider making a donation and if you are unfamiliar with gnupg, spend some time on the project site. It really is a great tool.

Updated 2014-01-06: At the request of the primary author of gnupg, I changed the title and a reference to GNU/Linux in recognition of gnupg’s formal status as part of the umbrella GNU project.

NYPD Anti-Terrorist Cameras Used for Much More

I wish I could say that this New York Times piece linked to by Slashdot surprises me in the least. It isn’t entirely clear that this is a case of mission creep. That uncertainty may be intentional, remarks from the law enforces responsible make it sound like they envisaged use of this growing network of automated cameras in regular criminal investigations was envisioned all along. The key question is whether that was part of the policy that funded their purchase, deployment and operation in the first place.

Donna Lieberman, the executive director of the New York Civil Liberties Union, nails the problem with the system right on the head.

She said it was hard to tell whether interest in “effective and efficient law enforcement” was being balanced with the “values of privacy and freedom.”

“We don’t know how much information is being recorded and kept, for how long, and by which cameras,” Ms. Lieberman said. “It’s one thing to have information about cars that are stopped for suspicious activity, but it’s something else to basically maintain a permanent database of where particular cars go when there is nothing happening that is wrong and there is no basis for suspicion.”

Most of the uses listed in the article seem innocuous enough but we don’t know if the system is restricted to just effectively extended human driven BOLOs. Operational transparency and privacy safeguards should really be inviolate conditions of establishing networks like this. How else can the public interest hold them accountable and audit they are not in fact creeping in their mission? Too bad that point is really only a very small part of the article which otherwise largely lionizes the cameras.

NYPD Anti-Terrorism Cameras Used For Much More, Slashdot

Case of a FOIA Request for a Public University Professor’s Email Messages

Dan Wallach at Freedom to Tinker has an interesting concern over a case that would otherwise seem easy to evade by using any number of free email services to simply compartmentalize correspondence an employee of a public institution does not wish to have subject to a Freedom of Information Act request. The circumstances he considers are very particular when asking questions about FOIA’s reach, or even that of an employers. Specifically, Wallach wonders if the practice of using Gmail or a comparable service to transparently handle professional email would blur the lines enough to erode any implicit protections from using an outside service.

Here’s another thing to ponder: When I send email from Gmail, it happily forges my rice.edu address in the from line. This allows me to use Gmail without most of the people who correspond with me ever knowing or caring that I’m using Gmail. By blurring the lines between my rice.edu and gmail.com email, am I also blurring the boundary of legal requests to discover my email? Since Rice is a private university, there are presumably no FOIA issues for me, but would it be any different for Prof. Cronon? Could or should present or future FOIA laws compel you to produce content from your “private” email service when you conflate it with your “professional” email address?

Bear in mind that norms have an impact of the law so what he is asking isn’t so far fetched. The potential fuzziness would suggest a better tactic would be to keep a much more explicit division, along with the overhead, that requires. Wallach ponders that practice, asking a final question as to whether that would be enough for purposes of insulating a personal account from such searches.

The case of Prof. Cronon and the FOIA requests for his private emails, Freedom to Tinker

Act Now in Support of Patriot Act Reform

Apologies that this is coming so late in the day but not too late. Some measures of the Patriot Act are set to expire at the end of this month. The Senate Judiciary Committee is to convene to review them tomorrow. EFF has posted an action alert to aid concerned citizens in contacting their elected representatives to urge a reigning in of powers under the Act. This dove tails with EFF’s analysis of documents recovered through dogged FOIA requests that show a sustained and clear pattern of abuse of these very powers.

Contact the Senate Judiciary Committee Today to Support Reforms to PATRIOT Act! EFF

California Supreme Court Allows Search of Cell Phones without a Warrant

As the Slashdot summary of this SFGate story makes clear, there are some big caveats on this ruling from the California Supreme Court. Warrantless searches of cell phones are only allowed after a defendant is arrested and taken into custody. The inclusion of cell phones is part of a larger rule allowing police to seize and search any personal effects.

The dissenting judges saw the massive amount of information potentially squirreled away in a modern cell phone as worthy of an additional barrier. This is consistent with rulings from other courts, including mostly notably the Ohio Supreme Court in a case from as recent as December of 2009.

In trying to reason through how a cell phone differs from other personal effects that would seem more reasonable for law enforcers to examine, I have to wonder what about a thumb drive? A personal media player? Laptops traditionally have posed more of a challenge, usual because of the addition of a password or even encryption. What about the pin codes and passwords offered by many smart phones? Would these raise the bar enough to make the California judges, or even the Supreme Court, see more of a bright line? I think there is more to consider here than just data capacity but am not clear in my own mind what would rise to the level of a domain outside of immediate and personal effects to something more like what the SCA and other laws cover in terms of stored data. (I realize the Stored Communications Act is a flawed analogy but the rulings protecting cell phones clearly beg some more definitional work.)

I haven’t seen much in the way of crypto for cell phones, beyond password safes. I wonder if rulings like these might encourage the development of encrypted alternatives to the built-in address book and other apps.

Police Can Search Cell Phones Without Warrants, Slashdot

Does Chrome OS Increase Privacy Risk?

Technology Review has a somewhat inflammatory piece that takes as its point of departure that the CR-48 prototype laptops reveal that the search giant’s new OS in development doesn’t rely on local storage. Rather it uses an always on wireless connection to access both applications and personal information. This is hardly news as the first reveal of the OS indicated it would include only the bare minimum plumbing needed to support the Chrome browser. The implication was pretty clear and supported by subsequent reporting such as on the VNC-like remoting feature for so-called native applications.

I re-phrased the thrust of this story as a question because I see it as just part of a larger trend that started initially in the nineties. Killer apps have increasingly popped up on the web versus on our local computers. This has drawn users further onto the network with the browser as the prime gateway for that access. Chrome OS just eliminates alternatives to web applications that might offer more privacy preserving choices. I may be cynical but I suspect the presence or absence of these alternatives is a small factor in the calculus of most users today.

To be fair to Erica Naone, the author, she does dig further into the some of the nuances of the program. Most notable is that while the Chrome browser you can install on any other OS already tracks users, it does not do so for the purposes of advertising. Many have been speculating that the forthcoming final versions of netbooks running Chrome OS will be free , supported by advertising revenue. Data collection then would be a natural higher priority to make this model as cost effective as possible. Naone also points out that privacy concerned users can opt out of collection, at least with the prototype rigs.

I have to question the economics everyone else takes for granted. We’ve already seen that the large cost in netbooks isn’t in the hardward itself. There are many excellent, cheap portables to choose from but where they pinch the wallet potentially is if you want to add a cellular data plan. By limiting users to the model of browser-as-OS, Google is eliminating another choice, for users to forego a data plan they might not be able to afford. A Chrome OS computer without ubiquitous network access may as well be a doorstop.

I would much rather see Google exercise some bulk buying or other market pressure to reduce the cost of cellular data plans to be more in line with what is on offer in other developed nations than spend time to figure out how to cram yet more ads into their latest offering.

Chrome OS Knows Your Every Move, Technology Review

New Proposal for Privacy Bill of Rights

Matthew Lasar at Ars Technica has news and details of an interesting proposal from the US Department of Commerce. In practice, it won’t be called a bill of rights.

Instead they’ll be dubbed “Fair Information Practice Principles” (FIPPs), intended to promote “increased transparency through simple notices, clearly articulated purposes for data collection, commitments to limit data uses to fulfill these purposes, and expanded use of robust audit systems to bolster accountability.”

As Lasar further explains, the framework suggested builds on principles found in the Privacy Act of 1974 which applies primarily to government agencies. The key difference is that this set of rules would be voluntary though once taken on by a web site or service operator would be enforced by the FTC. A form of safe harbor from complaints would even apply for sites adhering to their policies.

This is a good bit of rhetoric and includes ideas that haven’t to my knowledge been advanced in a proposed law previously. The proof will be in the implementation. After all, the US government has been snarled in its own share of privacy complaints and data breaches. A set of principles, no matter who well couched, isn’t going to be enough.

I guess I am most interested in this story because it has the potential for establishing some basic expectations in the realm of privacies on which both consumer and business choices could be based on more empirical negotiated.

US calls for online privacy “Bill of Rights”, Ars Technica

End of Road-Going Anonymity

This Wired article by Keith Barry is a bit chilling. It is about a mobile app by Philip Inghelbrecht, one of the co-founders of Shazam.

Anyone can write a ticket, even pedestrians and cyclists. No one is safe from being tattled on. Even if you don’t use the program, which went live Wednesday, you can’t opt out of being flagged if someone thinks you’re driving like a schmuck. Inghelbrecht is emphatic in saying he sees no privacy issues with the app and insists the end of road-going anonymity can only improve safety.

No privacy issues, huh? That is just for starters–what about due process? Officially installed red-light and speed cameras face enough challenges, I can’t imagine that this system is going to survive any kind of serious complaint in court.

The legal ramifications aren’t the whole story, however. It’s the commercial applications that are even more troublesome.

Insurance companies rely on buying your driving record from your state’s motor-vehicle bureau, and they use predictive proxy data such as marital status, homeownership and ZIP code to determine your risk. Inghelbrecht sees insurance companies having great interest in a driver-behavior database that, if predictive of claims data, could help set rates.

Thankfully, insurers are a little more cautious. Barry spoke with a representative from Nationwide who raises concners around consistent definitions. The horror of individuals bent of gaming this app thankfully is apparent enough to invite some caution. Inghelbrecht seems non-plussed, breezily mentioning algorithms and eventually user feedback to address “noise” in the system.

Even if this effort stalls, the idea is out there. My fear is that competitors will crop up in short order, magnifying the wasp nest of privacy and liability issues introduced.

Big Brotheresque App Kills Your Automotive Anonymity, Wired

How Will Device Fingerprinting Fare Against “Do Not Track”?

I linked to the preliminary report on privacy released by the FTC yesterday. Chief among their suggestions is a lightweight Do Not Track system based on browser headers, a scheme that is technically sound but raises questions about compliance and complaint.

More concerning is this Wall Street Journal posting about an outfit, BlueCava, looking to assemble a massive database of unique identifiable networked devices.

He’s off to a good start. So far, Mr. Norris’s start-up company, BlueCava Inc., has identified 200 million devices. By the end of next year, BlueCava says it expects to have cataloged one billion of the world’s estimated 10 billion devices.

Advertisers no longer want to just buy ads. They want to buy access to specific people. So, Mr. Norris is building a “credit bureau for devices” in which every computer or cellphone will have a “reputation” based on its user’s online behavior, shopping habits and demographics. He plans to sell this information to advertisers willing to pay top dollar for granular data about people’s interests and activities.

This is entirely continuous with EFF’s research into browser fingerprinting and sustains Professor Ed Felten’s warnings about going after mere tracking cookies too zealously. Nothing about the fingerprinting is necessarily incompatible with the proposed Do Not Track system. The article merely raises the urgency in answer questions around how to determine whether an advertiser is honoring the DNT header and how to enforce an action against them.

Race Is On to ‘Fingerprint’ Phones, PCs, Wall Street Journal (via Hacker News)