I saw this on Boing Boing thanks to Cory, who found it via jwz. At Cory’s book talk last night in DC, he made an observation about morality and algorithms, namely that data is the more salient part of the questions around algorithmic transparency and fairness. This research is revealing in terms of how far data can be stretched and pulled while maintaining some high level coherence.
Schatz’s criticism is on a procedural basis. In the years I spent advocating and representing the public interest at the FCC, my experience was the agency takes these processes very seriously. I worry, of course, that the right has shown a tendency to disregard habits, norms and rules where convenient.
The contours of this move to repeal are similar to the privacy rules repeal. Not only would it undo the open internet order, it would prohibit the FCC from making a similar future rule. The privacy rule repeal was a good warm up for this fight though we should not take any outcome for granted.
Karl Bode at Techdirt has a good corollary to the article I shared earlier today about the hajime worm. The motivations are arguably similar between that worm and these PDoS malwares. The approach in the latter case is much more drastic, to so badly damage the targeted devices so as to remove them from the Internet.
A bad idea comes back around, this time applied to the Internet of Things. The notion of a bit of self propagating code that defends instead of attacks is arguably as old as the Internet. It is never a good idea given the huge space of unintended consequences from unpredictable interactions with existing software to simple bugs exposing affected devices even more so than untouched ones. It is always better for devices owners to be aware of updates to their devices, ideally through a known and trusted mechanism.
That fingerprint systems are rarely as secure as advertised is no secret in the security world. Worse, if your device is vulnerable to this approach, you cannot exactly revoke your fingerprints and get new ones.
I submit that this trend of revealing private online activity through second and third order effects, like fingerprinting network packet headers as described in this research, is why we still need to push for better privacy norms and regulations. There is never likely to be a perfect privacy solution, we’ll always need some reasonable expectations and legal protections as well.
I guess I enjoy being in the tech minority: a Linux user in a Mac/Windows world, a Firefox user in a Chrome/Edge world. Detractors often cite Firefox as being a memory hog. Nice to see Mozilla taking that seriously although projects like Electrolysis and Servo which aim to thoroughly modernize the aging browser will do far more to address that complaint in the long run. In the short term being able to tweak the browser in this way isn’t a bad stepping stone.
Research like this continues to invite us to re-think how computing technologies can intersect with and enlive the world around us. This isn’t going to transform your PC or phone but rather will continue the trend of weaving computation into everything, even more so than the still relatively apparent propagation of the so-called Internet of Things. This technology also reminds us to ask: when memory is so available you can just literally paint it on, what would you do with it? Environmental sensors is really just the tip of the iceberg.
The exploit is just about the worst case scenario. Users don’t even have to connect to a malicious AP and turning off WiFi may not stop an attack. iOS has been patched but it is likely still weeks, if not months or in some cases ever, that Android will receive a patch. I can confirm that Broadcom makes some terrible chips after being stuck running Linux on a Mac for work recently. A coworker still routinely has disconnects and other issues with the same configuration.