2013 06 03

From TheCommandLineWiki
Jump to: navigation, search

Contents

Feature Cast for 2013-06-03

(00:00:17.741) Intro

(00:03:27.205) Bruce Schneier

  • I am pleased to welcome Bruce Schneier back onto the podcast
  • Bruce is a well know security expert
    • Whose work is quite broad
      • Encompasses the low level technical aspects of information security, such as encryption
      • But also increasingly more human orient factors like psychology
  • His most recent book explored the subject of trust and how it interacts with technology
    • And he asked me to invite him on to discuss another topic in this vein
      • About which he has been thinking, with the idea of writing another book
      • The intersection of power and technology
  • Welcome to the show, Bruce
  • You pointed out a post on your blog that enumerates some recent essays on this subject
    • I have some questions that occurred to me while reading through these
      • That I though would be could prompts for further discussion
  • https://www.schneier.com/blog/archives/2013/04/what_ive_been_t.html
  • On Internet and power
    • Risk of the unorganized
      • Inability to do exact sort of shaping of the debate that is required
      • Is that a facet of Internet, something new and inherent to it, or just exposed by it?
    • Are there examples of sustained coordination that can help?
      • Like Wikipedia or Linux
      • Or is their focus to narrowly on production
        • Usually in an apolitical as Biella Coleman discusses?
  • On the new feudalism
    • How do we avoid the capture you describe that leads to feudalism?
      • I am thinking of the trap of convenience coupled with the lack of choice
      • Zittrain wrote about this but it doesn't seem like it sunk in with regular Internet users
    • Your narrative about the formalisms that were established in response to feudal roles is compelling
      • Exactly because it suggests regulatory approach
      • It suggests that the value of collecting user data
        • Should come with obligations on the provider, for instance providing an escape hatch
      • I think an exacerbating factor is that companies seem to be obscuring
        • The exact nature of that value that resides in collecting users data and limiting choice
      • Do we need to better define that value objective and transparently
        • To make the stakes clear, for normative and regulatory shifts in the right direction?
  • On cyberwar
    • Is the antidote to some of the pressure of cyberwar as simply as more info and transparency?
      • You say the problem is fear and ignorance
      • Are independent researchers helping or hindering by revealing cyber espionage
        • By exposing where surveillance and censorship tech is being deployed, how it works?
    • Is there a category error being made, intentional or otherwise, in the cyberwar rhetoric?
      • The weapons are the same in military and non-military contexts
      • They arguably are not even weapons by any traditional definition
        • Really being multiple use tools as you explore elsewhere in your writing
      • Is it fair to see this as a corollary to the attribution problem?
        • That is the tool is less relevant than the actor and the intent?
      • These risks of confusion seem similar to the challenges arising
        • From the fights over copyright enforcement and online piracy
        • Are there any applicable lessons we can extract and apply
          • From that debate as we've at least spent a bit more time and ink on it to date?
    • How would the suggestion of treaties intersect with the attribution problem?
      • One could argue that China already uses confusion over attribution to its advantage
      • Wouldn't they just hire 3rd parties and establish deniability?
      • Does it devolve to a question, irrespective of the technological context
        • Of whether diplomatic and economic sanctions are effective for these outcomes?
  • Discussing internet as surveillance state
    • Isn't what you describe, the ability to use a single anonymity or privacy mistake
      • To unmask even the most careful actor
      • A risk not just arising from the internet
        • But also from the ubiquity of computers
        • And from the Internet invading offline things like business back offices
      • At least with the examples you give, it seems like these additional opportunities are needed
        • To make these sorts of troubling correlation easier
    • I agree in your identification of this not being a free market problem
      • But couldn't regulation tilt that way
        • Like requiring interoperability to lower cost of network effect?
      • I know it is probably an overused example but I am thinking
        • About the relative success of environmentalism and protection laws arising from it
    • Your conclusion is bleak, do you really think there is nothing we can do?
    • Is there a model for citizen power that does not fail
      • In the same way as the all or none examples you cite?
    • What about a harm reduction model?
    • Or the argument that norms will catch up and alter the cost of an inadvertent privacy slip?
  • IT for oppression
    • Can we shift the troubling alignment of corporate IT and government possessiveness for user data
      • By shifting the burden to penalizing behavior rather than allowing sweeping control?
    • I am thinking of something like a knowledge worker's bill of rights
      • To have access to an uncensored net
      • But still leaving room for abuses, like viewing porn, to still be cause for termination?
    • So a de-coupling of the means and the outcome

(00:33:34.488) Outro

Personal tools