Links

Ben Laurie blathering


Dilemmas of Privacy and Surveillance

The Royal Academy of Engineering has published an almost sensible paper on privacy and surveillance. They get off to a good start

There is a challenge to engineers to design products and services which can be enjoyed whilst
their users’ privacy is protected. Just as security features have been incorporated into car design, privacy protecting
features should be incorporated into the design of products and services that rely on divulging personal information.

but then wander off into cuckooland

sensitive personal information stored electronically could potentially be protected from theft or misuse by using digital
rights management technology.

Obviously this is even more loony than trying to protect music with DRM. Another example

Another issue is whether people would wish others to have privacy in this arena – for example, the concern might arise
that anonymous digital cash was used by money launderers or terrorists seeking to hide their identity. Thus this
technology represents another dilemma – should anonymous payment be allowed for those who wish to protect their
privacy, or should it be strictly limited so that it is not available to criminals?

Riiight – because we have these infallible methods for figuring out who is a criminal.

Also, as usual, no mention whatever of zero-knowledge or selective disclosure proofs. But even so, better than most of the policy papers out there. Perhaps next time they might consider consulting engineers with relevant knowledge?

(via ORG)

3 Comments

  1. Why is it *obviously* loony to to try to protect privacy with DRM?
    e.g.
    http://www.cs.dartmouth.edu/~sws/pubs/is05a.pdf
    http://slinky.imukuppi.org/2005/05/31/drm-for-privacy/

    Agree the RAE report is breathtakingly oblivious of advanced PETs, and this shows in the references and bibliog

    Comment by Anon — 28 Mar 2007 @ 15:07

  2. I didn’t say it was obviously loony. However:

    a) DRM doesn’t work

    b) Even if DRM did work, what you’re protecting with it has to be big enough that its onerous to just re-enter it. Clearly this rarely applies to user data with privacy concerns.

    Comment by Ben — 28 Mar 2007 @ 15:32

  3. OK, you said “Obviously this is even more loony” – why *more* loony?

    In the server-side-trusted-computing-base model of DRM for privacy, the premise is that the remotely attested stack on the server can’t break privacy rules unless they deliberately fool with the hardware or re-enter the data. But this privacy-by-default approach would have stopped a lot of organisational process screw-ups that have led to breaches that have come to light in the past few years …

    Comment by Anon — 28 Mar 2007 @ 15:52

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

Powered by WordPress