Ben Laurie blathering

Preprint: The Neb

Actually, this paper has a longer and sillier title, “Choose the Red Pill and the Blue Pill”. It was born at NDSS in a conversation with Abe Singer, my co-author.

The basic idea is that we have a choice between an operating system we can trust and one that is usable. A trustable system would be very bland and grey (the red pill) and a usable system would be full of fun and colour – but security would be a fantasy (the blue pill). In the paper, we discuss how to have your cake and eat it.

“The Neb” is the secure operating system we propose (short, of course, for the Nebuchadnezzar), btw.


  1. Why is it strictly necessary for it to attest to what app it’s running? The user himself should take good care to give proper keys to proper apps and not any other and besides, having any statement issued using this be more legally binding than current online purchases, etc. would require a change in legislation — which could take form of `whatever you sign with your key is presumed to be from you’. It isn’t as bad as it sounds compared with what already is deployed; see pincards problems.

    Comment by Robert Obryk — 26 May 2008 @ 23:08

  2. Having only read the abstract, this approach sounds very similar to that taken by the Annex and Nizza projects.



    Comment by Toby — 28 May 2008 @ 9:40

  3. I should add some comments. The Nizza project implements the separate device providing a trusted path as a small trusted virtual machine that sits alongside the legacy OS on top of a tiny hypervisor. (seL4 would appear to be an ideal hypervisor for this purpose, btw).

    The Annex project implements the separate device as a separate physical device, known as the Secure MultiFunction Card, a PCMCIA card providing the trusted path to the user and enabling secure communication between separate SMCs connected via Internet-connected untrusted hosts.

    The Annex project has since evolved since I worked on it, but at that point we had built proof-of-concept devices that implemented this functionality. The SMC ran a custom engineered object-capability kernel that used password caps to reify access to services between and within SMCs. We had a few prototype applications that ran atop the system making use of the high-security services afforded by the SMC, particularly a tweaked OpenH323 application for encrypted VoIP conference calling. Separate hardware was used to record and playback the audio stream which was encrypted on the SMC before being pushed to the untrusted host for transmission over the network.

    These sorts of hybrid architectures do seem like the right way to go about trying to implement highly secure variants of standard off-the-shelf services.

    Comment by Toby — 28 May 2008 @ 12:34

  4. I don’t quite buy the argument that it has to be a separate device from a cell phone. The requirement is a secure OS with a trusted path to the hardware and the user. It seems like this might coexist with regular cell phone software, perhaps with a special button to switch the rest of the UI between the secure and general-purpose operating systems.

    Comment by Brian Slesinsky — 2 Jun 2008 @ 2:05

  5. The Neb is free, right? 🙂

    Why did SET fail, and why does that reason not apply to the Neb?

    Comment by Wes Felter — 3 Jun 2008 @ 20:36

  6. […] back when I wrote about The Neb. The basic idea here was that you can’t trust your PC, so you should have a separate trusted […]

    Pingback by Links » IBM Implement The Neb — 6 Nov 2008 @ 14:59

  7. […] I have mentioned before, Abe Singer and I wrote a paper on giving up on general purpose operating system security, […]

    Pingback by Links » Red Pill/Blue Pill — 3 Dec 2008 @ 15:28

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

Powered by WordPress