Skip to main content

Home

Dartmouth Crest
The I3P is managed
by Dartmouth College


Contact Us
Home >

DESSEC Workshop Wiki

Agenda

The agenda is available in pdf format and in Wiki format.

The overall goal of the workshop is to produce a final report that contains as complete a specification as possible for three or more potential competitions. As the agenda shows, over the course of the three days the individual tracks, or teams, are scheduled to meet separately for only 11 hours or so. That is not a lot of time to accomplish these tasks. It is hoped that by the team reports at the end of each day the teams will have made progress roughly along these lines:

  • Tuesday: Identify the competition(s) to be investigated (no more than two per track); characterize the participants and their rights
  • Wednesday: Complete the specification of the competition(s) (see the numbered list below for guidance)
  • Thursday: Define the competition rules, the award structure (if any), and potential sponsors (and their rights)

The moderators and the I3P staff will be completing the final report on Friday morning for submission to NSF soon thereafter.

Fun Stuff

Bob Blakley's photos of the participants. If you see a picture of yourself that you like, you have Bob's permission to use it in any way you want free of charge and without requirement for attribution. If you'd like to download the large-size file of your picture, you click the picture to go to its photo page. Then click the magnifying glass icon labeled "all sizes" above the picture, and select the "Original" link which will appear. Finally, click the "Download the original size" link immediately above the photo, and you'll get the file.

NOTE: Bob doesn't tag people in photos because he's not sure they'll like the pictures or want to be identified in public. But if you are comfortable with having your name on your picture, please feel free (and welcome!) to tag yourself. You tag yourself by clicking on the picture to go to its photo page, and then clicking the "add tag" link on the right hand side of that page. type your name (enclosed in double quotation marks) into the box that appears, and hit "enter".

Track Goals and Deliverables

Each track is asked to produce a specification for a competition. Topics that will need to be considered include:

A. Particular objectives of the competition

B. Defining Participants

  • What will be the composition of the candidate pool?

C. Determining participant rights

  • What are the rules for existing intellectual property or that developed during competition?
  • Are there legal issues to address?
  • Will there be sponsors for the award, process, or competitors?

D. Setting the Rules

  • What will be the winning criteria
    • Objective vs. subjective balance
    • Application: First to complete, best of a group at deadline, all entries above a set bar, hybrid?
  • What is the staging and timing of competition?
    • Single round or multiple rounds (e.g., screening, short-lists, interim prizes)
    • What is the duration, cut-off date?
  • Will collaboration be encouraged and how?
    • e.g., in the team formation process, idea-sharing during competition, etc.?

E. Setting the award

  • What will be the incentive structure?
    • Monetary -- cash, further investment, winner-directed grants, etc.
    • Non-monetary -- a physical award, networks, publicity, experience, etc.?

Nick Weaver's thoughts on prize level

  • How many winners will there be?
    • Will there be multiple categories of award?
  • What size should the cash award be?

Although these considerations may influence the form of the final output from each group, the structure for the competition specification is expected to be approximately:

  1. A specification of a system to be built, at a reasonable level of detail, including usability and manageability goals
  2. A definition of the security to be provided by the system
  3. A definition of the threat environment in which the system is expected to operate
  4. A method for evaluating what is built against the specification, (the form of an assurance argument might be specified, for example)
  5. A description of how an extensibility challenge can be posed
  6. A method for evaluating the extended system (may be the same as (4))
  7. A list of potential supporting tools and resources that could be made available to competitors
  8. An estimate of the level of effort (number of person years) that might be required to produce an entry

The original DESSEC Call for Proposals included a general example of a competition submission.

Carl Landwehr's keynote slides are here.

The X-Prize Overview (File:DESSEC X Prize Part1.pdf) and X-Prize Prize Development (File:DESSEC X Prize Part2.pdf) presentations are also available.

Track Descriptions

Track 1: Foundational Security Components. Moderator: Anup Ghosh

Identify one or more foundational security components to implement in a competition style that will provide foundational security for software applications or systems. Reference systems will be implemented as open source. Red teams may be employed for testing, etc.

Examples:

  • Establish a trusted path mechanism from human to application that will provide integrity and confidentiality while bypassing keystroke loggers
  • Assuring integrity of hypervisor(s) by out-of-band hardware
  • Maintaining integrity of core software services from potentially corrupt I/O devices or device drivers
  • Attestation of integrity of machines involved in online transactions
  • Trusted displays to ensure user can trust data being displayed and prevent spoofing attacks
  • On-demand late launch of secure mode (going from unsecure to a secure mode with guarantees)

Foundational Security Components Home Page


Recommended references


Track 2: Secure System Implementation. Moderator: George Cybenko

Identify one or more competition concepts that address multi-component systems (MCS) security. Multi-component systems are composed of heterogeneous software elements, hardware platforms, networks, and human operators and users working to accomplish at least one concrete mission, business process or workflow. Competitions can be about implementing specific systems ab initio, implementing specific systems by enhancing COTS products, developing new techniques for analytically reasoning about MCS security properties in the design phase and/or developing new techniques for runtime assurances about MCS security properties during operational use. Evaluations of competitions will likely depend on the type of competition so that red teams, information markets and/or panels of judges could be used. A key outcome of this type of competition is to accelerate workforce growth and technology development in multi-component systems security, not necessarily to build a deployable system.

Examples of multi-component systems:

  • Air traffic control systems
  • Industrial Process Control Systems (SCADA)
  • Electronic medical/health systems
  • Electronic/online voting systems
  • Financial transaction systems

Secure System Implementation Home Page


Recommended references

  • 1995 IEEE Information Security Essay on "Evaluation Criteria for Trusted Systems" (Evaluation Criteria)
    • Evaluations of competitions can use techniques successfully applied to several deployed systems and COTS products.
    • Documentation (Rainbow Series) includes Network Interpretation for evaluation of multi-component systems security.

Track 3: Workforce development. Moderator: Ben Cook

This track will focus on defining the role of competition in building a cadre of secure systems engineering experts. Stagnant student interest in computer science and related fields coupled with surging demand for highly skilled cyber security specialists have led to a national imperative to create a more productive cyber education pipeline. The growing complexity of computer hardware and software systems and increasing sophistication of attackers further compounds the cyber education challenge - in short, the problem is getting harder and our adversaries are getting better. This track will specifically focus on designing a competition for college students that will stimulate and advance their interest and understanding of secure systems engineering while also fostering fundamental innovation. We'll explore possible models from other fields - from the National Concrete Canoe Competition to the University Nanosatellite Program - and attempt to define one or more compelling competitions that would complement existing activities such as the U.S. Cyber Challenge (largely focused on network defense and forensics skill building) to deepen the talent pool in secure systems engineering.

Questions to be considered include:

  • What is the appropriate engineering challenge(s) to pose? E.g., a more well-defined problem, such as a secure networked embedded systems implementation, or more open-ended design challenge, such as concepts for information provenance?
  • How do we attract and motivate the best student competitors? Recognizing the allure of attacking (and synergy in security design of bridging the adversarial and defensive mindsets), what is right balance between defending and attacking in a competition (e.g., student red teamers)?
  • What's the best way to structure and stand up a successful competition? What are the roles of faculty advisors, student chapters (IEEE, etc.), industry, and other stakeholders?

Workforce Development Home Page


Recommended references

  • 2001: A Space Odyssey, Kubrick S. & Clarke A. 1968 ("Voice Print Idenfitication" scene (scene 5) beginning at time 00:26:20)
  • DARPA IPTO STEM computer science BAA, emphasizes programs with continuity, national presence, sustainability. http://www.darpa.mil/IPTO/solicit/baa/RA-10-03_PIP.pdf

Participants

Please create a Subsection here with your brief Bio, alphabetized by last name (and please remove your name from the "not posted yet" list below!).

  • Susan Alexander, Senior Advisor, Joint Interagency Cyber Task Force, Office Director Nat'l Intelligence. At this moment there happens to be a bio at http://www.ioc.ornl.gov/csiirw/keynotebios.html#Susan%20Alexander
  • Lee Badger, Researcher, NIST, lee.badger@nist.gov, member of the NIST cloud computing group, http://csrc.nist.gov/groups/SNS/cloud-computing
  • Eileen Bartholomew, Senior Director, Prize Development, X PRIZE Foundation, eileen.bartholomew@xprize.org http://www.linkedin.com/in/eileenbartholomew
  • Jennifer Bayuk, Industry Professor, Stevens Institute of Technology, jennifer.bayuk@stevens.edu, http://www.stevens.edu/research/research_profile.php?fac_id=1499
  • Terry Benzel, Deputy Director Computer Network Division, USC Information Sciences Institute, tbenzel@isi.edu http://www.linkedin.com/pub/terry-benzel/0/18/748
  • Daniel Bilar, Assistant Professor, University of New Orleans, daniel@cs.uno.edu,
  • Bob Blakley, VP and Burton Group Identity and Privacy Research Director, Gartner, bio at: http://www.burtongroup.com/AboutUs/Bios/AnalystBios.aspx
  • Earl Boebert, Retired
  • Lawrence Carin, Professor, Duke University, bio at: http://people.ee.duke.edu/~lcarin/
  • Ramaswamy Chandramouli, Supervisory Computer Scientist, National Institute of Standards and Technology, mouli@nist.gov,http://www.linkedin.com/pub/ramaswamy-chandramouli/3/615/b66
  • Alessandro Coglio, Principal Scientist, Kestrel Institute, coglio@kestrel.edu, http://www.kestrel.edu/~coglio
  • Ben Cook, Cyber Enterprise Capabilities Manager, Sandia National Laboratories, bkcook@sandia.gov, http://www.sandia.gov
  • Douglas Creager, VP of Research and Development, RedJack LLC, douglas dot creager at redjack dot com, Short bio
  • Rob Cunningham, Leader, Cyber Systems and Technology Group, MIT Lincoln Lab [1], rkc at ll dot mit dot edu, Short bio
  • George Cybenko, Professor, Dartmouth College, gvc@dartmouth.edu, http://www.dartmouth.edu/~gvc
  • Drew Dean, Program Manager, DARPA, drew.dean@darpa.mil, http://www.csl.sri.com/~ddean
  • Anand Ekbote, VP of Liebert Monitoring, Emerson Network Power, Columbus, OH; http://www.linkedin.com/in/anandekbote
  • Jeremy Epstein, Senior Computer Scientist, SRI International, jeremy.epstein@sri.com - details at LinkedIn or VisualCV
  • Eduardo B. Fernandez, Dept. of CEECS, Florida Atlantic University, Boca raton, FL 33431, ed@cse.fau.edu,
  • Darlene Fisher, NSF
  • Anup Ghosh, Chief Scientist & Research Professor, Center for Secure Information Systems, George Mason University, aghosh1@gmu.edu, http://csis.gmu.edu/ghosh.html
  • Cordell Green, Chief Scientist, Kestrel Institute, green@kestrel.edu, http://www.kestrel.edu/home/people/green/
  • Steven J. Greenwald, Independent Consultant, CV.
  • Tim Hahn, Chief Architect Enterprise Modernization Tools, Rational Software, IBM Software Group, IBM. hahnt@us.ibm.com Working on secure engineering for software development as well as application development and testing tools.
  • Joseph Lorenzo Hall, Postdoc at UC Berkeley/Princeton, joehall@berkeley.edu (CV)
  • Jeff Hughes, Chief ATSPI Technology Office, Air Force Research Laboratory, WPAFB, OH
  • Cynthia Irvine, Professor, Naval Postgraduate School, Director, CISR, irvine@nps.edu ([2])
  • Carl Landwehr, Senior Research Scientist, Univ. of Maryland, on assignment to NSF as Director, Trustworthy Computing Program, http://www.isr.umd.edu/faculty/gateways/landwehr.htm
  • Wenke Lee, Professor, Georgia Tech, wenke@cc.gatech.edu, http://www.cc.gatech.edu/~wenke
  • Doug Maughan, DHS
  • Brad Martin, High Confidence Software and Systems Research Lead, National Security Agency, wbmarti@alpha.ncsc.mil
  • John McHugh, RedJack, LLC and UNC - Chapel Hill
  • Rick Metzger, Principal Computer Engineer, Division Technical Advisor, Information Systems Division, Air Force Research Laboratory/Rome, http://www.linkedin.com/pub/rick-metzger/6/330/51
  • Jelena Mirkovic, Researcher at USC Information Sciences Institute, sunshine@isi.edu [3]
  • Sanjai Narain, Senior Research Scientist/Telcordia, narain@research.telcordia.com ([4])
  • Charles Palmer, Senior Technical Advisor, I3P. ccpalmer@dartmouth.edu and Director of IBM's Institute for Advanced Security. For all the boring details, see CCP
  • Chuck Pfleeger, Consultant, Pfleeger Consulting Group, e-mail: chuck@pfleeger.com Personal page: http://chuck.pfleeger.com
  • Shari Lawrence Pfleeger, Senior Information Scientist, RAND Corporation, pfleeger@rand.org
  • Declan Rieb, Staff member, Sandia National Laboratories, darieb at sandia.gov
  • Roger Schell, President, Aesec Corporation, schellr@acm.org User:Schellr
  • Adam Shostack, Senior Program Manager, Microsoft Trustworthy Computing.
  • Jon A. Solworth, Associate Professor, Univ. of Illinois at Chicago, solworth@rites.uic.edu, http://www.rites.uic.edu/~solworth
  • Jason Syversen, CEO, Siege Technologies, details at LinkedIn
  • Andras R. Szakal, IBM Distinguished Engineer, IBM Software Group, aszakal@us.ibm.com, http://www.linkedin.com/in/aszakal
  • Dan Thomsen, Senior Research Scientist, Sandia, Government Security Research details at LinkedIn
  • Kevin Thompson, Program Manager, DHS, kevin.thompson@dhs.gov
  • Jonathan Trostle, Senior Research Scientist, Johns Hopkins University, jonathan.trostle@jhuapl.edu
  • W. Konrad Vesey, Program Manager, IARPA, william.k.vesey@ugov.gov
  • Giovanni Vigna, Professor, UCSB, vigna@cs.ucsb.edu, http://www.cs.ucsb.edu/~vigna
  • Grant Wagner, NSA
  • Cliff Wang, Acting Division chief, Computing science division, ARO, cliff.wang@us.army.mil
  • Nicholas Weaver, Researcher, ICSI, nweaver at my institution's email address, http://www.icsi.berkeley.edu/~nweaver
  • Mary Ellen (Mez) Zurko, LotusLive Security Architect, IBM, surely you do no want me to put my email address in for spam culling?, 3 year old vita; don't use that email, I never check it

Original participant track preferences

General Bibliography

Please add bibliographic references here in alphabetic order.

Fun

Reports, Books, Articles


Competitions

  • A characterization of several prior and continuing competitions, in more than a dozen dimensions, is available in pdf format here; you will need to zoom in a little to read it. I would have uploaded this as an Excel spreadsheet so that you could extend it, but I couldn't figure out how to do that. I would have made it into a single large page of pdf, rather than two pages, but I couldn't make that work either. Still, have a look. --Carl
  • AHS Student Design Competition (American Helicopter Society)
  • ASHRAE Student Design Competition (American Society of Heating, Refrigerating and Air-Conditioning Engineers)
  • Code Jam (Google) http://code.google.com/codejam
  • Cybersecurity: The First Pacific Rim Regional Collegiate Cyber Defense Competition podcast
    • Students defend a network over a weekend while conducting modifications to features. They are under constant attack from a professions red team
    • Probably not the type of contest we want for DESSEC but watching the podcast highlights some issues we need to think about
  • Design It: Shelter Competition (Guggenheim Museum) http://www.guggenheim.org/new-york/education/sackler-center/design-it-shelter
  • Digital Media and Learning Competition (HASTAC and the MacArthur Foundation) http://www.dmlcompetition.net/
  • Imagine Cup (Microsoft) http://imaginecup.com/default.aspx
  • Rails Rumble http://r09.railsrumble.com/entries
  • Solar Decathlon (Department of Energy) http://www.solardecathlon.gov
  • TopCoder (TopCoder Inc) http://www.topcoder.com
  • X Prize (X Prize Foundation) http://www.xprize.org/
  • "Unlock The Value" prize for new methods in silver extraction from quartz ore (Barrick Gold Corp)
  • Stiglitz (1984) Theory of Competition, Incentives, and Risk. Read the first 8 pages on meta research, or maybe just the first 3 pages . http://www.princeton.edu/~erp/ERParchives/archivepdfs/M311.pdf
  • Eileen's excellent DESSEC X Prize Presentation Eileen's outline is a 0th order incubator template: A meta-competition process. We actually have to develop a 1st order template at DESSEC: A continuity/feedback/evolution template that generates 0th order incubator templates: meta-meta-competition. As such: DESSEC workshop is itself a 2nd order template: A meta-meta-meta template)
  • US Cyber Challenge at CSIS (Allen Paller described this in his talk): http://csis.org/uscc


Materials/slides

  • Slides from Carl Landwehr's opening keynote are here.
  • Slides from Nick Weaver's talk are here.