convinces other people that something is true. • They improve but do not guarantee security, safety, and friendliness. • In 1979, Michael Rabin proved that his encryption system could be inverted as long as he uses the factor “n” when computing his algorithm. He uses “n” because it would be relatively hard for someone to compute a large value of “n” based off the given algorithm.
• In 2007, Boldyreva created the Order Multi-Signatures (OMS) which was claimed more efficient and more secure than other systems with similar functionality. This was proved wrong by Hwang in 2009.
computer system, they instead created a secure algorithm that has a possibility of being breached when the algorithm is figured out. • The system’s former security requirements might fall victim to attacks by not capturing everything an attacker can do to break the system. This also does not show what information is available to the hacker. • Mathematical proofs can sometimes be wrong
resisting all attackers ◦ Key pre-distribution schemes in large • Computational Security ◦ Characterized by resisting attacks made by PPT algorithms or circuits ◦ Typically uses complexity theoretic techniques to prove security ◦ Non-uniform algorithms are considered to be given a different “hint” for each value of the security parameter. • Formal Methods ◦ Characterized by Style of Proof Rather Than Class of Attackers ◦ Verification of Protocol Security ◦ Verification of Algorithmic Correctness
computer is “hard” • Proofs often come in 3 parts ◦ A description of the simulator ◦ A justification of why the simulators provides inputs which look like those in the security model ◦ A justification of why the simulator solves the problem whenever the attacker breaks the cryptosystem
operating system that started in 1965 and was used until 2000. • Created by MIT’s Project MAC. • Honeywell offered Multics as a commercial product and sold dozens of systems.
An innovative segmented memory addressing system • A tree structured file system • Device support • Hundreds of program commands, languages and tools • Hundreds of library routines • Operational and Support Tools • User and system documentation
has leaded pathway for computing systems’ security. There was no need for a Multics computer since methods used in Multics were now developed in Windows and Macintosh computers. There are still emulators today running Multic Systems. • Mathematically provable secure systems are more difficult to be proved than thought possible and it shows that not having a completely secure system would prove very resourceful when it comes to money. Industry wishes to make profit wherever seems profitable and having a mathematically provable secure system will take away from that profit. • I personally don’t believe that there isn’t such a thing as a mathematically provable secure system when it comes to computer systems but I do believe that people would rather make profit off of security than have a secure system.
formerly called Secure UNIX). KSOS is intended to provide a provably secure operating system for larger minicomputers • KSOS will provide a system call interface closely compatible with the UNIX operating system • KSOS is composed of three components: ◦ The Security Kernel ▪ Provides a simple operating system which can be shown to be secure ◦ The UNIX Emulator ◦ The Non-Kernel System Software
to provide strong assurances that it is impossible for an unprivileged user to cause an information compromise. • The overall design goals for KSOS are: ◦ The system must provide provable security, i.e. its design and mechanization must be oriented towards the proof of its security properties. ◦ The copying of the UNIX system call interface must be as faithful as possible given the constraints of the security model. ◦ The performance of the system should be "good," specifically, the performance should be comparable to that of a UNIX system. ◦ The Kernel should be usable by itself as a simple, secure operating system. ◦ The design should be amenable to implementation on other hardware bases.
for UNIX version 6 with: ◦ A security kernel ◦ Non-kernel security-related utility programs ◦ UNIX Application development and support environments (optional) • First full use of HDM (Hierarchical Development Methodology)
useful general-purpose operating system with demonstrable security properties • PSOS was designed using a combo of disciplined engineering processes in order to provide a sound basis for claiming that the resulting system could meet its security requirements • The PSOS design was strongly motivated by the formal approach: ◦ The Hierarchical Development Methodology (HDM) • In PSOS, capabilities are the means by which all system objects are referenced and accessed • Each object in PSOS can be accessed only upon presentation of an appropriate capability to a module responsible for that object
accessing and protecting objects ◦ Simplifies the proof process, unifies the design and has a great impact • Led to the usage of extended-type objects using the hierarchical design ◦ Providing layers of abstraction and protection • Reduces the proof of larger programs to many smaller programs which simplifies the input and output of each program
system originated in a joint Honeywell-Air Force program called Project Guardian, which was an attempt to further enhance the security of Honeywell's Multics system. • The Honeywell Secure Communications Processor (SCOMP) was an early guard platform
simple, secure and efficient • The Scomp system is a unique implementation of a hardware/software general-purpose operating system based on the security kernel concept. • Scomp hardware supports a Multics-like, hardware-enforced ring mechanism, virtual memory, virtual I/O processing, page-fault recovery support, and performance mechanisms to aid in the implementation of an efficient operating system
Honeywell Level 6 minicomputer • First system to be ranked as a Class A1 in the Trusted Computer System Evaluation Criteria (TCSEC) ◦ Class A1- verified design under Division A- Verified protection
no system can be “provably secure” in the strongest sense, since we can’t be 100% certain that the system’s formal security requirements have been specified properly, and we can’t be 100% certain the security proof itself is without error.
the proof/OS are unaware of may not even exist. The operating system’s formal security requirements might fail to capture everything the attacker can do to break the system, and what information is available to the attacker. 1. Accidental Discovery 2. Deliberate Research
the one installed on a computer that has never been nor never will be connected to the internet and is in a secure locked room which is also a Faraday cage. It must comply with NATO SDIP-27 Level A standards.