Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
Removal of bulletpoints/additional writing to ending the Motiviation …
…section

Signed-off-by: Paul Wortman <paul.mauddib28@gmail.com>
  • Loading branch information
paw10003 committed Jul 28, 2015
1 parent 6ae8800 commit 860ba0e
Showing 1 changed file with 6 additions and 12 deletions.
18 changes: 6 additions & 12 deletions PBDSecPaper.tex
Expand Up @@ -99,10 +99,7 @@ Virtualization will help offset the time and monetary costs of using and impleme
Hardware/Software codesign is crucial for bridging together the software and hardware aspects of a new system in an efficient and effective manner. There are different coding languages to handle different aspects (i.e. SW/HW) of vitrualization. When dealing with the virtualization of software the aspects of timing and concurrenccy semantics still fall short. These problems come from a lack of resolution and control at the lowest levels of virtualization interaction. The overwhelming hardware issue is that hardware semantics are very specific and tough to simulate. There has been the development of hardware simulation languages, such as SystemC~\cite{Kreku2008}, but there has not been the development of tools to bridge the space between hardware and software simulation/virtualization. Codesign of software simulations of hardware allows for development of high level software abstraction to interact with low level hardware abstraction. The reasoning being the constant growth in complexity calls for simulation/virtualization of the design process. System-on-chip (SoC) technology will be already dominated by 100-1000 core multiprocessing on a chip by 2020~\cite{Teich2012}. Changes will affect the way companies design embedded software and new languages, and tool chains will need to emerge in order to cope with the enormous complexity. Low-cost embedded systems (daily-life devices) will undoubtably see development of concurrent software and exploitation of parallelism. In order to cope with the desire to include environment in the design of future cyber-physical systems, a system's heterogeneity will most definitely continue to grow as well in SoCs as in distributed systems of systems. A huge part of design time is already spent on the verification, either in a simulative manner or using formal techniques~\cite{Teich2012}. ``Indeed, market data indicate that more than 80\% of system development efforts are now in software versus hardware. This implies that an effective platform has to offer a powerful design environment for software to cope with development cost.''~\cite{Vincentelli2002} Coverification will require an increasing proportion of the design time as systems become more complex. Progress at the electronic system level might diminish due to verification techniques that cannot cope with the modeling of errors and ways to retrieve and correct them, or, even better, prove that certian properties formulated as contraints during syntehsis will hold in the implementation by construction. The verification process on one level of abstraction needs to prove that an implementation (the structural view) indeed satisfies the specification (behavioral view)~\cite{Teich2012}. The uncertainty of environment and communication partners of complex interacting cyber-physical systems, runtime adaptivity will be a must for guaranteeing the efficiency of a system. Due to the availability of reconfigurable hardware and multicore processing, which will also take a more important role in the tool chain for system simulation and evaluation, online codesign techniques will work towards a standard as time moves forward. As with any design problem, if the functional aspects are indistinguishable from the implementation aspects, then it is very difficult to evolve the design over multiple hardware generations~\cite{Vincentelli2007}. It should be noted that there are tools that already exists for low, or high, system simulation. New territory is the combination of these tools to form a `hardware-to-software' virtualization tool that is both efficient and effective.
Metropolis is one tool that is based in part on the concept of platform-based design. Metropolis can analyze statically and dynamically functional designs with models that have no notion of physical quantities and mapped designs where the association of functionality to architectural services allows for evaluation of characteristics (e.g.~latency, throughput, power, and energy) of an implementation of a particular functionality with a particular platform instance~\cite{Vincentelli2007, Metropolis}. Metropolis is but one manifestation of platform-based design as a tool. PBD has been used for the platform-exploration of synthetic biological systems as seen in the work done by Densmore et.~al.~to create a strong and flexable tool~\cite{Densmore2009}. Other applications, of platform-based design, include design on a JPEG encoder, imaging, and use for distributed automotive design~\cite{Vincentelli2007, Sedcole2006, Gamatie2011, Lin2013, Teich2012, Gerstlauer2009, Gronbaek2008, Pimentel2006, Schaumont2005, Keutzer2000, Benveniste2012, Pinto2006, Bonivento2006, Pellizzoni2009, Densmore2009, Kreku2008, Gruttner2013}

\begin{itemize}
\item The manufacturer's standpoint boils down to: the design should minimize mask-making costs but be flexible enough to warrant its use for a set of applications so that production volume will be high over an extended chip lifetime~\cite{Vincentelli2007}. Companies try to drive adoptability by means of creating something that users want to interact with, but not be complicated to learn (e.g. abstraction of technology for ease of use). Accounting for ease of use can lead to vulnerabilities in security or the development of new tools. Automation is desirable from a `business' standpoint since customers/users enjoy the `set it and forget it' mentality for technology (especially new technologies). Companies/Manufacturers need positive customer/user experiences, otherwise there is no desire to extend any supplied functionality to any other devices/needs on the part of the consumer. Adoptablility tends to come from user `word of mouth' praising the functionality and ease of use of new technology/methods/devices and how the developing party reacts to system failures or user-need (branching from complaints and support requests). This is exactly why industry would love for platform-based design to become a new standard; gain high adoptability. The monetary costs saved would be enough to warrent adoption of the technology, \textbf{but} the monetary costs of developing such a system (e.g. design, evalutation, validation) does not carry the same attraction (simply because companies are selfish and want to \textbf{make} money).
\item Security concerns center around how to define trust/trustworthiness, determining the functions and behvaiors of security components, and the prinicples, policies, and mechanisms that are rigorously documented to standardize behavior. Also designed by industry to clearer standards, giving better security and ease of set-up and implementation.
\end{itemize}
The manufacturer's standpoint boils down to: the design should minimize mask-making costs but be flexible enough to warrant its use for a set of applications so that production volume will be high over an extended chip lifetime~\cite{Vincentelli2007}. Companies try to drive adoptability by means of creating something that users want to interact with, but not be complicated to learn (e.g. abstraction of technology for ease of use). Accounting for ease of use can lead to vulnerabilities in security or the development of new tools. Automation is desirable from a `business' standpoint since customers/users enjoy the `set it and forget it' mentality for technology (especially new technologies). Companies/Manufacturers need positive customer/user experiences, otherwise there is no desire to extend any supplied functionality to any other devices/needs on the part of the consumer. Adoptablility tends to come from user `word of mouth' praising the functionality and ease of use of new technology/methods/devices and how the developing party reacts to system failures or user-need (branching from complaints and support requests). This is exactly why industry would love for platform-based design to become a new standard; gain high adoptability. The monetary costs saved would be enough to warrent adoption of the technology, \textbf{but} the monetary costs of developing such a system (e.g. design, evalutation, validation) does not carry the same attraction (simply because companies are selfish and want to \textbf{make} money). Security concerns center around how to define trust/trustworthiness, determining the functions and behvaiors of security components, and the prinicples, policies, and mechanisms that are rigorously documented to standardize behavior. Also designed by industry to clearer standards, giving better security and ease of set-up and implementation. It is the aim of this paper to first outline platform-based design (Section~\ref{Platform-based design}) and its advantages and disadvantages, then move towards examining a model for designing security (Section~\ref{Security}) and lastly illustrate to the reader why platform-based design should be the basis for security design and development.

\section{Platford-based design}
\label{Platform-based design}
Expand Down Expand Up @@ -179,14 +176,11 @@ The first principle is that of `Least Common Mechanisms'. If multiple component
\end{itemize}
\end{itemize}

\begin{itemize}
\item Automation of security development
\begin{itemize}
\item When automating the development of security systems there are three key elements of the system that need to be examined/accounted for in the virtualization stage: security mechanisms, security principles, and security policies. For the purpose of reiteration, security mechanisms are the system artifacts that are used to enforce system security policies. Security principles are the guidelines or rules that when followed during system design will aid in making the system secure. Organizational security policies are ``the set of laws, rules, and practices that regulate how an organization manages, protects, and distributes sensitive information.''~\cite{Benzel2005} System Security Policies are rules that the information system enforces relative to the resources under its control to reflect the organizational security policy. Each of these aspects plays its part in determining the behavior and function of the overall security system. The security prinicples set the groundwork for how the system should behave and interact based on the expected user interactions. The security policies (both organizational and system) govern the rules and practices that regulate how the system, and its resources, is managed, how the information is protected, and how the system controls and distributes sensitive information. The security mechanisms are the implementations on these previous two aspects by being the system artifacts that are used to enforce the system security policies. Together these different facets shape and mold the desired higher level abstracted behavior and function that the system has been designed and developed for. Security principles may account for the majority of restrictions and considerations for a given system, but are by no means the most influential or important aspect. The security polcies developed out of the principles constrain the behavior, functions, and methods of communication between security elements. The mechanisms developed for implementing these rules and regulations must be designed in such a manner to ensurce the system's fidelity towards trustworthy actions while also being responsbile for how the system will react to unexpected input and failure.
\item In the same manner that these various security aspects (e.g. mechanisms, principles, policies) must be considered during development automation, the software and hardware aspects must also come under consideration based on the desired behavior/functionality of the system under design. One could have security elements that attempt to optimize themselves to the system they are in based on a few pivot points (power, time, efficiency, level of randomness). Another option for the automated tool could trade out specific security components as an easier way to increase security without requiring re-design/re-construction of the underlying element (e.g. modularity). There is always the requirement that the overall trustworthiness of a new system must meet the standards of the security policies that `rule' the system. For these reasons a user would desire rigorous documentation that would lay out the requirements of each component, so that in the case of trying to replace faulty or damaged components there would be no loss to the overall trustworthiness of the system; while also not introducing any vulnerabilities due to the inclusion of new system components.
\item Virtualization should be used for exploring the design space; it is hoped that it is obvious as to why. Not only is the cost of prototyping incredably expensive, but redesign is equally costly. Virtualization aids by removing the need for physical prototyping (less monitary costs) and allows for more rapid exploration of the full design space. While the design time for such powerful tools will be expensive (both in monitary and temporal costs), the rewards of developing, validating, and evaluating this virtualization tool will offset the early design phase costs of automated security component design.
\end{itemize}
\end{itemize}
When automating the development of security systems there are three key elements of the system that need to be examined/accounted for in the virtualization stage: security mechanisms, security principles, and security policies. For the purpose of reiteration, security mechanisms are the system artifacts that are used to enforce system security policies. Security principles are the guidelines or rules that when followed during system design will aid in making the system secure. Organizational security policies are ``the set of laws, rules, and practices that regulate how an organization manages, protects, and distributes sensitive information.''~\cite{Benzel2005} System Security Policies are rules that the information system enforces relative to the resources under its control to reflect the organizational security policy. Each of these aspects plays its part in determining the behavior and function of the overall security system. The security prinicples set the groundwork for how the system should behave and interact based on the expected user interactions. The security policies (both organizational and system) govern the rules and practices that regulate how the system, and its resources, is managed, how the information is protected, and how the system controls and distributes sensitive information. The security mechanisms are the implementations on these previous two aspects by being the system artifacts that are used to enforce the system security policies. Together these different facets shape and mold the desired higher level abstracted behavior and function that the system has been designed and developed for. Security principles may account for the majority of restrictions and considerations for a given system, but are by no means the most influential or important aspect. The security polcies developed out of the principles constrain the behavior, functions, and methods of communication between security elements. The mechanisms developed for implementing these rules and regulations must be designed in such a manner to ensurce the system's fidelity towards trustworthy actions while also being responsbile for how the system will react to unexpected input and failure.

In the same manner that these various security aspects (e.g. mechanisms, principles, policies) must be considered during development automation, the software and hardware aspects must also come under consideration based on the desired behavior/functionality of the system under design. One could have security elements that attempt to optimize themselves to the system they are in based on a few pivot points (power, time, efficiency, level of randomness). Another option for the automated tool could trade out specific security components as an easier way to increase security without requiring re-design/re-construction of the underlying element (e.g. modularity). There is always the requirement that the overall trustworthiness of a new system must meet the standards of the security policies that `rule' the system. For these reasons a user would desire rigorous documentation that would lay out the requirements of each component, so that in the case of trying to replace faulty or damaged components there would be no loss to the overall trustworthiness of the system; while also not introducing any vulnerabilities due to the inclusion of new system components.

Virtualization should be used for exploring the design space; it is hoped that it is obvious as to why. Not only is the cost of prototyping incredably expensive, but redesign is equally costly. Virtualization aids by removing the need for physical prototyping (less monitary costs) and allows for more rapid exploration of the full design space. While the design time for such powerful tools will be expensive (both in monitary and temporal costs), the rewards of developing, validating, and evaluating this virtualization tool will offset the early design phase costs of automated security component design.

At this point, it is the hope of the author that the reader can see how the needs and benefits of platform-based design and security development are closely aligned along the same concepts of rigorous design, virtualization/automation of tools, and the needs for meticulous documentation. The reasoning for using platform-based design is that PBD functions as a form of `architecural base' upon which security development can be mapped over. PBD can be used for development of hardware elements, security centric SoCs, or even as a set of abstract blocks that can be used to design higher level applications (e.g. virtualization development of larger security systems). But as with the development of any tool, and more so when expecting said tools to be more publically used, there is a deep need for meticulous documentation and rigorous use/distribution of standards. without this, there is no guarentee that anyone will benefits from use of this new model. Much like with security inovation and implementation, without proper outlining of behavior and function there is greater possiblity for erroneous use thus leading to greater vulnerability of the overall system.
\begin{quotation}
Expand Down

0 comments on commit 860ba0e

Please sign in to comment.