Research Interests

Dependability, Safety and Security 

Dependability is defined in [1] and [2] as "the ability of a system to deliver service that can justifiably be trusted". The service delivered by a system is its behavior as it is perceived by another system (human or physical) interacting with it. A service can deviate from its desired functionality. The occurrence of such an event is termed a failure. An error is defined as the part of the system state that may cause a failure. A fault is the determined or hypothesized cause of an error. It can be active, when it produces an error and dormant otherwise. A system fails according to several failure modes. A failure mode characterizes a service that does not fit with its desired functionality according to three parameters: the failure domain (value domain or time domain), the perception of the failure by several users of the system (consistent or incosistent), and the consequences of the failures (from insignificant to catastrophic).

Dependability is a concept that covers, in fact, several attributes. From a quality point of view, reliability, or the continuity of a correct service, and availability, expressing the readiness for a correct service, are important characteristics for any system.

Safety is the reliability of the system regarding critical failure modes, or failure modes leading to catastrophic, severe, or major consequences [3]. This attribute characterizes the ability of a system to avoid the occurrences of catastrophic events that may be very costly in terms of monetary loss and human suffering. One way to reach the safety objective is, first, to apply a safe development process in order to prevent and remove any design faults. This method has to be completed, in the design step, with an evaluation of the system's behavior. This can be achieved through a qualitative analysis (identification of failure modes, component failures and conditions leading to a system failure through formal modeling and analysis - see for example [4]) and a quantitative analysis (the probability evaluation applied to some parameters for the analysis of dependability properties). The last means for reaching dependability is to apply a fault-tolerant approach [5]. An important quantitative analysis issue in designing fault-tolerant systems is how to balance the amounts of failure detection, recovery and masking redundancy used in the system, in order to obtain the best possible overall cost/performance/dependability results - see for example [6].  

Dependability and Security have a lot in common ([7], [8] and [9]). Security refers to how robust a system is with respect to a particular security policy. Dependability refers to how robust a system is with respect to some fault and the definition of a fault is analogous to a security policy. It is widely accepted today ([10], [11]) that security is a subset of dependability. If there is a way to violate a security policy (see for example [12]), then there is a fault. The security policy always seems to be part of the particular definition of "robust" that is applied to a particular system.

References related to the generic concepts of Dependability, Safety and Security

[1] A. Avizienis, J. Laprie, and B. Randell. Fundamental concepts of dependability. In Proceedings of the 3rd Information Survivability Workshop, Boston, USA, 2000, pp. 7-12

[2] A. Avizienis, J. Laprie, B. Randell, and C. Landwehr. Basic concepts and taxonomy of dependable and secure computing. IEEE Transactions on Dependable and Secure Computing, 1: 11-33, 2004

[3] ARTIST, Project IST-2001-34820. Selected topics in embedded systems design: roadmaps for research, May 2004

[4] Related articles authored by P. Katsaros: A, B, C, D, E and others

[5] F. Cristian. Understanding fault-tolerant distributed systems. Communications of the ACM, 34/2: 56-78, 1991 

[6] Related articles authored by P. Katsaros: F, G, H, I, J, K, L and others

[7] C. Meadows. Applying the dependability paradigm to computer security. In Proceedings of the Workshop of New Security Paradigms, La Jolla, USA, 1995, pp. 75-79

[8] C. Meadows. Applying the dependability paradigm to computer security: then and now, In Proceedings of the Workshop on Principles of Dependable Systems, Dependable Systems and Networks 2003, San Francisco, USA, June 2003 

[9] P. Verissimo, Dependability, Security, two faces of a same coin?.Invited talk, In Proceedings of the Workshop on Principles of Dependable Systems, Dependable Systems and Networks 2003, San Francisco, USA, June 2003

[10] J. Viega and G. McGraw, Building secure software, Addison-Wesley Professional Computing Series, 2002 (pp. 15)

[11] R. Anderson, Security Engineering: A guide to building dependable distributed systems, Wiley, 2001

[12] Related articles authored by P. Katsaros: M , N, O and others

Important Annual Conferences

Important Journals

 

Component-Based Software Engineering (CBSE) 

CBSE is the discipline of the development of software components and the development of systems incorporating such components. Component-based systems are built by assembling components developed independently of the systems. The general-purpose component technologies currently available (CORBA CCM, JavaBeans, DCOM, .NET) cannot cope with the nonfunctional requirements of such systems (reliability, availability, safety, security etc). These additional requirements call for new technologies and new methods of software modeling and software verification.

Important Annual Conferences

Important Journals