Philadelphia picture © by Rob Shenk

Submission now open!

Important Dates

Mailinglist

If you want to keep updated about news regarding the workshop, we welcome you to subscribe to our mailinglist.

Contact

For all enquiries about the workshop, please contact:
 
Jan-Philipp Steghöfer
Institute for Software & Systems Engineering
Augsburg University
steghoefer@informatik.uni-augsburg.de
+49 (0) 821 598-2177

Call for Papers

Download the PDF here.

Call for Papers

The nature of self-organizing and autonomous systems and cyper-physical entities demands that issues of trust and its different facets become a primary concern. In autonomous cyber-physical or embedded systems with complex interactions between physical components there is a significant potential for failure, and considerable risk of severe adverse impact. When considering interaction between such systems and entities, their emergent behavior, and the highly dynamic and open environment in which they will be deployed the concerns related to security, safety and trust are even exacerbated. Not only will a thorough consideration of trust yield more robust and more secure systems, but the incorporation of trust can also lead to gains with regard to performance, acceptance and ease of use. In domains in which systems have to be certified, the formal treatment of trust and its facets in self-organizing or autonomous systems is a necessity.

The issues of trust and reputation management in multi-agent systems have received considerable attention. Also, formal methods to guarantee functional correctness, safety, and security as well as techniques to ensure reliability in distributed, self-organizing and autonomous systems have been investigated by diverse research groups from different communities. Furthermore, the role of humans as the users of self-organizing, self- adaptive and autonomous systems and the usability of such systems has been subject of research. These different facets of the same problem have so far been considered only separately and many have regarded security, safety, etc. as complementary to trust.

However, the overall trustworthiness of a self-organizing and autonomous system is connected to all the aforementioned properties and should be regarded holistically. In addition the legal issues of autonomous cyber-physical systems have so far not received much attention, as they mainly have been deployed in military contexts. When autonomous systems such as robots and mini-helicopters start to become used by private companies and individuals for everyday tasks such as driving, shopping and video recordings, a whole new range of safety, security, legal and trust issues arise.

The facets of trust must be considered in relationships between components of the system and between the user and the system. Functional correctness, security, safety, and reliability are facets that have to be ensured for the system's components as well as for the system as a whole. The classical notions of trust and reputation in MAS also apply to this relationship between system components. The relationship between the system and the user is influenced by the transparency and consistency of the system towards the user and most importantly by its usability, i.e., the way the user is informed about self-organization processes and autonomous system decisions and allowed to interact with the system.

The workshop will provide an open stage for discussions about the different facets of trust in self-organizing and autonomous systems, how every single one of them can be fostered, and how they relate. Further examples for topics of interest are:

Aim of the Workshop

The aim of the workshop is to bring together researchers of different communities such as Multi-Agent Systems, Autonomic Computing, Organic Computing, Trust Management, Security, Cyber-Physical Systems, and Distributed Systems to discuss - based on high quality position or research papers – the different aspects of trust in self-adaptive and self-organizing systems and to create a sense of the overarching concepts and problems that are associated with a holistic view on trustworthy self-organizing and autonomous systems. The workshop is an opportunity to promote this view and to engage in discussions about the interconnectedness of the different facets and their interplay in self-organizing and autonomous systems, as well as in presenting on- going research and identify areas where more attention from the community is required.

Audience

The workshop is aimed at researchers that have been investigating one of the trust aspects (functional correctness, safety, security, reliability, credibility, usability) in self-organizing or autonomous systems or that have been looking into trust and its different shapes. We explicitly encourage participation of researchers from different communities within computer science. The workshop will be set in an informal and cooperative atmosphere with ample time allotted to discussions.