EIS at RE’19

One of the biggest coups for EIS was to have a paper accepted at the renowned IEEE Requirements Engineering Conference during the project’s planning grant period. Consequently, Maximilian Köhl and Timo Speith, two of the paper’s authors, flew to South Korea to present their work to an interested community. During their stay, both of them […]

EIS at the conference for industrial, organizational and economic psychology (AOW) 2019 in Braunschweig (September 26, 2019)

Last week, Lena Kästner and Markus Langer had the opportunity to present the topic Explainable Artificial Intelligence (XAI) and the idea behind our research project Explainable Intelligent Systems (EIS) to an attentive audience at the AOW 2019. A fascinating discussion was prompted, following the presentation of the two researchers and after having bridged the terminological […]

EIS goes Athens: ESPP 2019

At the beginning of September, the EIS team contributed the interdisciplinary symposium on “Explainable Intelligent Systems and the Trustworthiness of Artificial Experts” for the third time. We presented at the 27th International Conference of the European Society for Philosophy and Psychology (ESPP), with Viviane Hähne, Lena Kästner, Daniel Oster, and Timo Speith as speakers. In […]

EIS Symposium at EuroCogSci 2019 in Bochum

The EIS team contributed an interdisciplinary symposium on “Explainable Intelligent Systems and the Trustworthiness of Artificial Experts” to the conference EuroCogSci, with Tina Feldkamp, Daniel Oster, Eva Schmidt, and Timo Speith as speakers. We used the symposium to give some of our junior researchers a chance to present in front of an international audience, and […]

EIS at the 20th Fall Academy of the German Foundation for Law and Computer Science

Kevin Baum and Andreas Sesing will partake in the 20th Fall Academy of the German Foundation for Law and Computer Science (DSRI). Their paper on ”Requirements for the Explainability of Machine-aided Decisions” (”Anforderungen an die Erklärbarkeit maschinengestützter Entscheidungen”) was accepted. The Fall Academy is taking place in Bremen between September 11 and September 14, 2019. […]

EIS goes Rome: ICIL 2019

Last week, Timo Speith, Felix Bräuer, and Markus Langer from the EIS team presented the Panel “Explainable Intelligent Systems and the Trustworthiness of Artificial Experts” at the 9th International Conference on Information Law and Ethics in Rome. In the course of the project, we are going to present this panel a total of three times, […]

Paper accepted for RE’19

We are proud to announce that the EIS-influenced paper »Explainability as a Non-Functional Requirement« by Maximilian Köhl, Dimitri Bohlender, Kevin Baum, Markus Langer, Daniel Oster, and Timo Speith has been accepted for the 27th IEEE International Requirements Engineering Conference at Jeju Island, South Korea (23th to 27th September).

Felix Bräuer presented “Trusting Artificial Experts” at University of Erlangen-Nuremberg

In March 2019 Felix Bräuer (Assistant Professor of Philosophy, Saarland University) has given a talk on “Trusting Artificial Experts” at the workshop Epistemic Trust in the Epistemology of Expert Testimony (University of Erlangen-Nuremberg). In his talk, Felix Bräuer has discussed which notions of trust are and aren’t applicable to our dealings with artificial intelligent recommender […]

Scroll to top