Security in Publicly Accessible Areas

überwachung
Szenario Prototyp Sicherheit in öffentlichen Räumen

In the name of civil security and risk prevention an increasing amount of sensitive data is collected by video surveillance systems. This approach is criticized in many ways. To solve the conflict between security needs and privacy protection one of the three KASTEL scenarios analyses how a privacy respecting video surveillance system can be established. As for all complex systems the formalization of the required security requirements and legal compliance is a key challenge. The aggregation of video data over multiple video cameras introduces additional new privacy questions. The goal of the project is a video surveillance system that detects critical situations without unnecessary impact on the privacy of the people under surveillance. To reach that goal KASTEL evaluates its different security assumptions in an interdisciplinary consortium.

Legal questions

In recent years, monitoring of publicly accessible areas by video surveillance is steadily increasing. The traditional camera-monitor principle fades more and more into the background. Instead of these cameras, intelligent video surveillance systems are much more used. They allow to filter information from recorded images, as well as to process them and to make decisions based on them. Therefore improving technical possibilities permit a more comprehensive analysis of people’s behavior. At the same time, they also have protective mechanisms to strengthen privacy of monitored persons. At once, such methods raise new legal questions. With regard to German constitutional law, the general right to privacy is important. On federal law level, smart or intelligent video surveillance systems must be in accordance with § 6 b German Data Protection Law (BDSG). But also the relation with the data reduction and data economy principle (§ 3 a BDSG) and the automated individual decision principle (§ 6 a BDSG) must be analyzed. Concerning the European Level, the “proposal for a General Data Protection Regulation” has to be legally assessed. A general data protection regulation could lead to the fact that national data protection provisions become invalid regarding the provision of the regulation. Therefore, more legal investigations are needed, if the European data protection regulation comes into force.

Contact

ZAR

Sebastian Bretthauer (ZAR)

References

  1. I. Spiecker gen. Döhmann, C. Bier Neue Entwicklungen der intelligenten Videoüberwachungstechnik – Schreckensszenario oder Gewinn für den Datenschutz? CR 2012, S. 610 - 618
  2. I. Spiecker gen. Döhmann Big Data intelligent genutzt: Rechtskonforme Videoüberwachung im Öffentlichen Raum K&R 2014 (im Erscheinen)
  3. I. Spiecker gen. Döhmann Ich sehe `was, was Du nicht siehst - Rechtliche Rahmenbedingungen intelligenter Videoüberwachung Rupert Vogel (Hrsg.), Jahrbuch 2013 im Auftrag der DGRI, Köln 2014 (im Erscheinen)
  4. S. Bretthauer, T. Bräuchle Datensicherheit in intelligenten Infrastrukturen M. Horbach (Hrsg.), Informatik 2013 - Informatik angepasst an Mensch, Organisation und Umwelt, Proceedings, GI-Edition: Lecture Notes in Informatics (LNI), S. 2104 - 2118
  5. P. Birnstill/S. Bretthauer/S. Greiner/E. Krempel Privacy Preserving Surveillance: An interdisciplinary approach Second SMART Policy Workshop
  6. S. Bretthauer/E. Krempel Videomonitoring zur Sturzdetektion und Alarmierung - Eine technische und rechtliche Analyse E. Schweighofer/F. Kummer/W. Hötzendorfer (Hrsg.), 17. Internationales Rechtsinformatik Symposion (IRIS) 2014 - Transparenz, S. 525-534 [online in: Jusletter IT 20. Februar 2014 http://jusletter-it.weblaw.ch/

Usage Control Enforcement for Privacy-related Requirements

As computer vision and situation assessment capabilities of smart video surveillance systems increase, so does the diversity of threats to privacy. Accordingly, privacy-related requirements emerge on a much more fine-grained level of detail: Which data is extracted from images? Which analyses are conducted on abstracted data? Which data is exposed for human situation assessment? Which additional access permissions, surveillance functions, or analysis methods get unlocked for investigating confirmed incidents?

Usage control provides an infrastructure for specifying and enforcing policies, which allow for such a fine-grained and deeply integrated control of intelligent video surveillance systems. We aim at a system, which automatically adapts to a situation-dependent trade-off between protecting the privacy of people in the observed area while still providing appropriate means for handling potentially critical incidents. Hence, our hybrid system combining algorithmic pre-processing with human situation assessment is optimized for privacy as long as there is no incident to be handled. Upon detecting an incident requiring human assessment, the system offers an incident type-specific view for situation assessment, which still aims at protecting the observed persons' privacy, e.g., by applying appropriate anonymization techniques to the images. Highly intrusive surveillance functionality only gets unlocked in case an incident has been confirmed by a human operator. The usage of such functionality comes along with detailled logging.

The prototype system restricts intrusions into observed persons' privacy according to incident types to be handled and thus contributes to achieving data economy. Furthermore automated individual decision are proactively inhibited by design.

Contact

IOSB

Pascal Birnstill (IOSB)

References

  1. Pascal Birnstill und Alexander Pretschner Enforcing Privacy through Usage-controlled Video Surveillance 10-th IEEE International Conference on Advanced Video and Signal-Based Surveillance (AVSS 2013)
  2. Pascal Birnstill Usage-controlled Video Surveillance - Revealing its Potentials for Privacy Security Research Conference 2013 (Future Security 2013)

Proving information flow properties for the prototype

State of the art surveillance systems do not only collect images of areas under surveillance, but also further data about people, for example their paths of movement within the building. With this data it is possible, to present less information about people by providing abstractions. Although this improves the protection of the person's privacy, the impact on employees still is tremendous. One solution to this problem is to work on applications that allow to filter out information about specially protected groups of people in order to improve their protection.

In KASTEL, methods are developed that allow a formal proof that collected data is never displayed in a system. The specification of related information flow properties is complex and their proof not possible with common techniques. Still, the applicability of the approach could be shown for an implementation and work on improvement of usability of the tools applied is going on.

The goal of this work is to provide the possibility of specification and proof of detailed and complex information flow properties and to gain a detailed understanding about the information flow within a system.

Contact

ITI

Simon Greiner (ITI)

References

  1. Simon Greiner, Pascal Birnstill, Erik Krempel Privacy Preserving Surveillance and the Tracking Paradox Proceedings of 8th Future Security, 2013

NurseEye Prototype

NurseEye is a research prototype for video-based semi-automatic fall detection. The system is designed for usage in hospitals and nursing homes and therefore incorporates many privacy related components. When NurseEye detects a falling person an alarm is send to the mobile devices of the nursing stuff. First the person next to the incident is alarmed and only if he is not able to respond in time additional personal is contacted. Once a person has received and confirmed an alarm, he becomes responsible for solving it. Therefore he gets access to an anonymized video stream to assess the situation. This prevents unnecessary privacy invasion in case of false-positive alarms. When a possible accident gets confirmed, the system activates a video-chat between the victim of the accident and the responding personnel. This allows the stuff to calm the victim already on the way to the emergency.

To achieve a maximum of transparency for all people un the monitored area, NurseEye deploys displays with every video camera. These displays show the current usage of the video data. As long as now emergency is detected, all video data is analyzed by algorithms and not by a human operator. The display directly under the camera will inform of this situation. When an emergency was detected the display will signal this. Once a member of the stuff has confirmed the accident and gets access to the live video data the display will show the face of this person.

Contact

ZAR IOSB

Sebastian Bretthauer (ZAR)

Erik Krempel (IOSB)

References

  1. Sebastian Bretthauer, Erik Krempel Videomonitoring zur Sturzdetektion und Alarmierung - Eine technische und rechtliche Analyse E. Schweighofer/F. Kummer/W. Hötzendorfer (Hrsg.), 17. Internationales Rechtsinformatik Symposion (IRIS) 2014 - Transparenz, S. 525-534 [online in: Jusletter IT 20. Februar 2014, http://jusletter-it.weblaw.ch/]

Anonymization of Video Data

Assessment of potentially critical situations through human operators requires appropriate video data. Exposing unmodified images of a certain camera may reveal captured persons' identity. Image processing provides algorithms, which can be used for obfuscating persons in images, e.g., blurring, edge detection, silhoutte detection, etc.

KASTEL analyses such techniques regarding the capability to hide persons' identities (identifiable features), while at the same time preserving the utility of the images for recognizing certain activities or events (e.g., violence, people falling down, stealing, dropping objects, etc.). Our goal is identifying suitable anonymization techniques for different kinds of events, which achieve a reasonable trade-off between privacy protection for people being observed and reliable situation assessment.

Contact

ITI IOSB

Pascal Birnstill (IOSB)

Tobias Nilges (ITI, AG Kryptographie und Sicherheit)

Transparency

In the near future modern video surveillance system will alter the way we solve security and safety task. Researchers are working on systems that are capable to track suspicious person over multiple cameras or detect when persons are falling to the floor.

With increasing surveillance capabilities the information imbalance between the people under video surveillance and the operators worsens. The operators receive detailed information about all persons under surveillance, while in most cases they only know about the presence of video surveillance systems. This imbalance is troublesome for multiple reasons. Transparency of data processing is a part of many data protection acts, e.g., the German BDSG requests transparency (BDSG §6b, BDSG §19a). Further it is an important element of broadly accepted principles of data handling. For example the Fair Information Practice Principles (FIP) and Privacy by Design (PbD) demand a high level of transparency. Last but not least transparency is an important factor for acceptance and should be part of every surveillance system operated in democratic countries.

Many different transparency technologies are available for integration in surveillance systems. While all of them are able to improve transparency not all of them qualify for every surveillance purpose. In KASTEL we explore to categorize such methods to identify ideal concepts for different surveillance tasks. Promising approaches get implemented and integrated into the NurseEye prototype.

Contact

IOSB

Erik Krempel (IOSB)

References

  1. Vagts, H.; Krempel, E. & Beyerer, J. User-centric Protection and Privacy in Smart Surveillance Systems Future Security - Security Research Conference 2012
  2. Kevin Laubis Vorausgreifende Methoden der Transparenz in intelligenten Videoüberwachungsanlagen Masterthesis am Karlsruhe Institut für Technology