Departmental Papers (CIS)

Date of this Version

4-2017

Document Type

Conference Paper

Comments

Proceedings of The 8th ACM/IEEE International Conference on Cyber-Physical Systems (ICCPS 2017), Pittsburgh, PA USA, April 2017

Abstract

Data-driven techniques are used in cyber-physical systems (CPS) for controlling autonomous vehicles, handling demand responses for energy management, and modeling human physiology for medical devices. These data-driven techniques extract models from training data, where their performance is often analyzed with respect to random errors in the training data. However, if the training data is maliciously altered by attackers, the effect of these attacks on the learning algorithms underpinning data-driven CPS have yet to be considered. In this paper, we analyze the resilience of classification algorithms to training data attacks. Specifically, a generic metric is proposed that is tailored to measure resilience of classification algorithms with respect to worst-case tampering of the training data. Using the metric, we show that traditional linear classification algorithms are resilient under restricted conditions. To overcome these limitations, we propose a linear classification algorithm with a majority constraint and prove that it is strictly more resilient than the traditional algorithms. Evaluations on both synthetic data and a real-world retrospective arrhythmia medical case-study show that the traditional algorithms are vulnerable to tampered training data, whereas the proposed algorithm is more resilient (as measured by worst-case tampering).

Subject Area

CPS Theory, CPS Security

Publication Source

The 8th ACM/IEEE International Conference on Cyber-Physical Systems (ICCPS 2017)

DOI

10.1145/3055004.3055006

Keywords

cyber-physical systems, linear classification, training data attacks

Share

COinS
 

Date Posted: 22 March 2017

This document has been peer reviewed.