Abstract
Anomaly detection is a key strategy for cyber intrusion detection because it is conceptually capable of detecting novel attacks. This makes it an appealing defensive technique for environments such as the nation’s critical infrastructure that is currently facing increased cyber adversarial activity. When considering deployment within the purview of such critical infrastructures it is imperative that the technology is well understood and reliable, where its performance is benchmarked on the results of principled assessments. This paper works towards such an imperative by analyzing the current state of anomaly detector assessments with a view toward mission critical deployments. We compile a framework of key evaluation constructs that identify how and where current assessment methods may fall short in providing sufficient insight into detector performance characteristics. Within the context of three case studies from literature, we show how error factors that influence the performance of detectors interact with different phases of a canonical evaluation strategy to compromise the integrity of the final results.
This material is based upon work supported by the United States Department of Energy under Award Number DE-OE000012 and the Los Angeles Department of Water and Power and the Jet Propulsion Laboratory Internal Research and Technology Development Program, in part through an agreement with the National Aeronautics and Space Administration. Neither the United States Government, the Los Angeles Department of Water and Power, nor any agency or employees thereof, make any warranty, express or implied, or assume legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, nor that its use would not infringe privately owned rights. The views and opinions of authors expressed herein do not necessarily reflect those of the sponsors. Figures and descriptions are provided by the authors and used with permission.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Denning, D.E.: An Intrusion-Detection Model. IEEE Trans. on Software Engineering SE-13(2), 222–232 (1987)
Peisert, S., Bishop, M.: How to Design Computer Security Experiments. In: Futcher, L., Dodge, R. (eds.) Fifth World Conference on Information Security Education. IFIP, vol. 237, pp. 141–148. Springer, Boston (2007)
Maxion, R.: Making experiments dependable. In: Jones, C.B., Lloyd, J.L. (eds.) Festschrift Randell. LNCS, vol. 6875, pp. 344–357. Springer, Heidelberg (2011)
Gates, C., Taylor, C.: Challenging the Anomaly Detection Paradigm: a Provocative Discussion. In: Proc. of the Workshop on New Sec., pp. 21–29. ACM, Paradigms (2006)
Sommer, R., Paxson, V.: Outside the Closed World: On Using Machine Learning for Network Intrusion Detection. In: Proc. of IEEE Symp. on Security and Privacy, pp. 305–316 (May 2010)
Killourhy, K., Maxion, R.: Why Did My Detector Do That?! In: Jha, S., Sommer, R., Kreibich, C. (eds.) RAID 2010. LNCS, vol. 6307, pp. 256–276. Springer, Heidelberg (2010)
Ingham, K.L., Inoue, H.: Comparing Anomaly Detection Techniques for HTTP. In: Kruegel, C., Lippmann, R., Clark, A. (eds.) RAID 2007. LNCS, vol. 4637, pp. 42–62. Springer, Heidelberg (2007)
Hadžiosmanović, D., Simionato, L., Bolzoni, D., Zambon, E., Etalle, S.: N-Gram against the Machine: On the Feasibility of the N-Gram Network Analysis for Binary Protocols. In: Balzarotti, D., Stolfo, S.J., Cova, M. (eds.) RAID 2012. LNCS, vol. 7462, pp. 354–373. Springer, Heidelberg (2012)
Lee, W., Xiang, D.: Information-theoretic Measures for Anomaly Detection. In: Proc. of the IEEE Symp. on Security and Privacy, pp. 130–143 (2001)
Mai, J., Chuah, C.N., Sridharan, A., Ye, T., Zang, H.: Is sampled data sufficient for anomaly detection? In: Proc. of the 6th ACM SIGCOMM Conf. on Internet measurement, pp. 165–176. ACM (2006)
Ringberg, H., Roughan, M., Rexford, J.: The Need for Simulation in Evaluating Anomaly Detectors. SIGCOMM Comp. Comm. Rev. (CCR) 38(1), 55–59 (2008)
Tan, K.M.C., Maxion, R.A.: “Why 6?” Defining the Operational Limits of Stide, an Anomaly-Based Intrusion Detector. In: Proc. of the IEEE Symp. on Security and Privacy, pp. 188–201 (2002)
Tavallaee, M., Stakhanova, N., Ghorbani, A.: Toward Credible Evaluation of Anomaly-Based Intrusion-Detection Methods. IEEE Trans. on Systems, Man, and Cybernetics, Part C: Applications and Reviews 40(5), 516–524 (2010)
Forrest, S., Hofmeyr, S.A., Somayaji, A., Longstaff, T.A.: A Sense of Self for Unix Processes. In: Proc. of the IEEE Symp. on Security and Privacy. IEEE (1996)
Fogla, P., Lee, W.: Evading Network Anomaly Detection Systems: Formal Reasoning and Practical Techniques. In: Proc. of the 13th ACM Conf. on Comp. and Comm. Sec. (CCS), pp. 59–68. ACM (2006)
Wagner, D., Soto, P.: Mimicry Attacks on Host-based Intrusion Detection Systems. In: Proc. of the 9th ACM Conf. on Comp. and Comm. Sec. (CCS), pp. 255–264. ACM (2002)
Chandola, V., Banerjee, A., Kumar, V.: Anomaly Detection: A Survey. ACM Computing Surveys 41(3), 15:1–15:58 (2009)
McHugh, J.: Testing Intrusion Detection Systems: A Critique of the 1998 and 1999 DARPA Intrusion Detection System Evaluations as Performed by Lincoln Laboratory. ACM Trans. on Info. System Security 3(4), 262–294 (2000)
Horky, J.: Corrupted Strace Output. In: Bug Report (2010), http://www.mail-archive.com/strace-devel@lists.sourceforge.net/msg01595.html
Cretu, G.F., Stavrou, A., et al.: Casting Out Demons: Sanitizing Training Data for Anomaly Sensors. In: Proc. of the IEEE Symp. on Security and Privacy, pp. 81–95. IEEE (2008)
Kohavi, R., et al.: A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection. In: Intl. Joint Conf. on Artificial Intelligence, vol. 14, pp. 1137–1145 (1995)
Data Mining: Practical Machine Learning Tools and Techniques. Morgan Kaufmann (2005)
Javitz, H., Valdes, A.: The SRI IDES Statistical Anomaly Detector. In: Proc. of the IEEE Comp. Soc. Symp. on Research in Security and Privacy, pp. 316–326 (1991)
Lane, T., Brodley, C.E.: Approaches to Online Learning and Concept Drift for User Identification in Computer Security. In: Proc. of the 4th Intl. Conf. on Knowledge Discovery and Data Mining, pp. 259–263 (1998)
Wang, K., Parekh, J.J., Stolfo, S.J.: Anagram: A content anomaly detector resistant to mimicry attack. In: Zamboni, D., Kruegel, C. (eds.) RAID 2006. LNCS, vol. 4219, pp. 226–248. Springer, Heidelberg (2006)
Kruegel, C., Vigna, G.: Anomaly Detection of Web-based Attacks. In: Proc. of the 10th ACM Conf. on Comp. and Comms. Security (CCS), pp. 251–261. ACM (2003)
Axelsson, S.: The Base-rate Fallacy and the Difficulty of Intrusion Detection. ACM Trans. on Info. Systems Security 3(3), 186–205 (2000)
Mahoney, M.V.: Network Traffic Anomaly Detection Based on Packet Bytes. In: Proc. of the ACM Symp. on Applied computing, pp. 346–350. ACM (2003)
Wang, K., Stolfo, S.: Anomalous payload-based network intrusion detection. In: Jonsson, E., Valdes, A., Almgren, M. (eds.) RAID 2004. LNCS, vol. 3224, pp. 203–222. Springer, Heidelberg (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Viswanathan, A., Tan, K., Neuman, C. (2013). Deconstructing the Assessment of Anomaly-based Intrusion Detectors. In: Stolfo, S.J., Stavrou, A., Wright, C.V. (eds) Research in Attacks, Intrusions, and Defenses. RAID 2013. Lecture Notes in Computer Science, vol 8145. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-41284-4_15
Download citation
DOI: https://doi.org/10.1007/978-3-642-41284-4_15
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-41283-7
Online ISBN: 978-3-642-41284-4
eBook Packages: Computer ScienceComputer Science (R0)