Online Exclusive: Calculating the Relative Performance of a PIDS System

Calculating a realistic Probability of Detection (POD) or determining a measurement of actual detection performance for a PIDS is not quite as simple as many people believe. You need to understand the highly interactive and closely coupled relationship that exists between the detection of intrusions versus unwanted nuisance alarms.

You’ve done your research and collected all of the Perimeter Intrusion Detection Systems (PIDS) brochures and technical information available, and read through all of the specifications from various manufacturers.  Now you are completely bamboozled by claimed detection rates, nuisance alarm rates and false alarm rates.   So, how do you compare and evaluate a range of products to select the best performing system?

Calculating a realistic Probability of Detection (POD) or determining a measurement of actual detection performance for a PIDS is not quite as simple as many people believe.  You need to understand the highly interactive and closely coupled relationship that exists between the detection of intrusions versus unwanted nuisance alarms.   As end users demand ever increasing sensitivity and detection performance from their PIDS systems, controlling nuisance alarms will continue to be the greatest challenge facing suppliers.

This article offers one method to calculate a systems comparative performance taking this relationship and trade-off between detection and nuisance alarms into account.   Although we are only covering perimeter intrusion detection systems in this article, these same techniques could potentially be applied to other detection areas and technologies such as Ground Based Radar, Microwave, VMD, etc.

Detecting each and every intrusion on your perimeter is the primary expectation of any PIDS system, but equally important is the confidence that your security staff have in the system not only capturing and reporting all legitimate intrusions, but in eliminating or not reporting nuisance alarms.   Too many false or nuisance alarms will seriously erode confidence in the system, often to the stage where all alarms – real or not – are simply ignored by the security staff.

When alarms are ignored, the actual quality of performance in detecting an intruder in this scenario suddenly drops to 0% - regardless of what figures your vendor quotes for the equipment. So, the question is how do you measure and quantify this Quality of Performance?

Almost any vendor can claim 100% detection of intrusions, but often because of the high sensitivity settings typically used to achieve this figure during testing, a corresponding increase in nuisance alarms may render the system performance unacceptable from the customer’s perspective. What we need to do is measure both detection and nuisance alarm figures simultaneously in order to get a realistic Quality of Performance (QOP) or a measurement of relative system performance.  This methodology can be applied to the comparison of PIDS systems from different manufacturers as well as different technologies.

With a relatively small number of tests (typically in the order of 100’s) being carried out on a site over a relatively  short period of time as part of an onsite evaluation or system acceptance procedure to determine the QOP and level of Confidence, one method to use is as follows:

Example1:  As part of the commissioning of a PIDS system if for example we do a series of 30 climb tests on the fence and detect 29 out of the 30 climbs, giving us a Detection Rate of 29/30 or 0.966, and we also received one nuisance alarm during the tests also giving us 29 hits but with a total of 30 (true and false) actual alarms received = 96.6% confidence

The Quality or Performance (QOP) would be:

 0.966 x 96.6 = 93.3% QOP with a 96.6% confidence.  

 If we recorded the same detection rate but with no nuisance alarms during this test, then the result would be:

  .966 x 100 = 96.6% QOP with 100% confidence.

Example 2:   If we then increased the sensitivity during the test in order to improve the detection rate, we may see an increase in nuisance alarms.   If we record a detection rate of 30/30 or 1.0, but a confidence of say 30/35 = 85.7% (5 nuisance alarms) then the QOP would be 85.7% with an 85.7% confidence.   Actually worse system performance overall even though we made it more sensitive.

Example 3:  Conversely, if we reduced the sensitivity to have fewer nuisance alarms, we may miss actual intrusion events.   So for example a detection rate of 27/30 or .9, but a confidence of 27/27 or 100% would yield a QOP of 90% with 100% confidence.

So as we go either side of these optimal detection / nuisance alarm settings, we find that the QOP or detection performance falls off significantly. 

Of course, you should always adjust the PIDS system to achieve the optimal QOP figure with any installation – ideally it would be 100% - but in the real-world there are many external and site specific factors that can influence this figure, such as fence type, fence quality, environmental conditions, etc.   No two sites are ever the same and in addition to site specific and environmental factors, the detection rate may also be affected by the skill and knowledge of the intruder and therefore their ability to defeat the system.

Please note that this proposal of system QOP measurement is only intended for use at commissioning or comparison time, with a relatively small number of measurements (100’s) over a relatively short period of time (hours).  For measuring an individual site QOP over an extended period of time, a different calculation method based on binomial distribution would probably be more appropriate.

Often vendors quote just the raw detection rates for their technologies without any sort of confidence factor – so it’s unlikely to be the QOP you can realistically expect to achieve on your site.   Vendor testing is usually carried out in a controlled or laboratory environment where they increase sensitivity to do the detection tests, record this figure, then reduce the sensitivity down to a level to eliminate nuisance alarms and record this figure – both in isolation of each other.  The reality is that due to the highly interdependent nature of these results, both of need to be evaluated together in order to come up with a meaningful measurement. 

Testing also needs to be carried out at the actual installed site so that the site specific environmental conditions are taken into account when calculating the QOP figure.   Any POD figure or raw detection rate quoted will be conditional and unique to a site, despite the claims made by some sensor manufacturers.   For example, a sensor may have quite a high detection rate or POD for a low-level threat such as a teenage vandal who has little knowledge of the system versus a more sophisticated threat from a professional thief or special operations person for whom the detection rate or POD will almost certainly be substantially lower.  

However, the quality of performance (QOP) figure provides a relative measure of a systems ability to detect intrusions within the protected area. The QOP depends not only on the characteristics of a particular PIDS system, but also takes into account the environment, the method of installation and adjustment, and the assumed behavior of an intruder.   

Other than measuring a systems relative performance for comparison reasons, a QOP figure is also useful for establishing the base line performance of your installation for future reference.    All of the installation, system settings/sensitivity levels, maintenance and testing activities should be clearly documented and kept on site so that they can be used to provide a known basis for system support or fault troubleshooting and to ensure proper performance of the PIDS and identify any performance degradation over time.

Regular performance and operability testing should be carried out every 6 months or whenever any system component has been adjusted, replaced or repaired to confirm correct and effective operation of the PIDS system.  This performance testing should include the use of and be referenced back to the documented QOP figures from the original acceptance testing.

You can see from the above that how a PIDS system will actually perform on your site is often markedly different to the expectation set by the raw detection rate figures (often quoted as the POD).  No two sites are ever the same with numerous external and site specific factors impacting on this figure to reduce the quality of the overall system performance.  

Featured

  • New Report Reveals Top Trends Transforming Access Controller Technology

    Mercury Security, a provider in access control hardware and open platform solutions, has published its Trends in Access Controllers Report, based on a survey of over 450 security professionals across North America and Europe. The findings highlight the controller’s vital role in a physical access control system (PACS), where the device not only enforces access policies but also connects with readers to verify user credentials—ranging from ID badges to biometrics and mobile identities. With 72% of respondents identifying the controller as a critical or important factor in PACS design, the report underscores how the choice of controller platform has become a strategic decision for today’s security leaders. Read Now

  • Overwhelming Majority of CISOs Anticipate Surge in Cyber Attacks Over the Next Three Years

    An overwhelming 98% of chief information security officers (CISOs) expect a surge in cyber attacks over the next three years as organizations face an increasingly complex and artificial intelligence (AI)-driven digital threat landscape. This is according to new research conducted among 300 CISOs, chief information officers (CIOs), and senior IT professionals by CSC1, the leading provider of enterprise-class domain and domain name system (DNS) security. Read Now

  • ASIS International Introduces New ANSI-Approved Investigations Standard

    • Guard Services
  • Cloud Security Alliance Brings AI-Assisted Auditing to Cloud Computing

    The Cloud Security Alliance (CSA), the world’s leading organization dedicated to defining standards, certifications, and best practices to help ensure a secure cloud computing environment, today introduced an innovative addition to its suite of Security, Trust, Assurance and Risk (STAR) Registry assessments with the launch of Valid-AI-ted, an AI-powered, automated validation system. The new tool provides an automated quality check of assurance information of STAR Level 1 self-assessments using state-of-the-art LLM technology. Read Now

  • Report: Nearly 1 in 5 Healthcare Leaders Say Cyberattacks Have Impacted Patient Care

    Omega Systems, a provider of managed IT and security services, today released new research that reveals the growing impact of cybersecurity challenges on leading healthcare organizations and patient safety. According to the 2025 Healthcare IT Landscape Report, 19% of healthcare leaders say a cyberattack has already disrupted patient care, and more than half (52%) believe a fatal cyber-related incident is inevitable within the next five years. Read Now

New Products

  • A8V MIND

    A8V MIND

    Hexagon’s Geosystems presents a portable version of its Accur8vision detection system. A rugged all-in-one solution, the A8V MIND (Mobile Intrusion Detection) is designed to provide flexible protection of critical outdoor infrastructure and objects. Hexagon’s Accur8vision is a volumetric detection system that employs LiDAR technology to safeguard entire areas. Whenever it detects movement in a specified zone, it automatically differentiates a threat from a nonthreat, and immediately notifies security staff if necessary. Person detection is carried out within a radius of 80 meters from this device. Connected remotely via a portable computer device, it enables remote surveillance and does not depend on security staff patrolling the area.

  • Camden CV-7600 High Security Card Readers

    Camden CV-7600 High Security Card Readers

    Camden Door Controls has relaunched its CV-7600 card readers in response to growing market demand for a more secure alternative to standard proximity credentials that can be easily cloned. CV-7600 readers support MIFARE DESFire EV1 & EV2 encryption technology credentials, making them virtually clone-proof and highly secure.

  • Luma x20

    Luma x20

    Snap One has announced its popular Luma x20 family of surveillance products now offers even greater security and privacy for home and business owners across the globe by giving them full control over integrators’ system access to view live and recorded video. According to Snap One Product Manager Derek Webb, the new “customer handoff” feature provides enhanced user control after initial installation, allowing the owners to have total privacy while also making it easy to reinstate integrator access when maintenance or assistance is required. This new feature is now available to all Luma x20 users globally. “The Luma x20 family of surveillance solutions provides excellent image and audio capture, and with the new customer handoff feature, it now offers absolute privacy for camera feeds and recordings,” Webb said. “With notifications and integrator access controlled through the powerful OvrC remote system management platform, it’s easy for integrators to give their clients full control of their footage and then to get temporary access from the client for any troubleshooting needs.”