Auror Subject Recognition

Auror Subject Recognition integrates with best-in-class facial recognition technology (FRT) to detect high-harm and prolific persons of interest (POI).

It allows retailers to combine FRT with their information about past serious offending, which is safeguarded to prohibit the collection of sensitive characteristics, and can only be used for crime prevention and safety purposes.

Our safeguards ensure biometric data is never stored within the retailer’s information in Auror, a human is always in the loop on decisions, and auditability tools promote transparency and responsible use.

The growing issue of violent crime in retail

Retail crime is becoming more brazen, organized and violent, and retailers are seeking solutions to keep their frontline workers, customers, and communities safe.
1 in 10

retail crime events are violent, abusive or involve weapons.

>60%

of retail crime is caused by the top 10% of offenders.

4x

more likely a repeat offender will be violent or aggressive.

Retailers have long used common target hardening tactics such as gates, store layout adjustments, security guards, staff de-escalation training and smoke machines to curb the trend, however brazen offenders and organized crime groups are driving a continued rise in violence.

Retail crime has a serious impact on frontline workers and the communities they serve. It strips community hubs of their vibrancy and is a gateway to more serious offending.

This is a global issue that requires responsible, preventative solutions that empower frontline teams with the right information to respond according to risk.

Our solution for safer stores and safer communities

Auror has been providing secure and trusted retail crime intelligence software to retailers for over a decade. Auror Subject Recognition is built on best-in-class FRT, strengthened by responsible safeguards and integrity tools to support secure workflows.

Auror Subject Recognition is designed to be used by retailers strictly for identification of POI for crime prevention and safety purposes only. Identification is based on past in-store offending that meet the retailer’s criteria for high-harm and prolific offending. Retailers cannot manually or arbitrarily enroll any profile to the POI list.

All retailer-entered information in Auror is restricted to prohibit the collection of sensitive information such as ethnicity, race, religion, political affiliation, or sexual orientation.

It cannot in any way be used for profiling, categorization, prediction, tracking, monitoring behavior, or targeted marketing purposes.

Built-in integrity tools provide retailers with full auditability and transparency. A human is always in the loop on decisions, and biometric data is never entered or stored within the retailer’s information in Auror, regardless of a match.

Auror undertakes extensive testing and due diligence of selected FRT providers against strict technical accuracy, anti-bias, reliability, privacy and security measures, in line with our own Responsible Tech and AI process.

Auror only partners with leading FRT providers that demonstrate world-class performance and reliability, are trained on diverse datasets, and are validated through independent testing.

We’ve outlined here how it works, what it does and what it doesn’t do and the approach we take to make sure we get it right. 

A responsible approach to live facial recognition in retail

We take our role in protecting information and building integrity tools for retailers using this technology seriously. Our aim is for everyone in the ecosystem to not just simply be compliant, but also to demonstrate best practice and what can be achieved when this technology is used in the right way.

Click the drop downs to learn more about how Subject Recognition and its integration with FRT works.

Auror Subject Recognition

Down arrow
  • It is strictly used for identification of POI for crime prevention and safety purposes only.

  • Biometric data is never entered or stored within the retailer’s information on Auror, regardless of a match. All processing of biometric data takes place within the integrated third-party FRT software.

  • If there is no match, both the detection image and temporary biometric template is only ever processed in the integrated third-party FRT software and is discarded immediately. It is never stored or entered within the retailer's information in Auror.

  • To be enrolled to the list, a POI must already exist within the retailer’s information on Auror and meet the retailer’s criteria for high-harm and prolific offending.

  • Retailers cannot manually or arbitrarily enroll any profile to the POI list. They can review suggested enrollments based on their own policies.

  • The retailer information that makes up the POI list is safeguarded to deliberately prohibit the collection of sensitive information, such as ethnicity, race, religion, political affiliation or sexual orientation.

  • It cannot in any way be used for profiling, categorization, prediction, tracking, monitoring behavior or targeted marketing purposes.

  • Law enforcement cannot access the Subject Recognition module or POI lists using Auror.

  • There are no shared POI lists between retailers and they cannot be shared with law enforcement.

  • A human is always in the loop on decisions.

  • Built-in transparency and auditability tools are available.

  • Auror has tested and adapted Subject Recognition to align with privacy impact and cultural assessments.

Integrated FRT software

Down arrow
  • It cross-checks a temporary biometric template against the retailer’s POI list. This temporary biometric template is only ever processed by the integrated third-party FRT software and discarded immediately. It never leaves the FRT software and is never entered or stored within the retailer’s information on Auror.

  • It uses a series of geometric and structural anchor points such as eyes, nose, cheekbones, and the distances between these points to generate the temporary biometric template.

  • It does not rely on race or ethnicity to determine a match and retailers are unable to directly capture this information.

  • It does not explicitly capture tattoos or markings in the biometric template.

  • Integrated FRT providers have passed our rigorous testing processes, our Responsible Tech and AI framework, are independently certified, and meet local legal requirements.

A responsible approach to FRT

We take our role in protecting information and building integrity tools for retailers when using this technology seriously. We aim for everyone in the ecosystem not just to simply comply with the law, but also to demonstrate best practice and what can be achieved when you get this right.

Third-party FRT:

Uses a series of geometric and structural anchor points from a face to generate temporary biometric templates. Key anchor points can include eyes, nose, cheekbones, chin, jawline, corners of the mouth, eyebrows and forehead, and the distances between these points.
Does not categorise or rely on race, skin color or ethnicity to determine a match and retailers are unable to directly capture this information in Auror.
Does not explicitly capture facial markings such as tattoos in the biometric template.
Has undergone and passed our rigorous testing processes, Responsible Tech and AI framework, and meets local legal requirements.

Auror’s FRT integration:

Biometric data is never entered or stored in the Auror database, regardless of whether there is a match or not. All processing of biometric data takes place in the third-party FRT.
FRT activity and information cannot be accessed by law enforcement using Auror, and retailers cannot share this information to other retail or law enforcement users of Auror.
It cannot be used for prediction, categorisation, marketing purposes or profiling of individuals or groups, and does not in any way monitor behavior such as shopping habits, emotion, a person's gait or movement within a store, their age or gender.
It is strictly limited to identification based on historic images and past in-store offending within a maximum period of 24 months, and cannot be used to track behavior.
Requires a human to always be in the loop. No decisions are made exclusively through automation.
Auror has undertaken privacy impact assessments (PIA) and where appropriate, a cultural review of our own product, to ensure compliance and mitigate potential risks.
As with all Auror modules, free-text fields are deliberately limited to mitigate the collection of sensitive information, such as ethnicity, race, religion, political affiliation or sexual orientation and filters do not allow the use of derogatory words or phrases
Auror enforces non-negotiable privacy guardrails for retailers, including the use of FRT for crime and safety purposes only, end-to-end transparency and auditability.

How it works

To be enrolled to the list, a POI must already exist within the retailer’s information on Auror, and meet the retailer’s criteria for Extreme Threat, Serious Threat, or High Loss.
The software has strict access controls, ensuring only a limited number of authorized staff, such as security managers, have access. Auror’s terms of use and the retailer’s own internal operating policies require all users to handle information confidentially and responsibly.
Person enters the store
The FRT detects a face and creates a temporary biometric template. This is cross-checked by the integrated third-party FRT software against a POI list of historic images linked to past in-store events that meet the retailer’s criteria.
No match
If there is no match to the POI list, no action is taken. The image and temporary biometric template is only ever processed by the integrated third-party FRT software and discarded immediately. It never leaves the FRT software and is never stored or entered within the retailer's information on Auror.
Match declined
No action is taken and the detection image is permanently deleted from Auror. The temporary biometric template is immediately discarded from the integrated third-party FRT software and never stored within the retailer’s information on Auror.
No decisions are made exclusively through automation and a human is always the final decision maker.

This technology reduces human bias by suggesting POI matches based on previous high-harm and prolific offending only, not on how a person looks, their demographic characteristics or what they’re wearing.
Match and alert sent
If there is a suggested POI match, an alert is sent via Auror to an authorized, trained retailer employee. This employee can view the historic image and detection image, and either confirm or decline the match. While the detection image is sent to Auror to make the alert, the temporary biometric template is discarded immediately and never stored within the retailer’s information on Auror.
Police have no access to the Subject Recognition module. Retailers cannot share Subject Recognition information with other retailers or law enforcement using the software.

A store team’s response can range from engaging politely with the POI, to not approaching, or escalating further. This empowers teams to make appropriate decisions to protect colleagues and customers.
Match confirmed and team can respond
The retailer employee confirms the match and is provided with information about the POI’s past offending, as previously recorded by the retailer in Auror. This gives workers critical moments to determine a response, guided by the retailer's training and internal policies. Even in the case of a confirmed match, the temporary biometric template is never stored or entered within the retailer’s information on Auror.
Outcome recorded
Once action has been taken, the employee can record the outcome to provide a full audit trail.

How it works

Person enters the store
This technology reduces human bias by suggesting matches based on previous offending, not on how a retail employee perceives them, what they’re wearing or their background.
The FRT detects a face and creates a temporary biometric template. This is cross-checked against a list of biometric templates of previously identified high risk and repeat offenders that meet a criteria set by the retailer.
No match
If there is no match to the list of previously identified offenders, no action is taken. The image and biometric template is never stored or entered in the Auror database.
Match and alert sent
This technology reduces human bias by suggesting matches based on previous offending, not on how a retail employee perceives them, what they’re wearing or their background.
If there is a high confidence suggested match, a notification is sent via Auror to an authorized, trained employee. This employee can view the images and either confirm or decline the suggested match. No decisions are made exclusively through automation and a human is always the final decision maker. The biometric template is never stored or entered in the Auror database.
Authorized employee declines the match
No action is taken and the detection image is permanently deleted from Auror
Match confirmed and appropriate action taken
Auror does not provide police with access to FRT information. Retailers cannot share FRT information with other retailers or police using the software.

A retail team’s response could range from engaging politely with the individual to alerting law enforcement and not approaching. Ultimately the goal is to empower teams to make appropriate decisions to protect staff and customers.
An authorized employee confirms the match and is provided with information about the individual’s known behavior and offending, as previously recorded by the retailer in Auror. This gives employees critical minutes to determine a response, guided by the retailer's training and internal policies.
Outcome recorded
Once the appropriate action has been taken, the employee can record the outcome to provide a full audit trail.

Frequently
asked questions

Why is Auror supporting integration with live FRT?

Down arrow

Retail crime is becoming more brazen, organized and violent, and the obligation for retailers to keep their frontline workers safe has become clear. FRT has become exceptionally accurate and will be a key tool in reducing violence in stores, so a responsible solution is required.

While Auror does not build or own FRT, it safely applies the secure end-to-end workflows, intelligence, strict safeguards and responsible controls to ensure the technology is used for identification, and crime prevention and safety purposes only.

All FRT providers have undergone and passed our testing processes. Approved FRT is trained on diverse datasets and has reached significantly high levels of accuracy and reliability, as demonstrated by independent testing.

FRT is already used in everyday environments such as airports, hotels and casinos, and it offers retailers the fastest and most accurate way to identify high-harm and prolific POIs, and make stores safer.

How is Subject Recognition used by retailers?

Down arrow

Subject Recognition can only be used for identification, strictly for crime prevention and safety purposes.

The integrated FRT cross-checks against a POI list of reference biometric templates based on historic images linked to past in-store events that meet the retailer’s criteria for high-harm and prolific offending.

All retailer-entered information in Auror is deliberately restricted to prohibit the collection of sensitive information such as ethnicity, race, religion, political affiliation, or sexual orientation.

It cannot in any way be used for profiling, prediction, tracking, monitoring behavior, or targeted marketing purposes.

Built-in integrity tools provide retailers with full auditability and transparency. A human is always in the loop on decisions, and biometric data is never entered or stored within the retailer’s information on Auror, regardless of a match.

Once a POI match is verified by a trained, authorized retailer employee, store teams are provided with information about the POI’s previous offending. This gives retail teams critical moments to determine a response according to the risk, guided by their training and internal policies.

Importantly, law enforcement cannot access the Subject Recognition module or POI lists, and retailers cannot share this information with other retailers or law enforcement users of Auror.

How is this different from other live FRT?

Down arrow

Concerns around FRT often center on its potential use for categorization, profiling, tracking, behavioral monitoring, marketing purposes, arbitrary enrollment, or scraping images from open sources. With Subject Recognition, retailers cannot in any way use FRT for those purposes.

Unlike other FRT solutions, Auror Subject Recognition combines best-in-class FRT with retailer information about high-harm and prolific offending, which is restricted to prohibit the collection of sensitive information such as ethnicity, race, religion, political affiliation, or sexual orientation.

To be enrolled to the list, a POI must already exist within the retailer’s information on Auror, and meet the retailer’s criteria for Extreme Threat, Serious Threat, or High Loss.

Built-in integrity tools provide retailers with full auditability and transparency. A human is always in the loop on decisions, and biometric data is never entered or stored within the retailer’s information in Auror, regardless of a match.

How do I know if a retailer is using Subject Recognition?

Down arrow

As Auror Subject Recognition integrates FRT, retailers are responsible for clearly communicating their use of FRT through appropriate signage. This provides customers with the opportunity to choose whether they want to enter the store, go to a different store, or use an alternative method such as online shopping. The purpose of Subject Recognition is to give frontline retail workers early notification of high-harm and prolific offenders to keep stores safe.

Who is detected by the FRT?

Down arrow

Anyone who enters a premises operating FRT, supported by Auror Subject Recognition, is cross-checked against a POI list which is based on information previously recorded by the retailer that meets their criteria for high-harm and prolific offending.

The vast majority of people will not have been previously recorded by the retailer in Auror or meet their criteria, therefore their image and temporary biometric template is only ever processed on the integrated third-party FRT software and is discarded immediately. It is never stored or entered within the retailer's information on Auror.

How is the POI list created?

Down arrow

Retailers use Auror to record crimes and in-store events after they happen at their sites, including images and CCTV footage as evidence of these events. POIs involved in these previously recorded events that meet the retailer’s criteria for Extreme Threat, Serious Threat, or High Loss are suggested for enrollment to the POI list.

The software suggests enrollments based on the retailer’s criteria and an authorized, trained retailer employee can review these suggestions based on their own policies. Retailers cannot arbitrarily enroll anyone to the POI list and post-enrollment auditing functionality allows retailers to ensure quality control of their list.

If a retailer changes any information that means a POI no longer meets the criteria for enrollment, then they automatically fall off the list.

Does this use a national watchlist or are POI lists shared between retailers?

Down arrow

No. Auror Subject Recognition has been configured in such a way to allow a retailer to keep their own information private. Retailers cannot share POI lists with other retailers or law enforcement, but only within their own organization. The POI list is only organization-wide and not created store-by-store, or a centralized ‘master’ list. This allows frontline teams to benefit from information about past offending across the retailer’s store network, and reduce the duplication of information across individual stores.

Who has access to the Subject Recognition module and POI lists?

Down arrow

Retailers authorize a limited number of specific roles in their organization to have access to the Subject Recognition functionality; this includes approving or declining suggested enrollment, quality controlling and auditing lists, receiving alerts and reviewing suggested matches. These individuals are trained by the retailer and must comply with retailer policy and Auror’s terms of use. All activity is auditable and monitored. Neither Auror or retailers can provide law enforcement users with access to FRT information or POI lists, and retailers cannot share this information with other retailers.

How long is information retained?

Down arrow

If there is no match, the image and temporary biometric template is only ever processed by the integrated third-party FRT software and is discarded immediately. It is never stored or entered within the retailer's information on Auror.

When a match is confirmed, the detection images are passed to Auror and are stored and retained in line with the retailer’s retention policies. Retailers generally set their own retention periods. Biometric data is not entered or stored within the retailer’s information on Auror, regardless of a match. All processing of biometric data takes place within the integrated third-party FRT software. Subject Recognition is strictly limited to identification based on historic images and past in-store offending within a maximum period of 24 months.

What bias mitigation measures have been considered?

Down arrow

Importantly, all retailer-entered information in Auror is restricted to prohibit the collection of sensitive information such as ethnicity, race, religion, political affiliation, or sexual orientation.

Therefore, these sensitive characteristics cannot be analyzed, or searched, or used in generating the POI lists. This is a bedrock safeguard in Auror as it means outcomes are based on past offending and behavior, and there is no ability to profile or categorize people

Any new product at Auror must undergo an assessment against a Responsible Tech and AI Assessment framework, which requires an independent Privacy Impact Assessment (PIA) by leading local law firms.

Independent testing shows that best-in-class FRT attains high accuracy in different scenarios.

All FRT providers undergo strict technical testing conducted by Auror data scientists. Auror tests for a range of areas such as anti-bias and accuracy, including the diversity of datasets used to train the FRT, with a focus on real world environments that reflect retail stores.

FRT does not explicitly capture tattoos or markings in the biometric template. Instead, FRT uses a series of geometric and structural anchor points such as eyes, nose, cheekbones, and the distances between these points to generate the temporary biometric templates.

Retailers undertake their own assessments prior to deployment to ensure local context informs the use of the technology. Retailers are ultimately responsible for the quality and accuracy of their information used to generate their POI list and determine accurate suggested matches.

Retailers also determine training for their teams and community expectations on when, how and where they deploy FRT. This is important as Subject Recognition ensures a human is always in the loop on decisions.

What guardrails are in place to promote responsible use?

Down arrow

Subject Recognition is based on the retailer’s information on Auror, which is deliberately restricted to prohibit the collection of sensitive information such as ethnicity, race, religion, political affiliation, or sexual orientation.

It is strictly used for identification of POIs for crime prevention and safety purposes only. It cannot in any way be used for profiling, categorization, prediction, tracking, monitoring behavior, or targeted marketing purposes.

To be enrolled to the list, a POI must already exist within the retailer’s information on Auror and meet the retailer’s strict criteria for ‘Extreme Threat’, ‘Serious Threat’ or ‘High Loss’.

Retailers cannot manually or arbitrarily enroll any profile to the POI list. The software suggests enrollments based on the retailer’s criteria and they can review these suggestions based on their own policies.

There are no shared POI lists between retailers and they cannot be shared with law enforcement. Law enforcement cannot access the Subject Recognition module or POI lists using Auror.

Identification is based on historic images linked to previously recorded events within a maximum 24 month period.

Biometric data is never entered or stored within the retailer’s information on Auror, regardless of a match. All processing of biometric data takes place within the integrated third-party FRT software.

If there is no match, the image and temporary biometric template is only ever processed by the integrated third-party FRT software and discarded immediately.

A human is always in the loop. No decisions are made exclusively through automation. Built-in integrity tools provide retailers with full auditability and transparency.

Retailers can apply strict access controls around Subject Recognition information. All activity is auditable and monitored.

Retailers and individual users of Auror are required to adhere to all applicable local laws and internal policies.

All information held on behalf of our retail customers is end-to-end encrypted. Auror is also SOC2, Type 2 compliant and regularly undergoes cyber security penetration testing.

What is the accuracy rate of the integrated FRT?

Down arrow

Independent testing shows that through broader and more diverse training datasets, and improved image quality, leading FRT are delivering significantly high accuracy rates in a range of real world environments.

It’s important to note that the quality of the reference image is a key component in delivering accurate results. Auror has tools in place to support retailers in entering high quality images and information, and prevents reference images from being used if they do not meet the quality threshold. With good quality reference images, leading FRT consistently delivers a true match rate as high as 99.8% in challenging conditions with poor lighting and off-angle views - this is even higher in controlled, well-lit environments.

The true match rate accounts for the false negative rate (missed matches) and false positive rate (incorrectly matched). For FRT generally, these rates change based on the confidence threshold set. For example, if the threshold for accuracy is set very high, then the FRT will likely deliver an extremely low false positive rate (incorrectly matched), but will result in a higher false negative rate (missed matches).

Subject Recognition supports high confidence thresholds to ensure a high level of accuracy, powered by best-in-class integrated FRT providers.

As a further safeguard, a human is always in the loop on decisions, meaning an authorized, trained retailer employee can view the historic reference image and the detection image and confirm or decline the match. No decisions are made exclusively through automation.

Subject Recognition reduces human bias by suggesting POI matches based on the retailer’s information about past high-harm and prolific offending only, not on how a person looks, their demographic characteristics or what they’re wearing.

Full transparency and auditability tools are available. Retailers can record the outcomes of detections, which can be monitored to determine performance and if anything requires further investigation.

How do retailers handle information about minors?

Down arrow

Retailers determine how they manage information relating to minors recorded in their information on Auror. This includes managing whether they want to be alerted when they enter their stores. They may choose to based on previous high-harm or prolific offending, such as violent behavior or use of a weapon. Retailers can also choose to label events involving known minors as ‘sensitive’, and therefore that information cannot be enrolled in the POI list for the purposes of Subject Recognition.

Who is responsible for building and training the integrated FRT?

Down arrow

The approved FRT providers are responsible for developing, maintaining and improving their technology. Retailer information on Auror is never used to train the FRT provider's system.

Can a POI (individual) request access to personal information?

Down arrow

Yes, under privacy regulations in most jurisdictions, you can request your personal information from the entity that collected the information. Individuals can request their personal information from the retailer for them to make a determination about how to handle your request. It's important to note that Auror is a Software as a Service (SaaS) provider to retailers and therefore, retailers determine what information they enter and remain in control of their own data. Auror acts as their agent under their instruction, so we cannot release information directly. We provide retailers with the ability to audit, search, and download information.

Once you make a request, the retailer may require more information, such as your full name, date of birth and the time and location of the event. If necessary, you may contact privacy@auror.co for more information about this process.

More about the Trust Centre