Evaluating Aerial Systems for Crime-Scene Reconstruction

Security professionals are using ICTs as an opportunity to propose new approaches for modifying or creating innovative strategies for crime prevention and response, thus achieving greater effectiveness and efficiency. New remote sensing technology mounted on drones offers the possibility of improving crime scene reconstruction, alongside the traditional model, i.e., the physical inspection performed by an agent at the crime scene.

In order to analyse the characteristics and differences between the aerial system (drone) and the terrestrial system (laser scanning), simulations of three different outdoor scenarios envisaged in the 2021 report “Evaluating Aerial Systems for Crime-Scene Reconstruction” [1] by the National Institute of Justice were conducted at the Crisis City Training Center near Salina, Kansas: (1) an urban scene recreating a carjacking and shooting including broken glass, bullet casings and pools of blood; (2) a forest where a suicide has taken place, with empty alcohol containers and narcotics; and (3) an open field with a clandestine grave, a shovel, a cell phone and clothing.

With the results of these simulations, the differences that became apparent were, on the one hand, in favour of the aerial system (and opposed to terrestrial laser scanning): (1) it does not require forensic personnel to walk through the crime scene, risking contamination and/or destruction of evidence and bodily harm from hazardous environments; (2) it allows faster data capturing of the entire crime scene; (3) it is cheaper, costing about $15,000 (the conventional terrestrial laser comes in at about $75,000); and (4) by capturing information from above, there are no blind spots, unlike in laser scanning, where blind spots occur if there are insufficient scanning positions or obstacles.

On the other hand, laser scanning results in a higher image accuracy, with an error level of about 1 mm, and capable of preserving the quality in the dark and regardless of the environmental conditions. As far as the drone is concerned, the error level was about 1 cm and, moreover, in the case of open spaces such as the forest, the height of the drone (to avoid hitting the treetops) reduced the quality of the image. The atmospheric variables (cloud coverage, temperature, wind, precipitation…) must also be taken into account, which will condition the efficiency of the aerial system.

Therefore, when reconstructing 3D images of the (simulated) crime scene, maximum efficiency is achieved by complementing terrestrial and aerial scanning, as the combination of both systems allows faster data capturing of the entire crime scene, while maintaining a higher level of accuracy. The procedure is a non-intrusive technique that helps investigators prevent scene contamination, and the results can help officers, lawyers and judges “walk through” the scene at any time, even years later, as well as verify details such as distances and sight.

[1] Informe https://nij.ojp.gov/topics/articles/evaluating-aerial-systems-crime-scene-reconstruction

_____

Aquest apunt en català / Esta entrada en español / Post en français

France installs noise radars in seven municipalities

For the first time, France is installing noise radars that will be responsible for monitoring noise levels emitted by vehicles in areas limited to 50 km/h. It is estimated that the fines imposed for exceeding the permitted noise levels could reach 135 euros.

The new noise radars have been installed in seven municipalities. The project, which will last for two years, went into operation a few weeks ago. This initiative is the first of its kind in Europe, so it will take time to evaluate the results and draw conclusions.

When France introduced radar speed guns twenty years ago, it drastically reduced the numbers of traffic accidents, which helped save thousands of lives. The new sensors or noise radars will, for the time being, be a test. Sensors will be able to detect and record vehicles that emit excessive noise, a growing problem in recent years. The authorities’ hope is to set a noise pollution limit and fine motorists who exceed it.

The initiative follows the growing intolerance of the French to street noise, especially from motorcycles and modified scooters. According to a study by Bruitparif – a French state-supported centre that monitors noise in the Paris area – a modified scooter crossing Paris at night can wake up to 10,000 people.

According to a study recently published by the European Environment Agency, noise pollution causes around 16,000 premature deaths in Europe and 72,000 hospitalisations a year. Noise from road traffic is one of the main causes of these poor figures.

After air pollution, noise is the second environmental factor that causes the most health problems, according to the World Health Organization in a 2011 report, increasing the risk of cardiovascular disorders and high blood pressure.

According to the new decree published in the French official gazette, the radars, which are equipped with microphones and cameras to capture the license plate of the offending vehicle, must be located on the hard shoulder.

The first radars installed are located in Saint-Lambert, a town located in the south-west region of Paris that is often part of the route of motorcyclists, ATV drivers and similar vehicles. According to measurements carried out by the Bruitparif agency, noise levels recorded last year in this area were between 210 and 520 dB.

More noise radars will gradually arrive in other municipalities, such as Nice or Toulouse. Although the decibel limit allowed before committing the offence has not yet been established, the first tests set a maximum of 90 dB.

Other measures to be implemented in parallel to the noise radars will be the reduction of the speed limit and the planting of trees and various types of vegetation along the often-congested Paris ring road. Dan Lert, deputy mayor in charge of this plan, adds that emergency vehicles will be ordered to lower the volume of their sirens at night.

_____

Aquest apunt en català / Esta entrada en español / Post en français

Comprehensive review of the College of Policing in the United Kingdom

The UK College of Policing has carried out a thorough review, devised to improve leadership, standards and professionalism throughout the police force, with the aim of helping the police themselves and improving public service. Thus, three basic priorities have been established:

  • Promote professionalism, ensuring that officers and staff have access to the best continuous training development and that it is appropriately prioritised.
  • Improve leadership in officers and staff at all levels to develop their leadership skills.
  • Promote consistency, overcoming the weaknesses of the model of the 43 police forces, to promote consistency where it matters most to the public and those who work in policing.

This review of the College of Policing was launched in March 2021 by Lord Herbert of South Downs, chair of the College of Policing. Objectives include:

  • Conduct an evaluation of the College’s role, its effectiveness and how it operates in conjunction with other policing organisations.
  • Ensure that the College is highly valued by all sectors of the police, from frontline officers and staff to chiefs and police commissioners.

Policing is becoming increasingly complex, and the culture and standards in the service are subject to increasing scrutiny.

To implement these challenges, the review included extensive consultation with people of different ranks, grades and roles from across the police force to find out what they want from their College of Policing. This included individual interviews, written evidence, focus groups, visits to police forces and a survey of some 15,000 officers and other personnel.

Respondents acknowledged the College’s success in addressing problems in some critical areas of policing, such as the response to the covid-19 pandemic, and identified future challenges for policing.

Identified challenges included:

  • Lack of professional development.
  • Insufficient investment in leadership development in all ranks.
  • Lack of coordinated strategic thinking across the police force.
  • Diffusion of responsibilities at the national level.
  • Insufficient equipment to respond to the growing digital aspects of crime.

Suggested improvements to address these problems include having the College act as a national centre for police leadership and make guidance and knowledge of what works more accessible to those on the front line through a College application, to introduce a consistent new approach to personal development for everyone in policing.

With these changes, the police will be better able to respond to the challenges they face, improve community confidence, reduce crime and keep citizens safe.

_____

Aquest apunt en català / Esta entrada en español / Post en français

5 things you should know about artificial intelligence

Although artificial intelligence has been the subject of academic research since 1950 and has been used commercially in some industries for decades, it is still in its infancy in all sectors.

The rapid adoption of this technology, coupled with the unique issues of privacy, security and accountability associated with it, has created opportunities for attempts to ensure that its use is ethical and legal.

On the specialised website Abajournal, authors Brenda Leong and Patrick Hall outline five things you should know about artificial intelligence:

1. Artificial intelligence is probabilistic, complex and dynamic. Machine learning algorithms are incredibly complex, learning billions of rules from datasets and applying those rules to arrive at an output recommendation.

2. Make transparency an actionable priority. The complexity of AI systems makes it difficult to ensure transparency, but organisations implementing AI can be held accountable if they are unable to provide certain information about their decision-making process.

3. Bias is a significant problem, but not the only one. AI systems learn by analysing billions of data points collected from the real world. This data can be numeric, categorical – such as gender and education level – or image-based, such as photos or videos. Because most systems are trained using the data generated by existing human systems, the biases that permeate our culture also permeate the data. Thus, there can be no such thing as an unbiased AI system.

Data privacy, information security, product liability and third-party sharing, as well as performance and transparency issues, are equally critical.

4. AI system performance is not limited to accuracy. While the quality and value of an AI system is largely determined by its accuracy, this alone is not sufficient to fully measure the wide range of risks associated with the technology. But focusing too much on accuracy is likely to ignore the transparency, fairness, privacy and security of a system.

Data scientists or lawyers, for example, should work together to create more robust ways of verifying AI performance that focus on the full spectrum of real-world performance and potential harms, whether from security threats or privacy shortfalls.

5. The hard work has just begun. Most organisations using AI technology must adopt policies to ensure the development and use of the technology and guidance for systems to comply with regulations.

Some researchers, practitioners, journalists, activists and lawyers have begun this work to mitigate the risks and liabilities posed by current AI systems. Companies are beginning to define and implement AI principles and make serious attempts at diversity and inclusion for technology teams.

_____

Aquest apunt en català / Esta entrada en español / Post en français

Inquiry into the police response to honour-based abuse and forced marriage in England

The police recognition of and response to honour-based abuse, forced marriage and female genital mutilation, while very well-meaning, does not usually provide adequate support for victims.

Jennifer Holton, director of Citizens in Policing for the Wiltshire Police (England), talks about this in her study based on her own experience and research into the issues with the aim of highlighting the key challenges and opportunities for change.

The research was carried out through a thematic review of national statistics and research, as well as the engagement of frontline practitioners in the South West region of England.

Holton rejects the idea that attending training courses or conferences or reading research that finds that the work is going badly is not talked about or that no one is looking for ways to improve it.

She also documents a number of key issues such as incorrect recording of offences. Crimes such as harassment, rape or assault are often considered in isolation, and the qualifier of honour-based abuse is not recorded or recognised. Therefore, it is unlikely that a consensus on this crime will be reached if it is not even recognised. And without accurate statistics, it is almost impossible to achieve greater funding and development in this area.

The research also notes that nearly half of the participants said that their agency had a designated person to contact for help related to honour-based abuse, but the vast majority added that they did not know who these people were or how to contact them.

The disparity in the reports is further demonstrated in the statistics, especially those referring to female genital mutilation. According to the statistics of the National Health Service of 2018-2019, 6,415 women went to a healthcare centre because they had evidence of a case of female genital mutilation, but successful prosecutions of this type of crime remain in single digits.

In 2019, 1,355 cases were referred to the Forced Marriage Unit for counselling. Of these cases, 64% were reported by professionals, 18% by victims and 18% by friends and family members anonymously.

The low proportion of cases referred by the victims reinforces the important role played by professionals. However, since many professionals are not confident in reporting it themselves, the problem lies in under-reporting at multiple sources.

In addition, the fact that most victims only report it to someone they trust, usually a close friend or family member, increases this responsibility on professionals to detect possible signs or symptoms.

This is the only guaranteed setting a vulnerable young person at risk of honour-based abuse has to safely and independently report someone. Jennifer Holton stresses the incomprehensibility of avoiding teaching about it in schools, which is the place where discussions on the subject should be promoted.

However, it also demonstrates that no individual or organisation is responsible – there is no lead agency, no one to take responsibility and ensure that action is taken. So, instead of everyone being accountable, it turns out that no one is.

_____

Aquest apunt en català / Esta entrada en español / Post en français

A model for predicting burglaries with forced entry in Catalonia

At the University of Girona on 22 November 2021, Dr Pere Boqué Busquet, an officer in the Catalan Police Force (Mossos d’Esquadra), read his doctoral thesis, entitled Mathematical models for the prediction of burglaries with forced entry in Catalonia. Directed by Dr Marc Saez Zafra and Dr Laura Serra Saurina, the thesis analyses the current situation with regard to so-called “predictive policing” and, as its title indicates, proposes a mathematical model for predicting which areas of Catalonia are likeliest to suffer from this type of crime at any given time.

One of the first aspects that Dr Boqué highlights is that predictive models applied in other parts of the world (and particularly in the US) are not valid in Catalonia. Among other questions, the regional and urban configuration of the territory means that repeat victimization patterns (part of the basis for these models) cannot be transferred directly to the Catalan context. Nevertheless, by dividing the territory into cells 5km in length on each side, he succeeded in showing how burglaries with forced entry follow time patterns in the form of waves or streaks, and can therefore be predicted. If the police detected the starting point, they could therefore prevent the pattern of replications or repeat crimes from developing. This conclusion is reported in an article published in the European Journal of Criminology, “‘Surfing’ burglaries with forced entry in Catalonia: Large-scale testing of near repeat victimization theory”, by Pere Boqué, Laura Serra and Marc Saez (November 2020).

The other results of the thesis and the construction of the model are presented in two further articles. In the second article a Log-Gaussian Cox model is applied to explore the possibility of making predictions on a smaller scale, in cells with a length of 500m, or even 250m or 100m, on each side. The conclusion is reached that a small-scale repeated victimization pattern, although it can also be detected in Catalonia, is not fit for purpose for the modelling of the overall dynamics of the pattern of burglaries.   

The third article proposes overcoming this limitation through a “new space-time victimization pattern that extends the concept of near repetition to that of repetition in different areas that have broadly similar characteristics but may be further apart in geographical terms.” These “groupings of areas that are often victimized at the same time” are described as “constellations of burglaries” and form a fixed body of reduced dimensions that is stable in time and on the basis of which predictive models can be created.

These last two articles are still awaiting publication. For this reason, the date of publication for the doctoral thesis has not yet been decided.

Over and above the effectiveness of the mathematical model, and as Dr Boqué emphasised at the end of his reading of his thesis, the difficulty of “predictive policing” does not so much lie in the predictive aspect, i.e, in the possibility or otherwise of predicting crime, as in the “policing” aspect: which preventive actions can be carried out to reduce the probability that the predicted crimes will finally be committed. Only when this model is applied will it be possible to discover the real potential for having a preventive effect on crime rates. Be that as it may, it is clear that a knowledge of the dynamics of space-time patterns of criminality offers the police an advantage that needs to be put to good use.

_____

Aquest apunt en català / Esta entrada en español / Post en français

A mobile application to improve safety and prevent crime

Maintaining a peaceful coexistence and encouraging social cohesion is never an easy task and always presents a challenge for cities. Nowadays, however, we have access to technology that can serve as a useful tool for promoting better communication between citizens, the public administration and law enforcement.

In December 2013, a joint initiative between the local council of Cornellà de Llobregat (Barcelona) and the private company, Einsmer, led to the town investing in one of these technologies in the form of a public safety mobile application. The result was an extremely intuitive, easy-to-use and accessible-to-all application that aims to safeguard citizens 365 days a year.

The App allows residents to notify the local police (Guàrdia Urbana) of any incident observed, quickly dial emergency numbers, receive alerts and general or personalised warnings, facilitate the location of vulnerable people, send emergency geo-located messages to the local police command centre and obtain and divulge pedagogical information related to preventive safety and first aid.

Furthermore, the system used by the application allows for inter-territorial cooperation that optimises resources and services between different municipalities and security forces, enhances collaboration between administrations and promotes citizen empowerment. The technical tool must be understood as a complement to the work carried out by Cornellà’s urban police force, governed by the Local Security Plan (PLASECOR), which includes actions designed to prevent conflicts before they occur. It promotes the participation of the different security actors, the assumption of responsibilities in its implementation and the promotion of joint and transversal work with the different security forces and the citizens.

The versatility of the application means it can be adapted to specific needs or new approach strategies proposed by the administration. For example, in 2019, Cornellà de Llobregat Municipal Council and the company Einsmer added a new alert function to the application aimed exclusively at assisting women who experience gender-based violence or suffer public harassment.

The system works with two levels of security: level 1, intended only for women in situations of gender-based violence with restraining orders against their aggressors (an alert that can be activated without having to unlock the phone), and level 2, intended for public harassment alerts, which is activated from the mobile’s home screen and puts the victim in direct contact the local police command centre.

In both cases, the security alerts reach the local police immediately through a geo-location and tracking feature, which activates an assistance protocol and alerts the local police officer closest to the victim so that they can reach the scene as quickly as possible. In addition, the user’s data is never shared and is only visible to police officers when the citizen requests assistance.

In order to share initiatives such as this with other cities, Cornellà de Llobregat Municipal Council has recently joined the EFUS and FEPSU municipality networks.

_____

Aquest apunt en català / Esta entrada en español / Post en français

A facial recognition technology has been declared illegal in Canada

Clearview’s controversial practice of collecting and selling billions of faceprints was dealt a heavy blow by a Canadian commissioner.

Clearview, founded in 2017 by Australian entrepreneur Hoan Ton-Thatand, is in the business of collecting what it calls “faceprints”, which are unique biometric identifiers similar to someone’s fingerprint or DNA profile, from photos people post online.

Canadian authorities have found that the collection of facial-recognition data by Clearview is illegal because it violates federal and provincial privacy laws, representing a win for individuals’ privacy and potentially setting a precedent for other legal challenges to the controversial technology.

A joint investigation of privacy authorities led by the Office of the Commissioner of Canada came to this conclusion, claiming that the New York-based company’s scraping of billions of images of people from across the internet represented mass surveillance and infringes on the privacy rights of Canadians.

Moreover, the investigation found that Clearview had collected highly sensitive biometric information without people’s knowledge or consent and then used and disclosed this personal information for inappropriate purposes that would not be appropriate even if people had consented.

Since 2019, the company has faced legal challenges to its technology and business practices, part of a larger question of whether facial-recognition technologies being developed by myriad companies—including companies like Microsoft and IBM—should be legal at all.

To date, Clearview has amassed a database of billions of these faceprints, which it sells to its clients. It also provides access to a smartphone app that allows clients to upload a photo of an unknown person and instantly receive a set of matching photos.

One of the biggest arguments in his company’s defence that Ton-Thatand has made in published reports is that there is significant benefit in using its technology in law enforcement and national security, which outweighs the privacy concerns of individuals. Furthermore, Clearview is not to blame if law enforcement misuses its technology.

The decision in Canada will likely help other legal challenges not only to Clearview’s technology but facial recognition in general. Last May, the American Civil Liberties Union sued Clearview for privacy violations in Illinois, a case that is ongoing. Lawmakers in the United States even have proposed a nationwide ban on facial recognition.

The technology also raises questions of racial bias and the potential for false accusations against innocent people.  In December two black men filed a case against police in Michigan, saying they were falsely identified by facial-recognition technology—specifically, DataWorks Plus, which is used by Michigan State Police.

_____

Aquest apunt en català / Esta entrada en español / Post en français

US police practise with a robot dog

Several US police departments have deployed the cyber-K-9 robotic dog in different situations, including hostage situations. Those in favour of using it say robots can help keep police officers safe, but critics are worried about how they might be used without clear political guidelines.

The robot is equipped with cameras, lights and a bidirectional communication system that allows the operator controlling it to see and hear its environment in real-time. The New York Police Department (NYPD) acquired the robot in December 2020 and has so far deployed it in active duty three times, the last of which was a few days ago, when it climbed the stairs of an apartment in the Bronx looking for two suspects in an ongoing investigation.

The robot, sold as a “Spot” by robotics company Boston Dynamics, costs US$74,000. NYPD officials have described it as a promising new technology that could save lives and reduce the risk for law enforcement officers, gathering information in places of risk and removing the need to send humans into compromised situations. (Last autumn, for example, it was used to send food into a hostage situation in Queens).

“The NYPD has been using robots since the 1970s to save lives in hostage situations & hazardous incidents,” the department said on Twitter. “This model of robot is being tested to evaluate its capabilities against other models in use by our emergency service unit and bomb squad”.

Jay Stanley, a senior policy analyst with the American Civil Liberties Union, says the deployment of the technology for police surveillance also raises other issues. Could the robot be autonomous? Is it a good investment at a time when communities are examining the relationship between police officers and citizens?

There are questions around whether the police will be transparent, have clear policies on the use of the technology, and ensure that the public is part of the conversation every step of the way.

Boston Dynamics, the manufacturer of the robot, said a clause had been added to the lease that prevented Spot from being used to in any way physically harm or intimidate people.

The Honolulu Police Department is using a Spot primarily to take action in a tent city for homeless people during the COVID-19 pandemic. This deployment has also been controversial, albeit for different reasons: according to the Honolulu Civil Beat, the robot dog was bought with almost US$150,000 in federal coronavirus aid money.

John McCarthy, deputy director of the department, said in a statement that the robot had other uses related to the pandemic, including thermal imaging and the delivery of food and medicines.

Much of this work is currently done by officers, some of whom are paid overtime. In the long run, the Spot robot will save money and keep officers safe.

_____

Aquest apunt en català / Esta entrada en español / Post en français

Is the world ready for facial recognition drones?

Some of the first drones with advanced facial recognition capabilities are being developed by Israeli surveillance companies, as North-American police consider whether they will soon be adding the controversial technology to their unmanned flying machines.

As a sign of the imminent arrival of biometric identification from the air, an Israeli start-up previously-funded by Microsoft has patented technologies for drone-based facial recognition. Tel Aviv-based AnyVision filed a patent application detailing technology to help a drone find the best angles for a facial recognition shot, before trying to find a match for the target by referring to faces stored in a database.

The patent aims to iron out some of the complexities of identifying faces from a flying machine. Various obvious issues arise when trying to recognise someone from a drone: acquiring an angle at which a face can be properly captured and being able to get good-quality visuals whilst moving. Both are considerably harder than getting a match from static footage.

U.S. military agencies have been trying to come up with solutions, including the Advanced Tactical Facial Recognition at a Distance Technology project at U.S. Special Operations Command (SOCOM) and the Intelligence Advanced Research Projects Activity (IARPA) Biometric Recognition and Identification at Altitude and Range initiative.

But private industry may get there first. In December 2020, it was revealed AnyVision executives had partnered with an Israeli defence supplier for a new joint venture called SightX. In demos provided to Israeli media in late 2020, SightX’s small drones didn’t have any facial recognition capabilities, though executives said the feature was coming soon. It’s unclear if the tech is for the military only or if it will be sold to police agencies.

What is clear is that the technology is ready for launch. AnyVision CEO Avi Golan told Forbes that whilst AnyVision didn’t have any in-production drones with facial recognition, they would be a reality soon. He pointed to the fact that delivery drones would potentially require facial recognition to determine whether they’ve reached the correct buyer. Amazon has already patented similar tech, pointing to its potential plans for its experimental drone delivery fleet.

As for when North-Americans can expect police drones with facial recognition, even if police agencies aren’t immediately planning to send them to the skies, there’s an expectation they will arrive in one form or another.

_____

Aquest apunt en català / Esta entrada en español / Post en français