What can the EU learn from Steven Seagal?
Since the attacks of 9/11, counterterrorism has been given a permanent spot on the political agenda. Politicians agree that everything that can be done should be done in order to prevent terrorism. Over the years, though, the limits of what can be done have been pushed and stretched. Measures that were once considered too susceptible to abuse or a threat to privacy, are implemented later anyway. Through politics of fear, the initial concerns of these measures are put aside in order to provide the much needed security.
An example of this is the EU Passenger Name Records (PNR) programme. PNR data contains passenger flight data and is collected by travel and airline agencies. Although several individual countries in the EU are already collecting this data, there are no European regulations. The wide use of PNR data in the EU for combatting terrorism was considered off-limits for a long time, due to privacy concerns. But a new proposal aims to breathe new life into the use of PNR for counterterrorism in order to prevent incidents like those at Charlie Hebdo. Although the privacy of citizens is not completely disregarded , it does seem to have lost some of its value in comparison to previous PNR proposals.
The PNR data is seen as important in the fight against terrorism and serious crimes. It is described by European Members of Parliament as “vital in the fight against terrorism”, a “tool in the toolbox” of law enforcement and able to “help authorities track the movements of those who travel to training camps.” The underlying assumptions is that data can pinpoint terrorists. Collecting, connecting and analyzing data can show you who is dangerous and subsequently who to act against.
The issue here is that there are assumptions being made on the effectivity of data collection and analysis. And , as the Steven Seagal movie classic Under Siege 2 has so eloquently taught us, assumptions are the mother of all screw-ups (paraphrased). Very little is known about how the systems that are already in place perform. Systematic evaluation seems to be lacking or the systems are so complex that evaluation becomes difficult. The assumptions on the effectivity of data collection and analysis do not seem to be based on the evidence of previously implemented programmes.
Also, central to the use of risk assessments is the idea of discovering patterns in behaviour. The information gathered in known terrorist cases is analyzed for shared characteristics. By searching for similar behaviour in the gathered data, possibly risky individuals can be pinpointed. This is also the concept behind the use of PNR data for counterterrorism. Herein lies the assumption that terrorists all show the same behaviour and follow the same steps. But research shows that the effectivity of profiling in terrorism has not been proven.
This idea of looking for patterns in data gained popularity after the events of 9/11. Analysts found that if they had had all the data gathered by different agencies, they could have found all of those involved in the attacks. They used this to stress the importance of data and surveillance. The obvious problem with this can be expressed best with the phrase “hindsight is always 20/20.” If you know what or who to look for, connecting the dots is not all that hard. It is predicting the unknown that is difficult. The fact that the Australian authorities received 18 tips in 3 days on the person who would later take 18 hostages and kill two of them in Sydney, is an example of this. Even though the hostage taker was known to the authorities, a risk assessment showed he was not a priority.
A political dilemma
Of course, for academics it is easy to point out problems with the decisions being made by politicians. We will not be the ones taking the heat when something actually happens. And therein also lies the problem with counterterrorism measures. As a politician, you do not want to be the one that opposed a measure that could have prevented a terrorist attack. At the same time you also do not want to be the person responsible for the unnecessary privacy violation of citizens. These can be hard decisions. However, it would be wise to make these decisions based on facts and evidence, not assumptions. Maybe politicians should learn from what the terrorists fighting Steven Seagal learned the hard way: assumptions are the mother of all screw-ups.