{"id":954269,"date":"2026-05-12T06:47:23","date_gmt":"2026-05-12T06:47:23","guid":{"rendered":"https:\/\/www.europesays.com\/uk\/954269\/"},"modified":"2026-05-12T06:47:23","modified_gmt":"2026-05-12T06:47:23","slug":"smart-borders-blind-spots-surveillance-technology-and-fundamental-rights-at-europes-borders","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/uk\/954269\/","title":{"rendered":"Smart Borders, Blind Spots: Surveillance Technology and Fundamental Rights at Europe&#8217;s Borders"},"content":{"rendered":"<p><strong>\u2014<\/strong><a href=\"https:\/\/www.linkedin.com\/in\/amaliyakartika\/\" target=\"_blank\" rel=\"noopener\">Amaliya Kartika Putri<\/a>, LL.M in Law and Technology from Utrecht University, focusing on GDPR, AI regulation, and fundamental rights in emerging technologies<\/p>\n<p><img fetchpriority=\"high\" decoding=\"async\" width=\"314\" height=\"417\" data-id=\"16492\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2026\/05\/Putri-photo.png\" alt=\"\" class=\"wp-image-16492\"  \/><\/p>\n<p>The European Union has adopted modern surveillance technology, including drone monitoring and facial recognition technology (FRT) to enhance border security. These technologies raise fundamental rights issues, yet they also have the potential to identify travelers efficiently and detect unauthorized crossings. As to the former point, EU law and the European Convention on Human Rights (ECHR) protect among others fundamental rights to non-discrimination and private life. <a href=\"https:\/\/hudoc.echr.coe.int\/#{%22itemid%22:[%22001-225655%22]}\" target=\"_blank\" rel=\"noopener\">The Glukhin v. Russia (2023) ECtHR judgment<\/a> confirms that the implementation of random facial recognition infringes upon privacy protections. Moreover, Murray and other scholars warn that the extensive deployment of facial recognition technology may result in the creation of a \u201c<a href=\"https:\/\/journals.sagepub.com\/doi\/10.1177\/09240519241253061\" target=\"_blank\" rel=\"noopener\">pervasive surveillance state<\/a>\u201d which would limit personal conduct and threaten democratic institutions.<\/p>\n<p>In this blog post I will investigate the impact of drone surveillance and facial recognition technology on privacy and equality rights at EU external borders. The analysis includes an examination of the current legal framework which includes the ECHR, the EU Charter of Fundamental Rights, the GDPR, and the AI Act. Although these instruments offer a crucial basis for safeguarding fundamental rights, this blog argues that more legal clarification and robust protections are required in order to address the intrusive and discriminatory risks presented by these emerging technologies.<\/p>\n<p><strong>Surveillance Technologies at the EU Border<\/strong><\/p>\n<p>The technology of facial recognition systems and unmanned aerial drones has transitioned from science fiction to reality at European borders. The EU, together with its Member States, has established pilot programs and systems to automate border control operations through these technological tools. <a href=\"https:\/\/www.tandfonline.com\/doi\/full\/10.1080\/1369118X.2020.1792530\" target=\"_blank\" rel=\"noopener\">The Horizon 2020 iBorderCtrl project<\/a> implemented a system that used facial scans, together with \u201cbiomarkers of deceit\u201d to analyze micro-expressions for detecting deception in travelers. Despite the system facing strong <a href=\"https:\/\/www.tatup.de\/index.php\/tatup\/article\/view\/7100\" target=\"_blank\" rel=\"noopener\">scientific skepticism<\/a> and arguably giving rise to rights violations, the EU continues to be very interested in using FRT for the biometric verification of travelers as demonstrated by the development of a <a href=\"https:\/\/home-affairs.ec.europa.eu\/news\/commission-announces-launch-shared-biometric-matching-service-2025-05-19_en\" target=\"_blank\" rel=\"noopener\">shared biometric matching service<\/a> across EU border systems and <a href=\"https:\/\/home-affairs.ec.europa.eu\/news\/entryexit-system-will-become-fully-operational-10-april-2026-2026-03-30_en\" target=\"_blank\" rel=\"noopener\">the integration of facial image capture and matching<\/a> within the Entry\/Exit System (EES). EU-funded prototypes, BorderUAS and ROBORDER, which have developed drones and ground robots that operate as interoperable \u201c<a href=\"https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=4883130\" target=\"_blank\" rel=\"noopener\">robo swarms<\/a>\u201d for autonomous border surveillance\u200b. The drones operate for extended periods across large areas while using cameras and sensors to identify \u201c<a href=\"https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=4883130\" target=\"_blank\" rel=\"noopener\">illegal border activities<\/a>\u201d. Through this technology, authorities can monitor distant border regions and locations beyond national borders through the process of \u201c<a href=\"https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=4883130\" target=\"_blank\" rel=\"noopener\">extraterritorialization<\/a>\u201d which extends border enforcement. The system provides continuous surveillance of migrants and travelers, and is typically favored due to the ability to detect illegal border crossings.<\/p>\n<p><strong>Legal Framework for Technology Use<\/strong><\/p>\n<p>The use of these technologies at the border is backed up by developing legal instruments. <a href=\"https:\/\/eur-lex.europa.eu\/eli\/reg\/2018\/1861\/oj\/eng\" target=\"_blank\" rel=\"noopener\">The Schengen Information System (SIS)<\/a> regulations include provisions for facial recognition, stating that facial images in SIS can be used for identification purposes at external border crossings after the technology reaches sufficient reliability. The Schengen Area border guards would be able to perform immediate face comparison checks against SIS watchlists for travelers entering the region. <a href=\"https:\/\/eur-lex.europa.eu\/eli\/reg\/2018\/1240\/oj\/eng\" target=\"_blank\" rel=\"noopener\">The European Travel Information and Authorisation System (ETIAS)<\/a> also requires visa-exempt visitors to pre-register and be screened against security and migration risk profiles. However, ETIAS\u2019 automated processing raises concerns about algorithmic profiling of travelers for \u201csecurity or illegal immigration risk,\u201d which could implicate <a href=\"chrome-extension:\/\/efaidnbmnnnibpcajpcglclefindmkaj\/https:\/www.europeanpapers.eu\/en\/system\/files\/pdf_version\/EP_eJ_2024_3_SS1_6_Lorenzo_Gugliotta_Abdullah_Elbi_00797.pdf#:~:text=pre,Publications%20Office%20of%20the\">privacy and non-discrimination<\/a>\u200b.<\/p>\n<p>As I discuss below, the main human rights concerns that arise in relation to these tools are how they affect the privacy of those crossing borders, and whether they introduce bias or discrimination into border enforcement.<\/p>\n<p><strong>Implications for the Right to Private Life and Data Protection<\/strong><\/p>\n<p>I suggest that the technology of drone surveillance, along with facial recognition systems, violates the personal privacy of those crossing borders by collecting and processing extensive amounts of individual data. The right to private life under <a href=\"chrome-extension:\/\/efaidnbmnnnibpcajpcglclefindmkaj\/https:\/www.echr.coe.int\/documents\/d\/echr\/convention_ENG\">Article 8 ECHR<\/a> protects three aspects of personal privacy which include personal data, identity, and privacy. <a href=\"chrome-extension:\/\/efaidnbmnnnibpcajpcglclefindmkaj\/https:\/www.europarl.europa.eu\/charter\/pdf\/text_en.pdf\">The EU Charter of Fundamental Rights<\/a> protects private life through Article 7 and personal data through Article 8. <a href=\"https:\/\/eur-lex.europa.eu\/eli\/reg\/2016\/679\/oj\/eng\" target=\"_blank\" rel=\"noopener\">The GDPR<\/a> identifies biometric data as sensitive personal data that requires specific protection. It also establishes a general prohibition on processing biometric data for individual identification purposes but allows such processing when specific conditions or public interest exceptions apply (<a href=\"https:\/\/eur-lex.europa.eu\/eli\/reg\/2016\/679\/oj\/eng\" target=\"_blank\" rel=\"noopener\">GDPR Art. 9<\/a>). <a href=\"https:\/\/eur-lex.europa.eu\/eli\/dir\/2016\/680\/oj\/eng\" target=\"_blank\" rel=\"noopener\">The LED Directive (EU) 2016\/680<\/a>, along with national laws, demands that surveillance technology usage must be both necessary and proportionate to achieve legitimate aims while implementing sufficient safeguards. The deployment of FRT or drones requires a specific legal foundation that serves a legitimate <a href=\"chrome-extension:\/\/efaidnbmnnnibpcajpcglclefindmkaj\/https:\/www.echr.coe.int\/documents\/d\/echr\/convention_ENG\">security purpose and demonstrates necessity in a democratic society<\/a>, according to the ECHR\u2019s three-part test.<\/p>\n<p>In <a href=\"https:\/\/hudoc.echr.coe.int\/#{%22itemid%22:[%22001-225655%22]}\" target=\"_blank\" rel=\"noopener\">the earlier mentioned 2023 Glukhin v. Russia<\/a> case, the European Court of Human Rights (ECtHR) analyzed the Russian police\u2019s utilization of live facial recognition technology to identify a solitary peaceful protester. The Court established that facial recognition surveillance constitutes an extremely invasive method which infringes Article 8 of the Convention unless appropriate limitations exist. <a href=\"https:\/\/hudoc.echr.coe.int\/#{%22itemid%22:[%22001-225655%22]}\" target=\"_blank\" rel=\"noopener\">The ECtHR agreed<\/a> with the assessment that the relevant Russian CCTV law granted excessive authority to the State without sufficient constraints for FRT deployment. It accordingly held that deploying facial recognition technology without sufficient protective measures and rules would violate fundamental rights. The Court established that reliance on FRT needs <a href=\"https:\/\/hudoc.echr.coe.int\/#{%22itemid%22:[%22001-225655%22]}\" target=\"_blank\" rel=\"noopener\">a high level of justification<\/a> when it comes to monitoring peaceful participants (and their possible identification for arrest) because of the likely creation of a chilling effect.<\/p>\n<p>EU border surveillance operates under the same principles. Border surveillance through drone monitoring can be likened to mass surveillance when drones use high-resolution cameras or thermal sensors for continuous observation. Drones can monitor migrants and travelers for extended periods to gather personal data through video and image collection without their awareness. The practice violates the right to privacy as it affects <a href=\"https:\/\/eucrim.eu\/news\/fra-looks-facial-recognition-technology\/\" target=\"_blank\" rel=\"noopener\">the reasonable expectation of privacy<\/a> in both public and semi-public border areas. Thus, any such interference must be justified under the law.<\/p>\n<p>Although <a href=\"https:\/\/eur-lex.europa.eu\/legal-content\/EN\/TXT\/?uri=celex:52022DC0303\" target=\"_blank\" rel=\"noopener\">the European Commission\u2019s strategic guidelines<\/a> for integrated border management mention drones in land-border surveillance, they also emphasize that technologies capable of collecting personal data must follow clear rules set by EU and national law. This is a rigorous standard as the ECtHR in the Szab\u00f3 and Vissy v. Hungary case <a href=\"https:\/\/hudoc.echr.coe.int\/eng-press#{%22itemid%22:[%22003-5268616-6546444%22]}\" target=\"_blank\" rel=\"noopener\">insisted on strong protections against abuse<\/a>: especially for broad surveillance methods, there must be independent approval, monitoring, and review afterward. Based on this, a Member State might justify <a href=\"https:\/\/curia.europa.eu\/site\/upload\/docs\/application\/pdf\/2022-09\/cp220156en.pdf\" target=\"_blank\" rel=\"noopener\">a targeted and temporary measure<\/a> for search and rescue, a specific cross-border crime risk, or an immediate threat to life or safety. However, it will find it difficult to justify ongoing and widespread drone surveillance aimed at identifying migrants or travelers as a group. <a href=\"https:\/\/www.edps.europa.eu\/data-protection\/our-work\/publications\/opinions\/drones_en\" target=\"_blank\" rel=\"noopener\">The mobility and discretion of drones<\/a> along with challenges in <a href=\"chrome-extension:\/\/efaidnbmnnnibpcajpcglclefindmkaj\/https:\/www.edpb.europa.eu\/sites\/default\/files\/files\/file1\/edpb_guidelines_201903_video_devices_en_0.pdf\">transparency and effective oversight<\/a> suggest that a solid justification is unlikely to emerge in most normal border-surveillance situations.<\/p>\n<p><strong>Implications for the Right to Non-Discrimination<\/strong><\/p>\n<p>Border surveillance technologies also pose substantial risks of discriminatory outcomes because they use biased algorithms and selective deployment methods. <a href=\"chrome-extension:\/\/efaidnbmnnnibpcajpcglclefindmkaj\/https:\/www.echr.coe.int\/documents\/d\/echr\/convention_eng\">The ECHR (Article 14)<\/a> together with <a href=\"chrome-extension:\/\/efaidnbmnnnibpcajpcglclefindmkaj\/https:\/www.europarl.europa.eu\/charter\/pdf\/text_en.pdf\">the EU Charter (Article 21)<\/a> protect non-discrimination rights by prohibiting unequal treatment of individuals based on their race or ethnic origin or religious beliefs when accessing rights. Border control must respect this principle by ensuring all groups receive equal treatment regardless of their protected characteristics.<\/p>\n<p><a href=\"https:\/\/amnesty.ca\/features\/racial-bias-in-facial-recognition-algorithms\/\" target=\"_blank\" rel=\"noopener\">Research indicates<\/a> that facial recognition algorithms perform worse on specific racial and ethnic groups, which results in elevated false identification rates among these groups. The deployment of FRT at EU borders and airports is likely to result in higher rates of false identification for people of color because of algorithmic bias, which constitutes indirect discrimination in practice. <a href=\"chrome-extension:\/\/efaidnbmnnnibpcajpcglclefindmkaj\/https:\/rm.coe.int\/facial-recognition-technology-fra-coe-meeting-july-2020-\/16809eee8b\">The European Commission against Racism and Intolerance (ECRI)<\/a> has observed that algorithmic bias in policing and border systems enables racial profiling through technological systems that violate non-discrimination standards.<\/p>\n<p><a href=\"chrome-extension:\/\/efaidnbmnnnibpcajpcglclefindmkaj\/https:\/eur-lex.europa.eu\/legal-content\/EN\/TXT\/PDF\/?uri=OJ:L_202401689\">The AI Act establishes high-risk classification<\/a> for numerous border-related AI systems, including biometric identification and risk assessment tools which must undergo strict requirements for discriminatory impact testing and human oversight. Civil society organization have asserted that the current measures are insufficient because <a href=\"https:\/\/www.ohchr.org\/en\/documents\/tools-and-resources\/digital-border-governance-human-rights-based-approach\" target=\"_blank\" rel=\"noopener\">truly invasive and biased systems should be completely prohibited<\/a>. <a href=\"https:\/\/www.ohchr.org\/en\/documents\/tools-and-resources\/digital-border-governance-human-rights-based-approach\" target=\"_blank\" rel=\"noopener\">The OHCHR advocates<\/a> for a total ban on AI systems that inherently violate discrimination prohibitions through examples such as social scoring and ethnic or religious categorization for risk assessment purposes.<\/p>\n<p><strong>Are Amendments to EU Legislation Necessary?<\/strong><\/p>\n<p>The above analysis supports the need for additional legal protections and possibly new amendments to safeguard fundamental rights in the face of growing surveillance practices. I suggest that the following measures are warranted. First, the EU should insert specific provisions in border management laws (SIS, Eurodac, or Frontex regulations) to either ban or heavily restrict the deployment of live facial recognition and broad-area drone surveillance for general monitoring purposes. The absence of explicit prohibitions creates a situation where these tools can be implemented in fragmented ways that threaten the respect for fundamental rights.<\/p>\n<p>Second, although <a href=\"chrome-extension:\/\/efaidnbmnnnibpcajpcglclefindmkaj\/https:\/eur-lex.europa.eu\/legal-content\/EN\/TXT\/PDF\/?uri=OJ:L_202401689\">the AI Act<\/a> is no longer in the finalisation stage, there is still room for reform as the regulation is now undergoing implementation and amendment. Thus, Members of the European Parliament continue to be important because they have the capacity <a href=\"https:\/\/www.legislation.gov.uk\/eut\/teec\/part\/SIX\/title\/I\/chapter\/1\/section\/1?view=plain\" target=\"_blank\" rel=\"noopener\">to request new legislation<\/a> or amendments from the Commission in their capacity as co-legislators. Despite the Act\u2019s more stringent punishment of similarly intrusive practices elsewhere, they should challenge the current <a href=\"https:\/\/verfassungsblog.de\/regulating-ai-at-europes-borders\/\" target=\"_blank\" rel=\"noopener\">double standard for migration and border AI<\/a>. The ban on technologies that endanger fundamental rights at borders should apply to all situations where these technologies <a href=\"https:\/\/www.ohchr.org\/en\/documents\/tools-and-resources\/digital-border-governance-human-rights-based-approach\" target=\"_blank\" rel=\"noopener\">pose risks to fundamental rights<\/a> including emotion detection and predictive profiling.<\/p>\n<p>The EU should require operators to disclose their FRT system algorithms and conduct audits on deployed systems which must demonstrate accuracy rates for all demographic groups while independent experts verify the absence of systemic bias. The AI Act and\/or existing border system regulations should be used to implement these measures. The translation of border laws into practice requires strengthening oversight mechanisms through <a href=\"https:\/\/eucrim.eu\/news\/fra-looks-facial-recognition-technology\/\" target=\"_blank\" rel=\"noopener\">increased resources and authority<\/a> for the European Data Protection Supervisor and national data protection authorities as well as Fundamental Rights Monitors to inspect border technology systems.<\/p>\n<p><strong>Conclusion<\/strong><\/p>\n<p>The increasing use of drone surveillance and facial recognition technology at EU borders puts security and fundamental rights at peril. Pervasive monitoring poses a threat to the right to privacy as demonstrated in Glukhin case and its application in migration scenarios raises questions about discrimination against vulnerable groups. The current practices reveal a disconnect between modern technology reality and legal norms despite the EU\u2019s robust legal frameworks.<\/p>\n<p>Clearer legal boundaries are required to address this issue. AI systems should be limited or prohibited, including at borders where they present obvious dangers. The fundamental rights that form the foundation of the EU legal system should not be put at risk in the name of border security.<\/p>\n<p><strong>Suggested citation:<\/strong> Amaliya Kartika Putri, Smart Borders, Blind Spots: Surveillance Technology and Fundamental Rights at Europe\u2019s Borders, Int\u2019l J. Const. L. Blog, May 12, 2026, at: http:\/\/www.iconnectblog.com\/smart-borders-blind-spots-surveillance-technology-and-fundamental-rights-at-europes-borders\/<\/p>\n","protected":false},"excerpt":{"rendered":"\u2014Amaliya Kartika Putri, LL.M in Law and Technology from Utrecht University, focusing on GDPR, AI regulation, and fundamental&hellip;\n","protected":false},"author":2,"featured_media":954270,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[5174],"tags":[323,768,2000,299,5187,1699,12520],"class_list":{"0":"post-954269","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-eu","8":"tag-ai","9":"tag-borders","10":"tag-eu","11":"tag-europe","12":"tag-european","13":"tag-european-union","14":"tag-surveillance-technology"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@uk\/116560269651044603","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/954269","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/comments?post=954269"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/954269\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media\/954270"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media?parent=954269"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/categories?post=954269"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/tags?post=954269"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}