Datos policiales e Inteligencia Artificial: Un equilibrio delicado entre la privacidad, la utilidad y la ética

Authors

  • María Teresa Jefa de servicio de Seguridad en el Gobierno de Canarias
  • Pedro Juan Baquero Pérez Profesor asociado de la Universidad de la Laguna y jefe de servicio de Informática y Comunicaciones del Gobierno de Canarias

DOI:

https://doi.org/10.36151/RCAP.ext.6

Keywords:

Artificial Intelligence (AI), privacy, ethics, police scope, personal data protection

Abstract

This article addresses the critical intersection between artificial intelligence (AI), privacy, and ethics in the realm of policing. It explores how AI offers unprecedented opportunities for enhancing efficiency in the collection and processing of data in criminal investigations, but also how it poses ethical challenges and risks to privacy and data protection. From ethical dilemmas in data collection and the use of predictive crime algorithms to the risks associated with data inference and profiling, the article examines the various facets of the issue. The tension between deontological and utilitarian approaches in privacy ethics is also considered, and specific methods to mitigate risks are introduced, such as anonymization, consent and notification, data deletion, and differential privacy. Finally, it provides a multidimensional analysis of the challenges and approaches in this emerging field.

Downloads

Download data is not yet available.

References

Abadi, M., Chu, A., Goodfellow, I., McMahan, H. B., Mironov, I., Talwar, K., y Zhang, L. (2016). Deep Learning with Differential Privacy. In Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security.

Apple Inc. (2017). iOS Security Guide. https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Babuta, A., Oswald, M., y Rinik, C. (2018). Machine learning algorithms and police decision-making: legal, ethical and regulatory challenges. Whitehall Report, núm. 3. Royal United Services Institute for Defense and Security Studies.

Bagdasaryan, E., Poursaeed, O., y Shmatikov, V. (2019). Differential Privacy Has Disparate Impact on Model Accuracy. In NeurIPS 2019: 33rd Conference on Neural Information Processing Systems. https://arxiv.org/pdf/1905.12101.pdf

Baquero Pérez, P.J. (2023). Cuestiones éticas sobre la implantación de la inteligencia artificial en la administración pública. Revista Canaria de Administración Pública, (1), 243–282.

Bok, S. (1983). Secrets: On the Ethics of Concealment and Revelation. Vintage Books.

Bos, J. W., Lauter, K., y Naehrig, M. (2014). Private predictive analysis on encrypted medical data. Journal of biomedical informatics, 50, 234-243.

Burrell, J. (2016). How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big data & society, 3(1).

Chen, M., Mao, S., y Liu, Y. (2014). Big data: A survey. Mobile Networks and Applications, 19(2), 171-209.

Chesney, R., y Citron, D. (2018). Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security. California Law Review, 107, 1753-1819.

Cohen, J. E. (2012). Configuring the Networked Self: Law, Code, and the Play of Everyday Practice. Yale University Press.

Cohen, J. E. (2019). Between Truth and Power: The Legal Constructions of Informational Capitalism. Oxford University Press.

Comisión Europea. (2016a). Reglamento (UE) 2016/679 del Parlamento Europeo y del Consejo de 27 de abril de 2016, relativo a la protección de las personas físicas en lo que respecta al tratamiento de datos personales y a la libre circulación de estos datos y por el que se deroga la Directiva 95/46/CE (Reglamento general de protección de datos).

Comisión Europea (2016b). Directiva (UE) 2016/680 del Parlamento Europeo y del Consejo de 27 de abril de 2016 relativa a la protección de las personas físicas en lo que respecta al tratamiento de datos personales por parte de las autoridades competentes para fines de prevención, investigación, detección o enjuiciamiento de infracciones penales o de ejecución de sanciones penales, y a la libre circulación de dichos datos y por la que se deroga la Decisión Marco 2008/977/JAI del Consejo

Crawford, K. (2016). Can an Algorithm Be Agonistic? Ten Scenes from Life in Calculated Publics. Science, Technology, y Human Values, 41(1), 77-92.

Dhar, V. (2013). Data science and prediction. Communications of the ACM, 56(12), 64-73.

Dwork, C., McSherry, F., Nissim, K., y Smith, A. (2006). Calibrating Noise to Sensitivity in Private Data Analysis. In Proceedings of the Third Theory of Cryptography Conference.

Dwork, C., y Roth, A. (2014). The Algorithmic Foundations of Differential Privacy. Foundations and Trends in Theoretical Computer Science, 9(3–4), 211–407.

Erlingsson, Ú., Pihur, V., y Korolova, A. (2014). RAPPOR: Randomized Aggregatable Privacy-Preserving Ordinal Response. In Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security.

España (2018). Ley Orgánica 3/2018, de 5 de diciembre, de Protección de Datos Personales y garantía de los derechos digitales. Boletín Oficial del Estado, número 294, de 6 de diciembre de 2018.

España (2021). Ley Orgánica 7/2021, de 26 de mayo, de protección de datos personales tratados para fines de prevención, detección, investigación y enjuiciamiento de infracciones penales y de ejecución de sanciones penales. Boletín Oficial del Estado, número 128, de 27 de mayo de 2021.

Ferguson, A. G. (2017). The rise of big data policing: surveillance, race, and the future of law enforcement. NYU Press.

Ferretti, L., Wymant, C., Kendall, M., Zhao, L., Nurtay, A., Abeler-Dörner, L., Parker, M., Bonsall, D. y Fraser, C. (2020). Quantifying SARS-CoV-2 transmission suggests epidemic control with digital contact tracing. science, 368(6491), eabb6936.

Fjeld, J., Achten, N., Hilligoss, H., Nagy, A., y Srikumar, M. (2020). Principled artificial intelligence: Mapping consensus in ethical and rights-based approaches to principles for AI. Berkman Klein Center Research Publication, (2020-1).

Floridi, L., Cowls, J., Beltrametti, M., Chatila, R., Chazerand, P., Dignum, V., Luetge, C., Robert Madelin, R., Pagallo, U., Rossi, F., Schafer; B., Valcke, P. y Vayena, E. (2018). AI4People—An Ethical Framework for a Good AI Society: Opportunities, Risks, Principles, and Recommendations. Minds and Machines, 28(4), 689-707.

Fussey, P., y Murray, D. (2019). Independent Report on the London Metropolitan Police Service's Trial of Live Facial Recognition Technology. University of Essex Human Rights Centre.

Gentry, C. (2009). A fully homomorphic encryption scheme. Stanford University. https://crypto.stanford.edu/craig/craig-thesis.pdf

Joh, E. E. (2017). Artificial intelligence and policing: First questions. Seattle UL Rev., 41, 1139.

Jordan, M. I., y Mitchell, T. M. (2015). Machine learning: Trends, perspectives, and prospects. Science, 349(6245), 255-260.

Konečný, J., McMahan, H. B., Yu, F. X., Richtárik, P., Suresh, A. T., y Bacon, D. (2016). Federated Learning: Strategies for Improving Communication Efficiency. arXiv preprint arXiv:1610.05492.

Kuner, C. (2017). The European Union General Data Protection Regulation (GDPR): European Regulation that has a Global Impact. International Data Privacy Law, 7(4), 277–289. (NO ENCONTRADO)

Levy, K., y Schneier, B. (2020). Privacy threats in intimate relationships. Journal of Cybersecurity, 6(1), tyaa006.

Lindell, Y. (2005). Secure multiparty computation for privacy preserving data mining. In Encyclopedia of Data Warehousing and Mining (pp. 1005-1009). IGI global.

Machanavajjhala, A., Kifer, D., Gehrke, J., y Venkitasubramaniam, M. (2007). l-diversity: Privacy beyond k-anonymity. ACM Transactions on Knowledge Discovery from Data (TKDD), 1(1), 3-es.

Mayer-Schönberger, V., y Cukier, K. (2013). Big Data: A revolution that will transform how we live, work, and think. Eamon Dolan/Houghton Mifflin Harcourt.

Mittelstadt, B., Allo, P., Taddeo, M., Wachter, S., y Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data y Society.

Narayanan, A., y Shmatikov, V. (2010). De-anonymizing social networks. In 2009 30th IEEE Symposium on Security and Privacy (pp. 173-187). IEEE.

Nissenbaum, H. (2009). Privacy in context: Technology, policy, and the integrity of social life. Stanford University Press.

Obar, J. A., y Oeldorf-Hirsch, A. (2018). The biggest lie on the Internet: Ignoring the privacy policies and terms of service policies of social networking services. Information, Communication y Society, 23(1), 128-147.

Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Harvard University Press.

Richards, N. M., y King, J. H. (2014). Big Data Ethics. Wake Forest Law Review, 49, 393–432.

Roman, R., Zhou, J., y Lopez, J. (2013). On the features and challenges of security and privacy in distributed internet of things. Computer Networks, 57(10), 2266-2279.

Sicari, S., Rizzardi, A., Grieco, L. A., y Coen-Porisini, A. (2015). Security, privacy and trust in Internet of Things: The road ahead. Computer Networks, 76, 146-164.

Solove, D.J. (2002). Conceptualizing Privacy. California Law Review, 90(4), 1087-1155.

Solove, D. J. (2008). Understanding Privacy. Harvard University Press.

Solove, D. J. (2013). Privacy self-management and the consent dilemma. Harvard Law Review, 126, 1880.

Sweeney, L. (2002). k-anonymity: A model for protecting privacy. International Journal on Uncertainty, Fuzziness and Knowledge-based Systems, 10(05), 557-570.

Tene, O., y Polonetsky, J. (2012). Big data for all: Privacy and user control in the age of analytics. Nw. J. Tech. y Intell. Prop., 11, 239.

Thompson, A., Stringfellow, L., Maclean, M., y Nazzal, A. (2021). Ethical considerations and challenges for using digital ethnography to research vulnerable populations. Journal of Business Research, 124, 676-683.

Yang, Q., Liu, Y., Chen, T., y Tong, Y. (2019). Federated machine learning: Concept and applications. ACM Transactions on Intelligent Systems and Technology (TIST), 10(2), 1-19. https://doi.org/10.1145/3284422

Ziegeldorf, J. H., Morchon, O. G., y Wehrle, K. (2014). Privacy in the Internet of Things: threats and challenges. Security and Communication Networks, 7(12), 2728-2742.

Zuboff, S. (2019). The Age of Surveillance Capitalism: The fight for a human future at the new frontier of power. PublicAffairs.

Published

2024-01-09

How to Cite

María Teresa, & Baquero Pérez, P. J. (2024). Datos policiales e Inteligencia Artificial: Un equilibrio delicado entre la privacidad, la utilidad y la ética. Revista Canaria De Administración Pública, (Extraordinario), 143–175. https://doi.org/10.36151/RCAP.ext.6

Issue

Section

Gobierno abierto, datos y nuevas tecnologías en la Administración Pública