Counterterrorism: Privately Clustering a Radical Social Network Data

Authors

  • Jamal Boujmil RS&GIS Lab. Dept. Telecommunications The National School for Applied Sciences of Tetuan, Tetuan, Morocco
  • N. Tagmouti RS&GIS Lab. Dept. Telecommunications The National School for Applied Sciences of Tetuan Tetuan, Morocco
  • N. Raissouni RS&GIS Lab. Dept. Telecommunications The National School for Applied Sciences of Tetuan Tetuan, Morocco

DOI:

https://doi.org/10.14738/tmlai.54.3204

Keywords:

differentail privacy, social similarity, privacy preserving

Abstract

The tradeoff between the needed or essential gathering and analysis of personal data and the privacy rights of individuals is now an important requirement under any counterterrorism program. The most famous and controversial recent example is the revelation that US intelligence agencies systemically engage in “bulk collection” of civilian “metadata” detailing telephonic and other types of communication and activities, with the alleged purpose of monitoring and thwarting terrorist activity. Differential privacy provides one of the strongest privacy guarantees up to now. In this paper, we present a new provably privacy-preserving algorithm able to identify and take action upon members of the targeted subpopulation. Meanwhile, avoiding compromising the privacy of the patriot subpopulation. It is a new algorithm for search methods which use a new combination of nodes social similarity and differential privacy.

References

(1) Kanski, J.J., Clinical ophthalmology. 6th edition ed2007, London: Elsevier Health Sciences (United Kingdom). P.952.

(2) Liang, Z., et al., The detection and quantification of retinopathy using digital angiograms. Medical Imaging, IEEE Transactions on, 1994. 13(4): p. 619-626.

(3) M. Kearns, A. Roth, Z. S. Wu, and G. Yaroslavtsev, “Private algorithms for the protected in social network search,” Proc. Natl. Acad. Sci. U. S. A., vol. 113, no. 4, pp. 913–918, 2016.

(4) S. Kok and R. Rogers, “Rethinking migration in the digital age: Transglocalization and the Somali diaspora,” Glob. Networks, no. July, 2016.

(5) C. Dwork and A. Roth, “The Algorithmic Foundations of Differential Privacy,” Found. Trends Theor. Comput. Sci., vol. 9, no. 2013, pp. 211–407, 2014.

(6) D. J. Solove, “Conceptualizing privacy,” Calif. Law Rev., vol. 90, no. 4, pp. 1087–1155, 2002.

(7) Y. Li, P. Luo, and C. Wu, “A new network node similarity measure method and its applications,” no. 71271070, pp. 1–12, 2014.

(8) N. Li, W. Qardaji, and D. Su, “On Sampling, Anonymization, and Differential Privacy: Or, k-Anonymization Meets Differential Privacy,” 2011.

(9) H. H. Nguyen, A. Imine, and M. Rusinowitch, “Detecting Communities under Differential Privacy,” 2016.

(10) V. D. Blondel, J.-L. Guillaume, R. Lambiotte, and E. Lefebvre, “Fast unfolding of communities in large networks,” J. Stat. Mech. Theory Exp.,

vol. 10008, no. 10, p. 6, 2008.

(11) K. Nissim, U. Stemmer, and S. Vadhan, “Locating a Small Cluster Privately,” Proc. 35th ACM SIGMOD-SIGACT-SIGAI Symp. Princ. Database Syst., pp. 413–427, 2016.

(12) F. Ahmed, R. Jin, and A. X. Liu, “A Random Matrix Approach to Differential Privacy and Structure Preserved Social Network Graph Publishing,” 2013.

(13) W. Day, N. Li, and M. Lyu, “Publishing Graph Degree Distribution with Node Differential Privacy,” Proc. 2016 Int. Conf. Manag. Data, pp. 123–138, 2016.

(14) Z. Jorgensen and T. Yu, “A Privacy-Preserving Framework for Personalized , Social Recommendations,” Proc. 17th Int. Conf. Extending Database Technol., pp. 571–582, 2014.

(15) A. Stanoev, D. Smilkov, and L. Kocarev, “Identifying communities by influence dynamics in social networks,” Phys. Rev. E - Stat. Nonlinear, Soft Matter Phys., vol. 84, no. 4, 2011.

(16) P. Pons and M. Latapy, “Computing communities in large networks using random walks,” Comput. Inf. Sci., vol. 10, pp. 284–293, 2005.

(17) C. Detection, “Nearest Neighbor search in Complex Network for

Community Detection,” pp. 1–13, 2015.

(18) S. Oh and P. Viswanath, “The Composition Theorem for Differential Privacy,” Int. Conf. Mach. Learn., vol. 37, p. 26, 2015.

(19) J. Hsu, M. Gaboardi, A. Haeberlen, S. Khanna, A. Narayan, and B. C. Pierce, “Differential Privacy: An Economic Method for Choosing Epsilon,” IEEE Comput. Secur. Found. Symp., pp. 1–29, 2014.

(20) C. Dwork, F. McSherry, K. Nissim, and A. Smith, “Calibrating Noise to Sensitivity in Private Data Analysis,” Lect. notes Comput. Sci., no. 3876, pp. 265–284, 2006.

(21) Q. Geng and P. Viswanath, “The Optimal Mechanism in epsilon,delta-Differential Privacy,” p. 16, 2013.

(22) K. Kenthapadi and A. Korolova, “Privacy via the Johnson-Lindenstrauss Transform,” arXiv Prepr. arXiv …, no. 1, pp. 1–24, 2012.

(23) F. Ahmed and M. Abulaish, “A Generic Statistical Approach for Spam Detection in Online Social Networks,” vol. 36, pp. 1120–1129, 2013.

(24) C. Wilson, B. Boe, A. Sala, K. P. N. Puttaswamy, and B. Y. Zhao, “User Interactions in Social Networks and their Implications,” Eurosys’09 Proc. Fourth Eurosys Conf., pp. 205–218, 2009.

(25) L. Takac and M. Zabovsky, “Data Analysis in Public Social Networks,” Int. Sci. Conf. Int. Work., no. May, pp. 1–6, 2012.

(26) J. Yang and J. Leskovec, “Defining and Evaluating Network Communities Based on Ground-Truth,” ACM SIGKDD Work. Min. Data Semant., pp. 745–754, 2012.

(27) D. Kempe, J. Kleinberg, and É. Tardos, “Maximizing the spread of influence through a social network,” Kdd, p. 137, 2003.

Downloads

Published

2017-09-01

How to Cite

Boujmil, J., Tagmouti, N., & Raissouni, N. (2017). Counterterrorism: Privately Clustering a Radical Social Network Data. Transactions on Engineering and Computing Sciences, 5(4). https://doi.org/10.14738/tmlai.54.3204

Issue

Section

Special Issue : 1st International Conference on Affective computing, Machine Learning and Intelligent Systems