New Research from Imperial College London has shown that Apple’s Implementation of a Widely Accepted Data Protection Model could leave Users to Privacy Attacks
The researchers discovered that by looking into how Apple used the local differential privacy (LDP) model, they could determine people’s favorite emoji skin tone and political inclinations. To improve apps and services, businesses gather behavioral data generated by users’ devices on a large scale. However, these records are fine-grained and contain sensitive information about specific people.
LDP allows businesses like Apple and Microsoft to get user data without obtaining personally identifiable information. The current research, however, describes how emoji and website usage patterns obtained through LDP may be used to gather data on a person’s use of emoji skin tones and political affiliations. It was presented at the peer-reviewed USENIX Security Symposium. According to academics from Imperial College London, this goes against the promises made by LDP, and more has to be taken care of in order to safeguard the data of Apple consumers.
The researchers showed how the LDP implementation of Apple has privacy issues. Researchers have now shown that, in the event of a brand-new pool inference attack, noisy records can divulge private information about specific users. They used website visits and emoji usage to mimic two different sorts of attacks. They discovered that despite privacy protections, users were exposed to both attacks.
The assault was demonstrated to be extremely efficient against users who exhibit highly revealing phone behavior, such as those who frequently visit news websites and those with strong opinions who tend to visit websites with a similar political slant.
According to the experts, LDP is a potent technique for data collection while protecting privacy, but it needs to be used carefully to offer reliable privacy guarantees. Apple’s approach is currently susceptible to attacks that might be used to deduce a person’s political preferences and other personal information, perhaps leading to abuse and discrimination, and Apple needs to take more steps to ensure LDP is correctly deployed.
Reference: https://www.imperial.ac.uk/news/239752/privacy-gaps-apples-data-collection-scheme/
Paper: https://www.usenix.org/system/files/sec22-gadotti_1.pdf
Prathvik is ML/AI Research content intern at MarktechPost, he is a 3rd year undergraduate at IIT Kharagpur. He has a keen interest in Machine learning and data science.He is enthusiastic in learning about the applications of Machine learning in different fields of study.
Credit: Source link
Comments are closed.