EU Regulators Urge App Developers to Security Test

image4066018630015815793Mobile app security is something we talk about a lot, but now it officially has the rest of the world talking.

European Privacy Regulators have just published their opinion on mobile app security, which will likely worsen the security testing headache for app developers and business owners. Complying with the new data protection law means following seven principles says David Meyer, of Gigaom:

“Here’s an example given by the regulators (with bold type reflecting my emphasis):

‘An app provides information about nearby restaurants. To be installed the app developer must seek consent. To access the geolocation data, the app developer must separately ask for consent, e.g. during installation or prior to accessing the geolocation. Specific means that the consent must be limited to the specific purpose of advising the user about nearby restaurants. The location data from the device may therefore only be accessed when the user is using the app for that purpose. The user’s consent to process geolocation data does not allow the app to continuously collect location data from the device. This further processing would require additional information and separate consent.

Similarly, for a communication app to access the contact list, the user must be able to select contacts that the user wishes to communicate with, instead of having to grant access to the entire address book (including contact details of non-users of that service that cannot have consented to the processing of data relating to them).’

How about app stores? Here, the working party recommends that apps “should not just be rated by users for how ‘cool’ they are, but also on the basis of their functionalities, with specific reference to privacy and security mechanisms”.

These kinds of recommendations may seem a tall order, but they are doable. However, the working party seems under no illusion about the challenge it faces. Here’s the whole problem with ensuring the rules get stuck to, distilled into a single passage:

‘A high risk to data protection comes from the degree of fragmentation between the many players in the app development landscape. A single data item can, in real time, be transmitted from the device to be processed across the globe or be copied between chains of third-parties. Some of the best known apps are developed by major technology companies but many others are designed by small start-ups. A single programmer with an idea and little or no prior programming skills can reach a global audience in a short space of time. App developers unaware of the data protection requirements may create significant risks to the private life and reputation of users of smart devices. Simultaneously, third-party services such as advertising are developing rapidly, which, if integrated by an app developer without due regard, may disclose significant amounts of personal data.’

There’s the rub. The creation and distribution of apps can involve many, many parties, with services interlinked in a way that’s hard to keep track of — especially since one of the fundamentals of EU data protection law is that the user is kept fully informed of what’s happening with their data, the likelihood of proper compliance breaks down on that point alone. That’s before we even get to the thorny issue of who is situated where and whether sending data to that location means breaking the rules, or how many opportunities for a security breach get opened up by having so many links in the chain.”

Complying with EU regulators, will mean developers need to security test any and all of their apps in-the-wild. What do you think of the new EU Article 29 Working Party?  Let us know in the comments section.

 

Leave a Reply

Your email address will not be published. Required fields are marked *