MaaS and the City: Enabling Privacy by Cooperation


Shared mobility services (often described as “Mobility-as-a-Service” or “MaaS”) have become a most common part of our daily urban lives, at least in the largest metropoles. Yet they come with a series of challenges and potential inconveniences to be faced by municipalities, ranging from mere logistics (damaged vehicles may hinder road traffic and pollute urban landscapes) to economics (competition between taxis and ride-hailing services being but one most “spectacular” example thereof).

To address these, many countries and municipalities have started passing laws and regulations regarding MaaS services (such as the French Loi no. 2019-1428 du 24 décembre 2019 d’orientation des mobilités) and implementing strict requirements in tenders, e.g. caps on circulating vehicles and permitted speeds or slow-speed, no-traffic or no-parking areas. At a broader level, municipalities may also want to get a better grasp of the way these services are being used, as part of their urban planning projects; police services may also wish to monitor individual uses of vehicles for law enforcement purposes.

All these use cases – ranging from policy-making to law enforcement – are obviously best served using accurate data relating to, for instance, vehicle availability rates, geolocation of vehicles (at rest and/or in transit) and preferred routes. Such data (let’s call it MaaS data), as it is held by the respective MaaS service providers (mostly private companies), has thus become a crucial topic – and actually a quite sensitive one – for both municipalities and the said service providers.

The best example to date is the feud surrounding the Mobility Data Specification (MDS) format launched by the City of Los Angeles Department of Transportation (LADOT) and managed now as an open source project by the Open Mobility Foundation (OMF): as a new comprehensive mobility data format, MDS has sparked much debate in the USA, on part of mobility operators and privacy advocates who accuse it of posing excessive threats to users’ privacy. This led LADOT to publish a first set of MDS data protection principles, which is expected to evolve based on further criticism. The situation escalated when LADOT banned an operator from providing its bike-sharing services in LA, for not complying with MDS data sharing requests – a ban confirmed by a court decision on February 11, 2020, which however did not rule clearly on privacy aspects.

Now that European municipalities have started to show interest in better exploiting mobility data, the same privacy issues will need to be dealt with under the scope of EU privacy laws and regulations, i.e. (chiefly – though not exclusively) the General Data Protection Regulation (GDPR).

Controversies have already started to rise as to whether MDS is (or is not) GDPR-compliant; albeit a bit too blunt (as we will show below), this question hints at legal questions that cannot be omitted or circumvented in an era of (legitimate) privacy concerns.

This article shall explore the application of EU privacy laws and regulations (including, but not limited to, GDPR) to processing of MaaS data (including, but not limited to, MDS data) by municipalities; the general conclusion is that such processing may under certain conditions comply with such privacy laws and regulations (which actually comes out as no big surprise), such compliance being best secured through close cooperation of municipalities, mobility operators and, as applicable, third-party service providers.

All aboard the privacy train!

_

The EU privacy framework: Applicable provisions & shortcomings

Let’s first consider what EU privacy laws and regulations exactly may apply to our subject – that is, the processing of MaaS data by municipalities – and where to find relevant guidelines.

While privacy-related discussions in the EU tend to focus on GDPR, which actually lays down the applicable principles and requirements for most personal data processing activities, one should not forget about domestic data protection laws and regulations: the processing of MaaS data by municipalities will often indeed be related to sovereignty topics such as transportation, urban planning or security; Member States are likely to adopt additional or ancillary domestic provisions for such topics, as per GDPR margins of manoeuvre – these, therefore, call for a case-by-case, local analysis.

Also, where MaaS data might be used in relation to investigation or prosecution matters by police authorities or jurisdictions, domestic laws will come into play inasmuch as they transpose Directive (EU) 2016/680 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data (colloquially referred to as the “Law Enforcement Directive” or LED), which provides principles and obligations slightly different than GDPR, based on the specific needs and sensitivity of such investigation and prosecution activities.

From a territorial perspective, both these sets of provisions (GDPR and related domestic data protection law; domestic provisions transposing LED) shall undoubtedly apply to the processing of MaaS data by EU municipalities, as these are (by definition) located on the EU territory (Article 3.1 GDPR; Article 2 LED); under certain circumstances, they may also have an impact on the processing of the same sort of data by non-EU municipalities, where for instance such municipalities rely on EU-based data processors. From a material perspective however (Article 2.1 GDPR; Article 2 LED), this question (i.e. whether EU personal data laws and regulations shall apply) will depend on whether said MaaS data qualifies as personal data or not – which is not that obvious, as we will explain in the next section.

Now, as nothing in either GDPR or LED deals specifically with the case of MaaS data, one may want to consider looking up supervisory authorities’ case law and opinions for more precise guidelines. To date, unfortunately, the task will prove a bit unconclusive. When it comes to mobility data, authorities have mostly dealt with geolocation data derived from personal vehicles and other personal connected devices (see esp. the European Data Protection Board’s Guidelines 1/2020 on processing personal data in the context of connected vehicles and mobility related applications), which differ in many important ways from MaaS services, as explained below. As for the use of such MaaS data by municipalities (an even more specific topic), authorities’ doctrine will only provide, at best, some very broad (and all too prospective) considerations on “smart cities” and the like (see esp. this piece from the French authority’s “lab” (LINC)).

In contrast, sectoral transport law provisions such as Directive 2010/40 on Intelligent Transport Systems (and the respective Opinion 2010/C 47/02 of the European Data Protection Supervisor) and Commission Delegated Regulation 2017/1926 contain some more actionable data protection requirements and standards. The EDPS Opinion especially brings forth interesting points as to privacy by design, transparency, data minimization and sharing of data with law enforcement authorities. It is however a bit outdated (as it was adopted in 2010, i.e. under the regime of former Directive 95/46 and prior to the development of micro-mobility services) and does not focus on municipalities’ and other public authorities’ use cases, but mostly on services to be provided directly to end users.

As for personal data laws and regulations (i.e. laws and regulations which apply to personal data), therefore, municipalities, MaaS operators and other stakeholders are left with very little well-tailored, official guidance, and pretty much to work with analogies.

Private initiatives have tried to make up for that scarcity by adopting public commitments (as in this MaaS Alliance November 2018 Vision Paper on Data), although these do not say much about the way data is being shared with public authorities, as they are primarily about the operators’ side of the processing. The same goes for academic literature about MaaS data (see esp. these papers by Caitlin D. Cottrill (January 2020) and Federico Costantini (2017)), which, albeit well-documented and quite insightful (at the core of these is especially the idea that the shift from a property-based to a service-based economy is a primary source of unprecedented privacy issues – a lesson that is not limited to transportation), fails to properly address municipalities’ use cases.

In addition to those personal data laws and regulations, and on a more prospective note, stakeholders should keep a close watch on the forthcoming ePrivacy Regulation, which is yet another privacy set of rules, intended to apply to machine to machine data flows, and thus virtually to any connected object in publicly available services or networks – such as MaaS vehicles. Having M2M communications in the scope is actually one purpose of this would-be regulation, reforming the current 2002 ePrivacy Directive. In its latest version by the Presidency of the Council of the EU, Recital 12 of the draft Regulation explicitly states that:

“The use of machine-to-machine and Internet of Things services, that is to say services involving an automated transfer of data and information between devices or software-based applications with limited or no human interaction, is emerging. In order to ensure full protection of the rights to privacy and confidentiality of communications, and to promote a trusted and secure Internet of Things in the digital single market, this Regulation, in particular the requirements relating to the confidentiality of communications, should apply to the transmission of such services. The transmission of machine-to-machine or Internet of Things services regularly involves the conveyance of signals via an electronic communications network and, hence, constitutes an electronic communications service. This Regulation should apply to the provider of the transmission service if that transmission is carried out via a publicly available electronic communications service or network.”

Also, Article 4.3(c) of the draft Regulation explicitly includes geolocation data as part of “electronic communications metadata“, which is central to the Regulation:

“[E]lectronic communications metadata’ means data processed by means ofelectronic communications services for the purposes of transmitting, distributing or exchanging electronic communications content; including data used to trace and identify the source and destination of a communication, data on the location of the device generated in the context of providing electronic communications services, and the date, time, duration and the type of communication[.]”

Geolocation data relating to connected vehicles would therefore fall under the scope of the intended Regulation. Yet the main question remains – what would be the applicable rules and/or restrictions? Article 6b.1(d), as currently drafted, appears to provide legal basis for the sharing of such geolocation data with public authorities, as it states that the processing of such data shall be lawful where “it is necessary to protect the vital interest of a natural person, in the case of emergency, in general upon request of a public authority, in accordance with Union or Member State law“. However, the very broad character of these provisions (and – to be fair – the sheer poorness of their writing) raises doubt as to their survival as such in the final version of the Regulation, a fortiori in view of the numerous, substantial changes already brought to the draft Regulation since its proposal by the European Commission in 2017. In any case, it seems obvious that the text to be adopted shall have a crucial impact on the processing and sharing of MaaS data.

A non-connected sort of vehicle – ePrivacy has no power here.

_

When & why should MaaS data be considered personal data?

Moving back to personal data laws and regulation, this question is paramount and should always be considered first: it will determine whether GDPR and national laws (to the extent that they implement GDPR margins of manoeuvre and transpose LED) shall apply or not. To put it bluntly, should you find that a given set of MaaS data does not contain any personal data, you may very well dismiss every GDPR/LED concerns altogether.

Of course we know that EU courts and authorities have given this notion – that of personal data – a pretty extensive meaning. However, the question should not be dismissed too easily, as the notion is still not limitless.

Article 4.1 GDPR/3.1 LED provides that “any information relating to an identified or identifiable natural person” is personal data; Recital 26 GDPR goes on to state that “[t]o determine whether a natural person is identifiable, account should be taken of all the means reasonably likely to be used, such as singling out, either by the controller or by another person to identify the natural person directly or indirectly. To ascertain whether means are reasonably likely to be used to identify the natural person, account should be taken of all objective factors, such as the costs of and the amount of time required for identification, taking into consideration the available technology at the time of the processing and technological developments. The principles of data protection should therefore not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable“.

This definition (and the subsequent clarification) makes it clear that (i) data is only personal data where it is reasonably likely to be retraced (either directly or indirectly, alone or in combination with other data) to a given identified or identifiable individual, and (ii) that the “reasonable” part in that last sentence depends on a concrete assessment of the circumstances of the case at hand (including technical and economical circumstances). In other words, there is no such thing as personal data per se; the notion is, by its sheer definition, case-sensitive.

Therefore, any serious assessment of whether data is personal data must be conducted as follows:

  • First, identify the persons who could have an interest in retracing the data to the respective individuals;
  • Second, identify all means (whether present or potential) that such persons could use to perform the foregoing;
  • Third, estimate the costs, technical availability and overall “burdensomeness” of such means, to determine whether the abovementioned persons are reasonably likely to use them.

Data that is not reasonably likely to be retraced to a given identified or identifiable individual by any person whatsoever in a given situation is not personal data in that situation, and the respective processing shall not be subject to GDPR/LED.

This line of reasoning is especially relevant in the case of pseudonymized data, to determine whether such pseudonymization goes as far as to amount to anonymization – anonymized data, as per Recital 26 GDPR above, being the very opposite of personal data. Should nobody be reasonably likely to retrace the real identify of the individuals behind the assigned pseudonyms, these pseudonyms and any data associated therewith shall not be deemed personal data. Thus, data that is once considered personal data may also cease to be so, after it is anonymized e.g. through non-reversible aggregation (in compliance with former Article 29 Data Protection Working Party’s Opinion 05/2014 on Anonymisation Techniques).

Let’s now apply this methodology to MaaS data. As explained above, such application cannot be abstract – it must be rooted in the specific context of a given use case; then again, let’s try to depict such a precise use case.

In most cases, municipalities will not collect and process MaaS data themselves: they will rely on third-party service providers (such as data vizualisation platforms) and entrust these with the task of collecting, hosting and “making sense of” data retained by mobility operators. Also, the content of such data may vary greatly depending on the type of service provided by the operator (ride-hailing services, shared e-scooters, dockless bikes, etc) and the data format that is required by the municipality/imposed by mobility operators.

Let us then consider MaaS data in the MDS format being collected and hosted by a third-party platform such as mentioned above, in relation to shared e-scooters. In a nutshell, MDS datasets contain time-stamped geolocation data associated with unique vehicle identifiers (“vehicle IDs”); surely none of this allows for direct reidentification of users: these can only be retraced by browsing the respective mobility operator’s database, which contains information such as email addresses and credit card details. Therefore, in the context of its processing by the third-party platform on behalf of the municipality, such MDS data is not directly identifying personal data – neither the third-party platform, nor the municipality thereby, have access to any such direct identifiers. Now nonetheless, can MDS data be retraced to the respective users using means that are “reasonably likely to be used“, in this particular context, as per Recital 26 GDPR?

This question, again, will depend on the very circumstances of the case; it may however be answered in two different general sort of ways. A first way is to consider that geolocation data alone may, in certain circumstances, allow for reidentification of individuals, as established by several pieces of scientific research – which get cited very often in data protection literature, as in abovementioned Article 29 Data Protection Working Party’s Opinion 05/2014 (p.23):

“Researchers at MIT recently analyzed a pseudonymised dataset consisting of 15 months of spatial-temporal mobility coordinates of 1,5 million people on a territory within a radius of 100 km. They showed that 95% of the population could be singled-outwith four location points, and that just two points were enough to single-outmore than 50% of the data subjects (one of such points is known, being very likely “home” or “office”) with very limited space for privacy protection, even if the individuals’ identities were pseudonymised by replacing their true attributes [….] with other labels.”

Albeit grounded on such scientific research (and, actually, on Article 4 of the GDPR itself, which states – without much regard to its own case-by-case approach – that “an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as […] location data“), this line of reasoning has certain shortcomings when it comes to MaaS services. MaaS vehicles differ from personal vehicles and smartphones in important ways, the first and most obvious one being that such vehicles are shared, i.e. continually passed from hand to hand; also, a given MaaS user will often switch from one vehicle to another and even from one service to another, using such services in combination with other transportation means, so that the connection between one vehicle and one user is much looser.

Therefore, while monitoring geolocation of a particular personal vehicle or a particular smartphone will allow to identify recurring mobility patterns and infer “personal” locations such as home or office, monitoring geolocation of MaaS vehicles is way less likely to allow the same, and thus way less likely to be considered personal data when considered alone.

In any case, it should be assessed whether the means to achieve such correlations based on MaaS vehicles data (if such means exist at all) may be considered “means reasonably likely to be used” as per Recital 26 GDPR above, with due regard to the concrete technical, human and financial resources of the persons susceptible of accessing such data (primarily the municipality and the third-party platform).

However, in most cases (or at least in the case of MDS data), one will not have to go as far as such subtleties. EU case law on the concept of personal data provides enough useful analogies, as in the 2016 Breyer ECJ case on IP addresses, where the Court ruled that such IP addresses are to be considered personal data from the perspective of a website owner, where applicable laws and regulations allow said website owner to request identification of the holder of a given IP address from a court, with the help of the respective Internet Service Provider (as will generally be possible in the EU under the 2000 eCommerce Directive). The ratio decidendi is that such request for identification, as a legal mean, is to be deemed a mean reasonably likely to be used by the website owner (see esp. §48 of the decision).

In the case of MDS data, vehicle IDs are to a municipality pretty much as IP addresses are to that website owner: while the municipality is not immediately able to reidentify a given user based on a given vehicle ID, it will generally have legal ways to force the respective mobility operator to provide information on the user who was riding the respective vehicle at a given date and time. Such ways may take the form of a warrant and may require the municipality to go before a court – however, they are, arguably, reasonably likely to be used, should (for instance) the municipality’s police services wish to investigate a given ride. For those reasons (at least), MDS data is – generally – best thought of as personal data.

Obviously this conclusion may be altered depending on specific measures to be applied by either the municipality, the third-party platform and/or the mobility operators. A third-party platform may for instance apply a non-reversible randomization algorithm to vehicle IDs right after they are collected from mobility operators, so that neither the municipality nor the platform service provider shall be able to go back to vehicle IDs as they still exist in the mobility operator’s database. The parties may also agree on binding restrictions and data sharing protocols stipulating that municipality services shall in no case be able to access users’ directly identifying personal data (as hinted at by former Article 29 Data Protection Working Party’s Opinion 04/2007 on the concept of personal data, esp. p.20) – although such an agreement is quite likely to come out as toothless, should the municipality get a judicial warrant of the sort above against the mobility operator.

Anyway, coming to the conclusion that a certain set of MaaS data (such as MDS datasets) is personal data has no consequence other than the following: that GDPR/LED requirements will apply, and need to be complied with (provided of course that the criterion for territorial application are met). At this stage, no conclusion can be drawn as to the lawfulness/compliance of the processing of the said MaaS dataset; this can only be determined through a concrete assessment of the way such processing is organized, with regard to GDPR/LED substantial provisions.

The following sections will explore how such a concrete assessment is to be conducted.

Pro privacy tip: to avoid geolocation, live in fractals.

_

Key challenges & Proposed best practices

As explained above, application of GDPR and LED principles to municipalities’ processing of MaaS data is not straightforward, as existing literature from supervisory authorities only provides imperfect analogies. That’s why stakeholders will need to think creatively and propose solutions before official guidance or case law is issued – the purpose of this article being no more than to hint at possible such solutions.

It is worth reminding, as a preliminary note, that data protection principles are by nature case-sensitive and risk-based. Concrete solutions for compliance will depend both on the intended purpose of the processing of MaaS data, and on the risks of reidentification of individuals, as is clarified by former Article 29 Data Protection Working Party’s Opinion 04/2007 on the concept of personal data (p.18):

“Retraceably pseudonymised data may be considered as information on individuals which are indirectly identifiable. Indeed, using a pseudonym means that it is possible to backtrack to the individual, so that the individual’s identity can be discovered, but then only under predefined circumstances. In that case, although data protection rules apply, the risks at stake for the individuals with regard to the processing of such indirectly identifiable information will most often be low, so that the application of these rules will justifiably be more flexible than if information on directly identifiable individuals were processed.

In other words: although both directly and indirectly identifying information will qualify as personal data and, as such, be subject to GDPR/LED, the lower the risks of reidentification, the more “flexible” the application of these provisions. Therefore, where personal data can only be retraced to data subjects through indirect or exceptional means, the data controller is expectedly not supposed to set up guarantees as high as would be required with directly identifying personal data.

This risk-based approach is of course crucial to the our case, as MaaS datasets (such as MDS datasets) do not contain any such directly identifying personal data, and only allow for reidentification of end users when combined with information held by mobility operators. Any reasonable compliance assessment shall take this as a starting point, before going into more details as to the various GDPR/LED principles and obligations – as we will do just now.

Disclaimer: The opinions below only cover those regulatory items that raise new or otherwise significant issues in relation to municipalities’ processing of MaaS data, and only provide general insights on those issues. They are not intended as a fully actionable compliance program for municipalities, especially since – as explained above – domestic laws and regulations may apply and call for different compliance solutions. Municipalities are therefore encouraged to seek legal advice in relation to their specific use cases.

_

Purpose limitation & Compatiblity (Article 5.1(b) and 6.4 GDPR/4 LED)

As is true for any processing of personal data under GDPR/LED, collection of MaaS data by municipalities must need to serve “specified, explicit and legitimate purposes“, and said data cannot be “further processed in a manner that is incompatible with those purposes“.

This principle poses two requirements on municipalities wishing to collect and process MaaS data. The first one, and actually the clearest, is that such municipalities must think and precisely define what exactly they will do with the data, prior to even starting collecting that data from operators.

This reflection – which is closely linked to privacy by design requirements as per Article 25 GDPR/20 LED – will have direct, highly practical ramifications, as it will determine, in particular, what data may be collected (data minimization), how long that data may be retained (storage limitation), and who is entitled to access the data (which is obviously crucial in relation to surveillance concerns). In this respect, two purposes as diverse as mere statistics and law enforcement will call for quite different levels of data protection measures and guarantees.

The intended purposes will also need to be disclosed to data subjects (citizens) as per transparency obligations (see below), and to be enshrined in data processing agreements between the municipality and third-party service providers (as data processors). Municipalities (just as any other data controller under GDPR/LED) should therefore pay close attention to that very first step (determination of purposes), however “obvious” the intended purposes may seem at first glance.

The second requirement, since municipalities collect such MaaS data from operators who initially collected it for their own, distinct purposes (such as float management and billing users), is that they (municipalities) make sure their own intended purposes are not “incompatible” with the operators’ initial purposes.

Article 6.4 GDPR provides a few (not-so-actionable) guidelines for performing such a “compatibility test”, stating that the following criteria should be taken into account:

“(a) any link between the purposes for which the personal data have been collected and the purposes of the intended further processing;

(b) the context in which the personal data have been collected, in particular regarding the relationship between data subjects and the controller;

(c) the nature of the personal data, in particular whether [sensitive data], or whether personal data related to criminal convictions and offences are processed […] ;

(d) the possible consequences of the intended further processing for data subjects;

(e) the existence of appropriate safeguards, which may include encryption or pseudonymisation.”

Those provisions make it clear that “compatibility” (or, more exactly, absence of incompatibility) between operators’ initial purposes and the municipality’s intended purposes is again a matter of casuistic, so that it is difficult to draw a general conclusion as to whether municipalities may or may not justify reusing MaaS data for any intended purpose. The test must be performed by each municipality itself (as data controller), taking into account all circumstances of the intended processing of MaaS data, in line with the overarching principle of accountability; while allowing for more flexibility than a system based on prior approval by an official authority, this arguably results in a certain degree of uncertainty, which to some may seem uneasy, or even discouraging.

In our case there is, however, another way around. Should they wish to circumvent this compatibility test, municipalities may consider relying on one of the two exemptions provided by the same Article 6.4 GDPR, namely that a compatibility test is not required where further processing is based “on a Union or Member State law which constitutes a necessary and proportionate measure in a democratic society to safeguard the objectives referred to in Article 23(1) [GDPR]”, i.e. objectives such as “public security” (Article 23.1(c) GDPR), “prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security” (Article 23.1(d) GDPR), or “other important objectives of general public interest of the Union or of a Member State” (Article 23.1(e) GDPR).

Basically, relying on that exemption would mean embedding the intended purposes for the processing of MaaS data in statutory law (e.g. a municipal decree). Instead of performing a compatibility test, the respective municipality/Member State would need here to establish that the intended collection and processing of MaaS data, as organized by the text, is both “necessary and proportionate” in relation to one of the objectives of “general public interest” listed under Article 23.1 GDPR – noting here that EU laws generally considers public transport management as a topic of general interest, as evidenced by the EU Commission’s Quality Framework for Services of General Interest in Europe (2017). Under such a “necessity & proportionality” test, no reference to the mobility operators’ initial purposes appears to be needed; rather, the municipality/Member State must provide a sound demonstration that the collection and processing of MaaS data, as envisaged, is the least privacy-intrusive measure possible to achieve the intended purposes.

Note that this exemption (further processing being based on EU/Member State law, as necessary and proportionate to achieve purposes of general public interest), which is an alternative to the compability test under GDPR, is actually the only way to justify further processing under Article 4.2 LED – i.e. the only way to justify the processing of MaaS data collected from mobility operators for law enforcement purposes.

→ Proposed best practices:

  • Municipalities should define all purposes for which they actually need MaaS data in a clear and precise manner, before starting to collect the data.
  • Municipalities need to perform and document a compatibility test between their own intended purposes and the respective operators’ initial purposes for processing the data, except where the municipality’s use cases are embedded in statutory law such as a municipal decree enacted after a due necessity & proportionality assessment.

_

Legal basis (Article 6 GDPR/8 LED)

Purposes, as defined in accordance with the foregoing, will also serve to identify the legal basis for the processing of MaaS data, i.e. the pre-condition for municipalities to be able to collect and process such data lawfully.

In most cases, we argue that consent (as per Article 6.1(a) GDPR) will not be the appropriate legal basis, as it is both non-necessary and even likely to be found invalid due to the imbalance of power between municipalities and citizens – as clarified by Recital 43 GDPR and reaffirmed by former Article 29 Data Protection Working Party’s Guidelines on consent under Regulation 2016/679.

Fortunately, Articles 6 GDPR and 8 LED provide enough alternative legal bases for most municipalities’ use cases to be secured, inasmuch as they fall within the scope of said municipalities’ legal obligations (Article 6.1(c) GDPR) or tasks carried out in the public interest or in the exercise of its official authority (Article 6.1(e) GDPR), or as necessary for the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties by competent authorities (Article 8 LED). (Let’s remind here, however, that municipalities will most likely not be able to invoke legitimate interests (Article 6.1(f) GDPR) as a legal basis for their collection and processing of MaaS data, due to the last subparagraph of that same article – “Point (f) of the first subparagraph shall not apply to processing carried out by public authorities in the performance of their tasks“.)

All those legal bases other than consent share one common important criterion, namely that the processing of personal data must prove necessary for the legal obligation/public service/criminal police-justice activities to be adequately fulfiled. This means that municipalities should not only assess whether their intended purposes merely “relate” to such legal bases, but whether it is actually necessary to collect and process (personal) MaaS data to fulfil the respective legal obligation/public service/criminal police-justice activities, instead of relying on other less privacy-intrusive means. Such an assessment must be documented, based on domestic legal environments (as legal rules applicable to municipalities and public services may vary greatly from one country/region to another) and taking into account all alternative solutions at hand.

Another major issue in relation to that notion of legal basis lies on the mobility operators’ side, i.e. the legal basis for those operators, as autonomous data controllers, to transfer MaaS data to municipalities. Indeed, such transfer does not fall into the “normal course of business” of those operators, and is certainly not justified by the agreement they have with their users (as per Article 6.1(b) GDPR).

Here again, we argue that users’ consent will not prove appropriate, for both legal and practical reasons; legitimate interest, on the other hand, does not seem more suited, as transferring MaaS data to the municipality is not really in the operator’s interest as a data controller, and the municipality itself (which would qualify as a “third party” under Article 6.1(f) GDPR) may not rely on that legal basis, as a public authority. Depending on the relationship between the municipality and mobility operators, it may be possible for the latter to introduce the transfer of MaaS data as “a task carried out in the public interest“, as per Article 6.1(e); municipalities may also enact statutory laws imposing on mobility operators’ to provide specific sets of MaaS data for the respective municipality’s purposes, so that mobility operators’ themselves may rely on those legal obligations as a legal basis under Article 6.1(c) GDPR.

→ Proposed best practices:

  • Every use case involving processing of MaaS data needs to be mapped to a given legal basis under Article 6.1 GDPR/Article 8 LED (such as a mission of public interest, a legal obligation or law enforcement activities carried out by the municipality).
  • Municipalities should check what use cases qualify as “a task carried out in the public interest” or as a legal obligation as per their respective domestic laws.

_

Data minimization (Article 5.1(c) GPDR/4.1(c) LED) & Storage limitation (Article 5.1(e) GDPR/4.1(e) LED)

Here we come to another sensitive topic, that has proven highly controversial in the US feud about the MDS format, namely what kind of information municipalities should be entitled to request from operators.

As it goes, MaaS data may include many different sorts of information, ranging from the most general (e.g. aggregated information on vehicles availability) to the most specific (e.g. real-time geolocation of vehicles at rest and/or in transit); privacy advocates have contended that the latter raises irreducible privacy issues for citizens, by allowing massive, real-time surveillance on part of public authorities, so that they should not be collected at all.

Although one should never dismiss such risks as irrealistic, the risk of that slippery slope argument is that in condemning any sharing of granular data regardless of the actual context and intended use cases, it may end up undermining many legitimate and mostly harmless activities of the municipality, such as supervision of road traffic. It would also favour the very common myth (which is actually contrary to GDPR itself) that one may only collect anonymized data, regardless of the purpose considered.

Quite the contrary, when it comes to determining what sort of data a controller may collect under GDPR, purpose should always be considered as a starting point; this results from the wording of Article 5.1(c) GDPR/4.1(c) LED itself, namely that collected data must be “adequate, relevant” and “limited to what is necessary” (GDPR) or “not excessive” (LED) “in relation to the purposes for which they are processed“.

For instance, while collecting unique vehicle identifiers may seem excessive (i.e. non-justified) for mere statistics, municipalities may have a legitimate use of such individual vehicle IDs, e.g. to signal misplaced vehicles causing traffic congestion to the respective operator – so that the latter may act promptly and remove those vehicles. In this case, the slippery slope argument seems flawed, as nothing in such activity implies surveillance of citizens, or even the municipality seeking actual reidentification of a given user – here the municipality is only interested in the vehicle, not the people.

The situation where a municipality would actually seek to use MaaS data for law enforcement against citizens, where said municipality would wish to reidentify individuals (and not only vehicles) is pretty much different, and – of course – not at all unlikely to happen. For such situations, we argue that “traditional” legal solutions exist that work best, i.e. requesting case-by-case, motivated, court-issued warrants against operators, as systematic a priori collection of directly identifying personal data such as user IDs (email addresses) would clearly seem unjustified (i.e. disproportionate). Moreover, as further explained below, municipalities should make sure that once MaaS data is collected for purposes other than such law enforcement activities, appropriate governance measures are set up to avoid that the same data be accessible and reusable by the respective law enforcement services.

These data minimization “best practices” are also to be envisaged dynamically, i.e. through the corollary principle of storage limitation. MaaS datasets are certainly not monolithic, so that once a certain part of these datasets (such as vehicle IDs) does not seem useful anymore for the intended purposes (e.g. because it is too late to intervene on a particular traffic congestion situation), the respective data is to be automatically discarded (or, more exactly: permanently deleted or irreversibly anonymized). As explained above, this may be achieved by securing that remaining parts of the data cannot be retraced back to the respective individuals through means “reasonably likely to be used“. This also will serve to prevent such data from being reused for unintended/unjustifiable purposes, and thus to minimize risks for citizens’ privacy.

→ Proposed best practices:

  • MaaS data formats (such as MDS), albeit convenient, should be approached at the most granular level, so that only those specific sub-datasets that are actually needed to fulfil the intended purposes are collected/retained.
  • In particular, indirectly identifying data such as single vehicle IDs should be collected only where singling out a specific vehicle is needed, e.g. for no-parking areas enforcement. Once they are not necessary anymore, these vehicle IDs should be discarded/irreversibly anonymized.
  • Directly identifying data (such as user data) should not be collected, except in the context of specific investigations and with appropriate judicial warrant.

_

Transparency (Article 12 seq. GDPR & LED)

Transparency under GDPR and LED means both informing data subjects and facilitating exercise of their rights – bearing in mind that these will vary for each purpose of the processing, depending on the legal basis linked to that purpose (as explained above). In the case of municipalities’ processing of MaaS data, such transparency is paramount to enabling trust and promoting social acceptance of the municipality’s use cases.

The first branch of that principle requires municipalities to draft plain language, legible privacy notices, to allow citizens to understand why and how MaaS data is collected and further processed by the respective municipality services. While the content of such privacy notices is clearly delinated by Articles 14 GDPR/13 LED, there is more room for discussion as to the manner this information is to be displayed. Here an important distinction is to be made between GDPR and LED, i.e. between law enforcement activities and other activities involving the processing of MaaS data: while for the latter municipalities are required to push information to citizens (as per Article 14.3 GDPR), LED merely requires that information relating to processing of personal data in the context of (criminal) law enforcement be “made available”, e.g. on a publicly available website. Also, Article 13.3 LED allows Member States to delay, omit or restrict the provision of that information in certain cases, e.g. where it could obstruct legal investigations or proceedings.

Except for that specific case, the question remains as to how municipalities may efficiently inform citizens in relation to other use cases such as urban planning and the like. Municipalities may consider relying on existing user interfaces allowing to provide push notices in due time – the best example being mobility operators’ applications themselves, as they allow municipalities to convey information to end users e.g. upon subscribing to the service or starting a new ride. Obviously this would require to municipalities to cooperate closely with mobility operators, so as to make sure that privacy notices are actually displayed in the apps.

The other branch of the principle requires municipalities to facilitate the exercise of citizens’ rights in relation to their data, such as rights of access, rectification and deletion of the data. However, as explained above, municipalities will mostly collect non-directly identifying MaaS data (such as vehicle IDs), so that they will not be able to know whether they actually have data in relation to a specific citizen who would request exercise of their rights. In this municipalities may efficiently rely on Article 11 GDPR, as it states that in such situations data controllers are not obliged to seek or obtain additional information for the sole purpose of answering the request; it would be up to the citizen, in that case, to spontaneously provide the additional information that would allow to link its identity with data held by the municipality (if any).

In any case, municipalities should set up appropriate procedures to properly address such requests from citizens and provide clear answers, even when rejecting said requests. As applicable, such procedures may involve third-party service providers as data processors, as those are legally obliged to provide assistance in this respect (as per Article 28.3 GDPR).

→ Proposed best practices:

  • Municipalities must design legible privacy notices covering all and any use cases for which MaaS data is collected. They may consider relying on existing user interfaces (such as operators’ apps) to provide push notices.
  • Municipalities should set up appropriate procedures to properly answer citizens’ requests for exercise of their rights, with the assistance of third-party service providers as data processors (as applicable). Procedures should reflect that all processing purposes do not allow for the same rights to be exercised, depending on the applicable legal basis.

_

Security & Confidentiality (Article 32 GDPR/29 LED)

Data security is one of the most fundamental principles under privacy laws and regulations; it is all the more relevant in the case of MaaS data, in view especially of concerns relating to surveillance of citizens. Therefore, every stakeholder who gets to host/transmit MaaS data must set up and enforce appropriate security measures to prevent unlawful access or dissemination of such data, and require its data processors to do the same.

Here again, however, it is worth reminding that appropriateness of security measures must be measured against actual risks, which in turn depends on the nature of the data hosted/transmit. As the municipality (and/or its third-party service providers) will not usually collect directly identifying data such as user IDs, the consequences of a data breach occurring at the municipality’s or at a third-party service provider’s would not be exactly the same as the consequences of a data breach occurring on a mobility operator’s servers: even indirectly identifying data such as vehicle IDs would be of no use to most people, absent any additional data allowing to link the vehicle IDs to the respective users. In this respect, vehicle IDs should be considered pseudonymized data – pseudonymization being a security measure per se, as recognized by Article 32.1.a) GDPR.

A more pressing issue, in terms of social acceptability, is the one of access control among municipality services, and especially access by police services, as those may use legal prerogatives to combine MaaS data with other sources in order to reidentify data. This is arguably the biggest concern in relation to municipalities’ accessing and processing MaaS data – namely, the one of mass (no pun intended) surveillance. Addressing it will require municipalities to set up and display strong restrictive measures, so that MaaS data is only accessible to services, employees and/or vendors who actually need to use it for the intended purposes. Also, those services, employees and/or vendors should only access the sub-categories of MaaS data that are actually needed to fulfil their respective missions.

In this respect, third-party service providers (such as those providing SaaS solutions) may help (and should be required to) by embedding strong access control features in their solutions, and more generally strong security measures such as end-to-end data encryption, firewalls, periodic backups, etc.

→ Proposed best practices:

  • Municipalities should set up appropriate access control and restrictive measures so that MaaS data is only accessible to those services, employees and vendors that need to use for the intended purposes. Specific procedures may be appropriate for certain more sensitive purposes – such as obtaining a warrant in the context of law enforcement activities.
  • In particular, where law enforcement is not an intended purpose, internal measures need to be taken so that police services are not able to consult the data.

_

Accountability (Article 24 GDPR/19 LED) & DPIA (Article 35 GDPR/27 LED)

In line with the overarching accountability principle, all stakeholders are required to keep track of their respective actions and decisions in relation to the processing of MaaS data, so as to be able to demonstrate compliance with their respective obligations.

As for the municipality, this will most likely involve performance of a data protection impact assessment, based on criteria set forth by Article 29 Data Protection Working Party’s 2017 guidelines on DPIAs and determining whether processing is “likely to result in a high risk” – especially criterion 3 (systematic monitoring) and criterion 5 (data processed on a large scale). This impact assessment needs to identify and address the risks posed to citizens’ fundamental rights and freedoms and interests by the intended processing of MaaS data, and should be performed prior to even collecting the data from mobility operators.

As a good practice (encouraged by Article 35.9 GDPR), municipalities may involve other parties and stakeholders in the performance of the DPIA. Data processors such as third-party service providers are actually obliged to provide assistance in this context (as per Article 28.3 GDPR); other relevant contributors may include mobility operators or citizen representatives.

→ Proposed best practices:

  • Municipalities should perform DPIAs in relation to the various purposes for which they intend to collect and process MaaS data.
  • The DPIA should involve other stakeholders (such as mobility operators, third-party service providers or citizen representatives) to properly identify and address privacy risks and concerns.

_

That was quite a long section. Time for some (non-urban) mobility!

_

Data controller, Data processor: Who’s responsible for what?

A preliminary issue is to identify the various roles and qualifications of the respective stakeholders, i.e. (to stick with the use case above) the municipality, mobility operators and the third-party platform; this will prove crucial to allocate responsibilities in relation to the various applicable principles and obligations.

Let’s start with the obvious: as the one entity which “determines the purposes and means of the processing” of MaaS data (as per Article 4.7 GDPR/4.8 LED), the municipality will qualify as the data controller in relation to those purposes, bearing responsibility for most obligations, including especially purpose limitation and compatibility, data minimization, storage limitation and transparency. Depending on the circumstances of the case (in particular the purpose of the processing and the institutional structure of the municipality’s bodies) this qualification may need to be allocated at a more granular level: for instance, should an entity with distinct legal personhood be in charge of transport management or law enforcement, that entity may qualify as a distinct data controller, as opposed to other municipal bodies. Note that the fact that municipality’s agents never actually access personal data (e.g. because the processing is entirely managed by a third-party service provider) does not affect this data controller qualification, which only depends on the (intellectual) determination of purposes and means of the processing, and not on having actual access to data (as clarified by Article 29 Data Protection Working Party’s Opinion 01/2010 on the concept of “controller” and “processor”).

To the extent that they process the same MaaS data for their own purposes (such as float management and billing of users), mobility operators shall qualify as independent, autonomous data controllers, having their own sphere of liability in relation to those purposes. It seems clear enough that there is no case here for joint control as per Article 26 GDPR, as mobility operators play no part in determining the purposes for which the municipality will, ultimately, process the data, and the essential means of such processing. On the other hand, mobility operators do not seem to qualify as data processors of the municipality for the mere reason that they provide it with the data – especially where such provision is imposed on them as a legal obligation (see above). Rather, in relation to that processing, mobility operators will qualify as a quite unregulated role under GDPR/LED, namely as the “source” of the data, which only comes mentioned under Article 14.2(f) (and implicitly under Article 13.2(d) LED), whereby data subjects should be informed of the identity of that source.

Now, when a municipality relies on a third-party service provider for the processing of MaaS data, e.g. to provide SaaS data visualization solutions, the question may arise as to whether such service provider acts as a data processor of the municipality, or as a joint controller. Based on abovementioned Article 29 Data Protection Working Party’s Opinion 01/2010 on the concept of “controller” and “processor”, one must admit that there is (again) no universal answer to that question; rather, it will depend on the extent to which the service provider is legally and factually able to influence the determination of the purposes and means of the processing – with, according to the aforementioned opinion (pp.17 seq.), no stronger emphasis to be placed on legal criteria than on factual ones. On a very general note, to the extent that the purposes for which a municipality processes MaaS data will relate to public authority prerogatives such as transport management, urban planning or law enforcement, we may reasonably argue that the chances for a private contractor to influence such purposes are relatively weak, so that a data processor qualification will often seem more appropriate; however, stakeholders should not systematically dismiss the case of joint control without a proper assessment of their respective roles and responsibilities. For instance, as expectations of data subjects must also be taken into account as part of a “functional assessment” of the qualifications (see aforementioned Opinion, esp. p.16), the fact that the service providers is in a direct relationship with citizens and receive their requests under its own name may be a strong clue towards joint control.

These qualifications have an important impact on the way responsibilities and relationships of those stakeholders are to be organized, in a legal perspective.

On one hand, the relationship between a municipality and its third-party service provider (whether it qualifies as a data processor or as a joint controller) is well delineated by Articles 26 GDPR/21 LED (joint controllers) and Articles 28 GDPR/22 LED (data processor): in both cases, the parties will need to enter into binding contractual clauses to secure compliance with the applicable legal framework. Such clauses are detailed at great length by Articles 28.3 GDPR/22.3 LED, should the service provider qualify as data processor. On the other hand, the relationship between the municipality and mobility operators, as two autonomous data controllers, does not seem to be subject to any specific provisions, whether in GDPR or in LED. In particular, no binding agreement or any sort of contract seems to be required.

Also, as the data controller for its own processing of the data, the municipality will bear most obligations under privacy laws, including obligations relating to purpose limitation, data minimization, storage limitation, transparency and accountability. It is especially the municipality’s responsibility to perform DPIAs where applicable or relevant, even though third-party service providers, as data processors (let’s assume, at this stage, they are not joint controllers), will have an obligation to assist in the performance of such DPIAs, and more generally to assist and advise the municipality in relation to privacy aspects, although final decisions do not rest with them.

As per Article 26/28 GDPR (depending on whether the third-party service provider qualifies as data processor or as joint controller), these responsibilities will need to be embedded in the respective data processing agreement between the municipality and its third-party service provider.

Yup – that’s the way to do it!

_

How to get the whole thing to comply: A case for closer cooperation

At the end of the day, privacy laws need to be seen for what they really are: rules flexible and sensible enough to allow for legitimate use cases, provided that appropriate measures and guarantees are set up to avoid disproportionate or otherwise unlawful (re)use. Such measures and guarantees should be prudent enough, and fit to the risks; however they should not be conflated with the abovementioned slippery slope argument, according to which certain data should never be collected – a fortiori when GDPR itself does not list the respective data as “sensitive” as per its Article 9.

To sum it all up, purpose is key and most privacy-related issues – especially technical, organizational and contractual issues – will depend on it directly, just as lawfulness of the processing as a whole. Therefore, any reasonable privacy assessment must be conducted on a case-by-case basis, and should always start with a clear definition of the intended purpose; obviously, collection of MaaS data for statistical purposes will not raise as many issues and call for as rigorous protective measures as collection of the same data for law enforcement purposes.

Another important takeaway of the considerations above is the importance of team play, as an inherent component of the situation.

Municipalities are highly dependent on the cooperation of mobility operators and (as applicable) third-party service providers, not only for the collection of MaaS data, but also for securing privacy issues in relation to such collection, so that true privacy may only be reached through trustable cooperation between all those stakeholders.

This is our general view that GDPR is a game best played together; in the case of MaaS data, however, we believe such cooperation to be indispensable, if citizens’ rights are to be protected at all.

The chain of processing is indeed quite complex, and the municipality’s having (indirect) access to geolocation data will certainly raise (legitimate) concerns in the mind of citizens and other stakeholders; addressing these requires building trust, which in turns requires true transparency, which most certainly cannot be achieved by each stakeholder on its own.

In this we fully agree with Federico Costantini, as he refers, in his paper MaaS and GDPR: an overview (2017), to GDPR trust-enabling instruments, namely codes of conduct and certifications:

“Considering the remarks on the aspects strictly linked to GDPR […], I would recommend that all operators draw a specific Code of Conduct concerning Data Protection (Article 40 GDPR) and propose a standard certification in this area (Article 42 GDPR). Indeed, both are needed in order to demonstrate compliance withthe obligations concerning GDPR. Furthermore, such a Code of Conduct could be very useful in order to solve or contain privacy issues, to create a framework of fair competition among companies, to encourage new businesses, and to set security measures against cyberattacks and unlawful accesses.”

As Costantini’s paper mostly focuses on mobility operators’ use of MaaS data, his recommendation above is limited to said operators’ side of the problem. It would certainly be in the best interests of citizens/users to expand the scope of these instruments (code of conduct and standard certification) to municipalities and third-party service providers, to the extent that they receive and process MaaS data, and play a crucial part in building what we often refer to as “smart cities”. A code of conduct, in particular, may cover topics such as the ones discussed above, e.g. data minimization, retention periods and access control, in line with best market practices and commonly agreed data formats.

As of today, such a proactive initiative (to be checked and approved, as applicable, by supervisory authorities or the European Commission), instead of a blanket ban on the sharing of any data whatsoever with municipalities, is something we can only hope for.

These happy teammates just won the Privacy Cup!

Laisser un commentaire

Votre adresse mail ne sera pas publiée. Tous les champs sont obligatoires.