Archives

ICO

UK watchdog sets out “age appropriate” design code for online services to keep kids’ privacy safe

The UK’s data protection watchdog has today published a set of design standards for Internet services which are intended to help protect the privacy of children online.

The Information Commissioner’s Office (ICO) has been working on the Age Appropriate Design Code since the 2018 update of domestic data protection law — as part of a government push to create ‘world-leading’ standards for children when they’re online.

UK lawmakers have grown increasingly concerned about the ‘datafication’ of children when they go online and may be too young to legally consent to being tracked and profiled under existing European data protection law.

The ICO’s code is comprised of 15 standards of what it calls “age appropriate design” — which the regulator says reflects a “risk-based approach”, including stipulating that setting should be set by default to ‘high privacy’; that only the minimum amount of data needed to provide the service should be collected and retained; and that children’s data should not be shared unless there’s a reason to do so that’s in their best interests.

Profiling should also be off by default. While the code also takes aim at dark pattern UI designs that seek to manipulate user actions against their own interests, saying “nudge techniques” should not be used to “lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections”.

“The focus is on providing default settings which ensures that children have the best possible access to online services whilst minimising data collection and use, by default,” the regulator writes in an executive summary.

While the age appropriate design code is focused on protecting children it is applies to a very broad range of online services — with the regulator noting that “the majority of online services that children use are covered” and also stipulating “this code applies if children are likely to use your service” [emphasis ours].

This means it could be applied to anything from games, to social media platforms to fitness apps to educational websites and on-demand streaming services — if they’re available to UK users.

“We consider that for a service to be ‘likely’ to be accessed [by children], the possibility of this happening needs to be more probable than not. This recognises the intention of Parliament to cover services that children use in reality, but does not extend the definition to cover all services that children could possibly access,” the ICO adds.

Here are the 15 standards in full as the regulator describes them:

  1. Best interests of the child: The best interests of the child should be a primary consideration when you design and develop online services likely to be accessed by a child.
  2. Data protection impact assessments: Undertake a DPIA to assess and mitigate risks to the rights and freedoms of children who are likely to access your service, which arise from your data processing. Take into account differing ages, capacities and development needs and ensure that your DPIA builds in compliance
    with this code.
  3. Age appropriate application: Take a risk-based approach to recognising the age of individual users and ensure you effectively apply the standards in this code to child users. Either establish age with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from your data processing, or apply the standards in this code to all your users instead.
  4. Transparency: The privacy information you provide to users, and other published terms, policies and community standards, must be concise, prominent and in clear language suited to the age of the child. Provide additional specific ‘bite-sized’ explanations about how you use personal data at the point that use is activated.
  5. Detrimental use of data: Do not use children’s personal data in ways that have been shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions or Government advice.
  6. Policies and community standards: Uphold your own published terms, policies and community standards (including but not limited to privacy policies, age restriction, behaviour rules and content policies).
  7. Default settings: Settings must be ‘high privacy’ by default (unless you can demonstrate a compelling reason for a different default setting, taking account of the best interests of the child).
  8. Data minimisation: Collect and retain only the minimum amount of personal data you need to provide the elements of your service in which a child is actively and knowingly engaged. Give children separate choices over which elements they wish to activate.
  9. Data sharing: Do not disclose children’s data unless you can demonstrate a compelling reason to do so, taking account of the best interests of the child.
  10. Geolocation: Switch geolocation options off by default (unless you can demonstrate a compelling reason for geolocation to be switched on by default, taking account of the best interests of the child). Provide an obvious sign for children when location tracking is active. Options which make a child’s location visible to others must default back to ‘off’ at the end of each session.
  11. Parental controls: If you provide parental controls, give the child age appropriate information about this. If your online service allows a parent or carer to monitor their child’s online activity or track their location, provide an obvious sign to the child when they are being monitored.
  12. Profiling: Switch options which use profiling ‘off’ by default (unless you can demonstrate a compelling reason for profiling to be on by default, taking account of the best interests of the child). Only allow profiling if you have appropriate measures in place to protect the child from any harmful effects (in particular, being fed content that is detrimental to their health or wellbeing).
  13. Nudge techniques: Do not use nudge techniques to lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections.
  14. Connected toys and devices: If you provide a connected toy or device ensure you include effective tools to enable conformance to this code.
  15. Online tools: Provide prominent and accessible tools to help children exercise their data protection rights and report concerns.

The Age Appropriate Design Code also defines children as under the age of 18 — which offers a higher bar than current UK data protection law which, for example, puts only a 13-year-age limit for children to be legally able to give their consent to being tracked online.

So — assuming (very wildly) — that Internet services were to suddenly decide to follow the code to the letter, setting trackers off by default and not nudging users to weaken privacy-protecting defaults by manipulating them to give up more data, the code could — in theory — raise the level of privacy both children and adults typically get online.

However it’s not legally binding — so there’s a pretty fat chance of that.

Although the regulator does make a point of noting that the standards in the code are backed by existing data protection laws, which it does regulate and can legally enforceable — pointing out that it has powers to take action against law breakers including “tough sanctions” such as orders to stop processing data and fines of up to 4% of a company’s global turnover.

So, in a way, the regulator appears to be saying: ‘Are you feeling lucky data punk?’

Last April the UK government published a white paper setting out its proposals for regulating a range of online harms — including seeking to address concern about inappropriate material that’s available on the Internet being accessed by children.

The ICO’s Age Appropriate Design Code is intended to support that effort. So there’s also a chance that some of the same sorts of stipulations could be baked into the planned online harms bill.

“This is not, and will not be, ‘law’. It is just a code of practice,” said Neil Brown, an Internet, telecoms and tech lawyer at Decoded Legal, discussing the likely impact of the suggested standards. “It shows the direction of the ICO’s thinking, and its expectations, and the ICO has to have regard to it when it takes enforcement action but it’s not something with which an organisation needs to comply as such. They need to comply with the law, which is the GDPR [General Data Protection Regulation] and the DPA [Data Protection Act] 2018.

“The code of practice sits under the DPA 2018, so companies which are within the scope of that are likely to want to understand what it says. The DPA 2018 and the UK GDPR (the version of the GDPR which will be in place after Brexit) covers controllers established in the UK, as well as overseas controllers which target services to people in the UK or monitor the behaviour of people in the UK. Merely making a service available to people in the UK should not be sufficient.”

“Overall, this is consistent with the general direction of travel for online services, and the perception that more needs to be done to protect children online,” Brown also told us.

“Right now, online services should be working out how to comply with the GDPR, the ePrivacy rules, and any other applicable laws. The obligation to comply with those laws does not change because of today’s code of practice. Rather, the code of practice shows the ICO’s thinking on what compliance might look like (and, possibly, goldplates some of the requirements of the law too).”

Organizations that choose to take note of the code — and are in a position to be able to demonstrate they’ve followed its standards — stand a better chance of persuading the regulator they’ve complied with relevant privacy laws, per Brown.

“Conversely, if they want to say that they comply with the law but not with the code, that is (legally) possible, but might be more of a struggle in terms of engagement with the ICO,” he added.

Zooming back out, the government said last fall that it’s committed to publishing draft online harms legislation for pre-legislative scrutiny “at pace”.

But at the same time it dropped a controversial plan included in a 2017 piece of digital legislation which would have made age checks for accessing online pornography mandatory — saying it wanted to focus on a developing “the most comprehensive approach possible to protecting children”, i.e. via the online harms bill.

How comprehensive the touted ‘child protections’ will end up being remains to be seen.

Brown suggested age verification could come through as a “general requirement”, given the age verification component of the Digital Economy Act 2017 was dropped — and “the government has said that these will be swept up in the broader online harms piece”.

It has also been consulting with tech companies on possible ways to implement age verification online.

The difficulties of regulating perpetually iterating Internet services — many of which are also operated by companies based outside the UK — have been writ large for years. (And are mired in geopolitics.)

While the enforcement of existing European digital privacy laws remains, to put it politely, a work in progress

Facebook agrees to pay UK data watchdog’s Cambridge Analytica fine but settles without admitting liability

Facebook has reached a settlement with the UK’s data protection watchdog, the ICO, agreeing to pay in full a £500,000 (~$643k) fine following the latter’s investigating into the Cambridge Analytica data misuse scandal.

As part of the arrangement Facebook has agreed to drop its legal appeal against the penalty. But under the terms of the settlement it has not admitted any liability in relation to paying the fine, which is the maximum possible monetary penalty under the applicable UK data protection law. (The Cambridge Analytica scandal predates Europe’s GDPR framework coming into force.)

Facebook’s appeal against the ICO’s penalty was focused on a claim that there was no evidence that U.K. Facebook users’ data had being mis-used by Cambridge Analytica .

But there’s a further twist here in that the company had secured a win, from a first tier legal tribunal — which held in June that “procedural fairness and allegations of bias” on the part of the ICO should be considered as part of its appeal.

The decision required the ICO to disclose materials relating to its decision-making process regarding the Facebook fine. The ICO, evidently less than keen for its emails to be trawled through, appealed last month. It’s now withdrawing the action as part of the settlement, Facebook having dropped its legal action.

In a statement laying out the bare bones of the settlement reached, the ICO writes: “The Commissioner considers that this agreement best serves the interests of all UK data subjects who are Facebook users. Both Facebook and the ICO are committed to continuing to work to ensure compliance with applicable data protection laws.”

An ICO spokeswoman did not respond to additional questions — telling us it does not have anything further to add than its public statement.

As part of the settlement, the ICO writes that Facebook is being allowed to retain some (unspecified) “documents” that the ICO had disclosed during the appeal process — to use for “other purposes”, including for furthering its own investigation into issues around Cambridge Analytica.

“Parts of this investigation had previously been put on hold at the ICO’s direction and can now resume,” the ICO adds.

Under the terms of the settlement the ICO and Facebook each pay their own legal costs. While the £500k fine is not kept by the ICO but paid to HM Treasury’s consolidated fund.

Commenting in a statement, deputy commissioner, James Dipple-Johnstone, said:

The ICO welcomes the agreement reached with Facebook for the withdrawal of their appeal against our Monetary Penalty Notice and agreement to pay the fine. The ICO’s main concern was that UK citizen data was exposed to a serious risk of harm. Protection of personal information and personal privacy is of fundamental importance, not only for the rights of individuals, but also as we now know, for the preservation of a strong democracy. We are pleased to hear that Facebook has taken, and will continue to take, significant steps to comply with the fundamental principles of data protection. With this strong commitment to protecting people’s personal information and privacy, we expect that Facebook will be able to move forward and learn from the events of this case.

In its own supporting statement, attached to the ICO’s remarks, Harry Kinmonth, director and associate general counsel at Facebook, added:

We are pleased to have reached a settlement with the ICO. As we have said before, we wish we had done more to investigate claims about Cambridge Analytica in 2015. We made major changes to our platform back then, significantly restricting the information which app developers could access. Protecting people’s information and privacy is a top priority for Facebook, and we are continuing to build new controls to help people protect and manage their information. The ICO has stated that it has not discovered evidence that the data of Facebook users in the EU was transferred to Cambridge Analytica by Dr Kogan. However, we look forward to continuing to cooperate with the ICO’s wider and ongoing investigation into the use of data analytics for political purposes.

A charitable interpretation of what’s gone on here is that both Facebook and the ICO have reached a stalemate where their interests are better served by taking a quick win that puts the issue to bed, rather than dragging on with legal appeals that might also have raised fresh embarrassments. 

That’s quick wins in terms of PR (a paid fine for the ICO; and drawing a line under the issue for Facebook), as well as (potentially) useful data to further Facebook’s internal investigation of the Cambridge Analytica scandal.

We don’t know exactly it’s getting from the ICO’s document stash. But we do know it’s facing a number of lawsuits and legal challenges over the scandal in the US. 

The ICO announced its intention to fine Facebook over the Cambridge Analytica scandal just over a year ago.

In March 2018 it had raided the UK offices of the now defunct data company, after obtaining a warrant, taking away hard drives and computers for analysis. It had also earlier ordered Facebook to withdraw its own investigators from the company’s offices.

Speaking to a UK parliamentary committee a year ago the information commissioner, Elizabeth Denham, and deputy Dipple-Johnstone, discussed their (then) ongoing investigation of data seized from Cambridge Analytica — saying they believed the Facebook user data-set the company had misappropriated could have been passed to more entities than were publicly known.

The ICO said at that point it was looking into “about half a dozen” entities.

It also told the committee it had evidence that, even as recently as early 2018, Cambridge Analytica might have retained some of the Facebook data — despite having claimed it had deleted everything.

“The follow up was less than robust. And that’s one of the reasons that we fined Facebook £500,000,” Denham also said at the time. 

Some of this evidence will likely be very useful for Facebook as it prepares to defend itself in legal challenges related to Cambridge Analytica. As well as aiding its claimed platform audit — when, in the wake of the scandal, Facebook said it would run a historical app audit and challenge all developers who it determined had downloaded large amounts of user data.

The audit, which it announced in March 2018, apparently remains ongoing.

SEC approves 1st consumer RegA+ token Props for cross-app rewards

After cracking down on ICOs, the SEC just okayed the first two RegA+ tokens that offer an alternative way for anyone to gain a financial stake in a company, even unaccredited investors. Blockstack got approved for a $28 million digital token sale to raise money, while influencer live streaming app YouNow’s spin-off Props received a formal green light for a consumer utility ‘Howey’ token users can earn to get loyalty perks in multiple apps.

Props has already raised $21 million by pre-selling tokens to Union Square Ventures, Comcast, Venrock, Andreessen Horowitz’s Chris Dixon, and YouTuber Casey Neistat, so it isn’t raising any money with the RegA+ by selling its tokens like Blockstack. Instead, users earn or ‘mine’ Props by engaging with apps like YouNow, which will award the tokens for creating broadcasts, watching videos, and tipping creators. Having more Props entitles YouNow’s 47 million registered users bonus features, VIP status, and more purchasing power with the app’s proprietary credits called Bars which users have bought $70 million-worth of to date.

How Props Work

But unlike most virtual currencies that can only be used in a single app and don’t technically belong to consumers, the open-sourced Props blockchain system can be integrated into other apps via an API and people can export their Props to cryptocurrency wallets. That lets them apply their Props in other apps beyond YouNow. Four partnered apps have been lined up including xSplit, a 17 million registered user app for video game streaming.

While Props aren’t currently redeemable for fiat currency, they were valued at $0.1369 each by the SEC-approved filing. The company is working to have Props listed on Alternative Trading Systems that work similarly to cryptocurrency exchanges. That lets Props give everyday app users a financial incentive to see the network of apps that adopt them grow. Since there’s a finite supply of 1 billion Props (with 600 million mined so far), if demand for Props rises then users could sell them for more. This creates a new growth hacking method for startups by providing a way to reward early and hardcore users.

“Our offering of Props is the first consumer-facing offering of “Howey tokens” to be qualified by the SEC. It makes it the first offering of consumer-oriented utility tokens that the SEC deems compliant, outside of Bitcoin and Ether” Props CEO Adi Sideman tells me. While SEC officials have said Bitcoin and Ether aren’t securities thanks to their sufficient decentralization, they haven’t received formal approval. “We used Regulation A+ (Reg A) for this qualification, so that Props may be earned by, and provide functionality to, non accredited investors, users, apps and validators, in compliance with US regulations.”

Props YouNow

However this could also create risk for less savvy users who might misunderstand the token system and be overly convinced they’ll get rich by watching tons of musicians or comedians streaming on YouNow. Props will need to ensure partners that integrate its tokens don’t exaggerate their potential. It’s spent two years working on SEC approval but could still face consequences if Props are misrepresented.

“Props enables us to turn creators into stakeholders in the network, meaning they become partners in the success of the network. It’s an important tool for us to better incentivize and align with the most important users of our apps” PeerStream CEO Alex Harrington writes. “Props abstracts, for us as developers, the technical and regulatory complexity associated with blockchain-based tokens, through a simple set of APIs that we can use to integrate the token into our apps’ experience.”

With Blockstack and Props having pioneered the RegA+ approach, we could see more companies filing to use this method of raising money or sharing stakes with their users.