Good afternoon, colleagues.
I know that many of us are anxious, as this is the last week of four before we go home for a constituency break week, but we have with us some very distinguished panellists to help us in the deliberations on our current study, which is on the Personal Information Protection and Electronic Documents Act, more affectionately known by Canadians on a daily basis as PIPEDA.
From the Centre for Law and Democracy, we are joined once again by Mr. Michael Karanicolas, a senior legal officer, by video conference.
It's good to see you again, Mr. Karanicolas.
As an individual, we again have joining us Teresa Scassa, a full professor at the University of Ottawa.
Thanks, Teresa, for joining us again. It's always a pleasure to have you here.
For the first time ever appearing before the committee, in his debut game—I mean, debut “appearance”—we have Florian Martin-Bariteau, assistant professor with the common law section of the Faculty of Law and the director of the Centre for Law, Technology and Society at the University of Ottawa.
As we normally do in this committee, we'll have a 10-minute opening statement from each of you. We'll simply go in the order in which I introduced you. I think everybody here is familiar with how this happens.
We'll start with you, Mr. Karanicolas. You have up to 10 minutes, please.
Thanks to the committee for your invitation to appear again.
I'd like to start by offering my congratulations to the standing committee for their recommendations to reform the Privacy Act, which were published late last year and which I thought were excellent.
It is, I believe, fairly clear that the current consent-based model of privacy protection is broken. The core dynamic that underlies this model and that drives much of the digital economy is that users may choose to trade their personal information for services. There are undeniable benefits to this model, which has assisted in the rapid spread of the Internet by lowering costs of entry. However, this dynamic relies on meaningful consent, which in turn requires at least a nominal understanding by the contracting party of what they're signing on to. In fact, virtually nobody reads their terms of service agreements, a state of affairs that significantly undermines the legitimacy of the consent obtained.
The OPC report points in part to the length of these agreements and the frequency with which they're presented to users as a cause of this lack of understanding, but it's also worth noting that these agreements are often drafted in a highly convoluted, confusing, and even self-contradictory manner that even technically and legally trained people struggle to understand. There's a vicious cycle at work. The fact that very few users read these agreements or use their substance as a basis for accepting or declining a service gives companies licence, and indeed an incentive, to draft them incredibly broadly. This drafting style and the lack of accessibility further depresses engagement with the agreements by their signatories and so on.
It's also worth noting that the company that presents the agreement and offers a service may often be distinct from the ones that actually collect and process the information. Third party data brokers play an increasingly common role in the Internet's ecosystem. A 2014 study showed that of the 950,000 most popular websites, 88% of them automatically shared visitor information with third parties, an average of 9.5 different third parties per website. The vast majority of this tracking is carried out surreptitiously, with only 2% of third parties including a visible prompt alerting users to their presence.
There's a clear problem here. However, it's important to try to look for solutions that will not derail the current digital economy. Although there are pros and cons to a system where personal information is used as a major currency by which online services are procured, potential avenues forward should be crafted with an eye to maintaining the tremendous benefits that Internet access provides.
One solution, which we strongly support, is to boost the quality of consent by improving the information available to users. A better practice here may include publishing a summary or explanatory guide of the terms of service alongside the full legal version, ensuring that the agreement is easily available for review, and clearly notifying users when a substantial change to the change of service has been made.
The OPC has an important role to play here: to promote better practice in terms of clarity and accessibility of terms of service agreements, and to audit existing agreements for their clarity and accessibility, as well as their accuracy against how information is actually collected and processed. In addition to these steps, the proposal to shift to opt-in consent as a default to the required approach is one that we support.
The move to expand transparency is another important factor to boosting the quality of consent, allowing people to look under the hood of the services and platforms they use. This may include, for example, a right to request an explanation of how their personal information has been used to customize their online experience, or what factors went into a particular decision by the company that they were subject to. However, while there is substantial room by which the quality of user engagement and of consent may be improved, these improvements alone are not sufficient to safeguard the privacy rights of Canadians. The CLD supports the creation of clearly defined no-go zones, as well as proceed-with-caution zones, as mentioned in the OPC report. One important area to consider here is the need for greater clarity on how information can be transferred out to third parties or resold, and what rules should govern these external uses. Broader investigative powers by the OPC are also needed to promote good practice in terms of information management and security.
In terms of the de-identification or anonymization of information, while I think it should certainly be encouraged, it is not a panacea for the current privacy concerns. I would add to the commentary contained in the OPC's report by noting that as anonymization gets stronger, the commercial value of information can often decline, giving businesses an incentive to pursue incomplete solutions. Moreover, the fact that information has been, quote-unquote, anonymized may create a false sense of security, prompting companies to be less vigilant in safeguarding it and consumers to assume that threats to privacy have been nullified.
I also want to speak briefly about reputation and privacy and the right to be forgotten.
The Internet's transformative impact on our social functions has made a person's online footprint a vital aspect of his or her identity. However, the permanence and increased accessibility of online information has led to concerns from some about the Internet's impact on privacy and reputation.
There are benefits to making people's pasts more accessible. A Holocaust museum, for example, has a legitimate interest in knowing if a person it is considering for a job has a history of making racist comments. However, we are also a society that believes in giving people second chances. There can be problems with how the digital records present themselves, such as where a decision by a prosecutor to drop charges may not generate as much coverage as the initial arrest, or where an erroneous and sensational media report may attract more attention than a later retraction.
However, experiences in Europe with the right to be forgotten should be viewed as a cautionary tale about what not to do. Namely, any move to develop a right to be forgotten should be grounded in clear and limited definitions of how it applies, strong transparency, and robust due process. I will address each of these in turn.
First, the application of a right to be forgotten requires a careful balancing of freedom of expression, privacy, and the right to information. Any such balancing will have to be based on a clear test to determine where the public interest lies. People have never had a right to control or curate their reputations. Any move to create a right to be forgotten should be aimed only at the novel aspects of reputation that have come about as a result of the Internet and should be reserved for significant and demonstrably unfair circumstances, such as when a person has been wrongly arrested.
Second, transparency is a key ingredient, including making available detailed information about how decision-making processes work and how they have been applied. There should be as much information as can be provided, short of undermining the efficacy of the processes themselves.
Third, as with any restriction on freedom of expression, due process is critically important. Search engines are simply not equipped to engage in this careful balancing of rights, and unfortunately have an incentive under the current European system to err on the side of removing the information without providing the careful due process such a tricky issue should warrant. Any order to remove material or to reduce its accessibility should be left in the hands of a court or a quasi-judicial authority, including careful due process considerations.
I want to emphasize that none of the above should be interpreted as an endorsement of the right to be forgotten. Indeed, there is a strong argument to be made that the present reputational challenges will sort themselves out over time, as people will gradually become inured to the preponderance of embarrassing or unpleasant information out there and will learn to take such information with a pinch of salt. However, insofar as the right to be forgotten is being considered, it is important that we not repeat the widely criticized mistakes of the Court of Justice of the European Union in how it handled the matter.
I look forward to your questions in the discussion.
Thank you for the invitation to meet with you today and to contribute to your discussion on PIPEDA. I'm a professor at the University of Ottawa in the Faculty of Law, where I hold the Canada research chair in information law. I'm appearing in my personal capacity.
We're facing what might be considered a crisis of legitimacy when it comes to personal data protection in Canada. Every day we hear new stories in the news about data hacks and breaches, and about the surreptitious collection of personal information by the devices in our homes and on our persons that are linked to the Internet of things. There are stories about how big data profiling impacts the ability of individuals to get health insurance, obtain credit, or find employment. There are also concerns about the extent to which state authorities access our personal information that is in the hands of private sector companies. PIPEDA, as it currently stands, is inadequate to meet these challenges.
My comments are organized around the theme of transparency. Transparency is fundamentally important to data protection and has always played an important role under PIPEDA. At a basic level, transparency means openness and accessibility. In the data protection context, it means requiring organizations to be transparent about the collection, use, and disclosure of personal information, and it means that the commissioner also must be transparent in his oversight functions under the act.
I'm going to also argue that it means that state actors, including law enforcement and national security organizations, must be more transparent about their access to and use of the vast stores of personal information in the hands of private sector organizations.
Under PIPEDA, transparency is at the heart of the consent-based data protection scheme. It's central to the requirement for companies to make their privacy policies available to consumers and to obtain consumer consent to the collection, use, or disclosure of personal information, yet this type of transparency has come under significant pressure and has been substantially undermined by technological change on the one hand, and by piecemeal legislative amendment on the other.
The volume of information that's collected through our digital, mobile, and online interactions is enormous, and its actual and potential uses are limitless. The Internet of things means that more and more of the devices that we have on our person and in our homes are collecting and transmitting information. They may even do so without our awareness, and they often do so on a continuous basis. The result is that there are fewer clear and well-defined points or moments at which data collection takes place, making it difficult to say that notice was provided and that consent was obtained in any meaningful way.
In this context, consent has become a bit of a joke, although unfortunately the joke is largely on consumers. The only parties capable of saying that our current consent-based model still works are those that benefit from consumer resignation in the face of this ubiquitous data harvesting.
The Privacy Commissioner's recent consultation process on consent identifies a number of possible strategies to address the failures of the current system. There is no quick or easy fix, no slight changing of wording that will address the problems around consent. This means that on the one hand there need to be major changes in how organizations achieve meaningful transparency about their data collection, use, and disclosure practices, and there must also be a new approach to compliance that gives considerably more oversight and enforcement powers to the commissioner. The two changes are inextricably linked.
The broader public protection mandate of the commissioner requires that he have necessary powers to take action in the public interest. The technological context in which we now find ourselves is so profoundly different from what it was when this legislation was enacted in 2001 that to talk of only minor adjustments to the legislation ignores the transformative impacts of big data and the Internet of things.
A major reworking of PIPEDA may be well overdue, in any event, and it might have important benefits that go beyond addressing the problems of consent. I note that if one were asked to draft a statute as a performance art piece that evokes the problem with incomprehensible, convoluted, and contorted privacy policies and their effective lack of transparency, then PIPEDA would be that statute. As unpopular as it might seem to suggest that it's time to redraft the legislation so that it no longer reads like the worst of all privacy policies, this is one thing this committee should consider.
I make this recommendation in a context in which all of those who collect, use, or disclose personal information in the course of commercial activity, including a vast number of small and medium-sized businesses with limited access to experienced legal counsel, are expected to comply with the legislation. In addition, the public ideally should have a fighting chance of reading the statute and understanding what it means in terms of the protection of their personal information and their rights of recourse. As it's currently drafted, PIPEDA is a convoluted mishmash in which the normative principles are not found in the law itself, but rather are tacked on in a schedule.
To make matters worse, the meaning of some of the words in the schedule, as well as the principles contained therein, are modified by the statute, so that it's not possible to fully understand rules and exceptions without engaging in a complex connect-the-dots exercise. After a series of piecemeal amendments, PIPEDA now consists in large part of a growing list of exceptions to the rules around collection, use, or disclosure with consent. While the OPC has worked hard to make the legal principles in PIPEDA accessible to businesses and to individuals, the law itself is not accessible.
In a recent PIPEDA application involving an unrepresented applicant—and most of them who appear before the Federal Court are unrepresented, which I think is another issue with PIPEDA—Justice Roy of the Federal Court expressed the opinion that for a party to “misunderstand the scope of the Act is hardly surprising”.
I've already mentioned the piecemeal amendments to PIPEDA over the years, as well as concerns about transparency. In this respect, it's important to note that the statute has been amended so as to increase the number of exceptions to consent that would otherwise be required for the collection, use, or disclosure of personal information.
For example, paragraphs 7(3)(d.1) and (d.2) were added in 2015. They permit organizations to share personal information between themselves for the purposes of investigating breaches of an agreement or actual or anticipated contraventions of the laws of Canada or a province, or to detect or suppress fraud. While these are important objectives, I note that no transparency requirements were created in relation to these rather significant powers to share personal information without knowledge or consent. In particular, there's no requirement to notify the commissioner of such sharing. The scope of these exceptions creates a significant transparency gap that undermines personal information protection. This should be fixed.
PIPEDA also contains exceptions that allow organizations to share personal information with government actors for law enforcement or national security purposes without the notice or consent of the individual. These exceptions also lack transparency safeguards. Given the huge volume of highly detailed personal information, including location information, which is now collected by private sector organizations, the lack of mandatory transparency requirements is a glaring privacy problem.
The Department of Innovation, Science and Economic Development has created a set of voluntary transparency guidelines for organizations that choose to disclose the number of requests they receive and how they deal with them. It's time for there to be mandatory transparency obligations around such disclosures, whether it be public reporting or reporting to the commissioner, or a combination of both. Also, that reporting should be by both private and public sector actors.
Another major change that is needed to enable PIPEDA to meet the contemporary data protection challenges relates to the powers of the commissioner. When PIPEDA was enacted in 2001, it represented a fundamental change in how companies were to go about collecting, using, and disclosing personal information. This major change was made with great delicacy. PIPEDA reflects an “ombuds” model that allows for a light touch with an emphasis on facilitating and cajoling compliance, rather than imposing and enforcing it. Sixteen years later, and with exabytes of personal data under the proverbial bridge, it's past time for the commissioner to be given a set of new tools to ensure an adequate level of protection for personal information in Canada.
First, the commissioner should have the authority to impose fines on organizations in circumstances where there has been substantial or systemic non-compliance with privacy obligations. Properly calibrated, such fines can have an important deterrent effect that is currently absent from PIPEDA. They also represent transparent moments of accountability that are important in maintaining public confidence in the data protection regime.
The tool box should also include the power for the commissioner to issue binding orders. I'm sure you're well aware that the commissioners in Quebec, Alberta, and British Columbia already have such powers. As it stands, the only route under PIPEDA to a binding order runs through the Federal Court, and then only after a complaint has passed through the commissioner's internal process. This is an overly long and complex route to an enforceable order, and it requires an investment of time and resources that places an unfair burden on individual complainants.
I note as well that PIPEDA currently does not provide any guidance as to damage awards. The Federal Court has been extremely conservative in damage awards for breaches of PIPEDA, and the amounts awarded are unlikely to have any deterrent effect other than to deter individuals who struggle to defend their personal privacy. Some attention should be paid to establishing parameters for non-pecuniary damages under PIPEDA. At the very least, these will assist unrepresented litigants in understanding the limits of any recourse that's available to them.
Thank you. I welcome any questions.
I would like to thank you for this opportunity to contribute to your work on the review of the Personal Information Protection and Electronic Documents Act (PIPEDA) and thus offer me the chance to share my thoughts with you about an issue of importance to Canadians.
I am an Assistant Professor of Law and Technology at the Common Law Section, Faculty of Law of the University of Ottawa, where I teach Digital Economy Law, and am the Director of the Centre for Law, Technology and Society. Nonetheless, I appear before you today in my personal capacity.
My comments will be built upon the letter sent to you by the Commissioner last December 2. I will focus on the issues of enforcement powers and reputation. I will then move to the scope of the act, before concluding with some reflections as to its accessibility and readability.
Throughout my presentation, I will draw references to new European Union's General Data Protection Regulation, GDPR, particularly due to the adequacy issues raised by the Commissioner.
As to the enforcement powers, I believe it is essential to strengthen the Commissioner's powers in order to ensure the effectiveness of the act, in particular by granting the Commissioner order-making powers and the authority to impose administrative monetary penalties. The ability to impose fines appears to be the most effective way to ensure protection.
As with everything, the protection of personal information is subject to a cost-benefit analysis. It is now a matter of either investing in a protection by design or choosing the possibility of a slap on the wrist. With the risk of monetary penalties, the cost-benefit analysis will favour a protection by design approach. Obviously, the amount of the fine will be a critical parameter for its effectiveness—a prohibitive amount is required. For example, if a $500,000 fine may seem significant—and it will be for small and medium-sized businesses—it will be an insignificant amount for companies like Amazon, Facebook or Google. In that respect, it was by imposing a $22.5-million fine that the U.S. Federal Trade Commission succeeded in getting Google to modify its DoubleClick advertising program.
In order to prove effective against big players, we need the maximum fine to be specified based on a percentage of worldwide turnover—for example, 1%. To ensure that the fine is not ludicrous for small and medium-sized enterprises, a second limit should be provided—for example, $500,000; with the greater limit to be retained. Incidentally, the GDPR is based on such a mixed approach.
In my view, this does not threaten the collaborative relationship between operators and the Commissioner. On the contrary, I am of the opinion that strengthened powers will encourage a greater co-operation within actors, before any damage. Besides, such powers seem necessary to obtain an adequate decision of the GDPR.
In order to avoid the appearance of conflicts of interest, fines should be made payable to the Receiver General. So as to protect small businesses and not slow down innovation, we could provide a procedure for a preliminary conformity assessment. In the event of damages, sanctions would only be imposed after an issued recommendation has not been acted upon within a reasonable time.
Finally, I am of the view that none of the Commissioner's powers, including those of order and sanction, should be limited to the receipt of a formal complaint—the totality of these powers evidently remaining subject to possible judicial review.
As to the rights of individuals and online reputation, many favour the creation of a “right to be forgotten”. In the way it is imagined and requested by some, I find this proposition dangerous. The Internet is the archives and the libraries of tomorrow, the new collective memory. Archives have never previously been erased because they were disturbing—at least, not legally in a democracy. This is dangerous ground, and similarly, it is dangerous to want to delegate censorship powers to private actors or to give the power to decide what should be accessible or not to a select few. In the same vein, the right to de-index seems illogical to me, in that it would entail the removal of the index entry, but not the content itself.
Legislation protecting personal information should not be used as a reputation management tool to remove what is embarrassing, but only to remove anything that is unjustified or inaccurate. Otherwise, I am not sure that such a mechanism would satisfy the charter test.
The actual problem with Canadian law is that PIPEDA recommends, but does not require, the erasure of inaccurate or unnecessary data. Certainly, in its recent and already famous Globe24h decision, the Federal Court circumvented this deficiency through the illegitimate and unauthorized nature of the disclosure.
Nevertheless, the erasure of data should be compulsory—and not simply recommended—once it is no longer necessary or accurate through stricter controls of the retention of data over time. One could also provide for an actionable right of erasure of outdated and inaccurate information. I should point out that this need does not only relate to the Internet, but to all databases, computerized or not.
It seems to me that these amendments are necessary—but sufficient—to the GDPR adequacy.
As to the scope, Canadians should be ensured that any harmful collection, use or disclosure of data be subject to strict standards of protection.
The definition of the scopes of the two federal statutes does not meet the citizens' expectation of protection in a global and interconnected world, including protected data and in particular with respect to the subjected organizations.
A solution for organizations would be to redefine the scope of PIPEDA in such a way that would render it applicable to all organizations operating under federal jurisdiction and that are not covered by the public sector act or any other federal law. Evidently, and analogously to our partners, the law shall retain exemptions for personal or journalistic use.
As to the issue of access to law, if it is undeniable that the law requires modifications in view of new realities, the legislator must seize the opportunity of this reform by performing a complete overhaul of the law, instead of making simple amendments.
Indeed, PIPEDA belongs, undoubtedly, in the hall of fame for the worst drafted federal laws—and we know that there is, in that matter, some competition there. The cornerstone of PIPEDA lies in an appendix copy-and-pasted from a document drafted by a private standardization organization. The act only supplements this document and other appendices by making constant references to them.
This poses a problem in terms of the public's access to law. A rewrite of the law, clearly explaining the right and obligations of each, would therefore be welcome—especially to make mandatory all that is presently recommended.
In terms of drafting, the act should remain conceived according to the principle of technological independence and be principles-based. Such an approach is essential to enable the Canadian legal framework to adapt to future social and technological changes, including the development of robotics, of the Internet of objects and artificial intelligence.
In terms of readability, the limitation of the legislation to the protection of personal information would be welcomed. Functional equivalence rules for electronic documents are irrelevant and should be transferred elsewhere.
Conversely, it would be desirable for a single act to contain the entire framework for the protection of personal information, that is, for both the private and public sectors. The concomitant reconsideration of these two acts by this committee offers this opportunity. This would also allow for the creation of a coherent framework for both the protection of personal information and the role of the Commissioner—even if it means providing several sections if it was considered necessary to maintain a public sector exemption regime.
As a final thought, I would like to draw your attention to the need of providing statutory rights of actions and damages. Equally, I would like to underline that it is necessary to update our law in order to satisfy the GDPR's suitability test, but that we must nevertheless consider two important factors: first, that the test does not require a carbon copy of the GDPR and secondly that this applies to all protection frameworks, and not just PIPEDA.
I hope that these few thoughts and recommendations will be useful to the committee. Sadly, I wasn't able to finalize on time a short bilingual brief with examples and recommendations. However, I could send it to you afterwards.
Thank you. I'll be happy to answer any questions that you may have.
I'll start by endorsing the distinction that Professor Scassa made between data protection and a right to deletion and a right to be forgotten, because that is a key distinction.
The way it was handed down was the first problem we saw. There was a decision by the European Court of Justice that didn't even really mention freedom of expression, and included statements that, for example, the right to privacy generally trumps people's right to obtain information. There was a lack of proper consideration of the rights that were being infringed. That would be the first one. With that decision, which was relatively bare, providing the only guidance at the outset at least that Google was going to have in implementing that, it was hugely problematic, because you create this enormous new responsibility without a huge amount of guidance on how it's supposed to be implemented.
As I mentioned briefly before, putting this on the private sector is hugely problematic, because this is a very tricky decision. It involves balancing different rights against one another, and it involves considering the overall public interest. Google is absolutely not equipped to do that. Even for a company of their size, this is something that you need judicial or quasi-judicial decision-makers to take on. Saddling it onto the private sector was also a significant mistake. I think you saw that the floodgates sort of came open. I looked it up in the interim, and I saw 348,000 requests to remove links by Google.
When I say that I have a certain amount of sympathy with regard to a few limited cases of where the right to be forgotten could be applied, I think it's a challenging thing to implement in terms of just applying it to those extreme cases. I think the European example shows that once the right is implemented, the floodgates kind of come open, and you have a huge amount of legitimate or accurate information, or perfectly relevant information, that people would request deletion for.
That's a very good question.
In terms of ubiquitous and continuous collection, people have suggested that there should be pop-ups from time to time to remind people that their information is being collected by the toaster, for example, and that they might want to think about whether they still want that to be happening. There are those types of things. Some of those could be mandated in legislation. Some could be done through guidance from the Privacy Commissioner.
There are others who suggest, as you know, broader fixes, such as moving all sorts of data collection and considering it fairly routine, and consent wouldn't be required. What worries me about that, of course, is the threshold that there be no risk or no harm. I think that in the big data environment, we're still trying to figure out exactly what the risks and the harms are. It's not always obvious at the outset what the implications of the collection of certain types of data are going to be, depending on what is then subsequently collected by someone else and put together.
I think there are some very serious challenges there, and I wish I could say, “Here are the three things that need to be done”, but I'm still struggling with it myself.
Okay. Thank you very much.
I'd like to come back to the question of consent. I know we've tried this a couple of times.
I could be wrong about this, but I think consenting to the extent of collection and then third party use of my personal data is often not really relevant to my using the service. Is there a legal way to try and divorce consent to the kind of widespread use of my personal information from what they would really need?
If you have an app like Foursquare, for instance, which uses your location, which is about where you are and about sharing that with other people, obviously collecting my location at that time, and doing that through my phone, is part of the app. With other software, however, you're often consenting to a broad statement about using your personal information that really has nothing to do with the use of that software. It's not integral to the operation of whatever service it is I'm trying to access.
Is there a way to try to carve that up that doesn't become overly cumbersome?