Thank you very much to our witnesses for being patient as we resolve issues in the House.
I will just mention that because I'm the lone New Democrat on the committee, I'm going to take time to ask questions as well. When I do that, Mr. Kelly will take the chair, and then I'll come back into the chair.
With that, I'll mention briefly who our witnesses are, and then we'll go into the presentations. We have Molly Reynolds, a senior associate with Torys LLP; Paige Backman, a partner in Aird and Berlis LLP; and Alex Cameron, partner and chair, privacy and information protection group, Fasken Martineau DuMoulin LLP.
We'll start with Mr. Cameron.
Thank you very much and good afternoon. I very much appreciate the opportunity to share with the committee some of my personal thoughts about important issues of privacy that affect Canadians. As mentioned, I'm a partner with the law firm Fasken Martineau. I've been working in the privacy law field for 17 years—in fact, pretty much since PIPEDA came in. In this case, however, I'm appearing before the committee purely as an individual today.
I have read many of the transcripts of the evidence that has been given to date before the committee, and certainly you have heard from a very wide range of perspectives and experts in the field. I share many of the views that have been expressed to date, in particular the submission by the Canadian Bar Association, of which I am a member.
In very broad strokes, my perspective on the issues before the committee is that the existing consent and ombudsperson model under PIPEDA has been proven to be remarkably resilient, adaptable, and effective in achieving the purpose of PIPEDA, which is, of course, to recognize the right of individuals to privacy and the needs of organizations to collect, use, and disclose personal information. On balance, in my view, the model should continue, absent a compelling need for some legislative change.
As for what I do want to focus my brief remarks on, one of the unique aspects of my practice and experience is privacy-related litigation. I want to suggest to you that the developments in the courts in respect of privacy-related interests and claims, which have been very significant over the last five years in particular, form a very important part of the context in which PIPEDA operates, and as I'll suggest, the context in which any deliberations about changes to PIPEDA should take place.
In addition, in my review of the evidence to date, it does not appear that the committee has heard very much about the developments in the courts. I saw that there have been some references to some developments in passing, but in the interests of trying to contribute something a bit new that you may not have heard or focused on previously, I'll focus my opening remarks on that issue, although I am happy to answer questions on other topics.
As the committee is aware, under PIPEDA there is the possibility of going to Federal Court in respect of matters addressed in a commissioner investigation and report. The court has the power to award damages and other relief, and that power gives some additional teeth to the legislation.
I want to describe what I see by referring to a bit of a story that we have seen outside of PIPEDA, which emerged starting in or about 2010.
First, in or about 2010, we started to see a handful of cases going to Federal Court under PIPEDA, in which individuals were typically awarded $5,000 or less for privacy breaches. Most of those were what I would characterize as relatively minor privacy incidents. We have seen a continuation of those types of cases going to Federal Court under PIPEDA since that time.
However, in terms of where the significant developments have taken place, in or about 2012 and into 2013, what we have witnessed in Canada is really an unprecedented increase in privacy-related litigation activity outside of PIPEDA. These cases are not going through the commissioner's process and on to Federal Court. These cases are being brought directly to court through tort claims, contract claims, negligence claims, and other causes of action. They've been very significant in the range of issues that are covered.
We have many cases that have dealt with cybersecurity-related issues, from computer hacking to snooping in the workplace, lost USB drives and lost devices, alleged misuse of personal information for commercial purposes, and inadvertent disclosures of information. These have crossed both private and public sector boundaries. This real proliferation of litigation started in or about that time and was something that had never really been seen before in Canada.
I highlight further that this development was unprecedented both in terms of the volume of activity—so a lot more cases were being brought—and also, in particular, in the fact that they were being brought not just by individual complainants but also through class action litigation. There are currently many cases that have been brought and a number of class actions that have been certified. A few of them have now been settled, and this litigation activity continues.
Significantly, the developments, in that respect, have meant that many cases that might otherwise have gone to the commissioner or through the PIPEDA process are, instead, just going directly to the courts. In my submission, that's something that can't be ignored in assessing some of the issues that I know are on your mind in terms of potential areas for change in PIPEDA, which I'll come to in just a moment.
The expectation, I would add, is that the litigation trend is going to continue because of the mandatory breach notification provisions, which may yet come into force in PIPEDA. The idea is that as more notifications are required to be given by organizations with respect to privacy breaches, we will see individuals seeking legal advice, and potentially more litigation claims being brought in the wake of notifications being received. That is the expectation.
In terms of how this relates back to some of the questions that I know are on the table with regard to changes in PIPEDA, among the reasons that I suggest this significant development that we've seen over the last five years is relevant is that, for example, it responds to a suggestion that in the absence of enhanced powers for the commissioner, under PIPEDA, organizations will not or might not take privacy compliance sufficiently seriously. It's often touted as one of the reasons in support of doing that.
Certainly in my experience, organizations are taking privacy seriously, but I point to the litigation-related development as part of the broader context in which these issues are being addressed. Those potential claims present very real legal risk—to real dollars having to be spent to deal with those issues and, ultimately, of course, potential liability for a wide range of privacy breaches. The courts have taken on a very significant role in shaping privacy protection in Canada at a practical level in that respect.
I'd further suggest, in terms of other areas of relevance of this development—and I know it's been highlighted by the bar association—that this broader context is also important in terms of assessing the question of adequacy in relation to the GDPR that's emerging from the EU.
I'll stop my remarks at this point. Of course, I'd be happy to take questions on that topic and the others on your mind, but I wanted to contribute that piece in particular, as I haven't seen it reflected much in the testimony to date.
Thank you for inviting me to appear before you today in respect to your study of PIPEDA. I'll give you a short background on me and then focus on the two issues that I submit should be part of your study of this act.
I am a privacy and data security litigator in Toronto. I counsel private sector organizations on both Canadian and American privacy law compliance. I also represent individuals who seek to enforce their privacy rights in the civil courts, including in this unfortunate area of non-consensual distribution of intimate images. I also note by way of background that I'm somewhat closer to the generation that grew up with the Internet, rather than the generation that saw the first office fax machine. That's part of the context that I bring to my perspective today.
Let me start with the top priority in my submission.
The single most significant reform that could be made to PIPEDA is to permit advance compliance rulings. We can do more to protect the personal information of Canadians and to improve private sector compliance by explicitly empowering the Office of the Privacy Commissioner to issue compliance rulings before a new initiative is launched by the private sector. You have already heard, I believe, about advance rulings, from my colleagues at the Canadian Bar Association, but this framework would allow organizations to voluntarily submit to the OPC a new initiative that may affect personal information—that might be a new product, a new technology, or a new service structure—and then receive the OPC's feedback on whether that design will likely comply with PIPEDA.
In my view, this authorization would require legislative amendments, because the OPC's powers as they're currently framed under the act really deal only with conduct of investigations, audits, or compliance agreements where an organization is believed to be out of compliance with the act, but the power to issue advance rulings shouldn't hinge on any belief of non-compliance. It should be voluntary, and it should be proactive on both sides.
In my submission, the power to issue advance compliance rulings would have four significant impacts.
First and foremost, it would better protect Canadians. The OPC and business would be using their resources to proactively protect Canadians' privacy rather than simply investigating and penalizing compliance failures. Just as we say that an ounce of prevention is worth a pound of cure, resources are better spent ensuring that privacy law compliance occurs before anyone's information is put at risk with a new initiative.
Second, it would help more organizations and it would better help the Office of the Privacy Commissioner because, through assessing these new initiatives, the OPC would gain better insight and more current insight into new developments and new technologies that affect personal information in the Canadian economy. This would allow it to provide more technical and more current general guidance and share the lessons learned with other organizations to better promote privacy awareness across the economy.
Third, advance rulings would increase certainty for all involved. An advance compliance ruling would allow organizations to rely on the commissioner's expertise in designing appropriate personal information protection in new initiatives. This will provide them with more certainty around what the compliance requirements are and a fresh perspective on the privacy implications of their new technology or their new project without stifling innovation.
Fourth, advance compliance rulings would improve risk assessment in the private sector, in my submission. The advance ruling option would encourage businesses to implement internal privacy impact assessment mechanisms, and that would have a positive impact on PIPEDA compliance across organizations and across the industry, beyond any one initiative that may be submitted to the OPC for review. As many of you may know, the Treasury Board Secretariat already requires government institutions to perform a private impact assessment to measure the potential impact of a new initiative on individual privacy rights, but we could craft this in the private sector so that in order to seek an advance ruling, the OPC would require an organization to first submit the results of its internal privacy impact assessment. This would further the spread of PIAs as a standard practice in the private sector and lead to more consistent protection for individuals' private rights.
Finally, on this first issue, I would note that advance compliance rulings should not be binding for either party. They should encourage a voluntary dialogue between industry and the regulator to further this proactive protection of personal information.
The second area in which, I submit, PIPEDA reform would have a significant effect is to establish a clear threshold for when information has become sufficiently anonymous that it's no longer defined in the act as personal information. The Privacy Commissioner did address this somewhat difficult issue in the discussion paper on consent and privacy, which, I believe, has been discussed here before. One of the essential features, which I know you've heard about time and again, is that PIPEDA was designed to be, and is, technology-neutral, but as technology develops, we're actually creating new forms of information. You think of metadata. You think of the results of data analytics. We have new categories of information, and it's often challenging for the private sector to determine whether the data it is creating or it is handling is personal information at law.
We could improve certainty here if PIPEDA or the regulations thereunder actually codified the threshold for what is identifiable information. The Privacy Commissioner's discussion paper refers to two thresholds that could be considered: whether there is a serious possibility that the individual could be identified—that's the one that Canadian courts have looked at before—or whether identification from the information is likely, which is the threshold that the U.K. commissioner has used previously.
The issue of de-identification does link back to my first point. If the Privacy Commissioner is given the authority to provide advance rulings to businesses, organizations could then test their assessments of whether the information they are handling is so unlikely to be associated with an individual that it's actually taken outside of the scope of the act, and they could do that before they finalize their program designs. If the OPC says they're wrong, safeguards could be put in place well before any information is actually put at risk. This is very consistent with the Privacy Commissioner's mandate to protect and to promote privacy rights.
In addition, on this point, a standard for de-identification is relevant to the right, in Canada and abroad, to have personal information deleted. As technology continues to develop and the storage of information becomes more decentralized, it's often becoming impossible to permanently delete every copy of every record that may contain an individual's personal information, especially where that definition of personal information may change with the context or with the technology we're using.
The act already contemplates that information should be destroyed or erased or, importantly, made anonymous when it's no longer required, and it contemplates that an organization may be required either to delete or to amend personal information when an individual requests that.
This is consistent with the idea of having a strict threshold for de-identification or what constitutes anonymized information. The value of that existing framework is that it is still technology-neutral, and we can protect individuals' privacy rights even where the technology we're using to store personal information doesn't allow us to permanently delete it. The alternative way to eliminate personal information in that context is to anonymize it. In my view, these options around amending or anonymizing information already exist in the act and can be held to be essentially equivalent to the EU general data protection regulation as it relates to the right to erasure, but individuals, organizations, and the regulator would benefit from a statutory threshold that governs when data is no longer deemed to be personal information at law.
By way of brief conclusion, I do note that many of my colleagues who have appeared before you on previous days have addressed the EU GDPR in some detail, and I don't want to dwell on that for too long. But as it relates to this issue of anonymizing personal information and whether the existing retention requirements under the act are equivalent to this right to erasure in the EU, I would just urge you to focus your study on the interests of Canadian consumers and Canadian businesses that are operating under both Canadian and international law. I respectfully submit that the focus of this study should not be reforms that would merely encourage an adequacy ruling from the EU, but rather areas in which harmonization of international standards with Canadian privacy law would truly help consumers and businesses protect information more consistently and with more certainty across jurisdictions.
I look forward to my colleagues' comments and any questions that you might have.
Good afternoon and thank you for inviting me here to contribute to your study on PIPEDA. The task facing this committee is challenging but extremely important.
Unlike Mr. Cameron and Ms. Reynolds, I'm a corporate lawyer. Among the three of us, there are two litigators and a corporate lawyer.
I am the co-chair of our firm's privacy and data security group, but I'm also a director of the CyberSafety Foundation, which looks at the protection of individuals with regard to cybercrimes and online activity. Those two positions, I would suggest, are often not in alignment, but I would suggest that with some thought you can further both interests—the advancement of business and the protection of individuals.
My submissions this afternoon are my own personal ones, and I welcome and I applaud Ms. Reynold's and Mr. Cameron's insights. I think they're very valuable.
The pace of technological advancements since the introduction of PIPEDA over 17 years ago has been staggering as has been the way in which businesses have created and continued to create new business models applicable to all ages and all demographics to take advantage of the new technologies. This results in an equally significant evolution in the ways in which individuals interact with technology; the nature and scope of personal information being collected, aggregated, reidentified, used, disclosed and sold; the manner in which businesses can commercialize this information; and the resulting impact on individuals arising from the foregoing.
As I think everybody who has offered submissions has probably noted, this has created quite a challenge for the application of PIPEDA and its evolution over the last 16 years.
While there are additional areas within PIPEDA to which I can propose amendments, for the purposes of my submission, I'm focusing on three key areas: a framework for consent, oversight of minors, and a limited right of erasure. While I will not provide recommendations regarding enforcement powers of the OPC, I will conclude with a consideration regarding the same.
For the audience at hand, it goes without saying that valid consent is the foundation of PIPEDA and of all privacy laws around the world, and my experience supports the numerous studies and extensive submissions to the committee that conclude that privacy policies and the way in which they are currently used are highly ineffective at communicating important information and obtaining the requisite consent.
This is an issue for both organizations and individuals. This is not simply an issue for individuals. Organizations relying on privacy policies have a false sense of security that they've obtained the requisite consent. If individuals cannot reasonably understand how their personal information will be used or disclosed, or if they don't understand if and when a business' information-handling practices go beyond what is required to fulfill a legitimate purpose, there is no consent. If we fix this element, it's going to be a critical pathway forward to both sides of the party coming together.
It would be unrealistic to suggest that we can find an approach that will satisfy every individual across all demographics. However, proper framework for consent will allow businesses to have greater certainty that they've established the requisite consent and will also provide individuals with meaningful information on which they can provide or not provide their consent.
To that end, I recommend that the following four-part framework for supporting consent be adopted. One, define information-handling practices for which consent may be implied, and incorporate the same into a model code attached to PIPEDA. Certain suggestions for terms to include in this model code are attached to schedule 1 in my written submission. This will clarify practices on which organizations can rely on implied consent, and to the extent an organization's practices deviate from such a model code, the organization's privacy policies would focus on those supplemental practices.
Two, require expressed consent for those practices that deviate from or are in addition to the model code.
Three, practices relating to secondary purposes should be specifically delineated within privacy policies, and a clear and readily available opt-out right for each secondary purpose should exist.
If anybody missed words of wisdom, let me know.
Voices: Oh, oh!
The second area for which I'll provide recommendations involves the oversight of minors. I represent a number of large education-focused businesses as well as other non-education businesses whose online sites and apps are used by minors. One of the most consistent and significant issues they grapple with is when it is appropriate to obtain consent from someone under the age of majority, and when and how to obtain the consent of the minor's parents or guardians.
I incorporate into my submissions some studies, as referenced in my written submission, that find that a significant percentage of young children are participating in online activities. I also incorporate a reference to a recent report by the Children's Commissioner for England reflecting on the terms and conditions of Instagram, an app that has been used by over 50% of children between the ages of 12 to 15; and 43%, or almost 50%, of children 8 to 11 years old in England.
Instagram's terms and conditions were 17 pages long and contained 5,000 words, with language and sentence structure well beyond the capability of the average youth—and, I would suggest, the average adult. When asked to read through the terms and conditions, the children and the youth were frustrated and understandably confused. While young Canadians may be text-savvy—as I can attest from my own young sons, who are perhaps more text-savvy than I am—children and youth are often not able to comprehend the terms of the policies even when these are brought to their attention, and often lack the knowledge and understanding of the business processes and consequences of those processes required to provide informed consent.
To that end, I recommend that organizations be required to obtain verifiable consent of a parent or guardian of individuals under 16 years of age. Any method to obtain verifiable consent should be reasonably calculated, in light of available technology, to ensure that the person providing consent is the child's parent or legal guardian. While the age of 16 is not a magic number, it is consistent with domestic laws as well as international laws, such as the GDPR. In relation to the approach to obtain the consent of the parent or guardian, our recommendations are consistent with the U.S. FTC's children online protection rule as well as the GDPR requiring organizations to make reasonable efforts to obtain verifiable parental consent, taking into consideration the available technologies.
The third area to which I will recommend amendment to PIPEDA relates to a limited right of erasure. We've heard a lot about it. There are definitely pros and cons to both sides. With that in mind, my recommendations are to a limited right of erasure.
To this end, I incorporate reference to studies, included in my written submission, that reflect the extensive use by young children of websites and online application that involve the collection of highly sensitive information, such as photos, videos, journal-type entries, and location, and posting the same publicly. Either they are posting it or others are posting it and reposting it.
There are significant benefits to children and youth engaging in online resources through social media. However, an error in judgment of a minor, or judgment of another that involves the information of a minor, can have significant short-term and long-term consequences for both the minor and society. More frequently, we are seeing that an online footprint, whether placed there by the individual, the minor or child themself, or someone else, can be central to online bullying. Such bullying can significantly impact the physical and mental health of the child and can lead to long-term consequences for both the minor and society.
While the parental consent recommendation above addresses the protection of minors at a particular point in time, we need to also address the ongoing information sharing and use of minors' information in commercial activities. Remember, this is all in the course of commercial activities that occur throughout the child or youth's involvement in the online environment, which often goes without parental involvement.
To that end, I recommend that the right of erasure be enacted in relation to minors where their personal information has been collected, used, and disclosed in the course of commercial activities.
Consistent with this recommendation, I note that the GDPR also supports the increased need for the right of erasure when personal information of a minor is involved. Specifically, we recommend the following, and in a manner consistent with the GDPR.
Individuals whose personal information is collected, used, or disclosed in the course of commercial activities, and that is or was collected, used, and disclosed during the time such individual was a minor, should have the right—and their parents and guardians should have the right—to have such personal information deleted without undue delay, except in those limited instances that I have set forth in my written submissions. To the extent that such personal information has been disclosed or transferred to a third party or otherwise made public, the organization that originally collected the information and all parties who are using or disclosing such information should take reasonable steps, including the use of reasonably available technology, to delete all copies and links to such personal information.
My last comment involves the enforcement powers of the OPC. I will not provide recommendations supporting specific enforcement powers. However, for purposes of discussion around the same, I reinforce that the general principles upon which PIPEDA is based, while creating flexibility, create great uncertainties around an organization's compliance obligations. Without greater certainty surrounding the compliance requirements under PIPEDA, it will be unfair and highly prejudicial to impose additional penalties and fines on such organizations.
In conclusion, I reiterate that the task facing the committee is challenging but extremely important. I commend you for your time and effort in modernizing PIPEDA and ensuring the amendments to PIPEDA are relevant and valuable in achieving its purposes. The effort to modernize PIPEDA and ensure the protections afforded thereunder are relevant and valuable will not come without roadblocks; however, decisions not to modernize and amend PIPEDA in a way that results in clarity and protections for businesses and individuals also come at a very high cost.
I hope my submission is of some value. While I limited proposed changes to three key areas, I welcome questions on those or other topics.
Yes, I do, up to the age of 18.
I'm not suggesting that the right to be forgotten shouldn't be applicable to adults. I get calls where adults have been slammed on the Internet and have material on the Internet that they reasonably should have removed. For children, this is something that I think we should tackle. We're talking about the use of children's information in a commercial activity. Whether it's the child who puts the information online, a friend of theirs who puts it online, or somebody else who puts it online, this can have serious consequences for the development of the child. This can affect their mental health and their physical health. We're seeing great trends of self-harming of children, because in the online environment they can't get away from the mistakes that are out there.
I'm perhaps not as close to the technology era as perhaps Molly is. I certainly made mistakes growing up, and I can tell you that had that been videotaped or uploaded, I'm not sure I'd be standing here before you, or been called the bar.
We all need to allow our youth and our children to make those mistakes and move on from them. To my mind, the right of erasure applicable to minors, when their information is involved in commercial activities, strikes the balance.
Perfect. Thank you very much.
Thank you, everybody, for being here today. Thank you for your patience while we voted.
I'd like to come back to some of the comments that my colleague Mr. Saini was asking you, Ms. Backman, but I'll also open it up to you, Ms. Reynolds, and to you, Mr. Cameron, on some of the stuff that you started talking about.
First, Ms. Backman, you used “right to be forgotten” and “right to erasure” almost interchangeably, it seemed to me. Going back to your point, what is also clear to a lot of us in the room is that if there are certain things in our past that we feel should be erased, somebody else might not feel that way. Particularly for members of Parliament, in running for public office there may be certain things that other people may find relevant to voting that maybe aren't necessarily what we would like to put out there—myself excluded completely, though.
Voices: Oh, oh!
Mr. Matt Jeneroux: I do want to get your comments on how we're struggling with that—
Thank you. I'll take maybe just a minute or two, and then Ms. Reynolds can respond.
It's a good question. It is a struggle to deal with that, which is one of the reasons that I limited my submission to it being applicable to minors. Let's think of the instances in which this may be relevant, such as a criminal record or some sort of a breach of law. Those are not erased simply by taking it offline, so, if you have a criminal past, that will still be accessible. It's just not there for the entire public to see.
What we're seeing more of, or where we're seeing the pressure point, is with poor judgment, which we all experienced as children, whether it's photos from a party with alcohol...which can really have an impact on your right to employment or to getting into educational institutions.
For me, the balance is that I understand that struggle and I think it's a legitimate struggle. I think it becomes more relevant once you become more able to correct your behaviour as an adult.
A child has very limited ability to really understand those consequences and, therefore, to correct the behaviour before this happens. The more significant issues that a child would be involved with, which you would want to know about legitimately, are going to be in a record somewhere. This is simply the additional bad judgment, which we all experience, that is just going to be removed from online.
I think given that this is in the context of a commercial activity, the balance really weighs more in favour of protecting the minor, letting them make those mistakes, and then allowing them to move on.
In some of the earlier testimony, I think we heard a clear preference on the part of some witnesses to stick with the current ombudsman model. One of the debates we've been having here at committee is whether it would be appropriate to give the Privacy Commissioner order-making power. Obviously, that would be a deviation from the current model.
I appreciated your suggestions, Ms. Reynolds, about giving powers to have advance rulings. I am wondering if perhaps you and Mr. Cameron, and Ms. Backman, as well, if she wants to jump in on this, could speak to how the power for advance ruling might actually work well—or not—with order-making powers.
One of the disadvantages of conferring order-making powers, we're told by people who don't want that route, is that it becomes cumbersome, it's complicated, and you're creating a bureaucracy. However, if the Privacy Commissioner had those order-making powers, it might incentivize taking advantage of advance rulings, and advance rulings might help mitigate the quantity of order-making instances, if you will. I'm wondering if you could speak to that point.