Thank you very much, Mr. Chair.
Members of the committee, thank you for inviting us here for your study of the Personal Information Protection and Electronic Documents Act, the PIPEDA.
As you know, PIPEDA is technology-neutral and based on principles of general application, two qualities that should remain as these are strengths that make this law a flexible tool.
However, the constant and accelerating pace of technological change since the turn of the 21st century, when PIPEDA came into force, is challenging the law's effectiveness and sustainability as an instrument for protecting the privacy of Canadians.
These technological changes bring important benefits to individuals. They greatly facilitate communications, they make available a wealth of information of all sorts, and they bring products and services from all areas of the world.
But these technologies also create important risks. Internet users want to share their views and search sensitive issues like health without fear that these activities will be tracked and shared with others with adverse interests. In fact, it is an essential aspect of the right to privacy that individuals have control over with whom one they share their personal information.
New technologies also hold the promise of important benefits for society. Future economic growth will come in large part from growth in the digital economy. For instance, Canada is well placed to become a world leader in artificial intelligence, which depends on the collection and use of massive amounts of data.
The 2016 OECD Ministerial Declaration on the Digital Economy, to which Canada is a signatory, commits, among other things, to an international effort to protect privacy, recognizing its importance for economic and social prosperity. Indeed, the protection of privacy is critical for building consumer trust and enabling a vibrant, robust and competitive digital economy.
Yet, the vast majority of Canadians are worried that they are losing control of their personal information, with 92% of Canadians expressing concern, and 57% being very concerned, about a loss of privacy in our most recent public opinion poll.
Without significant improvements to the ways in which their privacy is protected, Canadians will not have the trust required for the digital economy to flourish, they will not reap all the benefits made possible through innovation and, ultimately, their rights will not be adequately respected.
Consent has always been considered a foundational element of PIPEDA, but obtaining meaningful consent has become increasingly challenging in the age of big data, the Internet of Things, artificial intelligence and robotics.
When PIPEDA was adopted, the interactions with businesses were generally predictable, transparent and bidirectional. Consumers understood why the company that they were dealing with needed certain personal information. It is no longer entirely clear who is processing our data and for what purposes.
As such, the practicability of the current consent model has been called into question.
To be clear, I think there remains an important role for consent in protecting the right to privacy, where it can be meaningfully given with better information.
There may also be situations in which consent is maybe simply impracticable, and under appropriate conditions, it is worth exploring whether alternatives to consent can otherwise protect the privacy of Canadians. Some of these may require legislative amendments.
Through written submissions and in-person consultations with stakeholders across Canada, we've heard a broad range of suggestions.
For instance, individuals could be empowered to make decisions through simplified privacy notices. Organizations, on the other hand, could enhance their trustworthiness through the use of privacy by design, demonstrable accountability, or the adoption of industry codes of practice.
We heard that some wanted us to provide further guidance for organizations or promoting compliance through more proactive means such as audits. Others wanted us to have greater enforcement powers, a point to which I will return.
We also heard consistently that public education is essential and that more needs to be done.
We have therefore consulted a great many Canadians on the issue of consent. We are currently analyzing the proposed solutions, and many others in our general findings on the matter. We will be happy to share our consolidated findings with you once we have completed our work in mid-2017.
Another priority area for our office is reputation and privacy. Our ultimate goal here is to help create an environment in which individuals may use the Internet to explore their interests and develop as persons without fear that their digital trace will lead to unfair treatment.
As with the consent project, we started our work by issuing a discussion paper and inviting submissions. Many of the submissions received commented on the right to be forgotten, the concept arising out of the EU that individuals can request that certain links be removed from search results associated with their name. While acknowledging the potential harms that can come from a net that never forgets, some submissions raise significant concern about what a formally recognized right to be forgotten would mean for freedom of expression. Others question whether PIPEDA even applies to a number of aspects of online reputation or to search engines that are important players in that debate, and they call for other solutions instead. These ranged from greater use of targeted legislation to prevent specific harms, as we have seen in the cases of cyber-bullying and revenge porn; improved education on safe and appropriate use of the Internet, especially for vulnerable populations; and improved practices for websites and online services such as social networks. We would be pleased to inform the committee of our views once our policy position has been fully shaped later during the year.
Let me now turn to the question of enforcement powers. Enforcement is key to securing trust in the digital ecosystem. Our recent poll found that seven out of ten Canadians would be more likely to do business with companies if they were subject to financial penalties for misusing their information.
Currently my office cannot make orders or impose fines and it is, in many respects, weaker than some of our provincial and international counterparts. Industry worries that, should enforcement powers be granted to my office, organizations would be less willing to collaborate with us and negotiate toward solutions, yet my colleagues elsewhere have not had that experience. Perhaps it is time, then, to bring my office's powers in line with those of others around the world.
That being said, I also believe there is an important role for proactive compliance. Organizations are using data in innovative ways to derive value, and Canadians expect this activity to be regulated. A proactive approach to overseeing compliance at the front end before complaints happen would bring certainty to the market and further reassure Canadians that their concerns are being addressed.
Given time considerations, I will stop here, but let me conclude—can I continue?
Adequacy is another issue that I think the committee should bear in mind during its review: the adequacy of privacy laws in Europe. In Europe, the GDPR, the general data protection regulation, which has been adopted and will come into force in 2018, will require a review of adequacy decisions every four years, and Canada's adequacy status, which since 2001 has allowed data to flow freely from the EU to Canada, will have to be revisited.
A January 2017 communication from the European Commission notes that Canada's adequacy status is “partial”, in that it covers only PIPEDA, and that all future adequacy decisions will involve a comprehensive assessment of a country's privacy regime, including access to personal data by public authorities for law enforcement, national security, and other public interest purposes.
Given the far-reaching impacts of our country's adequacy status on trade, as well as the differences between GDPR and PIPEDA, it will be important to keep this consideration in mind as the committee moves forward with its study.
In conclusion, Professor Klaus Schwab, founder of the World Economic Forum, states that we stand on the brink of a fourth industrial revolution, characterized by a blurring of lines between the physical, digital, and biological spheres. This transformation, he argues, will be unlike anything humankind has experienced before.
PIPEDA was good legislation when it came into force in 2001, and it continues to provide a sound foundation upon which to build. However, in light of this new revolution, and more importantly, to meet the privacy expectations of Canadians, I believe that PIPEDA must be modernized.
Thank you very much. I look forward to your questions.
Yes. There is no absolute certainty in these matters, but I will give you my sense of what the considerations are.
The bottom line is that I think the committee should give serious consideration to reviewing any gaps or differences that may exist between Canadian privacy law and European law, because ultimately, under the European regulation, Canada's laws will be assessed—at the latest in 2022, four years after the coming into force of the GDPR—as to whether our laws are adequate, i.e., essentially equivalent to European laws.
Now, I say that there is no certainty in this matter because this standard of “essential equivalency” has not been defined very precisely by Europe. We know that equivalency does not mean “sameness”, so Canada's laws will not be expected to be a carbon copy of European laws, but still the standard appears to be quite high. It's one of essential equivalency. There may be some differences, but ultimately the laws should be essentially similar.
There are two areas in which potential differences between Canadian law and European law will have to be looked at. The first area is any differences between PIPEDA and the European regulation, the GDPR. The GDPR adds a few new rights to European law, one being the right to data erasure, which is the child, so to speak, of the “right to be forgotten”. That's one right that does not exist, per se, in Canadian law but exists in European law, and we should give consideration to whether we should bring our law closer to European law, if not to the same place. There is a right to data portability in European law that I urge you to look at.
For Canadian law, as it pertains to private organizations, this is a bit of the landscape. An important development in Europe over the past few years has been a decision of the European Court of Justice, essentially the supreme court of the European Union, which held, in a case called Schrems, that adequacy decisions in Europe should relate not only to privacy laws in other countries that relate to private organizations but also to public sector laws, including laws that govern law enforcement and national security.
What the European Court of Justice said in that case was that U.S. laws, under the previous safe harbour agreement, were not essentially equivalent to European laws for a number of reasons, including the fact that they did not contain criteria of reasonableness and proportionality. I would urge you to have a look at our laws governing the public sector as well for equivalency.
One of the reasons why, in the context of Bill , I recommended that the relevance standard be elevated to proportionality and necessity was the fact that in a few years our laws will be assessed against European laws, and European authorities will give consideration to necessity and proportionality as important factors.
That's all we have time for.
Some hon. members: Oh, oh!
The Chair:If you care to elaborate on that, that would be very helpful.
Colleagues, I appreciate your humouring me through this.
We thank you very much, Mr. Therrien, for coming once again. I'm sure it would be helpful, actually, at some point in time during the end of our study, once we've heard from more witnesses on this, to have you return to clear up some of the questions and concerns we'll have, so don't be surprised if you get an invitation.
We'll suspend for a few minutes, colleagues, to get ready for our next witnesses.
The Chair: We're resuming now. In order to keep to the agenda this time, I'm going to be much more strict on the seven-minute and five-minute rounds of questioning. That's the only way we can get through our one-hour time sessions. I am going to get straight to it.
We have, from the B.C. Freedom of Information and Privacy Association, via videoconference, someone who is no stranger to this committee, Mr. Vincent Gogolek.
We appreciate you joining us again today, sir.
We also have Ms. Valerie Steeves, who is appearing as an individual. She is a full professor in the department of criminology at the University of Ottawa.
Ms. Steeves, you have up to 10 minutes, so go ahead, please.
First, I'd really like to thank the committee for undertaking this study. I think it's incredibly important and very timely, given the changes we've seen since PIPEDA was first passed.
When I think back over that period of time, I always find myself thinking about three things. PIPEDA, as you know, was enacted to create trust in the information marketplace. Second, when PIPEDA was being passed, it was quite clear that the intention was to create consent as a floor and not a ceiling. Last, data protection and the provisions that are included in PIPEDA were part of a larger strategy that was designed to protect privacy as a human right. At the time, PIPEDA was seen as a necessary part of this protection, but it was not sufficient in and of itself.
In the last 20 years or so, I've spent a lot of time doing research on children's attitudes and experiences with privacy and equality in network spaces. I think that research raises real concerns about the first of those points, the success of PIPEDA to create trust in the information marketplace.
You could argue that part of it is a lack of education. I was in the field talking to 13- to 16-year-olds in October and November. We asked them about fair information practices, and none of them was able to identify a single one of them. In fact, almost none of them could remember the point at which they consented to the collection of their information when they signed up or posted material on Snapchat or Instagram.
Certainly when you talk to young people about the regulatory regime, they talk about privacy policies, and they don't talk about them in a very flattering way. From their point of view, these have been purposely written to obfuscate and confuse them, so they won't know what's happening, and so they will feel powerless.
They repeatedly—and increasingly, actually, over the years—have told us that the commercial surveillance they experience on these platforms is creepy; and “creepy” is a really important word because typically it means that someone's privacy has been invaded. It's a marker. But at the same time, since their school lives, their home lives, their work lives, and their play lives are so interpolated with technology, they really feel they don't have any choice about it whatsoever.
I think a good starting point for your study is the recognition that even though so many Canadian young people and Canadian adults have flocked to these platforms, that doesn't mean they're comfortable with the current regulatory framework.
In 2015 we surveyed 5,500 kids between the ages of 10 and 17 across the country. We asked them, “Who should be able to see what you post online?” and 83% of them said that the corporations that own the platforms where they're posting the information should not have access to it. So if I put something up on Facebook, Facebook shouldn't be looking. And 95% said that marketers should not be able to see what they post. Whether they've posted in a public place or a private place, they felt it was private to them.
Typically when kids are talking about privacy, they're not talking about non-disclosure, they're talking about audience control, and marketers were not an audience they wanted or expected. Some 96% said that companies that sell them smart phones and other devices or apps that use GPS should not be able to use it to locate them in the real world; and 99% said that marketers should never be able to use GPS to figure out where they were in the real world.
I think this brief snapshot really strongly suggests that there is a disconnect between the regulatory model and the lived experiences of the people who play, shop, go to school, and hang out on these platforms.
I think that disconnect is really related to a bit of the fiction that's embedded in PIPEDA. PIPEDA assumes that, when someone posts a photo on Instagram or is keeping a streak going at midnight on Snapchat, they knowingly and consciously are undertaking a commercial transaction, that they are trading their personal information for access to the platform.
But from the point of view of the people who live on these platforms, it's not a commercial transaction. If I'm on Snapchat, I'm chatting with my friends, I'm doing my homework, I'm signing a petition, I'm exercising my free speech, or I'm exercising my freedom of association. I don't think that's an outrageous perspective. Certainly that's the same relationship we have with our land lines. Although I spend $70 a month so Bell can put a phone line in my house and I can talk to people, I certainly don't expect Bell to listen to my phone calls.
I had a painter in the other day. I don't expect Bell to interrupt my conversation with my painter and tell me, “Home Depot has a sale on paint right now”, and sell to me in that environment. And I certainly don't expect Bell to take all that information and run it through an algorithm to figure out if I'm a criminal or not.
If we go back and look at that time period, part of reconnecting to that earlier hope for PIPEDA, I think, calls upon us to place privacy or data protection in a much broader context.
Go back to the Finestone report of 1997, in which privacy was seen as a social value, a democratic value, and a human right. I think that broader perspective provides this committee with two advantages.
The first one is that it's exactly the kind of thinking that you're going to need to use if you intend to harmonize our privacy protection regime with the European general data protection regulation that comes into force and effect in 2018. I think it's arguable that Europe has done a much better job than North America has in navigating through the challenges we've seen in network spaces over the last 15 years or so, precisely because of a strong commitment to human rights and a strong jurisprudence working on that commitment.
I also think that this broader perspective, placing data protection as is necessary but insufficient on its own piece of protecting privacy as a human right, will help us navigate the consent debate more effectively. As I said, when PIPEDA was passed, it was very clearly articulated that consent was intended to be a floor and not a ceiling, and it sure felt like a leaky ceiling after about six months had gone by.
Particularly given the commissioner's comments on big data, certainly there's pressure to weaken consent provisions and there's pressure to make more information publicly available precisely so corporations can sidestep the provisions that we now have. There's more pressure to de-identify and to accept de-identified information as non-personalized information for the purposes of the legislation.
It's always for the promise of big data: if we can just keep all the information, we'll be able to learn new things, because artificial intelligence will identify patterns that are hidden to us, so that we can predict behaviour, we can be more efficient, and we can be more effective. I think privacy is the best way to crack that open and to begin to examine the ethical concerns that flow from this type of information use. Big data is not predictive. This comes back to my human rights concern. Big data is never predictive; it can look only to the past. It assumes that I will do in the future what I did in the past, but even worse than that, it assumes that I will do what people like me have done in the past.
There's a deep concern around these kinds of information infrastructures, which is that we will unintentionally and unconsciously recreate biases in our information systems. We'll either program them in through false proxies, or they'll be learned by the algorithms themselves. We can look at the example in England where they identified young criminals. The youngest potential criminal they identified was three years of age, and he was identified because he was racialized, he was impoverished, and he lived in a particular area. There are discriminatory outcomes that are hidden within this information management system.
Even if we take the position that the algorithm will be able to learn, I think all you have to do is look at what happened with Microsoft's Tay to realize that an open season on information will lead to unintended consequences that will harm the most marginalized in our society.
At a practical level, I have five suggestions.
I think we need to strengthen the reasonable purposes clause. I was lucky enough to participate in the commissioner's meeting on consent, and it was quite interesting. We had quite a debate, because the representatives of the businesses I was sitting with kept saying that businesses have a right to collect information, while I kept saying, “No, businesses don't have a right.” People have rights. Businesses have needs and desires. I found it quite interesting that they kept pointing to the purpose clause. I think there's an opportunity to enrich our commitment to human rights within PIPEDA by opening up and reaffirming the need to protect individual rights against business uses, rather than business “rights”.
Second, I imagine that you're seriously considering adding a right to delink information if there's no public value. It's the right to be forgotten clause. From young people's point of view, certainly, this is absolutely crucial. When you sit down and talk to young people about the risks they're worried about online, that's it. They say, “Oh, something I did when I was 16 is going to sink me, and I will never be able to get over it.” I think that's a particularly important area to examine.
Also, young people certainly ask for regulators to mandate more technical controls so they can more easily control their audiences and take down content. I'm personally quite concerned that community standards are being created by corporations and that our elected representatives are not active in that space of setting standards for the kinds of discourse that are appropriate in Canadian context.
Fourth, I'd strongly urge you to consider mandating some form of algorithmic transparency. So many of these practices are hidden, and it's only getting worse, and so I think corporations should be required to be fully transparent with their information practices, particularly because of this concern about discriminatory outcomes.
Last, I'd ask you to consider holding corporations to account for those discriminatory outcomes if they're going to get the benefit of access to this information. It's like pollution; somebody is going to pay for the dirty water. Since we're building this system right from the get-go, we should be considering who that burden should fall on, and I would argue that it should fall on the people who profit from it.
Thank you very much.
My apologies first of all, but I'm strictly limited to your 2:30 deadline because we're having a bit of a problem out here in British Columbia with a privacy breach, strangely enough, one that affects both the public and the private sectors. So, I will have to go at 2:30.
I will also try to keep my comments as brief as possible to allow the maximum time for questions. I will limit myself to the four points raised by the commissioner in his letter of December 2 to the chair, as well as two extra points.
We've also had two detailed submissions that we've put in to the commissioner's process, which I believe are available, and I'd be pleased to provide them to you.
Consent for the collection, use, or disclosure of our personal information is the underpinning of PIPEDA. Attempts to move away from this or to tamper with it should be viewed with considerable suspicion. At the same time, it's important to note that, in many cases, consent is really illusory. The conditions being agreed to are often in the form of over-broad, lengthy terms of service and other contractual services. The choice offered to consumers is often to accept all conditions or to not use the service. The result of this is that, in many cases, an organization feels free to do whatever it wants with the information it collects under the guise that the individual whose information it is has, in fact, consented to this.
For example, in our 2015 study on “The Connected Car”—which was generously supported by the contributions program of the Privacy Commissioner—we found that there were multiple agreements, policies, and contracts that come into play when somebody is attempting to purchase a vehicle. The purchaser is supposed to have read and understood all of these policies. At lot of times these are not available on the Canadian website of the manufacturer. They are available only on the U.S. website, and it's not entirely clear whether or not they apply. These policies and conditions tend to have very open-ended use and conditions that allow for “such other purposes as we see fit” or for research or for marketing. Some of these policies can, in fact, be somewhat contradictory. It's not entirely clear where these are coming from. As a result, we provide this general recommendation in our “The Connected Car” report:
||Rather than relying on the fiction of choice and consent, what is needed in this industry are clear, specific and relevant limits on collection, retention, use and disclosure of personal customer data. We need industry-specific data protection regulations for the Connected Car industry.
We also had a number of specific recommendations for the automotive industry regarding consent. I'd like to refer you to four suggestions that Professor Michael Geist of the University of Ottawa put forward as a useful basis for approaching the issue of consent generally: the opt-in consent should be the default model; rules on transparency must be improved; consumers must be able to exercise a choice other than to take it or leave it; and stronger enforcement powers and penalties are required.
In terms of reputation and privacy, with the rise of the online world, considerations that were once primarily the concern of the well-heeled and the well-known—things like damage to reputation—have become much more widespread and are, in fact, concerns of pretty much everybody who is involved online. What might once have been simply neighbourhood gossip can now become part of a global campaign of vilification. Ordinary people who do not have large financial resources or access to legal resources are put in the position of trying to defend themselves and their reputation in this new world. FIPA made a submission to the Privacy Commissioner's consultation on this issue, and I would refer you to that piece of work for a more detailed discussion of some of the issues involved.
We didn't make specific recommendations, but we did outline various considerations that should be taken into account when approaching this issue.
In terms of enforcement, as we've said before, with regard to the Access to Information Act and the Information Commissioner or the Privacy Act and the Privacy Commissioner, we're also of the view in terms of PIPEDA that the Privacy Commissioner should be brought up to the same level as his provincial counterparts who have order-making power. This system has operated for more than a decade in British Columbia, and there hasn't been any systemic problem with the commissioner having order-making power. It would also ensure that, in terms of protection of people's rights, they would be able to get a more immediate remedy under the federal regime, which is not the case currently, rather than somebody, say in British Columba, having a choice of complaining about conduct either provincially or federally.
In terms of adequacy, the order-making power would have, I think, a positive effect with regard to ensuring that PIPEDA continued to be looked upon as providing adequate privacy protections.
The two additional points that I would raise are these.
One is something that came up, I believe, during our discussions on the Privacy Act, and that is the coverage of federal political parties. It's our view that the federal political parties, which are currently not covered under any legislation protecting people's privacy and personal-information rights, should be dealt with under PIPEDA. Here in British Columbia, our substantially similar provincial act, the Personal Information Protection Act, covers the political parties in this province. Arguably it could cover provincially incorporated branches of federal parties. The commissioner has, in fact, successfully done at least two investigations and reports on the two largest parties here in British Columbia, and we continue to have parliamentary democracy here, so we don't see any impediment to federal political parties being brought under the PIPEDA regime.
Finally, I'd just like to support what Professor Steeves said in terms of algorithmic transparency. This is a very key point, and it's something that we raised previously with regard to the Privacy Act.
I look forward to your questions.
Thank you very much.
Typically, as I said, in a regulatory regime, we figure that if someone discloses something then it's no longer private, whereas their notion of privacy is relational, and they want to negotiate it with different audiences.
If I post something on my Instagram account, and that account is for my friends, I don't want my mother looking at it. I want a mechanism that will say, “No, that's not my family account. That's my friend account.”
Typically, when they worry about privacy invasions, it's because the barriers between their different audiences have been removed. This is all taken out of the social-media world and made accessible to anyone outside of those audiences, and there are harms to them because of it.
Two 13-year-old kids in Toronto went on vacation. They got back. They were talking about their tans online. One said, “I'm darker than you,” and they were called into the principal's office for racist bullying, because they both happened to be African-Canadian kids.
They were thinking, “I'm talking to my friends, and we're having a chat, but because this information can then be captured, now I'm under surveillance by my school and I'm accountable to my school for everything I say.” It's the ability to keep those lines firmly in place that they care about.