We will begin the 59th meeting of the Standing Committee on Access to Information, Privacy and Ethics and continue the study on privacy and social media.
Today, we are fortunate to have two witnesses with us. First, we have a representative from BlueKai, Mr. Chapell, who will make a 10-minute presentation. Afterwards, we will be able to ask him questions. We will also hear from the Privacy Commissioner who is visiting us for the second time. She will summarize what has been said so far, since this should be the last meeting on this study.
Without further ado, I yield the floor to Mr. Chapell for ten minutes. As I already said, we will then have an opportunity to ask him questions.
Mr. Chapell, I want to thank you for joining us. The floor is yours.
:
Thank you, Mr. Chairman.
Mr. Chairman and members of the committee, thank you for inviting BlueKai to testify at this timely and important hearing. My name is Alan Chapell, and I am the outside counsel and privacy officer for BlueKai Incorporated, a digital data company with headquarters in Cupertino, California.
It is an honour to appear before this committee. I am pleased to describe BlueKai's business and share with the committee some of the privacy innovations we've developed at BlueKai.
BlueKai's mission is to build the world's first complete enterprise platform for data-driven marketing with the utmost attention and diligence to ensuring consumer privacy. We offer a data management platform that enables advertisers to collect, store, and utilize anonymous consumer preference data. Since our founding in 2007, BlueKai has embraced the privacy by design ideals championed by Information and Privacy Commissioner Dr. Ann Cavoukian. We recognize the importance of incorporating privacy into our products and services and have fostered a culture of protecting consumer privacy interests from day one.
BlueKai's platform enables businesses to utilize pseudonymous bits of marketing data for online behavioural advertising and analytics purposes. The platform allows businesses to create target audiences based on a combination of their own data and third party data in order to reach their target audiences across third party advertising networks and exchanges. The platform also helps those businesses to measure with accuracy which campaigns performed in order to refine media buys and advertise creatively over time.
The marketing data stored on the data management platform is generally governed by the privacy polices of our clients. BlueKai offers guidelines to help ensure that our clients understand the applicable privacy law and self-regulatory standards.
BlueKai also offers a data exchange that enables businesses to utilize pseudonymous third party data for their digital advertising campaigns. We take steps to ensure that the third party marketing data listed on the BlueKai Exchange meets or exceeds applicable privacy law and self-regulatory standards.
BlueKai is a board member of the Network Advertising Initiative, a coalition of more than 95 leading online advertising companies committed to shaping and enforcing responsible privacy practices for online behavioural advertising. We are also a member of the Digital Advertising Alliance, the industry-wide self-regulatory program for online behavioural advertising. We've been active in the behavioural advertising self-regulatory movement in North America, Europe, and the rest of the world since our founding.
We understand that a similar behavioural advertising self-regulatory program is being developed in Canada. Further, this program's privacy requirements are generally in harmony with the policy position on online behavioural advertising offered by the Office of the Privacy Commissioner of Canada. BlueKai has historically been a leader on the move to industry self-regulation. We aspire to continue that pattern and be one of the first companies to participate in the Canadian self-regulatory initiative when it is launched.
Last but not least, BlueKai participates actively in the World Wide Web Consortium's tracking protection working group to develop a browser-based do-not-track standard.
In addition to being active participants in industry self-regulation for online behavioural advertising, BlueKai has a history of innovating on privacy issues. l'd like to share two of those privacy innovations with the committee today.
The first is the BlueKai Registry. BlueKai was one of the first digital marketing companies to provide consumers with enhanced transparency by offering access to marketing data via the BlueKai Registry. The BlueKai Registry, which is available at BlueKai.com, brings transparency to consumers by allowing them to see what preferences are being stored via the BlueKai cookies on their computer.
Furthermore, consumers may also control their anonymous profile by managing their topics of interest. We strongly believe that offering consumers this level of transparency and control builds consumer trust. We've seen that in practice; relatively few consumers who visit the BlueKai registry actually opt out from further use of their preference data. This suggests to us that consumers who understand BlueKai's practices are generally less concerned by them.
The second innovation is the BlueKai opt-out protection tool. One of the challenges to offering opt-out choice in an online advertising context is that cookies serve a dual purpose. In other words, cookies are used to store marketing data and to record an Internet user's opt-out choice.
When Internet users delete all of their cookies, their opt-out choice may also be deleted. The Office of the Privacy Commissioner of Canada has proposed that opt-out choice is appropriate for most forms of online behavioural advertising; however, the Privacy Commissioner also recommends that such choice be made persistent. This recommendation is in line with the recommendations made by regulators across the globe. BlueKai has taken steps to meet those recommendations with the BlueKai opt-out protection tool.
Utilizing some open-source code, BlueKai developed a Firefox browser plug-in that was designed to protect user opt-out choice even when users have deleted their Internet cookies. This code was licensed to the Network Advertising Initiative, so all NAI member companies were able to leverage the opt-out protector technology. This opt-out protection concept was further embraced by the Digital Advertising Alliance and expanded to include most major Internet browsers.
We're proud that our hard work was able to help BlueKai and other online behavioural advertising companies to protect consumer privacy choices. We take privacy very seriously at BlueKai and are happy to have had the opportunity to share some of our privacy innovations with this committee.
I'd be happy to answer any questions.
I'm going to talk mostly about the United States. Many of these concepts are certainly working in other jurisdictions.
The first organization I mentioned was the Network Advertising Initiative. It's an industry trade association that's been around for about 12 years. The organization is made up primarily of what we call the Internet intermediaries: the networks, the platforms, the exchanges, and the data companies. These are the entities that sit in between a website publisher and an advertiser. They help facilitate the delivery of that.
With those companies, historically the challenge has been that since they don't control the ad and they don't control the website, it's difficult for those companies to push privacy standards out into the rest of the ecosystem. Those privacy standards involve notice, transparency, opt-out choice, and a rule set around what we call “sensitive data”.
When I refer to the Digital Advertising Alliance, I'm referring to what is more of a broad coalition of industry associations, which includes the Network Advertising Initiative. However, it also includes the online publishers, the online advertisers, and the digital advertising agencies.
The goal of the Digital Advertising Alliance is to make sure that all the privacy standards are harmonized within the business ecosystem.
:
Mr. Chair, thank you very much for your invitation to appear again at the very end of your study, which we have been following with interest.
[English]
I'm joined today by Chantal Bernier, assistant commissioner, who directs our day-to-day operations, and Barb Bucknell, strategic policy analyst, who is a specialist in social media. They will, I hope, help me answer your questions.
Honourable members, I'd like to start with an overview of privacy challenges.
Over the last few months, I believe you've heard from an array of interested parties on the benefits and the challenges of social media. When I first appeared in May, I noted the four areas of privacy protection where we had the most concern. These were accountability, meaningful consent, limiting use, and retention. It's noteworthy that the witnesses who appeared before you have largely agreed that these areas are challenged by social media. Where they tended to differ, I understand, was on the adequacy of the tools available to address the problems.
Also noteworthy was the extent to which children and youth privacy permeated the discussions. Many interesting ideas were put forth with respect to digital literacy as well as possible legislative responses.
Mr. Chairman, I would like to commend the committee for its insight and forward thinking in holding this particular study.
Today I want to address the key comments that have emerged from your hearings. I will begin with enforcement powers.
The most important question put forward throughout the study was whether PIPEDA is up to the task of handling the challenges brought about by changing technology. Most witnesses felt that PIPEDA needs to be modernized. Others took the position that PIPEDA does not need to be changed, that its enforcement model works, and that its technology-neutral character is its strength.
In my view, with the emergence of Internet giants, the balance intended by the spirit and letter of PIPEDA is at risk. The quasi-monopoly of these multinationals has made PIPEDA's soft approach, based on non-binding recommendations and the threat of reputation loss, largely ineffective, I believe. We have seen organizations ignore our recommendations until the matter goes to court. We have seen large corporations, in the name of consultation with my office, pay lip service to our concerns and then ignore our advice. Moreover, with vast amounts of personal information held by organizations on increasingly complex platforms, the risk of significant breaches and of unexpected, unwanted, or even intrusive uses of that information calls for commensurate safeguards and financial consequences not currently provided for in PIPEDA.
New incentives, including changes to the enforcement model, are required to encourage organizations to be proactive, to build upfront protections, and to ensure secure treatment of individuals' personal information. I agree with the witnesses who stated that PIPEDA's strength is that it is technology-neutral and principles-based. These are characteristics that must remain.
[Translation]
I also agree—at least in part—with those who noted my office's success in bringing organizations into better compliance with the law. We have made use of the tools the law provides, and we have been able to effect some change—but often after an arduous effort. That effort comes at high cost to Canadians and is less and less effective against powerful, multinational companies.
You heard the arguments that my office cannot be judge, jury and executioner. In response, I would point you to some of my international and even provincial counterparts.
The United Kingdom commissioner can issue fines, as can a number of the international data protection authorities listed in the document I have submitted today. In the United Kingdom, my counterparts have stronger enforcement powers, but that has not precluded an ombudsman approach. Fines are issued where a softer touch has failed. Our counterparts tell us that businesses that invest in adopting good privacy practices from the start feel it is only fair to impose a financial burden on those who do not, in order to even the playing field.
Commissioners in Quebec, Alberta and British Columbia have order-making powers and jurisdiction over the private sector. They also have other duties—prescribed by law—that enable them to perform multiple roles, such as educator, adjudicator, enforcer, advocate, and so on. I have noted that witnesses before this committee had only good things to say about their relationship with the commissioners. Witnesses have said that the Canadian model was the envy of many countries around the world.
What others like about our law is that it does not single out sectors and is non-prescriptive. Yet, given that many of my international counterparts either have stronger enforcement tools or are requesting them, it is not our enforcement model they are admiring.
Indeed, I worry that, if my counterparts continue to gain stronger powers, but Canada does not, we will fall behind in inspiring consumer confidence needed for the digital economy to thrive.
At the least, we must start with mandatory data breach notifications—including financial consequences for egregious cases. Increasingly, other countries are implementing similar legislation. Such requirements would reinforce accountability and, with penalties, provide financial incentives to better protect Canadians' personal information. Such penalties should be flexible and adaptable to circumstances, so as not to unduly burden smaller organizations.
[English]
I'd like now to talk a bit about digital literacy.
Another key theme that has emerged from your hearings is the importance of digital literacy. I believe that the moment has come for government, for educators, and for our communities to seriously focus attention on the digital education of all Canadians of all ages.
Such an effort must address the broader societal and ethical issues that are raised by new information technologies but that fall outside data protection law per se. People need to understand that information on the Internet can live on forever and that they should be careful about what they post about themselves and others. That being said, digital literacy does not absolve companies of their obligations under privacy law.
In conclusion, Mr. Chairman, given the global nature of today's digital economy, Canada's federal law needs enforcement powers comparable to those in other jurisdictions. That is the way to have the greatest impact on privacy protection and to improve Canadians' confidence in their online environment.
A law that dates back to a time before social networks and smart technologies were created cannot remain static. The ways in which personal information in this environment can be collected and used by many players makes a formal study of the effectiveness of our privacy framework even more pressing, so I strongly urge Parliament—and this committee particularly—to move forward with a review of the legislation, PIPEDA in particular.
Thank you very much for inviting me once again, and my colleagues and I would be happy to try to answer your questions.
Merci.
:
Thank you very much, Mr. Chair.
Welcome, Commissioner. It's nice to see you back again, and your colleagues with you. We appreciate your appearance here.
It's been a long study, but it's been a good study, I think. We've heard some very interesting comments and we've heard from some very interesting individuals as well as companies. I think that it has been very beneficial and I'm certainly glad we've undertaken this study.
As you pointed out in your remarks, some “witnesses felt that PIPEDA needs to be modernized; others took the position that PIPEDA does not need to be changed, that its enforcement model works and that its technology-neutral character is its strength.” I'm just reading that from the comments you made earlier.
We heard from a lot of people on both sides of this issue. We heard about concerns with respect to giving broader powers, including the enforcement powers and the ability to issue penalties, and the concerns that some felt this would alter the good relationship that your office currently enjoys with many companies you examine.
Could you respond to that concern? Do you feel it will affect your ability to deal well with these companies? If you had expanded enforcement powers, how is that going to affect your current relationship in dealings with private companies? You've said in your comments that some people say your office cannot be judge, jury, and executioner. How would that work out? How would the balance be there? Would there be checks in place? In your vision, is that final say in your office?
:
Thank you, honourable member.
I'm a bit amazed at that statement. It sounds like if we got more power, we would be slinging mud balls at each other. I don't know what hell would break loose if we had enforcement powers.
I had the honour to be the president of a tribunal, one of the ones I mentioned in my speech, that enforced privacy legislation in Quebec, both in the private sector and the public sector. I didn't notice that we had particularly acrimonious relationships with companies in the private sector. I don't notice that my colleagues in British Columbia and Alberta have particularly acrimonious relationships, because they also have an educative role. They also prefer to settle through negotiation, if possible. Nobody really wants to go to court if they can avoid it. They promote the voluntary adhesion to the law.
Therefore I don't see, in those places across Canada where there is some kind of enforcement power, that anybody said the relationships are difficult. If people don't agree and there's one case where you go to the tribunal, well, perhaps people agree to disagree, but I haven't noticed that's prevented my colleagues—or me, when I was in that position myself—from doing educational work, from working with chief privacy officers, from having collegial meetings with the private sector.
I'm a bit perplexed as to that statement.
:
Thank you, Commissioner.
Yes. I think this story is quite eloquent.
You may recall, if you have followed the press clippings around our work, that in 2011 we issued a report of findings on Google WiFi. We found that as Google was rolling out Street View, they captured—accidentally, they say, and we have no evidence otherwise—personal information of Canadians. We gave them one year, a full year, to present to us a third party audit assuring us that they had applied all the recommendations we had made.
That timeline was May 20. At the beginning of May we had a meeting with Google, and our request for a third party audit, which was clearly stated in our letter, did not even seem to be on their radar screen. They were rather apologetic, and said “Oh, my God, can we have an extension?” In July, they sent us the third party audit that in fact had been written for the FTC.
I believe that truly goes to your point.
:
Well, from observation over the years, I think it is the only thing that makes them sit up and take notice.
Their names are already public. We're dealing with a far different breed of companies from what existed when PIPEDA was adopted. Lawyers have said to me many times over the years, “I wish there were more sanctions”, or, when I started talking about sanctions, they say they are so happy we are doing that because their client—this could be an outside client at a law firm or the CEO of a company where they are an in-house lawyer—asks them to draw up all the regulatory risks and then asks, “What happens if I don't?”
When they get to privacy, they ask what happens if they fall off the Canadian privacy wagon. Well, I have to say, “Don't worry. There will be an investigation, and in the course of the investigation, you can promise to fix it”, and that's it. That's what the law says. If they promise to fix it and there's an agreement, I don't take them to Federal Court, so they say, “Okay, fine; put it at the bottom of the list.”
As a result, the lawyers who were advising their clients can't get their clients to pay attention to Canadian privacy law because the CEO asks, “What are my biggest risks?” If there's virtually no risk of infringing when you infringe a Canadian privacy law, you move on to other things. That includes data breach, as we were talking about earlier.
Ladies, thank you for being here today.
I have to say that this study has enlightened me about what goes on in social media and some of the challenges you have.
In your statement, you discussed four issues, which were retention, meaningful consent, limiting of use, and accountability.
To me, retention, meaningful consent, and limiting of use are very simple to deal with through laws or guidelines for compliance. You have to spell out what that should be. What I understand from the witnesses is that simplicity is important for the user. Accountability is really the issue, I think, and your biggest challenge.
To make those providers accountable, would you regulate on a complaint basis or a monitoring basis?
:
On that I would say, yes, that's what we hear often—that they just want to see what site you visit—but from our own work on what you can find out by tracking, the problem is that you can aggregate all the sites that I have visited and then draw up a profile. In some cases you could find my name and my address from public sources, and so on, and you could draw up a profile of me as a citizen or consumer that can be accurate or it can be extremely inaccurate.
As the Internet becomes more sophisticated.... There's an article by the American scholar Jeffrey Rosen that's very good on this. It was published about two weeks ago.
The danger of tracking and the issue of discrimination on the Internet is that because you have visited these sites, the ad server can decide that you fall into a certain category. We can't each have a personally individualized category for the moment, but we'll say “middle-aged lady, likes golf, likes to drive station wagons”. In the American example, because of different political sites that were visited, it could be “votes this way, thinks this way”, and so on. It can be accurate, but it can be inaccurate.
The fact that it will determine the information you get, the ads you get, and sometimes, I believe, the rankings in search engines—I'm not sure about that—means that your experience of the Internet and the world of knowledge that the Internet represents will be limited. It will be based on what may be a true or a false or a partly true profile that algorithms are determining for you.
That's some of the concern: that you fall into artificial categories and therefore only see the information that is deemed to fit in with the artificial category into which you have fallen.
:
Thank you. It's great that you came back.
Our first witness said something interesting. He spoke about self-regulation and some of the industry players we have. They have standards. Other people don't have standards. He said self-regulation worked very well as long as you had an enforcement mechanism.
I sometimes think my colleagues on the other side hear self-regulation as the market mantra. If that were the case, Somalia would be a centre of international innovation—but it's not, because they don't have the enforcement mechanisms to decide what is good activity and what is bad activity.
In our case it comes down to breach notification. That's one of the key bottom lines, I think. If my data is breached, it's not just what site I go to or what I'm interested in or where I play golf, but the fact that I use my credit card to buy stuff. If that data is breached, my security is at risk.
Under the rewrite that's being planned by this government, their language is interesting. They say it has to be a “real risk”—not a perceived risk, but a real risk—“of significant harm”. If I were a corporate lawyer, I'd say I wouldn't tell anybody that their data has been breached. Significant risk means what? Nobody's going to come and kill you.
It seems that the government is setting a bar so high that the companies have an opt-out mechanism and are not going to report breaches even if it's credit card information or personal data information, something that the cyber hackers would love. Do you think we need to clarify at what point a company has to inform you that the cyber hackers have been visiting your data?
I have just a couple of comments, as I've had a chance to talk to different businesses that have been involved in this area.
One of the concerns I have when you set the bar relatively high—and I think you went through a list, saying that each company should have various levels of individuals who can ensure that you have the privacy that you require—is whether we then start to be concerned about picking winners and losers. Perhaps the bigger companies, which already have that mechanism, are able to expand, and the smaller businesses then know that they have all of this level of privacy legislation and so on that they have to get to.
I'm concerned about that, with the small businesses coming in. That was one thing we heard right off the bat: that if you put the rules in right away and make them too stringent, the only ones who are going to be successful are the ones who are big enough to take on the burden that is being presented to them. That's not how you gain innovation.
When you take a look at some of your suggestions—as no doubt you will, when you think about what we have been studying—I wonder whether you could look at the question through that particular lens, because we want to make sure we're not stifling innovation. That's the first feeling and thought that I have with regard to this issue.
The other thing we've tried to talk about to people who have come here is that it isn't free. When we suggest that if we get on the BlackBerry and do this, that, and the other thing, we all of a sudden have the free range to do whatever we want and we're going to be protected from ourselves, based on some of the activities that we have.... I look at it from that perspective.
If you go into a store and take a magazine off the shelf and start reading it there, somewhere along the line you have to go and buy the thing; you have to recognize that this is part of what we do. I haven't really heard a lot of discussion from regulators that really recognizes this. When you ask businesses about how they make their money and what they do, you get a bit of an understanding of where you're going with that.
If I have a few seconds, my last comment is about the right to be forgotten. One of the analogies we heard was of someone taking a glass of water and pouring it into a stream; it goes all the way through, and at the end of days they say, “I want my glass of water back” after it has gone through the river and down into the ocean and so on.
There are different thoughts on this aspect. I wonder whether you could comment on some of my ramblings there in the time I have remaining.
Very quickly, then, in one minute: first, we have always tried to tailor the law to small and medium business. Some of the examples I'm talking about here are mega-megacorporations, not small and medium-sized businesses.
Second, on stifling innovation, I don't believe innovation always has a direct link to privacy. I think innovation is mostly encouraged by capital formation, entrepreneurial capital that's free, and levels of education or technical knowledge.
Third, my office has no objection if people want to sell their personal information to get services free. We have never said that. We have no problem with the Internet model. We just want the law that Parliament adopted in 1999 to be applied correctly: you have to consent, and you have to understand what you're selling and what will be done with it.
Fourth, on the right to be forgotten, I think this right is an important concept. We have to seriously look at the ways and means of enforcing it. Parliament in its wisdom said that PIPEDA that you have a right of deletion of your personal information, so we in a sense already have it, but we have big issues with some companies who built in no ability to delete young people's information.
Madam Commissioner, ladies, thank you for joining us.
As my colleague was saying, it's useful to hear from you at the beginning and at the end of the process. I will take a few seconds to say that this study, thanks to my colleague, has been something of a revelation for me. It has opened my eyes to the fact that we are monitored much more than we think on the Internet and in social media. I didn't know how much we were being monitored and watched.
I feel that this is the case for many Canadians who accept the conditions quickly and then go on to browse various websites. They are unaware of the machine behind it all—be it browsers, Google, social media or these data brokers, which I didn't even know existed not too long ago. They gather a great deal of information about us—our habits, choices, preferences, places we visit, purchases, ideas. Afterwards, they put all that together and often sell the information. I think that, according to what you have told us, the role of educator—which you should play more—is as important as the power to impose fines or penalties.
Could you tell me what you think of Canadians' digital knowledge or digital literacy? Do people know that they are being monitored so much?