Thank you for this opportunity to contribute to your work on the revision of the Personal Information Protection and Electronic Documents Act.
I will be giving my presentation in both languages and would be happy to answer questions in both as well.
In my presentation, I will refer to the Personal Information Protection and Electronic Documents Act as the act.
My starting point is the letter that the Privacy Commissioner of Canada sent to you on December 2, 2016, bringing to your attention four possible areas of intervention. I will add my observations from my experience as a privacy regulator and now as a lawyer in the private sector.
The first topic concerns valid consent.
Last summer, I submitted a brief further to the Privacy Commissioner's consultations on consent. I concluded that the current system of consent of the act is adequate for two key reasons. First, it has the rigour necessary to obtain valid consent. Second, it has the flexibility to ensure that consent applies to the various applications that exist on the Internet.
Consider section 6.1 of the act, which states the following:
|| the consent of an individual is only valid if it is reasonable to expect that an individual to whom the organization's activities are directed would understand the nature, purpose and consequences of the collection, use or disclosure of the personal information to which they are consenting.
That means the act truly allows for the complexity of the Internet, without specifying the modalities, thereby making it possible to adapt the principle to any application that emerges.
The act also recognizes the possibility of implied consent. Specifically, pursuant to section 4.3.6 of schedule 1, implied consent is acceptable in certain circumstances.
In my brief, I point out that enhancing consent involves privacy policies, which must meet three specific criteria, in my view. First, they must be written in accessible language. Second, they must be adapted to the organization. Third, they must be structured for easy consultation. This does not require any legislative change.
Furthermore, there is an improvement that does not require, but would benefit from, a legislative change. It would be to specify in the act, as European law does, that anonymization is a way to exclude personal information from application of the act.
I make that suggestion because, very often, in privacy policies, I see a paragraph advising the reader or consumer that de-identified personal information will be used for purpose X or Y. That is pointless. When identifiers are severed from the information to prevent identification of the individual, the act does not apply. I think it would be helpful to make that clear, as European law does.
The second concern brought to your attention by the commissioner is a widely shared one. That's the protection of reputation online. However, the issue is only partially in federal jurisdiction. Most of the harm that occurs to reputation online occurs not within the framework of commercial transactions but within the framework of personal relationships, which come under provincial legislation.
I will give you examples of five pieces of provincial legislation that may be helpful in that regard, and one piece of federal legislation.
Regarding provincial legislation, in British Columbia, Manitoba, Saskatchewan, and Newfoundland and Labrador, there are specific acts that say that the violation of privacy can be an actionable tort. In Quebec, a judge can prescribe measures to stop harm to reputation online.
At the federal level, there is the Protecting Canadians from Online Crime Act, which, as you know, criminalizes the online dissemination of intimate images without consent.
So there is a framework in which you can have some tools to stop harm to reputation online, but there is a legal void that remains. That legal void could perhaps be addressed through the federal act. That would be by creating—on the model of European law, and as mentioned by the commissioner in his letter—a right to be forgotten, meaning the right to erasure of certain information.
Such a provision would reduce the dissemination of personal information harmful to reputation and therefore would add some protection. In order to properly control its scope, however, I suggest that it be strictly framed with this beacon in mind: that this right to erasure would apply only to any display of personal data declared by a court as a violation of the right to privacy, with possible injunctions to stop the dissemination during trial. Still, I believe it is important to give it some solidity rather than leave it as discretionary and a burden to the platforms.
Given the seriousness of the damage to online reputation and in spite of the limited nature of federal jurisdiction in this matter, you may want to explore how the federal act could be amended to include the right to erasure as a method of reducing harm to reputation online.
The third issue brought to your attention by the commissioner concerns his enforcement powers. From my practice at Dentons, which is the biggest law firm in the world, I practise privacy law on a world level, which means that I see very concretely the disparity between the enforcement powers of our commissioner, which are actually absent, and those of his counterparts.
I cannot but observe the hold that other commissioners have on business because they can impose fines of millions of dollars. The Federal Trade Commission, for example, in the same investigation as our commissioner, can impose millions of dollars in fines, while our commissioner can only make recommendations.
France can impose fines of 300,000 euros and, interestingly, just this past February 7, Russia has increased tenfold the fines under privacy law. It's still not a big number. It's from 10,000 rubles to 35,000 rubles, which equates to about $1,600 Canadian, but it shows a trend toward increased enforcement powers. The New Zealand privacy commissioner has now recommended to his government $1 million in fines for privacy violations.
As you may have heard, the European regulation, which will come into force on May 25, 2018, does provide for fines of up to 4% of a company's global revenues.
That said, the Canadian commissioner's officer is performing quite well, especially with the right to name companies, because reputation is such an important asset. On the one hand, we have to weigh the advantage of this ombudsman model, which, according to the private sector, favours collaboration between regulators in business and the worldwide peculiarity, I would say, of our commissioner.
However, I have to tell you that in my experience as both a regulator and a privacy counsel to business, I do not see enforcement powers as the determining factor in collaboration, but rather good faith on both sides. That's what really matters.
Also, the imposition of sanctions is not necessarily bad for the private sector, because it evens the playing field. You have good organizations that invest the money up front and, therefore, get good results on privacy protection, and you have negligent organizations that fail to make the upfront investments and, therefore, pay the fine at the end. A lot of good organizations will tell you, “Thank you. You've just evened the playing field.”
That said, comparing the enforcement powers of the Canadian office with the rest of the world favours an upgrade, but I would like to put some parameters around that.
I encourage you to explore the possibility of creating a power to impose fines, but framed rigorously as follows. First of all, I think the fine should be imposed only if there is evidence of negligence. Incessant attacks and uncertainty in the breadth and scope of the law mean that organizations cannot be required to ward off every blow. It's unfair.
Secondly, the fine should be payable, obviously, to the receiver general. There are some data protection authorities where the fine is payable to the data protection authority. It creates a conflict of interest. It should be subject to the Federal Court. Obviously, and this is of huge impact, it has to be appealable.
Finally, as in the case of the European regulation, I would favour the fine being a percentage of annual revenues, because the use of personal information is part of profits. Therefore, the misuse of personal information should be part of financial loss. There is a logic there that I believe recognizes the monetary value of personal information. Secondly, it matches the investment that is required to be made upstream and leaves the issue of damages to the courts, where that would be more appropriately dealt with.
The fourth subject that the commissioner brings to your attention is, in my view, the most urgent. Why? Because it concerns the new General European Data Protection Regulation, which will come into force on May 25, 2018. The regulation considerably changes European legislation on personal data protection and puts our adequacy status at risk. Allow me to explain.
The issue is economic. Canada has the status of suitability to Europe, which allows Canadian companies to receive European data without any other form of authorization. This is a crucial competitive advantage. We could lose our adequacy status for two reasons. First, the new regulation provides for the review of adequacy status every four years, which means that our status will be questioned. Second, we will be evaluated against the standards in the new regulation, which are very different from those in the current federal legislation. The problem is that our rules are not in line with the new regulation.
In short, we could lose a major competitive advantage. Canada is the only North American state to have the status of suitability to Europe, so I encourage you to consider the issue.
On that note, I would be happy to answer any questions you have.
Thank you, Mr. Chairman.
The Public Interest Advocacy Centre is a national, non-profit organization and registered charity that provides legal and research services on behalf of consumer interests, in particular vulnerable consumer interests concerning the provision of important public services. We have been deeply involved with PIPEDA from before its passing.
Five years ago, we came to this committee to talk about privacy and social networks. Today, we come to discuss your review of PIPEDA. It is still about social media, but this time it has brought along its friend, big data.
Social networks and most smartphone apps routinely gather personal information as defined by PIPEDA, and retain that information on central servers. That information is then used, as permitted by PIPEDA, to target advertisements to that person, their friends, families, and colleagues on social media.
The term for this is “behavioural advertising” or marketing, as the vast amounts of very personal data, including one's preferences as to a myriad of products, previous purchases, location, age, gender, ethnicity, and much more, allow advertisers using this information to target these ads to your presumed behaviour and profile.
They call it “big data” when advertisers or other companies are able to combine data sets from various apps and website visits, and even from only one site over a long period. Then data mining occurs, using algorithms to look for patterns that suggest how successful targeting ads may be, or even attempting to find presumed ways to know or influence your future behaviour.
The companies doing this will tell you today that they are doing it lawfully under PIPEDA, that they have privacy policies, that they have your consent, and that they follow all the rules of sharing and processing data. The fact, though, is they often do not have your informed consent. Informed consent, whereby you understand the consequences of the provision of your information and what it will be used for and how it will be shared, is the standard for collecting, using, and disclosing information under PIPEDA.
Companies now are asking and beginning to ask that the consent standard be changed, largely because it impedes data gathering and big data. They will ask you to abandon informed consent as the standard that protects consumers and the reasonable expectations and conceptions of privacy. They will ask you for a risk-based model, or more implied consent. This should be resisted. Indeed, PIPEDA needs to enable the informed consent standard, and all it needs is some new rules to protect that and consumers.
Moving now to enforcement, if we are to address the problems with online privacy and big data, the Privacy Commissioner of Canada needs real enforcement powers, including a mandatory order-making power and an AMP or fining power.
PIAC advocated for these powers at the first PIPEDA review in 2008. At that time, the Office of the Privacy Commissioner did not want them. Then the OPC crossed swords with Facebook in a complaint in 2010. After that, Jennifer Stoddart asked you and the government repeatedly and loudly for order-making power and fining power. Her reasoning was that her office could not make large social media companies comply with only non-binding findings and name and shame.
Mr. Therrien, the current Privacy Commissioner, is more careful, and he may ask you only for order-making power. This will be cumbersome to enforce in court. You should also be giving him fining power.
In any case, if the Privacy Commissioner says that he or she needs it to do the job, why not give it? The OPC is up against the biggest corporations in the world right now, and needs tools. It is frankly embarrassing that provincial privacy commissioners have this power and not the Office of the Privacy Commissioner. Only by enforcing the present standards in PIPEDA can we see if it is effective or needs change. It's unfair to judge the act without enforcement.
Moving now to children, a new rule is needed regarding the treatment of children's privacy. I saw an extraordinary op-ed last week. In it, Owen Charters, the president and CEO of the Boys and Girls Club of Canada, said:
||The Wall Street Journal reports that...children's websites in the US install more tracking software than sites aimed at adults. These tracking tools follow our children as they surf the web, collecting data about their behaviour and interests. This information is often sold to marketing companies.
||There are endless public awareness campaigns dedicated to cyberbullying. Change is happening. But with the focus on those discussions, children's privacy rights in Canada have been placed on the back burner.
That a general children's welfare charity would underline online privacy is indeed telling. This letter closes with an exhortation to the Canadian government to pass a dedicated children's privacy act.
Our sentiments are similar, but we think that this protection can be added to PIPEDA. We have first-hand insights on the problem. In 2011, PIAC brought a privacy complaint against Nexopia.com, a social network based in Alberta and largely aimed at the teen audience. The Office of the Privacy Commissioner upheld all of our complaints, which were focused not so much on online safety, but on targeted marketing to minors.
Unfortunately, besides some voluntary guidelines from the Office of the Privacy Commissioner, we see no improvement in children's privacy in Canada since then. We have a detailed proposal to address this—and Europe is also adding regulations—but given our time to present, we invite you to ask about these solutions in your questions.
Another area that requires a new rule is data retention and destruction. Can consumers in the future be sure that the information they have provided, or that was extracted from their habits, will be destroyed or no longer used when the reasons for why they gave that consent are gone? Will they have control? Some of those present today would say no.
We say that now is the time to erase. PIPEDA states that personal information must only be retained for as long as necessary to fulfill an organization's stated purpose. However, the act only requires organizations to develop guidelines and implement procedures regarding the retention of personal data. It says that personal information that is no longer required to fulfill the stated purposes should—not shall—be destroyed, erased, or made anonymous. This is not strong enough.
The only OPC findings that Nexopia refused to implement, to the point of being taken to court by the OPC, were those requiring them to erase the personal information of teens who had left their service. As Canadians can now spend years, decades and, in the case of children, possibly their entire lives on an online service such as a social networking website, the amount of personal information collected from a user could be staggering. The more information on individuals that an organization has and the longer they keep it, the greater and more serious the risk of a data breach.
Canadians must have choice and control over the ways their personal data is used, including through consent, rectification of information, and especially the removal or erasure of their information.
A right to erasure was recognized in the European Union's recent general data protection regulation, which comes into force in 2018. The new GDPR codifies what is known as the “right to erasure”. This gives individuals the right to have personal data erased and to prevent the processing of their data when, for instance, the individual withdraws consent or objects to the processing and there is no overriding legitimate interest for continuing it.
Organizations are also required to be particularly sensitive when it comes to personal data shared by children on, for instance, a social networking site. They can only refuse in certain circumstances to erase personal data when requested, such as to comply with legal obligations or to exercise freedom of expression.
PIAC submits that the committee should consider recommending similar rules for PIPEDA that would align with the GDPR's protections. For instance, organizations should be upfront with users about how long they intend to retain their personal data and why. They should also be required to erase or destroy personal information once the data is no longer needed for a stated purpose, or when an individual withdraws consent.
Thank you for inviting me. I am pleased to be here today. I appreciate the opportunity to share with the committee my thoughts on important issues affecting Canadians and their privacy.
I am a partner at Borden Ladner Gervais, and I teach in the faculty of law at Université de Montréal. I am appearing before the committee today as an individual.
I will be discussing two issues that have been the subject of consultations undertaken by the Office of the Privacy Commissioner in the past year: meaningful consent, and reputation and privacy. I will also say a few words about enforcement powers. I will be giving my presentation in English but would be happy to answer questions in English or French.
PIPEDA is based on fair information practices that were initially drafted in the early 1970s. We should keep in mind that their main purpose was to address specific concerns pertaining to computerized databases and the fact that different private sector organizations could exchange personal information more easily without the knowledge or consent of individuals. At that time, the best way to deal with these new concerns was deemed to have individuals keep control of their personal information.
Forty years later, this concept is still one of the most predominant theories of privacy and the basis for data protection laws around the world, including PIPEDA. The notice-and-choice approach is no longer realistic. Individuals are overloaded with quantities of information they cannot realistically be expected to process or comprehend. As raised by the OPC, the complex information flows and new business models involving a multitude of third parties have also challenged the traditional consent model.
A first issue, if we want to maintain that consent model, is whether we should be amending PIPEDA on the issue of consent. Jean Carbonnier, one of the most prominent French jurists of the 20th century, has stated in French, “Ne légiférer qu'en tremblant”. What he meant was that we should be very cautious when enacting or amending laws. We have to be careful to make sure that the amendment will not be detrimental or problematic as soon as new technologies emerge. The current wording pertaining to obtaining consent under PIPEDA is quite flexible and definitely flexible enough to accommodate new types of technologies and business models.
However, the downside of this flexibility is that it creates uncertainty. Therefore, policy guidance on enhancing transparency and obtaining valid consent is increasingly necessary to address some of this uncertainty and allow organizations to innovate without taking major legal risks. Businesses look up to the OPC to provide such guidance and its recent guidance on online behavioural advertising, app development, and the Internet of things is quite useful. These documents are, more than ever, relevant and timely.
Under PIPEDA, in determining the form of consent to use, organizations shall consider the reasonable expectations of the individual. What these expectations are in any given context, and whether certain activities are legitimate from a privacy perspective, is often a function of many factors, including the prevailing social norms. Another argument against amending PIPEDA on the notion of consent pertains to the fact that social norms in connection with any new technology or business practice may not yet be established. The OPC has, in recent years, commissioned certain surveys meant to explore the awareness, understanding, and perceptions of Canadians on certain issues and new technologies. These studies are increasingly important, since they allow us to gain a better understanding of consumers and their expectations and help evaluate how the social norm in connection with a given technology or business practice is evolving.
Over the last few years, I have proposed, through various publications, that perhaps part of the solution to address some of the challenges pertaining to the consent model could include the adoption of a risk-based approach or interpretation, under which we would focus on obtaining express consent only for data collections, uses, or disclosures, if such activities might trigger a risk of harm to individuals. For instance, express consent would be required when using personal information to make an eligibility decision impacting the individual, a disclosure that would involve sensitive or potentially embarrassing information, or a practice that would go against the expectation of the individual.
A risk-based approach may allow organizations to streamline their communications with individuals, reducing the burden and confusion on individual consumers, since they would receive fewer requests for consent. These requests would be meaningful in the sense that they would focus on what matters to them. Although this type of approach would imply rethinking PIPEDA's current consent model to some extent, it could be further explored in the foreseeable future.
Regarding online reputation, the Office of the Privacy Commissioner of Canada recently chose to make reputation and privacy one of its priorities for the next few years, and launched a consultation last year in which it asked if there were a way to apply a right to be forgotten in Canada. With Internet technologies, there is a temporal shift, in the sense that pieces of information can outlive the context in which they were initially published and considered legitimate. Security expert Bruce Schneier stated a few years ago: “We're a species that forgets stuff.... We don't know what it's like to live in a world that never forgets.”
The right to be forgotten is the right famously coined by the Court of Justice of the European Union in its May 2014 landmark decision, in which it authorized an individual's personal information pertaining to past debts to be removed from accessibility via a search engine. While this right may sound appealing at first, especially in view of the protection granted to the privacy and reputation of individuals, this issue is more complex. Aside from the constitutional challenges that a right to be forgotten would raise, there are significant risks with entrusting private entities, such as search engines, with the task of arbitrating fundamental rights and values. A decision to de-index content is quite complex as it would require considering numerous criteria. It would fall to search engines to enforce this right, and these companies would have an incentive to err on the side of more removal rather than less in order to reduce costs or to avoid potential legal liability.
Courts, unlike private sector entities, have the expertise and independence to strike an appropriate balance between the two fundamental values that are often opposed in these types of requests, namely freedom of information, freedom of expression and privacy. On this issue, the Federal Court of Canada recently issued a decision in the Globe24h case, illustrating that courts should be the ones issuing orders to remove information from Google search results.
Quebec has a very stringent privacy and reputation legal framework in place. The right to privacy has been elevated to the rank of a fundamental right, protected by the Quebec Charter of Human Rights and Freedoms. The Civil Code of Quebec prohibits the publishing of someone's “name, image, likeness or voice for a purpose other than the legitimate information of the public”. While recovery for defamation in common law jurisdictions may be barred if the statements are true, in Quebec the fact that information published is true does not suffice to avoid liability.
This said, even with this stringent legal framework in place, some challenges in addressing online reputation issues remain. First, the notion of res judicata may prevent an individual from going before the courts and asking that certain information be removed if this request was made in the past and already decided upon. Periods of limitation must also be revisited to ensure that this legal framework can adequately address the fact that with the Internet, data legitimately published may, after a certain period, become irrelevant, or the fact that the data that was once considered outdated may become relevant again over time.
Second, pursuing litigation can be quite expensive, which may not make this type of tool or recourse always accessible. Perhaps efforts should be directed to improving our legal framework, notably by increasing access to justice or implementing a fast-track system for online removal requests, rather than by copying a European-style right to be forgotten.
Finally, the right to be forgotten includes extraterritorial issues that should be considered. The Federal Court of Canada, in its recent decision, opened up an important debate on the jurisdictional reach of privacy laws. All eyes are now on the Supreme Court of Canada, which will be rendering its decision dealing with these issues in the Equustek v. Google matter in the near future.
Regarding enforcement powers, the former Privacy Commissioner of Canada, Jennifer Stoddart, has asked for stronger enforcement powers under PIPEDA, which could include order-making powers and the power to impose penalties or statutory damages. In foreign jurisdictions, privacy regulators have such powers. This could provide an additional incentive for Canadian businesses to protect the personal information under their control. This being said, I wanted to raise one concern. As mentioned earlier, PIPEDA is based on flexible technology-neutral principles. The benefit of this flexibility is that it can accommodate new types of technologies and business models, but the downside of this flexibility is that it creates uncertainty: it is not always clear for businesses how they must comply with PIPEDA, especially when launching new products or services or innovative technologies. If on top of this uncertainty, there is also the risk of statutory damages or penalties, I am concerned that businesses will hesitate to launch new products and services and that in the end this will affect innovation and our competitive advantages as a nation driven by research, development, and innovation.
I am of the view that any enforcement powers, penalties, or statutory damages should come into play only once a certain practice is clearly illegal and once the organization has been advised of such and is refusing to adjust its business practices.
As a final thought, I have some concerns with the adequacy test that Canada will undergo in the coming years. The European general data protection regulation coming into force in 2018 will include certain new rights that are not currently in PIPEDA: a right to be forgotten and a right to data portability, to name a few.
We have important issues on our plate to ensure that our current data protection regime will survive and remain relevant in the near future. We have some challenges with our current notice and choice model, and perhaps addressing these issues should be our priority.
I have made written submissions in response to the OPC's consultation on privacy and consent and their call for essays on online reputation. My submissions are available on the OPC's website.
Thank you, and I welcome questions.
Good afternoon, Mr. Chairman and members.
My comments will be focused specifically on the four issues identified by the Privacy Commissioner in his December 2, 2016, letter to this committee.
The overriding concern I'll commence with is ensuring that PIPEDA works better when it comes to small and medium-sized businesses. For brevity, I'll refer to them as SMEs in the course of my presentation. I was involved in the development of PIPA in Alberta. I co-chaired a working group of Alberta privacy lawyers who were providing advice to the people drafting the legislation that became PIPA. Much of the input from the lawyers participating was animated by a focus on small and medium-sized businesses. PIPEDA, at least at the time, was seen as better suited to large banks, airlines, and national corporations but not so well suited to the neighbourhood bookstore.
When I was the Saskatchewan Information and Privacy Commissioner, my office partnered with the Privacy Commissioner of Canada's office to undertake a program called privacy made easy. This was focused on businesses on the Prairies. In meetings with business organizations, we found a remarkably low level of PIPEDA compliance by small and medium-sized enterprises. In fact, I'm disappointed to say, we found even a remarkably low level of PIPEDA awareness.
Dealing first with enforcement powers, I support the commissioner's recommendation that his office have order-making power. That aligns his office with most of the major international data protection authorities as well as the Canadian provinces with private sector privacy laws.
I want to acknowledge that the current ombuds office probably works quite well for large corporations in Canada, which achieve a high level of PIPEDA compliance, I think. That may be because of more capacity and it may be attributable to a more sophisticated recognition that privacy compliance is a good business practice.
I'm interested in the conclusions of a 2010 study that had been done for the Privacy Commissioner of Canada. It concluded that there's a differential impact on different sized businesses by the role of the Privacy Commissioner of Canada, as SMEs tend to be more sensitive to financial risk and penalties. Furthermore, the deterrent effect of avoiding intervention by the Privacy Commissioner would be more effective with SMEs if the Privacy Commissioner of Canada had order-making power and the ability to impose penalties.
Another reason I support order-making is that it leads to the creation of a body of precedents, more detailed orders than the current summaries provided by the office. These would serve to provide businesses with much clearer direction as to how PIPEDA is being interpreted and applied.
In terms of the GDPR—the general data protection regulation—alignment makes sense from the perspective of international trade. I would submit, however, that it's important not to lose sight of the private sector privacy laws in Alberta, British Columbia, and Quebec, as well as the substantially similar health information laws in jurisdictions such as Ontario, Newfoundland and Labrador, New Brunswick, and other provinces that will soon achieve the substantially similar designation. Any changes to PIPEDA would necessitate a similar review of each of those substantially similar provincial and territorial laws.
I'm not sure that data portability and privacy by design are not already captured by PIPEDA. Data erasure appears to have no PIPEDA counterpart, however.
On reputation and privacy, I don't support a right to be forgotten. I simply don't think it could survive a charter challenge.
As a former commissioner, I was very concerned with the issue of public registries that were created long before we started to worry about data profiling, data matching, and identity theft. The response needs to be to encourage more scrutiny at the time registries collect the information and ensure non-collection of anything not essential to the purpose of the registry.
When Chantal Bernier was assistant privacy commissioner of Canada, I recall that she led a collaborative initiative with provincial commissioners to create a set of guidelines dealing with the Internet publication of administrative tribunal decisions. So there certainly is an issue that can be addressed, but I'm just not sure the right to be forgotten is going to be the answer.
I think freedom of expression in the charter limits what could be done. If you cannot compel a media outlet to take down content, then I contend you cannot stop a search engine from communicating to the world that the content exists.
Regarding meaningful consent, I'm going to submit to you, Mr. Chairman and members, that some useful privacy lessons have been learned from the Canadian experience with electronic health records, where the role of consent has been significantly diminished, notwithstanding the fact that we're dealing with some of the most sensitive and prejudicial information that Canadians have. I'm thinking particularly of Alberta and Saskatchewan, which have a largely completed electronic health record for every citizen. This allows thousands of providers in all parts of the province the opportunity to look at prescription drug profiles, laboratory test results, diagnostic imaging pictures, radiology reports, clinical notes from providers in hospitals, and immunization information on anyone in the province. Of course, they're not supposed to be viewing this material unless they have a legitimate need for the purposes of diagnosis, treatment, and care, but the point is they have the ability to be able to access that information. With funding from Canada Health Infoway, all other provinces are working to develop a similar system which should be interoperable with that of all other provinces and territories.
And we've certainly learned over the last decade that apart from the question of consent, there's a compelling need for other privacy enhancing features. At the top of my list would be a privacy management program to ensure a coordinated approach to PIPEDA compliance, because what you tend to see too often among health care providers is a fragmentation: a policy here, a policy there, and not appropriate coordination and leadership. So a privacy management program is an important feature.
There's also a need for a proactive audit program that's made known to all employees. Too often, organizations like to boast that they have an audit capability with the electronic system they've got. That isn't very helpful or very useful if there isn't an ongoing proactive program and all staff that have access to that sensitive information are aware that this capacity exists in the organization.
Furthermore, we need strengthened regulatory oversight both by commissioner offices and also regulated professional bodies.
We could spend hours talking about the development and expansion of secondary use of personal health information and big data. The historic view is that if you're dealing with identifiable patient information, if you're using it for the original purpose—namely, that it was collected for diagnosis, treatment, and care—you don't require additional consent, but if you're using it for research purposes, you would then typically require the express consent, unless you have approval from a research ethics board that says consent isn't necessary.
There are significant issues around that and then the need for hard safeguards.
Unlike Australia and the system they have there known as My Health Record, where there's a requirement that patients must opt in to the electronic health record system, in Canada we have compulsory enrolment of all Canadians and uploading of their personal health information to the system. They're not invited or asked whether they consent. The system of electronic health records is based on implied consent, not express consent. Moreover, implied consent typically requires transparency at the point of collection about the kind of PHI that's collected and how it will be used and disclosed. Implied consent typically requires that an individual can elect to opt out. The kind of masking that's offered in the electronic health record system we're building in Canada usually offers patients something quite different, and certainly something much less than an opt out.
Patient privacy, as we've seen in our experience over the last decade, is typically reinforced by a number of soft safeguards, including an oath or pledge by all health care workers to protect privacy; written policies and procedures for the collection, use, and disclosure of personal health information; training of staff; and an audit trail of those who view anyone's PHI.
The experience, though, is that despite these soft safeguards, we've experienced something of a rash of snooping incidents. You have read about that, because we have, I think, pending class actions in at least five Canadian provinces that come from unauthorized people snooping in patients' personal health information. This has sharpened the focus on hard safeguards to backstop the soft safeguards.
I'd recommend that if you're looking—as the commissioner has invited you to do—at possible alternatives or enhancements to consent, you might want to consider the kinds of hard safeguards that have been developed for electronic health records. These would be dismissal for cause or other disciplinary action by employers, prosecution, and fines—if you look at the stand-alone health information laws, they have huge fines—class-action litigation, and disciplinary action by professional regulatory bodies.
I say that on the issue of consent and determining whether there are some alternatives, there's some valuable experience to consider and to draw from when we look at electronic health records as we see them now in Canada.
Thank you very much, Mr. Chairman.