Mr. Speaker, 34 years ago, the Supreme Court said that “privacy is at the heart of liberty in a modern state”. In the words of Justice Gérard La Forest of the Supreme Court of Canada in 1988, it is worthy of an individual and “it is worthy of constitutional protection”. All Canadians are worthy of having their privacy respected.
It is our duty as parliamentarians to do our best to protect Canadians' privacy rights, especially as we struggle so much for it today.
Bill , formerly Bill , is designed to update Canada’s federal private sector privacy law, the Personal Information Protection and Electronics Documents Act, or PIPEDA, to create a new tribunal and to propose new rules for artificial intelligence systems. It is a reworking of Bill C-11, and it has three components: the consumer privacy protection act; the personal information and data protection tribunal act, creating a new tribunal; and the artificial intelligence and data act.
The bill applies to Canadians' private rights. It does not apply to CSIS, RCMP or CSE. That and other government-held data is governed by the Privacy Act. Privacy laws for Canadians have not been updated in 22 years, and Europe updated the General Data Protection Regulation in 2016.
When we last updated this act, 22 years ago, the member for was turning 21 years old, and society was going through big changes. The world had just gotten past the Y2K scare. We were looking at what was going to happen to computers when the clock changed from 1999 to 2000. In certain areas, we did not know if the power would go out or what would happen.
People listened to music on CD Walkmans. Apple was over a year away from launching a cutting-edge new technology called the iPod. Less than 30% of Canadians actually owned a cellphone. The most popular cellphones were the Motorola Razr, which was a flip phone, and the Nokia brick phone, with texting that used the number pad and almost no web browsing capabilities. The most sophisticated app was called Snake. A fledgling Canadian telecommunications company was just starting, and it was called BlackBerry.
That is how long it has been since we updated our laws. Today, 22 years later, data collection is getting more sophisticated, and surveillance is more of the norm than the exception.
Apple Watch announced a few weeks ago that it can track and tell when a woman is ovulating. What is concerning, and we are going to talk a lot about data for good and data for wrong, is that this technology can tell if a woman skips a cycle, and then can identify if there has been a miscarriage or an abortion. This is very concerning.
Our Fitbits, our web history and our Apple phones can tell us how many steps we did in a day. Sometimes when we are in Parliament it is about 10, and if we are door knocking it is about 25,000. That does not sound important, but that information is also letting those regulators know where we have been, where we are going and where we live.
Facial recognition technology can identify a face like a fingerprint. Sometimes that is good. We have heard from law enforcement that it can be used for human trafficking. Sometimes that is wrong, when people are identified in a street and when people are identified with their names, their data and where they have been. Let us think of Minority Report, where everywhere someone goes, they are identified. It did not matter where they where going or where they had been. That is something that could happen with facial recognition technology.
Google and Amazon listen and collect our data in our bathrooms, living rooms, kitchens and cars. How many times have we been in conversations and Siri asks, “What was that?” Siri is always listening. Amazon is always listening. Speaking of cars, they are cellphones on wheels. When we connect to a rental car, and a lot of us rent cars, we see five or six other phones in the history. That car has downloaded all the data from our phone into that car. A lot of times, if we see that in the rental car, that car holds our information. It is very concerning.
There are many examples where it has hurt Canadians in the last several years. Two summers ago, Tim Hortons had a data breach, where every time someone rolled up the rim, it told Tim Hortons where they went afterwards, if they went home or where they were staying. It collected all that data, and it was a big problem.
In the ethics committee, we studied facial recognition technology. There was a company called Clearview AI, which took two billion images off the Internet, including a lot of ours, and just gave them to the police. There was no consent. The information just went and ended up in the hands of law enforcement.
There is Telus's “data for good”. During the pandemic, Telus collected our data. It knew where we went and if we went to the grocery store or the pharmacy, or if we stayed home. It just gave that to the government. It was called “data for good”. They called it de-identification. I am going to talk about how that hurt everyone later.
Lastly is doxing or using personal information to try to out people. GiveSendGo is a big one. It gave a U.S. company the information of people who donated to different causes or events. At one point, Google identified all those donors on a website showing exactly where they lived. Everyone's information, when they donated to a company, was identified and outed. That was terrible.
Surveillance has not just resulted in a wholesale destruction of privacy but a mental health crisis in children and youth as well. I am glad to hear the minister speak about children and youth because data has certainly affected them and continues to.
Canada’s federal government has repeatedly failed to take privacy seriously and construct a legal framework that protects the rights of Canadians in the digital age. This bill normalizes surveillance and treats privacy not as a fundamental human right and not even as a right to consumer protection. To make this point very clear, nowhere in the document for Bill does it state that privacy is a fundamental human right. However, this should be the crux of new legislation to update privacy laws, if not the outward premise, with the statement hammered from the preface until the end of Bill and following through the entire document. However, it is not there. It is nowhere and, therefore, holds no value.
This bill does not use that statement from the onset. It should be the pillar by which the bill is designed and led. Only a strong bill will ensure that Canadians' privacy rights are protected. Because of its omission, the bill is very weak, making it easier for industry players to be irresponsible with people's personal data. This is ironic as Canada has signed on to the UN Declaration of Human Rights and the International Covenant on Civil and Political Rights. That is where the bill starts and ends, with its failure to properly address privacy for Canadians.
Conservatives believe that Canadians’ digital privacy and data need to be properly protected. This protection must be a balance that ensures Canadians’ digital data is safe and that their information is properly protected and used only with their consent, while not being too onerous to be detrimental to private sector business. It is a balance.
Let us be clear. We need new privacy laws. In fact, it is essential to Canadians in this new digital era and to a growing digital future, but Bill needs massive rewrites and amendments to properly protect privacy, which should be a fundamental right of Canadians. The bill needs to be a balance between the fundamental right to privacy and privacy protection and the ability of business to responsibly collect and use data.
It also needs more nuance, but parts of this bill are far too vague. The definition of tyranny is the deliberate removal of nuance, so to create more equality or fairness on those privacy rights and to ensure businesses and AI use data for good, we need more nuance with more detail and more explanation, not less. There was a saying I used to love that my grandfather would say: “If you're going to do something, make sure you do it right or don't do it at all.”
Besides the omission of privacy rights as a fundamental right, the bill needs a massive rewrite. First, the bill doubles down on a flawed approach to privacy using a notice and consent model as its legal framework. The legal framework of Bill remains designed around a requirement that consent be obtained for the collection, use and disclosure of personal information, unless one of the listed exceptions to consent applies. Those exceptions are called “legitimate interest”.
What is scary about legitimate interest is that the businesses themselves will determine what legitimate interest means and what will be exempt. A quote on this from Canada’s leading privacy and data-governing expert, Teresa Scassa, says that this provision alone in the bill “trivializes the human and social value of privacy.” The legitimate interest provision allows Facebook, for instance, to build shadow profiles of individuals from information gathered from their contacts, even those with no Facebook access or accounts, without asking for their permission.
Have colleagues ever seen the “people you may know” feature on Facebook? Sometimes people turn up there, although one might not know where they had ever met and even though neither party is actually on Facebook. That is because Facebook builds profiles and shadow profiles from other members' contacts. Facebook has a feature that will suggest that one share their contacts: It will be great. People will give all their friends' information to Facebook: their emails, addresses and sometimes their private phone numbers. The U.S. found that information was turning up in Facebook. Here are a couple of examples. An attorney had a man recommended as a friend he might know who was a defence counsel on one of his cases, when they had only communicated though a work email. Another time, a man who donated sperm to a couple, secretly, had Facebook recommend their child as a person he should know, despite not having the couple, whom he once knew, on Facebook.
Legitimate interests needs more nuance. It needs to be more defined, or it is useless. Legitimate interests allow for too much interpretation. In other words, it allows something to be something unless it is not. It is far too broad.
Additionally, consent is listed as having to be “in plain language that an individual to whom the organization’s activities are directed would reasonably be expected to understand.” Bill makes it hard to determine what legitimate interests are, and that goes back to privacy as a Human Rights Commission complaint.
If we compare this section to the European Union's privacy law, the GDPR, which is, as the minister stated, the gold standard, the legitimate interest exemption is available unless there is an adverse effect on the individual that is not outweighed by the organization's legitimate interest, as opposed to the interest or fundamental freedom to the individual under the GDPR. If adverse effects on the individual can be data breaches, which are shocking and distressing to those impacted, and some courts have found that the ordinary stress and inconvenience of a data breach is not a compensable harm since it has been a routine part of life, probably for the last two years at least, then the legitimate interest exemption will be far too broad.
However, Bill would take something that was meant to be quite exceptional for consent in the European Union's privacy laws and make it a potentially more mainstream basis for the use of data without acknowledging consent. Why would it do this? It is because Bill C-27 places privacy on par with commercial interests in using personal data, something that would not happen if privacy was noted in the bill as a fundamental right for Canadians.
Additionally, we need to be wary of consent. As a mandatory, consent should be made easier. Has anyone ever looked at their iPhone when agreeing to consent and scrolled down? Has anyone actually read all that? Has anyone read Google's 38 pages of consent every time they sign up or use Google?
Consent is not easy. It is not simple, and certainly this proposed law would not make it any simpler. We need to be wary of consent, and we need to ensure that consent is consensual, both in language and intent, and that we all know exactly what we are signing up to do, to give and to receive.
There is another term I want to explain as well called “de-identification”. The bill talks a lot about de-identification, and its definition is that it “means to modify personal information so that an individual cannot be directly identified from it,” and then goes on to say “a risk of the individual being identified remains.” Therefore, an individual would lose all their information, but a risk of identifying an individual would remain.
Members will remember my Telus data for good example. Telus gave this information to the government during COVID, even though a risk of the individual being identified remained. It should be scrapped, and instead we should be using the word “anonymize”, which is also in the bill. This is what the GDPR does. In the bill, it “means to irreversibly and permanently modify personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means.”
I would ask members which one they would prefer. Would they like to be re-identified, as there is a possibility, or would they like no identification by any means?
Another major flaw in Bill is the creation of a bureaucratic tribunal instead of giving the Privacy Commissioner more bite. The creation of a tribunal is a time-waster, and the Privacy Commissioner should be allowed to levy fines. The Privacy Commissioner should be given more power and more bite. This is unclear because the EU, the U.K., New Zealand and Australia do not have tribunals that mediate their fines for privacy violations. Furthermore, it would no doubt cause those who have had their privacy violated to have to wait for years for the right of action.
I will put this straight. First we would have the Office of the Privacy Commissioner, or OPC, make a ruling. Then the government said that it would have a tribunal, which could then reverse the ruling of the Privacy Commissioner, and then we would have the Supreme Court, which would be allowed to rule on the tribunal's ruling. We would have a decision, another decision and a third decision, and each one of them could be countered.
Let me guess how long it would take. What do members think it would take? Would it take 48 hours or six months? Right now, the average is one year for the Privacy Commissioner, and we could add another year for the tribunal plus another year for appeals.
I ask this: Is it fair to have the average Canadian who has had their data breached, with their limited resources, have to go up against Facebook and Amazon and then spend three years in court? Does this protect fundamental privacy rights? Is this not just adding another layer of government that we certainly do not need?
The absence of rights-based language in the bill might tip the scale away from people in Canada, and the OPC and the tribunal weigh the privacy interest of people against the commercial interests of companies. Again, what does this come back to? Privacy was not listed as a fundamental right of Canadians.
Lastly, the AI portion of this bill is a complete rewrite. It needs to be split into its own bill.
I want to commend the for bringing this forward. He wants to be the first one in the land to bring this part of the bill forward, but to be honest, consultations only started in June. We have met with many individuals who certainly have not had any input into this deal, and although AI is there, there are many parts missing.
First of all, its findings conclude that there will be no independent and expert regulator for automated decision systems, nor does it have a shell of a framework for responsive artificial intelligence regulation and oversight. Instead, it says that the regulations will be determined at some future date and decisions will come from the or a designated official.
Again, part of this includes a new tribunal and puts decisions where they should not be, onto the government, with enforcement and decision-making by the or the minister's designated ISED official. This would be political decisions on privacy. Does everyone feel comfortable that we are now shifting from a tribunal to the government?
This part of the bill will shift all of that to the government, to the or his designate. It reminds me of the proclamation, “I'm from the government, and I'm here to help.”
There is no mention of facial recognition technology, also, in this part of the bill, despite reports that have come from the ethics committee, the examples I gave from before on FRT. Certainly, that is worth more study.
There are some parts of the bill that have good aspects and certainly ones we can get behind, including the protection of children's privacy. As a father, I know it is so very important. Our children now have access to all kinds of different applications on their phones, iPads and Amazon Fires.
Our children are being listened to and they are being surveilled. There is no question that businesses are taking advantage of those children and that is something that we definitely need to talk about.
The attempt to regulate AI, though, as I have stated, needs major revisions. Without a proper privacy statement, it does not have a balanced purpose statement establishing that the purpose of the CPPA is to establish rules for governing the protection of personal information in a manner that balances the right to privacy and the need for organizations to collect, use or disclose personal information.
We should be shooting beyond the European Union's privacy act, shooting to be the world leader in the balance of ensuring privacy protection and that businesses and industries use data for good. In doing so, they would attract investment and technology, all the while protecting Canadians' fundamental right to privacy.
Canada needs privacy protection that builds trust in the digital economy, where Canadians can use new technologies for good while protecting them from the bad, profiling, surveillance and discrimination. The said that he wants to seize the moment, that we need leadership in a constantly changing world. Most importantly, the minister said that trust has never been more important.
If we do not get this right, and if we do not make sure that privacy is a fundamental human right, and declare that in the document and build the document around that right, we are doing two things: We are not prioritizing Canadians' privacy, as we are certainly not putting privacy at the forefront of the bill, and we are certainly not showing leadership in an ever-changing world.
As I noted at the onset, the technologies of 22 years ago have changed so significantly. The technologies now are changing more significantly. In the next 22 years, we are going to have technologies that are more embedded, not less, in our lives. We will have AI that do good.
One of the stakeholders that we met with actually talked about AI for good. They talked about embedding AI into the government's system of passports. That might actually mean that we could get passports within 48 hours. Could we imagine that? Could we imagine imbedding technology for good into a system that would allow Canadians to get the things that they need more often?
We love technology. We want to embrace it. We just want to make sure that, number one, privacy is protected. We want to make sure that we do the hard work of building frameworks alongside Canadians' fundamental human right to privacy and being protected in equal balance with the economy, democracy and the rule of law. This bill does not do that, not yet.
Let us work to make sure we come back with a bill that does that.
Mr. Speaker, I would like to begin by giving a shout out to my constituents in Trois-Rivières, whom I will be visiting all next week in my riding.
When I talk to people on the street, privacy is a topic that comes up a lot. They know that I sit on the Standing Committee on Access to Information, Privacy and Ethics, and privacy comes up often. People tell me that it is important, that we must do our best to rise to the challenge. Today, we have the opportunity to debate that very subject.
Society is a human construct. It is a reflection of how we organize our lives together. It reflects our vision of the world, the role of a citizen, the role of the state. In a democratic society where elected officials are chosen by the people to represent them, our laws must reflect our desires and the desires of our fellow citizens, as well as the way in which their visions can be realized. In other words, a society and its laws are eminently cultural constructs.
When we compare the legislation passed in the House of Commons with that of the Quebec National Assembly, the difference is striking. Ottawa tends to emphasize the enforcement mechanism, whereas in Quebec, the emphasis is on the legislator's intent. Ottawa wants to arbitrate, while Quebec wants to prescribe and guide.
When it comes to privacy, this is especially true in the digital age: the difference is dramatic.
At one end of the spectrum, so to speak, is the United States. In the United States, laws are primarily intended to arbitrate disputes rather than to shape how the digital economy operates. Laws are based on the good faith of the players and on voluntary codes. As one might imagine, this has its limits. Ultimately, if someone is wronged, they can get redress through the common law.
At the other end of the spectrum is the European Union. The legislation there prescribes clear obligations. I am referring to the General Data Protection Regulation, better known by the acronym GDPR.
In between is Canada, a hybrid creature whose intentions on privacy oscillate between the European and American extremes. This may seem like an academic debate, but there are practical implications that bring us to Bill .
When it comes to privacy, European law is the most prescriptive in the world. It is based on a clear principle, namely that our personal information belongs to us and us alone, and no one can use it or benefit from it without our free, informed and explicit consent.
Once the government set out that principle or objective, it then provided a mechanism for achieving it. That mechanism is the GDPR. The GDPR is becoming the standard to follow when it comes to privacy, because it is the legal standard with the clearest objectives and the most binding application. Simply put, the GDPR does a good job of protecting privacy. That is one reason why it is the standard we should be emulating; the other is that the EU is projecting its standard-making power beyond its borders.
In order to protect the personal information of European citizens, the European Union will soon prohibit European businesses from sharing this information with foreign businesses that do not offer comparable protection. This does not affect us yet, but next year, the EU will be reviewing Canada's laws to see if they offer sufficient protection.
The existing legislation on personal electronic information protection dates back to 2000. That was 22 years ago. We were in the dinosaur era, the pre-digital era, an era we barely remember now. Also, it is far from clear whether Canada passes the comparable protection test required under the GDPR.
Information exchanges between Canadian businesses and their European partners could become more complicated. This is particularly true in areas that deal with more sensitive information, such as the financial sector. It is therefore absolutely necessary to redraft the Personal Information Protection and Electronic Documents Act, which is completely outdated. It has not kept pace with technological change and the data economy, where we are both the consumer and the product. It has not kept pace with the legal environment, where Canada is a dinosaur compared to Europe, as I was just saying.
Nevertheless, my colleagues will have figured out that the Bloc Québécois is in favour of the principle of Bill C‑27. Nevertheless, I would like to make a general comment about Bill C‑27. For some reason, the government has put into one bill two laws with completely different objectives. The bill would enact the consumer privacy protection act and also the artificial intelligence and data act. Although there is a logical link between these two acts, they could be stand-alone bills. Their objectives are different, their logic is different and they could be studied separately.
I have a suggestion for the government. It should split Bill C‑27 into two bills. We could create what I would call the traditional Bill C‑27, which would deal with personal information and the tribunal. Then, what I would call Bill C‑27 B would address artificial intelligence. As I was saying, there are logical reasons for that, but there are also practical reasons. Let me be frank and say that the artificial intelligence act being proposed is more of a draft than a law. The government has a clear idea about the mechanism for applying it, but, clearly, it has not yet wrapped its head around the objectives to be achieved and the requirements to be codified.
The mechanism is there, the bureaucratic framework is there, but the requirements to be complied with are not. Apart from a few generalities, the law relies essentially on self-regulation and the good faith of the industry. I have often faced these situations, and I can say that the industry's good faith is not the first thing I would count on.
Apart from a few generalities, this relies on good faith, but that is not a good way to protect rights. I am not convinced that this bill should be passed as written; I think it needs to be amended. Bill C‑27 probably deserves the same fate that Bill , its predecessor, encountered in the last Parliament. The government introduced it, debate got under way, criticism was fierce, and the government let it die on the Order Paper so it could keep working on it and come back with a better version. I think that is exactly what should happen to the artificial intelligence act.
The government has launched a healthy discussion, but this is not a finished product. If we decide that the government needs to keep working on it and come back with a new version, we will also be delaying the modernization of privacy and personal information legislation. Given the European legislation, which I talked about earlier, that is not what the government wants to do. That is why I would cordially advise the government to split Bill C‑27.
I am going to focus primarily on personal information protection because that is the part of Bill C‑27 that is ready to go and has the most practical applications. As I said before, Bill C‑27 is an improved version of Bill C‑11, which was introduced in the fall of 2020.
However, Bill C-27 still does not establish privacy as a fundamental right. Bill C-11 was strong on mechanics, but weak on protection. The principles were also weak and consent was unclear. It was tough on large corporations and much less so on small businesses. When it comes to privacy, however, it is the sensitivity of the data that should dictate the level of protection, not the size of the company.
A new start-up that develops an app that aggregates all of our banking data, for example, may have only two employees, but it still possesses and handles extraordinarily sensitive information that must be protected as much as possible. I cannot help but think of the ArriveCAN app, which was developed by just a few people but has a large impact on the data that is stored.
Finally, Bill C-11 did not provide for any harmonization with provincial legislation, such as Quebec's privacy legislation. The Bloc Québécois was quite insistent on that. A Quebec company subject to Quebec law would also have been subject to federal law as soon as the data left Quebec. It would have been subject to two laws that do not say the same thing and have two different rationales. This would mean duplication and uncertainty. It was quite a mess. Passing Bill C-11 would have diminished, in Quebec at least, the legal clarity that is needed to ensure that personal information is protected.
Here is what Daniel Therrien, the then privacy commissioner, told the Standing Committee on Access to Information, Privacy and Ethics, of which I am honoured to be a member. He said, and I quote, “I believe that represents a step back overall from our current law and needs significant changes if confidence in the digital economy is to be restored.”
He proposed a series of amendments that would make major changes to the bill. I want to commend the government here. It listened to the criticism. It is rare for this government to listen, but it did so in this case. It buried Bill C-11. We never debated it again in the House and it died on the Order Paper. It reappeared only after being improved.
Bill shows more respect for the various jurisdictions and avoids the legal mess I was talking about earlier.
Our personal information is private and it belongs to us. However, property and civil rights fall exclusively under provincial jurisdiction under subsection 92(13) of the Constitution of 1867.
What is more, privacy basically falls under provincial jurisdiction. That is particularly important in the case of Quebec, where our civil law tradition leads us to pass laws that are much more prescriptive.
Last spring, Quebec's National Assembly passed Bill 25, an in-depth reform of Quebec's privacy legislation. Our law, largely inspired by European laws, given that we share a legal tradition, is the most advanced in North America. As we speak, it is clear that Quebec has exceeded the European requirements and that our companies are protected from any hiccups in data circulation.
Our principles are clear: Our personal information belongs to us. It does not belong to the party who collected it or the party who stores it. The implication is clear. No one can dispose of, use, disclose or resell our personal information without our free, informed and express consent. Bill C-11 challenged this legal clarity but Bill C-27, at the very least, corrects that.
Under clause 122(2) of Bill C‑27, the government may, by order, “if satisfied that legislation of a province that is substantially similar to this Act applies to an organization, a class of organizations, an activity or a class of activities, exempt the organization, activity or class from the application of this Act in respect of the collection, use or disclosure of personal information that occurs within that province;”.
In other words, if Quebec's legislation is superior, then Quebec's legislation will apply in Quebec.
When I met with the minister's office earlier this week, I asked for some clarification just in case. Will a Quebec business be fully exempt from Bill C‑27, even if the information leaves Quebec? The answer is yes. Will it be exempt for all of its activities? The answer is yes.
There is still some grey area, though. I am thinking about businesses outside Quebec that collect personal information in Quebec. In Europe, it is clear. It is the citizen's place of residence that determines the applicable legislation. The same is true under Quebec's legislation.
It is not as clear in Bill C‑27. Since the bill relies on the general regulation powers for trade and commerce as granted by the Constitution, it focuses more on overseeing the industry than on protecting citizens. That is the sort of thing we will have to examine and fix in committee. I look forward to Bill C‑27 being studied in committee so we can debate the substance of the bill.
I have to say that I sense the openness and good faith of the government. In that regard, I would like to tell the member for to take note that, for once, I feel he is working in good faith.
Bill C‑27 will have a much greater impact outside Quebec than within it, because it is better drafted than Bill C-11. That is not the only aspect that was improved. The fundamental principles of the bill are clearer. Consent is more clearly stated. The more sensitive data must be handled in a more rigorous manner, no matter the size of the entity holding them. That is also more clear.
If the principles are clear, the act will better stand the test of time and adjust to the evolving technologies without becoming meaningless.
We will support it at second reading after a serious debate, but without unnecessary delays. However, we believe and insist that the real work must be done in committee. Bill is complex. Good principles do not necessarily make good laws. Before we can judge whether Bill C-27 is indeed a good law, we will need to hear from witnesses from all walks of life.
When it comes to privacy, it only takes one tiny flaw to bring down the whole structure. This requires attention to detail and surgical precision. The stakes are high and involve the most intimate part of our lives: our privacy.
For a long time, all we had to do to maintain our privacy was buy curtains. That is how it used to be. It kept us safe from swindlers. Then organizations started collecting data for their records. Bankers collected financial information, the government collected tax information and doctors collected medical records. This sensitive information had to be protected, but it was fairly simple, since it was written on paper.
Today, we live in a different world. Whereas personal information used to be a prerequisite for another activity, such as caring for a patient or getting a loan from a bank, it has become the core business of many companies. Information has become the core business of many companies, which are also large companies.
Computerization enables the storage and processing of astronomical volumes of data, also known as big data. Networking that data on the Internet increases the amount of available data exponentially and circulates it around the globe constantly, sometimes in perpetuity, unfortunately.
For many corporations, including web giants, personal data is crucial to the business model. Citizen-consumers are now the product they are marketing. To quote Daniel Therrien once again, we are now in the era of surveillance capitalism. Speaking of which, The Great Hack on Netflix is worth seeing. This is troubling.
Furthermore, for our youngest citizens, the virtual world and the real world have merged. Their lives are an open book on Instagram, Facebook and TikTok. They think they are communicating with the people who matter to them, but they are in fact feeding the databases that transform them into a marketable, marketed product. We absolutely have to protect them. We need to give them back control over their personal information, which is why it is so important to amend and modernize our laws.
I would like to close my speech with an appeal to the government. Bill does a lot, but there are also many things it does not do, or does not do properly. Consent is all well and good, but what happens when our data is compromised, when it has been stolen, when it is in the hands of criminals? These people operate outside the law and therefore are not governed by the law. All the consent-related protocols we can think of go out the window. To avoid fraud and identity theft, we will have to clarify the measures to be taken to ensure that anyone requesting a transaction is who they say they are. This really is a new dynamic. In that respect, we are somewhat in the dark, even though, curiously, this is a growing problem.
There is another gap to fill. Bill provides a framework for the handling of personal information in the private sector, but not in the public sector. The government is still governed by the same old legislation, which dates back to the pre-digital era. The legislation is outdated, as we saw with the fraud related to the Canada emergency response benefit. The controls are also outdated. I therefore call on the government to get to work and to do so quickly. We will collaborate.
Finally, there is another thing the government needs to work on and fast. We addressed this issue in committee when we were looking at the geolocation of data. Bill indicates what we need to do with personal data, nominative data. However, with artificial intelligence and cross-tabulation of data, it is possible to recreate an individual based on anonymous information. As no personal information was collected at the outset, Bill C‑27 is ineffective in these cases. However, we started by recreating the profile of a person with all their personal information. It is not science fiction. It is already happening. Nevertheless, this is missing from Bill , both in the part on information and the part on artificial intelligence.
I am not bringing this up as a way of opposing Bill C‑27. As I said, we will support it. However, we have to be aware of the fact that it is incomplete. As legislators, we still have some work to do. The time has come to treat privacy as a fundamental right.