Skip to main content
Start of content

INDU Committee Meeting

Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.

For an advanced search, use Publication Search tool.

If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.

Previous day publication Next day publication
Skip to Document Navigation Skip to Document Content






House of Commons Emblem

Standing Committee on Industry and Technology


NUMBER 110 
l
1st SESSION 
l
44th PARLIAMENT 

EVIDENCE

Monday, February 12, 2024

[Recorded by Electronic Apparatus]

  (1100)  

[Translation]

    I call this meeting to order.
     Good morning, everyone, and welcome to meeting number 110 of the House of Commons Standing Committee on Industry and Technology.
    Today’s meeting is taking place in a hybrid format, pursuant to the Standing Orders.
    I'd like to welcome all the witnesses here today.
     From the Alliance of Canadian Cinema, Television and Radio Artists, we have Eleanor Noble, national president, who is joined by Marie Kelly, national executive director. We also have Stéphanie Hénault, director of legal affairs at the Association nationale des éditeurs de livres, as well as Marie-Julie Desrochers, executive director of the Coalition for the Diversity of Cultural Expression. From the Directors Guild of Canada, we have Dave Forget, national executive director, and Samuel Bischoff, manager of policy and regulatory affairs. Lastly, from Music Canada, we have Patrick Rogers, chief executive officer.

[English]

    Thanks for being here with us this Monday morning as we're nearing the end of this study on Bill C-27. You are generating, as I see already, a lot of excitement in the room, so thanks for making the time to enlighten us with your testimony and your answers to our questions today.
     Without further ado, we can start with Ms. Noble for five minutes.
     The floor is yours.
    I'm Eleanor Noble, the national president of the Alliance of Canadian Cinema, Television and Radio Artists.
    Thank you for the opportunity to speak to this committee on behalf of the 30,000 members of our union. With me today is Marie Kelly, our national executive director. She's here with me to address any questions you may have.
    For 80-plus years, ACTRA has represented professional performers across Canada who bring Canadian stories to life. We play a vital role in a nearly $14-billion industry that generates 240,000 jobs a year. We came to this committee today because we are concerned about the use of artificial intelligence and similar technologies in our industry.
     To be clear, there are some positives to the adoption of technology in our industry when used responsibly. That said, our members are increasingly concerned about the unbridled and unmitigated use of AI in our industry and outside of it, which has the potential to significantly and harmfully impact our ability to work and make a living in the screen industry.
    Last year, we undertook a comprehensive survey of our members about the impact of AI. Outside of collective bargaining, we have never had more responses to a member survey. Let me share with you some of the high-level takeaways: 98% of ACTRA performers are concerned about the potential misuse of their name, image and likeness by AI; 93% of respondents are concerned that AI will eventually replace human actors, beginning with background work and dubbing.
    We have seen real examples of harmful use already. It was brought to ACTRA's attention last year that the voice of one of ACTRA's minor performers—underage performers—was uploaded to an AI text-to-speech voices list on a public website that allowed users to manipulate her voice to say crude, R-rated things. This is a minor, I'll remind you. This is unacceptable.
    Similarly, an ACTRA performer on a video game was downloaded by players and, with the use of AI, their voice and game character were manipulated to say obscene things and to perform sexual acts, all without the knowledge or consent of the actor. This was accessible online for two years before the actor became aware, at which point ACTRA was contacted to step in.
    These are just two examples of the harmful manipulations that performers—and, frankly, many Canadians—face. I think we can agree as Canadians that these are extremely harmful violations. We—you—have an opportunity to take action in this bill to protect us, and we are asking you to do so.
     We are pleased that the government is reviewing the impact of AI in a multi-faceted manner. We believe it's important to update the privacy regime in Canada and to put a framework in place to ensure AI developers and deployers must take action to mitigate the potential for harm from their technologies.
    We want to congratulate the government on bringing Bill C-27 forward and, in particular, we support your intention to ensure that consent is required for the use of biometric information, including a performer's name, image and likeness. Clarity around informed consent, we hope, will help in our work to ensure the industry does not ambush performers into signing away their rights.
    This committee must push this bill further to clarify the type of harm performers experience on an all-too-regular basis. Not only is AI causing personal harm to performers like me, but it also risks our livelihoods and reputations. In the entertainment business, our reputation—including our name, image and likeness—is all we have. We are the brand, which we protect. The difference between getting a job one day and not getting one the next can come down to the most minute things, including one's reputation.
     Sadly, reputational harm is not currently encompassed by Bill C-27. The definition of harm to include “psychological harm” or “economic loss to an individual” does not sufficiently encompass the reputational harm we experience. Due to the nature of our business, we might not be able to show an exact circumstance of work lost due to a deepfake or manipulation, but there is no doubt that damage to a performer's reputation means real and tangible loss for our careers.
    We have submitted to this committee our proposed language to rectify this gap under the legislation. We strongly urge this committee to amend the definition of harm to ensure that performers' rights are protected under this bill.

  (1105)  

     Finally, the government must take action to amend other statutes to mitigate the harm of AI on Canadian performers. Specifically, we believe that the Copyright Act is fundamentally biased against performers by not ascribing a moral right to their work. We urge this committee to take action, either through this bill or with haste elsewhere, to protect Canadian performers. We understand that the upcoming budget bill may contain amendments to the Copyright Act, and we ask that you raise with the Minister of Finance the urgency of the need to provide moral rights to performers in it, as musicians have.
    Committee members, we recognize that this bill is only scratching the surface of the public policy tools the government has on this file. We urge you to take us seriously. Our sector is an economic driver in this country, with real workers who strive to make a living and contribute to our Canadian cultural life. We need you, as legislators, to ensure that we can be protected and can continue to work today and into the future.
    Thank you. Marie and I would be happy to take any of your questions.

[Translation]

    Thank you, Ms. Noble.
    Ms. Hénault, the floor is yours.
    On behalf of the Association nationale des éditeurs de livres, I want to thank you for having me here in connection with this study on the first legislative initiative to specifically regulate artificial intelligence systems in Canada.
    My name is Stéphanie Hénault, and I am the director of legal affairs at the association, which represents francophone book publishing companies across the country. Together with the Union des écrivaines et des écrivains québécois, the association that represents authors in Quebec, we have established Copibec–Gestion collective des droits de reproduction, which offers copyright and royalty management solutions for users and rights holders. Associated with the International Publishers Association, the largest publishing federation in the world, we promote publishing as an economic, cultural and social development driver and are leaders in its evolution. By collectively participating in major international fairs and salons, hosting foreign publishers, booksellers and journalists here in Canada and taking part in numerous foundational projects, we are involved in numerous efforts to promote the exposure of French Canadian books.
    For example, we established the Entrepôt ANEL-De Marque, which has fostered the successful development of a business model that complements that of the print industry, and we support our members in implementing digital strategies that promote their development. More specifically, we promote books in all formats in francophone countries and the translation of those books in countries such as Germany, Argentina, China, Egypt, Spain, the United States, Mexico, Iceland, Sweden, Serbia and Turkey, to name only a few.
    The more Canadian literature is read internationally, the more popular it becomes among readers. The more often it's noticed by juries, the more awards it wins and the more it sells on all continents, including in our own country. The following numbers show how successful French Canadian books have become. In Quebec alone, sales of new books represent a market valued at approximately $680 million a year. Also in Quebec, the market share of francophone publishing companies represents 50% of sales, even though 900 foreign publishers distribute their books here.
    In the artificial intelligence era, the entire Canadian book publishing industry needs our support, now more than ever, in establishing updated policies and programs by encouraging the lawful supply of content in this field.
    This is why we took an active part in the recent consultation on generative AI and copyright by supporting the responsible development of artificial intelligence. We did it with Access Copyright, the Association of Canadian Publishers, the Association des éditeurs de langue anglaise du Québec, the Canadian Authors Association, the Canadian Publishers' Council, Copibec, the Literary Press Group of Canada, the Regroupement des éditeurs franco-canadiens, the Writers' Union of Canada, the Union des écrivaines et des écrivains québécois, as well as with our partners in the Coalition for the Diversity of Cultural Expressions.
    The global publishing industry relies on copyright, particularly the exclusive right to authorize or prohibit the use of works and to engage in fee-based licensing. These rights are engaged when works are integrated in AI systems and when those systems are used if works are reproduced within them. For rights holders, the ability to grant or withhold permission to use works in these ways is as important as the compensation that may follow therefrom, particularly when a production of an artificial intelligence system competes with the work, substitutes it or undermines the author's moral right, to name only those forms of harm.
    In the British, European and North American markets, we are seeing increasing numbers of copyright violation actions against AI models and trade agreements that are being reached to allow content to be licensed for text and data search purposes.
     In Canada, licensing for text and data search is a growing market. We implore the government, on behalf of the book publishing industry, to encourage that industry by amending part 3 of Bill C-27 such that it clearly establishes that artificial intelligence must be developed and deployed responsibly, in the following manner: first, by implementing procedures that guarantee compliance with copyright legislation when its models are trained; second, by establishing obligations of transparency in the publication and availability of information on content integrated in its systems; and, lastly, by clearly and expressly stating in its own conditions of licensing with its users that the latter are required to comply with copyright.
    The Copyright Act affords copyright holders remedies for addressing counterfeit cases involving AI developers, suppliers and users. First, however, AI framework legislation must at least provide that the intellectual property of Canadians be respected. Otherwise, the Canadian royalties market could well be hit even harder as systems will be developed and deployed secretly, unfairly and unlawfully.

  (1110)  

    Let me be very clear: we are not opposed to artificial intelligence, but we do contend that all Canadian market actors must support the legitimate interests of authors and publishers, as well as their essential contribution to innovation, knowledge, culture, diversity, cultural outreach, the economy and wealth of the country. We therefore emphasize that you must ensure our country at least complies with international practices respectful of authors and publishers, as Europe is doing with its new AI legislation, to prevent Canada from looking like a banana republic of international technology companies.
    In conclusion, I want to emphasize that authors and publishers are also counting on you to improve the Copyright Act, failing which they will be unable to receive the legitimate royalties that their international counterparts receive when their works are reproduced at certain educational institutions. I would also remind you that this priority was supported by the Standing Committee on Science and Research in its November 2023 report entitled Support for the Commercialization of Intellectual Property.
    On behalf of the Association nationale des éditeurs de livres, thank you very much for listening. I will be pleased to answer your questions.

  (1115)  

    Thank you very much, Ms. Hénault.
    Go ahead, Ms. Desrochers.
    Mr. Chair and members of the committee, thank you for your invitation and for this opportunity for the cultural sector to comment on Bill C-27.
    I am the executive director of the Coalition for the Diversity of Cultural Expressions, which this year celebrates its 25th anniversary. The coalition consists of more than 50 members from Canada's cultural sector: anglophone and francophone unions, professional associations and collection societies. We cover a broad and diverse range of audiovisual, musical, digital arts, book and publishing disciplines, as well as the visual and performing arts. We also represent more than 350,000 creators and nearly 3,000 businesses in the cultural industry.
    I'm in good company today, surrounded by three coalition members: the Association nationale des éditeurs de livres du Québec, the Directors Guild of Canada and the Alliance of Canadian Cinema, Television and Radio Artists. This small sample represents only part of the impact that the development of artificial intelligence has had on our sector. I encourage you to continue consulting the cultural sector so you can also hear from the representatives of visual artists, screenwriters, producers, composers, authors and others.
    Our coalition's primary mission is to secure a cultural exclusion in trade agreements in order to preserve Canada's cultural sovereignty. We also want to ensure that Canada adopts public policies that guarantee protection and promotion for the diversity of cultural expressions, including in the digital environment. Our efforts build on the 2005 Convention on the Protection and Promotion of the Diversity of Cultural Expressions. That UNESCO convention came to be as a result of the concerted efforts of Quebec and Canada, and France as well, and I would note that Canada was the first country to ratify it.
    We are here today to comment on a bill that it is designed to protect Canadians from the risks presented by the spectacular developments in artificial intelligence, generative AI in particular.
    The 2005 convention states that cultural diversity is "indispensable for peace and security at the local, national and international levels". In other words, the development of responsible artificial intelligence must take that diversity into account and ensure it is protected. Diversity is essential in safeguarding our freedom of expression, the health of our democracy and the maintenance of our sovereignty.
    Bill C-27 essentially addresses the risks facing individuals as a result of artificial intelligence. As others before us have done, we wish to emphasize how important it also is to consider the societal risks that artificial intelligence presents.
    The purpose of the new legislation, stated in clause 4, and the definition of harm that appears in the text are too limited. Adopting wording found in the European Union's AI legislation, we suggest that one of the purposes of the new act be to protect the health, safety and fundamental Charter rights, including democracy—of which the diversity of cultural expressions is a pillar—and the rule of law, as well as the environment, from the harmful effects of AI systems.
    The main theme for today's witnesses is copyright. That's heartening because we are convinced that Bill C-27 has a major role to play in this area.
    The Canadian government recently conducted a consultation on the impact of generative AI on copyright. The cultural community's unanimous view is that, contrary to the widely held perception, Canadian copyright legislation doesn't need to be significantly modernized to protect rights holders in reaction to developments in generative AI. It already protects human creation and prohibits the unauthorized use of protected cultural content. However, as a result of a lack of transparency regarding the data used to drive AI systems, that act cannot be applied in an optimal fashion. This is where Bill C-27 must play a role.
    Here are two specific potential solutions that would restore the Copyright Act to full effectiveness for the benefit of rights holders and Canadians as well.
    We should draw on European AI legislation and go beyond the obligation to retain data records, as provided in the new subsection 7(2) proposed by amendment to Bill C-27 and, in particular, provide that a sufficiently detailed summary of the use of copyright-protected training data is made available to the public.
    It should also be more clearly stated that Bill C-27 creates responsibilities with respect to the Copyright Act, as the European Union has done.

  (1120)  

    The accountability framework outlined in new subsection 12(5) moved by amendment to Bill C-27 could thus support policies and procedures concerning the Copyright Act and the use of an individual's voice, image or reputation.
    These additions would be consistent with the regulations being introduced at the international level and would foster the development of a licensing market based on consent and remuneration of rights holders.
    Thank you for your attention. I will be pleased to answer your questions.
    Thank you very much, Ms. Desrochers.
    Go ahead, Mr. Forget.

[English]

     Dear Chair and members of the committee, good morning. Thank you for the opportunity to participate in this important work.
    My name is Dave Forget, DGC's national executive director. With me today is Sam Bischoff, manager of policy and regulatory affairs. We appreciate the committee's invitation.
    Generative AI threatens the ecosystem of creativity on an existential level. Creators should have the right to consent and be compensated whenever an AI entity uses their copyrighted content. As we stand at the crossroads of regulating AI to protect against harms, we believe it is crucial to protect creators from the economic and moral harms under AIDA.
    The Directors Guild of Canada is a national labour organization that represents over 7,000 key creative and logistical personnel in film, television and digital media industries. Today it also includes over one thousand director members working across the country on screen-based programming.
    The Canadian film and television sector generates massive amounts of value, employment and soft power. In 2021-22 the entire screen sector value chain directly contributed an estimated 337,000 jobs, $16.6 billion in labour income and $23.3 billion in GDP to the Canadian economy. However, artificial intelligence threatens the core of this ecosystem. Large language model developers are reproducing extensive amounts of creative works for commercial purposes without the authorization and fair compensation of authors.
    Copyright remains a central framework law for governing our industry. Any unauthorized copying to train AI models is theft. Moreover, it is very difficult for rights holders to know when their works have been used without their consent in training AI models. Creators should be able to control whether their works are copied and used for mining purposes in the first place. Transparency in AI systems must be a prerequisite to defending authors' rights. This is a fundamental element to secure a future where human creativity can flourish.
    In its current form, AIDA is failing to protect and uphold fundamental copyright principles. We need AIDA to do the following, consistent with the protections being provided to creators under the EU Artificial Intelligence Act, also known as EU AIA.
    One, confirm that the use of copyright-protected content requires the authorization of the rights holder. This would be subject to the limited exception in copyright for technical ephemeral copying, which represents Canada’s very limited exception for text and data mining activities, to the extent that it applies.
    Two, general-purpose AI systems like ChatGPT must be transparent about the materials used for training purposes. They should provide a description of information on the data used for training, testing and validation, as well as how this data was obtained and selected.
    Three, providers of general-purpose AI models must be required to put in place a policy to respect Canadian law on copyright, including obtaining consent for text and data mining purposes. Any provider who makes available a general-purpose AI model in the Canadian market should comply with this obligation, regardless of the jurisdiction in which the AI training takes place.
    I'll turn it over to Sam.

  (1125)  

    Our European counterparts were able to secure these crucial rights to protect their cultural industries. There should be no reason the government can't provide the same level of protections in Canada. Large and well-funded platforms like Google, OpenAI and Microsoft should be required to operate on the same playing field in Canada as they will have to in the European Union.
    The Canadian government must ensure that creators are fully empowered to exercise their rights and make informed decisions. We believe that, unlike the provisions of the EU AIA, which include an opt-out regime associated with a text and data mining exception, Canadian creators and rights holders should benefit from an opt-in system to license their works.
    An AI tool cannot originate artistic work or supplant human creativity. The value of creativity is not captured or understood by the AI processes. Despite claims by the operators of these tools that their use is transformative, the reality is quite different. Generative AI tools do not genuinely transform. Instead, they exploit and launder the creative works they mine. It is imperative that authors receive fair compensation for the use of existing works and for all future uses.
    Members of the committee, we thank you for your time. We would be pleased to respond to your questions.
     Thank you very much.
    Now I will move on to Mr. Rogers.
    The floor is yours.
    Thank you for this opportunity to discuss Bill C-27 with you from the perspective of Canada's major music labels.
    Creating the rules of engagement for AI comes at an important time for the music industry, both here at home and abroad. I want to say, off the top, that our industry is already making use of positive elements of AI as a tool to help artists make more intriguing and interesting music, and using it, again as a tool, to help connect artists, including Canadian artists, with fans all around the world.
    Those aspects are, of course, not the central domain of Bill C-27 or the reasons why there needs to be further regulation. I will dedicate the rest of my time to telling you where you can help us most.
    As we saw with illegal downloading in the previous generation, the use and ownership of music is a valuable canary in the coal mine. Since then, we have learned the importance of regulating technology for its practical and common use rather than building exceptions into our laws and economic frameworks for corner cases. We've learned that the value of music and other forms of creative expression cannot be sacrificed to the drumbeat of technological revolution, and we've learned that quality, safe and licensed music is as popular with music fans as it is with the artists who are paid when their music is played.
    That is why Music Canada is supportive of the efforts made in Bill C-27 regarding the regulation of generative AI.
    There are three places where we would encourage you to go even further.
    The first involves the need for AI developers to maintain and make available records of the material that was ingested and used for training. Much of the economic framework for the industries that will be affected by the further flourishing of AI requires that everyone understand what the AI is trained on. In order to truly understand that, developers must keep these records.
    You will hear from the most excited proponents of unharnessed technology that this request is somewhere between missing the point and being impossible. I ask that the committee think about it in this way: If AI has the potential to cure diseases, design new and better cities for the future, and make travel plans for busy MPs a little more doable, then surely it can generate a spreadsheet or write a bibliography.
    The second place the bill can go further is in requiring the labelling of solely AI-generated images and videos, especially in cases where they impersonate an individual. Right now, today, we are standing at the edge of the uncanny valley with AI. Once you learn what to look for, you can understand that the image of the pope in the white puffy jacket is not a photo of the pope, but this technology will never be worse than it is today. Every day it is getting better and, in many ways, more dangerous when it comes to the powerful potential for deception and misinformation. Requiring labelling is an important step towards addressing this.
    The third is with respect to the need to address deepfakes and voice clones as a threat and to prepare our legal system so that we can all agree that the production of deepfakes and voice clones without the consent of the cloned person is wrong.
    As elected members of Parliament, you know that it takes a lifetime to build the reputation that brings you to this House of Commons. You also know that it takes just one moment for that to come crashing down. Increasingly, this is a fact that people across all professions, livelihoods and ages are coming to grips with in the face of the proliferation of deepfakes and the ease with which they can be produced.
    Abacus Data has found that exposure to deepfakes is common and that Canadians are worried about the risks. One out of every two Canadians has mistaken a deepfake for a real video. It's worse for younger Canadians, because 77% have been deceived and 15% say that it happens all the time. Canadians are worried about the effect of deepfakes on artists, political leaders and business leaders, but 79% of Canadians worry about it for themselves too. Almost unanimously, 93%, Canadians agree that there should be a right to prevent these impersonations.
    Now is the time to strengthen Bill C-27 and all of our laws to ensure that antiquated analog laws that were once designed to protect celebrities' images from being used against their consent in magazine ads are prepared for the digital realities for everyone today.
    Now, some will ask: What about free speech? When it comes to deepfakes, the answer is simple: Putting your words in my mouth is not free speech.
    What about parody? Deepfakes aren't parody. They don't mimic with deliberate exaggeration for comic effect. They are done to deceive, misinform and steal one person's character for the advantage of another. We should make clear that in 2024, in a digital setting, that is illegal.
    I thank you for your time and I look forward to your questions.

  (1130)  

     Thank you very much.
    Thank you to all of you.
    Now, to start the discussion, I'll turn it over to Mr. Williams for six minutes.
    Thank you very much, Mr. Chair.
    Thank you to our witnesses. This is always an important discussion, and it's great to hear from each one of you.
    I want to start with the testimony from Madame Desrochers today.
     You had two great recommendations at the end. I'm wondering if you could repeat them and elaborate on both.
     Then I'm going to ask other witnesses to comment on those recommendations for proposed subsection 7(2), I think it was, and for proposed subsection 12(5).

[Translation]

    We can also provide you with more information in writing.
    These recommendations are consistent with some of those prepared by my colleagues. I'd be pleased if one of my colleagues wished to comment.
     Bill C-27 currently provides that data must be retained, but we think that's not enough and request that data be both retained and made available to the public.
    Second, to make the Copyright Act more effective, we request that this be clearly stated in Bill C-27 in order to clarify certain obligations of transparency.
    Thank you.

[English]

    For other witnesses who want to contribute to that, I think what we're talking about, then, is sourcing data. It seems, from all testimony, that everyone is in agreement that we want to see that.
     Now, the question is whether that's in the Copyright Act, with other consultations now, or in AIDA itself. I understand that it's going to be in AIDA. There are certain parts that point to the Copyright Act, but that's what we want to see.
    Do any witnesses want to talk about that or comment on that concern?
    Yes. Thank you for the question.
    Our view is that, while the Copyright Act offers a number of protections, when we look at the definition of “harm” within AIDA as it exists there currently, it is limited to individuals. Copyright law applies to individuals but also collectively, and we don't think that the current text offers enough protections in terms of copyright for rights holders, for creators or for authors.
     We would like to see a specific policy and a specific provision that would indicate that copyright law applies to general-purpose AI systems.
    I know what one of the biggest concerns is. I mean, we have seen that ChatGPT has revolutionized and publicized AI and what it's capable of. Of course, we're talking about what's going to happen to creators in the music industry.
    Mr. Rogers, there's an AI Instagram site called “There I Ruined It”. It takes certain music from one creator and merges it with another's. This morning, it was Hank Williams singing Still D.R.E., by Dr. Dre. They're able to take the music and bring it to another.... This is a parody site. I think they do a good job.
    How is the music industry going to be transformed if we don't change the laws and make the correct changes to the law when it comes to music and the future of music in Canada?

  (1135)  

    Mr. Chair, I thank the member for the question.
    Look, I think that ultimately it's very important for us at this stage to remember that the technology that can amuse us is the same that can terrify us. That site in particular is of interest, and it can make us laugh.
     We may one day be in a position where those things can be licensed, but when those things are stolen, basically, the information and the data needed in order to create that is stripped from the Internet and copyright is infringed. All artists who are reproduced in that way have had it done without their consent and without compensation, and it is of great concern to us.
    I think it is very important, as we reach out in this sort of first try, that we keep some of those core principles in mind. We know that we wouldn't accept that in any other fashion. The fact that it's done rapidly in large quantities doesn't make it any more legal.
    Are you, then, on board with other recommendations that the source be published and that it be made public?
    Yes, absolutely. The only way copyright works is if we know what's in it. The only way we can know what's in it is if it's made broadly and widely available.
     As I said in my opening statement, even if that's a really big spreadsheet, that shouldn't concern anyone.
     Ms. Noble, you said that you represent 30,000 creators. I'm sure that a lot of them are small businesses, local artists who write a jingle for ads or who create their own music and aren't part of the bigger conglomerates. What can we do to protect those local creators? What parts of the act can we make sure are written for not just the big Metas and Googles and Amazons but also the local creators?
    Our members are actors. They are performers. They are in film and TV. They are on video games. They are very much subject to the deepfakes and abuses that are taking place. What we want to bring to light here is that when it comes to actors and performers, we are kind of on the outside looking in to the protections of copyright. We take a step back and say that the first problem for actors is that they don't have moral rights under the Copyright Act to protect them to begin with. Musicians got moral rights when Napster came in a couple of decades ago and started stealing their music. Today we have the same situation with deepfakes stealing the image of performers. We don't have moral rights in the Copyright Act. We say that needs to be taken care of right away.
    When we look at Bill C-27, we also see that under the definition of harm, you have “physical or psychological harm”. I would suggest that if we have to prove psychological harm, we'll have to get the DSM out, put it down on a table in a court of law, and explain the condition that was produced. That can't be the level for a performer. Then there's “damage to an individual’s property”. As I just told you, we don't have copyright protection. Now we go to the third one, “economic loss to an individual”. What we have to understand is that performers are precarious workers. Every day they audition for the next job. It might be one day on set in a series or a film. Every day they have to look for that job. How do they prove that they didn't get that role in Law & Order? How do they prove that as an economic loss?
    We need to have the damage to an individual's reputation. Eleanor Noble makes a living based on what you see here. Her name, her image, her likeness, she caretakes that with everything she does, whether business or personal, because she knows that her next gig relies on it, and yet she is subject to the deepfakes that are happening out there.
    Our data is easily captured now. Everything is streamed. Everything is on your phone. It's on your computer. It's readily available to be grabbed and used or misused. We really need this committee to take a look at the impact for performers in this country.

  (1140)  

[Translation]

    Thank you very much.
    Mr. Turnbull, go ahead.

[English]

    Thank you to all of you for being here.
    I really appreciate your raising your voices in this important conversation. You all represent incredibly important players and members who are across our creative industries and whom I think we're all deeply concerned about when it comes to AI and the harms it can cause to individuals who are earning their living and who, in many cases, I think, develop that reputation over many years and with lots of hard work. I empathize with all of your positions. I've read a lot of your submissions and materials in advance. I really appreciate your being here. Let me just start with that.
    One thing I keep hearing is conversations on intersecting with copyright. That's fair enough. I get that there are intersections of Bill C-27 and copyright, although we know that Bill C-27 doesn't deal with copyright. The Government of Canada is doing consultations and round tables. They have done seven round tables already.
    I want to start by asking each one of you—maybe one representative from each group—if you have been consulted and are participating in the copyright consultation process. The Government of Canada is looking at whether this conversation merits amendments to the Copyright Act as a separate process, but not involved in the scope of this bill.
    Ms. Noble or Ms. Kelly, has ACTRA been involved in that consultation?
    Yes, we have. We have been speaking and consulting on that. I have to say that we would very much like to have a meeting with the minister. We haven't yet been able to have that meeting. We are a bit concerned that the arts industry seems to be left out of that conversation. However, we have met with staff.
    Our intention is to include you, I can assure you of that.
    Ms. Hénault, go ahead.

[Translation]

    We were involved in the consultation on copyright in the era of generative artificial intelligence. However, like our European counterparts, we think that copyright legislation must be reviewed. We also think that the AI framework legislation should include provisions that protect copyright. It's really important that Canada include this clear message of support for its culture in the AI framework legislation.

[English]

     I will come back to that comment, but thank you.
    Ms. Desrochers, go ahead.

[Translation]

    As part of the consultation on copyright in the era of generative artificial intelligence, we also submitted a brief on behalf of all members of our coalition. As Ms. Hénault just mentioned, we believe that issues can be resolved in the Copyright Act. However, we're seeking very few amendments since it's already robust enough to provide an AI framework.
    However, some provisions must be added to Bill C-27, including an obligation to retain the data used to train AI systems and make it available to the public in order to permit the authorization and remuneration of copyright holders. That must appear in Bill C-27. There really is a connection between the two acts.

[English]

    I'm going to come back to that, but thank you.
    Mr. Forget, go ahead.
    Thank you very much for the question.
    We're talking about the intersection between the two acts, so—
    I'm sorry. I don't want to interrupt you, but I just want to know whether you participated in that consultation.
    Yes. I should have started with that.
    That's great.
     I want to ask some more questions, but do you want to make a quick comment? It was supposed to be a quick answer.
    Sure. The quick answer is yes.
    I can wait for you to come back.
    Okay. Wonderful.
    Mr. Rogers, go ahead.
    The answer for us is yes. We took part in the consultation. Music Canada shared the position that, of course, the ingestion of music relates to copyright and, of course, the use of AI relates to copyright and that none of these frameworks will work without that being the case.
    Yes, we know there's an intersection here, obviously.
     I think, Ms. Desrochers, you made a very good comment about how the two work together and how more requirements for transparency within Bill C-27 would actually help copyright to apply to the creative industries. Most of you are nodding your heads, so I take it that you agree with that.
    Don't you think Bill C-27 and the amendments proposed make several steps in the right direction? Can we maybe start there and then ask whether we need to go further? From my perspective, in terms of the identification of AI-generated content and strengthened enforcement, it has made some significant headway.
    Mr. Forget, would you agree with that?

  (1145)  

    I think there's still some work to do here. Just to finish my comment from earlier, I think it has been echoed that at the intersection of the two is that copyright is a framework for ensuring compensation and ensuring the application of moral rights and so on. In the context of AI and the uses of copyrighted material, we can create a framework for when that happens and how it happens.
    For example, one of our priorities is ensuring that the input that forms the basis of material that the AI systems draw from.... Let's be clear, they are existing works that were authored by DGC directors and other creators in our ecosystem, as the case may be, depending on the medium. It's important to effectively apply the requirements of copyright to ensure consent, compensation and credit and to relate those to AI.
    Therefore, we need to strengthen the AI side of the equation to ensure more transparency in the data that's used and more proactive “opting in”, where there's consent for that, to ensure there are opportunities for organizations like ours or for the individual artist to pursue the compensation they are entitled to.
    Thank you.
    I really agree with your general point.
    Unfortunately, I'm out of time. I wish I had more time to go back through all the panellists. Maybe I will on a future round.
    Thanks, Chair.
    Thank you, Mr. Turnbull.
    Monsieur Garon, go ahead.

[Translation]

    Thank you, Mr. Chair.
    Thanks to all the witnesses.
    I'll begin by informing you that the questions I'm going to ask you, like those of my colleagues, were prepared using a natural intelligence algorithm and that it all worked very well.
    Voices:Oh, oh!
    Now let's get serious.
    Ms. Desrochers, I'd like you to tell us about the unauthorized use of artworks because there is a lot of talk about artists and monetary compensation. In the public's mind, if you propose to make available a list of the works used to train generative AI systems, that's necessarily because the artists want to be paid, which would be entirely legitimate.
    Don't artists have a fundamental right to control what happens to their works, to refuse, for example, to allow those works to be used in a film production or adaptation? Why would that be different with artificial intelligence? Why are those models currently allowed to use all of that free of charge, without consent and without anyone even knowing?
    Thank you for that very good question.
    Models currently aren't allowed to use any of that free of charge or without obtaining consent. We're simply asking that this continue to be the case and that no exception be added to the Copyright Act that would enable text or data searches.
    Second, we're asking that the conditions be established that would permit the development of a licensing market in which rights holders can authorize or prohibit the use of their works if they so wish. If they agree, they may simply be paid for the use of those works.
     Are there any exceptions like those for data harvesting or mining?
    Yes, there are in other jurisdictions.
    Where are they, for example?
    There are some in the European Union, unfortunately. When I speak with my counterparts from the European coalitions, they warn Canada not to make the same mistake they did. It comes with many more complex issues.
    Our legislative system around copyright is a system based on positive consent, and we are therefore protected. However, with an exception such as this, which is based on negative consent, it's up to rights holders to withdraw from a system that, by default, permits the use of their works. As a result, that burden is too great for creators, and I have attended conferences where some of them said they had spent hours withdrawing their consent on numerous websites.
    We think that's the opposite of what we want. People who allow their works to be used must be able to grant permission and to allow a copyright licensing market to emerge.
    Thank you very much.
    You said there has to be transparency for artists, the market and everyone else to be able to make informed decisions.
    Imagine I've written a novel, which I assure you I haven't. If I want to know whether an algorithm has used my novel as a source of inspiration or whether it has read and copied parts of it; in other words, if I want to know whether an algorithm has had access to it, what resources do I have at my disposal?

  (1150)  

    Well, I—
    I see Ms. Hénault could also respond.
    If I understand your question, you want to know if the novel you've written has wound up in the algorithm. Is that it?
    Is there some way for me to know; is there a list?
    No, it's very difficult. The industry is now criticizing the lack of transparency in the creation of systems. As I said, things will change in Europe because there are very clear obligations of transparency in favour of individual rights in connection with the creation and use of works.
    We aren't necessarily seeing that in Bill C-27.
    That's correct. The bill has to be improved, as Europe is doing in this field.
    I'm going to play the devil's advocate and tell you that AI is a creator in the same way as a human being, that it exhibits creativity and that it draws inspiration from all kinds of sources.
    No artist works in a vacuum. Someone who writes a book has definitely read a lot of novels, 1,000, 2,000 or even 10,000, every one of which has been a source of inspiration. However, if an artist publishes a book, the use of the books that he or she has read for inspiration will not be considered unauthorized. So it's hard to understand why it would be different for a machine that, ultimately, is also a creator.
    No, the machine isn't a creator; a machine is driven by a programmed piece of software to produce results, and it sometimes produces bizarre results because it is a machine, not a human being.
    Since companies developing these machines negotiate licenses in exchange for compensation, their data suppliers must also be compensated. That's why a licensing market is developing. There are agencies that specialize in licensing for text searches and copyright-free own data.
    If I'm following your logic, you think that an artwork created by generative AI from other works is per se a kind of amalgam of adapted original works.
    It's a product, first of all, and that product, created by a machine, can be charged with forgery. If a work is recognized as protected and hasn't been authorized for use, that poses a problem for the user.
    When you buy and read books that inspire you, copyright makes that possible because the fact that it's been marketed means you can read it. However, our members constantly negotiate licenses for all kinds of services in both the technology and printed book fields.
    Text and data searches can be done on works that have been authorized with or without remuneration and on works that are in the public domain. For protected works, licenses must be acquired, as is the case for services one wishes to use and for which one must read the conditions, pay and search. We believe the same is true of intellectual property.
    Thank you, Ms. Hénault.
    You're quite welcome.
    Thank you, Mr. Garon.
    Mr. Masse, the floor is yours.

[English]

     Thank you, Mr. Chair.
    Thanks to our witnesses here today.
    Last week was really interesting. We had Google, Microsoft, Amazon and Meta here. I, quite frankly, was shocked by the fact that we had a panel in front of us that had been fined in the multi-billions of dollars across the world, yet we haven't had any of the same kind of oversight here. They also have the distribution rights to many of the works you actually perform here.
    Since that time, in fact, Microsoft, which was here, has been challenged in its takeover of Activision, which affects many of you and the people you represent. They have now identified that they're going to lay off a whole bunch of people at Activision, when they previously said they wouldn't, so the U.S. is taking stronger steps there.
    I would just like to go across the panel right now because we have to decide on this bill, which basically moves a lot of stuff to regulation. At best, it will be implemented in probably three years' time, or we can rework it across the board in terms of starting almost from the beginning. That's also because the government is unwilling to separate the Privacy Act aspects of this, where I think there's quite a lot of common ground, from the AI stuff.
    Maybe we'll start with ACTRA here and go across.
    Should we start over, or should we try to continue to work? I'm on the fence on this. Quite frankly, I was really disappointed with last Wednesday's.... I've never seen a panel, in all my years here, where we literally had companies, representing the influence of so many Canadians, that were fined and paid those fines—and lawsuits—including to other governments across the world, for billions of dollars. We've never had a panel like that, and yet they walked in and walked out of the room, just like we were nothing at all. They sent in government relations people, including people who were actually bought from government teams of the past, from government relations.
    Do we go ahead with a process that exposes us to potential regulation that's devolved from Parliament in many respects—to be updated—or do we try to rework things and put Parliament back in the front seat?

  (1155)  

    From my perspective, its not okay for government to do nothing. Across the globe, governments are struggling with this issue, and we do appreciate that this is a very difficult issue that touches on so many different industries, that touches on business as well as people. We appreciate that it's difficult. We also believe it's going to be a patchwork of protections that are going to come in. It can't just be Bill C-27.
    Growing up doing some lobbying in my past life, I was always told, “Get what you can now because government's not going to revisit this for another decade.” That can't be what happens here. We have to move forward as best we can, at every opportunity we have, on protections for Canadians, for workers, for our society.
    What I would say to you, on Bill C-27, is that we support the intention to ensure that consent is required for biometric information. We understand that it's going to start to protect name, image and likeness, but you're hearing us say, even on Bill C-27, that it doesn't go far enough for performers. We need greater protections within this bill, but you have to move. I would just say to the government, you have to move with speed.
    I want to make sure everybody gets in on this.
    Thank you. I got that.

[Translation]

    This is a big question. I have to admit I haven't stopped thinking about it. I read this legislation in a rush and wrongly thought it was more about personal data. We have some suggestions for improving it.
    As my colleague Ms. Kelly said, I think we have to legislate, but we also clearly have to improve this bill so it genuinely protects Canadians in the AI era.

[English]

    Thank you.

[Translation]

    I too would say it's important to act to provide a framework for artificial intelligence. The cultural sector may be a bit late to the party, but it's never too late. We're bringing specific proposals. We're reaching out so we can really continue the discussion with you and come up with specific measures that will truly take all the interests in the cultural sector into consideration.

[English]

    Thank you for the question.
    My answer is going to be, yes, we should be moving forward, but the context is moving so quickly. Bill C-27 was drafted before we had the impact of generative AI in the way we see it now. It was only a little over a year ago, with my elected board, that this switched from being in the background to front and centre.
    I can echo some of the comments you've heard. In our own surveys of our membership, who work not just as directors but across 50 different job categories, it impacts them in different ways, and it impacts them profoundly.
    This is a major concern, so moving quickly but making the improvements, some of which we're happy to be discussing here today, precisely to be able to protect creators, is really important.
    Move forward in a thoughtful way, but try to do it quickly. That would be our advice.
     Mr. Masse, thank you for the question.
    Look, I would encourage everyone to continue moving forward with Bill C-27 in its original state, which was a framework for all of these other pieces to hang on. I think that if I were you or any member on this committee, I would go to caucus on Wednesday morning, go to the microphones and say, “I heard really scary things about deepfakes and we have to do something on that now.”
     If it takes longer for Parliament to work through Bill C-27, that's fine, but I think there are some actions you could take right now to take real, meaningful action for our industry and, in fact, for all Canadians.
    Thank you very much.
    Mr. Perkins, the floor is yours.

  (1200)  

    Thank you, witnesses.
    I would like a yes-or-no answer from each one of the groups to start off.
    Before this bill was tabled, almost two years ago, were you consulted? It was tabled in June 2022. Were you consulted before that on this bill?
    No.

[Translation]

[English]

    No.
    I don't mean to be a stickler, but there was a public AI consultation in which most of the industry said they were not ready to talk about AI.
    Almost every witness we've had here over the last few months was not consulted beforehand.
    The minister made a big deal about consulting with 300 groups after the bill was tabled. I just looked through that list, and only one of you has been consulted, according to the list that was tabled with this committee. Is that correct?
    Since then, have you met with the minister and been consulted on this bill?
    No.

[Translation]

[English]

    No, you're correct.
    No.
    We've spoken to staff.
    Music Canada is the only name I see on the list, so it's not surprising to me that he got this part of the bill so wrong when he didn't talk to the groups most involved.
    When I buy a book or a company buys a book, or when I buy a ticket to a performance or download a song, post-Napster, I pay to use it for my personal use or for that particular thing within the company. If a university wants to photocopy part of a textbook, it has to pay for copyright because it is using that for commercial purposes. Is that right? It is paying for that right.
     Have any of you, or any of your organizations, been paid for any work of your members, artists and writers that's been used by OpenAI, ChatGPT, Microsoft or AWS? Have any of you been paid for any of that work so far?
    Not that we're aware of, but we wouldn't know, because we don't know whom they've used and what images they've stolen.

[Translation]

    As far as I know, that's not the case.
    As far as I know, that's not the case either.

[English]

    No. We haven't been able to identify them.
    Obviously, in the music industry space we've been through this path once before, so we have made major efforts to license music in the best ways we can, but in many of the ways you've listed off, those would not be licensed yet.
    Copyright law applies, but they're not paying for that. They're obviously building their large language models on the work of artists, writers, performers, musicians—artists of any kind—but you haven't been paid, and they're making money off what you're doing.
    Is what you're saying here that the Copyright Act isn't good enough and that this needs to have specific provisions that a large language model needs to abide by for any input that is used to develop it? They need to pay, just as I do. If I buy a book or go to a performance, I have to pay for the right to see that or read it. Are you asking for that in this bill?
    First of all, actors need to be given moral rights in the Copyright Act. We do not have that.
     Second, we need to be able to give consent if they want to use our image. We need to have control over the way it's utilized, and we should be compensated for it.
     All of those need to be taken care of.
    Mr. Rogers, I see that you have your hand up.
    Yes. Thank you for the opportunity to comment on that.
    You can go about it either way. Either you can say that there are no exceptions for AI, that AI is like everything else, and you can do it in a bill like Bill C-27 and go back and reference the Copyright Act, or you can make the change in the Copyright Act and say that this is the case.
    We didn't create copyright for the printing press. We created copyright for Dickens and the recognition that the work was worth more than what you paid for it right away, and we extended term of copyright for sound recordings because people were starting to live to the point at which they could hear their song on the radio and not get paid, so we made that change.
    If we say that we know they're scraping our stuff, and we know that's a use—it's of value—we can just agree now that that's the case and get out of those sorts of fun academic conversations about “I don't know. Is it a copy?” I know it's a copy. I know they're taking it because our stuff is a thing of value.

  (1205)  

     Thank you.
    We've had some discussion on deepfakes and intimate partner images.
    Do you believe this bill should make it a criminal offence to do a deepfake or intimate partner image?
    I'll start with ACTRA.
    I'll ask witnesses for brief answers, because we're out of time.
    As we said, this is going to be multi-faceted. Yes, the Criminal Code should impact and be able to deal with egregious situations of stealing people's images and embedding them in graphic porn. You know, anything that....
    It's anything without their consent.
    That's correct.
    I feel very strongly about this. We're here as a sort of cultural industries panel, but, members of Parliament, if you think about it as people.... In music, we call it VNIL: voice, name, image and likeness.
    It doesn't have to be the extreme example. It doesn't have to be pornographic. It doesn't have to be a prime minister. It could be my daughter. It could be your words in her mouth that she has not consented to. I have grave concerns about that.
    I would ask you all to take immediate action on it as soon as you can.

[Translation]

    All right, thank you very much.
    Ms. Lapointe—

[English]

    I'll ask Mr. Forget.
    Yes.
    Mr. Forget, do you have something to add?
    I'll keep it brief.
    Yes, I think it's in the public interest to keep the public safe. The exact form it takes remains to be seen, but the answer to your question is yes. There should be a mechanism that prevents these types of harms.

[Translation]

    Thank you very much, Mr. Forget.
    Go ahead, Ms. Lapointe.
    Thank you, Mr. Chair.

[English]

    I'll be sharing my time today with my colleague MP Turnbull.
    My first question is for Mr. Rogers.
    I noted that one of the suggestions you made in terms of amendments is that AI developers be required to maintain full records of the data used in training and for ingestion copies. What are you hoping to achieve with this amendment? I want to deepen my understanding on that.
    Again, I take the position that of course it's copyright. It's not even debatable to me. I know some people believe otherwise, but I don't understand it. In order for the rest of the copyright framework to work, I have to know what it was trained on, so the people who rightfully deserve to be paid for that training can be paid.
    Could I very quickly add something I want to say in response to Mr. Garon's questions?
    Two years ago, when this was science fiction, it was easy to imagine a space-robot head learning all the music and writing its own music. Now that we see the general application of it, it is ripping off bands you know and love with new songs that are rip-offs, or generating an image with copyright symbols in it because they ripped off a photo with a copyright logo in it. We know that it's stealing and scraping.
    Okay.
    My next question is directed at both Ms. Kelly and Mr. Forget.
    In terms of intellectual property, privacy issues and creative control, how can legislation outside of the Copyright Act help in terms of generative AI bad actors?
    I think it's going to be a lot of threading together of different things.
    Copyright is key. You are hearing us say that as actors. I think you need to have protection on the data you're looking at in Bill C-27. I think it's very important for us to look at how it's scraped and what they're doing with it. We need to have knowledge about where this data is coming from in order for us to even be able to trace bad actors—and good actors who just happen to take it and may not know.
    We're looking at things like this: What are you going to do with a worker who has their data taken from them by their employer so they can generate a program—say, a training session, etc.? Why not put something in the Employment Standards Act that protects all workers against having their name, image and likeness taken without consent, control and compensation?
    Privacy laws have to be increased so we have those protections.
    I'm sure there's more than that. This is going to be a patchwork.

  (1210)  

    Okay.
    Mr. Forget.
     I think it's a good illustration of the dichotomy of individual rights versus collective rights. We made the point a little bit earlier about the extent to which we don't think there's a comprehensive view on what we would call collective rights. We're in the business of negotiating collective rights, so we see this all the time.
    I'd say, in terms of bad actors, that one of the remedies to bad actors is encouraging good actors. The way you do that is by having order in the marketplace and by having a predictable marketplace where you have music, film and TV widely available in a way that is affordable so that customers can engage and buy. That's how you discourage bad actors.
    I'd like to leave you with one thought that I think is relevant here. We talked a lot about the extent to which the problems that arise with.... By the way, one of the direct answers to your question is that one of the ways we can discourage this is to prevent the dilution of value by ensuring that those players who are maybe not bad actors but are surfing off others' existing work to create new things acknowledge it, have the consent and have a model. There's an economic imperative here, too. We're happy to—we make these agreements all the time—sit down and negotiate what a licence agreement may be, and we're seeing more of that happening, so it's obviously possible.
    My last comment would be to point out the perverse logic. The same entities that are busy mining copyrighted works to create something new want to disregard the copyright on the input but then seek the protections of copyright on the output. I want to point out that—just to get it on the record—humans are the creative drivers here, not software. Also, coming back to make the connection with an orderly, predictable marketplace, it's problematic to have material that you have not licensed feed into something that isn't made by a human.
    I guess the question is this: What becomes of that in terms of...? When I think about the work that our members do, I see that it's millions of dollars of investment in creating film, television and digital media. There's a lot at stake, and investors are going to want to know that they have a path to be able to exploit those works and generate revenue.
    I'm sorry for the long answer, but there were many parts to the question.
    It was very good testimony. Thank you.
    I'm sorry, Mr. Turnbull, but we might have some more time at the end. Bear that in mind.

[Translation]

    The floor is yours, Mr. Garon.
    Thank you very much, Mr. Chair.
    Ms. Desrochers, witnesses have appeared before our committee and told us we should perhaps have a federal government registry to increase the security of the environment in which all kinds of artificial intelligence models are deployed. When they have a high-impact model, companies should hand over its code and get a risk mitigation plan.
    Getting back to cultural diversity, what's interesting is that the representatives of the Googles, Apples, Facebooks and Amazons of the world who have testified here defined high-impact and high-risk models involving people's health, safety and, I believe, integrity. Do you think cultural diversity should be included in this definition? If so, how could it be operationalized?
    I think that cultural diversity, which we call the diversity of cultural expressions, should be taken into consideration. I could give you an expanded answer to that question, but we should definitely reflect on indicators and ways of measuring the impact of the development of those systems on the diversity of our cultural expressions.
    There are concerns about the fact that the more culture is produced globally—I think we could say there's a centralized culture—the less it naturally concerns diversity.
    Here's a thought: suppose that, here in Canada, we went much further than our partners in protecting copyright, photographs, literary works and so on. That would obviously mean less material to train the AI models, and our works would therefore be included in AI results to a lesser degree.
    Consequently, if we overprotect our works, wouldn't that be a factor causing Quebec culture to disappear from global culture?
    That's a good question.
    First, even if all our works were processed by machines, they would still constitute a minority in all the information they process. Consequently, I don't think that would be enough to protect the diversity of our cultural expressions or to adequately reflect our culture in those models.
    Many people are now examining the issue of minority languages and cultures. All kinds of projects are being developed to determine how AI can help propel those minority cultures or to ensure that they're protected.
    Lastly, we can consider the possibility of putting innovative solutions in place to ensure that our culture continues to occupy its position in an environment where AI has been installed, but while retaining control over our data and stories as much as possible. All kinds of proposals are currently circulating.
    It's acknowledged that the development of AI reproduces the dynamics of domination and hegemony that we already see in the environment. Consequently, we shouldn't sell our available data cheaply, without consent or in conditions we don't control, and hope that Quebec culture is suddenly better represented in AI systems.

  (1215)  

    Thank you very much.
    Mr. Masse, go ahead.

[English]

     Thank you, Mr. Chair.
    It's been interesting in terms of what we've heard from a number of different witnesses. A lot of times, they're telling us to get something done, and then they're also telling us to make sure we're consistent with the United States and Europe to some degree. I don't know how we'd do both.
    Perhaps I can go to Ms. Noble on this. We've talked at a high level here with regard to how you can be exploited by AI, but I know that ACTRA represents children actors as well. I'm wondering what it's like in the empowerment model now in terms of vulnerability, even outside of AI, in terms of going and trying to get a contract, giving up the visual rights and your voice and all the other different things that could even be captured without AI, being as sophisticated as it is now because of recordings and the things that can happen, and then how scary that can be for the future and how maybe this can disempower more people in the future if we don't get it right.
    We would completely lose our livelihood. We would just be replaced. We're already seeing that threat at our heels. We've heard of background performers being scanned when they're going on set, being brought to a separate room to be scanned without their knowledge or proper informed consent. Then they no longer have a full schedule of shooting. They've lost a lot of work, because they're already being replaced with AI and not being compensated for it. This is at our heels. We've already heard the threats in Marvel movies with big Hollywood stars. It's happening to them as well. They want to scan them, replicate them and not have to use them in sequels.
    This is a huge threat. We spoke about dubbing and how this is a huge market, especially in Quebec, when it comes to dubbing Netflix series. This will all disappear overnight. The technology has already been made. Thousands of performers, specifically in Quebec, will completely lose their livelihood overnight. I myself have made my living off it for 30 years.
    There are numerous areas. Every threat.... We don't know what happens now when we go into an audition. Most of our auditions are self-taped. We send it off into the cyberworld. We have no control over where it goes or what will happen with it. Those are just simply auditions. We're not paid for auditions in the first place. If it's utilized in any other way—to scrape or scan or use in nefarious ways or do whatever—we have no control over that.
    I was at a couple of conferences in the United States this summer for part of my Canada-U.S. stuff. Even some of the companies were talking about how they're trying to fix their ethnic and cultural biases of actual input going into artificial intelligence and how they're building their models. They admitted that there are major deficiencies.
    I guess what you're saying is that the information that's now collected on the artist could then be replicated and used in biased representations across multimedia platforms for generations, and the person could still basically be walking around there.
    It's similar to what you said, Mr. Rogers, with regard to the artist. I thought that was really interesting, because you're right. I was here for the copyright review. Part of it was that they could literally hear themselves, because they're living longer. That can also be a legacy of the person.
    Just quickly, I know that we all sign contracts sometimes where we give away our privacy and it's all mishmash and stuff like that. Is it the same in the industry? Do artists have to figure out what they're giving up with these long forms and everything else at the last moment? Is that kind of vulnerability out there?

  (1220)  

     That's a good question.
    I would like to give you a little bit of insight into the life of performers. Number one, they know very well that if they are difficult, or perceived as difficult on the set, that will get around, and they won't get another job. Performers show up wanting to please the director, the producer, and people on the set. They show up far too often in the morning at 6 a.m., showing up for their hair and make-up. They're given a contract and told, “Just sign this, or else.” They're given a thick contract. They're not lawyers. They don't have a lawyer with them, but they know the reality that if they don't sign, giving away whatever it is that contract asks them to give away, they're not going to get that job, and maybe another job.
    They're precarious workers who really have to be concerned about their next job. They can't be the ones who are holding up the rights that they should have in this society. You've heard about the struggling performer or the struggling actor who has to have a second job, often in a bar or a restaurant. That's the truth. They can't pay the rent on the income they make working in the job they love, and then they have to face the realities of being perceived as not being easy to work with.
    They sign these contracts, and they don't know what they're giving away.
    Thank you, Mr. Chair.
    Thank you very much.

[Translation]

    Mr. Généreux, you have the floor.
    Thanks to the witnesses.
    Ms. Hénault and Ms. Desrochers, are the Canadian francophonie and Quebec francophonie in danger of literally disappearing in the future?
    I know that's a big question.
    Yes, it is a big question.
    Speaking as the representative of the Canadian coalition, I think we're here because we're working hard to protect and promote cultural specificity across Canada, for both the Quebec francophonie and the minority language communities.
    The Canadian and Quebec francophonies are definitely at risk. That's why we're vigilant and why we're here today to call upon you. We need to take action on several fronts to protect and promote the diversity of cultural expressions.
    Ms. Hénault, before you respond, I'd like to remind you that earlier you said that Canada mustn't become a banana republic. Do you view Bill C-27 as the bill of a banana republic?
    That's not what I said.
    I'm glad you asked me the question, and I thank you for it.
    Since publishing is a global industry, we have frequent discussions with international partners. However, our foreign counterparts are at times surprised to see that Canadian copyright legislation lags behind the rest of the world in all sectors.
    The purpose of the European directive is to protect cultural expressions, which I believe is one of the objectives of Europe's artificial intelligence legislation and also a boon to the francophonie and to all languages. However, it's even more important, in an AI context, to have good public policies to support minority cultural industries.
    English is obviously a dominant language that travels more easily than others, but that's one of the challenges for Canada's anglophone market because the large American market just next door competes with it.
    As for the francophone book publishing industry, Quebec's public policies have truly promoted its development, unlike other cultural industries, and the numbers are there to show it. A 50% market share, a very good number, has been achieved as a result of Quebec's and Canada's public policies, which have been foundational for the development of the book publishing industry.
    However, those policies must clearly be updated and modernized. The bill before us is an opportunity to help Canadian culture to continue emerging.
    Now I'm going to speak to everyone.
    On several occasions, many of you have discussed interoperability with what's being done elsewhere in the world, particularly in Europe and the United States. Do you think Bill C-27 goes far enough, even though it was improved by the amendments the government proposed? Considering the answers you've been giving from the start, that doesn't seem to be the case.
    To ensure your respective organizations remain viable, do you think it's important that Bill C-27 include the elements you're proposing?

  (1225)  

    We're ready to work very hard with you to improve this bill. The providers of AI models, both the general use ones and the big generative AI models, must be subject to the same cultural obligations as those Europe is introducing. We don't want to have European francophone culture in Quebec. We want our own culture to emerge. At any event, we're all in favour of respectfully regulating AI technologies for the common good.
    Last week, Mr. Bengio came and told us we had to pass this bill quickly despite its imperfections. However, many other witnesses have said we shouldn't move too quickly because, if we give Canada too rigid a framework and don't take the time to adapt to what Europe is doing to ensure that our respective regimes are interoperable, we would risk limiting innovation and research.
    Do you think we should simultaneously improve the bill and take our time to make sure we align our laws with those of other countries?
    Innovation and protection for rights holders go hand in hand. I'd even say it would be to the advantage of innovative businesses to work in an environment that has clearly marked guideposts and where it's easy to remunerate rights holders and secure consent. That actually works to everyone's advantage. Innovation and protection for creators shouldn't be mutually opposing concepts.
    As regards the importance of passing this bill quickly, I'm going to add to what Ms. Hénault said: we are prepared to work with you to improve it, but we need a framework. There are matters that must be settled promptly because the situation is developing quickly.
    Thank you very much.
    Before going to Mr. Turnbull, I see that Mr. Rogers wants to speak.

[English]

     I just wanted to say, on the U.S./EU question on deepfakes, specifically, that the No AI FRAUD Act in the U.S. is a bipartisan bill that our industry supports widely, and we would encourage all members to take a look at it. I think it provides a great framework for the deepfake issue we've been discussing.
    Thank you.
    Mr. Turnbull, the floor is yours.
    It's great to have a bit more time to go back to my line of questioning.
    My understanding from reading AIDA with the amendments that have been proposed is that it requires organizations building general-purpose systems with the ability to generate output to make their best effort and ensure that the output of those systems can be detected easily or with the aid of free software.
     That's one. I think that's a step in the right direction. I'm going to ask you in a second, but I want to cover a couple of other things.
     It also significantly strengthens the enforcement framework for privacy and requires express and meaningful consent when sensitive personal information is being collected, used or disclosed. That, to me, covers biometric information, which I think would apply to all of your actors, creators, performers and directors, etc. Perhaps there are some exceptions.
     It also requires the companies that are creating the AI systems to keep records related to the creation and operation of the systems, which may suggest they have to keep records of how they're training their systems.
    I understand that we could go deeper there, and some of you would want that, but those seem to be three significant steps to create greater transparency.
    I want to go to ACTRA first. It seems to me that these are really positive steps. Would you not agree that those are very positive steps that have been added to the bill?
    Yes, we would, and we started our submission by saying we're thankful that the government is looking at this and we're thankful that Bill C-27 has been brought forward. It has allowed us to have this conversation.
    There are significant changes we'd like to see in it, but we are happy to have the conversation. We're happy to be here, and we're glad that Bill C-27 is being discussed.

  (1230)  

[Translation]

    Yes, we're headed in the right direction, but I admit to you that, as a lawyer, I was very surprised to read the part of the bill concerning generative AI and not to see the key elements of the European legislation. However, I understand that our bill may have been conceived before the big generative AI models, which are capable of generating text and images, became widespread.
    Consequently, we have to adapt our legislation to reality, specifically by drawing on European AI legislation, to which you will soon have access. It has ingredients that can help us improve our own legislation so it genuinely protects Canadian creators and entrepreneurs.

[English]

     Thank you for that.
    Ms. Desrochers.

[Translation]

    On the first two points, I'll let my colleagues who more represent interpreters respond.

[English]

With regard to keeping records and making them publicly available, keeping records is not enough. What do we do with these records if they are not public? We don't know that. Yes, that's an improvement, but it must be improved again.
    My understanding is that some of these AI models could be 70 billion pieces of information, so is it realistic to make that publicly available? Maybe you think it is, but I'm sort of anticipating that there may be some logistical challenges to throwing the doors wide open and wanting that to be all public. Have you thought about that?
    Well, I'm not dealing with this amount of data. They do, so I guess there's a way of doing it.
    I understand that, but I think you have to have some empathy for what we, as legislators on our side, are trying to do, which is to be practical. There are companies using AI to do all kinds of great things. If we make it really onerous on them, it may stifle some of their ability to do some of the very good things that they're doing as well. I just want you to be aware of that.
    Thank you. I totally understand.
    I will switch to French again.

[Translation]

    In my remarks, I referred to an excerpt from the European Union's draft AI legislation, which isn't yet finalized. It provides for an obligation to make available to the public a sufficiently detailed summary of the way copyright legislation handles protected training data. I think that, if they can do it in the European Union, we can do it in Canada.

[English]

    Thank you very much.
    Mr. Forget, in terms of the three steps or the three additions that I highlighted at the beginning, can you speak to whether those are positive from your perspective?
     We believe that the current protection provided by AIDA doesn't go far enough. To reiterate the comment made by my colleague, Marie-Julie Desrochers, it's not sufficient in terms of transparency.
    Specifically—to give an example about generative AI—there is the fact that there currently is an absence with regard to authorizations from rights holders to use data for mining, for exploitation. The output cannot be protected by copyright. At least there is a consensus that these outputs are problematic because we have no knowledge of which data were used. It creates an issue for creators—for example, for DGC members—whenever they use or would like to use an AI output, but also for all Canadians, we believe, and—
    I don't want to interrupt you, but I'm sure I'm very close to running out of time.
    To summarize what you're saying, then, you're saying that you want copyright protection ahead of an AI model being generated or trained. Is that what you want? Help me understand how.... If an AI model takes 69 billion pieces of information, are you saying that every single rights holder of every single piece of information that may be put into one of these models—the person who created that content—should be entered into an agreement with? Is that what you're saying?

  (1235)  

    No. In fact, we do not currently have access to the information, as we said, of how much data is used and exactly how it's being used, which means that it creates a level of uncertainty.
     I just covered that the builders of these general-purpose systems have to make their best efforts to ensure that it can be detected how they've created that output. Isn't that right? If we put that in the bill, doesn't that actually address the concern that you're bringing up?
    The other dimension that was raised was the dimension of public data and the fact that it can be widely shared or readily accessible, and detailed to the point that an author, a rights holder, would be able to access it.

[Translation]

    Thank you, Mr. Turnbull.
    I give the floor to—

[English]

    I'm really sorry. I know I'm on the corner here. May I?
    Yes, Mr. Rogers.
     I believe the minister's amendments are a step in the right direction on those pieces. I think that, as we've discussed, they can go further.
     I am desperate, though, to speak about the idea that because they're big numbers or complicated, they shouldn't be regulated, or there shouldn't be a need for this.
    You have access to almost every song ever recorded on your phone right now in a licensed, legal way because there has been an arrangement between the rights holders and the platforms. That's awesome. That was, at one time, described as not doable. People sat here and told parliamentarians, “Do you really want me to go and track down the rights of every song rights holder? Don't you know there's a recording right and a written right? That will take forever.” Now we have a large, flourishing, legal, licensed music process across multiple platforms.
    Therefore, this is doable. I beg parliamentarians not to be led down this path of “It's too complicated for you to understand.” You must reject that. We wouldn't accept that in nuclear regulation or bank regulation. We can't allow it in the stealing of arts and culture.
    Thanks for that point.
    Thank you very much.
    Mr. Vis.
    I have a point of clarification.
    Some stakeholders, including many of you today, have raised concerns that the artificial intelligence and data act does not adequately protect the copyright of Canadian creators. That's been well established today.
    However, Mark Schaan, senior assistant deputy minister of the department, explained to the committee during his appearance in October that the most effective aspects for addressing the copyright concerns are in the Copyright Act, and that the government has announced a consultation regarding the connection between artificial intelligence and copyright. Some of that has been covered.
    Just so I get it on the record and the department hears it very clearly, why do you think matters related to copyright must be dealt with explicitly in AIDA rather than in the Copyright Act? Does anyone want to comment on that? Could amendments to the Copyright Act be made instead of, or in addition to, explicit copyright provisions in AIDA? If so, which ones?
    If anyone wants to comment on that point of clarification, it would be very helpful.
    Mr. Rogers.
    Can I just say, without casting aspersions on anyone, that this is an impossible game of three-card monte for stakeholders?
    The bill before Parliament is Bill C-27. There is a copyright review going on. If we don't comment on AI and its interaction with copyright during Bill C-27, we will have missed the boat. If we miss the opportunity to talk about it during copyright consultations, there's a high chance of it being suggested that we talk about it in Bill C-27.
    Thank you. I agree.
    This is in no way a reflection of my personal opinion of the assistant deputy minister, who I think is brilliant, but it's a little naive, in some respects, to differentiate it that way.
    Yes. I think what we're here for, all of us in different ways.... Again, I think that, on AI, the human element of this is very important. If you come to ground on the principles, you can go and change the laws however you want.
    Thank you. That's very helpful.
    Ms. Noble, your testimony in the very beginning struck a chord with me.
    I come from the Fraser Valley and I represent the Fraser Valley and the Fraser Canyon. Probably every Hallmark movie in North America has touched on my riding, the number one riding in Canada, Mission—Matsqui—Fraser Canyon. Literally every week last year, I'd drive by downtown Abbotsford or Mission and see movies being made. Then the strike in the United States happened and the industry shut down. In fact, both neighbours on either side of my house work in the film industry and didn't get a paycheque for almost a year. Those families were very hurt. Those are good, high-paying jobs.
    First off, what are actors and writers saying with respect to equivalent legislation or equivalent problems in the United States? Where do we need to find interoperability with American laws, specifically for English-language programming, to make sure our writers, entertainers and performers are not disadvantaged in any way?

  (1240)  

    I'm going to let Marie Kelly answer this for you. However, a big issue in their strike was this exact issue. It affected us. Marie will be able to give you more detail.
     This was the key issue in the SAG-AFTRA strike in the U.S. The members understand in the U.S., as we do in Canada, the precarious nature of how our product is held. They are lobbying in California and other places for deepfake laws. They know it has to be in the collective agreement, but it also has to be in the laws. They're also looking at consent, control and compensation. You'll hear about that in the U.S. as well. They'll talk about name, image and likeness. We also need those protections in legislation.
    I have to make one point. We haven't talked about the fact that diversity, equity, inclusion, and belonging are remiss in our industry. It's getting better, but the reality is that your TV screens don't reflect all of our society. When we're talking about data that's going in to machine learning, that's a concern we have to have. The discriminatory data that's going in is going to produce—and there are many articles on this—the same kind of data coming out. That needs to be said here.
    Thank you.
    Ms. Hénault, I know you wanted to comment on the previous point. Do you want to quickly say something regarding copyright?

[Translation]

    Yes, I wanted to discuss the distinction between the Copyright Act and Bill C-27. The Copyright Act governs rights holders, whereas Bill C-27 concerns the construction and management of generative AI models.
    It's important to regulate that industry by means of obligations of collective interest, including compliance with copyright. I imagine that other statutes, such as those on aircraft construction and transport, provide that one must comply with standards in the collective interest. We view Bill C-27 in the same way. It has to be said very clearly that developers must introduce policies to train their models fairly and respectfully and make them available. There must also be policies respecting users to ensure they clearly understand that this isn't a free pass to violate third-party copyright.

[English]

    Thank you.

[Translation]

    Thank you very much, Mr. Vis. I'm sorry but that's all the time you had.
    I now give the floor to Mr. Sorbara.

[English]

    I wish everyone a happy Monday, and welcome to this committee.
    From the testimony of each of you, it was easily garnered that the impact of AI is nothing less than what happened when the printing press was introduced a few hundred years ago. I say that with much historical thought on that front. What happened in the Industrial Revolution was that we were able to put a train on train tracks across the world. There is much emphasis on the opportunities for artificial intelligence in your field and your sector; however, there is also some trepidation you folks have within the AI space or within that technology.
    Eleanor and Marie, I'll start off with you. Is the impact of AI greater on the copyright side or the AI side, in terms of generative AI, where you may not need the individuals? I want to get that clarification, because we do have a copyright consultation going on right now, and part 3 of the bill, AIDA, does not pertain to copyright. I want to get your view on this. If you had to split up the two percentages, what would be the impact?

  (1245)  

    It's hard for us to divide the impact on individuals and their lives of having no work and having their image stolen and put in the worst kind of material. Both of those are horrendous, but we haven't talked a lot about the loss of work. The reality is that there is going to be a wholesale loss of work across all industries.
    We have a concern about being a cultural industry. I don't believe you can take AI and create a culture. Individuals can use it as a tool. In the U.S., there is already case law that says that if there's no individual involved in creating a piece of art, and it's just created by a machine, that's not copyrightable.
    The impact of this on jobs, individuals, and society is significant.
     I'm not a lawyer, but I'm still stewing on that reference to the U.S. case law that's in place now.
    I'd like to move on to Music Canada. I'll try to get to everyone. If I don't, please don't take it personally.
    With regard to high-impact systems, a lot of the testimony we've heard as a committee is dealing with the differentiation between high impact and low impact. With regard to the music and arts community, where would you fit yourselves in and why, in terms of high impact and low impact? We want to go after high impact, but we don't want to stifle innovation. We want to make sure the guardrails are in place for high impact, but we don't want to stifle innovation.
    I don't believe that currently the bill describes music as high-impact. I would find it hard to believe that anybody who's spent the last two hours listening to us, though, would think that there wasn't a high impact of AI on all cultural industries. If there is an attempt to allow for AI not to respect copyright laws, then it will have the highest impact on us. That's something you could fix today in Bill C-27 by just saying that AI has to pay for the use of copyright material.
    I'll move to Marie-Julie Desrochers.

[Translation]

     Welcome. My question concerns Bill C-27.

[English]

    Is it not important that we finish off this bill and put it in place? Twenty years have passed. Wouldn't you agree that the reviews of this bill pertaining to AI—even the copyright side, which is ongoing—should happen at much shorter intervals?
    I'm sorry. I missed the end of the question.
    It's about the intervals. When we review these pieces of legislation, due to the technological innovation that is occurring, it should happen at much shorter intervals or time periods.
     Yes, of course. At the speed things are going, it is probably ideal to review it regularly.
    I have a question for Mr. Forget from the Directors Guild.
    You and your members do a lot of heavy lifting; you are the creative side of the world, if I'm understanding this. How are your members feeling these days?
    Thank you for that question. Thank you also for the reference to the impact of the two strikes last year on our members, particularly in British Columbia.
    I would say that over the past year there's been a high level of anxiety. Our members do many different functions, starting with directors but right across the spectrum: picture and sound editing, location managers, production accountants, production coordinators, designers and so on.
    In asking members questions around their use of AI, their feelings about where this is going and how it's going to be impacting their job, we hear a different story, naturally. From the designers, we're hearing a very high level of apprehension. Designers are the people who create the world that you see onscreen, so they're responsible for the artwork that's on the wall in the person's home where the character is. Editors are quite concerned about the impact, as are, obviously, directors as well as authors.
    Across the spectrum, I would say that many of the members we represent see AI as being transformative. I think that meets the definition in my mind of something that will have a high impact, both for Canadians and the impact of culture, and in the way productions are made.
    I have one really quick comment. We're used to innovation. We've been digital for 30 years. We don't use film anymore to make films, so we have been early adopters and eager adopters of new technologies all along the way. You may be right that this is equivalent to the invention of the printing press, but we've had the experience of the introduction of a lot of new technologies that are now incorporated into the work that our members do day to day.
    AI, in a nutshell, is seen as something different. It is more significant and more transformative.
    I hope that answers your question.

  (1250)  

    Obviously, it creates uncertainty for your members, who are hard-working Canadians and work domestically and internationally, and who are artistically gifted, if I can use that term.
    My time is up.

[Translation]

    Thank you very much.
    Mr. Garon, go ahead.
    Thank you, Mr. Chair.
    I'd like to comment on the transparency issue. One of my colleagues, Mr. Turnbull, discussed this. He said it might be complicated to determine the identity of works that have been used among billions of data points. However, my impression is that an AI system capable of reading 100 million books a day is capable of searching from a list. You'd have to check that.
     That being said, some intervenors have told us that Bill C-27 won't get the job done. Many representatives of the web giants told us so, almost implying that we should reject it, start over from scratch, modify all kinds of other acts and work on it for I don't know how many years. We have that option, but there's also the option of moving ahead, continuing to amend Bill C-27 and doing the best we can. Then there's the option of waiting and imitating Europe, since Canada is a minor player after all.
    However, there's another solution: we could add a provision requiring periodic updates to the act, say every three to five years. That would force Parliament to review the act completely and would give it the opportunity to align the act periodically with the legislation of other countries so that Canada remains competitive, while enabling it to participate in the international review process.
    Ms. Hénault, what do you think of that kind of provision?
    I'd prefer that we try to do things right, starting now, based on the information we have and foundational policy needs. I'm going to think about this and will pass on my comments to you.
    Do any other witnesses wish to speak to the appropriateness of adding a periodic review provision to the bill?

[English]

     Yes. Ultimately, we are copyright stakeholders, as we have said many times today. The Copyright Act has a section on this.
    It is an important opportunity to review and make sure that we are up to international norms, but it is by no means a silver bullet to this problem. Like an endless election cycle, it creates an endless lobbying cycle in which this goes on. These pieces are often useful in minority governments. It is something you should add, but it's not something you should depend on.
    We believe that it's fundamentally important for the government to take action. Our members are already detrimentally hurt by this.
    I love the idea you have about looking at this more regularly. I don't know that you need to have something legislative for it. I hope we have good government that looks at this on a regular basis. It is amplified each and every day. This changes. What we're talking to you about today will likely have a new aspect to it a month from now.
    I would like to believe, as a Canadian, that we have a government that cares about these issues and that is going to continue to look at all aspects of this issue and make the changes you need to make when you need to make them.
    I want to say that we often look to see what's happening in other countries, but I'm a proud Canadian. I think we can lead, and I think we should lead on this.

[Translation]

    Thank you very much.
    Mr. Masse, go ahead.

[English]

    Thank you, Mr. Chair.
    My first intervention is the challenge of what we do next, because what I think you have demonstrated today is that it's like the argument that we're going to consult you on Bill C-27, and we will fix it sometime on copyright, and we will fix it somehow after we pass Bill C-27. That is not sufficient for the NDP. It's clear to us that you can do both of those things. Alternatively, we either send this to regulatory oblivion—that's really what happens—or dismantle what we have here.
    I'm looking at an alternative where we view it through the lens of almost like national security. Perhaps we even have a standing committee of Parliament and the Senate that looks at this over all the different jurisdictions, because copyright is proving that it's just outside this particular bill in terms of the technicality of it, but the reality is that it encompasses everything you have been saying and doing here in a much more wholesome way than in many other industries.
    I have one quick question to go across the table here about an AI commissioner. Should the commissioner be independent and able to fine the abuse of artificial intelligence if that is part of the law?
    Maybe we can start with ACTRA and go across.

  (1255)  

    Yes.
    Yes, provided we also have a framework for determining the nature of the abuse we're talking about, and that's where the bill comes in.
    Yes, 100%, that's a good point.
    I know it's really basic, but I only have 30 seconds, I think.
    I liked your standing committee idea better because it gets closer to what I believe in, which is that this is an important first-step framework bill, but we will be writing laws about AI forever, so we should make sure that we're clear in that as we go forward.
    Thank you.
    Thank you very much, colleagues.
    We still have about five minutes left, so if there are any lingering questions on your mind, I'll open the floor.
    I recognize Mr. Perkins.

[Translation]

    Thank you, Mr. Chair.
    My question is for Ms. Noble and Ms. Kelly.

[English]

    Ms. Kelly, you mentioned in your opening statement the issue of what you called “moral rights”. Can you explain that to the committee, please?
    In the copyright laws, a few decades ago, when Napster happened, there was a decision made that they would give a certain level of copyright. There are different levels of copyright. This is called the moral right, and it allows musicians to defend music. When they have music and it goes out into the world and somebody steals it, it allows them to claim that they have ownership of it. Performers don't have that. It sounds strange for me to say that, but performers don't have it. However, there wasn't a Napster for performers back then.
    You were on film, and nobody was going to take that film. Now we're out there in the ether digitally, and the deepfakes can easily take that. The problem we have is that the studios.... Eleanor works for a studio, and she's in a movie. The studio then owns her likeness and her performance for that particular movie, but if somebody steals it and puts it out on the Internet and does something to it, she doesn't have a moral right to protect herself against that. She should have that right.
    Thank you.
    Thank you, Mr. Perkins.
    Mr. Vis, go ahead.
    Very quickly, proposed section 33 of the AI bill speaks about establishing a data commissioner, an AI commissioner. This commissioner would be granted broad powers, largely outlined in the amendments the minister put forward.
    There's been a discussion at other panels as to whether the Canadian public would be served by a commissioner who reports directly to the minister or whether, given the significant societal and individual impact AI will have on every one of us, a commissioner should not report to the minister but to Parliament directly.
    Do you have any comments on that?
    I think that the Université de Montréal professor Catherine Régis talked about this, and I want to have more reflections on that. I think it's a very important question that you ask because the separation of powers....

[Translation]

    I don't know why I'm speaking in English; I think you'll understand better if I answer in French.
    It's better in French. I can understand.

  (1300)  

    That's an important question. Parliament may have to pass the laws, and we could have independent agencies to ensure compliance with those laws. That's a question that we'll answer too, if you want.
    I await your email. Thank you.
    I'm going to take this opportunity to remind witnesses not to hesitate to send the committee every specific amendment or comment that comes to mind after the meeting.
    Mr. Turnbull, go ahead.

[English]

    I have a quick clarifying question related to the EU requirements. Following up on the conversation we were having when I had the floor, I want to clarify the requirements for recording and tracking the data that's used to inform these models. Is the EU requirement to have a complete publicly disclosed register of all data, or is it to have a summary with regard to copyright holders?
    That's for Ms. Desrochers or Ms. Hénault.
    I'm sorry. I don't have the text in front of me online, but I'm going to give you exactly the clause we think should be extracted from the EU act on that.
     With respect to copyright, I think it's an obligation for all kinds of generative models, whether they are high-risk or low-risk.
    Right. However, I believe it's a summary, if I'm not mistaken.
    Ms. Desrochers, could you answer?
    Yes, what I read earlier was “sufficiently detailed summary”.

[Translation]

    Thank you.
    Thank you very much.
    Mr. Généreux, you have the floor for a final question.
    Last week, Mr. Bengio said that, within the next decade, we will have—we don't really know what to call them—machines that are autonomous or intelligent enough to perform any task, including creating. Do you think those types of creators—numbered or given another name—are culture creators?
    You're probably referring to products that are purely generated by AI. In the brief we recently submitted to the departments of Industry and Canadian Heritage concerning the interaction between copyright and generative AI, we object to the idea of granting copyright to products that would be purely generated by AI. However, we aren't opposed to artists using AI as a creative tool and being able to continue enjoying copyright.
    Thank you very much.
    That concludes this meeting, which was fascinating. On behalf of the committee, I thank the witnesses for taking the time to attend and to enlighten us with your expertise, and also for all the work you have done to defend and promote creators and artists in Quebec and Canada.
    Mr. Garon, you raised your hand. Go ahead.
    I have a quick question on another topic.
    For the Wednesday meeting, there was talk of hearing from the cell phone companies concerning their rates. Most of them will already be in Ottawa to testify before the Canadian Radio-television and Telecommunications Commission, and the committee will be meeting in the evening. Are we still planning to have those companies appear before the committee?
    Three of the four businesses were unfortunately not available on Wednesday evening. What we have proposed, together with the clerk, is to come back after the parliamentary break. We can hear from most of the telecommunications companies at that time.
    However, Mr. Garon, I would be happy to discuss this with you after the meeting and to see how we could plan the committee's business.
    Once again, thank you, everyone, and good day.
    The meeting is adjourned.
Publication Explorer
Publication Explorer
ParlVU