:
I am calling this meeting to order.
Welcome to meeting number six of the House of Commons Standing Committee on Canadian Heritage. Today's meeting is taking place in a hybrid format, pursuant to the Standing Orders.
Members are attending in person. Do we have anybody on Zoom today? No, we do not.
Please wait until I recognize you by name before speaking, and all comments should be addressed through the chair.
Pursuant to Standing Order 108(2), the committee is meeting to study the effects of the technological advances in AI on the creative industries.
It is now my pleasure to welcome our witnesses. From ArtIA, we have Marc-Olivier Ducharme, director of innovation, alliances and futures at Sporobole. From OCAD University, we have Kelly Wilhelm, head of the cultural policy hub. We have Patrick Rogers, chief executive officer at Music Canada. We have Margaret McGuffin, chief executive officer of Music Publishers Canada. From the Society of Composers, Authors and Music Publishers of Canada, otherwise known as SOCAN, we have Jennifer Brown, chief executive officer.
Welcome.
Each organization will have five minutes to speak.
[Translation]
We'll start with Marc‑Olivier Ducharme.
:
Madam Chair, members of the committee, thank you for the invitation.
I represent ArtIA, a group of cultural organizations, research centres and artists. Our goal is to understand the impact of artificial intelligence on the arts through action research. We have just released a report after more than two years of work, and an upcoming report will be tabled soon.
Silicon Valley's technology development model has been failing for decades. In the 1980s, it was already known that this model only served shareholders, not human beings. Today, artificial intelligence amplifies this problem. Artificial intelligence giants are funded to the tune of hundreds of billions of dollars. This difference in scale creates dominant positions that are hard to counterbalance.
Numerous studies and surveys, even some conducted by this committee, have shown negative effects of certain social platforms on the population, particularly on adolescents and marginalized people. Risks of political bias in referral systems have also been demonstrated. These same risks and biases are maintained in the different platforms of generative artificial intelligence, since their training bodies draw directly on the data produced in those networks.
We don't have a technology problem per se; we have a digital feudalism problem. Silicon Valley aspires to our dollars, data and talent.
Our conclusions are clear: This model optimizes shareholder profitability first, not the public interest. Without safeguards, ethics and cultural specificities become externalities, leading to cultural standardization. The social costs are borne by the communities, as well as by governments, which leads to cultural uniformity, especially among linguistic and cultural minorities. Without cultural sovereignty, there can be no economic sovereignty, since nothing would distinguish us as a society.
Silicon Valley controls all our production tools. Are we also letting them take control of artificial intelligence? That's the question we're asking.
Artificial intelligence is the next frontier of technological colonization. Here is what our findings reveal.
First, we see the exploitation of creators by dominant artificial intelligence models that are driven by data stolen from creators, which marginalizes francophone communities, aboriginal communities and other cultural minority communities, such as Acadians. This is unacceptable.
Second, there is the threat to cultural diversity. Because toxic algorithms homogenize cultures, our different languages are threatened.
Third, given the pace of development, accelerated exit cycles impose technical and cultural standards before public deliberation. Without adaptation laboratories, linguistic minorities, once again, don't have the time to adapt their tools, interfaces and data sets.
Fourth, we're experiencing a Napster 2.0 moment: Without timely intervention, we risk repeating the music industry’s mistake—ceding control of our cultural data and creative tools to foreign platforms.
Fifth, harnessing digital commons—in other words, what we've all created on the Internet—is at the heart of artificial intelligence models owned by digital giants.
The artists we work with don't seem to be afraid of technology. Instead, they're afraid that they won't be able to survive financially in a future dictated by technological giants.
ArtIA proposes government investment in laboratories for experimentation, training and production of artificial intelligence in the field of culture that remain owned by the Canadian cultural sector and are governed as a digital community.
These include laboratories that are experimental spaces where cultural communities design their own artificial intelligence tools, adapted to Canadian French, indigenous languages and other minority languages.
We also propose cultural data trusts, infrastructures that are sovereign and that protect and value our cultural data, whose governance rules and terms of access are decided by the communities and by the artists.
In addition, we favour training programs that enable artists to master artificial intelligence rather than be dependent on it, thereby preserving their creative autonomy and our creative autonomy.
In our view, this is an exportable model. If we succeed, we'll create a culturally responsible AI ecosystem model that can be exported to other social sectors and countries. As a result, Canada is positioning itself as a world leader.
I'd like to point out that Canada has already demonstrated its ability to implement innovative public policies. The 2017 Pan-Canadian Artificial Intelligence Strategy, the world's first national artificial intelligence strategy, has allowed Canada to become a global leader in artificial intelligence. This strategy has created hubs of excellence, such as Mila in Montreal, which has attracted international investment and positioned Canada as a key player in the ethical development of artificial intelligence.
ArtIA is part of that tradition. We want to develop laboratories for our culture by extending this visionary approach to the cultural sector, which will protect our diversity while creating economic opportunities.
Artificial intelligence is already transforming our creative industries. Acting now means choosing to protect our cultural diversity rather than going through homogenization.
The Canadian government invested $2.4 billion in artificial intelligence, but none of those funds went directly to culture, one of the most vulnerable economic sectors to—
:
Thank you, Madam Chair.
Thank you, honourable members of the committee, for the invitation to appear today. My name is Kelly Wilhelm. I am head of the cultural policy hub at OCAD University. Prior to that, I spent about 20 years in the federal policy and funding system around arts and culture.
I want to start by breaking down a false dichotomy that we often hear in discussions around the creative industries and AI. That dichotomy is between two sides. On the one hand, there are those who argue that private sector development of AI must be allowed to continue unfettered and unregulated or the country's capacity to innovate will be at risk. On the other side, we often hear those who argue that government regulation and guidelines are essential to preserve the country's culture and identity. You can see the split clearly in the government's 2024 consultations around AI and copyright and also internationally in any policy discussion that addresses the use of copyrighted works in the development of AI systems.
Here's the reframing that I want to propose to this dichotomy. Government action that values and invests in creative and cultural professionals and their IP does contribute to innovation. It does not stifle it. There's no question that AI impacts creative IP, copyright and labour. It has already disrupted the value chains on which the creative industries, their companies and their workers depend. These are critical issues, and collective work on them by industry, researchers and policy-makers must continue.
You've heard already in these meetings that the creative industries are asking for transparency and fairness in how the content to which they own the rights is used in AI systems and in identifying AI-generated content for what it is. They're also asking for a seat at the table where AI policy decisions that affect them are being made. This is a point that I would echo and support. That's because, while AI is affecting the cultural industries, as your question and your study suggests, the effect is not one way. That goes back to my reframing. Like other technologies before it, creative industries are using AI to do what they do best—create, innovate and tell Canada's stories.
In the cultural policy hub's convening with creative industries over the past two years, we've heard that many are using AI and building AI tools that help them to run their businesses—the back end—and in their creative work. They use AI for many purposes, including creation, production, distribution and marketing. They use it to reduce technical, financial and environmental barriers for all sorts of creators, whether those are small and medium-sized enterprises, indigenous creators or even those working in rural and remote communities. They use AI to maximize the value of their IP, to find new markets and to build out fan bases in global markets. They use it to protect the sovereignty of the data they hold, which is cultural data, and to develop new protocols and ownership models. Artists, as we know, work with AI to create the entirely new and unexpected, very often in collaboration with tech companies and product designers in the private sector.
Creative industry leaders are asking how Canada can shape AI in a way that protects and enhances our culture, creativity and sovereignty. Four key themes keep coming up in our conversations with those leaders. I want to share them with you today.
The first is around sovereign and accessible data. Creative industries, like other industries, need access to data, including government datasets, to develop effective, Canadian-made AI tools and solutions. Governments should be investing in Canadian-owned and Canadian-governed public data infrastructure—an AI commons, if you will—alongside its investments in private companies and AI compute.
The second area that keeps coming up is skills training and education. Canada needs a national AI strategy for education from K to 12, post-secondary and in professional sectors like the creative industries. The strategy should prepare Canadians to harness new AI tools and technologies to remain competitive in the evolving job market. This is a concern the creative industries share with other industries.
The third area, which I'm sure you'll hear about again today, is regulation. Most in the cultural sector are calling for simple and clear transparency requirements, harmonized standards and shared principles in order to avoid uncertainty, fragmentation and confusion in this marketplace.
The last point is cultural sovereignty and creative ambition. This moment of economic, technological and geopolitical change is the right time for Canada to be ambitious in this space, not to draw back but to think big and invest in the full diversity of creators and creative SMEs that drive our creative economy in this country. Canada has long recognized the importance of our creative industries to our cultural sovereignty. It's why we have the tools and the successful industries that we do in film, TV, music and others.
AI is fundamentally a homogenizing tool. It is about recording and reproducing existing patterns found in the data that is fed into the machine. It does much of the cognitive work for us, replacing the human work of synthesizing and considering patterns that is essential to creating new ones.
Let's be clear: AI-generated content is not a replacement for original creative IP.
My name is Patrick Rogers. I am the CEO of Music Canada. Music Canada is the trade association of Canada's major labels—Sony Music Entertainment Canada, Warner Music Canada and Universal Music Canada. Our companies have offices full of Canadians making Canadian music for the Canadian market and the world. They are very excited about what AI could mean for the future of music.
For my members and the artists they invest in, AI as a tool can help human artists elevate their creativity, find efficiencies in the recording process and help our industry protect its IP. However, having faced an existential technological crisis in our lifetime, the music industry has some lessons learned that shape our advocacy to government.
As you know, not that long ago, home Internet created the possibility for peer-to-peer file sharing and Napster and other sites decimated the industry, but it wasn't the Internet that nearly killed the music industry. It was copyright piracy and the public's willingness to become pirates. It was the breakdown of our public understanding that we—I mean all of us—were stealing. I worry about that happening again today, or tomorrow, with AI, but let me first share some examples of AI at its best.
The team at Universal Music Group and Brenda Lee, keeping the original music and background vocals, replaced Brenda Lee's original lead vocals in Rockin' Around the Christmas Tree with a newly translated Spanish vocal, created using an ethically trained AI model derived from her voice and fully approved by her. After Randy Travis suffered a devastating stroke in 2013, leading to the loss of his voice, he and Warner Music Nashville used AI to help him record the song Where That Came From. Randy and his team worked on vocals and put human touches on every note of the song, which would not have been possible without AI. Universal Music Group's collaboration with ethical AI company Endel is enabling UMG artists and labels to create AI-powered soundscapes to enhance daily activities like sleep, relaxation and focus.
What's the problem? Each of these examples is done in collaboration with the artist involved. It's still their artistry. It's still their talent. Their rights are embedded in the project. That's not how the most popular song-generating services work now. They aren't licensed by artists and rights holders, and they aren't paying for the inputs they are trained on or the outputs that compete with those very artists. We are in the Napster era of AI in the marketplace. We need to get to the iTunes stage so that we can get to streaming. To do that, we need to do a couple of things.
One, uphold copyright law, recognizing that the training of AI models on the music of artists engages their copyright. Done without their permission, it is infringement—full stop. Upholding copyright also means that you should not cave to demands for sweeping “text and data mining” exceptions. Everything you need to know about this is in its name. It's not called the “hopelessly searching for human enlightenment” exception. They call it mining. What do we mine for? Valuable things. Where do we mine? Where we know there are valuable things. No data is more valuable than the catalogue of our favourite musicians. As much as some generative AI companies will tell you that they are teaching the machines everything, they don't want to train their algorithms on your brother's college band. They want Sgt. Pepper's and Pet Sounds and Folklore.
Two, we want to make AI companies keep records of their inputs. We want AI companies to track and disclose to rights holders what they train their models on. The proponents that say AI should be able to steal everything will claim that it isn't possible, but if we're going to unlock human consciousness with AI, shouldn't it be able to write a bibliography?
Three, we want the federal government to get serious about deepfakes, not just the worst kinds—the sexually explicit ones or the ones that interfere in our elections—but all of the harmful ones. They put all Canadians, including our kids, at risk. What started as Connor McDavid appearing to smack talk his opponents and teammates two years ago has turned into babies interviewing their dogs, which I know aren't real but I can't tell you why. These videos make us laugh, but the technology could easily be used to make us cry. What if someone used it to ruin your career? What if someone used it to ruin my five-year-old's budding social life? What's illegal on paper should be illegal online. Putting your words in my mouth is not free speech.
Fortunately, AI will also be part of the solution. It's why our industry is working with tech to combat this problem. Sony Music's work with Vermillio's TraceID tool helps protect artists in real time against digital replicas, but we need governments to make it clear that unauthorized and harmful deepfakes are illegal. There are models for Canada to look at, such as the U.S. NO FAKES Act, which has bipartisan support and is backed by both the music industry and platforms like YouTube.
These three recommendations will lead to Canada being at the forefront of AI innovation while still protecting the Canadian cultural sector.
Thank you.
As CEO of SOCAN, I am pleased to appear before this committee to speak to the effects of artificial intelligence on the music industry.
In 2025, SOCAN is celebrating 100 years as Canada's copyright collective for songwriters, composers and music publishers. We are proud to represent the rights of more than 200,000 direct members from coast to coast to coast. SOCAN collects licence fees for the public performance and reproduction of music, matches those uses to rights holders and then distributes the royalties to our direct members and rights holders from across the world.
We are only starting to uncover the full potential and implications of AI in the creative sphere and to understand what role it will play in shaping the future of the cultural landscape in Canada. AI presents a turning point for the music industry. We believe that with proper safeguards and an appropriate copyright framework, this technology provides tremendous opportunities to support and enhance human creativity as a tool that allows Canadian creators to continue to tell their stories, reflect on who they are and contribute to Canada's identity and values.
However, without the appropriate balance in place, the current state of AI presents a challenge for our members. Canadian works are being stolen and scraped by AI companies to train their models, without any compensation to creators. These AI models can then output a complete song in response to a single prompt, with that AI-generated song potentially replacing the work of Canadian creators.
A global study conducted by CISAC, which is one of the world's largest networks in the creator sector, estimates that under the current market conditions of wholesale theft, up to 24% of music creators' revenues are at risk of disappearing. This presents a real threat to the sustainability of the Canadian music industry.
Meanwhile, the companies behind these AI models have publicly stated the importance of using high-quality, human-created music to develop their products, going so far as to say it is essential. What is not essential, apparently, is making sure that creators are paid for their important contributions. Instead, human creators are currently fuelling advances in AI models without sharing in the benefits.
We believe a successful AI approach will value and compensate human authorship, respect the policy objectives of the Copyright Act and lead to a vibrant licensing market where the benefits of AI are shared with those who are vital to its development.
Looking back at the online streaming we've gone through in the last 10 years or more, similar arguments were made about compensation and licensing being impossible, but a mature licensing regime has formed and creators and streaming services both have benefited in the past decade. Respect for copyright does not stifle innovation. If you stream music on your smart phone, you have proof in your pocket that compensation for creators and technical innovation can successfully coexist. The adoption of AI can also be done in a way that respects creators and incentivizes human expression.
We have two positions that we would like to put forward to the committee.
First, we strongly oppose new copyright exception. AI companies should not be permitted to exploit creators' works without obtaining consent and providing credit and compensation. A TDM exception would not facilitate growth in either the creative sector or the technology sector. While there is no evidence to suggest that a TDM exception is necessary to maximize investment in the AI sphere, it would certainly deprive creators of the economic benefits of their works.
Second, we urge you to ensure AI companies are transparent about the works they use to develop their models and that AI-generated outputs are clearly labelled. AI developers must be required to disclose which copyright-protected works are ingested and stored in their datasets. Without such transparency, rights holders are unable to negotiate on a level playing field and cannot prove when their works are used. Further, mandatory labelling of AI outputs would mean that the public can then make informed choices about the type of content they consume.
Thank you very much for your time.
Good afternoon, Madam Chair and members of the committee.
My name is Margaret McGuffin and I am CEO of Music Publishers Canada. I am here to advocate for the ethical and transparent development of AI.
Music publishers discover and develop Canadian songwriters and have made significant investments in the vast majority of songs and scores that are heard every day on radio, on streaming services, in video games, in film and television productions and on new emergent platforms around the world.
In the music space, AI has the potential to support the valuable work of human creators, which in turn enriches Canadian culture and society. Our members are already leveraging the benefits of this new technology. Songwriters are using it in the studio, and our members are using it to scale their operations.
Unfortunately, the music industry has also seen mass theft of copyrighted-protected songs by AI companies, both on the input side for the purpose of training AI models, and on the output side, with the development and publication of unlicensed generative AI models. This poses serious risks for Canadian creators and the companies that invest in them.
Strong copyright ensures that MPC's members, songwriters and composers maintain control over their music and receive the fair compensation they deserve.
When an AI company uses music that has been scraped or captured from the Internet without authorization, it prevents rights holders from controlling and realizing value for the use of their works. The development and commercialization of unlicensed AI model inputs and generative AI outputs are already creating serious market distortions and raising concerns about fair competition.
MPC works with the International Confederation of Music Publishers. A recent Billboard magazine story highlights evidence collected by ICMP over the past three years showing that many of the world's biggest tech companies have scraped copyright-protected music created by millions of songwriters, composers and artists to train generative AI systems, without permission or licensing.
To put this in perspective, nearly every song ever written by a Canadian songwriter has already been scraped and is already stolen by these AI companies without consent, credit or compensation. Imagine that someone accessed your paycheques without your permission and that behaviour was normalized. That's what's happening to songwriters.
This extensive non-compliance with copyright laws in turn leads to serious negative economic impacts. Copyrighted works—our songs—add value to AI models. To derive fair value for the use of this copyrighted material, the music publishing industry, which includes SOCAN and CMRRA, routinely grants licences to technology companies. AI developers should be no different. The emerging market for licensing music to AI developers should be encouraged, including requiring them to disclose and maintain records of all their data.
In conclusion, MPC believes the Canadian government must reject any calls for watering down the copyright system with a text and data mining exception. We've already heard about that today. Music rights holders must be able to control and realize value for the use of their songs. It is imperative that Canadian regulators and the U.S. government approach generative AI in a manner that respects creators and incentivizes human expression. This will be beneficial for creators and for Canadians as a whole.
I look forward to answering any of your questions.
Bill was put in place and, through its process, Bill C-18 resulted in Google coming forward and creating a hundred-million dollar fund. There was then a cohort put in charge of that money. News businesses would apply to the fund and then be awarded dollars.
As a result, when they are awarded those dollars, they are also signing over their rights to the information they're producing, which means that Google can then go in, scrape that information and use it however it wishes. Because these news businesses have signed off on Bill and the Google dollars, they're subject to Google and whatever Google wants to do with their information.
This is the direct result of over-regulation by this government, so my concern is this. We have digital-first creators in this country who are generating fantastic content. We have artists in this country who are doing great work. When the government overreaches, it stagnates and hurts the industry. Where is that line between putting legislation in place that is going to facilitate further success versus hindering it?
:
Thank you, Madam Chair.
[English]
Thank you all for taking the time to join us today.
I quite literally just met with members from various organizations representing Canada's book sector as part of the day on the Hill. They're actually in this room as we speak. I'll give a quick nod to them and to the great work they do.
We also spoke of Copibec and the work they do. When I met with them earlier this summer, they described us committee members as “guardians” of Canadian culture. I think we'll all agree that it's a very generous way of looking at us members of this committee. Protecting Canadian cultural sovereignty while understanding the inevitability of AI is our priority, though. That is quite literally the role we hold as members of this committee.
[Translation]
During our discussion, they clearly stated that the goal was to develop a framework for artificial intelligence and Canadian culture to be partners.
I now turn to the head of ArtIA, whose mission statement is to develop the infrastructure, tools and skills for Canada's cultural sector to become a global leader in responsible artificial intelligence.
You've positioned yourself to ensure that artists have a say in how artificial intelligence shapes creative practices. We often talk about a human-centred approach to artificial intelligence.
In your opinion, Mr. Ducharme, what does true cultural sovereignty look like in this area? What would it take for Canada to be a leader in establishing ethical standards of artificial intelligence for the arts, rather than simply adapting to what's already being done?
:
In our opinion, artificial intelligence in the field of Canadian culture should be managed within a governance framework that would be created and administered by people who work in culture. This framework should be flexible, scalable and adaptable to certain cultural communities, particularly indigenous and minority language communities.
We think there should be networks of laboratories across Canada that would look at artificial intelligence issues and allow creators to go out and experiment. These laboratories would also allow them to make decisions about how they want to protect their own data and works. That's not currently the case.
For years, Facebook and Instagram have encouraged artists to publish their works of art on Instagram. These works were extracted entirely by scraping data for use to drive models. So there were lies and theft. Silicon Valley has been doing that for a very long time, as I said in my opening remarks.
So we believe that governance is really at the root of all this. In order to develop governance frameworks, you have to bring people together. We must ensure that there are places where human beings, in this case artists, can come together and make decisions about policies that affect them.
We're not asking the Canadian government to make decisions for artists. We're asking that artists have the autonomy and funding to provide a counterweight to American economic power, which is so vast that it pulls the rug out from under us every time we try to take a step. That's our biggest problem.
Basically, the ArtIA project was funded through a grant from the Quebec government to help us understand the effects of artificial intelligence. We see that all of a sudden the Americans are investing heavily in this area. They do it with capital from Silicon Valley, but they also do it by polluting heavily, consuming natural resources and destroying cultural resources in communities. For example, we see a lot of indigenous art being created by artificial intelligence, by creators who aren't indigenous. So it's a very dangerous cultural appropriation.
As a francophone, I have no artificial intelligence model that represents me. Current models can mimic my language, but they can't understand my cultural specificity. I'm a Quebecker, so it's not too bad, but an Acadian person has a cultural and linguistic specificity that's very different from that of Quebec. Less data on Acadian culture was collected by harvesting, which makes the biases even stronger.
What about a queer artist or a Jamaican-Canadian artist, for example? What about the people who exist at truly complex intersections of our societies, where models, with their biases, end up acting as real homogenizers of culture?
So our position is to let the artists take over, and to do that we would have to invest in those policies.
:
Chair, I thank the member for the question.
It's good to see you.
When it comes to AI, the way out of the piracy era was that our members, up against the odds, held out and said that music had value. In a period of time in which music was stolen or given away for free, we said that we believed there would be a time in which people pay for music again.
I believe this to be true. I'd be very excited to put this in Hansard. I've been saying it for a while. I believe that music is the only thing in the world that everyone did for free, and then people started to pay for it. Then people stopped paying for it. Now, they pay for it again. This is a human success story. I think it is one worth protecting.
My greatest regret at the beginning of the AI boom was hearing people say that AI was not a copyright issue and not recognizing it as the massive red flag that it was, but it is the copyright issue of our generation.
That's how we're going about it.
:
I'm going to take all the time you give me, Madam Chair. Thank you very much.
Thank you to all the witnesses. Frankly, it's always fascinating, interesting, and very informative to hear them talk about the all-important issue of artificial intelligence and its impact on the cultural sector.
We're hearing a lot of concerns from people in all sectors of the cultural community. What's happening is very worrisome. However, I don't want to paint an entirely bleak picture of the situation. I think there are some very interesting things about artificial intelligence. However, I think we need to tell ourselves the real story and realize the risks associated with artificial intelligence for the cultural sector, particularly creators.
Mr. Ducharme, I'm going to start with you. Your arguments in support of an artificial intelligence engine that would be local and owned by people from the cultural industry are really interesting. It's an idea, a model. I'm curious, though. In the past, search engines from Quebec or Canada have tried to emerge, for example broadcasting platforms. Even now, people are trying to set up a digital broadcasting platform. These are great ideas, and we'd like to see them flourish. Unfortunately, these companies are competing so hard with international companies, particularly U.S. companies, that they're not likely to be successful enough to generate what we want them to generate.
In the case of an artificial intelligence engine, I remain hopeful because much of the research that led to the development of artificial intelligence was done in Montreal, Quebec, particularly by Mila, with Mr. Bengio and all his colleagues. I tell myself that there's hope, since we developed this technology here.
Tell me how, in the case of an artificial intelligence engine, it would be different from other attempts we've made in the past to compete with Internet giants.
:
There are several elements to your question.
Our goal isn't to develop an artificial intelligence model or engine, as you call it, but rather to focus on the governance of these tools and the infrastructures that make their creation possible.
What we're finding is that creators aren't necessarily using ChatGPT, DALL-E, or generative artificial intelligence systems for image production. They use all kinds of artificial intelligence tools. They use them to locate bodies in space as part of a dance show, for example, or to control systems in a performance or in a festival. Artificial intelligence isn't just used for content generation. It can be used for several purposes, such as data management. My colleagues who are here with me have given other examples.
We want to be able to experiment. Artists want to be able to understand how artificial intelligence impacts their work and the creation of their own value, that is, the value they bring to market. As a result, they want to know what the rules are behind that. Silicon Valley's rules are opaque; we don't know them. Not only are they opaque, but they are constantly changing.
I'll give you an example. A few years ago, many artists from here and elsewhere contributed to the development of an application by RunwayML. It was free, it was good. We worked with it and everyone gave each other hugs. It was perfect, it was great. One day, artists received an email telling them that they no longer had access to their work. Everything they'd put on the platform no longer belonged to them. The rules had changed.
AI rules and models are constantly changing. We can never rely on the development of this technology, because it's in its infancy. It started just a few years ago, and yet we're overwhelmed by it. That's the Silicon Valley model: They give us everything for free, and then they pull the rug out from under us.
:
Thank you very much, Mr. Rogers.
Mr. Ducharme, like you, I'm a francophone and I come from Quebec. However, I'd like to ask the other witnesses if they share your point of view. I partly agree with what you're saying. We can't be against virtue, of course. We'd like Canada to be able to preserve its identity and creative autonomy and for artists to be able to earn income from this activity, as is currently the case.
So I'm speaking to all the other witnesses.
In your respective fields, could a model like the one proposed by Mr. Ducharme exist? If so, why not create it yourself without having to resort to government funding? In fact, the Canadian creative industry as a whole generates billions of dollars of revenue, to my knowledge. I'm just putting the idea out there.
:
I agree with everything that was said earlier.
Last year, less than $1 per Canadian citizen was invested in direct support for artists. In fact, the total amount was about $36 million. Someone asked us earlier why we don't create our own systems. You have the answer. When investments in direct support for artists are $36 million, there's no other way to work.
Independent artists are the ones who support the industry. They're at the bottom of the pyramid, in a way, but they move and flow through it. Video games as a creative industry wouldn't exist without visual artists. Support is needed for that to exist.
Artists are among the lowest earners in Canada. They are in an under-represented job class, and generally don't have the chance to make their voices heard in places like this, and don't hire lobbyists to put pressure on the government.
We're asking the government to invest in technologies, but we want those investments to be controlled by artists.
:
I'm very gratified to be thought of as having an opinion on this that could be useful. I'm not an expert in education policy, but I do have a husband who is a music teacher.
I grew up and was surrounded by teachers in that capacity as well, so education is very important in my life and, I think, in this case.
When starting at K to 12, the place to begin is with literacy. It's around understanding media, understanding the world around you, knowing what's real and what's not, and having the cognitive skills and the judgment to be able to know the difference. It isn't necessarily that I'm saying that at kindergarten you should be using AI generative tools. That's not necessarily the way forward. However, we do need to have, at the youngest possible age, a skill set for our children to be able to know the difference for what they're looking at.
The challenge is there; I don't disagree with you. There are many sizes of school boards. There are many different approaches to curriculum across this country, depending on where in the country you live. I would suggest that if we're able to do things like the communications technology courses that are mandatory in Ontario now in grade 9, then surely we should be able to figure this out for K to 12.
:
I can speak a little bit to the OCAD experience. The students at OCAD work in a variety of different disciplines. They work in textiles, sculpture and new media. They also work in foresight and in thinking about design futures. What does the world we want to live in look like? How can we design that to be a place where we want to live?
For those students, they've grown up obviously way past digital natives, in that sense. Technology is ubiquitous in their lives. They know what they value, coming into their education. I think an important conversation needs to be had in the post-secs: How do you, as a creator in particular, want to be able to control your work, control your future and also contribute to the conversation that you're in as a creator? I think at OCAD there is a really smart way of talking about media literacy and the ability to market your work and understand the market you're working in.
In non-artistic post-secondary, obviously, the concerns around cheating and the policies around AI use are still developing in this country and elsewhere. I think the teachers, professors and that side of the education equation need support to be able to teach in this world. That is also, I think, at this point missing a little bit from their post-secondary education. That's a place to look as well.
:
I thank you for the question. This is something we think about a lot.
I'll share a couple of points on this. One, we've been sort of working away on the concept of deepfakes since the dropping of the fake Drake track. That made it clear at the time that, one, everything was already stolen. There was no point preparing for it to be stolen—it was stolen.
Two, bring me the person who disagrees with clamping down on deepfakes. We have yet to meet them. I do encourage the government and opposition parties to really get serious about this. If I walked into your office and impersonated you, that is already a crime. It should be a crime online the same way it is in person.
What can you do tomorrow? Again, drawing back to a previous digital revolution, the ads that appeared before Blockbuster movies that said you shouldn't steal this film were from the Motion Picture Association. It was from industry. This time around, we should have government stand up on it: This is illegal. You can't do this. You shouldn't do it. We're working on the law.
You could do that today.
:
There's a lot to be talked about. I'll go back to the very basics, because when you're talking about the ISRC, you're talking about matching and distribution. We're at the licensing stage. I'm going to put forward that SOCAN has very successfully, on a blanket, been licensing all of the digital platforms in Canada, all of the bars and restaurants and the broadcasters. This is not new territory.
When somebody is at the table with you and you have a dance partner, you can say, “Great. This is a win-win. You need our works. We're very happy that you want to use them, and we want to also share in that benefit.” That's where the songwriters are. They're not trying to put the horses back in the barn. They know the works have been stolen. They're saying, “Okay, you stole it. I need my compensation.”
We believe that can happen easily. We represent all of the world's repertoire in Canada. That doesn't just mean the Canadian artists; it means every single songwriter whose work is performed in Canada. We would love to sit at the table with these people and have them know that there are no exceptions and they have to be there. Right now, they are spending their money on lawsuits. They are spending their money on insurance and lobbying. They could be spending their money on licensing and have a win-win when we sit at the table and figure out a licensing agreement.
What you're talking afterwards about is more of the distribution piece. We already have all of the works in there. We believe, as Patrick says, that if you're an AI company, you can put a list together of all the works you have used. We process that kind of data on a daily basis. I'm not worried about that.
:
Practically, we do not have a licence in place with any AI company. As I mentioned, no one has come to the table.
I will tell you that we have licences in place with every other tech provider that operates in Canada. We sit at the table with them and they get to use all of the repertoire. They don't have to ask specifically, “Can I have this piece?” They have access to the world's repertoire of music for a revenue share.
In most cases, that's the way it is. If you're a subscription model, then maybe it's a percentage of revenue. If you're a free service, then maybe it's a flat fee. These are things that two willing people across the table sit down and figure out: How is it that you're generating revenue? Let's respect the value of the creative work that's been put into that, and let's figure out what the right monetary licence really is.
A licence is not theoretical. We have plenty of licences. Most of them are—I say, with the tech agreements—negotiated agreements. These are willing partners sitting at the table, in a free market, negotiating.
:
I think it's important, actually, to continue with what MP Thomas was discussing.
It's an important thing to clarify that this is by no means an introduction to something that is foreign to the free market. This is how the free market works with copyrighted works. As you said, there's a reason why they call it mining, because it's of value. Copyrighted works, unless we're thinking of getting rid of copyrighted works, and I sure don't think we are, are part of what a market looks like. It's not necessarily just regulation. It's about giving value to things that are otherwise stolen as if there was no value or copyright, if I'm not mistaken.
It's important to make the distinction between what is free market and what is perceived as potential over-regulation. From what I understand, Warner Music, Universal Music and Sony Music are fans of the free market. They're some of the largest companies in the world. I think that's important to recognize in this conversation.
What we're talking about is protecting copyrighted works. We've also talked about the potential for this to be a very doable process in the same way that, when a song gets played on the radio, an artist gets money. That's how copyright works. When a song gets played on a commercial, a company calls me and says, “Can we use your song in this commercial?” and it becomes a choice between me and that company, depending on the value that they want to spend on that. That's called the free market. It's a wonderful thing. It can be a wonderful thing.
That's a really important distinction we need to make right here. We're not talking about rethinking the free market. We're talking about putting value on something that, at the time being, is being mined as if it is data that is not copyrighted. This is what we are considering. Am I correct?
:
I'm a lot of things, but I'm not an artist, that's for sure.
It's important to understand the purpose of our study. I sat on the Standing Committee on Industry and Technology for several years, until I became a member of this committee last June. The Standing Committee on Industry and Technology studied Bill , which addressed several interrelated issues, including artificial intelligence. It was a big bill. Clearly, the Liberals didn't do their job properly because they should never have included privacy and artificial intelligence, even though they are interconnected. Artificial intelligence alone needs to be studied, as is the case with the study we are currently conducting. This issue will also be addressed in studies conducted elsewhere in Parliament, including by the Standing Committee on Official Languages. You talked about francophones living in minority situations; protecting them is very important as well.
Anyway, I want to reiterate the purpose of our study and what we're looking for from you and other people in the industry. I'm sure that some people are listening to what is going on in committee today and will be called upon to testify. How can we ensure that our cultural industry is growing across Canada while protecting it, in the current context where we're using artificial intelligence tools? That's the purpose of our study.
Mr. Ducharme, you're right in saying that we must protect our cultural industry, but does it have to go through a government model? Personally, I don't believe that. Can the government participate in that? That's certainly the case.
However, we have a lot of work to do to ensure that we protect our Canadian identity, as well as our Quebec, francophone and francophile identity across Canada.
:
Thank you, Madam Chair.
I'll take a very quick moment to say that you all clearly love what you do and you're surrounded by others in your space who also very clearly love what they do. You want to keep doing it collaboratively. I want you to know we can tell that and we appreciate it.
This has been a lot of fun. There are so many questions and such little time.
Mr. Rogers, I'll go back to you very quickly. I think it's worth emphasizing in the broader context what I believe our intent would be, which is creating a space for AI to grow and for the Canadian industry to lead, without it bearing a cost on Canadian culture or Canadian heritage.
Tech and music do not operate without one another as is. You mentioned the work you do alongside tech already, but could you expand on how Music Canada engages with the cross-sectoral organizations around AI in music, in particular?
:
Thank you for the opportunity to talk about this.
The modern record label is a tech company. In order to find fans in Japan, we make use of technology, including AI-engaged technology. We are very excited about what licensed, legal, paying AI models will do for the industry in order to continue human creativity.
After this meeting, I'm going to talk to Mr. Waugh and make sure he understands how great that Brenda Lee example is. There is such opportunity. There are the heartstrings opportunities about losing voices and doing things that really make good advocacy.
A licensed, legal generative service could make Mr. Généreux an artist. We could be excited about that, but all of the inputs, including Mr. Myles's, need to be compensated for Mr. Généreux to become an artist.
:
I'd like to tell you about the Metacreation Lab at Simon Fraser University, a laboratory that examines issues related to emerging technologies in artistic production.
They've created a visual generation model that doesn't use copyrighted content. This content can therefore be used by artists to create a model using their own work. Artists can then use it.
It's a really interesting model. Other researchers and artists in the United States and elsewhere have created the same type of model.
The model will generate an image based on the works that were used to train the model and that belong to the artists. The artists will prepare their own corpus, and a training process will feed it into the machine. Rather than feeding a product, they'll feed this model. Then, they'll be able to create their own artworks, in their own visual style.
The problem is that you can't run this model on a regular computer. You need a relatively powerful computer. However, that infrastructure isn't currently available.
So artists need to find a way to access powerful computing infrastructure, but also find a way to fund that. So it will be through their own subsidies, since we operate with subsidy schemes. They'll apply for a grant from the Canada Council for the Arts. They may get it. If that's the case, they may be able to find that computing power to make the model work.
Right now, we don't have any spaces or infrastructure for artists to use the models that are produced in Canada. There's music, video and visual arts. If we don't create those spaces and invest in accessible Canadian infrastructure for artists, that's not going to happen. We're going to stifle innovation. These innovations will remain in university research centres.
:
Thank you, Mr. Deschênes. Your time is up.
[English]
I want to really thank all of our excellent witnesses today for their testimony. It's really valuable while we put together our report.
If you don't mind waiting, members, I have a budget for our briefing on CBC/Radio-Canada. I have a budget for snacks that day.
Can I have someone move it? Mr. Ntumba moves it.
Is everyone agreed that we can pass this budget?
(Motion agreed to [See Minutes of Proceedings])
The Chair: Happy Thanksgiving, everyone. Have a great week. We'll see you when we're back.
The meeting is adjourned.