Artificial Intelligence and Culture: What we heard at the National Summit
- Rozsa Foundation
- 8 minutes ago
- 3 min read

by Lisa Mackay
Across the arts sector, there is growing pressure to understand what artificial intelligence means for creative work and for the organizations that support it. I attended the National Summit on AI and Culture, hosted by the Banff Centre in partnership with the federal government, including Canadian Heritage and the office of the Minister of Artificial Intelligence and Digital Innovation, where the conversation began to take shape at a national level, bringing together policy-makers, technology leaders, and cultural workers.
One takeaway is that we are not always talking about the same thing.
“AI” was used to describe everything from generative systems trained on massive datasets to smaller tools that support day-to-day organizational work. In many cases, people in the arts sector imagine a narrow set of tools, such as ChatGPT or image-generation tools like Midjourney, but the landscape is much larger. This distinction matters in practice. The use of AI to support administrative capacity is a very different question from its use in relation to creative works, where authorship, ownership, and compensation are central.
From a policy perspective, AI is being positioned as an enabling technology that can increase productivity and drive industry growth. It includes systems that automate tasks, generate new outputs, and rely on large-scale data to improve efficiency and capability. For the cultural sector, this raises two distinct concerns: many current systems have been trained on artistic and cultural works without consent, and their outputs introduce new questions about how creative work is produced, attributed, and valued.
These concerns are not theoretical, and they came up repeatedly. While existing copyright law may offer protection in principle, there was little confidence at the Summit that it is being applied or enforced in ways that reflect how these technologies are actually operating. At the same time, there are no widely adopted systems that allow artists to control how their work is used in training or outputs, or to ensure attribution and compensation.
This is shaping how the sector is responding in practice. Much of the available technology is built by large American companies using training practices that many cultural workers view as extractive. There is interest in how AI could support organizational capacity, particularly in areas like administration, communications, and analysis. At the same time, there is a clear reluctance to adopt tools that rely on these systems.

There are alternatives under discussion, including smaller models that can be trained on independent servers using ethically-managed datasets, as well as the development of Canadian-based AI infrastructure. These approaches are promising, but not yet widely available, and Canada does not currently have strong data sovereignty in this area.
There are also areas of potential. AI could expand the capacity of arts organizations, particularly where resources are stretched. If systems for attribution and compensation are established, this could also open new revenue streams for artists and rights holders. It may also support new approaches to creative development, and to initiatives such as documenting and revitalizing Indigenous languages and knowledge systems. Natasha Ita MacDonald of Heritage Lab, who appeared on one of the panels, highlighted the importance of Inuit-led approaches to this work, in which communities shape how technologies are developed and used.
The Summit also highlighted examples of artistic and organizational experimentation with AI. Les 7 doigts de la main presented work integrating AI-generated visualizations into performance, and Magnify Digital demonstrated tools for analyzing audience growth across digital platforms. These examples pointed to a more contained and intentional use of AI, where the technology supports human-led creative and administrative processes rather than replacing them.
Not all perspectives were equally present at the Summit. Representation from equity-seeking communities was limited, and several participants noted that inclusion often required active advocacy. There was also little sustained discussion about the relationship between AI and the climate crisis. These gaps matter, particularly given that communities less visible in the room bring important experience with how these technologies are used in practice, including the disability community.
The Summit will feed into the development of Canada’s next national AI strategy, expected later in 2026. The federal government also announced the creation of an AI and Culture advisory committee, intended to provide ongoing advice as this work develops. At this stage, details on the committee’s membership, timeline, and role in decision-making have not been made public, leaving open questions about how sector input will shape policy.
For arts organizations, the question is not only whether to engage with AI, but under what conditions. Without clear approaches to consent, attribution, and compensation, many in the sector are hesitant to move forward. At the same time, the pace of development means these questions cannot be deferred.

