The use of generative artificial intelligence raises several ethical, legal, qualitative, and environmental issues that are important to understand, know, and master in order to use these technologies consciously and responsibly.
The use of AI in the cultural sector must be thoughtful and regulated to avoid abuses that could affect content authenticity, data protection, the environment, and employment.
Avoid dependence and homogenization of content
Studies on the impact of generative AI tend to show its detrimental effect on users' critical thinking abilities¹, which is why it is important to remain in control of the tool and use it in a reasoned and critical way regarding its outputs.
- Generative AI should be used as an assistant tool and not as a substitute for human creativity.
- It is a statistical technology, capable of inventing information, book titles, or scientific articles based on probabilities. Therefore, a degree of skepticism toward its outputs is necessary, and sources should be requested.
- Personalizing and reinterpreting AI-generated content is crucial to preserve a strong and distinctive editorial identity.
Protect personal data and comply with GDPR
- Cultural institutions must choose solutions compliant with European regulations and avoid integrating sensitive data into uncertified tools.
- Inform the public about data usage and prioritize local solutions or AI trained on secure and transparent databases.
Limit algorithmic biases and ensure diversity of representation
- Verify that the tools used do not reinforce stereotypes in marketing campaigns, programming, or audience analyses.
- Combine algorithmic analysis with critical human review to avoid errors and discriminatory biases.
Limit environmental impact through responsible use
- Generative AI usage exacerbates a trend already underway that predicts a massive increase in the carbon footprint of our digital activities-potentially up to 60% by 2040 according to a joint study by ADEME and ARCEP².
- While generative AIs are energy-intensive during training, they do not all remain so during operation³.
- Favor less energy-consuming AI solutions and limit superfluous usage (avoid systematically generating content).
- Raise team awareness about the environmental impact of AI queries and encourage responsible digital practices.
Anticipate work transformations and support teams
- Train professionals for a critical and informed use of AI so that it becomes a lever for innovation rather than a constraint.
- Encourage active monitoring of changes in professions impacted by AI, especially in administrative and creative sectors.
- Reflect on the risk of job replacement by AI (photographers, authors, graphic designers...), the impact on creative quality, and the institution's social responsibility in this area.
Generative AI is a powerful lever for cultural organizations, facilitating content creation and data analysis. However, it is an opportunity to seize with discernment. Its use must fit within a global strategy and take into account the ethical, ecological, and human challenges outlined above. A tool remains a tool: it is up to humans to give it meaning. By training and adopting a critical approach, cultural professionals can harness these technologies while preserving the uniqueness of their institution. Training will be the best way to ensure a reasoned and situated use of these technologies-understanding core functions, recognizing and avoiding algorithmic biases, and identifying relevant use cases. This will allow relying on AI without giving it a blank check.