Close up photo of an office worker sitting at their desk and working at a computer

AI is changing how content is created, and transparency is key to businesses building trust. Australians need to feel confident that they can recognise when digital content has been created or changed using AI. 

The National AI Centre’s new guidance, Being clear about AI-generated content, aims to support business to be more transparent about the content they create. This can build trust with customers and employees. 

It will also help AI developers and organisations using AI systems.

Why transparency is important 

Being clear about when content has been modified or generated by AI can help business to: 

  • reduce regulatory and reputational risks
  • build trust in digital content
  • gain a competitive advantage in the digital economy. 

What the guidance covers 

The guidance outlines practical steps for business to make AI-generated content easy to identify: 

  • Labelling ­– adding a visible notification to show if content is AI-generated and where it came from.
  • Watermarking ­– embedding information in digital content to help trace its origin or verify authenticity.
  • Metadata recording ­– including descriptive information about the content within the file. 

Business should choose the right level of transparency for their context and potential impact of the content. For example, an internal news article may need a simple label, while official documents might need watermarking and metadata. 

Keeping up with best practice 

The guidance is based on industry best practice and developing global standards. We will update it as technology and international standards change. 

This news article was produced with AI assistance.