is part of a wide collaboration between media and tech organisations to develop signals that can be tied to media content to allow audiences to determine where content has come from and check for any manipulation or changes made since it was originally released.
The project was started in 2018 by the 91Èȱ¬ with CBC/Radio Canada, The New York Times and Microsoft, born of a conviction that media publishers, working in concert with technology and civil society organisations, could create a reliable, effective system to signal content integrity. In 2020 we joinedwith partners in the Content Authenticity initiative to establish the Coalition for Content Provenance and Authenticity (), an open standards body to develop and share our work which has since been joined by a number of other organisations.
This time last year we noted that:
polarisation is being sharpened by partisanship and, as we’ve seen so clearly during the Covid crisis and the US election in particular, the partisans are being sold weaponised disinformation
A year on, things are even worse, with bad actors using a real time war to mount their latest disinformation offensives and the anti-vaccination narrative consolidating its own community, to create information ecosystem that has morphed into a vast disinformation bubble filled with everything from conspiracies about climate change to the war in Ukraine. It's abundantly clear that audiences need help to identify trustworthy content.
A report published in the UK by the regulator Ofcom on March 30th 2022 showed that a third of internet users are unaware that online content might be false or biased.
- It found that 30% of UK adults who go online (that’s approaching 15m people) are unsure about, or don’t even consider, the truthfulness of online information. A further 6% – around one in every twenty internet users – believe everything they see online.
- Alongside this, although more than two thirds of adults said they were confident in identifying misinformation, only 22% were able to correctly identify the tell-tale signs of a genuine post. There’s no reason to suspect the UK is an outlier in this respect.
We believe that our work on Project Origin, alongside promotion of media literacy and fact checking, offers a solution. In the last twelve months we have seen the C2PA release version 1.0 of its technical specification for digital provenance. We’ve built official and unofficial support for our work, with Sony one of the latest large organisations to join the C2PA. Media partners have been taking part in a range of activities examining workflows – the latest of these will welcome the IPTC (International Press Telecommunications Council) and its expertise on board.
As we push forward with the partner collaboration work underpinned by the Trusted News Initiative, the work we are doing on provenance feels like an even more important part of the efforts being made globally to tackle disinformation. It can only ever be one brick in the wall against disinformation but, if we get it right it could be a keystone supporting the efforts of many others.
Resources
- The power of the machine – harnessing AI to fight disinformationHow technology is being used to detect and fight the spread of fake news. We hear from those developing solutions.
- First Draft Part One: Election disinformation tactics and strategiesIn the first of a two part blog on disinformation and elections from First Draft, Esther Chan writes about how journalists can tackle the problem.
- How not to amplify bad ideasMike Wendling, Editor of 91Èȱ¬ Trending, describes the problem of amplification and how to avoid it.
- EBU view: 100 years of public service mediaThe EBU's Noel Curran writes about the role members play in protecting democracy
- Facing the information apocalypseRadio-Canada's Jeff Yates writes about his role tackling disinformation