Microsoft has released new tools to fight deepfake or AI-doctored images or videos that can otherwise be exploited to spread disinformation. The detection tool, called Microsoft Video Authenticator, will analyse the videos frame-by-frame to give a confidence score indicating the chance of it being artificially created. The tool will initially be available to political and media organisations. The company has announced a separate system to help content producers add hidden code to their footage so any subsequent changes can be easily flagged.