Take, for example, the ability to paste someone’s face onto someone else’s body - arguably the ur-deepfake application that started all this bother. And when we asked why this was, the company didn’t answer directly but instead gave a long answer about how seriously it takes the threats posed by deepfakes and what it’s doing about them.Īdobe’s efforts in these areas seem involved and sincere (they’re mostly focused on content authentication schemes), but they don’t mitigate a commercial problem facing the company: that the same deepfake tools that would be most useful to its customer base are those that are also potentially most destructive.ĭeepfakes present a tricky problem for Adobe’s business strategy This, no doubt, is why Adobe didn’t once use the word “deepfake” to describe the technology in any of the briefing materials it sent to The Verge. It’s what policy wonks refer to as a “dual-use technology,” which is a snappy way of saying that the tech is “sometimes maybe good, sometimes maybe shit.” making a jailed dissident appear relaxed in court footage when they’re really being starved to death. It’s all about editing video footage of humans - in ways that many will likely find uncanny or manipulative.Ĭhanging someone’s facial expression in a video, for example, might be used by a director to punch up a bad take, but it could also be used to create political propaganda - e.g. But Project Morpheus is obviously much more deepfakey than the company’s earlier efforts. These include the aforementioned Neural Filters, as well as more functional tools like AI-assisted masking and segmentation. Now, given the looseness with which we define deepfakes these days, Adobe has arguably been making such tools for years.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |