A deep fake is a picture or video of someone doing or saying something they didn’t. In the old days pictures and video were considered “proof”, it was easy to tell if they had been altered, as with the laughable removal of out-of-favour leaders from Soviet pictures.
With the advent of useful “AI’ making deepfakes has become easy, and it is destroying one of the ways we know the truth. In addition it is putting people into positions they never took, having them say things they did not say and so forth. The common general use is for pornography, but putting words into someone’s mouth is potentially just as bad.
The law will need to be changed to deal with this.