Half a croissant, on a plate, with a sign in front of it saying '50c'
h a l f b a k e r y
What was the question again?

idea: add, search, annotate, link, view, overview, recent, by name, random

meta: news, help, about, links, report a problem

account: browse anonymously, or get an account and write.



Deepfake vs. Bitcoin

Encrypted verification of video content
  [vote for,

Very soon, it seems that we will no longer be able to trust our eyes when it comes to video speeches taken of, well, anyone important.  Deepfake videos are most probably being developed at this very moment of every major candidate, saying, doing, something slightly off of their usual, in order to tilt the next election cycle and drive the news.

Fortunately, we also have developed some methods of verification of transaction.  Bitcoins record every transaction of a mined coin as it progresses through its economic life.  Bear with me.  Computer cryptography, programming, and hashing aren't really my thing.  Computers are barely my thing.

Video viewership can be seen as a type of transaction.  An authenticated video of a real person doing a real event could be recorded with a hash of the event's location, date, etc. that could be processed by a datacenter and turned into a key of some type.  News orgs wishing to verify that said video was real could register with that datacenter and retreive a handshake of some form that agrees with the key.  Or the video itself could be hashed into a strong encryption which is then verified against a key somehow.

Weaknesses of this type of verification method would be primarily the time.  The true video would have a hard time staying ahead of the deepfakes shared on social media or wherever these types of things are disseminated.

RayfordSteele, Jun 10 2019

Please log in.
If you're not logged in, you can see what this page looks like, but you will not be able to add anything.


       Before the Singularity, this would have made some sense. But now, when everthing is just an artefact of [Ian Tindale]'s deranged imagination, it seems a trifle pointless.
8th of 7, Jun 10 2019

       Perhaps the singularity will rule us this way. For some reason this seems strangely comforting.
RayfordSteele, Jun 10 2019

       First off, Bitcoin and any other block chain technologies are doomed to failure. Eventually people will catch on to just how much energy it takes to keep those platforms alive - its ridiculous. Currently 40 TWh/year just for bitcoin and that would power 6 million American households. If the major hosting companies were to provide a paid crypto service, we'd be talking about a fraction of the energy and cost. Warning: Get out of block chain technologies.   

       Now to address the deep fake portion of the title and idea.   

       The internet's current policy is to demonetize channels or just delete channels and videos that are seen as against a truthful (and probably US democratic lefty) message. Even non faked alternative view content is purged, which is a problem if you happen to be posting an alternate and true view of the world (dinosaurs are blasphemy to creationists).   

       So the problem is much deeper than fakes, its how we treat the fakes and grey areas, particularly with political content. Just because something is a fake, or pokes fun at politicians e.g. Pelosi or Trump, it doesn't mean to say its not legitimate content - it should just be categorised as humour or satire rather than news.   

       Currently there is no working category system on YouTube, so any edgy humour, any alternative content or even any political opinion that rubs the management the wrong way is either deleted or demonitised. It's like they've forgotten that media has a full spectrum of stuff all with its own merits.   

       Personally I don't like all the stuff such as Colberts mock shooting of the NRA spokeswoman, but I don't think he should be sacked and CBS closed down because of it. I just find it offensive and makes me want to watch less of his stuff.   

       [+] But sure, crypto will help to verify the content tags and ratings applied by the original creator, but absolute truth is a horrible and boring standard that excludes e.g. humour from all media - lets not do that.
bigsleep, Jun 11 2019

       New Scientist, 16th March 2019, p.22-3, "Don't believe your eyes: smartphones equipped with artificially intelligent cameras are changing how we see reality".
P.23 column 2: "If a whole generation grows up not trusting photos or videos, what do we have left?" says Raskar. Digital watermarking and cloud-based encryption are possibilities, he says, as is using the blockchain technology behind bitcoin.
pocmloc, Jun 11 2019

       So I'm on the right track. I'll take that as confirmation.
RayfordSteele, Jun 11 2019

       ins't the bigger problem not trusting reality?   

       You cannot fix fakes with encryption, unless we imagine all recording equipment having encryption built in, in a way that is not modifiable or "turn-offable".   

       Computer analysis would be able to distinguish fakes for a long time to come based, in the same way that you can still tell CGI even when the movie costs $250 Million to make. But it will get harder.   

       I suspect that given where drones and video generally are going, people who care will likely have alibi drones. You could come up with a system where their video would not be accessible without witness cooperation (to avoid self- incrimination) but multiple sources could then verify where/what you were doing when the deepfake allegedly occurred
theircompetitor, Jun 11 2019

       This is why I have for a while wanted a way to cryptographically sign things I say or write before they leave my brain, using a private key that never leaves my brain. Unfortunately, I have as yet no idea how to begin to conceive of a mechanism that could accomplish that. A probably minor difficulty that occurs to me right now is that it's impossible with current algorithms (that I know about) to sign a text before the wording is finalized, while things that one speaks and writes hardly ever have the final wording before they come out.   

       On the other hand, deepfakes give a lot more plausible deniability, which can be a good thing too (for the individual accused, at least). For example, if you want to get away with having said racist things in the past, you can just claim someone faked the video. But if you signed the words when they came out of your mouth, you can't.
notexactly, Jun 11 2019

       Didn't I read something about attempts at using quantum mechanics for information security awhile back where you could always detect if a transmission was compromised? Couldn't something like that be utilized?
RayfordSteele, Jun 11 2019

       //Couldn't something like that be utilized?//   

       The entanglement method just destroys the other copies of the message if one is read. It's only for transmission and mostly useless.   

       //You cannot fix fakes with encryption//   

       Digital signing could easily determine authenticity though. Something authenticated as coming from Reuters could be trusted as a non fake unless marked as such in signed meta data. I couldn't find a working standard, but a browser could cryptographically verify page elements using a similar mechanism to SSL.   

       The biggest problem right now is that there is absolutely no categorisation for the newer media outlets. The halfbakery has categories, but not Google or Youtube*. Just allowing videos to have attributes of authentic / fake, news / commentary / satire / humour etc would go a long way towards at-a-glance recognition of what kind of content it is.   

       * If you browse YouTube categories you'll soon find out they are based on image recognition very poorly done.
bigsleep, Jun 12 2019

       Errr, how do we know you posted this?
not_morrison_rm, Jun 13 2019

       Come to that, how do you know anything ?   

       <Wonders if teaching [n_m_rm] phenomenology will cause him to self-destruct/>   

       <Decides it's worth a try, cleans blackboard, shuffles lecture notes/>
8th of 7, Jun 13 2019

       There is nothing.
pocmloc, Jun 14 2019


back: main index

business  computer  culture  fashion  food  halfbakery  home  other  product  public  science  sport  vehicle