Press J to jump to the feed. Press question mark to learn the rest of the keyboard shortcuts
Log In
Found the internet!
344
pinned by moderators
Posted by
i like the tech
2 years ago
LeemonHelpful7Wholesome3Silver14
344
60 comments
11
Posted by4 days ago
11
6 comments
15
Posted by2 hours ago

I watched the latest Google IO event the other day and had an interesting realization. Everyone knows that "everything thing will be tokenized." It's been said a million times by now. But I didn't realize that it will extend to digital files or media of any format. Any thing that requires the knowledge of its source. Allow me to elaborate...

With the onslaught of AI infiltrating every aspect of our media experience it became very clear that in the very near future no one will be able to tell what is real or not. We will soon be online and will have no idea that the images (or any media file for that matter) is, actually, what it portrays. Was the vacation photo real or rendered? Were there really that many fans at that concert? Did that war atrocity actually happen or not? The need to know the source, or Truth, of an image can quickly get real and have real consequences. Of course this is nothing new but the quality and ease-of-use to twist the Truth is reaching new levels of convenience and will soon propagate everywhere.

So how does one go about to resolve this? There is only one answer that I can think of.

Every piece of code and file will need to be tokenized. Using images as an example, the only way to know if a photo is real or not would be to first start at the camera. The camera will need to run an OS that has its transaction hash stored on the network. When you run the camera OS it will take a hash of the OS code and verify it against the tokenized hash as published on the network by the camera developers. The code (and, by extension, the hash) will have been verified by multiple 3rd-party auditors (most likely DAOs themselves). The camera will then take a picture for you and, via the verified code, will hash the image without editing it and submit a transaction to the network with said image-hash. Later on you receive the image and, in order to verify it hasn't been manipulated, your local software will, again, hash the local image and verify it against the network's hash of the image. At that point, and only at that point, you will have virtually full certainty that the image you are viewing has not been altered or edited.

Now consider that for every single piece of data where you care if it's real or not. If it can't be verified in this (or similar) manner then you have to assume it's fiction. There is no other choice.

So when we say the future will tokenized everything, we really mean everything that you care about down to every piece of file or data packet that has meaning to you. Otherwise it's fiction.

Pretty crazy once you start to think about it. Anyway, just wanted to share on the only subreddit where I know "you'll get it." :)

Thanks for listening.

15
5 comments
16
Posted by4 hours ago
Post image
16
1 comment
22
Posted by9 hours ago
22
10 comments
15
Posted by
i like the tech
8 hours ago
15
4 comments

About Community

Hedera is a decentralized, open-source, proof-of-stake public ledger that utilizes the leaderless, asynchronous Byzantine Fault Tolerance (aBFT) hashgraph consensus algorithm. It is governed by a diverse, decentralized council of leading enterprises, universities, and web3 projects from around the world.
Created Mar 14, 2018

30.5k

Gossipers

2.2k

Gossiping

Top 5%

Ranked by Size

r/Hedera Rules

1.
Spam/Low effort
2.
No NSFW
3.
Be Kind & Respectful
4.
Don’t shill other crypto projects
5.
FUD
6.
Political discussion
7.
Complaints

Quick Glossary

Hedera™ = The network
Hashgraph = The technology
HBAR = The cryptocurrency

Related Subreddits

r/HbarNFTspace

355 members

r/HBAR_Foundation

4,562 members

Moderators

Moderator list hidden. Learn More