I just watched the recent U.S. Senate hearing about licensing and regulating AI, and the vibes were...off.
one of the major issues that came up was potential cascade of misinformation that is very cheap to manufacture at scale. while this is a concern, I think something was being said between the lines. sometimes things are defined as "criminal", "illegal", "misinformation" not because they are objectively, ethically wrong, but because there's some status quo that's being threatened. I think the recent talks of suppressing AI research are about a small cohort of society trying to not lose control over the flow of information and global literacy.
here's some background:
these are just some notable examples, but make no mistake, there's a bunch of examples where journalists reveal power and wealth being abused. even now, there's subreddits like /r/ElonJetTracker, though not necessarily revealing wrongdoing, still demonstrate a hard-to-suppress public awareness over one of the richest, most powerful individuals on the planet.
also consider all the developments past 2020, like the George Floyd protests, the aftermath of the U.S. general election, the drama of /r/wallstreetbets. right now, more people are getting bolder, because they're not advocating or whistleblowing on their own anymore. right now, LLMs are scaled up to the extent that models like GPT-4 can handle every commonly spoken language around the globe. moving forward, it'll be easier for automated correspondence with firsthand sources, and easier for whistleblowers to evade retaliation for reporting wrongdoing. where sources get corroborated and fact-checked at scale, a free, credibly-neutral narrative will percolate through the public more extensively.
right now, researchers are parsing massive documents like SEC filings. it's easier now to report union-busting, especially since so many new unions are forming. I don't think we've really seen the main body of developments that will occur in the coming months and years, but it all comes back to the increased ability of the public, with tools like AI, to create transparency and accountability of our own choosing. everyone can make their own podcast or blog, but with the recent developments, I think it's actually the ability for everyone to hire their own journalism team, or facilitate more whistleblowing. it's no secret that a lot of people have been silenced and a lot of rumors have been swept under the rug, but now the cost of wholly suppressing such information, or "burying the lede", will skyrocket. I think we're about to see a Streisand effect on steroids.
I will admit that "The Singularity Papers" may not be a single release like in the past. I think as time progresses, this will just be the moniker of a time period where governments, the wealthy, and the powerful move to suppress AI tooling for journalistic movements because "it's creating misinformation", and the subsequent chain of events where suppression of massive public accountability fails spectacularly.
what do you think?