The importance of coming to your own conclusions and the effects of commenters here


Hey all, been on this sub since David Grusch and this is my first time making a post. I was just looking at the Lue Elizondo Tweet post that’s getting tons of traction and hate. Now I have no idea if Lue is truthful or not, but I do want to offer a perspective on disinformation bots on this subreddit from the perspective of a software engineer. I just opened ChatGPT and asked it the following

https://preview.redd.it/l3gjpma6p74d1.png?width=792&format=png&auto=webp&s=3a99a3c87afc34827c38f5bfc9c7e6492a387418

For 10 seconds of manual effort, that’s a pretty convincing looking reply, don’t you think? Now, what if you could do this same process programmatically? And what if you could generate thousands of replies in minutes that are just as convincing from profiles that seem like actual people? Now, anyone with some software engineering background could very easily use this capability to generate thousands of accounts, make them all look legitimate by posting in normal subreddits for months or years, and then have them occasionally jump into whatever subject they see fit (UAP’s, Biden, Trump, Gaza) and instantly generate whatever positioned response they’d like in order to stir the pot. This could extremely easily all be done programmatically using simple API’s (application programming interfaces are basically the interface that you can talk to with code, similar to how graphical user interfaces are the interface for human users) to generate tens of thousands of accounts in a relatively short amount of time. I could imagine Reddit has some sort of bot detection but there’s absolutely no way it can keep up with the rate that AI is moving.

Now, I could write some python scripts that do everything I just mentioned above probably in a weeks time and have it deployed for a fairly simple implementation and it would cost me almost nothing to run on a small server somewhere. Could you imagine the level of sophistication that state or bad actors could leverage to implement something like this? This is far beyond UAP’s, but I like this sub and figured it was fitting. Imagine everything you see, from UAP’s to politics to war to foreign affairs to the economy etc. Everything could be (and I imagine often is) very easily manipulated to distort public perception and have folks second guess their intuitions.

You could imagine have you tens of thousands, hundreds of thousands, or millions of bots on all social media platforms. And not only you, but your foreign adversaries are doing the same exact thing trying to create their own distortion of perception against what trolls, state actors, etc are trying to do. And then the average person is left thinking “Wow I guess the world is really fucked and everyone is divided”. It’s something that’s so damaging to our collective conscious and I think is probably the biggest thing responsible for the tension that everyone is feeling in reality right now.

I’m not here to say UAP’s are real, Grusch and others are truthful/liars, or anything like that because I really have no idea at the end of the day. But I guess that begs the question, what does the average person do knowing all of this? As far as I can tell, the best we can do is trust our intuition and gut feeling as we wade the waters of news in this subject as well as many others and be careful with how other comments change our perception of things. I’d personally rather be wrong and made a fool of something when the dust settles on my own opinion than be wrong and made a fool subconsciously adopting the opinions and viewpoints of some transistors turning on and off in a server farm somewhere.

Curious on others thoughts on this, my very brief sentiment of this on the Lue thread is getting bombarded with downvotes so I wanted to check the temp of the waters in it’s own thread.

submitted by /u/IrresistibleMittens
[link] [comments]  

Read More