a delicate reminder now that @fasterandworse@awful.systems has undeleted this thread:
we know what a crypto shill looks like and we are watching
a delicate reminder now that @fasterandworse@awful.systems has undeleted this thread:
we know what a crypto shill looks like and we are watching
“incredibly dangerous drug advice that doesn’t get called out” is practically an orange site specialty at this point. it’s weird that their world-class mods don’t even catch the obvious cases
I’d be very surprised if that were the case, given this is the description of the $4.20 tier of Chill Goblin’s patreon:
VERY chill. You just bought yourself a Chilled Gobbo NFG, which is like an NFT except it's just a picture I have saved on a google gallery. I'll rename the one you pick with your username so everyone will know EXACTLY who it belongs to. I'm hoping one day Chilled Gobbo NFGs will replace both money and art, by donating at this level you will be helping my dream come true!
it’s not high parody (heh heh) but it’s pretty clear he’s not an NFT fan
daily use is definitely possible! you’ll just get absolutely nothing out of it other than extremely minor but cumulative damage to your heart (which notably doesn’t happen when you’re not doing weird shit with your acid and you give it a fucking week or two off)
as a long-term supporter of the idea that folk should be able to just get high if they want to, it’s incredibly hard to square my beliefs with the existence of a person like Scott, especially having directly dealt with the outcomes of Scott’s advice amongst folks I have worked with
in general, the capitalist idea that even getting high needs to have utility has led to the normalization of some of the worst drug culture I’ve ever seen, from the can of worms that is lifestyle microdosing to some much worse shit than that
What about addiction risk?
The data on this are really poor because it’s hard to define addiction. If a prescription stimulant user uses their stimulants every day, and feels really good on them, and feels really upset if they can’t get them…well, that’s basically the expected outcome.
did I just watch Scott try to reply guy addiction out of existence?
also, all the paragraphs Scott uses to call his patients liars and insinuate that other psychiatrists have guilty consciences are really uncomfy? cause it really feels like a normal response to the situations he’s describing is “boy I’m getting a lot of folks with ADHD and neurodivergent traits and all they seem to want is one treatment for it, maybe I should examine that more closely” and not “look at all these normal-brained fucks with intense problems focusing coming to me for drugs, which I’m certain the other pill-pushers in my industry will give them without question. welp time to not even attempt to establish a therapeutic dosage or even guidelines around how much to take since this is a fun safe party drug”
huh, that’s a good question. from https://join-lemmy.org/docs/contributors/07-ranking-algo.html it looks like it should bump it, but I haven’t been seeing that behavior in my UI (which I usually just perch on the default front page)
I’ve fucking seen it myself! I’ll never understand why my industry has such a casual relationship with problem drug use
the way they use the real hardware’s cathode fade in the animations is awesome!
let me know if a big NotAwfulTech thread for demos would be fun and I’ll start one, since the demoscene is almost always not awful by definition
imagine if you had taken even a moderate break from your bullshit and tried posting like this instead
(TREACLESP)
extremely long pause and 3 GC cycles as the Lisp machine heats up the room next to your terminal
T
like Christ look at all the nonsense they posted to try to distract from the adderall thing
it seems like it isn’t possible, unfortunately. it looks like the lemmy devs are finally starting to work on more mod tooling, so we may see some improvement here soon; if not, it can go on my queue as I get back into modifying our version of lemmy
First time I met him, in a professional setting, he had his (at the time) wife kneeling at his feet wearing a collar.
hold on, you can’t just write this paragraph and then continue on as if it’s not a whole damn thing
ah yes the first time I met yud he non-consensually involved me in his bondage play with his wife (which he somehow incorporated into a business meeting)
wait, so the AI is just your fears about capitalism?
Same elementary school logic but I mean this is how a nuke works.
what. no it isn’t
look I don’t want to shock you but that’s basically what they get paid to do. and (perverse) incentives apply - of course goog isn’t just going to spend a couple decabillion then go “oh shit, hmm, we’ve reached the limits of what this can do. okay everyone, pack it in, we’re done with this one!”, they’re gonna keep trying to milk it to make some of those decabillions back. and there’s plenty of useful suckers out there
a lot of corporations involved with AI are doing their damndest to damage our relationship with the scientific process by releasing as much fluff disguised as research as they can manage, and I really feel like it’s a trick they learned from watching cryptocurrency projects release an interminable amount of whitepapers (which, itself, damaged our relationship with and expectations from the engineering process)
oh god, rationalists really were those kids and they never grew out of it
What I’m trying to get at is that the practicalities of improving technology are generally skated over by aingularatians in favor of imagining technology as a magic number that you can just throw “intelligence” at to make it go up.
this is where the singularity always lost me. like, imagine, you build an AI and it maxes out the compute in its server farm (a known and extremely easy to calculate quantity) so it decides to spread onto the internet where it’ll have infinite compute! well congrats, now the AI is extremely slow cause the actual internet isn’t magic, it’s a network where latency and reliability are gigantic issues, and there isn’t really any way for an AI to work around that. so singulatarians just handwave it away
or like when they reach for nanomachines as a “scientific” reason why the AI would be able to exert godlike influence on the real world. but nanomachines don’t work like that at all, it’s just a lazy soft sci-fi idea that gets taken way too seriously by folks who are mediocre at best at understanding science
listen, 99% of projects trying to make a self-lifting crane fail. but can you imagine the money you could make if you invested in the 1% that succeeded in spite of physics and common sense? send your investment to the following monero address (SEC enforcement agents do not have my permission to view this post!)