Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Taking over for Gerard this time. Special thanks to him for starting this.)
Saltman has a new blogpost out he calls ‘Three Observations’ that I feel too tired to sneer properly but I’m sure will be featured in pivot-to-ai pretty soon.
Of note that he seems to admit chatbot abilities have plateaued for the current technological paradigm, by way of offering the “observation” that model intelligence is logarithmically dependent on the resources used to train and run it (i = log( r )) so it’s officially diminishing returns from now on.
Second observation is that when a thing gets cheaper it’s used more, i.e. they’ll be pushing even harded to shove it into everything.
Third observation is that
which is hilarious.
The rest of the blogpost appears to mostly be fanfiction about the efficiency of their agents that I didn’t read too closely.
My big robot is really expensive to build.
If big robot parts become cheaper, I will declare that the big robot must be bigger, lest somebody poorer than me also build a big robot.
My robot must be made or else I won’t be able to show off the biggest, most expensive big robot.
QED, I deserve more money to build the big robot.
P.S. And for the naysayers, just remember that that robot will be so big that your critiques won’t apply to it, as it is too big.
christ this is dumb as shit
My ability to guess the solution of Boolean SAT problems also scales roughly with the log of number of tries you give me.
It probably deserves its own post on techtakes, but let’s do a little here.
Diogenes’s corpse turns
Of course Saltman means “all of my buddies” as he doesn’t consider 99% of the human population as human.
Ugh. Amongst many things wrong here, people didn’t jerk each other off to scifi/spec fic fantasies about the other inventions.
AGI IS NOT EVEN FUCKING REAL YOU SHIT. YOU CAN’T CURE FUCK WITH DREAMS
I must be blind.
“Intelligence” in no way has been quantified here, so this is a meaningless observation. “Data” is finite, which negates the idea of “continuous” gains. “Predictable” is a meaningless qualifier. This makes no fucking sense!
“Moore’s law” didn’t change shit! It was a fucking observation! Anyone who misuses “moore’s laws” outta be mangione’d. Also, if this is true, just show a graph or something? Don’t just literally cherrypick one window?
“Linearly increasing intelligence” is meaningless as intelligence has not been… wait, I’m repeating myself. Also, “super-exponential” only to the “socio” that Ol’ Salty cares about, which I have mentioned earlier.
Oh hm but none of them are true. What now???
Stopping here for now, I can only take so much garbage in at once.
dude’s gone full lesswrong. feels nostalgic.
You’d think that, at this point, LW style AGI wish fulfilment fanfic would have been milked dry for building hype, but apparently Salty doesn’t!