

Psh, I only use the Flushvalve Pro Vowel Pack. You can’t beat them in terms of value for your money.
Psh, I only use the Flushvalve Pro Vowel Pack. You can’t beat them in terms of value for your money.
The bill mandates safety testing of advanced AI models and the imposition of “guardrails” to ensure they can’t slip out of the control of their developers or users and can’t be employed to create “biological, chemical, and nuclear weapons, as well as weapons with cyber-offensive capabilities.” It’s been endorsed by some AI developers but condemned by others who assert that its constraints will drive AI developers out of California.
Man, if I can’t even build homemade nuclear weapons, what CAN I do? That’s it, I’m moving to Nevada!
It’s so wild how ChatGPT and this “style” of AI literally didn’t exist two years ago yet we’re all expected to believe it’s this essential, indispensable, irreplaceable tool that people can’t live without, and actually you’re the meanie for suggesting people do something the exact same way they would have in 2022 instead of using the environmental-disaster spam machine
yeah, I was more thinking of like my phone’s notes app lol. Also, freeform computer note-taking requires weird hardware and can’t search the text of my notes, so, still a tradeoff…
I’ve thought about a similar idea before in the more minor context of stuff like note-taking apps – when you’re taking notes in a paper notebook, you can take notes in whatever format you want, you can add little pictures or diagrams or whatever, arranged however you want. Heck, you can write sheet music notation. When you’re taking notes in an app, you can basically just write paragraphs of text, or bullet points, and maybe add pictures in some limited predefined locations if you’re lucky.
Obviously you get some advantages in exchange for the restrictive format (you can sync/back up things to the internet! you can search through your notes! etc) but it’s by no means a strict upgrade, it’s more of a tradeoff with advantages and disadvantages. I think we tend to frame technological solutions like this as though they were strict upgrades, and often we aren’t so willing to look at what is being lost in the tradeoff.
God, that would be the dream, huh? Absolutely crossing my fingers it all shakes out this way.
Can AI companies legally ingest copyrighted materials found on the internet to train their models, and use them to pump out commercial products that they then profit from? Or, as the tech companies claim, does generative AI output constitute fair use?
This is kind of the central issue to me honestly. I’m not a lawyer, just a (non-professional) artist, but it seems to me like “using artistic works without permission of the original creators in order to create commercial content that directly competes with and destroys the market for the original work” is extremely not fair use. In fact it’s kind of a prototypically unfair use.
Meanwhile Midjourney and OpenAI are over here like “uhh, no copyright infringement intended!!!” as though “fair use” is a magic word you say that makes the thing you’re doing suddenly okay. They don’t seem to have very solid arguments justifying them other than “AI learns like a person!” (false) and “well google books did something that’s not really the same at all that one time”.
I dunno, I know that legally we don’t know which way this is going to go, because the ai people presumably have very good lawyers, but something about the way everyone seems to frame this as “oh, both sides have good points! who will turn out to be right in the end!” really bugs me for some reason. Like, it seems to me that there’s a notable asymmetry here!
Language designers are obligated to be linguists as well.
This is why I love Perl. Larry Wall has a linguistics background and created the only programming language where you can conjugate variables.
(I know it sounds like I’m making fun of perl here, and I am, but I also legitimately do love perl)
deleted by creator
Clicking through to one of the source articles
Through an algorithm that analyzes troves of student information from multiple sources, the chatbot was designed to offer tailored responses to questions like “what grade does my child have in math?”
Okay, I’m not a big-brain edtech integration admin, but I seem to recall that like fifteen years ago we had a website that my parents could check to see my grade in math. I feel like this was already a solved problem honestly.
Before the big AI boom, I actually did a project where I used inferkit to generate text for the comedy factor because the unhinged nightmare garbage it spit out was extremely entertaining. I just can’t imagine using chat gpt in the same way, it’s so boring
ngl his stuff always felt a bit cynical to me, in that it seemed to exist more to say “look, video games can have a deep message!” than it did to just have such a message in the first place. Like it existed more to gesture at the concept of meaningfulness rather than to be meaningful itself.
Anyone can copy it, recreate with it, reproduce with it
Ew… stay away from my content, you creep!
If you think of LLMs as being akin to lossy text compression of a set of text, where the compression artifacts happen to also result in grammatical-looking sentences, the question you eventually end up asking is “why is the compression lossy? What if we had the same thing but it returned text from its database without chewing it up first?” and then you realize that you’ve come full circle and reinvented search engines
unironically saying “the sharing economy” in the year of our lord 2024 is… certainly a choice
also
God knows we old-timers tried to be cynical about ChatGPT, pedantically insisting that AI was actually just machine learning and that Altman’s new toy was nothing but cheap mimicry. But the rest of the world knew better
idk dude I’ve talked to the rest of the world about this and most of them actually seem to dislike this technology, it seems like maybe you didn’t actually try very hard to be cynical
The copyright clause in the US constitution (1789) also frames it in terms of granting rights to authors to “promote the progress of … useful arts”. Strictly speaking author protection is not the origin of copyright but also I was snarkily responding to a person who was arguing in favor of AI-training-as-fair-use and implying copyright was 120 years old, not trying to do a detailed explication of the origins of copyright law
I’m sorry for my imprecise wording, I was feeling flippant and I know what I said isn’t totally accurate. not a big history person here honestly. I’ll try and stick to joke-commenting next time. but also can you just say what you mean instead of darkly hinting.
iirc even though the origin of copyright is not really specifically about author protection, part of the broad-strokes motivation for its existence involved “we need to keep production of new works viable in a world where new copies can be easily produced and undercut the original,” which was what I was trying to get at. maybe they picked a bad way to do that idk I’m not here to make excuses for the decisions of 16th-century monarchs
also again I’m not a copyright fan/defender. in particular copyright as currently constituted massively and obviously sucks. I just don’t think copyright-in-the-abstract is like the Greatest Moral Evil either, bc I’m not a libertarian. sorry ¯\_(ツ)_/¯
heck yeah I love Physics Jenny Nicholson Angela Collier
I mean, it seems like you’re reading my argument as a defense of copyright as a concept. I’m ambivalent on the goodness or badness of copyright law in the abstract. Like a lot of laws, it’s probably not the ideal way to fix the issue it was designed to solve, and it comes with (many) issues of its own, but that doesn’t necessarily mean we’d be better off if we just got rid of it wholesale and left the rest of society as is. (We would probably be left with excitingly new and different problems.)
As I see it, the actual issue at hand with all of this is that people are exploiting the labor/art/culture of others in order to make a profit for themselves at the expense of the people affected. Sometimes copyright is a tool to facilitate that exploitation, and sometimes it’s a tool that protects people from it. To paraphrase Dan Olson, the problem is what people are doing to others, not that the law they’re using to do it is called “copyright.”
The really annoying thing is, the people behind AI surely ought to know all this already. I remember just a few years ago when DALL-E mini came out, and they’d purposefully not trained it on pictures of human faces so you couldn’t use it to generate pictures of human faces – they’d come out all garbled. What’s changed isn’t that they don’t know this stuff – it’s that the temptation of money means they don’t care anymore