• ayyy@sh.itjust.works
    link
    fedilink
    arrow-up
    31
    arrow-down
    5
    ·
    29 days ago

    No, using an already-trained model doesn’t “use up” the model in exactly the same way that pirating a movie doesn’t steal anything from Hollywood.

    • PeriodicallyPedantic@lemmy.ca
      link
      fedilink
      arrow-up
      8
      arrow-down
      10
      ·
      29 days ago

      Use a diamond doesn’t “use up” that diamond.

      And yet, it’s still unethical to buy already mined blood diamonds from people who continue to mine more blood diamonds. Funny thing about that, huh

      • bob_lemon@feddit.org
        link
        fedilink
        arrow-up
        15
        ·
        29 days ago

        In this analogy, using the diamond does use it up. In the sense that none else can use that diamond concurrently. If someone else wants a diamond, more children must die.

        This is different from the trained AI model, which can concurrently be used by everyone at the same time, at very little extra cost.

        • PeriodicallyPedantic@lemmy.ca
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          29 days ago

          Even if the diamond mine owners stop mining, it’s unethical to buy their stockpile of blood diamonds.

          Also, there is a cost besides electricity - the theft of artist’s work is inherent to the use of the model, not just in the training. The artist is not being compensated whenever an AI generates art in their style, and they may in fact lose their job or have their compensation reduced due to artificial supply.

          Finally, this is an analogy, it’s not perfect. Picking apart incidental parts of the analogy doesn’t really prove anything. Use an analogy to explain a problem, but don’t pick apart an analogy as though you’re picking apart the problem.

          • KillingTimeItself@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            28 days ago

            and they may in fact lose their job or have their compensation reduced due to artificial supply.

            highly doubt. Any artists that do lose their job are probably mostly ok with it anyway, since it’s most likely going to be graphical drivel anyway. In fields like media theres a different argument to be made, but even then it’s iffy sometimes. Also i don’t think this would be considered artificial supply, it would be artificially insisted demand instead no? Or perhaps an inelastic demand side expectation.

            Although, it would be nice to have some actual concrete data on artists and job prospects in relation to AI. Unfortunately it’s probably too early to tell right now, since we’re just out of the Luddite reactionary phase, who knows.

            • PeriodicallyPedantic@lemmy.ca
              link
              fedilink
              arrow-up
              1
              ·
              28 days ago

              Any artists that do lose their job are probably mostly ok with it anyway, since it’s most likely going to be graphical drivel anyway.

              Replace “artist” and “graphical”, and you just described most jobs. I don’t think most people are ok losing their jobs even if those jobs aren’t especially inherently rewarding; they’re getting paid for their area of training. They’re not just gonna be able to find a new job because in this hypothetical, the demand for it (a living human doing the work) is gone.

              I consider this an increase in supply because it’s an increase in the potential supply. Productivity increases (which is what this is) mean you can make more, which drives down the price, which means that artists get paid less (or just get replaced).

              Remember: if you 10x the productivity of an employee, that typically doesn’t mean you produce 10x the product, it typically means you need 1/10th the employees. That payroll saving goes right into the pockets of execs.

              Also wrt luddites, they weren’t wrong. It did absolutely demolish their industry and devastate the workers. It’s just that the textile industry was only a small part of the economy, and there were other industries who could absorb the displaced workers after they got retrained.
              LLMs threaten almost every industry, so there is a greater worker impact and fewer places for displaced workers to go. Also now workers are responsible for bearing the costs of their own retraining, unlike back in the day of the luddites.

              • KillingTimeItself@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                2
                ·
                27 days ago

                Replace “artist” and “graphical”, and you just described most jobs.

                yeah, it’s a generalized statement, so that makes sense. It’d be weird if my statement only applied to the artist economy, not like, the rest of the economy.

                I don’t think most people are ok losing their jobs even if those jobs aren’t especially inherently rewarding; they’re getting paid for their area of training.

                i think this is what most people would say, and i would generally agree, however there is always going to be some level of job market upset, which is a good thing for society, and people, even if they don’t like it. It’s a liquidity thing at the end of the day. If you have no liquidity doing anything other than what you first started doing is going to be really hard, if you have more liquidity, it becomes easier. Although there is a point of diminishing returns where it turns into a revolving door of short term labor.

                They’re not just gonna be able to find a new job because in this hypothetical, the demand for it (a living human doing the work) is gone.

                yeah but that’s the thing, i’m not convinced that the entire field is just, gone. Maybe a small portion of it, like 10-20% is less active overall right now. Some art communities are pretty insulated from the broader market as a whole, the furry community being one of them. I think most realistically, AI generated art is going to be used in places where you aren’t actively removing art from the art pool, but adding more art to the already busy art pool.

                I mean in Hollywood for example, AI is most commonly used to do what, remove mustaches and shit? Replace existing CGI that would need to be done on top of faces? Particularly the really time intensive and tedious parts of it that aren’t cost effective to approach and manage. Otherwise you would just pay a normal artist to do it, because they’re going to be extremely competitive. It’s not like you can just delete an entire production team and smash a movie script through an AI (that was also written by AI) and get a full movie out of it.

                I think a lot of it is a senseless overreaction, i’m sort of sympathetic to it, but at the end of the day, there hasn’t been an artist famine to my knowledge, so it seems like things are going fine.

                I consider this an increase in supply because it’s an increase in the potential supply.

                counter point. It’s not actually an increase in supply potential, it’s both supply and demand. You can’t just look at this like it’s increasing productivity exclusively, although it is in some part, AI simply can’t do certain things that humans can. And you can’t act like an increase in productivity won’t drive in increase in demand either. Because it will. That’s the entire operating principle of the global economy. a steady increase in productivity leads to a steady increase in supply, which leads to a steady increase of demand, which leads to overall growth.

                You might be paid less as an artist, but unless you get a significant downturn that we would’ve already heard about, things like covid are going to effect you more heavily.

                if you 10x the productivity of an employee, that typically doesn’t mean you produce 10x the product, it typically means you need 1/10th the employees. That payroll saving goes right into the pockets of execs.

                but you also need to consider that in the global market, you aren’t just supplying a magical constant of demand, if you can increase demand, you can increase supply. and lower the price of the product as well, making it both more competitive, more accessible, and more productive. Since now you have an incentive to increase the production of that product to meet supply.

                Of course if people don’t buy things, it doesn’t matter anymore, but i think both of us here can agree that people don’t really seem content to stop spending money on things any time soon.

                Also wrt luddites, they weren’t wrong. It did absolutely demolish their industry and devastate the workers.

                isn’t this primarily because they never industrialized/mechanized and then ended up being out competed in the market by people who did? Even if this is a concern, you can always shift focus, and move from making generalized textiles, to more complex, difficult to make, and more expensive textiles. Humans always have an advantage of machines and robots, we’re smarter. A lot smarter. You can look at other industries like watch making for example, expert watch craftsman could have lost their jobs to mechanized industrial production, but they never did. Hand made watches are still a big deal.

                LLMs threaten almost every industry

                i’m not sure how much they threaten every industry collectively, though i’m sure they pose some level of obstacle, at the end of the day i’ve only ever seen AI being used for janitorial tasks in companies, like boiler-plating reports, and categorizing documentation. And also customer support, which is not particularly something i’m a fan of, but there is definitely utility in it. Almost never fully replacing entire positions, that would be too costly.

                • PeriodicallyPedantic@lemmy.ca
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  27 days ago

                  I largely agree with you, and I definitely appreciate that you’re being very civil in our discussion.

                  There are a few points I’d like to clarify and maybe counter:

                  I think you’re putting too much burden of responsibility on the workers for decisions that the employer makes. To just say it’s a liquidity problem is ignoring how many people have a liquidity problem and the sources of that problem, and the responsibility that employers should have to their community. I agree that some degree of turnover is ok, but I don’t think that’s what we’re talking about.

                  I agree that the entire [field] won’t just vanish, but I believe that the increase in productivity means that they’ll need way fewer workers.
                  this isn’t just affecting fine arts and support:
                  This is also affecting things like technical writers, marketing, copywriting, programming, paralegal, even diagnostic medicine. Pretty much any office job is in the line of fire.
                  And when the spread is that wide, even 10-15% of the workforce is devastating to industries and even the economy as a whole.
                  And the spread is only gonna get wider as they introduce “agents” who are capable of making “decisions” autonomously, so you don’t need a human to tell the AI what to do, and then do something with the output.
                  Yes the Luddites never mechanized, that’s the thing they were fighting against. They couldn’t all move to complex textiles, because the market wasn’t there for it, if they lowered their prices enough to generate the demand then they couldn’t recoup their time and material costs.

                  Wrt supply/demand, an increase in supply drives an increase in demand through a lowering of prices. This is the foundation of microeconomics. It doesn’t really translate to the messiness of IRL, but it’s still close enough that it shows that bad things will happen.

                  In the end it comes down to what the LLM producers are promising. They’re promising to be able to do all this. Idk if they can actually fulfill their promises, but I think it’s crazy to wait and see if they can before moving to prevent it. They’re saying they’re gonna do it, let’s make them not

                  • KillingTimeItself@lemmy.dbzer0.com
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    27 days ago

                    I largely agree with you, and I definitely appreciate that you’re being very civil in our discussion.

                    that’s the primary reason i’m here at all, i enjoy discussions about things like this, it’s interesting, and sometimes you even learn things.

                    I think you’re putting too much burden of responsibility on the workers for decisions that the employer makes.

                    to be clear, i wasn’t defining this as a worker issue, or an employer issue, i was stating it as a fundamental limitation of our economic model. It’s sort of a fundamental limitation of how making money works at the moment.

                    I agree that some degree of turnover is ok, but I don’t think that’s what we’re talking about.

                    that probably isn’t, but a significant problem i find with people is that they often don’t provide enough specificity or detail surrounding their statements or claims, to the point where it’s either irrelevant or simply too broad. Broad enough in some cases to the point where you could write a PHD on it, and then work for 20 years in that field, before answering the question.

                    People will hand waive the entirety of capitalism, in favor of something like socialism, which has no practical implementation as far as i’m aware, outside of the few tries that haven’t quite worked out optimally so far. I just can’t justify using logic in that way, i try to at least lock down on what i’m talking about to a point where it’s broadly understandable. Which is challenging, but that’s partly why im here lol.

                    but I believe that the increase in productivity means that they’ll need way fewer workers.

                    i think this is probably true, but given the accuracy and competence of most existing AI, i highly expect this to be restricted mostly due to “additional” productivity, it’s essentially creating a new market segment where one wasn’t previously. An AI alone can’t exactly replace a human. It can replace certain aspects and parts of a human, but never a full one. So it’s really hard to say how bad it will hit the industries in question.

                    this isn’t just affecting fine arts and support:

                    yeah for sure, i’m just not sure how much of this is going to be A: significant, or B: impactful.

                    And the spread is only gonna get wider as they introduce “agents” who are capable of making “decisions” autonomously, so you don’t need a human to tell the AI what to do, and then do something with the output.

                    It’s also worth noting that this is a significantly more risky move to make, especially if you put it in charge of handling anything other than doing “menial organization” work. For example, money. I highly doubt you would find anybody willing to let an AI buy things for them.

                    A lot of this labor is already automated through things like scripting and strict data entry. This is probably only going to make it less strict in that sense.

                    Yes the Luddites never mechanized, that’s the thing they were fighting against. They couldn’t all move to complex textiles, because the market wasn’t there for it, if they lowered their prices enough to generate the demand then they couldn’t recoup their time and material costs.

                    To be clear, this is kind of the example of bringing a knife to a gun fight, it’s your fault if you lose at that point. And while it’s definitely true that it cost the market jobs, the increase in productivity was probably more significant than the loss from textiles. Not to mention the decrease in product prices, making the living standards of everybody higher.

                    You could theoretically never mechanize, but you’re fighting a losing battle, by never innovating. Just look at intel, got blown out of the water by AMD since they sat on technology for a decade, and they’re losing market share now. They had a huge stock crash over their recent CPU lineup being overcooked, and burning themselves out. They’re not having a particularly great time right now, but that’s just what happens. And as a market, we’re all doing better now, the hardware capability of CPUs has improved MASSIVELY since the start of ryzen, and laptops have even seen a significant boost in productivity so much so that apple had to move to their own silicon to keep any sort of lead on the competition. Really good CPUs are a lot cheaper now, you can use ECC memory with most ryzen chips, while you have to pay intel for that privilege. The single thread speed of CPUs has increased significantly as well, making basically every task that much faster, the power efficiency of chips has also massively improved as well.

                    Generally, in a market like ours, losing existing jobs, and increasing productivity is going to be a beneficial tradeoff, as it opens more space for other types of productivity down the road. It’s sort of the endless optimization of a specific item, but the global economy.

                    In the end it comes down to what the LLM producers are promising. They’re promising to be able to do all this.

                    and so far, they’ve lied. Google cheated on the gemini presentation. Grok can’t even produce real facts. ChatGPT has progressively worsened since launch due to bad data. Image generation and video generation has improved, but we’re at a point where it can’t improve more than it has already. At least that significantly, so we’re quickly approaching a wall. Unless we pivot, and they will, but it will have to be marketable at the end of the day, and that’s the hard part.

                    I think it’s also worth noting that we should in some capacity, prepare for the inevitable, never be comfortable, always be ready. You can’t lie down at the sight of a sword, and not expect to be killed anyway. Fighting against it might work, but that’s not historically supported in any significant capacity to my knowledge. You can do nothing, which is even worse. Or you can do your best to prepare as well as you can. There is always something to be offering over other people. Especially AI.

      • bluewing@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        29 days ago

        A very large amount of those dug up diamonds end up as “industrial diamonds.” Because they are far from gemstone quality. And they definitely get used up. I have used up my share of them as cutting tools when I was a toolmaker.

        • PeriodicallyPedantic@lemmy.ca
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          29 days ago

          Ok cool, but this is an analogy. Why are you defending the use of AI by megacorps by objecting to irrelevant parts of an analogy on technicality?

            • PeriodicallyPedantic@lemmy.ca
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              28 days ago

              You’re insufferable.

              I know the analogy isn’t a perfect fit for LLMs in general. Analogies never come close to describing the entire thing they’re analogs of, they don’t need to.

              It doesn’t matter because this is a suitable analogy for the argument. This is how analogies work.

              The argument is that because the harm has already been done, it’s fine to use LLMs.
              That same argument can be made for blood diamonds, and it’s untrue for exactly the same reason:
              Because buying the use of LLMs (which is mostly how they’re used, even if you pay in data instead of money) is funding the continued harmful actions of the producer.

              I can’t believe I have to explain how analogies are used to a grown ass adult.

      • KillingTimeItself@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        28 days ago

        this is actually a really debatable argument. If you’re buying it first hand, from somebody trying to make money, yes it could arguably be unethical, but if you’re buying it second hand, i.e. someone who just doesn’t want it anymore, you could make the argument that it’s an ethically net positive transaction. Since they no longer want the diamond, and wish to have money instead, and you wish to have the diamond, and less money. Everybody wins in 2nd hand deals, weirdly enough.

        • PeriodicallyPedantic@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          28 days ago

          Setting aside “there is no ethical consumption under capitalism”, which is a debatable for another time:

          I don’t totally agree with your assessment of 2nd hand sales: it’s not ethical positive, at best it’s ethically neutral, because demand trickles up the market. I could go into this more, but ultimately it it’s irrelevant:

          The 2nd hand LLM market doesn’t work like that because LLMs are sold as a service. The LLM producers take a cut from all LLM resellers.

          You could make a case that self hosting a free open source LLM like OLlama is ok, but that’s not how most LLMs are distributed.

          • KillingTimeItself@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            27 days ago

            Setting aside “there is no ethical consumption under capitalism”, which is a debatable for another time:

            the simple answer is yes and no, there is both ethical, and unethical consumption under capitalism. As there is in any society throughout human history.

            I don’t totally agree with your assessment of 2nd hand sales: it’s not ethical positive, at best it’s ethically neutral, because demand trickles up the market. I could go into this more, but ultimately it it’s irrelevant:

            if we start from a basis of ethical neutrality, ignoring the parties as a baseline, i included the parties because i would consider two parties who wish to do business, successfully doing business, to be an ethically positive thing, as two people have had their needs/wants more closely met by this deal. Therefore making it ethically positive.

            If you’re starting from an ethically negative point, you need some sort of inherent negative value to exist within that relationship. Perhaps the guy who is selling was hitler for example, that would make it an ethically negative scenario. Maybe after he sold it, he would’ve donated the money or given it to someone who can do something more useful with it, making it even more ethically positive.

            there’s an argument to be made for something with perceived value and static supply to have an upwards trajectory in the market going forward, for something like blood diamonds this is unlikely, but assuming it did happen, this should actually be an ethically positive thing assuming that the families of the original diamonds ended up getting their hands on them for example. If they weren’t given back to the family then it doesn’t really matter since you’re back to the baseline anyway. Prospective investments aren’t a real tangible asset, so we can’t forsee that.

            although to be clear, i wasn’t talking about LLMs, this is a much harder thing to do with LLMs, although the argument here is that the training has already been done, the power has already been consumed, you can’t unconsume that power, so therefore whatever potential future consumption that could happen, is based off of the existing consumption already. Unless of course you did more training in the future, but that’s a different story. Just boycott AI or something at that point lol.

            • PeriodicallyPedantic@lemmy.ca
              link
              fedilink
              arrow-up
              1
              ·
              27 days ago

              In both your examples, you seem to assume that the harm is already done and that there is no continued harm.

              But in both cases the harm isn’t finished; the blood diamond mine owners use the continued sale of blood diamonds to fund their continued mining operations, and LLM providers use the sale of LLM use to fund the continued training of new LLM models.

              Regardless of if you think that buying second hand blood diamonds increases overall demand in the market (which blood diamonds sellers benefit from); it is clearly the case that selling (and reselling) LLM services benefit the LLM providers, and we can trivially see that they’re training new models and not making amends.

              • KillingTimeItself@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                27 days ago

                But in both cases the harm isn’t finished;

                says who? In my example i assumed that the blood mined diamonds had already ceased production, because obviously if they haven’t theres no point in talking about market forces at all. The more pressing concern would be the literal blood diamonds. I was talking about the second hand market for what were previously blood diamonds, and technically still are, just without the active cost associated.

                And again, what funds, there are no funds, this is a purely second hand sale. The seller is not giving a percent back to the diamond mining company that no longer exists here.

                and LLM providers use the sale of LLM use to fund the continued training of new LLM models.

                i would agree with this, but it seems like we very quickly hit a new technical limitation as of the last few years. The pace has drastically slowed, the technical nature of the AIs have improved less, the broad suitability has improved more. And it’s also worth noting that this is an itemized cost. Not a whole static cost. Just saying “but but, ai consumes lots of energies” is meaningless, unless you can demonstrate that it’s significant, and actually matters. I think there is definitely an argument to be made here, but unfortunately, i have yet to see anyone actually argue it.

                and we can trivially see that they’re training new models and not making amends.

                what do you mean when you say amends? Carbon capture? Paying off artists so they can "steal their jobs? This is meaningless to me without an actual practical example.