• Daxtron2@startrek.website
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 months ago

    Then you haven’t been paying attention. There’s been huge strides in the field of small open language models which can do inference with low enough power consumption to run locally on a phone.