• brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    7
    ·
    edit-2
    3 days ago

    Meanwhile, the chinese and other open weights models are are killing it. GLM 4.5 is sick. Jamba 1.7 is a great sleeper model for stuff outside coding and STEM. The 32Bs we have like EXAONE and Qwen3 (and finetuned experiments) are mad for 20GB files, and crowding out APIs. There are great little MCP models like Jan too.

    Are they AGI? Of course not. They are tools, and that’s what was promised; but the improvements are real.

    Turns out, closed source tech bro corporate enshittification is not a sustainable plan. Who knew…