• ClamDrinker@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    6 months ago

    If you’re here because of the AI headline, this is important to read.

    We’re looking at how we can use local, on-device AI models – i.e., more private – to enhance your browsing experience further. One feature we’re starting with next quarter is AI-generated alt-text for images inserted into PDFs, which makes it more accessible to visually impaired users and people with learning disabilities.

    They are implementing AI how it should be. Don’t let all the shitty companies blind you to the fact what we call AI has positive sides.

    • AusatKeyboardPremi@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      There are a lot of knee jerk reactions in the comments. I hope few of those commenters have read the article or, at the least, your comment.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      6 months ago

      They are implementing AI how it should be.

      The term is so overused and abused that I’m not clear what they’re even promising. Are they localizing a LLM? Are they providing some kind of very fancy macroing? Are they linking up with ChatGPT somehow or integrating with Co-pilot? There’s no way to tell from the verbage.

      And that’s not even really Mozilla’s fault. It’s just how the term AI can mean anything from “overhyped javascript” to “multi-billion dollar datacenter full of fake Scarlett Johansson voice patterns”.

      • chrash0@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        there are language models that are quite feasible to run locally for easier tasks like this. “local” rules out both ChatGPT and Co-pilot since those models are enormous. AI generally means machine learned neural networks these days, even if a pile of if-else used to pass in the past.

        not sure how they’re going to handle low-resource machines, but as far as AI integrations go this one is rather tame

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          6 months ago

          AI generally means machine learned neural networks these days

          Right, but a neural network traditionally rules out using a single local machine. Hell, we have entire chip architecture that revolves around neural net optimization. I can’t imagine needing that kind of configuration for my internet browser.

          not sure how they’re going to handle low-resource machines

          One of the perks of Firefox is its relative thinness. Chrome was a shameless resource hog even in its best days, and IE wasn’t any better. Do I really want Firefox chewing hundreds of MB of memory so it can… what? Simulate a 600 processor cluster doing weird finger art?

          • chrash0@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            i mean, i’ve worked in neural networks for embedded systems, and it’s definitely possible. i share you skepticism about overhead, but i’ll eat my shoes if it isn’t opt in

  • Larry@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    6 months ago

    Local AI sounds nice. One reason I’m cynical about the current state of AI is because of how many send all your data to another company

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      6 months ago

      Eh, I don’t particularly care too much either way. It seems to be solving problems with the 80/20 approach: 80% of the benefit for 20% of the effort. However, getting that last 20% is probably way more difficult than just building purpose-built solutions from the start.

      So I’m guessing we’ll see a lot more “decent but not quite there” products, and they’ll never “get there.”

      So it might be fun to play with, but it’s not something I’m interested in using day-to-day. Then again, maybe I’m completely wrong and it’s the best thing since sliced bread, but as someone who has worked on very basid NLP projects in the past (distantly related to modern LLMs), I just find it hard to look past the limitations.

    • MacN'Cheezus@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      6 months ago

      It is. Unfortunately it does tend to use up a lot of RAM and requires either a fairly fast CPU or better yet, a decent graphics card. This means it’s at least somewhat problematic for use on lower spec or ultraportable laptops, especially while on battery power.

  • fpslem@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    6 months ago

    tab grouping

    Sure, okay.

    vertical tabs

    To each their own.

    profile management

    Whatever, it’s fine.

    and local AI features

    HOLLUP

    • elliot_crane@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 months ago

      We’re looking at how we can use local, on-device AI models – i.e., more private – to enhance your browsing experience further. One feature we’re starting with next quarter is AI-generated alt-text for images inserted into PDFs, which makes it more accessible to visually impaired users and people with learning disabilities. The alt text is then processed on your device and saved locally instead of cloud services, ensuring that enhancements like these are done with your privacy in mind.

      IMO if everything’s going to have AI ham fisted into it, this is probably the least shitty way to do so. With Firefox being open source, the code can also be audited to ensure they’re actually keeping their word about it being local-only.

      • PseudorandomNoise@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Don’t you need specific CPUs for these AI features? If so, how is this going to work on the machines that don’t support it?

        • lemmyvore@feddit.nl
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          You only need lots of precessing power to train the models. Using the models can be done on regular hardware.

        • elliot_crane@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          With it being local it’s probably a small and limited model. I took a couple courses on machine learning years ago (before it got rebranded as “AI”), and you’d be surprised at how well a basic image recognition model can run on the lowest-spec macbook from 2012.

          • ferret@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 months ago

            Tbh the inversion of typical intuition that is LLMs taking orders of magnitudes more memory than computer vision can mess people unfamiliar up on estimates of the hardware required

        • sacredbirdman@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          6 months ago

          Nope, they can use your NPU, GPU or CPU whatever you have… the performance will vary quite a bit though. Also, the larger the model the more memory it needs to run well.

        • space@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          Running AI models isn’t that resource intensive. Training the models is the difficult part.

    • GregorGizeh@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      6 months ago

      While I dislike corporate ai as much as the next guy I am quite interested in open source, local models. If i can run it on my machine, with the absolute certainty that it is my llm, working for my benefit, that’s pretty cool. And not feeding every miniscule detail about me to corporate.

      • anarchrist@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        I mean that’s that thing. They’re kind of black boxes so it can be hard to tell what they’re doing, but yeah local hardware is the absolute minimum. I guess places like huggingface are at least working to try and apply some sort of standard measures to the LLM space at least through testing…

        • grue@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 months ago

          I mean, as long as you can tell it’s not opening up any network connections (e.g. by not giving the process network permission), it’s fine.

          'Course, being built into a web browser might not make that easy…

          • GregorGizeh@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 months ago

            Sums up my thoughts nicely. I am by no means able to make sense of the inner workings of an llm anyway, even if I can look at its code. At best i would be able to learn how to tweak its results to my needs or maybe provide it with additional datasets over time.

            I simply trust that an open source model that is able to run offline, and doesnt call home somewhere with telemetry, has been vetted for trustworthiness by far more qualified people than me.

    • RmDebArc_5@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      I tried one of their test builds. Seems like the AI part just means the browser can integrate with llamafile (Mozilla’s open source solution for running open source llm’s with just one file on any platform)

  • phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    Local AI, or also, how AI should be. Actually helpful, instead of a spying and data gathering tool for companies

  • grue@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    6 months ago

    I want fewer built-in features, not more of them. All of these things should be extensions, not built into the browser core.

    I mean, I’d be perfectly happy for said extensions and more to be shipped by default – it would be good for Firefox to come “batteries included” even with adblocking and such, and that’s most likely the way I would use it. But I just want it to be modular and removable as a matter of principle.

    I remember how monolithic Mozilla SeaMonkey got too top-heavy and forced Mozilla to start over more-or-less from scratch with Phoenix Firebird Firefox, and I want it to stick close to those roots so they don’t have to do it again.

  • Thrife@feddit.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    Tab grouping, nice! Finally back after they removed then years ago…

    • kirk781@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      6 months ago

      I do not know why browser makers like Opera or Brave(and now apparently Firefox) is going hey ho over AI. I don’t see a proper benefit of integration of local AI for most people as of now.

      As for vertical tabs, Waterfox got it just now. It is basically a fork of Tree Style Tabs and very basically implemented. I am honestly happy with TST on Firefox and while a native integration might be a bit faster(my browser takes just that few extra seconds to load the right TST panel on my slow laptop), it’ll likely be feature incomplete when compared to TST.

      • FooBarrington@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        It depends. I really liked Mozillas initiative for local translation - much better for data privacy than remote services. But conversational/generative AI, no thank you.

        • barsoap@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          6 months ago

          AI-generated alt-text for images inserted into PDFs

          Sounds more like classification so far. Things like summarising web-pages would be properly generative, LLMs in general could be useful to interrogate your browsing history. Doing feature extraction on it, sorting it into a graph of categories not by links, but concepts could be useful. And heck if a conversational interface falls out of that I’m not exactly opposed, unlike the stuff you see on the net it’s bound to quote its sources, it’s going to tell you right-away that “a cat licking you is trying to see whether you’re fit for consumption” doesn’t come from the gazillion of cat behaviour sites you’ve visited, but reddit. Firefox doesn’t have an incentive to keep you in the AI interface and out of some random webpage.

          • douglasg14b@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 months ago

            Mozilla actually had a project for that: https://memorycache.ai//

            They just suck at naming things, and unfortunately it’s not getting much of the necessary dev time it needs to get out of the POC stage.

            The biggest thing I want is local only models that use my activity & browsing history as a way for me to recall or contextualize events and information.

  • jacktherippah@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    That’s all fine and good but Firefox on Android is currently in a sorry state. No per-site process isolation, buggy, can’t keep tabs open, slow, choppy, drains battery. Had to uninstall it on my brand new Galaxy S24+ and my Pixel 6 Pro because it was draining so much battery. When are you going to finally stop ignoring Firefox Android, Mozilla?

      • Eyck_of_denesle@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Nope. He’s right. There are similar threads on reddit too every single week about the mobile version. It’s simply bad.

          • Eyck_of_denesle@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 months ago

            Maybe. It feels slower than it’s open source forks which feel a bit slower than chromium alternatives. And the group tabbing is so bad and no process isolation.

    • bionicjoey@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      I’ve been using it for at least a decade now and haven’t encountered any of the issues you mention.

  • sunbeam60@lemmy.one
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    This is what Mozilla should have done a LONG time ago - focussed on browser features, ease of use, compatibility and speed. Make a better browser if you want to win a browser war.

    • The Liver@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      Isn’t that no. the exact attitude a lot of boomers have for technology? Look where that got them.

    • douglasg14b@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      Why the no?

      It’s local only, and actually used to improve the product as opposed to being another shitty chatbot.

      This is how it should be done.

      • Todd Bonzalez@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        Yeah, everyone is putting AI into their browsers, to some extent Mozilla needs to do this to compete.

        I’m very much in favor of them integrating a local FOSS model rather than to partner with OpenAI like everyone else. Even if you’re against AI, you should understand that this is a way better situation.

  • YurkshireLad@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    6 months ago

    Can I disable all local AI features? Or better yet not have that functionality installed?