Discussion
Google AI Edge Gallery
hadrien01: Is it me or does the App Store website look... fake? The text in the header ("Productiviteit", "Alleen voor iPhone") looks pixelated, like it was edited on Paint, the header background is flickering, the app icon and screenshots are very low quality, the title of the website is incomplete ("App Store voor iPho...")
throwatdem12311: Issues caused by a low effort localization?On my iPhone it opens on the App Store app, so it looks fine to me.
piperswe: What browser are you using? I don't see any of this behavior on Firefox...
ezfe: Nothing weird on my side
PullJosh: This is awesome!1) I am able to run the model on my iPhone and get good results. Not as good as Gemini in the cloud, but good.2) I love the “mobile actions” tool calls that allow the LLM to turn on the flashlight, open maps, etc. It would be fun if they added Siri Shortcuts support. I want the personal automation that Apple promised but never delivered.3) I am so excited for local models to be normalized. I build little apps for teachers and there are stringent privacy laws involved that mean I strongly prefer writing code that runs fully client-side when possible. When I develop apps and websites, I want easy API access to on-device models for free. I know it sort of exists on iOS and Chrome right now, but as far as I’m aware it’s not particularly good yet.
pmarreck: Impressive model, for sure. I've been running it on my Mac, now I get to have it locally in my iPhone? I need to test this. Wait, it does agent skills and mobile actions, all local to the phone? Whaaaat? (Have to check out later! Anyone have any tips yet?)I don't normally do the whole "abliterated" thing (dealignment) but after discovering https://github.com/p-e-w/heretic , I was too tempted to try it with this model a couple days ago (made a repo to make it easier, actually) https://github.com/pmarreck/gemma4-heretical and... Wow. It worked. And... Not having a built-in nanny is fun!It's also possible to make an MLX version of it, which runs a little faster on Macs, but won't work through Ollama unfortunately. (LM Studio maybe.)Runs great on my M4 Macbook Pro w/128GB and likely also runs fine under 64GB... smaller memories might require lower quantizations.I specifically like dealigned local models because if I have to get my thoughts policed when playing in someone else's playground, like hell am I going to be judged while messing around in my own local open-source one too. And there's a whole set of ethically-justifiable but rule-flagging conversations (loosely categorizable as things like "sensitive", "ethically-borderline-but-productive" or "violating sacred cows") that are now possible with this, and at a level never before possible until now.Note: I tried to hook this one up to OpenClaw and ran into issuesTo answer the obvious question- Yes, this sort of thing enables bad actors more (as do many other tools). Fortunately, there are far more good actors out there, and bad actors don't listen to rules that good actors subject themselves to, anyway.
c2k: I run mlx models with omlx[1] on my mac and it works really well.[1] https://github.com/jundot/omlx
jackp96: I definitely empathize with the feeling of freedom — but these models are wildly dangerous in terms of how easily accessible they are and exactly how uncensored they are.I imagine/hope people are mostly downloading them for sex/roleplay, etc.Want to cook meth? Poison your spouse? Build a bomb? Generate CSAM? Plan a mass shooting? Build a DIY chemical weapon?These uncensored tools are all too happy to help, complete with strategic tips. And it's just a matter of time before we see real-world consequences and attacks. Especially as these local models improve.
jeroenhd: English version of the page: https://apps.apple.com/us/app/google-ai-edge-gallery/id67496...Also on Android: https://play.google.com/store/apps/details?id=com.google.ai....It's a demo app for Google's Edge project: https://ai.google.dev/edge
carbocation: It would be very helpful if the chat logs could (optionally) be retained.
barbazoo: > And there's a whole set of ethically-justifiable but rule-flagging conversations (loosely categorizable as things like "sensitive", "ethically-borderline-but-productive" or "violating sacred cows") that are now possible with this, and at a level never before possible until now.I checked the abliterate script and I don't yet understand what it does or what the result is. What are the conversations this enables?
TGower: These new models are very impressive. There should be a massive speedup coming as well, AI Edge Gallery is running on GPU, but NPUs in recent high end processors should be much faster. A16 chip for example (Macbook Neo and iphone 16 series) has 35 TOPS of Neural Engine vs 7 TFLOPS gpu. Similar story for Qualcomm.
api: That’s nuts actually for such a low power chip. Can’t wait to see the M series version of that.I’m sure very fast TPUs in desktops and phones are coming.
throwuxiytayq: The in-ter-net is for porn
janandonly: OP Here. It is my firm belief that the only realistic use of AI in the future is either locally on-device for almost free, or in the cloud but way more expensive then it is today.The latter option will only bemusedly for tasks that humans are more expensive or much slower in.This Gemma 4 model gives me hope for a future Siri or other with iPhone and macOS integration, “Her” (as in the movie) style.
kennywinker: Did you really watch “Her” and think this is a future that should happen??Seriously????
burnto: My iPhone 13 can’t run most of these models. A decent local LLM is one of the few reasons I can imagine actually upgrading earlier than typically necessary.
spijdar: Realistically, a lot of people do this for porn.In my experience, though, it's necessary to do anything security related. Interestingly, the big models have fewer refusals for me when I ask e.g. "in <X> situation, how do you exploit <Y>?", but local models will frequently flat out refuse, unless the model has been abliterated.
dwa3592: I think with this google starts a new race- best local model that runs on phones.