-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can we call embedded AI from the Commons app? Wikimedia policy, licenses, etc #6143
Comments
Other practical concerns might be whether WMF allows to distribute under the organization an app with AI features enabled by default, and whether F-droid allows it. I think WMF would want to be consistent with their AI policies in other areas like T336905 (which is not still defined). I don't know much about F-droid, but it looks like they are concerned with blackbox binary dependencies in general. |
The Gemma terms of use seem open source-friendlier than AICore terms, I see no clause about age or reverse-engineering. The model is meant to be downloaded by the app, rather than already present on the device (code). Each model seems to be 1GB or more, so this would also be opt-in. Sample app: https://ai.google.dev/edge/mediapipe/solutions/genai/llm_inference/android |
yeah this would allow us to use any model that we want. We can also download the model if and only if users want to use. |
I am starting to work on this. I will start to make a seperate app just to test out this feature and find the minimum model that will work. |
This is a GSoC task, let's keep actual implementation for the GSoC itself. :-) |
Ohh okay got it. I'm looking forward to it. |
AICore is an SDK provided by the Android OS.
Any issue with using it?
@whym mentioned
transparency and control
, and asked for an optionas long as that is not prohibitively difficult to implement
. (I guess an option is totally doable)My main worry is about whether we can call it and still be considered open source, I asked at https://opensource.stackexchange.com/questions/15324
We could also use explicitly open source-friendly model+library. Maybe https://github.com/srogmann/JBLOOMz (Apache2) + the smallest BLOOM model we can find, which currently seems to be bloomz-560m at 4 GB... still very big, hopefully this could be made an opt-in additional download.
Any other worries/ideas/etc?
The text was updated successfully, but these errors were encountered: