Google issued an apology and will pause the image generation feature of its artificial intelligence model Gemini after it refused to show images of White people.
in News
Google issued an apology and will pause the image generation feature of its artificial intelligence model Gemini after it refused to show images of White people.
Enter your account data and we will send you a link to reset your password.
To use social login you have to agree with the storage and handling of your data by this website. Privacy Policy
AcceptHere you'll find all collections you've created before.
the war on humanity continues
Done intentionally to race bait and distract.
Ha! Did they make them all fat and trans as well. Goodness me 😩
White people connected the world to begin with.
100% correct!
Microsoft’s version of DALL-E 3 was also injecting different races and changing peoples prompts pretty heavily for about a month. Facebook’s “Imagine with Meta AI” is also pretty woke as a baseline. This is why locally run open source models are so important for both image and text generation.
futuristic technology and predictive programming no doubt…
my experience with Scribble Diffusion when it was free was full of crappy results graphically… would not follow prompts for artistic style but maybe half the time and did it half-assed when it did… was prone to putting in All kinds extraneous blobs that were unrecognizable…
.
someone in another post recommended Gab AI as being ‘neutral’ with respect to all the wokester crapola pushed by other AI engines…
.
you want to see some wild AI animations look like i imagine a trip on hallucinagens is like, check out my boy TETOUZE on the utoob…
rajasthani soul vid-yo is pretty cool…
8^)
here’s an example of what i was talking abiut… i know the guiding sketch is not great, but not awful, and with the prompt, should have gotten a more realistic image than it did…
and they wanted me to Pay cash-munny for that crapola ?
gtfoh
8^)
AS IF googs doesn’t know EXACTLY what they are purposefully doing… they aren’t contrite or embarassed, this was no mistake, not a glitch needing to be fixed…
googs thinks WE are the glitch needing to be fixed…