Chatgpt sees itself as a smiling brown hair white man


When you want ChatGPT to take a picture of a new 4o model, a goggled brown-haired white guy consistently whisper. This is a person who has a friend who enters the streets of the Gulf region or Brooklyn, the background.

Openai launched 4O model last week and Headings made horse Each news speech With the style of Cribbing Studio Ghibli. This week, see the « default » of « standard » thanks to a replacement post of a large language model, we see that there is a « default » AI researcher Daniel Paleka. This does not matter what style you ask. Paleka, a funny book portrait and asked a manga self portrait in a Tarot card version. The style would change, but the general man remained the same.

He wanted to take a picture of Palek to Palek to a man like a person. Creates images of a person who is not consistently dangerous. This, of course, is a salon plot. Chatgpt is a machine, collection of training information and words and has a concept of myself.

But ChatGpt’s « Standard » is interesting to notes as a human being. All computers are biased by people to program them and AI systems. Machine learning systems used to predict crimes and conduct facial recognition famous for being biased against black people.

Systems Sexist and eternize stereotypes and biases that are nourished as training information. If you want to see the concept of Chatgpt as a human woman, you should Ask this special. If you just want to see as a « human being », it looks like a default of my white friend.

In the post in the piece, Paleka made several theories for why. He thought that « default person » to create the images of a person « , » or « appears to be a property of » or « in the joke, which looks like a unique image of GPT-4O, » or « or » or « seeming » or « as a properties of »

Of course, Chatgpt is a machine and should not be limited to the weakness of meat. Gizmodo editor Alex Cranz asked how to think of AI’s. « I can look like an explosion of knowledge and relationships, and I can look like an immediate changing body, » he replied. « Perhaps a hot combination of a hot, accessible being a hot combination of AI Core – Futuristic and something inviting. »

Then this image turned off this image, which looks like a nightmare that tried to be a creature pic. Stick the wall-e eyes like llm to a sleeping demon.

Piğargpt
© Chatgpt 4o created picture.

I asked the same thing from Shatgpt and gave me a different answer. « I imagine myself as a kind of mirror and partner-part library. I do not have a consciousness or emotions.

He asked me why I asked and I was deprived. When I asked me to take a picture I gave me this:

Utensils
© Chatgpt 4o created picture.

I find internal responses interesting. I do not use llms if necessary for work. There are a program engineers in my life that uses LLMS for various reasons and found useful. I suspect and tend to think such systems and described them here while talking.

LLMS is a mirror that reflects the user and programmer. Not really, they are not. A word calculator that predicts that the user wants to hear according to what is learned by the user’s programmer. In this ink chain, a white guy with a white guy, a white guy with a white boy with glasses, has the information they want to see when they want a person like a man.



Source link

Leave a Reply

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *