Page 1 of 4 123 ... LastLast
Results 1 to 10 of 39

Thread: Can a machine be racist?

  1. #1
    Utisz's Avatar
    Type
    INxP
    Join Date
    Dec 2013
    Location
    Ayer
    Posts
    2,760

    Can a machine be racist?

    So Google's automatic image recognition software tagged two black folks as gorillas.



    The question "Can a machine be racist?" is obviously flawed so feel free to take exception to that or refine it if you wish ... it would seem that a machine cannot be racist but it could be programmed to act in a racist way.

    In this case though this was an objective algorithm that returned a result that was incorrect and also socially speaking would be considered very racist. Google thus removed "gorilla" from its set of tags since it could not correct the algorithm.

    A related question would be: should machines/algorithms/etc. be enforced to follow politically correct norms?
    Last edited by Utisz; 07-16-2015 at 01:33 AM.

  2. #2
    a cantori Perdix's Avatar
    Type
    INTj
    Join Date
    Jan 2014
    Location
    the deep end
    Posts
    2,436
    No, a machine programmed to be racist would only indicate that its programmer is racist.

    This is a stupid question and the way it's phrased makes me think you're not actually looking for well thought out answers. Machines programmed to be politically correct? Is that some sort of joke?

  3. #3
    a fool on a journey pensive_pilgrim's Avatar
    Type
    INTP
    Join Date
    Dec 2013
    Posts
    3,470
    I think a more interesting question is "is acknowledging that some facts support racist stereotypes in itself racist?" I mean, obviously some black people look a lot like gorillas, or this wouldn't have happened. Saying "some black people look a lot like gorillas" will get you lynched in today's political climate though, and your words dismissed as racist. When a computer says the same we can't just write it off as racism (with the presumption that RACIST == FALSE)

  4. #4
    Faster. Than. Ever. Sloth's Avatar
    Type
    INTP
    Join Date
    May 2014
    Location
    Somewhere, I'm sure.
    Posts
    1,799
    INTPx Award Winner
    Quote Originally Posted by pathogenetic_peripatetic View Post
    I think a more interesting question is "is acknowledging that some facts support racist stereotypes in itself racist?" I mean, obviously some black people look a lot like gorillas, or this wouldn't have happened. Saying "some black people look a lot like gorillas" will get you lynched in today's political climate though, and your words dismissed as racist. When a computer says the same we can't just write it off as racism (with the presumption that RACIST == FALSE)
    This is a terrible example to attack the validity of political correctness. The fact that black people are more closely associated with looking like gorillas and monkeys than other races IS actually racist.






    We all look like fucking monkeys, pay more attention. There's a reason people get destroyed publicly when saying things like that, it isn't always out of over sensitivity.

  5. #5
    a fool on a journey pensive_pilgrim's Avatar
    Type
    INTP
    Join Date
    Dec 2013
    Posts
    3,470
    Quote Originally Posted by Sloth View Post
    This is a terrible example to attack the validity of political correctness. The fact that black people are more closely associated with looking like gorillas and monkeys than other races IS actually racist.
    Well first of all, I wasn't attacking anything, so I think you've gotten the wrong impression from my post.

    Secondly, whether or not black people actually do resemble gorillas more than other races is irrelevant to the point I was making. I can call white people monkeys all day and nobody will get mad, but calling a black person a monkey is usually pretty highly offensive and hurtful.

    Finally, it's your examples that are terrible. It's obvious that none of those matches were made by a computer, and determining whether one image resembles another is a completely different world from trying to determine what an image contains.

    Regardless, "some white people look like monkeys" could not ever be a refutation of "some black people look like gorillas"

  6. #6
    Utisz's Avatar
    Type
    INxP
    Join Date
    Dec 2013
    Location
    Ayer
    Posts
    2,760
    Quote Originally Posted by prometheus View Post
    No, a machine programmed to be racist would only indicate that its programmer is racist.

    This is a stupid question and the way it's phrased makes me think you're not actually looking for well thought out answers.
    I posted the OP in a rush and I went with a title that was click-baity and that I admitted was flawed. I didn't want to make the question technical straight away. I wanted to leave some lee-way to see where it might go.

    Machines programmed to be politically correct? Is that some sort of joke?
    Not really, no.

    Quote Originally Posted by pathogenetic_peripatetic View Post
    I think a more interesting question is "is acknowledging that some facts support racist stereotypes in itself racist?" I mean, obviously some black people look a lot like gorillas, or this wouldn't have happened. Saying "some black people look a lot like gorillas" will get you lynched in today's political climate though, and your words dismissed as racist. When a computer says the same we can't just write it off as racism (with the presumption that RACIST == FALSE)
    I would presume that given the fuzzy nature of image recognition, that Google Photo has made lots of dumb mistakes and that this one is being talked about because it is racist in any other context.

    But yes, I think to be more specific about what I'm asking: if a general purpose inductive learning algorithm gets trained on a shit load of objective data, but then starts spewing out racist conclusions, is that racist? Should the machine effectively be censored?

    Quote Originally Posted by Sloth View Post
    The fact that black people are more closely associated with looking like gorillas and monkeys than other races IS actually racist. ...

    We all look like fucking monkeys, pay more attention. There's a reason people get destroyed publicly when saying things like that, it isn't always out of over sensitivity.
    Knowing roughly how these sorts of algorithms work on a high level, the algorithm is a lot more likely to mistakenly tag black people as gorillas than white people due to a high concentration of dark pixels in both images of black people's faces and images of gorillas faces. The algorithm would have learned those patterns itself from lots of training data rather than being instructed in any manner. White people, on the other hand, are more likely to be tagged as cream crackers than black people.

  7. #7
    Faster. Than. Ever. Sloth's Avatar
    Type
    INTP
    Join Date
    May 2014
    Location
    Somewhere, I'm sure.
    Posts
    1,799
    INTPx Award Winner
    Quote Originally Posted by pathogenetic_peripatetic View Post
    Well first of all, I wasn't attacking anything, so I think you've gotten the wrong impression from my post.

    Secondly, whether or not black people actually do resemble gorillas more than other races is irrelevant to the point I was making. I can call white people monkeys all day and nobody will get mad, but calling a black person a monkey is usually pretty highly offensive and hurtful.

    Finally, it's your examples that are terrible. It's obvious that none of those matches were made by a computer, and determining whether one image resembles another is a completely different world from trying to determine what an image contains.

    Regardless, "some white people look like monkeys" could not ever be a refutation of "some black people look like gorillas"
    -I wasn't saying you're racist, I was saying the statement is.
    -It's racist because it's disproportionately emphasized. It's similar to saying "Many women are stupid." and making no other statement, though yes it doesn't negate the fact that "many men are stupid", putting a disproportionate amount of emphasis on a negative trait while disproportionately ignoring that same trait in other groups is discriminatory and destructive.
    -I also don't agree that people don't get angry when you compare white people to monkeys (no one ever says it so it seems like no one cares)...
    -My examples weren't addressing if computers/technology are racist, I agree with prometheus that it's a stupid question for the same reasons he thinks it's stupid.

  8. #8
    Faster. Than. Ever. Sloth's Avatar
    Type
    INTP
    Join Date
    May 2014
    Location
    Somewhere, I'm sure.
    Posts
    1,799
    INTPx Award Winner
    Quote Originally Posted by Utisz View Post
    Knowing roughly how these sorts of algorithms work on a high level, the algorithm is a lot more likely to mistakenly tag black people as gorillas than white people due to a high concentration of dark pixels in both images of black people's faces and images of gorillas faces. The algorithm would have learned those patterns itself from lots of training data rather than being instructed in any manner. White people, on the other hand, are more likely to be tagged as cream crackers than black people.
    Yeah I know, I realized that my response was actually sort of beginning to get the thread off topic. It was intended to address pathogenic's example to be a poor for his argument about the arbitrariness of political correctness (which I also agree is often arbitrary BTW, it was just a really realllly bad example to use for it).

    Sorry to the OP for not really addressing the thread's topic in my first post.

  9. #9
    Utisz's Avatar
    Type
    INxP
    Join Date
    Dec 2013
    Location
    Ayer
    Posts
    2,760
    Quote Originally Posted by Sloth View Post
    -My examples weren't addressing if computers/technology are racist, I agree with prometheus that it's a stupid question for the same reasons he thinks it's stupid.
    Quote Originally Posted by Sloth View Post
    Sorry to the OP for not really addressing the thread's topic in my first post.
    No worries. The question is admittedly stupid ... perhaps the more interesting question for me is why is it stupid.

    I know for example that there are systems that can guess a person's gender just from their writing with a pretty high accuracy rate. I don't have the time to find an exact reference right now but an example system is this one (not necessarily the one with high accuracy but the one with the best Google score).

    Assuming a machine starts from a state of zero, and is fed a bunch of examples and told that these are written by males and those are written by females, and learns some patterns from those examples, it can then apply those patterns to specific individuals to classify the text as "male" or "female". The idea is a technical parallel to the idea of stereotyping and then applying that prejudice to individuals. It is sexist to assume with any confidence that one writes in a certain way or chooses certain types of words because of their gender. And yet these systems exist and are accurate. So where in the system does the sexism originate (if at all)?



    Anyways, I stuck a blog post of mine into the analyser above and it classified me as "Verdict: Weak FEMALE".
    EDIT: and some CS work stuff ... "Verdict: MALE".

  10. #10
    Faster. Than. Ever. Sloth's Avatar
    Type
    INTP
    Join Date
    May 2014
    Location
    Somewhere, I'm sure.
    Posts
    1,799
    INTPx Award Winner
    Quote Originally Posted by Utisz View Post
    Assuming a machine starts from a state of zero, and is fed a bunch of examples and told that these are written by males and those are written by females, and learns some patterns from those examples, it can then apply those patterns to specific individuals to classify the text as "male" or "female". The idea is a technical parallel to the idea of stereotyping and then applying that prejudice to individuals. It is sexist to assume with any confidence that one writes in a certain way or chooses certain types of words because of their gender. And yet these systems exist and are accurate. So where in the system does the sexism originate (if at all)?
    I wonder how much it's picking up sociological correlations vs. measurable brain functioning differences. It'd be really hard to tell, heh, especially since the two impact each other. That's some really interesting technology. I'd love to give my handwriting a whirl, I've always felt I wrote very androgynously.

    Edit: Clicked the link and realized that you didn't mean handwriting, you meant just putting words together. I'm so going to experiment with this for a while haha

Similar Threads

  1. What makes something racist? A Washington Redskins thread
    By El D. in forum News, Culture & History
    Replies: 216
    Last Post: 06-23-2014, 07:29 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •