Page 7 of 8 FirstFirst ... 5678 LastLast
Results 61 to 70 of 80

Thread: Elsagate

  1. #61
    Meae Musae Servus Hephaestus's Avatar
    Type
    INTP
    Join Date
    Dec 2013
    Location
    Ceti Alpha V
    Posts
    12,014
    Quote Originally Posted by pensive_pilgrim View Post
    Two problems.

    1) There is nothing to prevent that separate upload path from being overwhelmed in the same way the first one was.
    2) All of the content kids want to see is already on the regular youtube. Now content creators have to reupload all of their content to a second site, and then maintain a presence in both places if they want to have an audience that includes people of all ages.
    Scope.

    Subsection is visible by supersection, but only subsection is visible from within subsection.

    If all content is manually vetted, overwhelming doesn't matter. In fact, the worst stuff is going to be really really easy. The moment you see something off, trash it, move to the next vid. If the channel gets overwhelmed, well, that's what big data should be able to do: find the perps, block their accounts, but don't tell them. Hell, let them think it's uploaded as long as they're looking from their own accounts--but that's a bigger meta-strategy.

    I don't consider the transfer of non-objectionable content from main YouTube to kiddie YouTube to be a significant problem since we've already established this sequestering already exists and there is desired content there that is being overwhelmed by the bad because of no oversight.
    For some, "how", not "why", is the fundamental unit of measure for curiosity. This divergence is neither parallel, nor straight. Where one might have a "why?-5" problem, it might only be a "how?-2" question. But then, there are also many things where the "why?" is immediately obvious but the "how?" is best measured in centuries of perpetual wonder. Both approaches have their drawbacks.

    If one is superior, the other is unaware of it.

    --Meditations on Uncertainty Vol ξ(x)

  2. #62
    Quote Originally Posted by Hephaestus View Post
    Scope.

    Subsection is visible by supersection, but only subsection is visible from within subsection.

    If all content is manually vetted, overwhelming doesn't matter. In fact, the worst stuff is going to be really really easy. The moment you see something off, trash it, move to the next vid. If the channel gets overwhelmed, well, that's what big data should be able to do: find the perps, block their accounts, but don't tell them. Hell, let them think it's uploaded as long as they're looking from their own accounts--but that's a bigger meta-strategy.
    I'm getting frustrated trying to make this easier to understand. There is so much content that people want to upload that it would be impossible to even glance at a fraction of it before deciding to approve it. You can trash the obvious stuff all you want, there could be ten thousand of you doing it, you are not going to come anywhere close to getting through the queue. I've tried to illustrate this with numbers. 300 hours of video are uploaded to YouTube every minute. Until it is analyzed, all of that could potentially be content that will be good for kids or hardcore pornography.

    I don't consider the transfer of non-objectionable content from main YouTube to kiddie YouTube to be a significant problem since we've already established this sequestering already exists and there is desired content there that is being overwhelmed by the bad because of no oversight.
    YouTube Kids is not a separate site. YouTube Kids serves content from YouTube. The entire point is to make YouTube(which was already popular with kids) safe for kids.

  3. #63
    Meae Musae Servus Hephaestus's Avatar
    Type
    INTP
    Join Date
    Dec 2013
    Location
    Ceti Alpha V
    Posts
    12,014
    Quote Originally Posted by pensive_pilgrim View Post
    I'm getting frustrated trying to make this easier to understand. There is so much content that people want to upload that it would be impossible to even glance at a fraction of it before deciding to approve it. You can trash the obvious stuff all you want, there could be ten thousand of you doing it, you are not going to come anywhere close to getting through the queue. I've tried to illustrate this with numbers. 300 hours of video are uploaded to YouTube every minute. Until it is analyzed, all of that could potentially be content that will be good for kids or hardcore pornography.


    YouTube Kids is not a separate site. YouTube Kids serves content from YouTube. The entire point is to make YouTube(which was already popular with kids) safe for kids.
    I give no shits about how much content there is to be parsed, only that the content parsed is parsed correctly. Brute force works. You're talking about so much content, it doesn't matter how much doesn't get looked at.

    I understand the idea of YouTube Kids. Do you understand what I mean by scope? A vetted subset visible to any user but represents the extent of what is visible for subset restricted users? It's a common pattern.

    You don't have to vet all of YouTube. Just the videos for YouTube Kids, and only those whose creators ask to be put on YouTube Kids.

    I understand it. The thing is, we have completely different ideas about what an acceptable end state is. I'm getting frustrated trying to explain to you that you don't have to check everything, just those things that are going to be set aside as kid-safe, and you don't have to have everything that is kid-safe marked as kid-safe. Given the amount of content, I don't even see the point of trying as apparently it wouldn't take long to find enough kid safe content that no kid is going to finish it all before they age out of anyone caring.

    Meanwhile, what you want, is horrific. Think about what such a tool would be used to do. Google already has considerable power over what most people see on the internet. Making a re-toolable AI that can parse that amount of data for semiotic nuance and then filter it so the user doesn't even see it, on one hand, could be a good search engine, but on the other, wouldn't take more than a nudge to make it an implementation of massive and pervasive information control that utterly destroys the vision you're promoting of the internet being a large resource of information.

    You're saying we need a hydrogen bomb for a problem that could be solved with a flashlight.
    For some, "how", not "why", is the fundamental unit of measure for curiosity. This divergence is neither parallel, nor straight. Where one might have a "why?-5" problem, it might only be a "how?-2" question. But then, there are also many things where the "why?" is immediately obvious but the "how?" is best measured in centuries of perpetual wonder. Both approaches have their drawbacks.

    If one is superior, the other is unaware of it.

    --Meditations on Uncertainty Vol ξ(x)

  4. #64
    Quote Originally Posted by Hephaestus View Post
    I give no shits about how much content there is to be parsed, only that the content parsed is parsed correctly. Brute force works. You're talking about so much content, it doesn't matter how much doesn't get looked at.

    I understand the idea of YouTube Kids. Do you understand what I mean by scope? A vetted subset visible to any user but represents the extent of what is visible for subset restricted users? It's a common pattern.

    You don't have to vet all of YouTube. Just the videos for YouTube Kids, and only those whose creators ask to be put on YouTube Kids.

    I understand it. The thing is, we have completely different ideas about what an acceptable end state is. I'm getting frustrated trying to explain to you that you don't have to check everything, just those things that are going to be set aside as kid-safe, and you don't have to have everything that is kid-safe marked as kid-safe. Given the amount of content, I don't even see the point of trying as apparently it wouldn't take long to find enough kid safe content that no kid is going to finish it all before they age out of anyone caring.

    Meanwhile, what you want, is horrific. Think about what such a tool would be used to do. Google already has considerable power over what most people see on the internet. Making a re-toolable AI that can parse that amount of data for semiotic nuance and then filter it so the user doesn't even see it, on one hand, could be a good search engine, but on the other, wouldn't take more than a nudge to make it an implementation of massive and pervasive information control that utterly destroys the vision you're promoting of the internet being a large resource of information.

    You're saying we need a hydrogen bomb for a problem that could be solved with a flashlight.
    Okay dude. You've got it all figured out. Google should hire you to make the new version of YouTube kids that doesn't have 99.999% of the stuff that kids want to watch on youtube. I don't know why they can't see that it's such a simple problem with a simple solution: brute force!

  5. #65
    Meae Musae Servus Hephaestus's Avatar
    Type
    INTP
    Join Date
    Dec 2013
    Location
    Ceti Alpha V
    Posts
    12,014
    Quote Originally Posted by pensive_pilgrim View Post
    Okay dude. You've got it all figured out. Google should hire you to make the new version of YouTube kids that doesn't have 99.999% of the stuff that kids want to watch on youtube. I don't know why they can't see that it's such a simple problem with a simple solution: brute force!
    If what I proposed had been the starting point, Elsagate wouldn't be about content seen via Youtube Kids.


    Edit: So if you had 100 people working a normal 40 hr workweek, checking content starting at the launch of YouTube Kids, and if we assume they can approve about 15 minutes worth of content per hour worked, then on their first anniversary they would have 40x52x.25x100 = 52000 hours of approved kid safe content. Convert that to days and you have almost six years worth of content. Hire multiple teams, at least one for each region being served because local standards vary. Tricky bit would be letting the different regions share data. But that would be a project for year two.

    There would have been some gaffes, some surprise errors in judgment. But nothing like this.
    Last edited by Hephaestus; 12-07-2017 at 09:36 AM.
    For some, "how", not "why", is the fundamental unit of measure for curiosity. This divergence is neither parallel, nor straight. Where one might have a "why?-5" problem, it might only be a "how?-2" question. But then, there are also many things where the "why?" is immediately obvious but the "how?" is best measured in centuries of perpetual wonder. Both approaches have their drawbacks.

    If one is superior, the other is unaware of it.

    --Meditations on Uncertainty Vol ξ(x)

  6. #66
    Quote Originally Posted by Hephaestus View Post
    If what I proposed had been the starting point, Elsagate wouldn't be about content seen via Youtube Kids.


    Edit: So if you had 100 people working a normal 40 hr workweek, checking content starting at the launch of YouTube Kids, and if we assume they can approve about 15 minutes worth of content per hour worked, then on their first anniversary they would have 40x52x.25x100 = 52000 hours of approved kid safe content. Convert that to days and you have almost six years worth of content. Hire multiple teams, at least one for each region being served because local standards vary. Tricky bit would be letting the different regions share data. But that would be a project for year two.

    There would have been some gaffes, some surprise errors in judgment. But nothing like this.
    Yeah, because nobody would use that. 52000 hours is about .004% of the amount of content on YouTube. The overwhelming majority of the stuff kids want to see would never have a chance to get checked. What did get approved is now available a year after it was created. Lame as fuck. Great job on stubbornly refusing to grasp the point of youtube.

    edit: and I still haven't even brought up the point that youtube isn't just for passive consumption - the idea is to be able to share your own videos and interact through comments. But that's it, I'm officially done with repeatedly explaining why making a TV channel that shows old youtube videos is completely beside the point.

  7. #67
    Member Micawber's Avatar
    Type
    IXTP
    Join Date
    Nov 2017
    Posts
    52
    Since the Great Chinese Firewall can't block youtube from determined citizens, I see no reason to believe that American helicopter parents could selectively censor it for their children.

  8. #68
    Meae Musae Servus Hephaestus's Avatar
    Type
    INTP
    Join Date
    Dec 2013
    Location
    Ceti Alpha V
    Posts
    12,014
    Quote Originally Posted by pensive_pilgrim View Post
    Yeah, because nobody would use that. 52000 hours is about .004% of the amount of content on YouTube. The overwhelming majority of the stuff kids want to see would never have a chance to get checked. What did get approved is now available a year after it was created. Lame as fuck. Great job on stubbornly refusing to grasp the point of youtube.

    edit: and I still haven't even brought up the point that youtube isn't just for passive consumption - the idea is to be able to share your own videos and interact through comments. But that's it, I'm officially done with repeatedly explaining why making a TV channel that shows old youtube videos is completely beside the point.
    You're stubbornly refusing to accept the difference in target markets. That article you like points out that kids aren't participating in comments. Possibly because a lot of the people we're talking about being targetted with Elsagate videos aren't literate.

    52000 hours may be a tiny portion of what is on YouTube, but that's the point. Even your solution is about winnowing the total content down to a tiny portion of what is available, but you keep talking like doing that is a terrible thing unless it's done by machine. That's absurd.

    52000 hours is also more content than any toddler is going to see while still a toddler, and that's just from one year of such service.

    Quote Originally Posted by Micawber View Post
    Since the Great Chinese Firewall can't block youtube from determined citizens, I see no reason to believe that American helicopter parents could selectively censor it for their children.
    Another conflation of toddlers with adolescents and adults.
    For some, "how", not "why", is the fundamental unit of measure for curiosity. This divergence is neither parallel, nor straight. Where one might have a "why?-5" problem, it might only be a "how?-2" question. But then, there are also many things where the "why?" is immediately obvious but the "how?" is best measured in centuries of perpetual wonder. Both approaches have their drawbacks.

    If one is superior, the other is unaware of it.

    --Meditations on Uncertainty Vol ξ(x)

  9. #69
    malarkey oxyjen's Avatar
    Type
    INtP
    Join Date
    Dec 2013
    Posts
    1,735
    I like the idea of an actual person vetting the videos prior to upload on YouTube Kids; however, it needs to be acknowledged that with the extra cost, it would be likely that YouTube would move to becoming a paid subscription rather than a free app.

    This will likely sound classist AF--the people who are most likely to have their kids in front of a screen for long periods of time unsupervised are likely the type that cannot or would not pay for a curated service.

    Also the vetting for the YouTube kids app addresses the explicit cartoon videos that are directed at kids, but doesn't speak to the videos of child exploitation on YouTube that are catered to adults.

    An AI that is better at weeding or flagging posts prior to upload is necessary, and extra personnel to go through AI-and-user-flagged videos would be necessary IMO.

  10. #70
    Senior Member Sinny's Avatar
    Type
    INTP
    Join Date
    Feb 2015
    Location
    Birmingham, UK
    Posts
    1,973
    I was brought up in a Conservative Catholic home, and went to Christian School..

    I think the sexualisation of Children is abhorrent, and it's (sex ed) not something they should even be exposed to until puberty.. That's when I was naturally becoming curious..

    Kids dont need to know the nasties.

    What fucking sort of society are we living in?.

    I don't believe in dictatorial internet censorship.. but this falls under the kind of stuff that I think definitely should be censored, for the obvious protection of children and traditional culture.

    Is not rocket science... This stuff is designed to create a whole generation of Miley Cyrus'.

    All I ever wanted
    Was the secrets that you keep
    All you ever wanted
    Was the truth I couldn't speak


    ~ Linkin Park


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •