There is a bug in Youtube’s AI filter that is causing some videos to flicker. The content creator have no idea that is happening and no way to opt out. If you have Epilepsy it is recommended that you watch out for these situations
Also as content creators dont know this may be happening, if you find these issues you should try to contact the respective content creator and let them know this is happening on your specific device. Again, its so random it may only happen on some devices


If you have epilepsy, watch out so you don’t randomly see something that could potentially kill you.
Great job google.
I’m not saying this isn’t true (nor am I saying I do think it’s true), but this entire post gives off the vibes of Halloween candy scares.
People take epilepsy risk a bit far. I mean I get it they don’t want to kill someone but people with epilepsy know how much flickering is required to be a problem and honestly it’s a lot. That flickering would be extremely annoying but it’s not going to be dangerous to me.
Although I must say I’ve never actually seen this issue and I watch a lot of YouTube because of my job which requires me to be on call but not actually doing anything.
There is an issue that YouTube occasionally has where a video will lock up for a couple of seconds and it always locks up at exactly the same point, and always recovers at exactly the same point, but that’s a very rare issue, isn’t flashing so is unlikely to cause epilepsy, and has been around for years now so is nothing to do with AI.
FWIW, photosensitive epilepsy is typically only triggered at flash rates between 3-30 hertz. The rate of flashing shown is extremely unlikely to cause seizures even in generally susceptible individuals.
I don’t know the type of flashes causes by this glitch but the post specifically warns for epilepsy.
That part of the post might be wrong, that’s completely fair and i trust you have less to gain from lying versus a post craving clicks.
Though my point is, if it is dangerous. What are they supposed to look out for? The “recommendation” should be google stop doing this shit that puts people at random risk.
If it’s just annoying, then its just a glitch, the same for other photosensitive but not suffering epilepsy people.
The danger is honestly pretty minimal for people who are aware they have photosensitive epilepsy; those who are prone to it but unaware of it are not likely to heed warnings even where they exist since they won’t typically perceive the risk until after experiencing it.
It takes several minutes from triggering exposure to actual seizure onset, so those who know of their susceptibility have time to stop exposure and make sure they’re in a safe position if a seizure does come. There are many ways of mitigating the seizure risk by stopping exposure, closing one eye and facing away from the light source, keeping screen brightness at the lowest level you can still easily read, etc.
That’s not to say I think warnings aren’t useful, but the intensity of many of the warnings people use is disproportionate to the actual risk and can cause people to be much more worried than necessary IMO. Google et al really need to stop messing with videos and such via AI without any sort of notice or warning for a whole host of reasons, including broader non-epileptic photosensitive since becoming intensely nauseous or getting a migraine over it is still pretty annoying.
tl;dr I think the warnings are a good idea, but maybe a little broader and less “OMG the epileptics are gonna all die”. And fuck companies silently manipulating content they didn’t even produce with AI in general.
Well said.
Did they ever give any reason why they even consider it an acceptable thing to do?
Imagine making an art piece to be displayed in a museum only to find they allowed an interim with next to zero experience to paint over it.
Not that I’m aware of, unfortunately. They seem to be trying to pretend they aren’t even doing it, so telling us why they obviously are modifying content doesn’t seem likely until they’re backed in to a corner by popular outrage.