Helping people with photosensitive epilepsy browse the web safely

A bot that scans the web and reports content that is dangerous for people with photosensitive epilepsy to view

How does this work? Background story

Human Focused AI - Helping millions of people stay safe online | Product Hunt

Analyzing content in real time

🤖 Number of Images, GIFs and videos analyzed: 1739206

✅ Safe images, GIFs and videos checked: 1738978

❌ Dangerous images, GIFs and videos reported: 228

Updated every 10 minutes.

What is photosensitive epilepsy ✨

Epilepsy is the fourth most common neurological disorder and is characterized by recurrent epileptic seizures.

It affects people of all ages and affects around 1% of the population.

For about 5% of people with epilepsy (millions of people worldwide) exposure to flashing lights at certain intensities or to certain visual patterns can trigger seizures.

This condition is known as photosensitive epilepsy.

Photosensitive epilepsy and screens 💻

The internet can be a dangerous place for people with photosensitive epilepsy.

There have been cases where flashing GIFs on computer or mobile phone screens have triggered epileptic seizures.

And even more extreme cases where flashing content has sent hundreds of people with photosensitive epilepsy to the hospital.

What makes content dangerous 💥

In general, there are three things that can make media content dangerous

1) Large enough flashes with great enough contrast at specific frequencies

2) Frame transitions including saturated reds

3) geometrical patterns, such as stripes

How this bot works 🤖

Scanning the whole internet is impossible. Fortunately, you don't have to do that.

Most of the world's web traffic is concentrated on five websites. Facebook, Instagram, Twitter, TikTok and YouTube.

Our bot will scan these websites and instantly report any visual content that is dangerous.

What is the algorithm behind this 🧬

This specialized algorithm couldn't have been built without standing on the shoulders of giants.

The algorithm is based on the work of leading scientists in photosensitive epilepsy, like the late Dr. Harding and Professor Binnie from King's College London.

It also follows the well-established guidelines of Ofcom and WCAG20.

Analyzing videos and GIFs isn't as easy as it sounds.

Let's say you want to analyze a simple GIF, 600x600 pixels with 200 frames. Then, you would have 600x600 pixels x 200 frames. That's 72 million pixels to analyze!

In order to do this in real time, the bot uses Python, linear algebra and optimized Python packages written in C.

Who is behind this project 🙋‍♂️

This project is not funded by any non-profit organization, corporation or government.

It's just me, Alex. I'm a computer scientist, internet enterpreneur and traveler.

I built this project by myself because I saw that there is nothing to protect people with photosensitive epilepsy online.

Actually, the way I found out about this and stumbled upon this idea is very random.

You can read about it here.

What's next ❤️

Well, there are many things I want to do next. Some of them are:

1) Make the project open source, so everyone can contribute and build upon this technology.

2) Scan other social media platforms like TikTok, Instagram, Facebook and YouTube.

3) Scan films and build a database with dangerous and safe movies.

4) Much, much more.