Anonymous ID: 33f903 Sept. 10, 2020, 7:46 p.m. No.10598827   🗄️.is 🔗kun   >>8836 >>8847 >>8913 >>8940

>>10598745

If they are using AI (opposed to human intervention) another possibility might be to "hide" the meme in a larger pictureframe that is either benificial for us to get banned (Dem/Biden2020 Electoral pictures found on Twitter), or too bland to aid detection.

Reverse color/mirror and adding black squares will also help obfuscate detection by AI.

Anonymous ID: 33f903 Sept. 10, 2020, 8:03 p.m. No.10598961   🗄️.is 🔗kun   >>8973 >>9226

>>10598842

>>10598867

>>10598777 Checked

>>10598893

For phase 2, we can take a picture of a screen and use the moire pattern to our advantage. Call it Digital Camo is you will.

If you copy to clipboard this will also remove your EXIF data. Otherwise you can manually look up how to manually get rid of your EXIF data = metadata that is stored by the device you take the picture with. See pic related (random data shown)

Anonymous ID: 33f903 Sept. 10, 2020, 8:11 p.m. No.10599043   🗄️.is 🔗kun   >>9064

>>10598812

Agreed, my guess is they can manually "tweak" or adjust the sensitivity to suit their needs. If we can get them to ban pictures that are to our advantage (Pro Dem/Pro Biden) that would force them to lower the threshold. Win-win situation either way you cut it.

Also keep in mind they will be storing and collecting a database to fingerprint and datamine critical memes for later use near the election date.

Sauce:

>Browserleaks.com

Anonymous ID: 33f903 Sept. 10, 2020, 8:38 p.m. No.10599317   🗄️.is 🔗kun   >>9375

>>10599226

Everybody can hold a camera and make a picture, bonus is that it would be much harder to detect because of added variance. Just make sure you don't forget to get rid of the EXIF data.