Doesn't Work Well

#1
by qpqpqpqpqpqp - opened

I used some "sfw" images as inputs and it says nsfw, but when I used an image with mild blood (not real) it said it was sfw. I gave it an image with erection under clothes, a huge detailed bulge, and it thought it was sfw, but bare flat male chests and cigarettes are "nsfw". What's wrong?
+ValueError: Unable to infer channel dimension format

Viddexa AI org

Thank you for reaching out! In the current version, model detects sexual nsfw content only. We are working on to include other types of violations such as violence and drugs. As to the misclassification, unfortunately model is not 100% accurate, but I am surprised how many images it has misclassified, could you per chance provide links to those images or send an email to email: [email protected]? Also a screenshot of +ValueError so we can work on that.

image

image

image

Viddexa AI org

@photoreg @qpqpqpqpqpqp can you try our latest model in the same demo? We just released nsfw-detection-2-mini, which supports 5 classes instead of 2.

fcakyon changed discussion status to closed

Sign up or log in to comment