Not Suitable for Work (NSFW) image classification using Keras and Tensorflow JS.
Defining NSFW material is subjective and the task of identifying these images is non-trivial.
Identification into two categories.
[SFW] positively trained for neutral images that are safe for work.
[NSFW] negatively trained for pornographic images involving sexually explicit images, acts, or drawings.
Finding what convolutional layers' filters are sensitive to after
Calculating maximally-activating NSFW input image for
convolutional filters through gradient ascent in the input space.
Getting the internal activation of a convnet by usng the
functional model API of TensorFlow.js.
Finding which part of an input image is most relevant to the
classification decision made by convnet using the gradient-based class activation map (CAM)
NSFW input image and classification:
What to visualize: