New submitter OWCareers writes: Yahoo today announced its latest open-source release: a model that can figure out if images are specifically pornographic in nature. The system uses a type of artificial intelligence called deep learning, which involves training artificial neural networks on lots of data (like dirty images) and getting them to make inferences about new data. The model that’s now available on GitHub under a BSD 2-Clause license comes pre-trained, so users only have to fine-tune it if they so choose. The model works with the widely used Caffe open source deep learning framework. The team trained the model using its now open source CaffeOnSpark system.The new model could be interesting to look at for developers maintaining applications like Instagram and Pinterest that are keen to minimize smut. Search engine operators like Google and Microsoft might also want to check out what’s under the hood here.The tool gives images a score between 0 to 1 on how NSFW the pictures look. The official blog post from Yahoo outlines several examples.
Read more of this story at Slashdot.