Skip to content

Commit

Permalink
Code Cleanup
Browse files Browse the repository at this point in the history
  • Loading branch information
nipunru committed Mar 29, 2020
1 parent 08e71ff commit bd881ec
Show file tree
Hide file tree
Showing 2 changed files with 11 additions and 11 deletions.
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)[![](https://jitpack.io/v/adawoud/BottomSheetTimeRangePicker.svg)](https://jitpack.io/#adawoud/BottomSheetTimeRangePicker)

# NSFW(Nude Content) Detector

NSFW Content detector using
Expand All @@ -7,9 +9,7 @@ NSFW Content detector using

This module contains pre trained
[TensorFlow Lite(tflite)](https://www.tensorflow.org/lite) model that
compatible with AutoML.

[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)[![](https://jitpack.io/v/adawoud/BottomSheetTimeRangePicker.svg)](https://jitpack.io/#adawoud/BottomSheetTimeRangePicker)
compatible with AutoML. And this is an `OnDeviceAutoMLImageLabeler`.

## Installation

Expand Down Expand Up @@ -51,7 +51,7 @@ dependencies {
implementation "com.google.firebase:firebase-ml-vision-automl:<latest_version>"
implementation "com.google.firebase:firebase-ml-model-interpreter:<latest_version>"
implementation 'com.github.nipunru:nsfw-detector:0.0.3'
implementation 'com.github.nipunru:nsfw-detector:0.0.4'
}
apply plugin: 'com.google.gms.google-services'
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,29 +45,29 @@ object NSFWDetector {
when (label.text) {
LABEL_SFW -> {
if (label.confidence > threshold) {
callback(true,label.confidence, bitmap)
callback(true, label.confidence, bitmap)
} else {
callback(false,label.confidence, bitmap)
callback(false, label.confidence, bitmap)
}
}
LABEL_NSFW -> {
if (label.confidence < (1 - threshold)) {
callback(true,label.confidence, bitmap)
callback(true, label.confidence, bitmap)
} else {
callback(false,label.confidence, bitmap)
callback(false, label.confidence, bitmap)
}
}
else -> {
callback(false,0.0F , bitmap)
callback(false, 0.0F, bitmap)
}
}
} catch (e: Exception) {
Log.e(TAG, e.localizedMessage ?: "NSFW Scan Error")
callback(false,0.0F , bitmap)
callback(false, 0.0F, bitmap)
}
}.addOnFailureListener { e ->
Log.e(TAG, e.localizedMessage ?: "NSFW Scan Error")
callback(false,0.0F , bitmap)
callback(false, 0.0F, bitmap)
}
}
}

0 comments on commit bd881ec

Please sign in to comment.