Machine Learning Benchmark Tool (ML Bench) (AI Benchmark Tool)
Supported models :
- MobileNet v1
- MobileNet v2
- Inception v3
- Resnet v2 50
- SSD Mobilenet v1 (Object Detection)
Supported runtime :
- Tensorflow Lite
- Tensorflow Mobile
- Android NN
- SNPE (for Qualcomm)
SideLoad Support:
How to side load your model :
1. Convert your model to tflite (using toco) or dlc (using snpe conversion tool).
2. On your local machine, create [Model Name] directory
3. Copy your model file to the directory created in step 2
4. Create a file called meta-data.json in the [Model Name] directory
example of meta-data.json :
{
"xres" : 299,
"yres" : 299,
"depth" : 3,
"input_type" : "float",
"output_type" : "float",
"input_name" : "input:0",
"output_name" : "InceptionV3/Predictions/Reshape_1:0",
"image_mean" : 0,
"image_std" : 0,
"accelerator":"dsp",
}
5. push [Model Name] directory to the target device using below command
adb push ./[Model Name] /sdcard/Android/data/com.etinum.mlbench/files/models/
Machine Learning Benchmark Tool (ML Bench) (Alat Patokan AI)
Model yang didukung:
- MobileNet v1
- MobileNet v2
- Inception v3
- Resnet v2 50
- SSD Mobilenet v1 (Deteksi Objek)
Waktu proses yang didukung:
- Tensorflow Lite
- Tensorflow Mobile
- Android NN
- SNPE (untuk Qualcomm)
Dukungan SideLoad:
Bagaimana cara mem-load model Anda:
1. Ubah model Anda menjadi tflite (menggunakan toco) atau dlc (menggunakan alat konversi snpe).
2. Pada mesin lokal Anda, buat direktori [Nama Model]
3. Salin file model Anda ke direktori yang dibuat pada langkah 2
4. Buat file bernama meta-data.json di direktori [Nama Model]
contoh meta-data.json:
{
"xres": 299,
"yres": 299,
"kedalaman": 3,
"input_type": "float",
"output_type": "float",
"input_name": "input: 0",
"output_name": "InceptionV3 / Predictions / Reshape_1: 0",
"image_mean": 0,
"image_std": 0,
"akselerator": "dsp",
}
5. dorong [Nama Model] direktori ke perangkat target menggunakan perintah di bawah ini
adb push ./[Model Name] /sdcard/Android/data/com.etinum.mlbench/files/models/
Machine Learning Benchmark Tool (ML Bench) (AI Benchmark Tool)
Supported models :
- MobileNet v1
- MobileNet v2
- Inception v3
- Resnet v2 50
- SSD Mobilenet v1 (Object Detection)
Supported runtime :
- Tensorflow Lite
- Tensorflow Mobile
- Android NN
- SNPE (for Qualcomm)
SideLoad Support:
How to side load your model :
1. Convert your model to tflite (using toco) or dlc (using snpe conversion tool).
2. On your local machine, create [Model Name] directory
3. Copy your model file to the directory created in step 2
4. Create a file called meta-data.json in the [Model Name] directory
example of meta-data.json :
{
"xres" : 299,
"yres" : 299,
"depth" : 3,
"input_type" : "float",
"output_type" : "float",
"input_name" : "input:0",
"output_name" : "InceptionV3/Predictions/Reshape_1:0",
"image_mean" : 0,
"image_std" : 0,
"accelerator":"dsp",
}
5. push [Model Name] directory to the target device using below command
adb push ./[Model Name] /sdcard/Android/data/com.etinum.mlbench/files/models/