Upload Date | July 19 2024 12:56 AM |
Views | 1 |
System Information | |
---|---|
Operating System | Android 13 |
Model | Xiaomi 220333QNY |
Model ID | Xiaomi 220333QNY |
Motherboard | rain |
Governor | schedutil |
CPU Information | |
---|---|
Name | ARM Qualcomm |
Topology | 1 Processor, 8 Cores |
Identifier | ARM implementer 65 architecture 8 variant 1 part 3337 revision 0 |
Base Frequency | 1.90 GHz |
Cluster 1 | 4 Cores @ 1.90 GHz |
Cluster 2 | 4 Cores @ 2.40 GHz |
Memory Information | |
---|---|
Size | 3.56 GB |
Inference Information | |
---|---|
Framework | TensorFlow Lite |
Backend | CPU |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
116
21.7 IPS |
|
Image Classification (F16)
|
100% |
117
21.9 IPS |
|
Image Classification (I8)
|
97% |
174
32.6 IPS |
|
Image Segmentation (F32)
|
100% |
176
2.94 IPS |
|
Image Segmentation (F16)
|
100% |
178
2.97 IPS |
|
Image Segmentation (I8)
|
98% |
215
3.59 IPS |
|
Pose Estimation (F32)
|
100% |
230
0.28 IPS |
|
Pose Estimation (F16)
|
100% |
233
0.28 IPS |
|
Pose Estimation (I8)
|
100% |
701
0.85 IPS |
|
Object Detection (F32)
|
100% |
126
9.37 IPS |
|
Object Detection (F16)
|
100% |
126
9.42 IPS |
|
Object Detection (I8)
|
61% |
210
15.7 IPS |
|
Face Detection (F32)
|
100% |
275
3.27 IPS |
|
Face Detection (F16)
|
100% |
276
3.28 IPS |
|
Face Detection (I8)
|
86% |
473
5.62 IPS |
|
Depth Estimation (F32)
|
100% |
249
1.93 IPS |
|
Depth Estimation (F16)
|
100% |
252
1.95 IPS |
|
Depth Estimation (I8)
|
95% |
484
3.75 IPS |
|
Style Transfer (F32)
|
100% |
413
0.54 IPS |
|
Style Transfer (F16)
|
100% |
425
0.56 IPS |
|
Style Transfer (I8)
|
98% |
1002
1.32 IPS |
|
Image Super-Resolution (F32)
|
100% |
147
5.24 IPS |
|
Image Super-Resolution (F16)
|
100% |
149
5.34 IPS |
|
Image Super-Resolution (I8)
|
98% |
326
11.6 IPS |
|
Text Classification (F32)
|
100% |
111
159.1 IPS |
|
Text Classification (F16)
|
100% |
107
153.9 IPS |
|
Text Classification (I8)
|
92% |
87
125.7 IPS |
|
Machine Translation (F32)
|
100% |
224
4.12 IPS |
|
Machine Translation (F16)
|
100% |
221
4.07 IPS |
|
Machine Translation (I8)
|
64% |
124
2.28 IPS |