Upload Date | November 20 2024 06:14 PM |
Views | 1 |
System Information | |
---|---|
Operating System | Android 15 |
Model | Xiaomi 2410DPN6CC |
Model ID | Xiaomi 2410DPN6CC |
Motherboard | sun |
Governor | walt |
CPU Information | |
---|---|
Name | Qualcomm ARMv8 |
Topology | 1 Processor, 8 Cores |
Identifier | ARM implementer 81 architecture 8 variant 3 part 1 revision 4 |
Base Frequency | 3.53 GHz |
Cluster 1 | 6 Cores @ 3.53 GHz |
Cluster 2 | 2 Cores @ 4.32 GHz |
Memory Information | |
---|---|
Size | 14.76 GB |
Inference Information | |
---|---|
Framework | TensorFlow Lite |
Backend | CPU |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
787
147.2 IPS |
|
Image Classification (F16)
|
100% |
760
142.3 IPS |
|
Image Classification (I8)
|
97% |
1016
190.1 IPS |
|
Image Segmentation (F32)
|
100% |
1008
16.8 IPS |
|
Image Segmentation (F16)
|
100% |
999
16.7 IPS |
|
Image Segmentation (I8)
|
98% |
871
14.6 IPS |
|
Pose Estimation (F32)
|
100% |
1630
1.97 IPS |
|
Pose Estimation (F16)
|
100% |
1572
1.90 IPS |
|
Pose Estimation (I8)
|
100% |
5889
7.13 IPS |
|
Object Detection (F32)
|
100% |
768
57.3 IPS |
|
Object Detection (F16)
|
100% |
750
56.0 IPS |
|
Object Detection (I8)
|
61% |
1185
88.5 IPS |
|
Face Detection (F32)
|
100% |
1834
21.8 IPS |
|
Face Detection (F16)
|
100% |
1926
22.9 IPS |
|
Face Detection (I8)
|
86% |
2072
24.6 IPS |
|
Depth Estimation (F32)
|
100% |
1640
12.7 IPS |
|
Depth Estimation (F16)
|
100% |
1652
12.8 IPS |
|
Depth Estimation (I8)
|
95% |
2552
19.8 IPS |
|
Style Transfer (F32)
|
100% |
2613
3.44 IPS |
|
Style Transfer (F16)
|
100% |
2658
3.50 IPS |
|
Style Transfer (I8)
|
98% |
6099
8.02 IPS |
|
Image Super-Resolution (F32)
|
100% |
773
27.6 IPS |
|
Image Super-Resolution (F16)
|
100% |
719
25.7 IPS |
|
Image Super-Resolution (I8)
|
98% |
1764
63.0 IPS |
|
Text Classification (F32)
|
100% |
595
855.5 IPS |
|
Text Classification (F16)
|
100% |
599
861.3 IPS |
|
Text Classification (I8)
|
92% |
400
574.5 IPS |
|
Machine Translation (F32)
|
100% |
1431
26.3 IPS |
|
Machine Translation (F16)
|
100% |
1465
27.0 IPS |
|
Machine Translation (I8)
|
64% |
443
8.15 IPS |