Upload Date | November 17 2024 01:08 AM |
Views | 3 |
System Information | |
---|---|
Operating System | Android 15 |
Model | Xiaomi 24129PN74C |
Model ID | Xiaomi 24129PN74C |
Motherboard | sun |
Governor | walt |
CPU Information | |
---|---|
Name | Qualcomm ARMv8 |
Topology | 1 Processor, 8 Cores |
Identifier | ARM implementer 81 architecture 8 variant 3 part 1 revision 4 |
Base Frequency | 3.53 GHz |
Cluster 1 | 6 Cores @ 3.53 GHz |
Cluster 2 | 2 Cores @ 4.32 GHz |
Memory Information | |
---|---|
Size | 14.76 GB |
Inference Information | |
---|---|
Framework | TensorFlow Lite |
Backend | GPU |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
679
127.1 IPS |
|
Image Classification (F16)
|
100% |
988
184.8 IPS |
|
Image Classification (I8)
|
100% |
1036
193.8 IPS |
|
Image Segmentation (F32)
|
100% |
976
16.3 IPS |
|
Image Segmentation (F16)
|
100% |
1416
23.7 IPS |
|
Image Segmentation (I8)
|
98% |
1344
22.5 IPS |
|
Pose Estimation (F32)
|
100% |
1434
1.74 IPS |
|
Pose Estimation (F16)
|
100% |
2408
2.92 IPS |
|
Pose Estimation (I8)
|
100% |
2433
2.95 IPS |
|
Object Detection (F32)
|
100% |
454
33.9 IPS |
|
Object Detection (F16)
|
100% |
495
37.0 IPS |
|
Object Detection (I8)
|
59% |
461
34.4 IPS |
|
Face Detection (F32)
|
100% |
2831
33.7 IPS |
|
Face Detection (F16)
|
97% |
4774
56.8 IPS |
|
Face Detection (I8)
|
86% |
5314
63.2 IPS |
|
Depth Estimation (F32)
|
100% |
1444
11.2 IPS |
|
Depth Estimation (F16)
|
100% |
4128
32.0 IPS |
|
Depth Estimation (I8)
|
94% |
4098
31.8 IPS |
|
Style Transfer (F32)
|
100% |
4140
5.45 IPS |
|
Style Transfer (F16)
|
100% |
10454
13.8 IPS |
|
Style Transfer (I8)
|
98% |
7545
9.92 IPS |
|
Image Super-Resolution (F32)
|
100% |
478
17.1 IPS |
|
Image Super-Resolution (F16)
|
100% |
727
26.0 IPS |
|
Image Super-Resolution (I8)
|
98% |
707
25.3 IPS |
|
Text Classification (F32)
|
100% |
187
268.1 IPS |
|
Text Classification (F16)
|
100% |
185
265.2 IPS |
|
Text Classification (I8)
|
92% |
312
447.8 IPS |
|
Machine Translation (F32)
|
100% |
354
6.51 IPS |
|
Machine Translation (F16)
|
100% |
550
10.1 IPS |
|
Machine Translation (I8)
|
64% |
551
10.1 IPS |