Upload Date | November 08 2024 04:16 PM |
Views | 1 |
System Information | |
---|---|
Operating System | Android 15 |
Model | asus ASUSAI2501E |
Model ID | asus ASUSAI2501E |
Motherboard | sun |
Governor | walt |
CPU Information | |
---|---|
Name | Qualcomm ARMv8 |
Topology | 1 Processor, 8 Cores |
Identifier | ARM implementer 81 architecture 8 variant 3 part 1 revision 4 |
Base Frequency | 3.53 GHz |
Cluster 1 | 6 Cores @ 3.53 GHz |
Cluster 2 | 2 Cores @ 4.32 GHz |
Memory Information | |
---|---|
Size | 22.51 GB |
Inference Information | |
---|---|
Framework | TensorFlow Lite |
Backend | GPU |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
697
130.4 IPS |
|
Image Classification (F16)
|
100% |
737
137.9 IPS |
|
Image Classification (I8)
|
100% |
634
118.7 IPS |
|
Image Segmentation (F32)
|
100% |
932
15.6 IPS |
|
Image Segmentation (F16)
|
100% |
1291
21.6 IPS |
|
Image Segmentation (I8)
|
98% |
1289
21.5 IPS |
|
Pose Estimation (F32)
|
100% |
1526
1.85 IPS |
|
Pose Estimation (F16)
|
100% |
2372
2.87 IPS |
|
Pose Estimation (I8)
|
100% |
2397
2.90 IPS |
|
Object Detection (F32)
|
100% |
470
35.1 IPS |
|
Object Detection (F16)
|
100% |
490
36.6 IPS |
|
Object Detection (I8)
|
59% |
472
35.3 IPS |
|
Face Detection (F32)
|
100% |
2869
34.1 IPS |
|
Face Detection (F16)
|
97% |
4241
50.4 IPS |
|
Face Detection (I8)
|
86% |
4029
47.9 IPS |
|
Depth Estimation (F32)
|
100% |
1561
12.1 IPS |
|
Depth Estimation (F16)
|
100% |
4280
33.2 IPS |
|
Depth Estimation (I8)
|
94% |
4345
33.7 IPS |
|
Style Transfer (F32)
|
100% |
5676
7.47 IPS |
|
Style Transfer (F16)
|
100% |
11939
15.7 IPS |
|
Style Transfer (I8)
|
98% |
11938
15.7 IPS |
|
Image Super-Resolution (F32)
|
100% |
657
23.5 IPS |
|
Image Super-Resolution (F16)
|
100% |
1245
44.5 IPS |
|
Image Super-Resolution (I8)
|
98% |
1214
43.4 IPS |
|
Text Classification (F32)
|
100% |
399
573.5 IPS |
|
Text Classification (F16)
|
100% |
404
580.2 IPS |
|
Text Classification (I8)
|
92% |
626
899.8 IPS |
|
Machine Translation (F32)
|
100% |
733
13.5 IPS |
|
Machine Translation (F16)
|
100% |
734
13.5 IPS |
|
Machine Translation (I8)
|
64% |
746
13.7 IPS |