Upload Date | November 08 2024 04:22 PM |
Views | 1 |
System Information | |
---|---|
Operating System | Android 15 |
Model | asus ASUSAI2501E |
Model ID | asus ASUSAI2501E |
Motherboard | sun |
Governor | walt |
CPU Information | |
---|---|
Name | Qualcomm ARMv8 |
Topology | 1 Processor, 8 Cores |
Identifier | ARM implementer 81 architecture 8 variant 3 part 1 revision 4 |
Base Frequency | 3.53 GHz |
Cluster 1 | 6 Cores @ 3.53 GHz |
Cluster 2 | 2 Cores @ 4.32 GHz |
Memory Information | |
---|---|
Size | 22.51 GB |
Inference Information | |
---|---|
Framework | TensorFlow Lite |
Backend | GPU |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
700
130.9 IPS |
|
Image Classification (F16)
|
100% |
824
154.2 IPS |
|
Image Classification (I8)
|
100% |
962
180.1 IPS |
|
Image Segmentation (F32)
|
100% |
919
15.4 IPS |
|
Image Segmentation (F16)
|
100% |
1167
19.5 IPS |
|
Image Segmentation (I8)
|
98% |
1287
21.5 IPS |
|
Pose Estimation (F32)
|
100% |
1534
1.86 IPS |
|
Pose Estimation (F16)
|
100% |
2381
2.88 IPS |
|
Pose Estimation (I8)
|
100% |
2389
2.89 IPS |
|
Object Detection (F32)
|
100% |
491
36.7 IPS |
|
Object Detection (F16)
|
100% |
475
35.5 IPS |
|
Object Detection (I8)
|
59% |
502
37.5 IPS |
|
Face Detection (F32)
|
100% |
2867
34.1 IPS |
|
Face Detection (F16)
|
97% |
3732
44.4 IPS |
|
Face Detection (I8)
|
86% |
4020
47.8 IPS |
|
Depth Estimation (F32)
|
100% |
1559
12.1 IPS |
|
Depth Estimation (F16)
|
100% |
4309
33.4 IPS |
|
Depth Estimation (I8)
|
94% |
4291
33.3 IPS |
|
Style Transfer (F32)
|
100% |
5702
7.50 IPS |
|
Style Transfer (F16)
|
100% |
11939
15.7 IPS |
|
Style Transfer (I8)
|
98% |
11931
15.7 IPS |
|
Image Super-Resolution (F32)
|
100% |
674
24.1 IPS |
|
Image Super-Resolution (F16)
|
100% |
1160
41.4 IPS |
|
Image Super-Resolution (I8)
|
98% |
1080
38.6 IPS |
|
Text Classification (F32)
|
100% |
402
577.4 IPS |
|
Text Classification (F16)
|
100% |
404
580.3 IPS |
|
Text Classification (I8)
|
92% |
626
900.3 IPS |
|
Machine Translation (F32)
|
100% |
723
13.3 IPS |
|
Machine Translation (F16)
|
100% |
725
13.3 IPS |
|
Machine Translation (I8)
|
64% |
741
13.6 IPS |