Upload Date | April 17 2024 07:36 AM |
Views | 572 |
System Information | |
---|---|
Operating System | Microsoft Windows 11 Pro Insider Preview (64-bit) |
Model | OEMMN OEMMN |
Motherboard | OEMMN OEMMN |
Power Plan | Balanced |
CPU Information | |
---|---|
Name | Snapdragon X Plus - X1P64100 - Qualcomm Oryon CPU |
Topology | 1 Processor, 10 Cores |
Identifier | ARMv8 (64-bit) Family 8 Model 1 Revision 201 |
Base Frequency | 3.42 GHz |
Cluster 1 | 6 Cores |
Cluster 2 | 4 Cores |
Memory Information | |
---|---|
Size | 16.00 GB |
Inference Information | |
---|---|
Framework | ONNX |
Backend | CPU |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
636
119.1 IPS |
|
Image Classification (F16)
|
100% |
673
126.0 IPS |
|
Image Classification (I8)
|
99% |
3274
612.7 IPS |
|
Image Segmentation (F32)
|
100% |
733
12.2 IPS |
|
Image Segmentation (F16)
|
100% |
748
12.5 IPS |
|
Image Segmentation (I8)
|
98% |
2098
35.0 IPS |
|
Pose Estimation (F32)
|
100% |
4657
5.64 IPS |
|
Pose Estimation (F16)
|
100% |
4710
5.70 IPS |
|
Pose Estimation (I8)
|
100% |
20832
25.2 IPS |
|
Object Detection (F32)
|
100% |
879
65.6 IPS |
|
Object Detection (F16)
|
100% |
880
65.7 IPS |
|
Object Detection (I8)
|
65% |
2893
216.0 IPS |
|
Face Detection (F32)
|
100% |
1688
20.1 IPS |
|
Face Detection (F16)
|
100% |
1661
19.7 IPS |
|
Face Detection (I8)
|
89% |
4896
58.2 IPS |
|
Depth Estimation (F32)
|
100% |
2332
18.1 IPS |
|
Depth Estimation (F16)
|
100% |
2286
17.7 IPS |
|
Depth Estimation (I8)
|
96% |
11833
91.8 IPS |
|
Style Transfer (F32)
|
100% |
11312
14.9 IPS |
|
Style Transfer (F16)
|
100% |
11362
14.9 IPS |
|
Style Transfer (I8)
|
98% |
22090
29.1 IPS |
|
Image Super-Resolution (F32)
|
100% |
1778
63.5 IPS |
|
Image Super-Resolution (F16)
|
100% |
1760
62.9 IPS |
|
Image Super-Resolution (I8)
|
99% |
3253
116.2 IPS |
|
Text Classification (F32)
|
100% |
1249
1.80 KIPS |
|
Text Classification (F16)
|
100% |
1458
2.10 KIPS |
|
Text Classification (I8)
|
98% |
1169
1.68 KIPS |
|
Machine Translation (F32)
|
100% |
1206
22.2 IPS |
|
Machine Translation (F16)
|
100% |
1207
22.2 IPS |
|
Machine Translation (I8)
|
67% |
2287
42.1 IPS |