Upload Date | September 02 2024 08:58 AM |
Views | 3 |
System Information | |
---|---|
Operating System | Microsoft Windows 11 Pro (64-bit) |
Model | Microsoft Corporation Surface Pro X |
Motherboard | Microsoft Corporation Surface Pro X |
Power Plan | Balanced |
CPU Information | |
---|---|
Name | Microsoft SQ1 @ 3.0 GHz |
Topology | 1 Processor, 8 Cores |
Identifier | ARMv8 (64-bit) Family 8 Model 805 Revision D0E |
Base Frequency | 3.00 GHz |
Cluster 1 | 4 Cores |
Cluster 2 | 4 Cores |
Memory Information | |
---|---|
Size | 16.00 GB |
Inference Information | |
---|---|
Framework | ONNX |
Backend | DirectML |
Device | Qualcomm(R) Adreno(TM) 680 GPU |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
414
77.6 IPS |
|
Image Classification (F16)
|
100% |
363
67.9 IPS |
|
Image Classification (I8)
|
100% |
216
40.3 IPS |
|
Image Segmentation (F32)
|
100% |
447
7.46 IPS |
|
Image Segmentation (F16)
|
100% |
420
7.01 IPS |
|
Image Segmentation (I8)
|
98% |
260
4.34 IPS |
|
Pose Estimation (F32)
|
100% |
2421
2.93 IPS |
|
Pose Estimation (F16)
|
100% |
2467
2.99 IPS |
|
Pose Estimation (I8)
|
100% |
1760
2.13 IPS |
|
Object Detection (F32)
|
100% |
321
24.0 IPS |
|
Object Detection (F16)
|
100% |
311
23.2 IPS |
|
Object Detection (I8)
|
71% |
209
15.6 IPS |
|
Face Detection (F32)
|
100% |
809
9.62 IPS |
|
Face Detection (F16)
|
100% |
763
9.07 IPS |
|
Face Detection (I8)
|
93% |
387
4.60 IPS |
|
Depth Estimation (F32)
|
100% |
1702
13.2 IPS |
|
Depth Estimation (F16)
|
100% |
1668
12.9 IPS |
|
Depth Estimation (I8)
|
94% |
925
7.18 IPS |
|
Style Transfer (F32)
|
100% |
3507
4.61 IPS |
|
Style Transfer (F16)
|
100% |
3367
4.43 IPS |
|
Style Transfer (I8)
|
98% |
2469
3.25 IPS |
|
Image Super-Resolution (F32)
|
100% |
1276
45.6 IPS |
|
Image Super-Resolution (F16)
|
100% |
1301
46.4 IPS |
|
Image Super-Resolution (I8)
|
99% |
573
20.5 IPS |
|
Text Classification (F32)
|
100% |
147
211.0 IPS |
|
Text Classification (F16)
|
100% |
140
201.6 IPS |
|
Text Classification (I8)
|
98% |
78
112.3 IPS |
|
Machine Translation (F32)
|
100% |
357
6.58 IPS |
|
Machine Translation (F16)
|
100% |
363
6.67 IPS |
|
Machine Translation (I8)
|
65% |
158
2.90 IPS |