Upload Date | July 10 2024 07:56 PM |
Views | 1 |
System Information | |
---|---|
Operating System | iOS 17.5.1 |
Model | iPad Pro (10.5-inch) |
Model ID | iPad7,4 |
Motherboard | J208AP |
CPU Information | |
---|---|
Name | Apple A10X Fusion |
Topology | 1 Processor, 3 Cores |
Identifier | ARM |
Base Frequency | 2.34 GHz |
Cluster 1 | 3 Cores |
L1 Instruction Cache | 64.0 KB x 1 |
L1 Data Cache | 64.0 KB x 1 |
L2 Cache | 8.00 MB x 1 |
Memory Information | |
---|---|
Size | 3.91 GB |
Inference Information | |
---|---|
Framework | Core ML |
Backend | GPU |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
246
46.1 IPS |
|
Image Classification (F16)
|
100% |
214
40.1 IPS |
|
Image Classification (I8)
|
99% |
209
39.1 IPS |
|
Image Segmentation (F32)
|
100% |
144
2.40 IPS |
|
Image Segmentation (F16)
|
100% |
143
2.39 IPS |
|
Image Segmentation (I8)
|
100% |
143
2.39 IPS |
|
Pose Estimation (F32)
|
100% |
1402
1.70 IPS |
|
Pose Estimation (F16)
|
100% |
1378
1.67 IPS |
|
Pose Estimation (I8)
|
100% |
1341
1.62 IPS |
|
Object Detection (F32)
|
100% |
244
18.2 IPS |
|
Object Detection (F16)
|
100% |
213
15.9 IPS |
|
Object Detection (I8)
|
97% |
240
17.9 IPS |
|
Face Detection (F32)
|
100% |
474
5.63 IPS |
|
Face Detection (F16)
|
100% |
466
5.55 IPS |
|
Face Detection (I8)
|
99% |
466
5.54 IPS |
|
Depth Estimation (F32)
|
100% |
602
4.67 IPS |
|
Depth Estimation (F16)
|
100% |
543
4.21 IPS |
|
Depth Estimation (I8)
|
99% |
547
4.24 IPS |
|
Style Transfer (F32)
|
100% |
1909
2.51 IPS |
|
Style Transfer (F16)
|
100% |
1892
2.49 IPS |
|
Style Transfer (I8)
|
100% |
1847
2.43 IPS |
|
Image Super-Resolution (F32)
|
100% |
600
21.4 IPS |
|
Image Super-Resolution (F16)
|
100% |
590
21.1 IPS |
|
Image Super-Resolution (I8)
|
100% |
593
21.2 IPS |
|
Text Classification (F32)
|
100% |
142
203.8 IPS |
|
Text Classification (F16)
|
100% |
119
170.4 IPS |
|
Text Classification (I8)
|
96% |
118
169.7 IPS |
|
Machine Translation (F32)
|
100% |
231
4.26 IPS |
|
Machine Translation (F16)
|
100% |
198
3.64 IPS |
|
Machine Translation (I8)
|
99% |
235
4.32 IPS |