Upload Date | July 26 2024 12:30 PM |
Views | 2 |
System Information | |
---|---|
Operating System | iOS 18.0 |
Model | iPad Pro (11-inch) |
Model ID | iPad8,1 |
Motherboard | J317AP |
CPU Information | |
---|---|
Name | Apple A12X Bionic |
Topology | 1 Processor, 8 Cores |
Identifier | ARM |
Base Frequency | 2.49 GHz |
Cluster 1 | 4 Cores |
Cluster 2 | 4 Cores |
L1 Instruction Cache | 48.0 KB x 1 |
L1 Data Cache | 32.0 KB x 1 |
L2 Cache | 2.00 MB x 1 |
Memory Information | |
---|---|
Size | 3.70 GB |
Inference Information | |
---|---|
Framework | Core ML |
Backend | Neural Engine |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
762
142.6 IPS |
|
Image Classification (F16)
|
100% |
2184
408.6 IPS |
|
Image Classification (I8)
|
99% |
2167
405.5 IPS |
|
Image Segmentation (F32)
|
100% |
239
3.99 IPS |
|
Image Segmentation (F16)
|
100% |
261
4.36 IPS |
|
Image Segmentation (I8)
|
100% |
256
4.27 IPS |
|
Pose Estimation (F32)
|
100% |
2709
3.28 IPS |
|
Pose Estimation (F16)
|
100% |
24830
30.1 IPS |
|
Pose Estimation (I8)
|
100% |
24814
30.0 IPS |
|
Object Detection (F32)
|
100% |
825
61.6 IPS |
|
Object Detection (F16)
|
100% |
1102
82.3 IPS |
|
Object Detection (I8)
|
97% |
848
63.3 IPS |
|
Face Detection (F32)
|
100% |
3204
38.1 IPS |
|
Face Detection (F16)
|
98% |
3952
47.0 IPS |
|
Face Detection (I8)
|
97% |
3888
46.2 IPS |
|
Depth Estimation (F32)
|
100% |
2945
22.8 IPS |
|
Depth Estimation (F16)
|
100% |
20587
159.7 IPS |
|
Depth Estimation (I8)
|
99% |
20271
157.2 IPS |
|
Style Transfer (F32)
|
100% |
7751
10.2 IPS |
|
Style Transfer (F16)
|
100% |
9273
12.2 IPS |
|
Style Transfer (I8)
|
100% |
9217
12.1 IPS |
|
Image Super-Resolution (F32)
|
100% |
1537
54.9 IPS |
|
Image Super-Resolution (F16)
|
100% |
3713
132.6 IPS |
|
Image Super-Resolution (I8)
|
100% |
3684
131.6 IPS |
|
Text Classification (F32)
|
100% |
192
276.6 IPS |
|
Text Classification (F16)
|
100% |
232
333.7 IPS |
|
Text Classification (I8)
|
96% |
232
333.8 IPS |
|
Machine Translation (F32)
|
100% |
253
4.65 IPS |
|
Machine Translation (F16)
|
100% |
248
4.56 IPS |
|
Machine Translation (I8)
|
99% |
256
4.70 IPS |