Upload Date | October 14 2024 12:26 PM |
Views | 1 |
System Information | |
---|---|
Operating System | iOS 16.7.10 |
Model | iPhone 8 Plus |
Model ID | iPhone10,5 |
Motherboard | D211AP |
CPU Information | |
---|---|
Name | Apple A11 Bionic |
Topology | 1 Processor, 6 Cores |
Identifier | ARM |
Base Frequency | 2.39 GHz |
Cluster 1 | 2 Cores |
Cluster 2 | 4 Cores |
L1 Instruction Cache | 48.0 KB x 1 |
L1 Data Cache | 32.0 KB x 1 |
L2 Cache | 8.00 MB x 1 |
Memory Information | |
---|---|
Size | 2.90 GB |
Inference Information | |
---|---|
Framework | Core ML |
Backend | Neural Engine |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
175
32.7 IPS |
|
Image Classification (F16)
|
100% |
143
26.8 IPS |
|
Image Classification (I8)
|
99% |
144
27.0 IPS |
|
Image Segmentation (F32)
|
100% |
185
3.09 IPS |
|
Image Segmentation (F16)
|
100% |
198
3.31 IPS |
|
Image Segmentation (I8)
|
100% |
198
3.31 IPS |
|
Pose Estimation (F32)
|
100% |
2254
2.73 IPS |
|
Pose Estimation (F16)
|
100% |
2626
3.18 IPS |
|
Pose Estimation (I8)
|
100% |
2569
3.11 IPS |
|
Object Detection (F32)
|
100% |
367
27.4 IPS |
|
Object Detection (F16)
|
100% |
270
20.1 IPS |
|
Object Detection (I8)
|
97% |
305
22.8 IPS |
|
Face Detection (F32)
|
100% |
890
10.6 IPS |
|
Face Detection (F16)
|
100% |
880
10.5 IPS |
|
Face Detection (I8)
|
99% |
1002
11.9 IPS |
|
Depth Estimation (F32)
|
100% |
2073
16.1 IPS |
|
Depth Estimation (F16)
|
100% |
2682
20.8 IPS |
|
Depth Estimation (I8)
|
99% |
2406
18.7 IPS |
|
Style Transfer (F32)
|
100% |
4587
6.03 IPS |
|
Style Transfer (F16)
|
100% |
4906
6.45 IPS |
|
Style Transfer (I8)
|
100% |
4799
6.31 IPS |
|
Image Super-Resolution (F32)
|
100% |
892
31.9 IPS |
|
Image Super-Resolution (F16)
|
100% |
943
33.7 IPS |
|
Image Super-Resolution (I8)
|
100% |
1072
38.3 IPS |
|
Text Classification (F32)
|
100% |
50
72.5 IPS |
|
Text Classification (F16)
|
100% |
200
287.7 IPS |
|
Text Classification (I8)
|
96% |
197
282.8 IPS |
|
Machine Translation (F32)
|
100% |
319
5.86 IPS |
|
Machine Translation (F16)
|
100% |
250
4.60 IPS |
|
Machine Translation (I8)
|
99% |
315
5.79 IPS |