Upload Date | September 17 2024 01:34 AM |
Views | 1 |
System Information | |
---|---|
Operating System | iOS 16.7.10 |
Model | iPhone 8 Plus |
Model ID | iPhone10,2 |
Motherboard | D21AP |
CPU Information | |
---|---|
Name | Apple A11 Bionic |
Topology | 1 Processor, 6 Cores |
Identifier | ARM |
Base Frequency | 2.39 GHz |
Cluster 1 | 2 Cores |
Cluster 2 | 4 Cores |
L1 Instruction Cache | 48.0 KB x 1 |
L1 Data Cache | 32.0 KB x 1 |
L2 Cache | 8.00 MB x 1 |
Memory Information | |
---|---|
Size | 2.90 GB |
Inference Information | |
---|---|
Framework | Core ML |
Backend | GPU |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
181
33.9 IPS |
|
Image Classification (F16)
|
100% |
151
28.3 IPS |
|
Image Classification (I8)
|
99% |
155
29.1 IPS |
|
Image Segmentation (F32)
|
100% |
190
3.18 IPS |
|
Image Segmentation (F16)
|
100% |
201
3.36 IPS |
|
Image Segmentation (I8)
|
100% |
203
3.40 IPS |
|
Pose Estimation (F32)
|
100% |
2339
2.83 IPS |
|
Pose Estimation (F16)
|
100% |
2834
3.43 IPS |
|
Pose Estimation (I8)
|
100% |
2327
2.82 IPS |
|
Object Detection (F32)
|
100% |
330
24.6 IPS |
|
Object Detection (F16)
|
100% |
310
23.1 IPS |
|
Object Detection (I8)
|
97% |
330
24.7 IPS |
|
Face Detection (F32)
|
100% |
1145
13.6 IPS |
|
Face Detection (F16)
|
100% |
1364
16.2 IPS |
|
Face Detection (I8)
|
99% |
1448
17.2 IPS |
|
Depth Estimation (F32)
|
100% |
2163
16.8 IPS |
|
Depth Estimation (F16)
|
100% |
3072
23.8 IPS |
|
Depth Estimation (I8)
|
99% |
2822
21.9 IPS |
|
Style Transfer (F32)
|
100% |
4576
6.02 IPS |
|
Style Transfer (F16)
|
100% |
5513
7.25 IPS |
|
Style Transfer (I8)
|
100% |
5736
7.54 IPS |
|
Image Super-Resolution (F32)
|
100% |
1032
36.8 IPS |
|
Image Super-Resolution (F16)
|
100% |
1202
42.9 IPS |
|
Image Super-Resolution (I8)
|
100% |
1178
42.1 IPS |
|
Text Classification (F32)
|
100% |
72
103.4 IPS |
|
Text Classification (F16)
|
100% |
143
204.9 IPS |
|
Text Classification (I8)
|
96% |
87
124.4 IPS |
|
Machine Translation (F32)
|
100% |
170
3.13 IPS |
|
Machine Translation (F16)
|
100% |
155
2.84 IPS |
|
Machine Translation (I8)
|
99% |
167
3.08 IPS |