Upload Date | August 01 2024 07:39 PM |
Views | 2 |
System Information | |
---|---|
Operating System | iOS 17.5.1 |
Model | iPhone 12 |
Model ID | iPhone13,2 |
Motherboard | D53gAP |
CPU Information | |
---|---|
Name | Apple A14 Bionic |
Topology | 1 Processor, 6 Cores |
Identifier | ARM |
Base Frequency | 2.99 GHz |
Cluster 1 | 2 Cores |
Cluster 2 | 4 Cores |
L1 Instruction Cache | 128 KB x 1 |
L1 Data Cache | 64.0 KB x 1 |
L2 Cache | 4.00 MB x 1 |
Memory Information | |
---|---|
Size | 3.56 GB |
Inference Information | |
---|---|
Framework | Core ML |
Backend | Neural Engine |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
756
141.5 IPS |
|
Image Classification (F16)
|
100% |
4855
908.4 IPS |
|
Image Classification (I8)
|
99% |
5344
999.9 IPS |
|
Image Segmentation (F32)
|
100% |
525
8.77 IPS |
|
Image Segmentation (F16)
|
100% |
2852
47.6 IPS |
|
Image Segmentation (I8)
|
100% |
2758
46.1 IPS |
|
Pose Estimation (F32)
|
100% |
6130
7.42 IPS |
|
Pose Estimation (F16)
|
100% |
32320
39.1 IPS |
|
Pose Estimation (I8)
|
100% |
46667
56.5 IPS |
|
Object Detection (F32)
|
100% |
733
54.7 IPS |
|
Object Detection (F16)
|
100% |
3013
224.9 IPS |
|
Object Detection (I8)
|
97% |
828
61.9 IPS |
|
Face Detection (F32)
|
100% |
2379
28.3 IPS |
|
Face Detection (F16)
|
97% |
4421
52.6 IPS |
|
Face Detection (I8)
|
96% |
4579
54.5 IPS |
|
Depth Estimation (F32)
|
100% |
5417
42.0 IPS |
|
Depth Estimation (F16)
|
100% |
31592
245.0 IPS |
|
Depth Estimation (I8)
|
37% |
35350
274.1 IPS |
|
Style Transfer (F32)
|
100% |
11612
15.3 IPS |
|
Style Transfer (F16)
|
100% |
46299
60.9 IPS |
|
Style Transfer (I8)
|
100% |
62960
82.8 IPS |
|
Image Super-Resolution (F32)
|
100% |
1966
70.2 IPS |
|
Image Super-Resolution (F16)
|
100% |
10424
372.3 IPS |
|
Image Super-Resolution (I8)
|
100% |
12156
434.1 IPS |
|
Text Classification (F32)
|
100% |
249
357.2 IPS |
|
Text Classification (F16)
|
100% |
805
1.16 KIPS |
|
Text Classification (I8)
|
96% |
323
464.3 IPS |
|
Machine Translation (F32)
|
100% |
328
6.04 IPS |
|
Machine Translation (F16)
|
100% |
1019
18.8 IPS |
|
Machine Translation (I8)
|
99% |
334
6.15 IPS |