Upload Date | February 12 2024 07:18 PM |
Views | 1 |
System Information | |
---|---|
Operating System | iOS 17.3.1 |
Model | iPhone 15 Pro Max |
Model ID | iPhone16,2 |
Motherboard | D84AP |
CPU Information | |
---|---|
Name | Apple A17 Pro |
Topology | 1 Processor, 6 Cores |
Identifier | ARM |
Base Frequency | 3.78 GHz |
Cluster 1 | 2 Cores |
Cluster 2 | 4 Cores |
L1 Instruction Cache | 128 KB x 1 |
L1 Data Cache | 64.0 KB x 1 |
L2 Cache | 4.00 MB x 1 |
Memory Information | |
---|---|
Size | 7.47 GB |
Inference Information | |
---|---|
Framework | Core ML |
Backend | CPU |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
1432
268.0 IPS |
|
Image Classification (F16)
|
100% |
1244
232.8 IPS |
|
Image Classification (I8)
|
99% |
1240
232.1 IPS |
|
Image Segmentation (F32)
|
100% |
628
10.5 IPS |
|
Image Segmentation (F16)
|
100% |
561
9.37 IPS |
|
Image Segmentation (I8)
|
100% |
554
9.25 IPS |
|
Pose Estimation (F32)
|
100% |
7004
8.48 IPS |
|
Pose Estimation (F16)
|
100% |
13576
16.4 IPS |
|
Pose Estimation (I8)
|
100% |
13528
16.4 IPS |
|
Object Detection (F32)
|
100% |
1402
104.7 IPS |
|
Object Detection (F16)
|
100% |
1379
103.0 IPS |
|
Object Detection (I8)
|
97% |
1401
104.6 IPS |
|
Face Detection (F32)
|
100% |
2308
27.4 IPS |
|
Face Detection (F16)
|
100% |
2036
24.2 IPS |
|
Face Detection (I8)
|
99% |
1969
23.4 IPS |
|
Depth Estimation (F32)
|
100% |
4661
36.1 IPS |
|
Depth Estimation (F16)
|
99% |
4963
38.5 IPS |
|
Depth Estimation (I8)
|
99% |
4914
38.1 IPS |
|
Style Transfer (F32)
|
100% |
12352
16.2 IPS |
|
Style Transfer (F16)
|
100% |
14690
19.3 IPS |
|
Style Transfer (I8)
|
100% |
14870
19.6 IPS |
|
Image Super-Resolution (F32)
|
100% |
4270
152.5 IPS |
|
Image Super-Resolution (F16)
|
100% |
3643
130.1 IPS |
|
Image Super-Resolution (I8)
|
100% |
3664
130.9 IPS |
|
Text Classification (F32)
|
100% |
1692
2.43 KIPS |
|
Text Classification (F16)
|
100% |
1064
1.53 KIPS |
|
Text Classification (I8)
|
96% |
1069
1.54 KIPS |
|
Machine Translation (F32)
|
100% |
1004
18.5 IPS |
|
Machine Translation (F16)
|
99% |
1056
19.4 IPS |
|
Machine Translation (I8)
|
99% |
978
18.0 IPS |