Upload Date | December 06 2024 06:01 PM |
Views | 2 |
System Information | |
---|---|
Operating System | macOS 15.2 (Build 24C98) |
Model | MacBook Pro (13-inch, 2022) |
Model ID | Mac14,7 |
Motherboard | Mac14,7 |
CPU Information | |
---|---|
Name | Apple M2 |
Topology | 1 Processor, 8 Cores |
Identifier | Apple M2 |
Base Frequency | 3.45 GHz |
Cluster 1 | 4 Cores |
Cluster 2 | 4 Cores |
L1 Instruction Cache | 128 KB x 1 |
L1 Data Cache | 64.0 KB x 1 |
L2 Cache | 4.00 MB x 1 |
Memory Information | |
---|---|
Size | 8.00 GB |
Inference Information | |
---|---|
Framework | Core ML |
Backend | CPU |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
1376
257.5 IPS |
|
Image Classification (F16)
|
100% |
2518
471.3 IPS |
|
Image Classification (I8)
|
99% |
2491
466.1 IPS |
|
Image Segmentation (F32)
|
100% |
847
14.1 IPS |
|
Image Segmentation (F16)
|
100% |
1030
17.2 IPS |
|
Image Segmentation (I8)
|
100% |
1054
17.6 IPS |
|
Pose Estimation (F32)
|
100% |
6493
7.86 IPS |
|
Pose Estimation (F16)
|
100% |
14214
17.2 IPS |
|
Pose Estimation (I8)
|
100% |
13153
15.9 IPS |
|
Object Detection (F32)
|
100% |
1429
106.7 IPS |
|
Object Detection (F16)
|
100% |
2457
183.5 IPS |
|
Object Detection (I8)
|
97% |
1424
106.3 IPS |
|
Face Detection (F32)
|
100% |
2788
33.1 IPS |
|
Face Detection (F16)
|
97% |
5783
68.8 IPS |
|
Face Detection (I8)
|
97% |
5886
70.0 IPS |
|
Depth Estimation (F32)
|
100% |
5708
44.3 IPS |
|
Depth Estimation (F16)
|
100% |
11680
90.6 IPS |
|
Depth Estimation (I8)
|
99% |
11162
86.6 IPS |
|
Style Transfer (F32)
|
100% |
16309
21.5 IPS |
|
Style Transfer (F16)
|
100% |
27712
36.5 IPS |
|
Style Transfer (I8)
|
100% |
27781
36.5 IPS |
|
Image Super-Resolution (F32)
|
100% |
4204
150.1 IPS |
|
Image Super-Resolution (F16)
|
100% |
4963
177.2 IPS |
|
Image Super-Resolution (I8)
|
100% |
4997
178.5 IPS |
|
Text Classification (F32)
|
100% |
2077
2.99 KIPS |
|
Text Classification (F16)
|
100% |
3457
4.97 KIPS |
|
Text Classification (I8)
|
96% |
3451
4.96 KIPS |
|
Machine Translation (F32)
|
100% |
1376
25.3 IPS |
|
Machine Translation (F16)
|
100% |
1618
29.8 IPS |
|
Machine Translation (I8)
|
99% |
1362
25.1 IPS |