Upload Date | February 04 2025 07:18 PM |
Views | 4 |
System Information | |
---|---|
Operating System | macOS 15.3 (Build 24D60) |
Model | MacBook Pro (13-inch, 2022) |
Model ID | Mac14,7 |
Motherboard | Mac14,7 |
CPU Information | |
---|---|
Name | Apple M2 |
Topology | 1 Processor, 8 Cores |
Identifier | Apple M2 |
Base Frequency | 3.30 GHz |
Cluster 1 | 4 Cores |
Cluster 2 | 4 Cores |
L1 Instruction Cache | 128 KB x 1 |
L1 Data Cache | 64.0 KB x 1 |
L2 Cache | 4.00 MB x 1 |
Memory Information | |
---|---|
Size | 8.00 GB |
Inference Information | |
---|---|
Framework | Core ML |
Backend | CPU |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
1207
225.9 IPS |
|
Image Classification (F16)
|
100% |
2203
412.2 IPS |
|
Image Classification (I8)
|
99% |
2137
399.9 IPS |
|
Image Segmentation (F32)
|
100% |
840
14.0 IPS |
|
Image Segmentation (F16)
|
100% |
1150
19.2 IPS |
|
Image Segmentation (I8)
|
100% |
1021
17.1 IPS |
|
Pose Estimation (F32)
|
100% |
6232
7.55 IPS |
|
Pose Estimation (F16)
|
100% |
14054
17.0 IPS |
|
Pose Estimation (I8)
|
100% |
11487
13.9 IPS |
|
Object Detection (F32)
|
100% |
1429
106.7 IPS |
|
Object Detection (F16)
|
100% |
2424
181.0 IPS |
|
Object Detection (I8)
|
97% |
1430
106.8 IPS |
|
Face Detection (F32)
|
100% |
2976
35.4 IPS |
|
Face Detection (F16)
|
97% |
5678
67.5 IPS |
|
Face Detection (I8)
|
97% |
5372
63.9 IPS |
|
Depth Estimation (F32)
|
100% |
5462
42.4 IPS |
|
Depth Estimation (F16)
|
100% |
11459
88.9 IPS |
|
Depth Estimation (I8)
|
99% |
11608
90.0 IPS |
|
Style Transfer (F32)
|
100% |
17977
23.6 IPS |
|
Style Transfer (F16)
|
100% |
26534
34.9 IPS |
|
Style Transfer (I8)
|
100% |
27208
35.8 IPS |
|
Image Super-Resolution (F32)
|
100% |
4013
143.3 IPS |
|
Image Super-Resolution (F16)
|
100% |
5084
181.6 IPS |
|
Image Super-Resolution (I8)
|
100% |
4880
174.3 IPS |
|
Text Classification (F32)
|
100% |
2011
2.89 KIPS |
|
Text Classification (F16)
|
100% |
3389
4.87 KIPS |
|
Text Classification (I8)
|
96% |
3478
5.00 KIPS |
|
Machine Translation (F32)
|
100% |
1299
23.9 IPS |
|
Machine Translation (F16)
|
100% |
1599
29.4 IPS |
|
Machine Translation (I8)
|
99% |
1313
24.2 IPS |