Upload Date | August 16 2024 04:40 PM |
Views | 3 |
System Information | |
---|---|
Operating System | macOS 14.6.1 (Build 23G93) |
Model | Mac Studio |
Model ID | Mac13,1 |
Motherboard | Mac13,1 |
CPU Information | |
---|---|
Name | Apple M1 Max |
Topology | 1 Processor, 10 Cores |
Identifier | Apple M1 Max |
Base Frequency | 3.22 GHz |
Cluster 1 | 8 Cores |
Cluster 2 | 2 Cores |
L1 Instruction Cache | 128 KB x 1 |
L1 Data Cache | 64.0 KB x 1 |
L2 Cache | 4.00 MB x 1 |
Memory Information | |
---|---|
Size | 32.00 GB |
Inference Information | |
---|---|
Framework | Core ML |
Backend | GPU |
Device | Default |
Workload | Accuracy | Score | |
---|---|---|---|
Image Classification (F32)
|
100% |
3999
748.4 IPS |
|
Image Classification (F16)
|
100% |
4063
760.4 IPS |
|
Image Classification (I8)
|
99% |
3553
664.9 IPS |
|
Image Segmentation (F32)
|
100% |
1571
26.2 IPS |
|
Image Segmentation (F16)
|
100% |
1615
27.0 IPS |
|
Image Segmentation (I8)
|
100% |
1591
26.6 IPS |
|
Pose Estimation (F32)
|
100% |
42247
51.2 IPS |
|
Pose Estimation (F16)
|
100% |
43277
52.4 IPS |
|
Pose Estimation (I8)
|
100% |
40243
48.7 IPS |
|
Object Detection (F32)
|
100% |
2895
216.1 IPS |
|
Object Detection (F16)
|
100% |
3400
253.8 IPS |
|
Object Detection (I8)
|
97% |
2795
208.7 IPS |
|
Face Detection (F32)
|
100% |
14136
168.1 IPS |
|
Face Detection (F16)
|
98% |
14952
177.8 IPS |
|
Face Detection (I8)
|
97% |
14038
166.9 IPS |
|
Depth Estimation (F32)
|
100% |
22386
173.6 IPS |
|
Depth Estimation (F16)
|
100% |
23639
183.3 IPS |
|
Depth Estimation (I8)
|
37% |
21210
164.5 IPS |
|
Style Transfer (F32)
|
100% |
68634
90.3 IPS |
|
Style Transfer (F16)
|
100% |
87735
115.4 IPS |
|
Style Transfer (I8)
|
100% |
82133
108.0 IPS |
|
Image Super-Resolution (F32)
|
100% |
12015
429.1 IPS |
|
Image Super-Resolution (F16)
|
100% |
12276
438.4 IPS |
|
Image Super-Resolution (I8)
|
100% |
10571
377.5 IPS |
|
Text Classification (F32)
|
100% |
690
991.5 IPS |
|
Text Classification (F16)
|
100% |
472
678.1 IPS |
|
Text Classification (I8)
|
96% |
473
680.0 IPS |
|
Machine Translation (F32)
|
100% |
723
13.3 IPS |
|
Machine Translation (F16)
|
100% |
724
13.3 IPS |
|
Machine Translation (I8)
|
99% |
704
13.0 IPS |