Geekbench 6
Top Single-Core Results
Top Multi-Core Results
Recent CPU Results
Recent GPU Results
Geekbench AI
Recent Results
Benchmark Charts
Android Benchmark Chart
iOS Benchmark Chart
Mac Benchmark Chart
Processor Benchmark Chart
AI Benchmark Chart
Metal Benchmark Chart
OpenCL Benchmark Chart
Vulkan Benchmark Chart
Account
Sign Up
Log In
Latest Geekbench ML Inference Results (Page 97)
System
Micro-Star International Co., Ltd. Prestige 13 AI+ Evo A2VMG
Intel Core Ultra 7 258V 2200 MHz (8 cores)
Uploaded
Jan 23, 2025
Platform
Windows
Inference Framework
ONNX CPU
Inference Score
3221
System
Micro-Star International Co., Ltd. Prestige 13 AI+ Evo A2VMG
Intel Core Ultra 7 258V 2200 MHz (8 cores)
Uploaded
Jan 23, 2025
Platform
Windows
Inference Framework
ONNX CPU
Inference Score
3184
System
Micro-Star International Co., Ltd. Summit 13 AI+ Evo A2VMTG
Intel Core Ultra 7 258V 2200 MHz (8 cores)
Uploaded
Jan 23, 2025
Platform
Windows
Inference Framework
ONNX CPU
Inference Score
3357
System
LENOVO 81J2
Intel Pentium Silver N5000 2700 MHz (4 cores)
Uploaded
Jan 23, 2025
ppwozniak
Platform
Linux
Inference Framework
TensorFlow Lite CPU
Inference Score
7
System
ASUSTeK COMPUTER INC. ASUS Zenbook S 16 UM5606WA_UM5606WA
AMD Ryzen AI 9 HX 370 w/ Radeon 890M 2000 MHz (12 cores)
Uploaded
Jan 23, 2025
Platform
Windows
Inference Framework
ONNX DirectML
Inference Score
6652
System
Mac15,13
Apple M3 4044 MHz (8 cores)
Uploaded
Jan 23, 2025
Platform
macOS
Inference Framework
Core ML CPU
Inference Score
3286
System
Mac15,13
Apple M3 4042 MHz (8 cores)
Uploaded
Jan 23, 2025
Platform
macOS
Inference Framework
Core ML CPU
Inference Score
3271
System
iPhone SE
Apple A9 1800 MHz (2 cores)
Uploaded
Jan 22, 2025
Platform
iOS
Inference Framework
TensorFlow Lite Core ML
Inference Score
289
System
iPhone SE
Apple A9 1800 MHz (2 cores)
Uploaded
Jan 22, 2025
Platform
iOS
Inference Framework
TensorFlow Lite CPU
Inference Score
327
System
Mac16,5
Apple M4 Max 4506 MHz (16 cores)
Uploaded
Jan 22, 2025
Platform
macOS
Inference Framework
Core ML Neural Engine
Inference Score
16341
System
Mac16,5
Apple M4 Max 4505 MHz (16 cores)
Uploaded
Jan 22, 2025
Platform
macOS
Inference Framework
Core ML GPU
Inference Score
14444
System
Mac16,5
Apple M4 Max 4505 MHz (16 cores)
Uploaded
Jan 22, 2025
Platform
macOS
Inference Framework
Core ML CPU
Inference Score
6443
System
MacBook Pro (13-inch, 2022)
Apple M2 3476 MHz (8 cores)
Uploaded
Jan 22, 2025
Platform
macOS
Inference Framework
Core ML Neural Engine
Inference Score
9575
System
MacBook Pro (13-inch, 2022)
Apple M2 3472 MHz (8 cores)
Uploaded
Jan 22, 2025
Platform
macOS
Inference Framework
Core ML GPU
Inference Score
5946
System
MacBook Pro (13-inch, 2022)
Apple M2 3469 MHz (8 cores)
Uploaded
Jan 22, 2025
Platform
macOS
Inference Framework
Core ML CPU
Inference Score
3832
System
iPhone 14
Apple A15 Bionic 3230 MHz (6 cores)
Uploaded
Jan 22, 2025
Platform
iOS
Inference Framework
Core ML CPU
Inference Score
1940
System
Alienware Alienware Aurora R16
Intel Core i9-14900F 2000 MHz (24 cores)
Uploaded
Jan 22, 2025
Platform
Windows
Inference Framework
ONNX DirectML
Inference Score
27868
System
Microsoft Corporation Microsoft Surface Pro, 11th Edition
Snapdragon X X1E80100 3417 MHz (12 cores)
Uploaded
Jan 22, 2025
Platform
Windows
Inference Framework
ONNX CPU
Inference Score
2883
System
Microsoft Corporation Microsoft Surface Pro, 11th Edition
Snapdragon X X1E80100 3417 MHz (12 cores)
Uploaded
Jan 22, 2025
Platform
Windows
Inference Framework
ONNX DirectML
Inference Score
2171
System
Microsoft Corporation Microsoft Surface Pro, 11th Edition
Snapdragon X X1E80100 3417 MHz (12 cores)
Uploaded
Jan 22, 2025
Platform
Windows
Inference Framework
ONNX DirectML
Inference Score
2179
System
Microsoft Corporation Microsoft Surface Pro, 11th Edition
Snapdragon X X1E80100 3417 MHz (12 cores)
Uploaded
Jan 22, 2025
Platform
Windows
Inference Framework
ONNX DirectML
Inference Score
2182
System
Microsoft Corporation Microsoft Surface Pro, 11th Edition
Snapdragon X X1E80100 3417 MHz (12 cores)
Uploaded
Jan 22, 2025
Platform
Windows
Inference Framework
ONNX DirectML
Inference Score
2172
System
Microsoft Corporation Microsoft Surface Pro, 11th Edition
Snapdragon X X1E80100 3417 MHz (12 cores)
Uploaded
Jan 22, 2025
Platform
Windows
Inference Framework
ONNX CPU
Inference Score
2875
System
Microsoft Corporation Microsoft Surface Pro, 11th Edition
Snapdragon X X1E80100 3417 MHz (12 cores)
Uploaded
Jan 22, 2025
Platform
Windows
Inference Framework
ONNX CPU
Inference Score
3017
System
Microsoft Corporation Microsoft Surface Pro, 11th Edition
Snapdragon X X1E80100 3417 MHz (12 cores)
Uploaded
Jan 22, 2025
Platform
Windows
Inference Framework
ONNX CPU
Inference Score
2841
System
samsung SM-M166P
ARM ARMv8 2000 MHz (8 cores)
Uploaded
Jan 21, 2025
Platform
Android
Inference Framework
TensorFlow Lite NNAPI
Inference Score
344
System
samsung SM-M166P
ARM ARMv8 2000 MHz (8 cores)
Uploaded
Jan 21, 2025
Platform
Android
Inference Framework
TensorFlow Lite GPU
Inference Score
441
System
samsung SM-M166P
ARM ARMv8 2000 MHz (8 cores)
Uploaded
Jan 21, 2025
Platform
Android
Inference Framework
TensorFlow Lite NNAPI
Inference Score
353
System
samsung SM-M166P
ARM ARMv8 2000 MHz (8 cores)
Uploaded
Jan 21, 2025
Platform
Android
Inference Framework
TensorFlow Lite CPU
Inference Score
201
System
samsung SM-M166P
ARM ARMv8 2000 MHz (8 cores)
Uploaded
Jan 21, 2025
Platform
Android
Inference Framework
TensorFlow Lite NNAPI
Inference Score
343
← Previous
1
2
…
92
93
94
95
96
97
98
99
100
Next →