Latest Geekbench ML Inference Results (Page 92)

System ASUSTeK COMPUTER INC. ROG Zephyrus G16 GA605WI_GA605WI AMD Ryzen AI 9 HX 370 2000 MHz (12 cores)
Uploaded Feb 06, 2025
Platform Windows
Inference Framework ONNX DirectML
Inference Score 12326
System ASUSTeK COMPUTER INC. ROG Zephyrus G16 GA605WI_GA605WI AMD Ryzen AI 9 HX 370 2000 MHz (12 cores)
Uploaded Feb 06, 2025
Platform Windows
Inference Framework ONNX DirectML
Inference Score 12791
System ASUSTeK COMPUTER INC. ROG Zephyrus G16 GA605WI_GA605WI AMD Ryzen AI 9 HX 370 2000 MHz (12 cores)
Uploaded Feb 06, 2025
Platform Windows
Inference Framework ONNX DirectML
Inference Score 12855
System ASUSTeK COMPUTER INC. ROG Zephyrus G16 GA605WI_GA605WI AMD Ryzen AI 9 HX 370 2000 MHz (12 cores)
Uploaded Feb 06, 2025
Platform Windows
Inference Framework ONNX DirectML
Inference Score 12677
System ASUSTeK COMPUTER INC. ROG Zephyrus G16 GA605WI_GA605WI AMD Ryzen AI 9 HX 370 2000 MHz (12 cores)
Uploaded Feb 06, 2025
Platform Windows
Inference Framework ONNX CPU
Inference Score 3635
System ASUSTeK COMPUTER INC. ROG Zephyrus G16 GA605WI_GA605WI AMD Ryzen AI 9 HX 370 2000 MHz (12 cores)
Uploaded Feb 06, 2025
Platform Windows
Inference Framework ONNX CPU
Inference Score 3735
System ASUSTeK COMPUTER INC. ROG Zephyrus G16 GA605WI_GA605WI AMD Ryzen AI 9 HX 370 2000 MHz (12 cores)
Uploaded Feb 06, 2025
Platform Windows
Inference Framework ONNX CPU
Inference Score 3687
System ASUSTeK COMPUTER INC. ROG Zephyrus G16 GA605WI_GA605WI AMD Ryzen AI 9 HX 370 2000 MHz (12 cores)
Uploaded Feb 06, 2025
Platform Windows
Inference Framework ONNX CPU
Inference Score 3765
System ASUSTeK COMPUTER INC. ASUS Zenbook S 16 UM5606WA_UM5606WA AMD Ryzen AI 9 HX 370 2000 MHz (12 cores)
Uploaded Feb 06, 2025
Platform Windows
Inference Framework ONNX DirectML
Inference Score 6556
System HP HP ZBook Studio 16 inch G11 Mobile Workstation PC Intel Core Ultra 9 185H 2300 MHz (16 cores)
Uploaded Feb 05, 2025
Platform Windows
Inference Framework ONNX DirectML
Inference Score 12548
System HP HP ZBook Studio 16 inch G11 Mobile Workstation PC Intel Core Ultra 9 185H 2300 MHz (16 cores)
Uploaded Feb 05, 2025
Platform Windows
Inference Framework ONNX DirectML
Inference Score 12538
System HP HP ZBook Studio 16 inch G11 Mobile Workstation PC Intel Core Ultra 9 185H 2300 MHz (16 cores)
Uploaded Feb 05, 2025
Platform Windows
Inference Framework ONNX DirectML
Inference Score 12269
System HP HP ZBook Studio 16 inch G11 Mobile Workstation PC Intel Core Ultra 9 185H 2300 MHz (16 cores)
Uploaded Feb 05, 2025
Platform Windows
Inference Framework ONNX CPU
Inference Score 3523
System HP HP ZBook Studio 16 inch G11 Mobile Workstation PC Intel Core Ultra 9 185H 2300 MHz (16 cores)
Uploaded Feb 05, 2025
Platform Windows
Inference Framework ONNX CPU
Inference Score 3535
System HP HP ZBook Studio 16 inch G11 Mobile Workstation PC Intel Core Ultra 9 185H 2300 MHz (16 cores)
Uploaded Feb 05, 2025
Platform Windows
Inference Framework ONNX CPU
Inference Score 3515
System MacBook Pro (13-inch, 2022) Apple M2 3456 MHz (8 cores)
Uploaded Feb 05, 2025
Platform macOS
Inference Framework Core ML Neural Engine
Inference Score 9624
System MacBook Pro (13-inch, 2022) Apple M2 3478 MHz (8 cores)
Uploaded Feb 05, 2025
Platform macOS
Inference Framework Core ML GPU
Inference Score 5799
System MacBook Pro (13-inch, 2022) Apple M2 3460 MHz (8 cores)
Uploaded Feb 05, 2025
Platform macOS
Inference Framework Core ML CPU
Inference Score 3856
System samsung SM-S931B Qualcomm ARMv8 3532 MHz (8 cores)
Uploaded Feb 05, 2025
Platform Android
Inference Framework TensorFlow Lite CPU
Inference Score 603
System OnePlus PJX110 ARM ARMv8 2265 MHz (8 cores)
Uploaded Feb 05, 2025
Platform Android
Inference Framework TensorFlow Lite GPU
Inference Score 922
System OnePlus PJX110 ARM ARMv8 2265 MHz (8 cores)
Uploaded Feb 05, 2025
Platform Android
Inference Framework TensorFlow Lite NNAPI
Inference Score 476
System OnePlus PJX110 ARM ARMv8 2265 MHz (8 cores)
Uploaded Feb 05, 2025
Platform Android
Inference Framework TensorFlow Lite CPU
Inference Score 1183
System MacBook Pro (13-inch, 2022) Apple M2 3459 MHz (8 cores)
Uploaded Feb 04, 2025
Platform macOS
Inference Framework Core ML Neural Engine
Inference Score 9668
System MacBook Pro (13-inch, 2022) Apple M2 3477 MHz (8 cores)
Uploaded Feb 04, 2025
Platform macOS
Inference Framework Core ML GPU
Inference Score 5940
System MacBook Pro (13-inch, 2022) Apple M2 3299 MHz (8 cores)
Uploaded Feb 04, 2025
Platform macOS
Inference Framework Core ML CPU
Inference Score 3740
System iPhone 14 Apple A15 Bionic 3230 MHz (6 cores)
Uploaded Feb 04, 2025
Platform iOS
Inference Framework Core ML CPU
Inference Score 1870
System samsung SM-S938B Qualcomm ARMv8 3532 MHz (8 cores)
Uploaded Feb 04, 2025
Platform Android
Inference Framework TensorFlow Lite GPU
Inference Score 1558
System samsung SM-S938B Qualcomm ARMv8 3532 MHz (8 cores)
Uploaded Feb 04, 2025
Platform Android
Inference Framework TensorFlow Lite CPU
Inference Score 2081
System nxp EVK_95 ARM ARMv8 1800 MHz (6 cores)
Uploaded Feb 04, 2025
Platform Android
Inference Framework TensorFlow Lite NNAPI
Inference Score 23
System nxp EVK_95 ARM ARMv8 1800 MHz (6 cores)
Uploaded Feb 04, 2025
Platform Android
Inference Framework TensorFlow Lite GPU
Inference Score 72