| Upload Date | December 12 2023 10:48 PM |
| Views | 165 |
| System Information | |
|---|---|
| Operating System | Ubuntu 23.10 |
| Model | CHUWI Innovation And Technology(ShenZhen)co.,Ltd AeroBook Pro |
| Motherboard | CHUWI Innovation And Technology(ShenZhen)co.,Ltd MN15 |
| CPU Information | |
|---|---|
| Name | Intel Core m3-8100Y |
| Topology | 1 Processor, 2 Cores, 4 Threads |
| Identifier | GenuineIntel Family 6 Model 142 Stepping 9 |
| Base Frequency | 3.40 GHz |
| Cluster 1 | 0 Cores |
| L1 Instruction Cache | 32.0 KB x 2 |
| L1 Data Cache | 32.0 KB x 2 |
| L2 Cache | 256 KB x 2 |
| L3 Cache | 4.00 MB x 1 |
| Memory Information | |
|---|---|
| Size | 7.66 GB |
| Inference Information | |
|---|---|
| Framework | TensorFlow Lite |
| Backend | CPU |
| Device | Default |
| Workload | Accuracy | Score | |
|---|---|---|---|
|
Image Classification (F32)
|
100% |
6
1.10 IPS |
|
|
Image Classification (F16)
|
100% |
6
1.11 IPS |
|
|
Image Classification (I8)
|
97% |
5
0.91 IPS |
|
|
Image Segmentation (F32)
|
100% |
8
0.13 IPS |
|
|
Image Segmentation (F16)
|
100% |
8
0.13 IPS |
|
|
Image Segmentation (I8)
|
98% |
6
0.11 IPS |
|
|
Pose Estimation (F32)
|
100% |
9
0.01 IPS |
|
|
Pose Estimation (F16)
|
100% |
9
0.01 IPS |
|
|
Pose Estimation (I8)
|
100% |
8
0.01 IPS |
|
|
Object Detection (F32)
|
100% |
6
0.47 IPS |
|
|
Object Detection (F16)
|
100% |
6
0.47 IPS |
|
|
Object Detection (I8)
|
68% |
5
0.38 IPS |
|
|
Face Detection (F32)
|
100% |
20
0.23 IPS |
|
|
Face Detection (F16)
|
100% |
20
0.23 IPS |
|
|
Face Detection (I8)
|
87% |
15
0.18 IPS |
|
|
Depth Estimation (F32)
|
100% |
14
0.11 IPS |
|
|
Depth Estimation (F16)
|
100% |
14
0.11 IPS |
|
|
Depth Estimation (I8)
|
95% |
12
0.09 IPS |
|
|
Style Transfer (F32)
|
100% |
29
0.04 IPS |
|
|
Style Transfer (F16)
|
100% |
29
0.04 IPS |
|
|
Style Transfer (I8)
|
98% |
24
0.03 IPS |
|
|
Image Super-Resolution (F32)
|
100% |
7
0.23 IPS |
|
|
Image Super-Resolution (F16)
|
100% |
6
0.23 IPS |
|
|
Image Super-Resolution (I8)
|
98% |
5
0.19 IPS |
|
|
Text Classification (F32)
|
100% |
8
11.2 IPS |
|
|
Text Classification (F16)
|
100% |
8
11.1 IPS |
|
|
Text Classification (I8)
|
92% |
6
8.70 IPS |
|
|
Machine Translation (F32)
|
100% |
15
0.28 IPS |
|
|
Machine Translation (F16)
|
100% |
15
0.28 IPS |
|
|
Machine Translation (I8)
|
62% |
11
0.21 IPS |