We need to store benchmarks somewhere and be able to analyze the results even if time has passed.

I suggest to create a repo called eve-performance and store results there.

Proposing the following structure: 

Performance benchmark structure

Level 1: Name of the configuration (e.g. Raspberry Pi 4) 

      -  README.cfg  - full spec of the HW

      -  SUMMARY.csv - All results in the table 

Type

Config name

Config descr

Io_test

Result












CONFIG_NAME : where we run (which env : Container/ eve/ ubuntu) 

       IO_TEST: name of the fio test

- test.cfg  - full test configuration

-README.cfg - test summary 

-fio (or some test) - RAW results

-iostat

-perf 

-sar

All results should have time and raw data.