[feature] measure performance and save to a file
Compare changes
Files
4
output/resultAnalysis.py
0 → 100644
+ 20
− 0
EDIT: due to a miscalculation of idf this result is no more accurate. See !5 (merged)
Closes #4 (closed)
Measures the performance of the current implementation for diffrerent thresholds.
A theshold is the minimum value for the Selektor (threshold <= simValue <= 1)