Bug - Fix tensorrt-inference parsing (#674)
**Description** Today I was running a benchmark on my machine. And encountered a fancy issue with tensorrt-inference. I got code 33, which according to the source code is: ``` MICROBENCHMARK_RESULT_PARSING_FAILURE = 33 ``` I dived deep into the code and found out the following problem. The parser stumbled upon getting to the following line: ``` [11/28/2024-17:03:11] [I] Latency: min = 7.2793 ms, max = 10.1606 ms, mean = 7.41642 ms, median = 7.39551 ms, percentile(99%) = 8 ms ``` I ran it separately on the code and found out that the regular expression was not suitable for the cases like this, when you encounter an INT as a result in milliseconds. That's why this pull request is created. I came up with the closest possible regular expression to fix this issue and not to introduce any other bug. **Major Revision** - 0.11.0
Showing
Please register or sign in to comment