Uploaded image for project: 'Swift'
  1. Swift
  2. SR-4669

Add a Benchmark_Driver --rerun N option

    XMLWordPrintable

    Details

      Description

      This feature would work as follows;

      1. Run all the benchmarks as usual, according to all the other options just like 'Benchmark_Driver run'

      2. Run the compare script just like Benchmark_Driver compare.

      3. From the output of the comparison scrape a list of tests from the significant regressions and improvements.

      4. Rerun just that subset for N iterations. The user would normally want N to be much higher than the initial iterations. Say 20 vs. 3. The output of each rerun should simply be appended to the previous output. That's how the compare_script was originally designed to work.

      The driver has almost all of the functionality to do this already. The only thing missing is parsing the compare_script's output.

      An alternative would be to make the compare_script a python module that the driver can import.

        Attachments

          Issue Links

            Activity

              People

              Assignee:
              palimondo Pavol Vaskovic
              Reporter:
              atrick Andrew Trick
              Votes:
              1 Vote for this issue
              Watchers:
              3 Start watching this issue

                Dates

                Created:
                Updated:
                Resolved: