We recalculated the lineup-relief tables using innings pitched instead of games. This is a more accurate measure. The table below is reformatted to show the lineup-relief combo pair, innings pitched per game and the average runs per inning scored by lineup, also given up by relief.
Not sure the innings pitched/game column means anything for each lineup-relief pair. The average runs show a low of 0.388 for the worst lineup against the best relief to 0,540 for the best lineup against the worst relief (highlighted in green above). This should be expected and the range is rather significant and should provide for interesting results in simulation.
Below is the table above condensed making it easier to see the trend.
Average runs scored goes down with worse lineups facing better relief squads as we would expect. The data looks correct so far. It’s possible that the best lineup against the worst relief has highest IP/Game because the best lineup will knock out starters faster than worse lineups making relief pitchers pitch more innings regardless of value.
Since we’re here let’s do this for lineup-starters as well. Same table format as above.
The difference between the two extremes in lineup-starter combos is around 0.2 runs per inning. For lineup-relief combos that difference is around 0.15 runs per inning. The innings pitched per game column shows how higher tier pitchers pitch more innings which should be expected. The high is 6.94 for the 5-1 lineup-starter pair and drops to a low of 5.87 for the 1-5 pair.
Below is a condensed version of the above table.
The trend of average runs follows what we expect with the best linups facing the worst starters to score the most runs which decreases as starter value increases and lineup value decreases. We will use the inning numbers for simulation.
That is all for now. The next step is running simulations.