When Mike McCarthy was auditioning for a new head coaching gig after being let go by the Green Bay Packers, he made it known that he spent his year away from the game camped out in his basement lair studying tape and immersing himself in “analytics.” After Jerry Jones hired him as his next head coach of the Dallas Cowboys, McCarthy made it seem that he had added the new-age analytics to his arsenal of coaching weapons. This new-found knowledge was tested right away in his first game as coach of the Cowboys.
McCarthy found his team trailing by a 20-17 score in the fourth quarter. When the Dallas offense stalled at 4th-and-3 on the Los Rams’ 13-yard line, McCarthy bypassed attempting the 30-yard field goal since the “analytics” on 4th down attempts apparently provided a one-size-fits-all answer that required going for the 1st down to keep a potential touchdown drive alive. Unfortunately for the Cowboys faithful, Dallas failed to convert the 1st down and eventually lost the game by the same 20-17 score.
I'm agnostic as to whether or not Mike McCarthy made the right or wrong call going for it at 4th-and-3 rather than kick the game-tying FG. However, the litany of defenses for his decision exposed one of the most flawed applications of analytics in football. Attempting to apply the "historical" probability odds of the success-rate on 4th-and-3 (or any other 4th down situation) to the Cowboys' specific chances in that spot represents deductive logic run amok.
Here a just a few intangibles that would impact Dallas' success rate at that moment: (1) their field position; (2) the moment in the game; (3) the quality of the opponent’s defense; (3) their credible 3-yard rush play options; (4) their credible 3-yard pass play options; (5) the injury status/health of key offensive players.
Each one of these considerations either impacts the specific success rate of the Cowboys’ 4th down play at that moment or contextualizes the risk calculus regarding the ramifications of the probability matrix if they settled for the field goal attempt.
These intangibles expose the need for more precise data to identify qualitative factors that contextualize the "actual" probability. Head coaches conduct this additional level of analysis. And this inductive logic is even considered conventional wisdom in other situations!
What are the "NFL history" odds for the probability of making a 45-yard field goal? 60%? Imagine that argument trotted out in a situation for a kicker who was struggling through out that very game in making chip-shot field goals (as was the case the next night for Monday Night Football when the Tennessee's place kicker Stephen Gostowksi made a game-winning field goal after missing kicks earlier in the game)?
If the NFL history for success rate on 4th-and-3 is, say, 51% (guessing), that does not mean the Cowboys' had that same probability in that specific (statistical) moment. Maybe it was higher!
It is indicative of football analytics still being in its infancy stage that this deductive logic is advanced so heavily. Imagine this argument being made after a hypothetical World Series moment: "Stolen base success rate is 55% -- so take your chances with (the relatively slow and non-base stealer) Cody Bellinger stealing 2nd (with two outs)!"
Football coaches may, in fact, be better served by being more aggressive on 4th down. Citing general league-wide data that is even attempting to get more specific to being analogous to the situation at hand is flawed.
Best of luck for us — Frank.