Chaos Rankings

Monday, July 25, 2011

You Can't Teach An Old Dog New Tricks

Good thing I'm not old! If I've learned nothing else from the last year of graduate school it's this: There is no such thing as an objective "best" of anything. I alluded to this in the original mission statement post of this blog, way back in July 2006. But now I have documented proof: Evaluation and Decision Models with Multiple Criteria. I won't bore you with the details, but the premise here is that best depends on how you set up your evaluation model, and that even if you get everyone to agree on using the same data set, different aggregation methods will result in different ranking orders. As such, Bouyssou suggests that any evaluation model should be constructed to best imitate the client's value system and preferences. In this case, trying to elicit that information from the NCAA is impractical if not impossible. As for the BCS, things are a bit simpler: they only care about identifying the two best teams and having them play each other to annoint a champion. The BCS is a perfect example of how there is no such thing as "best." All the ranking methods used in the BCS computation don't necessarily agree with each other, and often the final result isn't agreed upon by the end user (the viewing public).
What does all of this nonsense mean for you? Two things. 1) These rankings aren't going anywhere. If the BCS continues to be controversial every year, it means there's always room for improvement. I'd like to think the proliferation of alternate ranking systems and recent uptick in playoff proponents leaves space at the discussion table for my own system. 2) I have a few ideas of my sleeve that I'm going to be playing with this season in terms of improving the predictive ability of the ranking system. If it's impossible to determine who's legitimately the "best" team out of a group of 120, I'd settle for at least being able to tell you that Team A should be able to beat Team B (therefore making them "better" than Team B) and why, using only past performance indicators. Stay tuned!