It is crucial to develop reusable methods for explaining and evaluating esports given their popularity and diversity. Quantifying skill in an esport has the potential to improve win prediction, matchmaking, and storytelling for these games. Arpad Elo’s skill modeling system for chess has been adapted to many games and sports. In each instance, the modeler is challenged with tuning parameters to optimize for some metric, usually accuracy. Often these approaches are one-off and lack consistency. We propose SCOPE, a framework that uses grid search cross-validation to select optimal parameters for Elo based on accuracy, calibration, or log loss. We demonstrate this method on a season of Call of Duty World League, a first-person shooter esport, and we demonstrate comparable performance to other more complex, state-of-the-art methods.