CyberPatriot 8 Round 1 Score Analysis
For the past several years1, I’ve analyzed CyberPatriot2 competition rounds using a Pryaxis product called The Magi.
This report covers the data The Magi collected during CyberPatriot 8 Round 1, during the entirety of the competition window.
After the competition window closes, CyberPatriot modifies scores to account for penalties, alternative score dates, and extenuating circumstances that warrant score modifications. Because this round had no Cisco/Networking scores, the scores compared here are free reveal how the CyberPatriot operations center alters scores from images after the competition closes.
CPOC’s3 released official score PDFs tell an interesting story about five teams.
One team, 08-2587, was listed as “Score Under Review,” and no score was provided. Their team was flagged as having multiple copies of the same image open but had a fairly “average” score of 155 points4. This was the only team listed as “under review,” rather than withheld.
Two teams from the open division were listed as “Score Withheld,” and again, no score was provided. These teams were 08-2868 and 08-2869. Both teams had a score of 154 points at shutdown, were started within six minutes of each other, and are only a single digit off in their team identifiers.
Two middle school teams, 08-1043 and 08-1046, started within three minutes of each other and were withheld under the same conditions as the open division high school teams – no warnings issued. These teams had 94 points and were subsequently removed.
The first team’s removal is interesting, because other teams with concurrent instance flags were penalized in points – not entirely removed. This data is interesting on its own, and is examined later.
All other excluded teams had no warnings. The only conclusion that makes sense given the startup times and the team ID numbers is that they were sharing information between teams. Both sets of teams were likely in the same school – which would make this type of cheating trivial. Unfortunately, without the graph data to confirm this hypothesis, the answers will likely remain with those coaches and CPOC.
Score adjustments made by CPOC during most competition rounds are normally obscured due to the addition of Cisco & networking challenges that lay outside the scope of the online competition system. Because this round lacked these additional components, these changes are unmasked.
In the middle school division, four teams lost points, with an average of 25 points lost each time.
- 08-1489 lost 6 points, 73 scraped5 to 67 officially, with a recorded 6 hours and 26 minutes of competition time. This was likely a time penalty.
- 08-0852 lost 20 points, 85 scraped to 65 officially, with a recorded 5 hours and 48 minutes of competition time. They were flagged as having multiple instances open, and likely were penalized as a result.
- 08-1187 lost 25 points, 88 scraped to 63 officially, with a recorded time of 5 hours and 59 minutes. This team wasn’t flagged by CCS6 for time or instances, which makes this loss somewhat surprising.
- 08-1853 lost 49 points, from 143 (!!) scraped points to 94 officially. They were flagged with a whopping 49 hours and 57 minutes of competition time.
In the high school competition, score reductions were more numerous.
In the all service division, 14 teams lost an average of 25 points. Of those, three teams were reduced with the multiple instances flag, and five with time flags.
- 08-0713 had no recorded penalties in CCS, but lost 43 points (162 to 119).
- 08-1880 also had no penalties in CCS, but lost 42 points (92 to 50).
In the open division, 40 teams lost an average of 20 points. 11 teams were flagged with multiple instances, and 12 with the time flag.
- 08-3339 managed to go from a scraped score of 112 to 8 with a loss of 104 points in 25 hours of competition time.
- 08-2871, which had no penalties in CCS, went from 147 to 78 – a 69 point drop.
I originally thought this section wouldn’t exist, or if it would, it would be a very small number of score corrections.
- 623 teams gained an average of 8 points in the all service division, including 5 teams that were flagged with multiple instances and 101 teams flagged with time violations.
- 843 teams gained an average of 8 points in the open division, including 18 teams that were flagged with multiple instances and 93 teams with time violations.
I initially thought this was a fluke with the data, but after looking at teams with apparently unrelated traits, a majority of them gained adjustments of exactly eight points, irrespective of time or multiple instance penalties. The middle school division proves that this wasn’t a fluke with the scraper – only five teams received positive point adjustments in that division. Because the middle and high school competitions use different images, it would be no surprise if an issue with a vulnerability was found in an image that caused teams to lose points. However, this shouldn’t be the case, as 105 teams were recorded with a perfect round one score. The only option that tends to remain would be a curve, but that raises the question as to why it would only apply to the high school teams – not the middle school teams. Moreover, it wasn’t unilateral. Teams with 200 points didn’t get bumped to 208.
For the “curve” it was actually a error in the score engine. It didn’t log one of the vulnerabilities properly for those teams, but did for others. It was confirmed by CyberPatriot via email when they sent out the results. –TibitXimer, via Reddit7
The data for these calculations was pulled from The Magi’s database, and is available in The Magi’s source repo as a human readable document and CSV.
In the past, it was difficult to see how time penalties affected teams in CyberPatriot. Teams very clearly lose points, even for minor overages, which makes stopping on time critically important in future rounds. The mysterious eight-point addition is certainly interesting, but it probably masks scores that would have lost points without their addition. As always, running multiple images seems to be the most dangerous trial – it caused the complete removal of a score as “under review,” the only score of its kind, along with significant point drops.
The Magi is again predicting platinum slots, as it did last year with 95% accuracy. This year, it also correctly calculates slots for All Service and Middle School teams.
Coaches and captains who want more practice should consider Jump, a scoring engine for Linux & Windows that provides a lot of power for a fair price. Read more about it in my announcement post.
Several years is vague. I don’t really remember the exact competition I started doing this on, so I’ll leave it vague. The earliest scraper I have open sourced is the one for CyberPatriot 6, available on github. ↩
CPOC is the CyberPatriot Operations Center, and is commonly used in reference to the staff and organizers behind CyberPatriot. ↩
Not really an average score, but it was normal enough that I consider it “average.” ↩
Scraped refers to what The Magi saw at the end of the CyberPatriot 8 competition window. Quoting CPOC, “The scores and warnings shown on [the scoreboard] have not been officially verified and are provided for reference purposes only. Displayed scores may not include penalties or other lost points. Official scores are published by the CyberPatriot Program Office during the week following each round of competition.” ↩
CCS is the CyberPatriot Competition System, developed at the University of Texas (San Antonio). ↩