Anonymous wrote:Anonymous wrote:
Can you provide examples? I haven't seen this personally.
I haven't paid attention to see if this happens. It could happen if previous QoP scores are adjusted retroactively based on the most recent game week.
Anonymous wrote:
Can you provide examples? I haven't seen this personally.
Anonymous wrote:Has anyone found an explanation for why the scores change even without playing games? Not the ranking, but the actual scores.
Assuming you are right, how would you measure defensive quality of play?
Anonymous wrote:The quality of play algorithm uses the “quality of every significant player action” to calculate attacking and defensive scores. This seems to work reasonably well for attacking, but it can lead to the wrong results for defending. Let me give you an analogy: you are comparing hospitals A and B on how well they treat infections. Hospital A admits 100 patients, 50 get infections at the hospital, and 5 die. You conclude that hospital A treated 45/50 infections successfully for a success rate of 90%. Hospital B admits 100 patients, 5 get infections, and 3 die. You conclude that hospital B treated 2/5 infections successfully for a success rate of only 40%. If you only look at success rate, you will conclude that hospital A is better. But hospital B actually prevented infections and that led to fewer total infections and fewer deaths.
Something similar might be happening for defensive quality of play. I will use Achilles U14 as an example because they have one of the largest differences between goals allowed and defensive QoP, but there are many other examples. Achilles likes to build from the back and play through the middle. That might be great for development and might produce great players 5 years from now. But for now, it leads to a lot of turnovers in their own half, high possession for the opposing team, and a lot of high quality defensive actions for Achilles’ CB and CDM. They allow 3.55 goals per game and their defensive QoP is 80.6. They are like hospital A.
Baltimore Armour (0.47 goals allowed per game), SYC (0.74 per game), Philly Union (0.84 per game), Delco (1.26 per game), allow fewer goals per game than Achilles – Armour allows 7 times fewer! Yet they all have a worse defensive quality of play than Achilles. This is not because their defenders are bad. This is because these teams prevented attacking opportunities from their opponents by either having more possession, recovering the ball farther up the field, or other actions. They are like hospital B.
Baltimore Armour U14 has the second fewest goals per game allowed in the country (0.47), but they are ranked #73 in defensive QoP. Cedar Star Bergen has the third fewest goals per game allowed in the country (0.53) but they are #51 in defensive quality of play. These teams are likely being penalized for having fewer high quality defensive plays than other teams, but that should actually be a good thing! Again, these teams are like hospital B.
I think this is a reasonable explanation for why defensive QoP seems uncorrelated with goals allowed. It rewards high quality defensive plays but also rewards having a high volume of defensive plays. It penalizes teams that have a low volume of defensive plays. But I admit I could be wrong. Taka’s algorithm is proprietary, non-public, and not reproducible. I have no way to test these results with their data. But if anyone from Taka or MLS Next eventually reads this thread, maybe they can look into how defensive QoP is calculated next year.
Anonymous wrote:The quality of play algorithm uses the “quality of every significant player action” to calculate attacking and defensive scores. This seems to work reasonably well for attacking, but it can lead to the wrong results for defending. Let me give you an analogy: you are comparing hospitals A and B on how well they treat infections. Hospital A admits 100 patients, 50 get infections at the hospital, and 5 die. You conclude that hospital A treated 45/50 infections successfully for a success rate of 90%. Hospital B admits 100 patients, 5 get infections, and 3 die. You conclude that hospital B treated 2/5 infections successfully for a success rate of only 40%. If you only look at success rate, you will conclude that hospital A is better. But hospital B actually prevented infections and that led to fewer total infections and fewer deaths.
Something similar might be happening for defensive quality of play. I will use Achilles U14 as an example because they have one of the largest differences between goals allowed and defensive QoP, but there are many other examples. Achilles likes to build from the back and play through the middle. That might be great for development and might produce great players 5 years from now. But for now, it leads to a lot of turnovers in their own half, high possession for the opposing team, and a lot of high quality defensive actions for Achilles’ CB and CDM. They allow 3.55 goals per game and their defensive QoP is 80.6. They are like hospital A.
Baltimore Armour (0.47 goals allowed per game), SYC (0.74 per game), Philly Union (0.84 per game), Delco (1.26 per game), allow fewer goals per game than Achilles – Armour allows 7 times fewer! Yet they all have a worse defensive quality of play than Achilles. This is not because their defenders are bad. This is because these teams prevented attacking opportunities from their opponents by either having more possession, recovering the ball farther up the field, or other actions. They are like hospital B.
Baltimore Armour U14 has the second fewest goals per game allowed in the country (0.47), but they are ranked #73 in defensive QoP. Cedar Star Bergen has the third fewest goals per game allowed in the country (0.53) but they are #51 in defensive quality of play. These teams are likely being penalized for having fewer high quality defensive plays than other teams, but that should actually be a good thing! Again, these teams are like hospital B.
I think this is a reasonable explanation for why defensive QoP seems uncorrelated with goals allowed. It rewards high quality defensive plays but also rewards having a high volume of defensive plays. It penalizes teams that have a low volume of defensive plays. But I admit I could be wrong. Taka’s algorithm is proprietary, non-public, and not reproducible. I have no way to test these results with their data. But if anyone from Taka or MLS Next eventually reads this thread, maybe they can look into how defensive QoP is calculated next year.
Anonymous wrote:I honestly think that Taka might be making a mistake in how they calculate defensive quality of play.
If you look at different leagues, the attacking quality of play is highly correlated with the number of goals scored. That seems reasonable. More goals = higher attacking QoP.
But defensive quality of play seems to be uncorrelated with the number of goals against. Which is highly unintuitive. For example, in the U14 Northeast Division, Bergen has allowed a league-best 6 goals and is ranked 9th in defensive QoP. In the U14 Mid-Atlantic, Armour has allowed a league-best 9 goals and is ranked 9th in defensive QoP. It seems unintuitive that teams with the best defense in terms of goals allowed are ranked so far down in defensive QoP. You can find many more such examples.
My uninformed guess is that Taka count the number of "good" defensive plays a team makes and then they add them up to calculate the defensive QoP score. But teams that recover the ball quickly or have more possession will have fewer of those defensive plays. So teams that allow their opponents fewer chances are penalized with a low defensive QoP.
Anonymous wrote:I honestly think that Taka might be making a mistake in how they calculate defensive quality of play.
If you look at different leagues, the attacking quality of play is highly correlated with the number of goals scored. That seems reasonable. More goals = higher attacking QoP.
But defensive quality of play seems to be uncorrelated with the number of goals against. Which is highly unintuitive. For example, in the U14 Northeast Division, Bergen has allowed a league-best 6 goals and is ranked 9th in defensive QoP. In the U14 Mid-Atlantic, Armour has allowed a league-best 9 goals and is ranked 9th in defensive QoP. It seems unintuitive that teams with the best defense in terms of goals allowed are ranked so far down in defensive QoP. You can find many more such examples.
My uninformed guess is that Taka count the number of "good" defensive plays a team makes and then they add them up to calculate the defensive QoP score. But teams that recover the ball quickly or have more possession will have fewer of those defensive plays. So teams that allow their opponents fewer chances are penalized with a low defensive QoP.
Anonymous wrote:The newly updated QoP scores are up. And some of the QoP numbers seem completely detached from reality:
For example, the U14 Achilles team has a great defensive QoP of 80.6 after allowing 70 goals. This is way better than Philly Union's 75.3 with 16 goals allowed.
When two teams have allowed a similar number of goals, the QoP could be a useful tool to say which team is better depending on style, schedule and other factors. But when a team has allowed 4+ times as many goals as another team, yet the algorithm says they have a better defensive QoP, the most reasonable conclusion is to question the validity of the algorithm,
Anonymous wrote:
This. The app uses a zero sum game model for a one V one contest and applies it to a non-zero sum game in 7v7 to 11v11 contests where the rosters are fluid.
There is a false surety that’s created. And when you get those false “gut” confirmations “the rankings app said it would be 1-0 and we tied, the rankings app is pretty good” when in reality for a game that is low margin over 90m, that is a huge miss.
I don't think you understood half of the words you said. In any case, a model that predicts the score of a soccer game consistently within 1 goal would be an incredibly good model.
Anonymous wrote:
This. The app uses a zero sum game model for a one V one contest and applies it to a non-zero sum game in 7v7 to 11v11 contests where the rosters are fluid.
There is a false surety that’s created. And when you get those false “gut” confirmations “the rankings app said it would be 1-0 and we tied, the rankings app is pretty good” when in reality for a game that is low margin over 90m, that is a huge miss.
I don't think you understood half of the words you said. In any case, a model that predicts the score of a soccer game consistently within 1 goal would be an incredibly good model.
This. The app uses a zero sum game model for a one V one contest and applies it to a non-zero sum game in 7v7 to 11v11 contests where the rosters are fluid.
There is a false surety that’s created. And when you get those false “gut” confirmations “the rankings app said it would be 1-0 and we tied, the rankings app is pretty good” when in reality for a game that is low margin over 90m, that is a huge miss.