- Jan 5, 2013
- 150
- 184
I think this is a good thing to speak generally about. I have seen people post about this in different threads and don't want to bring up specifics.
When coaches are comparing scores from one event to another on the same score sheet, what are you looking at? Heres a little list of things I think are dos and donts.
Absolutely do not compare your raw scores from one event to another. There are many things that change from judge to judge. Even with the same judge at different events there can be differences.
Compare category scores. Look at difficulty scores for categories and compare them. You shouldnt be looking for the exact same number but instead making sure they are in the same range. As long as you are in the same range, a judge has the discretion of where in the range to put you. If you drop a stunt and miss elements, you may not make it into the same range as you did last time. With tumbling being very numbers based, 2 or 3 athletes not throwing the skill they were supposed to can move you out of a majority situation.
Technique scores. These can vary greatly from panel to panel and event to event. Don't compare your technique from one day to technique from another. Most companies have ranges for this as well but you usually wont see the same exact score from event to event. What you should check is multiple teams from the same panel. Perhaps your minI 2 amd sr 2 were judged by the same panel. You should know which team had the better execution so make sure it is consistent. Also must companies don't allow judges to consider stunts that fall in technique. If it falls you are getting a deduction so the technique score will be based on the remaining stunts.
Creativity and dance scores will probably vary the greatest. A few things affect this and could vary greatly depending on the event. Most companies don't have concrete rubrics for either of these, just directional guidance. This basically leaves it up to the judge to score where they see fit. Again while you cant compare event to event, teams judged by the same panel should be consistent.
So for those who are comparing scores, what do you look for? (which should be every coach... never just take scores and not verify them)
When coaches are comparing scores from one event to another on the same score sheet, what are you looking at? Heres a little list of things I think are dos and donts.
Absolutely do not compare your raw scores from one event to another. There are many things that change from judge to judge. Even with the same judge at different events there can be differences.
Compare category scores. Look at difficulty scores for categories and compare them. You shouldnt be looking for the exact same number but instead making sure they are in the same range. As long as you are in the same range, a judge has the discretion of where in the range to put you. If you drop a stunt and miss elements, you may not make it into the same range as you did last time. With tumbling being very numbers based, 2 or 3 athletes not throwing the skill they were supposed to can move you out of a majority situation.
Technique scores. These can vary greatly from panel to panel and event to event. Don't compare your technique from one day to technique from another. Most companies have ranges for this as well but you usually wont see the same exact score from event to event. What you should check is multiple teams from the same panel. Perhaps your minI 2 amd sr 2 were judged by the same panel. You should know which team had the better execution so make sure it is consistent. Also must companies don't allow judges to consider stunts that fall in technique. If it falls you are getting a deduction so the technique score will be based on the remaining stunts.
Creativity and dance scores will probably vary the greatest. A few things affect this and could vary greatly depending on the event. Most companies don't have concrete rubrics for either of these, just directional guidance. This basically leaves it up to the judge to score where they see fit. Again while you cant compare event to event, teams judged by the same panel should be consistent.
So for those who are comparing scores, what do you look for? (which should be every coach... never just take scores and not verify them)