All-Star Inconsistent Judging

Welcome to our Cheerleading Community

Members see FEWER ads... join today!

Funfunfun

Cheer Parent
Dec 20, 2009
96
89
I see a lot of comments about inconsistencies in judging. It made me think about referees in the NFL. My understanding is that after each game, the game tapes are reviewed by industry leaders and each referee is given a grade/score based on their performance. (Example, did they miss a call, did they call something they thought they saw but it never really happened). All scores are tabulated and at the end of the season, the referees with the best scores are invited to work the Superbowl.
I know cheer is not the NFL, however, I was wondering if competition judges are ever judged themselves by industry leaders and ranked in any way. I realize many areas of the scorecard are subjective but I’m talking about those score sheets where judges make ridiculous comments about adding skills that are out of level or deducting points for three stunts hitting the ground when clearly there was only one stunt down.
I’m just a cheer parent so I don’t know all the ins and outs of judging, that’s why I’m asking. Are they held accountable in any way? Even if they are held accountable by each individual event producer it would be interesting to see what would happen if those internal ranking were shared throughout the industry.
 
The truth is until the EP's get over themselves and agree to a universal scoresheet you can't 'blame' the judges. The system they have been given is awful. One week the winner scores 95 and the next week a 'loser' gets 435. That makes NO SENSE to the laymen.
 
I see a lot of comments about inconsistencies in judging. It made me think about referees in the NFL. My understanding is that after each game, the game tapes are reviewed by industry leaders and each referee is given a grade/score based on their performance. (Example, did they miss a call, did they call something they thought they saw but it never really happened). All scores are tabulated and at the end of the season, the referees with the best scores are invited to work the Superbowl.
I know cheer is not the NFL, however, I was wondering if competition judges are ever judged themselves by industry leaders and ranked in any way. I realize many areas of the scorecard are subjective but I’m talking about those score sheets where judges make ridiculous comments about adding skills that are out of level or deducting points for three stunts hitting the ground when clearly there was only one stunt down.
I’m just a cheer parent so I don’t know all the ins and outs of judging, that’s why I’m asking. Are they held accountable in any way? Even if they are held accountable by each individual event producer it would be interesting to see what would happen if those internal ranking were shared throughout the industry.
I think it would be great for EPs to "audit" the judging to make sure they are providing a fair competition.Maybe judges that didn't have any demerits could earn a bonus at the end of the season.
 
The truth is until the EP's get over themselves and agree to a universal scoresheet you can't 'blame' the judges. The system they have been given is awful. One week the winner scores 95 and the next week a 'loser' gets 435. That makes NO SENSE to the laymen.

This. A million times, this. Once you have a universal scoresheet, you can have consistent training of judges, which will automatically improve their performance. And once judging is better and you have a scoresheet people will understand (and open up the scores) you should see fewer inconsistencies.
 
Im with Kingston here. You can't blame judges when the scoresheet is different week to week. We cant evaluate judges until there is one consistent scoresheet. It is unfair to the judges to say they don't score consistently when their score sheet isnt even consistent.
 
The way it works at most competitions:
1. Judges score the routine that is on the floor, all score sheets for that team are put in an envelope.
2. A runner takes the envelope to a back room at the event
3. In that room the score sheets are added up (so the judges don't have to add them) and then double (sometimes triple) checked to be sure they are added correctly.
4. Scores are recorded for the team on a master sheet
5. Score sheets are put back into the envelope with the team name
6. Repeat

Those envelopes are given to the teams at awards....the EP's don't keep a copy. This review system would add so much more time to the competition. You'd have to basically have another judge, in a back room re-watch the routine on video replay and check to be sure that deductions were taken etc..and check ALL the score sheets from that team (that's 5 score sheets per team to check). It's also going to add costs to the event (you'll need more than one judge reviewer at each event to move it along) and you'll need to pay for the added costs of video replay, that will be passed on in registration fees/admission costs.

As said before, a Universal Score Sheet, is necessary. Also, I think that instead of having the judging panel sit in the arena and watch, the video should be filming and going to a tv in a back room where the judges watch and score. The judges can pause and rewind the video to ensure everything is seen and caught. It's not ridiculously loud, so the judges can talk to each other and discuss the routine and what deductions should/shouldn't be taken.
 
I believe what they mean by reviewing the judges score and evaluating them is not to be done right there at the competition. But later with a video tape just to see if there were any major judging mistakes and base their judging skills and grade them. I would say that would work and the most consistent judges would be asked to come back.

I am still confused on the fact that they can erase 15 scores on one teams score sheets from one day, there were 3 sheets toal and each sheet had different judges yet it all total 15 times erased is still confusing to me. I have seen maybe one or two in the past, but not that many.
 
Is there a universal scoring sheets in the near future in the works?? It just seems like there is so much talk about it that it would be a big priority to get on. It's frustrating all around for judges/coaches/parents etc trying to understand the many differences between them all. I'm sure that EP get questioned all of the time and this may relieve some of the headaches for them too.
 
The way it works at most competitions:
1. Judges score the routine that is on the floor, all score sheets for that team are put in an envelope.
2. A runner takes the envelope to a back room at the event
3. In that room the score sheets are added up (so the judges don't have to add them) and then double (sometimes triple) checked to be sure they are added correctly.
4. Scores are recorded for the team on a master sheet
5. Score sheets are put back into the envelope with the team name
6. Repeat

I think the suggestion was to review the quality of the judging after the event was completed. The only difference to your above process would be to photocopy/scan the scoresheets at some point before they are handed out to the teams. I have no idea if the EPs keep copies of the scoresheets now, but my guess is that most don't. I think that speaks volumes to their level of concern about the quality of the judging.

Note to coaches: Always double-check the math on your scoresheets when you get them. As I have said many times before - you would be astonished at how often we find mistakes. Scoring out-of-theoretical-range, blank boxes, and flat-out addition errors are painfully common.

SIDE NOTE: How is it possible that the scores are oftentimes not entered into a computer/spreadsheet when they are being calculated? It seems so incredibly easy to set this up to where mistakes would be much less common.
 
I think the suggestion was to review the quality of the judging after the event was completed. The only difference to your above process would be to photocopy/scan the scoresheets at some point before they are handed out to the teams. I have no idea if the EPs keep copies of the scoresheets now, but my guess is that most don't. I think that speaks volumes to their level of concern about the quality of the judging.

Note to coaches: Always double-check the math on your scoresheets when you get them. As I have said many times before - you would be astonished at how often we find mistakes. Scoring out-of-theoretical-range, blank boxes, and flat-out addition errors are painfully common.

SIDE NOTE: How is it possible that the scores are oftentimes not entered into a computer/spreadsheet when they are being calculated? It seems so incredibly easy to set this up to where mistakes would be much less common.

I think different ranges not based around the decimal system is the problem. Think about it... when you go to a doctor and he/she wants to know your pain level... what does he/she ask? On a scale of 1 to 10 how bad does it hurt?

When males are judging someones attractiveness (as the pigs we are) what do we ask? On a scale of 1 - 10 how hot is she?

When you try to rank something or someone without a set method of scoring you automatically do it on base 10. Can you imagine if the doctor said: On a scale of 1 - 35 (because its an international doctor) can you grade your pain? Thats just such a foreign concept.

People make mistakes entering scores because its all different ranges and sizes and differently done every week. Working with large warehouse buyers who have parts of their job go from paper to computer I know that you can't get rid of mistakes (until we get rid of paper altogether) but there is plenty of things you can do to reduce the amount of transcription errors. (oh, and PS no human should EVER do the tabulation. Humans should only do judging and transcription).
 
When you try to rank something or someone without a set method of scoring you automatically do it on base 10. Can you imagine if the doctor said: On a scale of 1 - 35 (because its an international doctor) can you grade your pain? Thats just such a foreign concept.
In Australia we go on a scale of 10-1 because y'know, we're all upside down and our toilets flush the opposite way ;)

Is it more feasible to institute a system based on scores of 10 with EPs having the flexibility to multiply it by what they think is more important (stunts, tumbling etc), or continuing the fight to get a universal score sheet? (Not that the former would mean the cessation of a universal score sheet being the ultimate goal, but as a compromise along the way ...?) Having a standard 10 system can also make it possible for judges and relative scores to be compared across events, regardless of which skills each event favours.
 
Why can't we just make it simple and have 10 sections on the scoresheet each worth 10 points? Or if we really cant narrow it down to ten then cut some of the categories in half and do something like 8 categories of 10pts and 4 categories of 5pts. Then you make a rubric that says this set of skills scores within this range. If these skills are peformed poorly they may be moved down a scoring range.
 
I think the suggestion was to review the quality of the judging after the event was completed. The only difference to your above process would be to photocopy/scan the scoresheets at some point before they are handed out to the teams. I have no idea if the EPs keep copies of the scoresheets now, but my guess is that most don't. I think that speaks volumes to their level of concern about the quality of the judging.

Note to coaches: Always double-check the math on your scoresheets when you get them. As I have said many times before - you would be astonished at how often we find mistakes. Scoring out-of-theoretical-range, blank boxes, and flat-out addition errors are painfully common.

SIDE NOTE: How is it possible that the scores are oftentimes not entered into a computer/spreadsheet when they are being calculated? It seems so incredibly easy to set this up to where mistakes would be much less common.

I know at UCA competitions (where the computers aren't used) all of the scores are kept in a spreadsheet and they are added by human, and also in the computer.

Also I believe that the computer system used by varsity doesn't allow you to score out of range. And can also flag judges that are scoring out of character for a certain team in a level based off of their other scores.
 
Seems it would also help to have judges inputting scores directly into a tablet / smart phone, eliminate the paper, and make tabulating "live."

But I think it would be a great idea to have judges "reviewed and scored." And to build on that, even have those scores be a requirement for moving up judging levels (be it levels of cheer, or just event levels like regional, state, national.)
 
Back