All-Star Competition Order Affecting Scores?

Welcome to our Cheerleading Community

Members see FEWER ads... join today!

Not positive but this sounds a lot like figure skating scoring, right? Multiple people per criteria section and high low scores get tossed.
I'm not sure, but if that's how figure skating gets scored then yes. I was a diver in high school and college. State competitions had a large panel (7 judges) with high and low score eliminated. Diving of course only judges one dive at a time. With cheer having so many different elements, I feel like there's too much to miss if you don't divide it up into categories.
 
I'm not sure, but if that's how figure skating gets scored then yes. I was a diver in high school and college. State competitions had a large panel (7 judges) with high and low score eliminated. Diving of course only judges one dive at a time. With cheer having so many different elements, I feel like there's too much to miss if you don't divide it up into categories.
Still not positive if that is how figure skating is judged but it does sound familiar to me. Quite the interesting proposition for cheer. Can imagine smaller EP's might have lots of reservations in doing this but not so sure would be too difficult for say NCA or Cheer sport to experiment with. Would certainly help eliminate the subconscious favoritism factor.
 
Considering gymnastics, a similarly subjective sport, has three judges evaluating one person doing one routine, it really is ridiculous to suggest that six or so judges can accurately evaluate a 2.5 minute routine with so much included.
 
I judge high school comps and we usually try to delegate which portion of the scoresheet to FOCUS on. So say I would mainly count tumblers and judge 2 would mainly look at stunting difficulty and numbers, and judge 3 would watch for jump and motion difficulty. The unfortunate thing is that we have to give scores for all of the categories. So say I can't remember if the team did a double toe BHS or a triple toe BHS, I could ask judge 3 and she would tell me. This works but I think it would be better if there was literally 2 judges for each category on the scoresheet. 2 for standing tumbling, 2 for running, 2 for baskets and pyramids, 2 for stunts, 2 for choreography/dance/jumps. I know EPs are going to say they can't pay that many but they already usually pay at least 7 (2 panels- 3 per panel plus 1 safety judge). So what is 3 more? These 10 wouldn't have to switch on and off panels because they are only judging 1 category and don't have to score and commet on so many categories- that is what takes so long in between teams and why most comps have 2 panels. I think we would have a lot more accuracy in the sport if we did something like this.
 
In our 15 years in competitive cheer it honestly seems performance order does not affect score. I do know that our gym prefers to go last or as close to last as possible. It seems to be the "sweet spot" for us lol
 
First of all I'm really enjoying this thread because a few of my teammates and I had this conversation the other day. We were the 2nd team in our division to compete at Worlds last year. It was super early in the morning, because we only had an At Large bid. I'm pretty confident in saying we would've been scored differently had we of competed later on in the day and/or had been attending with a Paid Bid.
 
Great discussion. I started it becasue I saw a team say something to the effect that they got a good draw in performance order. So I was curious to know what other thoughts were. I believe judge's do their best, but I also think it would be great if we had some data to actually study. I mean, isn't this also somewhat of an argument against ranking teams based on "high scores" over different compeitions/EPs? Maybe Sally's SuperStars competed in Wonkaville and got a score of 99/100. Then Lulu's Ladies competed in Whoville and got an 95. Ranking them against each other really doesn't do a whole lot to determine who really had to best routine. Agree? Disagree?
 
Great discussion. I started it becasue I saw a team say something to the effect that they got a good draw in performance order. So I was curious to know what other thoughts were. I believe judge's do their best, but I also think it would be great if we had some data to actually study. I mean, isn't this also somewhat of an argument against ranking teams based on "high scores" over different compeitions/EPs? Maybe Sally's SuperStars competed in Wonkaville and got a score of 99/100. Then Lulu's Ladies competed in Whoville and got an 95. Ranking them against each other really doesn't do a whole lot to determine who really had to best routine. Agree? Disagree?
Agree, comparing scores between events is impossible. That is why when US Finals does their final ranking they have a new set of judges do a video review of the top teams from each venue to determine the final ranking.
 
From judging for multiple companies, it really depends what scoresheet you are talking. I think certain categories don't need to be based on other teams to score accurately. If a judge knows the level rules well then you compare to max allowed in the level. With that being said, there isnt a large range for scores for any produce. Take varsity and jamfest. You get into a range by performing a certain quantity of skills that are level appropriate. once you get into that top range a judge is can only vary the score by 0.4 points on both of those scoresheets. Once youre at 0.4 in the range and have a team doing basic tick tocks and full up lib you know you can do a lot more like double ups etc.

For other scores like creativity, it is almost impossible to not compare teams to others since there is no benchmark for creativity. When looking at creativity though, it is assumed that a judge has up to date knowledge of what is happening in the industry so you are able to compare without seeing other teams that day.

So all in all I don't think order of appearance has anything to do with rankings on scoresheets with strict rubrics.

Sent from my Galaxy S III
 
Agree, comparing scores between events is impossible. That is why when US Finals does their final ranking they have a new set of judges do a video review of the top teams from each venue to determine the final ranking.
Well I am not picking on Jamfest but doesn't Jambrand release that like high-score ranking sheet each month? That's more what I was talking about. And don't get me wrong, the score are great which obviously reflect a solid performance, but if those are just score A compared to score B then it really isn't a tool for comparison.
 
I don't mind going first as long as it is alternating divisions (panel A judges level 1, then we go on in level 2 judged by panel B)

The only thing I dislike is going on after an amazing team that there's no way we can match (think a coed level 5 goes on then my Sr 1--no way we are going to be as exciting).

Other than that, I'm good to go. We will either score well or we won't depending on our routine. I can't predict the judges personal opinions nor rearrange the line up to my liking so I try not to sweat the stuff I can't control.
 
No. At least I don't and I have never been instructed to do so. Normally there is a range that you score specific skills and execution so its pretty clear what scores they should be getting. =).
I don't think people are saying that judges are told to hold back scores, or that judges know that they're doing this, more that they may subconsciously do it, not to hurt the teams going at different times, but just because they're humans and they make mistakes! :)

That being said, I think that someone made a good point about the best team still winning. Assuming there was no favoritism or anything and scores were just held back, the best team should still score the highest out of everyone, even if it's not a perfect score or all the scores are consistently lower than expected, the ranking should (hypothetically) be the same. However, when you then compare the results between divisions is where it's probably inaccurate because if they were holding back in division A but not really holding back in division B, Division B's scores will obviously be higher.
*that's assuming there's no favoritism and only the holding back of scores. Which is probably not realistic.
I love the idea of playback w/ no sound. I feel like that would make the scores SO much more accurate. And playback for #s (difficulty) because I know I'm not a judge and I'm not trained, but I know there's no way I would be able to count while watching a routine and keep track of the numbers and judge overall impression at the same time. I would either be focused on #s or on impression... but maybe it's just because I'm not a judge?
 
Total honesty here, and I have no proof that there's any truth to it, but I have always believed your best chance for success is to compete right after a team that isn't as good as your team is. Basically because that's the last team the judges saw before giving your team scores. So if everything we do is superior to them, the judges will be more impressed by our routine. On the contrary, if we compete immediately after an equally skilled team our scores are more likely to end up right where their's are. Bottom line for me as a coach, I NEVER want to go first and I NEVER want to go right after a really good team. Other than that, I don't care where we are in the lineup.

This is kind of ramble-ly...so apologies in advance.

When I was in gymnastics our coaches would kind of let us determine our order at meets--but they always had to have a "guideline" for each event...They would always pick the first and last person for each event, and then kind of pick a few girls and say to the remaining girls, "you girls fill these spots (insert numbers), you girls fill (insert numbers) and you girls fill (numbers)"...one time they put me last for bars, and I really didn't want to go last, so I asked why I couldn't go sooner--and they told me that as they came up with orders, they wanted to make a gentle progression from worst to best for each apparatus. That way, the routines (it was still level 6 so it was compulsory) would look slightly better with each performance, that way it could work in favor of those last few girls because it would show more improvement to the judges and they may subconsciously think about how it WAS better, and score higher.
 
The bottom line is, cheerleading is judged by people, not a stop watch, not a scoreboard or a machine. Mistakes happen. Some things can be missed and sometimes personal opinions can affect judging. It is a hard job watching hours of routines. We have all been on the good side of judging and the bad. No matter what, we are all involved in the sport because we love watching our children compete or as a competitor love those 2 1/2 minutes to showoff what we have worked for after many hours of hard work in the gym.
 
Well I am not picking on Jamfest but doesn't Jambrand release that like high-score ranking sheet each month? That's more what I was talking about. And don't get me wrong, the score are great which obviously reflect a solid performance, but if those are just score A compared to score B then it really isn't a tool for comparison.
They do release that every month but that is comparing Team A's score from a comp in Chicago to Team B's score from a comp in Virginia. Same guidelines but different judges.
 
Back