All-Star Scores Vs. Number Of Skills - Nca Nationals

Welcome to our Cheerleading Community

Members see FEWER ads... join today!

My faith in judges is limited regardless of training. Idk how one is expected to count 75%x2 AND then judge their technique. I'm all in favor start value submission and video scoring on difficulty and live judging technique.


The Fierce Board App! || iPhone || Android
 
I give props to NCA for releasing the category data. Was all of the judging perfect? No. Was it more accurate than any other competition you go to all season? Almost certainly.

I can't even imagine how many mistakes we would find at those competitions that choose to hide that data.

I would say until they have a better way of judging it is in their best interest to hide the data. So if we can find them a more accurate way of judging I think people would have less an issue of hiding the data.
 
I would say until they have a better way of judging it is in their best interest to hide the data. So if we can find them a more accurate way of judging I think people would have less an issue of hiding the data.

Well, I know getting that data, even if flawed, is a huge plus for us when choosing our events. We are typically very frustrated when we leave an event having no idea why we won/lost and rarely return if there is another realistic option. You will notice we didn't take any of our cheer teams to Atlanta this year almost exclusively for this reason.

I completely agree that the parts of scoring that are theoretically objective should be done in a way to get it right.
 
Well, I know getting that data, even if flawed, is a huge plus for us when choosing our events. We are typically very frustrated when we leave an event having no idea why we won/lost and rarely return if there is another realistic option. You will notice we didn't take any of our cheer teams to Atlanta this year almost exclusively for this reason.

I completely agree that the parts of scoring that are theoretically objective should be done in a way to get it right.

I think you tend to be the exception rather than the rule, even though many people would benefit from actually know what is going on. The 'magic' of cheerleading wont disapear just because we quantify parts of it.

But I do think for a lot of people ignorance is bliss.

PS - Atlanta went a lot better this year with the new scoring system.
 
Though I like the idea of a submitted form I would rather the judges pay attention to what is happening on the floor than trying to read and compare to what the gym submitted.
 
Well, I know getting that data, even if flawed, is a huge plus for us when choosing our events. We are typically very frustrated when we leave an event having no idea why we won/lost and rarely return if there is another realistic option. You will notice we didn't take any of our cheer teams to Atlanta this year almost exclusively for this reason.

I completely agree that the parts of scoring that are theoretically objective should be done in a way to get it right.

I think it is even more frustrating to see the category breakdowns and see competitors with half your skill level equal you or even outscore you. It's one thing to not know why you lost and quite another to know you lost because the judges over scored your competitor and there's not a darn thing you can do about it. Again, I am a huge fan of this new varsity scoring grid. I just want our judges to be able to get it right. I don't believe they are capable with the way things are right now. As they say, close only counts in horseshoes and hand grenades. NCA might be the closest to getting it right but unfortunately they still didn't...
 
I think it is even more frustrating to see the category breakdowns and see competitors with half your skill level equal you or even outscore you. It's one thing to not know why you lost and quite another to know you lost because the judges over scored your competitor and there's not a darn thing you can do about it. Again, I am a huge fan of this new varsity scoring grid. I just want our judges to be able to get it right. I don't believe they are capable with the way things are right now. As they say, close only counts in horseshoes and hand grenades. NCA might be the closest to getting it right but unfortunately they still didn't...

If you won, and you feel this passionate, how must those who lost and believe they were mis-scored, in some areas, feel? I can only imagine their disappointment. :(

Wasn't Dartfish originally supposed to be used as some sort of an instant replay system when it was introduced way back when?
 
Though I like the idea of a submitted form I would rather the judges pay attention to what is happening on the floor than trying to read and compare to what the gym submitted.
I disagree. If I'm judging tumbling and I can quickly review what's coming up before that team comes out, I don't have to split my attention between trying to weed out fakers, counting skills on the fly etc. I can focus on technique and synchro and if I need to verify numbers I can do that from the replay. But I think having a "layout" (no pun intended) of what I'm about to see can only help. As a judge i know where those things will be on the floor anyway, they'll be the kids at the front of the mat in synchro tumbling, the first and/or last passes etc.

I also think that takes the "most" language out of the ranges. It's only there now because judges can't count them all. If it has to be turned in, you can be specific. High range MUST be 75% and that us a specific number based on your team size.

As far as cheating on the routine submission, create a deduction for omissions. If I say I have 27 running tucks on J3 and I really only have 22, I'm going to get deducted for the 4 kids that "didn't throw them" (because they don't throw one in the routine and I lied). That will also make me think twice about putting a kid in a skill that may not be ready for it.

I think about diving and gymnastics... Even figure skating.. It's not a surprise. The judges know exactly what they'd about to throw and in what order so they can focus on technique etc.
 
In all my scoresheeting experiences I would say the majority, 80%, of myself ever feeling like there was a scoring issue was not getting rewarded for difficulty correctly. I think in general a standard cheer mom can decently reward execution. Even to the highly untrained eye lines and timing overall are easy to see. And when you give that .8 for execution (because it is just so darn pretty) you are most likely going to give a .7-.9 for difficulty.

Why? It is very mentally difficulty for a judge to give two very separate scores, execution and difficulty, that are very different from each other. That is not saying someone can't, but I bet the if you were to take a look at difficulty and execution scores the have standard deviation around 25%. I believe execution will pull up a difficulty score and down a difficulty score.

How do I know? Because I have banked on it many times from day 1 to day 2 and it works. We hit harder day 2 after a bad day 1 our difficulty score goes up. This is a matter of stepping outside of yourself and into the judges seat. Not only knowing what the scoresheet and rubric does, but the mechanics of scoring in general and the psychology of it all. It is flipping ridiculous someone needs to know and understand these things to be competitive! And some of the reasons people get so frustrated.

The difficulty and execution judge need to have no idea what the other scored because it doesn't matter to them. The execution and presentation judge does it live. No counting skills. Just watching and placing a percentage of perfection on execution as well as scoring creativity. Nothing else.

The difficulty judge watches by video with a small form that says the layout of the routine that will be competed. I HIGHLY disagree with a form that says the amount of people doing skills and deducting if the coach is off. It will add time. Just count as you go. I have tried it, for fun, with a couple routine videos from last year. I can do it consistently under 10 minutes.

If you ever feel like you were scored in correctly in difficulty you can pay $200 bucks to have your difficulty judged by a new judge. If the parts you disagree with the new judge gives you a score more than .1 different than the old judge you get your $200 back. If it is not, they keep your $200 and your old score. Execution / live scores are never debatable. Accuracy of the judge can be brought into question for further events, but that score is never debatable. As well with focussing on one area the judges accuracy should go way up.

This concept on focussing EP's and judges are already trying, just in the wrong combination. They make a tumbling judge and a building judge. The problem is they are still multitasking while judging those individual things. Instead of 100% of their brain focussing on if the skills were executed well they are only giving 50% of their focus on that and 50% of their focus on difficulty. It decreases how well any person can do either.

So 1 person doing difficulty by video. 2 people doing execution live and those scores averaged. 1 person doing deductions live with clicker. 1 person doing safety with a clicker. If either the deduction person or the safety person see either they click their button. 5 seconds before the click and 5 seconds after of video are wrapped up for someone in the back to review. This technology is very easy to create with Dartfish (I talked to the guy for quite a while). This will speed up the process similarly to an assembly line with job specialization.
 
Back