ES Training Tasks -- Tough enough?

Started by RiverAux, September 29, 2007, 05:25:14 PM

0 Members and 1 Guest are viewing this topic.

RiverAux

I think everyone will recognize the major step forward that CAP made when it adopted the current task based training along with associated task guides (based on those developed by MD Wing) and powerpoint presentations. 

I think that for our primary ES positions (ground team and aircrew) that they're actually pretty good and that someone who goes through and is trained and tested as intended can probably be expected to do a pretty good job. 

For the base staff I think they've probably still got a ways to go but I suspect that as NIMS gets ramped up that we'll probably discard the CAP specific program we've got now and just adopt the national standards. 

Now, I don't want to get in another NIMS/NASAR discussion and actually want to return to the aircrew and ground team training program.  Do you believe that the tasks are adequate as is?  Too much?  Not tough enough?  What specific tasks need to be added, deleted, toughened up, or loosened up? 

smj58501

For mission staff training, I feel there need to be some task equivalency matrices developed for external training received.

A good example of this includes the Inland SAR Planning Course. This could be closely looked at as a signoff for Planning Section Chief, and perhaps others. Another to consider is successful completion of the FEMA-standardized ICS 300 and 400 courses for IC and other mission base tasks.

The "eaches" would need to be researched and debated, but I think the overall concept of task equivalency for attending standardized courses like Inland SAR and ICS 300/ 400 should be implimented.
Sean M. Johnson
Lt Col, CAP
Chief of Staff
ND Wing CAP

floridacyclist

#2
I think that while the standards could be beefed up at least a little, the much bigger problem is getting folks to evaluate properly. Seldom do you see folks actually going over the P/F tests at the end of each task qualification section, rather they usually seem to  satisfy themselves with asking a couple of questions or perhaps having knowledge (either firsthand, secondhand, or rumored) that a person performed a job (sat, unsat or other) on a practice mission. Does it really matter what the standards are upgraded to if the current ones aren't being adequately evaluated?

Gene Floyd, Capt CAP
Wearer of many hats, master of none (but senior-rated in two)
www.tallahasseecap.org
www.rideforfatherhood.org

RiverAux

Since the SAR Planning Course is not offered very often, it would really make it difficult to get new higher-level staff qualified.  Most people would need to travel out of state, and sometimes REALLY far out of state, to take it and I think thats asking just a bit much. 

Pencil-whipping is an issue, but lets assume that most CAP members want to do it right and follow the rules for the purpose of this thread.

SDF_Specialist

Quote from: RiverAux on September 29, 2007, 06:02:25 PM
Pencil-whipping is an issue, but lets assume that most CAP members want to do it right and follow the rules for the purpose of this thread.

This is true. My first SAREX, I was training for GTM3. After that day, it showed me as a qualified GTM3. I asked everyone what to do, who to contact. I never bothered putting the badge on until the following SAREX where I was able to complete the remaining tasks on the SQTR. After that, I showed this to my unit commander who authorized me to wear the badge.
SDF_Specialist

smj58501

Quote from: RiverAux on September 29, 2007, 06:02:25 PM
Since the SAR Planning Course is not offered very often, it would really make it difficult to get new higher-level staff qualified.  Most people would need to travel out of state, and sometimes REALLY far out of state, to take it and I think thats asking just a bit much. 

Pencil-whipping is an issue, but lets assume that most CAP members want to do it right and follow the rules for the purpose of this thread.

I don't disagree that the offerings are not what they should be. I do not advocate replacing the current system. I am suggesting that we simply give credit for attending Inland SAR.

You are correct.... often times attending Inland SAR is above and beyond expectations for a volunteer. Thats all the more reason to find a way to recognize those of our members for making the extra effort to attend. One way to do that is awarding them equivalent credit for a specialty.
Sean M. Johnson
Lt Col, CAP
Chief of Staff
ND Wing CAP

floridacyclist

I'm not referring to totally skipping the sign-off, but ignoring the standards or making up your own.

"So Cadet, what was your job on this UDF training mission?"

"Radio Operator sir"

"OK...we'll sign you off on the radio operations tasks of your SQTR. Did you keep a log?"

"Yes sir"

"OK....we'll make sure to list that too"

It just aggravates me when I walk up to a cadet and ask him a question from the end-of-task evaluation and he is totally clueless stating that he has never heard that question before.
Gene Floyd, Capt CAP
Wearer of many hats, master of none (but senior-rated in two)
www.tallahasseecap.org
www.rideforfatherhood.org

SDF_Specialist

Quote from: floridacyclist on September 29, 2007, 06:21:06 PM
It just aggravates me when I walk up to a cadet and ask him a question from the end-of-task evaluation and he is totally clueless stating that he has never heard that question before.

There's another problem there. It's not that they don't want to participate, it's just that having classroom training doesn't capture their interest as much as hands on would. That is my major concern with developing an ES training plan for my unit is to try to reduce the classroom training, and do more hands on. Of course I will be explaining as we go what each task is, why it is important, and the proper way(s) to complete it. I've been through the same thing Gene. I've trained even Seniors for MRO, and ask them about tasks, and they have no clue.
SDF_Specialist

floridacyclist

My question is "If they don't know the answer to the questions, then how in the world did they get passed on the evaluation?" It doesn't bother me if someone doesn't learn the material well...you simply repeat and practice until they know it...THEN you allow them to sign off, not before.

I wonder is this more a symptom of society's attitude that nobody should ever fail anything? Or the evaluators simply not knowing how to conduct a proper and thorough evaluation, including all of the test questions (or at the very least, similar ones covering the same standard of performance)?
Gene Floyd, Capt CAP
Wearer of many hats, master of none (but senior-rated in two)
www.tallahasseecap.org
www.rideforfatherhood.org

SJFedor

It all depends on the quality of training.

For example, the Mission Observers that NESA graduates every year, extremely high quality, very proficient in their jobs and with Air SAR in general. So, using that model, the tasks for those specialties are definitely quality.

Same tasks, same SQTR, but in a "less structured" training environment, the person might know how to get into the plane, mess with the CAP radio a little bit, and that's about as useful as they can be.  It's all about how much education they had on it and how much practical experience they've had on it.

Steven Fedor, NREMT-P
Master Ambulance Driver
Former Capt, MP, MCPE, MO, MS, GTL, and various other 3-and-4 letter combinations
NESA MAS Instructor, 2008-2010 (#479)

floridacyclist

Exactly..and shoddy training should be caught and nipped in the bud by the evaluator, yet most folks make it through an evaluation session with flying colors whether they knew the material, had to be coaxed on each question, or weren't even fully tested on each task.

I think many SETs forget that evaluation is not training...if they don't know it, you fail them and send them back for more training, you don't correct them on the spot and sign off anyway. I had that drilled through my head back when FL had their Wing Authorized Trainer program and my oldest son and I spent two weekends in Orlando learning how the wing wanted signoffs on MRO to be conducted.
Gene Floyd, Capt CAP
Wearer of many hats, master of none (but senior-rated in two)
www.tallahasseecap.org
www.rideforfatherhood.org

arajca

Perhaps we need to re-evauate the notion of signing off tasks at SAEX's/missions. I suggest running quarterly evaluation sessions were several stations are set up and the student has to go each to get signed off on a task - one per station. The purely lecture stuff can be done and signed off in class. Anything requiring demonstration of a skill is done at the eval session. This removes the temptation to pencil whip folks through on a SAREX or mission. Schools such as NESA and the various wing/region level es schools would follow the same format, except the eval session is part of the school.

IIRC, a similar practice is used in the military during basic training.

floridacyclist

#12
Yup...30 basic tasks. I still have my green book somewhere.

We do a similar thing at our weekend events. We usually hold them on 3-day weekends...the 1st day is classroom training, the 2nd is field exercise and playing with the things we've learned, and the last day is sign-offs. For our Gp 2 ES Academy, we're doing 2 monthly weekends of squadron-based training and then a large group-wide ground exercise and evaluation party where we'll have all the SETs together and we can evaluate the evalution process live and in person.

All we guarantee is that we will do our best to present the training. We do not guarantee that you will pass the evaluation.
Gene Floyd, Capt CAP
Wearer of many hats, master of none (but senior-rated in two)
www.tallahasseecap.org
www.rideforfatherhood.org

wingnut

This year I flew over 140 hours  for CAP, of those at least 70 have been on  SAR, Ihave become disillusioned with the training process, it seems that "TOO" many people can sign off on an SQTR, I mean from UDF to Mission Pilots. I can't tell you how many missions I was on and several mission pilots did not have a gridded chart, or the observer did not have a clue on how to operate the CAP radio. A lot of guys only fly when they are in a CAP plane and barely meet their required  FAA minimum's, much less flying a Grid.

I think we need to evaluate the Sarex and the Actual missions themselves, each and every time, we learn as a team. I am not throwing spears at anyone, I just flew a HLS mission with a large number of Military Pilots who had some of the same issues with a lack of training and or experience.

Dragoon

There are a couple of issues I see with our training approach right now

1.  Evaluators who don't.  (evaluate, that is).  We need more than SET.  We need a hands on evaluation of the Evaluators ability to evaluate!  (how's that for a mouthful).  Followed by a Wing CC's sign off that the guy is allowed to evaluate.  Adjust MIMS to require that Wing CC sign off.

Basically, we have to have the same standards for evaluators as we do for check pilots. 

2.  Some of the tasks are screwy.  For example, scanners have to demonstrate operating the CAP radio, which they can't even reach from the backset!.  Also, we've got huge amounts of worthless stuff in some of the task guides (does it really matter if scanners can explain all the ins and outs of icing)?

3.  Too many of the tasks for aicrew (and some for staff) involve "Discussing" rather than "Demonstrating".  Who gives a rat's patootie what you can "discuss".  It's the "demonstrate" stuff that matters.

4.  The staff tasks are wayyyyy too vague.  Lots of time and effort has been spent on the GT side, and fair amount on aircrew.  But I submit that one could pass all the listed tasks for Logistics Section Chief and still have zero idea on how to actually do the job on a real mission.

But on the whole, we're light years ahead of where we were in the 80s.

arajca

For the staff jobs, I suggest we use the ICS task books used in the fire service. Those are the basis for most of the tasks and sign offs for the staff positions.

RiverAux

Quote2.  Some of the tasks are screwy.  For example, scanners have to demonstrate operating the CAP radio, which they can't even reach from the backset!.
Depends on your airplane.  The newer ones have push to talk switches in the back seat and if you set the radio right the scanner can do all the radio work. 

floridacyclist

That's actually how we train. Pilots fly, observers do almost everything else, and the scanner sits in the backseat and coordinates with the ground team, especially since he's usually the one keeping an eye on them.
Gene Floyd, Capt CAP
Wearer of many hats, master of none (but senior-rated in two)
www.tallahasseecap.org
www.rideforfatherhood.org

Dragoon

Quote from: RiverAux on October 01, 2007, 10:16:46 PM
Quote2.  Some of the tasks are screwy.  For example, scanners have to demonstrate operating the CAP radio, which they can't even reach from the backset!.
Depends on your airplane.  The newer ones have push to talk switches in the back seat and if you set the radio right the scanner can do all the radio work. 

In that case, the task should cover nothing except the push to talk switch. 

The current task involves operating every switch on the radio.  Again, can't do it from the back seat. No reason to burden scanners with it.

Sure, they need to know how to talk on the radio.  But that's an entirely different task....

floridacyclist

It is very conceivable that a Scanner might be using a handheld to communicate with the GT
Gene Floyd, Capt CAP
Wearer of many hats, master of none (but senior-rated in two)
www.tallahasseecap.org
www.rideforfatherhood.org

RiverAux

While the Scanner generally flies in the backseat during the active part of missions it is not at all uncommon for them to fly in the front front seat at other times (for example, flying to and from mission base and other transport flights).  Even though they aren't in that seat serving as an Observer, I think that they should at least be able to operate the radio in case something happens to the pilot.  Seems like a basic safety measure.

But then again, I'm the one that thinks that all non-pilot CAP aircrew members should take AOPA's Pinch-hitter course and should receive at least 4 hours of in-flight training in emergency flight and landing skills. 

isuhawkeye

^^^ Iowa put a group of Observers through just such a course durring the August WTA

Dragoon

Quote from: RiverAux on October 02, 2007, 10:11:45 PM
While the Scanner generally flies in the backseat during the active part of missions it is not at all uncommon for them to fly in the front front seat at other times (for example, flying to and from mission base and other transport flights).  Even though they aren't in that seat serving as an Observer, I think that they should at least be able to operate the radio in case something happens to the pilot.  Seems like a basic safety measure.

But then again, I'm the one that thinks that all non-pilot CAP aircrew members should take AOPA's Pinch-hitter course and should receive at least 4 hours of in-flight training in emergency flight and landing skills. 

Careful,

What you're basically arguing for  is "Scanners need to be trained to do what observers do, since we let them sit up front"  After all, there are lot of other things the right seater is supposed to do besides operate a radio.

If we go down that route, there's no need for an MS rating.  I think it's better to stick with "MS sits in the back.  If MS sits up front, the MP handles the addtional work load."




RiverAux

Not at all.  Just pointing out that there are some issues that scanners need to be informed on.  You brought up icing earlier -- I certainly want the scanner aware that he should probably say something if he notices a big build up of ice while he's looking out the window. 

Dragoon

And I would want that as well.

If the aircrew tasks were designed from the practical point of view ("What do I want this guy to DO?") They'd be a lot shorter.

Yup. I want the scanner to alert me if he notices visible signs of icing, or notices the engine making funny noises.  But that it.

He does not need to know

the details of freezing levels
the details of how icing affects aircraft performance (other than to say your plane gets heavier and may fall out of the sky)
the details of carborator icing.

This is NICE to know stuff, but not critical to sitting in the backseat and looking out the window.

Neither is it critical that he

knows the ins and outs of weight and balance (just know they exist and that the pilot has to do it)
Understands aircraft controls
Understands airraft instruments
Knows the ins and outs of wake turbulence (beyond "tell me if the windsock shifts)
Knows the details of how light and atmospheric conditions affect scanning (since he has no control over them, and isn't planning the flight)
Knows the details of visibility and turbulence affects search (he'll figure it out quickly on his own)
Knows how to keep a log (since the taskbook assigns this mission to the observer, not the scanner)

This is all nice to know, not need to know.


If we cut this down to the bare basics, and focused on the most important stuff - namely looking out the window and spotting things on the ground, we could train folks much faster, and have them much more proficient at the critical tasks.

Just one guy's opinion.

floridacyclist

#25
Don't forget that the Scanner should also be considered a future Observer Trainee and should be at least vaguely familiar with the same things that Observers have to be proficient at. Approaching it with this in mind will make their Scanner time much more productive as they will actually have a clue as to what is going on and in the long run will give you quicker- and better-trained Observers
Gene Floyd, Capt CAP
Wearer of many hats, master of none (but senior-rated in two)
www.tallahasseecap.org
www.rideforfatherhood.org

sardak

There are a number of interesting requirements in the task guides.

a. Task O-2003 - Grid a Sectional
First line of the training outline in the task guide - As a Mission Observer trainee, knowing how to grid a sectional chart and use grids is essential in order to assist the mission pilot in planning a search, and to maintain situational awareness during a search.

This is a required task for Mission Pilot but not Observer or Scanner.

Performance measurement 1. Grid a sectional using the CAP grid system.
Every mission pilot has sat in front of an evaluator and gridded a sectional?

b. Task O-0204 - Locate a Point on a Map Using Latitude and Longitude
Required task for Scanner and Ground Team Leader but not UDF Team Member.

Seems that someone on a UDF team needs to know how to plot lat/lon, as there is no requirement to have a scanner or GTL along to plot Sarsat coordinates or the location of the transmitter after it's been located.

Mike

RiverAux

I agree on the lat-lon issue.  I assume they overlooked it. 

With the gridding the map, I can see why you would need a pilot able to do that since they've got to constantly buy new sectionals anyway.  I would say that an Observer should need to be able to do it, but not a scanner.

RocketPropelled

Quote from: Dragoon on October 01, 2007, 05:53:20 PM1.  Evaluators who don't.  (evaluate, that is).  We need more than SET.  We need a hands on evaluation of the Evaluators ability to evaluate!  (how's that for a mouthful).  Followed by a Wing CC's sign off that the guy is allowed to evaluate.  Adjust MIMS to require that Wing CC sign off.

Basically, we have to have the same standards for evaluators as we do for check pilots. 

Hey, if you want to make the "evaluator" a super-limited set of people, be prepared for another training bottleneck.  I'm all for a higher standard of training -- but limiting evaluators and establishing the "approved cadre" is far more likely to put a strain on anyone you choose.

The entire training system is set up based upon the integrity of the evaluator and the trainee, no matter their additional qualifications.  I'm the ES training officer for our squadron, and I assure you, no one gets signed off under my pen before they're ready -- because I'll have to carry that person if they're not prepared to perform.  I've had some great trainers, and I've learned a lot from them.

NESA is a fine example of the textbook training being taught, applied, and evaluated in the field.  But at the squadron level (where most of the rubber meets the road, training-wise), you need to have the ability to perform evaluation and signoff for tasks.

In most wings, finding a check pilot to fly with you (in addition to finding an IP to do a fam flight, scheduling an airplane, etc.) is hard enough.  Check pilots are vetted not only through the FAA for their skills and experience, they need to pass a high degree of CAP training and signoff -- all necessary when you're dealing with expensive aircraft and checkrides.

I don't see why you need the same super-specific approval methods when you're signing off a UDFT, a Mission Scanner, or a GTM3.  F91 rides are tough for most Mission Pilot trainees, and being approved as a GTL is seldom a pencil-whip, as far as I've seen.  Your wings may vary.

As a training guy, I don't like the idea of creating ANOTHER bottleneck in the system -- it's often hard enough to get tasks signed off and missions attended just to finish the SQTR.  Why complicate it by further limiting the signoff criteria?

Dragoon

#29
Quote from: floridacyclist on October 03, 2007, 11:29:55 PM
Don't forget that the Scanner should also be considered a future Observer Trainee and should be at least vaguely familiar with the same things that Observers have to be proficient at. Approaching it with this in mind will make their Scanner time much more productive as they will actually have a clue as to what is going on and in the long run will give you quicker- and better-trained Observers

I'd absulutely agree on the value of familiarization.  But that's very differnent than requiring proficiency as measured by time consuming tests.

The abundance of relatively low value tasks has encouraged people to pencil whip.

I'd also argue that the scanner, once qualified, will quickly become vaguely familiar with Observer duties simply by acting as part of an aircrew.  And if he's a little unclear on what a VSI does, I don't think it's a big risk to the mission.

Dragoon

Quote from: RocketPropelled on October 04, 2007, 04:30:56 PM

Hey, if you want to make the "evaluator" a super-limited set of people, be prepared for another training bottleneck.  I'm all for a higher standard of training -- but limiting evaluators and establishing the "approved cadre" is far more likely to put a strain on anyone you choose.

Well, I think there's a bit of wiggle room between "wide open" and "super-limited"

It's a quality thing. Passing a multiple choice test does not in any way, shape or form make you qualfied to conduct hands on evaluation.

And having just learned something yourself doesn't make you qualfied to evaluate others in that skill.

The current system allows a brand new 13 year old GTM 3 who takes and passes the online SET exam to evaluate other GTM 3s on their proficiency.  This, of course, is ludicrous.  Newbies at any skill set shouldn't be testing other newbies.  And not everyone has the temprament and discipline to be a good evaluator.

It's all in how tight you make it.  A wing could approve oodles and oodles of evaluators at the push of a button in an online system.  But at least make someone send an email to Wing ES requesting the status.  It gives Wing a chance to say no.  You can still have lots of them - just don't have a wide open free for all.

Quote from: RocketPropelled on October 04, 2007, 04:30:56 PM
The entire training system is set up based upon the integrity of the evaluator and the trainee, no matter their additional qualifications.  I'm the ES training officer for our squadron, and I assure you, no one gets signed off under my pen before they're ready -- because I'll have to carry that person if they're not prepared to perform.  I've had some great trainers, and I've learned a lot from them.

Unless you are fully qualfied in all the specialties yourself, respectfully you have no clue if someone is "ready."  And very few squadron ES officers have all those qualfications.    You have to trust whoever did the evaluation, and that person may be someone from another unit that you've never met.  As an ES trng officer, you need some proof that all the evaluators are good at evaluating, and that you can trust them.

You mentioned that the system assumes the integrity of the evaluator.  Doesn't it make sense to ensure at the very least that the potential evaluator HAS integrity before you start trusting it?


Quote from: RocketPropelled on October 04, 2007, 04:30:56 PM
NESA is a fine example of the textbook training being taught, applied, and evaluated in the field.  But at the squadron level (where most of the rubber meets the road, training-wise), you need to have the ability to perform evaluation and signoff for tasks.

Absolutely.  Nothing in this proposal changes that.  Just adds a little quality.

And remember, this is about evaluation,not teaching.  Anyone can teach.  But you need to be able to trust the skill and judgement of your evaluators.


Quote from: RocketPropelled on October 04, 2007, 04:30:56 PM
In most wings, finding a check pilot to fly with you (in addition to finding an IP to do a fam flight, scheduling an airplane, etc.) is hard enough.  Check pilots are vetted not only through the FAA for their skills and experience, they need to pass a high degree of CAP training and signoff -- all necessary when you're dealing with expensive aircraft and checkrides.

I don't see why you need the same super-specific approval methods when you're signing off a UDFT, a Mission Scanner, or a GTM3.  F91 rides are tough for most Mission Pilot trainees, and being approved as a GTL is seldom a pencil-whip, as far as I've seen.  Your wings may vary.


You don't need the same standards.  But you probably need higher ones than "have the rating yourself and pass an online SET test."

Remember, F91s cannot be given by any MP SET.  Only by folks appointed by Wing.   Because CAP has known for years that not every MP is a good judge of other MPs.  You don't have to be a CFI to be an MP Check Pilot - but you do need to be an experienced MP who Wing has some faith in.

In the same way, not every mission scanners should be evaluating scanners.  You probably want to limit it to very experienced scanners, or even better to Observers and MPs.

It's probably OK to limit GTM evaluation to GTLs, and GTL evaluation to a select group of very good GTLs.

The bar can be raised a bit without putting it through the ceiling. 

The return is that when one of your guys gets trained by someone you've never met, you'll have some idea that the guy actually tested your member to standard, and that the task was truly mastered.

SarDragon

Quote from: RiverAux on October 04, 2007, 02:58:15 PM
I agree on the lat-lon issue.  I assume they overlooked it. 

With the gridding the map, I can see why you would need a pilot able to do that since they've got to constantly buy new sectionals anyway.  I would say that an Observer should need to be able to do it, but not a scanner.

Most of the folks I know use the same olde gridded chart all the time for that part of their flying, and the new ones for navigation. There's nothing that says that the gridded chart be current.
Dave Bowles
Maj, CAP
AT1, USN Retired
50 Year Member
Mitchell Award (unnumbered)
C/WO, CAP, Ret

ammotrucker

I think as ES training officer for my squadron, and think about the times that I was being signed off.  I beleive that some of the thing being floated in this thread are ture. 

I know that on one training event, I was signed off and I refused to add it to MIMS for the simple fact that I did not like the eval.  I stated this is a member and went to a different squadron and received the training much the way that I expected to get.  After that I added these to my MIMS.

If your looking just to get signed-off I could care less how you receive it.  Most members do not accept actual missions.  I care about the members that will be involved.  That they get quality training and eval.

This was the case with my AOBD.  I know that the FAA Aircraft Dispatch license means nothing to CAP.  But, I see a lot of coralation in the two.  This is the reasoning that I used for my AOBD not using the initail sign-off. 

Most people that will actually use the training will inherently want to do it correctly.

This is one mans opinion.
RG Little, Capt

Dragoon

Quote from: ammotrucker on November 06, 2007, 03:53:28 PM
I think as ES training officer for my squadron, and think about the times that I was being signed off.  I beleive that some of the thing being floated in this thread are ture. 

I know that on one training event, I was signed off and I refused to add it to MIMS for the simple fact that I did not like the eval.  I stated this is a member and went to a different squadron and received the training much the way that I expected to get.  After that I added these to my MIMS.

If your looking just to get signed-off I could care less how you receive it.  Most members do not accept actual missions.  I care about the members that will be involved.  That they get quality training and eval.

This was the case with my AOBD.  I know that the FAA Aircraft Dispatch license means nothing to CAP.  But, I see a lot of coralation in the two.  This is the reasoning that I used for my AOBD not using the initail sign-off. 

Most people that will actually use the training will inherently want to do it correctly.

This is one mans opinion.

You've got the right attitude - you want to master the material. Sadly, there are folks who just want the sign-offs.  Basically, they look at the training requirements as obstacles between them and the rating they want, rather than an opportunity to learn.

floridacyclist

I was thinking about this case in particular when I first posted on this thread.

I don't think most of the problem is in the students, but in the evaluators. Perhaps evaluators should be evaluated and endorsed by 2 or 3 other evaluators with the same or higher level of qualification before being allowed to sign off on a task and should be subject to periodic review. After all, they are the final check/balance to have hands-on experience with a person and their abilities before turning them loose.
Gene Floyd, Capt CAP
Wearer of many hats, master of none (but senior-rated in two)
www.tallahasseecap.org
www.rideforfatherhood.org

Dragoon

Quote from: floridacyclist on November 06, 2007, 05:03:58 PM
I was thinking about this case in particular when I first posted on this thread.

I don't think most of the problem is in the students, but in the evaluators. Perhaps evaluators should be evaluated and endorsed by 2 or 3 other evaluators with the same or higher level of qualification before being allowed to sign off on a task and should be subject to periodic review. After all, they are the final check/balance to have hands-on experience with a person and their abilities before turning them loose.

Yup, that's the sort of thing we need.

The idea of being qualified to do hands on evaluation because you took an online test is ludicrous.

There has to be some quality control - the evaluator needs a certain amount of experience in the position, plus demonstrated ability to evaluate.  Note I didn't say "teach" - we don't need to regulate teaching.  We need to regulate evaluation.

For some ratings, one answer for experience would be to  limit instruction to those with the next higher rating - have to be an observer or MP to teach scanner, have to be a GTL to teach GTM, etc.

But even with something like that we still need more assurance that all evaluators are doing it to standard.