ES Training Tasks -- Tough enough?

Started by RiverAux, September 29, 2007, 05:25:14 PM

0 Members and 1 Guest are viewing this topic.

RiverAux

While the Scanner generally flies in the backseat during the active part of missions it is not at all uncommon for them to fly in the front front seat at other times (for example, flying to and from mission base and other transport flights).  Even though they aren't in that seat serving as an Observer, I think that they should at least be able to operate the radio in case something happens to the pilot.  Seems like a basic safety measure.

But then again, I'm the one that thinks that all non-pilot CAP aircrew members should take AOPA's Pinch-hitter course and should receive at least 4 hours of in-flight training in emergency flight and landing skills. 

isuhawkeye

^^^ Iowa put a group of Observers through just such a course durring the August WTA

Dragoon

Quote from: RiverAux on October 02, 2007, 10:11:45 PM
While the Scanner generally flies in the backseat during the active part of missions it is not at all uncommon for them to fly in the front front seat at other times (for example, flying to and from mission base and other transport flights).  Even though they aren't in that seat serving as an Observer, I think that they should at least be able to operate the radio in case something happens to the pilot.  Seems like a basic safety measure.

But then again, I'm the one that thinks that all non-pilot CAP aircrew members should take AOPA's Pinch-hitter course and should receive at least 4 hours of in-flight training in emergency flight and landing skills. 

Careful,

What you're basically arguing for  is "Scanners need to be trained to do what observers do, since we let them sit up front"  After all, there are lot of other things the right seater is supposed to do besides operate a radio.

If we go down that route, there's no need for an MS rating.  I think it's better to stick with "MS sits in the back.  If MS sits up front, the MP handles the addtional work load."




RiverAux

Not at all.  Just pointing out that there are some issues that scanners need to be informed on.  You brought up icing earlier -- I certainly want the scanner aware that he should probably say something if he notices a big build up of ice while he's looking out the window. 

Dragoon

And I would want that as well.

If the aircrew tasks were designed from the practical point of view ("What do I want this guy to DO?") They'd be a lot shorter.

Yup. I want the scanner to alert me if he notices visible signs of icing, or notices the engine making funny noises.  But that it.

He does not need to know

the details of freezing levels
the details of how icing affects aircraft performance (other than to say your plane gets heavier and may fall out of the sky)
the details of carborator icing.

This is NICE to know stuff, but not critical to sitting in the backseat and looking out the window.

Neither is it critical that he

knows the ins and outs of weight and balance (just know they exist and that the pilot has to do it)
Understands aircraft controls
Understands airraft instruments
Knows the ins and outs of wake turbulence (beyond "tell me if the windsock shifts)
Knows the details of how light and atmospheric conditions affect scanning (since he has no control over them, and isn't planning the flight)
Knows the details of visibility and turbulence affects search (he'll figure it out quickly on his own)
Knows how to keep a log (since the taskbook assigns this mission to the observer, not the scanner)

This is all nice to know, not need to know.


If we cut this down to the bare basics, and focused on the most important stuff - namely looking out the window and spotting things on the ground, we could train folks much faster, and have them much more proficient at the critical tasks.

Just one guy's opinion.

floridacyclist

#25
Don't forget that the Scanner should also be considered a future Observer Trainee and should be at least vaguely familiar with the same things that Observers have to be proficient at. Approaching it with this in mind will make their Scanner time much more productive as they will actually have a clue as to what is going on and in the long run will give you quicker- and better-trained Observers
Gene Floyd, Capt CAP
Wearer of many hats, master of none (but senior-rated in two)
www.tallahasseecap.org
www.rideforfatherhood.org

sardak

There are a number of interesting requirements in the task guides.

a. Task O-2003 - Grid a Sectional
First line of the training outline in the task guide - As a Mission Observer trainee, knowing how to grid a sectional chart and use grids is essential in order to assist the mission pilot in planning a search, and to maintain situational awareness during a search.

This is a required task for Mission Pilot but not Observer or Scanner.

Performance measurement 1. Grid a sectional using the CAP grid system.
Every mission pilot has sat in front of an evaluator and gridded a sectional?

b. Task O-0204 - Locate a Point on a Map Using Latitude and Longitude
Required task for Scanner and Ground Team Leader but not UDF Team Member.

Seems that someone on a UDF team needs to know how to plot lat/lon, as there is no requirement to have a scanner or GTL along to plot Sarsat coordinates or the location of the transmitter after it's been located.

Mike

RiverAux

I agree on the lat-lon issue.  I assume they overlooked it. 

With the gridding the map, I can see why you would need a pilot able to do that since they've got to constantly buy new sectionals anyway.  I would say that an Observer should need to be able to do it, but not a scanner.

RocketPropelled

Quote from: Dragoon on October 01, 2007, 05:53:20 PM1.  Evaluators who don't.  (evaluate, that is).  We need more than SET.  We need a hands on evaluation of the Evaluators ability to evaluate!  (how's that for a mouthful).  Followed by a Wing CC's sign off that the guy is allowed to evaluate.  Adjust MIMS to require that Wing CC sign off.

Basically, we have to have the same standards for evaluators as we do for check pilots. 

Hey, if you want to make the "evaluator" a super-limited set of people, be prepared for another training bottleneck.  I'm all for a higher standard of training -- but limiting evaluators and establishing the "approved cadre" is far more likely to put a strain on anyone you choose.

The entire training system is set up based upon the integrity of the evaluator and the trainee, no matter their additional qualifications.  I'm the ES training officer for our squadron, and I assure you, no one gets signed off under my pen before they're ready -- because I'll have to carry that person if they're not prepared to perform.  I've had some great trainers, and I've learned a lot from them.

NESA is a fine example of the textbook training being taught, applied, and evaluated in the field.  But at the squadron level (where most of the rubber meets the road, training-wise), you need to have the ability to perform evaluation and signoff for tasks.

In most wings, finding a check pilot to fly with you (in addition to finding an IP to do a fam flight, scheduling an airplane, etc.) is hard enough.  Check pilots are vetted not only through the FAA for their skills and experience, they need to pass a high degree of CAP training and signoff -- all necessary when you're dealing with expensive aircraft and checkrides.

I don't see why you need the same super-specific approval methods when you're signing off a UDFT, a Mission Scanner, or a GTM3.  F91 rides are tough for most Mission Pilot trainees, and being approved as a GTL is seldom a pencil-whip, as far as I've seen.  Your wings may vary.

As a training guy, I don't like the idea of creating ANOTHER bottleneck in the system -- it's often hard enough to get tasks signed off and missions attended just to finish the SQTR.  Why complicate it by further limiting the signoff criteria?

Dragoon

#29
Quote from: floridacyclist on October 03, 2007, 11:29:55 PM
Don't forget that the Scanner should also be considered a future Observer Trainee and should be at least vaguely familiar with the same things that Observers have to be proficient at. Approaching it with this in mind will make their Scanner time much more productive as they will actually have a clue as to what is going on and in the long run will give you quicker- and better-trained Observers

I'd absulutely agree on the value of familiarization.  But that's very differnent than requiring proficiency as measured by time consuming tests.

The abundance of relatively low value tasks has encouraged people to pencil whip.

I'd also argue that the scanner, once qualified, will quickly become vaguely familiar with Observer duties simply by acting as part of an aircrew.  And if he's a little unclear on what a VSI does, I don't think it's a big risk to the mission.

Dragoon

Quote from: RocketPropelled on October 04, 2007, 04:30:56 PM

Hey, if you want to make the "evaluator" a super-limited set of people, be prepared for another training bottleneck.  I'm all for a higher standard of training -- but limiting evaluators and establishing the "approved cadre" is far more likely to put a strain on anyone you choose.

Well, I think there's a bit of wiggle room between "wide open" and "super-limited"

It's a quality thing. Passing a multiple choice test does not in any way, shape or form make you qualfied to conduct hands on evaluation.

And having just learned something yourself doesn't make you qualfied to evaluate others in that skill.

The current system allows a brand new 13 year old GTM 3 who takes and passes the online SET exam to evaluate other GTM 3s on their proficiency.  This, of course, is ludicrous.  Newbies at any skill set shouldn't be testing other newbies.  And not everyone has the temprament and discipline to be a good evaluator.

It's all in how tight you make it.  A wing could approve oodles and oodles of evaluators at the push of a button in an online system.  But at least make someone send an email to Wing ES requesting the status.  It gives Wing a chance to say no.  You can still have lots of them - just don't have a wide open free for all.

Quote from: RocketPropelled on October 04, 2007, 04:30:56 PM
The entire training system is set up based upon the integrity of the evaluator and the trainee, no matter their additional qualifications.  I'm the ES training officer for our squadron, and I assure you, no one gets signed off under my pen before they're ready -- because I'll have to carry that person if they're not prepared to perform.  I've had some great trainers, and I've learned a lot from them.

Unless you are fully qualfied in all the specialties yourself, respectfully you have no clue if someone is "ready."  And very few squadron ES officers have all those qualfications.    You have to trust whoever did the evaluation, and that person may be someone from another unit that you've never met.  As an ES trng officer, you need some proof that all the evaluators are good at evaluating, and that you can trust them.

You mentioned that the system assumes the integrity of the evaluator.  Doesn't it make sense to ensure at the very least that the potential evaluator HAS integrity before you start trusting it?


Quote from: RocketPropelled on October 04, 2007, 04:30:56 PM
NESA is a fine example of the textbook training being taught, applied, and evaluated in the field.  But at the squadron level (where most of the rubber meets the road, training-wise), you need to have the ability to perform evaluation and signoff for tasks.

Absolutely.  Nothing in this proposal changes that.  Just adds a little quality.

And remember, this is about evaluation,not teaching.  Anyone can teach.  But you need to be able to trust the skill and judgement of your evaluators.


Quote from: RocketPropelled on October 04, 2007, 04:30:56 PM
In most wings, finding a check pilot to fly with you (in addition to finding an IP to do a fam flight, scheduling an airplane, etc.) is hard enough.  Check pilots are vetted not only through the FAA for their skills and experience, they need to pass a high degree of CAP training and signoff -- all necessary when you're dealing with expensive aircraft and checkrides.

I don't see why you need the same super-specific approval methods when you're signing off a UDFT, a Mission Scanner, or a GTM3.  F91 rides are tough for most Mission Pilot trainees, and being approved as a GTL is seldom a pencil-whip, as far as I've seen.  Your wings may vary.


You don't need the same standards.  But you probably need higher ones than "have the rating yourself and pass an online SET test."

Remember, F91s cannot be given by any MP SET.  Only by folks appointed by Wing.   Because CAP has known for years that not every MP is a good judge of other MPs.  You don't have to be a CFI to be an MP Check Pilot - but you do need to be an experienced MP who Wing has some faith in.

In the same way, not every mission scanners should be evaluating scanners.  You probably want to limit it to very experienced scanners, or even better to Observers and MPs.

It's probably OK to limit GTM evaluation to GTLs, and GTL evaluation to a select group of very good GTLs.

The bar can be raised a bit without putting it through the ceiling. 

The return is that when one of your guys gets trained by someone you've never met, you'll have some idea that the guy actually tested your member to standard, and that the task was truly mastered.

SarDragon

Quote from: RiverAux on October 04, 2007, 02:58:15 PM
I agree on the lat-lon issue.  I assume they overlooked it. 

With the gridding the map, I can see why you would need a pilot able to do that since they've got to constantly buy new sectionals anyway.  I would say that an Observer should need to be able to do it, but not a scanner.

Most of the folks I know use the same olde gridded chart all the time for that part of their flying, and the new ones for navigation. There's nothing that says that the gridded chart be current.
Dave Bowles
Maj, CAP
AT1, USN Retired
50 Year Member
Mitchell Award (unnumbered)
C/WO, CAP, Ret

ammotrucker

I think as ES training officer for my squadron, and think about the times that I was being signed off.  I beleive that some of the thing being floated in this thread are ture. 

I know that on one training event, I was signed off and I refused to add it to MIMS for the simple fact that I did not like the eval.  I stated this is a member and went to a different squadron and received the training much the way that I expected to get.  After that I added these to my MIMS.

If your looking just to get signed-off I could care less how you receive it.  Most members do not accept actual missions.  I care about the members that will be involved.  That they get quality training and eval.

This was the case with my AOBD.  I know that the FAA Aircraft Dispatch license means nothing to CAP.  But, I see a lot of coralation in the two.  This is the reasoning that I used for my AOBD not using the initail sign-off. 

Most people that will actually use the training will inherently want to do it correctly.

This is one mans opinion.
RG Little, Capt

Dragoon

Quote from: ammotrucker on November 06, 2007, 03:53:28 PM
I think as ES training officer for my squadron, and think about the times that I was being signed off.  I beleive that some of the thing being floated in this thread are ture. 

I know that on one training event, I was signed off and I refused to add it to MIMS for the simple fact that I did not like the eval.  I stated this is a member and went to a different squadron and received the training much the way that I expected to get.  After that I added these to my MIMS.

If your looking just to get signed-off I could care less how you receive it.  Most members do not accept actual missions.  I care about the members that will be involved.  That they get quality training and eval.

This was the case with my AOBD.  I know that the FAA Aircraft Dispatch license means nothing to CAP.  But, I see a lot of coralation in the two.  This is the reasoning that I used for my AOBD not using the initail sign-off. 

Most people that will actually use the training will inherently want to do it correctly.

This is one mans opinion.

You've got the right attitude - you want to master the material. Sadly, there are folks who just want the sign-offs.  Basically, they look at the training requirements as obstacles between them and the rating they want, rather than an opportunity to learn.

floridacyclist

I was thinking about this case in particular when I first posted on this thread.

I don't think most of the problem is in the students, but in the evaluators. Perhaps evaluators should be evaluated and endorsed by 2 or 3 other evaluators with the same or higher level of qualification before being allowed to sign off on a task and should be subject to periodic review. After all, they are the final check/balance to have hands-on experience with a person and their abilities before turning them loose.
Gene Floyd, Capt CAP
Wearer of many hats, master of none (but senior-rated in two)
www.tallahasseecap.org
www.rideforfatherhood.org

Dragoon

Quote from: floridacyclist on November 06, 2007, 05:03:58 PM
I was thinking about this case in particular when I first posted on this thread.

I don't think most of the problem is in the students, but in the evaluators. Perhaps evaluators should be evaluated and endorsed by 2 or 3 other evaluators with the same or higher level of qualification before being allowed to sign off on a task and should be subject to periodic review. After all, they are the final check/balance to have hands-on experience with a person and their abilities before turning them loose.

Yup, that's the sort of thing we need.

The idea of being qualified to do hands on evaluation because you took an online test is ludicrous.

There has to be some quality control - the evaluator needs a certain amount of experience in the position, plus demonstrated ability to evaluate.  Note I didn't say "teach" - we don't need to regulate teaching.  We need to regulate evaluation.

For some ratings, one answer for experience would be to  limit instruction to those with the next higher rating - have to be an observer or MP to teach scanner, have to be a GTL to teach GTM, etc.

But even with something like that we still need more assurance that all evaluators are doing it to standard.