Main Menu

MS test

Started by Walkman, July 17, 2012, 12:53:44 PM

0 Members and 1 Guest are viewing this topic.

Walkman

I'm currently training a new class of scanners. We're about finished with the classroom F&P portion of the training. My predecessor had a multiple choice test that covered all the F&P info, but unfortunately, he's been deployed and I wasn't able to get it from him before he left. Does anyone have (or can you point me to) a test somewhere I can use? If not, I've got two weeks to create one...

I did check the NESA site and capes.net.

Thanks!

Angus

There are actually isn't an official test.  You can create one yourself just make sure that you include all the questions required of the tasks. 
Maj. Richard J. Walsh, Jr.
Director Education & Training MAWG 
 Gill Robb Wilson #4030

EMT-83

Someone making up their own requirements?

Walkman

Quote from: Angus on July 17, 2012, 12:58:14 PM
There are actually isn't an official test.  You can create one yourself just make sure that you include all the questions required of the tasks. 

Understood. I was hoping someone had already created one I could steal use. Our Ops O was the instructor last round, and I thought he told me he didn't create it from scratch. 

Quote from: EMT-83 on July 17, 2012, 01:02:34 PM
Someone making up their own requirements?

Nope. Just providing an efficient way to make sure the students are understanding the subject matter and can be evaluated effectively to sign off on the classroom knowledge based tasks. There's a lot of info in the F&Ps that can't be eval'd without some sort of quizzing/testing.

Eclipse

Qualification evaluation is supposed to be individual demonstration, not mass testing.

"That Others May Zoom"

arajca

#5
At least per MY wing ops folks, SOME tasks can be completed by written testing. Anything that starts with "Describe" or "List" definitely fall into that category.

added
They also require the test administrator to be SET qualified for the qualification.

Walkman

Quote from: arajca on July 17, 2012, 02:06:00 PM
At least per MY wing ops folks, SOME tasks can be completed by written testing. Anything that starts with "Describe" or "List" definitely fall into that category.

That's what we're doing.

Eclipse

Quote from: arajca on July 17, 2012, 02:06:00 PM
At least per MY wing ops folks, SOME tasks can be completed by written testing. Anything that starts with "Describe" or "List" definitely fall into that category.

Are these multiple choice or freeform answers?  Who's scoring the tests.

Are those being examined allowed to use their task guides?

Are the tests controlled per normal test control procedures?

How do you account for missed questions? (most tasking indicates "all or nothing / pass fail")

"That Others May Zoom"

Walkman


Eclipse

You know, one of the best reasons for asking things on CAPTalk is to expose bad ideas - there aren't too many things "new under the sun", but plenty of mistakes people make over and over.

It's very disappointing when people take the "forget it" attitude when it comes to light that their concept for expediency (etc), might not be a good idea,
or more commonly, there's more to the conversation then "just doing it".

We have this exact issue locally and are struggling with the middle ground - we have some units with very proficient instructors who have created
classroom seminars and tests around the curriculum.  "Seemed like a good idea at the time."  However, as these things do, the evolution has become a "correct to 100%" situation, the proficient guys have no issues, but there appears to be far to many "last row" guys who are getting the same task credit and then are clueless when they get into a mission environment or are tasked individually.

There's a good reason why the task guides don't allow for this - those direct conversations expose weaknesses and gaps in understanding that
standardized testing of raw facts may not.

Expediency has its place (i.e. Dog task? "Stay away from the dogs."), but it has to be used properly, and making up "standardized" testing locally
for a national curriculum where members are expected to operate in the same manner no matter what wing they are in, may not be a good idea.

"That Others May Zoom"

Walkman

It's not so much the critique of the idea that got me to say it, it was the feeling of being jumped on. That's always been an issue here. Someone posts a question, even if it's not the best idea and get piled on. Maybe I'm having a bad day, but I really didn't feel like getting grilled on it.

arajca

#11
Quote from: Eclipse on July 17, 2012, 02:46:50 PM
Quote from: arajca on July 17, 2012, 02:06:00 PM
At least per MY wing ops folks, SOME tasks can be completed by written testing. Anything that starts with "Describe" or "List" definitely fall into that category.

Are these multiple choice or freeform answers?  Who's scoring the tests.

Are those being examined allowed to use their task guides?

Are the tests controlled per normal test control procedures?

How do you account for missed questions? (most tasking indicates "all or nothing / pass fail")
1. Yes, depending on how the task evaluation is worded. An appropriate SET qualified member.
2. If the task evaluation instructions allow it. Then it is an "open book, closed mouth" situation.
3. No more than the task evaluation books.
4. Individual review/discussion.

Quote from: Eclipse on July 17, 2012, 03:06:05 PM
We have this exact issue locally and are struggling with the middle ground - we have some units with very proficient instructors who have created
classroom seminars and tests around the curriculum.  "Seemed like a good idea at the time."  However, as these things do, the evolution has become a "correct to 100%" situation, the proficient guys have no issues, but there appears to be far to many "last row" guys who are getting the same task credit and then are clueless when they get into a mission environment or are tasked individually.
If someone is making up a test based on the curriculum and not on the task guides and evaluation instructions, they are wrong. That's where you start getting different, non-standard training and problems.

What's the difference between having someone describe the types of heat injury in writing vs. verbally?

QuoteThere's a good reason why the task guides don't allow for this - those direct conversations expose weaknesses and gaps in understanding that standardized testing of raw facts may not.
Cite please.

Eclipse

I'd have an issue with multiple choice questions, otherwise, most seem reasonable, but it's all in the execution, and I don't think
there should be any "correct to 100%" that gives credit.

"That Others May Zoom"

jayleswo

Hi Kristin,

CAWG has had several online tests for years - including one for Scanner. They are somewhat old and based on outdated material/SQTR's to some extent but if your purpose is to make sure people were staying awake during a class I see no reason they could not take them. I'd suggest you try the one for MS yourself and see if it meets your need first.

https://tests.capnhq.gov/ops/tests/default.cfm?grp=pcr

-- John
John Aylesworth, Lt Col CAP

SAR/DR MP, Mission Check Pilot Examiner, Master Observer
Earhart #1139 FEB 1982

bflynn

Quote from: Walkman on July 17, 2012, 03:18:09 PM
It's not so much the critique of the idea that got me to say it, it was the feeling of being jumped on. That's always been an issue here. Someone posts a question, even if it's not the best idea and get piled on. Maybe I'm having a bad day, but I really didn't feel like getting grilled on it.

Tests are a good thing - they are evaluations of how well the students have learned the materials.

Note that he never said passing the test was a requirement for the qualification.  Learning the material is a requirement.  An examination is a diagnostic. 

As long as it's approached that way as an individual demostration of ability, then there's no problem.

Unfortunately, I don't have such an exam to share with you, but if I can suggest your writing one - don't choose questions by pulling quotes out of the course material.  Make scenarios that require using course material to answer the question.  Maybe make the test an example mission.  Then just review with the student so you have a structure to talk about their individual accomplishment in learning the material.


arajca

Quote from: bflynn on July 17, 2012, 06:04:34 PM
Note that he never said passing the test was a requirement for the qualification.  Learning the material is a requirement.  An examination is a diagnostic. 

As long as it's approached that way as an individual demostration of ability, then there's no problem.

Unfortunately, I don't have such an exam to share with you, but if I can suggest your writing one - don't choose questions by pulling quotes out of the course material.  Make scenarios that require using course material to answer the question.  Maybe make the test an example mission.  Then just review with the student so you have a structure to talk about their individual accomplishment in learning the material.
If you're going to use the test for tsk qualification, you need to use the task guide to write the questions. You should be as verbatum as possible to the language of the task guide. You cannot make up a test based on curriculum to use for qualification. As a student, I would be irritated if you gave me a test the did not directly relate to the goal of the training - completion of the MS rating. I would also tell others to avoid your classes since you are not providing qualification sign offs and are wasting our tiem.

bflynn

#16
Quote from: arajca on July 17, 2012, 06:28:26 PM
If you're going to use the test for tsk qualification, you need to use the task guide to write the questions. You should be as verbatum as possible to the language of the task guide. You cannot make up a test based on curriculum to use for qualification. As a student, I would be irritated if you gave me a test the did not directly relate to the goal of the training - completion of the MS rating. I would also tell others to avoid your classes since you are not providing qualification sign offs and are wasting our tiem.

You're jumping to an unwarranted conclusion.

Please note that I said nothing about the source of the material for the questions.  I suggested a framework for asking the questions that would be beneficial by putting the questions in context of a continious scenario. 

What I suggested not doing is what I see so often in testing - lifting a phrase out of the course material directly and telling the person to search for it and fill in the blank does nothing toward evaluating learning...yet we do it so often.  For example, suppose I had a question that said "One of the most important commodities during disasters is ______, _______ intelligence."  Other than searching the reference text for this phrase, does the fact that you discover the answers are "accurate" and "timely" do anything to diagnose whether or not you learned anything? 

On the other hand if I ask a question about a continuing scenario and have a question that says "Your aircraft has been assigned to take over as high bird.  Upon arriving on station at 9000 ft, you experience a headache, dizziness and feeling of euphoria.  Despite how you feel, you must concentrate and do what?"  Well that answer is definately to inform the pilot that you're experiencing hypoxia. 

Both questions come from the reference text and both questions are part of learning to be a mission scanner.  But isn't the second one a better diagnostic of whether or not something was learned?

Eclipse

The task guides are specific as to the questions and the process for evaluation.  Deviation is not authorized.

Real-world scenarios and contextual training is valuable, however there is no option to use it during qualification tasking.

"That Others May Zoom"

bflynn

And you believe what I suggested deviates from the task guide?  Or are you setting up a strawman just so you can run it down?

P-2024 requires a MS trainee to demostrate knowledge of high altitude effects, which includes hypoxia.  It's a completely reasonable topic.

All I'm suggesting is stringing questions together to make the test into a kind of ongoing story that describes a mission, rather than a search and find task using Acrobat Reader and Ctrl+F.  It's a suggestion intended to make the evaluation more interesting and fun.


arajca

#19
Here are the evaluation requirements for task P-2024 from the task guide:
Quote
Evaluation Preparation
Setup: None.
Brief Student: You are a Scanner trainee asked to discuss the effects of high altitude on the body and strategies
to deal with the conditions.
Evaluation
Performance measures Results
1. Discuss the symptoms and dangers of the following:
a. Ear block. P F
b. Sinus block. P F
c. Hypoxia. P F
2. Discuss strategies used to combat these symptoms. P F
Student must receive a pass on all performance measures to qualify in this task. If the individual fails any
measure, show what was done wrong and how to do it correctly.
Your scenario does not address all the items evaluation requires, therefore it is NOT suitable for qualification sign-off.

lordmonar

Quote from: Eclipse on July 17, 2012, 01:20:43 PM
Qualification evaluation is supposed to be individual demonstration, not mass testing.
Try again....
PATRICK M. HARRIS, SMSgt, CAP

lordmonar

Quote from: Eclipse on July 17, 2012, 03:06:05 PM
You know, one of the best reasons for asking things on CAPTalk is to expose bad ideas - there aren't too many things "new under the sun", but plenty of mistakes people make over and over.

It's very disappointing when people take the "forget it" attitude when it comes to light that their concept for expediency (etc), might not be a good idea,
or more commonly, there's more to the conversation then "just doing it".

We have this exact issue locally and are struggling with the middle ground - we have some units with very proficient instructors who have created
classroom seminars and tests around the curriculum.  "Seemed like a good idea at the time."  However, as these things do, the evolution has become a "correct to 100%" situation, the proficient guys have no issues, but there appears to be far to many "last row" guys who are getting the same task credit and then are clueless when they get into a mission environment or are tasked individually.

There's a good reason why the task guides don't allow for this - those direct conversations expose weaknesses and gaps in understanding that
standardized testing of raw facts may not.

Expediency has its place (i.e. Dog task? "Stay away from the dogs."), but it has to be used properly, and making up "standardized" testing locally
for a national curriculum where members are expected to operate in the same manner no matter what wing they are in, may not be a good idea.
They take the "forget it attitude"....when people attack them or treat them as if they are stupid, crimial or incompetant.....which you tend to do a lot.

The dude asked if anyone knew where the test could be found.......because that is the way they do it in his wing.....and you attacked him and demanded answers.
PATRICK M. HARRIS, SMSgt, CAP

lordmonar

Quote from: Eclipse on July 17, 2012, 06:53:30 PM
The task guides are specific as to the questions and the process for evaluation.  Deviation is not authorized.

Real-world scenarios and contextual training is valuable, however there is no option to use it during qualification tasking.
Please cite! :)
PATRICK M. HARRIS, SMSgt, CAP

Eclipse

Quote from: lordmonar on July 17, 2012, 09:18:13 PM
Quote from: Eclipse on July 17, 2012, 06:53:30 PM
The task guides are specific as to the questions and the process for evaluation.  Deviation is not authorized.

Real-world scenarios and contextual training is valuable, however there is no option to use it during qualification tasking.
Please cite! :)

Cite where it is allowed. 

The regs and process are very clear, especially for the initial qualification, which leaves little room for creativity or shortcuts.

"That Others May Zoom"

lordmonar

Quote from: Eclipse on July 17, 2012, 09:34:46 PM
Quote from: lordmonar on July 17, 2012, 09:18:13 PM
Quote from: Eclipse on July 17, 2012, 06:53:30 PM
The task guides are specific as to the questions and the process for evaluation.  Deviation is not authorized.

Real-world scenarios and contextual training is valuable, however there is no option to use it during qualification tasking.
Please cite! :)

Cite where it is allowed. 

The regs and process are very clear, especially for the initial qualification, which leaves little room for creativity or shortcuts.
>:D  That which is not forbidden is allowed.   ;)
PATRICK M. HARRIS, SMSgt, CAP

Eclipse

Quote from: lordmonar on July 17, 2012, 09:42:08 PM
>:D  That which is not forbidden is allowed.   ;)

The FSM will get you for that!

"That Others May Zoom"

lordmonar

Quote from: Eclipse on July 17, 2012, 10:03:50 PM
Quote from: lordmonar on July 17, 2012, 09:42:08 PM
>:D  That which is not forbidden is allowed.   ;)

The FSM will get you for that!
He is a forgiving diety  >:D
PATRICK M. HARRIS, SMSgt, CAP