Nothing Special   »   [go: up one dir, main page]

Q4 WEEK 3 Validity and Reliability of Research Instruments

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 8

Grade Level 11 Region NCR

Semester SECOND Teaching Dates April 24-25, 2024

Quarter THIRD Grade & Section Grade 11-HUMSS

Learning Area ENGLISH Modality Face to Face

I. OBJECTIVES

The learner demonstrates understanding of the quantitative research design, description of


sample, instrument development, description of intervention (if applicable) data collection and
A. Content Standards analysis procedures such as survey, interview and observation, and guidelines in writing
research methodology.

The learner transfers learning by describing adequately quantitative research designs, sample,
instrument used, intervention (if applicable), data collection, and analysis procedures.
B. Performance Standards

Constructs an instrument and establishes its validity and reliability CS_RS12-IIa-c-3


C. Most Essential Learning
At the end of the teaching-learning activities, the learners should be able to:
Competencies/ Objectives
Write the LC code for each a. classify the types of questions used in research instrument;
b. recognize the types of validity/reliability used in the sample scenarios; and
c. construct a sample research instrument.
II. CONTENT Research Instrument and Validity and Reliability

III. LEARNING RESOURCES

A. References

B. Other Learning Resources

IV. PROCEDURES

In the previous modules, you have learned the types of sampling techniques applied in
quantitative research. As a review, let’s have a short activity. Below is a word hunt puzzle
containing the types of probability and non-probability sampling techniques.
A. Reviewing previous lesson
or presenting the new
lesson
What do you think will happen if tools for building a house is not prepared meticulously? The
same thing when getting information for answers to a research problem, tools, or instruments
should be prepared carefully. In constructing a quantitative research instrument, it is very
important to remember that the tools created should require responses or data that will be
numerically analyzed.
B. Establishing a purpose for
the lesson Research Instruments are basic tools researchers used to gather data for specific research
problems. Common instruments are performance tests, questionnaires, interviews, and
observation checklist. The first two instruments are usually used in quantitative research, while
the last two instruments are often in qualitative research. However, interviews and observation
checklists can still be used in quantitative research once the information gathered is translated
into numerical data..
Directions: Read and analyze the given scenario. Answer the guide questions below.

A culmination program was scheduled for Grade 12 students. The highlight of the program is the
presentation of the festival of dances. The six sections prepared for the said dance contest
during the culmination. A month before the activity, the students already started planning. Their
parents were also very supportive in the preparation of their costumes and props. The class
advisers also monitored the practices in their classrooms.

During the contest, the PE teacher invited teachers from other schools to serve as a judge. The
C. Presenting
performances were exemplary, especially the section Anahaw. The section was also a crowd
examples/instances of the
favorite. However, another group that performed poorly compared to Anahaw was pronounced
new lesson
as the winner. Due to the result of the contest, Anahaw and other sections wanted to know the
bases for judging. After conducting an investigation, it turned out that no clear criteria were set,
and no rating sheets were used.

Guide Questions:
1. What do you think must have been done to avoid the said situation?
2. What can you say about the result of investigation?
3. How will you relate the scenario to the conduct of a quantitative research study?

Instrument vs. Instrumentation


D. discussing new concepts In research, an instrument is a general term used by the researcher formeasuring devices such
and practicing new skills as surveys, questionnaires, tests, checklist, etc. On the other hand, instrumentation is the action
#1 which is the process of developing, testing, and using the instrument. Take note that instrument
is the device while instrumentation is a course of action (Prieto, Naval, and Carey 2017).

Characteristics of a Good Research Instrument


Concise. Have you tried answering a very long test, and because of its length, you just pick the
answer without even reading it? A good research instrument is concise in length yet can elicit
the needed data.
Sequential. Questions or items must be arranged well. It is recommended to arrange it from
simplest to the most complex. In this way, the instrument will be more favorable to the
respondents to answer.

Valid and reliable. The instrument should pass the tests of validity and reliability to
get more appropriate and accurate information.

Easily tabulated. Since you will be constructing an instrument for quantitative research, this
factor should be considered. Hence, before crafting the instruments, the researcher makes sure
that the variable and research questions are established. These will be an important basis for
making items in the research instruments.

Guidelines in Developing Research Instrument

Step 1. Background
• Do basic research on the chosen variables or construct of the research study. Choose a
construct that you can use to create the objective of the questionnaire. Construct means the
characteristics that you wish to measure or to evaluate in your research instrument (e.g. weight,
academic performance, etc.).
• After identifying the construct, it is easy to state the purpose or objective of the questionnaire
and the research questions as well.

Step 2. Questionnaire Conceptualization


• Select a response scale where the respondents answer the questions in your research study.
Some of the scales you might use in your research questionnaire are given below:

▪ Yes / No or Yes/No/Neither
▪ Likert Scale. This scale is used to measure behavior quantitatively.

• Create questions based on the objectives of the research study. These are the guidelines in
developing questions for your questionnaire:
▪ The questions should be clear, concise, and simple. Avoid lengthy and confusing questions.
▪ Classify question under each statement based on your problem statement.
▪ Questions should be consistent within the needs of the study.
▪ Avoid using sensitive and debatable question.
▪ Avoid using jargon or unfamiliar words in question.

• Choose the type of questions in developing your questionnaire. It can be:


1. Dichotomous questions. It refers to a question with only two choices such as “Yes/No”
or “Like/Dislike”.
2. Open-ended questions. It refers to a question that normally answers the question
“why”. But take in mind that this type of question is usually used in qualitative research.
Example: What do you like most about your school?
_______________________________________________________
3. Closed-ended questions. It is also called multiple-choice questions. It consists of three
or more choices.
Example: What is the highest educational attainment of yourmother?
___ elementary ___ high school ___ college
4. Rank- order Scale questions. This is a type of question that asks for ranking the given
choices or items.
Example: Rank the following based on their importance in work as SHS student. (3=
highest and
1=lowest)
__ doing homeroom activities
__ going to library
__ using computer

5. Rating Scale questions. It is the Likert scale form. It is a type of question that measures the
weights of the response of the respondents.

Ways in Developing Research Instrument


There are three ways you can consider in developing the research instrument for your study.
First is adopting an instrument from the already utilized instruments from previous related
studies. The second way is modifying an existing instrument when the available instruments do
not yield the exact data that will answer the research problem. And the third way is when the
researcher made his own instrument that corresponds to the variable and scope of his current
study.

Common Scales Used in Quantitative Research

E. Continuation of the Step 3. Establishing the validity of the questionnaire


discussion of new Validity
concepts and practicing A research instrument is considered valid if it measures what it supposed to measure.
new skills #2 When measuring oral communication proficiency level of students, speech
performance using rubric, or rating scale is more valid than students are given
multiple choice tests.

Face Validity. It is also known as “logical validity.” It calls for an initiative judgment of
the instruments as it “appear.” Just by looking at the instrument, the researcher
decides if it is valid.

This is a subjective type of assessment of the researchinstrument. This is the simplest


and the easiest type of validity wherein the validator skims the overview of the
instrument in order to form an opinion. Moreover, it is often criticized as the weakest
type of validity used in research instruments (Stephanie, 2015).
Content Validity. An instrument that is judged with content validity meets the
objectives of the study. It is done by checking the statements or questions if this
elicits the needed information. Experts in the field of interest can also provide specific
elements that should be measured by the instrument.

This type of assessment refers to the appropriateness of the content of an


instrument. An expert to the content or professional that is familiar to the construct
being measured is needed in this type of validity. The expert makes a judgment about
the degree to which the items in the questionnaire cover all the relevant parts of the
construct it aims to measure.

Construct Validity. It refers to the validity of instruments as it corresponds to the


theoretical construct of the study. It is concerning if a specific measure relates
to other measures. It defines how well a test measures what it claims to measure. It is
used to know whether the operational definition of a construct aligns with the true
theoretical meaning of a concept.

Criterion Validity. This type of validity measures how well the relationship
between the result of your instrument to the result of another instrument.

For example, an English teacher makes an instrument to measure students’ English


writing ability. In order to assess how well the instrument measures the student’s
writing skills, she finds an existing instrument that is considered a valid measurement
of English writing ability and compares the results when the same group of students
takes both tests. If the results are very similar, the instrument created by the teacher
has a high criterion validity.

Concurrent Validity. When the instrument canpredict results like those similar tests
already validated, it has concurrent validity.

Predictive Validity. When the instrument can produce results similar to those similar
tests that will be employed in the future, it has predictive validity. This is particularly
useful for the aptitude test.

Step 4. Establishing the reliability of the questionnaire


Reliability
Reliability refers to how accurate and precise the measuring instrument is. It
yields consistent responses over repeated measurements. In order to have a reliable
instrument, you need to have questions that yield consistent scores when asked
repeatedly.

Stability or Test-retest reliability. This is the simplest type of reliability where the
same questionnaire is administered twice to the same sample at a different point in
time and the correlation between two sets of scores is computed.

It is achieved by giving the same test to the same group of respondents twice. The
consistency of the two scores will be checked.

Equivalent Forms Reliability/ Split-half method. It is established by administering


two identical tests except for wordings to the same group of respondents.

Internal Consistency Reliability. It determines how well the items measure the same
construct. It is reasonable that when a respondent gets a high score in one item, he
will also get one in similar items. There are three ways to measure the internal
consistency; through the split-half coefficient, Cronbach’s alpha, and Kuder-
Richardson formula.
Step 5. Pilot testing of the questionnaire
Once you have done assessing the validity and reliability of the instrument, the next
step to take is to pilot test the questionnaire before distributing it to the target
respondents of the study. Pilot testing is like pre-testing the instrument. You may find
10-15 people to answer the questionnaire. In this process, participants could put
remarks on some questions. This could help you to enhance your questions.

Step 6. Revise the questionnaire


After identifying some problem in your questionnaire, revise the questionnaire based
on the feedback of the participants during pilot testing. However, do not forget that
the questionnaire should match the research objective.

Let’s try: A. Based on the sample research instrument below, classify the types of questions
used.

F. Developing mastery (leads


to Formative Assessment
3)

1. Gender ________________________________________

2. Age ________________________________________

3. Question #1 ________________________________________

4. Question #2 ________________________________________

5. Question #3 ________________________________________

6. Question #4 ________________________________________
Let’s practice: Construct a sample research instrument based on your research topic. You may
use different type of questions if you wish to. Follow the sample guide below in constructing the
draft of your research instrument.

G. Finding practical
applications of concepts
and skills in daily living

To summarize what you have learned today, complete the diagram on the steps of developing or
constructing a research instrument. Put some important notes to remember in each step.

H. Making generalizations
and abstractions about the
lesson

Recognize the types of validity/reliability used in each of the following scenarios. Choose the
letter of the best answer from the choices inside the box.

____1. The expert assessed the instrument measuring the students’ ability to handle stress
based on the set of criteria that truly reflects the construct being measured. What type of validity
is used by the expert?
____2. An instructor distributes a set of tests to his class, a few weeks after, he distributes
another different test but on the same topic as the first set of tests. What type of reliability is
I. Evaluating learning used by the instructor?
____3. An English class takes an English Proficiency Test and then take it again after a month
so the instructor can assess the reliability of the test. What type of reliability is used by the
instructor?
____4. The researcher examines the questions on the instrument measuring the impact of
mobile games on the sleeping habits of the teenagers by simply skimming it. On its surface, the
survey questionnaire seems like a good representation of what it aims to measure. What type of
validity is being used by the researcher?
____5. The researchers conducted two different tests to their respondents. Afterward, they
compared and correlated the result of the test they created to the result of the test that already
existed and validated. They did this to assess how well their researchers-made-test measured
what it intended to measure. What type of validity is used by the researchers?

J. Additional activities for


application or remediation

V. REMARKS
VI. REFLECTION
A. No. of learners who earned 80% in the evaluation
B. No. of learners who require additional activities for remediation
C. Did the remedial lessons work? No. of learners who have caught up with the lesson
D. No. of learners who continue to require remediation
E. Which of my teaching strategies worked well? Why did this work?
F. What difficulties did I encounter which my principal or supervisor can help me solve?
G. What innovation or localized materials did I use/discover which I wish to share with other
teachers?

Prepared by: Checked by:

VANESSA DC. BALITAO RICHELLE M. ASAYTONO


Subject Teacher SGH, English SHS Dept.

Reviewed by:

JUN RYAN L. HERNANDEZ


Assistant School Principal II

Approved by:

CECILIA G. REGALA
Principal III

You might also like