Measure Description | Source of measure | Basu, S., Salisbury, C.L., Thorkildsen, T.A. (2010). Measuring Collaborative Consultation Practices in Natural Environments, Journal of Early Intervention, 32(2), 127-150. |
Mode of administration | Rater coding of video segments |
Age range for use | Infants, Toddlers, Preschool-age children |
Domains Assessed | Frequency of selected parent and service provider behaviors (with children) during early intervention sessions conducted in natural environments |
Related Measures | Creating a Supportive Environment (CASE); Home Visit Observation Form (HVOF), Natural Environment Rating Scale (NERS), |
Burden | Training needed to administer | Training required on how to code observations using the rating scale. Training involves two 2-hour group sessions, followed by individualized training until reliability is achieved in coding. |
Minutes to complete | Video segments to be rated are 10 minutes in length. Time needed to code will vary depending on the coder's experience. |
# of items | 33 (21 items about provider behavior; 12 items about caregiver behavior) |
Cost | Instrument is available free of charge |
Adaptation for AIAN use | Adapted | No |
Developer allows adaptation? | No information available |
Used with AIAN populations? | No mention of use with AIAN families. Reported that the TIERS was developed using a culturally, ethnically, and linguistically diverse sample of families and service providers. |
Psychometrics | Norm-referenced | No |
AIAN: Cronbach's alpha range | No information available |
AIAN: Evidence of validity | No information available |
Other populations: Cronbach's alpha range | The internal consistency of the 12 item parent scale was α = .91 and coefficients for high, medium, and low parental participation were .68, .84, and .90, respectively. The internal consistency estimate for the 21-item service provider scale was α = .94. The coefficients for the four subscales were .59 for Observing and Information Sharing, .87 for Joint Attention and Problem-Solving, .86 for Practice With Feedback and Reflection, and .78 for Direct Teaching and Guided Practice, respectively. 20% of the videotape segments were coded with two independent student raters and their interrater agreement was к = .70. Similarly, when 20% of the segments were coded by pairs of student raters and service providers, the interrater agreement was к = .72. |
Other populations: Evidence of validity | Basu et al. 2010 reported that the TIERS "met the basic guidelines of psychometric tests for internal consistency, interrater agreement, and cross-rater relability" (p. 135) but no specific data or validity information was provided. |
Source | Developer | Basu, S., Salisbury, C.L., Thorkildsen, T.A. (2010). Measuring Collaborative Consultation Practices in Natural Environments, Journal of Early Intervention, 32(2), 127-150. |
Link | Basu, S., Salisbury, C.L., Thorkildsen, T.A. (2010). Measuring Collaborative Consultation Practices in Natural Environments, Journal of Early Intervention, 32(2), 127-150. |
Summary | Comments about sensitivity to change | Limited information available. Behaviors are rated on a 3-point scale (Almost Always, Sometimes, Never) which may limit detection of change. |
General remarks | Instrument available in Basu et al. 2010 publication. |