د/ايمان زغلول قاسم

استاذ تكنولوجيا التعليم المساعد بكلية التربية بالزلفي

book B88

6.6.2 Focus Groups
Measurement instruments and procedure. Through 45-minute focus group
meetings at two schools, qualitative data were gathered about the users‘ perceptions of the
redesigned reports. The sessions were set up in a group discussion format (Newby, 2010),
similar to the previous meetings. The educational adviser fulfilled the role of moderator and
was present to answer assessment-specific questions. The first author took notes and clarified
the rationales behind the design solutions, when necessary. For each report, the participants
were shown the original version, followed by the aspects in need of change, and a display of
the redesigned version. Next, the moderator presented the results from the questionnaire,
showing the respondents‘ perceptions of the redesigned reports. Subsequently, for each report,
the participants were encouraged to discuss their thoughts on the redesigned reports. They
were asked to indicate whether they would like the changes to be implemented, and whether
there were further adaptations or refinements needed.
Furthermore, the experts came up with the idea of optionally organising the ability
growth report based on clusters of pupils at the same level. The users‘ ideas regarding
offering this option were also probed.
Respondents. The focus group at School 1 consisted of two teachers and two school
principals. The focus group at School 2 comprised three teachers, one ICT
teacher/coordinator, an internal support teacher, and an adjunct principal.
Data analysis. The participants‘ comments and recommendations were summarised
for each report. Subsequently, these responses were systematically mapped onto the design
table (Table 6.1). This analysis served in determining which design solutions were
satisfactory, and which ones needed to be adapted or refined.
Results. The participants were enthusiastic about the redesigned reports and agreed
that they should be implemented. Particularly valued were the legends in the pupil report and
group report, and the level indicators in the ability growth report. The participants provided
some useful suggestions for refinement, which mainly related to the tones of the colours used,
and the distinctiveness amongst the colours within one report. There was also a positive
reaction to the possibility of organising the ability growth report based on clusters of pupils at
the same level. The participants reported that this would be particularly valuable for the
school‘s own evaluation purposes.
Chapter 6
144
However, one aspect of the design that related to the score intervals remained
doubtful. The researchers expected that integrating the score intervals in the graph, next to
reporting the numeric information in the table, would be helpful (inspired by Brown, 2001;
Vezzu, VanWinkle, & Zapata-Rivera, 2012; Wainer, 1996). Specifically, the results obtained
by Van der Kleij and Eggen (2013) suggest that many users do not understand what the score
intervals mean or how they should use this feature, and the information about it is often
ignored. The focus group participants claimed that the score intervals might bother or confuse
some users. Participants at School 1 did indicate that visualisation of the score intervals was
useful, and that it more clearly showed a pupil‘s learning trajectory. Nevertheless, the results
clearly indicate the users‘ preference for an option not to display the intervals.
6.6.3 Consultation with Key Informants
In the subsequent update of the Computer Program LOVS, a number of changes were
implemented (see for an example Figures 6.10 and 6.11). These revisions, along with the
remaining intended changes, were evaluated using key informant interviews (Mellenbergh,
2008; Streiner & Norman, 1995).
Figure 6.10.

الوقت من ذهب

اذكر الله


المصحف الالكتروني