Beruflich Dokumente
Kultur Dokumente
2. Did your structure and session pIan work? In not why not and what wiII you do
differentIy next time?
Despite directions from the module leader, many preferred to compIete aII the
questions reIating to each Iearning outcome before proceeding to the next.
This lead to some not touching all the learning outcomes in the allotted contact time.
Perhaps this could be alleviated by marking some problems to be done outside of
contact hours or giving guidance on how much of the contact time we would like
students to spend on each group of questions.
3. What faciIitation techniques were used to engage students in the topic and in
discussion? Did they work? If not what eIse couId you do?
The vast majority of the cIass enjoyed the caf-styIe, working in pairs and fours.
Those who preferred working alone were permitted to do so where it was of benefit to
their studies.
Most seemed happy to engage the demonstrators and moduIe Ieader when they
were stuck.
Near examination update: Our failure to track which students did not hit all three learning
outcomes lead to some surprising absences of knowledge during revision time. Better records
should be kept on the questions that students are completing.
B.1.
37
4. What did you Iearn most from the experience and what wiII you do differentIy next
time?
The caf-styIe Iearning breeds heaIthy discussion and encourages deep
Iearning. would like to see it applied elsewhere.
Students need to be encouraged to try to hit aII Iearning outcomes during the
contact time, otherwise some may slip through the cracks.
Near examination update: Students will often get stuck on learning outcomes that they did
not reach in contact time. Better records will allows the team to track and chase those who do.
BibIiography
Bloom, B. S. and Engelhart, M. D and Furst, E. J. and Hill, W. H. and Krathwohl, D. R.
Taxonomy of educational objectives: the classification of educational goals. Handbook I:
Cognitive Domain, Longmans, Green, 1956.
B. Reective logs
38
Preparing Future Academics (PFA) Form Learn 1
Learning Log TempIate
2. Did your structure and session pIan work? In not why not and what wiII you do
differentIy next time?
The assessment was intentionaIIy cIose to the styIe of the summative
assessment. Only topics from the previous three weeks were covered to help target
their revision for this instance. Both of these gave the feeling of the actual examination
without overwhelming the student and defeating the point of the exercise.
The markers were asked to sit the assessment in advance under a tighter time
constraint and then mark another's script. This helped check the consistency and
soundness of the assessment and write relevant feedback .
As this was the students first closed assessment of the course, explained the
purpose of formative assessment, the precise protocoI of this session and how it
would differ from the summative assessment. Students appeared to appreciate the
use of the assessment as a learning tool.
The intention was to return scripts in time for midterm supervision meetings.
Unfortunately, some supervisors did not follow guidance on arranging these meetings
after the assessment marking deadline. t is a concern that some students missed out
on this face-to-face feedback.
3. What faciIitation techniques were used to engage students in the topic and in
discussion? Did they work? If not what eIse couId you do?
Scripts were generally returned through supervisors, providing an opportunity to
discuss how to orient their studies. Both supervisors and students appreciated the
practical evidence on which they could base their discussions.
Students were invited to discuss their results with the markers and module leader in
person or by e-mail. None did. t is unclear whether this was a problem given most
discussed the feedback with their supervisors.
B.2.
39
4. What did you Iearn most from the experience and what wiII you do differentIy next
time?
AIign the formative assessment with the summative assessment, giving a clear
mapping between the two. This will help manage student expectations and provide a
path to learning.
Ask other peopIe to actuaIIy sit the assessment in advance of the students. This
will help iron out any inconsistencies.
Better communicate the intention to have face-to-face meetings. Ensure that some
face-to-face meeting (either with module leader or PGWT) takes place if it is not
possible to have one with the supervisor.
BibIiography
Cowie, B. and Bell, B. A Model of Formative Assessment in Science Education. Assessment
in Education: Principles, Policy & Practice, 6:1, Routledge, 1999.
B. Reective logs
40
Preparing Future Academics (PFA) Form Learn 1
Learning Log TempIate
TitIe of teaching session: Code Generation and Optimisation: A Picture to Turtle compiler
(with Matthew Naylor)
Date: 18th February 2011
1. Did you achieve your intended Iearning outcomes? If not why not?
Yes. All those who attended the class completed both intended learning outcomes: (LO1)
mplementing a picture to turtle compiler and (LO2) defining appropriate properties for
compiler correctness.
2. Did your structure and session pIan work? In not why not and what wiII you do
differentIy next time?
As the exercises stack on top of the previous weeks, we ensured that students were
not stuck by asking individuaIs to demonstrate functionaIity and providing
exampIe soIutions to the previous weeks exercises. As all students reached the
intended learning outcomes, this appears to have succeeded.
3. What faciIitation techniques were used to engage students in the topic and in
discussion? Did they work? If not what eIse couId you do?
We engaged in students in a number of ways;
We toured the cIass, observing progress directly, and asking students both
generaIIy about their progress and specific questions. Specific questions tended
to invite fuller responses.
Students were also encouraged to indicate if they were having difficuIty. We
endeavoured to see those having trouble immediately, before returning to the tour.
Most of our questions came from this technique.
4. What did you Iearn most from the experience and what wiII you do differentIy next
time?
The use of automated assessment (as referred to in my symposium presentation)
allows quick assessment and targeted feedback on functionality.
The class seemed to enjoy the baIanced probIem specification. We had attempted
to give enough information for the student to understand a high-level approach to the
problem, while not feeling constrained by it. However, providing a variety of exercises
of differing difficulties can be used to stretch the able and support those needing
assistance.
B.3.
41
Preparing Future Academics (PFA) Form Learn 1
Learning Log TempIate
TitIe of teaching session: Code Generation and Optimisation: Getting compilers right
(guest lecture)
Date: 17th February 2011
1. Did you achieve your intended Iearning outcomes? If not why not?
All learning outcomes were discussed but think more detail on "Apply some verification
techniques to a problem (LO5) could have helped strengthen the concept. As it was, the
class seemed confused.
2. Did your structure and session pIan work? In not why not and what wiII you do
differentIy next time?
The pauses for discussion worked very well. They could be lengthened or shortened
as necessary to keep to the session plan timing.
More time should have been given to "Apply some verification techniques to a
problem, (LO5) possibly at the expense of some of the motivation. Not enough time
was available to properly explain the concepts.
3. What faciIitation techniques were used to engage students in the topic and in
discussion? Did they work? If not what eIse couId you do?
Different techniques were used to encourage discussion depending on the stage of the
lecture.
A quick yes/no shout out was used near the beginning to engage the class with
an 'easy' question. Nearly all responded, all those who did correctly.
A structured discussion point was used around the middle, where students were
asked to discuss in a number of well-structured questions in small groups and then
contribute back to the main class. All students participated in a group and most groups
contributed back to the class.
An unstructured discussion point was used towards the end, where students were
given an unstructured question to discuss in small groups and contribute responses.
These responses were then used as further class discussion points. While all students
participated in groups, not as many felt comfortable supply responses. However, the
class did manage to supply all the expected responses.
Brief, written feedback on the lectures was requested by means of a structured form.
Everyone in the class responded, to varying degrees of depth.
What seemed to be missing was a practical exercise, something to be considered for another
occasion.
B.4.
42
4. What did you Iearn most from the experience and what wiII you do differentIy next
time?
While and the majority of the class found the discussion points useful, one or two
students found them superfluous and disengaging. am unsure whether they can be
accommodated.
Another area of feedback was that needed a more confident style. had a tendency
to defer to members of the class.
Multiple paths are required for complex topics, such as theorem proving. Different
classes need the information at varying degrees of depth. The choice needs to be
made on-the-fly.
B.4. Reective log for a CGO guest lecture
43
Preparing Future Academics (PFA) Form Learn 1
Learning Log TempIate
TitIe of teaching session: Lexical and Syntax Analysis of Programming Languages: Bison,
a Parser Generator (with module leader Matthew Naylor)
Date: 23rd May 2011
1. Did you achieve your intended Iearning outcomes? If not why not?
Yes. By the end of the two hour practical, all of the students were able to use Bison to build
a syntax checker (LO1.1) and evaluator (LO1.2) for a small arithmetic language. Nearly
all managed to extend the language (LO2) for Boolean expressions. Most had at least
begun include (LO3) imperative statements. Those who did not achieve this weeks learning
outcomes had missed the previous practical. They were encouraged to complete it in their
own time and feel free to request help by e-mail.
2. Did your structure and session pIan work? In not why not and what wiII you do
differentIy next time?
The lesson plan worked well. Most students immediateIy understood the probIem and what
resources they needed to complete the exercise sheet. The solution was sIightIy mechanicaI
and, next time, it would be preferable to supply a problem that required a bit more ingenuity
to stretch abIe students, without scaring those who are struggling.
3. What faciIitation techniques were used to engage students in the topic and in
discussion? Did they work? If not what eIse couId you do?
We engaged in students in a number of ways;
The probIem sheet was introduced and summarised to the class 'from the front.'
At this point, immediate questions about the problem sheet were invited but few had
look at it in advance and, therefore, did not feel comfortable asking questions on it.
Next time, would like to introduce the problem sheet, give a chance to work on a
small, accessible section and then summarise the rest of the sheet before asking for
questions.
We toured the cIass, observing progress directly, and asking students both
generaIIy about their progress and specific questions. Specific questions tended
to invite fuller responses.
Students were also encouraged to indicate if they were having difficuIty. We
endeavoured to see those having trouble immediately, before returning to the tour.
Most of our questions came from this technique.
B.5.
44
4. What did you Iearn most from the experience and what wiII you do differentIy next
time?
The majority of questions were not actually on the intended learning outcomes but
technicaI matters about pecuIiarities in the tooIs. This may be an indication that
the tools selected are not the best for teaching this topic.
Most students were working alone and, while teamwork was not an explicit intended
learning outcome, it often helps students to work in small groups. Perhaps the set
problem can be orientated to encourage this better.
B.5. Reective log for an LSA practical
45
C. Peer and sta observations
Contents
C.1. Peer observation of my teaching, performed by Christopher Poskitt 49
C.2. Peer observation of Christopher Poskitt teaching, performed by me 51
C.3. Sta observation of Dr Chris Fewster, performed by me . . . . . . 53
47
Preparing Future Academics (PFA) Form Obs/1
Observation of Teaching
(Depending on which observation (either peer or other teaching) not all questions below will
be appropriate for all sessions)
Title of teaching session: CGO Guest Lecture: Getting Compilers Right (Jason S. Reich)
Date: 17
th
February 2011
Observer: Chris Poskitt
1. What in your opinion went well in the session? Why?
The lecture was very effectively structured a lot of thought had gone into this. Jason
began with a clear set of learning objectives, then placed the lecture within the context of
both the CGO module, and previous modules that the students had taken. The main part of
the lecture was organised into sensible sections, and breadcrumbs at the top of each
slide allowed students to clearly see where they were. The lecture closed with a summary
of what had been covered, and Jason was careful to relate this to the learning objectives
stated at the beginning.
Jason ensured on multiple occasions to motivate the work by linking it to real examples. A
sense of excitement was conveyed to students by linking the work to ongoing research.
Humour was used effectively and made the lecturer appear approachable.
The group discussions mid-lecture were an excellent feature. They broke up the lecture
(avoiding information overload), and really did encourage students to think about the
issues with the people sat next to them. The students were asked to ponder over a number
of open-ended questions; from what I could tell, they engaged with these questions very
well, and even challenged each others views.
The pace of the lecture was good (initially it was quite fast, but this became steady very
quickly and then remained at a good pace).
2. What in your opinion could be improved or developed? How might this be achieved?
Jason asked audience members (myself included) on a couple of occasions if what he had
just said was correct. On all occasions, of course, he was right but I suggest not to ask
for reassurance like this in the future, as it could potentially affect a students confidence in
the lecturer.
For much of the lecture, Jason stood behind the AV Tower. This was not a problem for
most of the students in the room, but I felt that potentially, students sat at the far right of
the room may not have had such a clear view of the lecturer. (This is speculation since I
was not sat there but nonetheless it remains very good practice to carefully consider
where it is you stand when delivering a lecture.)
I felt that the code slides had a bit too much information on them. It might be more
effective to use overlays for these, introducing lines/blocks of code one-by-one. I suspect
that a student would find it much easier to focus on the code if presented this way, rather
than presenting a lot of code all at once (it is easy for tired eyes to wander).
C.1.
49
3. What techniques did the lecturer use to encourage discussion? Did they work?
The lecturer broke off from lecturing on two occasions for students to discuss some open-
ended questions amongst themselves. As discussed earlier, this proved to be very
effective, and clearly engaged the students with the material.
4. Please comment on areas in which the lecturer invited feedback
N/A
5. Any other comments or suggestions
Did I see Comic Sans font in one of the diagrams?!?! :-)
The lecturer handed out some very effective feedback forms, not dissimilar to those used
by the RDT. From what I could tell, students actually left some meaningful feedback, as
opposed to the pleasant but unhelpful generic good lecture comments that other
types of forms may have invoked!
Overall, an excellent lecture and I am not alone in thinking this! I hope that some of these
comments are helpful.
C. Peer and sta observations
50
D. PFA symposium submission
Contents
D.1. Executive summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
D.2. Symposium slides . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
D.1. Executive summary
I chose the topic, Do you see opportunities for innovative assessment methods in modules
that you have been involved in supporting or are the current methods still t for purpose?
and applied it to the teaching programming within computer science and beyond.
The presentation; (section D.2)
Highlighted the many departments were computer programming was taught.
Described and explained the common intended learning outcomes from a programming
module.
Discussed the existing teaching strategies.
Presented informal feedback from both students and teachers.
Used this feedback to motivate suggestions, based on research into the teaching of pro-
gramming skills.
Demonstrated how these suggestions could have also been derived from the existing
intended learning outcomes. In particular, these suggestions were related to professional
practice.
Briey described real world results where the suggestions had be put into practice and
indicated where further study was required.
Constructive alignment runs deep. Going back to the beginning may give you the solutions
you need.
55
D.2. Symposium slides
D.2. Symposium slides
57
D. PFA symposium submission
58
D.2. Symposium slides
59
D
.
P
F
A
s
y
m
p
o
s
i
u
m
s
u
b
m
i
s
s
i
o
n
6
0
D.2. Symposium slides
61
62
E. Skillsforge points summary
Points Summary (Green Card)
The name Green Card is taken from the professional development programme used in The
Department of Biology. The Green Card pulls together all of your completed courses and
development activities and sums up the points that have been awarded.
Points total to date: 37.0
10-
Jun-
2011
Maintaining Innovation and Enthusiasm in
University Teaching
3.0
Researcher
Development
Team
26-
May-
2011
Research with Impact in Computer Science,
Electronics and the Physical Sciences
3.0
Researcher
Development
Team
27-Apr-
2011
Evaluation and Quality Enhancement 3.0
Researcher
Development
Team
15-
Mar-
2011
PFA Symposium 2011 5.0
Researcher
Development
Team
28-
Feb-
2011
Learning Styles & Student Motivation 3.0
Researcher
Development
Team
14-
Feb-
2011
Introduction to Pedagogic Research 3.0
Researcher
Development
Team
26-Jan-
2011
Planning Assessment Methods for Student Work 3.0
Researcher
Development
Team
20-Jan-
2011
Structuring and Designing Courses 6.0
Researcher
Development
Team
13-
Dec-
2010
Effective Lecturing 3.0
Researcher
Development
Team
20-Oct-
2010
PFA Introduction 5.0
Researcher
Development
Team
* Please allow 3 weeks from the end of the course for points to be awarded. If, after this time,
points still have not been awarded, please contact the department responsible.
Date Title of course / activity Points
Points
Awarded By
63
F. Student feedback
Contents
F.1. General feedback from Mathematics for Computer Science . . . . 67
F.2. General feedback for Code Generation and Optimisation . . . . . 68
F.3. Specic feedback for Code Generation and Optimisation guest
lecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
F.3.1. Rating feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
F.3.2. Free-form feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
F.3.3. Keyword feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
65
F.1. General feedback from Mathematics for Computer Science
F.1. General feedback from Mathematics for Computer Science
All students were asked to complete a standardised feedback form during the last practical
session of each module. The question relating to postgraduates who teach asked;
Please comment on practicals and/or problem classes and/or case study sessions.
For instance, were the Postgraduate Teaching Assistants (PTAs) helpful to you? If
you can identify specic PTAs, that would be helpful.
Example response 1 PTAs helpful as they explained it instead of just pointing to formulas
like Colin [, the module leader,] did. (Jason was very good)
Example response 2 Jason was very good, gave useful pointers or hints instead of full
answers.
Example response 3 They were helpful, it was sometimes hard getting hold of them. Jason
was particually good.
Example response 4 Jason was helpful. [The other PGWT] struggled to explain things.
General themes Helpfulness, approachability, friendliness, knowledge, hints, dierence from
lecturer style.
67
F. Student feedback
F.2. General feedback for Code Generation and Optimisation
All students were asked to complete a standardised feedback form during the last practical
session of each module. The question relating to postgraduates who teach asked;
Please comment on practicals and/or problem classes and/or case study sessions.
For instance, were the Postgraduate Teaching Assistants (PTAs) helpful to you? If
you can identify specic PTAs, that would be helpful.
Example response 1 Good support in lab practicals. Interesting tasks and good that each
lab builds on the previous one. Paper practicals not as enjoyable might be better if they
were weekly one-hour sessions.
Example response 2 PTAs in this module are far better than any other modules this year.
They are approachable and helpful when you ask them questions. The problem classes were
good though reasonably challenging.
Example response 3 Both Jason and [the other demonstrator] were extremely helpful in
both lab and pen/paper practicals.
General themes Helpfulness, approachability, friendliness, constructive and connected exer-
cises, length of sessions.
68
F.3. Specic feedback for Code Generation and Optimisation guest lecture
F.3. Specic feedback for Code Generation and Optimisation guest
lecture
I collected written feedback for my guest lecture in the Code Generation and Optimisation
module using a custom form. This form, inspired by that used by the Researcher Development
Team, encourages respondents to supply structured but constructive feedback. Thirty-three
responses were received.
F.3.1. Rating feedback
Figure F.1.: Chart of rating feedback
Participants were to rate three statements on
a scale from strongly disagree to no opin-
ion to strongly agree. The results are sum-
marised in Figure F.1.
The statements used were:
Q1 I feel the knowledge/skills learned will
help me in my work or study.
Q2 I found this session useful.
Q3 I found the facilitator was suciently en-
gaging and informed.
F.3.2. Free-form feedback
Participants were asked to
indicate one thing that the facil-
iator should stop, start and con-
tinue.
Example response 1 STOP: Asking the audience if you are correct. CONTINUE: Dicussion
breaks! START: ...
Example response 2 STOP: Writing on whiteboards. CONTINUE: Lecturing!
Example response 3 STOP: (nothing to say here!). START: Making questions more precise
(unless the ambiguity is indended?) CONTINUE: With engaging style of presentation!
Example response STOP: (cant think of anything sorry). START: Making people feel
they can ask questions (answering Fair Enough seems a bit negative) and more notes on slides.
CONTINUE: Discussions and questions, sessions for consolidation of discussion.
Example response 4 START: Using a pointy stick for projector slides.
Example response 5 CONTINUE: Interactive aspect of lecture and top bar [breadcrumbs]
on slides.
69
F. Student feedback
Example response 6 STOP: Interactive Q & A in lecture.
General themes Most respondents enjoyed the discussion breaks. Only one specically asked
for them to stop. A few comments on my visible nerves, either rushing parts and asking the
audience if I was correct. One response commented that I may have not been engaging with
audience questions.
F.3.3. Keyword feedback
Participants were asked to circle three keywordss (out of a selection of eighteen) that they felt
best described the lecture.
Keyword Frequency
Informative 19
Relevant 16
Interesting 15
Interactive 13
Engaging 12
Constructive 7
Worth my time 5
Helpful 4
Theoretical 4
Keyword Frequency
Practical 2
Too fast 2
Boring 0
Irrelevant 0
Too detailed 0
Too slow 0
Unfocused 0
Unhelpful 0
Waste of time 0
Table F.1.: Keyword feedback
70