The Dangers of Feedback

Greg Foley is a lecturer in the School of Biotechnology.


Feedback is one of those things in education that is always seen as a ‘Good Thing’. (An excellent paper on feedback by John Hattie can be found here.) But providing feedback also has its dangers. To explain, I need to divert. A few years ago I did a bit of research on artificial neural networks for data analysis in chemical engineering. Artificial networks are essentially a fancy form of non-linear regression and while they are very powerful, they can be misused. One of the dangers of ANNs is that that if you don’t know what you are doing you can design networks that are ‘over-trained.’ To illustrate what over-training is, considering the two curve fits shown below. The first fit, a simple linear regression, would seem to have captured the essence of the relationship between the dependent variable and the independent variable. On the other hand, the second fit which actually matches the data exactly, is probably ‘over-trained’. It is seeing patterns where it is likely that none really exist. This fit cannot ‘see the wood for the trees’.



So what has this got to do with feedback? Often, when I am giving oral feedback to students (in a lab for example) I am struck by the fact that students tend to want a sort of recipe for success. They want to be told precisely what they need to do to score a high mark. They don’t like generalities like being told that their graphs and tables should be presented in a ‘logical order’ or that they need to improve their attention to detail. Indeed, it is often the best and most ambitious students who desire this level of precision in the feedback they receive. It’s as if students want to be ‘over-trained’ so that their work matches exactly the ‘perfect’ lab report where ‘perfect’ is defined by the lecturer’s marking scheme.

Getting the balance right between providing students with useful guidance and facilitating them to jump through hoops can be tricky.


Twitter for learning: does it work for you?

Muireann O Keefe works in the Teaching Enhancement Unit in DCU. She is an academic developer – a role that assists those working in higher education to critically think about their practices as lecturers, and to make  research-informed changes to practice. See


According to learning consultants like Jane Hart and other educators out there, Twitter is a top tool for learning. In fact Jane Hart’s annual poll of learning tools  among professionals has voted Twitter as a top tool for learning 7 years in a row!

I can’t calculate accurately how many people use social networking tools for learning in higher education but I do know that growing numbers of academics and professional in higher education are logging on and signing up to services such as Twitter, LinkedIn and YouTube to learn from others, to keep up-to-date, and share practice with other interested professionals.

Researchers in this area (Martin Weller, Cristina Costa and George Veletsianos) describe how academics share knowledge and scholarship with one another via Twitter. For these researchers the web is an open participatory place where sharing and connecting with others is useful for disseminating research and creating networks of people around topics of special interest.

I am an academic developer and in my job I have been helping and coaching academics on enhancing teaching practices for a number of years. Back in 2009 I began to encourage staff to use Twitter as a means of keeping up-to-date with the latest research and information regarding practice. I gave people introductory workshops on using Twitter and social networking tools. Many staff who adopted Twitter perceived it as a learning tool.

EdD research

When I started my EdD studies I chose to explore Twitter as a learning tool and this exploratory research has given me in depth qualitative insight into how Twitter has been used for learning by teaching faculty in higher education.

Research Findings

So what did I find in my research? I discovered that all of my participants perceived Twitter as a tool for learning resonating with claims from educational consultants about Twitter as a learning tool for professionals.

On further analysis…

However when I looked more closely at the Twitter data, I noticed that participants of the study demonstrated different  levels of social engagement on Twitter, the majority of my participants were not sharing practice or having conversations with other tweeters online.  In fact the majority of my participants were listening in and lurking on the Twittersphere gathering information disseminated by others but not having conversations or sharing their own practice online.

This finding jarred with me because of my social constructivist belief about learning. I consider that learning happens socially among people, we learn from one another, from our experiences and our cultural contexts. But the participants within my study who used Twitter (a platform built for social networking) did not engage in social networking activities with other tweeters.

I followed up with interviews, which revealed some very interesting data – not all of these participants experienced a sense of belonging to networks on Twitter. They were “not ready” to be part of conversations online and they gave reasons of perceiving others to be more knowledgeable than they were and feelings of vulnerability.

In contrast the minority of research participants who were highly active in social network activities said that their confidence from grounding in education enabled them to share practice and have conversations about professional educational practices on Twitter.

Lack of professional confidence is an issue

So it seems that while Twitter is perceived as a useful tool for learning, some participants did not have the courage or confidence within their professional selves to use it as tool for learning. They did not identify or feel a sense of belonging with other similar professionals online and this was a barrier to the social engagement and display of voice online.

So how can these findings be used?

This research is useful to my work as an academic developer because it has taught me that purely technical support with tools such as Twitter is inadequate, development of professional identity needs to be factored into the process.

Other issues have also come to light in my practice: If I was advocating the use of online social tools was I potentially placing people in vulnerable situations online? Safety and vulnerability are real risk factors online, which need mindful and critical discussion if advocating open online tools for learning. Certainly going forward I will be facilitating deeper discussion with learners (academics, undergrads or whoever) about how they use open online tools and about development of identity as part of their online learning process.



The spacing effect and semesterisation

Greg Foley, Lecturer in Bioprocess Engineering, School of Biotechnology


There is a well-established phenomenon in learning science and it is known as the spacing effect. This effect is illustrated schematically in the figure below. Basically, it is something we all intuitively know, namely the more times we revise material the more it sticks in our memory. Our instinctive awareness for the spacing effect is the reason why we advise our students not to cram for exams but to study continuously through the year.


Ideally, a student should study regularly and often (but not too often because it turns out that it’s good to forget and re-learn), but in each study session he/she should not only cover the most recent material; on the contrary, parts or all of the accumulated material up to that point should be revisited. In reality, this means that study sessions should get longer and longer as exam time approaches. It goes without saying that it is challenging to study like this because it demands a lot of commitment and sacrifice especially in the smartphone age.

The question is, therefore: does our current, very condensed semesterised system make it more difficult for students to study effectively? It is interesting that in the US where the semester system has long been the natural order of things, good learning and study practice is more or less enforced through the use of homeworks, quizzes, mid-terms and finals. Testing is frequent and because there is so much of it, the stakes in any given test are not so high as to be overly stress-inducing. Crucially, this means that you can set assignments/ problems that really test the student’s ability to think critically and creatively without severely penalising weaker students.

For sure we have continuous assessment components in many modules and this does help to reduce failure rates, but paradoxically this might be part of the problem. Suppose you have a CA component that accounts for 20% of the final module mark and it involves a couple of undemanding in-class tests covering small chunks of material, material that in many cases is re-examined in the final exam. Now suppose a student gets 70% on that test, a common occurrence in my experience. (CA marks tend to be much higher than exam marks.) Then the student only has to get 33% in the end-of-semester exam to pass the module. That’s quite a low bar and an average student will be able to get over it with little more than a short period of cramming.

Even without a CA component,  the chunks of material that students have to revise in a semesterised system are relatively small anyway. Now combine this with the fact that most lecturers provide  online notes, thus encouraging low attendance rates, and you get a ‘perfect storm’ in which students are almost incentivised to learn ineffectively.

The really worrying thing about all of this is that if students put their faith in cramming, much of what they will have learned will be rapidly forgotten, maybe even before the start of the next semester. The knock-on effects are both serious and obvious.

Of course, the horse has bolted and there is no going back to non-semesterised ways but we may perhaps need to do some hard thinking about the very nature of third level education here in Ireland. We seem to have put a lot of our eggs in the ‘independent learning’ basket but it is hard not to think that we have adopted a system that encourages study habits that are inconsistent with effective independent learning.

Having studied in the US, I think the US approach might suit the Irish student of 2016. However, that would be problematic for a number of reasons, not least the fact that the American system is underwritten by cheap labour in the form of teaching assistants.


What would doubting Thomas do? Some thoughts on assessment

Dr Mel Duffy, Lecturer in Sociology & Sexuality Studies, School of Nursing & Human Sciences


Grading examinations or assignments has become problematic in recent years.  There was a time where grading was perceived as being the stepping stone enabling students to become better communicators through writing on their field of study but more importantly on a subset within that field of study.  It was not based on the banking method of learning but on the fluidity of knowledge whereby what was learned in one subset directly and indeed at times indirectly influenced the next subset. Within this system students produced work by an allotted time, it was marked and in return the student received feedback.  Upon receiving a lower grade than expected the student would have been permitted to resubmit taking into consideration the pointers that had been discussed.  This however would appear today to be the stuff of nostalgia.  No longer do academics have the time to accept re-submissions rather they are now termed as re-sits.  The language surrounding students’ ability has moved from reality to that of the rarefied world of potentiality without due regard to the authenticity of that potential in the first place.   Therein lies the problem.  Do rubrics really encourage ongoing learning in the student? Are they really a system of empowerment for the lecturer?  Or is the true reality they exist, that is rubrics, as the Berlin wall between the academy and the legal system.

If we are to assume that rubrics are aids to the development of a robust curriculum then are we suggesting that the tried and tested methods of the past produced less than robust results.   In actual fact there appears to be little questioning of the validity for the usage of rubrics particular in the constant chatter in the educational setting that increasingly has gone public that is the ‘dumbing down’ of our degrees.  If we are constrained by the rubric for assessment are we failing to recognize the creative, once in a life time brilliant mind before us as the rubric is too tight to allow for that?  We are ruled by convention, teaching students to produce in a format that sits with the ‘high impact’ output requirement of academia failing to recognize that within the journal of such ‘high standards’ rarely do more than 10 actually read the article but it makes the producer and by extension institution look good.  Marx suggests that new knowledge is the pushing of an already existing idea into a direction never thought of before.  Within this way of thinking will we the academic fail the person who does just that as it does not sit within the confines of a particular box.

What emanates from this is my real fear of failing the student, the one or ones that do not fit the mold but rather sit outside the box looking in and discovering that there is no place for their thought processes.  My concern arises from my experiences of dealing with 1st year students who struggle with the idea that we ‘kinda like to know what they think’ and then confine them in their writing by our grading tools.  Therefore I am the doubting Thomas of the rubric who may in reality only be converted upon its usages or indeed be able to adequately answer the statement posed through reflecting on their usage.

Editor’s Note: Anyone interested in the whole idea of rubrics and whether there are better alternatives might like to read this fascinating article on comparative judgement.



Panadero, E.; Tapia, J. A. & Huertas, J. A. 2012. Rubrics and Self-Assessment Scripts Effects on Self-Regulation, Learning and Self-Efficacy in Secondary Education. Learning and Individual Differences Vol. 22 pp.806–813

Rezaei, A. R. & Lovorn, M. 2010.  Reliability and Validity of Rubrics for Assessment through Writing.  Assessing Writing Vol. 15 pp.18–39

James, P. W. 1997. What’s Wrong–and What’s Right–with Rubrics. Educational Leadership, Vol. 55 No.2 p72-75

Making the transition to blended learning as an educator for the first time? Some tips to ease the transition

Dr. Mary Rose Sweeney, School of Nursing and Human Sciences


There has been large growth in the availability of programmes of Higher Education (HE) via remote means over the past 10 years with the aim of facilitating an “anytime anyplace” approach to learning.  This is driven by the need for the HEIs to provide flexible learning opportunities as well as a desire to attract students from wider pools. In theory this should require fewer resources and les campus facilities thereby easing the burden for institutions with shrinking budgets, fewer staff and more students.

In keeping with this agenda the School of Nursing and Human Sciences (SNHS) at Dublin City University offered a new blended learning Bachelor of Nursing Studies (BNS) Programme in 2011.  The students who take this programme are qualified nurses, many of whom are very experienced in clinical practise but are returning to education after a prolonged gap in classroom style learning, and some have limited IT skills. Hence for many of these students the return to learning was daunting and a steep learning curve ensued.

The students were not alone in their apprehensions however as the programme team who had to transition the programme from fully class room delivered teaching to a blended learning format were also feeling somewhat apprehensive with one staff member admitting that “I was frozen at the thought of making a video with myself in it”. It occurred to me that a stint in the Gaiety School of acting might have come in handy for the preparation ahead but I had never envisioned a career in “television”. As it turned out I had to become adept at filming, lighting, sound and editing before the year was out.

The programme team and the students who were based in Ireland and further afield including the USA, Saudi Arabia, and Tanzania survived the first year however and afterwards we reflected on the entire process, how well it had worked, what specific issues emerged, how we dealt with them and moved forward and in this blog we share those experiences in order that others making the transition to blended learning for the first time can benefit from our experiences. It is noteworthy that there is a paucity of published literature providing advice for educators embarking on the process for the first time.

Student engagement, student readiness as well as staff readiness were identified as important considerations in the successful execution of the programme. Initial apprehension and anxiety about engaging with technology enhanced learning were identified as barriers, in addition to tight timelines and existing heavy workloads. This time commitment involved not only developing the materials and front loading of work but also getting to grips with the new technologies and finding innovative ways to deliver material and get comfortable with it. In addition designing new smaller activities to both assess learning outcomes and engage students remotely required additional time and planning. These activities had to be conducted at the same time as existing or on-going programmes were being delivered and the usual workload allocation was in place for these staff members. Arising from this we recommend that consideration be given to staff involved in making programme transitions from traditional face to face to blended format within workload allocation to allow adequate time for the amount of work required.

Fears and apprehensions and hence staff readiness to use the new technology emerged strongly in our data. The idea that a staff member would be “frozen at the thought of making a video with myself in it” is a concern and indicates a training requirement which should have been addressed prior to this point where the programme was about to be rolled out.

It was also apparent that the transition required considerable input from our “technical” and “learning innovation” colleagues at all stages of the process which should be ongoing even after the programme has been launched. This will greatly enhance the quality and presentation of the materials as well as helping with ideas to deliver content and design assessment strategies.

Student ability/readiness to engage with technology enhanced learning was an important determinant of success. Insufficient IT skills resulted in some attrition from the programme in this first year. An orientation programme was subsequently developed in later years of the programme to capture and support struggling students as early as possible in the semester to reduce attrition from the programme. This took the format of “designated times” when students could phone/skype in and get technical support up to week 3 of the programme. Going forward we recommended that all prospective students should have a basic level of competency in IT prior to embarking on the programme and that upskilling was necessary prior to starting.

Face-to-face time with academic staff emerged as being very important to our students, evidenced by the fact that they turned up for the few tutorials offered even though some of them were as one colleague put it “the graveyard shift” which might have taken place in the early morning/late evening and even at weekends. A similar high participation rate was observed in the remote tutorials for the students who lived outside of Ireland.  We scheduled these remote tutorials at 2pm Irish time to try and accommodate those around the world at different time zones. It was obvious that students had saved up all of the questions/queries they had for the face-to-face sessions rather than submitting them electronically and this prompted us we to make sure that we took a more pro-active approach on line in the future to encourage students not to be afraid to submit their queries electronically early on and to not wait until the face-to face/remote sessions.

During the face-to-face and even remote classrooms we noticed that students shared their contact details with each other and we observed how important peer support and peer learning were for them. Some students reported feeling rather isolated up until that point but thereafter contacted one another if they needed support in some way (morale or in academic matters). We concluded that activities that enhance regular engagement between students are important considerations which should not be lost in the process.

It quickly became evident that placing large chunks of course material on the VLE (eg moodle/loop) did little to engage students actively and that measures to promote active student participation were required, such as using smaller activities (quizzes), writing discussing forums and inviting replies, making short Camtasia files available, making pod casts or using open source ones, recommending ebook chapters and providing URl links to interesting articles.

Another issue raised by some of the team related to when there were periods of apparent lack of activity on the VLE and it lead one colleague to ponder “sometimes you wonder if there really are any students out there”. Early, regular and timely feedback to students was identified as critical for student engagement but also for student improvement particularly amongst the weaker students; this can also help to identify students who are particularly challenged in the online environment.

The retention of some element of online examination was considered desirable in order to reduce the reliance on 100% course work and subsequent risk of plagiarism, but this would be a concern for all educational programmes and is not unique to online / blended learning. Reservations about whether transitioning the programme to an online format encouraged students to engage in additional practices of plagiarism were expressed by some. Others felt that elearning was no more likely to encourage plagiarism than other forms of learning/assessing. There was agreement that quality of content should remain the core consideration for the programme team and that ongoing work should be undertaken as the programmes progress into subsequent years to improve not only the content and delivery but also the sustainability. The need for a global content is important, after all the very idea of elearning is the prospect of reaching wider audiences.

The positive aspects expressed by the team involved included the new learning opportunity and experiences afforded to them in setting up the programme, the feeling of satisfaction now that it has been delivered, the peer sharing of experiences/content and assessment strategies between those on the programme team, which is not normally how we work as academics, which tends to be more solitary in nature rather than working together in groups.

In the field of nursing elearning is a relatively new and emerging field which will require huge cultural shifts for staff and students alike. It is clear that students value face-to-face time with their educators though so we should be careful to strike a balance between the tools used and the personal input and presence, otherwise we might be throwing the baby out with the bath water. The learnings from our experiences are likely to be transferrable to other fields of study as they are unlikely to be unique to nursing students/educators – hopefully you can use some of them to inform and ease your transition to blended learning.

 baby bath


Image sourced on google images at

A peer reviewed publication entitled “Transition to blended learning: Experiences from the first year of our blended learning Bachelor of Nursing Studies (BNS) Programme” has been accepted for publication in the Journal of Contemporary Nurse. If you want to see further details of the evaluation contact  for a copy of the paper.

Is it there LOT to be HOT? Some thoughts on pedagogical aspects of blogging

Dr Mel Duffy, Lecturer in Sociology & Sexuality Studies, School of Nursing & Human Sciences


Traditionally, university students were initiated into the frame of critical thinking. It was a movement from the descriptive with the understanding that, as students, they faltered and felt out of their depth as they questioned their ability to critique with the overriding sentiment of ‘who are they to do so?’.  The authors of the subject matter of their choice were seen to hold the keys to knowledge and from them they would learn.  It is rather daunting for a student to be asked ‘well what did you think?’ or indeed ‘do you agree?’ to the point of ‘how would you have written this differently with the knowledge that you hold’?  This is the interactive space of a classroom whereby the student would be encouraged through words, facial expressions and body language.  The lecturer becomes a reader of bodies to enable them to help students. When we move to the internet and blogging, two out of three tools of engagements are removed.  The challenge will become how to interact and encourage without the physicality of seeing and reading or indeed the setting of the classroom.

There is an assumption that students are computer and indeed internet literate which for those of us not of their generation can be quite daunting.  However therein lies the conundrum.  Using a tool such as blogging for pedagogy is inherently different that using it as soap box for everyday living. Zawilinski (2009 p.652) suggest that using what has become ordinary communication tools of the internet such as twitter and blogging does not necessarily create ‘effective and efficient’ usage.  Indeed the question does arise as to how do they move from the soap box scenario which we might call Lower Order Thinking (LOT) to what has become called Higher Order Thinking (HOT).  Deng and Yuen (2011) indicate that blogs facilitate those with little technical skills to become publishers.  The online blog offers space whereby ones thoughts, options, emotional reactions, political agendas and activism can be winged out to a wider audience rather than family and friends being set upon over dinner or indeed a couple of beverages.  One could say that the internet alleviates their burden.  However, the problem arises as to how it becomes more than just the visceral reaction to the moment.  It is this that the facilitator/administrator of the blog in the academy has to encourage movement from reaction to thinking, to reflection to critique.  Whatever the language used to describe the outcome whether it is HOT or critical analysis, it is the movement of thinking into a space of combining reading, reflection and evaluation into a coherent written format that is at play.  My concerns are centred on this movement, if other students are critiques of their peers work what skills have they developed to become the critique? Taking up occupancy of the academy was to impart training and development of the skill set required.  The internet and use of the blog would appear to once remove the academic from the process which raises the spectre of our knowing that the skill set acquired by the student falls into the HOT category.  Deng and Yuen (2011) suggests that blogs have a potential wider audience of all internet uses, however if the skill set is to be developed, does it not require a closed blog group?  Indeed this raises the question as to whether there is any such thing as a closed group on the internet?  It would appear that those who engage with this process become advocates for its usage suggesting that it is a ‘transformational technology for teaching and learning’ (Williams and Jacobs 2004 p.245), that by supporting lecturers it can ‘readily engage learners in a problem-solving setting’ (Wang et al 2007 p.276) and it has become an ‘enabling learning tool’ (Farmer, Yue and Brooks 2008 p.123).  For me I am confronted by Frost’s (1916) image of the road not taken and maybe for me it is a leap into the unknown and emerges from the undergrowth of pedagogy irrespective of the space it finds itself in.


Deng, Liping & Yuen, Allan H.K. 2011. Towards a framework for educational affordances of blogs. Computers & Education pp441-451

Farmer, Brett; Yue, Audrey & Brooks, Claire. 2008.  Using blogging for higher order learning in large cohort university teaching: A case study. Australian Journal of Educational Technology. Pp.123-136

Halic, Olivia; Lee, Debra; Paulus,  Trena & Spence, Marsha.  2010 To blog or not to blog: Student perceptions of blog effectiveness for learning in a college-level course. Internet and Higher Education pp.206-213

Hsu, Chin-Lung & Lin, Judy Chuan-Chuan. 2007. Acceptance of blog usage: The roles of technology acceptance, social influence and knowledge sharing motivation. Information & Management pp.65-74

Kim, Hyung Nam. 2008. The phenomenon of blogs and theoretical model of blog use in educational contexts. Computers & Education pp1342-1352

Wang, Kun Te; Huang, Yueh-Min; Jeng, Yu-Lin & Wang, Tzone-I. 2008. A blog-based dynamic learning map. Computers & Education pp.262-278

Zawilinski, Lisa. 2009.  HOT Blogging: A Framework for Blogging to Promote Higher Order Thinking. The Reading Teacher pp.650-661

A Devil’s Glossary

Author: Paul van Kampen, CASTeL and School of Physical Sciences


This twenty-first century homage to Ambrose Bierce aims to elucidate terms that often appear in this blog. Only typical examples pertaining to common practices in the teaching and learning of science in Faculty of Science and Health at DCU are shown.

  •  assessment: rigorous method of mapping a student’s cognitive development during 12 weeks in a narrowly delineated area of science onto an integer between 0 and 100; cornerstone of university education that allows a student to showcase the power of their short-term memory and their proficiency at complex routine calculations.
  • continuous assessment: watered-down version of assessment administered in every module in weeks 6 and 12 to engage students (meaning 2) and prepare them for the real thing. Not to be confused with formative assessment, which does not meet the criteria for inclusion in this glossary.
  • circular reasoning: logically flawless argument confirming a premise, e.g. “lectures are good if and only if they transmit information accurately. In their exams these students reproduced the information presented to them in lectures accurately, so their lectures were good”.
  • data: 1. (research) a set of scrutinized observations. Antonym: data (meaning 2). 2. (teaching) See data (meaning 1).
  • direct instruction: the transmission of knowledge; most effective in three-hour blocks containing two 10-minute breaks to groups of 200 to 400 students in a Venutian atmosphere, in which case it also transmits reasoning skills.
  • discovering things for themselves: the only alternative to direct instruction. See also teaching (meaning 2); ignore e.g. Socratic questioning.
  • to dumb down: to engage in teaching pitched at students rather than academics.
  • to dumb up: (rare, may only appear in this blog post) to engage in direct instruction that can only be examined by recall questions. See also assessment.
  • to engage students: 1. to interlace direct instruction with videos and story telling. 2. to attempt to turn all students into good students.
  • good student: 1. student who is assigned a large integer in assessment. 2. a pleasant student who appears likely to experience this.
  • learning: the vestiges of teaching.
  • reasoning skills: collectively they denote the ability to apply concepts and resolve hitherto unseen complex problems; transmitted by direct instruction.
  • science: a body of knowledge determined by scientists to be learnt by good students, and other students too if they behave.
  • Socratic questioning: endlessly asking students what they think to avoid preparing direct instruction; purported to cause learning in victims and hemlock poisoning in perpetrators.
  • teaching: 1. direct instruction. 2. (often within scare quotes) sitting around while students exchange pleasantries.
  • university education: 1. (archaic) the acquisition of advanced knowledge, research skills, professional and ethical values, and the facilitation thereof. 2. an elaborate form of certification that depresses unemployment numbers among the middle classes at roughly the cost of a Jobseeker’s Allowance.
  • unproven methods: any interaction involving students that is not direct instruction. See also circular reasoning.