Designing out and Deterring Plagiarism

Are you concerned about plagiarism? The internet offers many opportunities to access learning materials, but also enables ‘copy and paste’ and there are unscrupulous companies that will sell assignments to students. For example this Guardian article: Plan to crack down on websites selling essays to students announced

To reduce opportunities for any student to plagiarise, design your assessment strategy to ‘design out’ plagiarism and try these suggestions for promoting authentic work:

Involve students in learning activities about plagiarism:

  • Include online learning materials that cover plagiarism and how it can be avoided in your course materials with course specific, relevant examples of what is not acceptable. University regulations:
  • A good online tutorial to teach students about plagiarism is the one produced by Indiana University: How to Recognize Plagiarism: Tutorials and Tests
  • Include mandatory learning activities that engage students with correct citing, referencing etc (e.g., article reviews)
  • Include learning activities that engage students with marking sample (or each others’) work against the assessment criteria and with a plagiarism-detecting tool (e.g., Turnitin)
  • Assess (handwritten) short pieces of student writing early on in a course and compare to assignment writing styles
  • Clarify and use existing policies and appropriate disciplinary action 

Design assignments that can’t easily be plagiarised:

  • Use ‘open-ended’ tasks rather than ‘closed’ tasks, or ask students to reflect on their individual learning journey.
  • Create tasks where paraphrasing will not be sufficient.
  • Don’t rely on Turnitin – it does not pick up all copied content, and does not deter plagiarism
  • Ask students to apply theories, models and ideas to their experience or a case study rather than to explain or paraphrase the theory, model or idea.
  • Create unique assignment tasks where the answers can’t be found on the internet e.g., original case studies, recent news articles, detailed analysis of specific extracts etc
  • Avoid tasks that can be completed by accessing readily available banks of knowledge
  • Create new assignment tasks/essay titles for each cohort in negotiation with students
  • Break down a task into parts with specific deadlines, e.g.,  a plan, a literature review, an abstract, set of data, proposal.
  • Create assignments that engage students in solving current, relevant problems or making decisions
  • Include an additional task that can only be completed with thorough understanding of the final assignment e.g., an oral presentation, reflective account, short examination etc
  • Include a range of assignment formats e.g., report, poster, presentation, web-conference, blog posts, book/webpage review etc and combinations of formats
  • Include partially pre-prepared tasks with very short deadlines, e.g., case study analysis, application of methodology to data, curating material for exhibition or conference to be completed in 24 hours etc

Useful links and references:

Ledwith, A., & Rísquez, A. (2008). Using anti-plagiarism software to promote academic honesty in the context of peer reviewed assignments. Studies in Higher Education, 33(4), 371-384

Walker, J. (2010). Measuring plagiarism: researching what students do, not what they say they do. Studies in Higher Education, 35(1), 41-59

Warn, J. (2006). Plagiarism software: no magic bullet! Higher Education Research & Development, 25(2), 195-208

Rogerson, A; McCarthy, G (2017. Using Internet based paraphrasing tools: Original work, patchwriting or facilitated plagiarism? International Journal for Educational Integrity, 13, (1)

THE Article: Use of Turnitin software does not deter cheating, study finds

Flat out

Well, I’ve had lots of plans and many ideas about how to work more effectively on my doctorate, but I seem to have stumbled on something that is working (and not what I expected). One of my previous ideas was that I should schedule time out to study, but I never seemed to be able to stick at it, or even with 6-8 hours planned and done in a week, I still didn’t seem to have much to show for it.

So the new plan is to turn everything around. Instead of planning my EdD work round my life, I do the opposite. Mornings are a good time to think, review and plan what I’m working on that day, so this means a discussion over breakfast and even on the cycle journey to work with my (very supportive!) husband. I have break in the office for work, and then at lunchtime, I open Mendeley for some reading. Often, my reading triggers other reading, so there’s always something interesting to read over lunch. I try hard to make useful notes (ie, in sentences that relate to what I’m working on) ready for using later. After a break for office work, the cycle home and tea (made by my very, very supportive husband!) I then open Mendeley and my chapter so far and settle down to writing. Weekends follow much the same format, with short breaks in my EdD work to visit family and friends, eat and some (very small amount!) of housework. Fortunately, holidays give me the opportunity to enjoy a different view from over the laptop, and different, warmer, squares to sit and read in.

A while ago, I read a blog that suggested if you have a big project on and are finding it difficult to see how it can be achieved, one way was to imagine what you would have to do to complete it in a fraction of the time. It took me about 8 months to write my first section of my EdD, and, using my new way of working ‘flat out’ I have done half of my next section  in less than 1 month. Whether my husband can take this for the next 12 months remains to be seen…


I’m on  leave from work, but that means working on my EdD. It’s slow progress, and I think there are several reasons for this. One is the way I’m working alongside my full time work means that there is a long gap between reading, thinking, making notes and then writing. I’ve usually forgotten what I’ve read by the time I get to write and have to re-read! I have also made detailed plans, but then when I’m writing I find that the structure doesn’t quite work and I need to re-organise. The main reason I’m working slowly, I think, is becuase I’m not confident that my work is good enough. I’m writing in ‘draft’ format that then needs heavy re-writing weeks later. So, a few ideas to try out:

  • Read and write in very short sections that are ‘complete’ paragraphs. Perhaps give a title or question that the paragraph aims to answer.
  • Practice dictating my writing to the computer. My spoken sentences might be clearer and more focussed.
  • Blog with what I’ve achieved each working session to help me see my own progress, using a Self-organised Learning planner as per the advice in Learning Conversations


It’s hard to work on a big project like this. For a while, the excitement keeps you going, but then, when it gets hard, the prospect of having to study every day, every weekend, every holiday, for the next 18 months (if I’m lucky) is daunting. I have done a lot of planning, and have a detailed list of what to do and deadlines, but I’m already way behind. Yes, I can re-schedule, but that means the whole project will just take longer and longer (and more days/weekends/holidays). I’m also feeling very unsure about the validity of my project, and whether it is going to get anywhere and this is slowing me down.

However, I must go on. One thing that makes it feel interesting and valuable, is when I work with my colleagues. The work to prepare for the eDAT analysis and the resulting discussions have been really interesting, and I’m working today on preparing a repertory grid pilot to try out with another EdD student tomorrow. It might be that I’m thinking of this project as my own project. If I focus on the work with others, and plan to build in activities with colleagues this might help move me forward. And I don’t mean lunch ‘chats’ about the project that make me feel like I’ve been working when actually I haven’t!

So, a plan is forming. I will look at all my proposed activities and see if I can adapt them to activities that include my colleagues. For example, I’m working on the rep grid, but I don’t have interview dates set up with the tutors. I could plan those now. In addition, I could set up a workshop for distance learning tutors on the topic of ‘interaction in ODL’, present some of my findings, and enable discussion and sharing of best practice. By talking to others and applying my work as I go, I may find confidence that it is valuable.

eDAT Workshop

To test the eDAT, I set up a workshop with 3 colleagues and asked them to use the tool to categorise some of our distance learning courses. In addition, I asked them to use some other taxonomies for learning activity against the same courses to compare.

A quick analysis of the data showed a moderately good inter-rater reliability of 0.71 for the eDAT.



Repertory Grids

In early April I took books to Palma and sat overlooking a lovely traditional square in warm sunshine reading and thinking about Kelly’s Personal Construct Theory and his repertory grids. Personal Constructs is a theory about the way we make meaning of the world by using ‘blueprints’.  I think that this idea might be helpful when thinking about how teachers create online leaning activities and understanding what the constructs are that underpin them. What language do teachers use about the designs that suggest the constructs? I’m exploring the use of repertory grids as a kind of semi-structured interview to elicit some of this.

I’ve not found much criticism of Kelly, which is surprising, but I have found some discussion of the dichotomous nature of the constructs. Do we think in opposites? When we think about whether someone is outgoing, do we, at the same time, think about the opposite (for us)?

Module Evaluation

I was involved in the new Evasys module evaluation process last term and have been thinking about the data it’s generated and how we can use it to support learning and teaching. But there is evidence that student evaluations don’t measure teaching effectiveness very well, and are displaying students’ gender and racial prejudices:

We need to take this into account when we prepare university-wide reports on module evaluation.

The Higher Education Course and Module Evaluation Conference

I attended this conference yesterday and found it an excellent opportunity to think through the whole of the evaluation cycle, and to put our Evasys module evaluation process in context. Full details of the sessions and abstracts are on the website. Some highlights are below:

Working With Students to Review, Refresh and Relaunch Module Evaluation, Dr Helen Carmichael, Matt Grange and Matt Linguard (Uni of West London):

  • The students’ union were central to the increase in response rates to the online survey. A student promotional video was created, and course reps briefed on the purpose and benefit of participation.
  • Class time was made available to complete the online form on students’ own devices and communication with staff and students was increased with timely emails and a web page.
  • Response rates were shared with HoS during the evaluation window and staff were asked to post a summary of the evaluation and feedback to students within a few days of the end of the process.
  • The results are not used as part of performance management, but are used as a satisfaction rating that gives a guide to good and weak practice to stimulate discussion in teams.

Turning Course Evaluation Data Into Information, Katarina Thomson and Karl Molden (Uni of Greenwich):

  • Results showed a correlation between learning outcomes and some course evaluation questions on learner engagement (ie on group work). Similar to research done at Sheffield Hallam
  • Survey not completed anonymously – student demographic data matched to response, but results shared without names. Student contract developed to reflect this.
  • Results shared with all students and tutor asked to post response to evaluation on VLE within 10 days.

48,000 Comments and Counting – What Does It All Mean? Sarah Baxter, Rachel Forsyth and Rod Cullen (Manchester Met):

  • Ran text analysis on the free text comments and did thematic analysis. Themes included organisation of course materials in VLE and assessment
  • Overall satisfaction is based on difference between best and worst experience, (ie, a good feature in one module meant the lack of it in another was flagged as a negative experience)
  • Staff wanted more detail from the evaluations to help guide interventions
  • The thematic data was used to identify ‘good’ exemplars to share with staff
  • Data was also used to develop personalized staff development for module teams
  • Data was cross-referenced to external examiners reports

Effective Evaluation – Creating Understanding, Change and Continuous Improvement – How Effective Evaluation can lead to the Improvement of the University Experience for Students and Staff, Michelle Morgan (Uni of Kingston and Lead for HEFCE PEP Project):

  • Use the audit cycle and embed evaluation at each stage rather than as a set of disjointed surveys
  • Identify the key stages of the student life-cycle and coordinate the evaluation at each stage

Morgan, M. and Direito, I. (2016) Widening and sustaining postgraduate taught (PGT) STEM study in the UK: a collaborative project, HEFCE and Kingston University: London