Best practices for evaluating student contributions?
Hi everyone! I’m using Miro for an upper-division college course wherein students are building out mind maps in groups. I’m evaluating them by their weekly individual contributions, for which I’m using the board history. This has worked all right so far, though it can be a bit difficult to parse what the user actually contributed.
Does anyone have any other methods they’ve used to accurately review how users have contributed to boards? Is there a way to see a user’s overall contributions (that is, not through a specific board’s history)? Thanks!
Page 1 / 1
Hello Nolan,
Do your students have Miro account themselves ? → This would help you ensure with more accuracy who has contributed what to your boards.
When it comes to the stickynotes at least, to check who as done what you can use the “show author” feature in the note’s menu :
That shall enable you to performs check like this one :
But if you try to rid yourself of a potential free-rider phenomenon it is quite hard to do it in Miro, without indirectly publicly shaming these person(s).
For example, maybe a dot voting session for each group with a set of X dots by group member to distribute to other teammates according to their perceived activity / involvement in the group work ? But that would result in small totals of dots for the supposed free-riders which would set them aside and not give you an opportunity to discuss that in private with the free riders.
That is a tough challenge to solve, finding these free riders.
We use a tool just for that in our LMS called FeedbackFruits and which proposes activities called “team members evaluation” where this type of evaluation can be made in private.
Thanks for the ideas, Adrien! Alas, it looks like you can’t see the author for nodes in a mind map, unfortunately (kind of puzzling why that would be enabled for notes but not nodes).
I’ve got my own rubric set up in Canvas measuring students by their contributions (which seems to be a functional equivalent to your suggestion about voting or FeedbackFruits), so really my only issue is more accuracy in seeing who has done what. I’m not even evaluating the boards themselves, so it’s not so much of a free rider problem as just not seeing who has done what. But yes, it does seem like there’s more to develop on the platform before it is reliable enough for exercises like this! I’ve already got at least one student who swears they’ve contributed even though the board history says they haven’t. New platform, same pedagogical problems, right?
Oh, and, yes, the students do have accounts!
Thanks for the ideas, Adrien! Alas, it looks like you can’t see the author for nodes in a mind map, unfortunately (kind of puzzling why that would be enabled for notes but not nodes).
I’ve got my own rubric set up in Canvas measuring students by their contributions (which seems to be a functional equivalent to your suggestion about voting or FeedbackFruits), so really my only issue is more accuracy in seeing who has done what. I’m not even evaluating the boards themselves, so it’s not so much of a free rider problem as just not seeing who has done what. But yes, it does seem like there’s more to develop on the platform before it is reliable enough for exercises like this! I’ve already got at least one student who swears they’ve contributed even though the board history says they haven’t. New platform, same pedagogical problems, right?
I think it might just be a matter of you not using the built-in Miro Mind Map tool in your activities and walking your pupils / students through the process of building manually a Mind Map through the use of Sticky-Notes.
It is a bit of investment in time for you but it makes for grat mind maps too, and you know who wrote what.
One of the key features in that process of onboarding the students to create their own mind maps from scratch with post-its is just making this GIF available to them on the board as a reference :
I’ll certainly consider that next time I use this assignment. This time I wanted to strike a balance between giving students a wide variety of tools while not overwhelming them with the platform. Maybe I’m missing something, but it seems like I still wouldn’t be able to easily see if someone revised a note or restructured the map, which I’m including in my evaluations. But in any case, good to know there are other options in the future!
I just wanted to update this thread in the event that anyone else is considering this broader question. After just two weeks, it looks like I need to abandon my planned method of evaluating students based on the board history. After reading the FAQ a bit more closely and through some experimentation, I’ve realized a few critical flaws:
Board history doesn’t account for moving nodes or items, so if a student made meaningful organizational changes, it might not get registered
When editing or deleting an item, board history removes any previous mention of that item. The FAQ mentions that this is for “content optimization,” but it effectively means that any member who created or previously worked on that item now has their contributions deleted (outside of digging through past versions). IN a classroom context, that means that if another student came along and fixed one typo or style choice on a node, it now appears on the board history as though they did the work.
Unless I’m mistaken, there’s no good way to see what was deleted if the history just says “deleted node” or “deleted line.”
In summary, I see why board history is designed this way for more general team management, but I don’t think it can work effectively for evaluating members’ contributions.