While experience means different things to different people, let me narrow it down to one area, based on my conversation with some of the people in my network. Here is what I have heard - Once you post an article for a MIX contest (say M-PRIZE or H-PRIZE etc), no one ever knows, how the articles get evaluated, including the yardsticks used etc, by the judges etc. And that is exactly the point, from where; my idea comes from as well. I suggest implementing an unprecedented Transparency within MIX. When I say transparency, please note that, by no means, I am second guessing the integrity of the system, rather, I am suggesting to make the process as transparent as possible. I will also say this that - I have utmost respect for the MIX community and its leaders as the things I have learned from this platform are invaluable! With that said, here are some ideas - • Publish the evaluation guidelines openly • Provide the scores for the articles that made it to the list, including those, did not make it to the list, and provide reasons, why it did not make it. (at least for the people who ask for it in an 1:1 mode) • Open up Mixers to ask questions around the evaluation reasoning i.e. why it was scored that way, and clarify if there is any misunderstanding, if any. • Where possible, open it up for debate and then ask the community to vote on it, especially when selecting finalists and/or the winners! While I can go on and on, transparency is as good as the values and codes practiced by all (yours truly included), and so, I suggest that we publish the values and codes first, and then slowly implement transparency.
You need to register in order to submit a comment.