This is a diary of my involvement in a project on collaborative learning in the psychology department at the University of South Africa. Most recent posts below and links to previous posts on the left.
Institutional repositories and collaborative learning
Institutional repositories are a particular type of content management system - intended as a means for scholarly materials other than what gets published in journals to be made available online. The two big players are DSpace from MIT and Eprints - both, thankfully, open source. Like other content management systems, work on institutional repositories has been converging with work on collaborative learning. Systems such as DSpace are collaborative (learning) environments in that they allow for many people to work together in creating a larger shared product. I have just been reading a nice, solid article on DSpace by the senior people involved. There are many plusses - e.g. the fact that it is open source, flexibility around file formats, and flexible, customizable submissions processes.
So what's wrong with institutional repositories as they are currently envisaged? In my view two things:
They are built around the idea of an essentially private submissions process with authors, editors, checkers, approvers, managers checking stuff before it gets approved to be placed for public access on the repository. A system such as DSpace allows for different departments etc in an institution to have different submissions processes (including just placing stuff directly into the repository with no checking). At a push this could probably be adapted to accomodate more interesting submissions trajectories where materials are opened up for comment and editing in public or semi-public spaces rather than being simply either not-yet-approved or approved-and-submitted. However, the repository ethos (of 'lodging' approved materials in an 'archive') certainly does not currently encourage more open submission and revision processes. Put differently: A system such as DSpace allows for many different kinds of authoring roles for people in the institution (authors, editors, nit-pickers, whatever), but tends to see users (those who will access materials once they have been lodged) as essentially passive consumers of ready-made products.
The second problem (I think) is that the vision of how materials are to be located in the repositories is essentially one of keyword-type searching. The repository is imaged, I think, as a kind of random-access stack of numbered items rather than as something that is globally or locally structured. Again, no doubt, there is technical support for structuring and re-structing (arranging things under categories, themes, projects, dates), but the ethos is not really about collaboratively building a structured whole - it is about having a big undifferentiated space to dump materials into. I may be wrong about this second criticism though. Certainly the DSpace people are thinking in terms of different "collections" (e.g. a separate collection for each academic department, if that is what people want) - which is already a useful way of structuring the repository.
I think I may have double standards about peer review. In settings to do with academic publication and funding I'm as cynical as the next person - peer review is largely a way of dressing up intellectual warfare as a fair and neutral process. In the context of collaborative learning, however, I see it as a good thing - a blurring of the roles of student and teacher which encourages everybody to think more deeply and critically about what is being learnt. I've just read a very critical account by Roth of the peer review process for research funding in Canada. It's got me wondering if the sorts of inequities Roth describes might not also be endemic where peer review is used in collaborative learning. I can't say that I have answers to this yet, but suggest four ways of perhaps avoiding some of the worse injustices Roth describes:
Be careful about defining how the qualitative and automated/quantitative aspects of a review are related to each other. Roth describes typical scenarios where quantitative information goes into the review process, but is then used in an arbitrary manner by the committee making the final decision. One way (I don't think the only way) of making the relationship between the two clearer is to base the decision entirely on some kind of automated summing of ratings or other quantitative information while encouraging copious qualitative comments as feedback that does not affect the funding/publication decision.
Use a transparant process. Like many others I'm tired of the whole "blind review" charade. Say who is doing the ratings and making the decisions so people who like doing this sort of thing can build up a public track record of good or bad calls.
Use many more reviewers. Let's face it, getting good reviews is a kind of popularity contest - so why not make it explicit by getting as many people as possible who form part of the intended audience to say if your article/proposal is crap or not? If you don't like their judgement then you know it's time to move on and find a different kind of audience - you don't have to wonder if the negative reaction is just the biased opinion of a few people who happen to be on the review panel. And it doesn't have to be terribly labour-intensive either - lots of people spending a minute deciding whether something is worth recommending to others may be worth more than two or three each spending several hours.
Use many more different levels of publication. Why should publication be an on/off decision? In online communities like Slashdot more highly valued contributions automatically "float to the surface" because they receive many good ratings, but less highly thought-of contributions don't disappear - they can still be excavated and read by anybody who doesn't trust the majority view.
There is another useful intro article by James Branum. It gives a fairly comprehensive background to blogging from a journalism perspective, together with an overview of how various mass communication theories might apply to blogging.
An article on Journal writing as an adult learning tool by Sandra Kerka discusses issues such as privacy and evaluation, leaning towards saying students should feel free to write about private issues and to criticise the course/lecturer without fear of lecturer evaluation. In an online journal privacy is of course not possible, but the constraining effects of evaluation may be reduced by having it occur in the context of a community of learners who read and comment on each others' journals - rather than one-sided lecturer evaluation. Kerka also discusses various dimensions on which journals could be evaluated.
It is an attempt to bring coherence to a number of smaller projects in my department, all with a common theme - the development of "communities of practice" in which learning happens collaboratively. A community of practice exists where a group of peers work together towards shared goals. It could be a small group involved in a project (such as a group of co-authors writing a paper) or a much larger group drawn together by a broadly defined common programme (such as the community of practice formed by South African psychologists). Participants in a community of practice are peers in the sense that there is no rigid distinction between learners and teachers. However it is accepted that participants have different levels of expertise, and a successful community of practice typically provides opportunities for new members to make useful 'real-life' contributions while they are still in the process of learning the ropes.
Below is an outline of the various smaller projects thus far included under the collaborative learning environments umbrella. We only got going towards the end of last year, but there has already been some good progress - as outlined below.
Collaborative learning overview (Martin, Vasi, Patricia, Chris, Johan K., Piet, Louise)
This project aims to keep us current on the literature about collaborative learning. Thus far we have created a web-based resource list containing many interesting links and snippets of information. It is at http://www.criticalmethods.org/docedit.mv?doc=collaboration_links. This year we plan to reorganise the resource list, expand it, and set it up so that everybody in the group can contribute items. We also want to make a presentation about collaborative learning to the department.
Research Honours Pilot Project (Vasi, Martin, Chris, Piet, Johan K.)
For several years now our psychological research honours students have benefitted from collaborative learning through participation in a peer review process of their research proposals and reports. The pilot project aims to create opportunities for richer interaction by taking this process online. Thus far we have devised a two-stage process and several interesting new mechanisms for online interaction. We plan on going live with the system on 20 Feb.
Community Psychology Honours Project (Matshepo, Martin)
The community honours course is an obvious place for applying collaborative learning principles and our plan for next year is to develop collaborative learning environments involving students, staff and community organisations. This year we are (with the help of students) drawing up a resource list of organisations that may be interested in participating.
Young Researchers Online (Lazarus, Martin, Johan K)
This is an online discussion list, intended as a networking environment for recently graduated MA research psychology students where they can share their experiences of entering the world of applied research. Membership has grown rapidly and now includes graduates from Unisa, Pietermaritzburg, Rhodes and Cape Town. There have been some useful exchanges about research projects as well as postings about job openings and upcoming events. This year we plan to recruit more members and to explore additional ways of growing the network.
Critical Methods Conference (Patricia, Johan K., Martin, Lazarus)
This conference has functioned for almost a decade as a collaborative learning space for young researchers interested in qualitative and critical methods. Johan arranged to have the well-known critical educationist, Peter McLaren, at last year's conference and Patricia is working with David Nightingale of Manchester Metropolitan University to finalise the conference proceedings. The theme for the September 2003 conference is "indigenous knowledge systems".
Knowledge Broker Survey (Vasi, Martin)
The aim of this project is to describe the variety of knowledge broker roles in and around academia (including editors, discussion list moderators, consultants, webmasters and informal knowledge brokers) and to network with individuals in such roles. Our impression is that such people are often the ones responsible for initiating and maintaining collaborative learning environments and it would therefore be useful to learn more about how they see their role. The project will start as soon as a student research assistant becomes available.
Institutional repository (Vasi, Martin)
This project aims to develop a system for maintaining an accessible repository of knowledge products developed by staff and students in the psychology department. The repository would allow students and staff to achieve at least a minimal level of publication of their work and to receive feedback from others in the community of practice. Thus far we have done a small literature survey of experiences with institutional repositories elsewhere; invited honours students to submit their projects for inclusion in the repository (with a fairly good response); and have met with the computer services department to discuss technical requiremenst for the repository. We expect to meet with them again shortly and to set up the repository in the course of the year.
Overlay journal (Vasi, Johan K., Martin)
This project aims to further develop Unisa Psychologia as an 'overlay journal' (i.e. a journal that publishes selected material from the department's institutional repository) and to explore the potential for similar overlay publications. This is the 3rd year that Psychologia will, in effect, be run as an overlay journal and lessons learnt from the first two years (particularly with regard to how to select material for inclusion in a participatory but rigorous way) will be implemented in 2003.
UML modelling, XML binding and Java skills projects (Vasi, Piet, Chris, Johan K., Louise, Martin)
The aim of these projects is to develop formal modelling skills (which can be useful in designing certain types of collaborative learning environments); XML skills (which can be useful for describing data structures used in such environments); and Java programming skills. Last year some of us participated in an inter-departmental modelling project and attended a UML modelling course. This year we plan on completing some simple modelling and XML binding exercises in the department and to learn basic Java programming.