Call for inputs: Global Consultation on the Draft Principles of Open Science Monitoring

Per the UNSECO announcement:

As part of the ongoing efforts to advance open science monitoring in alignment with the 2021 UNESCO Recommendation on Open Science, UNESCO is calling for inputs and comments from all the regions and interested stakeholders on the Draft Principles for Open Science Monitoring.
Deadline for inputs: 30 November 2024

Open source hardware is currently a part of the UNESCO Recommendation on Open Science, probably a good thing to give some feedback on how to achieve effective monitoring of open science hardware!!

5 Likes

Great lets organise a working group and jump on a call ?

I think that any suggestions for the recommendation need to take into account that a depressingly large proportion of academics will do the exact minimum to technically comply. More complex hoops to jump through which are still binary (i.e. pass-fail) often can end up catching out truly open projects that work in a new or innovative way, while normally still being easy to short cut.

This is particularly important as certain domains are overlooked and most scientists live in a field-specific bubble.

I am a strong believer that we need real career incentives within science for people to work openly (See opinion paper). I also hold the somewhat incompatible view best sumarised by Goodharts law:

“When a measure becomes a target, it ceases to be a good measure”

I feel that what I would push for is for UNESCO to encourage governments and institutions to put less weight on metrics which are antithetical to open science rather than to establish yet more metrics. For example, publications metrics are key to academic success. If academics are incentivised to out publish their peers at all cost, then there is an incentive to withhold data, details, and tools. Open science requirements can be sprinkled on top, but there is an incentive to do the bare minimum to pass the threshold of “open science” but to keep all other key information in-house.

One other thing I would suggest is that when we monitor science we need more nuance and less counting. Citation counting again poorly reflects the usefulness of open hardware. A citation is a citation. You get one “point” if someone mentions your work in passing because it’s cool, one “point” if someone says your result is wrong or controversial, one “point” if they back up a fact they already knew by citing your paper, and one point if they meticulously follow your carefully written painstaking instructions to build an instrument that is the foundation of the work they are publishing. This is not an incentive to do good meticulous open source hardware documentation, it is an incentive to get as many flashy, controversial results into the literature as possible.

This might sound like quite a negative response from a bitter individual (because it is!). But, it is also a reminder that institutions actively encourage scientists to maximise statistics by gaming the system. We need to be aware of this when suggesting the rules of the game.

3 Likes

Thanks for sharing this; it would be really nice if we can give our feedback considering that UNESCO is keen on having diverse inputs. We (@uddhavi2009 , @e-clin and @ratiranjan) can also add some feedback with the south-east Asian perspective.

Sidenote: As we are encouraging for Open Data in the NIHR Global RESPIRE project; our team is considering for giving comments on Open Data “As open as possible”, on achieving effective monitoring of Open Data in line with the FAIR and CARE principles. So that it will help to protect the confidentiality and rights of the people including the Indigenous (e.g. tribal communities).

2 Likes

I sent my own feedback already per email, happy to work it out in this space and send a new email. Here my email :

In the introduction, please add open source hardware, as indicated in the UNESCO open science recommendation
opening up or sharing of research data, publication of software used in research under open source licences
→
opening up or sharing of research data, publication of software and hardware source used in research under open source licences

As a more general remark, I would like to propose some clause to minimise the risk of misuse. We have seen both misuse of metrics (JIF for instance) and competing interests in open science monitoring (same company creating metrics and selling services for publication).

In Part 2: Transparency and reproducibility I would propose to add:

Objectives and competing interest. In order to reduce the risk of misuse, the objective aimed during the creation of the measure should be indicated (in order to prevent the use of a metrics for a purpose it was not designed for), in addition the authors of the metrics should be known and any competing interest should be stated.

PS: the authors are very experienced (and clever), I think they considered what @julianstirling mean and the principles reflect that in part 3 (especially the “avoid ranking” part).

3 Likes

It’s certainly nice to see “Avoid rankings” in the document. I read that as no new rankings on how open science is? It would be even better to suggest removing existing rankings that harm open science, but I suppose that may be out of scope.

1 Like

I read Julian’s linked opinion piece, and in Section 3, the examples of openness during the Victorian era (1845!) –

  • publishing details of a mechanism needed for replication of an experiment,
  • creating the first standards for screw threads to speed innovation, and
  • rejecting patents

– makes the perverse incentives of today’s academic science practices plainly evident.

I’d like to see his conclusion calling for carrots (aligning vision) rather than sticks (requirements that can be gamed) inform a group letter to UNESCO.

3 Likes

Just for context, from the Unesco draft (Part 3, items 2 and 3):

Monitor for improvement: monitoring should be used to incentivise rather than punish lack of practices.
Continuous assessment of the monitoring initiative: as any indicator could potentially facilitate gaming attempts, there is a need for continuous assessment of indicators and methodologies, and the potential shift of practices in unintended directions.

From the folks I know in the contributor list, I have every reason to believe that the people who wrote these draft principles are open to discussing improvements and suggestions, and that letters from any individual or group will be well received and fully considered. I would just note that the scope of the document is Open Science monitoring initiatives (which already exist and aren’t going away) and not research evaluation.

Cheers! :slight_smile:

.~´

2 Likes

I am glad to share updates on my involvement with UNESCO’s Open Science Monitoring Initiative (OSMI) as a member of Working Group 1 (Scoping the Needs) and Working Group 4 (Shared Resources and Infrastructure). Representing the NIHR RESPIRE Collaboration, our focus will be on addressing challenges specific to Southeast Asian LMICs, promoting FAIR and CARE principles (since some of the partners are working with Tribal Populations), and advocating for Open Science Hardware (Air Quality Sensors) to support cost-effective and accessible research solutions.

In November 2024, we submitted collective feedback from NIHR RESPIRE Project
partners across South and Southeast Asia. The feedback emphasised protecting Indigenous data, ensuring inclusivity, and scaling infrastructure for open science. Practical recommendations, including the use of FOSS and Open-source technologies, aim to make open science more accessible in resource-limited settings.

I look forward to contributing to this global initiative and will continue sharing updates with the GOSH Community.

Let’s work together to make open science truly inclusive!

1 Like

Ni! Hi @tapas, hi everyone,

Good idea to share our involvement here in case the community needs anything or is interested in the OSMI. They’ll know who to get in touch.

And everyone should take notice there’s a website for OSMI:

I have also joined both Working Groups 1 and 4. It seems we have similar interests @tapas :wink: My territorial axis is between South America, where I’m a member of Brazil’s National Institute for Citizen Science, and France, where I’m a member of LISIS, a laboratory in the field of science and technology studies.

I’m looking forward as well, and let’s keep working to make sure Open Science Hardware is recognized in these institutional initiatives.

Cheers!

2 Likes

Ni! I thought I had shared it here at the time, but just realized I did not. Better late than never, so below are the comments I sent as feedback to Unesco, in case it can be interesting or useful to anybody, even if only as food for thought.

On 13 Sep 2024, Ale wrote:

Earlier this week I have learned, with great interest, about the POSM document: https://www.unesco.org/en/articles/call-inputs-global-consultation-draft-principles-open-science-monitoring

[…snip me introducing myself and my work where I mention assessment of OSH as an OS monitoring activity…]

Paying respect to the work done so far on the document and to the people involved, and with great appreciation for you taking inputs, I would like to share some comments on it.

C1: The introduction in POSM talks about monitoring, while the tripartite text essentially discusses indicators. In my understanding, reducing monitoring to indicators would go against the stated goals, and against the possibility of striking a balance between the “comparable” and “inclusive” principles. STI studies have, for a long time, included more holistic approaches such as network-based cartographies and other mixed-methods/quali-quanti ways to monitor socio-technical systems. That they are absent from the POSM risks favoring the exclusion of such approaches also from official monitoring initiatives.

C2: It is well established that indicators produce mechanisms for rich-get-richer and lock-in dynamics, and OS indicators aren’t an exception. With OA we’ve seen commercial editors play that song, alongside the chant of mandates, to the tune of abusive APCs. The document adresses these phenomena only partially, and the items that do – “Inclusive” and “Avoid rankings” – are among the least developed. Beyond those items, section “Self-assessment and responsible use” could welcome principles inspired from decolonial and STI studies, such as empowerment and reflexivity, that would account for the performativity of definitions and indicators, and for the politics, political economy, and privileges they embody.

C3: In the context of the two previous comments, the inherent tension between comparability and inclusiveness could be explicitly stated in the document. Also, and in correspondence, the mutualistic complementarity between indicators and other kinds of monitoring could be discussed.

C4: Open infrastructures are cited in a footnote and in the bibliography, but not acknowledged in the text. Indicators and other forms of monitoring that rely on proprietary infrastructures, even with “Explicit data provenance”, have their adherence to every principle distorted by matters of access, trust and relationships with those infrastructures, whose private interests are in many ways at odds with scholarly interests, public interests, and with the equal treatment of communities, nationalities, and cultures. Thus, adopting and contributing to such open scholarly infrastructures could be stated as a principle, while deploying and maintaining them as a collective goal.

I’d like to add that I admit to knowing very little of the context of this ongoing work and that my comments are perhaps not written with the deserved discretion. I apologize if they seem misplaced. And again, thank you for the opportunity to comment on this draft and for your attention,

Ale

1 Like