Consolidating mapping efforts

Hi GOSHers

There has been some discussion on the GitLab roadmap about consolidating some of the mapping efforts. Some of these issues have already been merged but there are still quite a few mapping issues without too much progress.

Is it possible combine the effort into making a an easy to edit data and visualisation structure that others can add to later? I know nothing about this, but it seems like it could be a good place for collaboration. I think @access2perspectives knows a lot about mapping?

Any other ideas.

Great, thanks @julianstirling!

I’ve mapped all participants from GOSH 2016, 2017 and 2018 from public info in GOSH website. The map can be found in this github repo. As it’s based on Wikidata, it’s collaborative, more info on how to contribute in the repository. The data model is mega simple, we can make it more complex to turn it into something more useful.

Anyway I think the original idea goes further, mapping resources and things like that. So can we start defining what do we want to map?

Well this is fantastic! The contribution guide looks really clear. Maybe we make a Maps repository in the GOSH GitLab? We can hopefully attach the GitLab output to

How painful is it to update the data model when there is already data? I ask this because is it a good idea to get some data in and ask for contributions and update the model on the fly. Or do you want to do lots of thinking to create a general purpose model before you start?

Thanks :)! I’m currently having a very stupid problem with gitlab as I lost my 2FA :sweat_smile: so I’m in the process of proving my identity and can’t log in right now, but:

  • Yes, I’d like to think of a better way of maintaining the map using gitlab CI, looks like a great idea to me;

  • A bit offtopic, but Wikidata has recently implemented ShEx, the main idea is that once there’s a data model in place we can define it as ShEx and use it as a tool to:
    a) check if items already loaded are compliant with the model
    b) provide structured guidelines for contributions, limiting input to comply with the shape (Cradle, the tool used for the form that the map uses now for input, accepts ShEx as criteria for building a form)

This would eliminate the need for manual curation, and there would be no need for an extra python-mediated step as it’s in place now. It would just demand using some tool that takes the SPARQL result and renders it in a nicer way than Wikidata does for maps, with no extra validating/consolidating steps (one can only dream…).

Now, on the data model itself, I’ve tried to spark the ontology discussion but it’s a heavy one (we have a close example with openknowhow), and I don’t have the time/will to do it now. I agree with asking people for contributions and modeling afterwards. Wikidata has tools in place for batch editing that we could use once the new model is defined, so it’s not a lot of work.

However, we end up with the same questions to begin working… What do we want to map? Which contributions are we asking for? If we organize it a bit better we’ll be saving lots of time later.

I think the question of what we want to map is the key. Looking at the map issues I see that we want to map at least three different things :

1 Like

Knowing what resources are available, under what conditions and where they are is paramount for any organisation to function. An open network such as GOSH needs ressource planning tools that are adapted for open networks.

In my opinion, GOSH needs a comprehensive resource planning system that takes care of resources, processes, transactions, etc. No need to do any mapping once you adopt such system. Any map would be generated on the fly from the data already in the system, put by users, as they conduct their operations.

If you try to use an ERP (enterprise resource planning) you’ll rapidly realize the inadequacy, because these systems have not been designed for sharing or new ownership regimes. What is the best tool out there to co-manage resources put together by different entities (organizations or individuals), to manage a pool of shareable resources that contains things that are privately owned, public goods, shared goods, commons, nondominum goods, etc.

You can take a look at the NRP (network resource planning) an open source tool that has been designed from the ground up for open networks, built on the ERA ontology that is able to model any economic reality within open networks, and allows you to project that network reality onto the traditional economic reality and spit out any traditional accounting result.

The UI/UX are not the forte of the NRP, and there are lots of things that can be tweaked to make it better. But the foundation is sound. If you don’t start with a good model like the REA you’ll encounter big problems later as the system scales up.

Here’s more about resources in the context of open networks.

I am willing to help if this community wants to know more about NRP and open value networks.

1 Like

@jarancio @nanocastro
I agree what we want to map and the data is incredibly important. To clarify when I said consolidate I didn’t necessarily mean that we would need one map, with one data structure that answers everything. Instead if we had an established workflow for setting up a map with the tools already in place to deploy it. So could be a list of maps that have been established with an explanation on how to contribute (modelled on Julieta’s example). There could then be a guide to add a map something like:

  • Create a data structure (guide of what to think about)
  • Upload the structure to X location
  • Upload some data to Y location
  • Ask someone on the GitLab to accept the map
  • Then boom the map is included in :gosh:

I suppose this is not a consolidating of the mapping effort itself, but establishing a toolchain so that each mapping team doesn’t need to think about this part as well.

I am not sure if this is useful. I don’t really know much about mapping, but I enjoy automating things on GitLab. I am probably a hammer looking for a nail!!

Resource planning tools sound fantastic to me, but they also sound complicated because I don’t understand them. How much work is it to establish and maintain the tool? How much work is it for each partner to learn and enter their data? If we do go down a route like this we probably want more of the community to have a voice on if they want/need the tool and are prepared to invest the time to learn and use it. For me I don’t understand enough of what it is and what it brings us and how hard it is to use to be able to comment.

I think the benefit of the mapping is that a few people committed to mapping (or with no other hobbies, so they spend their weekends setting up GitLab CIs! :sweat_smile:) can do it without requiring significant time investment from the rest of the community. I would love to hear more about NRP, how much effort is involved, and whether I have totally misunderstood it. But I would assume this would need some serious time investment that would probably mean that it should be pitched to our governance committee once they are more established?

1 Like

Hi @julianstirling

The problem with maps that people make once in a while and maintain once in a while is that they are not dynamic, they are not interconnected and do not present current data. In other words, their value is limited.
But the most problematic thing with the type of maps that you propose is that they will not be maintained. Perhaps you can raise a wave of enthusiasm to produce a few maps. Once the hype is over these maps will be forgotten, they will become outdated and will only serve as historical data. I’ve spent over a decade contributing to open networks and the same pattern comes back over and over again, people don’t like to do support work, i.e. do the boring tasks to maintain the organisation. This is why things need to be automated and interconnected, to reduce to a minimum intervention from people to maintain the network.

The best arrangement is to have the intelligence about the network feed not on data that is manually collected and injected into the system, but from data that is generated as a byproduct of everyone’s activity. An NRP is one type of system that can do that.

You’re right. Adopting a new tool to manage projects and resources is a big deal. You’ve already adopted GitLab. Congratulations. I think it is a good tool for GOSH at this moment.
Formalizing an organisation is not easy… but there are ways to do it, ways to adopt governance, to adopt new tools, to put in place new methodologies of work and event to cultivate a new work culture. One thing is clear though: the more you want to grow and the more complex your activities become the more you need to formalize. The degree of formalization will determine the growth of GOCH. Can’t stay informal and expect GOSH to be able to tackle complex issues.
Imagine someone comes tomorrow and says “I give GOSH $1M to do this”. Without a high degree of formalization, without tools like NRP, without a good governance, the organisation will not be able to accept the offer.
NOTE : formalization doesn’t mean centralization. Bitcoin is highly formal and decentralized at the same time.

So yes, I know, people don’t like to learn new tools, don’t read documentation, … I get it, there is always resistance to adopt something like an NRP. But if GOSH can’t figure out a way to go there the $1M opportunity can’t even be a dream for gosh-ers. Open networks can take on the job of large traditional organisations. We just need to be able to do at least as good of a job as them. They use sophisticated management systems, we need to use them too.
But perhaps GOSH dosn’t have this ambition, which is all fine.

A path of adoption of more sophisticated tools is to use them in paid projects, where people are obliged to do so, in order to be accountable and realiable as a group. In fact, no one will give you a large sum of monay if you don’t show accountability. Usualy open networks use non-profit organizations as interfaces with the funding agencies, to show accountability. A mechanism of fiscal sponsorship is used to transfer fund front the nonprofit to agents in the open network. It works. But in this case the network is subject to whatever the nonprofit says, it is the hand that feed the network, ultimately. Using these tools and other tricks from the OVN book the network can become soverein / autonomous. So there’s a good practical reason to adopt new tools, to be able to tackle lucrative projects. Once adopted by the core of the open network, its use can spread to other types of projects.

Here’s a blog post I wrote about formalization of open networks. Might help continue the conversation…

Hi, I think we are falling into the same old discussion of which tools we need, when we don’t even know which questions we want to answer. I thought this wasn’t a tool discussion, but I may be wrong.

Thanks @nanocastro for compiling these:

One of my questions is, when we talk about resources, what do we mean? Infrastructure, networks, skills? Others?


Hi, just writing to jump on here since I’m interested in mapping OScH resources for fabrication (3D printers, CNC microlathes, etc) and trying to figure out where to start