olisb Also, there might be many Indexes and Aggregators, which may or may not operate according to the same principles/requirements, which may or may not pull from or submit to the Coop one. That’s a way to think about it.
For-profit scraping, do you want to say that’s a good/accepted/invited thing or bad, needs to be prevented? Are you saying Murmurations (Coop) directory should/will not publish/share entries/profiles? Realistically, the Google crawler (and those of many other companies, organizations, individuals) will eventually traverse the links and copy pages/data, for all sorts of purposes, and even with Google, their commercial gain with Murmurations data would likely be very indirect, as they don’t necessarily automatically or manually know what to do with it, so their profits are very tiny fractions of having your Coop entries show up in the Google search, so the latter is a minor little bit more interesting to then sell advertising, divided/spread by all other search result hits. Or do you think about some other business of Google? Teaching their translator/NLP at most?
Google’s “Knowledge Graph”, you’re not creating one yourself, are you? You’re not starting and running a search engine to sell advertisement on it, or are you? So if Google is not interested (and why should they) with indexing/listing your entries, then most likely the Coops are a bit hard to find and Murmurations failing with its purpose (except we stop and don’t care about crappy Google at all and instead distribute/spread these records between groups/communities, and if Google does some too, fine, who cares), no? The advertising Google is selling next to Coop entries, these would be most likely other “competing”/proprietary offers anyway, except a Coop on Murmurations is also paying Google for being advertised. So if you want to sell Google an API key (?) in exchange of access to the Coop directory (given it’s not published elsewhere, so you have to make it scarce artificially, keep the entries proprietary, avoid sharing them, Murmurations Coop directory required to remain just another centralized silo), then it’s either at the expense of the promotion of these groups, or you get a fraction of their ad-money indirectly (by Google keeping other parts for profit). I fail to see how any of this makes any sense, how it would/could even work.
Murmurations server costs, how much could that be? Granted, software development can be quite an expense. In lack of money, people could as well contribute code/time – isn’t it a coop effort/service/affordance after all?
Ripple Foundation, don’t see how that’s applicable/similar — healthcare highly regulated, and what’s the alternative, simply not doing it, and going with the proprietary status quo? No need to even go into “Tragedy of the Commons” or 90/9/1.
The thing with agents/bots/clients is that they’re CLIENTS/USER-agents, meaning that they in some way obtain a copy of your data, and then they can do on their side all sorts of things beyond your observation/control. So people can set their permissions all day long, but if the data is shared, it’s sort of out of the bag, and the more access gets granted to more parties, the more it is out of the bag. Ever heard of the Turing Test? Once the data is over the wire, you can not have any idea about what’s on the other side, nor what’s happening or not happening there.
Reading all this, technically Murmurations with its semantics is just fine, as Collection+JSON and RSS and similar stuff is as well, and fine if the goal is to set up a Coop directory that has certain business interests to generate profits/income and/or has some political/economical goals, but to me personally I care much more about all the other use cases for semantics and Murmurations and that kind of tools, machinery and infrastructure, which seems to be entirely of no interest and out of scope here. That’s fine, but please don’t screw up the technical “protocol”/format/approach and ideally not the software/tooling either, as these are agnostic, generic/universal, don’t need to be specific or trapped/locked to a particular instance/implementation/use-case/user.