Wikipedia, Wikimedia, MediaWiki, and wiki

From Green Policy
Jump to navigation Jump to search

Mediawiki1.jpg


GreenPolicy360, hosted on the open/public MediaWiki platform


Differences between Wikipedia, Wikimedia, MediaWiki, and wiki


○ ○ ○ ○ ○ ○ ○ ○ ○ ○


Why did Wikipedia succeed?

Currently (2018) the 5th most visited website on the World Wide Web


The Beginning of Wikipedia, 2000, in St. Petersburg, Florida


Wikipedia

Jimmy Wales / Founder - https://en.wikipedia.org/wiki/Jimmy_Wales


○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○


GreenPolicy360: In the Digital Commons


Digital CitizenDigital RightsThe Commons


○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○


Wiki Definitions

A wiki is a type of website whose contents can be edited from the web browser, and which keeps a version history for each editable page. Wikis are often, but not always, editable by any visitor to the site.
  • wiki (as an adjective), "the wiki way", and the antonym un-wiki are also used to describe the community-oriented philosophy that goes with such a system (e.g. "that's not a very wiki way of doing things").
  • Wiki (with a capital 'W') is an incorrect term, although it has sometimes been used to refer to either Wikipedia (or the Portland Pattern Repository (also known as WikiWikiWeb), which was the first wiki to be created).
  • wiki software and wiki engine are terms referring to pieces of software that power wiki websites. There are many different types of wiki software; some very simple, others with advanced features.


See Wikipedia's article (a wiki) on "wiki software" and Wikipedia's definition of "wiki".


Wikimedia

Wikimedia is the collective name for the Wikimedia movement, revolving around a group of inter-related projects, including Wikipedia, Wiktionary, Wikiquote and others, which aim to use the collaborative power of the Internet, and the wiki concept, to create and share free knowledge of all kinds.

The term Wikimedia servers is often used, referring to the computer hardware on which all Wikimedia projects are hosted.
Wikimedia Foundation is a non-profit organization headquartered in San Francisco that runs the Wikimedia projects. Wikimedia Deutschland, Wikimedia Russia etc. are the names of dozens of local chapters of enthusiasts of Wikimedia projects. They are independent from the Wikimedia Foundation and the Wikimedia projects.


Wikipedia

Wikipedia is a Wikimedia project that is a global, free and multilingual internet encyclopedia. It is the oldest and largest Wikimedia project, predating the

Wikimedia Foundation itself. Wikipedia is often described as a wiki, but it is in fact a collection of over 200 wikis, one for each language, all running on the MediaWiki software.


MediaWiki

MediaWiki is a particular wiki engine developed for and used by Wikipedia and the other Wikimedia projects. MediaWiki is freely available for others to use (and improve), and it is in use by all sorts of projects and organizations around the world.

The site mediawiki.org is intended for information about MediaWiki and related software.


See also:

MediaWiki Manual: What is MediaWiki?

Sites using MediaWiki

The State of Wikipedia on YouTube


··············································


A conversation with Katherine Maher about Wikipedia’s nonprofit structure and the future


Read the full interview. Important reflections and go-forward observations about digital rights, the Commons, and the global future of the Internet...


April 2018 Interview

How do you view Wikipedia and Wikimedia Foundation’s relationships with the major platforms — Google, Facebook, and so on. How would you characterize it?

For the most part, we grew up independently from most of the major platforms that dominate today. If you look around the biggest ones, we predate, I think, all of them but Google. At least of the ones that trade in knowledge and information. We were created for a very different purpose at a different time. The web that we came into actually had a number of larger collaborative, non-commercialized early projects, some of which still exist, and some of which don’t, that were more community-driven. And we sort of are a legacy of the original spirit of the web, and that’s very much what the Wikimedia Foundation was created to do — to ensure that Wikipedia was preserved as a nonprofit entity, with respect for community governance, and in the public interest and in the public spirit.

And that is a very different business model and incentive structure than most of these other organizations, and I think that sets us apart a little bit. We have different resources, different questions, different challenges, and different accountabilities. Now, over time, I think that the dynamics of the different companies have become more intertwined — in part because there has been increased consolidation in the market. Some of these platforms have found ways to incorporate the content that Wikipedia and Wikimedia, more broadly, offers.

And that’s okay, that’s in terms of the spirit of our license. We want people to reuse content and we want knowledge to be more broadly available to the world. But it’s definitely a different dynamic, where we have a smaller number of larger players who are thinking about how to integrate Wikimedia into their ecosystem in ways that are probably very different than we ever could have anticipated at the get-go. And because we have a different profit motive — we don’t have one — and a different infrastructure and different resourcing capacity, that has led to, at times, us being not always in some of those conversations and in the same rooms, because we’re just not necessarily the same beast.


I’m curious about the ways in which you think the incentive structure of the other platforms has created the problems the tech industry is facing — the rising set of concerns over privacy, data usage, the toxic communities, etcetera.

The business model in many ways drives the product and the product decisions. Many of the larger platforms have business models that rely on the incentivization of additional clicks or more time on-site or the extraction of additional data or the creation and modeling of user personas that can be used to sell not just ads on the site but data sets that can be sold to third parties. Those incentive structures, from a product and decision-making standpoint, and privacy policies and all of that are optimized to extract more from users without necessarily optimizing to provide more value to users.

So, there’s an old saying that if you’re not paying, you’re the product. I think that is something that has floated around for a really long time in privacy circles among others. I think we’re really beginning to see in very tangible ways how that is the case, and what the negative impacts are for our social structures and our sort of common sense of identity and enjoyment of the web overall.


Is there a fundamental tension between making a profit and treating users well? Is there an advantage to being a nonprofit, in terms of how you operate and the choices you make? Do you think that there’s a way for these platforms to operate and to adapt and change and work in the interest of their users?

I think that the jury’s out. I think it is important to recognize that while we are dealing with the repercussions of decisions that have been made over time that have gotten us to where we are, it is also important for us to recognize some of the value that has been created. I don’t mean value through a shareholders standpoint; I mean just the good that has been created in terms of people having access to information, people being able to engage with the web in their own languages, people being connected in ways that were not previously connected to their communities, their cultures, their political leaders, their very sense of dignity and agency. I think all of those things are real things that have happened on the web. And whether or not they are the result of a specific company or not, I do think it’s just important for us to contextualize all of that as we start these conversations.

To your point about whether there’s a fundamental tension, as I said, the jury is out. There are companies out there that have different business models that do require fees for service. And when you have fees for service, you tend to have more consumer agency, to the extent that people are going to be willing to pay for services, or to the extent that there are folks out there who can’t pay for services. But that could create a new and tiered, exclusionary, privatized web in which those with resources have access to privilege of privacy and those without do not. I think that those are really challenging questions that these platforms are going to have to grapple with, and grapple with meaningfully.

* You asked the question of what it means to us to be a nonprofit in terms of the choices that we can make. We can make the choice to always put our users first. We can make the choice to invest over the long term. We can make the choice to call out problems on our platform and acknowledge them when we see them, without the fear of our market diminishing. I think that that grants a tremendous amount of freedom and hopefully integrity that allows us to be honest with ourselves, and honest with our users, and accountable to our users in the spirit of continuous improvement.

And I think that that is a different sort of incentive structure that is much more freeing than some of the choices that these platforms currently face.


These companies seem to operate at such a size and with such market power that it’s really hard to apply regular consumer kinds of pressure. Which obviously raises the question of government regulation and intervention by the public on behalf of the public. And there’s absolutely no appetite for that in Silicon Valley right now, and there hasn’t really ever been.

Oh, I’m going to push back on that. I think that it depends on who you talk to in Silicon Valley.


Well, let me rephrase it. There’s no appetite for regulation from within these companies. Within Silicon Valley, you —

I would say very carefully within large dominant incumbents. Right? I think that that’s one of the issues of regulation that is challenging and where these questions become nuanced and complex. When you regulate in response to incumbents, as opposed to in response to user needs, you have all sorts of issues, like, Are you, in fact, just reinforcing the position of incumbents to the exclusion of competitors? There are challenging questions that regulators would need to engage in, but that doesn’t mean that there isn’t the opportunity for a conversation.

You know, we regulate all sorts of things in our society. We regulate our food systems; we regulate our electricity consumption and our energy supplies. I don’t think that regulation is necessarily the challenge; the question is, how do we go about this in ways that continue to create and enable creativity and opportunity and don’t lead to negative externalities to prevent from continuing to evolve in ways that are actually friendly to users?

When we think about governance of the web, and we think about regulation of the web, what we really should be thinking is, What is in the interest of the people who use, populate, and derive value from it? Whether it’s educational value, or cultural value, or connective value. That should be at the forefront of the conversations that we’re having there.

And I think that you do talk to people who, when thinking about the nature of the challenges we face, recognize and acknowledge increasingly that it is actually better to have an accountable system and an electoral pick of the democratic system — some form of balance and weight there that is presented to people’s needs.


If you remove the fundamental thing that makes Facebook so much money, which is its ability to show customized ads to people at a global scale in an efficient way, it would be fundamentally and irrevocably changed. Or, at the very least, the discussion sort of has to be open to that. Then, it’s a question about how much value are people willing to sacrifice? What’s the degree to which you think that there’s a willingness to suffer pain, and suffer the loss of value on the parts of people who operate platforms?

I think that the important thing that we all have to keep in mind is that these companies are made up of people who are making difficult decisions on a daily basis. And adding multiplicity of viewpoints. I have certainly seen, as I’m sure you have seen, folks within these companies who say, “Actually, maximizing value is what we do.” But when Facebook filed as a public company, they talked about their mission, rather than shareholder value. And I think that that’s the sort of thing where that is a question for the boards of those companies, is whether they are willing to reduce the maximization of profits in order to be better corporate citizens. And I can’t answer that question for these companies, but I can pinpoint that that’s the problem.


Coming back to the question of Wikimedia’s relationship with these companies and the degree, do you think that public, or at least nonprofit services that form critical web infrastructure — and I think of the Wikimedia Foundation as being perhaps the largest and most significant and influential of these — have been treated fairly by the platforms? By YouTube and Facebook and Google and other services that in some fashion rely on Wikipedia, or plan to rely on Wikipedia?

* I believe that we form part of the commons. And that is a cultural commons; it’s a heritage commons; it’s a scientific commons; it’s something that every single human has the right to enjoy, and has the right to participate in, and should be available. There is no public internet, and we are the closest thing to it.

And it’s not just us; there are other outfits and organizations out there that maintain critical open-source projects, that maintain the open licenses on which we operate. And I believe that the organizations that derive value from the commons should support the commons.


Do you think that they’ve done a sufficient job of supporting the commons through now?

I think there’s a lot of opportunity for that support to increase and be more meaningful.


Whether it’s YouTube and Wikipedia, Facebook and PolitiFact, Facebook and Snopes, the platforms are increasingly relying on much smaller, sometimes nonprofit organizations to do a lot of work of contextualizing, fact-checking, and so on. Do you think that is sustainable?

* This is why it is important for the platforms to support the commons. We already provide a tremendous amount of value to these platforms. And I mean that as monetary value. I’m not talking about sort of social-good value. We do offer a tremendous amount of social-good value, but we also know that projects like Wikipedia create tremendous value in computational science.

The reason it is important to these platforms to support the commons is because there is labor that is involved, and because the commons requires infrastructure support. But also, importantly, because if we — those of us who are invested in stewardship of the internet — want the internet to be a good, healthy, dynamic, ever-growing, rich, meaningful place full of content, it is important that the commons is sustained, because it is often the source of the highest-quality information that exists on the web.

Not only that, the commons currently only reflects a small proportion of the people that exist on this globe. So if we want to deepen it, invest in it, make sure that we have more languages represented, more cultures represented, more perspectives represented, more people represented, so that the internet means more to more people, so that more people are on it, so that more platforms are able to engage with more people, then we need to support the commons. The commons is a driver of why we access the web. It creates real value. And we want to make sure that it is healthy and thriving and supported into the future.


Do you think that there are any potential pitfalls to receiving support from the big platforms? Getting financial support in some way links you financially or fundamentally to these platforms — do you think that there are any dangers there?

So what I say is that these platforms are already using the commons and deriving value from it. So that sort of is a known factor. All financial support to the commons, including ourselves, should be transparent, and it should be accountable. So that’s one of the things that I think is very unique about the Wikimedia model; it’s not just that absolutely all of the information is available and you can look back and see, there’s a transparency around it: Where did this content come from? Who added it? When was it added? What did it look like before that content was added or removed?

There is also transparency about our funding models. We publish a report every single year that’s not just an annual report but is also a fundraising report: What are the messages that we’ve used? Where did the money come from? What’s the balance geographically? How much of it is small dollars? And you can sustain that in a transparent and meaningful way that ensures that you’ve got that accountability.

There are ways to do this. But I think that that presumes that this is a conversation that’s happening. I look at the open-source community to look at the way that open-source projects have been developed over time and what has worked and what hasn’t, because a huge proportion of today’s web runs on an open-source stack, and there are institutions out there that support specific discrete parts of it — institutions that, as you well know, rely on the funding. So there are certainly models for it.


Changing the subject a bit, do you feel that these companies have thus far proven themselves up to the challenge of fixing themselves at the moment? And secondly, what do you imagine they need to do to fix the crisis they’ve sort of propagated?

We, as Wikimedia, start from a position of default transparency — by talking about our problems as a means of acknowledging and looking for solutions. Because when we can come into the room and share what we’re seeing that is happening in our communities, on our platforms, in governance, whatever the case may be, that then allows us to have a candid conversation where we immediately default to solutions.

What I’ve heard from policy-makers is that policy-makers want, generally speaking, to work with platforms because we all recognize that there is social good that these platforms provide. They fill needs that people have. I think what I’ve heard from policy-makers is that there is a frustration when these platforms won’t come to the table, in the spirit of problem-solving, in the spirit of acknowledging where there have been shortcomings.

What I think is really interesting about Wikimedia is that every single decision — the very nature of the way that Wikimedia works, the way that content has been created — has been hundreds of millions (if not actually, literally, billions) of individual human judgements.

And those human judgements are part of why it works. Because the ability to make small decisions on a daily basis and come back and renegotiate those decisions so they’re called into question is something that creates a form of accountability to users. That is totally different than the way that algorithms and computer-generated moderation actually works.

The decisions that algorithms make at scale are fundamentally not transparent to users. The decisions that humans make at scale are, at least in Wikimedia’s structure, transparent and are constantly up for renegotiation.


How much do you think making these algorithms transparent is a core concern?

I think clear and more transparent product decisions are absolutely essential, and I don’t mean that labeling things differently is going to solve this problem. What I actually mean is that making it visible and comprehensible and intelligible to users is the way forward. I want to know why information is being put in front of me. And I want to know who has made that decision to put that information in front of me.

That is something that happens based on the way that these systems have been designed. That ability to ask those questions, then the ability to challenge those questions, to inject humanity back into our systems — and in doing so, accountability — that’s missing right now.


Interview Courtesy of NYMag.com

Creative Commons