8.5 KiB
title | thumbnail | featured_image | date | tags |
---|---|---|---|---|
The Ethical Open License (EOL) - Rethinking Open Source Responsibility | https://socialify.git.ci/timkicker/eol/image?language=1&logo=https%3A%2F%2Fraw.githubusercontent.com%2Ftimkicker%2FEOL%2Frefs%2Fheads%2Fmain%2Ffile_with_handshake.png&name=1&owner=1&stargazers=1&theme=Light | https://socialify.git.ci/timkicker/eol/image?language=1&logo=https%3A%2F%2Fraw.githubusercontent.com%2Ftimkicker%2FEOL%2Frefs%2Fheads%2Fmain%2Ffile_with_handshake.png&name=1&owner=1&stargazers=1&theme=Light | 2025-02-24 15:20:53 | <nil> |
For a while now, I’ve been thinking about something that doesn’t get talked about enough in open source: what happens when your code is used for something you fundamentally disagree with?
We celebrate open-source as this great force for collaboration, and in many ways, it is. But there’s a gap in how we think about responsibility. Right now, if you release software under a standard open-source license, you’re essentially saying: "Here, take this. Use it for whatever you want." And that’s fine, until you realize that "whatever you want" includes things like mass surveillance, AI-driven discrimination, child exploitation networks, or even tools used to facilitate human trafficking.
Some people argue that this is just the price of open-source -- once you put code out there, it’s out of your hands. But I started asking myself: does it have to be?
What Is the Ethical Open License (EOL)?
The Ethical Open License (EOL) is an attempt to build a licensing model that allows for openness while setting some fundamental ethical limitations. It’s not about restricting everyday users or preventing innovation. It’s about setting clear boundaries on how software can and can’t be used.
Under EOL, your software for example cannot be used for:
- Mass surveillance -- No large-scale tracking, unauthorized data collection, or government spying programs.
- Autonomous weapons -- Your code should not end up in military AI systems or automated combat technology.
- Discriminatory AI -- No training models that reinforce bias or make decisions based on race, gender, or social class.
- Exploitation networks -- No use in systems that facilitate child abuse, human trafficking, or other crimes that exploit vulnerable individuals.
This is about recognizing that technology has real-world consequences and that developers should have a say in how their work is applied.
How Does EOL Work?
EOL follows a familiar open-source structure, but with added restrictions on unethical use. It grants users the right to:
- Use, modify, and distribute the software freely.
- Contribute to the project under the same ethical guidelines.
- Fork or create derivatives as long as they comply with the license terms.
However, if an entity is found to be violating the ethical clauses of the license, they lose their right to use the software. This is meant to create a tangible consequence for misuse while keeping the spirit of open-source collaboration intact.
To avoid ambiguity, EOL also provides a defined process for addressing violations. This would ideally involve an independent review process where complaints can be filed, reviewed, and addressed based on available evidence.
If a violation is confirmed, the offending party is expected to cease the unethical use immediately or risk losing access to the software under the terms of the license.
The Cost Model of EOL
One important question that comes up with any new license is: how does the cost model work?
EOL itself is, like most open-source licenses, free to use. Any developer, company, or organization can adopt the license without paying fees. However, the enforcement mechanisms and the potential establishment of an independent ethics review board (IERB) introduce some financial considerations.
Potential Costs and Funding Sources
-
Self-Governance (Free Model) -- In its simplest form, projects adopting EOL could rely on community-driven enforcement, where violations are reported and discussed publicly. This keeps costs low but relies heavily on volunteer effort and public pressure.
-
Ethics Review Board (IERB) -- If a formal IERB were established, it would require funding for:
- Legal reviews of complaints
- Investigations into reported misuse
- Administrative work related to enforcement
This could be supported through:
- Corporate sponsorships from organizations that believe in ethical open-source development.
- Crowdfunding and donations from developers and supporters.
- A tiered compliance model where companies using EOL-licensed software commercially contribute a small fee toward maintaining ethical oversight.
-
Hybrid Approach -- A mix of self-governance and optional paid enforcement. Smaller projects could rely on community oversight, while larger commercial users could opt into a paid compliance system that helps fund ethical review and enforcement.
The exact cost model isn’t set in stone -- it’s something that would need to be refined based on community feedback and practical needs. The core idea, however, is that ethical enforcement doesn’t have to be a barrier to open-source adoption, but it does require some thought into sustainability.
The Royalty Structure of EOL
I’m still not entirely sure if a royalty model is the right approach or if it would actually be beneficial. The idea is to create a way for large-scale commercial users to contribute back to the ethical enforcement of open-source projects, but whether this is the best method is something that needs further discussion.
Another aspect of EOL is its royalty model for commercial use. While individuals, nonprofits, and small companies can use the software freely, larger companies generating significant revenue directly from EOL-licensed software are expected to contribute back to the ecosystem.
Royalty Rates Based on Annual Gross Revenue
Annual Gross Revenue | Royalty Rate |
---|---|
Less than $1,000,000 | 0% |
$1,000,000 - $5,000,000 | 1% |
More than $5,000,000 | 2% |
These royalties are calculated based on annual gross revenue directly attributable to the software or its derivative work. The goal is to ensure that successful commercial ventures built on EOL-licensed software contribute fairly to its maintenance and ethical enforcement.
Funds collected through royalties can be allocated toward:
- Supporting ethical enforcement efforts
- Maintaining an independent review process
- Funding security audits and compliance verification
- Ensuring long-term sustainability of ethically governed open-source projects
This model helps create a balance between free access for non-commercial use and fair compensation for commercial beneficiaries, ensuring that ethical oversight remains feasible without burdening smaller developers or independent contributors.
"Isn’t open-source supposed to be neutral?"
The idea has always been that developers provide the tools, and it’s not their job to dictate how those tools are used.
But software isn’t neutral. A powerful AI model isn’t just some abstract tool -- it actively shapes real-world outcomes. A social media algorithm doesn’t just recommend content -- it determines what millions of people see and believe. And if we, as developers, recognize that, why should we act as if we have no role in what happens next?
That’s where EOL comes in -- not as a perfect solution, but as a proposal for a different way of thinking about open-source responsibility.
EOL is Now on GitHub
The Ethical Open License (EOL) 1.0 is now up on GitHub. It’s not a final product. It’s an open discussion. If you’re interested, check it out, share your thoughts, and let’s figure out if this is something that could work.
Whether this turns into something practical or just sparks a broader conversation, I’d call that a win.
I don’t expect EOL to replace MIT, GPL, or any of the widely used licenses. But I do think it’s time we stop pretending that software is neutral.
The way our code is used matters. And if we, as developers, have the ability to set ethical boundaries, why shouldn’t we?