<metaname="description"content="For a while now, I’ve been thinking about something that doesn’t get talked about enough in open source: what happens when your code is used for somet">
<metaname="description"content="For a while now, I’ve been playing with a thought experiment: what happens when your code is used for something you fundamentally disagree with? Open ">
<metaproperty="og:title"content="The Ethical Open License (EOL) - Rethinking Open Source Responsibility"/>
<metaproperty="og:title"content="Rethinking Open Source Responsibility (EOL)"/>
<metaproperty="og:description"content="For a while now, I’ve been thinking about something that doesn’t get talked about enough in open source: what happens when your code is used for somet">
<metaproperty="og:description"content="For a while now, I’ve been playing with a thought experiment: what happens when your code is used for something you fundamentally disagree with? Open ">
<metaname="twitter:title"content="The Ethical Open License (EOL) - Rethinking Open Source Responsibility">
<metaname="twitter:title"content="Rethinking Open Source Responsibility (EOL)">
<metaname="twitter:description"content="For a while now, I’ve been thinking about something that doesn’t get talked about enough in open source: what happens when your code is used for somet">
<metaname="twitter:description"content="For a while now, I’ve been playing with a thought experiment: what happens when your code is used for something you fundamentally disagree with? Open ">
<h2>The Ethical Open License (EOL) - Rethinking Open Source Responsibility</h2>
<h2>Rethinking Open Source Responsibility (EOL)</h2>
<p>For a while now, I’ve been thinking about something that doesn’t get talked about enough in open source: <strong>what happens when your code is used for something you fundamentally disagree with?</strong></p>
<p>We celebrate open-source as this great force for collaboration, and in many ways, it is. But there’s a gap in how we think about responsibility. Right now, if you release software under a standard open-source license, you’re essentially saying: “Here, take this. Use it for whatever you want.” And that’s fine, until you realize that “whatever you want” includes things like mass surveillance, AI-driven discrimination, child exploitation networks, or even tools used to facilitate human trafficking.</p>
<p>Some people argue that this is just the price of open-source – once you put code out there, it’s out of your hands. But I started asking myself: <strong>does it have to be?</strong></p>
<p>For a while now, I’ve been playing with a thought experiment: <strong>what happens when your code is used for something you fundamentally disagree with?</strong></p>
<p>Open source is great. It encourages collaboration, innovation, and accessibility. But what it doesn’t do is ask whether there should be <em>any</em> limits on how software is used. Right now, if you release something under a permissive license, you’re essentially saying: “Here, take this. Use it for whatever you want.” And sometimes, that “whatever” includes mass surveillance, AI-driven discrimination, or worse.</p>
<p>Some people argue that this is just the price of open-source. Once you put code out there, it’s out of your hands. But I started wondering: <strong>does it have to be?</strong></p>
<p><em>(Fun fact: Apparently, just asking this question is enough to get your post removed from certain open-source communities. The conversation must be <em>very</em> settled, right?)</em></p>
<h2id="What-Is-the-Ethical-Open-License-EOL"><ahref="#What-Is-the-Ethical-Open-License-EOL"class="headerlink"title="What Is the Ethical Open License (EOL)?"></a>What Is the Ethical Open License (EOL)?</h2><p>The <strong>Ethical Open License (EOL)</strong> is an attempt to build a licensing model that allows for openness while setting some fundamental ethical limitations. It’s not about restricting everyday users or preventing innovation. It’s about setting clear boundaries on how software can and can’t be used.</p>
<p>Under EOL, your software for example <strong>cannot</strong> be used for:</p>
<h2id="What-Is-the-Ethical-Open-License-EOL"><ahref="#What-Is-the-Ethical-Open-License-EOL"class="headerlink"title="What Is the Ethical Open License (EOL)?"></a>What Is the Ethical Open License (EOL)?</h2><p>The <strong>Ethical Open License (EOL)</strong> is a hypothetical licensing model that explores whether open-source can include ethical restrictions. This isn’t about restricting everyday users or stifling innovation. It’s about setting clear boundaries on how software <em>shouldn’t</em> be used.</p>
<p>Under EOL, your software <strong>cannot</strong> be used for:</p>
<ul>
<li><strong>Mass surveillance</strong>– No large-scale tracking, unauthorized data collection, or government spying programs.</li>
<li><strong>Autonomous weapons</strong>– Your code should not end up in military AI systems or automated combat technology.</li>
<li><strong>Discriminatory AI</strong>– No training models that reinforce bias or make decisions based on race, gender, or social class.</li>
<li><strong>Exploitation networks</strong>– No use in systems that facilitate child abuse, human trafficking, or other crimes that exploit vulnerable individuals.</li>
<li><strong>Mass surveillance</strong> – No large-scale tracking, unauthorized data collection, or government spying programs.</li>
<li><strong>Autonomous weapons</strong> – Your code should not end up in military AI systems or automated combat technology.</li>
<li><strong>Discriminatory AI</strong> – No training models that reinforce bias or make decisions based on race, gender, or social class.</li>
<li><strong>Exploitation networks</strong> – No use in systems that facilitate child abuse, human trafficking, or other crimes that exploit vulnerable individuals.</li>
</ul>
<p>This is about recognizing that technology has real-world consequences and that developers should have a say in how their work is applied.</p>
<p>This raises a fair question: <em>who decides what’s ethical?</em> That’s something that would need clearer definition (which, to be fair, has been one of the biggest criticisms). But ignoring the question entirely doesn’t seem like the best answer either.</p>
<hr>
<h2id="How-Does-EOL-Work"><ahref="#How-Does-EOL-Work"class="headerlink"title="How Does EOL Work?"></a>How Does EOL Work?</h2><p>EOL follows a familiar open-source structure, but with added restrictions on unethical use. It grants users the right to:</p>
<h2id="How-Does-EOL-Work-If-It-Did-Work"><ahref="#How-Does-EOL-Work-If-It-Did-Work"class="headerlink"title="How Does EOL Work? (If It Did Work?)"></a>How Does EOL Work? (If It Did Work?)</h2><p>EOL would follow a familiar open-source structure, but with added restrictions on unethical use. It would grant users the right to:</p>
<ul>
<li>Use, modify, and distribute the software freely.</li>
<li>Contribute to the project under the same ethical guidelines.</li>
<li>Fork or create derivatives as long as they comply with the license terms.</li>
</ul>
<p>However, if an entity is found to be violating the ethical clauses of the license, they <strong>lose their right to use the software</strong>. This is meant to create a tangible consequence for misuse while keeping the spirit of open-source collaboration intact.</p>
<p>However, if an entity is found to be violating the ethical clauses of the license, they <strong>lose their right to use the software</strong>. This would be enforced through a <strong>defined process</strong>, ideally involving an independent review board (if one could exist without being a bureaucratic nightmare). </p>
<p>To avoid ambiguity, EOL also provides a <strong>defined process</strong> for addressing violations. This would ideally involve an independent review process where complaints can be filed, reviewed, and addressed based on available evidence.</p>
<p>If a violation is confirmed, the offending party is expected to cease the unethical use immediately or risk losing access to the software under the terms of the license.</p>
<p>Of course, enforceability is a huge concern – another major critique. If bad actors don’t follow the law, why would they follow a license? That’s a fair point, but licensing isn’t always about stopping the worst offenders. Sometimes, it’s about setting expectations and norms.</p>
<hr>
<h2id="The-Cost-Model-of-EOL"><ahref="#The-Cost-Model-of-EOL"class="headerlink"title="The Cost Model of EOL"></a>The Cost Model of EOL</h2><p>One important question that comes up with any new license is: <strong>how does the cost model work?</strong></p>
<p>EOL itself is, like most open-source licenses, <strong>free to use</strong>. Any developer, company, or organization can adopt the license without paying fees. However, the enforcement mechanisms and the potential establishment of an independent ethics review board (IERB) introduce some financial considerations.</p>
@ -296,16 +298,20 @@
</ul>
<p>This model helps create a balance between free access for non-commercial use and fair compensation for commercial beneficiaries, ensuring that ethical oversight remains feasible without burdening smaller developers or independent contributors.</p>
<hr>
<h2id="“Isn’t-open-source-supposed-to-be-neutral-”"><ahref="#“Isn’t-open-source-supposed-to-be-neutral-”"class="headerlink"title="“Isn’t open-source supposed to be neutral?”"></a>“Isn’t open-source supposed to be neutral?”</h2><p>The idea has always been that developers provide the tools, and it’s not their job to dictate how those tools are used.</p>
<h2id="The-Biggest-Pushbacks"><ahref="#The-Biggest-Pushbacks"class="headerlink"title="The Biggest Pushbacks"></a>The Biggest Pushbacks</h2><p><em>(And Why They Might Be Right)</em></p>
<h3id="“Isn’t-open-source-supposed-to-be-neutral-”"><ahref="#“Isn’t-open-source-supposed-to-be-neutral-”"class="headerlink"title="“Isn’t open-source supposed to be neutral?”"></a>“Isn’t open-source supposed to be neutral?”</h3><p>The idea has always been that developers provide the tools, and it’s not their job to dictate how those tools are used.</p>
<p>But software isn’t neutral. A powerful AI model isn’t just some abstract tool – it actively shapes real-world outcomes. A social media algorithm doesn’t just recommend content – it determines what millions of people see and believe. And if we, as developers, recognize that, why should we act as if we have no role in what happens next?</p>
<p>That’s where EOL comes in – not as a perfect solution, but as a proposal for a different way of thinking about open-source responsibility.</p>
<h3id="“This-isn’t-open-source-Stop-pretending-it-is-”"><ahref="#“This-isn’t-open-source-Stop-pretending-it-is-”"class="headerlink"title="“This isn’t open source. Stop pretending it is.”"></a>“This isn’t open source. Stop pretending it is.”</h3><p>Okay, fair. It doesn’t fit the OSI definition, which says that open-source software must allow unrestricted use. If <em>that’s</em> the definition you go by, then sure, EOL isn’t open source. But if you see open source as something that can evolve, it’s at least worth talking about.</p>
<h3id="“Ethics-are-subjective-You-can’t-put-them-in-a-license-”"><ahref="#“Ethics-are-subjective-You-can’t-put-them-in-a-license-”"class="headerlink"title="“Ethics are subjective. You can’t put them in a license.”"></a>“Ethics are subjective. You can’t put them in a license.”</h3><p>Completely true. What’s considered ethical today might not be in 50 years. But laws and policies shift too, and we don’t abandon them just because they’re hard to define. The challenge isn’t that ethics change—it’s how to define them in a way that works.</p>
<h3id="“Nobody-would-use-this-because-it’s-legally-vague-”"><ahref="#“Nobody-would-use-this-because-it’s-legally-vague-”"class="headerlink"title="“Nobody would use this because it’s legally vague.”"></a>“Nobody would use this because it’s legally vague.”</h3><p>Honestly, that’s a solid argument. If a license introduces too much risk, companies won’t touch it. If something like EOL were to work, it would need <em>very</em> clear definitions and solid legal backing. Right now, it’s more of a conversation starter than a practical tool.</p>
<h3id="“Bad-actors-won’t-follow-the-license-anyway-”"><ahref="#“Bad-actors-won’t-follow-the-license-anyway-”"class="headerlink"title="“Bad actors won’t follow the license anyway.”"></a>“Bad actors won’t follow the license anyway.”</h3><p>True again. If someone wants to build something awful, they won’t stop because a license tells them not to. But a license isn’t just about enforcement—it’s about setting a precedent. Big companies <em>do</em> care about compliance, and even if this wouldn’t stop everything, it might influence how some organizations think about responsibility.</p>
<hr>
<h2id="EOL-is-Now-on-GitHub"><ahref="#EOL-is-Now-on-GitHub"class="headerlink"title="EOL is Now on GitHub"></a>EOL is Now on GitHub</h2><p>The <strong>Ethical Open License (EOL) 1.0</strong> is now up on GitHub. It’s not a final product. It’s an open discussion. If you’re interested, check it out, share your thoughts, and let’s figure out if this is something that could work.</p>
<h2id="So…-Is-This-a-Good-Idea-Probably-Not-But-It’s-Worth-Discussing"><ahref="#So…-Is-This-a-Good-Idea-Probably-Not-But-It’s-Worth-Discussing"class="headerlink"title="So… Is This a Good Idea? Probably Not. But It’s Worth Discussing."></a>So… Is This a Good Idea? Probably Not. But It’s Worth Discussing.</h2><p>I’m not saying EOL is <em>the</em> answer. I’m not even saying it’s <em>a</em> good answer. What I <em>am</em> saying is that open-source has a responsibility problem that’s at least worth thinking about. If the reaction is just “shut up, open source is freedom,” then maybe the conversation is overdue. </p>
<p>The <strong>Ethical Open License (EOL)</strong> is up on GitHub. It’s not a finished product. It’s an open discussion. If you’re interested, check it out and let me know your thoughts.</p>
<p>Whether this turns into something practical or just sparks a broader conversation, I’d call that a win.</p>
<hr>
<p>I don’t expect EOL to replace MIT, GPL, or any of the widely used licenses. But I do think it’s time we stop pretending that software is neutral.</p>
<p>The way our code is used <strong>matters</strong>. And if we, as developers, have the ability to set ethical boundaries, why shouldn’t we?</p>
<h2id="Whether-this-turns-into-something-real-or-just-sparks-a-broader-conversation-I’ll-count-that-as-a-win"><ahref="#Whether-this-turns-into-something-real-or-just-sparks-a-broader-conversation-I’ll-count-that-as-a-win"class="headerlink"title="Whether this turns into something real or just sparks a broader conversation, I’ll count that as a win."></a>Whether this turns into something real or just sparks a broader conversation, I’ll count that as a win.</h2><p>I don’t see EOL as a replacement for MIT, GPL, or other widely adopted licenses. But I do think it’s worth questioning the idea that software is inherently neutral.</p>
<p>How our code gets used matters. And if we, as developers, have the ability to set ethical boundaries, why wouldn’t we consider it?</p>
<contenttype="html"><p>For a while now, I’ve been thinking about something that doesn’t get talked about enough in open source: <strong>what happens when your code is used for something you fundamentally disagree with?</strong></p>
<p>We celebrate open-source as this great force for collaboration, and in many ways, it is. But there’s a gap in how we think about responsibility. Right now, if you release software under a standard open-source license, you’re essentially saying: “Here, take this. Use it for whatever you want.” And that’s fine, until you realize that “whatever you want” includes things like mass surveillance, AI-driven discrimination, child exploitation networks, or even tools used to facilitate human trafficking.</p>
<p>Some people argue that this is just the price of open-source – once you put code out there, it’s out of your hands. But I started asking myself: <strong>does it have to be?</strong></p>
<contenttype="html"><p>For a while now, I’ve been playing with a thought experiment: <strong>what happens when your code is used for something you fundamentally disagree with?</strong></p>
<p>Open source is great. It encourages collaboration, innovation, and accessibility. But what it doesn’t do is ask whether there should be <em>any</em> limits on how software is used. Right now, if you release something under a permissive license, you’re essentially saying: “Here, take this. Use it for whatever you want.” And sometimes, that “whatever” includes mass surveillance, AI-driven discrimination, or worse.</p>
<p>Some people argue that this is just the price of open-source. Once you put code out there, it’s out of your hands. But I started wondering: <strong>does it have to be?</strong></p>
<p><em>(Fun fact: Apparently, just asking this question is enough to get your post removed from certain open-source communities. The conversation must be <em>very</em> settled, right?)</em></p>
<h2 id="What-Is-the-Ethical-Open-License-EOL"><a href="#What-Is-the-Ethical-Open-License-EOL" class="headerlink" title="What Is the Ethical Open License (EOL)?"></a>What Is the Ethical Open License (EOL)?</h2><p>The <strong>Ethical Open License (EOL)</strong> is an attempt to build a licensing model that allows for openness while setting some fundamental ethical limitations. It’s not about restricting everyday users or preventing innovation. It’s about setting clear boundaries on how software can and can’t be used.</p>
<p>Under EOL, your software for example <strong>cannot</strong> be used for:</p>
<h2 id="What-Is-the-Ethical-Open-License-EOL"><a href="#What-Is-the-Ethical-Open-License-EOL" class="headerlink" title="What Is the Ethical Open License (EOL)?"></a>What Is the Ethical Open License (EOL)?</h2><p>The <strong>Ethical Open License (EOL)</strong> is a hypothetical licensing model that explores whether open-source can include ethical restrictions. This isn’t about restricting everyday users or stifling innovation. It’s about setting clear boundaries on how software <em>shouldn’t</em> be used.</p>
<p>Under EOL, your software <strong>cannot</strong> be used for:</p>
<ul>
<li><strong>Mass surveillance</strong>– No large-scale tracking, unauthorized data collection, or government spying programs.</li>
<li><strong>Autonomous weapons</strong>– Your code should not end up in military AI systems or automated combat technology.</li>
<li><strong>Discriminatory AI</strong>– No training models that reinforce bias or make decisions based on race, gender, or social class.</li>
<li><strong>Exploitation networks</strong>– No use in systems that facilitate child abuse, human trafficking, or other crimes that exploit vulnerable individuals.</li>
<li><strong>Mass surveillance</strong> – No large-scale tracking, unauthorized data collection, or government spying programs.</li>
<li><strong>Autonomous weapons</strong> – Your code should not end up in military AI systems or automated combat technology.</li>
<li><strong>Discriminatory AI</strong> – No training models that reinforce bias or make decisions based on race, gender, or social class.</li>
<li><strong>Exploitation networks</strong> – No use in systems that facilitate child abuse, human trafficking, or other crimes that exploit vulnerable individuals.</li>
</ul>
<p>This is about recognizing that technology has real-world consequences and that developers should have a say in how their work is applied.</p>
<p>This raises a fair question: <em>who decides what’s ethical?</em> That’s something that would need clearer definition (which, to be fair, has been one of the biggest criticisms). But ignoring the question entirely doesn’t seem like the best answer either.</p>
<hr>
<h2 id="How-Does-EOL-Work"><a href="#How-Does-EOL-Work" class="headerlink" title="How Does EOL Work?"></a>How Does EOL Work?</h2><p>EOL follows a familiar open-source structure, but with added restrictions on unethical use. It grants users the right to:</p>
<h2 id="How-Does-EOL-Work-If-It-Did-Work"><a href="#How-Does-EOL-Work-If-It-Did-Work" class="headerlink" title="How Does EOL Work? (If It Did Work?)"></a>How Does EOL Work? (If It Did Work?)</h2><p>EOL would follow a familiar open-source structure, but with added restrictions on unethical use. It would grant users the right to:</p>
<ul>
<li>Use, modify, and distribute the software freely.</li>
<li>Contribute to the project under the same ethical guidelines.</li>
<li>Fork or create derivatives as long as they comply with the license terms.</li>
</ul>
<p>However, if an entity is found to be violating the ethical clauses of the license, they <strong>lose their right to use the software</strong>. This is meant to create a tangible consequence for misuse while keeping the spirit of open-source collaboration intact.</p>
<p>However, if an entity is found to be violating the ethical clauses of the license, they <strong>lose their right to use the software</strong>. This would be enforced through a <strong>defined process</strong>, ideally involving an independent review board (if one could exist without being a bureaucratic nightmare). </p>
<p>To avoid ambiguity, EOL also provides a <strong>defined process</strong> for addressing violations. This would ideally involve an independent review process where complaints can be filed, reviewed, and addressed based on available evidence.</p>
<p>If a violation is confirmed, the offending party is expected to cease the unethical use immediately or risk losing access to the software under the terms of the license.</p>
<p>Of course, enforceability is a huge concern – another major critique. If bad actors don’t follow the law, why would they follow a license? That’s a fair point, but licensing isn’t always about stopping the worst offenders. Sometimes, it’s about setting expectations and norms.</p>
<hr>
<h2 id="The-Cost-Model-of-EOL"><a href="#The-Cost-Model-of-EOL" class="headerlink" title="The Cost Model of EOL"></a>The Cost Model of EOL</h2><p>One important question that comes up with any new license is: <strong>how does the cost model work?</strong></p>
<p>EOL itself is, like most open-source licenses, <strong>free to use</strong>. Any developer, company, or organization can adopt the license without paying fees. However, the enforcement mechanisms and the potential establishment of an independent ethics review board (IERB) introduce some financial considerations.</p>
@ -88,16 +90,20 @@
</ul>
<p>This model helps create a balance between free access for non-commercial use and fair compensation for commercial beneficiaries, ensuring that ethical oversight remains feasible without burdening smaller developers or independent contributors.</p>
<hr>
<h2 id="“Isn’t-open-source-supposed-to-be-neutral-”"><a href="#“Isn’t-open-source-supposed-to-be-neutral-”" class="headerlink" title="“Isn’t open-source supposed to be neutral?”"></a>“Isn’t open-source supposed to be neutral?”</h2><p>The idea has always been that developers provide the tools, and it’s not their job to dictate how those tools are used.</p>
<h2 id="The-Biggest-Pushbacks"><a href="#The-Biggest-Pushbacks" class="headerlink" title="The Biggest Pushbacks"></a>The Biggest Pushbacks</h2><p><em>(And Why They Might Be Right)</em></p>
<h3 id="“Isn’t-open-source-supposed-to-be-neutral-”"><a href="#“Isn’t-open-source-supposed-to-be-neutral-”" class="headerlink" title="“Isn’t open-source supposed to be neutral?”"></a>“Isn’t open-source supposed to be neutral?”</h3><p>The idea has always been that developers provide the tools, and it’s not their job to dictate how those tools are used.</p>
<p>But software isn’t neutral. A powerful AI model isn’t just some abstract tool – it actively shapes real-world outcomes. A social media algorithm doesn’t just recommend content – it determines what millions of people see and believe. And if we, as developers, recognize that, why should we act as if we have no role in what happens next?</p>
<p>That’s where EOL comes in – not as a perfect solution, but as a proposal for a different way of thinking about open-source responsibility.</p>
<h3 id="“This-isn’t-open-source-Stop-pretending-it-is-”"><a href="#“This-isn’t-open-source-Stop-pretending-it-is-”" class="headerlink" title="“This isn’t open source. Stop pretending it is.”"></a>“This isn’t open source. Stop pretending it is.”</h3><p>Okay, fair. It doesn’t fit the OSI definition, which says that open-source software must allow unrestricted use. If <em>that’s</em> the definition you go by, then sure, EOL isn’t open source. But if you see open source as something that can evolve, it’s at least worth talking about.</p>
<h3 id="“Ethics-are-subjective-You-can’t-put-them-in-a-license-”"><a href="#“Ethics-are-subjective-You-can’t-put-them-in-a-license-”" class="headerlink" title="“Ethics are subjective. You can’t put them in a license.”"></a>“Ethics are subjective. You can’t put them in a license.”</h3><p>Completely true. What’s considered ethical today might not be in 50 years. But laws and policies shift too, and we don’t abandon them just because they’re hard to define. The challenge isn’t that ethics change—it’s how to define them in a way that works.</p>
<h3 id="“Nobody-would-use-this-because-it’s-legally-vague-”"><a href="#“Nobody-would-use-this-because-it’s-legally-vague-”" class="headerlink" title="“Nobody would use this because it’s legally vague.”"></a>“Nobody would use this because it’s legally vague.”</h3><p>Honestly, that’s a solid argument. If a license introduces too much risk, companies won’t touch it. If something like EOL were to work, it would need <em>very</em> clear definitions and solid legal backing. Right now, it’s more of a conversation starter than a practical tool.</p>
<h3 id="“Bad-actors-won’t-follow-the-license-anyway-”"><a href="#“Bad-actors-won’t-follow-the-license-anyway-”" class="headerlink" title="“Bad actors won’t follow the license anyway.”"></a>“Bad actors won’t follow the license anyway.”</h3><p>True again. If someone wants to build something awful, they won’t stop because a license tells them not to. But a license isn’t just about enforcement—it’s about setting a precedent. Big companies <em>do</em> care about compliance, and even if this wouldn’t stop everything, it might influence how some organizations think about responsibility.</p>
<hr>
<h2 id="EOL-is-Now-on-GitHub"><a href="#EOL-is-Now-on-GitHub" class="headerlink" title="EOL is Now on GitHub"></a>EOL is Now on GitHub</h2><p>The <strong>Ethical Open License (EOL) 1.0</strong> is now up on GitHub. It’s not a final product. It’s an open discussion. If you’re interested, check it out, share your thoughts, and let’s figure out if this is something that could work.</p>
<h2 id="So…-Is-This-a-Good-Idea-Probably-Not-But-It’s-Worth-Discussing"><a href="#So…-Is-This-a-Good-Idea-Probably-Not-But-It’s-Worth-Discussing" class="headerlink" title="So… Is This a Good Idea? Probably Not. But It’s Worth Discussing."></a>So… Is This a Good Idea? Probably Not. But It’s Worth Discussing.</h2><p>I’m not saying EOL is <em>the</em> answer. I’m not even saying it’s <em>a</em> good answer. What I <em>am</em> saying is that open-source has a responsibility problem that’s at least worth thinking about. If the reaction is just “shut up, open source is freedom,” then maybe the conversation is overdue. </p>
<p>The <strong>Ethical Open License (EOL)</strong> is up on GitHub. It’s not a finished product. It’s an open discussion. If you’re interested, check it out and let me know your thoughts.</p>
<p>Whether this turns into something practical or just sparks a broader conversation, I’d call that a win.</p>
<hr>
<p>I don’t expect EOL to replace MIT, GPL, or any of the widely used licenses. But I do think it’s time we stop pretending that software is neutral.</p>
<p>The way our code is used <strong>matters</strong>. And if we, as developers, have the ability to set ethical boundaries, why shouldn’t we?</p>
<h2 id="Whether-this-turns-into-something-real-or-just-sparks-a-broader-conversation-I’ll-count-that-as-a-win"><a href="#Whether-this-turns-into-something-real-or-just-sparks-a-broader-conversation-I’ll-count-that-as-a-win" class="headerlink" title="Whether this turns into something real or just sparks a broader conversation, I’ll count that as a win."></a>Whether this turns into something real or just sparks a broader conversation, I’ll count that as a win.</h2><p>I don’t see EOL as a replacement for MIT, GPL, or other widely adopted licenses. But I do think it’s worth questioning the idea that software is inherently neutral.</p>
<p>How our code gets used matters. And if we, as developers, have the ability to set ethical boundaries, why wouldn’t we consider it?</p>
<h4id="What-is-your-blog-all-about"><ahref="#What-is-your-blog-all-about"class="headerlink"title="What is your blog all about?"></a>What is your blog all about?</h4><p>The purpose of this website is to give you a small overview about my projects, interests and opinions.</p>
<h4id="How-can-I-contact-the-author-administrator-of-the-blog"><ahref="#How-can-I-contact-the-author-administrator-of-the-blog"class="headerlink"title="How can I contact the author/administrator of the blog?"></a>How can I contact the author/administrator of the blog?</h4><p>Mail: <ahref="mailto:tim.kicker@protonmail.com">tim.kicker@protonmail.com</a></p>
<h4id="How-can-I-contact-the-author-administrator-of-the-blog"><ahref="#How-can-I-contact-the-author-administrator-of-the-blog"class="headerlink"title="How can I contact the author/administrator of the blog?"></a>How can I contact the author/administrator of the blog?</h4><p>Mail: <ahref="mailto:tim.kicker@protonmail.com">tim.kicker@protonmail.com</a></p>
<h4id="Are-the-articles-on-this-blog-written-by-a-single-author-or-multiple-contributors"><ahref="#Are-the-articles-on-this-blog-written-by-a-single-author-or-multiple-contributors"class="headerlink"title="Are the articles on this blog written by a single author or multiple contributors?"></a>Are the articles on this blog written by a single author or multiple contributors?</h4><p>At the time of writing, all blogs were completely done by myself.</p>
{"title":"The Ethical Open License (EOL) - Rethinking Open Source Responsibility","id":"2025/02/24/eol/","date_published":"02/24/2025","summary":"","url":"https://tim.kicker.dev/2025/02/24/eol/","tags":[],"categories":[]}
{"title":"Rethinking Open Source Responsibility (EOL)","id":"2025/02/24/eol/","date_published":"02/24/2025","summary":"","url":"https://tim.kicker.dev/2025/02/24/eol/","tags":[],"categories":[]}
<pubDate>Mon, 24 Feb 2025 15:20:53 +0000</pubDate>
<description><![CDATA[ <p>For a while now, I’ve been thinking about something that doesn’t get talked about enough in open source: <strong>what happens when your code is used for something you fundamentally disagree with?</strong></p>
<p>We celebrate open-source as this great force for collaboration, and in many ways, it is. But there’s a gap in how we think about responsibility. Right now, if you release software under a standard open-source license, you’re essentially saying: “Here, take this. Use it for whatever you want.” And that’s fine, until you realize that “whatever you want” includes things like mass surveillance, AI-driven discrimination, child exploitation networks, or even tools used to facilitate human trafficking.</p>
<p>Some people argue that this is just the price of open-source – once you put code out there, it’s out of your hands. But I started asking myself: <strong>does it have to be?</strong></p>
<description><![CDATA[ <p>For a while now, I’ve been playing with a thought experiment: <strong>what happens when your code is used for something you fundamentally disagree with?</strong></p>
<p>Open source is great. It encourages collaboration, innovation, and accessibility. But what it doesn’t do is ask whether there should be <em>any</em> limits on how software is used. Right now, if you release something under a permissive license, you’re essentially saying: “Here, take this. Use it for whatever you want.” And sometimes, that “whatever” includes mass surveillance, AI-driven discrimination, or worse.</p>
<p>Some people argue that this is just the price of open-source. Once you put code out there, it’s out of your hands. But I started wondering: <strong>does it have to be?</strong></p>
<p><em>(Fun fact: Apparently, just asking this question is enough to get your post removed from certain open-source communities. The conversation must be <em>very</em> settled, right?)</em></p>
<h2 id="What-Is-the-Ethical-Open-License-EOL"><a href="#What-Is-the-Ethical-Open-License-EOL" class="headerlink" title="What Is the Ethical Open License (EOL)?"></a>What Is the Ethical Open License (EOL)?</h2><p>The <strong>Ethical Open License (EOL)</strong> is an attempt to build a licensing model that allows for openness while setting some fundamental ethical limitations. It’s not about restricting everyday users or preventing innovation. It’s about setting clear boundaries on how software can and can’t be used.</p>
<p>Under EOL, your software for example <strong>cannot</strong> be used for:</p>
<h2 id="What-Is-the-Ethical-Open-License-EOL"><a href="#What-Is-the-Ethical-Open-License-EOL" class="headerlink" title="What Is the Ethical Open License (EOL)?"></a>What Is the Ethical Open License (EOL)?</h2><p>The <strong>Ethical Open License (EOL)</strong> is a hypothetical licensing model that explores whether open-source can include ethical restrictions. This isn’t about restricting everyday users or stifling innovation. It’s about setting clear boundaries on how software <em>shouldn’t</em> be used.</p>
<p>Under EOL, your software <strong>cannot</strong> be used for:</p>
<ul>
<li><strong>Mass surveillance</strong>– No large-scale tracking, unauthorized data collection, or government spying programs.</li>
<li><strong>Autonomous weapons</strong>– Your code should not end up in military AI systems or automated combat technology.</li>
<li><strong>Discriminatory AI</strong>– No training models that reinforce bias or make decisions based on race, gender, or social class.</li>
<li><strong>Exploitation networks</strong>– No use in systems that facilitate child abuse, human trafficking, or other crimes that exploit vulnerable individuals.</li>
<li><strong>Mass surveillance</strong> – No large-scale tracking, unauthorized data collection, or government spying programs.</li>
<li><strong>Autonomous weapons</strong> – Your code should not end up in military AI systems or automated combat technology.</li>
<li><strong>Discriminatory AI</strong> – No training models that reinforce bias or make decisions based on race, gender, or social class.</li>
<li><strong>Exploitation networks</strong> – No use in systems that facilitate child abuse, human trafficking, or other crimes that exploit vulnerable individuals.</li>
</ul>
<p>This is about recognizing that technology has real-world consequences and that developers should have a say in how their work is applied.</p>
<p>This raises a fair question: <em>who decides what’s ethical?</em> That’s something that would need clearer definition (which, to be fair, has been one of the biggest criticisms). But ignoring the question entirely doesn’t seem like the best answer either.</p>
<hr>
<h2 id="How-Does-EOL-Work"><a href="#How-Does-EOL-Work" class="headerlink" title="How Does EOL Work?"></a>How Does EOL Work?</h2><p>EOL follows a familiar open-source structure, but with added restrictions on unethical use. It grants users the right to:</p>
<h2 id="How-Does-EOL-Work-If-It-Did-Work"><a href="#How-Does-EOL-Work-If-It-Did-Work" class="headerlink" title="How Does EOL Work? (If It Did Work?)"></a>How Does EOL Work? (If It Did Work?)</h2><p>EOL would follow a familiar open-source structure, but with added restrictions on unethical use. It would grant users the right to:</p>
<ul>
<li>Use, modify, and distribute the software freely.</li>
<li>Contribute to the project under the same ethical guidelines.</li>
<li>Fork or create derivatives as long as they comply with the license terms.</li>
</ul>
<p>However, if an entity is found to be violating the ethical clauses of the license, they <strong>lose their right to use the software</strong>. This is meant to create a tangible consequence for misuse while keeping the spirit of open-source collaboration intact.</p>
<p>However, if an entity is found to be violating the ethical clauses of the license, they <strong>lose their right to use the software</strong>. This would be enforced through a <strong>defined process</strong>, ideally involving an independent review board (if one could exist without being a bureaucratic nightmare). </p>
<p>To avoid ambiguity, EOL also provides a <strong>defined process</strong> for addressing violations. This would ideally involve an independent review process where complaints can be filed, reviewed, and addressed based on available evidence.</p>
<p>If a violation is confirmed, the offending party is expected to cease the unethical use immediately or risk losing access to the software under the terms of the license.</p>
<p>Of course, enforceability is a huge concern – another major critique. If bad actors don’t follow the law, why would they follow a license? That’s a fair point, but licensing isn’t always about stopping the worst offenders. Sometimes, it’s about setting expectations and norms.</p>
<hr>
<h2 id="The-Cost-Model-of-EOL"><a href="#The-Cost-Model-of-EOL" class="headerlink" title="The Cost Model of EOL"></a>The Cost Model of EOL</h2><p>One important question that comes up with any new license is: <strong>how does the cost model work?</strong></p>
<p>EOL itself is, like most open-source licenses, <strong>free to use</strong>. Any developer, company, or organization can adopt the license without paying fees. However, the enforcement mechanisms and the potential establishment of an independent ethics review board (IERB) introduce some financial considerations.</p>
@ -92,16 +94,20 @@
</ul>
<p>This model helps create a balance between free access for non-commercial use and fair compensation for commercial beneficiaries, ensuring that ethical oversight remains feasible without burdening smaller developers or independent contributors.</p>
<hr>
<h2 id="“Isn’t-open-source-supposed-to-be-neutral-”"><a href="#“Isn’t-open-source-supposed-to-be-neutral-”" class="headerlink" title="“Isn’t open-source supposed to be neutral?”"></a>“Isn’t open-source supposed to be neutral?”</h2><p>The idea has always been that developers provide the tools, and it’s not their job to dictate how those tools are used.</p>
<h2 id="The-Biggest-Pushbacks"><a href="#The-Biggest-Pushbacks" class="headerlink" title="The Biggest Pushbacks"></a>The Biggest Pushbacks</h2><p><em>(And Why They Might Be Right)</em></p>
<h3 id="“Isn’t-open-source-supposed-to-be-neutral-”"><a href="#“Isn’t-open-source-supposed-to-be-neutral-”" class="headerlink" title="“Isn’t open-source supposed to be neutral?”"></a>“Isn’t open-source supposed to be neutral?”</h3><p>The idea has always been that developers provide the tools, and it’s not their job to dictate how those tools are used.</p>
<p>But software isn’t neutral. A powerful AI model isn’t just some abstract tool – it actively shapes real-world outcomes. A social media algorithm doesn’t just recommend content – it determines what millions of people see and believe. And if we, as developers, recognize that, why should we act as if we have no role in what happens next?</p>
<p>That’s where EOL comes in – not as a perfect solution, but as a proposal for a different way of thinking about open-source responsibility.</p>
<h3 id="“This-isn’t-open-source-Stop-pretending-it-is-”"><a href="#“This-isn’t-open-source-Stop-pretending-it-is-”" class="headerlink" title="“This isn’t open source. Stop pretending it is.”"></a>“This isn’t open source. Stop pretending it is.”</h3><p>Okay, fair. It doesn’t fit the OSI definition, which says that open-source software must allow unrestricted use. If <em>that’s</em> the definition you go by, then sure, EOL isn’t open source. But if you see open source as something that can evolve, it’s at least worth talking about.</p>
<h3 id="“Ethics-are-subjective-You-can’t-put-them-in-a-license-”"><a href="#“Ethics-are-subjective-You-can’t-put-them-in-a-license-”" class="headerlink" title="“Ethics are subjective. You can’t put them in a license.”"></a>“Ethics are subjective. You can’t put them in a license.”</h3><p>Completely true. What’s considered ethical today might not be in 50 years. But laws and policies shift too, and we don’t abandon them just because they’re hard to define. The challenge isn’t that ethics change—it’s how to define them in a way that works.</p>
<h3 id="“Nobody-would-use-this-because-it’s-legally-vague-”"><a href="#“Nobody-would-use-this-because-it’s-legally-vague-”" class="headerlink" title="“Nobody would use this because it’s legally vague.”"></a>“Nobody would use this because it’s legally vague.”</h3><p>Honestly, that’s a solid argument. If a license introduces too much risk, companies won’t touch it. If something like EOL were to work, it would need <em>very</em> clear definitions and solid legal backing. Right now, it’s more of a conversation starter than a practical tool.</p>
<h3 id="“Bad-actors-won’t-follow-the-license-anyway-”"><a href="#“Bad-actors-won’t-follow-the-license-anyway-”" class="headerlink" title="“Bad actors won’t follow the license anyway.”"></a>“Bad actors won’t follow the license anyway.”</h3><p>True again. If someone wants to build something awful, they won’t stop because a license tells them not to. But a license isn’t just about enforcement—it’s about setting a precedent. Big companies <em>do</em> care about compliance, and even if this wouldn’t stop everything, it might influence how some organizations think about responsibility.</p>
<hr>
<h2 id="EOL-is-Now-on-GitHub"><a href="#EOL-is-Now-on-GitHub" class="headerlink" title="EOL is Now on GitHub"></a>EOL is Now on GitHub</h2><p>The <strong>Ethical Open License (EOL) 1.0</strong> is now up on GitHub. It’s not a final product. It’s an open discussion. If you’re interested, check it out, share your thoughts, and let’s figure out if this is something that could work.</p>
<h2 id="So…-Is-This-a-Good-Idea-Probably-Not-But-It’s-Worth-Discussing"><a href="#So…-Is-This-a-Good-Idea-Probably-Not-But-It’s-Worth-Discussing" class="headerlink" title="So… Is This a Good Idea? Probably Not. But It’s Worth Discussing."></a>So… Is This a Good Idea? Probably Not. But It’s Worth Discussing.</h2><p>I’m not saying EOL is <em>the</em> answer. I’m not even saying it’s <em>a</em> good answer. What I <em>am</em> saying is that open-source has a responsibility problem that’s at least worth thinking about. If the reaction is just “shut up, open source is freedom,” then maybe the conversation is overdue. </p>
<p>The <strong>Ethical Open License (EOL)</strong> is up on GitHub. It’s not a finished product. It’s an open discussion. If you’re interested, check it out and let me know your thoughts.</p>
<p>Whether this turns into something practical or just sparks a broader conversation, I’d call that a win.</p>
<hr>
<p>I don’t expect EOL to replace MIT, GPL, or any of the widely used licenses. But I do think it’s time we stop pretending that software is neutral.</p>
<p>The way our code is used <strong>matters</strong>. And if we, as developers, have the ability to set ethical boundaries, why shouldn’t we?</p>
<h2 id="Whether-this-turns-into-something-real-or-just-sparks-a-broader-conversation-I’ll-count-that-as-a-win"><a href="#Whether-this-turns-into-something-real-or-just-sparks-a-broader-conversation-I’ll-count-that-as-a-win" class="headerlink" title="Whether this turns into something real or just sparks a broader conversation, I’ll count that as a win."></a>Whether this turns into something real or just sparks a broader conversation, I’ll count that as a win.</h2><p>I don’t see EOL as a replacement for MIT, GPL, or other widely adopted licenses. But I do think it’s worth questioning the idea that software is inherently neutral.</p>
<p>How our code gets used matters. And if we, as developers, have the ability to set ethical boundaries, why wouldn’t we consider it?</p>