Enter: Anubis | cmdr-nova@internet:~$

Enter: Anubis

Follow me via:





So, I don’t have a lot of brain-juice to spare lately. Mostly because I’ve been reeling from the existential crisis of turning forty … uh, TOMORROW (as of posting this), and I’ve been adjusting to a different position at work, which also comes with longer hours. But, with the limited juice I have, and the start of a tiny vacation from work, I went ahead and hashed out the things I would need to do to:

  • 1: move my website to my own server
  • 2: enable Anubis in order to protect against all AI scrapers

And I’ve done that! In a miraculously short amount of time.

This website will no longer be hosted on Github, where everything you write is being scraped by Copilot, and it’ll also no longer be subject to “action” fees once you reach into the 2000+ actions zone (this website does a lot of stuff). This also means that I won’t be actively pushing my site to Neocities, but honestly? Having this place hosted on my own server basically lifts every and all restrictions. Including the ability to install and host Anubis as a way to protect against AI.

Neocities is great, but I do have my doubts about the owner and his stance on the grand-theft of all things for corporate greed and the destruction of the arts, but for a measly 6 dollars a month to run a server, I can now do … whatever the hell I want.

But, it’s not just important to protect against AI scrapers in order to retain your work and the things you do as your own, and not something for Google or Sam Altman to make a profit off of, but also because, sometimes, when these bots hit your servers, it is effectively a DDoS. And, I don’t know how many people know, but bandwidth exploding from aggressive bots trying to steal everything you’ve ever done WILL eventually incur charges upon you, that could be astronomical.

If you’re reading this though, you’ve already had your http request passed through Anubis, and you are not AI.

Congratulations.

My only stipulation about Anubis are the instructions on getting everything running. In that, they’re pretty bare-bones, and it took me putting on my thinking cap to get past that.

Things I’ll note for others who potentially want to build against AI using Anubis.

  • 1: You have to actually download the package and put it on your server first, and if you constantly forget things (like me), or are new to these sorts of things, you might have forgotten/don’t know how to do this. Just setup FileZilla with some SSH privileges and drop the package into your root folder.

  • 2: In the setup instructions on the Anubis website, it mentions an example where you’re building Anubis to protect a Gitea server. This threw me for a loop, because I was like, “Uh, well, what if I’m just protecting a website?? What do I input??” And this was clarified by the developer themselves … you just write whatever. So, mine is “nova.”

  • 3: Reverse-proxying is a super advanced thing that regular users have probably zero experience with, but, basically, get rid of your default Nginx server configuration, and make sure you’ve already got your SSL running via certbot, and follow everything on the page via Nginx configuration to the T. You’ll need the upstream, and in my case, I created a secondary config file in sites-available where my site is running on port 3000, that the actual config (that proxies Anubis) points to.

Convoluted! But, actually, pretty simple, once you’ve wrapped your head around it.

I could probably confidently walk you through it, if you’d like to do exactly what I just did.

Otherwise, onward! To the future!

… and being forty, I guess.


mkultra.monster is independent, in that it is written, developed, and maintained by one person. Written, developed, and maintained, not for scrapers, bots, scammers, algorithms, or grifters: But for people to follow and read, just like the way it used to be, back in the golden age of the internet.

mkultra.monster is independent, in that it is written, developed, and maintained by one person. Written, developed, and maintained, not for scrapers, bots, scammers, algorithms, or grifters: But for people to follow and read, just like the way it used to be, back in the golden age of the internet.


WEBMENTIONS

Have you written a response to this post? Send me a webmention!

📝 How to send a webmention

To send a webmention, your response page must contain an exact link to this post and be publicly fetchable.

  • A blog post that mentions or links to this article
  • A public webpage that includes the exact canonical URL
  • Any webpage that references this content

After creating your response, paste the URL below. Social posts often need a bridge such as Bridgy before they appear as webmentions here.

Webmention submitted!
It may take a few moments to appear.

Error submitting webmention.