I use Zip Bombs to Protect my Server (idiallo.com)
from some_guy@lemmy.sdf.org to technology@lemmy.world on 29 Apr 22:39
https://lemmy.sdf.org/post/33570833

The one-liner:

dd if=/dev/zero bs=1G count=10 | gzip -c > 10GB.gz

This is brilliant.

#technology

threaded - newest

Aatube@kbin.melroy.org on 29 Apr 23:10 next collapse

macOS compresses its memory. Does this mean we'll see bots running on macOS now?

UnbrokenTaco@lemm.ee on 29 Apr 23:20 next collapse

Is it immune to zip bombs?

Aatube@kbin.melroy.org on 29 Apr 23:24 collapse

All I know is it compresses memory. The mechanism mentioned here for ZIP bombs to crash bots is to fill up memory fast with repeating zeroes.

Guidy@lemmy.world on 29 Apr 23:33 collapse

I thought it was to fill all available storage. Maybe it’s both?

ivn@jlai.lu on 29 Apr 23:50 next collapse

Linux and Windows compress it too, for 10 years or more. And that’s not how you avoid zip bombs, just limit how much you uncompress and abort if it’s over that limit.

timetraveller@lemmy.world on 30 Apr 00:30 collapse

I was going to say the same thing.

tdawg@lemmy.world on 30 Apr 00:00 collapse

No, but that’s an interesting question. Ultimately it probably comes down to hardware specs. Or depending on the particular bot and it’s env the spec of the container it’s running in

Even with macos’s style of compressing inactive memory pages you’ll still have a hard cap that can be reached with the same technique (just with a larger uncompressed file)

4am@lemm.ee on 30 Apr 04:05 collapse

How long would it take to be considered an inactive memory page? Does OOM conditions immediately trigger compression, or would the process die first?

tdawg@lemmy.world on 30 Apr 16:51 collapse

So I’m not an expert but my understanding is the flow is roughly:

  1. Available memory gets low
  2. Compress based on LRU rules
  3. Use swap
  4. OOM

So it’s more meant to be preventative afaik

melroy@kbin.melroy.org on 29 Apr 23:16 next collapse

let me try..

melroy@kbin.melroy.org on 29 Apr 23:17 collapse

Looks fine to me. Only 1 CPU core I think was 100%.

10+0 records in
10+0 records out
10737418240 bytes (11 GB, 10 GiB) copied, 28,0695 s, 383 MB/s
melroy@kbin.melroy.org on 29 Apr 23:19 collapse

ow.. now the idea is to unzip it right?

nice idea:

if (ipIsBlackListed() || isMalicious()) {
    header("Content-Encoding: deflate, gzip");
    header("Content-Length: "+ filesize(ZIP_BOMB_FILE_10G)); // 10 MB
    readfile(ZIP_BOMB_FILE_10G);
    exit;
}
mbirth@lemmy.ml on 30 Apr 00:03 collapse

Might need some

if (ob_get_level()) ob_end_clean();

before the readfile. 😉

comador@lemmy.world on 29 Apr 23:18 next collapse

Funny part is many of us crusty old sysadmins were using derivatives of this decades ago to test RAID-5/6 sequencial reads and write speeds.

UnbrokenTaco@lemm.ee on 29 Apr 23:19 next collapse

Interesting. I wonder how long it takes until most bots adapt to this type of “reverse DoS”.

sugar_in_your_tea@sh.itjust.works on 30 Apr 03:28 collapse

Then we’ll just be more clever as well. It’s an arms race after all.

lemmylommy@lemmy.world on 29 Apr 23:21 next collapse

Before I tell you how to create a zip bomb, I do have to warn you that you can potentially crash and destroy your own device.

LOL. Destroy your device, kill the cat, what else?

archonet@lemy.lol on 29 Apr 23:55 next collapse

destroy your device by… having to reboot it. the horror! The pain! The financial loss of downtime!

Albbi@lemmy.ca on 30 Apr 01:43 collapse

It’ll email your grandmother all if your porn!

CrazyLikeGollum@lemmy.world on 30 Apr 06:12 next collapse

Ah yes, the infamous “stinky cheese” email virus. Who knew zip bombs could be so destructive. It erased all of the easter eggs off of my DVDs.

turkalino@lemmy.yachts on 30 Apr 07:12 next collapse

Haven’t thought about that Weird Al song in a while

Exec@pawb.social on 30 Apr 07:27 next collapse

outstanding reference

AceFuzzLord@lemm.ee on 01 May 00:03 collapse

The horrors of having your TV record Gigli!

palordrolap@fedia.io on 29 Apr 23:36 next collapse

The article writer kind of complains that they're having to serve a 10MB file, which is the result of the gzip compression. If that's a problem, they could switch to bzip2. It's available pretty much everywhere that gzip is available and it packs the 10GB down to 7506 bytes.

That's not a typo. bzip2 is way better with highly redundant data.

just_another_person@lemmy.world on 30 Apr 00:02 next collapse

I believe he’s returning a gzip HTTP response stream, not just a file payload that the requester then downloads and decompresses.

Bzip isn’t used in HTTP compression.

sugar_in_your_tea@sh.itjust.works on 30 Apr 03:12 next collapse

Brotli is an option, and it’s comparable to Bzip. Brotli works in most browsers, so hopefully these bots would support it.

I just tested it, and a 10G file full of zeroes is only 8.3K compressed. That’s pretty good, though a little bigger than BZip.

bss03@infosec.pub on 01 May 18:57 collapse

For scrapers that not just implementing HTTP, but are trying to extract zip files, you can possibly drive them insane with zip quines: github.com/ruvmello/zip-quine-generator or otherwise compressed files that contain themselves at some level of nesting, possibly with other data so that they recursively expand to an unbounded (“infinite”) size.

some_guy@lemmy.sdf.org on 30 Apr 01:56 next collapse

TIL why I’m gonna start learning more about bzip2. Thanks!

sugar_in_your_tea@sh.itjust.works on 30 Apr 03:25 next collapse

Brotli gets it to 8.3K, and is supported in most browsers, so there’s a chance scrapers also support it.

Aceticon@lemmy.dbzer0.com on 30 Apr 10:43 collapse

Gzip encoding has been part of the HTTP protocol for a long time and every server-side HTTP library out there supports it, and phishing/scrapper bots will be done with server-side libraries, not using browser engines.

Further, judging by the guy’s example in his article he’s not using gzip with maximum compression when generating the zip bomb files: he needs to add -9 to the gzip command line to get the best compression (but it will be slower). (I tested this and it made no difference at all).

sugar_in_your_tea@sh.itjust.works on 30 Apr 13:08 collapse

You can make multiple files with different encodings and select based on the Accept-Encoding header.

Aceticon@lemmy.dbzer0.com on 30 Apr 13:35 collapse

Yeah, good point.

I forgot about that.

Xanza@lemm.ee on 01 May 17:32 collapse

zstd is a significantly better option than anything else available unless you need something specific for a specific reason: github.com/facebook/zstd?tab=readme-ov-file#bench…

LZ4 is likely better than zstd, but it doesn’t have wide usability yet.

palordrolap@fedia.io on 01 May 17:46 collapse

You might be thinking of lzip rather than lz4. Both compress, but the former is meant for high compression whereas the latter is meant for speed. Neither are particularly good at dealing with highly redundant data though, if my testing is anything to go by.

Either way, none of those are installed as standard in my distro. xz (which is lzma based) is installed as standard but, like lzip, is slow, and zstd is still pretty new to some distros, so the recipient could conceivably not have that installed either.

bzip2 is ancient and almost always available at this point, which is why I figured it would be the best option to stand in for gzip.

As it turns out, the question was one of data streams not files, and as at least one other person pointed out, brotli is often available for streams where bzip2 isn't. That's also not installed by default as a command line tool, but it may well be that the recipient, while attempting to emulate a browser, might have actually installed it.

Xanza@lemm.ee on 01 May 20:45 collapse

No. github.com/lz4/lz4

LZ4 already has a caddy layer which interprets and compresses data streams for caddy: github.com/mholt/caddy-l4

It’s also very impressive.

mbirth@lemmy.ml on 30 Apr 00:01 next collapse

And if you want some customisation, e.g. some repeating string over and over, you can use something like this:

yes "b0M" | tr -d '\n' | head -c 10G | gzip -c > 10GB.gz

yes repeats the given string (followed by a line feed) indefinitely - originally meant to type “yes” + ENTER into prompts. tr then removes the line breaks again and head makes sure to only take 10GB and not have it run indefinitely.

If you want to be really fancy, you can even add some HTML header and footer to some files like header and footer and then run it like this:

yes "b0M" | tr -d '\n' | head -c 10G | cat header - footer | gzip -c > 10GB.gz
tal@lemmy.today on 30 Apr 00:54 next collapse

Anyone who writes a spider that’s going to inspect all the content out there is already going to have to have dealt with this, along with about a bazillion other kinds of oddball or bad data.

catloaf@lemm.ee on 30 Apr 02:54 next collapse

Competent ones, yes. Most developers aren’t competent, scraper writers even less so.

idriss@lemm.ee on 30 Apr 11:22 collapse

That’s true. Scrapping is a gold mine for the people that don’t know. I worked for a place which crawls the internet and beyond (fetches some internal dumps we pay for). There is no chance a zip bomb would crash the workers as there are strict timeouts and smell tests (even if a does it will crash an ECS task at worst and we will be alerted to fix that within a short time). We were as honest as it gets though, following GDPR, honoring the robots file, no spiders or scanners allowed, only home page to extract some insights.

I am aware of some big name EU non-software companies very interested in keeping an eye on some key things that are only possible with scraping.

lennivelkant@discuss.tchncs.de on 30 Apr 05:58 next collapse

That’s the usual case with arms races: Unless you are yourself a major power, odds are you’ll never be able to fully stand up to one (at least not on your own, but let’s not stretch the metaphor too far). Often, the best you can do is to deterr other, minor powers and hope major ones never have a serious intent to bring you down.

In this specific case, the number of potential minor “attackers” and the hurdle for “attack” mKe it attractive to try to overwhelm the amateurs at least. You’ll never get the pros, you just hope they don’t bother you too much.

YMS@discuss.tchncs.de on 01 May 11:03 collapse

If you have billions of targets to scan, there’s generally no need to handle each and every edge case. Just ignoring what you can’t understand easily and jumping on to the next target is an absolutely viable strategy. You will never be able to process everything anyway.

Of course, it changes a bit if some of these targets actually make your bot crash. If it happens to often, you will want to harden your bot against it. Then again, if it just happens every now and then, it’s still much easier to just restart and continue with the next target.

Bishma@discuss.tchncs.de on 30 Apr 01:08 next collapse

When I was serving high volume sites (that were targeted by scrapers) I had a collection of files in CDN that contained nothing but the word “no” over and over. Scrapers who barely hit our detection thresholds saw all their requests go to the 50M version. Super aggressive scrapers got the 10G version. And the scripts that just wouldn’t stop got the 50G version.

It didn’t move the needle on budget, but hopefully it cost them.

sugar_in_your_tea@sh.itjust.works on 30 Apr 03:08 collapse

How do you tell scrapers from regular traffic?

Bishma@discuss.tchncs.de on 30 Apr 03:14 collapse

Most often because they don’t download any of the css of external js files from the pages they scrape. But there are a lot of other patterns you can detect once you have their traffic logs loaded in a time series database. I used an ELK stack back in the day.

sugar_in_your_tea@sh.itjust.works on 30 Apr 03:36 collapse

That sounds like a lot of effort. Are there any tools that get like 80% of the way there? Like something I could plug into Caddy, nginx, or haproxy?

Bishma@discuss.tchncs.de on 30 Apr 04:11 collapse

My experience is with systems that handle nearly 1000 pageviews per second. We did use a spread of haproxy servers to handle routing and SNI, but they were being fed offender lists by external analysis tools (built in-house).

sugar_in_your_tea@sh.itjust.works on 30 Apr 04:16 collapse

Dang, I was hoping for a FOSS project that would do most of the heavy lifting for me. Maybe such a thing exists, idk, but it would be pretty cool to have a pluggable system that analyzes activity and tags connections w/ some kind of identifier so I could configure a web server to either send it nonsense (i.e. poison AI scrapers), zip bombs (i.e. bots that aren’t respectful of resources), or redirect to a honey pot (i.e. malicious actors).

A quick search didn’t yield anything immediately, but I wasn’t that thorough. I’d be interested if anyone knows of such a project that’s pretty easy to play with.

ABasilPlant@lemmy.world on 30 Apr 05:25 collapse

Not exactly what you asked, but do you know about ufw-blocklist?

I’ve been using this on my multiple VPSes for some time now and the number of fail2ban failed/banned has gone down like crazy. Previously, I had 20k failed attempts after a few months and 30-50 currently-banned IPs at all times; now it’s less than 1k failed after a year and maybe 3-ish banned at any time.

There was also that paid service where users share their spammy IP address attempts with a centralized network, which does some dynamic intelligence monitoring. I forgot the name and search these days isn’t great. Something to do with “Sense”? It was paid, but well recommended as far as I remember.

Edit: seems like the keyword is " threat intelligence platform"

Kolanaki@pawb.social on 30 Apr 01:11 next collapse

How I read that code:

“If the dev folder’s bullshit is equal to 1 gram…”

cy_narrator@discuss.tchncs.de on 30 Apr 01:15 next collapse

First off, be very careful with bs=1G as it may overload the RAM. You will want to set count accordingly

sugar_in_your_tea@sh.itjust.works on 30 Apr 03:09 collapse

Yup, use something sensible like 10M or so.

cy_narrator@discuss.tchncs.de on 30 Apr 07:45 collapse

I would normally go much lower,

bs=4K count=262144 which creates 1G with 4K block size

deaddigger@lemm.ee on 30 Apr 03:07 next collapse

At least in germany having one of these on your system is illegal

lka1988@lemmy.dbzer0.com on 30 Apr 04:20 next collapse

Maybe bots shouldn’t be trying to install malicious code? Sucks to suck.

lennivelkant@discuss.tchncs.de on 30 Apr 05:32 collapse

Still illegal. Not immoral, but a lot of our laws aren’t built on morality.

raltoid@lemmy.world on 30 Apr 05:02 next collapse

Illegal to publically serve or distribute.

dzso@lemmy.world on 30 Apr 05:35 collapse

Out of curiosity, what is illegal about it, exactly?

deaddigger@lemm.ee on 30 Apr 06:10 collapse

I mean i am not a lawyer.

In germany we have § 303 b StGB. In short it says if you hinder someone elses dataprocessing through physical means or malicous data you can go to jail for up to 3 years . If it is a major process for someone you can get up to 5 and in major cases up to 10 years.

So if you have a zipbomb on your system and a crawler reads and unpacks it you did two crimes. 1. You hindered that crawlers dataprocessing 2. Some isp nodes look into it and can crash too. If the isp is pissed of enough you can go to jail for 5 years. This applies even if you didnt crash them due to them having protection against it, because trying it is also against the law.

Having a zipbomb is part of a gray area. Because trying to disrupt dataprocessing is illegal, having a zipbomb can be considered trying, however i am not aware of any judgement in this regard

Edit: btw if you password protect your zipbomb, everything is fine

barsoap@lemm.ee on 30 Apr 06:33 next collapse

Severely disrupting other people’s data processing of significant import to them. By submitting malicious data requires intent to cause harm, physical destruction, deletion, etc, doesn’t. This is about crashing people’s payroll systems, ddosing, etc. Not burning some cpu cycles and having a crawler subprocess crash with OOM.

Why the hell would an ISP have a look at this. And even if, they’re professional enough to detect zip bombs. Which btw is why this whole thing is pointless anyway: If you class requests as malicious, just don’t serve them. If that’s not enough it’s much more sensible to go the anubis route and demand proof of work as that catches crawlers which come from a gazillion IPs with different user agents etc.

deaddigger@lemm.ee on 01 May 15:35 collapse

Telecom for example does Deep PackageInspection. That is rather well kown. Derec made a statement years ago that it is normal for other european isps too. Here is a secondary source for it, i cant find the primary source anymore netzpolitik.org/…/berec-studie-dpi-bei-vielen-pro…

If you are succesful in disrupting some dataprocessing doesnt matter, trying to do it is illigal. If you put it there to disrupt crawlers you are trying to disrupt an entities dataprocessing.

If your isp does dpi an archive bomb is able to crash their server. Even if they have measures againt it, it is still illigal because trying it is illigal.

barsoap@lemm.ee on 01 May 17:13 collapse

The intent is to get rid of crawlers which are disrupting the operation of your servers. That’s not intent of doing harm to the crawler’s operator, or their business. It’s analogous to telling a hawker to fuck off: Polite, no, but them being able to profit off you is not your responsibly, you do not have to accede to that. And intent to harm the ISP is even less reasonable to assume.

cant find the primary source anymore netzpolitik.org/…/berec-studie-dpi-bei-vielen-pro…

That’s out of date anyway. How about this one. DPI is limited to OSI level 5 and only allowed to resolve network issues – and a crawler crashing is not a network issue.

deaddigger@lemm.ee on 01 May 17:20 collapse

That’s out of date anyway. How about this one.

Good to know

A crawler is a data processing machine, nothing more. therefor you are disrupting dataprocessing through data. If you think its not thats ok too. I would still advise to contact your lawyer in germany if you are thinking about hosting a zipbomb

barsoap@lemm.ee on 01 May 18:00 collapse

A crawler is a data processing machine, nothing more. therefor you are disrupting dataprocessing through data. If you think its not thats ok too.

Nah it’s definitely disrupting data processing, even though at a very low-key level – you’re not causing any data to become invalid or such. It’s the intent to harm the operator that’s the linchpin: “Jemandem einen Nachteil zufügen”. “Jemand” needs to be a person, natural or legal. And by stopping a crawler you don’t want to inflict a disadvantage on the operator you want to, at most, stop them from gaining an advantage. “Inflict disadvantage” and “prevent advantage” are two different things.

I would still advise to contact your lawyer in germany if you are thinking about hosting a zipbomb

Good idea, but as already said before: First, you should contact a sysadmin. Who will tell you it’s a stupid idea.

deaddigger@lemm.ee on 02 May 20:17 collapse

“Jemand” could be the owner of the company, further 5. Explicitly mentions companies so this is kinda a void argument.

Who will tell you it’s a stupid idea.

I mean i never argued against that, like you already postet anubis is a way better option and not against german law afaik

MimicJar@lemmy.world on 30 Apr 08:26 next collapse

I wonder if having a robots.txt file that said to ignore the file/path would help.

I’m assuming a bad bot would ignore the robots.txt file. So you could argue that you put up a clear sign and they chose to ignore it.

deaddigger@lemm.ee on 01 May 15:37 collapse

Good question i dont know tbh. Would be an interesting question for a lawyer influencer

raltoid@lemmy.world on 30 Apr 18:10 collapse

TL;DR: It’s illegal to have publically available or share.

Making it illegal to create one for research purposes on your own hardware is not illegal as far as I know. And if it is, I wouldn’t mind seeing someone challenge that with the EU.

deaddigger@lemm.ee on 01 May 15:43 collapse

For research purposes you could make it password protected, which would make it legal, though. Like i said having one is a gray area, because the law is made extremly vague. Like i said i dont know of any judgements about it, but it is still a possibility. If you life in germany and are inclined for an archive bomb and care about your legal safety contact a lawyer beforehand

aesthelete@lemmy.world on 30 Apr 03:17 next collapse

This reminds me of shitty FTP sites with ratios when I was on dial-up. I used to push them files full of null characters with filenames that looked like actual content. The modem would compress the upload as it transmitted it which allowed me to upload the junk files at several times the rate of a normal file.

MeThisGuy@feddit.nl on 30 Apr 15:15 collapse

that is pretty darn clever

I use a torrent client that will lie on the upload (x10 or x11, or a myriad of other options) so as to satisfy the upload ratio requirement of many members only torrent communities

dwt@feddit.org on 30 Apr 05:14 next collapse

Sadly about the only thing that reliably helps against malicious crawlers is Anubis

anubis.techaro.lol

alehel@lemmy.zip on 30 Apr 05:45 next collapse

That URL is telling me “Invalid response”. Am I a bot?

doorknob88@lemmy.world on 30 Apr 07:02 next collapse

I’m sorry you had to find out this way.

sugar_in_your_tea@sh.itjust.works on 30 Apr 07:08 next collapse
MonkderVierte@lemmy.ml on 30 Apr 09:09 next collapse

You’re using a VPN, right?

Squizzy@lemmy.world on 30 Apr 10:21 next collapse

Im not and it gave an invalid response. I am just chilling on my home wifi.

alehel@lemmy.zip on 30 Apr 14:02 collapse

Nope. Just using Vivaldi on my Android device.

L_Acacia@lemmy.ml on 30 Apr 09:27 next collapse

anubis.techaro.lol/…/known-broken-extensions

If you have JShelter installed, it breaks the proof of work from anubis

xavier666@lemm.ee on 30 Apr 11:03 collapse

Now you know why your mom spent so much time with the Amiga

LainTrain@lemmy.dbzer0.com on 30 Apr 22:01 next collapse

Neat

spicehoarder@lemm.ee on 01 May 20:13 collapse

I don’t really like this approach, not just because I was flagged as a bot, but because I don’t really like captchas. I swear I’m not a bot guys!

dwt@feddit.org on 02 May 05:08 collapse

That’s the reason I say ‚sadly‘. It’s definitely not good. But since everything else fails, this is what currently remains.

moopet@sh.itjust.works on 30 Apr 08:43 next collapse

I’d be amazed if this works, since these sorts of tricks have been around since dinosaurs ruled the Earth, and most bots will use pretty modern zip libraries which will just return “nope” or throw an exception, which will be treated exactly the same way any corrupt file is - for example a site saying it’s serving a zip file but the contents are a generic 404 html file, which is not uncommon.

Also, be careful because you could destroy your own device? What the hell? No. Unless you’re using dd backwards and as root, you can’t do anything bad, and even then it’s the drive contents you overwrite, not the device you “destroy”.

Lucien@mander.xyz on 01 May 12:02 next collapse

Yeah, this article came across as if written by a complete beginner. They mention having their WordPress hacked, but failed to admit it was because they didn’t upgrade the install.

namingthingsiseasy@programming.dev on 01 May 15:20 collapse

On the other hand, there are lots of bots scraping Wikipedia even though it’s easy to download the entire website as a single archive.

So they’re not really that smart…

[deleted] on 01 May 18:37 collapse

.

frozenpopsicle@lemmy.dbzer0.com on 30 Apr 09:39 next collapse

❤️

arc@lemm.ee on 30 Apr 09:42 next collapse

Probably only works for dumb bots and I’m guessing the big ones are resilient to this sort of thing.

Judging from recent stories the big threat is bots scraping for AIs and I wonder if there is a way to poison content so any AI ingesting it becomes dumber. e.g. text which is nonsensical or filled with counter information, trap phrases that reveal any AIs that ingested it, garbage pictures that purport to show something they don’t etc.

mostlikelyaperson@lemmy.world on 30 Apr 09:49 next collapse

There have been some attempts in that regard, I don’t remember the names of the projects, but there were one or two that’d basically generate a crapton of nonsense to do just that. No idea how well that works.

frezik@midwest.social on 30 Apr 12:56 next collapse

When it comes to attacks on the Internet, doing simple things to get rid of the stupid bots means kicking 90% of attacks out. No, it won’t work against a determined foe, but it does something useful.

Same goes for setting SSH to a random port. Logs are so much cleaner after doing that.

airgapped@piefed.social on 01 May 06:14 collapse

Setting a random SSH port and limiting it to 3/min saw failed login attempts fall by 99% and jailed IPs fall to 0.

WFloyd@lemmy.world on 01 May 18:17 collapse

I’ve found great success using a hardened ssh config with a limited set of supported Cyphers/MACs/KexAlgorithms. Nothing ever gets far enough to even trigger fail2ban. Then of course it’s key only login from there.

echodot@feddit.uk on 30 Apr 13:25 next collapse

I don’t know as to poisoning AI, but one thing that I used to do was to redirect any suspicious bots or ones that were hitting their server too much to a simple html page with no JS or CSS or forward links. Then they used to go away.

delusion@lemmy.myserv.one on 01 May 09:42 next collapse
fmstrat@lemmy.nowsci.com on 30 Apr 11:17 next collapse

This is why I use things like Docusaurus to generate static sites. Vulnerability injections are pretty hard when there’s no code to inject into.

fmstrat@lemmy.nowsci.com on 30 Apr 11:21 next collapse

I’ve been thinking about making an nginx plugin that randomizes words on a page to poison AI scrapers.

some_guy@lemmy.sdf.org on 01 May 00:40 next collapse

If you have the time, I think it’s a great idea.

delusion@lemmy.myserv.one on 01 May 09:41 next collapse

zadzmo.org/code/nepenthes/

fmstrat@lemmy.nowsci.com on 01 May 13:39 collapse

That is a very interesting git repo. Is this just a web view into the actual git folder?

owsei@programming.dev on 01 May 11:30 collapse

There are “AI mazes” that do that.

I remember reading and article about this but haven’t found it yet

corsicanguppy@lemmy.ca on 01 May 19:26 collapse

The one below, named Anubis, is the one I heard about. Come back to the thread and check the link.

billwashere@lemmy.world on 30 Apr 13:24 next collapse

I want to know he they built that visualization

Treczoks@lemmy.world on 01 May 08:41 collapse

Have you ever heard of sparse files, and how Linux and Windows deal with zips of it? You’ll love this.