I chose clap as the argument parsing crate for my book on Rust CLI programming and it's been my biggest regret. Folk sometimes complain that Rust itself is a moving target but I haven't found that to be the case at all. On the other hand, clap has gone through several, incompatible major releases over the past few years. Unfortunately, in order to keep this forward momentum, it also starts deprecating things which means that I had to pin it to a quickly outdated version. If I ever go back and update the earlier chapters I'd pick a simpler, slower-moving library.
On the other hand, this situation is a great example of why keeping things like argument parsing out of the Rust standard library is such a good idea. It's much better to let the community gel around some crates which are able to iterate (and innovate) faster than something with strict backwards-compatibility requirements. Looking at the discussion here there's clearly not one way to solve this - there's no way that something as large and complex as clap has evolved into will ever become "standard".
structopt/Clap's derive magic is one of the first things I miss when I go to write some more-or-less trivial program in a non-Rust language these days. Being able to define all the data for a command line argument in one place (how/where to store it, what the type/valid input is, the association between the name and a variable/field, the documentation for --help...) seems like table stakes but afaict almost every other argument parsing library makes me repeat myself to the point where it takes all the joy out of writing a simple program.
I want to like docopt, but that the only data types it supports are boolean and string—if you want anything else, you have to do another round of parsing and error checking—destroys a lot of the advantage of using a high-level library for handling command-line arguments.
People are used to the `click` way, where you can define args as function parameters. It's little more verbose but it helps click is a very established library which also provides many other things needed by CLI tools.
There's also `typer` from the creator of `fastapi` which relies on type annotations. I have not had the opportunity to use it.
In my opinion, clap is a textbook example of over-engineering for a single metric (UX) at the expense of all other considerations (compilation speed, runtime cost, binary size, auditability, and maintainability). It is an 18kloc command-line parser with an additional 125kloc of dependencies that takes nearly 6 seconds to compile (‘only’ 400ms for an incremental build) and which adds nearly 690KiB to an optimised release binary (‘only’ 430KiB if you strip out most of the sugar that only clap provides).
There are many other command-line parsers to choose from that do all the key things that clap does, with half or less the build cost, and most of them with 30x less binary overhead[0]. argh is under 4kloc. gumdrop is under 2kloc. pico-args is under 700loc. What is the value of that extra 10kloc? A 10% better parser?
I am not saying there is no room for a library like clap—it is, at least, a triumphant clown car of features that can handle practically any edge-case anyone ever thought of—but if I got a nickel every time I spent 15 minutes replacing a trivial use of clap with pico-args and thus reduced the binary size and compile time of some project by at least 80%, I would have at least three nickels.
Just to try to pre-empt arguments like “disk space is cheap”, “compiler time is cheaper than human time”, etc.: there are no golden bullets in engineering, only trade-offs. Why would you default to the biggest, slowest option? This is the “every web site must be able to scale like Facebook” type logic. You don’t even have to do more work to use argh or gumdrop. If clap ends up having some magic feature that no other parser has that you absolutely need, you can switch, but I’ve yet to ever encounter such a thing. Its inertia and popularity carry it forward, but it is perhaps the last choice you should pick for a new project—not the first.
You’re right that there are only trade-offs in engineering. But the key to evaluating trade-offs is evaluating impact, and how long my dependencies take to compile when I first check out a repo or whether it takes 1ms or 2ms to parse my command line (if we’re even talking about something above microseconds) have no discernible impact for approximately all use-cases. If you’re making some odd CLI tool that has to run on an microcontroller with 1MB of RAM or something, fine, agonize about whether your command line parser is parsimonious enough. Otherwise you’ve abjectly failed to evaluate one of the most important trade-offs in engineering: whether something is even worth your time to think about.
> Otherwise you’ve abjectly failed to evaluate one of the most important trade-offs in engineering: whether something is even worth your time to think about.
Phew, several folks have replied to me about how it’s not worth the time thinking about these impacts at all, thus creating a paradox whereby more time has been spent thinking about writing about whether to think about it than has been spent in not thinking about it and just accepting that I wrote a reply on HN about how I feel there are more suitable command-line parsers than clap for most Rust projects! :-)
I agree that much of high-level engineering is knowing whether something is worth thinking about; in this case, I did the thinking already, and I am sharing what I know so others can benefit from (or ignore) my thinking and not have to do so much of their own. If my own personal anecdote of significantly reducing compile times (and binary sizes) by taking a few minutes to replace clap is insufficient, and if the aggregate of other problems I identified don’t matter to others, that’s alright. If reading my comment doesn’t make someone go “huh, I didn’t know {argh|gumdrop|pico-args} existed, clap does seem a little excessive now that you mention it, I will try one of these instead on my next project and see how it goes”, then I suppose they were not the target audience.
I don’t really want to keep engaging on this—as almost everyone (including me) seems to agree, command-line parser minutiae just aren’t that important—but I guess I will just conclude by saying that I believe that anchoring effects have led many programmers to consider any dependency smaller than, say, Electron to be not a big deal (and many think Electron’s fine too, obviously), whereas my experience has been that the difference between good and bad products usually hinges on many such ‘insignificant’ choices combining in aggregate.
Assuming whichever command-line parser one uses operates above a certain baseline—and I believe all of the argparse libraries in that benchmark do—it seems particularly silly to make wasteful choices here because this is such a small part of an application. Choosing wastefulness because it’s technically possible, then rationalising the waste by claiming it increases velocity/usability/scalability/whatever without actually verifying that claim because it’s ‘not worth thinking about’, seems more problematic to me than any spectre of premature or ‘unnecessary’ optimisation. I hope to find better ways to communicate this in future.
Hmm isn't optimizing to save 690KiB for an optimised release binary and getting incremental builds to be significantly less than 400ms actually much closer to the after-mentioned "every web site must be able to scale like Facebook” type logic" ?
The “every website must scale like Facebook” mindset is premature optimization driven by hypothetical future needs exactly what YAGNI advises against. But in your case, you’re investing time upfront to avoid a heavier dependency that already works and has no clear downside for the majority of users.
If you don’t actually need ultra-small binaries or sub-200ms compile times, then replacing Clap just in case seems like a textbook case of violating YAGNI rather than applying it.
> But in your case, you’re investing time upfront to avoid a heavier dependency
This is very confusing to me. What of this API[0], or this one[1], requires “investing time upfront”? With argh, you already know how to use all the basic features before you even start scrolling. These crates are all practically interchangeable already with how similarly they work.
It is only now that I look at clap’s documentation that I feel like I might understand this category of reply to my post. Why does clap need two tutorials and two references and a cookbook and an FAQ and major version migration guidelines? Are you just assuming that all other command-line parsers are as complicated and hard to use as clap?
Neither of those libraries provide cross-shell completions, or coloured output, or "did you mean" suggestions, or even just command aliases, all of which I would consider basic features in a modern CLI. So you need to invest more time to provide those features, whereas they just exist in clap.
That's not to say that clap is always better, but it is significantly more full-featured than the alternatives, and for a larger project, those features are likely to be important.
For a smaller project, like something you're just making for yourself, I can see why you'd go for a less full-featured option, but given there's not much difference between clap and, say, argh that I feel like I'd get much benefit out of argh. If you're really looking for something simple, just use lexopt or something like that, and write the help text by hand.
And you disregard user experience and other developer experience with your own custom parsing code.
Acts as if there's no trade-off whatsoever in your own decision and my way is holier than thou in engineering is beyond sad.
> I got a nickel every time I spent 15 minutes replacing a trivial use of clap with pico-args and thus reduced the binary size and compile time of some project by at least 80%, I would have at least three nickels.
Hahaha, awesome. Thanks for the pico-args recommendation.
It supports the bare minimum.
I sure would like deriving-style parsing and --help auto-generation.
I think deriving-style unavoidably causes build time and complexity.
But it could be done without the dependency explosion.
Among the ones you recommend, argh supports deriving, auto-generates --help and optimizes for code size. And its syntax is very comparable to clap, so migrating is very easy. gumdrop seems very similar in its feature set (specifying help strings a little differently), but I can't find a defining feature for it.
That 690KB savings is 1/97000th of the RAM on the machine I develop and run most of my Rust software on.
If I ever encounter a single howto or blog post or Stack Overflow answer that tells me how to use Clap to do something 5 minutes more quickly than with an alternative, it’s paid for itself.
Amdahl’s Law says you can’t optimize a system by tweaking a single component and get more than that component’s total usage back. If Clap takes 1% of a program’s resources, optimizing that down to 0 will still use 99% of the original resources.
At this point you're just flexing that you have 96GiB machine. (Average developer machines are more like 16GiB)
But that's not the point. If every dependency follows same philosophy, costs (compiler time, binary size, dependency supply chain) will add up very quickly.
Not to mention, in big organizations, you have to track each 3rd party and transitive dependency you add to the codebase (for very good reasons).
I can write and have written hand-tuned assembly when every byte is sacred. That’s valuable in the right context. But that’s not the common case. In most situations, I’d rather spend those resources on code ergonomics, a flexible and heavily documented command line, and a widely used standard that other devs know how to use and contribute to.
And by proportion, that library would add an extra .7 bytes to a Commodore 64 program. I would have cheerfully “wasted” that much space for something 100th as nice as Clap.
I’ve worked in big organizations and been the one responsible for tracking dependencies, their licenses, and their vulnerable versions. No one does that by hand after a certain size. Snyk is as happy to track 1000 dependencies as 10.
96? It sounds more like 64 to me, which is probably above average but not exactly crazy. I've had 64 GB in my personal desktop for years, and most laptops I've used in the past 5 years or so for work have had 32 GB. If it takes up 1/4700 of memory, I don't think it changes things much. Plus, argument parsing tends to be done right at the beginning of the program and completely unused again by the time anything else happens, so even if the parsing itself is inefficient, it seems like maybe the least worrisome place I could imagine to optimize for developer efficiency over performance.
I can understand this, if we were talking about JavaScript CLIs that requires GBs of dependencies. But 690KiB for modern computing is a drop in the ocean. It is not something you should base or make a consideration of unless you were doing embedded programming.
690KiB is a far compromise if Clap provided, for example, better performance or better code readability and organization. The benchmarks you provided shows the performance is practically the same which is close to no library usage.
I did do a bit of CLI work (I try to maintain https://github.com/rust-starter/rust-starter) and will always pick up clap. It just happens that even for the simplest CLIs out there, things can get hairy really fast. A good type interface that let me define my args as types and have extras on the fly (ie: Shell completion code) is worth the less than 1MiB overhead.
> Why would you default to the biggest, slowest option?
Because it's not very big, nor very slow. Why wouldn't you default to the most full-featured option when its performance and space usage is adequate for the overwhelming majority of cases?
> Why wouldn't you default to the most full-featured option when its performance and space usage is adequate for the overwhelming majority of cases?
This is the logic of buying a Ford F-150 to drive your kids to school and to commute to the office because you might someday need to maybe haul some wood from the home improvement store once. The compact sedan is the obviously practical choice, but it can’t haul the wood, and you can afford the giant truck, so why not?
> This is the logic of buying a Ford F-150 to drive your kids to school and to commute to the office because you might someday need to maybe haul some wood from the home improvement store once.
No, it's like buying the standard off the shelf $5 backpack instead of the special handmade tiny backpack that you can just barely squeeze your current netbook into. Yes, maybe it's a little bigger than you need, maybe you're wasting some space. But it's really not worth the time worrying about it.
If using clap would take up a significant fraction of your memory/disk/whatever budget then of course investigate alternatives. But sacrificing usability to switch to something that takes up 0.000000001% of available disk space instead of 0.0000001% is a false economy, the opposite of practical; it feels like a sister phenomenon to https://paulgraham.com/selfindulgence.html .
Well you hit the nail on the proverbial head. The compact will handle 99% of people's use-cases, the truck will handle 100%. People don't want the hassle of renting something or paying for help for the 1% of cases their compact wouldn't handle.
Believe it or not, I'm with you; I live somewhere where it's sunny all year round, so I get around with a motorcycle as my primary transportation year-round and evangelize them as the cheap alternative to people struggling with car-related payments. But no, my motorcycle isn't going to carry a 2x4. Someone who cares about supporting that, even if they only need to do so exceptionally rarely, is gonna buy a truck. And then they won't have the money to buy a motorcycle on the side.
Not sure why you’re being downvoted. I also don’t like oversized motor vehicles but I think the parable is sound;
If the effort of switching out when you need the last 1% is higher than whatever premium you will pay (compilation time/fuel cost) - especially as a small ongoing cost, people will likely choose it.
I’m not saying this as if its wisdom into the future, only in that we can observe it today with at least a handful of examples.
Maybe. If your build is too slow, fix it. But pre-emptively microoptimising your build time is as bad as pre-emptively microoptimising anything else. Set a build time budget and don't worry about it unless and until you see a risk of exceeding that budget.
According to that link you posted, many of the other argument parsers don't even generate help, only one other offers multiple interfaces, and none of the others are noted as having color or suggested fixes.
These aren't exactly esoteric features, and you're not going to get them for free. I'm happy to pay in space and compile time for clap to gain those features.
This isn't a case of the commandline app needing to be facebook, but rather putting the exponential gains we've made in storage space to good use providing features which should be table stakes at this point.
I like clap a lot, however I find that clap derive is not very easily discoverable. I always have to google for the right macro incantation to get what I want. Whereas editor completions from rust analyzer get me quite far without needing to leave my editor when I'm just using an ordinary library.
I think this is more a criticism of rust-analyzer than clap itself, any macro-heavy library I have similar issues with.
(Yes I know clap can be used without derive, but I'm willing to deal with the pain to parse directly into a struct)
I hope you don't mind me plugging my thing here, but I had the 100% same problem and made aargvark (https://docs.rs/aargvark/latest/aargvark/). When I was using clap, every time I'd need to look up how to do X, or what combination of things I needed to put an enum here, or find out that this nesting of data types wasn't supported, etc.
It's still derive macro-based, but there's only one derive (`Aargvark`) rather than `Parser`, `Subcommand`, etc, and it can handle any data structure composition orthogonally (although crazy structures may result in fairly awkward command lines).
FYI, maybe for you and other readers. My key to really understanding clap's derive macro was to understand that the macros takes as argument every methods of clap's Command struct https://docs.rs/clap/latest/clap/struct.Command.html
Looking at this resolved most of my issues about discoverability.
I feel like there's a sweet spot for complexity that the derive macro hits pretty well. When things get more complex it can feel like a maze, but below that complexity it's pretty nice.
PowerShell has everything related to argument parsing, helptext generation, tab completion, type coercion, validation etc. built-in with a declarative DSL. Many things work even directly OOTB, without requiring special annotations.
It is by far the nicest way to create CLI tools I have ever seen. Every other shell or commandline parsing library I ever tried, feels extremely clunky in comparison.
Nice write up. “Good” CLI semantics are pretty devilish, and overall I think clap does a pretty great job of picking the right (or at least most intuitive) behavior.
(One edge case that consistently trips me up, which other argument parsers similarly struggle with: an environment variable fallback has the same “weight” as its option counterpart, so any CLI that makes use of grouping/exclusivity will eventually hit user confusions where the user passes `--exclusive` and gets a failure because of an unrelated environment variable.)
The argument / environment variable duality is a challenging one, especially when developing server software that should take security into account where you don't want to encourage users to put secrets into scripts. Do you end up with some items that can only be entered via one mechanism or another? Maybe that's where the fun of being a developer comes in is making those choices.
10kloc for command line parsing. TEN THOUSAND LINES. pico-args does it in 700 lines and probably handles 99% of real world use cases. compile times go to shit binary size bloats and for some edge case you'll never hit.most CLI tools need what three four flags max, maybe a subcommand or two. you don't need the swiss army knife of argument parsing for that. tried replacing clap with pico-args on three different projects last month. 80% reduction in compile time every single time. binary went from 8mb to 2mb on one of them.the "disk space is cheap" argument's acceptable partially but compile time isn't. developer experience isn't. startup time isn't. memory usage isn't
No help generation
Only flags, options, free arguments and subcommands are supported
A properer parser would knew that --arg2 is a key and will return an error, since the value is missing.
If such behavior is unacceptable to your application, then you have to use a more high-level arguments parsing library.
Yeah, no thank you. If we're talking about 700 LOC, I'm just going to write it myself rather than take on a dependency that won't even describe itself as a proper enough parser. This argument parser doesn't even handle the tedium of generating a help message for the user, and doesn't really parse the arguments -- what's the purpose of using it to do the argument parsing then?
So 700 LOC gets us a mediocre argument parser with no features. What do you get for an additional 9300 LOC? A "properer" parser (dev and user experience+). Help generation (dev experience+). Multiple interfaces (dev experience+). Color terminal output (user experience+). Suggested completions (user experience+).
Is it worth it? I dunno that's a per-project choice. If you absolutely need the smallest footprint and compile times possible, probably you don't want to go with clap. You also probably don't want to go with Rust.
I have used clap to build perspt (https://github.com/eonseed/perspt). The project has extensive documentation on how it was built, as we did it as a learning exercise.
Something I’ve been working on recently is a command line tool [1] to bring clap declarative command line parsing to shell scripts. Unfinished WIP but largely functional.
… I don't want every program to attempt to implement argument parsing; bespoke implementations will be lower quality than clap. Not reinventing an argument parser is an extremely reasonably dependency to take.
Clap, on the non-derive side, has approximately two dependencies: anstream/anstyle (for terminal coloring, another thing that sounds deceptively simple at first pass, if you think all the world is a VT100, but really isn't; this is a reasonable dep. for a CLI arg parser) and strsim (string similarity, again, a reasonable dep for a CLI arg parser…). And that's it¹ for clap's direct deps.
(¹I'm omitting clap_lex, as an "internal" dep.)
On the derive side, there's the usual proc-macro2/quote/syn trio, but those come up frequently in derive crates due to what they do, and other than that, there's just `heck`, which is again an obvious dependency in context.
… what is so quizzical to me about the "so many dependencies!" complaint is that when we do get examples like this, they're almost always on crates that bottle up genuinely tricky functionality (like what we see here) — exactly the sort of thing that a.) is hard to get right and b.) isn't relevant to the problem I want to solve. That's like "absolutely this is a dependency" central, to me…
This shouldn't even need terminal coloring, in fact that sounds annoying because it's going to have to behave differently if you pipe it to less (or it's going to do something dumb like the rust compiler itself and just reopen the tty.)
This actually reminds me of my other issue with this kind of "oh we just get it for free" attitude that tends to result in overbuilding things that I also dislike in rust.
No I think people would be better off with a bespoke option parser actually.
1. `color` feature and thus the `anstream` dep is optional.
2. Even if you use it, it handles all the behaviour correctly regarding the piping and no color support, which is why it is a dependency in the first place.
Why is needing to behave differently when you pipe annoying? Are you saying it doesn't work? But also FWIW I don't think piping command help output is a common use case.
It's useful if your terminal emulator isn't very good and the scrollback buffer doesn't work. This can happen for any number of reasons.
Or maybe I don't feel like using the mouse, or I want to do something like grep it. There are an unlimited number of reasons I might want that, that's how interfaces like these work.
I agree with you that this is a ridiculous criticism to level against clap specifically.
But I also share the same overall sentiment. Every moderately sized rust project I've worked on has quite a lot of transitive deps, and that makes me a little bit nervous.
On the derive side, there's the usual proc-macro2/quote/syn trio, but
those come up frequently in derive crates due to what they do, and other
than that, there's just `heck`, which is again an obvious dependency in
context.
They're common dependencies, sure, but not necessarily the same versions. So, yeah, it's entirely possible you'll end up building multiple versions of quote/syn.
If all the code was crammed into the std library it'd be fine?
Functions need to build on top of simpler functions to be able to abstract problems and tackle them one at a time. There's innate complexity around and without trying to tame it into smaller functions/packages it seems you'll end up in a worse spot.
Not OP, but more code in stdlib does indeed sound better.
I'm not against abstraction and re-use. What I don't like is that for every given thing I want to do, there are multiple crates that offer the same functionality, and it can be really fatiguing trying to vet them. And it is truly a rarity to find a crate that is past the 1.0 version milestone.
Compare to golang for example. You can get quite far in go without needing to pull in any libraries. But in rust you need a library for even a basic http request.
Trying to get everything into std requires a lot of work that I think gets harder when considering Rust's goals of being low level and paying only for what you use. Not having finalised some aspects of the language itself would also slow this down even more.
I'd rather have libraries built with more freedom and the possibility of having experimental stuff around meanwhile the std worthy solution lands, and if things work fine without them in the standard library then it makes sense to keep them out.
Rust may be lacking an easier way to shop for recommended libraries for common problems. There should be a path to discover all the good and best libraries for each problem. crates.io takes a stab at having this information, but I think more handholding and some sort of community seal of approval is needed.
I'm not the OP here, but I did write the blog post. I have mixed thoughts on how much should be part of std. I think that what we're seeing is a class of library where we could see a new level introduced to open languages like rust that sits somewhere above std, but is maintained and blessed by a broader group than just the nebulous community. Rust's separation of std from no-std is a good example of tiers that I think we could see evolve. Another example in the same category as clap would be serde and some of the specific serde implementations like serde-yaml that is now in a really painful unmaintained/forked status. Maybe these are things to push for more broadly in the rust community.
I think Rust's std really wants to be just the runtime. Most language stds are runtime + common utilities and then often there's an extra set of common environment libraries with varying levels of distinction.
If you got a progress bar, website, and dependency tree for every
#include <argp.h>, <stdio.h>, <sstream>, or <curl.h>
it'd feel pretty crazy too. Imagine if `make` went out and pulled latest upstream changes for `pthreads` every time any one of your dependencies used it. C++ imagine it's pulling and building boost, or abseil.
C#? The entire mono/.net toolchain and system/ FFI libraries.
Imagine if we had "dot-h.io" that tracked how many separate C projects used argp. Laughable! Millions!
Every language has gobs of dependencies. So many dependencies it'd make you sick. Thousands upon thousands of lines of code, just to make something that runs on one target and says "Hello world" to the screen. Hell, some languages require you to run a runtime on your operating system that runs on real hardware _just to launch those thousands of lines of code_. And those are written using curl.h, pthreads.h, etc etc (or similar). Bananas!
At least those with package managers allow you to see it, audit it, update it seamlessly.
The compile time guarantees + declarative nature make Clap so amazing and foolproof. This is like heaven compared to imperative, arcane incantations like getopt.
I chose clap as the argument parsing crate for my book on Rust CLI programming and it's been my biggest regret. Folk sometimes complain that Rust itself is a moving target but I haven't found that to be the case at all. On the other hand, clap has gone through several, incompatible major releases over the past few years. Unfortunately, in order to keep this forward momentum, it also starts deprecating things which means that I had to pin it to a quickly outdated version. If I ever go back and update the earlier chapters I'd pick a simpler, slower-moving library.
On the other hand, this situation is a great example of why keeping things like argument parsing out of the Rust standard library is such a good idea. It's much better to let the community gel around some crates which are able to iterate (and innovate) faster than something with strict backwards-compatibility requirements. Looking at the discussion here there's clearly not one way to solve this - there's no way that something as large and complex as clap has evolved into will ever become "standard".
structopt/Clap's derive magic is one of the first things I miss when I go to write some more-or-less trivial program in a non-Rust language these days. Being able to define all the data for a command line argument in one place (how/where to store it, what the type/valid input is, the association between the name and a variable/field, the documentation for --help...) seems like table stakes but afaict almost every other argument parsing library makes me repeat myself to the point where it takes all the joy out of writing a simple program.
(Python's docopt is also amazing, fwiw)
I want to like docopt, but that the only data types it supports are boolean and string—if you want anything else, you have to do another round of parsing and error checking—destroys a lot of the advantage of using a high-level library for handling command-line arguments.
Not sure if it's on par with Clap, but for Python I don't see enough people talk about SimpleParsing: https://github.com/lebrice/SimpleParsing
It has quirks once you try to do something more complex/advanced, but for most of the simple stuff it's very nice to use.
In Python you can use pydantic to create a cli:
https://docs.pydantic.dev/latest/concepts/pydantic_settings/...
People are used to the `click` way, where you can define args as function parameters. It's little more verbose but it helps click is a very established library which also provides many other things needed by CLI tools.
There's also `typer` from the creator of `fastapi` which relies on type annotations. I have not had the opportunity to use it.
Nice. I had made something similar (but less featureful), funnily enough also for an ML training script usecase. Here it is in a gist:
https://gist.github.com/porridgewithraisins/313a26ee3b827f73...
I love the ergonomics of this method, and I was going to improve it to support subcommands, etc, but now I think I will use the library you posted.
Docopt is great! http://docopt.org/
There's implementations for other languages, too.
TIL about structopt, thanks.
In my opinion, clap is a textbook example of over-engineering for a single metric (UX) at the expense of all other considerations (compilation speed, runtime cost, binary size, auditability, and maintainability). It is an 18kloc command-line parser with an additional 125kloc of dependencies that takes nearly 6 seconds to compile (‘only’ 400ms for an incremental build) and which adds nearly 690KiB to an optimised release binary (‘only’ 430KiB if you strip out most of the sugar that only clap provides).
There are many other command-line parsers to choose from that do all the key things that clap does, with half or less the build cost, and most of them with 30x less binary overhead[0]. argh is under 4kloc. gumdrop is under 2kloc. pico-args is under 700loc. What is the value of that extra 10kloc? A 10% better parser?
I am not saying there is no room for a library like clap—it is, at least, a triumphant clown car of features that can handle practically any edge-case anyone ever thought of—but if I got a nickel every time I spent 15 minutes replacing a trivial use of clap with pico-args and thus reduced the binary size and compile time of some project by at least 80%, I would have at least three nickels.
Just to try to pre-empt arguments like “disk space is cheap”, “compiler time is cheaper than human time”, etc.: there are no golden bullets in engineering, only trade-offs. Why would you default to the biggest, slowest option? This is the “every web site must be able to scale like Facebook” type logic. You don’t even have to do more work to use argh or gumdrop. If clap ends up having some magic feature that no other parser has that you absolutely need, you can switch, but I’ve yet to ever encounter such a thing. Its inertia and popularity carry it forward, but it is perhaps the last choice you should pick for a new project—not the first.
[0] https://github.com/rosetta-rs/argparse-rosetta-rs
You’re right that there are only trade-offs in engineering. But the key to evaluating trade-offs is evaluating impact, and how long my dependencies take to compile when I first check out a repo or whether it takes 1ms or 2ms to parse my command line (if we’re even talking about something above microseconds) have no discernible impact for approximately all use-cases. If you’re making some odd CLI tool that has to run on an microcontroller with 1MB of RAM or something, fine, agonize about whether your command line parser is parsimonious enough. Otherwise you’ve abjectly failed to evaluate one of the most important trade-offs in engineering: whether something is even worth your time to think about.
> Otherwise you’ve abjectly failed to evaluate one of the most important trade-offs in engineering: whether something is even worth your time to think about.
Phew, several folks have replied to me about how it’s not worth the time thinking about these impacts at all, thus creating a paradox whereby more time has been spent thinking about writing about whether to think about it than has been spent in not thinking about it and just accepting that I wrote a reply on HN about how I feel there are more suitable command-line parsers than clap for most Rust projects! :-)
I agree that much of high-level engineering is knowing whether something is worth thinking about; in this case, I did the thinking already, and I am sharing what I know so others can benefit from (or ignore) my thinking and not have to do so much of their own. If my own personal anecdote of significantly reducing compile times (and binary sizes) by taking a few minutes to replace clap is insufficient, and if the aggregate of other problems I identified don’t matter to others, that’s alright. If reading my comment doesn’t make someone go “huh, I didn’t know {argh|gumdrop|pico-args} existed, clap does seem a little excessive now that you mention it, I will try one of these instead on my next project and see how it goes”, then I suppose they were not the target audience.
I don’t really want to keep engaging on this—as almost everyone (including me) seems to agree, command-line parser minutiae just aren’t that important—but I guess I will just conclude by saying that I believe that anchoring effects have led many programmers to consider any dependency smaller than, say, Electron to be not a big deal (and many think Electron’s fine too, obviously), whereas my experience has been that the difference between good and bad products usually hinges on many such ‘insignificant’ choices combining in aggregate.
Assuming whichever command-line parser one uses operates above a certain baseline—and I believe all of the argparse libraries in that benchmark do—it seems particularly silly to make wasteful choices here because this is such a small part of an application. Choosing wastefulness because it’s technically possible, then rationalising the waste by claiming it increases velocity/usability/scalability/whatever without actually verifying that claim because it’s ‘not worth thinking about’, seems more problematic to me than any spectre of premature or ‘unnecessary’ optimisation. I hope to find better ways to communicate this in future.
Hmm isn't optimizing to save 690KiB for an optimised release binary and getting incremental builds to be significantly less than 400ms actually much closer to the after-mentioned "every web site must be able to scale like Facebook” type logic" ?
No, it is the following the principle of YAGNI.
The “every website must scale like Facebook” mindset is premature optimization driven by hypothetical future needs exactly what YAGNI advises against. But in your case, you’re investing time upfront to avoid a heavier dependency that already works and has no clear downside for the majority of users.
If you don’t actually need ultra-small binaries or sub-200ms compile times, then replacing Clap just in case seems like a textbook case of violating YAGNI rather than applying it.
> But in your case, you’re investing time upfront to avoid a heavier dependency
This is very confusing to me. What of this API[0], or this one[1], requires “investing time upfront”? With argh, you already know how to use all the basic features before you even start scrolling. These crates are all practically interchangeable already with how similarly they work.
It is only now that I look at clap’s documentation that I feel like I might understand this category of reply to my post. Why does clap need two tutorials and two references and a cookbook and an FAQ and major version migration guidelines? Are you just assuming that all other command-line parsers are as complicated and hard to use as clap?
[0] https://docs.rs/argh/latest/argh/
[1] https://docs.rs/gumdrop/latest/gumdrop/
Neither of those libraries provide cross-shell completions, or coloured output, or "did you mean" suggestions, or even just command aliases, all of which I would consider basic features in a modern CLI. So you need to invest more time to provide those features, whereas they just exist in clap.
That's not to say that clap is always better, but it is significantly more full-featured than the alternatives, and for a larger project, those features are likely to be important.
For a smaller project, like something you're just making for yourself, I can see why you'd go for a less full-featured option, but given there's not much difference between clap and, say, argh that I feel like I'd get much benefit out of argh. If you're really looking for something simple, just use lexopt or something like that, and write the help text by hand.
Rust invites serious disregard for resources and time. Sadly, many will accept that invitation. But some won't.
> disregard for resources and time
There is a tradeoff between compile time and running time.
This matters for programs that run more often than they get compiled.
And you disregard user experience and other developer experience with your own custom parsing code. Acts as if there's no trade-off whatsoever in your own decision and my way is holier than thou in engineering is beyond sad.
> I got a nickel every time I spent 15 minutes replacing a trivial use of clap with pico-args and thus reduced the binary size and compile time of some project by at least 80%, I would have at least three nickels.
Hahaha, awesome. Thanks for the pico-args recommendation.
It supports the bare minimum.
I sure would like deriving-style parsing and --help auto-generation.
I think deriving-style unavoidably causes build time and complexity.
But it could be done without the dependency explosion.
There's a list of options here:
https://github.com/rosetta-rs/argparse-rosetta-rs#rust-arg-p...
Among the ones you recommend, argh supports deriving, auto-generates --help and optimizes for code size. And its syntax is very comparable to clap, so migrating is very easy. gumdrop seems very similar in its feature set (specifying help strings a little differently), but I can't find a defining feature for it.
I am wondering how much of this can be mitigated by carefully designing feature flags, and make default feature set small.
That 690KB savings is 1/97000th of the RAM on the machine I develop and run most of my Rust software on.
If I ever encounter a single howto or blog post or Stack Overflow answer that tells me how to use Clap to do something 5 minutes more quickly than with an alternative, it’s paid for itself.
Amdahl’s Law says you can’t optimize a system by tweaking a single component and get more than that component’s total usage back. If Clap takes 1% of a program’s resources, optimizing that down to 0 will still use 99% of the original resources.
It’s just not worth it.
At this point you're just flexing that you have 96GiB machine. (Average developer machines are more like 16GiB)
But that's not the point. If every dependency follows same philosophy, costs (compiler time, binary size, dependency supply chain) will add up very quickly.
Not to mention, in big organizations, you have to track each 3rd party and transitive dependency you add to the codebase (for very good reasons).
I can write and have written hand-tuned assembly when every byte is sacred. That’s valuable in the right context. But that’s not the common case. In most situations, I’d rather spend those resources on code ergonomics, a flexible and heavily documented command line, and a widely used standard that other devs know how to use and contribute to.
And by proportion, that library would add an extra .7 bytes to a Commodore 64 program. I would have cheerfully “wasted” that much space for something 100th as nice as Clap.
I’ve worked in big organizations and been the one responsible for tracking dependencies, their licenses, and their vulnerable versions. No one does that by hand after a certain size. Snyk is as happy to track 1000 dependencies as 10.
> No one does that by hand after a certain size
This is not true
96? It sounds more like 64 to me, which is probably above average but not exactly crazy. I've had 64 GB in my personal desktop for years, and most laptops I've used in the past 5 years or so for work have had 32 GB. If it takes up 1/4700 of memory, I don't think it changes things much. Plus, argument parsing tends to be done right at the beginning of the program and completely unused again by the time anything else happens, so even if the parsing itself is inefficient, it seems like maybe the least worrisome place I could imagine to optimize for developer efficiency over performance.
Convenience always win. If we want smaller more purposefully built dependencies then we need better tooling that makes those choices convenient.
I can understand this, if we were talking about JavaScript CLIs that requires GBs of dependencies. But 690KiB for modern computing is a drop in the ocean. It is not something you should base or make a consideration of unless you were doing embedded programming.
690KiB is a far compromise if Clap provided, for example, better performance or better code readability and organization. The benchmarks you provided shows the performance is practically the same which is close to no library usage.
I did do a bit of CLI work (I try to maintain https://github.com/rust-starter/rust-starter) and will always pick up clap. It just happens that even for the simplest CLIs out there, things can get hairy really fast. A good type interface that let me define my args as types and have extras on the fly (ie: Shell completion code) is worth the less than 1MiB overhead.
> Why would you default to the biggest, slowest option?
Because it's not very big, nor very slow. Why wouldn't you default to the most full-featured option when its performance and space usage is adequate for the overwhelming majority of cases?
> Why wouldn't you default to the most full-featured option when its performance and space usage is adequate for the overwhelming majority of cases?
This is the logic of buying a Ford F-150 to drive your kids to school and to commute to the office because you might someday need to maybe haul some wood from the home improvement store once. The compact sedan is the obviously practical choice, but it can’t haul the wood, and you can afford the giant truck, so why not?
> This is the logic of buying a Ford F-150 to drive your kids to school and to commute to the office because you might someday need to maybe haul some wood from the home improvement store once.
No, it's like buying the standard off the shelf $5 backpack instead of the special handmade tiny backpack that you can just barely squeeze your current netbook into. Yes, maybe it's a little bigger than you need, maybe you're wasting some space. But it's really not worth the time worrying about it.
If using clap would take up a significant fraction of your memory/disk/whatever budget then of course investigate alternatives. But sacrificing usability to switch to something that takes up 0.000000001% of available disk space instead of 0.0000001% is a false economy, the opposite of practical; it feels like a sister phenomenon to https://paulgraham.com/selfindulgence.html .
Well you hit the nail on the proverbial head. The compact will handle 99% of people's use-cases, the truck will handle 100%. People don't want the hassle of renting something or paying for help for the 1% of cases their compact wouldn't handle.
Believe it or not, I'm with you; I live somewhere where it's sunny all year round, so I get around with a motorcycle as my primary transportation year-round and evangelize them as the cheap alternative to people struggling with car-related payments. But no, my motorcycle isn't going to carry a 2x4. Someone who cares about supporting that, even if they only need to do so exceptionally rarely, is gonna buy a truck. And then they won't have the money to buy a motorcycle on the side.
Not sure why you’re being downvoted. I also don’t like oversized motor vehicles but I think the parable is sound;
If the effort of switching out when you need the last 1% is higher than whatever premium you will pay (compilation time/fuel cost) - especially as a small ongoing cost, people will likely choose it.
I’m not saying this as if its wisdom into the future, only in that we can observe it today with at least a handful of examples.
These decisions accumulate then all of a sudden you have a project that takes ten minutes to build for almost no benefit.
Maybe. If your build is too slow, fix it. But pre-emptively microoptimising your build time is as bad as pre-emptively microoptimising anything else. Set a build time budget and don't worry about it unless and until you see a risk of exceeding that budget.
According to that link you posted, many of the other argument parsers don't even generate help, only one other offers multiple interfaces, and none of the others are noted as having color or suggested fixes.
These aren't exactly esoteric features, and you're not going to get them for free. I'm happy to pay in space and compile time for clap to gain those features.
This isn't a case of the commandline app needing to be facebook, but rather putting the exponential gains we've made in storage space to good use providing features which should be table stakes at this point.
Clap has also great dev ux, so I wouldn't put maintainability as an expense.
I like clap a lot, however I find that clap derive is not very easily discoverable. I always have to google for the right macro incantation to get what I want. Whereas editor completions from rust analyzer get me quite far without needing to leave my editor when I'm just using an ordinary library.
I think this is more a criticism of rust-analyzer than clap itself, any macro-heavy library I have similar issues with.
(Yes I know clap can be used without derive, but I'm willing to deal with the pain to parse directly into a struct)
I hope you don't mind me plugging my thing here, but I had the 100% same problem and made aargvark (https://docs.rs/aargvark/latest/aargvark/). When I was using clap, every time I'd need to look up how to do X, or what combination of things I needed to put an enum here, or find out that this nesting of data types wasn't supported, etc.
It's still derive macro-based, but there's only one derive (`Aargvark`) rather than `Parser`, `Subcommand`, etc, and it can handle any data structure composition orthogonally (although crazy structures may result in fairly awkward command lines).
FYI, maybe for you and other readers. My key to really understanding clap's derive macro was to understand that the macros takes as argument every methods of clap's Command struct https://docs.rs/clap/latest/clap/struct.Command.html
Looking at this resolved most of my issues about discoverability.
I feel like there's a sweet spot for complexity that the derive macro hits pretty well. When things get more complex it can feel like a maze, but below that complexity it's pretty nice.
PowerShell has everything related to argument parsing, helptext generation, tab completion, type coercion, validation etc. built-in with a declarative DSL. Many things work even directly OOTB, without requiring special annotations.
It is by far the nicest way to create CLI tools I have ever seen. Every other shell or commandline parsing library I ever tried, feels extremely clunky in comparison.
https://learn.microsoft.com/en-us/powershell/module/microsof...
Nice write up. “Good” CLI semantics are pretty devilish, and overall I think clap does a pretty great job of picking the right (or at least most intuitive) behavior.
(One edge case that consistently trips me up, which other argument parsers similarly struggle with: an environment variable fallback has the same “weight” as its option counterpart, so any CLI that makes use of grouping/exclusivity will eventually hit user confusions where the user passes `--exclusive` and gets a failure because of an unrelated environment variable.)
The argument / environment variable duality is a challenging one, especially when developing server software that should take security into account where you don't want to encourage users to put secrets into scripts. Do you end up with some items that can only be entered via one mechanism or another? Maybe that's where the fun of being a developer comes in is making those choices.
10kloc for command line parsing. TEN THOUSAND LINES. pico-args does it in 700 lines and probably handles 99% of real world use cases. compile times go to shit binary size bloats and for some edge case you'll never hit.most CLI tools need what three four flags max, maybe a subcommand or two. you don't need the swiss army knife of argument parsing for that. tried replacing clap with pico-args on three different projects last month. 80% reduction in compile time every single time. binary went from 8mb to 2mb on one of them.the "disk space is cheap" argument's acceptable partially but compile time isn't. developer experience isn't. startup time isn't. memory usage isn't
So 700 LOC gets us a mediocre argument parser with no features. What do you get for an additional 9300 LOC? A "properer" parser (dev and user experience+). Help generation (dev experience+). Multiple interfaces (dev experience+). Color terminal output (user experience+). Suggested completions (user experience+).
Is it worth it? I dunno that's a per-project choice. If you absolutely need the smallest footprint and compile times possible, probably you don't want to go with clap. You also probably don't want to go with Rust.
I have used clap to build perspt (https://github.com/eonseed/perspt). The project has extensive documentation on how it was built, as we did it as a learning exercise.
The macro magic in rust is amazing. It's so nice to just define a struct, slap a `#[derive(Parser)]` on it, and call it a day.
That was a great intro to clap, thanks for writing it up!
I've been building clap CLIs for a while and started to put together a template: https://github.com/mootoday/cli-template.
It also includes a crate I developed to reduce the boilerplate code for nested commands: https://crates.io/crates/clap-nested-commands
Thanks for the feedback. Nested commands are definitely full of boilerplate and your crate looks interesting.
Worth noting that you can do this in languages other than Rust. Python, for example, has https://github.com/fastapi/typer. https://github.com/shadawck/awesome-cli-frameworks lists relevant libraries in many languages, though not all of them support clap-style structured parsing.
Something I’ve been working on recently is a command line tool [1] to bring clap declarative command line parsing to shell scripts. Unfinished WIP but largely functional.
[1] https://github.com/fujiapple852/claptrap
It really bothers me how much people use crates in Rust. "Minimalist" crates have tens of dependencies.
It's like the node.js of systems languages. Touching it feels gross.
… I don't want every program to attempt to implement argument parsing; bespoke implementations will be lower quality than clap. Not reinventing an argument parser is an extremely reasonably dependency to take.
Clap, on the non-derive side, has approximately two dependencies: anstream/anstyle (for terminal coloring, another thing that sounds deceptively simple at first pass, if you think all the world is a VT100, but really isn't; this is a reasonable dep. for a CLI arg parser) and strsim (string similarity, again, a reasonable dep for a CLI arg parser…). And that's it¹ for clap's direct deps.
(¹I'm omitting clap_lex, as an "internal" dep.)
On the derive side, there's the usual proc-macro2/quote/syn trio, but those come up frequently in derive crates due to what they do, and other than that, there's just `heck`, which is again an obvious dependency in context.
… what is so quizzical to me about the "so many dependencies!" complaint is that when we do get examples like this, they're almost always on crates that bottle up genuinely tricky functionality (like what we see here) — exactly the sort of thing that a.) is hard to get right and b.) isn't relevant to the problem I want to solve. That's like "absolutely this is a dependency" central, to me…
This shouldn't even need terminal coloring, in fact that sounds annoying because it's going to have to behave differently if you pipe it to less (or it's going to do something dumb like the rust compiler itself and just reopen the tty.)
This actually reminds me of my other issue with this kind of "oh we just get it for free" attitude that tends to result in overbuilding things that I also dislike in rust.
No I think people would be better off with a bespoke option parser actually.
Actually,
1. `color` feature and thus the `anstream` dep is optional.
2. Even if you use it, it handles all the behaviour correctly regarding the piping and no color support, which is why it is a dependency in the first place.
Source: I am clap maintainer
Why is needing to behave differently when you pipe annoying? Are you saying it doesn't work? But also FWIW I don't think piping command help output is a common use case.
It's useful if your terminal emulator isn't very good and the scrollback buffer doesn't work. This can happen for any number of reasons.
Or maybe I don't feel like using the mouse, or I want to do something like grep it. There are an unlimited number of reasons I might want that, that's how interfaces like these work.
Are you saying your terminal emulator can't match on text with ansi color modifiers when searching?
Are you going to demand it when you write your program? That seems like a regression from most CLIs with bespoke option parsers I've used.
I agree with you that this is a ridiculous criticism to level against clap specifically.
But I also share the same overall sentiment. Every moderately sized rust project I've worked on has quite a lot of transitive deps, and that makes me a little bit nervous.
Completely agree. I find it crazy that this is encouraged in the very first example in the official tutorial [1].
[1] https://doc.rust-lang.org/book/ch02-00-guessing-game-tutoria...
If all the code was crammed into the std library it'd be fine?
Functions need to build on top of simpler functions to be able to abstract problems and tackle them one at a time. There's innate complexity around and without trying to tame it into smaller functions/packages it seems you'll end up in a worse spot.
Not OP, but more code in stdlib does indeed sound better.
I'm not against abstraction and re-use. What I don't like is that for every given thing I want to do, there are multiple crates that offer the same functionality, and it can be really fatiguing trying to vet them. And it is truly a rarity to find a crate that is past the 1.0 version milestone.
Compare to golang for example. You can get quite far in go without needing to pull in any libraries. But in rust you need a library for even a basic http request.
Trying to get everything into std requires a lot of work that I think gets harder when considering Rust's goals of being low level and paying only for what you use. Not having finalised some aspects of the language itself would also slow this down even more.
I'd rather have libraries built with more freedom and the possibility of having experimental stuff around meanwhile the std worthy solution lands, and if things work fine without them in the standard library then it makes sense to keep them out.
Rust may be lacking an easier way to shop for recommended libraries for common problems. There should be a path to discover all the good and best libraries for each problem. crates.io takes a stab at having this information, but I think more handholding and some sort of community seal of approval is needed.
I'm not the OP here, but I did write the blog post. I have mixed thoughts on how much should be part of std. I think that what we're seeing is a class of library where we could see a new level introduced to open languages like rust that sits somewhere above std, but is maintained and blessed by a broader group than just the nebulous community. Rust's separation of std from no-std is a good example of tiers that I think we could see evolve. Another example in the same category as clap would be serde and some of the specific serde implementations like serde-yaml that is now in a really painful unmaintained/forked status. Maybe these are things to push for more broadly in the rust community.
I think Rust's std really wants to be just the runtime. Most language stds are runtime + common utilities and then often there's an extra set of common environment libraries with varying levels of distinction.
Yeah I was a little surprised to discover the standard library doesn't even include regex. That's kind of extreme. Even most C environments have that.
Yes, having a good std library would be fine. It would really limit the proliferation of crates.
NIHS?
If you got a progress bar, website, and dependency tree for every
#include <argp.h>, <stdio.h>, <sstream>, or <curl.h>
it'd feel pretty crazy too. Imagine if `make` went out and pulled latest upstream changes for `pthreads` every time any one of your dependencies used it. C++ imagine it's pulling and building boost, or abseil.
C#? The entire mono/.net toolchain and system/ FFI libraries.
Imagine if we had "dot-h.io" that tracked how many separate C projects used argp. Laughable! Millions!
Every language has gobs of dependencies. So many dependencies it'd make you sick. Thousands upon thousands of lines of code, just to make something that runs on one target and says "Hello world" to the screen. Hell, some languages require you to run a runtime on your operating system that runs on real hardware _just to launch those thousands of lines of code_. And those are written using curl.h, pthreads.h, etc etc (or similar). Bananas!
At least those with package managers allow you to see it, audit it, update it seamlessly.
If it's too big, use "nostd"
I shouldn't have to and will not explain why C's stdio.h and arbitrary 3rd party projects on github are not the same.
The compile time guarantees + declarative nature make Clap so amazing and foolproof. This is like heaven compared to imperative, arcane incantations like getopt.
Clap is the way to go. It makes command line argument parsing a breeze.