The main issue I have with rust is the lack of a rust abi for shared libraries, which makes big dependencies shitty to work with. Another is a lot of the big, nearly ubiquitous libraries don't have great documentation, what's getting put up on crates.io is insufficient to quickly get an understanding of the library. It'd also be nice if the error messages coming out of rust analyzer were as verbose as what the compiler will give you. Other than that it's a really interesting language with a lot of great ideas. The iterator paradigm is really convenient, and the way enums work leads to really expressive code.
Documentation is generally considered one of the stronger points of rust libraries. Crates.io is not a documentation site you want https://docs.rs/ for that though it is generally linked to on crates.io. A lot of bigger crates also have their own online books for more in depth stuff. It is not that common to find a larger crate with bad documentation.
One specific example I encountered was ndarray. I couldn't figure out how to make a function take an array and an arrayslice without rewriting the function for both types. This could be because I'm novice with the language, but it didn't seem obvious. I ended up giving up after trying to dig through the docs for a few hours and went back to C++.
As someone that have worked in software for 30 years, and deplying complicated software, shared libraries is a misstake. You think you get the benefit of size and easy security upgrades, but due to deployment hell you end up using docker and now your deployment actually added a whole OS in size and you need to do security upgrades for this OS instead of just your application.
I use rust for some software now, and I build it with musl, and is struck by how small things get in relation to the regular deployment, and it feels like magic that I no longer get glibc incompatibility issues.
Maybe for your use cases that's OK, but there are many situations where the size and ease of upgrading provided by shared libraries is worthwhile. For example it would suck to need to push a 40+ GB binary to a fleet of systems with a poor or unreliable internet connection. You could try to mitigate this sort of thing by splitting the application up into microservices, but that adds complexity, and isn't always a viable tradeoff if maximizing compute efficiency is also a concern.
I'm not so sure that dynamic libraries always reduces the size. Specially with libraries that are linked by a single binary.
With static libraries, you can conditionally compile only the features you're gonna use. With dynamic libraries, however, the whole library must be compiled.
EDIT: just to clarify, I'm not saying that static libraries result always in less size. I'm saying that it's not a black and white issue.
Technically this is conflating two things: bundling dependencies and static/dynamic linking. But since you have to bundle your dependencies to use static linking, and there's little point dynamic linking if you bundle your dependencies... most of the time they are synonymous.
Exceptions are things like plugins, but that's pretty rare.
You can just use an unsafe block though. Or make a thin wrapper that is just safe functions that inside just have an unsafe block with the C ABI function.
Even if rust had a stable ABI, you would still need that unsafe block.
Rust is definitely a really cool language (as someone who has played with it just a little) but it's quite headache inducing, at least for me at the moment.
Mostly the ownership model, trying to remember which functions expect borrowed types or not, etc.
The error messages in rust are really good, so I can usually make the code work quickly, but I need to properly understand the reason behind the error in order to learn, so that's when I get headaches