Possibly stupid question: is automated testing actually a common practice?
Referring more to smaller places like my own - few hundred employees with ~20 person IT team (~10 developers).
I read enough about testing that it seems industry standard. But whenever I talk to coworkers and my EM, it's generally, "That would be nice, but it's not practical for our size and the business would allow us to slow down for that." We have ~5 manual testers, so things aren't considered "untested", but issues still frequently slip through. It's insurance software so at least bugs aren't killing people, but our quality still freaks me out a bit.
I try to write automated tests for my own code, since it seems valuable, but I avoid it whenever it's not straightforward. I've read books on testing, but they generally feel like either toy examples or far more effort than my company would be willing to spend. Over time I'm wondering if I'm just overly idealistic, and automated testing is more of a FAANG / bigger company thing.
My context: I'm in a small ~30 software company. We do various projects for various customers. We're close to the machine sector, although my team is not. I'm lead in a small 3-person developer team/continuous project.
I write unit tests when I want to verify things. When I'm in somewhat low, algorithmic, coding behavior, interfacing areas.
I would write more and against our interfaces if those were exposed to someone or something. If it needs that stability and verification.
Our tests are mainly manually (mostly user-/UI-/use-interface-centric), and we have data restrictions and automated reporting data consistency validations. (Our project is very data-centric.)
it’s not practical for our size and the business would allow us to slow down for that
Tests are an investment. A slowdown in implementing tests will increase maintainability and stability down the line. Which could already be before delivering (reviews, before merge or before delivery issues being noticed).
It may very well be that they wouldn't even slow you down, because they could lead you to a more thought out implementation and interfacing. Or noticing issues before they hit review, test, or production.
If you have a project that will be maintained then it's not a question of slowing down but of are you willing to pay more (effort, complexity, money, instability, consequential dissatisfaction) down the line for possibly earlier deliverables?
If tests would make sense and you don't implement them then it's technical debt you are incurring. It's not sound development or engineering practice. Which should require a conscious decision about that fact, and awareness on the cost of not adding tests.
How common automated testing is - I don't know. I think many developers will take shortcuts when they can. Many are not thorough in that way. And give in to short-sighted time pressure and fallacy.
Perhaps it's just part of being somewhere where tech is seen as a cost center? Technical leadership loves to talk big about how we need to invest in our software and make it more scalable for future growth. But when push comes to shove, they simply say yes to nearly every business request, tell us to fix things later, and we end up making things less scalable and harder to test.
It feels terrible and burns me out, but we never seem to seriously suffer for poor quality, so I thought this could be all in my head. I guess I've just been gaslit by my EM into thinking this lack of testing is a common occurrence.
(A programming lemmy may not be a terribly representative sample, but I don't see anyone here anywhere close to as wild west as my place.)
It feels terrible and burns me out, but we never seem to seriously suffer for poor quality, so I thought this could be all in my head.
The way you suffer for it, is in a loss of agility.
When I'm in a project with excellent unit test coverage, I often have no qualms with typing up a hot fix, running it through our automated tests and then rolling it out, in less than an hour.
Obviously, if it's a critical target system, you might want to get someone's review anyways, but you don't have to wait multiple days for your manual testers to get around to it.
Another way in which it reduces agility is in terms of moving people between projects.
If all the intended behavior is specified in automated tests, then the intern or someone, who just got added to the project, can go ham on your codebase without much worry that they'll break something.
And if someone needs to be pulled out from your project, then they don't leave a massive hole, where only they knew the intended behavior for certain parts of the code.
Your management wants this, they just don't yet understand why.
We haveused to have a scrum master so we're already agile! /s
They want those things, sure, but I think it would take multiple weeks of dedicated work for me to set up tests on our primary system that would cover much of anything. Big investment that might enable faster future development is what I find hard to sell. I am already seen as the "automated testing guy" on my (separate) project, and it doesn't really look like I'm that much faster than anyone else.
What I've been meaning to do is start underloading my own sprint items by a day or two and try to set up some test infrastructure in my spare Fridays to show some practical use. But boy is that a hard thing to actually hold myself to.
If we end up in a project with too little test coverage, our strategy is usually to then formulate unit tests before touching old code.
So, first you figure out what the hell that old code does, then you formulate a unit test until it's green, then you make a commit. And then you tweak your unit test to include the new requirements and make the production code match it (i.e. make the unit test green again).
I am already seen as the "automated testing guy" on my (separate) project, and it doesn't really look like I'm that much faster than anyone else.
This isn't about you being faster, as you write a feature. I mean, it often does help, even during first implementation, because you can iterate much quicker than starting up the whole application. But especially for small applications, it will slow you down when you first write a feature.
Who's sped up by your automated tests are your team members and you-in-three-months.
You should definitely push for automated tests, but you need to make it clear that this needs to be a team effort for it to succeed. You're doing it as a service to everyone else.
If it's only you who's writing automated tests, then that doesn't diminish the value of your automated tests, but it will make it look like you're slower at actually completing a feature, and it will make everyone else look faster at tweaking the features you implemented. You want your management to understand that and be on board with it, so that they don't rate you badly.
Who’s sped up by your automated tests are your team members and you-in-three-months.
Definitely true. I am very thankful when I fail a test and know I broke something and need to clean up after myself. Also very nice as insurance against our more "chaotic" developer(s).
I've advocated for tests as a team effort. Problem is just that we don't really have any technical leadership, just a hands-off EM and hands-off CTO. Best I get from them is "Yes, you should test your code." ...Doesn't really help when some developers just aren't interested in testing. I am warming another developer on my team up to testing, so at least I may get another developer or two on the testing kick for a bit.
And as for management rating me... I don't really worry too much. As I mentioned, hands off management. Heck, we didn't even get performance reviews last year.