why would you do *NIX development on anything other than the actual kit the code is going to be running on?
I understand this to mean that you think that the dev environment should be identical to the production environment. Am I misunderstanding?
Assuming I'm understanding you correctly, no, I don't think what you said and what I expressed are the same at all.
A developer (or development team), probably has (and should have) access to all of these environments for testing, but likely does development in the one they're most comfortable with.
If you are saying that development in a different environment is okay, but a dev should do testing in a production-equivalent environment, then we agree. I don't see that distinction in what you've written.
If I'm misunderstanding or misrepresenting what you're saying, please do correct me, as that's not my intent.
> I don't think dev environments need to be harmonized the same as production. If your tests are good, you should catch most of the "it worked on my laptop" problems.
That's true until something break in production: then you want to replicate the same situation in the dev environment, as close as possible.
It's important to ensure that the software you're developing runs as expected in the production environment. That does not mean that the development environment and the production environment need to be the same. For example, it's very likely that the development tools you're using will (or should) not be deployed. This extends to anything else about the development environment, as long as you're able to consistently deploy the software to the production environment.
This also assumes that there is only one production environment. Some software is expected to work on different architectures or OSes, some quite dissimilar to each other. For example, some software runs on Linux, macOS, and Windows. A developer (or development team), probably has (and should have) access to all of these environments for testing, but likely does development in the one they're most comfortable with.
Nonsense. You can have both - a local dev environment where you sandbox test your changes, a "proper" dev environment running Linux, a test environment running Linux that matches production, and a production environment.
When you sandbox test your changes in your local dev you also quickly learn which pieces of your code are non-portable or non-standard.
Don't know why it is downvoted but isn't it a legitimate concern that if dev and prod environments are produced differently with different tool chains, it might result in discrepancies?
Concrete example, dev environment with nix but production is Dockerfile with apt getting packages?
Your dev environment is never going to be identical to your production environment, because you don't write code directly in production. Test and acc should be identical to prod, but for dev that's pointless.
"I think it's better if the devs run something close to the production environment."
I used to think so too, but I changed my mind a few years ago. Have diversity in the environment in the phases leading up to staging/testing leads to many problems being uncovered early - hardcoded paths, platform assumptions, potential performance issues that only show under some circumstances, ...
I think it improves the code if various devs use different installation paths, DB's, development tools and even OS's. I've seen many deployment issues that would have trivially been detected if devs had different environments.
I think it is an overkill to have your dev environment different from your deploy environments. This would mean you maintain dev environments separately than deploy environments. It would mean you are debugging something other than you are testing and deploying.
About reproducibility, unless nix promises to fix all upstreams (apt, pypi ??), I don't see how it can fix reproducibility on the client side only.
Did you read my original comment which specifically mentions that development, test, and production are all matching?
That you write and test the code before moving to the "proper" development environment running Linux gives you a chance to ensure you're not writing Linux-specific code in cases where portability at least a passing concern (read: unless you're developing platform specific firmware).
It doesn't matter. Developers almost never develop/test in an environment that mimics production - multiple load balancers, multiple app servers, multiple database servers, failover to a second data center, etc.
If your dev environment is not different from prod, you're either insanely rich of your server setup is trivial.
To avoid problems like that, you probably want your dev environment as close as possible to your production environment. You probably also want to be testing your production specific code/logic (especially if you're not doing that on your local).
I guess so. It seems strange to me that anyone would want to inflict upon themselves the pain of using different environments for development vs. production, though.
I would totally want to deploy on the prod environment. If the tools have their symbols resolved properly, they should run on any reasonable environment.
The other way around that we build a dev environment that is different from prod just makes no sense.
Just to play Devil's Advocate and be argumentative, what is the point of testing in production when your development/staging environment is guaranteed to be identical to your production environment?
How many of the failures caught in dev would have been legitimate problems in production? How about the ones in staging?
If your environments are that different are you even testing the right things?
And if yes, if you need all of those, then why not add a couple more environments? Because more pre-prod environments means more bugs caught in those, right? /s
I understand this to mean that you think that the dev environment should be identical to the production environment. Am I misunderstanding?
Assuming I'm understanding you correctly, no, I don't think what you said and what I expressed are the same at all.
A developer (or development team), probably has (and should have) access to all of these environments for testing, but likely does development in the one they're most comfortable with.
If you are saying that development in a different environment is okay, but a dev should do testing in a production-equivalent environment, then we agree. I don't see that distinction in what you've written.
If I'm misunderstanding or misrepresenting what you're saying, please do correct me, as that's not my intent.
reply