Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Would you consider proprietary solvers or optimization software in that category?


sort by: page size:

We’re trying to bring new ideas to the optimization space. We’ve used a bunch of technologies over time (CP, MIP, LP, heuristics, etc.). We’ve built our own solver (for www.nextmv.io) and learned a lot along the way.

Solvers are amazing. MIP solvers, for example, are some of the fastest and most mature software packages that exist. They’re also wickedly challenging to use effectively.

Do you use optimization solvers? Which ones? Why? Do you like them? Why or why not?


I think it depends if you're interested in knowing how to solve these problems, or simply in solving them. If the latter, there's a bunch of commercial solvers out there. I won't name any for fear of an adverse reaction :0) but a trawl through the first page of Google results for 'optimization solvers' would probably work.

How does your product compare to major optimization software/mathematical programming products? Do you have public benchmarks?

I have a Phd in optimization and I still think Cplex and Gurobi seem magical. The devil is in the details. Open source solvers are comparatively not very good.

I strongly disagree. This has very little to do with Excel Solver. This seems to be Microsoft's take at large-scale, real-world optimization problems, possibly with millions of variables and constraints. Can you do that with Excel? I doubt it. This Solver Foundation framework can do Constraint Programming, Quadratic Programming, and Mixed-Integer Programming. This is serious optimization, not the kiddie stuff Excel users deal with. SF seems to be based on commercial C++ solvers such as MOSEK.

I would say that open-source is not of great interest in optimization software. The idea is to write as little code as possible, and to trust that everything performance-critical has been optimized. Even speed is not the main issue for me. I want something that is flexible. I want to write little code because the less I write the less bugs there are. Correctness trumps everything else.

If I am using SF to allocate investments, I want to make sure an optimal solution is found, even if it takes a little longer. Computer time is cheap. Buy a bigger computer. Developer time is more precious. There are only 24 hours in a day.

You want to change the source code? With all due respect, but I would speculate that 99,9999% of HN users are not qualified to write numerical optimization code. Looking at it is of little use unless you have a PhD in Applied Math and years and years of experience.

Of limited use to HN readers? To those writing web-apps, perhaps. Those doing Machine Learning will probably be ecstatic to find this.


Not sure I agree with you on SNOPT. There are many many tools out there that I see people use for convex (and non convex) optimization. Especially considering a single license costs $6000 I would be surprised if SNOPT even captured a plurality of solvers in use.

FWIW in my department (Electrical Engineering) No one is using SNOPT for their work that I know of.


How often do these come up? I studied some OR stuff in grad school when I was evaluating solvers and brushed up against the OR tools community for some research ideas and from what I could tell the businesses that really needed it already use these tools and the other businesses just don't have enough business dependent on optimization to be worthwhile.

On the other hand, there may be a greater gain in specializing the solver for the problem you're tackling. Besides, they can copy some of the algorithms.

A friend of mine in grad school used to do it for some jurisdictions. The optimization software was his own, he sold the solutions. I really don't know much more about it than that, however.

As someone tinkering away on their own optimization algorithm, how big is the market for these sorts of optimization solutions?

Optimisation solvers are complex and incredibly powerful pieces of software. However, there are lots of different options and choosing which to use can be a daunting task.

There are some open source options ([COIN](https://www.coin-or.org/), [OR-tools](https://developers.google.com/optimization), [Minion](https://constraintmodelling.org/minion/), [CVXPY](https://www.cvxpy.org/)) and other commercial offerings ([gurobi](https://www.gurobi.com/), [Mosek](https://www.mosek.com/)), other people write their own for thiner specifics purposes. Some [benchmarks](http://plato.asu.edu/bench.html.) are maintained by Hans Mittelmannt.

Personally, I have used OR-tools, maintained by a team at Google, for vehicle routing optimisation and found it powerful but poorly documented with an inconsistent API. I've also used R's [optim](https://www.rdocumentation.org/packages/stats/versions/3.6.2...) function and [lpsolve](https://cran.r-project.org/web/packages/lpSolve/index.html) for linear and integer problems.


Absolutely! One of the other issues that hasn't been mentioned already but seems to come up a lot is the relative lack of different solution constraints available in these existing packages, preventing the use of optimization beyond parts intended for a limited set manufacturing processes. It is extremely difficult to create solution constraints from a software development perspective when working with a continuous solution approach, and yet it is trivial to do so with a discrete approach. This gives us a major advantage and allows us to generalize our software for more engineering problems, letting us reach wider adoption.

Most of what we have right now would still be considered in the "bench test" phase, but we're making progress towards something which could be demonstrated in industry. Our primary focus at the moment is making it easier to import and export CAD geometry to and from the optimizer.


I’m in the optimization space and have never heard of this. Thanks. I recognize many of the names behind it.

I think part of the reason it’s not more popular is because QPs aren’t the most popular problem type. They’re used in MPC problems and any number of L2-loss function problems like ML problems, but these already have custom methods for solving the QP. Doesn’t meant they can’t adopt a standard high performance solver — I think this is very promising.

Also I’m not sure if there’s any political issues that limits them from listing this on COIN OR. That’s usually most folks go to site for discovering new solvers.


I have been writing optimization solvers of many forms to solve problems in engineering for about 20 years. from "make excel do linear regression on some data" to linear least squares to some nonlinear methods, simulated annealing, bayesian methods, deep neural networks -- none of this is "a new kind of programming", it's "do a bunch of data munging, throw matrix at a function, get matrix back, interpret/plot."

there's really no other magic in it than that.


I teach a class on the subject: https://www.youtube.com/playlist?list=PLdkTDauaUnQpzuOCZyUUZ...

Understanding optimization at this level and being able to formulate your own problems in one of the canonical forms gives you super powers.

You don’t even have to build your own solvers. The commercial/open source ones are typically good enough.


What do you mean by optimization engine? Did you write a solver yourself, or did you use something off-the-shelf? In either case, what are the algorithms involved?

Unless you're solving toy problems, it matters quite a bit. In mathematical optimization, constructing the problem tends to be just as expensive as solving the problem is – sometimes more so. The existence of expensive commercial systems like AMPL and GAMS that only exist to express optimization problems demonstrates that this is a non-trivial issue that people are willing to pay money for. Using C++ APIs to solvers is extremely painful, inflexible, error-prone, and completely locks you into a specific solver.

Yup. & when you get a model that works matched up with an industrial scale decision problem that's valuable to solve, arguably it's only of academic interest if you can solve it "optimally". The problem is only a simplified model of reality anyway -- it's often better to get a quick close-enough approximate solution to a problem that's a good approximation of the situation than an exact optimal solution to a simpler problem that's a poorer approximation.

If you're lucky enough to get a problem that's basically stable over time, where the problem structure doesn't change, then maybe you can get improved solutions rapidly at industrial scale replacing use of a black-box MIP solver like Gurobi/CPLEX with a decomposition that exploits the problem structure, where sub-problems can be solved by some specialized graph algorithm or heuristic or brute force (if they all have bounded size), and the general purpose MIP/LP solver can be left with the job of figuring out how to deal with the shared resources and constraints that bind the subproblems together. The downside to a highly specialised custom solver is that it usually isn't flexible to changing requirements (unless you get very lucky) -- a slight change in business rule can break the problem structure that underpins the entire solution approach.


There are nonlinear solvers available but they are slower and not guaranteed to converge on a globally optimal solution.
next

Legal | privacy