Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

If you think real computation is in any way feasible why don't you build one and make a silly amount of money?


sort by: page size:

These computers do exist on tabletops in labs; the problem is that it's comparatively difficult/expensive to mass-produce and miniaturize these components compared to silicon (where you can etch billions of transistors at once on a modern CPU). There are really cool things you can do, especially in terms of continuous/non-discretized-time computations, but I wouldn't hold your breath for this to become mainstream.

Because that's the point where you take things that don't compute, and make something that computes.

Computation is not theory. Theorists use paper and pencil and sometimes mathematica, not simulation. As for not being able to rebuild a computer, that’s truly ridiculous.

He is not arguing purely for throwing money at hardware:

A good engineer would balance the tradeoffs and solve the problem within the existing resource constraints. A theoretical computer scientist would whine until he got a big enough machine to implement the mathematical solution with the least amount of fussing about the constraints of the real world.

The post at the start of the Usenet thread asks if any Lisp implementation can handle inverting a 20,000 by 20,000 matrix. Naggum is arguing that if you really had that problem, you are probably working in an industry where the choice of Lisp implementation is irrelevant, because solving the problem is so lucrative that you could afford to write your own Lisp implementation, if needed. He's saying that you will throw money at both hardware and developers to get the job done.

Edit: He also does not compare Fortran and Lisp.


Compute is most definitely a marginal cost rather than a fixed cost....

And when you start talking about buying 10s of millions in additional hardware to support the project you're trying to launch you start thinking pretty hard about whether you can speed things up.

And for practitioners most of this is not crazy high level math, it's first year linear algebra and multivariable calculus at most, and generally just lego blocks, intuition and data work.


There's a big difference between having an operating computable function computing on real hardware, and having the idea that a certain phenomenon is potentially computable.

If you want to go with imaginary algorithms that don’t exist yet (or possibly ever) sure such an imaginary computer or system could do it more efficiently.

I think saying 'computation will never be __________' is usually just wrong. We have a decent understanding now of the complexity of a human body, and there's no technical limitation on putting a ton of supercomputers together to do what OP suggested. It's more of a matter of waiting until it's economically worth it.

Ego has infected this thread. We can't just say that this project is interesting. We have to talk about "expensive" computation. Why? What does "ambition" have to do with computer science?

If someone can pay for these computing resources, why shouldn't he do it? Also there is no physical law that forbids cheap brain-scale computing. 1 exaflops could easily fit in less than 1 cm^3, given sufficiently advanced logic technology.

It will still cost crazy amount of computing power. The economics dont allow it.

Unironically, we can already solve the calculation problem with standard computing techniques right now.

So in other words, it's not a computer, it's a computation.

We recently had some real discussions about this. It’s worth remembering that designing, building and testing a custom chip runs, for a relatively old density, order 100 million dollars. Add to that the fact it would only be useful for a small subset of the problems your users want to use the machine for and it becomes pretty clear why supercomputer centers go generic as much as feasible. Also bitcoin is a bit of a poor example because the memory requirement for each unit is incredibly small. That’s rarely the case for any kind of meaningful simulation.

What an absurd thing to consider. There is no possibility of building a computer the size of the universe. This website is full of crazy people talking about crazy things!

It's only meaningful for traditional computing if you can build circuits orders of magnitude more efficient than we can currently.

Unless you know it to be physically or logically impossible, you could not really know how likely it is or isn't. Ask anyone in the mid-1970's how likely it is that billions of people would be walking around with a supercomputer in their pockets, and they'd come up with all sorts of reasons why it was extremely unlikely, such as no individual would ever need so much computing power. The practicality of the precision only depends on the ability to measure and the ability to manipulate and simulate large amounts of data, both of which are extremely likely to get better, and better faster and faster, as time and technology progresses.

From what I’m reading in the comments here, there is a ton of progress to be made by having more and cheaper computing to replace as much physical experimentation as possible. Ideally, we would have a gold, tabula-rasa SOTA benchmark that says “any system made of up to this molecular weight is perfectly simulable in-silico. Not ifs and buts, just this many $$$$.”

exactly, i saw a similar complaint as to the gp's in a thread about working with classical computing to simulate quantum computing

my argument then was 'imagine prototyping on a machine that needs to be kept at 17millikelvin'

here the argument is 'imagine prototyping on a two ton missile traveling at lethal speeds through highly populated areas'

next

Legal | privacy