Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Just keep dividing it, that's one of the key designs. Work in mini bitcoin, or micro bitcoin, or even single satoshi. There's nothing saying that you have to use full units.


sort by: page size:

I agree with most of those, but what's with the first two?

How are the amounts/divisibility an issue? It divides down to tiny levels and there's enough.

Is it basically the idea that 1 bitcoin is large and we should have used a different base unit?


Ah that makes sense - if Bitcoin keeps going at its present rate could 8 units of precision simply not be enough? Would a hard-fork be the solution to this?

Yeah I realise that. Even if it does work (noting currently most BCH blocks are tiny) then the chain will grow by ~135Gb per month (the entire Bitcoin chain is ~145Gb). Running a full node means you now have to buy a hard-drive (or two) and hope your internet connection isn't capped. You also have to be OK with really slow block propagation (which you really shouldn't be).

There are some very good reasons for small blocks and higher on-chain transaction fees.


Some very good points. I recall that the mainnet once had a gas per block limit of 4.7 mil before it was increased. The increase is something that is voted on by the miners, and has been fluid over the years. (Btw, it is something very similar to a block size limit in BTC, although it limits the storage + CPU usage, rather than just storage like in BTC)

About the point on using floats, one should never use floats when working with money, this is because floats are not precise and result in rounding errors (eg. 0.1 + 0.2 results 0.30000000000000004, see for details http://0.30000000000000004.com ). One of the simplest approaches to solve it is to work with the smallest units, so if working with dollars then you can use cents and that means you can use integers which give more precision, which is how Solidity currently deals with it, by working with the smallest unit of Ether which is 'wei'. Some of the units listed here https://etherconverter.online


You can't just scale the number of transactions in a block forever and still have a stable currency. If you have only a few miners working on massive blocks, then they confer very little confidence onto the transactions in the blocks. More transactions, more mining. The whole point of the system is to verify transactions and it stops working if it doesn't do that.

It is already possible to select mBTC (1/1,000) and µBTC (1/1,000,000) units in the client. That should be sufficient, at least for now. For nBTC (1/1,000,000,000) the 8 digits in the protocol are no longer enough, thus some more work will be necessary. But let's not get ahead of ourselves.

You realize bitcoin core can’t scale? It literally can only process 4 tx/s whereas original bitcoin (bsv) can do thousands of tx/s

For starters the whitepaper concludes that a 133mb base block size is needed for it to work at scale. Bitcoin currently has a 1mb block size limit, which it will never increase.

Ah. I see! Thank you for that. It makes complete sense now. I was wondering why they had such a convoluted method for calculating the block size.

Its the flaw in the horizontal scaling suggestion I made.

Scaling Bitcoin requires everyone to agree on bigger machines/faster pipes and/or use the lightning option [which may or may not work]. Both of those options don't have the problem I stated.


If you're looking for scaling, look at cryptocurrencies other than Bitcoin. On-chain scaling is far from explored, and it's really the only way to solve it properly.

Satoshi picked that model probably because it's easy to divide by 2 (x >> 2). He didn't want to prematurely optimize or overthink anything because it was a simple experiment and nobody could have predicted that Bitcoin will become the mammoth it is now.

I agree it's a good rule of thumb, and better than nothing for the most part. I tend to side conservative: there should be a protocol like Bitcoin which makes running full nodes as cheap as possible in the current environment, but could it increase its block size 50% every 4 years? Probably.

> What is a reasonable blocksize? I couldn't give you an exact answer. I am not a researcher.

We both agree that 'the right size' should be a sophisticated determination. In my view, that suggests a market based, variable approach is the way forward.

> In any case, it's possible to design a cryptocurrency that does not need to store transactions forever

> the more important metric is network consensus bandwidth

You're the only person who I have heard point out these facts that is not doing so in the context of Saito. I invite you to check it out, or while it's small and crypto is relatively inactive, just drop into the Telegram and have a chat with David, an outspoken founder. That project was built on understandings you are repeating here.


Well, it should mostly see the real Bitcoin chain as valid until it doesn't. Older full nodes have a bug where they can't handle chain reorganisations properly with blocks that are the full 1MB in size, so the moment they see an orphaned block they break horribly. (Relatedly, if any big block pusher argues increasing the block size is just a simple matter of changing a few constants, that's a good sign they don't know what they're talking about.)

Your speculation doesn't square well with the original whitepaper that preempts your concerns by pointing out how you don't have to run a full node. It introduces SPV and points out you just need to store 80byte block headers (total of 50MB in 2021). (Satoshi was a big believer in SPV)

It's a common misconception that block size is somehow related to decentralization. In fact you don't need a whole blockchain to verify transactions securely, you only need a few last blocks at max. Also full nodes add nothing to the network, they have no vote, only miners and actual users do.

Cryptocurrency can do this, but Bitcoin artificially keeps the blocksize to 1.6KB/s so people think that small transactions are not practical in a general sense, since they aren't practical there.

Why are you specifying such a small block size? bs=1M or more is much quicker for that sort of thing

I do not understand this move. If blockchain size is an issue, I don't understand why the focus isn't pruning the blockchain so that only the hashed headers are stored for past transactions. This defeats the point of having 8 decimal places if you can only reasonably use 3+.
next

Legal | privacy