It's Official: Segregated Witness Will Activate on Bitcoin ...

How Blockchain.com harms the cryptocurrency community

How Blockchain.com harms the cryptocurrency community
This post has been written to draw Blockchain.com's attention to the issues of its product.
Blockchain.com (formerly Blockchain.info) was founded in 2011 and with no doubt has helped the Bitcoin community to create a block explorer and has proven itself as a valuable service. Millions of people from all over the world use their wallet. However, time goes on, Bitcoin develops, but one of the main cryptocurrency companies not only slows down the process of its development, but also discredits the usability of cryptocurrencies. Why is that? We have tried to explain that in this post.
https://preview.redd.it/i0yk7qraqkw41.jpg?width=1024&format=pjpg&auto=webp&s=963dc18551d2900bca456bfa3a6cfd3636c7c93c

1. Lack of Segwit Address Support

This is the most painful problem for our service. To receive cryptocurrency we use segwit addresses by default. If a customer contacts our support, we can of course change an address in an order to P2SH (the one that begins with number "3"), but it reduces the usability of the service. Why don't we use P2SH by default? It is less beneficial both for us and for our client, as the cost of the consolidation of the transaction is taken into account when calculating the exchange rate. With a high network load and with orders for a small amount, the commission becomes significant. To compare — segwit addresses (or Bech32 that starts with "bc1") are 15% more advantageous than P2SH.
Segwit (Segregated Witness) was activated in 2017. At the end of the same year the CEO of Blockchain.info announced its support starting from (most likely) 2018. We can understand certain fears at the beginning of the way, as the company's security system is for sure not that bad. However more than 2 years have passed since then and that is a lot for the crypto world.

2. Using Legacy (P2PKH) Addresses Only

As of now only a P2PKH address (that starts with number "1") is used in the Blockchain.com wallet to receive cryptocurrency. Why is that bad? It is unfavorable for the users of this wallet. They spend 29% more than those using P2SH addresses.

3. Confusion with PAX and USD Digital

Changing the names of the currencies from one to another only in a Blockchain.com wallet is a rather strange decision. The key problem is that nowhere in the wallet are there any clues that this is an ERC-20 PAX token on the Ethereum blockchain. New wallet users will most likely be confused by this.
We sometimes get questions: "do we have USD Digital to buy or to sell?" and our technical support is forced to explain that it is PAX.

4. Incompetence of mobile application developers

In fact, this post was inspired by this particular problem. We will not focus on performance or shortcomings, we will just tell you about the main issue.
It is worth starting with questions. What problem does a mobile application solve when a person needs to pay for something? What is the best way to fill in the recipient address and the payment amount on the smartphone? Answer: QR code.
Scanning the QR code in this application is done not just badly, but also in such a way that creates maximum number of problems for a user.
The fact is that in the Android application when scanning code with the bitcoin:
?amount=, the value of the amount to be inserted in the corresponding field may differ from the encoded value by 1-100 Satoshi! Our team simply does not understand how this was implemented.
Do not believe? Try it yourself. Amount to insert — 0.00143452 BTC
bitcoin:3LAxDr5CxwBJT4tCejV8rpAXETz7bUH3tG?amount=0.00143452
After receiving information on such a problem from our users, we began to monitor updates to this application. After 2 updates had come out, the problem was not fixed.
And what about iOS? When scanning a QR code with a sum in iOS , the sum value is simply not inserted into the field! No comments. Bravo!
Blockchain.com wallet has different currencies, not just Bitcoin. Let's try Ethereum. You want to scan QR codes for Ethereum payment with the relevant sum? There is no such possibility. The application will respond with an "Invalid address" to all such codes:
  • eth:
    ?amount=
  • ether:
    ?amount=
  • ethereum:
    ?amount=
For BitcoinCash, the task of identifying the amount in the line is also an impossible task. Line with wallet bitcoincash:qpk0689rt3xkzlw8ap4yy72amp2zpws6zujkcgavptconsidered true, but with the amount bitcoincash:qpk0689rt3xkzlw8ap4yy72amp2zpws6zujkcgavpt?amount=0.1 — "Invalid address"
But there are applications that understand all such formats, or at least one of them. The string parsing function is pretty trivial and should not be a problem for the developer.
This article has been written based on the experience of using the application of the members of our team (who have used it for many years) and our users.
We encourage Blockchain.com to fix at least 3 of the 4 problems that we covered in this article. We still hope that the company will work on the bugs and will earn back trust of its users.
In the meantime — use other applications! ;)
The post is published on our blog:https://fixedfloat.com/blog/guides/how-blockchain-com-harms
submitted by FixedFloat to Bitcoin [link] [comments]

Understanding SegWit

Understanding SegWit
https://preview.redd.it/tb8bvi3nec351.png?width=1920&format=png&auto=webp&s=2c02d9d52f7b00d460ad0ccf87d069e1fc2d31b2
The First layer scaling solution is comprised of 3 different scaling mechanisms:
· Sharding
· Hard fork
· SegWit
In my last two articles, I have already covered Hard Fork and Sharding. So here in this article, I will focus on the last scaling solution i.e SegWit.
What is SegWit?
SegWit stands for Segregating Witness
i.e separating the signatures from the transactions.
In this process, certain parts of a transaction are removed, which will free up space so that more transactions can be added to the chain. The idea behind using this method is to overcome the block size limit of blockchain transactions. In simple terms, SegWit changed the way data are stored, therefore helping the Bitcoin network to run faster and more smoothly.
It was suggested as a soft fork change in the transaction format of Bitcoin in the Bitcoin Improvement Proposal number BIP141.
Problem Statement
In the Bitcoin platform, Blocks are getting generated every 10 minutes and are constrained to a maximum size of 1 megabyte (MB). As the number of transactions is increasing, more blocks need to be added to the chain. But due to the block size constraint, only a certain number of transactions can be added to a block. The weight of the transactions can cause delays in processing and verifying transactions. Sometimes, it takes hours to confirm a transaction as valid. This can slow down further when the network is busy.
The Solution
To overcome the block size limit issue and to enhance the transaction speed, the transaction is divided into two segments. Removing the unlocking signature (witness) from the original portion and appending it as a separate structure at the end. The original portion will still have the sender and receiver data, and the new "witness" structure would contain scripts and signatures. The original data segment would be counted normally, but the new "witness" segment becomes one-fourth of its original size.
Digital signature accounts for 65% of the space in a given transaction.
SegWit is backward compatible, which means nodes that are updated with the SegWit Bitcoin protocol can still work with nodes that haven’t been updated.
SegWit measures blocks by block weight.
The formula used to calculate block weight:
(tx size with witness data stripped) * 3 + (tx size)
Since segregated witness creates a sidechain where witness data is stored, it prevents transaction IDs from being altered by dishonest users. It also addresses signature malleability, by serializing signatures separately from the rest of the transaction data, so that the transaction ID is no longer malleable.
History
Pieter Wuille, a bitcoin developer, first proposed the concept of SegWit.
On 24 July 2017 as a part of the software upgrade process i.e Bitcoin Improvement Proposal (BIP) 91, the concept of Segregated Witness is activated at block 477,120.
Within one week of implementation, the bitcoin price seen a spike of 50%. The transaction usage rate using SegWit further increased from 7% to 10% in the first week of October. As of February 2018, SegWit transactions exceed 30%.
However, a group of China-based bitcoin miners were unhappy with the implementation and later forked to created Bitcoin Cash.
Lightning Network - Layer 2 solution
Lightning Network operates on top of bitcoin and is referred to as a “Layer 2” component. It is an off-chain micropayment system that is designed to enhance the transaction speed in the blockchain network.
SegWit acts as a base component for the Lightning Network. By implementing SegWit, the transaction malleability issue can be prevented which will allow this secure payment system to process millions of transactions per second in the Bitcoin network.
Advantages of SegWit:
· Prevents transaction malleability problem.
· Prevents signature malleability problem.
· Helps in scaling the bitcoin network.
· Increases block size.
· Reduced transaction fees.
· Acts as a base for the lightning protocol.
Conclusion
There is no doubt that Bitcoin technology is very revolutionary but like any other technology, it has certain drawbacks as well as challenges. Scaling is one of them which has restricted in large scale applications adopted. It is capable of processing only 7-10 transactions per second on the base layer. Many developers, researchers from the Bitcoin community are working hard to overcome the problem. SegWit along with the Lightning Network together aiming to allow Bitcoin to process millions (or more) transactions per second. But the real scenario will depend on the success of future projects.

Read More: A Guide to Smart Contracts
submitted by RumaDas to BlockChain_info [link] [comments]

Can anyone honestly respond to these questions without resorting to name calling and hyperbole?

I just want to know the answers. I don’t care about all the shit chucking. Can anyone help me out with the correct articles to read that are well cited and well-reasoned. All I can find are a bunch of shit fights with occasional pieces of interesting information.
What is the response to the mining cartel threat?
I tried to ask “over there” (https://np.reddit.com/Bitcoin/comments/73ek09/comment/dnr14o0?st=J88P2KKX&sh=001b4c8fhttps://www.reddit.com/Bitcoin/comments/73ek09/comment/dnr14o0?st=J88P2KKX&sh=001b4c8f ) but mostly they just want to fling shit and name call; although certainly a lot of that happens here too.
Is it true though? Segwit effectively goes from 1MB to 2.2MB? If that’s the case why does the congestion and high fees remain? What is stopping it from hitting 4MB max during times of high activity?
I guess the means that the extra data is no longer reported in the “block size” on blockchain.info for example. Where can I see the real total data size of each block now? I know that segwit strips the signature data out of the “block” but if the nodes still have to store it’s kind of disingenuous to say the blocks are only 1MB. I also understand that segwit intends to “discard” or at least “optionally discard” the signature data once verified. So is there a way to see the real size of the data live both with the save the signature data option on or off?
As I understand it, so far, only people using segwit addresses can take advantage of a theoretical 60% capacity increase. However, not that many actually use segwit so it doesn’t automatically deliver this the way a simple block size increase would.
I am not trying to be an asshole I’m just actually trying to learn but it’s really hard to sift through all the bullshit and mid-slinging on both sides. Right now a simple increase in block size seems like the best solution and segwit seems like a bad idea for many reasons. Neither one is an end all be all, it’s a intermediate capacity increase on a long journey. But SegWit seems very complicated, in some ways illogical (https://www.coindesk.com/the-risks-of-bitcoins-segregated-witness-problems-under-us-contract-law/) and potentially dangerous (https://news.bitcoin.com/risks-segregated-witness-opening-door-mining-cartels-undermine-bitcoin-network/). I believe there are counter arguments but I have not heard a good one that makes any sense. For me what makes most sense is something like a simple 2-4MB larger block and continuing the discussion about what to do next.
I’m sure the knee jerk reaction of many will be to attack the messenger and while I’m no fan of fake Satoshi his argument on the mining cartels seems reasonable and I have not heard or read a good counter argument.
I’m more than happy to update my opinions based on new information and I’m trying to figure out exactly what to do with yet another hard fork looming on the horizon. Based on everything I read to date though, segwit plus no block size increase seems like the absolute worst option. Obviously there are some important elements I’ve missed, but as I said, it’s really hard to wade through all the bullshit.
Node blocksize calculator: https://iancoleman.github.io/blocksize/
submitted by mdprutj to btc [link] [comments]

03-27 13:34 - 'AiOption (AiOption) receives tens of millions of dollars in financing to help the blockchain empower the financial industry' (self.Bitcoin) by /u/jackzhang0 removed from /r/Bitcoin within 3-13min

'''
In 2020, due to the dual impact of the coronary pneumonia epidemic and the plunge of US oil stocks, the economic situation in the Asia-Pacific region is very grim. Within a week, U.S. stocks melted twice, and crypto digital assets such as Bitcoin plummeted. This seems to indicate that the direction of global financial markets in 2020 will be extremely unstable. In this situation, traditional financial investment methods are not the most valuable means of financial management. AiOption Blockchain Binary Options Platform provides a new direction for financial investment, predicting the rise and fall of encrypted digital assets such as Bitcoin in a fixed period of time to obtain income.
Recently, AiOption, a professional blockchain binary options platform, announced that it has received tens of millions of dollars in financing. This round of financing was led by the Japanese consortium and the Thai royal family. This round of financing is an important milestone in the continuous increase of market competitiveness. At the same time, AiOption has become the largest platform in China to provide blockchain binary options transactions.

[link]1
This round of financing will help the platform to further strengthen the innovation and research and development of original key core technologies, consolidate the company's leading edge in the binary options industry of the blockchain, and help the company continue to expand more application scenarios and accelerate the blockchain's empowerment of the financial industry. In order to further improve the product experience, we will also introduce local special versions based on user habits in different countries and regions. As soon as it entered the promotion in the Asia-Pacific region in 2020, there were more than 100,000 registered users in the first week, achieving very good results. The platform will also launch more promotion activities in combination with local characteristics. The top investment groups such as the Thai Royal Family and the Japanese Consortium gave AiOption a high rating. It is indeed a black technology star product known as Israeli fintech innovation.
AiOption (AiOption) is a professional crypto asset options trading platform with a solid foundation of blockchain technology. It has achieved significant R & D results in distributed network and blockchain security. It has worked closely with more than 8 countries to provide a very simple way to predict the price fluctuations of encrypted digital assets such as Bitcoin and Ethereum. The platform collects price data of multiple trading symbols from multiple selected trusted data sources (such as Binance, coinbase, bittrex, huobi, and some other well-known global exchanges) to merge together, and uses intelligent algorithms to identify and Filter abnormal price data and calculate the final price index for a single coin. Use more innovative and fair ways for players to predict the price of crypto digital assets such as Bitcoin and Ethereum.

[link]2
Safe, efficient, and high-performance systems
AiOption has top risk control, anti-fraud and segregated witness technologies, comprehensively formulates a security policy system, multi-level risk identification control, and multiple security defense methods. The high-frequency transaction matching engine steadily supports large amounts of data, high performance, and high concurrency. It adopts a distributed architecture, and the market and deep data come online at a fast speed. The front-end adopts a firewall anti-attack mechanism and the back-end adopts a hidden and discrete deployment.
AiOption's binary options trading system is equipped with flexible and convenient trading modes and an extremely secure system to ensure the safety of user assets.
Fair and simple, simple and convenient transaction model
On a general options platform, the bet price is real-time Bitcoin price and can be easily manipulated by the platform. When the player wagers the Bitcoin price on the platform, the wager price is the initial Bitcoin price for each round of the game, and manipulation is not allowed! Ensure fair and fair transactions, convenient user transactions, and easy to master gameplay.
  1. The operation is simple. You only need to judge the rise and fall of encrypted digital assets after 90 seconds.
  2. The rate of return is fast, and the single-round profit can be settled in 90 seconds.
  3. Transaction time is unlimited, 90 seconds matching, non-stop trading 7 days and 24 hours.
  4. There is no handling fee, and no dealer control disk.
At the same time, the platform has a unique function of depositing money and managing money. By depositing a certain amount of USDT, excellent players and excellent teams can obtain fixed high returns, with a maximum return of four times!
For many years, AIoption has always adhered to the concept of blockchain technology to empower the financial industry, and has concentrated on polishing products and application scenarios. The top-level blockchain team has achieved certain results in the blockchain and financial fields.
Through this financing, we will continue to focus on the development of blockchain technology and continue to develop in the large field of blockchain binary options services. AiOption's vision is to promote the development of blockchain binary options services, provide customers with better services, and continue to maintain its leading position in the domestic blockchain binary options industry.
'''
AiOption (AiOption) receives tens of millions of dollars in financing to help the blockchain empower the financial industry
Go1dfish undelete link
unreddit undelete link
Author: jackzhang0
1: pr*vi*w.redd.i*/0*i**tuut7p41.png*w***h=6*8&*mp;for*at*png&****=web*&*s*9387b*0a4b5b1*b8*165*517*9*5*bdb*a5e1a*b 2: preview.redd.it*vgy*zpd4u*p41*pn***i**h=769&format=pn*&am*;***o=w*bp&***;s=b69***7339239*967622***bccea*c5*07b*55**
Unknown links are censored to prevent spreading illicit content.
submitted by removalbot to removalbot [link] [comments]

AN INTRODUCTION TO DIGIBYTE

DigiByte

What are cryptocurrencies?
Cryptocurrencies are peer to peer technology protocols which rely on the block-chain; a system of decentralized record keeping which allows people to exchange unmodifiable and indestructible information “coins,” globally in little to no time with little to no fees – this translates into the exchange of value as these coins cannot be counterfeit nor stolen. This concept was started by Satoshi Nakamoto (allegedly a pseudonym for a single man or organization) whom described and coded Bitcoin in 2009.
What is DigiByte?
DigiByte (DGB) is a cryptocurrency like Bitcoin. It is also a decentralized applications protocol in a similar fashion to Neo or Ethereum.
DigiByte was founded and created by Jared Tate in 2014. DigiByte allows for fast (virtually instant) and low cost (virtually free) transactions. DigiByte is hard capped at 21 billion coins which will ever be mined, over a period of 21 years. DigiByte was never an ICO and was mined/created in the same way that Bitcoin or Litecoin initially were.
DigiByte is the fastest UTXO PoW scalable block-chain in the world. We’ll cover what this really means down below.
DigiByte has put forth and applied solutions to many of the problems that have plagued Bitcoin and cryptocurrencies in general – those being:
We will address these point by point in the subsequent sections.
The DigiByte Protocol
DigiByte maintains these properties through use of various technological innovations which we will briefly address below.
Why so many coins? 21 Billion
When initially conceived Bitcoin was the first of a kind! And came into the hands of a few! The beginnings of a coin such as Bitcoin were difficult, it had to go through a lot of initial growth pains which following coins did not have to face. It is for this reason among others why I believe Bitcoin was capped at 21 million; and why today it has thus secured a place as digital gold.
When Bitcoin was first invented no one knew anything about cryptocurrencies, for the inventor to get them out to the public he would have to give them away. This is how the first Bitcoins were probably passed on, for free! But then as interest grew so did the community. For them to be able to build something and create something which could go on to have actual value, it would have to go through a steady growth phase. Therefore, the control of inflation through mining was extremely important. Also, why the cap for Bitcoin was probably set so low - to allow these coins to amass value without being destroyed by inflation (from mining) in the same way fiat is today! In my mind Satoshi Nakamoto knew what he was doing when setting it at 21 million BTC and must have known and even anticipated others would take his design and build on top of it.
At DigiByte, we are that better design and capped at 21 billion. That's 1000 times larger than the supply of Bitcoin. Why though? Why is the cap on DigiByte so much higher than that of Bitcoin? Because DigiByte was conceived to be used not as a digital gold, nor as any sort of commodity, but as a real currency!
Today on planet Earth, we are approximately 7.6 billion people. If each person should want or need to use and live off Bitcoin; then equally split at best each person could only own 0.00276315789 BTC. The market cap for all the money on the whole planet today is estimated to have recently passed 80 trillion dollars. That means that each whole unit of Bitcoin would be worth approximately $3,809,523.81!
$3,809,523.81
This is of course in an extreme case where everyone used Bitcoin for everything. But even in a more conservative scenario the fact remains that with such a low supply each unit of a Bitcoin would become absurdly expensive if not inaccessible to most. Imagine trying to buy anything under a dollar!
Not only would using Bitcoin as an everyday currency be a logistical nightmare but it would be nigh impossible. For each Satoshi of a Bitcoin would be worth much, much, more than what is realistically manageable.
This is where DigiByte comes in and where it shines. DigiByte aims to be used world-wide as an international currency! Not to be hoarded in the same way Bitcoin is. If we were to do some of the same calculations with DigiByte we'd find that the numbers are a lot more reasonable.
At 7.6 billion people, each person could own 2.76315789474 DGB. Each whole unit of DGB would be worth approximately $3,809.52.
$3,809.52
This is much more manageable and remember in an extreme case where everyone used DigiByte for everything! I don't expect this to happen anytime soon, but with the supply of DigiByte it would allow us to live and transact in a much more realistic and fluid fashion. Without having to divide large numbers on our phone's calculator to understand how much we owe for that cup of coffee! With DigiByte it's simple, coffee cost 1.5 DGB, the cinema 2.8 DGB, a plane ticket 500 DGB!
There is a reason for DigiByte's large supply, and it is a good one!
Decentralisation
Decentralisation is an important concept for the block-chain and cryptocurrencies in general. This allows for a system which cannot be controlled nor manipulated no matter how large the organization in play or their intentions. DigiByte’s chain remains out of the reach of even the most powerful government. This allows for people to transact freely and openly without fear of censorship.
Decentralisation on the DigiByte block-chain is assured by having an accessible and fair mining protocol in place – this is the multi-algorithm (MultiAlgo) approach. We believe that all should have access to DigiByte whether through purchase or by mining. Therefore, DigiByte is minable not only on dedicated mining hardware such as Antminers, but also through use of conventional graphics cards. The multi-algorithm approach allows for users to mine on a variety of hardware types through use of one of the 5 mining algorithms supported by DigiByte. Those being:
Please note that these mining algorithms are modified and updated from time to time to assure complete decentralisation and thus ultimate security.
The problem with using only one mining algorithm such as Bitcoin or Litecoin do is that this allows for people to continually amass mining hardware and hash power. The more hash power one has, the more one can collect more. This leads to a cycle of centralisation and the creation of mining centres. It is known that a massive portion of all hash power in Bitcoin comes from China. This kind of centralisation is a natural tendency as it is cheaper for large organisations to set up in countries with inexpensive electricity and other such advantages which may be unavailable to the average miner.
DigiByte mitigates this problem with the use of multiple algorithms. It allows for miners with many different kinds of hardware to mine the same coin on an even playing field. Mining difficulty is set relative to the mining algorithm used. This allows for those with dedicated mining rigs to mine alongside those with more modest machines – and all secure the DigiByte chain while maintaining decentralisation.
Low Fees
Low fees are maintained in DigiByte thanks to the MultiAlgo approach working in conjunction with MultiShield (originally known as DigiShield). MultiShield calls for block difficulty readjustment between every single block on the chain; currently blocks last 15 seconds. This continuous difficulty readjustment allows us to combat any bad actors which may wish to manipulate the DigiByte chain.
Manipulation may be done by a large pool or a single entity with a great amount of hash power mining blocks on the chain; thus, increasing the difficulty of the chain. In some coins such as Bitcoin or Litecoin difficulty is readjusted every 2016 blocks at approximately 10mins each and 2mins respectively. Meaning that Bitcoin’s difficulty is readjusted about every two weeks. This system can allow for large bad actors to mine a coin and then abandon it, leaving it with a difficulty level far too high for the present hash rate – and so transactions can be frozen, and the chain stopped until there is a difficulty readjustment and or enough hash power to mine the chain. In such a case users may be faced with a choice - pay exorbitant fees or have their transactions frozen. In an extreme case the whole chain could be frozen completely for extended periods of time.
DigiByte does not face this problem as its difficulty is readjusted per block every 15 seconds. This innovation was a technological breakthrough and was adopted by several other coins in the cryptocurrency environment such as Dogecoin, Z-Cash, Ubiq, Monacoin, and Bitcoin Gold.
This difficulty readjustment along with the MultiAlgo approach allows DigiByte to maintain the lowest fees of any UTXO – PoW – chain in the world. Currently fees on the DigiByte block-chain are at about 0.0001 DGB per transaction of 100 000 DGB sent. This depends on the amount sent and currently 100 000 DGB are worth around $2000.00 with the fee being less than 0.000002 cents. It would take 500 000 transactions of 100 000 DGB to equal 1 penny’s worth. This was tested on a Ledger Nano S set to the low fees setting.
Fast transaction times
Fast transactions are ensured by the conjunctive use of the two aforementioned technology protocols. The use of MultiShield and MultiAlgo allows the mining of the DigiByte chain to always be profitable and thus there is always someone mining your transactions. MultiAlgo allows there to a greater amount of hash power spread world-wide, this along with 15 second block times allows for transactions to be near instantaneous. This speed is also ensured by the use DigiSpeed. DigiSpeed is the protocol by which the DigiByte chain will decrease block timing gradually. Initially DigiByte started with 30 second block times in 2014; which today are set at 15 seconds. This decrease will allow for ever faster and ever more transactions per block.
Robust security + The Immutable Ledger
At the core of cryptocurrency security is decentralisation. As stated before decentralisation is ensured on the DigiByte block chain by use of the MultiAlgo approach. Each algorithm in the MultiAlgo approach of DigiByte is only allowed about 20% of all new blocks. This in conjunction with MultiShield allows for DigiByte to be the most secure, most reliable, and fastest UTXO block chain on the planet. This means that DigiByte is a proof of work (PoW) block-chain where all transactional activities are stored on the immutable public ledger world-wide. In DigiByte there is no need for the Lightning protocol (although we have it) nor sidechains to scale, and thus we get to keep PoW’s security.
There are many great debates as to the robustness or cleanliness of PoW. The fact remains that PoW block-chains remain the only systems in human history which have never been hacked and thus their security is maximal.
For an attacker to divert the DigiByte chain they would need to control over 93% of all the hashrate on one algorithm and 51% of the other four. And so DigiByte is immune to the infamous 51% attack to which Bitcoin and Litecoin are vulnerable.
Moreover, the DigiByte block-chain is currently spread over 200 000 plus servers, computers, phones, and other machines world-wide. The fact is that DigiByte is one of the easiest to mine coins there is – this is greatly aided by the recent release of the one click miner. This allows for ever greater decentralisation which in turn assures that there is no single point of failure and the chain is thus virtually un-attackable.
On Chain Scalability
The biggest barrier for block-chains today is scalability. Visa the credit card company can handle around 2000 transactions per second (TPS) today. This allows them to ensure customer security and transactional rates nation-wide. Bitcoin currently sits at around 7 TPS and Litecoin at 28 TPS (56 TPS with SegWit). All the technological innovations I’ve mentioned above come together to allow for DigiByte to be the fastest PoW block-chain in the world and the most scalable.
DigiByte is scalable because of DigiSpeed, the protocol through which block times are decreased and block sizes are increased. It is known that a simple increase in block size can increase the TPS of any block-chain, such is the case with Bitcoin Cash. This is however not scalable. The reason a simple increase in block size is not scalable is because it would eventually lead to some if not a great amount of centralization. This centralization occurs because larger block sizes mean that storage costs and thus hardware cost for miners increases. This increase along with full blocks – meaning many transactions occurring on the chain – will inevitably bar out the average miner after difficulty increases and mining centres consolidate.
Hardware cost, and storage costs decrease over time following Moore’s law and DigiByte adheres to it perfectly. DigiSpeed calls for the increase in block sizes and decrease in block timing every two years by a factor of two. This means that originally DigiByte’s block sizes were 1 MB at 30 seconds each at inception in 2014. In 2016 DigiByte increased block size by two and decreased block timing by the same factor. Perfectly following Moore’s law. Moore’s law dictates that in general hardware increases in power by a factor of two while halving in cost every year.
This would allow for DigiByte to scale at a steady rate and for people to adopt new hardware at an equally steady rate and reasonable expense. Thus so, the average miner can continue to mine DigiByte on his algorithm of choice with entry level hardware.
DigiByte was one of the first block chains to adopt segregated witness (SegWit in 2017) a protocol whereby a part of transactional data is removed and stored elsewhere to decrease transaction data weight and thus increase scalability and speed. This allows us to fit more transactions per block which does not increase in size!
DigiByte currently sits at 560 TPS and could scale to over 280 000 TPS by 2035. This dwarfs any of the TPS capacities; even projected/possible capacities of some coins and even private companies. In essence DigiByte could scale worldwide today and still be reliable and robust. DigiByte could even handle the cumulative transactions of all the top 50 coins in coinmarketcap.com and still run smoothly and below capacity. In fact, to max out DigiByte’s actual maximum capacity (today at 560 TPS) you would have to take all these transactions and multiply them by a factor of 10!
Oher Uses for DigiByte
Note that DigiByte is not only to be used as a currency. Its immense robustness, security and scalability make it ideal for building decentralised applications (DAPPS) which it can host. DigiByte can in fact host DAPPS and even centralised versions which rely on the chain which are known as Digi-Apps. This application layer is also accompanied by a smart contract layer.
Thus, DigiByte could host several Crypto Kitties games and more without freezing out or increasing transaction costs for the end user.
Currently there are various DAPPS being built on the DigiByte block-chain, these are done independently of the DigiByte core team. These companies are simply using the DigiByte block-chain as a utility much in the same way one uses a road to get to work. One such example is Loly – a Tinderesque consensual dating application.
DigiByte also hosts a variety of other platform projects such as the following:
The DigiByte Foundation
As previously mentioned DigiByte was not an ICO. The DigiByte foundation was established in 2017 by founder Jared Tate. Its purpose is as a non-profit organization dedicated to supporting and developing the DigiByte block-chain.
DigiByte is a community effort and a community coin, to be treated as a public resource as water or air. Know that anyone can work on DigiByte, anyone can create, and do as they wish. It is a permissionless system which encourages innovation and creation. If you have an idea and or would like to get help on your project do not hesitate to contact the DigiByte foundation either through the official website and or the telegram developer’s channel.
For this reason, it is ever more important to note that the DigiByte foundation cannot exist without public support. And so, this is the reason I encourage all to donate to the foundation. All funds are used for the maintenance of DigiByte servers, marketing, and DigiByte development.
DigiByte Resources and Websites
DigiByte
Wallets
Explorers
Please refer to the sidebar of this sub-reddit for more resources and information.
Edit - Removed Jaxx wallet.
Edit - A new section was added to the article: Why so many coins? 21 Billion
Edit - Adjusted max capacity of DGB's TPS - Note it's actually larger than I initially calculated.
Edit – Grammar and format readjustment
Hello,
I hope you’ve enjoyed my article, I originally wrote this for the reddit sub-wiki where it generally will most likely, probably not, get a lot of attention. So instead I've decided to make this sort of an introductory post, an open letter, to any newcomers to DGB or for those whom are just curious.
I tried to cover every aspect of DGB, but of course I may have forgotten something! Please leave a comment down below and tell me why you're in DGB? What convinced you? Me it's the decentralised PoW that really convinced me. Plus, just that transaction speed and virtually no fees! Made my mouth water!
-Dereck de Mézquita
I'm a student typing this stuff on my free time, help me pay my debts? Thank you!
D64fAFQvJMhrBUNYpqUKQjqKrMLu76j24g
https://digiexplorer.info/address/D64fAFQvJMhrBUNYpqUKQjqKrMLu76j24g
submitted by xeno_biologist to Digibyte [link] [comments]

Could the "Segregated Witness" part of the "Sidechain Elements" project be used to solve the Bitcoin tx malleability problem ?

https://github.com/ElementsProject/elementsproject.github.io/tree/master#segregated-witness
"As transaction IDs no longer cover the signatures, they remove all forms of transaction malleability, in a much more fundamental way than BIP62 aims to do. This results in a larger class of multi-clause contract constructs that become safe to use."
Could it be in some way backported to Bitcoin ?
submitted by mably to Bitcoin [link] [comments]

⚡ Lightning Network Megathread ⚡

Last updated 2018-01-29
This post is a collaboration with the Bitcoin community to create a one-stop source for Lightning Network information.
There are still questions in the FAQ that are unanswered, if you know the answer and can provide a source please do so!

⚡What is the Lightning Network? ⚡

Explanations:

Image Explanations:

Specifications / White Papers

Videos

Lightning Network Experts on Reddit

  • starkbot - (Elizabeth Stark - Lightning Labs)
  • roasbeef - (Olaoluwa Osuntokun - Lightning Labs)
  • stile65 - (Alex Akselrod - Lightning Labs)
  • cfromknecht - (Conner Fromknecht - Lightning Labs)
  • RustyReddit - (Rusty Russell - Blockstream)
  • cdecker - (Christian Decker - Blockstream)
  • Dryja - (Tadge Dryja - Digital Currency Initiative)
  • josephpoon - (Joseph Poon)
  • fdrn - (Fabrice Drouin - ACINQ )
  • pmpadiou - (Pierre-Marie Padiou - ACINQ)

Lightning Network Experts on Twitter

  • @starkness - (Elizabeth Stark - Lightning Labs)
  • @roasbeef - (Olaoluwa Osuntokun - Lightning Labs)
  • @stile65 - (Alex Akselrod - Lightning Labs)
  • @bitconner - (Conner Fromknecht - Lightning Labs)
  • @johanth - (Johan Halseth - Lightning Labs)
  • @bvu - (Bryan Vu - Lightning Labs)
  • @rusty_twit - (Rusty Russell - Blockstream)
  • @snyke - (Christian Decker - Blockstream)
  • @JackMallers - (Jack Mallers - Zap)
  • @tdryja - (Tadge Dryja - Digital Currency Initiative)
  • @jcp - (Joseph Poon)
  • @alexbosworth - (Alex Bosworth - yalls.org)

Medium Posts

Learning Resources

Books

Desktop Interfaces

Web Interfaces

Tutorials and resources

Lightning on Testnet

Lightning Wallets

Place a testnet transaction

Altcoin Trading using Lightning

  • ZigZag - Disclaimer You must trust ZigZag to send to Target Address

Lightning on Mainnet

Warning - Testing should be done on Testnet

Atomic Swaps

Developer Documentation and Resources

Lightning implementations

  • LND - Lightning Network Daemon (Golang)
  • eclair - A Scala implementation of the Lightning Network (Scala)
  • c-lightning - A Lightning Network implementation in C
  • lit - Lightning Network node software (Golang)
  • lightning-onion - Onion Routed Micropayments for the Lightning Network (Golang)
  • lightning-integration - Lightning Integration Testing Framework
  • ptarmigan - C++ BOLT-Compliant Lightning Network Implementation [Incomplete]

Libraries

Lightning Network Visualizers/Explorers

Testnet

Mainnet

Payment Processors

  • BTCPay - Next stable version will include Lightning Network

Community

Slack

IRC

Slack Channel

Discord Channel

Miscellaneous

⚡ Lightning FAQs ⚡

If you can answer please PM me and include source if possible. Feel free to help keep these answers up to date and as brief but correct as possible
Is Lightning Bitcoin?
Yes. You pick a peer and after some setup, create a bitcoin transaction to fund the lightning channel; it’ll then take another transaction to close it and release your funds. You and your peer always hold a bitcoin transaction to get your funds whenever you want: just broadcast to the blockchain like normal. In other words, you and your peer create a shared account, and then use Lightning to securely negotiate who gets how much from that shared account, without waiting for the bitcoin blockchain.
Is the Lightning Network open source?
Yes, Lightning is open source. Anyone can review the code (in the same way as the bitcoin code)
Who owns and controls the Lightning Network?
Similar to the bitcoin network, no one will ever own or control the Lightning Network. The code is open source and free for anyone to download and review. Anyone can run a node and be part of the network.
I’ve heard that Lightning transactions are happening “off-chain”…Does that mean that my bitcoin will be removed from the blockchain?
No, your bitcoin will never leave the blockchain. Instead your bitcoin will be held in a multi-signature address as long as your channel stays open. When the channel is closed; the final transaction will be added to the blockchain. “Off-chain” is not a perfect term, but it is used due to the fact that the transfer of ownership is no longer reflected on the blockchain until the channel is closed.
Do I need a constant connection to run a lightning node?
Not necessarily,
Example: A and B have a channel. 1 BTC each. A sends B 0.5 BTC. B sends back 0.25 BTC. Balance should be A = 0.75, B = 1.25. If A gets disconnected, B can publish the first Tx where the balance was A = 0.5 and B = 1.5. If the node B does in fact attempt to cheat by publishing an old state (such as the A=0.5 and B=1.5 state), this cheat can then be detected on-chain and used to steal the cheaters funds, i.e., A can see the closing transaction, notice it's an old one and grab all funds in the channel (A=2, B=0). The time that A has in order to react to the cheating counterparty is given by the CheckLockTimeVerify (CLTV) in the cheating transaction, which is adjustable. So if A foresees that it'll be able to check in about once every 24 hours it'll require that the CLTV is at least that large, if it's once a week then that's fine too. You definitely do not need to be online and watching the chain 24/7, just make sure to check in once in a while before the CLTV expires. Alternatively you can outsource the watch duties, in order to keep the CLTV timeouts low. This can be achieved both with trusted third parties or untrusted ones (watchtowers). In the case of a unilateral close, e.g., you just go offline and never come back, the other endpoint will have to wait for that timeout to expire to get its funds back. So peers might not accept channels with extremely high CLTV timeouts. -- Source
What Are Lightning’s Advantages?
Tiny payments are possible: since fees are proportional to the payment amount, you can pay a fraction of a cent; accounting is even done in thousandths of a satoshi. Payments are settled instantly: the money is sent in the time it takes to cross the network to your destination and back, typically a fraction of a second.
Does Lightning require Segregated Witness?
Yes, but not in theory. You could make a poorer lightning network without it, which has higher risks when establishing channels (you might have to wait a month if things go wrong!), has limited channel lifetime, longer minimum payment expiry times on each hop, is less efficient and has less robust outsourcing. The entire spec as written today assumes segregated witness, as it solves all these problems.
Can I Send Funds From Lightning to a Normal Bitcoin Address?
No, for now. For the first version of the protocol, if you wanted to send a normal bitcoin transaction using your channel, you have to close it, send the funds, then reopen the channel (3 transactions). In future versions, you and your peer would agree to spend out of your lightning channel funds just like a normal bitcoin payment, allowing you to use your lightning wallet like a normal bitcoin wallet.
Can I Make Money Running a Lightning Node?
Not really. Anyone can set up a node, and so it’s a race to the bottom on fees. In practice, we may see the network use a nominal fee and not change very much, which only provides an incremental incentive to route on a node you’re going to use yourself, and not enough to run one merely for fees. Having clients use criteria other than fees (e.g. randomness, diversity) in route selection will also help this.
What is the release date for Lightning on Mainnet?
Lightning is already being tested on the Mainnet Twitter Link but as for a specific date, Jameson Lopp says it best
Would there be any KYC/AML issues with certain nodes?
Nope, because there is no custody ever involved. It's just like forwarding packets. -- Source
What is the delay time for the recipient of a transaction receiving confirmation?
Furthermore, the Lightning Network scales not with the transaction throughput of the underlying blockchain, but with modern data processing and latency limits - payments can be made nearly as quickly as packets can be sent. -- Source
How does the lightning network prevent centralization?
Bitcoin Stack Exchange Answer
What are Channel Factories and how do they work?
Bitcoin Stack Exchange Answer
How does the Lightning network work in simple terms?
Bitcoin Stack Exchange Answer
How are paths found in Lightning Network?
Bitcoin Stack Exchange Answer
How would the lightning network work between exchanges?
Each exchange will get to decide and need to implement the software into their system, but some ideas have been outlined here: Google Doc - Lightning Exchanges
Note that by virtue of the usual benefits of cost-less, instantaneous transactions, lightning will make arbitrage between exchanges much more efficient and thus lead to consistent pricing across exchange that adopt it. -- Source
How do lightning nodes find other lightning nodes?
Stack Exchange Answer
Does every user need to store the state of the complete Lightning Network?
According to Rusty's calculations we should be able to store 1 million nodes in about 100 MB, so that should work even for mobile phones. Beyond that we have some proposals ready to lighten the load on endpoints, but we'll cross that bridge when we get there. -- Source
Would I need to download the complete state every time I open the App and make a payment?
No you'd remember the information from the last time you started the app and only sync the differences. This is not yet implemented, but it shouldn't be too hard to get a preliminary protocol working if that turns out to be a problem. -- Source
What needs to happen for the Lightning Network to be deployed and what can I do as a user to help?
Lightning is based on participants in the network running lightning node software that enables them to interact with other nodes. This does not require being a full bitcoin node, but you will have to run "lnd", "eclair", or one of the other node softwares listed above.
All lightning wallets have node software integrated into them, because that is necessary to create payment channels and conduct payments on the network, but you can also intentionally run lnd or similar for public benefit - e.g. you can hold open payment channels or channels with higher volume, than you need for your own transactions. You would be compensated in modest fees by those who transact across your node with multi-hop payments. -- Source
Is there anyway for someone who isn't a developer to meaningfully contribute?
Sure, you can help write up educational material. You can learn and read more about the tech at http://dev.lightning.community/resources. You can test the various desktop and mobile apps out there (Lightning Desktop, Zap, Eclair apps). -- Source
Do I need to be a miner to be a Lightning Network node?
No -- Source
Do I need to run a full Bitcoin node to run a lightning node?
lit doesn't depend on having your own full node -- it automatically connects to full nodes on the network. -- Source
LND uses a light client mode, so it doesn't require a full node. The name of the light client it uses is called neutrino
How does the lightning network stop "Cheating" (Someone broadcasting an old transaction)?
Upon opening a channel, the two endpoints first agree on a reserve value, below which the channel balance may not drop. This is to make sure that both endpoints always have some skin in the game as rustyreddit puts it :-)
For a cheat to become worth it, the opponent has to be absolutely sure that you cannot retaliate against him during the timeout. So he has to make sure you never ever get network connectivity during that time. Having someone else also watching for channel closures and notifying you, or releasing a canned retaliation, makes this even harder for the attacker. This is because if he misjudged you being truly offline you can retaliate by grabbing all of its funds. Spotty connections, DDoS, and similar will not provide the attacker the necessary guarantees to make cheating worthwhile. Any form of uncertainty about your online status acts as a deterrent to the other endpoint. -- Source
How many times would someone need to open and close their lightning channels?
You typically want to have more than one channel open at any given time for redundancy's sake. And we imagine open and close will probably be automated for the most part. In fact we already have a feature in LND called autopilot that can automatically open channels for a user.
Frequency will depend whether the funds are needed on-chain or more useful on LN. -- Source
Will the lightning network reduce BTC Liquidity due to "locking-up" funds in channels?
Stack Exchange Answer
Can the Lightning Network work on any other cryptocurrency? How?
Stack Exchange Answer
When setting up a Lightning Network Node are fees set for the entire node, or each channel when opened?
You don't really set up a "node" in the sense that anyone with more than one channel can automatically be a node and route payments. Fees on LN can be set by the node, and can change dynamically on the network. -- Source
Can Lightning routing fees be changed dynamically, without closing channels?
Yes but it has to be implemented in the Lightning software being used. -- Source
How can you make sure that there will be routes with large enough balances to handle transactions?
You won't have to do anything. With autopilot enabled, it'll automatically open and close channels based on the availability of the network. -- Source
How does the Lightning Network stop flooding nodes (DDoS) with micro transactions? Is this even an issue?
Stack Exchange Answer

Unanswered Questions

How do on-chain fees work when opening and closing channels? Who pays the fee?
How does the Lightning Network work for mobile users?
What are the best practices for securing a lightning node?
What is a lightning "hub"?
How does lightning handle cross chain (Atomic) swaps?

Special Thanks and Notes

  • Many links found from awesome-lightning-network github
  • Everyone who submitted a question or concern!
  • I'm continuing to format for an easier Mobile experience!
submitted by codedaway to Bitcoin [link] [comments]

I quit btc.

TL&DR Basically rant why I don’t want to face bitcoin core supporters constant lies and I don’t want to have anything to do with bitcoin core (btc) anymore.
Bitcoin was always about sending safely digital money to anybody, anywhere and without need of central authority. It was very clearly stated in first discussions and first promoting materials, that whole idea is for it to work instantly with no fees, or very little fees and it is for everybody equally and anonymously.
Nobody was ever suggesting that bitcoin is finished product. Probably it is fair to say everybody were expecting some kind of problems and different and unforeseen circumstances that could potentially kill the project any minute and instantly. Many of users could also see potential new use cases and phenomenal possibilities for the future. Bitcoin got quickly recognised as very risky but very promising technology that could change the world. Things like that don’t happened every day.
Evolution of bitcoin was inevitable. Every aspect of bitcoin needed protection and improvement to face problems.
Oh boy, but how I’m surprised what way it all went.
Maximum blocksize was introduce by bitcoin creator as a temporary measure to mitigate problems bitcoin was vulnerable at the time. It was always suppose to be increased when needed and Bitcoin creator (Satoshi Nakamoto) even said how to do it effortlessly. That max block size was trivial temporary fix that not many at the time realised how big obstacle for bitcoin it will become. Unfortunately for all of us, Satoshi left the project, before sorting it out.
Instant transactions were removed when “replace by fee” feature and increasing transaction waiting time in mempool from, I think 3 days to 14 days, were introduced. It was done to allegedly make it easier to estimate correct fee needed to pay to get to next block. In effect though, it enabled race to the top of the fees where in order to keep up with increasing volume, it was better to increase fee above everybody else or face staying in limbo of unconfirmed transaction for two weeks or more in case some party chooses to rebroadcast transaction. What is more terrifying, transactions couldn’t be safely used as instant anymore, as a sender could potentially double spend transaction with sending funds to different than original address with higher fee and more chance to not get rejected. Instant transaction was basically killed. Now we all had to wait for confirmations, preferably 6 of them. Originally, that was only advised as extra safety measure for bigger purchases, but now thanks to rbf, it is a must. Plus fees were encouraged to go up.
Foundations for high fees were set by rbf and 1mb block size. When volume came with increasing adoption and interest from new users, fees skyrocketed to above 1000sat/byte. You could send with lower fees and get lucky, but basically fees were extremely high. Also, not every transaction is simple. This 1000sat/byte could easily result in fee on 100gbp for transaction if you were using many unspent outputs.
That killed adoption. Period. You can’t use bitcoin to send money if you have to pay transaction bigger than often value of transaction itself. Low fee or no fee aspect was killed and even vanished for a while from bitcoin.org site.
Important part is, that all of that above could have been justified. As I mentioned before, bitcoin is not finished and it is vulnerable so any changes should be tested, not rushed. I can understand that. What is more, I can not demand from bitcoin developers changes. I can propose changes myself and even show how to do it though.
But here is the tricky part. Bitcoin core developers killed all progress by censoring every discussion that was not in line with central party rhetoric. You want to talk about big blocks? Ban. You want to ask about why not? Ban. But, but… Ban. So changes can not be proposed anymore and discussed. It was possible to get ban even when taking part in discussion elsewhere or agree to something core didn’t approve and “obviously” being not in line.
Well done guys, you just created central authority that stand against everything that bitcoin was for.
How big fees were justified?
By pushing blame on users. It must be stupid to use bitcoin they said. When you using it you are taking precious resources. You are bad for bitcoin. Bitcoin is not money, it is store of value!!!
Just buy and hold. Sorry. Just buy and “hodl”. Be stupid meme reader. Than tell others to buy and hold. Create perfect ponzi. This is what bitcoin core is now being used mostly for.
Solutions proposed and introduced.
Segwit or Segregated Witness. (didn’t help)
Reorganisation of transaction record that changes the way transaction size is being counted and also fixes malleability issue. At the time of introduction it was being compared to approximately equal to increase to 1.7 mb block size. Now opinions and calculations are vary. Some give it more, but most are very confusing anyway. As misinformation is very common in bitcoin world, I leave it for everybody to check it themselves.
Segwit was mostly needed to introduce Lightning Network that required transaction malleability to be fixed. In normal bitcoin use, it wasn’t really big problem, but lightning apparently had to have it sorted this way.
Lightning network
Fascinating concept really I must admit. It is different layer working on top of bitcoin block chain. Instead of sending every transaction on chain, users were encouraged to use this so called settlement layer, where only final balancing is written on chain. In theory, when network will be big enough and everybody will connect, closing final balances will never be required or for very long time plus when something goes wrong. Lightning network is in even bigger beta than I thought and I don’t think I can say more about its technical side, but already I think it might be very interesting someday. It should not stop on chain scaling though.
My problem with Lightning network is more on idealogical level. It to much looks like trying to replicate existing banking system (I might be totally wrong on this) and there was LIE spread before introducing LN that everybody needs to run full node. It is a lie. Obvious lie.
First of all, the definition of full node has been changed. Originally full node was node that was doing all functions of node and that includes mining. Mining is now highly centralised and it has very big entry price, so normal user rather can’t run full node efficiently.
Definition has changed to call non mining nodes a full node. That implies they are important to bitcoin network. They are not. They are important for Lightning network though, as user has to be connected to it all the time via they're own node.
Not only Lightning Network is build on bitcoin chain but also on the lie and misinformation. That is very bad. Any discussion to put things straight as they are result in ban in every communication channel controlled by central authority of core devs.
Every day I come to reddit or any other social media, I see plenty of lies, usually from people that do not lie, and I am sick of it. Bitcoin is evil, bitcoin is broken, bitcoin is taken over by malicious group, that luckily forked away in August last year and is marked as btc.
Bch chain restored the original value of Bitcoin. Central authority is gone. If it happens again, we will fork away again. It is low fee or no fee system for everybody.
It is fascinating again. There is new development. Look on memo and blockpress. If you can’t see implications of this, I don’t know what to say.
Now is the time people have to choose though. Bitcoin cash has low volume. It is possible people don’t want uncensored money, social network, or network in general. Maybe they need Lambo dream and ponzi scheme? Maybe. I don’t know. But I’m off from btc and I am not coming back.
submitted by MarchewkaCzerwona to btc [link] [comments]

Long live decentralized bitcoin(!) A reading list

Newbs might not know this, but bitcoin recently came out of an intense internal drama. Between July 2015 and August 2017 bitcoin was attacked by external forces who were hoping to destroy the very properties that made bitcoin valuable in the first place. This culminated in the creation of segwit and the UASF (user activated soft fork) movement. The UASF was successful, segwit was added to bitcoin and with that the anti-decentralization side left bitcoin altogether and created their own altcoin called bcash. Bitcoin's price was $2500, soon after segwit was activated the price doubled to $5000 and continued rising until a top of $20000 before correcting to where we are today.
During this drama, I took time away from writing open source code to help educate and argue on reddit, twitter and other social media. I came up with a reading list for quickly copypasting things. It may be interesting today for newbs or anyone who wants a history lesson on what exactly happened during those two years when bitcoin's very existence as a decentralized low-trust currency was questioned. Now the fight has essentially been won, I try not to comment on reddit that much anymore. There's nothing left to do except wait for Lightning and similar tech to become mature (or better yet, help code it and test it)
In this thread you can learn about block sizes, latency, decentralization, segwit, ASICBOOST, lightning network and all the other issues that were debated endlessly for over two years. So when someone tries to get you to invest in bcash, remind them of the time they supported Bitcoin Unlimited.
For more threads like this see UASF

Summary / The fundamental tradeoff

A trip to the moon requires a rocket with multiple stages by gmaxwell (must read) https://www.reddit.com/Bitcoin/comments/438hx0/a_trip_to_the_moon_requires_a_rocket_with/
Bram Cohen, creator of bittorrent, argues against a hard fork to a larger block size https://medium.com/@bramcohen/bitcoin-s-ironic-crisis-32226a85e39f#.558vetum4
gmaxwell's summary of the debate https://bitcointalk.org/index.php?topic=1343716.msg13701818#msg13701818
Core devs please explain your vision (see luke's post which also argues that blocks are already too big) https://www.reddit.com/Bitcoin/comments/61yvvv/request_to_core_devs_please_explain_your_vision/
Mod of btc speaking against a hard fork https://www.reddit.com/btc/comments/57hd14/core_reaction_to_viabtc_this_week/d8scokm/
It's becoming clear to me that a lot of people don't understand how fragile bitcoin is https://www.reddit.com/Bitcoin/comments/59kflj/its_becoming_clear_to_me_that_a_lot_of_people/
Blockchain space must be costly, it can never be free https://www.reddit.com/Bitcoin/comments/4og24h/i_just_attended_the_distributed_trade_conference/
Charlie Lee with a nice analogy about the fundamental tradeoff https://medium.com/@SatoshiLite/eating-the-bitcoin-cake-fc2b4ebfb85e#.444vr8shw
gmaxwell on the tradeoffs https://bitcointalk.org/index.php?topic=1520693.msg15303746#msg15303746
jratcliff on the layering https://www.reddit.com/btc/comments/59upyh/segwit_the_poison_pill_for_bitcoin/d9bstuw/

Scaling on-chain will destroy bitcoin's decentralization

Peter Todd: How a floating blocksize limit inevitably leads towards centralization [Feb 2013] https://bitcointalk.org/index.php?topic=144895.0 mailing list https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2013-February/002176.html with discussion on reddit in Aug 2015 https://www.reddit.com/Bitcoin/comments/3hnvi8/just_a_little_history_lesson_for_everyone_new_the/
Nick Szabo's blog post on what makes bitcoin so special http://unenumerated.blogspot.com/2017/02/money-blockchains-and-social-scalability.html
There is academic research showing that even small (2MB) increases to the blocksize results in drastic node dropoff counts due to the non-linear increase of RAM needed. http://bravenewcoin.com/assets/Whitepapers/block-size-1.1.1.pdf
Reddit summary of above link. In this table, you can see it estimates a 40% drop immediately in node count with a 2MB upgrade and a 50% over 6 months. At 4mb, it becomes 75% immediately and 80% over 6 months. At 8, it becomes 90% and 95%. https://www.reddit.com/Bitcoin/comments/5qw2wa_future_led_by_bitcoin_unlimited_is_a/dd442pw/
Larger block sizes make centralization pressures worse (mathematical) https://petertodd.org/2016/block-publication-incentives-for-miners
Talk at scalingbitcoin montreal, initial blockchain synchronization puts serious constraints on any increase in the block size https://www.youtube.com/watch?v=TgjrS-BPWDQ&t=2h02m06s with transcript https://scalingbitcoin.org/transcript/montreal2015/block-synchronization-time
Bitcoin's P2P Network: The Soft Underbelly of Bitcoin https://www.youtube.com/watch?v=Y6kibPzbrIc someone's notes: https://gist.github.com/romyilano/5e22394857a39889a1e5 reddit discussion https://www.reddit.com/Bitcoin/comments/4py5df/so_f2pool_antpool_btcc_pool_are_actually_one_pool/
In adversarial environments blockchains dont scale https://scalingbitcoin.org/transcript/hongkong2015/in-adversarial-environments-blockchains-dont-scale
Why miners will not voluntarily individually produce smaller blocks https://scalingbitcoin.org/transcript/hongkong2015/why-miners-will-not-voluntarily-individually-produce-smaller-blocks
Hal Finney: bitcoin's blockchain can only be a settlement layer (mostly interesting because it's hal finney and its in 2010) https://www.reddit.com/Bitcoin/comments/3sb5nj/most_bitcoin_transactions_will_occur_between/
petertodd's 2013 video explaining this https://www.youtube.com/watch?v=cZp7UGgBR0I
luke-jr's summary https://www.reddit.com/Bitcoin/comments/61yvvv/request_to_core_devs_please_explain_your_vision/dficjhj/
Another jratcliff thread https://www.reddit.com/Bitcoin/comments/6lmpll/explaining_why_big_blocks_are_bad/

Full blocks are not a disaster

Blocks must be always full, there must always be a backlog https://medium.com/@bergealex4/bitcoin-is-unstable-without-the-block-size-size-limit-70db07070a54#.kh2vi86lr
Same as above, the mining gap means there must always be a backlog talk: https://www.youtube.com/watch?time_continue=2453&v=iKDC2DpzNbw transcript: https://scalingbitcoin.org/transcript/montreal2015/security-of-diminishing-block-subsidy
Backlogs arent that bad https://www.reddit.com/Bitcoin/comments/49p011/was_the_fee_event_really_so_bad_my_mind_is/
Examples where scarce block space causes people to use precious resources more efficiently https://www.reddit.com/Bitcoin/comments/4kxxvj/i_just_singlehandedly_increased_bitcoin_network/
https://www.reddit.com/Bitcoin/comments/47d4m2/why_does_coinbase_make_2_transactions_pe
https://www.reddit.com/Bitcoin/comments/53wucs/why_arent_blocks_full_yet/d7x19iv
Full blocks are fine https://www.reddit.com/Bitcoin/comments/5uld1a/misconception_full_blocks_mean_bitcoin_is_failing/
High miner fees imply a sustainable future for bitcoin https://www.reddit.com/BitcoinMarkets/comments/680tvf/fundamentals_friday_week_of_friday_april_28_2017/dgwmhl7/
gmaxwell on why full blocks are good https://www.reddit.com/Bitcoin/comments/6b57ca/full_blocks_good_or_bad/dhjxwbz/
The whole idea of the mempool being "filled" is wrong headed. The mempool doesn't "clog" or get stuck, or anything like that. https://www.reddit.com/Bitcoin/comments/7cusnx/to_the_people_still_doubting_that_this_congestion/dpssokf/

Segwit

What is segwit

luke-jr's longer summary https://www.reddit.com/Bitcoin/comments/6033h7/today_is_exactly_4_months_since_the_segwit_voting/df3tgwg/?context=1
Charlie Shrem's on upgrading to segwit https://twitter.com/CharlieShrem/status/842711238853513220
Original segwit talk at scalingbitcoin hong kong + transcript https://youtu.be/zchzn7aPQjI?t=110
https://scalingbitcoin.org/transcript/hongkong2015/segregated-witness-and-its-impact-on-scalability
Segwit is not too complex https://www.reddit.com/btc/comments/57vjin/segwit_is_not_great/d8vos33/
Segwit does not make it possible for miners to steal coins, contrary to what some people say https://www.reddit.com/btc/comments/5e6bt0/concerns_with_segwit_and_anyone_can_spend/daa5jat/?context=1
https://keepingstock.net/segwit-eli5-misinformation-faq-19908ceacf23#.r8hlzaquz
Segwit is required for a useful lightning network It's now known that without a malleability fix useful indefinite channels are not really possible.
https://www.reddit.com/Bitcoin/comments/5tzqtc/gentle_reminder_the_ln_doesnt_require_segwit/ddqgda7/
https://www.reddit.com/Bitcoin/comments/5tzqtc/gentle_reminder_the_ln_doesnt_require_segwit/ddqbukj/
https://www.reddit.com/Bitcoin/comments/5x2oh0/olaoluwa_osuntokun_all_active_lightning_network/deeto14/?context=3
Clearing up SegWit Lies and Myths: https://achow101.com/2016/04/Segwit-FUD-Clearup
Segwit is bigger blocks https://www.reddit.com/Bitcoin/comments/5pb8vs/misinformation_is_working_54_incorrectly_believe/dcpz3en/
Typical usage results in segwit allowing capacity equivalent to 2mb blocks https://www.reddit.com/Bitcoin/comments/69i2md/observe_for_yourself_segwit_allows_2_mb_blocks_in/

Why is segwit being blocked

Jihan Wu (head of largest bitcoin mining group) is blocking segwit because of perceived loss of income https://www.reddit.com/Bitcoin/comments/60mb9e/complete_high_quality_translation_of_jihans/
Witness discount creates aligned incentives https://segwit.org/why-a-discount-factor-of-4-why-not-2-or-8-bbcebe91721e#.h36odthq0 https://medium.com/@SegWit.co/what-is-behind-the-segwit-discount-988f29dc1edf#.sr91dg406
or because he wants his mining enterprise to have control over bitcoin https://www.reddit.com/Bitcoin/comments/6jdyk8/direct_report_of_jihan_wus_real_reason_fo

Segwit is being blocked because it breaks ASICBOOST, a patented optimization used by bitmain ASIC manufacturer

Details and discovery by gmaxwell https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2017-April/013996.html
Reddit thread with discussion https://www.reddit.com/Bitcoin/comments/63otrp/gregory_maxwell_major_asic_manufacturer_is/
Simplified explaination by jonny1000 https://www.reddit.com/Bitcoin/comments/64qq5g/attempted_explanation_of_the_alleged_asicboost/
http://www.mit.edu/~jlrubin/public/pdfs/Asicboost.pdf
https://medium.com/@jimmysong/examining-bitmains-claims-about-asicboost-1d61118c678d
Evidence https://www.reddit.com/Bitcoin/comments/63yo27/some_circumstantial_evidence_supporting_the_claim/
https://www.reddit.com/Bitcoin/comments/63vn5g/please_dont_stop_us_from_using_asicboost_which/dfxmm75/
https://www.reddit.com/Bitcoin/comments/63soe3/reverse_engineering_an_asic_is_a_significant_task/dfx9nc
Bitmain admits their chips have asicboost but they say they never used it on the network (haha a likely story) https://blog.bitmain.com/en/regarding-recent-allegations-smear-campaigns/
Worth $100m per year to them (also in gmaxwell's original email) https://twitter.com/petertoddbtc/status/849798529929424898
Other calculations show less https://medium.com/@vcorem/the-real-savings-from-asicboost-to-bitmaintech-ff265c2d305b
This also blocks all these other cool updates, not just segwit https://www.reddit.com/Bitcoin/comments/63otrp/gregory_maxwell_major_asic_manufacturer_is/dfw0ej3/
Summary of bad consequences of asicboost https://www.reddit.com/Bitcoin/comments/64qq5g/attempted_explanation_of_the_alleged_asicboost/dg4hyqk/?context=1
Luke's summary of the entire situation https://www.reddit.com/Bitcoin/comments/6ego3s/why_is_killing_asicboost_not_a_priority/diagkkb/?context=1
Prices goes up because now segwit looks more likely https://twitter.com/TuurDemeestestatus/849846845425799168
Asicboost discovery made the price rise https://twitter.com/TuurDemeestestatus/851520094677200901
A pool was caught red handed doing asicboost, by this time it seemed fairly certain that segwit would get activated so it didnt produce as much interest as earlier https://www.reddit.com/Bitcoin/comments/6p7lr5/1hash_pool_has_mined_2_invalid_blocks/ and https://www.reddit.com/Bitcoin/comments/6p95dl/interesting_1hash_pool_mined_some_invalid_blocks/ and https://twitter.com/petertoddbtc/status/889475196322811904
This btc user is outraged at the entire forum because they support Bitmain and ASICBOOST https://www.reddit.com/btc/comments/67t43y/dragons_den_planned_smear_campaign_of_bitmain/dgtg9l2/
Antbleed, turns out Bitmain can shut down all its ASICs by remote control: http://www.antbleed.com/

What if segwit never activates

What if segwit never activates? https://www.reddit.com/Bitcoin/comments/6ab8js/transaction_fees_are_now_making_btc_like_the_banks/dhdq3id/ with https://www.reddit.com/Bitcoin/comments/5ksu3o/blinded_bearer_certificates/ and https://www.reddit.com/Bitcoin/comments/4xy0fm/scaling_quickly/

Lightning

bitcoinmagazine's series on what lightning is and how it works https://bitcoinmagazine.com/articles/understanding-the-lightning-network-part-building-a-bidirectional-payment-channel-1464710791/ https://bitcoinmagazine.com/articles/understanding-the-lightning-network-part-creating-the-network-1465326903/ https://bitcoinmagazine.com/articles/understanding-the-lightning-network-part-completing-the-puzzle-and-closing-the-channel-1466178980/
The Lightning Network ELIDHDICACS (Explain Like I Don’t Have Degrees in Cryptography and Computer Science) https://letstalkbitcoin.com/blog/post/the-lightning-network-elidhdicacs
Ligtning will increases fees for miners, not lower them https://medium.com/lightning-resources/the-lightning-paradox-f15ce0e8e374#.erfgunumh
Cost-benefit analysis of lightning from the point of view of miners https://medium.com/@rusty_lightning/miners-and-bitcoin-lightning-a133cd550310#.x42rovlg8
Routing blog post by rusty https://medium.com/@rusty_lightning/routing-dijkstra-bellman-ford-and-bfg-7715840f004 and reddit comments https://www.reddit.com/Bitcoin/comments/4lzkz1/rusty_russell_on_lightning_routing_routing/
Lightning protocol rfc https://github.com/lightningnetwork/lightning-rfc
Blog post with screenshots of ln being used on testnet https://medium.com/@btc_coach/lightning-network-in-action-b18a035c955d video https://www.youtube.com/watch?v=mxGiMu4V7ns
Video of sending and receiving ln on testnet https://twitter.com/alexbosworth/status/844030573131706368
Lightning tradeoffs http://www.coindesk.com/lightning-technical-challenges-bitcoin-scalability/
Beer sold for testnet lightning https://www.reddit.com/Bitcoin/comments/62uw23/lightning_network_is_working_room77_is_accepting/ and https://twitter.com/MrHodl/status/848265171269283845
Lightning will result in far fewer coins being stored on third parties because it supports instant transactions https://medium.com/@thecryptoconomy/the-barely-discussed-incredible-benefit-of-the-lightning-network-4ce82c75eb58
jgarzik argues strongly against LN, he owns a coin tracking startup https://twitter.com/petertoddbtc/status/860826532650123264 https://twitter.com/Beautyon_/status/886128801926795264
luke's great debunking / answer of some misinformation questions https://www.reddit.com/Bitcoin/comments/6st4eq/questions_about_lightning_network/dlfap0u/
Lightning centralization doesnt happen https://www.reddit.com/Bitcoin/comments/6vzau5/reminder_bitcoins_key_strength_is_in_being/dm4ou3v/?context=1
roasbeef on hubs and charging fees https://twitter.com/roasbeef/status/930209165728825344 and https://twitter.com/roasbeef/status/930210145790976000

Immutability / Being a swiss bank in your pocket / Why doing a hard fork (especially without consensus) is damaging

A downside of hard forks is damaging bitcoin's immutability https://www.reddit.com/Bitcoin/comments/5em6vu/what_happens_if_segwit_doesnt_activate/dae1r6c/?context=3
Interesting analysis of miners incentives and how failure is possible, don't trust the miners for long term https://www.reddit.com/Bitcoin/comments/5gtew4/why_an_increased_block_size_increases_the_cost_of/daybazj/?context=2
waxwing on the meaning of cash and settlement https://www.reddit.com/Bitcoin/comments/5ei7m3/unconfirmed_transactions_60k_total_fees_14btc/dad001v/
maaku on the cash question https://www.reddit.com/Bitcoin/comments/5i5iq5/we_are_spoiled/db5luiv/?context=1
Digital gold funamentalists gain nothing from supporting a hard fork to larger block sizes https://www.reddit.com/Bitcoin/comments/5xzunq/core_please_compromise_before_we_end_up_with_bu/dem73xg/?context=1
Those asking for a compromise don't understand the underlying political forces https://www.reddit.com/Bitcoin/comments/6ef7wb/some_comments_on_the_bip148_uasf_from_the/dia236b/?context=3
Nobody wants a contentious hard fork actually, anti-core people got emotionally manipulated https://www.reddit.com/Bitcoin/comments/5sq5ocontentious_forks_vs_incremental_progress/ddip57o/
The hard work of the core developers has kept bitcoin scalable https://www.reddit.com/Bitcoin/comments/3hfgpo/an_initiative_to_bring_advanced_privacy_features/cu7mhw8?context=9
Recent PRs to improve bitcoin scaleability ignored by the debate https://twitter.com/jfnewbery/status/883001356168167425
gmaxwell against hard forks since 2013 https://bitcointalk.org/index.php?topic=140233.20
maaku: hard forks are really bad https://www.reddit.com/Bitcoin/comments/5zxjza/adam_greg_core_devs_and_big_blockers_now_is_the/df275yk/?context=2

Some metrics on what the market thinks of decentralization and hostile hard forks

The price history shows that the exchange rate drops every time a hard fork threatens: https://i.imgur.com/EVPYLR8.jpg
and this example from 2017 https://twitter.com/WhalePanda/status/845562763820912642
http://imgur.com/a/DuHAn btc users lose money
price supporting theymos' moderation https://i.imgur.com/0jZdF9h.png
old version https://i.imgur.com/BFTxTJl.png
older version https://pbs.twimg.com/media/CxqtUakUQAEmC0d.jpg
about 50% of nodes updated to the soft fork node quite quickly https://imgur.com/O0xboVI

Bitcoin Unlimited / Emergent Consensus is badly designed, changes the game theory of bitcoin

Bitcoin Unlimited was a proposed hard fork client, it was made with the intention to stop segwit from activating
A Future Led by Bitcoin Unlimited is a Centralized Future https://blog.sia.tech/a-future-led-by-bitcoin-unlimited-is-a-centralized-future-e48ab52c817a#.p1ly6hldk
Flexible transactions are bugged https://www.reddit.com/Bitcoin/comments/57tf5g/bitcoindev_bluematt_on_flexible_transactions/
Bugged BU software mines an invalid block, wasting 13 bitcoins or $12k
https://www.reddit.com/Bitcoin/comments/5qwtr2/bitcoincom_loses_132btc_trying_to_fork_the/
https://www.reddit.com/btc/comments/5qx18i/bitcoincom_loses_132btc_trying_to_fork_the/
bitcoin.com employees are moderators of btc https://medium.com/@WhalePanda/the-curious-relation-between-bitcoin-com-anti-segwit-propaganda-26c877249976#.vl02566k4
miners don't control stuff like the block size http://hackingdistributed.com/2016/01/03/time-for-bitcoin-user-voice/
even gavin agreed that economic majority controls things https://www.reddit.com/Bitcoin/comments/5ywoi9/in_2010_gavin_predicted_that_exchanges_ie_the/
fork clients are trying to steal bitcoin's brand and network effect, theyre no different from altcoins https://medium.com/@Coinosphere/why-bitcoin-unlimited-should-be-correctly-classified-as-an-attempted-robbery-of-bitcoin-not-a-9355d075763c#.qeaynlx5m
BU being active makes it easier to reverse payments, increases wasted work making the network less secure and giving an advantage to bigger miners https://www.reddit.com/Bitcoin/comments/5g1x84/bitcoin_unlimited_bu_median_value_of_miner_eb/
bitcoin unlimited takes power away from users and gives it to miners https://medium.com/@alpalpalp/bitcoin-unlimiteds-placebo-controls-6320cbc137d4#.q0dv15gd5
bitcoin unlimited's accepted depth https://twitter.com/tdryja/status/804770009272696832
BU's lying propaganda poster https://imgur.com/osrViDE

BU is bugged, poorly-reviewed and crashes

bitcoin unlimited allegedly funded by kraken stolen coins
https://www.reddit.com/btc/comments/55ajuh/taint_analysis_on_bitcoin_stolen_from_kraken_on/
https://www.reddit.com/btc/comments/559miz/taint_analysis_on_btc_allegedly_stolen_from_kraken/
Other funding stuff
https://www.reddit.com/Bitcoin/comments/5zozmn/damning_evidence_on_how_bitcoin_unlimited_pays/
A serious bug in BU https://www.reddit.com/Bitcoin/comments/5h70s3/bitcoin_unlimited_bu_the_developers_have_realized/
A summary of what's wrong with BU: https://www.reddit.com/Bitcoin/comments/5z3wg2/jihanwu_we_will_switch_the_entire_pool_to/devak98/

Bitcoin Unlimited Remote Exploit Crash 14/3/2017

https://www.reddit.com/Bitcoin/comments/5zdkv3/bitcoin_unlimited_remote_exploit_crash/ https://www.reddit.com/Bitcoin/comments/5zeb76/timbe https://www.reddit.com/btc/comments/5zdrru/peter_todd_bu_remote_crash_dos_wtf_bug_assert0_in/
BU devs calling it as disaster https://twitter.com/SooMartindale/status/841758265188966401 also btc deleted a thread about the exploit https://i.imgur.com/lVvFRqN.png
Summary of incident https://www.reddit.com/Bitcoin/comments/5zf97j/i_was_undecided_now_im_not/
More than 20 exchanges will list BTU as an altcoin
https://www.reddit.com/Bitcoin/comments/5zyg6g/bitcoin_exchanges_unveil_emergency_hard_fork/
Again a few days later https://www.reddit.com/Bitcoin/comments/60qmkt/bu_is_taking_another_shit_timberrrrr

User Activated Soft Fork (UASF)

site for it, including list of businesses supporting it http://www.uasf.co/
luke's view
https://www.reddit.com/Bitcoin/comments/5zsk45/i_am_shaolinfry_author_of_the_recent_usedf1dqen/?context=3
threat of UASF makes the miner fall into line in litecoin
https://www.reddit.com/litecoin/comments/66omhlitecoin_global_roundtable_resolution/dgk2thk/?context=3
UASF delivers the goods for vertcoin
https://www.reddit.com/Bitcoin/comments/692mi3/in_test_case_uasf_results_in_miner_consensus/dh3cm34/?context=1
UASF coin is more valuable https://www.reddit.com/Bitcoin/comments/6cgv44/a_uasf_chain_will_be_profoundly_more_valuable/
All the links together in one place https://www.reddit.com/Bitcoin/comments/6dzpew/hi_its_mkwia_again_maintainer_of_uasfbitcoin_on/
p2sh was a uasf https://github.com/bitcoin/bitcoin/blob/v0.6.0/src/main.cpp#L1281-L1283
jgarzik annoyed at the strict timeline that segwit2x has to follow because of bip148 https://twitter.com/jgarzik/status/886605836902162432
Committed intolerant minority https://www.reddit.com/Bitcoin/comments/6d7dyt/a_plea_for_rational_intolerance_extremism_and/
alp on the game theory of the intolerant minority https://medium.com/@alpalpalp/user-activated-soft-forks-and-the-intolerant-minority-a54e57869f57
The risk of UASF is less than the cost of doing nothing https://www.reddit.com/Bitcoin/comments/6bof7a/were_getting_to_the_point_where_a_the_cost_of_not/
uasf delivered the goods for bitcoin, it forced antpool and others to signal (May 2016) https://bitcoinmagazine.com/articles/antpool-will-not-run-segwit-without-block-size-increase-hard-fork-1464028753/ "When asked specifically whether Antpool would run SegWit code without a hard fork increase in the block size also included in a release of Bitcoin Core, Wu responded: “No. It is acceptable that the hard fork code is not activated, but it needs to be included in a ‘release’ of Bitcoin Core. I have made it clear about the definition of ‘release,’ which is not ‘public.’”"
Screenshot of peter rizun capitulating https://twitter.com/chris_belcher_/status/905231603991007232

Fighting off 2x HF

https://twitter.com/MrHodl/status/895089909723049984
https://www.reddit.com/Bitcoin/comments/6h612o/can_someone_explain_to_me_why_core_wont_endorse/?st=j6ic5n17&sh=cc37ee23
https://www.reddit.com/Bitcoin/comments/6smezz/segwit2x_hard_fork_is_completely_useless_its_a/?st=j6ic2aw3&sh=371418dd
https://www.reddit.com/Bitcoin/comments/6sbspv/who_exactly_is_segwit2x_catering_for_now_segwit/?st=j6ic5nic&sh=1f86cadd
https://medium.com/@elliotolds/lesser-known-reasons-to-keep-blocks-small-in-the-words-of-bitcoin-core-developers-44861968185e
b2x is most of all about firing core https://twitter.com/WhalePanda/status/912664487135760384
https://medium.com/@StopAndDecrypt/thats-not-bitcoin-this-is-bitcoin-95f05a6fd6c2

Misinformation / sockpuppets

https://www.reddit.com/Bitcoin/comments/6uqz6k/markets_update_bitcoin_cash_rallies_for_three/dlurbpx/
three year old account, only started posting today https://archive.is/3STjH
Why we should not hard fork after the UASF worked: https://www.reddit.com/Bitcoin/comments/6sl1qf/heres_why_we_should_not_hard_fork_in_a_few_months/

History

Good article that covers virtually all the important history https://bitcoinmagazine.com/articles/long-road-segwit-how-bitcoins-biggest-protocol-upgrade-became-reality/
Interesting post with some history pre-2015 https://btcmanager.com/the-long-history-of-the-fight-over-scaling-bitcoin/
The core scalabality roadmap + my summary from 3/2017 https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-Decembe011865.html my summary https://www.reddit.com/Bitcoin/comments/5xa5fa/the_core_development_scalability_roadmap/
History from summer 2015 https://www.reddit.com/Bitcoin/comments/5xg7f8/the_origins_of_the_blocksize_debate/
Brief reminders of the ETC situation https://www.reddit.com/Bitcoin/comments/6nvlgo/simple_breakdown_of_bip91_its_simply_the_miners/dkcycrz/
Longer writeup of ethereum's TheDAO bailout fraud https://www.reddit.com/ethereumfraud/comments/6bgvqv/faq_what_exactly_is_the_fraud_in_ethereum/
Point that the bigblocker side is only blocking segwit as a hostage https://www.reddit.com/BitcoinMarkets/comments/5sqhcq/daily_discussion_wednesday_february_08_2017/ddi3ctv/?context=3
jonny1000's recall of the history of bitcoin https://www.reddit.com/Bitcoin/comments/6s34gg/rbtc_spreading_misinformation_in_rbitcoinmarkets/dl9wkfx/

Misc (mostly memes)

libbitcoin's Understanding Bitcoin series (another must read, most of it) https://github.com/libbitcoin/libbitcoin/wiki/Understanding-Bitcoin
github commit where satoshi added the block size limit https://www.reddit.com/Bitcoin/comments/63859l/github_commit_where_satoshi_added_the_block_size/
hard fork proposals from some core devs https://bitcoinhardforkresearch.github.io/
blockstream hasnt taken over the entire bitcoin core project https://www.reddit.com/Bitcoin/comments/622bjp/bitcoin_core_blockstream/
blockstream is one of the good guys https://www.reddit.com/Bitcoin/comments/6cttkh/its_happening_blockstream_opens_liquid_sidechain/dhxu4e
Forkers, we're not raising a single byte! Song lyrics by belcher https://gist.github.com/chris-belche7264cd6750a86f8b4a9a
Some stuff here along with that cool photoshopped poster https://medium.com/@jimmysong/bitcoin-realism-or-how-i-learned-to-stop-worrying-and-love-1mb-blocks-c191c35e74cb
Nice graphic https://twitter.com/RNR_0/status/871070843698380800
gmaxwell saying how he is probably responsible for the most privacy tech in bitcoin, while mike hearn screwed up privacy https://www.reddit.com/btc/comments/6azyme/hey_bu_wheres_your_testnet/dhiq3xo/?context=6
Fairly cool propaganda poster https://twitter.com/urbanarson/status/880476631583924225
btc tankman https://i.redd.it/gxjqenzpr27z.png https://twitter.com/DanDarkPill/status/853653168151986177
asicboost discovery meme https://twitter.com/allenscottoshi/status/849888189124947971
https://twitter.com/urbanarson/status/882020516521013250
gavin wanted to kill the bitcoin chain https://twitter.com/allenscottoshi/status/849888189124947971
stuff that btc believes https://www.reddit.com/Bitcoin/comments/6ld4a5/serious_is_the_rbtc_and_the_bu_crowd_a_joke_how/djszsqu/
after segwit2x NYA got agreed all the fee pressure disappeared, laurenmt found they were artificial spam https://twitter.com/i/moments/885827802775396352
theymos saying why victory isnt inevitable https://www.reddit.com/Bitcoin/comments/6lmpll/explaining_why_big_blocks_are_bad/djvxv2o/
with ignorant enemies like these its no wonder we won https://bitco.in/forum/threads/gold-collapsing-bitcoin-up.16/page-999 ""So, once segwit2x activates, from that moment on it will require a coordinated fork to avoid the up coming "baked in" HF. ""
a positive effect of bcash, it made blockchain utxo spammers move away from bitcoin https://www.reddit.com/btc/comments/76lv0b/cryptograffitiinfo_now_accepts_bitcoin_cash/dof38gw/
summary of craig wright, jihan wu and roger ver's positions https://medium.com/@HjalmarPeters/the-big-blockers-bead6027deb2
Why is bitcoin so strong against attack?!?! (because we're motivated and awesome) https://www.reddit.com/btc/comments/64wo1h/bitcoin_unlimited_is_being_blocked_by_antivirus/dg5n00x/
what happened to #oldjeffgarzik https://www.reddit.com/Bitcoin/comments/6ufv5x/a_reminder_of_some_of_jeff_garziks_greatest/
big blockers fully deserve to lose every last bitcoin they ever had and more https://www.reddit.com/BitcoinMarkets/comments/756nxf/daily_discussion_monday_october_09_2017/do5ihqi/
gavinandresen brainstorming how to kill bitcoin with a 51% in a nasty way https://twitter.com/btcdrak/status/843914877542567937
Roger Ver as bitcoin Judas https://imgur.com/a/Rf1Pi
A bunch of tweets and memes celebrating UASF
https://twitter.com/shaolinfry/status/842457019286188032 | https://twitter.com/SatoshiLite/status/888335092560441345 | https://twitter.com/btcArtGallery/status/887485162925285377 | https://twitter.com/Beautyon_/status/888109901611802624 | https://twitter.com/Excellion/status/889211512966873088 | https://twitter.com/lopp/status/888200452197801984 | https://twitter.com/AlpacaSW/status/886988980524396544 | https://twitter.com/BashCo_/status/877253729531162624 | https://twitter.com/tdryja/status/865212300361379840 | https://twitter.com/Excellion/status/871179040157179904 | https://twitter.com/TraceMayestatus/849856343074902016 | https://twitter.com/TraceMayestatus/841855022640033792 | https://fs.bitcoinmagazine.com/img/images/Screen_Shot_2017-08-18_at_01.36.47.original.png
submitted by belcher_ to Bitcoin [link] [comments]

Merkle Trees and Mountain Ranges - Making UTXO Set Growth Irrelevant With Low-Latency Delayed TXO Commitments

Original link: https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2016-May/012715.html
Unedited text and originally written by:

Peter Todd pete at petertodd.org
Tue May 17 13:23:11 UTC 2016
Previous message: [bitcoin-dev] Bip44 extension for P2SH/P2WSH/...
Next message: [bitcoin-dev] Making UTXO Set Growth Irrelevant With Low-Latency Delayed TXO Commitments
Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
# Motivation

UTXO growth is a serious concern for Bitcoin's long-term decentralization. To
run a competitive mining operation potentially the entire UTXO set must be in
RAM to achieve competitive latency; your larger, more centralized, competitors
will have the UTXO set in RAM. Mining is a zero-sum game, so the extra latency
of not doing so if they do directly impacts your profit margin. Secondly,
having possession of the UTXO set is one of the minimum requirements to run a
full node; the larger the set the harder it is to run a full node.

Currently the maximum size of the UTXO set is unbounded as there is no
consensus rule that limits growth, other than the block-size limit itself; as
of writing the UTXO set is 1.3GB in the on-disk, compressed serialization,
which expands to significantly more in memory. UTXO growth is driven by a
number of factors, including the fact that there is little incentive to merge
inputs, lost coins, dust outputs that can't be economically spent, and
non-btc-value-transfer "blockchain" use-cases such as anti-replay oracles and
timestamping.

We don't have good tools to combat UTXO growth. Segregated Witness proposes to
give witness space a 75% discount, in part of make reducing the UTXO set size
by spending txouts cheaper. While this may change wallets to more often spend
dust, it's hard to imagine an incentive sufficiently strong to discourage most,
let alone all, UTXO growing behavior.

For example, timestamping applications often create unspendable outputs due to
ease of implementation, and because doing so is an easy way to make sure that
the data required to reconstruct the timestamp proof won't get lost - all
Bitcoin full nodes are forced to keep a copy of it. Similarly anti-replay
use-cases like using the UTXO set for key rotation piggyback on the uniquely
strong security and decentralization guarantee that Bitcoin provides; it's very
difficult - perhaps impossible - to provide these applications with
alternatives that are equally secure. These non-btc-value-transfer use-cases
can often afford to pay far higher fees per UTXO created than competing
btc-value-transfer use-cases; many users could afford to spend $50 to register
a new PGP key, yet would rather not spend $50 in fees to create a standard two
output transaction. Effective techniques to resist miner censorship exist, so
without resorting to whitelists blocking non-btc-value-transfer use-cases as
"spam" is not a long-term, incentive compatible, solution.

A hard upper limit on UTXO set size could create a more level playing field in
the form of fixed minimum requirements to run a performant Bitcoin node, and
make the issue of UTXO "spam" less important. However, making any coins
unspendable, regardless of age or value, is a politically untenable economic
change.


# TXO Commitments

A merkle tree committing to the state of all transaction outputs, both spent
and unspent, we can provide a method of compactly proving the current state of
an output. This lets us "archive" less frequently accessed parts of the UTXO
set, allowing full nodes to discard the associated data, still providing a
mechanism to spend those archived outputs by proving to those nodes that the
outputs are in fact unspent.

Specifically TXO commitments proposes a Merkle Mountain Range¹ (MMR), a
type of deterministic, indexable, insertion ordered merkle tree, which allows
new items to be cheaply appended to the tree with minimal storage requirements,
just log2(n) "mountain tips". Once an output is added to the TXO MMR it is
never removed; if an output is spent its status is updated in place. Both the
state of a specific item in the MMR, as well the validity of changes to items
in the MMR, can be proven with log2(n) sized proofs consisting of a merkle path
to the tip of the tree.

At an extreme, with TXO commitments we could even have no UTXO set at all,
entirely eliminating the UTXO growth problem. Transactions would simply be
accompanied by TXO commitment proofs showing that the outputs they wanted to
spend were still unspent; nodes could update the state of the TXO MMR purely
from TXO commitment proofs. However, the log2(n) bandwidth overhead per txin is
substantial, so a more realistic implementation is be to have a UTXO cache for
recent transactions, with TXO commitments acting as a alternate for the (rare)
event that an old txout needs to be spent.

Proofs can be generated and added to transactions without the involvement of
the signers, even after the fact; there's no need for the proof itself to
signed and the proof is not part of the transaction hash. Anyone with access to
TXO MMR data can (re)generate missing proofs, so minimal, if any, changes are
required to wallet software to make use of TXO commitments.


## Delayed Commitments

TXO commitments aren't a new idea - the author proposed them years ago in
response to UTXO commitments. However it's critical for small miners' orphan
rates that block validation be fast, and so far it has proven difficult to
create (U)TXO implementations with acceptable performance; updating and
recalculating cryptographicly hashed merkelized datasets is inherently more
work than not doing so. Fortunately if we maintain a UTXO set for recent
outputs, TXO commitments are only needed when spending old, archived, outputs.
We can take advantage of this by delaying the commitment, allowing it to be
calculated well in advance of it actually being used, thus changing a
latency-critical task into a much easier average throughput problem.

Concretely each block B_i commits to the TXO set state as of block B_{i-n}, in
other words what the TXO commitment would have been n blocks ago, if not for
the n block delay. Since that commitment only depends on the contents of the
blockchain up until block B_{i-n}, the contents of any block after are
irrelevant to the calculation.


## Implementation

Our proposed high-performance/low-latency delayed commitment full-node
implementation needs to store the following data:

1) UTXO set

Low-latency K:V map of txouts definitely known to be unspent. Similar to
existing UTXO implementation, but with the key difference that old,
unspent, outputs may be pruned from the UTXO set.


2) STXO set

Low-latency set of transaction outputs known to have been spent by
transactions after the most recent TXO commitment, but created prior to the
TXO commitment.


3) TXO journal

FIFO of outputs that need to be marked as spent in the TXO MMR. Appends
must be low-latency; removals can be high-latency.


4) TXO MMR list

Prunable, ordered list of TXO MMR's, mainly the highest pending commitment,
backed by a reference counted, cryptographically hashed object store
indexed by digest (similar to how git repos work). High-latency ok. We'll
cover this in more in detail later.


### Fast-Path: Verifying a Txout Spend In a Block

When a transaction output is spent by a transaction in a block we have two
cases:

1) Recently created output

Output created after the most recent TXO commitment, so it should be in the
UTXO set; the transaction spending it does not need a TXO commitment proof.
Remove the output from the UTXO set and append it to the TXO journal.

2) Archived output

Output created prior to the most recent TXO commitment, so there's no
guarantee it's in the UTXO set; transaction will have a TXO commitment
proof for the most recent TXO commitment showing that it was unspent.
Check that the output isn't already in the STXO set (double-spent), and if
not add it. Append the output and TXO commitment proof to the TXO journal.

In both cases recording an output as spent requires no more than two key:value
updates, and one journal append. The existing UTXO set requires one key:value
update per spend, so we can expect new block validation latency to be within 2x
of the status quo even in the worst case of 100% archived output spends.


### Slow-Path: Calculating Pending TXO Commitments

In a low-priority background task we flush the TXO journal, recording the
outputs spent by each block in the TXO MMR, and hashing MMR data to obtain the
TXO commitment digest. Additionally this background task removes STXO's that
have been recorded in TXO commitments, and prunes TXO commitment data no longer
needed.

Throughput for the TXO commitment calculation will be worse than the existing
UTXO only scheme. This impacts bulk verification, e.g. initial block download.
That said, TXO commitments provides other possible tradeoffs that can mitigate
impact of slower validation throughput, such as skipping validation of old
history, as well as fraud proof approaches.


### TXO MMR Implementation Details

Each TXO MMR state is a modification of the previous one with most information
shared, so we an space-efficiently store a large number of TXO commitments
states, where each state is a small delta of the previous state, by sharing
unchanged data between each state; cycles are impossible in merkelized data
structures, so simple reference counting is sufficient for garbage collection.
Data no longer needed can be pruned by dropping it from the database, and
unpruned by adding it again. Since everything is committed to via cryptographic
hash, we're guaranteed that regardless of where we get the data, after
unpruning we'll have the right data.

Let's look at how the TXO MMR works in detail. Consider the following TXO MMR
with two txouts, which we'll call state #0:

0
/ \
a b

If we add another entry we get state #1:

1
/ \
0 \
/ \ \
a b c

Note how it 100% of the state #0 data was reused in commitment #1. Let's
add two more entries to get state #2:

2
/ \
2 \
/ \ \
/ \ \
/ \ \
0 2 \
/ \ / \ \
a b c d e

This time part of state #1 wasn't reused - it's wasn't a perfect binary
tree - but we've still got a lot of re-use.

Now suppose state #2 is committed into the blockchain by the most recent block.
Future transactions attempting to spend outputs created as of state #2 are
obliged to prove that they are unspent; essentially they're forced to provide
part of the state #2 MMR data. This lets us prune that data, discarding it,
leaving us with only the bare minimum data we need to append new txouts to the
TXO MMR, the tips of the perfect binary trees ("mountains") within the MMR:

2
/ \
2 \
\
\
\
\
\
e

Note that we're glossing over some nuance here about exactly what data needs to
be kept; depending on the details of the implementation the only data we need
for nodes "2" and "e" may be their hash digest.

Adding another three more txouts results in state #3:

3
/ \
/ \
/ \
/ \
/ \
/ \
/ \
2 3
/ \
/ \
/ \
3 3
/ \ / \
e f g h

Suppose recently created txout f is spent. We have all the data required to
update the MMR, giving us state #4. It modifies two inner nodes and one leaf
node:

4
/ \
/ \
/ \
/ \
/ \
/ \
/ \
2 4
/ \
/ \
/ \
4 3
/ \ / \
e (f) g h

If an archived txout is spent requires the transaction to provide the merkle
path to the most recently committed TXO, in our case state #2. If txout b is
spent that means the transaction must provide the following data from state #2:

2
/
2
/
/
/
0
\
b

We can add that data to our local knowledge of the TXO MMR, unpruning part of
it:

4
/ \
/ \
/ \
/ \
/ \
/ \
/ \
2 4
/ / \
/ / \
/ / \
0 4 3
\ / \ / \
b e (f) g h

Remember, we haven't _modified_ state #4 yet; we just have more data about it.
When we mark txout b as spent we get state #5:

5
/ \
/ \
/ \
/ \
/ \
/ \
/ \
5 4
/ / \
/ / \
/ / \
5 4 3
\ / \ / \
(b) e (f) g h

Secondly by now state #3 has been committed into the chain, and transactions
that want to spend txouts created as of state #3 must provide a TXO proof
consisting of state #3 data. The leaf nodes for outputs g and h, and the inner
node above them, are part of state #3, so we prune them:

5
/ \
/ \
/ \
/ \
/ \
/ \
/ \
5 4
/ /
/ /
/ /
5 4
\ / \
(b) e (f)

Finally, lets put this all together, by spending txouts a, c, and g, and
creating three new txouts i, j, and k. State #3 was the most recently committed
state, so the transactions spending a and g are providing merkle paths up to
it. This includes part of the state #2 data:

3
/ \
/ \
/ \
/ \
/ \
/ \
/ \
2 3
/ \ \
/ \ \
/ \ \
0 2 3
/ / /
a c g

After unpruning we have the following data for state #5:

5
/ \
/ \
/ \
/ \
/ \
/ \
/ \
5 4
/ \ / \
/ \ / \
/ \ / \
5 2 4 3
/ \ / / \ /
a (b) c e (f) g

That's sufficient to mark the three outputs as spent and add the three new
txouts, resulting in state #6:

6
/ \
/ \
/ \
/ \
/ \
6 \
/ \ \
/ \ \
/ \ \
/ \ \
/ \ \
/ \ \
/ \ \
6 6 \
/ \ / \ \
/ \ / \ 6
/ \ / \ / \
6 6 4 6 6 \
/ \ / / \ / / \ \
(a) (b) (c) e (f) (g) i j k

Again, state #4 related data can be pruned. In addition, depending on how the
STXO set is implemented may also be able to prune data related to spent txouts
after that state, including inner nodes where all txouts under them have been
spent (more on pruning spent inner nodes later).


### Consensus and Pruning

It's important to note that pruning behavior is consensus critical: a full node
that is missing data due to pruning it too soon will fall out of consensus, and
a miner that fails to include a merkle proof that is required by the consensus
is creating an invalid block. At the same time many full nodes will have
significantly more data on hand than the bare minimum so they can help wallets
make transactions spending old coins; implementations should strongly consider
separating the data that is, and isn't, strictly required for consensus.

A reasonable approach for the low-level cryptography may be to actually treat
the two cases differently, with the TXO commitments committing too what data
does and does not need to be kept on hand by the UTXO expiration rules. On the
other hand, leaving that uncommitted allows for certain types of soft-forks
where the protocol is changed to require more data than it previously did.


### Consensus Critical Storage Overheads

Only the UTXO and STXO sets need to be kept on fast random access storage.
Since STXO set entries can only be created by spending a UTXO - and are smaller
than a UTXO entry - we can guarantee that the peak size of the UTXO and STXO
sets combined will always be less than the peak size of the UTXO set alone in
the existing UTXO-only scheme (though the combined size can be temporarily
higher than what the UTXO set size alone would be when large numbers of
archived txouts are spent).

TXO journal entries and unpruned entries in the TXO MMR have log2(n) maximum
overhead per entry: a unique merkle path to a TXO commitment (by "unique" we
mean that no other entry shares data with it). On a reasonably fast system the
TXO journal will be flushed quickly, converting it into TXO MMR data; the TXO
journal will never be more than a few blocks in size.

Transactions spending non-archived txouts are not required to provide any TXO
commitment data; we must have that data on hand in the form of one TXO MMR
entry per UTXO. Once spent however the TXO MMR leaf node associated with that
non-archived txout can be immediately pruned - it's no longer in the UTXO set
so any attempt to spend it will fail; the data is now immutable and we'll never
need it again. Inner nodes in the TXO MMR can also be pruned if all leafs under
them are fully spent; detecting this is easy the TXO MMR is a merkle-sum tree,
with each inner node committing to the sum of the unspent txouts under it.

When a archived txout is spent the transaction is required to provide a merkle
path to the most recent TXO commitment. As shown above that path is sufficient
information to unprune the necessary nodes in the TXO MMR and apply the spend
immediately, reducing this case to the TXO journal size question (non-consensus
critical overhead is a different question, which we'll address in the next
section).

Taking all this into account the only significant storage overhead of our TXO
commitments scheme when compared to the status quo is the log2(n) merkle path
overhead; as long as less than 1/log2(n) of the UTXO set is active,
non-archived, UTXO's we've come out ahead, even in the unrealistic case where
all storage available is equally fast. In the real world that isn't yet the
case - even SSD's significantly slower than RAM.


### Non-Consensus Critical Storage Overheads

Transactions spending archived txouts pose two challenges:

1) Obtaining up-to-date TXO commitment proofs

2) Updating those proofs as blocks are mined

The first challenge can be handled by specialized archival nodes, not unlike
how some nodes make transaction data available to wallets via bloom filters or
the Electrum protocol. There's a whole variety of options available, and the
the data can be easily sharded to scale horizontally; the data is
self-validating allowing horizontal scaling without trust.

While miners and relay nodes don't need to be concerned about the initial
commitment proof, updating that proof is another matter. If a node aggressively
prunes old versions of the TXO MMR as it calculates pending TXO commitments, it
won't have the data available to update the TXO commitment proof to be against
the next block, when that block is found; the child nodes of the TXO MMR tip
are guaranteed to have changed, yet aggressive pruning would have discarded that
data.

Relay nodes could ignore this problem if they simply accept the fact that
they'll only be able to fully relay the transaction once, when it is initially
broadcast, and won't be able to provide mempool functionality after the initial
relay. Modulo high-latency mixnets, this is probably acceptable; the author has
previously argued that relay nodes don't need a mempool² at all.

For a miner though not having the data necessary to update the proofs as blocks
are found means potentially losing out on transactions fees. So how much extra
data is necessary to make this a non-issue?

Since the TXO MMR is insertion ordered, spending a non-archived txout can only
invalidate the upper nodes in of the archived txout's TXO MMR proof (if this
isn't clear, imagine a two-level scheme, with a per-block TXO MMRs, committed
by a master MMR for all blocks). The maximum number of relevant inner nodes
changed is log2(n) per block, so if there are n non-archival blocks between the
most recent TXO commitment and the pending TXO MMR tip, we have to store
log2(n)*n inner nodes - on the order of a few dozen MB even when n is a
(seemingly ridiculously high) year worth of blocks.

Archived txout spends on the other hand can invalidate TXO MMR proofs at any
level - consider the case of two adjacent txouts being spent. To guarantee
success requires storing full proofs. However, they're limited by the blocksize
limit, and additionally are expected to be relatively uncommon. For example, if
1% of 1MB blocks was archival spends, our hypothetical year long TXO commitment
delay is only a few hundred MB of data with low-IO-performance requirements.


## Security Model

Of course, a TXO commitment delay of a year sounds ridiculous. Even the slowest
imaginable computer isn't going to need more than a few blocks of TXO
commitment delay to keep up ~100% of the time, and there's no reason why we
can't have the UTXO archive delay be significantly longer than the TXO
commitment delay.

However, as with UTXO commitments, TXO commitments raise issues with Bitcoin's
security model by allowing relatively miners to profitably mine transactions
without bothering to validate prior history. At the extreme, if there was no
commitment delay at all at the cost of a bit of some extra network bandwidth
"full" nodes could operate and even mine blocks completely statelessly by
expecting all transactions to include "proof" that their inputs are unspent; a
TXO commitment proof for a commitment you haven't verified isn't a proof that a
transaction output is unspent, it's a proof that some miners claimed the txout
was unspent.

At one extreme, we could simply implement TXO commitments in a "virtual"
fashion, without miners actually including the TXO commitment digest in their
blocks at all. Full nodes would be forced to compute the commitment from
scratch, in the same way they are forced to compute the UTXO state, or total
work. Of course a full node operator who doesn't want to verify old history can
get a copy of the TXO state from a trusted source - no different from how you
could get a copy of the UTXO set from a trusted source.

A more pragmatic approach is to accept that people will do that anyway, and
instead assume that sufficiently old blocks are valid. But how old is
"sufficiently old"? First of all, if your full node implementation comes "from
the factory" with a reasonably up-to-date minimum accepted total-work
thresholdⁱ - in other words it won't accept a chain with less than that amount
of total work - it may be reasonable to assume any Sybil attacker with
sufficient hashing power to make a forked chain meeting that threshold with,
say, six months worth of blocks has enough hashing power to threaten the main
chain as well.

That leaves public attempts to falsify TXO commitments, done out in the open by
the majority of hashing power. In this circumstance the "assumed valid"
threshold determines how long the attack would have to go on before full nodes
start accepting the invalid chain, or at least, newly installed/recently reset
full nodes. The minimum age that we can "assume valid" is tradeoff between
political/social/technical concerns; we probably want at least a few weeks to
guarantee the defenders a chance to organise themselves.

With this in mind, a longer-than-technically-necessary TXO commitment delayʲ
may help ensure that full node software actually validates some minimum number
of blocks out-of-the-box, without taking shortcuts. However this can be
achieved in a wide variety of ways, such as the author's prev-block-proof
proposal³, fraud proofs, or even a PoW with an inner loop dependent on
blockchain data. Like UTXO commitments, TXO commitments are also potentially
very useful in reducing the need for SPV wallet software to trust third parties
providing them with transaction data.

i) Checkpoints that reject any chain without a specific block are a more
common, if uglier, way of achieving this protection.

j) A good homework problem is to figure out how the TXO commitment could be
designed such that the delay could be reduced in a soft-fork.


## Further Work

While we've shown that TXO commitments certainly could be implemented without
increasing peak IO bandwidth/block validation latency significantly with the
delayed commitment approach, we're far from being certain that they should be
implemented this way (or at all).

1) Can a TXO commitment scheme be optimized sufficiently to be used directly
without a commitment delay? Obviously it'd be preferable to avoid all the above
complexity entirely.

2) Is it possible to use a metric other than age, e.g. priority? While this
complicates the pruning logic, it could use the UTXO set space more
efficiently, especially if your goal is to prioritise bitcoin value-transfer
over other uses (though if "normal" wallets nearly never need to use TXO
commitments proofs to spend outputs, the infrastructure to actually do this may
rot).

3) Should UTXO archiving be based on a fixed size UTXO set, rather than an
age/priority/etc. threshold?

4) By fixing the problem (or possibly just "fixing" the problem) are we
encouraging/legitimising blockchain use-cases other than BTC value transfer?
Should we?

5) Instead of TXO commitment proofs counting towards the blocksize limit, can
we use a different miner fairness/decentralization metric/incentive? For
instance it might be reasonable for the TXO commitment proof size to be
discounted, or ignored entirely, if a proof-of-propagation scheme (e.g.
thinblocks) is used to ensure all miners have received the proof in advance.

6) How does this interact with fraud proofs? Obviously furthering dependency on
non-cryptographically-committed STXO/UTXO databases is incompatible with the
modularized validation approach to implementing fraud proofs.


# References

1) "Merkle Mountain Ranges",
Peter Todd, OpenTimestamps, Mar 18 2013,
https://github.com/opentimestamps/opentimestamps-serveblob/mastedoc/merkle-mountain-range.md

2) "Do we really need a mempool? (for relay nodes)",
Peter Todd, bitcoin-dev mailing list, Jul 18th 2015,
https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-July/009479.html

3) "Segregated witnesses and validationless mining",
Peter Todd, bitcoin-dev mailing list, Dec 23rd 2015,
https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2015-Decembe012103.html

--
https://petertodd.org 'peter'[:-1]@petertodd.org
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 455 bytes
Desc: Digital signature
URL: <http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20160517/33f69665/attachment-0001.sig>
submitted by Godballz to CryptoTechnology [link] [comments]

Groestlcoin Christmas Release!

Groestlcoin Dec 2018 Christmas Release Update

As per usual the 3 months has been all hand-on-deck, helping to bring further adoption utilities to Groestlcoin. The markets have been red but as always that doesn't stop the show from going on with regards to the development since the last release update on 24th September. Here's a recap of what has happened so far:

Recap:

What’s New Today?

Groestlcoin on Trezor Model T

As of the latest version of the Trezor Model T firmware, Groestlcoin is now officially supported! The Trezor Model T is the next-generation cryptocurrency hardware wallet, designed to be your universal vault for all of your digital assets. Store and encrypt your coins, passwords and other digital keys with confidence. The Trezor Model T now supports over 500 cryptocurrencies.

Blockbook MainNet & TestNet Block Explorer

Blockbook is an open-source Groestlcoin blockchain explorer with complete REST and websocket APIs that can be used for writing web wallets and other apps that need more advanced blockchain queries than provided by groestlcoind RPC.
Blockbook REST API provides you with a convenient, powerful and simple way to read data from the groestlcoin network and with it, build your own services.

Features:

Blockbook is available via https://blockbook.groestlcoin.org/ Testnet: https://blockbook-test.groestlcoin.org/ Source code: https://github.com/Groestlcoin/blockbook

Edge Wallet

Groestlcoin has been added to the Edge wallet for Android and iOS. Edge wallet is secure, private and intuitive. By including support for ShapeShift, Simplex and Changelly, Edge allows you to seamlessly shift between digital currencies, anywhere with an internet connection.

Features:

Android: https://play.google.com/store/apps/details?id=co.edgesecure.app
iOS: https://itunes.apple.com/us/app/edge-bitcoin-wallet/id1344400091?mt=8
Direct Android: https://edge.app/app

CoinID Wallet

We are excited to announce that Groestlcoin has been added to CoinID! With integrated cold and hot wallet support, and a host of other unique wallet features, CoinID can easily become your go-to wallet for storing Groestlcoin. More details can be found here: https://coinid.org/s/groestlcoin-wallet-overview.pdf

Features

Android: https://play.google.com/store/apps/details?id=org.coinid.wallet.grs
iOS: https://itunes.apple.com/us/app/grs-wallet-for-coinid/id1439638550

Groestlcoin Sentinel - Windows Released

Groestlcoin Sentinel is the easiest and fastest way to track balances of your Groestlcoin addresses.
Features
You can download it using the links below.
Download the Windows Wallet (64 bit) here: https://github.com/Groestlcoin/Groestlcoin-Sentinel-Windows/releases/download/1.0/SentinelSetup_x64.msi
Download the Windows Wallet (32 bit) here: https://github.com/Groestlcoin/Groestlcoin-Sentinel-Windows/releases/download/1.0/SentinelSetup_x86.msi
Source code: https://github.com/Groestlcoin/Groestlcoin-Sentinel-Windows/

Groestlcoin BIP39 Tool 0.3.9 Update

The Groestlcoin BIP39 tool is an open-source web tool for converting BIP39 mnemonic codes to addresses and private keys. This enables the greatest security against third-party wallets potentially disappearing – You’ll still have access to your funds thanks to this tool.
What’s New
Download the Groestlcoin BIP39 tool here: https://github.com/Groestlcoin/bip39/archive/master.zip
Source code: https://github.com/groestlcoin/bip39
Or use hosted version: https://groestlcoin.org/bip39/

Electrum-GRS 3.2.3 Update

Electrum-GRS is a lightweight "thin client" Groestlcoin wallet Windows, MacOS and Linux based on a client-server protocol. Its main advantages over the original Groestlcoin client include support for multi-signature wallets and not requiring the download of the entire block chain.
What’s New

Electrum + Android Version 3.2.3:

Android: https://play.google.com/store/apps/details?id=org.groestlcoin.electrumgrs
Windows & OSX: https://github.com/Groestlcoin/electrum-grs/releases/
Linux:
sudo apt-get install python3-setuptools python3-pyqt5 python3-pip python3-dev libssl-dev sudo pip3 install groestlcoin_hash sudo pip3 install https://github.com/Groestlcoin/electrum-grs/releases/download/v3.2.3/Electrum-grs-3.2.3.tar.gz electrum-grs
GitHub Source server: https://github.com/Groestlcoin/electrumx-grs
Github Source server installer: https://github.com/Groestlcoin/electrumx-grs-installer
Github Source client: https://github.com/Groestlcoin/electrum-grs

Groestlcoin ivendPay Integration

ivendPay and Groestlcoin cryptocurrency have announced the start of integration.
IT company ivendPay, the developer of a universal multicurrency payment module for automatic and retail trade, intends to integrate Groestlcoin cryptocurrency — one of the oldest and the most reputable Bitcoin forks into the payment system. Groestlcoin is characterized by instant transactions with almost zero commission and is optimal for mass retail trade where micropayments are mostly used.
According to Sergey Danilov, founder and CEO of ivendPay, Groestlcoin will become the 11th cryptocurrency integrated into the payment module. The first working vending machines for the sale of coffee, snacks and souvenirs, equipped with ivendPay modules, served the visitors of the CryptoEvent RIW exhibition at VDNKh in Moscow and accepted Bitcoin, Go Byte, Dash, Bitcoin Cash, Ethereum, Ethereum Classic, Zcash, Bitcoin Gold, Dogecoin and Emercoin. ivendPay terminals are designed and patented to accept payments in electronic money, cryptocurrencies and cash when connecting the corresponding cash terminal. Payment for the purchase takes a few seconds, the choice of the payment currency occurs at the time of placing the order on the screen, the payment is made by QR-code through the cryptocurrency wallet on the smartphone.
The interest in equipping vending machines with ivendPay terminals has already been shown by the companies of Malaysia and Israel, where first test networks would be installed. ivendPay compiles a waiting list for vending networks interested in buying terminals and searches for an investor to launch industrial production. According to Sergey Danilov, the universal payment terminal ivendPay for the vending machine will cost about $500. The founder of ivendPay has welcomed the appearance of Groestlcoin among integrated cryptocurrencies, as it is another step towards the realization of the basic idea of digital money - free and cross-border access to goods and services for everybody.
submitted by Yokomoko_Saleen to groestlcoin [link] [comments]

LITECOIN SEGWIT ACTIVATED - AMAZING TIMES FOR LITECOIN MINING Litecoin Vs Bitcoin Mining Profitability Calculator Vs The ... Segwit moving forward on Litecoin and Bitcoin Bitconnect Reinvestment and projected earnings with the compounding Calculator Is Segregated Witness Active? Almost!

Segregated Witness (SegWit) resolves this by separating—or “segregating”—the witness data away from the transaction data, which allows bitcoin’s blockchain to fully read and record the details of the transaction before releasing any actual bitcoins. The Segregated Witness soft-fork (segwit) includes a wide range of features, many of which are highly technical. This page summarises some of the benefits of those features. Malleability Fixes. Bitcoin transactions are identified by a 64-digit hexadecimal hash called a transaction identifier (txid) which is based on both the coins being spent and on who will be able to spend the results of the ... Following the continuous debates among the bitcoin industry and the mining community about Segregated Witness’ (SegWit) implementation since 2016, as wsignaling and waiting periods, the SegWit soft fork has finally been officially activated on the Bitcoin network on August 23 having processed block 481822. “After two years of the crypto community infighting, this upgrade is long overdue, […] Segwit stands for Segregated Witness and was introduced in BIP-0141 (Bitcoin Improvement Proposal 141). The size of a Segwit transaction is lower than the size of a legacy Bitcoin transaction, as Segwit reduces the overall size of the transaction. The difference lies mainly in the size of inputs, as the outputs are roughly the same in size - 1-2 bytes smaller when using Segwit. Thus, having a ... What is Segregated Witness? Segregated Witness or SegWit (“separate witness”) is a implemented protocol update proposed by the Bitcoin Core development team. The aim of this proposal is to optimize the block size, which in the long run will solve the issues of scalability of the Bitcoin network, block overflow with transactions, transaction confirmation speed and lower fees.

[index] [48395] [41848] [37031] [39495] [13251] [25653] [16030] [21756] [44270] [18965]

LITECOIN SEGWIT ACTIVATED - AMAZING TIMES FOR LITECOIN MINING

This video is unavailable. Watch Queue Queue. Watch Queue segregated witness segwit - segwit vs native segwit (bitcoin segregated witness explained). segregated witness explained [segwit] (litecoin/bitcoin). What is... Secure your Bitcoin with a Ledger Hardware Wallet: https:/... Skip navigation Sign in. Search. Loading... Close. This video is unavailable. Watch Queue Queue. Watch Queue Queue. Remove all ... This video is unavailable. Watch Queue Queue. Watch Queue Queue Crypto mining calculator ... Segregated Witness Explained [Segwit] (Litecoin/Bitcoin) - Duration: 12:09. Franklyn [Crypto] 65,886 views. 12:09. Solar Panels on Our House - One Year In - Duration ...

#