Bitcoin’s 1MB Block Size Limit ‘Starting to Fade Away ...

Why i’m bullish on Zilliqa (long read)

Edit: TL;DR added in the comments
 
Hey all, I've been researching coins since 2017 and have gone through 100s of them in the last 3 years. I got introduced to blockchain via Bitcoin of course, analyzed Ethereum thereafter and from that moment I have a keen interest in smart contact platforms. I’m passionate about Ethereum but I find Zilliqa to have a better risk-reward ratio. Especially because Zilliqa has found an elegant balance between being secure, decentralized and scalable in my opinion.
 
Below I post my analysis of why from all the coins I went through I’m most bullish on Zilliqa (yes I went through Tezos, EOS, NEO, VeChain, Harmony, Algorand, Cardano etc.). Note that this is not investment advice and although it's a thorough analysis there is obviously some bias involved. Looking forward to what you all think!
 
Fun fact: the name Zilliqa is a play on ‘silica’ silicon dioxide which means “Silicon for the high-throughput consensus computer.”
 
This post is divided into (i) Technology, (ii) Business & Partnerships, and (iii) Marketing & Community. I’ve tried to make the technology part readable for a broad audience. If you’ve ever tried understanding the inner workings of Bitcoin and Ethereum you should be able to grasp most parts. Otherwise, just skim through and once you are zoning out head to the next part.
 
Technology and some more:
 
Introduction
 
The technology is one of the main reasons why I’m so bullish on Zilliqa. First thing you see on their website is: “Zilliqa is a high-performance, high-security blockchain platform for enterprises and next-generation applications.” These are some bold statements.
 
Before we deep dive into the technology let’s take a step back in time first as they have quite the history. The initial research paper from which Zilliqa originated dates back to August 2016: Elastico: A Secure Sharding Protocol For Open Blockchains where Loi Luu (Kyber Network) is one of the co-authors. Other ideas that led to the development of what Zilliqa has become today are: Bitcoin-NG, collective signing CoSi, ByzCoin and Omniledger.
 
The technical white paper was made public in August 2017 and since then they have achieved everything stated in the white paper and also created their own open source intermediate level smart contract language called Scilla (functional programming language similar to OCaml) too.
 
Mainnet is live since the end of January 2019 with daily transaction rates growing continuously. About a week ago mainnet reached 5 million transactions, 500.000+ addresses in total along with 2400 nodes keeping the network decentralized and secure. Circulating supply is nearing 11 billion and currently only mining rewards are left. The maximum supply is 21 billion with annual inflation being 7.13% currently and will only decrease with time.
 
Zilliqa realized early on that the usage of public cryptocurrencies and smart contracts were increasing but decentralized, secure, and scalable alternatives were lacking in the crypto space. They proposed to apply sharding onto a public smart contract blockchain where the transaction rate increases almost linear with the increase in the amount of nodes. More nodes = higher transaction throughput and increased decentralization. Sharding comes in many forms and Zilliqa uses network-, transaction- and computational sharding. Network sharding opens up the possibility of using transaction- and computational sharding on top. Zilliqa does not use state sharding for now. We’ll come back to this later.
 
Before we continue dissecting how Zilliqa achieves such from a technological standpoint it’s good to keep in mind that a blockchain being decentralised and secure and scalable is still one of the main hurdles in allowing widespread usage of decentralised networks. In my opinion this needs to be solved first before blockchains can get to the point where they can create and add large scale value. So I invite you to read the next section to grasp the underlying fundamentals. Because after all these premises need to be true otherwise there isn’t a fundamental case to be bullish on Zilliqa, right?
 
Down the rabbit hole
 
How have they achieved this? Let’s define the basics first: key players on Zilliqa are the users and the miners. A user is anybody who uses the blockchain to transfer funds or run smart contracts. Miners are the (shard) nodes in the network who run the consensus protocol and get rewarded for their service in Zillings (ZIL). The mining network is divided into several smaller networks called shards, which is also referred to as ‘network sharding’. Miners subsequently are randomly assigned to a shard by another set of miners called DS (Directory Service) nodes. The regular shards process transactions and the outputs of these shards are eventually combined by the DS shard as they reach consensus on the final state. More on how these DS shards reach consensus (via pBFT) will be explained later on.
 
The Zilliqa network produces two types of blocks: DS blocks and Tx blocks. One DS Block consists of 100 Tx Blocks. And as previously mentioned there are two types of nodes concerned with reaching consensus: shard nodes and DS nodes. Becoming a shard node or DS node is being defined by the result of a PoW cycle (Ethash) at the beginning of the DS Block. All candidate mining nodes compete with each other and run the PoW (Proof-of-Work) cycle for 60 seconds and the submissions achieving the highest difficulty will be allowed on the network. And to put it in perspective: the average difficulty for one DS node is ~ 2 Th/s equaling 2.000.000 Mh/s or 55 thousand+ GeForce GTX 1070 / 8 GB GPUs at 35.4 Mh/s. Each DS Block 10 new DS nodes are allowed. And a shard node needs to provide around 8.53 GH/s currently (around 240 GTX 1070s). Dual mining ETH/ETC and ZIL is possible and can be done via mining software such as Phoenix and Claymore. There are pools and if you have large amounts of hashing power (Ethash) available you could mine solo.
 
The PoW cycle of 60 seconds is a peak performance and acts as an entry ticket to the network. The entry ticket is called a sybil resistance mechanism and makes it incredibly hard for adversaries to spawn lots of identities and manipulate the network with these identities. And after every 100 Tx Blocks which corresponds to roughly 1,5 hour this PoW process repeats. In between these 1,5 hour, no PoW needs to be done meaning Zilliqa’s energy consumption to keep the network secure is low. For more detailed information on how mining works click here.
Okay, hats off to you. You have made it this far. Before we go any deeper down the rabbit hole we first must understand why Zilliqa goes through all of the above technicalities and understand a bit more what a blockchain on a more fundamental level is. Because the core of Zilliqa’s consensus protocol relies on the usage of pBFT (practical Byzantine Fault Tolerance) we need to know more about state machines and their function. Navigate to Viewblock, a Zilliqa block explorer, and just come back to this article. We will use this site to navigate through a few concepts.
 
We have established that Zilliqa is a public and distributed blockchain. Meaning that everyone with an internet connection can send ZILs, trigger smart contracts, etc. and there is no central authority who fully controls the network. Zilliqa and other public and distributed blockchains (like Bitcoin and Ethereum) can also be defined as state machines.
 
Taking the liberty of paraphrasing examples and definitions given by Samuel Brooks’ medium article, he describes the definition of a blockchain (like Zilliqa) as: “A peer-to-peer, append-only datastore that uses consensus to synchronize cryptographically-secure data”.
 
Next, he states that: "blockchains are fundamentally systems for managing valid state transitions”. For some more context, I recommend reading the whole medium article to get a better grasp of the definitions and understanding of state machines. Nevertheless, let’s try to simplify and compile it into a single paragraph. Take traffic lights as an example: all its states (red, amber, and green) are predefined, all possible outcomes are known and it doesn’t matter if you encounter the traffic light today or tomorrow. It will still behave the same. Managing the states of a traffic light can be done by triggering a sensor on the road or pushing a button resulting in one traffic lights’ state going from green to red (via amber) and another light from red to green.
 
With public blockchains like Zilliqa, this isn’t so straightforward and simple. It started with block #1 almost 1,5 years ago and every 45 seconds or so a new block linked to the previous block is being added. Resulting in a chain of blocks with transactions in it that everyone can verify from block #1 to the current #647.000+ block. The state is ever changing and the states it can find itself in are infinite. And while the traffic light might work together in tandem with various other traffic lights, it’s rather insignificant comparing it to a public blockchain. Because Zilliqa consists of 2400 nodes who need to work together to achieve consensus on what the latest valid state is while some of these nodes may have latency or broadcast issues, drop offline or are deliberately trying to attack the network, etc.
 
Now go back to the Viewblock page take a look at the amount of transaction, addresses, block and DS height and then hit refresh. Obviously as expected you see new incremented values on one or all parameters. And how did the Zilliqa blockchain manage to transition from a previous valid state to the latest valid state? By using pBFT to reach consensus on the latest valid state.
 
After having obtained the entry ticket, miners execute pBFT to reach consensus on the ever-changing state of the blockchain. pBFT requires a series of network communication between nodes, and as such there is no GPU involved (but CPU). Resulting in the total energy consumed to keep the blockchain secure, decentralized and scalable being low.
 
pBFT stands for practical Byzantine Fault Tolerance and is an optimization on the Byzantine Fault Tolerant algorithm. To quote Blockonomi: “In the context of distributed systems, Byzantine Fault Tolerance is the ability of a distributed computer network to function as desired and correctly reach a sufficient consensus despite malicious components (nodes) of the system failing or propagating incorrect information to other peers.” Zilliqa is such a distributed computer network and depends on the honesty of the nodes (shard and DS) to reach consensus and to continuously update the state with the latest block. If pBFT is a new term for you I can highly recommend the Blockonomi article.
 
The idea of pBFT was introduced in 1999 - one of the authors even won a Turing award for it - and it is well researched and applied in various blockchains and distributed systems nowadays. If you want more advanced information than the Blockonomi link provides click here. And if you’re in between Blockonomi and the University of Singapore read the Zilliqa Design Story Part 2 dating from October 2017.
Quoting from the Zilliqa tech whitepaper: “pBFT relies upon a correct leader (which is randomly selected) to begin each phase and proceed when the sufficient majority exists. In case the leader is byzantine it can stall the entire consensus protocol. To address this challenge, pBFT offers a view change protocol to replace the byzantine leader with another one.”
 
pBFT can tolerate ⅓ of the nodes being dishonest (offline counts as Byzantine = dishonest) and the consensus protocol will function without stalling or hiccups. Once there are more than ⅓ of dishonest nodes but no more than ⅔ the network will be stalled and a view change will be triggered to elect a new DS leader. Only when more than ⅔ of the nodes are dishonest (66%) double-spend attacks become possible.
 
If the network stalls no transactions can be processed and one has to wait until a new honest leader has been elected. When the mainnet was just launched and in its early phases, view changes happened regularly. As of today the last stalling of the network - and view change being triggered - was at the end of October 2019.
 
Another benefit of using pBFT for consensus besides low energy is the immediate finality it provides. Once your transaction is included in a block and the block is added to the chain it’s done. Lastly, take a look at this article where three types of finality are being defined: probabilistic, absolute and economic finality. Zilliqa falls under the absolute finality (just like Tendermint for example). Although lengthy already we skipped through some of the inner workings from Zilliqa’s consensus: read the Zilliqa Design Story Part 3 and you will be close to having a complete picture on it. Enough about PoW, sybil resistance mechanism, pBFT, etc. Another thing we haven’t looked at yet is the amount of decentralization.
 
Decentralisation
 
Currently, there are four shards, each one of them consisting of 600 nodes. 1 shard with 600 so-called DS nodes (Directory Service - they need to achieve a higher difficulty than shard nodes) and 1800 shard nodes of which 250 are shard guards (centralized nodes controlled by the team). The amount of shard guards has been steadily declining from 1200 in January 2019 to 250 as of May 2020. On the Viewblock statistics, you can see that many of the nodes are being located in the US but those are only the (CPU parts of the) shard nodes who perform pBFT. There is no data from where the PoW sources are coming. And when the Zilliqa blockchain starts reaching its transaction capacity limit, a network upgrade needs to be executed to lift the current cap of maximum 2400 nodes to allow more nodes and formation of more shards which will allow to network to keep on scaling according to demand.
Besides shard nodes there are also seed nodes. The main role of seed nodes is to serve as direct access points (for end-users and clients) to the core Zilliqa network that validates transactions. Seed nodes consolidate transaction requests and forward these to the lookup nodes (another type of nodes) for distribution to the shards in the network. Seed nodes also maintain the entire transaction history and the global state of the blockchain which is needed to provide services such as block explorers. Seed nodes in the Zilliqa network are comparable to Infura on Ethereum.
 
The seed nodes were first only operated by Zilliqa themselves, exchanges and Viewblock. Operators of seed nodes like exchanges had no incentive to open them for the greater public. They were centralised at first. Decentralisation at the seed nodes level has been steadily rolled out since March 2020 ( Zilliqa Improvement Proposal 3 ). Currently the amount of seed nodes is being increased, they are public-facing and at the same time PoS is applied to incentivize seed node operators and make it possible for ZIL holders to stake and earn passive yields. Important distinction: seed nodes are not involved with consensus! That is still PoW as entry ticket and pBFT for the actual consensus.
 
5% of the block rewards are being assigned to seed nodes (from the beginning in 2019) and those are being used to pay out ZIL stakers. The 5% block rewards with an annual yield of 10.03% translate to roughly 610 MM ZILs in total that can be staked. Exchanges use the custodial variant of staking and wallets like Moonlet will use the non-custodial version (starting in Q3 2020). Staking is being done by sending ZILs to a smart contract created by Zilliqa and audited by Quantstamp.
 
With a high amount of DS; shard nodes and seed nodes becoming more decentralized too, Zilliqa qualifies for the label of decentralized in my opinion.
 
Smart contracts
 
Let me start by saying I’m not a developer and my programming skills are quite limited. So I‘m taking the ELI5 route (maybe 12) but if you are familiar with Javascript, Solidity or specifically OCaml please head straight to Scilla - read the docs to get a good initial grasp of how Zilliqa’s smart contract language Scilla works and if you ask yourself “why another programming language?” check this article. And if you want to play around with some sample contracts in an IDE click here. The faucet can be found here. And more information on architecture, dapp development and API can be found on the Developer Portal.
If you are more into listening and watching: check this recent webinar explaining Zilliqa and Scilla. Link is time-stamped so you’ll start right away with a platform introduction, roadmap 2020 and afterwards a proper Scilla introduction.
 
Generalized: programming languages can be divided into being ‘object-oriented’ or ‘functional’. Here is an ELI5 given by software development academy: * “all programs have two basic components, data – what the program knows – and behavior – what the program can do with that data. So object-oriented programming states that combining data and related behaviors in one place, is called “object”, which makes it easier to understand how a particular program works. On the other hand, functional programming argues that data and behavior are different things and should be separated to ensure their clarity.” *
 
Scilla is on the functional side and shares similarities with OCaml: OCaml is a general-purpose programming language with an emphasis on expressiveness and safety. It has an advanced type system that helps catch your mistakes without getting in your way. It's used in environments where a single mistake can cost millions and speed matters, is supported by an active community, and has a rich set of libraries and development tools. For all its power, OCaml is also pretty simple, which is one reason it's often used as a teaching language.
 
Scilla is blockchain agnostic, can be implemented onto other blockchains as well, is recognized by academics and won a so-called Distinguished Artifact Award award at the end of last year.
 
One of the reasons why the Zilliqa team decided to create their own programming language focused on preventing smart contract vulnerabilities is that adding logic on a blockchain, programming, means that you cannot afford to make mistakes. Otherwise, it could cost you. It’s all great and fun blockchains being immutable but updating your code because you found a bug isn’t the same as with a regular web application for example. And with smart contracts, it inherently involves cryptocurrencies in some form thus value.
 
Another difference with programming languages on a blockchain is gas. Every transaction you do on a smart contract platform like Zilliqa or Ethereum costs gas. With gas you basically pay for computational costs. Sending a ZIL from address A to address B costs 0.001 ZIL currently. Smart contracts are more complex, often involve various functions and require more gas (if gas is a new concept click here ).
 
So with Scilla, similar to Solidity, you need to make sure that “every function in your smart contract will run as expected without hitting gas limits. An improper resource analysis may lead to situations where funds may get stuck simply because a part of the smart contract code cannot be executed due to gas limits. Such constraints are not present in traditional software systems”. Scilla design story part 1
 
Some examples of smart contract issues you’d want to avoid are: leaking funds, ‘unexpected changes to critical state variables’ (example: someone other than you setting his or her address as the owner of the smart contract after creation) or simply killing a contract.
 
Scilla also allows for formal verification. Wikipedia to the rescue: In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal methods of mathematics.
 
Formal verification can be helpful in proving the correctness of systems such as: cryptographic protocols, combinational circuits, digital circuits with internal memory, and software expressed as source code.
 
Scilla is being developed hand-in-hand with formalization of its semantics and its embedding into the Coq proof assistant — a state-of-the art tool for mechanized proofs about properties of programs.”
 
Simply put, with Scilla and accompanying tooling developers can be mathematically sure and proof that the smart contract they’ve written does what he or she intends it to do.
 
Smart contract on a sharded environment and state sharding
 
There is one more topic I’d like to touch on: smart contract execution in a sharded environment (and what is the effect of state sharding). This is a complex topic. I’m not able to explain it any easier than what is posted here. But I will try to compress the post into something easy to digest.
 
Earlier on we have established that Zilliqa can process transactions in parallel due to network sharding. This is where the linear scalability comes from. We can define simple transactions: a transaction from address A to B (Category 1), a transaction where a user interacts with one smart contract (Category 2) and the most complex ones where triggering a transaction results in multiple smart contracts being involved (Category 3). The shards are able to process transactions on their own without interference of the other shards. With Category 1 transactions that is doable, with Category 2 transactions sometimes if that address is in the same shard as the smart contract but with Category 3 you definitely need communication between the shards. Solving that requires to make a set of communication rules the protocol needs to follow in order to process all transactions in a generalised fashion.
 
And this is where the downsides of state sharding comes in currently. All shards in Zilliqa have access to the complete state. Yes the state size (0.1 GB at the moment) grows and all of the nodes need to store it but it also means that they don’t need to shop around for information available on other shards. Requiring more communication and adding more complexity. Computer science knowledge and/or developer knowledge required links if you want to dig further: Scilla - language grammar Scilla - Foundations for Verifiable Decentralised Computations on a Blockchain Gas Accounting NUS x Zilliqa: Smart contract language workshop
 
Easier to follow links on programming Scilla https://learnscilla.com/home Ivan on Tech
 
Roadmap / Zilliqa 2.0
 
There is no strict defined roadmap but here are topics being worked on. And via the Zilliqa website there is also more information on the projects they are working on.
 
Business & Partnerships
 
It’s not only technology in which Zilliqa seems to be excelling as their ecosystem has been expanding and starting to grow rapidly. The project is on a mission to provide OpenFinance (OpFi) to the world and Singapore is the right place to be due to its progressive regulations and futuristic thinking. Singapore has taken a proactive approach towards cryptocurrencies by introducing the Payment Services Act 2019 (PS Act). Among other things, the PS Act will regulate intermediaries dealing with certain cryptocurrencies, with a particular focus on consumer protection and anti-money laundering. It will also provide a stable regulatory licensing and operating framework for cryptocurrency entities, effectively covering all crypto businesses and exchanges based in Singapore. According to PWC 82% of the surveyed executives in Singapore reported blockchain initiatives underway and 13% of them have already brought the initiatives live to the market. There is also an increasing list of organizations that are starting to provide digital payment services. Moreover, Singaporean blockchain developers Building Cities Beyond has recently created an innovation $15 million grant to encourage development on its ecosystem. This all suggests that Singapore tries to position itself as (one of) the leading blockchain hubs in the world.
 
Zilliqa seems to already take advantage of this and recently helped launch Hg Exchange on their platform, together with financial institutions PhillipCapital, PrimePartners and Fundnel. Hg Exchange, which is now approved by the Monetary Authority of Singapore (MAS), uses smart contracts to represent digital assets. Through Hg Exchange financial institutions worldwide can use Zilliqa's safe-by-design smart contracts to enable the trading of private equities. For example, think of companies such as Grab, Airbnb, SpaceX that are not available for public trading right now. Hg Exchange will allow investors to buy shares of private companies & unicorns and capture their value before an IPO. Anquan, the main company behind Zilliqa, has also recently announced that they became a partner and shareholder in TEN31 Bank, which is a fully regulated bank allowing for tokenization of assets and is aiming to bridge the gap between conventional banking and the blockchain world. If STOs, the tokenization of assets, and equity trading will continue to increase, then Zilliqa’s public blockchain would be the ideal candidate due to its strategic positioning, partnerships, regulatory compliance and the technology that is being built on top of it.
 
What is also very encouraging is their focus on banking the un(der)banked. They are launching a stablecoin basket starting with XSGD. As many of you know, stablecoins are currently mostly used for trading. However, Zilliqa is actively trying to broaden the use case of stablecoins. I recommend everybody to read this text that Amrit Kumar wrote (one of the co-founders). These stablecoins will be integrated in the traditional markets and bridge the gap between the crypto world and the traditional world. This could potentially revolutionize and legitimise the crypto space if retailers and companies will for example start to use stablecoins for payments or remittances, instead of it solely being used for trading.
 
Zilliqa also released their DeFi strategic roadmap (dating November 2019) which seems to be aligning well with their OpFi strategy. A non-custodial DEX is coming to Zilliqa made by Switcheo which allows cross-chain trading (atomic swaps) between ETH, EOS and ZIL based tokens. They also signed a Memorandum of Understanding for a (soon to be announced) USD stablecoin. And as Zilliqa is all about regulations and being compliant, I’m speculating on it to be a regulated USD stablecoin. Furthermore, XSGD is already created and visible on block explorer and XIDR (Indonesian Stablecoin) is also coming soon via StraitsX. Here also an overview of the Tech Stack for Financial Applications from September 2019. Further quoting Amrit Kumar on this:
 
There are two basic building blocks in DeFi/OpFi though: 1) stablecoins as you need a non-volatile currency to get access to this market and 2) a dex to be able to trade all these financial assets. The rest are built on top of these blocks.
 
So far, together with our partners and community, we have worked on developing these building blocks with XSGD as a stablecoin. We are working on bringing a USD-backed stablecoin as well. We will soon have a decentralised exchange developed by Switcheo. And with HGX going live, we are also venturing into the tokenization space. More to come in the future.”
 
Additionally, they also have this ZILHive initiative that injects capital into projects. There have been already 6 waves of various teams working on infrastructure, innovation and research, and they are not from ASEAN or Singapore only but global: see Grantees breakdown by country. Over 60 project teams from over 20 countries have contributed to Zilliqa's ecosystem. This includes individuals and teams developing wallets, explorers, developer toolkits, smart contract testing frameworks, dapps, etc. As some of you may know, Unstoppable Domains (UD) blew up when they launched on Zilliqa. UD aims to replace cryptocurrency addresses with a human-readable name and allows for uncensorable websites. Zilliqa will probably be the only one able to handle all these transactions onchain due to ability to scale and its resulting low fees which is why the UD team launched this on Zilliqa in the first place. Furthermore, Zilliqa also has a strong emphasis on security, compliance, and privacy, which is why they partnered with companies like Elliptic, ChainSecurity (part of PwC Switzerland), and Incognito. Their sister company Aqilliz (Zilliqa spelled backwards) focuses on revolutionizing the digital advertising space and is doing interesting things like using Zilliqa to track outdoor digital ads with companies like Foodpanda.
 
Zilliqa is listed on nearly all major exchanges, having several different fiat-gateways and recently have been added to Binance’s margin trading and futures trading with really good volume. They also have a very impressive team with good credentials and experience. They don't just have “tech people”. They have a mix of tech people, business people, marketeers, scientists, and more. Naturally, it's good to have a mix of people with different skill sets if you work in the crypto space.
 
Marketing & Community
 
Zilliqa has a very strong community. If you just follow their Twitter their engagement is much higher for a coin that has approximately 80k followers. They also have been ‘coin of the day’ by LunarCrush many times. LunarCrush tracks real-time cryptocurrency value and social data. According to their data, it seems Zilliqa has a more fundamental and deeper understanding of marketing and community engagement than almost all other coins. While almost all coins have been a bit frozen in the last months, Zilliqa seems to be on its own bull run. It was somewhere in the 100s a few months ago and is currently ranked #46 on CoinGecko. Their official Telegram also has over 20k people and is very active, and their community channel which is over 7k now is more active and larger than many other official channels. Their local communities also seem to be growing.
 
Moreover, their community started ‘Zillacracy’ together with the Zilliqa core team ( see www.zillacracy.com ). It’s a community-run initiative where people from all over the world are now helping with marketing and development on Zilliqa. Since its launch in February 2020 they have been doing a lot and will also run their own non-custodial seed node for staking. This seed node will also allow them to start generating revenue for them to become a self sustaining entity that could potentially scale up to become a decentralized company working in parallel with the Zilliqa core team. Comparing it to all the other smart contract platforms (e.g. Cardano, EOS, Tezos etc.) they don't seem to have started a similar initiative (correct me if I’m wrong though). This suggests in my opinion that these other smart contract platforms do not fully understand how to utilize the ‘power of the community’. This is something you cannot ‘buy with money’ and gives many projects in the space a disadvantage.
 
Zilliqa also released two social products called SocialPay and Zeeves. SocialPay allows users to earn ZILs while tweeting with a specific hashtag. They have recently used it in partnership with the Singapore Red Cross for a marketing campaign after their initial pilot program. It seems like a very valuable social product with a good use case. I can see a lot of traditional companies entering the space through this product, which they seem to suggest will happen. Tokenizing hashtags with smart contracts to get network effect is a very smart and innovative idea.
 
Regarding Zeeves, this is a tipping bot for Telegram. They already have 1000s of signups and they plan to keep upgrading it for more and more people to use it (e.g. they recently have added a quiz features). They also use it during AMAs to reward people in real-time. It’s a very smart approach to grow their communities and get familiar with ZIL. I can see this becoming very big on Telegram. This tool suggests, again, that the Zilliqa team has a deeper understanding of what the crypto space and community needs and is good at finding the right innovative tools to grow and scale.
 
To be honest, I haven’t covered everything (i’m also reaching the character limited haha). So many updates happening lately that it's hard to keep up, such as the International Monetary Fund mentioning Zilliqa in their report, custodial and non-custodial Staking, Binance Margin, Futures, Widget, entering the Indian market, and more. The Head of Marketing Colin Miles has also released this as an overview of what is coming next. And last but not least, Vitalik Buterin has been mentioning Zilliqa lately acknowledging Zilliqa and mentioning that both projects have a lot of room to grow. There is much more info of course and a good part of it has been served to you on a silver platter. I invite you to continue researching by yourself :-) And if you have any comments or questions please post here!
submitted by haveyouheardaboutit to CryptoCurrency [link] [comments]

So-called "Poison Blocks" (what Greg Maxwell called the "big block attack") are the way Bitcoin was designed to scale and the ONLY way it ever can

Sounds insane, right? Not if you realize Bitcoin works only because it is an economic system. Everything in Bitcoin that falls under the purview of cutthroat market competition works, and everything that doesn't, doesn't.
The error here is this is seen as a reason not to lift the cap. "We cannot raise the cap or miners would be forced to do work!" This is stated un-ironically, with no awareness that some miners being left behind and some miners making it is exactly how Bitcoin always had to work.
This is a cry to leave node code optimization out of the purview of cutthroat market competition, because apparently some believe that "cutthroat" has something to do with the result -- the kind of socialist mindset that thinks cutthroat competion among seatbelt makers would lead to seatbelts that kill you. Anyone who understands economics knows nothing could be further from the truth.
The rallying cry of the Core-style socialist mentality is that "Node code is too important to be left to the market, we need good Samaritan devs to provide it for all miners so that no miner is left behind."
The ultimate result of shielding men from the effects of folly, is to fill the world with fools. -Herbert Spencer
Likewise, the ultimate result of shielding miners from their inability or unwilliness to suitably optimize their node software is to fill Bitcoin with unprofessional miners who can't take us to global adoption.
Without the incentive to upgrade networking and codebase, Bitcoin lacks the crucial vetting process that Bitcoin need in order to distill miners into a long tail of professionals who have what it takes to ride this train all the way to a billion users, quickly and securely.
I challenge anyone to describe how they think Bitcoin can professionalize as long as there remains an effective subsidy for laggard miners in the areas of networking and node optimization (not meaning protocol optimization, but rather things like parallel validation). As painful as it may seem, the only way Bitcoin scales is over the bankrupt shells of many miners who didn't have what it takes. The cruft cannot come along for the ride.
This means orphan battles, even if just a little at a time. It means stress tests of rapidly increasing scale. While killing off too much hashpower too fast is in no one's interest (hahsrate gets too low), moving at a speed that is fast yet manageable by most big-league pros is. And really, the changes that need to be made aren't even reputed by anyone to be incredibly hard problems once you accept, as Satoshi did, that "it ends in datacentres and big server farms."
The fact that people are still arguing against 128MB by referencing tests with laptop nodes suggests that's the real problem here. Core's full node religion still has sway, despite being manufactured from whole cloth. Also known as Blockstream Syndrome, as a play off Stockholm Syndrome (where captives begin to sympathize with their captors).
Whatever the reasons given, critics of removing the cap invariably appeal to the infrastructure "not being ready" as if that were a bad thing. It's a good thing!
First of all, if we were to wait for all miners to be ready, we would be waiting for far too long. The right approach, to be determined by the market, is to move ahead somewhere between when 51% are ready and say 90% are ready, which is exactly what we can expect to happen without a cap. The incentives are such that it it profitable to sheer away some laggard miners but not too many (as culling too many at a time leaves BCH open to hashpower attack by BTC miners; over the longer term though it incentivizes pros to enter and take the place of the failed miners, making BCH even more secure).
Secondly, the idea of a monolithic "infrastructure" ignores the secret sauce that makes Bitcoin work: miners in competition. Some are expected to fail to be ready! If not, how can Bitcoin miners get any more professional? Only the removal or reformation of the laggards can ever ensure Bitcoin ends up with professional infrastructure.
This vetting process is inevitable and essential, and it must apply to all aspects of Bitcoin that we want to see professionalized, including node software.
Now leaving aside a miner filling his block with his own 0-fee transactions (which can be dealt with by other miners rejecting blocks with too many 0-fee txs of low coin age*), Greg Maxwell's "big block attack" where big miners try to terrorize smaller (less well capitalized) miners using oversized blocks that a sizable minority of the network can't handle due to their slow networking is in fact exactly how Bitcoin MUST scale.
It's not an attack, it's a stress test, and one Bitcoin literally cannot scale without. What he called an attack is the solution to scaling, not any kind of problem. Stress tests are incentivized in Bitcoin as a way of calling the bluff of the lazy miners. You gamble some money on an "attack," see who the slowpokes are and take their block rewards for your own.
No miners had the balls to do this so far, but they will soon or Bitcoin dies due to the halvings in a few more years, as fee volume won't sustain security. As big blockers said to Core, there no room for arbitrary "conservatism" in the face of an oncoming train.
Finally, I leave you with a thought experiment. Imagine somehow the community of volunteer developers in Bitcoin was so incredibly generous that it offered all miners ASIC designs, mining pool software, and all manner hashing optimizations to the point that miners merely had to buy ASICs and plug them in with no need to understand anything at all, and no need to try innovating on their own with ASIC design since these incredibly skilled volunteers trumped everything they could possibly come up with. Now naturally this situation must eventually come to an end, as the real pros step in, like Samsung.
With security thereby left out of the purview of cutthroat market competition, thanks to overweening volunteerism that continued for too long (no problem with volunteers at the start, just a child isn't born into the world an adult and needs parenting at first), these miners would be wholly unvetted, unprepared, unable to scale up their hashing operations and be obliterated by Samsung or maybe a government 51% attack to kill Bitcoin.
The point here is there is a formative period, and then there is adulthood. Growing up is a process of relying less and less on handouts, being exposed more and more to the cutthroat realities of the world. When is Bitcoin going to grow up? The halvings place a time limit on Bitcoin's security, and overprotective parents (those who don't want to remove the cap) -- in an ostensible effort to be conservative -- may end up keeping Honeybadger holed up his figurative mom's basement too long for him to accomplish his mission.
*and if your response is, "This doesn't exist yet in any clients," I think you have missed the point of this post: again, that's a good thing. Let miners who are too incompetent to figure out something that simple get sloughed away. Do we really want such sluggards? If so and you're a dev, volunteer some code to them. If not, try to get hired by them instead. I think the pay will be much better.
And if your response is, "But that means some miners might get orphaned unexpectedly and cry foul," then once again I say, that's a good thing. Block creation is fundamentally a speculative process. In other words, it's a gamble, by design. It's a Keynesian beauty contest wherein each miner tries to mine the greediest block they can get away with while not upping their orphan risk appreciably. Messing around with low-coin-age 0-fee tx stuffing might get you orphaned, boo-hoo. Miners are under no obligation to tell other miners their standards for block beauty in advance, even though they typically have done so thus far. Miners are ALWAYS free to orphan a block for ANY reason. That they generally keep to consistent, well-broadcast rules is a courtesy, not a necessity. Preventing general assholery isn't necessarily best effected by being up-front about what you will punish, but even if it is, miners can do that, too (let them figure it out, as they do for hashpower -- unless you have a good argument for why there is no possible solution or the solution is necessary too hard for a professional organization to figure out in reasonable time; that's the bar for objection, not "well the volunteer dev code doesn't do this yet").
And if your response is, "That will increase the orphan rate," yes and orphans already happen routinely so it is certainly not any catastrophe. See it as a detox process. It might put some small strain on the network as the slowpokes and dickheads are smacked, but again miners still choose this level of orphaning as well by the same Keynesian-beauty-contest dynamic. Orphans are a key part of why Bitcoin works and why it can scale, but if the orphan rate would interfere with service too much (unlikely if you believe 0-conf works), that also gets taken into account in the beauty contest and gets balanced with the benefits of punishing bad behavior and the costs of stomaching the poison block. The offending miner can also be un-whitelisted, returned to rando-node status, but again why are we trying to coddle miners by coming up with their strategies for being better professionals for them? Hopefully it is clear by now that all such arguments are central planning, which is bad at least after an early parental phase which I think has long since passed its natural life.
submitted by ratifythis to btc [link] [comments]

Why i’m bullish on Zilliqa (long read)

Hey all, I've been researching coins since 2017 and have gone through 100s of them in the last 3 years. I got introduced to blockchain via Bitcoin of course, analysed Ethereum thereafter and from that moment I have a keen interest in smart contact platforms. I’m passionate about Ethereum but I find Zilliqa to have a better risk reward ratio. Especially because Zilliqa has found an elegant balance between being secure, decentralised and scalable in my opinion.
 
Below I post my analysis why from all the coins I went through I’m most bullish on Zilliqa (yes I went through Tezos, EOS, NEO, VeChain, Harmony, Algorand, Cardano etc.). Note that this is not investment advice and although it's a thorough analysis there is obviously some bias involved. Looking forward to what you all think!
 
Fun fact: the name Zilliqa is a play on ‘silica’ silicon dioxide which means “Silicon for the high-throughput consensus computer.”
 
This post is divided into (i) Technology, (ii) Business & Partnerships, and (iii) Marketing & Community. I’ve tried to make the technology part readable for a broad audience. If you’ve ever tried understanding the inner workings of Bitcoin and Ethereum you should be able to grasp most parts. Otherwise just skim through and once you are zoning out head to the next part.
 
Technology and some more:
 
Introduction The technology is one of the main reasons why I’m so bullish on Zilliqa. First thing you see on their website is: “Zilliqa is a high-performance, high-security blockchain platform for enterprises and next-generation applications.” These are some bold statements.
 
Before we deep dive into the technology let’s take a step back in time first as they have quite the history. The initial research paper from which Zilliqa originated dates back to August 2016: Elastico: A Secure Sharding Protocol For Open Blockchains where Loi Luu (Kyber Network) is one of the co-authors. Other ideas that led to the development of what Zilliqa has become today are: Bitcoin-NG, collective signing CoSi, ByzCoin and Omniledger.
 
The technical white paper was made public in August 2017 and since then they have achieved everything stated in the white paper and also created their own open source intermediate level smart contract language called Scilla (functional programming language similar to OCaml) too.
 
Mainnet is live since end of January 2019 with daily transaction rate growing continuously. About a week ago mainnet reached 5 million transactions, 500.000+ addresses in total along with 2400 nodes keeping the network decentralised and secure. Circulating supply is nearing 11 billion and currently only mining rewards are left. Maximum supply is 21 billion with annual inflation being 7.13% currently and will only decrease with time.
 
Zilliqa realised early on that the usage of public cryptocurrencies and smart contracts were increasing but decentralised, secure and scalable alternatives were lacking in the crypto space. They proposed to apply sharding onto a public smart contract blockchain where the transaction rate increases almost linear with the increase in amount of nodes. More nodes = higher transaction throughput and increased decentralisation. Sharding comes in many forms and Zilliqa uses network-, transaction- and computational sharding. Network sharding opens up the possibility of using transaction- and computational sharding on top. Zilliqa does not use state sharding for now. We’ll come back to this later.
 
Before we continue disecting how Zilliqa achieves such from a technological standpoint it’s good to keep in mind that a blockchain being decentralised and secure and scalable is still one of the main hurdles in allowing widespread usage of decentralised networks. In my opinion this needs to be solved first before blockchains can get to the point where they can create and add large scale value. So I invite you to read the next section to grasp the underlying fundamentals. Because after all these premises need to be true otherwise there isn’t a fundamental case to be bullish on Zilliqa, right?
 
Down the rabbit hole
 
How have they achieved this? Let’s define the basics first: key players on Zilliqa are the users and the miners. A user is anybody who uses the blockchain to transfer funds or run smart contracts. Miners are the (shard) nodes in the network who run the consensus protocol and get rewarded for their service in Zillings (ZIL). The mining network is divided into several smaller networks called shards, which is also referred to as ‘network sharding’. Miners subsequently are randomly assigned to a shard by another set of miners called DS (Directory Service) nodes. The regular shards process transactions and the outputs of these shards are eventually combined by the DS shard as they reach consensus on the final state. More on how these DS shards reach consensus (via pBFT) will be explained later on.
 
The Zilliqa network produces two types of blocks: DS blocks and Tx blocks. One DS Block consists of 100 Tx Blocks. And as previously mentioned there are two types of nodes concerned with reaching consensus: shard nodes and DS nodes. Becoming a shard node or DS node is being defined by the result of a PoW cycle (Ethash) at the beginning of the DS Block. All candidate mining nodes compete with each other and run the PoW (Proof-of-Work) cycle for 60 seconds and the submissions achieving the highest difficulty will be allowed on the network. And to put it in perspective: the average difficulty for one DS node is ~ 2 Th/s equaling 2.000.000 Mh/s or 55 thousand+ GeForce GTX 1070 / 8 GB GPUs at 35.4 Mh/s. Each DS Block 10 new DS nodes are allowed. And a shard node needs to provide around 8.53 GH/s currently (around 240 GTX 1070s). Dual mining ETH/ETC and ZIL is possible and can be done via mining software such as Phoenix and Claymore. There are pools and if you have large amounts of hashing power (Ethash) available you could mine solo.
 
The PoW cycle of 60 seconds is a peak performance and acts as an entry ticket to the network. The entry ticket is called a sybil resistance mechanism and makes it incredibly hard for adversaries to spawn lots of identities and manipulate the network with these identities. And after every 100 Tx Blocks which corresponds to roughly 1,5 hour this PoW process repeats. In between these 1,5 hour no PoW needs to be done meaning Zilliqa’s energy consumption to keep the network secure is low. For more detailed information on how mining works click here.
Okay, hats off to you. You have made it this far. Before we go any deeper down the rabbit hole we first must understand why Zilliqa goes through all of the above technicalities and understand a bit more what a blockchain on a more fundamental level is. Because the core of Zilliqa’s consensus protocol relies on the usage of pBFT (practical Byzantine Fault Tolerance) we need to know more about state machines and their function. Navigate to Viewblock, a Zilliqa block explorer, and just come back to this article. We will use this site to navigate through a few concepts.
 
We have established that Zilliqa is a public and distributed blockchain. Meaning that everyone with an internet connection can send ZILs, trigger smart contracts etc. and there is no central authority who fully controls the network. Zilliqa and other public and distributed blockchains (like Bitcoin and Ethereum) can also be defined as state machines.
 
Taking the liberty of paraphrasing examples and definitions given by Samuel Brooks’ medium article, he describes the definition of a blockchain (like Zilliqa) as:
“A peer-to-peer, append-only datastore that uses consensus to synchronise cryptographically-secure data”.
 
Next he states that: >“blockchains are fundamentally systems for managing valid state transitions”.* For some more context, I recommend reading the whole medium article to get a better grasp of the definitions and understanding of state machines. Nevertheless, let’s try to simplify and compile it into a single paragraph. Take traffic lights as an example: all its states (red, amber and green) are predefined, all possible outcomes are known and it doesn’t matter if you encounter the traffic light today or tomorrow. It will still behave the same. Managing the states of a traffic light can be done by triggering a sensor on the road or pushing a button resulting in one traffic lights’ state going from green to red (via amber) and another light from red to green.
 
With public blockchains like Zilliqa this isn’t so straightforward and simple. It started with block #1 almost 1,5 years ago and every 45 seconds or so a new block linked to the previous block is being added. Resulting in a chain of blocks with transactions in it that everyone can verify from block #1 to the current #647.000+ block. The state is ever changing and the states it can find itself in are infinite. And while the traffic light might work together in tandem with various other traffic lights, it’s rather insignificant comparing it to a public blockchain. Because Zilliqa consists of 2400 nodes who need to work together to achieve consensus on what the latest valid state is while some of these nodes may have latency or broadcast issues, drop offline or are deliberately trying to attack the network etc.
 
Now go back to the Viewblock page take a look at the amount of transaction, addresses, block and DS height and then hit refresh. Obviously as expected you see new incremented values on one or all parameters. And how did the Zilliqa blockchain manage to transition from a previous valid state to the latest valid state? By using pBFT to reach consensus on the latest valid state.
 
After having obtained the entry ticket, miners execute pBFT to reach consensus on the ever changing state of the blockchain. pBFT requires a series of network communication between nodes, and as such there is no GPU involved (but CPU). Resulting in the total energy consumed to keep the blockchain secure, decentralised and scalable being low.
 
pBFT stands for practical Byzantine Fault Tolerance and is an optimisation on the Byzantine Fault Tolerant algorithm. To quote Blockonomi: “In the context of distributed systems, Byzantine Fault Tolerance is the ability of a distributed computer network to function as desired and correctly reach a sufficient consensus despite malicious components (nodes) of the system failing or propagating incorrect information to other peers.” Zilliqa is such a distributed computer network and depends on the honesty of the nodes (shard and DS) to reach consensus and to continuously update the state with the latest block. If pBFT is a new term for you I can highly recommend the Blockonomi article.
 
The idea of pBFT was introduced in 1999 - one of the authors even won a Turing award for it - and it is well researched and applied in various blockchains and distributed systems nowadays. If you want more advanced information than the Blockonomi link provides click here. And if you’re in between Blockonomi and University of Singapore read the Zilliqa Design Story Part 2 dating from October 2017.
Quoting from the Zilliqa tech whitepaper: “pBFT relies upon a correct leader (which is randomly selected) to begin each phase and proceed when the sufficient majority exists. In case the leader is byzantine it can stall the entire consensus protocol. To address this challenge, pBFT offers a view change protocol to replace the byzantine leader with another one.”
 
pBFT can tolerate ⅓ of the nodes being dishonest (offline counts as Byzantine = dishonest) and the consensus protocol will function without stalling or hiccups. Once there are more than ⅓ of dishonest nodes but no more than ⅔ the network will be stalled and a view change will be triggered to elect a new DS leader. Only when more than ⅔ of the nodes are dishonest (>66%) double spend attacks become possible.
 
If the network stalls no transactions can be processed and one has to wait until a new honest leader has been elected. When the mainnet was just launched and in its early phases, view changes happened regularly. As of today the last stalling of the network - and view change being triggered - was at the end of October 2019.
 
Another benefit of using pBFT for consensus besides low energy is the immediate finality it provides. Once your transaction is included in a block and the block is added to the chain it’s done. Lastly, take a look at this article where three types of finality are being defined: probabilistic, absolute and economic finality. Zilliqa falls under the absolute finality (just like Tendermint for example). Although lengthy already we skipped through some of the inner workings from Zilliqa’s consensus: read the Zilliqa Design Story Part 3 and you will be close to having a complete picture on it. Enough about PoW, sybil resistance mechanism, pBFT etc. Another thing we haven’t looked at yet is the amount of decentralisation.
 
Decentralisation
 
Currently there are four shards, each one of them consisting of 600 nodes. 1 shard with 600 so called DS nodes (Directory Service - they need to achieve a higher difficulty than shard nodes) and 1800 shard nodes of which 250 are shard guards (centralised nodes controlled by the team). The amount of shard guards has been steadily declining from 1200 in January 2019 to 250 as of May 2020. On the Viewblock statistics you can see that many of the nodes are being located in the US but those are only the (CPU parts of the) shard nodes who perform pBFT. There is no data from where the PoW sources are coming. And when the Zilliqa blockchain starts reaching their transaction capacity limit, a network upgrade needs to be executed to lift the current cap of maximum 2400 nodes to allow more nodes and formation of more shards which will allow to network to keep on scaling according to demand.
Besides shard nodes there are also seed nodes. The main role of seed nodes is to serve as direct access points (for end users and clients) to the core Zilliqa network that validates transactions. Seed nodes consolidate transaction requests and forward these to the lookup nodes (another type of nodes) for distribution to the shards in the network. Seed nodes also maintain the entire transaction history and the global state of the blockchain which is needed to provide services such as block explorers. Seed nodes in the Zilliqa network are comparable to Infura on Ethereum.
 
The seed nodes were first only operated by Zilliqa themselves, exchanges and Viewblock. Operators of seed nodes like exchanges had no incentive to open them for the greater public.They were centralised at first. Decentralisation at the seed nodes level has been steadily rolled out since March 2020 ( Zilliqa Improvement Proposal 3 ). Currently the amount of seed nodes is being increased, they are public facing and at the same time PoS is applied to incentivize seed node operators and make it possible for ZIL holders to stake and earn passive yields. Important distinction: seed nodes are not involved with consensus! That is still PoW as entry ticket and pBFT for the actual consensus.
 
5% of the block rewards are being assigned to seed nodes (from the beginning in 2019) and those are being used to pay out ZIL stakers.The 5% block rewards with an annual yield of 10.03% translates to roughly 610 MM ZILs in total that can be staked. Exchanges use the custodial variant of staking and wallets like Moonlet will use the non custodial version (starting in Q3 2020). Staking is being done by sending ZILs to a smart contract created by Zilliqa and audited by Quantstamp.
 
With a high amount of DS & shard nodes and seed nodes becoming more decentralised too, Zilliqa qualifies for the label of decentralised in my opinion.
 
Smart contracts
 
Let me start by saying I’m not a developer and my programming skills are quite limited. So I‘m taking the ELI5 route (maybe 12) but if you are familiar with Javascript, Solidity or specifically OCaml please head straight to Scilla - read the docs to get a good initial grasp of how Zilliqa’s smart contract language Scilla works and if you ask yourself “why another programming language?” check this article. And if you want to play around with some sample contracts in an IDE click here. Faucet can be found here. And more information on architecture, dapp development and API can be found on the Developer Portal.
If you are more into listening and watching: check this recent webinar explaining Zilliqa and Scilla. Link is time stamped so you’ll start right away with a platform introduction, R&D roadmap 2020 and afterwards a proper Scilla introduction.
 
Generalised: programming languages can be divided into being ‘object oriented’ or ‘functional’. Here is an ELI5 given by software development academy: > “all programmes have two basic components, data – what the programme knows – and behaviour – what the programme can do with that data. So object-oriented programming states that combining data and related behaviours in one place, is called “object”, which makes it easier to understand how a particular program works. On the other hand, functional programming argues that data and behaviour are different things and should be separated to ensure their clarity.”
 
Scilla is on the functional side and shares similarities with OCaml: > OCaml is a general purpose programming language with an emphasis on expressiveness and safety. It has an advanced type system that helps catch your mistakes without getting in your way. It's used in environments where a single mistake can cost millions and speed matters, is supported by an active community, and has a rich set of libraries and development tools. For all its power, OCaml is also pretty simple, which is one reason it's often used as a teaching language.
 
Scilla is blockchain agnostic, can be implemented onto other blockchains as well, is recognised by academics and won a so called Distinguished Artifact Award award at the end of last year.
 
One of the reasons why the Zilliqa team decided to create their own programming language focused on preventing smart contract vulnerabilities safety is that adding logic on a blockchain, programming, means that you cannot afford to make mistakes. Otherwise it could cost you. It’s all great and fun blockchains being immutable but updating your code because you found a bug isn’t the same as with a regular web application for example. And with smart contracts it inherently involves cryptocurrencies in some form thus value.
 
Another difference with programming languages on a blockchain is gas. Every transaction you do on a smart contract platform like Zilliqa for Ethereum costs gas. With gas you basically pay for computational costs. Sending a ZIL from address A to address B costs 0.001 ZIL currently. Smart contracts are more complex, often involve various functions and require more gas (if gas is a new concept click here ).
 
So with Scilla, similar to Solidity, you need to make sure that “every function in your smart contract will run as expected without hitting gas limits. An improper resource analysis may lead to situations where funds may get stuck simply because a part of the smart contract code cannot be executed due to gas limits. Such constraints are not present in traditional software systems”. Scilla design story part 1
 
Some examples of smart contract issues you’d want to avoid are: leaking funds, ‘unexpected changes to critical state variables’ (example: someone other than you setting his or her address as the owner of the smart contract after creation) or simply killing a contract.
 
Scilla also allows for formal verification. Wikipedia to the rescue:
In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal methods of mathematics.
 
Formal verification can be helpful in proving the correctness of systems such as: cryptographic protocols, combinational circuits, digital circuits with internal memory, and software expressed as source code.
 
Scilla is being developed hand-in-hand with formalization of its semantics and its embedding into the Coq proof assistant — a state-of-the art tool for mechanized proofs about properties of programs.”
 
Simply put, with Scilla and accompanying tooling developers can be mathematically sure and proof that the smart contract they’ve written does what he or she intends it to do.
 
Smart contract on a sharded environment and state sharding
 
There is one more topic I’d like to touch on: smart contract execution in a sharded environment (and what is the effect of state sharding). This is a complex topic. I’m not able to explain it any easier than what is posted here. But I will try to compress the post into something easy to digest.
 
Earlier on we have established that Zilliqa can process transactions in parallel due to network sharding. This is where the linear scalability comes from. We can define simple transactions: a transaction from address A to B (Category 1), a transaction where a user interacts with one smart contract (Category 2) and the most complex ones where triggering a transaction results in multiple smart contracts being involved (Category 3). The shards are able to process transactions on their own without interference of the other shards. With Category 1 transactions that is doable, with Category 2 transactions sometimes if that address is in the same shard as the smart contract but with Category 3 you definitely need communication between the shards. Solving that requires to make a set of communication rules the protocol needs to follow in order to process all transactions in a generalised fashion.
 
And this is where the downsides of state sharding comes in currently. All shards in Zilliqa have access to the complete state. Yes the state size (0.1 GB at the moment) grows and all of the nodes need to store it but it also means that they don’t need to shop around for information available on other shards. Requiring more communication and adding more complexity. Computer science knowledge and/or developer knowledge required links if you want to dig further: Scilla - language grammar Scilla - Foundations for Verifiable Decentralised Computations on a Blockchain Gas Accounting NUS x Zilliqa: Smart contract language workshop
 
Easier to follow links on programming Scilla https://learnscilla.com/home Ivan on Tech
 
Roadmap / Zilliqa 2.0
 
There is no strict defined roadmap but here are topics being worked on. And via the Zilliqa website there is also more information on the projects they are working on.
 
Business & Partnerships  
It’s not only technology in which Zilliqa seems to be excelling as their ecosystem has been expanding and starting to grow rapidly. The project is on a mission to provide OpenFinance (OpFi) to the world and Singapore is the right place to be due to its progressive regulations and futuristic thinking. Singapore has taken a proactive approach towards cryptocurrencies by introducing the Payment Services Act 2019 (PS Act). Among other things, the PS Act will regulate intermediaries dealing with certain cryptocurrencies, with a particular focus on consumer protection and anti-money laundering. It will also provide a stable regulatory licensing and operating framework for cryptocurrency entities, effectively covering all crypto businesses and exchanges based in Singapore. According to PWC 82% of the surveyed executives in Singapore reported blockchain initiatives underway and 13% of them have already brought the initiatives live to the market. There is also an increasing list of organisations that are starting to provide digital payment services. Moreover, Singaporean blockchain developers Building Cities Beyond has recently created an innovation $15 million grant to encourage development on its ecosystem. This all suggest that Singapore tries to position itself as (one of) the leading blockchain hubs in the world.
 
Zilliqa seems to already taking advantage of this and recently helped launch Hg Exchange on their platform, together with financial institutions PhillipCapital, PrimePartners and Fundnel. Hg Exchange, which is now approved by the Monetary Authority of Singapore (MAS), uses smart contracts to represent digital assets. Through Hg Exchange financial institutions worldwide can use Zilliqa's safe-by-design smart contracts to enable the trading of private equities. For example, think of companies such as Grab, AirBnB, SpaceX that are not available for public trading right now. Hg Exchange will allow investors to buy shares of private companies & unicorns and capture their value before an IPO. Anquan, the main company behind Zilliqa, has also recently announced that they became a partner and shareholder in TEN31 Bank, which is a fully regulated bank allowing for tokenization of assets and is aiming to bridge the gap between conventional banking and the blockchain world. If STOs, the tokenization of assets, and equity trading will continue to increase, then Zilliqa’s public blockchain would be the ideal candidate due to its strategic positioning, partnerships, regulatory compliance and the technology that is being built on top of it.
 
What is also very encouraging is their focus on banking the un(der)banked. They are launching a stablecoin basket starting with XSGD. As many of you know, stablecoins are currently mostly used for trading. However, Zilliqa is actively trying to broaden the use case of stablecoins. I recommend everybody to read this text that Amrit Kumar wrote (one of the co-founders). These stablecoins will be integrated in the traditional markets and bridge the gap between the crypto world and the traditional world. This could potentially revolutionize and legitimise the crypto space if retailers and companies will for example start to use stablecoins for payments or remittances, instead of it solely being used for trading.
 
Zilliqa also released their DeFi strategic roadmap (dating November 2019) which seems to be aligning well with their OpFi strategy. A non-custodial DEX is coming to Zilliqa made by Switcheo which allows cross-chain trading (atomic swaps) between ETH, EOS and ZIL based tokens. They also signed a Memorandum of Understanding for a (soon to be announced) USD stablecoin. And as Zilliqa is all about regulations and being compliant, I’m speculating on it to be a regulated USD stablecoin. Furthermore, XSGD is already created and visible on block explorer and XIDR (Indonesian Stablecoin) is also coming soon via StraitsX. Here also an overview of the Tech Stack for Financial Applications from September 2019. Further quoting Amrit Kumar on this:
 
There are two basic building blocks in DeFi/OpFi though: 1) stablecoins as you need a non-volatile currency to get access to this market and 2) a dex to be able to trade all these financial assets. The rest are build on top of these blocks.
 
So far, together with our partners and community, we have worked on developing these building blocks with XSGD as a stablecoin. We are working on bringing a USD-backed stablecoin as well. We will soon have a decentralised exchange developed by Switcheo. And with HGX going live, we are also venturing into the tokenization space. More to come in the future.”*
 
Additionally, they also have this ZILHive initiative that injects capital into projects. There have been already 6 waves of various teams working on infrastructure, innovation and research, and they are not from ASEAN or Singapore only but global: see Grantees breakdown by country. Over 60 project teams from over 20 countries have contributed to Zilliqa's ecosystem. This includes individuals and teams developing wallets, explorers, developer toolkits, smart contract testing frameworks, dapps, etc. As some of you may know, Unstoppable Domains (UD) blew up when they launched on Zilliqa. UD aims to replace cryptocurrency addresses with a human readable name and allows for uncensorable websites. Zilliqa will probably be the only one able to handle all these transactions onchain due to ability to scale and its resulting low fees which is why the UD team launched this on Zilliqa in the first place. Furthermore, Zilliqa also has a strong emphasis on security, compliance, and privacy, which is why they partnered with companies like Elliptic, ChainSecurity (part of PwC Switzerland), and Incognito. Their sister company Aqilliz (Zilliqa spelled backwards) focuses on revolutionizing the digital advertising space and is doing interesting things like using Zilliqa to track outdoor digital ads with companies like Foodpanda.
 
Zilliqa is listed on nearly all major exchanges, having several different fiat-gateways and recently have been added to Binance’s margin trading and futures trading with really good volume. They also have a very impressive team with good credentials and experience. They dont just have “tech people”. They have a mix of tech people, business people, marketeers, scientists, and more. Naturally, it's good to have a mix of people with different skill sets if you work in the crypto space.
 
Marketing & Community
 
Zilliqa has a very strong community. If you just follow their Twitter their engagement is much higher for a coin that has approximately 80k followers. They also have been ‘coin of the day’ by LunarCrush many times. LunarCrush tracks real-time cryptocurrency value and social data. According to their data it seems Zilliqa has a more fundamental and deeper understanding of marketing and community engagement than almost all other coins. While almost all coins have been a bit frozen in the last months, Zilliqa seems to be on its own bull run. It was somewhere in the 100s a few months ago and is currently ranked #46 on CoinGecko. Their official Telegram also has over 20k people and is very active, and their community channel which is over 7k now is more active and larger than many other official channels. Their local communities) also seem to be growing.
 
Moreover, their community started ‘Zillacracy’ together with the Zilliqa core team ( see www.zillacracy.com ). It’s a community run initiative where people from all over the world are now helping with marketing and development on Zilliqa. Since its launch in February 2020 they have been doing a lot and will also run their own non custodial seed node for staking. This seed node will also allow them to start generating revenue for them to become a self sustaining entity that could potentially scale up to become a decentralized company working in parallel with the Zilliqa core team. Comparing it to all the other smart contract platforms (e.g. Cardano, EOS, Tezos etc.) they don't seem to have started a similar initiatives (correct me if I’m wrong though). This suggest in my opinion that these other smart contract platforms do not fully understand how to utilize the ‘power of the community’. This is something you cannot ‘buy with money’ and gives many projects in the space a disadvantage.
 
Zilliqa also released two social products called SocialPay and Zeeves. SocialPay allows users to earn ZILs while tweeting with a specific hashtag. They have recently used it in partnership with the Singapore Red Cross for a marketing campaign after their initial pilot program. It seems like a very valuable social product with a good use case. I can see a lot of traditional companies entering the space through this product, which they seem to suggest will happen. Tokenizing hashtags with smart contracts to get network effect is a very smart and innovative idea.
 
Regarding Zeeves, this is a tipping bot for Telegram. They already have 1000s of signups and they plan to keep upgrading it for more and more people to use it (e.g. they recently have added a quiz features). They also use it during AMAs to reward people in real time. It’s a very smart approach to grow their communities and get familiar with ZIL. I can see this becoming very big on Telegram. This tool suggests, again, that the Zilliqa team has a deeper understanding what the crypto space and community needs and is good at finding the right innovative tools to grow and scale.
 
To be honest, I haven’t covered everything (i’m also reaching the character limited haha). So many updates happening lately that it's hard to keep up, such as the International Monetary Fund mentioning Zilliqa in their report, custodial and non-custodial Staking, Binance Margin, Futures & Widget, entering the Indian market, and more. The Head of Marketing Colin Miles has also released this as an overview of what is coming next. And last but not least, Vitalik Buterin has been mentioning Zilliqa lately acknowledging Zilliqa and mentioning that both projects have a lot of room to grow. There is much more info of course and a good part of it has been served to you on a silver platter. I invite you to continue researching by yourself :-) And if you have any comments or questions please post here!
submitted by haveyouheardaboutit to CryptoCurrency [link] [comments]

Let's discuss some of the issues with Nano

Let's talk about some of Nano's biggest issues. I also made a video about this topic, available here: https://youtu.be/d9yb9ifurbg.
00:12 Spam
Issues
Potential Mitigations & Outstanding Issues
01:58 Privacy
Issues
  • Nano has no privacy. It is pseudonymous (like Bitcoin), not anonymous.
Potential Mitigations & Outstanding Issues & Outstanding Issues*
  • Second layer solutions like mixers can help, but some argue that isn't enough privacy.
  • The current protocol design + the computational overhead of privacy does not allow Nano to implement first layer privacy without compromising it's other features (fast, feeless, and scalable transactions).
02:56 Decentralization
Issues
  • Nano is currently not as decentralized as it could be. ~25% of the voting weight is held by Binance.
  • Users must choose representatives, and users don't always choose the best ones (or never choose).
Potential Mitigations & Outstanding Issues
  • Currently 4 unrelated parties (who all have a verifiable interest in keeping the network running) would have to work together to attack the network
  • Unlike Bitcoin, there is no mining or fees in Nano. This means that there is not a strong incentive for emergent centralization from profit maximization and economies of scale. We've seen this firsthand, as Nano's decentralization has increased over time.
  • Nano representative percentages are not that far off from Bitcoin mining pool percentages.
  • In Nano, voting weight can be remotely re-delegated to anyone at any time. This differs from Bitcoin, where consensus is controlled by miners and requires significant hardware investment.
  • The cost of a 51% attack scales with the market cap of Nano.
06:49 Marketing & adoption
Issues
  • The best technology doesn't always win. If no one knows about or uses Nano, it will die.
Potential Mitigations & Outstanding Issues
  • I would argue that the best technology typically does win, but it needs to be best in every way (price, speed, accessbility, etc). Nano is currently in a good place if you agree with that argument.
  • Bitcoin started small, and didn't spend money on marketing. It takes time to build a community.
  • The developers have said they will market more once the protocol is where they want it to be (v20 or v21?).
  • Community marketing initiatives have started to form organically (e.g. Twitter campaigns, YouTube ads, etc).
  • Marketing and adoption is a very difficult problem to solve, especially when you don't have first mover advantage or consistent cashflow.
08:07 Small developer fund
Issues
  • The developer fund only has 3 million NANO left (~$4MM), what happens after that?
Potential Mitigations & Outstanding Issues
  • The goal for Nano is to be an Internet RFC like TCP/IP or SMTP - development naturally slows down when the protocol is in a good place.
  • Nano development is completely open source, so anyone can participate. Multiple developers are now familiar with the Nano protocol.
  • Businesses and whales that benefit from Nano (exchanges, remittances, merchant services, etc) are incentivized to keep the protocol developed and running.
  • The developer fund was only ~5% of the supply - compare that to some of the other major cryptocurrencies.
10:08 Node incentives
Issues
  • There are no transaction fees, why would people run nodes to keep the network running?
Potential Mitigations & Outstanding Issues
  • The cost of consensus is so low in Nano that the benefits of the network itself are the incentive: decentralized money with 0 transaction fees that can be sent anywhere in the world nearly instantly. Similar to TCP/IP, email servers, and http servers. Just like Bitcoin full nodes.
  • Paying $50-$100 a month for a high-end node is a lot cheaper for merchants than paying 1-3% in total sales.
  • Businesses and whales that benefit from Nano (exchanges, remittances, merchant services, etc) are incentivized to keep the protocol developed and running.
11:58 No smart contracts
Issues
  • Nano doesn't support smart contracts.
Potential Mitigations & Outstanding Issues
  • Nano's sole goal is to be the most efficient peer-to-peer value transfer protocol possible. Adding smart contracts makes keeping Nano feeless, fast, and decentralized much more difficult.
  • Other solutions (e.g. Ethereum) exist for creating and enforcing smart contracts.
  • Code can still interact with Nano, but not on the first layer in a decentralized matter.
  • Real world smart contract adoption and usage is pretty limited at the moment, but that might not always be the case.
13:20 Price stability
Issues
  • Why would anyone accept or spend Nano if the price fluctuates so much?
  • Why wouldn't people just use a stablecoin version of Nano for sending and receiving money?
Potential Mitigations & Outstanding Issues
  • With good fiat gateways (stable, low fees, etc), you can always buy back the fiat equivalent of what you've spent.
  • The hope is that with enough adoption, people and businesses will eventually skip the fiat conversion and use Nano directly.
  • Because Nano is so fast, volatility is less of an issue. Transactions are confirmed in <10 seconds, and prices change less in that timeframe (vs 10 minutes to hours for Bitcoin).
  • Stablecoins reintroduce trust. Stable against what? Who controls the supply, and how do you get people to adopt them? What happens if the assets they're stable against fail? Nano is pure supply and demand.
  • With worldwide adoption, the market capitalization of Nano would be in the trillions. If that happens, even millions of dollars won't move the price significantly.
15:06 Deflation
Issues
  • Nano's current supply == max supply. Why would people spend Nano today if it could be worth more tomorrow?
  • What happens to principal representatives and voting weight as private keys are lost? How do you know keys are lost?
Potential Mitigations & Outstanding Issues
  • Nano is extremely divisible. 1 NANO is 1030 raw. Since there are no transaction fees, smaller and smaller amounts of Nano could be used to transact, even if the market cap reaches trillions.
  • People will always buy things they need (food, housing, etc).
  • I'm not sure what the plan is to adjust for lost keys. Probably requires more discussion.
Long-term Scalability
Issue
  • Current node software and hardware cannot handle thousands of TPS (low-end nodes fall behind at even 50 TPS).
  • The more representatives that exist, the more vote traffic is required (network bandwidth).
  • Low-end nodes currently slow down the network significantly. Principal representatives waste their resources constantly bootstrapping these weak nodes during network saturation.
Potential Mitigations & Outstanding Issues
  • Even as is, Nano can comfortably handle 50 TPS average - which is roughly the amount of transactions per day PayPal was doing in 2011 with nearly 100 million users.
  • Network bandwidth increases 50% a year.
  • There are some discussions of prioritizing bootstrapping by vote weight to limit the impact of weak nodes.
  • Since Nano uses an account balance system, pruning could drastically reduce storage requirements. You only need current state to keep the network running, not the full transaction history.
  • In the future, vote stapling could drastically reduce bandwidth usage by collecting all representative signatures up front and then only sharing that single aggregate signature.
  • Nano has no artificial protocol-based limits (e.g. block sizes or block times). It scales with hardware.
Obviously there is still a lot of work to be done in some areas, but overall I think Nano is a good place. For people that aren't Nano fans, what are your biggest concerns?
submitted by Qwahzi to CryptoCurrency [link] [comments]

Bitcoin (BTC)A Peer-to-Peer Electronic Cash System.

Bitcoin (BTC)A Peer-to-Peer Electronic Cash System.
  • Bitcoin (BTC) is a peer-to-peer cryptocurrency that aims to function as a means of exchange that is independent of any central authority. BTC can be transferred electronically in a secure, verifiable, and immutable way.
  • Launched in 2009, BTC is the first virtual currency to solve the double-spending issue by timestamping transactions before broadcasting them to all of the nodes in the Bitcoin network. The Bitcoin Protocol offered a solution to the Byzantine Generals’ Problem with a blockchain network structure, a notion first created by Stuart Haber and W. Scott Stornetta in 1991.
  • Bitcoin’s whitepaper was published pseudonymously in 2008 by an individual, or a group, with the pseudonym “Satoshi Nakamoto”, whose underlying identity has still not been verified.
  • The Bitcoin protocol uses an SHA-256d-based Proof-of-Work (PoW) algorithm to reach network consensus. Its network has a target block time of 10 minutes and a maximum supply of 21 million tokens, with a decaying token emission rate. To prevent fluctuation of the block time, the network’s block difficulty is re-adjusted through an algorithm based on the past 2016 block times.
  • With a block size limit capped at 1 megabyte, the Bitcoin Protocol has supported both the Lightning Network, a second-layer infrastructure for payment channels, and Segregated Witness, a soft-fork to increase the number of transactions on a block, as solutions to network scalability.

https://preview.redd.it/s2gmpmeze3151.png?width=256&format=png&auto=webp&s=9759910dd3c4a15b83f55b827d1899fb2fdd3de1

1. What is Bitcoin (BTC)?

  • Bitcoin is a peer-to-peer cryptocurrency that aims to function as a means of exchange and is independent of any central authority. Bitcoins are transferred electronically in a secure, verifiable, and immutable way.
  • Network validators, whom are often referred to as miners, participate in the SHA-256d-based Proof-of-Work consensus mechanism to determine the next global state of the blockchain.
  • The Bitcoin protocol has a target block time of 10 minutes, and a maximum supply of 21 million tokens. The only way new bitcoins can be produced is when a block producer generates a new valid block.
  • The protocol has a token emission rate that halves every 210,000 blocks, or approximately every 4 years.
  • Unlike public blockchain infrastructures supporting the development of decentralized applications (Ethereum), the Bitcoin protocol is primarily used only for payments, and has only very limited support for smart contract-like functionalities (Bitcoin “Script” is mostly used to create certain conditions before bitcoins are used to be spent).

2. Bitcoin’s core features

For a more beginner’s introduction to Bitcoin, please visit Binance Academy’s guide to Bitcoin.

Unspent Transaction Output (UTXO) model

A UTXO transaction works like cash payment between two parties: Alice gives money to Bob and receives change (i.e., unspent amount). In comparison, blockchains like Ethereum rely on the account model.
https://preview.redd.it/t1j6anf8f3151.png?width=1601&format=png&auto=webp&s=33bd141d8f2136a6f32739c8cdc7aae2e04cbc47

Nakamoto consensus

In the Bitcoin network, anyone can join the network and become a bookkeeping service provider i.e., a validator. All validators are allowed in the race to become the block producer for the next block, yet only the first to complete a computationally heavy task will win. This feature is called Proof of Work (PoW).
The probability of any single validator to finish the task first is equal to the percentage of the total network computation power, or hash power, the validator has. For instance, a validator with 5% of the total network computation power will have a 5% chance of completing the task first, and therefore becoming the next block producer.
Since anyone can join the race, competition is prone to increase. In the early days, Bitcoin mining was mostly done by personal computer CPUs.
As of today, Bitcoin validators, or miners, have opted for dedicated and more powerful devices such as machines based on Application-Specific Integrated Circuit (“ASIC”).
Proof of Work secures the network as block producers must have spent resources external to the network (i.e., money to pay electricity), and can provide proof to other participants that they did so.
With various miners competing for block rewards, it becomes difficult for one single malicious party to gain network majority (defined as more than 51% of the network’s hash power in the Nakamoto consensus mechanism). The ability to rearrange transactions via 51% attacks indicates another feature of the Nakamoto consensus: the finality of transactions is only probabilistic.
Once a block is produced, it is then propagated by the block producer to all other validators to check on the validity of all transactions in that block. The block producer will receive rewards in the network’s native currency (i.e., bitcoin) as all validators approve the block and update their ledgers.

The blockchain

Block production

The Bitcoin protocol utilizes the Merkle tree data structure in order to organize hashes of numerous individual transactions into each block. This concept is named after Ralph Merkle, who patented it in 1979.
With the use of a Merkle tree, though each block might contain thousands of transactions, it will have the ability to combine all of their hashes and condense them into one, allowing efficient and secure verification of this group of transactions. This single hash called is a Merkle root, which is stored in the Block Header of a block. The Block Header also stores other meta information of a block, such as a hash of the previous Block Header, which enables blocks to be associated in a chain-like structure (hence the name “blockchain”).
An illustration of block production in the Bitcoin Protocol is demonstrated below.

https://preview.redd.it/m6texxicf3151.png?width=1591&format=png&auto=webp&s=f4253304912ed8370948b9c524e08fef28f1c78d

Block time and mining difficulty

Block time is the period required to create the next block in a network. As mentioned above, the node who solves the computationally intensive task will be allowed to produce the next block. Therefore, block time is directly correlated to the amount of time it takes for a node to find a solution to the task. The Bitcoin protocol sets a target block time of 10 minutes, and attempts to achieve this by introducing a variable named mining difficulty.
Mining difficulty refers to how difficult it is for the node to solve the computationally intensive task. If the network sets a high difficulty for the task, while miners have low computational power, which is often referred to as “hashrate”, it would statistically take longer for the nodes to get an answer for the task. If the difficulty is low, but miners have rather strong computational power, statistically, some nodes will be able to solve the task quickly.
Therefore, the 10 minute target block time is achieved by constantly and automatically adjusting the mining difficulty according to how much computational power there is amongst the nodes. The average block time of the network is evaluated after a certain number of blocks, and if it is greater than the expected block time, the difficulty level will decrease; if it is less than the expected block time, the difficulty level will increase.

What are orphan blocks?

In a PoW blockchain network, if the block time is too low, it would increase the likelihood of nodes producingorphan blocks, for which they would receive no reward. Orphan blocks are produced by nodes who solved the task but did not broadcast their results to the whole network the quickest due to network latency.
It takes time for a message to travel through a network, and it is entirely possible for 2 nodes to complete the task and start to broadcast their results to the network at roughly the same time, while one’s messages are received by all other nodes earlier as the node has low latency.
Imagine there is a network latency of 1 minute and a target block time of 2 minutes. A node could solve the task in around 1 minute but his message would take 1 minute to reach the rest of the nodes that are still working on the solution. While his message travels through the network, all the work done by all other nodes during that 1 minute, even if these nodes also complete the task, would go to waste. In this case, 50% of the computational power contributed to the network is wasted.
The percentage of wasted computational power would proportionally decrease if the mining difficulty were higher, as it would statistically take longer for miners to complete the task. In other words, if the mining difficulty, and therefore targeted block time is low, miners with powerful and often centralized mining facilities would get a higher chance of becoming the block producer, while the participation of weaker miners would become in vain. This introduces possible centralization and weakens the overall security of the network.
However, given a limited amount of transactions that can be stored in a block, making the block time too longwould decrease the number of transactions the network can process per second, negatively affecting network scalability.

3. Bitcoin’s additional features

Segregated Witness (SegWit)

Segregated Witness, often abbreviated as SegWit, is a protocol upgrade proposal that went live in August 2017.
SegWit separates witness signatures from transaction-related data. Witness signatures in legacy Bitcoin blocks often take more than 50% of the block size. By removing witness signatures from the transaction block, this protocol upgrade effectively increases the number of transactions that can be stored in a single block, enabling the network to handle more transactions per second. As a result, SegWit increases the scalability of Nakamoto consensus-based blockchain networks like Bitcoin and Litecoin.
SegWit also makes transactions cheaper. Since transaction fees are derived from how much data is being processed by the block producer, the more transactions that can be stored in a 1MB block, the cheaper individual transactions become.
https://preview.redd.it/depya70mf3151.png?width=1601&format=png&auto=webp&s=a6499aa2131fbf347f8ffd812930b2f7d66be48e
The legacy Bitcoin block has a block size limit of 1 megabyte, and any change on the block size would require a network hard-fork. On August 1st 2017, the first hard-fork occurred, leading to the creation of Bitcoin Cash (“BCH”), which introduced an 8 megabyte block size limit.
Conversely, Segregated Witness was a soft-fork: it never changed the transaction block size limit of the network. Instead, it added an extended block with an upper limit of 3 megabytes, which contains solely witness signatures, to the 1 megabyte block that contains only transaction data. This new block type can be processed even by nodes that have not completed the SegWit protocol upgrade.
Furthermore, the separation of witness signatures from transaction data solves the malleability issue with the original Bitcoin protocol. Without Segregated Witness, these signatures could be altered before the block is validated by miners. Indeed, alterations can be done in such a way that if the system does a mathematical check, the signature would still be valid. However, since the values in the signature are changed, the two signatures would create vastly different hash values.
For instance, if a witness signature states “6,” it has a mathematical value of 6, and would create a hash value of 12345. However, if the witness signature were changed to “06”, it would maintain a mathematical value of 6 while creating a (faulty) hash value of 67890.
Since the mathematical values are the same, the altered signature remains a valid signature. This would create a bookkeeping issue, as transactions in Nakamoto consensus-based blockchain networks are documented with these hash values, or transaction IDs. Effectively, one can alter a transaction ID to a new one, and the new ID can still be valid.
This can create many issues, as illustrated in the below example:
  1. Alice sends Bob 1 BTC, and Bob sends Merchant Carol this 1 BTC for some goods.
  2. Bob sends Carols this 1 BTC, while the transaction from Alice to Bob is not yet validated. Carol sees this incoming transaction of 1 BTC to him, and immediately ships goods to B.
  3. At the moment, the transaction from Alice to Bob is still not confirmed by the network, and Bob can change the witness signature, therefore changing this transaction ID from 12345 to 67890.
  4. Now Carol will not receive his 1 BTC, as the network looks for transaction 12345 to ensure that Bob’s wallet balance is valid.
  5. As this particular transaction ID changed from 12345 to 67890, the transaction from Bob to Carol will fail, and Bob will get his goods while still holding his BTC.
With the Segregated Witness upgrade, such instances can not happen again. This is because the witness signatures are moved outside of the transaction block into an extended block, and altering the witness signature won’t affect the transaction ID.
Since the transaction malleability issue is fixed, Segregated Witness also enables the proper functioning of second-layer scalability solutions on the Bitcoin protocol, such as the Lightning Network.

Lightning Network

Lightning Network is a second-layer micropayment solution for scalability.
Specifically, Lightning Network aims to enable near-instant and low-cost payments between merchants and customers that wish to use bitcoins.
Lightning Network was conceptualized in a whitepaper by Joseph Poon and Thaddeus Dryja in 2015. Since then, it has been implemented by multiple companies. The most prominent of them include Blockstream, Lightning Labs, and ACINQ.
A list of curated resources relevant to Lightning Network can be found here.
In the Lightning Network, if a customer wishes to transact with a merchant, both of them need to open a payment channel, which operates off the Bitcoin blockchain (i.e., off-chain vs. on-chain). None of the transaction details from this payment channel are recorded on the blockchain, and only when the channel is closed will the end result of both party’s wallet balances be updated to the blockchain. The blockchain only serves as a settlement layer for Lightning transactions.
Since all transactions done via the payment channel are conducted independently of the Nakamoto consensus, both parties involved in transactions do not need to wait for network confirmation on transactions. Instead, transacting parties would pay transaction fees to Bitcoin miners only when they decide to close the channel.
https://preview.redd.it/cy56icarf3151.png?width=1601&format=png&auto=webp&s=b239a63c6a87ec6cc1b18ce2cbd0355f8831c3a8
One limitation to the Lightning Network is that it requires a person to be online to receive transactions attributing towards him. Another limitation in user experience could be that one needs to lock up some funds every time he wishes to open a payment channel, and is only able to use that fund within the channel.
However, this does not mean he needs to create new channels every time he wishes to transact with a different person on the Lightning Network. If Alice wants to send money to Carol, but they do not have a payment channel open, they can ask Bob, who has payment channels open to both Alice and Carol, to help make that transaction. Alice will be able to send funds to Bob, and Bob to Carol. Hence, the number of “payment hubs” (i.e., Bob in the previous example) correlates with both the convenience and the usability of the Lightning Network for real-world applications.

Schnorr Signature upgrade proposal

Elliptic Curve Digital Signature Algorithm (“ECDSA”) signatures are used to sign transactions on the Bitcoin blockchain.
https://preview.redd.it/hjeqe4l7g3151.png?width=1601&format=png&auto=webp&s=8014fb08fe62ac4d91645499bc0c7e1c04c5d7c4
However, many developers now advocate for replacing ECDSA with Schnorr Signature. Once Schnorr Signatures are implemented, multiple parties can collaborate in producing a signature that is valid for the sum of their public keys.
This would primarily be beneficial for network scalability. When multiple addresses were to conduct transactions to a single address, each transaction would require their own signature. With Schnorr Signature, all these signatures would be combined into one. As a result, the network would be able to store more transactions in a single block.
https://preview.redd.it/axg3wayag3151.png?width=1601&format=png&auto=webp&s=93d958fa6b0e623caa82ca71fe457b4daa88c71e
The reduced size in signatures implies a reduced cost on transaction fees. The group of senders can split the transaction fees for that one group signature, instead of paying for one personal signature individually.
Schnorr Signature also improves network privacy and token fungibility. A third-party observer will not be able to detect if a user is sending a multi-signature transaction, since the signature will be in the same format as a single-signature transaction.

4. Economics and supply distribution

The Bitcoin protocol utilizes the Nakamoto consensus, and nodes validate blocks via Proof-of-Work mining. The bitcoin token was not pre-mined, and has a maximum supply of 21 million. The initial reward for a block was 50 BTC per block. Block mining rewards halve every 210,000 blocks. Since the average time for block production on the blockchain is 10 minutes, it implies that the block reward halving events will approximately take place every 4 years.
As of May 12th 2020, the block mining rewards are 6.25 BTC per block. Transaction fees also represent a minor revenue stream for miners.
submitted by D-platform to u/D-platform [link] [comments]

Staking in Ethereum 2.0: when will it appear and how much can you earn on it?

Staking in Ethereum 2.0: when will it appear and how much can you earn on it?

Staking in Ethereum 2.0: when will it appear and how much can you earn on it?

Why coin staking will be added in Ethereum 2.0

A brief educational program for those who do not follow the update of the project of Vitalik Buterin. Ethereum has long been in need of updating, and the main problem of the network is scalability: the blockchain is overloaded, transactions are slowing down, and the cost of “gas” (transaction fees) is growing. If you do not update the consensus algorithm, then the network will someday cease to be operational. To avoid this, developers have been working for several years on moving the network from the PoW algorithm to state 2.0, running on PoS. This should make the network more scalable, faster and cheaper. In December last year, the first upgrade phase, Istanbul, was implemented in the network, and in April of this year, the Topaz test network with the possibility of staking was launched - the first users already earned 1%. In the PoS algorithm that Ethereum switches to, there is no mining, and validation occurs due to the delegation of user network coins to the masternodes. For the duration of the delegation, these coins are frozen, and for providing their funds for block validation, users receive a portion of the reward. This is staking - such a crypto-analogue of a bank deposit. There are several types of staking: with income from dividends or masternodes, but not the device’s power, as in PoW algorithms, but the number of miner coins is important in all of them. The more coins, the higher the income. For crypto investors, staking is an opportunity to receive passive income from blocked coins. It is assumed that the launch of staking:
  • Will make ETH mining more affordable, but less resource intensive;
  • Will make the network more secure and secure - attacks will become too expensive;
  • Will create an entirely new sector of steak infrastructure around the platform;
  • Provides increased scalability, which will create the opportunity for wider implementation of DeFi protocols;
  • And, most importantly, it will show that Ethereum is a developing project.

The first payments to stakeholders will be one to two years after the launch of the update

The minimum validator steak will be 32 ETN (≈$6092 for today). This is the minimum number of coins that an ETH holder must freeze in order to qualify for payments. Another prerequisite is not to disconnect your wallet from the network. If the user disconnects and goes into automatic mode, he loses his daily income. If at some point the steak drops below 16 ETH, the user will be deprived of the right to be a validator. The Ethereum network has to go through many more important stages before coin holders can make money on its storage. Collin Myers, the leader of the product strategy at the startup of the Ethereum developer ConsenSys, said that the genesis block of the new network will not be mined until the total amount of frozen funds reaches 524,000 ETN ($99.76 million at the time of publication). So many coins should be kept by 16,375 validators with a minimum deposit of 32 ETN. Until this moment, none of them will receive a percentage profit. Myers noted that this event is not tied to a clear time and depends on the activity of the community. All validators will have to freeze a rather significant amount for an indefinite period in the new network without confidence in the growth of the coin rate. It’s hard to say how many people there are. The developers believe that it will take 12−18 or even 24 months. According to the latest ConsenSys Codefi report, more than 65% of the 300 ETH owners surveyed plan to use the staking opportunity. This sample, of course, is not representative, but it can be assumed that most major coin holders will still be willing to take a chance.

How much can you earn on Ethereum staking

Developers have been arguing for a long time about what profitability should be among the validators of the Ethereum 2.0 network. The economic model of the network maintains an inflation rate below 1% and dynamically adjusts the reward scale for validators. The difficulty is not to overpay, but not to pay too little. Profitability will be variable, as it depends on the number and size of steaks, as well as other parameters. The fewer frozen coins and validators, the higher the yield, and vice versa. This is an easy way to motivate users to freeze ETN. According to the October calculations of Collin Myers, after the launch of Ethereum 2.0, validators will be able to receive from 4.6% to 10.3% per annum as a reward for their steak. At the summit, he clarified that the first time after the launch of the Genesis block, it can even reach 20.3%. But as the number of steaks grows, profitability will decline. So, with five million steaks, it drops to about 6.6%. The above numbers are not net returns. They do not include equipment and electricity costs. According to Myers, after the Genesis block, the costs of maintaining the validator node will be about 4.75% of the remuneration. They will continue to increase as the number of blocked coins increases, and with a five millionth steak, they will grow to about 14.7%. Myers emphasized that profitability will be higher for those who will work on their own equipment, rather than relying on cloud services. The latter, according to his calculations, at current prices can bring a loss of up to minus 15% per year. This, he believes, promotes true decentralization. At the end of April, Vitalik Buterin said that validators will be able to earn 5% per annum with a minimum stake of 32 ETH - 1.6 ETH per year, or $ 304 at the time of publication. However, given the cost of freezing funds, the real return will be at 0.8%.

How to calculate profitability from ETN staking

The easiest way to calculate the estimated return for Ethereum staking is to use a special calculator. For example, from the online services EthereumPrice or Stakingrewards. The service takes into account the latest indicators of network profitability, as well as additional characteristics: the time of operation of a node in the network, the price of a coin, the share of blocked ETNs and so on. Depending on these values, the profit of the validator can vary greatly. For example, you block 32 ETNs at today's coin price - $190, 1% of the coins are blocked, and the node works 99% of the time. According to the EthereumPrice calculator, in this case your yield will be 14.25% per annum, or 4.56 ETH.
Validator earnings from the example above for 10 years according to EthereumPrice.
If to change the data, you have the same steak, but the proportion of blocked coins is 10%. Now your annual yield is only 4.51%, or 1.44 ETH.
Validator earnings from the second example over 10 years according to EthereumPrice.
It is important that this is profitability excluding expenses. Real returns will be significantly lower and in the second case may be negative. In addition, you must consider the fluctuation of the course. Even with a yield of 14% per annum in ETN, dollar-denominated returns may be negative in a bear market.

When will the transition to Ethereum 2.0 start

Ben Edgington from Teku, the operator of Ethereum 2.0, at the last summit said that the transition to PoS could be launched in July this year. These deadlines, if there are no new delays, were also mentioned by experts of the BitMEX crypto exchange in their recent report on the transition of the Ethereum ecosystem to stage 2.0. However, on May 12, Vitalik Buterin denied the possibility of launching Ethereum 2.0 in July. The network is not yet ready and is unlikely to be launched before the end of the year. July 30 marks the 5th anniversary of the launch of Ethereum. Unfortunately, it seems that it will not be possible to start the update for the anniversary again. Full deployment of updates will consist of several stages. Phase 0. Beacon chain. The "zero" phase, which can be launched in July this year. In fact, it will only be a network test and PoS testing without economic activity, but it will use new ETN coins and the possibility of staking will appear. The "zero" phase will test the first layer of Ethereum 2.0 architecture - Lighthouse. This is the Ethereum 2.0 client in Rust, developed back in 2018. Phase 1. Sharding - rejection of full nodes in favor of load balancing between all network nodes (shards). This should increase network bandwidth and solve the scalability problem. This is the first full phase of Ethereum 2.0. It will initially be deployed with 64 shards. It is because of sharding that the transition of a network to a new state is so complicated - existing smart contracts cannot be transferred to a new network. Therefore, at first, perhaps several years, both networks will exist simultaneously. Phase 2. State execution. In this phase, various applications will work, and it will be possible to conclude smart contracts. This is a full-fledged working Ethereum 2.0 network. After the second phase, two networks will work in parallel - Ethereum and Ethereum 2.0. Coin holders will be able to transfer ETN from the first to the second without the ability to transfer them back. To stimulate network support, coin emissions in both networks will increase until they merge. Read more about the phases of transition to state 2.0 in the aforementioned BitMEX report.

How the upgrade to Ethereum 2.0 will affect the staking market and coin price

The transition of the second largest coin to PoS will dramatically increase the stake in the market. The deposit in 32 ETH is too large for most users. Therefore, we should expect an increase in offers for staking from the exchanges. So, the launch of such a service in November was announced by the largest Swiss crypto exchange Bitcoin Suisse. She will not have a minimum deposit, and the commission will be 15%. According to October estimates by Binance Research analysts, the transition of Ethereum to stage 2.0 can double the price of a coin and the stake of staking in the market, and it will also make ETH the most popular currency on the PoS algorithm. Adam Cochran, partner at MetaCartel Ventures DAO and developer of DuckDuckGo, argued in his blog that Ethereum's transition to state 2.0 would be the “biggest event” of the cryptocurrency market. He believes that a 3–5% return will attract the capital of large investors, and fear of lost profit (FOMO) among retail investors will push them to actively buy coins. The planned coin burning mechanism for each transaction will reduce the potential oversupply. However, BitMEX experts in the report mentioned above believe that updating the network will not be as important an event as it seems to many, and will not have a significant impact on the coin rate and the staking market. Initially, this will be more likely to test the PoS system, rather than a full-fledged network. There will be no economic activity and smart contracts, and interest for a steak will not be paid immediately. Therefore, most of the economic activity will continue to be concluded in the original Ethereum network, which will work in parallel with the new one. Analysts of the exchange emphasized that due to the addition of staking, the first time (short, in their opinion) a large number of ETNs will be blocked on the network. Most likely, this will limit the supply of coins and lead to higher prices. However, this can also release some of the ETNs blocked in smart contracts, and then the price will not rise. Moreover, the authors of the document are not sure that the demand for coins will be long-term and stable. For this to happen, PoS and sharding must prove that they work stably and provide the benefits for which the update was started. But, if this happens, the network is waiting for a wave of coins from the developers of smart contracts and DeFi protocols. In any case, quick changes should not be expected. A full transition to Ethereum 2.0 will take years and won’t be smooth - network failures are inevitable. We also believe that we should not rely on Ethereum staking as another panacea for all the problems of the coin and the market. Most likely, the transition of the network to PoS will not have a significant impact on the staking market, but may positively affect the price of the coin. However, relying on the ETN rally in anticipation of this is too optimistic.
Subscribe to our Telegram channel
submitted by Smart_Smell to Robopay [link] [comments]

What are Nano's biggest issues? Let's talk about it!

Let's talk about some of Nano's biggest issues. I also made a video about this topic, available here: https://youtu.be/d9yb9ifurbg.
00:12 Spam
Issues
Potential Mitigations & Outstanding Issues
01:58 Privacy
Issues
  • Nano has no privacy. It is pseudonymous (like Bitcoin), not anonymous.
Potential Mitigations & Outstanding Issues & Outstanding Issues*
  • Second layer solutions like mixers can help, but some argue that isn't enough privacy.
  • The current protocol design + the computational overhead of privacy does not allow Nano to implement first layer privacy without compromising it's other features (fast, feeless, and scalable transactions).
02:56 Decentralization
Issues
  • Nano is currently not as decentralized as it could be. ~25% of the voting weight is held by Binance.
  • Users must choose representatives, and users don't always choose the best ones (or never choose).
Potential Mitigations & Outstanding Issues
  • Currently 4 unrelated parties (who all have a verifiable interest in keeping the network running) would have to work together to attack the network
  • Unlike Bitcoin, there is no mining or fees in Nano. This means that there is not a strong incentive for emergent centralization from profit maximization and economies of scale. We've seen this firsthand, as Nano's decentralization has increased over time.
  • Nano representative percentages are not that far off from Bitcoin mining pool percentages.
  • In Nano, voting weight can be remotely re-delegated to anyone at any time. This differs from Bitcoin, where consensus is controlled by miners and requires significant hardware investment.
  • The cost of a 51% attack scales with the market cap of Nano.
06:49 Marketing & adoption
Issues
  • The best technology doesn't always win. If no one knows about or uses Nano, it will die.
Potential Mitigations & Outstanding Issues
  • I would argue that the best technology typically does win, but it needs to be best in every way (price, speed, accessbility, etc). Nano is currently in a good place if you agree with that argument.
  • Bitcoin started small, and didn't spend money on marketing. It takes time to build a community.
  • The developers have said they will market more once the protocol is where they want it to be (v20 or v21?).
  • Community marketing initiatives have started to form organically (e.g. Twitter campaigns, YouTube ads, etc).
  • Marketing and adoption is a very difficult problem to solve, especially when you don't have first mover advantage or consistent cashflow.
08:07 Small developer fund
Issues
  • The developer fund only has 3 million NANO left (~$4MM), what happens after that?
Potential Mitigations & Outstanding Issues
  • The goal for Nano is to be an Internet RFC like TCP/IP or SMTP - development naturally slows down when the protocol is in a good place.
  • Nano development is completely open source, so anyone can participate. Multiple developers are now familiar with the Nano protocol.
  • Businesses and whales that benefit from Nano (exchanges, remittances, merchant services, etc) are incentivized to keep the protocol developed and running.
  • The developer fund was only ~5% of the supply - compare that to some of the other major cryptocurrencies.
10:08 Node incentives
Issues
  • There are no transaction fees, why would people run nodes to keep the network running?
Potential Mitigations & Outstanding Issues
  • The cost of consensus is so low in Nano that the benefits of the network itself are the incentive: decentralized money with 0 transaction fees that can be sent anywhere in the world nearly instantly.
  • Paying $50-$100 a month for a high-end node is a lot cheaper for merchants than paying 1-3% in total sales.
  • Businesses and whales that benefit from Nano (exchanges, remittances, merchant services, etc) are incentivized to keep the protocol developed and running.
11:58 No smart contracts
Issues
  • Nano doesn't support smart contracts.
Potential Mitigations & Outstanding Issues
  • Nano's sole goal is to be the most efficient peer-to-peer value transfer protocol possible. Adding smart contracts makes keeping Nano feeless, fast, and decentralized much more difficult.
  • Other solutions (e.g. Ethereum) exist for creating and enforcing smart contracts.
  • Code can still interact with Nano, but not on the first layer in a decentralized matter.
  • Real world smart contract adoption and usage is pretty limited at the moment, but that might not always be the case.
13:20 Price stability
Issues
  • Why would anyone accept or spend Nano if the price fluctuates so much?
  • Why wouldn't people just use a stablecoin version of Nano for sending and receiving money?
Potential Mitigations & Outstanding Issues
  • With good fiat gateways (stable, low fees, etc), you can always buy back the fiat equivalent of what you've spent.
  • The hope is that with enough adoption, people and businesses will eventually skip the fiat conversion and use Nano directly.
  • Because Nano is so fast, volatility is less of an issue. Transactions are confirmed in <10 seconds, and prices change less in that timeframe (vs 10 minutes to hours for Bitcoin).
  • Stablecoins reintroduce trust. Stable against what? Who controls the supply, and how do you get people to adopt them? What happens if the assets they're stable against fail? Nano is pure supply and demand.
  • With worldwide adoption, the market capitalization of Nano would be in the trillions. If that happens, even millions of dollars won't move the price significantly.
15:06 Deflation
Issues
  • Nano's current supply == max supply. Why would people spend Nano today if it could be worth more tomorrow?
  • What happens to principal representatives and voting weight as private keys are lost? How do you know keys are lost?
Potential Mitigations & Outstanding Issues
  • Nano is extremely divisible. 1 NANO is 1030 raw. Since there are no transaction fees, smaller and smaller amounts of Nano could be used to transact, even if the market cap reaches trillions.
  • People will always buy things they need (food, housing, etc).
  • I'm not sure what the plan is to adjust for lost keys. Probably requires more discussion.
Long-term Scalability
Issue
  • Current node software and hardware cannot handle thousands of TPS (low-end nodes fall behind at even 50 TPS).
  • The more representatives that exist, the more vote traffic is required (network bandwidth).
  • Low-end nodes currently slow down the network significantly. Principal representatives waste their resources constantly bootstrapping these weak nodes during network saturation.
Potential Mitigations & Outstanding Issues
  • Even as is, Nano can comfortably handle 50 TPS average - which is roughly the amount of transactions per day PayPal was doing in 2011 with nearly 100 million users.
  • Network bandwidth increases 50% a year.
  • There are some discussions of prioritizing bootstrapping by vote weight to limit the impact of weak nodes.
  • Since Nano uses an account balance system, pruning could drastically reduce storage requirements. You only need current state to keep the network running, not the full transaction history.
  • In the future, vote stapling could drastically reduce bandwidth usage by collecting all representative signatures up front and then only sharing that single aggregate signature.
  • Nano has no artificial protocol-based limits (e.g. block sizes or block times). It scales with hardware.
submitted by Qwahzi to nanocurrency [link] [comments]

Technical: A Brief History of Payment Channels: from Satoshi to Lightning Network

Who cares about political tweets from some random country's president when payment channels are a much more interesting and are actually capable of carrying value?
So let's have a short history of various payment channel techs!

Generation 0: Satoshi's Broken nSequence Channels

Because Satoshi's Vision included payment channels, except his implementation sucked so hard we had to go fix it and added RBF as a by-product.
Originally, the plan for nSequence was that mempools would replace any transaction spending certain inputs with another transaction spending the same inputs, but only if the nSequence field of the replacement was larger.
Since 0xFFFFFFFF was the highest value that nSequence could get, this would mark a transaction as "final" and not replaceable on the mempool anymore.
In fact, this "nSequence channel" I will describe is the reason why we have this weird rule about nLockTime and nSequence. nLockTime actually only works if nSequence is not 0xFFFFFFFF i.e. final. If nSequence is 0xFFFFFFFF then nLockTime is ignored, because this if the "final" version of the transaction.
So what you'd do would be something like this:
  1. You go to a bar and promise the bartender to pay by the time the bar closes. Because this is the Bitcoin universe, time is measured in blockheight, so the closing time of the bar is indicated as some future blockheight.
  2. For your first drink, you'd make a transaction paying to the bartender for that drink, paying from some coins you have. The transaction has an nLockTime equal to the closing time of the bar, and a starting nSequence of 0. You hand over the transaction and the bartender hands you your drink.
  3. For your succeeding drink, you'd remake the same transaction, adding the payment for that drink to the transaction output that goes to the bartender (so that output keeps getting larger, by the amount of payment), and having an nSequence that is one higher than the previous one.
  4. Eventually you have to stop drinking. It comes down to one of two possibilities:
    • You drink until the bar closes. Since it is now the nLockTime indicated in the transaction, the bartender is able to broadcast the latest transaction and tells the bouncers to kick you out of the bar.
    • You wisely consider the state of your liver. So you re-sign the last transaction with a "final" nSequence of 0xFFFFFFFF i.e. the maximum possible value it can have. This allows the bartender to get his or her funds immediately (nLockTime is ignored if nSequence is 0xFFFFFFFF), so he or she tells the bouncers to let you out of the bar.
Now that of course is a payment channel. Individual payments (purchases of alcohol, so I guess buying coffee is not in scope for payment channels). Closing is done by creating a "final" transaction that is the sum of the individual payments. Sure there's no routing and channels are unidirectional and channels have a maximum lifetime but give Satoshi a break, he was also busy inventing Bitcoin at the time.
Now if you noticed I called this kind of payment channel "broken". This is because the mempool rules are not consensus rules, and cannot be validated (nothing about the mempool can be validated onchain: I sigh every time somebody proposes "let's make block size dependent on mempool size", mempool state cannot be validated by onchain data). Fullnodes can't see all of the transactions you signed, and then validate that the final one with the maximum nSequence is the one that actually is used onchain. So you can do the below:
  1. Become friends with Jihan Wu, because he owns >51% of the mining hashrate (he totally reorged Bitcoin to reverse the Binance hack right?).
  2. Slip Jihan Wu some of the more interesting drinks you're ordering as an incentive to cooperate with you. So say you end up ordering 100 drinks, you split it with Jihan Wu and give him 50 of the drinks.
  3. When the bar closes, Jihan Wu quickly calls his mining rig and tells them to mine the version of your transaction with nSequence 0. You know, that first one where you pay for only one drink.
  4. Because fullnodes cannot validate nSequence, they'll accept even the nSequence=0 version and confirm it, immutably adding you paying for a single alcoholic drink to the blockchain.
  5. The bartender, pissed at being cheated, takes out a shotgun from under the bar and shoots at you and Jihan Wu.
  6. Jihan Wu uses his mystical chi powers (actually the combined exhaust from all of his mining rigs) to slow down the shotgun pellets, making them hit you as softly as petals drifting in the wind.
  7. The bartender mutters some words, clothes ripping apart as he or she (hard to believe it could be a she but hey) turns into a bear, ready to maul you for cheating him or her of the payment for all the 100 drinks you ordered from him or her.
  8. Steely-eyed, you stand in front of the bartender-turned-bear, daring him to touch you. You've watched Revenant, you know Leonardo di Caprio could survive a bear mauling, and if some posh actor can survive that, you know you can too. You make a pose. "Drunken troll logic attack!"
  9. I think I got sidetracked here.
Lessons learned?

Spilman Channels

Incentive-compatible time-limited unidirectional channel; or, Satoshi's Vision, Fixed (if transaction malleability hadn't been a problem, that is).
Now, we know the bartender will turn into a bear and maul you if you try to cheat the payment channel, and now that we've revealed you're good friends with Jihan Wu, the bartender will no longer accept a payment channel scheme that lets one you cooperate with a miner to cheat the bartender.
Fortunately, Jeremy Spilman proposed a better way that would not let you cheat the bartender.
First, you and the bartender perform this ritual:
  1. You get some funds and create a transaction that pays to a 2-of-2 multisig between you and the bartender. You don't broadcast this yet: you just sign it and get its txid.
  2. You create another transaction that spends the above transaction. This transaction (the "backoff") has an nLockTime equal to the closing time of the bar, plus one block. You sign it and give this backoff transaction (but not the above transaction) to the bartender.
  3. The bartender signs the backoff and gives it back to you. It is now valid since it's spending a 2-of-2 of you and the bartender, and both of you have signed the backoff transaction.
  4. Now you broadcast the first transaction onchain. You and the bartender wait for it to be deeply confirmed, then you can start ordering.
The above is probably vaguely familiar to LN users. It's the funding process of payment channels! The first transaction, the one that pays to a 2-of-2 multisig, is the funding transaction that backs the payment channel funds.
So now you start ordering in this way:
  1. For your first drink, you create a transaction spending the funding transaction output and sending the price of the drink to the bartender, with the rest returning to you.
  2. You sign the transaction and pass it to the bartender, who serves your first drink.
  3. For your succeeding drinks, you recreate the same transaction, adding the price of the new drink to the sum that goes to the bartender and reducing the money returned to you. You sign the transaction and give it to the bartender, who serves you your next drink.
  4. At the end:
    • If the bar closing time is reached, the bartender signs the latest transaction, completing the needed 2-of-2 signatures and broadcasting this to the Bitcoin network. Since the backoff transaction is the closing time + 1, it can't get used at closing time.
    • If you decide you want to leave early because your liver is crying, you just tell the bartender to go ahead and close the channel (which the bartender can do at any time by just signing and broadcasting the latest transaction: the bartender won't do that because he or she is hoping you'll stay and drink more).
    • If you ended up just hanging around the bar and never ordering, then at closing time + 1 you broadcast the backoff transaction and get your funds back in full.
Now, even if you pass 50 drinks to Jihan Wu, you can't give him the first transaction (the one which pays for only one drink) and ask him to mine it: it's spending a 2-of-2 and the copy you have only contains your own signature. You need the bartender's signature to make it valid, but he or she sure as hell isn't going to cooperate in something that would lose him or her money, so a signature from the bartender validating old state where he or she gets paid less isn't going to happen.
So, problem solved, right? Right? Okay, let's try it. So you get your funds, put them in a funding tx, get the backoff tx, confirm the funding tx...
Once the funding transaction confirms deeply, the bartender laughs uproariously. He or she summons the bouncers, who surround you menacingly.
"I'm refusing service to you," the bartender says.
"Fine," you say. "I was leaving anyway;" You smirk. "I'll get back my money with the backoff transaction, and posting about your poor service on reddit so you get negative karma, so there!"
"Not so fast," the bartender says. His or her voice chills your bones. It looks like your exploitation of the Satoshi nSequence payment channel is still fresh in his or her mind. "Look at the txid of the funding transaction that got confirmed."
"What about it?" you ask nonchalantly, as you flip open your desktop computer and open a reputable blockchain explorer.
What you see shocks you.
"What the --- the txid is different! You--- you changed my signature?? But how? I put the only copy of my private key in a sealed envelope in a cast-iron box inside a safe buried in the Gobi desert protected by a clan of nomads who have dedicated their lives and their childrens' lives to keeping my private key safe in perpetuity!"
"Didn't you know?" the bartender asks. "The components of the signature are just very large numbers. The sign of one of the signature components can be changed, from positive to negative, or negative to positive, and the signature will remain valid. Anyone can do that, even if they don't know the private key. But because Bitcoin includes the signatures in the transaction when it's generating the txid, this little change also changes the txid." He or she chuckles. "They say they'll fix it by separating the signatures from the transaction body. They're saying that these kinds of signature malleability won't affect transaction ids anymore after they do this, but I bet I can get my good friend Jihan Wu to delay this 'SepSig' plan for a good while yet. Friendly guy, this Jihan Wu, it turns out all I had to do was slip him 51 drinks and he was willing to mine a tx with the signature signs flipped." His or her grin widens. "I'm afraid your backoff transaction won't work anymore, since it spends a txid that is not existent and will never be confirmed. So here's the deal. You pay me 99% of the funds in the funding transaction, in exchange for me signing the transaction that spends with the txid that you see onchain. Refuse, and you lose 100% of the funds and every other HODLer, including me, benefits from the reduction in coin supply. Accept, and you get to keep 1%. I lose nothing if you refuse, so I won't care if you do, but consider the difference of getting zilch vs. getting 1% of your funds." His or her eyes glow. "GENUFLECT RIGHT NOW."
Lesson learned?

CLTV-protected Spilman Channels

Using CLTV for the backoff branch.
This variation is simply Spilman channels, but with the backoff transaction replaced with a backoff branch in the SCRIPT you pay to. It only became possible after OP_CHECKLOCKTIMEVERIFY (CLTV) was enabled in 2015.
Now as we saw in the Spilman Channels discussion, transaction malleability means that any pre-signed offchain transaction can easily be invalidated by flipping the sign of the signature of the funding transaction while the funding transaction is not yet confirmed.
This can be avoided by simply putting any special requirements into an explicit branch of the Bitcoin SCRIPT. Now, the backoff branch is supposed to create a maximum lifetime for the payment channel, and prior to the introduction of OP_CHECKLOCKTIMEVERIFY this could only be done by having a pre-signed nLockTime transaction.
With CLTV, however, we can now make the branches explicit in the SCRIPT that the funding transaction pays to.
Instead of paying to a 2-of-2 in order to set up the funding transaction, you pay to a SCRIPT which is basically "2-of-2, OR this singlesig after a specified lock time".
With this, there is no backoff transaction that is pre-signed and which refers to a specific txid. Instead, you can create the backoff transaction later, using whatever txid the funding transaction ends up being confirmed under. Since the funding transaction is immutable once confirmed, it is no longer possible to change the txid afterwards.

Todd Micropayment Networks

The old hub-spoke model (that isn't how LN today actually works).
One of the more direct predecessors of the Lightning Network was the hub-spoke model discussed by Peter Todd. In this model, instead of payers directly having channels to payees, payers and payees connect to a central hub server. This allows any payer to pay any payee, using the same channel for every payee on the hub. Similarly, this allows any payee to receive from any payer, using the same channel.
Remember from the above Spilman example? When you open a channel to the bartender, you have to wait around for the funding tx to confirm. This will take an hour at best. Now consider that you have to make channels for everyone you want to pay to. That's not very scalable.
So the Todd hub-spoke model has a central "clearing house" that transport money from payers to payees. The "Moonbeam" project takes this model. Of course, this reveals to the hub who the payer and payee are, and thus the hub can potentially censor transactions. Generally, though, it was considered that a hub would more efficiently censor by just not maintaining a channel with the payer or payee that it wants to censor (since the money it owned in the channel would just be locked uselessly if the hub won't process payments to/from the censored user).
In any case, the ability of the central hub to monitor payments means that it can surveill the payer and payee, and then sell this private transactional data to third parties. This loss of privacy would be intolerable today.
Peter Todd also proposed that there might be multiple hubs that could transport funds to each other on behalf of their users, providing somewhat better privacy.
Another point of note is that at the time such networks were proposed, only unidirectional (Spilman) channels were available. Thus, while one could be a payer, or payee, you would have to use separate channels for your income versus for your spending. Worse, if you wanted to transfer money from your income channel to your spending channel, you had to close both and reshuffle the money between them, both onchain activities.

Poon-Dryja Lightning Network

Bidirectional two-participant channels.
The Poon-Dryja channel mechanism has two important properties:
Both the original Satoshi and the two Spilman variants are unidirectional: there is a payer and a payee, and if the payee wants to do a refund, or wants to pay for a different service or product the payer is providing, then they can't use the same unidirectional channel.
The Poon-Dryjam mechanism allows channels, however, to be bidirectional instead: you are not a payer or a payee on the channel, you can receive or send at any time as long as both you and the channel counterparty are online.
Further, unlike either of the Spilman variants, there is no time limit for the lifetime of a channel. Instead, you can keep the channel open for as long as you want.
Both properties, together, form a very powerful scaling property that I believe most people have not appreciated. With unidirectional channels, as mentioned before, if you both earn and spend over the same network of payment channels, you would have separate channels for earning and spending. You would then need to perform onchain operations to "reverse" the directions of your channels periodically. Secondly, since Spilman channels have a fixed lifetime, even if you never used either channel, you would have to periodically "refresh" it by closing it and reopening.
With bidirectional, indefinite-lifetime channels, you may instead open some channels when you first begin managing your own money, then close them only after your lawyers have executed your last will and testament on how the money in your channels get divided up to your heirs: that's just two onchain transactions in your entire lifetime. That is the potentially very powerful scaling property that bidirectional, indefinite-lifetime channels allow.
I won't discuss the transaction structure needed for Poon-Dryja bidirectional channels --- it's complicated and you can easily get explanations with cute graphics elsewhere.
There is a weakness of Poon-Dryja that people tend to gloss over (because it was fixed very well by RustyReddit):
Another thing I want to emphasize is that while the Lightning Network paper and many of the earlier presentations developed from the old Peter Todd hub-and-spoke model, the modern Lightning Network takes the logical conclusion of removing a strict separation between "hubs" and "spokes". Any node on the Lightning Network can very well work as a hub for any other node. Thus, while you might operate as "mostly a payer", "mostly a forwarding node", "mostly a payee", you still end up being at least partially a forwarding node ("hub") on the network, at least part of the time. This greatly reduces the problems of privacy inherent in having only a few hub nodes: forwarding nodes cannot get significantly useful data from the payments passing through them, because the distance between the payer and the payee can be so large that it would be likely that the ultimate payer and the ultimate payee could be anyone on the Lightning Network.
Lessons learned?

Future

After LN, there's also the Decker-Wattenhofer Duplex Micropayment Channels (DMC). This post is long enough as-is, LOL. But for now, it uses a novel "decrementing nSequence channel", using the new relative-timelock semantics of nSequence (not the broken one originally by Satoshi). It actually uses multiple such "decrementing nSequence" constructs, terminating in a pair of Spilman channels, one in both directions (thus "duplex"). Maybe I'll discuss it some other time.
The realization that channel constructions could actually hold more channel constructions inside them (the way the Decker-Wattenhofer puts a pair of Spilman channels inside a series of "decrementing nSequence channels") lead to the further thought behind Burchert-Decker-Wattenhofer channel factories. Basically, you could host multiple two-participant channel constructs inside a larger multiparticipant "channel" construct (i.e. host multiple channels inside a factory).
Further, we have the Decker-Russell-Osuntokun or "eltoo" construction. I'd argue that this is "nSequence done right". I'll write more about this later, because this post is long enough.
Lessons learned?
submitted by almkglor to Bitcoin [link] [comments]

Binance, ordenes Stop limit (como usarlas y porque) BINANCE  Einführung und Anleitung  Deutsch GTX 1080 Pirl And SiaCoin Duel Mining! Setup And Expected Return! Fetch Bitcoin Exchange Data For Beginners Using Websocket More Charts and Transferring funds from Bittrex to Binance using Litecoin Guthaben von Bitfinex zu Binance schicken Binance Tutorial - How to register and use exchange. #6 So setzt du eine Limit-Order auf Binance Wie melde ich mich bei Binance an? Einfach + verständlich (+ Limit-Order) Börsen Tutorial  ALL ABOUT BINANCE  Wie kaufe ich ...

Bitcoin 0 0 has been growing in popularity since it was created almost a decade ago. It is used by more and more people each day — and more transactions mean bigger blocks forming the block-chain . The blocks produced by the Bitcoin network ( approximately every 10 minuten) are now exceeding the 1MB limit that was in place before Segregated Witness was introduced in August of last year . A block can contain an indefinite number of individual transactions; the limit is only the block size. This is 1 Megabyte (MB). The miners compete for the production of these blocks. The proof-of-work is a cryptographic puzzle that determines which miner is allowed to write the next block into the Bitcoin blockchain. The miners take the transactions (or their Merkle tree) and add a random ... Regarding block size, Monero does not have a fixed cap, unlike Bitcoin's 4 million in block weight units. Instead, it has a dynamic block size, meaning that blocks can expand to accommodate increased demand. Accordingly, if demand is reduced, the permitted size will shrink. The size is calculated by looking at the median size of the previous ... The process of mining Bitcoin Cash is similar to mining Bitcoin; nonetheless, there are few differences. For instance, the block size limit of Bitcoin is 1 MB, while that of Bitcoin Cash is 8 MB, which means that you’ll need more computing power and investment to mine BCH. However, bigger blocks contain more transaction fees, which go to the ... Since its genesis block, Bitcoin mining activities have first pivoted from CPUs to GPUs; since 2014, BTC has been mostly mined by Application-Specific Integrated Circuits (ASICs). Chart 1 - Mining difficulty of Bitcoin (BTC) (log scale) Sources: Binance Research, Coin Metrics. As illustrated by chart 1, the mining difficulty of Bitcoin has increased dramatically since 2010. Before 2010, the ... The next Bitcoin halving event is due in May 2020. The Bitcoin block reward will halve, rewarding miners with 6.25 BTC for each block. A Short History of Bitcoin Block Size. In mid-2015, the then-lead Bitcoin developer, Gavin Andresen, issued a warning that the Bitcoin block size was an impending issue for the Bitcoin network. Bitcoin 0 0 has been growing in popularity since it was created almost a decade ago. It is used by more and more people each day — and more transactions mean bigger blocks forming the block-chain . The blocks produced by the Bitcoin network ( approximately every 10 pöytäkirja) are now exceeding the 1MB limit that was in place before Segregated Witness was introduced in August of last year . The Bitcoin network is scaling as the network just reached a new all-time high of 1.3 MB average block size amid record low transaction fees. Biggest Bitcoin is now not only regularly mining blocks over the old 1MB block size limit, but has just reached a new all-time high of 1.3 MB average block size. Bitcoin Mining Firms will Grow, But a Block Size Limit Can’t Stop Centralization However, in regards to Bitcoin, there is one aspect of the economics of competition that is negative. Due to economies of scale, Bitcoin mining firms will grow to an empirically optimal size — which possibly means that one firm could gain 51% or more of the network’s hashing power. The major selling point for Bitcoin Cash over BTC was supposedly the fact that the coin would scale better via bigger 32MB blocks. However, it appears that Bitcoin Cash miners are limiting block sizes to 2MB, which is more or less the limit for Bitcoin. The revelation raises questions as to whether BCH can continue to claim to be an upgrade from Bitcoin.

[index] [14749] [9787] [19453] [23385] [22025] [17276] [6621] [20371] [22306] [18133]

Binance, ordenes Stop limit (como usarlas y porque)

Top Bitcoin Core Dev Greg Maxwell DevCore: Must watch talk on mining, block size, and more - Duration: 55:04. The Bitcoin Foundation 20,012 views Binance findest du hier - https://goo.gl/86HNDK Krypto verstehen lernen - https://goo.gl/hGPpnE Bitcoin.de Wallet - https://goo.gl/s6wMpq Folge mir auf T... The mining tutorial begins at 5:05 Batch File shown at 11:40 Expected Return is shown at 14:45 This video begins with my ICON giveaway using Binance! I move from there into Pirl and why I think it ... Willkommen zu einer weiteren neuen Ausgabe von FFDK! Hier gehts zur Anmeldung von Binance* : https://www.binance.com/en/register?ref=11058982 *Affiliate-... In diesem Video zeige ich euch, wie Ihr euch bei Binance anmeldet und ein Limit in Bitcoin (BTC) platziert. In diesem Beispiel kaufen wir die Kryptowährung Cardano (ADA) gegen Bitcoin (BTC). How to transfer Litecoin or Bitcoin from Coinbase to Binance - Duration: 10 ... Coffee and Crypto with the Bitcoin Block Bully - Duration: 51:39. Chicago Crypto Hustler 103 views. 51:39. The ... Trading AVANZADO Binance Tutorial en ESPAÑOL - Stop limit, stop loss, ordenes de compra [Parte 1] - Duration: 22:03. Crypto Forum - Criptomonedas en español 4,200 views 22:03 In dem Video zeige ich euch wie ihr eure Guthaben von Bitfinex zu Binance schicken könnt. Binance - Gute Alternative zum Handeln von Altcoins: https://www.b... After the forked block-chain and wallet are available and stable, Binance will credit the forked coins to users. Use code - cElIM5 - and get 3% off every purchase on GM. https://www.genesis-mining ... Bitcoin / Altcoin Trading Grundlagenkurs: ... Block-Builders 6,902 views. 22:47. Make a Living in 1 Hour a Day Trading the 3 Bar Play!! - Duration: 34:34. Live Traders 1,784,020 views. 34:34 ...

http://youtube-forex.forexhe.pw