Secp256k1 Calculator - eouk.robertoguastini.it

ECDSA In Bitcoin

Digital signatures are considered the foundation of online sovereignty. The advent of public-key cryptography in 1976 paved the way for the creation of a global communications tool – the Internet, and a completely new form of money – Bitcoin. Although the fundamental properties of public-key cryptography have not changed much since then, dozens of different open-source digital signature schemes are now available to cryptographers.

How ECDSA was incorporated into Bitcoin

When Satoshi Nakamoto, a mystical founder of the first crypto, started working on Bitcoin, one of the key points was to select the signature schemes for an open and public financial system. The requirements were clear. An algorithm should have been widely used, understandable, safe enough, easy, and, what is more important, open-sourced.
Of all the options available at that time, he chose the one that met these criteria: Elliptic Curve Digital Signature Algorithm, or ECDSA.
At that time, native support for ECDSA was provided in OpenSSL, an open set of encryption tools developed by experienced cipher banks in order to increase the confidentiality of online communications. Compared to other popular schemes, ECDSA had such advantages as:
These are extremely useful features for digital money. At the same time, it provides a proportional level of security: for example, a 256-bit ECDSA key has the same level of security as a 3072-bit RSA key (Rivest, Shamir и Adleman) with a significantly smaller key size.

Basic principles of ECDSA

ECDSA is a process that uses elliptic curves and finite fields to “sign” data in such a way that third parties can easily verify the authenticity of the signature, but the signer himself reserves the exclusive opportunity to create signatures. In the case of Bitcoin, the “data” that is signed is a transaction that transfers ownership of bitcoins.
ECDSA has two separate procedures for signing and verifying. Each procedure is an algorithm consisting of several arithmetic operations. The signature algorithm uses the private key, and the verification algorithm uses only the public key.
To use ECDSA, such protocol as Bitcoin must fix a set of parameters for the elliptic curve and its finite field, so that all users of the protocol know and apply these parameters. Otherwise, everyone will solve their own equations, which will not converge with each other, and they will never agree on anything.
For all these parameters, Bitcoin uses very, very large (well, awesomely incredibly huge) numbers. It is important. In fact, all practical applications of ECDSA use huge numbers. After all, the security of this algorithm relies on the fact that these values are too large to pick up a key with a simple brute force. The 384-bit ECDSA key is considered safe enough for the NSA's most secretive government service (USA).

Replacement of ECDSA

Thanks to the hard work done by Peter Wuille (a famous cryptography specialist) and his colleagues on an improved elliptical curve called secp256k1, Bitcoin's ECDSA has become even faster and more efficient. However, ECDSA still has some shortcomings, which can serve as a sufficient basis for its complete replacement. After several years of research and experimentation, a new signature scheme was established to increase the confidentiality and efficiency of Bitcoin transactions: Schnorr's digital signature scheme.
Schnorr's signature takes the process of using “keys” to a new level. It takes only 64 bytes when it gets into the block, which reduces the space occupied by transactions by 4%. Since transactions with the Schnorr signature are the same size, this makes it possible to pre-calculate the total size of the part of the block that contains such signatures. A preliminary calculation of the block size is the key to its safe increase in the future.
Keep up with the news of the crypto world at CoinJoy.io Follow us on Twitter and Medium. Subscribe to our YouTube channel. Join our Telegram channel. For any inquiries mail us at [[email protected]](mailto:[email protected]).
submitted by CoinjoyAssistant to btc [link] [comments]

ECDSA In Bitcoin

Digital signatures are considered the foundation of online sovereignty. The advent of public-key cryptography in 1976 paved the way for the creation of a global communications tool – the Internet, and a completely new form of money – Bitcoin. Although the fundamental properties of public-key cryptography have not changed much since then, dozens of different open-source digital signature schemes are now available to cryptographers.

How ECDSA was incorporated into Bitcoin

When Satoshi Nakamoto, a mystical founder of the first crypto, started working on Bitcoin, one of the key points was to select the signature schemes for an open and public financial system. The requirements were clear. An algorithm should have been widely used, understandable, safe enough, easy, and, what is more important, open-sourced.
Of all the options available at that time, he chose the one that met these criteria: Elliptic Curve Digital Signature Algorithm, or ECDSA.
At that time, native support for ECDSA was provided in OpenSSL, an open set of encryption tools developed by experienced cipher banks in order to increase the confidentiality of online communications. Compared to other popular schemes, ECDSA had such advantages as:
These are extremely useful features for digital money. At the same time, it provides a proportional level of security: for example, a 256-bit ECDSA key has the same level of security as a 3072-bit RSA key (Rivest, Shamir и Adleman) with a significantly smaller key size.

Basic principles of ECDSA

ECDSA is a process that uses elliptic curves and finite fields to “sign” data in such a way that third parties can easily verify the authenticity of the signature, but the signer himself reserves the exclusive opportunity to create signatures. In the case of Bitcoin, the “data” that is signed is a transaction that transfers ownership of bitcoins.
ECDSA has two separate procedures for signing and verifying. Each procedure is an algorithm consisting of several arithmetic operations. The signature algorithm uses the private key, and the verification algorithm uses only the public key.
To use ECDSA, such protocol as Bitcoin must fix a set of parameters for the elliptic curve and its finite field, so that all users of the protocol know and apply these parameters. Otherwise, everyone will solve their own equations, which will not converge with each other, and they will never agree on anything.
For all these parameters, Bitcoin uses very, very large (well, awesomely incredibly huge) numbers. It is important. In fact, all practical applications of ECDSA use huge numbers. After all, the security of this algorithm relies on the fact that these values are too large to pick up a key with a simple brute force. The 384-bit ECDSA key is considered safe enough for the NSA's most secretive government service (USA).

Replacement of ECDSA

Thanks to the hard work done by Peter Wuille (a famous cryptography specialist) and his colleagues on an improved elliptical curve called secp256k1, Bitcoin's ECDSA has become even faster and more efficient. However, ECDSA still has some shortcomings, which can serve as a sufficient basis for its complete replacement. After several years of research and experimentation, a new signature scheme was established to increase the confidentiality and efficiency of Bitcoin transactions: Schnorr's digital signature scheme.
Schnorr's signature takes the process of using “keys” to a new level. It takes only 64 bytes when it gets into the block, which reduces the space occupied by transactions by 4%. Since transactions with the Schnorr signature are the same size, this makes it possible to pre-calculate the total size of the part of the block that contains such signatures. A preliminary calculation of the block size is the key to its safe increase in the future.
Keep up with the news of the crypto world at CoinJoy.io Follow us on Twitter and Medium. Subscribe to our YouTube channel. Join our Telegram channel. For any inquiries mail us at [[email protected]](mailto:[email protected]).
submitted by CoinjoyAssistant to Bitcoin [link] [comments]

Technical: Upcoming Improvements to Lightning Network

Price? Who gives a shit about price when Lightning Network development is a lot more interesting?????
One thing about LN is that because there's no need for consensus before implementing things, figuring out the status of things is quite a bit more difficult than on Bitcoin. In one hand it lets larger groups of people work on improving LN faster without having to coordinate so much. On the other hand it leads to some fragmentation of the LN space, with compatibility problems occasionally coming up.
The below is just a smattering sample of LN stuff I personally find interesting. There's a bunch of other stuff, like splice and dual-funding, that I won't cover --- post is long enough as-is, and besides, some of the below aren't as well-known.
Anyway.....

"eltoo" Decker-Russell-Osuntokun

Yeah the exciting new Lightning Network channel update protocol!

Advantages

Myths

Disadvantages

Multipart payments / AMP

Splitting up large payments into smaller parts!

Details

Advantages

Disadvantages

Payment points / scalars

Using the magic of elliptic curve homomorphism for fun and Lightning Network profits!
Basically, currently on Lightning an invoice has a payment hash, and the receiver reveals a payment preimage which, when inputted to SHA256, returns the given payment hash.
Instead of using payment hashes and preimages, just replace them with payment points and scalars. An invoice will now contain a payment point, and the receiver reveals a payment scalar (private key) which, when multiplied with the standard generator point G on secp256k1, returns the given payment point.
This is basically Scriptless Script usage on Lightning, instead of HTLCs we have Scriptless Script Pointlocked Timelocked Contracts (PTLCs).

Advantages

Disadvantages

Pay-for-data

Ensuring that payers cannot access data or other digital goods without proof of having paid the provider.
In a nutshell: the payment preimage used as a proof-of-payment is the decryption key of the data. The provider gives the encrypted data, and issues an invoice. The buyer of the data then has to pay over Lightning in order to learn the decryption key, with the decryption key being the payment preimage.

Advantages

Disadvantages

Stuckless payments

No more payments getting stuck somewhere in the Lightning network without knowing whether the payee will ever get paid!
(that's actually a bit overmuch claim, payments still can get stuck, but what "stuckless" really enables is that we can now safely run another parallel payment attempt until any one of the payment attempts get through).
Basically, by using the ability to add points together, the payer can enforce that the payee can only claim the funds if it knows two pieces of information:
  1. The payment scalar corresponding to the payment point in the invoice signed by the payee.
  2. An "acknowledgment" scalar provided by the payer to the payee via another communication path.
This allows the payer to make multiple payment attempts in parallel, unlike the current situation where we must wait for an attempt to fail before trying another route. The payer only needs to ensure it generates different acknowledgment scalars for each payment attempt.
Then, if at least one of the payment attempts reaches the payee, the payee can then acquire the acknowledgment scalar from the payer. Then the payee can acquire the payment. If the payee attempts to acquire multiple acknowledgment scalars for the same payment, the payer just gives out one and then tells the payee "LOL don't try to scam me", so the payee can only acquire a single acknowledgment scalar, meaning it can only claim a payment once; it can't claim multiple parallel payments.

Advantages

Disadvantages

Non-custodial escrow over Lightning

The "acknowledgment" scalar used in stuckless can be reused here.
The acknowledgment scalar is derived as an ECDH shared secret between the payer and the escrow service. On arrival of payment to the payee, the payee queries the escrow to determine if the acknowledgment point is from a scalar that the escrow can derive using ECDH with the payer, plus a hash of the contract terms of the trade (for example, to transfer some goods in exchange for Lightning payment). Once the payee gets confirmation from the escrow that the acknowledgment scalar is known by the escrow, the payee performs the trade, then asks the payer to provide the acknowledgment scalar once the trade completes.
If the payer refuses to give the acknowledgment scalar even though the payee has given over the goods to be traded, then the payee contacts the escrow again, reveals the contract terms text, and requests to be paid. If the escrow finds in favor of the payee (i.e. it determines the goods have arrived at the payer as per the contract text) then it gives the acknowledgment scalar to the payee.

Advantages

Disadvantages

Payment decorrelation

Because elliptic curve points can be added (unlike hashes), for every forwarding node, we an add a "blinding" point / scalar. This prevents multiple forwarding nodes from discovering that they have been on the same payment route. This is unlike the current payment hash + preimage, where the same hash is used along the route.
In fact, the acknowledgment scalar we use in stuckless and escrow can simply be the sum of each blinding scalar used at each forwarding node.

Advantages

Disadvantages

submitted by almkglor to Bitcoin [link] [comments]

Help me code it!

Hi everyone, i am learning about Python and it's quite hard with me. I want to calculate Public key from Private key with ECC. I have the code from Github, transform it to Python 3.0 and it does not work:
# Super simple Elliptic Curve Presentation. No imported libraries, wrappers, nothing. # For educational purposes only. Remember to use Python 2.7.6 or lower. You'll need to make changes for Python 3. # Below are the public specs for Bitcoin's curve - the secp256k1 import binascii Pcurve = 2**256 - 2**32 - 2**9 - 2**8 - 2**7 - 2**6 - 2**4 -1 # The proven prime N=0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFEBAAEDCE6AF48A03BBFD25E8CD0364141 # Number of points in the field Acurve = 0; Bcurve = 7 # These two defines the elliptic curve. y^2 = x^3 + Acurve * x + Bcurve Gx = 55066263022277343669578718895168534326250603453777594175500187360389116729240 Gy = 32670510020758816978083085130507043184471273380659243275938904335757337482424 GPoint = (Gx,Gy) # This is our generator point. Trillions of dif ones possible #Individual Transaction/Personal Information privKey = 0xA0DC65FFCA799873CBEA0AC274015B9526505DAAAED385155425F7337704883E #replace with any private key def modinv(a,n=Pcurve): #Extended Euclidean Algorithm/'division' in elliptic curves lm, hm = 1,0 low, high = a%n,n while low > 1: ratio = high/low nm, new = hm-lm*ratio, high-low*ratio lm, low, hm, high = nm, new, lm, low return lm % n def ECadd(a,b): # Not true addition, invented for EC. Could have been called anything. LamAdd = ((b[1]-a[1]) * modinv(b[0]-a[0],Pcurve)) % Pcurve x = (LamAdd*LamAdd-a[0]-b[0]) % Pcurve y = (LamAdd*(a[0]-x)-a[1]) % Pcurve return (x,y) def ECdouble(a): # This is called point doubling, also invented for EC. Lam = ((3*a[0]*a[0]+Acurve) * modinv((2*a[1]),Pcurve)) % Pcurve x = (Lam*Lam-2*a[0]) % Pcurve y = (Lam*(a[0]-x)-a[1]) % Pcurve return (x,y) def EccMultiply(GenPoint,ScalarHex): #Double & add. Not true multiplication if ScalarHex == 0 or ScalarHex >= N: raise Exception("Invalid ScalaPrivate Key") ScalarBin = str(bin(ScalarHex))[2:]; #print(ScalarBin); Q=GenPoint for i in range (1,len(ScalarBin)): # This is invented EC multiplication. Q=ECdouble(Q); print(("DUB", Q[0])); print(i) if ScalarBin[i] == "1": Q=ECadd(Q,GenPoint); print(("ADD", Q[0])); print() return (Q) PublicKey = EccMultiply(GPoint,privKey); print(); print("******* Public Key Generation *********"); print() print("the private key:"); print((hex(privKey))); print() print("the uncompressed public key (not address):"); print(PublicKey); print() print("the uncompressed public key (HEX):"); print(("04" + "%064x" % PublicKey[0] + "%064x" % PublicKey[1])); print(); print("the official Public Key - compressed:"); if PublicKey[1] % 2 == 1: # If the Y value for the Public Key is odd. print(("03"+str(hex(PublicKey[0])[2:-1]).zfill(64))) else: # Or else, if the Y value is even. print(("02"+str(hex(PublicKey[0])[2:-1]).zfill(64))) 
submitted by Phuc_Jackson to Bitcoin [link] [comments]

Threshold Signature Explained— Bringing Exciting Applications with TSS

Threshold Signature Explained— Bringing Exciting Applications with TSS
— A deep dive into threshold signature without mathematics by ARPA’s cryptographer Dr. Alex Su

https://preview.redd.it/cp0wib2mk0q41.png?width=757&format=png&auto=webp&s=d42056f42fb16041bc512f10f10fed56a16dc279
Threshold signature is a distributed multi-party signature protocol that includes distributed key generation, signature, and verification algorithms.
In recent years, with the rapid development of blockchain technology, signature algorithms have gained widespread attention in both academic research and real-world applications. Its properties like security, practicability, scalability, and decentralization of signature are pored through.
Due to the fact that blockchain and signature are closely connected, the development of signature algorithms and the introduction of new signature paradigms will directly affect the characteristics and efficiency of blockchain networks.
In addition, institutional and personal account key management requirements stimulated by distributed ledgers have also spawned many wallet applications, and this change has also affected traditional enterprises. No matter in the blockchain or traditional financial institutions, the threshold signature scheme can bring security and privacy improvement in various scenarios. As an emerging technology, threshold signatures are still under academic research and discussions, among which there are unverified security risks and practical problems.
This article will start from the technical rationale and discuss about cryptography and blockchain. Then we will compare multi-party computation and threshold signature before discussing the pros and cons of different paradigms of signature. In the end, there will be a list of use cases of threshold signature. So that, the reader may quickly learn about the threshold signature.
I. Cryptography in Daily Life
Before introducing threshold signatures, let’s get a general understanding of cryptography. How does cryptography protect digital information? How to create an identity in the digital world? At the very beginning, people want secure storage and transmission. After one creates a key, he can use symmetric encryption to store secrets. If two people have the same key, they can achieve secure transmission between them. Like, the king encrypts a command and the general decrypts it with the corresponding key.
But when two people do not have a safe channel to use, how can they create a shared key? So, the key exchange protocol came into being. Analogously, if the king issues an order to all the people in the digital world, how can everyone proves that the sentence originated from the king? As such, the digital signature protocol was invented. Both protocols are based on public key cryptography, or asymmetric cryptographic algorithms.


“Tiger Rune” is a troop deployment tool used by ancient emperor’s, made of bronze or gold tokens in the shape of a tiger, split in half, half of which is given to the general and the other half is saved by the emperor. Only when two tiger amulets are combined and used at the same time, will the amulet holder get the right to dispatch troops.
Symmetric and asymmetric encryption constitute the main components of modern cryptography. They both have three fixed parts: key generation, encryption, and decryption. Here, we focus on digital signature protocols. The key generation process generates a pair of associated keys: the public key and the private key. The public key is open to everyone, and the private key represents the identity and is only revealed to the owner. Whoever owns the private key has the identity represented by the key. The encryption algorithm, or signature algorithm, takes the private key as input and generate a signature on a piece of information. The decryption algorithm, or signature verification algorithm, uses public keys to verify the validity of the signature and the correctness of the information.
II. Signature in the Blockchain
Looking back on blockchain, it uses consensus algorithm to construct distributed books, and signature provides identity information for blockchain. All the transaction information on the blockchain is identified by the signature of the transaction initiator. The blockchain can verify the signature according to specific rules to check the transaction validity, all thanks to the immutability and verifiability of the signature.
For cryptography, the blockchain is more than using signature protocol, or that the consensus algorithm based on Proof-of-Work uses a hash function. Blockchain builds an infrastructure layer of consensus and transaction through. On top of that, the novel cryptographic protocols such as secure multi-party computation, zero-knowledge proof, homomorphic encryption thrives. For example, secure multi-party computation, which is naturally adapted to distributed networks, can build secure data transfer and machine learning platforms on the blockchain. The special nature of zero-knowledge proof provides feasibility for verifiable anonymous transactions. The combination of these cutting-edge cryptographic protocols and blockchain technology will drive the development of the digital world in the next decade, leading to secure data sharing, privacy protection, or more applications now unimaginable.
III. Secure Multi-party Computation and Threshold Signature
After introducing how digital signature protocol affects our lives, and how to help the blockchain build identities and record transactions, we will mention secure multi-party computation (MPC), from where we can see how threshold signatures achieve decentralization. For more about MPC, please refer to our previous posts which detailed the technical background and application scenarios.
MPC, by definition, is a secure computation that several participants jointly execute. Security here means that, in one computation, all participants provide their own private input, and can obtain results from the calculation. It is not possible to get any private information entered by other parties. In 1982, when Prof. Yao proposed the concept of MPC, he gave an example called the “Millionaires Problem” — two millionaires who want to know who is richer than the other without telling the true amount of assets. Specifically, the secure multiparty computation would care about the following properties:
  • Privacy: Any participant cannot obtain any private input of other participants, except for information that can be inferred from the computation results.
  • Correctness and verifiability: The computation should ensure correct execution, and the legitimacy and correctness of this process should be verifiable by participants or third parties.
  • Fairness or robustness: All parties involved in the calculation, if not agreed in advance, should be able to obtain the computation results at the same time or cannot obtain the results.
Supposing we use secure multi-party computation to make a digital signature in a general sense, we will proceed as follows:
  • Key generation phase: all future participants will be involved together to do two things: 1) each involved party generates a secret private key; 2) The public key is calculated according to the sequence of private keys.
  • Signature phase: Participants joining in a certain signature use their own private keys as private inputs, and the information to be signed as a public input to perform a joint signature operation to obtain a signature. In this process, the privacy of secure multi-party computing ensures the security of private keys. The correctness and robustness guarantee the unforgeability of the signature and everyone can all get signatures.
  • Verification phase: Use the public key corresponding to the transaction to verify the signature as traditional algorithm. There is no “secret input” during the verification, this means that the verification can be performed without multi-party computation, which will become an advantage of multi-party computation type distributed signature.
The signature protocol constructed on the idea of ​​secure multiparty computing is the threshold signature. It should be noted that we have omitted some details, because secure multiparty computing is actually a collective name for a type of cryptographic protocol. For different security assumptions and threshold settings, there are different construction methods. Therefore, the threshold signatures of different settings will also have distinctive properties, this article will not explain each setting, but the comparative result with other signature schemes will be introduced in the next section.
IV. Single Signature, Multi-Signature and Threshold Signature
Besides the threshold signature, what other methods can we choose?
Bitcoin at the beginning, uses single signature which allocates each account with one private key. The message signed by this key is considered legitimate. Later, in order to avoid single point of failure, or introduce account management by multiple people, Bitcoin provides a multi-signature function. Multi-signature can be simply understood as each account owner signs successively and post all signatures to the chain. Then signatures are verified in order on the chain. When certain conditions are met, the transaction is legitimate. This method achieves a multiple private keys control purpose.
So, what’s the difference between multi-signature and threshold signature?
Several constraints of multi-signature are:
  1. The access structure is not flexible. If an account’s access structure is given, that is, which private keys can complete a legal signature, this structure cannot be adjusted at a later stage. For example, a participant withdraws, or a new involved party needs to change the access structure. If you must change, you need to complete the initial setup process again, which will change the public key and account address as well.
  2. Less efficiency. The first is that the verification on chain consumes power of all nodes, and therefore requires a processing fee. The verification of multiple signatures is equivalent to multiple single signatures. The second is performance. The verification obviously takes more time.
  3. Requirements of smart contract support and algorithm adaptation that varies from chain to chain. Because multi-sig is not naturally supported. Due to the possible vulnerabilities in smart contracts, this support is considered risky.
  4. No anonymity, this is not able to be trivially called disadvantage or advantage, because anonymity is required for specific conditions. Anonymity here means that multi-signature directly exposes all participating signers of the transaction.
Correspondingly, the threshold signature has the following features:
  1. The access structure is flexible. Through an additional multi-party computation, the existing private key sequence can be expanded to assign private keys to new participants. This process will not expose the old and newly generated private key, nor will it change the public key and account address.
  2. It provides more efficiency. For the chain, the signature generated by the threshold signature is not different from a single signature, which means the following improvements : a) The verification is the same as the single signature, and needs no additional fee; b ) the information of the signer is invisible, because for other nodes, the information is decrypted with the same public key; c) No smart contract on chain is needed to provide additional support.
In addition to the above discussion, there is a distributed signature scheme supported by Shamir secret sharing. Secret sharing algorithm has a long history which is used to slice information storage and perform error correction information. From the underlying algorithm of secure computation to the error correction of the disc. This technology has always played an important role, but the main problem is that when used in a signature protocol, Shamir secret sharing needs to recover the master private key.
As for multiple signatures or threshold signature, the master private key has never been reconstructed, even if it is in memory or cache. this short-term reconstruction is not tolerable for vital accounts.
V. Limitations
Just like other secure multi-party computation protocols, the introduction of other participants makes security model different with traditional point-to-point encrypted transmission. The problem of conspiracy and malicious participants were not taken into account in algorithms before. The behavior of physical entities cannot be restricted, and perpetrators are introduced into participating groups.
Therefore, multi-party cryptographic protocols cannot obtain the security strength as before. Effort is needed to develop threshold signature applications, integrate existing infrastructure, and test the true strength of threshold signature scheme.
VI. Scenarios
1. Key Management
The use of threshold signature in key management system can achieve a more flexible administration, such as ARPA’s enterprise key management API. One can use the access structure to design authorization pattern for users with different priorities. In addition, for the entry of new entities, the threshold signature can quickly refresh the key. This operation can also be performed periodically to level up the difficulty of hacking multiple private keys at the same time. Finally, for the verifier, the threshold signature is not different from the traditional signature, so it is compatible with old equipments and reduces the update cost. ARPA enterprise key management modules already support Elliptic Curve Digital Signature Scheme secp256k1 and ed25519 parameters. In the future, it will be compatible with more parameters.

https://preview.redd.it/c27zuuhdl0q41.png?width=757&format=png&auto=webp&s=26d46e871dadbbd4e3bea74d840e0198dec8eb1c
2. Crypto Wallet
Wallets based on threshold signature are more secure because the private key doesn’t need to be rebuilt. Also, without all signatures posted publicly, anonymity can be achieved. Compared to the multi-signature, threshold signature needs less transaction fees. Similar to key management applications, the administration of digital asset accounts can also be more flexible. Furthermore, threshold signature wallet can support various blockchains that do not natively support multi-signature, which reduces the risk of smart contracts bugs.

Conclusion

This article describes why people need the threshold signature, and what inspiring properties it may bring. One can see that threshold signature has higher security, more flexible control, more efficient verification process. In fact, different signature technologies have different application scenarios, such as aggregate signatures not mentioned in the article, and BLS-based multi-signature. At the same time, readers are also welcomed to read more about secure multi-party computation. Secure computation is the holy grail of cryptographic protocols. It can accomplish much more than the application of threshold signatures. In the near future, secure computation will solve more specific application questions in the digital world.

About Author

Dr. Alex Su works for ARPA as the cryptography researcher. He got his Bachelor’s degree in Electronic Engineering and Ph.D. in Cryptography from Tsinghua University. Dr. Su’s research interests include multi-party computation and post-quantum cryptography implementation and acceleration.

About ARPA

ARPA is committed to providing secure data transfer solutions based on cryptographic operations for businesses and individuals.
The ARPA secure multi-party computing network can be used as a protocol layer to implement privacy computing capabilities for public chains, and it enables developers to build efficient, secure, and data-protected business applications on private smart contracts. Enterprise and personal data can, therefore, be analyzed securely on the ARPA computing network without fear of exposing the data to any third party.
ARPA’s multi-party computing technology supports secure data markets, precision marketing, credit score calculations, and even the safe realization of personal data.
ARPA’s core team is international, with PhDs in cryptography from Tsinghua University, experienced systems engineers from Google, Uber, Amazon, Huawei and Mitsubishi, blockchain experts from the University of Tokyo, AIG, and the World Bank. We also have hired data scientists from CircleUp, as well as financial and data professionals from Fosun and Fidelity Investments.
For more information about ARPA, or to join our team, please contact us at [email protected].
Learn about ARPA’s recent official news:
Telegram (English): https://t.me/arpa_community
Telegram (Việt Nam): https://t.me/ARPAVietnam
Telegram (Russian): https://t.me/arpa_community_ru
Telegram (Indonesian): https://t.me/Arpa_Indonesia
Telegram (Thai): https://t.me/Arpa_Thai
Telegram (Philippines):https://t.me/ARPA_Philippines
Telegram (Turkish): https://t.me/Arpa_Turkey
Korean Chats: https://open.kakao.com/o/giExbhmb (Kakao) & https://t.me/arpakoreanofficial (Telegram, new)
Medium: https://medium.com/@arpa
Twitter: u/arpaofficial
Reddit: https://www.reddit.com/arpachain/
Facebook: https://www.facebook.com/ARPA-317434982266680/54
submitted by arpaofficial to u/arpaofficial [link] [comments]

Upcoming Updates to Bitcoin Consensus

Price and Libra posts are shit boring, so let's focus on a technical topic for a change.
Let me start by presenting a few of the upcoming Bitcoin consensus changes.
(as these are consensus changes and not P2P changes it does not include erlay or dandelion)
Let's hope the community strongly supports these upcoming updates!

Schnorr

The sexy new signing algo.

Advantages

Disadvantages

MuSig

A provably-secure way for a group of n participants to form an aggregate pubkey and signature. Creating their group pubkey does not require their coordination other than getting individual pubkeys from each participant, but creating their signature does require all participants to be online near-simultaneously.

Advantages

Disadvantages

Taproot

Hiding a Bitcoin SCRIPT inside a pubkey, letting you sign with the pubkey without revealing the SCRIPT, or reveal the SCRIPT without signing with the pubkey.

Advantages

Disadvantages

MAST

Encode each possible branch of a Bitcoin contract separately, and only require revelation of the exact branch taken, without revealing any of the other branches. One of the Taproot script versions will be used to denote a MAST construction. If the contract has only one branch then MAST does not add more overhead.

Advantages

Disadvantages

submitted by almkglor to Bitcoin [link] [comments]

Google’s Quantum Computing Breakthrough Brings Blockchain Resistance Into the Spotlight Again

Google’s Quantum Computing Breakthrough Brings Blockchain Resistance Into the Spotlight Again


News by Forbes: Darryn Pollock
Quantum computing has been on the tech radar for some time now, but it has also been lurking in the background of the blockchain ecosystem for very different reasons. The new advancement of computing allows for complex equations and problems to be solved exponentially quicker than is currently available.
However, it has always been predominantly a futuristic, almost science fiction-like pursuit; for blockchain that has been just fine as well because we have been warned that quantum computation could render existing encryption standards obsolete, threatening the security of every significant blockchain.
This week, news has emerged that Google has made a recent quantum computing breakthrough, achieving quantum supremacy. It is being reported that Google, using a quantum computer, managed to perform a calculation in just over three minutes that would take the world’s most powerful supercomputer 10,000 years.
This could mean panic stations for blockchain as all that has been achieved thus far could be wiped out, and without the right provisions, all the promise and potential could be eliminated overnight.
However, the term quantum supremacy refers to the moment when a quantum computer outperforms the world’s best classical computer in a specific test. This is just the first step, but it is a rather large step that means the spotlight is once again on blockchain to try and resist this kind of technology which can unravel its cryptographic algorithms in minutes.
Google’s first steps
Google has described the recent achievement as a “milestone towards full-scale quantum computing.” They have also said this milestone puts a marker in the ground on which they can start rapidly progressing towards full quantum computing — another concerning statement form blockchains.
Details are a little scarce on what Google has achieved, and how they have done it, but previous proposals essentially involve the quantum computer racing a classical computer simulating a random quantum circuit.
According to Gizmodo, it has been long known that Google has been testing a 72-qubit device called Bristlecone with which it hoped to achieve quantum supremacy and the initial report from the Financial Times says that the supremacy experiment was instead performed with a 53-qubit processor codenamed Sycamore.
However, it would be a little early to start abandoning all hope with Bitcoin, blockchain, and the emerging technology as it is a bit more complicated than that. More so, there is already technology and projects in place that has been trying to prepare for an age of quantum computing where blockchain is resistant.
Are blockchains ready to resist?
So, if quantum computing is making significant breakthroughs, is there any evidence of blockchain’s being prepared for this new age, and a new threat? There has been news of blockchain builders putting out quantum-resistant chains, such as E-cash inventor David Chaum and his latest cryptocurrency, Praxxis.

David Chaum, Elixxir on Moneyconf Stage during day two of Web Summit 2018 (Photo by Eoin Noonan /Web Summit via Getty Images)

WEB SUMMIT VIA GETTY IMAGES
QAN is another project that says it is ready for the quantum computing age, has reacted quickly to the news of Google’s breakthrough with Johann Polecsak, CTO of QAN, telling Bitcoin.com: “The notion of Google achieving a quantum breakthrough sounds very dramatic, but in reality, it’s hard to gauge the significance at this time. How can we be sure that Google’s quantum computer is more powerful than D-wave’s, for example, which surpassed 1,000 qubits four years ago?”
I also reached out to Polecsak to find out more about the threat of quantum computing when, and if, it reaches its pinnacle.
“We should definitely be worried,” he told me, “Many IT professionals and CTOs, including the earlier m, are neglecting and denying quantum computing threats with the simple reasoning that once it’s seriously coming, we’ll have to redesign almost everything from scratch, and that must surely be a long time ahead.”
“The truth is that one can already rent quantum computers for experimenting with possible attack algorithms and testing theoretical approaches. The maths behind breaking currently used public key cryptography — EC and RSA — were proven, we just need more qubits.”
“In cryptography, it’s best to prepare for the worst, and one can observe in recent literature that past skeptics now instantiate their crypto protocols in a post-quantum setting — just it case. Users shouldn’t worry now, but experts should prepare before it’s too late.”
QAN CTO Johann Polecsak speaking about the threat of quantum computing at a conference in Seoul, South Korea.

SUPPLIED
What it means to be quantum-resistant
Of course, the technological aspect of the race between quantum computing and blockchain quantum resistance is immense, and it is also quite nuanced. It is not as if quantum computing will, like a light switch, be available and all blockchains will suddenly be vulnerable — but it is still important to be prepared. As it stands, there probably is not enough preparation and planning in place, according to Polecsak.
“Blockchains won’t be ready for such a breakthrough. Since transaction history is the backbone of blockchains, such an improvement in quantum computing could be catastrophic for the whole transaction history,” added the CTO. “There is an extra layer of protection with Bitcoin’s double hashing but assuming a quantum computer is capable of Shor on secp256k1 it’s safe to assume it’s also capable of Grover256. Also, we don’t know bounds for SHA regarding quantum circuits.”
“As for QAN blockchain platform, it is not a linear comparison or a race where we need to keep up side-by-side with increasing qubits. Being Quantum-safe does not mean that we are just increasing bits in currently used algorithms, but that we take a totally different approach which resists the known Quantum attacks by design.”
Prepare to resist
As science-fictiony as it sounds, quantum computing is a threat that needs to be taken seriously in the world of blockchains. It may not be the kill switch that everyone imagines because of media hype, but it certainly something that should be on the radar for anyone involved in the ecosystem.
It is not only because of what has been accomplished in blockchain thus far but also because of what is being built and promised in the space. Blockchain is a major technology revolution on the horizon, and as it permeates deeper into enterprises and governments it would be catastrophic for all that has been done to be undone, and all that has been promised to be eliminated.
submitted by GTE_IO to u/GTE_IO [link] [comments]

The core concepts of DTube's new blockchain

Dear Reddit community,
Following our announcement for DTube v0.9, I have received countless questions about the new blockchain part, avalon. First I want to make it clear, that it would have been utterly impossible to build this on STEEM, even with the centralized SCOT/Tribes that weren't available when I started working on this. This will become much clearer as you read through the whole wall of text and understand the novelties.
SteemPeak says this is a 25 minutes read, but if you are truly interested in the concept of a social blockchain, and you believe in its power, I think it will be worth the time!

MOVING FORWARD

I'm a long time member of STEEM, with tens of thousands of staked STEEM for 2 years+. I understand the instinctive fear from the other members of the community when they see a new crypto project coming out. We've had two recent examples recently with the VOICE and LIBRA annoucements, being either hated or ignored. When you are invested morally, and financially, when you see competitors popping up, it's normal to be afraid.
But we should remember competition is healthy, and learn from what these projects are doing and how it will influence us. Instead, by reacting the way STEEM reacts, we are putting our heads in the sand and failing to adapt. I currently see STEEM like the "North Korea of blockchains", trying to do everything better than other blockchains, while being #80 on coinmarketcap and slowly but surely losing positions over the months.
When DLive left and revealed their own blockchain, it really got me thinking about why they did it. The way they did it was really scummy and flawed, but I concluded that in the end it was a good choice for them to try to develop their activity, while others waited for SMTs. Sadly, when I tried their new product, I was disappointed, they had botched it. It's purely a donation system, no proof of brain... And the ultra-majority of the existing supply is controlled by them, alongside many other 'anti-decentralization' features. It's like they had learnt nothing from their STEEM experience at all...
STEEM was still the only blockchain able to distribute crypto-currency via social interactions (and no, 'donations' are not social interactions, they are monetary transfers; bitcoin can do it too). It is the killer feature we need. Years of negligence or greed from the witnesses/developers about the economic balance of STEEM is what broke this killer feature. Even when proposing economical changes (which are actually getting through finally in HF21), the discussions have always been centered around modifying the existing model (changing the curve, changing the split, etc), instead of developing a new one.
You never change things by fighting the existing reality.
To change something, build a new model that makes the existing model obsolete.
What if I built a new model for proof of brain distribution from the ground up? I first tried playing with STEEM clones, I played with EOS contracts too. Both systems couldn't do the concepts I wanted to integrate for DTube, unless I did a major refactor of tens of thousands of lines of code I had never worked with before. Making a new blockchain felt like a lighter task, and more fun too.
Before even starting, I had a good idea of the concepts I'd love to implement. Most of these bullet points stemmed from observations of what happened here on STEEM in the past, and what I considered weaknesses for d.tube's growth.

NO POWER-UP

The first concept I wanted to implement deep down the core of how a DPOS chain works, is that I didn't want the token to be staked, at all (i.e. no 'powering up'). The cons of staking for a decentralized social platform are obvious: * complexity for the users with the double token system. * difficulty to onboard people as they need to freeze their money, akin to a pyramid scheme.
The only good thing about staking is how it can fill your bandwidth and your voting power when you power-up, so you don't need to wait for it to grow to start transacting. In a fully-liquid system, your account ressources start at 0% and new users will need to wait for it to grow before they can start transacting. I don't think that's a big issue.
That meant that witness elections had to be run out of the liquid stake. Could it be done? Was it safe for the network? Can we update the cumulative votes for witnesses without rounding issues? Even when the money flows between accounts freely?
Well I now believe it is entirely possible and safe, under certain conditions. The incentive for top witnesses to keep on running the chain is still present even if the stake is liquid. With a bit of discrete mathematics, it's easy to have a perfectly deterministic algorithm to run a decentralized election based off liquid stake, it's just going to be more dynamic as the funds and the witness votes can move around much faster.

NO EARLY USER ADVANTAGE

STEEM has had multiple events that influenced the distribution in a bad way. The most obvious one is the inflation settings. One day it was hella-inflationary, then suddently hard fork 16 it wasn't anymore. Another major one, is the non-linear rewards that ran for a long time, which created a huge early-user advantage that we can still feel today.
I liked linear rewards, it's what gives minnows their best chance while staying sybil-resistant. I just needed Avalon's inflation to be smart. Not hyper-inflationary like The key metric to consider for this issue, is the number of tokens distributed per user per day. If this metric goes down, then the incentive for staying on the network and playing the game, goes down everyday. You feel like you're making less and less from your efforts. If this metric goes up, the number of printed tokens goes up and the token is hyper-inflationary and holding it feels really bad if you aren't actively earning from the inflation by playing the game.
Avalon ensures that the number of printed tokens is proportional to the number of users with active stake. If more users come in, avalon prints more tokens, if users cash-out and stop transacting, the inflation goes down. This ensures that earning 1 DTC will be about as hard today, tomorrow, next month or next year, no matter how many people have registered or left d.tube, and no matter what happens on the markets.

NO LIMIT TO MY VOTING POWER

Another big issue that most steemians don't really know about, but that is really detrimental to STEEM, is how the voting power mana bar works. I guess having to manage a 2M SP delegation for @dtube really convinced me of this one.
When your mana bar is full at 100%, you lose out the potential power generation, and rewards coming from it. And it only takes 5 days to go from 0% to 100%. A lot of people have very valid reasons to be offline for 5 days+, they shouldn't be punished so hard. This is why all most big stake holders make sure to always spend some of their voting power on a daily basis. And this is why minnows or smaller holders miss out on tons of curation rewards, unless they delegate to a bidbot or join some curation guild... meh. I guess a lot of people would rather just cash-out and don't mind the trouble of having to optimize their stake.
So why is it even a mana bar? Why can't it grow forever? Well, everything in a computer has to have a limit, but why is this limit proportional to my stake? While I totally understand the purpose of making the bandwidth limited and forcing big stake holders to waste it, I think it's totally unneeded and inadapted for the voting power. As long as the growth of the VP is proportional to the stake, the system stays sybil-resistant, and there could technically be no limit at all if it wasn't for the fact that this is ran in a computer where numbers have a limited number of bits.
On Avalon, I made it so that your voting power grows virtually indefinitely, or at least I don't think anyone will ever reach the current limit of Number.MAX_SAFE_INTEGER: 9007199254740991 or about 9 Peta VP. If you go inactive for 6 months on an account with some DTCs, when you come back you will have 6 months worth of power generation to spend, turning you into a whale, at least for a few votes.
Another awkward limit on STEEM is how a 100% vote spends only 2% of your power. Not only STEEM forces you to be active on a daily basis, you also need to do a minimum of 10 votes / day to optimize your earnings. On Avalon, you can use 100% of your stored voting power in a single mega-vote if you wish, it's up to you.

A NEW PROOF-OF-BRAIN

No Author rewards

People should vote with the intent of getting a reward from it. If 75% of the value forcibly goes to the author, it's hard to expect a good return from curation. Steem is currently basically a complex donation platform. No one wants to donate when they vote, no matter what they will say, and no matter how much vote-trading, self-voting or bid-botting happens.
So in order to keep a system where money is printed when votes happen, if we cannot use the username of the author to distribute rewards, the only possibility left is to use the list of previous voters aka "Curation rewards". The 25% interesting part of STEEM, that has totally be shadowed by the author rewards for too long.

Downvote rewards

STEEM has always suffered from the issue that the downvote button is unused, or when it's used, it's mostly for evil. This comes from the fact that in STEEM's model, downvotes are not eligible for any rewards. Even if they were, your downvote would be lowering the final payout of the content, and your own curation rewards...
I wanted Avalon's downvotes to be completely symmetric to the upvotes. That means if we revert all the votes (upvotes become downvotes and vice versa), the content should still distribute the same amount of tokens to the same people, at the same time.

No payment windows

Steem has a system of payments windows. When you publish a content, it opens a payment window where people can freely upvote or downvote to influence the payout happening 7 days later. This is convenient when you want a system where downvotes lower rewards. Waiting 7 days to collect rewards is also another friction point for new users, some of them might never come back 7 days later to convince themselves that 'it works'. On avalon, when you are part of the winners of curation after a vote, you earn it instantly in your account, 100% liquid and transferable.

Unlimited monetization in time

Indeed, the 7 days monetization limit has been our biggest issue for our video platform since day 8. This incentivized our users to create more frequent, but lesser quality content, as they know that they aren't going to earn anything from the 'long-haul'. Monetization had to be unlimited on DTube, so that even a 2 years old video could be dug up and generate rewards in the far future.
Infinite monetization is possible, but as removing tokens from a balance is impossible, the downvotes cannot remove money from the payout like they do on STEEM. Instead, downvotes print money in the same way upvotes do, downvotes still lower the popularity in the hot and trending and should only rewards other people who downvoted the same content earlier.

New curation rewards algorithm

STEEM's curation algorithm isn't stupid, but I believe it lacks some elegance. The 15 minutes 'band-aid' necessary to prevent curation bots (bots who auto vote as fast as possible on contents of popular authors) that they added proves it. The way is distributes the reward also feels very flat and boring. The rewards for my votes are very predictable, especially if I'm the biggest voter / stake holder for the content. My own vote is paying for my own curation rewards, how stupid is that? If no one elses votes after my big vote despite a popularity boost, it probably means I deserve 0 rewards, no?
I had to try different attempts to find an algorithm yielding interesting results, with infinite monetization, and without obvious ways to exploit it. The final distribution algorithm is more complex than STEEM's curation but it's still pretty simple. When a vote is cast, we calculate the 'popularity' at the time of the vote. The first vote is given a popularity of 0, the next votes are defined by (total_vp_upvotes - total_vp_downvotes) / time_since_1st_vote. Then we look into the list of previous votes, and we remove all votes in the opposite direction (up/down). The we remove all the votes with a higher popularity if its an upvote, or the ones with a lower popularity if its a downvote. The remaining votes in the list are the 'winners'. Finally, akin to STEEM, the amount of tokens generated by the vote will be split between winners proportionally to the voting power spent by each (linear rewards - no advantages for whales) and distributed instantly. Instead of purely using the order of the votes, Avalon distribution is based on when the votes are cast, and each second that passes reduces the popularity of a content, potentially increasing the long-term ROI of the next vote cast on it.
Graph It's possible to chart the popularity that influences the DTC monetary distribution directly in the d.tube UI
This algorithm ensures there are always losers. The last upvoter never earns anything, also the person who upvoted at the highest popularity, and the one who downvoted at the lowest popularity would never receive any rewards for their vote. Just like the last upvoter and last downvoter wouldn't either. All the other ones in the middle may or may not receive anything, depending on how the voting and popularity evolved in time. The one with an obvious advantage, is the first voter who is always counted as 0 popularity. As long as the content stays at a positive popularity, every upvote will earn him rewards. Similarly, being the first downvoter on an overly-popular content could easily earn you 100% rewards on the next downvote that could be from a whale, earning you a fat bonus.
While Avalon doesn't technically have author rewards, the first-voter advantage is strong, and the author has the advantage of always being the first voter, so the author can still earn from his potentially original creations, he just needs to commit some voting power on his own contents to be able to publish.

ONE CHAIN <==> ONE APP

More scalable than shared blockchains

Another issue with generalistic blockchains like ETH/STEEM/EOS/TRX, which are currently hosting dozens of semi-popular web/mobile apps, is the reduced scalability of such shared models. Again, everything in a computer has a limit. For DPOS blockchains, 99%+ of the CPU load of a producing node will be to verify the signatures of the many transactions coming in every 3 seconds. And sadly this fact will not change with time. Even if we had a huge breakthrough on CPU speeds today, we would need to update the cryptographic standards for blockchains to keep them secure. This means it would NOT become easier to scale up the number of verifiable transactions per seconds.
Oh, but we are not there yet you're thinking? Or maybe you think that we'll all be rich if we reach the scalability limits so it doesn't really matter? WRONG
The limit is the number of signature verifications the most expensive CPU on the planet can do. Most blockchains use the secp256k1 curve, including Bitcoin, Ethereum, Steem and now Avalon. It was originally chosen for Bitcoin by Satoshi Nakamoto probably because it's decently quick at verifying signatures, and seems to be backdoor-proof (or else someone is playing a very patient game). Maybe some other curves exist with faster signature verification speed, but it won't be improved many-fold, and will likely require much research, auditing, and time to get adopted considering the security implications.
In 2015 Graphene was created, and Bitshares was completely rewritten. This was able to achieve 100,000 transaction per second on a single machine, and decentralized global stress testing achieved 18,000 transactions per second on a distributed network.
So BitShares/STEEM and other DPOS graphene chains in production can validate at most 18000 txs/sec, so about 1.5 billion transactions per day. EOS, Tendermint, Avalon, LIBRA or any other DPOS blockchain can achieve similar speeds, because there's no planet-killing proof-of-works, and thanks to the leader-based/democratic system that reduces the number of nodes taking part in the consensus.
As a comparison, there are about 4 billion likes per day on instagram, so you can probably double that with the actual uploads, stories and comments, password changes, etc. The load is also likely unstable through the day, probably some hours will go twice as fast as the average. You wouldn't be able to fit Instagram in a blockchain, ever, even with the most scalable blockchain tech on the world's best hardware. You'd need like a dozen of those chains. And instagram is still a growing platform, not as big as Facebook, or YouTube.
So, splitting this limit between many popular apps? Madness! Maybe it's still working right now, but when many different apps reach millions of daily active users plus bots, it won't fit anymore.
Serious projects with a big user base will need to rethink the shared blockchain models like Ethereum, EOS, TRX, etc because the fees in gas or necessary stake required to transact will skyrocket, and the victims will be the hordes of minnows at the bottom of the distribution spectrum.
If we can't run a full instagram on a DPOS blockchain, there is absolutely no point trying to run medium+reddit+insta+fb+yt+wechat+vk+tinder on one. Being able to run half an instagram is already pretty good and probably enough to actually onboard a fair share of the planet. But if we multiply the load by the number of different app concepts available, then it's never gonna scale.
DTube chain is meant for the DTube UI only. Please do not build something unrelated to video connecting to our chain, we would actively do what we can to prevent you from growing. We want this chain to be for video contents only, and the JSON format of the contents should always follow the one used by d.tube.
If you are interested in avalon tech for your project isn't about video, it's strongly suggested to fork the blockchain code and run your own avalon chain with a different origin id, instead of trying to connect your project to dtube's mainnet. If you still want to do it, chain leaders would be forced to actively combat your project as we would consider it as useless noise inside our dedicated blockchain.

Focused governance

Another issue of sharing a blockchain, is the issues coming up with the governance of it. Tons of features enabled by avalon would be controversial to develop on STEEM, because they'd only benefit DTube, and maybe even hurt/break some other projects. At best they'd be put at the bottom of a todo list somewhere. Having a blockchain dedicated to a single project enables it to quickly push updates that are focused on a single product, not dozens of totally different projects.
Many blockchain projects are trying to make decentralized governance true, but this is absolutely not what I am interested in for DTube. Instead, in avalon the 'init' account, or 'master' account, has very strong permissions. In the DTC case, @dtube: * will earn 10% fees from all the inflation * will not have to burn DTCs to create accounts * will be able to do certain types of transactions when others can't * * account creation (during steem exclusivity period) * * transfers (during IEO period) * * transfering voting power and bandwidth ressources (used for easier onboarding)
For example, for our IEO we will setup a mainnet where only @dtube is allowed to transfer funds or vote until the IEO completes and the airdrop happens. This is also what enabled us to create a 'steem-only' registration period on the public testnet for the first month. Only @dtube can create accounts, this way we can enforce a 1 month period where users can port their username for free, without imposters having a chance to steal usernames. Through the hard-forking mechanism, we can enable/disable these limitations and easily evolve the rules and permissions of the blockchain, for example opening monetary transfers at the end of our IEO, or opening account creation once the steem exclusivity ends.
Luckily, avalon is decentralized, and all these parameters (like the @dtube fees, and @dtube permissions) are easily hardforkable by the leaders. @dtube will however be a very strong leader in the chain, as we plan to use our vote to at least keep the #1 producing node for as long as we can.
We reserve the right to 'not follow' an hardfork. For example, it's obvious we wouldn't follow something like reducing our fees to 0% as it would financially endanger the project, and we would rather just continue our official fork on our own and plug d.tube domain and mobile app to it.
On the other end of the spectrum, if other leaders think @dtube is being tyranical one way or another, leaders will always have the option of declining the new hardforks and putting the system on hold, then @dtube will have an issue and will need to compromise or betray the trust of 1/3 of the stake holders, which could reveal costly.
The goal is to have a harmounious, enterprise-level decision making within the top leaders. We expect these leaders to be financially and emotionally connected with the project and act for good. @dtube is to be expected to be the main good actor for the chain, and any permission given to it should be granted with the goal of increasing the DTC marketcap, and nothing else. Leaders and @dtube should be able to keep cooperation high enough to keep the hard-forks focused on the actual issues, and flowing faster than other blockchain projects striving for a totally decentralized governance, a goal they are unlikely to ever achieve.

PERFECT IMBALANCE

A lot of hard-forking

Avalon is easily hard-forkable, and will get hard-forked often, on purpose. No replays will be needed for leaders/exchanges during these hard-forks, just pull the new hardfork code, and restart the node before the hard-fork planned time to stay on the main fork. Why is this so crucial? It's something about game theory.
I have no former proof for this, but I assume a social and financial game akin to the one played on steem since 2016 to be impossible to perfectly balance, even with a thourough dichotomical process. It's probably because of some psychological reason, or maybe just the fact that humans are naturally greedy. Or maybe it's just because of the sheer number of players. They can gang up together, try to counter each others, and find all sorts of creative ideas to earn more and exploit each other. In the end, the slightest change in the rules, can cause drastic gameplay changes. It's a real problem, luckily it's been faced by other people in the past.
Similarly to what popular and succesful massively multiplayer games have achieved, I plan to patch or suggest hard-forks for avalon's mainnet on a bi-monthly basis. The goal of this perfect imbalance concept, is to force players to re-discover their best strategy often. By introducing regular, small, and semi-controlled changes into this chaos, we can fake balance. This will require players to be more adaptative and aware of the changes. This prevents the game from becoming stale and boring for players, while staying fair.

Death to bots

Automators on the other side, will need to re-think their bots, go through the developement and testing phase again, on every new hard-fork. It will be an unfair cat-and-mouse game. Doing small and semi-random changes in frequent hard-forks will be a easy task for the dtube leaders, compared to the work load generated to maintain the bots. In the end, I hope their return on investment to be much lower compared to the bid-bots, up to a point where there will be no automation.
Imagine how different things would have been if SteemIt Inc acted strongly against bid-bots or other forms of automation when they started appearing? Imagine if hard-forks were frequent and they promised to fight bid-bots and their ilk? Who would be crazy enough to make a bid-bot apart from @berniesanders then?
I don't want you to earn DTCs unless you are human. The way you are going to prove you are human, is not by sending a selfie of you with your passport to a 3rd party private company located on the other side of the world. You will just need to adapt to the new rules published every two weeks, and your human brain will do it subconsciously by just playing the voting game and seeing the rewards coming.
All these concepts are aimed at directly improving d.tube, making it more resilient, and scale both technologically and economically. Having control over the full tech stack required to power our dapp will prevent issues like the one we had with the search engine, where we relied too heavily on a 3rd party tool, and that created a 6-months long bug that basically broke 1/3 of the UI.
While d.tube's UI can now totally run independently from any other entity, we kept everything we could working with STEEM, and the user is now able to transparently publish/vote/comment videos on 2 different chains with one click. This way we can keep on leveraging the generalistic good features of STEEM that our new chain doesn't focuses on doing, such as the dollar-pegged token, the author rewards/donation mechanism, the tribes/communities tokens, and simply the extra exposure d.tube users can get from other website (steemit.com, busy.org, partiko, steempeak, etc), which is larger than the number of people using d.tube directly.
The public testnet has been running pretty well for 3 weeks now, with 6000+ accounts registered, and already a dozen of independant nodes popping up and running for leaders. The majority of the videos are cross-posted on both chains and the daily video volume has slightly increased since the update, despite the added friction of the new 'double login' system and several UI bugs.
If you've read this article, I'm hoping to get some reactions from you in the comments section!
Some even more focused articles about avalon are going to pop on my blog in the following weeks, such as how to get a node running and running for leadewitness, so feel free to follow me to get more news and help me reach 10K followers ;)
submitted by nannal to dtube [link] [comments]

Zcoin Dev Update 14 Feb 2019

Zcoin Dev Update 14 Feb 2019
https://preview.redd.it/ias8xvc5eog21.png?width=1200&format=png&auto=webp&s=25fdf7cf9b9324214a46c7dbd67a4badb4b6452a
Zcoin Electrum Light Wallet
  • Electrum light wallet rebase to 3.3.3 is released with much better performance and stability.
MTP
  • AMD miner, sgminer fixed crashing issues and increased speed for Vega
  • Made a new release.
  • To work on optimizing cpuminer
Sigma
  • Discussion on algorithm to automatically determine which mints to use to meet a private spend request.
    • Originally decided to always pick the mints that are largest so that the private spend request could be met in the smallest amount of spends. However leads to many smaller mints.
    • Better approach is for the algorithm to calculate the least amount of Zerocoin transactions needed to meet a private spend request that also takes into accounts the re-minting of change. This is because mints are future spends.
  • Testnet for Sigma to be started soon
  • Some issues with secp256k1 library resolved
  • Plan to release Sigma with core features for testing end of February
Lelantus
  • Academic paper being rearranged to clearly separate
    • scheme
    • claimed security properties
    • proofs of the claimed properties.
  • This will assist in third party verification of claims
  • Ongoing discussion on whether to allow shielded to shielded transactions which sacrifices element of supply auditability. Leaning towards launching Lelantus without having this feature first.
  • Exploring alternative ways to cryptographically prove ‘shielded’ supply so that forging of zk proofs alone won’t be enough to cheat system.
  • Batching of bullet proofs work is almost complete and will have working code for this portion by end of week.
Core Upgrade (Bitcoin Core 0.17)
  • Code to be pushed to public repo in a few days and open for testing
GUI
  • Incomplete shutdown issue located
  • Issue where non-existent mints are showing been fixed
  • Settings page to include slider to adjust how many percent of coins should be kept as mints after startup.
  • Found weightage issue with Zerocoin spends where it is weighted 4x the size leading to only 3 spends fitting within a transaction.
  • New release by this weekend.
  • Hierarchical deterministic (HD) minting almost working simplifying wallet backup.
submitted by Muggles_XZC to zcoin [link] [comments]

Found a gem in the blockchain

If you calculate sha256 of the original bitcoin whitepaper, you get:
b1674191a88ec5cdd733e4240a81803105dc412d6c6708d53ab94fc248f4f553
If you now use this point as the exponent for secp256k1, you get:
(04)34248547E1430BA78813ACE1053FA1DEB7410C63068CD18CB8574A92836DF6727B37110B1E1B55E69B0648BAC7CC176C49ADDEBDC1E4115DCCD861516F614850 04 is just the mainnet prefix, that's added after the computation
And guess what: that address had some BTC in it a couple of months ago.
submitted by MRSantos to Bitcoin [link] [comments]

ECDSA Playground

ECDSA Playground
ECDSA Playground https://8gwifi.org/ecsignverify.jsp

Elliptic Curve Digital Signature Algorithm or ECDSA is a cryptographic algorithm used by Bitcoin to ensure that funds can only be spent by their rightful owners.
This tool is capable of generating key the the curve


https://preview.redd.it/9fwcnzijrgu11.png?width=1127&format=png&auto=webp&s=a4f36c49b74f3122b2bc903f7582c17ea041dec1
"c2pnb272w1", "c2tnb359v1", "prime256v1", "c2pnb304w1", "c2pnb368w1", "c2tnb431r1", "sect283r1", "sect283k1", "secp256r1", "sect571r1", "sect571k1", "sect409r1", "sect409k1", "secp521r1", "secp384r1", "P-521", "P-256", "P-384", "B-409", "B-283", "B-571", "K-409", "K-283", "K-571", "brainpoolp512r1", "brainpoolp384t1", "brainpoolp256r1", "brainpoolp512t1", "brainpoolp256t1", "brainpoolp320r1", "brainpoolp384r1", "brainpoolp320t1", "FRP256v1", "sm2p256v1" 
secp256k1 refers to the parameters of the elliptic curve used in Bitcoin’s public-key cryptography, and is defined in Standards for Efficient Cryptography (SEC)
A few concepts related to ECDSA:
  • private key: A secret number, known only to the person that generated it. A private key is essentially a randomly generated number. In Bitcoin, a private key is a single unsigned 256 bit integer (32 bytes).
  • public key: A number that corresponds to a private key, but does not need to be kept secret. A public key can be calculated from a private key, but not vice versa. A public key can be used to determine if a signature is genuine (in other words, produced with the proper key) without requiring the private key to be divulged.
  • signature: A number that proves that a signing operation took place.
Openssl Generating EC Keys and Parameters
$ openssl ecparam -list_curves secp256k1 : SECG curve over a 256 bit prime field secp384r1 : NIST/SECG curve over a 384 bit prime field secp521r1 : NIST/SECG curve over a 521 bit prime field prime256v1: X9.62/SECG curve over a 256 bit prime field 
An EC parameters file can then be generated for any of the built-in named curves as follows:
$ openssl ecparam -name secp256k1 -out secp256k1.pem $ cat secp256k1.pem -----BEGIN EC PARAMETERS----- BgUrgQQACg== -----END EC PARAMETERS----- 
To generate a private/public key pair from a pre-eixsting parameters file use the following:
$ openssl ecparam -in secp256k1.pem -genkey -noout -out secp256k1-key.pem $ cat secp256k1-key.pem -----BEGIN EC PRIVATE KEY----- MHQCAQEEIKRPdj7XMkxO8nehl7iYF9WAnr2Jdvo4OFqceqoBjc8/oAcGBSuBBAAK oUQDQgAE7qXaOiK9jgWezLxemv+lxQ/9/Q68pYCox/y1vD1fhvosggCxIkiNOZrD kHqms0N+huh92A/vfI5FyDZx0+cHww== -----END EC PRIVATE KEY----- 
Examine the specific details of the parameters associated with a particular named curve
$ openssl ecparam -in secp256k1.pem -text -param_enc explicit -noout Field Type: prime-field Prime: 00:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff: ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:fe:ff: ff:fc:2f A: 0 B: 7 (0x7) Generator (uncompressed): 04:79:be:66:7e:f9:dc:bb:ac:55:a0:62:95:ce:87: 0b:07:02:9b:fc:db:2d:ce:28:d9:59:f2:81:5b:16: f8:17:98:48:3a:da:77:26:a3:c4:65:5d:a4:fb:fc: 0e:11:08:a8:fd:17:b4:48:a6:85:54:19:9c:47:d0: 8f:fb:10:d4:b8 Order: 00:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff: ff:fe:ba:ae:dc:e6:af:48:a0:3b:bf:d2:5e:8c:d0: 36:41:41 Cofactor: 1 (0x1) 
submitted by anish2good to u/anish2good [link] [comments]

elliptic curve secp256k1 vulnerability?

to preface I have never used bitcoin so i don't know if this is a true vulnerability but in doing some background on bitcoin cryptology I saw something that seemed a little odd. On their protocol specification wiki they say that in their scripts they provide hexidecimal decompressed x,y coordinates (though these are really r,s values) for the signature of a transaction encoded in DER. They also specify that the curve used is the secp256k1 ECDSA curve and they go so far as to post the secp256k1 parameters p,a,b,G,n, and h on one of their wiki pages for the curve (though these can be found on the EJBCA site too). On the wikipedia page for the Elliptic curve DSA it describes calculating (r,s) as follows
  1. Calculate e=HASH(m), where HASH is a cryptographic hash function, such as SHA-1 (in bitcoin sha256).
  2. Let Z be the Ln leftmost bits of e, where Ln is the bit length of the group order n.
  3. Select a random integer k from [1, n-1].
  4. Calculate the curve point (x1,y1) = k *G (where G is the base-point).
  5. Calculate r= x1 mod(n). If r=0, go back to step 3.
  6. Calculate s= k-1(z+r(da)). If ,r=0 go back to step 3.
  7. The signature is the pair (r,s).
since they provide the signature (r,s) values for a transaction as well as the value of n, couldn't one theoretically convert the (r,s) from DER to ASN.1 then compute the k scalar by iteratively scaling the x-coordinate of G with a variable c such that c*G (mod n)=x, c++ until x is equal to r? If k is obtained the value of da can be algebraically determined and y1 could be determined, which from my understanding is the user's private key... i think? I know that n is a large number and this would require a bit of brute force but it feels like someone with a reasonable number theory background could find some paths to get around that issue. I also feel like this is too big of a loophole for bitcoin to not realize or anybody for that matter and so I'd love to know what I'm misunderstanding.
submitted by jsunderland323 to Bitcoin [link] [comments]

[Meta] Re: Bitcoin Core 0.13.2 released | Luke Dashjr | Jan 07 2017

Luke Dashjr on Jan 07 2017:
I don't think release announcements are really appropriate for the bitcoin-dev
mailing list. People who want these can subscribe to the bitcoin-core-dev list
and/or the Core announce mailing list. Maybe sending to bitcoin-discuss would
also make sense, but not bitcoin-dev...
Luke
On Tuesday, January 03, 2017 8:47:36 AM Wladimir J. van der Laan via bitcoin-
dev wrote:
Bitcoin Core version 0.13.2 is now available from:
https://bitcoin.org/bin/bitcoin-core-0.13.2/
Or by bittorrent:
magnet:?xt=urn:btih:746697d03db3ff531158b1133bab5d1e4cef4e5a&dn=bitcoin-co
re-0.13.2&tr=udp%3A%2F%2Ftracker.openbittorrent.com%3A80%2Fannounce&tr=udp%
3A%2F%2Ftracker.publicbt.com%3A80%2Fannounce&tr=udp%3A%2F%2Ftracker.ccc.de%
3A80%2Fannounce&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%
2Ftracker.leechers-paradise.org%3A6969&ws=https%3A%2F%2Fbitcoin.org%2Fbin%2
F
This is a new minor version release, including various bugfixes and
performance improvements, as well as updated translations.
Please report bugs using the issue tracker at github:
https://github.com/bitcoin/bitcoin/issues
To receive security and update notifications, please subscribe to:
https://bitcoincore.org/en/list/announcements/join/
Compatibility

Microsoft ended support for Windows XP on [April 8th,
2014](https://www.microsoft.com/en-us/WindowsForBusiness/end-of-xp-support
), an OS initially released in 2001. This means that not even critical
security updates will be released anymore. Without security updates, using
a bitcoin wallet on a XP machine is irresponsible at least.
In addition to that, with 0.12.x there have been varied reports of Bitcoin
Core randomly crashing on Windows XP. It is [not
clear](https://github.com/bitcoin/bitcoin/issues/7681#issuecomment-2174398
91) what the source of these crashes is, but it is likely that upstream
libraries such as Qt are no longer being tested on XP.
We do not have time nor resources to provide support for an OS that is
end-of-life. From 0.13.0 on, Windows XP is no longer supported. Users are
suggested to upgrade to a newer version of Windows, or install an
alternative OS that is supported.
No attempt is made to prevent installing or running the software on Windows
XP, you can still do so at your own risk, but do not expect it to work: do
not report issues about Windows XP to the issue tracker.
From 0.13.1 onwards OS X 10.7 is no longer supported. 0.13.0 was intended
to work on 10.7+, but severe issues with the libc++ version on 10.7.x keep
it from running reliably. 0.13.1 now requires 10.8+, and will communicate
that to 10.7 users, rather than crashing unexpectedly.
Notable changes

Change to wallet handling of mempool rejection
When a newly created transaction failed to enter the mempool due to
the limits on chains of unconfirmed transactions the sending RPC
calls would return an error. The transaction would still be queued
in the wallet and, once some of the parent transactions were
confirmed, broadcast after the software was restarted.
This behavior has been changed to return success and to reattempt
mempool insertion at the same time transaction rebroadcast is
attempted, avoiding a need for a restart.
Transactions in the wallet which cannot be accepted into the mempool
can be abandoned with the previously existing abandontransaction RPC
(or in the GUI via a context menu on the transaction).
0.13.2 Change log

Detailed release notes follow. This overview includes changes that affect
behavior, not code moves, refactors and string updates. For convenience in
locating the code changes and accompanying discussion, both the pull
request and git merge commit are mentioned.

Consensus

  • #9293 e591c10 [0.13 Backport #9053] IBD using chainwork instead of
height and not using header timestamp (gmaxwell) - #9053 5b93eee IBD
using chainwork instead of height and not using header timestamps
(gmaxwell)

RPC and other APIs

  • 8845 1d048b9 Don't return the address of a P2SH of a P2SH (jnewbery)

  • 9041 87fbced keypoololdest denote Unix epoch, not GMT

(s-matthew-english) - #9122 f82c81b fix getnettotals RPC description
about timemillis (visvirial) - #9042 5bcb05d [rpc] ParseHash: Fail when
length is not 64 (MarcoFalke) - #9194 f26dab7 Add option to return
non-segwit serialization via rpc (instagibbs) - #9347 b711390 [0.13.2]
wallet/rpc backports (MarcoFalke)
  • #9292 c365556 Complain when unknown rpcserialversion is specified
(sipa) - #9322 49a612f [qa] Don't set unknown rpcserialversion
(MarcoFalke)

Block and transaction handling

  • 8357 ce0d817 [mempool] Fix relaypriority calculation error (maiiz)

  • 9267 0a4aa87 [0.13 backport #9239] Disable fee estimates for a confirm

target of 1 block (morcos) - #9196 0c09d9f Send tip change notification
from invalidateblock (ryanofsky)

P2P protocol and network code

  • #8995 9ef3875 Add missing cs_main lock to ::GETBLOCKTXN processing
(TheBlueMatt) - #9234 94531b5 torcontrol: Explicitly request RSA1024
private key (laanwj) - #8637 2cad5db Compact Block Tweaks (rebase of

8235) (sipa)

  • #9058 286e548 Fixes for p2p-compactblocks.py test timeouts on travis
(#8842) (ryanofsky) - #8865 4c71fc4 Decouple peer-processing-logic from
block-connection-logic (TheBlueMatt) - #9117 6fe3981 net: don't send
feefilter messages before the version handshake is complete (theuni) -

9188 ca1fd75 Make orphan parent fetching ask for witnesses (gmaxwell) -

9052 3a3bcbf Use RelevantServices instead of node_network in

AttemptToEvict (gmaxwell) - #9048 9460771 [0.13 backport #9026] Fix
handling of invalid compact blocks (sdaftuar) - #9357 03b6f62 [0.13
backport #9352] Attempt reconstruction from all compact block
announcements (sdaftuar) - #9189 b96a8f7 Always add
default_witness_commitment with GBT client support (sipa) - #9253
28d0f22 Fix calculation of number of bound sockets to use (TheBlueMatt)
  • #9199 da5a16b Always drop the least preferred HB peer when adding a
new one (gmaxwell)

Build system

  • 9169 d1b4da9 build: fix qt5.7 build under macOS (theuni)

  • 9326 a0f7ece Update for OpenSSL 1.1 API (gmaxwell)

  • 9224 396c405 Prevent FD_SETSIZE error building on OpenBSD (ivdsangen)

GUI

  • #8972 6f86b53 Make warnings label selectable (jonasschnelli)
(MarcoFalke) - #9185 6d70a73 Fix coincontrol sort issue (jonasschnelli)
  • #9094 5f3a12c Use correct conversion function for boost::path datadir
(laanwj) - #8908 4a974b2 Update bitcoin-qt.desktop (s-matthew-english)
  • #9190 dc46b10 Plug many memory leaks (laanwj)

Wallet

  • #9290 35174a0 Make RelayWalletTransaction attempt to AcceptToMemoryPool
(gmaxwell) - #9295 43bcfca Bugfix: Fundrawtransaction: don't terminate
when keypool is empty (jonasschnelli) - #9302 f5d606e Return txid even
if ATMP fails for new transaction (sipa) - #9262 fe39f26 Prefer coins
that have fewer ancestors, sanity check txn before ATMP (instagibbs)

Tests and QA

  • #9159 eca9b46 Wait for specific block announcement in p2p-compactblocks
(ryanofsky) - #9186 dccdc3a Fix use-after-free in scheduler tests
(laanwj)
  • #9168 3107280 Add assert_raises_message to check specific error message
(mrbandrews) - #9191 29435db 0.13.2 Backports (MarcoFalke)
  • 9077 1d4c884 Increase wallet-dump RPC timeout (ryanofsky)

  • 9098 ecd7db5 Handle zombies and cluttered tmpdirs (MarcoFalke)

  • 8927 387ec9d Add script tests for FindAndDelete in pre-segwit and

segwit scripts (jl2012) - #9200 eebc699 bench: Fix subtle counting issue
when rescaling iteration count (laanwj)

Miscellaneous

  • #8838 094848b Calculate size and weight of block correctly in
CreateNewBlock() (jnewbery) - #8920 40169dc Set minimum required Boost
to 1.47.0 (fanquake)
  • #9251 a710a43 Improvement of documentation of command line parameter
'whitelist' (wodry) - #8932 106da69 Allow bitcoin-tx to create v2
transactions (btcdrak) - #8929 12428b4 add software-properties-common
(sigwo)
  • #9120 08d1c90 bug: Missed one "return false" in recent refactoring in

9067 (UdjinM6) - #9067 f85ee01 Fix exit codes (UdjinM6)

  • 9340 fb987b3 [0.13] Update secp256k1 subtree (MarcoFalke)

  • 9229 b172377 Remove calls to getaddrinfo_a (TheBlueMatt)

Credits

Thanks to everyone who directly contributed to this release:
  • Alex Morcos
  • BtcDrak
  • Cory Fields
  • fanquake
  • Gregory Maxwell
  • Gregory Sanders
  • instagibbs
  • Ivo van der Sangen
  • jnewbery
  • Johnson Lau
  • Jonas Schnelli
  • Luke Dashjr
  • maiiz
  • MarcoFalke
  • Masahiko Hyuga
  • Matt Corallo
  • matthias
  • mrbandrews
  • Pavel Janík
  • Pieter Wuille
  • randy-waterhouse
  • Russell Yanofsky
  • S. Matthew English
  • Steven
  • Suhas Daftuar
  • UdjinM6
  • Wladimir J. van der Laan
  • wodry
As well as everyone that helped translating on
Transifex.
bitcoin-dev mailing list
bitcoin-dev at lists.linuxfoundation.org
https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
original: https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2017-January/013442.html
submitted by dev_list_bot to bitcoin_devlist [link] [comments]

Bitcoin Core 0.13.2 released | Wladimir J. van der Laan | Jan 03 2017

Wladimir J. van der Laan on Jan 03 2017:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512
Bitcoin Core version 0.13.2 is now available from:
https://bitcoin.org/bin/bitcoin-core-0.13.2/
Or by bittorrent:
magnet:?xt=urn:btih:746697d03db3ff531158b1133bab5d1e4cef4e5a&dn;=bitcoin-core-0.13.2&tr;=udp%3A%2F%2Ftracker.openbittorrent.com%3A80%2Fannounce&tr;=udp%3A%2F%2Ftracker.publicbt.com%3A80%2Fannounce&tr;=udp%3A%2F%2Ftracker.ccc.de%3A80%2Fannounce&tr;=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr;=udp%3A%2F%2Ftracker.leechers-paradise.org%3A6969&ws;=https%3A%2F%2Fbitcoin.org%2Fbin%2F
This is a new minor version release, including various bugfixes and
performance improvements, as well as updated translations.
Please report bugs using the issue tracker at github:
https://github.com/bitcoin/bitcoin/issues
To receive security and update notifications, please subscribe to:
https://bitcoincore.org/en/list/announcements/join/
Compatibility

Microsoft ended support for Windows XP on April 8th, 2014,
an OS initially released in 2001. This means that not even critical security
updates will be released anymore. Without security updates, using a bitcoin
wallet on a XP machine is irresponsible at least.
In addition to that, with 0.12.x there have been varied reports of Bitcoin Core
randomly crashing on Windows XP. It is not clear
what the source of these crashes is, but it is likely that upstream
libraries such as Qt are no longer being tested on XP.
We do not have time nor resources to provide support for an OS that is
end-of-life. From 0.13.0 on, Windows XP is no longer supported. Users are
suggested to upgrade to a newer version of Windows, or install an alternative OS
that is supported.
No attempt is made to prevent installing or running the software on Windows XP,
you can still do so at your own risk, but do not expect it to work: do not
report issues about Windows XP to the issue tracker.
but severe issues with the libc++ version on 10.7.x keep it from running reliably.
0.13.1 now requires 10.8+, and will communicate that to 10.7 users, rather than crashing unexpectedly.
Notable changes

Change to wallet handling of mempool rejection
When a newly created transaction failed to enter the mempool due to
the limits on chains of unconfirmed transactions the sending RPC
calls would return an error. The transaction would still be queued
in the wallet and, once some of the parent transactions were
confirmed, broadcast after the software was restarted.
This behavior has been changed to return success and to reattempt
mempool insertion at the same time transaction rebroadcast is
attempted, avoiding a need for a restart.
Transactions in the wallet which cannot be accepted into the mempool
can be abandoned with the previously existing abandontransaction RPC
(or in the GUI via a context menu on the transaction).
0.13.2 Change log

Detailed release notes follow. This overview includes changes that affect
behavior, not code moves, refactors and string updates. For convenience in locating
the code changes and accompanying discussion, both the pull request and
git merge commit are mentioned.

Consensus

RPC and other APIs

Block and transaction handling

P2P protocol and network code

Build system

GUI

Wallet

Tests and QA

Miscellaneous

Credits

Thanks to everyone who directly contributed to this release:
As well as everyone that helped translating on Transifex.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1
iQEcBAEBCgAGBQJYa2IbAAoJEHSBCwEjRsmmiQsIALbkHVVwO7nViQKH1Ub2qpD4
TplOuAP0/4vYotizuI12Gqdnu8SjPmhKwAgIXhVinE6TS4OzGNjy+6LtWGzpcpud
B1pcziZ72Mlfxdbdd1UhDMWEjoBumS9RmXMSqzTlMVlHRv4iiISzdaAROu1jHvdF
YTsnmKXB8OvcXOecxRMY9LrnpSzLALM2MYTDmYwlhhExHIA8ZqI2niky6GCfyfDi
KD7bgfIFJzlgFTpAdhQXOXtWoRV5iHqN7T29ot8Y+yIhVCRhHYXS93Z50GKbkqYV
MXsVAkpZF3qqcKYSPFjbif7faMdrMqcEiII6QhXdDTRGI/35IfuTDbWzzQlnVyY=
=ncCY
-----END PGP SIGNATURE-----
original: https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2017-January/013412.html
submitted by dev_list_bot to bitcoin_devlist [link] [comments]

Elliptic Curve Cryptography Overview - YouTube Flaw in the Enigma Code - Numberphile - YouTube Elliptic Curve Digital Signature Algorithm (ECDSA) - Public Key Cryptography w/ JAVA (tutorial 10) Blockchain tutorial 11: Elliptic Curve key pair generation ...

Secp256k1 calculator. As the a constant is zero, the ax term in the curve equation is always zero, hence the curve equation becomes y 2 = x 3 + 7. 1 128 256 3072 k secp256r1 2. jsrsasign : The 'jsrsasign' (RSA-Sign JavaScript Library) is a open source free pure JavaScript implementation of PKCS#1 v2. No other curves are included Questions tagged [secp256k1] Ask Question The secp256k1 tag has ... Secp256k1 Calculator Allowed key sizes: 192, 256, 384 or 521 bits. random(); // Load key-pair from existing private key SECP256K1. It was created purely by SECG. An EC parameters file can then be generated for any of the built-in named curves as follows:. See full list on pypi. Generate public keys from private keys for ed25519, secp256k1 and bls12-381. ECDSA Sign The ECDSA signing algorithm ... Secp256k1 Calculator From Bitcoin Wiki: secp256k1 refers to the parameters of the ECDSA curve used in Bitcoin, and is defined in ... secp256k1 was almost never used before Bitcoin became popular, but it is now gaining in popularity due to its several nice properties. Most commonly-used curves have a random structure, but secp256k1 was constructed in a special non-random way which allows for especially efficient computation. As a result, it is often more than 30% faster than other curves if the implementation is sufficiently ...

[index] [24957] [28799] [33413] [51365] [21093] [47355] [33136] [25378] [26527] [32717]

Elliptic Curve Cryptography Overview - YouTube

The flaw which allowed the Allies to break the Nazi Enigma code. More links & stuff in full description below ↓↓↓ First video explaining Enigma: http://youtu... 34:30 Alice uses secp256k1 (the bitcoin curve) 35:22 Bob uses secp384r1 curve 36:23 test run Alice sending signed message(s) and Bob validating if message(s) signed by Alice This is part 11 of the Blockchain tutorial explaining how the generate a public private key using Elliptic Curve. In this video series different topics will ... John Wagnon discusses the basics and benefits of Elliptic Curve Cryptography (ECC) in this episode of Lightboard Lessons. Check out this article on DevCentra...

#