Category Archives: ico crypto price

Bitcoins Value: Altcoins – Plucky Fresh Coin

Bitcoins Value: Altcoins

It has been difficult to determine what exactly gives Bitcoin its value. Many economists reject the idea that a virtual currency has any intrinsic value, and merely is a product of speculative request. Others maintain that the ‘moneyness’ of Bitcoin – its utility as a medium of exchange and store of value – confers value. But what if Bitcoins are not ‘money’ at all? Over the past year I have sifted through the data, and after a thorough analysis judge that Bitcoin does indeed have value. Only perhaps not for the reasons that many enthusiasts believe.

BTC Price Volatility and Noise

The evident place to begin an investigation into what determines the value of Bitcoin is to track how its market price fluctuates under various circumstances. Logical questions to ask include: Does the price go up as more mining effort is directed at it? Is Bitcoin price correlated with other financial assets such as the stock market, gold, or the dollar? Does geo-political financial risk cause people to bid up its price?

The problem was, however, that the price volatility of Bitcoin in relation to national fiat currencies was enormous. While volatility in and of itself is not always a bad thing, the case of the the massive run up in market price in late two thousand thirteen to early two thousand fourteen is very likely due to market manipulation and fraudulent activity. The Willy Report has found concrete evidence that trading bots operating at the Mt. Gox exchange were artificially pumping up the price of bitcoin by using the dollars held in customer accounts. When those dollars ran out and the price of Bitcoin began to decline, Mt. Gox itself collapsed leaving its user base empty passed. This sort of noise conflates any earnest effort to investigate the questions posed above.

One positive result of the entire Mt. Gox debacle is that by driving the price to unnaturally high levels, it spurred the interest of many casual observers as well as the media who had largely written off the cryptocurrency as a fad limited to the field of geekdom. As a result of $1,000+ valuations, a flurry of entrepreneurship and capital investment entered the Bitcoin space where it wouldn’t have before. Of course, the subsequent demise of Mt. Gox and the fall in price to the $200 range didn’t bode well for Bitcoin afterwords, it is critical to understand that a $1,000 valuation was never real and was fully unwarranted at the time.

Altcoins Provide Answers

Fortunately, there exists an entire cryptocurrency ecosystem accomplish with alternate cryptocurrencies, ‘altcoins’, created as forks to the Bitcoin blockchain or as attempts to rectify certain perceived flaws in the Bitcoin code. Scrypt was introduced as an ‘ASIC-proof’ mining algorithm as Bitcoin’s SHA-256 became predominated by these application-specific microchips, rendering solo miners and their GPU equipments obsolete. Other altcoins were created to permit for certain fresh applications. Namecoin was intended to become an alternative to DNS domain name service for the .name internet domain. Peercoin was intended to permit Proof-of-Stake to coexist alongside Proof-of-Workstore value cotsikis.si mining. Dogecoin was created as a practical joke to the Bitcoin community, but gained mass appeal nonetheless.

Regardless, all of these various cryptocurrencies existed alongside Bitcoin, and many had tradeable currency pairs on various exchanges. By disrobing away the dollar, euro, and yuan exchange rates, much of that previous noise and volatility disappeared. What remained was a remarkably stable BTC-denominated universe of cryptocurrencies. Trading bots functioned by interfacing with exchanges’ APIs to take advantage of any arbitrage chance that may arise. Soon, these markets became fairly efficient in terms of relative pricing of cryptocurrencies. If one coin currency pair became out of whack, these trading bots would very quickly take advantage of that chance and put everything back in line.

Despite the many differences among the altcoins, they all collective a common lineage back to Bitcoin, and that meant a set of common attributes that could be measured, compared, and analyzed. All of these coins had the following: A value for the block prize; the targeted time inbetween blocks; the number of blocks to be found before re-targeting the difficulty; the mining algorithm employed; how many total coins will ever come to exist; the halving schedule; and how long each has been in existence. There were also outside variables such as the amount of hashing power directed at a certain coin, which influences its mining difficulty. And of course, the market price in terms of Altcoin/BTC.

Taking all of these variables into account, numerous regression analysis was employed to evaluate how meaningful these inputs were at conferring relative value, if at all. The results were surprising. Very first, it doesn’t matter how long a coin has been around; the date of the genesis block was largely irrelevant. 2nd, the amount of coins ever to exist does not matter for relative value. Bitcoin’s twenty one million cap, 42Coin’s cap at 42, or Peercoin’s infinite future supply played no role in value formation. Furthermore, the price of a coin was uncorrelated with its market cap. While the 42Coin commanded an exchange price of almost five BTC, its market cap was less than $50,000 all told. The exchange rate had little to do with the money supply.

What is statistically significant turns out to be the rate of coin formation for a given unit of hashing power. 42Coin isn’t trading at five BTC because there will only be a a few dozen to exist, rather it is because the block prize is a mere 0.000042 coins. The larger the block prize, the less valuable the altcoin is at exchange. Scrypt is a more challenging algorithm to solve than SHA-256, so for a given unit of computational effort, effectively less Scrypt coins can be mined – and all else equal, a coin with a tighter algorithm is more valuable. As the mining difficulty of a coin increases, a unit of hashing power will find less coins over that interval – the higher the difficulty of a coin, the more valuable.

Bitcoin Regulates Altcoin Production

Taken together, the main driver of relative value in cryptocurrencies is how many coins a unit of hashpower can find on average over some interval, say in a day. In other words, value seems to be related to the production side. Mining, then, could be thought of as a competitive market where producers (miners) rival with each other to mine for coins and then bring them to market. Of course, some miners hoard their stash for speculation or for reasons of loyalty and sense of community. But assuming that the majority of miners are rational and driven by the profit motive, the altcoin-producing economy might actually serve as an analog for competitive commodity production.

Bitcoin benefits from path-dependence. It was the very first mover in the space of blockchain-based virtual currencies, and it has the greatest market depth as well as social and merchant acceptance. Today, it is fairly effortless to find a merchant to accept Bitcoin for payment whether it be for a cup of coffee at a neighborhood cafe, a fresh sofa from an online store, or an international flight on a major airline. You would be hard-pressed, however, to find a ticket from Fresh York – Los Angeles in exchange for Litecoin, Darkcoin, or Dogecoin, the three largest altcoin behind Bitcoin. Anybody wanting to use cryptyocurrency to transact with the real economy these days must do so with Bitcoin, and therefore one would have to exchange their altcoins for Bitcoins. A rational, profit-motivated producer of cryptocurrencies would only determine to point their mining effort at an altcoin, then, if they could exchange them for effectively more Bitcoins than mining for Bitcoin directly. Otherwise they will elect to mine for Bitcoin instead by default. Multi-pools that steer a miner’s effort automatically to the most profitable coin is an example of this in practice.

This point cannot be understated: The only reason to mine for an altcoin is because it provides the chance to exchange them for more Bitcoin than mining for Bitcoin directly. Take for example the choice to mine inbetween Bitcoin and some hypothetical altcoin, XYZCoin, and my mining equipment can produce an expected one BTC/day. The same mining equipment can produce an expected 33,000 XYZ/day. If the market bid is 0.00003996, I can exchange my XYZ and get 33,000 x 0.00003996 = 1.32 BTC/day, making XYZCoin mining 32% more profitable than BTC.

Two coerces will act on XYZCoin to eliminate this chance for excess profits. Very first, I will keep producing XYZCoin and selling them in the market, driving down its price until it reaches 0.00003030 in our example. At the same time, the extra mining power coming in XYZCoin mining will drive up its difficulty, reducing the amount of XYZCoin expected given the hashpower in my mining equipment. Because the market price can adjust continuously, but the difficulty adjustment happens only incrementally, there is a probability that when the difficulty adjustment will overshoot rendering XYZ not identically as profitable to mine as Bitcoin (which economic theory would suggest), but less profitable.

This doesn’t mean that Bitcoin is the optimal technical achievement of the blockchain. Bitcoin has flaws and limitations, and there are certainly altcoins out there that address and resolve many of those problems. History does not always prize the best, however. The VHS won out over the technologically superior BetaMax. USB over Firewire. TCP/IP over ISO/OSI. QWERTY Keyboards. Internal combustion engines. History is littered with sub-optimal technologies becoming the most widely accepted stable equilibrium – and these equilibria are amazingly difficult to dis-entrench.

The implications are significant. When altcoins are produced they are done so, on average, in order to sell them in comeback for Bitcoin. This means that altcoins are always for sale, and their market price over time should decline relative to Bitcoin. In fact, this downward trend has been the observable case for the majority of altcoins to date.

Related video:

Are Blockchains Key to the Future of Web Encryption?

blockchain encryption

May 7, two thousand seventeen at 11:50 UTC by Joshua Oliver

Encrypted websites now treat more than half the world’s web traffic, but the way the keys for those connections are exchanged and verified hasn’t switched much in twenty years.

The current system relies on a global network of certificate authorities (CAs) to verify the public key and the holder of each secure website. It has long been criticized for creating central points of failure. And those central points, the CAs, have actually failed in some cases.

Some think blockchains – the technology that manages key exchange for the $25bn bitcoin network – could be the basis for a secure alternative.

The initial idea

Like blockchains, CAs began as a way to facilitate connected commerce. Veteran developer Christopher Allen – who helped set up the very first certificate authority, VeriSign – said he imagined a system with several CAs where users would pick which ones to trust.

As the system has scaled, however, it’s become impractical for everyday users to actively manage their trust in different authorities. Most now rely on their browser’s default settings instead. It’s now the browser companies that effectively control trust, providing them big clout within the certificate industry.

“We’ve got a fresh centrality, which is the big browser companies,” said Allen.

Today’s risks

While control over trust has centralized, the number of certificate authorities has grown. There are now hundreds of authorities in countries around the world, and a failure at any one of them undermines the entire system.

The worst incident to date was the collapse of the Dutch authority DigiNotar in 2011. Hacking DigiNotar permitted attackers to spy on around 300,000 Iranian Gmail accounts, and compelled a improvised shut down of many of the Dutch government’s online services.

Since then, there have been dozens of cases where CAs were caught issuing unverified certificates, using substandard security, or even attempting to deceive browser companies. None of these had the same effects as DigiNotar, and the industry has raised security standards many times since 2011, but there are still those who think it’s time to look for a long-term alternative to CAs.

One of those alternatives was outlined in a two thousand fifteen white paper, written at a workshop Allen hosted called “Rebooting Web of Trust”. The paper set out goals for a decentralized public key infrastructure (dpki) to substitute the current, centralized system.

“The objective of dpki is to ensure that . no single third-party can compromise the integrity and security of the system as as entire.”

In place of the current system, where domain ownership is recorded in the DNS and key are verified by CAs, Rebooting Web of Trust envisioned a secure namespace where domain registration and the key for each domain would be recorded on a blockchain.

A fresh namespace

The Ethereum Name System (ENS) is attempting to create the same kind of secure namespace for the ethereum community. It gives us a very first look at the challenges and opportunities of making these ideas work in practice.

Developer Alex Van de Sande said his team often uses the analogy of a sandwich to explain how ENS is designed. The ‘bread’ in the ENS sandwich are two plain contracts. One stipulates that if you own the domain, you’re entitled to its subdomains. The other treats payments.

Like in a sandwich, the complicated part of ENS is in the middle. That’s the contract that sets the rules for name registration. ENS wants to avoid the problem of domain squatting, which was common during the initial internet domain name boom.

They’re also pursuing the ‘principle of least surprise’, the idea that people shouldn’t be too astonished by who actually wields a name. It might seem like common sense that Bank of America should have very first dibs on bankofamerica.eth. But Van de Sande said that designing a system to implement that principle is very challenging, maybe even impractical.

He added that ENS will take the very first year after the relaunch as an chance to learn how to improve the registration rules. If the rules switch, he said, name owners will have a choice to upgrade or capitulate their names for a refund.

Van de Sande said he hopes ENS will be a model for a broader use of similar ideas, adding:

“ENS reflects the way we wish the internet would be. It doesn’t mean that it’s actually going to be that way.”

Blockstack’s model

Another way to decentralize the infrastructure behind secure online communication is to ensure that users can verify the actual information they receive, rather than attempting to secure the server-client connection.

Engineer Jude Nelson, who collaborated on the two thousand fifteen “Rebooting Web of Trust” white paper, told CoinDesk this is the purpose of his startup, Fresh York-based Blockstack.

Blockstack’s system, which is presently in an alpha release, permits users to record their unique name and key on the bitcoin blockchain, and then lookup another user in order to verify the information they receive.

“With Blockstack, we’re attempting to make it so that developers can build server-less, decentralized, applications where users own their own data,” said Nelson. “There are no passwords and developers don’t have to host either of them.”

This could, one day, reduce the need for the website encryption altogether.

Sovereign identity and its hurdles

Each of these projects reflects the same overarching objective: to reduce the role of third parties and give users more control.

Allen, who has convened the Rebooting Web of Trust group every six months since 2015, said he is working towards technologies that give users true sovereignty.

The many strings of letters and numbers that represent individuals online today are all registered with third parties. “You’re not truly buying it, you’re renting it. You don’t have true sovereignty,” said Allen.

But Allen also sees many challenges ahead. One is usability. Systems that work for technically adept users may not scale to applications where most users will rely on defaults and won’t be ready to make choices about who to trust.

“We’ve learned in technology that providing users choice often doesn’t work.”

Meantime, the centralized system is also switching. Google is in the middle of rolling out its own solution to the pitfalls of the CA system — a plan called Certificate Transparency, which requires CAs to log all trusted certificates in public view.

Google said it can verify log-inclusion and the log’s honesty with Merkle trees, and the system has already permitted researchers to catch some bad certificates.

Google’s idea is to keep the third party, but liquidate the trust. And this treatment may prove to be a long-term competitor to blockchain-based projects which want to get rid of both.

The leader in blockchain news, CoinDesk is an independent media outlet that strives for the highest journalistic standards and abides by a stringent set of editorial policies. Interested in suggesting your expertise or insights to our reporting? Contact us at [email protected] .

Related video:

Adjoint – When Blockchain? When Database?

blockchain database

Blockchains are a particularly interesting topic right now in financial technology for a broad multiplicity of applications. But despite all the noise there is a significant amount of noise and misdirected interest in a technology, that while revolutionary for some applications, is not suited for every and all use cases.

What is a blockchain?

A blockchain is effectively a puny variant on top of the same distributed database technology and algorithms that have existed for twenty five years. A blockchain is a very simplistic database that is a append-only immutable store that can be written to by a collection of agents who sign their transactions and engage in confirming other agents transactions through a distributed consensus protocol based on cryptographic hashing. There are many implementations of these ideas but in some form they all share these common features.

The blockchain is a specialized use case rather than a finish divergence from traditional database technology. In many cases, blockchain technology is not suitable to use. In most cases, the question of whether blockchain is adequate to use can best be answered by the question: What do I need that my traditional database is not providing me?

  1. Do you need a database in the very first place?
  2. Does your application depend on extreme fault-tolerance?
  3. Does my application depend on a collective writes from parties with potentially unaligned interests?
  4. What time horizon do I need writes and reads to be consistent?
  5. What groups of parties (agents) need to be responsible for consensus?
  6. Is a trusted third party needed to audit transactions?

Do I need a database in the very first place?

Very first and foremost, we must ask the question: do I need a database in the very first place. Typically the criterion for the need of a traditional database is having large amounts of data that does not fit in memory (or an Excel file), and requires that data to be queried and manipulated by automated business processes. Often times, the size and complexity of the data has reached a point where manual processes and manual human labor cannot keep the data internally consistent (accurately up-to-date) across all entities who need to read from it.

A fairly large amount of companies simply have not even moved many core data processes into traditional databases at all. The budge over to a blockchain storage and processing system is a much more drastic migration. The process of turning an Excel file into a traditional relational database is the best very first step to integrating modern database technology.

Blockchain is not a silver bullet that will instantaneously convert a company run on an overgrown Excel spreadsheet to running their business on a scalable globally consistent database.

Does my application depend on extreme fault-tolerance?

In particular, is it necessary that the state of all transactions over the data needs to be replicated across every client, queryable across history and verified independently for it’s consistency? Certain blockchain implementations predominate at this task of extreme fault-tolerance, but it does so at a very high cost of data movement and write times.

The primary tradeoff of a massively replicated datastore is that each entity involved in the confirmation process must have a utter copy of every transaction. For some applications (financial, healthcare, privacy sensitive, etc.) this is a non-starter. While data can be encrypted on-chain, the metadata about the specific sender and receiver of information is still eternally stored and visible to all entities. If sharing this metadata is infeasible or illegal, blockchain may not a suitable solution. Instead, a centralized store managed by a single legally accountable entity is the right solution.

Does my application depend on a collective writes from parties with potentially misaligned interests?

The killer application of many blockchain datastores is the capability for potentially misaligned entities to transact information without the capability for any single party to disrupt or manipulate the exchange of information. Different systems achieve this through different means and those systems have different tradeoffs. Yet, systems like Bitcoin have demonstrated extreme resilience to attacks while routinely permitting large networks of self-interested parties to securely and securely exchange value and data.

What time horizon do I need writes and reads to be consistent?

Different points in the design space of distributed databases make different compromises inbetween the throughput and consistency of read and write protocols to interface with the underlying storage engine. Some systems are eventually consistent having strong probabilistic bounds on the time for the system to converge on consensus. Distributed databases are therefore more likely to be data consistent. That being said, if your requirements pertain to necessity of speed and throughput rather than data consistency and reliability, then a traditional database is a better fit for you.

What groups of agents need to be responsible for consensus and are transactions public or private??

In other words, which parties need to be responsible for determining the correctness of data reads and writes? And, how private do the transactions need to be? If the reaction is: a puny number of trusted private parties e, then it is recommended to have a private blockchain. If the response is: a large number of agents and I do not need to know them (public) but still trust them, then a public blockchain is a better fit.

The reasoning behind this is related to the number of parties, the degree of privacy of the transaction, and the trust factor of those parties. Private blockchains require authority on who determines what data will be read/written to the single source of truth. Public blockchains tend to permit anyone who interacts with it to participate in consensus, therefore a number of them may be compromised actors attempting to pass incorrect information to the blockchain. Through consensus, trustless parties can still be trustful due to the consensus protocol filtering out incorrect writes at scale.

Is a trusted third party needed to audit transactions?

There are a number of institutions and parties that exist in the world today to dual check, audit, and reconcile data. For clarity, regulatory bods are an example, but mostly, if these cases are post transactional they are a better case for blockchain usage. If there are internal based third parties, the term disintermediation comes to mind as the involved use case of blockchain wise contracts would eliminate their need to exist if at all. It is very much a “do not use blockchain for now”.

Related video:

A fine practice at IFIP

Dr. Mu Mu

A good practice at IFIP/IEEE IM 2017: 5G slicing, cognitive, E2E, blockchain…

The week journey to IFIP/IEEE International Symposium on Integrated Network Management (IM 2017) in Lisbon was fantastic. I had the chance to catch up with old friends and colleagues (Edmundo, Marilia, Alberto, etc.) and to meet other enthusiasts in network management, SDN, QoE, 5G, block chain and cognitive technologies.

I spent my very first day with the QoE-Management workshop, which had one keynote led by seven presentations. There is a lot of work on measuring different aspects (delay, switching, fairness, buffer underrun) of the quality of adaptive streaming. Machine learning is also gaining its popularity in QoE management. In my opinion, the QoE communities face a few hurdles for a major leap ahead: human intent/perception, encrypted traffic, feasible machine learning solution in communication networks, and end-to-end multi-service management. I am glad to see that this community is very open to the challenges ahead. It is also fairly interesting to see Tobias opening up the argument on Mean Opinion Score (MOS). MOS is essentially a method to gather and analyse user opinions in subjective experiments. MOS has been widely used in the QoE community for decades but it is mathematically flawed. I discussed this five years ago in a paper at IEEE CCNC: Statistical Analysis of Ordinal User Opinion Scores (Warning! It will upset you if you’ve done a lot of work using conventional MOS… If you ended up upset, seek doctor’s advice. Preferably a doctor in Mathematics.). Tactile Internet was mentioned a few times as one of the use cases. I think someone also mentioned NFV in user terminal with incentives? Why not…

The 2nd day’s programme commenced with Raouf Boutaba (University of Waterloo)’s keynote on 5G network slicing. Raouf talked about virtual network embedding (VNE) with which we map virtual network knots and links onto physical infrastructure. A good VNE would lead to better error tolerance, efficiency, and “collective wellbeing”, etc. It is surely linked to the cognitive networking that I am working on. Later on, a few papers from the industry predominated the practice track. Some highlights are Cisco’s model driven network analysis using a variation of RFC seven thousand nine hundred fifty YANG (YANG is a data modelling language used to model configuration data, state data, Remote Procedure Calls, and notifications for network management protocols.); UNIFY, a framework that brings cross-layer “elasticity” that unifies cloud and service networks; virtualization of radio access networks (for end-to-end management and other purposes); and IBM’s “BlueWall”, an orchestration of firewalls. BlueWall still keeps human-in-the-loop so it’s very likely more of an Intelligence Augmentation system rather than Artificial Intelligence. The Panel on “Challenges and Issues for 5G E2E Slicing and its Orchestration” was packed with good talks on 5G. People were very optimistic of 5G open slicing, especially its potential in creating future generation mobile operators (“anyone can be an operator”) and the E2E benefits on VR and emergency use cases.

The third day was led by two inspiring keynotes: “Intent-Driven Networks” from Laurent Ciavaglia, Nokia and “The Future of Management is Cognitive” from Nikos Anerousis, IBM Research. They recognised that network/service management is moving towards “dark room + algorithms” (machine learning), but human will still have pivotal roles: referring/curating skill and training systems to solve elaborate problems. I then went to the security session and SDN session for the rest of the day. An Ericsson talk discussed COMPA (Control, Orchestration, Management, Policy, and Analytics) adaptive control loop as an automation pattern for carrier networks, a good work to go after if you do such high-level designs. There was an interesting paper on addressing the shortage of scarce and expensive TCAM memory on SDN switches using “memory exchange”. The idea is to employ the memory of SDN controller for least frequently used flow rules to free up TCAM space. Is it impractical, naive? I think there are screenplays where this solution will actually work well…

David Gorman from IBM kicked commenced the fourth day with his excellent keynote talk on “Making Blockchain Real for Business”. David collective his vision on a world of collective ledger, clever contract, privacy (certificate) and trust. He used auditing as one of the use cases to demonstrate the uniqueness of blockchain in tracking transactions (switches) in comparison to conventional database solutions. His talk then converged on a brief introduction of Hyperledger, a community effort on cross-industry blockchain technologies. I had a brief and interesting discussion with David on the influence and use cases of blockchain in higher education. Ultimately, blockchain is merely a technology and not a solution (in fact, the same applies to SDN). I think it can be a key technology to enable cross-service end-to-end management but in many cases, a solution is not dictated by the technology but politics and regulations.

On the last day, I only stayed till lunch time before I had to leave to catch my flight. The highlight of the day is certainly Alex Galis (UCL)’s talk on Programmability, Softwarization and Management in 5G networking. He emphasised on the importance and influence of softwarization and network programmability, especially the quality of slice in future networks. I’d summarise his talk, blending in my own views, as autonomous, adaptive, and automated end-to-end resource management. Alex also spent a few glides concluding on the key challenges on network slicing, which are very helpful to fresh researchers in this field.

All in all, IM two thousand seventeen at Portugal has been a wonderful event (In fact, they’ve done so well that they also won Eurovision 2017). I am looking forward to its future iterations (NOMS and IM).

Related video:

1 7 8 9