More brain matter
Bitcoin: A method of secure transactions based on wide publication of a decentralized ledger across the Internet. This method contrasts with current credit card systems based on secrecy and centralization, using protected networks and firewalled data centers filled with the personal information of the transactors.
The public ledger of transactions is collected in blocks roughly every ten minutes, beginning with the current block and going back to the “Genesis block” created by Satoshi Nakamoto, the pseudonymous inventor of bitcoin. Each block is confirmed when at least half the participants in the bitcoin verification process—the “miners”— hash the block mathematically with all the previous blocks since the Genesis block. In order to change or rescind a transaction, therefore, more than half the computers in the system have to agree to recompute and restate all the transactions since Genesis.
Bitcoins are not coins, but metrics or measuring sticks for transactions that are permanently registered in the blockchain.
Blockchain: A database, similar to a cadaster of real estate titles, extended to transactions, events, covenants, patents, licenses, or other permanent records. All are reduced and hashed together mathematically from the origin of the series, with the record distributed and publicized on decentralized Internet nodes.
Boltzmann’s Entropy: Heat (the total energy of all molecules in a system) over temperature (the average energy of the molecules). Boltzmann identified this difference with missing information, or uncertainty about the arrangement of the molecules, thus opening the way for Claude Shannon and information theory. Both forms of entropy register disorder. Boltzmann entropy is analog andgoverned by the natural logarithm “e”, while Shannon entropy is digital and governed by log 2.
Chaitin’s Law: Gregory Chaitin, inventor of algorithmic information theory, ordains that you cannot use static, eternal, perfect mathematics to model dynamic creative life. Determinist math traps the mathematician in a mechanical process that cannot yield innovation or surprise, learning or life. You need to transcend the Newtonian math of physics and adopt post-modern math—the math that follows Gödel (1931) and Turing (1936), the mathematics of creativity.
Economic growth: Learning tested by falsifiability or possible bankruptcy. This understanding of economic growth follows from Karl Popper’s insight that a scientific proposition must be framed in terms that are falsifiable or refutable. Government guarantees prevent learning and thus thwart economic growth.
All expanding businesses and industries follow a learning curve that ordains a 20 to 30 percent decrease in costs with every doubling of total units sold. Classical learning curves are Moore’s Law in microchips and Metcalfe’s Law in networking. Raymond Kurzweil generalized the idea as a “law of accelerating returns,” a “law” that Henry Adams introduced in a learning curve chart in The Education of Henry Adams (1865) and applied to the increase of energy output. As a learning process, economic growth does not directly gain from “machine learning”, unless the symbols processed are interpreted by humans.
Expansionary fiscal and monetary policy: The attempt by central banks to stimulate economic activity by selling government securities to pay for a governmental deficit.
Keynesians, mostly on the left, believe that central banks sell securities and impart a fiscal stimulus by enabling more government spending.
Monetarists, mostly on the right, believe that central banks stimulate economic activity by creating money to buy government securities. This new money goes to the previous owners of the purchased securities, chiefly banks, which in recent years have used the funds to purchase more securities from the Treasury.
Keynesianism and monetarism converge in expanding the government’s power to spend. In an information economy, both measures attempt to use government power to force growth. But economic growth is learning (accumulating tested knowledge). Learning cannot be forced.
Gödel’s Incompleteness Theorem: Kurt Gödel’s discovery in mathematical logic that any formal system powerful enough to express the truths of arithmetic will be incomplete and dependent on axioms not reducible to the system—truths that cannot be proved within the system itself. In developing his proof, Gödel invented a mathematical machine that used numbers to embody axioms and thus anticipated the discoveries of computer science. By showing that mathematics could not be hermetically sealed or physically determinist, Gödel opened the way to post-modern math: a math of software and creativity. The first person to appreciate and publicize the importance of Kurt Gödel’s demonstration in 1931 that mathematical statements can be true but unprovable was John von Neumann.
As von Neumann saw, Gödel’s proof depended on his invention of a mathematical “machine” that used numbers to encode and prove algorithms also expressed in numbers. This conception of software, absorbed by von Neumann and Alan Turing, launched computer science and information theory and enabled the development of the Internet and the blockchain.
Gold: The monetary element, atomic number 179, tested over centuries and found uniquely suitable as money. In the periodic table, the five precious metals are rhodium, palladium, silver, platinum and gold. Rhodium and palladium are rare elements that remained undiscovered until the 18th century. Platinum’s melting point is 3000 degrees Fahrenheit, making it unworkable without advanced technology. Silver tarnishes and corrodes and its reactivity makes it more tractable for most industrial purposes than gold. Only gold can function as a durable and unchanging measuring stick for value. Usually thought to be money because it is a useful commodity—pretty, shiny, divisible, portable, scarce, and convertible into jewelry— gold is in fact the monetary element because it is useless. Money is not valuable because it is really jewelry; jewelry is valuable because it is really money. Gold is a metric of valuation based on the time to extract an incremental ounce, which has changed little over the centuries, while gold has become more difficult to extract from deeper and more-attenuated lodes. Thus the gold metric is not a function of technology and industrial progress, part of what it measures. It is a pure gauge of value based on the time to extract it..
Hash: Conversion of a digital file of variable length into a string of characters of a specific length—in Secure Hashing Algorithm (SHA-256 used in Bitcoin’s blockchain cryptography) the output is always 32 bytes (256 bits). Hashes are prohibitively hard to invert; knowledge of the hash does not enable knowledge of the file, but knowledge of the file is readily converted into the hash. Any change of the file drastically changes the hash result. Thus hashes reveal any tampering with the hashed data. The most common hash is the checksum at the end of every Internet packet. Hashes are theenabling technology of blockchains and hashgraphs.
Hashgraph: Use of chained blocks (called “rounds”) of hashes in a tree-like structure, with an ingenious algorithm called “virtual voting” that achieves consensus without actual voting or proof of work. Proof of work is a complex and laborious process good to avoid whenever possible. Hashgraph’s fast and efficient system may well prevail as the bottom layer of many blockchains.
Hypertrophy of finance: The growth of finance beyond the rate of growth of the commerce it measures and intermediates. For example, international currency trading is roughly seventy-three times more voluminous than all global trading in goods and services and an estimated one hundred times as voluminous as all stock market transactions. Oil futures trading has risen by a factor of one hundred in some three decades, from 10 percent of oil output in 1984 to ten times oil output in 2015. Derivatives on real estate are now nine times global GDP. That’s not capitalism, that’s hypertrophy of finance.
Information Theory: Begun by Kurt Gödel, when he made logic into functional math and algorithms. Information theory evolved through the minds of Claude Shannon and Alan Turing into its current role as mathematical philosophy. It depicts human creations and communications as transmissions across a channel, whether a wire or the world, in the face of the power of noise, with the outcome measured by its “news” or surprise, defined asentropy and consummated as knowledge.
Entropy is higher or lower depending on the freedom of choice of the sender. It is a libertarian index. The larger the available alphabet of symbols—that is the larger the set of possible messages—the greater the composer’s choice and the higher the entropy and information of the message. Information theory both enables and describes our digital and analog world.
Main Street: The symbol of the real economy of workers paid hourly or monthly and sealed off from the accelerated circular loops of Wall Street moneymaking. Perhaps the street where you live, Main Street is the site of local businesses and jobs.
Metcalfe’s Law: The value and power of a network grows by the square of the number of compatible nodes it links. Named for the engineer Robert Metcalfe, a co-inventor of Ethernet, this law is a rough index and deeply counterintuitive. (The Internet is worth less than the square of its six billion connected devices.) But the law applies to smaller networks, and it explains the vectors of value creation of companies such as Facebook, Apple, Google, and Amazon, which now dominate stock market capitalization. Metcalfe’s Law may well apply to the promise of new digital currencies and ultimately assure the success of a new transactions layer for the Internet software stack.
Moore’s Law: Cost-effectiveness in the computer industry doubles every two years. This pace corresponds closely to a faster pace in the number of transistors produced, signifying a learning curve. Formulated by Intel founder Gordon Moore and inspired by Caltech professor Carver Mead’s research, Moore’s Law was originally based on the biennial doubling of the density of transis- tors on a silicon chip. It now chiefly relies on other vectors of learning, such as parallel processing, multi-threading, lower voltages, and three-dimensional chip architectures. As a learning curve, Moore’s Law is an important principle of information theory.
Noise: Interference in a message. Any influence of the conduit on the content: An undesired disturbance in a communications channel. Noise is commonly the distortion of content by its conduit. A high-entropy message (full of surprise) requires a low-entropy channel (with no surprises). Surprises in the signal are information; surprises in the channel are noise.
Peirce’s Triad: The theorem of leading 19th century mathematician and philosopher Charles Sanders Peirce holding that all symbol and sign systems (such as software and mathematics) are meaningless without an “interpretant” or interpreter. The triad consists of a sign (or symbol), an object, and a human interpreter. Removing the interpreter vitiates the triad and renders it empty and ready to be filled by ideology and artifice (e.g. “machine learning” and “artificial intelligence”).
Public Key Cryptography: Most cryptography is symmetrical: the same key (or string of digital numbers) both encrypts and decrypts the message. This is fine if you can personally give the key to the recipient. But the Internet economy depends on continual transactions with people you never see. The answer to this problem is asymmetrical pairs of keys, generated together, with the key that encrypts the message—the public key—unable to decrypt it, and with a private key for decryption. Blockchains all depend on public keys as addresses for transactions that can be consummated by their private keys.
An important payoff for private keys is using them to encrypt files to be decrypted by the related public key. This process enables digital signatures that authenticate the source of a message. You know that the message originated with a unique private key that was generated in a pair with the public key that you hold. This means money can be signed, like a check, assuring authentication without necessarily revealing the source of the signature.
This technique reconciles two apparently conflicting goals of cryptocurrencies: privacy and attestation. Privacy entails seamless trusted transactions without exposure of personal data. Attestation requires access to reliable records of property and history for legal purposes. Thus we can have cash-like transactions (with no exposed secrets) together with robust and reliable and immutable records when demanded by the courts or the IRS. Identity and property can be concealed when appropriate and proven when required.
Contrast this system with the current system where identity and property are constantly exposed to untrusted outsiders, but cannot be proven without reliance on possibly corrupt or mendacious third parties, witnesses, or prosecutors.
Real money: A measuring stick, a metric of value, reflecting the scarcity and irreversible passage of time—entropy based, equally distributed, and founded on the physical limits of the speed of light and the span of life. Bitcoin and gold are both real money in this sense. Government monopoly money is not.
Sand Hill Road: The arboreal abode of California venture capitalists and their “unicorns,” stretching from the Camino Real near Stanford to Route 280 and into the clouds and wealth of Woodside and Silicon Valley. Losing its leadership in entrepreneurial capital to China, Israel, the world’s Initial Coin Offerings (ICOs), and other fund-raising devices, Sand Hill Road is filling up with lawyers and politicians aiming to redistribute its troves of old wealth.
Shannon Entropy: The measure of information as surprise, or unexpected bits. It is most simply measured by the number of binary digits needed to encode a message, and is calculated as the sum of the base two logarithms of the probabilities of the components of the message. The logarithms of probabilities between one and zero are always minus quantities; entropy is rendered positive by a minus sign in front of this sum. This minus sign prompted some eminent theorists to blunder into the idea of negentropy, which is an oxymoron—more than 100 percent probability. Counterintuitively, surprising information is a kind of disorder. The crystals are ordered; snowflakes are ordered. Hamlet and Google are beautifully disordered alphabets conveying surprising information.
Turing Machine: Inspired by Gödel’s proof, Turing conceived an abstract universal computer model consisting of a control unit administering a set of instructions reading, writing, and moving one space at a time back and forth along an infinitely long tape divided into squares along its length. He proved that this hypothetical machine could perform any computable function. Silicon Valley has been cheering ever since, despite his further proof that most numbers can not be generated by a computational process. Turing’s universal computer could not calculate whether any particular program would ever halt. Turing’s machine was a general-purpose computer because it commanded infinite time and space. Necessarily restricted to specific purposes, real computers are not minds.
Wealth: Tested knowledge. Physical law dictates that matter is conserved: material resources have not changed since the Stone Age. All enduring economic advances come from the increase of knowledge through learning, as measured by learning curves (e.g., Moore’s Law, Metcalfe’s Law).