Posted by AGORACOM-JC
at 10:30 AM on Monday, December 10th, 2018
SPONSOR: Bougainville Ventures Inc (CSE: BOG) Converting irrigated farmland to greenhouse-equipped farmland. Bougainville does not “touch the plant†and only provides agricultural infrastructure as a landlord for licensed marijuana growers.
————————–
While marijuana stocks have pulled back with the chaos of the broader
markets in late 2018, we cannot forget about the countless catalysts
that are still out there.
One of the major catalysts involves mergers and acquisitions.
Alcohol companies have had significant interest, for example.
Months ago, Constellation Brands increased its stake in Canopy Growth (CGC) by $4 billion.
That came just months after Constellation first took a 10% stake in
Canopy to help create nonalcoholic cannabis-infused drinks and other
products.
All, as sales of beer fall in the United States,
brewers have begun to bet that legalization of marijuana around the
globe, especially the United States, will continue to build momentum and
sales of cannabis products will take off.
Molson Coors even
listed legal cannabis among the biggest possible risks to its business
in its annual shareholder report. Even Coca-Cola expressed some interest
at one point.
Now, cigarette makers are jumping into the fray, too.
For example, shares of Cronos (CRON) rocketed higher last
week on news that Altria Inc. took at 45% stake in the company for $1.8
billion.
“Investing in Cronos Group as our exclusive partner in
the emerging global cannabis category represents an exciting new growth
opportunity for Altria,” said Howard Willard, Altria’s CEO, as quoted by
CNN.
Talks started, as the tobacco industry comes under
pressure, as sales have begun to sharply decline. Just last year,
cigarette smoking fell to its lowest point in history. Marijuana sales
may bring back some of that lost revenue, though.
According to
CNBC, “Counting both legal and black-market sales, the total demand for
pot is approximately $52.5 billion, Marijuana Business Daily has
reported.â€
This could potentially lead to other major M&A
deals in the industry following Altria and Constellation news. If
nothing else, such major deals provide further evidence that the
opportunities in the global marijuana industry are much more than just
hype.
For Altria to invest $1.8 billion in a company isn’t small change.
We wouldn’t be shocked to see other pot stocks see further M&A interest moving forward.
Stay tuned for more on potential deals right here.
Tags: Hemp, Marijuana, THC Posted in Bougainville Ventures | Comments Off on Bougainville Ventures (BOG:CSE) – Marijuana M&A: Altria Group Opens the Door to Major Deals $CROP.ca $VP.ca NF.ca $MCOA
Posted by AGORACOM-JC
at 12:00 PM on Sunday, December 9th, 2018
Metis Management Consultancy will work closely with KoreConX’s office in Dubai
[New
York, NY – December 09, 2018] – KoreConX, the first
all-in-one platform for companies to manage their business activities, is
partnering with Metis Management Consultancy, a UAE-grown consultancy firm
focused on providing services to SMEs in the MENA region. The company will
become part of the KorePartner’s Ecosystem, a group of selected companies that
works closely with KoreConX to ensure that small and medium enterprises have
all the elements they need to thrive.
In addition to providing businesses with an all-in-one solution for
their management pain points, KoreConX also developed its own fully-compliant
Security Token Protocol, using IBM’s Hyperledger Fabric, a permissioned
Blockchain. Using KoreConX’s platform, companies are able to issue their
Security Tokens (Tokenized & Digitized Securities) and raise capital in
multiple jurisdictions.
“Blockchain plays a crucial role in everything that we do at KoreConX.
Dubai is taking this technology to the next level, by planning to make the city
fully powered by Blockchain by 2020,†said Edwin Lee, director of MENA Region.
“Dubai is the place to be when advancing into new technologies and Metis is the
company that shares our high standards when it comes to providing companies
with the best advice.â€
The same feeling is shared by Metis Management Consultancy team.
“We always strive to provide companies with high-quality consulting
services and access to the latest tools†said Nayef Shahin, Founder and
Managing Partner at Metis. “It is only natural to partner with KoreConX, a team
with a deep understanding of compliance, securities regulations and the one to
create the only protocol that is fully tracked on chain through their transfer
agent service.â€
In UAE, SMEs account for over 90 per cent of private enterprises and
contribute to nearly 47% of Dubai’s GDP and 52% of its workforce. It is
therefore a top priority this sector has access to cutting-edge knowledge,
technology and funding.
Metis Management Consultancy is part of the KorePartner ecosystem, a
group of selected broker-dealers, secondary market platforms, capital markets
platforms, lawyers, compliance, investor relations, accounting, and marketing
firms that support the KoreConX security token protocol and adhere to KoreConX
governance standards. KoreConX’s KorePartners are from around the globe and
bring the necessary expertise that a company will need to launch a fully
compliant security token in multiple jurisdictions.
About KoreConX
KoreConX is the world’s first highly-secure
permissioned blockchain ecosystem for fully-compliant tokenized securities
worldwide.
To ensure compliance with securities
regulation and corporate law, the KoreConX all-in-one, AI-based blockchain
platform manages the full lifecycle of tokenized securities including the
issuance, trading, clearing, settlement, management, reporting, corporate
actions, and custodianship. KoreConX connects companies to the capital markets
and secondary markets facilitating access to capital and liquidity for private
investors.
KoreConX is the first secure, all-in-one
platform for private companies to manage their capital market activity and
stakeholder communications. Removing the burden of fragmented systems and
inefficient tools across multiple vendors, KoreConX offers a single environment
to connect companies, investors and broker/dealers. Leveraged for investor
relations and fundraising, private companies can share and manage corporate
records and investments including portfolio management, capitalization table
management, virtual minute book, security registers, transfer agent services
and virtual deal rooms for raising capital.
Metis Management Consultancy is a leading SME business
advisor consultancy in the region. Their mission is to enhance their clients’
corporate value by providing them access to tier quality consulting services
and expertise across their business domains. Metis make sure that each business
gets its own tailored solution according to their own wants and needs while
providing direction, guidance, and innovative services to turn their clients’
corporate vision and strategy into operational reality and success.
Posted by AGORACOM-JC
at 4:02 PM on Friday, December 7th, 2018
SPONSOR: Tartisan Nickel (TN:CSE) The company’s Kenbridge Property has a measured and indicated resource of 7.14 million tonnes at 0.62% nickel, 0.33% copper. Tartisan also has interests in Peru, including a 20 percent equity stake in Eloro Resources and 2 percent NSR in their La Victoria property. Click her for more information
—————–
-Vale, the Brazilian mining giant built on supplying the world’s steel mills with iron ore, is now betting on the electric vehicle (EV) revolution to turn its nickel division around.
-“We believe in this revolution to come,†Chief Executive Fabio
Schvartsman told analysts at the company’s investor day presentation in
New York this week.
LONDON (Reuters) – Vale, the Brazilian mining giant built on
supplying the world’s steel mills with iron ore, is now betting on the
electric vehicle (EV) revolution to turn its nickel division around.
FILE PHOTO: The logo of Vale SA is pictured in Rio de Janeiro, Brazil, August 7, 2017. REUTERS/Ricardo Moraes/File Photo
“We believe in this revolution to come,†Chief Executive Fabio
Schvartsman told analysts at the company’s investor day presentation in
New York this week.
The use of nickel in lithium ion batteries will translate into at
least 500,000 tonnes of extra demand by 2025, according to Vale, which
is planning to play a leading role in meeting the additional need for
high-grade metal.
However, to do so, it will have to turn around its troubled New
Caledonian operations, a task described by Schvartsman as “maybe our
biggest challengeâ€.
It will also have to gamble that Chinese players led by the Tsingshan
steel group don’t make the technological breakthrough that would allow
them to convert nickel ore straight into battery-grade nickel.
That would undermine demand for the sort of high-purity material, so-called Class I nickel, that Vale specializes in producing.
STILL WAITING FOR GORO
Vale had been hoping to attract a partner for its Vale New Caledonia (VNC) operations but evidently without success.
It will now go it alone.
What was originally known as the Goro project has been strewn with
operational problems ever since it came on stream, two years late, in
2011.
In theory, it’s perfectly positioned to ride the EV revolution,
producing the right sort of nickel for processing into batteries with a
by-product stream of cobalt, another hot battery metal.
In practice, Vale has never fully mastered the high-pressure-acid-lead (HPAL) technology used to convert ore to nickel oxides.
The original plan envisaged a three-year ramp-up to nameplate
capacity of 58,000 tonnes of nickel in oxide and hydroxide. In 2017, its
sixth year of operation, it managed 40,000 tonnes.
Alas, even that good run hasn’t lasted into 2018.
Production of what Vale terms “finished nickel products from VNC
source material†fell 17 percent in the first nine months of the year to
24,200 tonnes and VNC reported an operating loss of $42 million in the
third quarter itself.
Vale management is undeterred.
It has, according to Eduardo Bartolomeo, head of the company’s base
metals division, commissioned a “very detailed study to know exactly why
we can’t achieve our nameplate capacity.â€
The study found that there is no “insurmountable†bottleneck in the
plant and Vale’s goal is now to invest $500 million to get the plant
operating at 50,000 tonnes per year of nickel products over a two- to
three-year time horizon.
It’s not the first time senior Vale management has vowed to fix Goro,
but the new-found incentive is the coming electric vehicle revolution.
The decision to double down on New Caledonia is “very simpleâ€,
according to Schvartsman. “We will need this operation in order to
supply the market because of the growth in the consumption for
batteries.â€
TSINGSHAN CHALLENGE
That is, unless Chinese steel giant Tsingshan can make good on its
ambitions to build an Indonesian plant that can convert nickel ore
straight into battery-quality material.
Since Tsingshan’s original announcement in September, the London
Metal Exchange (LME) nickel price has fallen from just under $13,000 per
tonne to a current $11,000.
Nickel’s shiny electric vehicle premium has been blown away by the
prospect of Indonesia’s abundant nickel ore production, currently
exclusively destined for the stainless steel sector, being diverted into
meeting battery demand.
Such an eventuality could also impact severely demand for the sort of premium nickel product currently produced by Vale.
No-one quite believes Tsingshan’s stated intention of building a
plant to produce 50,000 tonnes per year of contained nickel at a cost of
$700 million with first production next year. Particularly since it is
proposing to use the same HPAL technology that has challenged Vale and
other producers in recent years.
But based on Tsingshan’s track record of single-handedly propelling
Indonesia into the top ranks of stainless steel producers in super-quick
time, no-one’s quite sure either.
Vale’s Schvartsman conceded that “there is no question about the
ingenuity of the Chinese†and that over time “this technology will
become more competitive in their handsâ€.
But not next year, nor in all likelihood the year after.
To build a plant that size, using that technology with that amount of investment “is totally impossibleâ€, Schvartsman said.
Tsingshan’s September statement, according to Schvartsman, “is more
an issue of communication – there isn’t anything real behind it.â€
“Just talkâ€, agreed Bartolomeo, who noted it would take Tsingshan 18
months just to get a federal marine disposal license. “They have the
provisional license but the rules are very strictâ€.
NOW A BELIEVER
This time last year, when Vale was actively looking for an investment
partner in VNC, Schvartsman said it was a test of whether the market
really believed that “nickel is something that is important for the
future of EVs.â€
Would all the future promise “translate into someone who is eager to invest with us to have more nickel in the future�
The apparent negative response is in all likelihood far more to do
with Goro’s problematic past performance than nickel’s future prospects.
The metal seems on track to be an early winner in the materials
competition for lithium batteries, partly at the expense of cobalt on
price and supply stability grounds.
But the promise still lies largely in the future. Batteries only account for around 5 percent of total nickel demand.
Right now the price remains beholden to its traditional stainless
steel drivers. Stainless production ran hot through the first part of
this year but is cooling rapidly, an overlooked part of the recent price
sell-off.
Nickel inventories, meanwhile, remain elevated. Visible stocks on the
LME have been falling but there is a strong suspicion that part of the
decline has simply reflected statistically hidden stock building along
the supply chain.
Vale has around 60,000 tonnes of idled production capacity, taken off-line at the end of 2017 due to low prices.
That gives it plenty of optionality in lifting output as and when demand from the battery sector takes off.
Because one thing is for sure. Vale is now an official believer in the electric vehicle story.
To reap the full rewards, though, it needs to sort out once and for
all its problem child, Goro, and keep its fingers crossed that
Tsingshan’s announcement is, for now at least, “just talkâ€.
Tags: #mining, nickel Posted in All Recent Posts, Tartisan Nickel | Comments Off on Tartisan Nickel Corp. $TN.ca – Vale doubles down on #nickel ahead of #EV revolution: Andy Home $ROX.ca $FF.ca $EDG.ca $AGL.ca $ANZ.ca
Posted by AGORACOM-JC
at 2:00 PM on Friday, December 7th, 2018
Sponsor: Good Life Networks: Video advertising is the future! Company’s A.I. makes 80,000 calculations / second, targeting 750 million users to deliver higher prices and volume. The company achieved a record $9.7 Million in revenue for 2017 and recently announced entering the video game industry with programmatic technology. Click here for more information
——————
In 2018, more than $47 billion in the US was spent on programmatic display advertising with Facebook and Google taking a large chunk of the pie. By 2020, that figure will climb to nearly $69 billion.
After an intense two days spent at the world’s largest conference on programmatic advertising, Programmatic I/O, it was fascinating to see how US online publishers are utilising data and selling inventory programmatically.
In 2018, more than $47 billion in the US was spent on
programmatic display advertising with Facebook and Google taking a large
chunk of the pie. By 2020, that figure will climb to nearly $69
billion.
In the United States, there are an average of 14.5 programmatic
tech partners per publisher, whereas in South Africa, we have an
average of just three. This is not a bad situation to be in as our
ecosystem is less fragmented and we have more control over our
inventory. But it does highlight that programmatic is still in its
infancy here.
One of the US speakers, Taylor Schreiner at Adobe, said,
“Organisations are transforming to take advantage of programmatic.
Brands are now more hands-on. They have a better understanding of the
metrics they are facing, and they’re more specific in their directives
to agencies. We’re seeing more clients who now have people in the
organisation who are in a position to think about reach across
channels.â€
First party data, which is essential for publishers, was
another big theme that came through. It is a priority as it gives a
competitive advantage in fighting against the duopoly that is Google and
Facebook, who account for around 50% of programmatic revenue in the
US. Their amazing audience intelligence, reach and measurement
capabilities which advertisers are not going to pass up keeps them on
top.
Relationships are key in this fragmented industry. The display
ad tech Lumascape highlighted this fact. There are many touch points
available when it comes to making a deal and publishers need to ensure
they are talking to all parties involved. There is no such thing as ‘set
it and forget it’. Secondly, CPM rates are lost to tech costs and the
publisher comes out with only a fraction of what the seller initially
spends.
The issue of viewability
The issue of viewability came across a lot throughout the two
day conference. A viewable ad is defined as 50% of the pixels of a
regular creative or 30% of the pixels of a large size creative, are on
an in-focus browser tab on the viewable space of the browser page for a
minimum of one continuous second. (This description was even questioned a
few times by brands.) As far as many brands were concerned, 100%
viewability has to be a non-negotiable and advertisers/buyers should not
pay for a non-viewable ad.
Artificial Intelligence was also strong presence with a few
interesting developments on the cards. We can’t escape the fact that AI
is and will be an essential part of our lives. The Nest Cam Indoor
security camera, for instance, learns who the regular members of your
household are. If a stranger or visitor is in your home, Nest reports
back to you immediately via your connected device. The Ricoh whiteboard
is another great AI example – once you’ve made your notes on it you can
email the contents to anyone around the world, with full translation
capabilities. All of these things will add to the wonder that is big
data which ultimately will assist advertisers to better target
consumers.
And then there’s brand safety
It was evident that there’s a need for deeper conversations on
brand safety between publishers, agencies and brands who all need to
understand and explain what brand safety means to each of them. Some
brands mentioned that they won’t pay if creative appears in a negative
environment. However, they would consider an environment that has a
positive spin next to controversial content. Unfortunately, safety tools
are screening out these environments if the story contains blacklisted
key words. Private market places need to be of more help.
Ad fraud and fake news is rife in the US industry, and ad fraud
specifically, but on a lesser scale, here in South Africa. Publishers
are fighting hard against these practices and buyers are turning to
technology to assist with eliminating and reducing their ad spend on
these practices that deceive.
There are many types of ad fraud but in general challenges in
programmatic include invalid traffic (IVT), domain spoofing, page level
scripting, ad injection, and poor user experience. Low-quality human
traffic is another issue, through paid media channels (including click
bait) traffic is pushed to transit hubs by fake authors and
instantaneously bounce off these sites, purely to serve ads and receive
ad revenue. These are all things that advertisers and publishers need to
be cognisant of. Publishers need to adopt ads.txt as a non-negotiable
and advertisers need to be selective when buying inventory across the
open market.
I learned that a dollar in does not equal a dollar out when it
comes to programmatic and intermediaries are more prevalent than
thought. But most importantly, relationships are more important now than
ever before. Successful partnerships between publishers, agencies and
clients are open and honest about what works for them.
The conference gave invaluable insights into the world of
programmatic and even though we have some catching up to do, it’s an
exciting time in our industry.
Ashleigh Footit is head of techops, programmatic and performance
at SPARK Media. She was responsible for establishing the programmatic
division for the group in 2015 and have been one of the key drivers in
the implementation, management and success of Caxton’s Supply Side and
Data Management Platform.
Posted by AGORACOM-JC
at 10:42 AM on Friday, December 7th, 2018
SPONSOR: Esports Entertainment $GMBL – Esports audience is 350M, growing to 590M, Esports wagering is projected at $23 BILLION by 2020. The company has launched VIE.gg esports betting platform and has accelerated affiliate marketing agreements with an additional 42 Esports teams, bringing total to 176 Esports teams. Click here for more information
———————-
Magic: The Gathering tournaments, whether they’re informal
competitions at local game shops or large, formal affairs, have been an
institution for years. And as announced at the 2018 Game Awards, those
tournaments are now being brought into the esports arena with the reveal of Mythic Championship events and a pro league.
Magic: The Gathering has increasingly been digitized this year, with
the development of Magic: The Gathering Arena, a new way to play the
game online separate from the preexisting Magic: The Gathering Online.
Arena is currently in open beta for PC users, with a full release
planned for 2019. But even though awareness about Arena may benefit most
from this reveal, this new esports structure won’t just apply to the
digital version of the game.
The prize pool is split evenly between two ways to play the game. The
traditional tabletop game and Arena will each have a $5 million prize
pool, with a total of 10 tournaments that begin with the Mythic
Championship being held at next year’s PAX East.
The Magic Pro League, meanwhile, will include the 32 top-ranked
players in the world. Though everyday players will have means to qualify
for championship events (with more details promised for 2019), each of
these players is afforded automatic entry and are promised “competitive
pro contracts,” according to Wizards of the Coast.
Arena players can also receive an esports starter kit by entering the promo code “GameAwards.”
Posted by AGORACOM-JC
at 9:49 AM on Friday, December 7th, 2018
SPONSOR:Â Betteru Education Corp.Connecting global leading educators to the mass population of India. BetterU Education has ability to reach 100 MILLION potential learners each week.
Online Education for India
Online education has become popular among working professionals and
students in higher education. These categories of online learners find
immense benefit in the autonomy, and flexibility, that these courses
offer.
Online courses can be planned into their schedule, which may include full-time employment, internships and caring for a family. It can also help them take out quiet time to study.
Online learning in an education system
Distance learning has been around for a long time, even before
technology made it extremely accessible. Traditional schooling is now
seeing an increased proliferation of virtual training materials and
online courses. Even in a world of tried and tested schooling
systems and curricula, the most successful schools are the ones who
adapt to the changing times, as well as to the expectations of students,
parents and the society.
If online education is here to stay, then what are its implications
for traditional learning? Instead of focusing on pros and cons, the
conversation we should be having today is about leveraging online
learning to make our education system more conductive to learning.
Setting goals, tracking progress and meeting deadlines
Online courses involve setting our own goals, tracking progress and meeting deadlines
Online courses call for a greater amount of motivation and
self-discipline than a classroom-based course. A classroom has one or
more instructors and peers, who can hold a student accountable for their
course-work. In contrast, online courses involve setting our own goals,
tracking progress and meeting deadlines.
One does not learn in isolation, so online courses do offer
discussion forums, email, and one-on-one support. Technology also adds
on to the visual experience by incorporating animations, that can be
used interactively for effective teaching, and communication.
The classroom advantage
A school provides structure, support, and a system of rewards and penalties to groom its students. Classroom education has the benefit of face-to-face interactions with peers, which are typically moderated by a teacher.
It provides children, especially those in their early developmental
years, with a stable environment for social interactions, helping them
develop skills like boundary setting, empathy, and cooperation. This
also allows plenty of room for spontaneity, unlike a virtual learning
setup.
Online education in the context of schooling
As students’ progress to higher classes, they seek more autonomy and
intellectual freedom. Online learning can help them pursue highly
individualized learning programmes, possibly even college-level courses.
These, combined with hands-on exercises, real-world exploration, and
thorough assessments, can be highly beneficial to their learning
progress.
Here’s what the Managing Director of Trio World Academy said:
“They can explore their options, by trying out introductory topics
from different fields, before committing to a specialization. Online
learning platforms can help these students become more independent
learners before they make their way into college,” said Naveen K M.
“I believe that we must not hold students back from picking any
online course, but instead act as their guide as they navigate through
it,” he added.
Teachers and parents should act as anchors and mentors
Teachers and parents should be anchors and mentors
Mobile apps that provide enhanced learning opportunities for school
children have become mainstream. Since mobile phones have already found
their way into their hands, these apps are being used to supplement
classroom learning.
Teachers and parents need to act as anchors and mentors, curating the
kind of educational content students are exposed to, during the tricky
phase of finding the right career to pursue.
Programmes to support families wishing home-school
Virtual public schools, that offer a full scale K12 education, have
already sprung up in some parts of the world. They even offer a
combination of the traditional system with online education. There are
programmes that provide support to families that wish to home-school
their children, in the form of online course material.
These programmes bring parents and teachers into the fold, by
involving them into their child’s education from the get-go. However,
their effectiveness in the long term needs to be studied.
Online programmes for weaker communities
Online programmes for weaker sections
Online learning programmes will also open up opportunities for
children from weaker socio-economic communities, who possess a limited
access to learning resources i.e. teachers, textbooks and
infrastructure.
It will connect them to a global network of online learners, exposing
them to new perspectives. The ideas that they receive, will not be
limited by the number of heads in one classroom.
Online education can also be designed to be accommodating of a variety of learning styles among students.
“As educators, it is likely that we will have to put in additional
efforts to incorporate online learning programmes into the curriculum,
in the most suitable manner,” said the managing director.
Online training programmes are helping teachers/educators advance
their skills in curriculum implementation, policy, education systems and
leadership, both independently and with the support of their
institutions.
It lets them collaborate with their peers, and learn new
instructional skills, that are relevant to their career. These
programmes can help them develop new skills and capabilities in their
students, with the help of technology and interdisciplinary approaches.
Education for future
As the overlap of the traditional and online educational worlds is
becoming more and more inevitable, we owe it to our students to make
their education relevant to their future, through our own ingenuity,
passion and careful planning.
-Authored article by Naveen K M, Managing Director, Trio World Academy
Posted by AGORACOM-JC
at 4:28 PM on Wednesday, December 5th, 2018
By Dr. Kiran Garimella
In parts 1-3, we briefly touched on some of the historical foundations of blockchains from computer science and mathematics, including their sub-topics such as distributed systems and cryptography. Specific topics in either of these categories were consensus mechanisms, fault-tolerance, scaling, zero-knowledge proofs, etc.
Obviously, this brief series doesn’t do justice. The history of computing and mathematics is rich, with many interconnections and dependencies. The goal of this series was to provide just enough to make the point that the technologies that power blockchain (whether public or private) were built on a well-established foundation of various topics with contributions from real scientists in both industry and academia. The graphic below depicts the broad brush-strokes of development, clearly showing how current blockchain technologies are based on a wide spectrum of historical developments.
Technologies of Blockchain – Historical Timeline
Conclusion
As you can see, a tremendous amount of development that took place for almost half a century made the modern blockchain possible. Bringing these technologies together—almost all of them based not on just techniques but deep mathematical foundations—into a cohesive whole in the form of a bitcoin application was no doubt a tremendous achievement in itself.
Moving forward, we need to keep in mind the initial motivation for each of these technologies, their strengths, their limitations, and determine how to create different architectures based on business needs. A good example of this is to relax the requirements of anonymity, strengthen safety, incorporate recourse, improve security, and incorporate the enormous complexity of regulatory compliance in securities transactions. Making such trade-offs doesn’t detract from the need for public, decentralized blockchains. On the contrary, this strengthens the use of the blockchain technology ‘horizontally’ across many industries and use cases.
In the near future, we expect to see some innovation in blockchains to improve performance and scalability, which is a special challenge for public blockchains. Along the same lines, there will be new consensus mechanisms going mainstream (such as proof-of-stake). For consensus and validation, blockchain researchers are investigating efficient implementation of zero-knowledge proofs and specific variants such as zkSNARKs.
Posted by AGORACOM-JC
at 4:19 PM on Wednesday, December 5th, 2018
Kiran Garimella
In Part 2, we saw how a simple concept of a linked list can morph into complex, distributed systems. Obviously, this is a simple, conceptual evolution leading up to blockchain, but it’s not the only way distributed systems can arise. Distributed systems need coordination, fault tolerance, consensus, and several layers of technology management (in the sense of systems and protocols).
Distributed systems also have a number of other complex issues. When the nodes in a distributed system are also decentralized (from the perspective of ownership and control), security becomes essential. That’s where complex cryptographic mechanisms come into play. The huge volume of transactions makes it necessary to address performance of any shared or replicated data, thus paving the way to notions of scaling, sharding, and verification of distributed data to ensure that it did not get out of sync or get compromised. In this segment, we will see that these ideas are not new; they were known and have been working on for several decades.
Cryptography
One important requirement in distributed systems is the security of data and participants. This motivates the introduction of cryptographic techniques. Ralph Merkle, for example, introduced in 1979 the concept of a binary tree of hashes (now known as a Merkle tree). Cryptographic hashing of blocks was implemented in 1991 by Stuart Haber & W. Scott Stornetta. In 1992, they incorporated Merkle trees into their scheme for efficiency.
The hashing functions are well-researched, standard techniques that provide the foundation for much of modern cryptography, including the well-known SSL certificates and the https protocol. Merkle’s hash function, now known as the Merkle-Damgard construction, is used in SHA-1 and SHA-2. Hashcash uses SHA-1 (original SHA-0 in 1993, SHA-1 in 1995), now using the more secure SHA-2 (which actually consists of SHA-256 and SHA-512). The more secure SHA-3 is the next upgrade.
Partitioning, Scaling, Replicating, and Sharding
Since the core of a blockchain is the database in the form of a distributed ledger, the question of how to deal with the rapidly growing size of the database becomes increasingly urgent. Partitioning, replicating, scaling, and sharding are all closely related concepts. These techniques, historically used in enterprise systems, are now being employed in blockchains to address performance limitations.
As with all things blockchain, these are not new concepts either, since large companies have been struggling with these issues for many decades, though not from a blockchain perspective. The intuitively obvious solution for a growing database is to split it up into pieces and store the pieces separately. Underlying this seemingly simple solution lies a number of technical challenges, such as how would the application layer know in which “piece†any particular data record would be found, how to manage queries across multiple partitions of the data, etc. While these scalability problems are tractable in enterprise systems or in ecosystems that have known and permitted participants (i.e., the equivalent of permissioned blockchains), it gets trickier in public blockchains. The permutations for malicious strategies seem endless and practically impossible to enumerate in advance. The need to preserve reasonable anonymity also increases the complexity of robust solutions.
Verification and Validation
Zero-knowledge proofs (ZKP) are techniques to prove (to another party, called the verifier) that the prover knows something without the prover having to disclose what it is that the prover knows. (This sounds magical, but there are many simple examples to show how this is possible that I’ll cover in a later post.) ZKP was first described in a paper, “The Knowledge Complexity of Interactive Proof-Systems†in 1985 by Shafi Goldwasser, Silvio Micali, and Charles Rackoff (apparently, it was developed much earlier in 1982 but not published until 1985). Zcash, a bitcoin-based cryptocurrency, uses ZKPs (or variants called zkSNARKs, first introduced in 2012 by four researchers) to ensure validity of transactions without revealing any information about the sender, receiver, or the amount itself.
Some of these proofs and indeed the transactions themselves could be implemented by automated code, popularly known as smart contracts. These were first conceived by Nick Szabo in 1996. Despite the name, it is debatable if these automated pieces of code can be said to be smart given the relatively advanced current state of artificial intelligence. Similarly, smart contracts are not quite contracts in the legal sense. A credit card transaction, for example, incorporates a tremendous amount of computation that includes checking for balances, holds, fraud, unusual spending patterns, etc., with service-level agreements and contractual bindings between various parties in the complex web of modern financial transactions, but we don’t usually call this a ‘smart contract’. In comparison, even the current ‘smart contracts’ are fairly simplistic.
Posted by AGORACOM-JC
at 4:11 PM on Wednesday, December 5th, 2018
Kiran Garimella
We saw in Part 1 that linked lists provide the conceptual foundation for blockchain, where a ‘block’ is a package of data and blocks are strung together by some type of linking mechanism such as pointers, references, addresses, etc. In this Part 2, we will see how this simple concept gives rise to powerful ideas that lay the foundation for distributed systems.
What happens when one of the links in the linked list or one of the computers (aka, ‘nodes’) in a distributed system falls sick (and responds slowly), gets taken down (‘hacked’), or dies? How does the full list (or chain) recover from such tragic events? This brings us to the notion of fault tolerance in distributed systems. Once changes are made to the data in one of the nodes (blocks), how do we ensure that the same information is consistent with other nodes? That introduces the requirement for consensus.
Pushing the analogy of the linked list a bit further, algorithms that manage linked lists are carefully designed not to break the list. Appending links to the end or the front, for that matter, is an easy operation (we just need to make sure that the markers that indicate the start and end of the list are updated correctly). However, removing a link (or member of the chain) or adding one is a bit trickier. When it is necessary to remove or insert into the middle of the list, it’s a bit more complicated, but a well-understood problem with known solutions. We won’t go into the specifics in this article because the intent is not to describe these operations but to convey a high-level historical perspective.
In distributed systems, fault tolerance becomes a very important topic. In one sense, it is a logical extension to managing a linked list on a single computer. Obviously, in real-world applications, each of the nodes in a distributed system are economic entities that depend on other economic entities to achieve their goals. Faults within the system must be minimized as much as possible. When faults are inevitable, recovery must be as quick and complete as possible. Computer scientists began studying the methods of fault tolerance in the mid-1950s, resulting in the first fault-tolerant computer, SAPO, in Czechoslovakia.
Besides fault tolerance, when information needs to be added to the distributed system (a bit like adding, deleting, or updating the elements of a linked list), the different parties must agree. The reason for agreement is that the data that goes into the ‘linked list’ is data that arises out of transactions between these parties. Without agreement, imagine the chaos! My node would record that I sent you $90 while your node would record only $19! Or, if I send you payment for a product, I expect to receive the product. There should be agreement, settlement, and reconciliation between the transacting parties. A stronger requirement in distributed systems is that once the parties agree to something, the data that is agreed upon cannot be changed by one of the parties without the concurrence of the other party or parties. The strongest version of this requirement is ‘immutability’, where it is technically impossible to make any changes to data that is agreed to and committed to the chain.
Fault-Tolerance and Consensus
Distributed systems, therefore, require fault-tolerance, consensus, and immutability in varying degrees, depending on the needs of the business. Mechanisms for fault-tolerance and consensus evolved since the early days. Notable developments are:
Byzantine Fault Tolerance (BFT) by Lamport, Shostak, and Pease in 1982, to deal with situations where one or more of the nodes in the distributed system become faulty or malicious.
Proof-of-Work (POW), first described in 1993 and the term coined in 1999, which is a technique for providing economic disincentives for malicious attacks. A precursor idea of POW was proposed in 1992 by Cynthia Dwork and Moni Naor, as a means to combatting junk mail—a problem that was already a significant nuisance way back in 1992!* Their solution was to require a sender to solve a computational problem that was easy enough for sending emails normally but becomes computationally expensive for sending massive amounts of junk emails.
A high-performance version of BFT, called Practical Byzantine Fault Tolerance (PBFT), by Miguel Castro and Barbara Liskov, in 1999; and so on.
Paxos**, a family of consensus algorithms, has its roots in a 1988 work by Dwork, Lynch, and Stockmeyer, and first published in 1998 (even though conceived several years earlier) by Leslie Lamport.
Raft consensus algorithm was developed by Diego Ongaro and John Ousterhout. Published in 2014, it was designed to be a more understandable alternative to Paxos.
State machine replication (SMR) is a framework for fault-tolerance and consensus is a way to resolve conflicts or achieve agreement on the state values. SMR’s beginnings are in the early 1980s, with an influential paper by Leslie Lamport, “Using Time Instead of Timeout for Fault-Tolerant Distributed Systems†in 1984.
In Part 3, we will do a high-level review of mechanisms designed to keep distributed systems secure, consistent, and able to handle large volumes of transactions.
*Their paper, “Pricing via Processing or Combatting Junk Mailâ€, begins with a charming expression of exasperation: “Some time ago one of us returned from a brief vacation, only to find 241 messages in our reader.â€
**No known relation to the blockchain company, Paxos.com
Posted by AGORACOM-JC
at 4:03 PM on Wednesday, December 5th, 2018
Kiran Garimella
Technologies of Blockchain – Part 1: The Foundations
Blockchain is not just a single technology but a package of a number of technologies and techniques. The rich lexicon in the blockchain includes terms such as Merkle trees, sharding, state machine replication, fault tolerance, cryptographic hashing, zero-knowledge proofs, zkSNARKS, and other exotic terms.
In this four-part series, we will provide a very high-level overview of each of the main components of technology. In reality, the number of technologies, variations, configurations, and considerations of trade-offs are numerous. Each piece in this puzzle was motivated by certain business requirements and technical considerations.
In this first part, we look at the origins of the ‘chain’ and the most important technological advancement that makes blockchain (and all e-commerce) possible, i.e., the Internet.
While there have been genuine innovations within the last decade, blockchain’s underlying technologies are mostly quite old (in computer science time scale). Let us unpack a typical blockchain to trace out the origins of the constituent technologies. In this short post, I’ll only point to a very small (some may say, infinitesimally small) subset of the historical origin of technologies that make the modern blockchain possible. I’ll make no attempt to trace the development of these concepts from origin to the present time (that would fill up several books). The fact that blockchain’s technologies have a long and respectable history should help us gain confidence that blockchain, as a technology, is not some fly-by-night, newfangled idea cooked up by the crypto fandom.
What is less certain and much more controversial is the economic justification for blockchain (or at least some types of blockchain), ranging from the unrealistic expectation that it is a panacea for all of humankind’s ills (most optimistically, for social and economic inequities), to the total and premature dismissal of blockchain in its entirety.
The Beginnings
At the conceptual heart of blockchain is the ‘chain’. By definition, the links of the chain are, well, linked. It’s a list of data elements or packets of information (in blockchain, these are called ‘blocks’) that are linked. A blockchain is, therefore, a type of linked list.
The concept of a linked list was defined by pioneers of computer science and artificial intelligence, Alan Newell, Cliff Shaw, and Herbert Simon, way back in 1955-56.
In the early days of computer science, data and processing power lived on individual computers. Soon, people wanted these computers to ‘talk’ to each other. The grand idea of an Intergalactic Computer Network was put forth by J. C. R. Licklider as early as 1963. Unfortunately, even after half a century of rapid development, we have achieved only a planetary-wide Internet so far. An ‘intergalactic’ network is still a few years away!*
These ideas and the need to connect dispersed computers gave rise to wide-scale distributed systems in the 1960s-70s, with the advent of ARPANET and Ethernet. Technically, these linked computers are not necessarily treated in the same way as a traditional linked list that lived on one computer, but the conceptual idea is similar. When data and computational power get dispersed, layers of management, coordination, and security become increasingly important.
Blockchain would not exist without the Internet, which itself would not exist without TCP/IP, developed by Bob Kahn and Vint Cerf in the 1970s and ‘80s. Along the way, some scientists managed to have some fun too. They carried out an April Fools prank in 1990 by issuing an RFC (1149) for IPoAC protocol (IP over Avian Carriers, i.e., carrier pigeons). The punch line was delivered in April 2001 when a Linux user group implemented CPIP (Carrier Pigeon Internet Protocol) by sending nine data packets over three miles using carrier pigeons. They reported packet loss of 55%. A joke that takes a decade to pull off is practically Saturday night live comedy in Internet time scale!
In part 2, we will see how the extension of the concept of linked list on the Internet leads to distributed systems, the attending challenges, and their solutions.