Conclusion Contributors We have explored the thesis that we are at the start of a new era of computing resulting from the confluence of a number of megatrends (edge computing, AI/ML and decentralised data) and that value capture mechanisms for these cryptoassets As ever, we are grateful to our friends for inspiration, helpful discussions, and are steadily turning from an art to a science. These new tokenised representations of honest feedback on early drafts. All errors and omissions are ours. ownership, when deployed in networks of users, machines and other stakeholders, have birthed a whole cadre of decentralised value accrual models in addition to today’s well We are also grateful to Token Data and TokenAnalyst for providing their data & understood centralised business models. analysis for the quantitative part of the report. Just as Rome was not built in a day, we believe there is a long journey ahead to build a scalable, secure and private Web3.0 - starting with the technical infrastructure, Token Data is a free platform tracking all publicly available data (qualitative & developer tools and data management frameworks. To support this vision, Fabric quantitative) on token sales. TokenData is unaffiliated to mainstream media, Ventures is adapting the patient venture capital model to investing in decentralised crypto currency news outlets or token sales. They distribute a bi-weekly data networks: backing the boldest technologists & communities at the earliest stages, newsletter that is read by a wide spectrum of people interested in the supporting them throughout their journey and becoming active participants within the blockchain space, ranging from cryptocurrency hobbyists to prominent VCs networks they are building. and national regulators. www.tokendata.io In Part II we will share some of the nuts and bolts of how Fabric seeks to be selected as a partner by the most discerning entrepreneurs, deliver value to their networks and more generally push forward the Web3.0 vision. TokenAnalyst aims to bring transparency for the decentralized economy. They process and analyse every transaction on the blockchain itself, using cutting- edge machine learning techniques, deriving data-driven insights and metrics that enable investors, developers, and other stakeholders in this growing economy to fundamentally understand and value the plethora of crypto- assets available today. www.tokenanalyst.io 44 45
State of the Token Market Page 43 Page 45