Infrastructure Testing for Decentralized Applications built on Blockchain or Distributed Ledger Platform

Background: Ethereum is the first programmable blockchain platform provided to the developer community to build business logic in the form of a Smart Contract that eventually helps developers build decentralized applications for any business use case. Once Smart Contracts are developed, they need to be registered on the blockchain, followed by deploying to the blockchain. After deploying the contract, the contract address gets assigned through which contract methods can be executed by the abstract layer built over ABI. Web3 is the module which is the most popular implementation to interact with local or remote node participating in the underlying blockchain network, built over Ethereum.

Define Decentralized Application Architecture for Testing: Needless to say that testing of any Decentralized application built over the blockchain platform is not only highly complex but also requires a specialized skill set with the most analytical mind of white box testers. At Magic FinServ, we possess rich real-time experience of some of the most complicated concepts of testing Blockchain-based decentralized applications. Based on this experience, our strategy divides Blockchain-based decentralized application into three isolated layers from a testing perspective –

1. Lowest-Layer – Blockchain platform to provide a platform on which smart contracts can be executed. 

a. Ethereum Virtual Machine

b. Encryption (Hashing & Digital Signature by using cryptography)

c. P2P Networking

d. Consensus Algorithm

e.  Blockchain Data & State of the network (Key-Value storage)

2.  Middle-Layer – Business Layer (Smart Contract) to build business logic for business use cases

a. Smart Contract development – Smart Contract Compilation, Deployment & Execution in Test Network

b. Smart Contract Audit

3. Upper-Layer – API Layer for Contracts, Nodes, Blocks, Messages, Accounts, Key Management & Miscellaneous endpoints to provide an interface to execute business logic and get updates on the state of the network at any given point in time. These interfaces can be embedded between upstream & downstream as well.

Based on these defined components of blockchain, we build an encompassing generic testing strategy for each layer in 2 broad categories –

1. Functional: As the category name suggests, this category ensures that all components that belong to each layer should function as per defined acceptance criteria by the business user/technical analyst/business analyst. We prefer to include System/Integration testing under this category to ensure that all components within each layer work as defined, but also as a complete system, it should accomplish the total business use case. 

2. Non-Functional: This category covers all kinds of testing other than functional testing like Performance, Security, Usability, Volatility & Resiliency testing not only at Node level but container level as well if Docker is being used to host any service.

In defining the generic testing strategy for these two broad categories, we surmise that infrastructure needs to be set up first & it will not be the same all the time. Before moving ahead on this, we need to answer other questions –

Question1: Why is the setting up of infrastructure the most critical & essential activity to start strategizing blockchain-based application testing?  

Question2: What all potential challenges do testers face while setting up infrastructure?

Question3: What all solutions do Testers have to overcome with Infrastructure setup?

To answer the first question:

We need to take a few steps backward to understand what we do for testing any software application in the traditional approach. For starting any software application testing, an environment has to be set up, but that is always a one-time activity until the development team does any significant change to the underlying platforms or technology. However, that happens very rarely. So testers can continue with testing without much worry about infrastructure.

The two core concepts of Blockchain technology are P2P networking & consensus algorithms. Testing these two components is heavily dependent on infrastructure setup, meaning how many nodes we need to test one feature or the other.

For P2P, we need a different number of connected nodes with an automated way to kill nodes randomly & then observe the behavior in each condition.

For Consensus, it depends on what kind of consensus is being used & based on the nature of consensus, different types of nodes, each with a different number of nodes will be needed.

Another differentiating factor that is not applicable for public blockchain but has a significant impact on a private blockchain is different types of permission to different nodes.

There is a frequent requirement to keep changing network topology for verifying most of the features of decentralized applications.

By now, we know how important it is to change network topology for each feature; otherwise, testing would not be as effective.  As we all know now, the blockchain network is a collection of many machines (a.k.a. nodes) connected with Peer To Peer networking. It is always a priority to automate the infrastructure required for mimicking the network topology that is needed for testing.

To answer the second question:

1. If we manually spin-off instances, let’s assume five instances, with required software & user setup, we need to spend almost 2–3 hours per instance.

2. Manually setting up machines is highly error-prone & mundane. Even simple automation does not help until the automation framework is intelligent enough to understand the need for different network topologies.

3. Due to agile methodology adoption, spending so much time setting up just infrastructure is not acceptable as the testing team usually does not have that length of time to complete testing for any given sprint.

4. All testers have to understand infrastructure setup as all need to change network topology to test most of the features. Finding a tester with good infrastructure knowledge is also challenging. 

5. Invalid network topology, most of the time, does not show an immediate failure due to the blockchain concept’s inherent nature. Eventually, an incorrect network topology leads to time and effort spent on non-productive testing without finding any potential bugs.

6. High defect rejection ratio by Dev, either due to incorrect network topology or due to incorrect peering among nodes

To answer the third question:

There are four ways to set up a network of nodes –

1. Physical machines with LAN

2. Virtual Machines

3. Docker Containers on the same machine can be considered as an isolated machine

4. Cloud Platform

We use Docker containers & cloud platforms to set up the infrastructure for testing blockchain-based applications as setting up physical machines, or virtual machines is not viable from a maintenance perspective. 

Physical machines with LAN: To set up a blockchain network with physical devices is tough & scalability is challenging since we need additional devices to achieve the desired testing quality. During Infrastructure testing, we need to make machines (a.k.a. Nodes) go up & down to verify the volatility. This is a cumbersome process for the tester as well. Setting up the network with physical devices requires physical space and need maintenance of these machines at regular intervals. We usually do not recommend going for this option; however, if a customer requires the testing to be done in such a manner, we can define a detailed strategy to execute it. 

Virtual Machines: Compared to a network of physical machines, virtual machines do have a lot of advantages. Increasing the number of VMs on an underlying device also highly complicates the matter since maintaining VMs is not user friendly. Another disadvantage is that we need to hardcode the limit of all the resources beforehand. However, combining Option1 and Option2  (multiple physical machines with multiple VMs on a single machine) seems to be a better choice, although it still requires lots of maintenance and carries overheads that act as a time sink for the tester. As reducing time to test is a critical aspect of quality delivery, we focus on saving as much time as possible to be invested in other higher-value elements of testing.

The advantage of using a cloud platform lies in the ability to spin off as many machines as needed without the overheads of maintenance or any other physical activity. However, it is still an uphill task to maintain such a network with multiple machines on the cloud too. Eventually, we thought of option 4 with Docker, and then we concluded that by combining option3 with option4, we could create a very solid strategy to perform infrastructure testing by overcoming various problems. 

Based on our real-world experiences, we tried various options individually, and in combination, our recommended approach is this process. 

Always perform Sanity/Smoke testing for any build with docker containers. Once all sanity/smoke tests finish without any failure, then switch to replicate the required network topology for functional testing of new enhancements & features.

The advantages of our approach are

1. Build failure issues can be found in less time & can be reported back to dev without any delay that cloud infrastructure introduces. Before taking this approach, we had to spend 2-3 hours to report any build failure bug, whereas the same can be caught in 5-10 minutes as we always run selective test cases under sanity/smoke. 

2. Saved the cost of cloud infrastructure in case of failure in the build, as there is no uptime in case of build failure. 

3. Saved a lot of time for the testing team to spend in infrastructure setup. 

4. Dev team gets more time to fix issues found in sanity/smoke testing as it gets reported in just a few minutes. 

5. Significant reduction in rejection of bugs by the development team. 

6. Timely delivery percentage for the build, without any major bug, has increased significantly. 

To dive into the testing strategy, using docker containers with cloud platform shall be covered in an upcoming blog, followed by our automation framework for the infrastructure setup testing. We will also try to define the answer for the most frequently asked questions by the customer –

Question1: Why should customers be concerned about Infrastructure testing for decentralized applications?

Question2: Why should customers look for an automated way of Infrastructure Setup testing for blockchain-based decentralized applications?

Stay tuned for the upcoming blogs and follow us on LinkedIn for timely updates.  

Building Smart Contracts using DAML: A comparative exploration

What are Smart Contracts?

Smart contracts are translations of an agreement, including terms and conditions, into a computational code. Blockchain follows the “No Central Authority” concept, and its primary purpose is to maintain transaction records without a central authority. To automate this process of recording transactions, Smart Contracts were constituted. Smart Contracts carry several beneficial traits, including automation, immutability, and a self-executing mode.

What is a DAML Smart Contract?

DAML is the open-source language from Digital Asset created to support DLT or distributed ledger technology. It allows multiple parties to do transactions in a secure and permissioned way. Using DAML as the coding language enables developers to focus only on building the smart contracts’ business logic rather than fretting about the nitty-gritties of underlying technology. DAML Smart Contracts run on various blockchain platforms and regular SQL database while providing the same logical privacy model and immutability guarantees.

A personal perspective

As a technology leader with years of experience, I remember the one line from my early days is the “Write Once Run Anywhere” slogan by Sun Microsystems to highlight the Java language’s cross-platform benefits.

I believe, in the coming years, DAML is the language that will enjoy similar popularity as Java due to its cross-platform benefits, ease of use, and versatility. DAML can revolutionize how business processes are managed by simplifying the contracts and making them ‘smarter.’

Comparing DAML

DAML V/S General Purpose Languages

Today, few popular general-purpose languages are in use for creating multi-party blockchain applications, i.e., Java, Go, and Kotlin.

All of these can also be used to create smart contracts. But the challenge lies in the sheer complexity of the task at hand.  The code that needs to be produced to build an effective smart contract, using these languages, is daunting. DAML can achieve the same result by writing 5-7 times less code in a much simpler manner. 

Smart Contract basic data types are contract-oriented (parties, observers, signatories/others), which is in direct contrast to the general-purpose languages (int/float/double). So, the very essence of smart contract languages such as DAML is one and only one – contracts, making it a superior choice for writing Smart Contracts. 

Comparison with Existing Smart Contract Languages

The domain of Smart Contracts is better handled by languages that have been purpose-built for smart contracts, like Solidity, DAML, AML/others. Among these smart contract languages, DAML is the only open-source and Write Once Run Anywhere (WORA). The DAML contract type is also private in nature. At a logical level, DAML has a strict policy. However, at the persistence level, different ledgers might implement privacy in different ways.

DAML for Private and Public Ledgers

The two types of ledgers, Private and Public serve different purposes and should be used accordingly. The underlying concept is that information on the ledgers is immutable once created.

Public Ledgers: Open to all, and anybody can join the system, and each member has access/read/write transactions. Examples: Bitcoin/ Ethereum/ others

Private Ledgers: Also known as permissioned networks or permissioned blockchains, have limitations in participation. It has higher security and limited permissions. Examples: Hyperledger Fabric and Corda. Some Private ledgers offer different privacy and data sharing settings, like Hyperledger sawtooth, although permissioned, allows all nodes to receive a copy of all transactions. 

DAML-Open source language allows the involved parties to do transactions in a secure and permissioned way. Thus, enabling developers to focus on the business logic rather than spending precious time in fine-tuning the underlying persistence technology.

At a logical level, DAML has a strict policy for permissioned access. However, at the persistence level, different ledgers might implement privacy in different ways. 

Sample of a Smart Contract: 

Reporting a trade transaction between 2 counter parties to a regulator or reviewing authority using Smart Contracts.

module Finance where

template Finance

  with

exampleParty : Party

exampleParty2 : Party

exampleParty3 : Party

regulator : Party

exampleParameter : Text

— more parameters here

  where

signatory exampleParty, exampleParty2

observer regulator

controller exampleParty can

   UpdateExampleParameter : ContractId Finance

     with

        newexampleParameter : Text

       do

         create this with

              exampleParameter = newexampleParameter

template name template keyword defines the parameters followed by the names of parameters and their types

template body where keyword can include:

template-local definitions let keyword

Let’s you make definitions that have access to the contract arguments and are available in the rest of the template definition.

signatories signatory keyword Required 

The parties (see the Party type) must consent to create an instance of this contract. You won’t be able to create an instance of this contract until all of these parties have authorized it.

observers observer keyword Optional. 

Parties that aren’t signatories but who you still want to be able to see this contract. For example, the SEC wants to know every contract created, and the SEC should be aware of this.

Optional: Text that describes the agreement that this contract represents.

Explanation of the code snippet

DAML is whitespace-aware and uses layout to structure blocks. Everything that is below the first line is indented and thus part of the template’s body.

The signatory keyword specifies the signatories of a contract instance. These are the parties whose authority is required to create the contract or archive it again – just like a real contract. Every contract must have at least one signatory.

Here the contract is created between two parties-Party1 and Party2, and the regulator is the observer. Every transaction done is visible to the observer, i.e., the regulator playing the role of Regulator (SEC) in this case and SEC can be looking at every transaction. So, smart contracts can be created in this space.

DAML disclosure policy ensures that Party3 will not be able to view the transactions as it is neither signatory, nor observer or controller, and it is just a party to the contract.

Here is a link to the repository provided by DAML which contains examples for several such use cases modeled in DAML.

  1. How to write smart contracts using DAML and various use-cases

https://github.com/digital-asset/ex-models

  1. Ledger implementation enabling DAML applications to run on Hyperledger Fabric 2.x

https://github.com/digital-asset/daml-on-fabric and for learning

Compilation and Deployment of DAML

DAML has both the language as well as the run time environments (in the form of libraries known as DAML SDK). Developers need to focus on writing smart contracts (in the way of language features provided by DAML SDK) without bothering about the underlying persistence layer. It also provides support for existing data structures (List/Map/Tuple) and also provides the functionality for creating a new data structure. 

Other notable features of DAML 

  • A .dar file is the result of compilation done through DAML Assistant, and eventually, .dar files are uploaded into the ledger so that the contracts can be created from the templates in the file. This .dar is made up of multiple .dalf files.  A .dalf file is the output of a compiled DAML package or library and it’s underlying format is DAML-LF.
  • Sandbox is a lightweight ledger (in-memory) implementation available only in the dev environment. 
  • Navigator is a tool for exploring what is there on the Ledger and it shows what contracts can be seen by different parties and submit commands on behalf of those parties.
  • DAML gives you the ability to deploy your smart contracts on the local ledger(in-memory) so that various scenarios can be easily tested. 

Testing DAML Smart Contracts

1) DAML has a built-in mechanism for testing templates called ‘scenarios’. Scenarios emulate the ledger. One can specify a linear sequence of actions that various parties take, and subsequently these are evaluated with the same consistency, authorization, and privacy rules as they would be on the sandbox ledger or ledger server. DAML Studio shows you the resulting transaction graph.

2) Magic FinServ launched its Test Automation suite called Intelligent Scenario Robot, or IsRobo™, an AI-driven Scenario Generator that helps developers test smart contracts written in DAML. It generates the unit test cases (negative and positive test cases) scenarios for the given smart-contract without any human intervention, purely based on AI.

Usage in Capital Markets

Smart contracts, in general, have excellent applications across the capital markets industry. I shall cover some use cases in subsequent blogs outlining how multi-party workflows within enterprises can benefit by minimizing reconciliations of data between them, and allow mutualization of the business process. Some popular applications currently being explored by Magic FinServ are: 

  • Onboarding KYC
  • Reference data management
  • Settlement and clearing for trades
  • Regulatory reporting
  • Option writing contracts (Derivatives industry)

Recent Noteworthy Implementations of DAML Smart Contracts are: 

  • International Swaps Derivatives Association (ISDA) is running a pilot for its Common Domain Model (CDM) for clearing of interest rate derivatives using a distributed ledger.
  • The Australian Stock Exchange (ASX) and Swiss investment bank UBS are continually providing the inputs to validate the CDM’s additional functionality alongside ISDA and Digital Asset.

DAML Certification process

To get hands-on experience with DAML, free access to docs.daml.com is available, where developers may learn from the study material/download the run time, and build sample programs. However, to reinforce learning and add this as a valuable skill, it is better to be a DAML-Certified Engineer. It is worth pursuing as the fee is reasonable and the benefits are manifold. There are not plenty of DAML developers available in the market, so it is a rather sought-after skill as well. 

Conclusion

DAML is ripe for revolutionizing the way business processes are managed and transactions are conducted. 

The smart contracts that are developed on open-source DAML can run on multiple DLTs / blockchains and databases without requiring any changes (write once, run anywhere). 

With the varied applications and relative ease of learning, DAML is surely emerging as a significant skill to add to your bouquet if you are a technologist in the capital markets domain. 

To explore the DAML applications with Magic FinServ, read more here

To schedule a demo, write to us mail@magicfinserv.com 

Revolutionary use of Smart Contracts to solve your Enterprise Reconciliation and Application Synchronicity Challenges

Until recently, your enterprise may have considered smart contracts as a tool to bridge silos from one organization to another – that is to establish external connectivity over Blockchain. However, what if we proposed applying the same concept so a firm can be instrumental in addressing enterprise-wide data reconciliation and system integration / consolidation challenges to expedite time to market and streamline (i.e internal, regulatory, FP&A, supplier risk) reporting. 

Afterall, about 70-80% of reconciliation activity takes place within the enterprise. The best part? A firm can do this with minimal disruption to its current application suite, operating system and tech stack. We will look at traditional approaches and explain how smart contracts are the way to get started on one of those journeys when one never looks back

To set the stage, let’s cover the self-evident truths. Reconciliation tools are expensive and third party tool implementations typically require multi year (and multi million dollar) investments. Over 70% of Reconciliation requirements are within the Enterprise amongst internal systems. Most reconciliation resolutions start with an unstructured data input (pdf/email/spreadsheet) which requires a manual review/scrubbing to be ingested easily. For mission critical processes, this “readiness of data” lag can result in delays and lost business, backlogs, unjustifiable cost and worst of all, regulatory penalties. 

Magic Finserv proposes a three-fold approach to take on this challenge. 

  1. Data readiness: Tackle the unstructured data solution using AI and ML utilities that can access data sources and ingest them into a structured format. Often Reconcilliation is necessary because of incorrect or incomplete data, ML can anticipate what is wrong / missing from past transactions and remediate. This is the Auto Reconciliation.
  2. Given unstructured data elements may reside in fragmented platforms or organizational silos, the Firm must have an intelligent way of integrating and mutualizing itself with minimal intervention. An ETL or data feed may look appealing initially, however, these are error prone and do not remediate the manual reconciliation tasks for exception management.  Alternatively, a smart contract based approach can streamline your rule-based processes to create a single data source. 
  3. Seamless integration to minimize the disconnect between applications. The goal is to create an environment where reconciliation is no longer required. Ideally.

We have partnered with Digital Asset to outline a solution that brings together an intelligent data extraction tool, a DAML smart contract and a capital markets focused integration partner that will reduce manual end to end reconciliation challenges for the enterprise.

Problem statement & Traditional Approach

Given that most enterprise business processes run through multiple disparate applications with their respective unique databases, it has been proven a monolithic application approach is close to impossible. And not recommended due to issues with a Monolithic application architecture. Traditionally, this challenge has been addressed using integration tools such as an Enterprise Service Bus, SOA, where the business gets consumed in the cycle of data aggregation, cleansing and reconciliation. Each database becomes a virtual pipeline of a business process and an additional staging layer is created to deliver internal/external analytics. In addition, these integration tools are not intelligent as they only capture workflows with adapters (ad hoc business logic) and do not offer privacy restrictions from the outset. 

Solution

The Digital Assets DAML on X initiative extends the concept of the Smart Contract onto multiple platforms including Databases. The DAML on X interfaces with the underlying Databases using standard interfacing protocols, the Smart Contract mutualizes the Data Validation rules as well as the access privileges. Once you create a DAML smart contract, the integrity of the process is built into the code itself, and the DAML runtime makes disparate communication seamless. It is in its DNA to function as a platform independent programming language specifically for multi-party applications.

Without replacing your current architecture such as the ESB, or your institutional vendor management tool of choice, use the DAML runtime to make application communication seamless and have your ESB invoke the necessary elements of your smart contract via exposed APIs .  

Handling Privacy, Entitlements & Identity Management

Every party in the smart contract has a “party ID” plugged in directly with your identity management solution that you are using institutionally. You can even embed “trustless authentication”. 

The idea is that entitlements/rights & obligations are baked directly into the language itself as opposed to a normal business process management tool where you build out your business process and then put the entitlements/ marry them in phase 3 of the process – only to realize that workflow needs to change. 

DAML handles this upfront – all of the authentication is taken care of the persistence layer/IDM that you decide on. The smart contract template represents a data scheme in a database and the Signatories/controllers in our example represent role-level permissioning of who can do what/when and who can see what/when

 The image below shows how the golden source of data is generated.


It is a purpose built product that contains automatic disclosures and privacy parameters out of the box. You don’t need to keep checking your code to see if the guy who is exercising command is actually allowed to see the data or not. All of this is within the scope of the DAML runtime. 

Already kickstarted your enterprise blockchain strategy?

Firstly, Amazing! Second, since DAML Smart contracts can run on databases or distributed ledgers of your choice (Fabric, Corda etc. ), it’s a unique solution that gives you the flexibility to get started with the application building and even change underlying ledgers at any point. You can also Integrate between multiple instances. I.e. If you are running a DAML app on Fabric and another DAML app on corda, both apps can talk to one another. 

The key takeaway here is that most enterprises are held up with determining which ledger meets their needs. With its intuitive business workflow focused approach, developing your DAML applications while you select your ledger fabric can expedite revenue capture, implement consistent enterprise reporting and reduce the burden of reconciliation – the smart contract through to the integration layer is completely portable. 

STO Processing Lifecycle (Part 1) -An overview of STO Lifecycle in Primary and Secondary markets

Security Token Offering seems to be the next hype to utilize BlockChain concepts and transform the current security instruments (Equity, Debt, Derivatives, etc) into digitized security. STOs have gained popularity and momentum in recent times due to lack of regulations in the ICO world with a lot of outliers for most of the ICOs in 2018. There are many Startups that are building platforms by utilizing programmable blockchain platforms like Ethereum. Recent developments in STO platforms seem to be moving in the direction of building an Ecosystem with defined standards as well. By seeing lots of traction on GitHub towards standards like ERC-1400, ERC-1410, ERC-1594, ERC-1643 and ERC-1644, it has given us the opportunity to think about how can a technology company like us (Specialized in ensuring Quality standards for Blockchain-based applications on Ethereum) can contribute to this. We started our journey in defining the Complete STOs processing cycle in the context of real-time usage (from a functional perspective) with underlying Ethereum platforms (from a technology perspective).

It is highly important that we list down all the major participants & their roles before actually defining the  STO lifecycle :

  1. Issuers – Legal entity who develops, registers & sells security for raising funds for their business expansion.
  2. Investors – Entities who are ready to invest in securities to expect financial returns.
  3. Legal & Compliance Delegates – Entities that ensure all participants & processes are complied within the defined rules & regulations by the jurisdiction.
  4. KYC/AML service providers – Entity which provide KYC/AML for required participants.
  5. Smart Contract Development communities like Developers, Smart Contract Auditors, QA Engineers.

Most of the companies claiming to provide STO platforms are using Ethereum as the underlying programmable blockchain platform with few exceptions. The rationale for using Ethereum as the first choice is – It is a Turing Complete platform to build complex decentralized applications by defining logics inside Smart contracts (Solidity is the most favorable programming language among developer communities). Parallelly, Ethereum is also getting matured, secured and improved on performance with scalability by introducing lots of new features and improvements. There are very few who are utilizing other platforms apart from Ethereum to build their own STO processing platforms and some of them are trying to build a completely new blockchain platform dedicatedly designed for STOs. The last approach seems to be too optimistic as it might take years to build such a  system whereas the current momentum around STOs does not seem to wait that long.

Basis the above, we can now define the generic STO lifecycle from a functional standpoint into 2 phases as below: Primary Market

  1. To issue STO by Issuer
  2. To invest in STO
  3. Secondary Market to trade STO on either on Exchanges or Over The Counter

Primary Market

  1. To issue STO by Issuer –
  1. Registration of Issuer
  2. Creation of STO Token
  3. Approval from Legal & Compliance for STO
  4. Issuance of STO post Legal & Compliance approval  
  1. To invest in STO by Investor –
  1. Registration of Investor
  2. KYC/AML for Investors
  3. Whitelisting of Investors for STOs post KYC/AML
  4. Investment in STO for allowed STOs based on whitelisting of corresponding STO

Before we actually get into the technical insight of underlying blockchain technology, we need to define the STO platform technical architecture from a  user perspective. Each STO platform that exists in any state in today’s world has –

  1. A Presentation layer (User Interface with any chosen front end technology, Integration with Wallets)
  2. A Business Layer (JS libraries to provide an interface to interact with Smart Contracts)
  3. A Data Layer (Ethereum data storage in blocks in the form of key-value storage)

Now let’s define a high-level overview from a technical standpoint by assuming that the STO platform is using Ethereum as an underlying blockchain platform ( assuming that the Backend Layer has been set up already) –

  1. Creation of an external account for all participants to bring everyone on Ethereum blockchain
  2. Defining transactions for Off-Chain and On-Chain for all activities defined for Issuer and Investors
  3. Merger of Off-Chain data with On-Chain data
  4. Develop Smart Contracts
    1. Standard smart contract to be built for each STO depending upon Jurisdictions for generic processes among required participants
    2. STO specific Smart Contract to be built for implementing business/regulation rules
    3. Smart contracts with all business logic especially for transaction processing

Based on the expertise of our group, Magic FinServ can contribute in a very big way in the development of Smart Contracts (Written in Solidity) along with Auditing of contracts.
For more details visit https://www.magicblockchainqa.com/our-services/#smart-contract-testing

In the next part, we will detail out all the above mentioned high-level technical overview with high-level functional overview followed by more insight on all these defined functional & technical flow.

Why does it matter for a technology company like us to embrace global tech standards?

When ERC-20 Standards came into existence, it eased down the ICO token interoperability across wallets & crypto exchanges for all ERC-20 compliant tokens. Having standards for any process not only helps to have bigger acceptance but also improves interoperability to build up an ecosystem. Being a technology obsessed firm, we’ve always encouraged standards to be in place. An acceptable standard not only helps developers (One of the strongest stakeholders in the ecosystem who have the responsibility to provide workable solutions by using available technology) to build  the ecosystem but also leads to minimal changes for implementing interoperability. In today’s world, there is no system in existence which does not raise any error /failure in real time usage . Using global standards provides us another vital advantage of finding a resolution for such errors/failures as  these cases would have already been resolved by the tech fraternity earlier.

Today, it is of utmost importance to have standards that can not only integrate multiple systems (STO Platforms, Wallets and Exchanges) with minimal changes but also make security tokens easily interoperable across wallets and exchanges. Security Token Offerings can’t be an exception for not having standards when they seem to have the biggest and most complicated technological advancement for transforming the existing world of security to Digitized security with automated processing over traditional blockchain technology.

The recent traction on ERC-1400 (now moved to ERC-1411) has helped towards defining standard libraries for the complete STO life cycle especially for on-chain/off-chain transactions This compilation of requirements has got the technology folks globally excited as this has the mettle to ease down the complete STO lifecycle. It completely makes sense that lots of individuals are very excited to see such a good compilation of requirements from various involved participants with probable interface that can ease down the complete STO lifecycle. Github, for instance has a lot of real time developers participating in discussions to share their experiences as well.

Ethereum Standards (ERC abbreviation of Ethereum Request for Comments) related to regulated tokens

The below standards are worth a read to understand in depth about the rationale behind targeting more regulated transactions based on Ethereum tokens –  

  1. ERC-1404 : Simple Restricted Token Standard
  2. ERC-1462 : Base Security Token
  3. ERC-884 : Delaware General Corporations Law (DGCL) compatible share token

ConsenSys claims to have implemented ERC-1400  on the Github repository & named the solution as Dauriel Network.  GitHub says, “Dauriel Network is an advanced institutional technology platform for issuing and exchanging tokenized financial assets, powered by the Ethereum blockchain.”  

ERC-1400 (Renamed to ERC-1411) Overview

Smart contracts  will eventually control all the activities like Security issuance process, trading lifecycle from an issuer & investor perspective as well as  event processing related to security token automatically. Let’s try to understand ERC-1400 standard libraries with respect to each activity for STO lifecycle :

  1. ERC-20: Token Standards
  2. ERC-777: A New Advanced Token Standards
  3. ERC 1410: Partially Fungible Token Standard
  4. ERC 1594: Core Security Token Standard
  5. ERC-1643: Document Management Standard
  6. ERC-1644: Controller Token Operation Standard
  7. ERC-1066: Standard way to design Ethereum Status Code (ESC)

All the defined methods inside each standard (Solidity Smart Contract Interfaces) at an activity level are (Pre MarketPrimary MarketSecondary Market)  and can be represented pictorially as below:

It is of utmost importance to distinguish Off-Chain & On-Chain activities  with those that will be processed outside STO platform before defining the mapping between standard libraries methods and activities across all 3 stages. Off Chain activities can be done outside the main chain of underlying blockchain platform then merged. However, Integration will be needed for all activities performed outside an STO platform where several standards (e.g. ERC-725 & ERC-735 define for Identity management) play an  important role.

All activities related to Pre-Market are supposed to happen outside the  STO platform as those are completely related to documentation like structuring the offering, preparing the  required documentation with all internal and external stakeholders including the legal team to ensure regulation compliances . To bring reference of all pre market documentation to the  STO platform, Cryptographic representation of all documentation can be used effectively.

Similarly, KYC/AML process can happen off-chain with proper integration on the STO platforms with proper identity management (standards around Identity management like ERC-725 and ERC-735).

ERC-1400 (now a.k.a ERC-1411) covers all activities related to primary and secondary market with proper integration to all off chain data which brings all related documentation/identity to the underlying blockchain platform on which the STO is designed.

Magic and its approach for defining ERC-1411 mapping

Team Magic is working continuously to define  the mapping between all defined methods with all real time activities of primary and secondary markets. A key part of our strategy is  to collect all requirements from various stakeholders like Security lawyers, Exchange Operators, KYA providers, Custodians, Business Owners, Regulators, Legal Advisor. Once we have all requirements collected then our experienced business analyst teams (Experts from Pricing, Corporate Actions, and Risk Assessment) take over and reconcile the requirements with ERC-1400 standards to not only map each requirement but also find out the gaps in the standards. Post this, our technology team  prepares the implementation strategy of all those standards by developing smart contracts in Solidity. Having an in-house developed smart contract for any specific case study (Provided by our Business Analyst team) helps us define Auditing of ERC-1400 specific smart contracts and the testing strategy for each contract as well. 

Battling Blockchain Vulnerabilities through Quality Audits

The original promise of blockchain technology was security. However, they might not be as invulnerable as initially thought. Smart contracts, the protocols which govern blockchain transactions, have yielded under targeted attacks in the past.

The intricacies of these protocols let programmers implement anything that the core system allows, which includes inserting loops in the code. The greater the options are given to programmers, the more the code needs to be structured. This makes it more likely for security vulnerabilities to enter blockchain-based environments.

The Attacks that Plague Blockchain

Faulty blockchain coding can give rise to several vulnerabilities. For instance, during Ethereum’s Constantinople upgrade in January of this year, reentrancy attacks became a cause for concern. These are possibly the most notorious among all blockchain attacks. A smart contract may interface with an external smart contract by ‘calling it’. This is an external call. Reentrancy attacks exploit malicious code in the external contract to withdraw money from the original smart contract. A similar flaw was first revealed during the 2016 DAO attack, where hackers drained $50 million from a Decentralized Autonomous Organization (DAO). Note the following token contract, from programmer Peter Borah, of what appears to be a great endeavor at condition-oriented programming:

contract TokenWithInvariants {   

mapping(address => uint) public balanceOf;

uint public totalSupply;

   modifier checkInvariants {

          _
         if (this.balance < totalSupply) throw;

}

   function deposit (uint amount) checkInvariants {

     balanceOf[msg.sender] += amount;

     totalSupply += amount;

  }

  function transfer(address to, uint value) checkInvariants {

        if (balanceOf[msg.sender] >= value) {

        balanceOf[to] += value;

        balanceOf-msg.sender] -= value;

  }

  }

  function withdraw() checkInvariants {

      uint balance = balanceOf[msg.sender];

      if (msg.sender.call.value(balance) ()) {

        totalSupply -= balance;

        balanceOf[msg.sender] = 0;

The above contract executes state-changing operations after an external call. It neither carries out an external call at the end nor does it have a mutex to prevent reentrant calls. The code does perform excellently in some areas, such as checking for a global invariant wherein the contract balance (this.balance) should not be below what the contract perceives it to be (totalSupply). However, these invariant checks are done at function entry in function modifiers, thereby treating a global invariant as a post-condition rather than holding it at all times. The deposit function is also flawed since it considers the user-mandated amount(msg.sender) instead of msg.amount.

Finally, the seventh line has a bug in it. Instead of,

if (this.balance < totalSupply) throw;

It should be,

if (this.balance != totalSupply) throw;

This is so because instead of checking for a stronger condition, we are now confirming a somewhat weaker condition of the contract’s actual balance being higher than what it thinks it should be.

These issues enable the contract to stock more money than it should. An attacker can potentially withdraw more than their share, heightening the danger of reentrancy even when the contract codes are watertight.

Overflows and underflows are also significant vulnerabilities that can be used as a Trojan Horse by non-ethical hackers. An overflow error occurs when a number gets incremented above its maximum value. Think of odometers in cars where the distance gets reset to zero after surpassing, say 999,999 km. If we affirm a uint8 variable that can take up to 8 bits, it can have decimal numbers between 0 and 2^8-1 = 255. Now if we code as such,uint a = 255;a++;

Then this will lead to an overflow error since a’s maximum value is 255.

On the other end, underflow errors effect smart contracts in the exact opposite direction. Taking an uint8 variable again:unint8 a = 0;a-;

Now we have effected an underflow, which will make a assume a maximum value of 255.

Underflow errors are more probable, since users are less likely to possess a large quantity of tokens. The Proof of Weak Hands Coin (POWH) scheme by 4chan’s business and finance imageboard /biz/ suffered a $800k loss overnight in March 2018 because of an underflow attack. Building and auditing secure mathematical libraries that replace the customary arithmetic operators is a sensible defense for these attacks.

The 51% attack is also prevalent in the world of cryptocurrency. A group of miners control more than 50% of the mining hashrate on the network and control all new transactions. Similarly, external contract referencing exploits Ethereum’s ability to reuse code from and interact with already existing contracts by masking malevolent actors in these interactions.

Smart contract auditing that combines the attention of manual code analysis and the efficiency of automated analysis is indispensable in preventing such attacks.
Solving the Conundrum
Fixes to such security risks in blockchain-based environments are very much possible. A process-oriented approach is a must with agile quality assurance (QA) models. Robust automation frameworks are also crucial in weeding out errors in coding and therefore strengthening smart contracts in the process.

In the case of reentrancy attacks, avoiding external calls is a good first step. So is inserting a mutex, a state variable to lock the contract during code execution. This will block reentry calls. All logic that changes state variables should occur before an external call. Correct auditing in this instance will ensure these steps are followed. In the case of overflow and underflow attacks, the right auditing tools will build mathematical libraries for safe math operations. The SafeMath library on Solidity is a good example.

To prevent external contract referencing, even something as simple as using the ‘new’ keyword to create contracts may not be implemented in the absence of proper auditing. Incidentally, this one step can ensure that an instance of the referred contract is formed during the time of execution, and the attacker cannot replace the original contract with anything else without changing the smart contract itself.

Magic BlockchainQA’s pioneering QA model has created industry-leading service level agreements (SLAs). Our portfolio of auditing services leverage our expertise in independently verifying blockchain platforms. This ensures decreased losses on investments for fintech firms, along with end-to-end integration, security, and performance. Crucially, this will usher in widespread acceptance of blockchain-based platforms. With the constant evolution of blockchain-based  environments, we are constantly evolving as well, to tackle new challenges and threats, while ensuring that our tools can conduct impeccable auditing of these contracts.

Blockchain technology first came with the promise of unprecedented security. Through correct auditing practices, we can fulfill this original promise. At Magic BlockchainQA’s, we aim to take that promise to its completion every single time.