mohd sameer, Author at Magic FinServ

Why do FinTech’s struggle to delineate their Buy, Build, Partner strategy?

Every now and then, FinTech service providers approach Product Managers seemingly insulted by the question “have you considered amplifying your DevOps team or optimizing your Cloud Strategy with outside help to scale faster and cheaper?”.

Convincing these naysayers can sometimes make you feel like you are assuming a “bad cop” parental role – i.e. becoming someone who knows what’s good for you based on extensive experience, even when you may not see it or believe them at first. So let’s jump right into the spiel.

There are three imperative considerations to implementing a “Buy, Build, Partner” strategy that rest on the premise (which Silicon Valley seemingly forgets from time to time) that time and material resources are finite: Leverage Open-source, Protect your Mindshare and Trust Inorganic Growth. 

Steve Jobs once said, “it doesn’t make sense to hire smart people and tell them what to do; we hire smart people so that they can tell us what to do” –  Even for Apple, a maximum-security IP fortress, this did not mean augmenting payroll. In 2012, Apple revealed its 15-year association with the likes of Infosys and Wipro, implying Apple’s journey to a market cap of 1 trillion dollars was not achieved by internal hiring alone. The good news is, this is only getting easier to achieve, the bad news is too few emerging names are following best practices to accelerate ahead. Instead, the C-Suite is often at the mercy of the apprehension and fear of internal (usually technical) “gatekeepers”. Let’s take a look at 3 key components of a “Buy, Build, Partner” strategy in 2020, which assumes your partners and service providers aren’t seeking multi-year/million-dollar engagements, that they adapt to your servicing time zone with suitable SLAs and that they will not compromise on quality. 

1.The ubiquitous existing Open- source tech stacks today do not require you to reinvent the wheel

  • The famous four (Apple, Google, Microsoft, and Linux) have now been replaced with robust community-driven Open-source code that is not dependent on cyclical patches, provides real-time bug fixes and new enhancements, and can deliver results in T-0 vulnerabilities as opposed to proprietary solutions. Embracing the world of Open-source and applying your domain knowledge is how you get the biggest bang for your buck. 
  • For instance, Rasa: Open- source Conversational AI, has given multiple verticals (Airline, Retail, Healthcare, Financial Services, and counting…) the opportunity to create enterprise-grade intelligent virtual assistants that are well versed in the context of their industry. 

Extrapolate this tangible product approach to AI/ML solutions, data visualization, testing, cloud strategy and platform engineering with a range of emerging tech stacks such as Kubernetes, ELK, Kibana and Terraform. Take your pick and get in touch with me if you are looking to explore!

2. Protect your mindshare, build responsibly, keep costs low and hire for what you don’t know

  • Keep your team the right size and working on exciting stuff such as product development and feature building. A service provider with domain experts can easily handle manual QA, DevOps and migration projects that can be executed without the overhead (read: large fixed costs such as offices, servers, inventory) of doing it on your own. Note again for anxious readers: IP is not at risk, especially if your contract specifies that source code is to be handled and maintained by your firm. Decide on an outcome-based model that establishes clear deliverables. 
  • Burn rates should not make investors or leadership teams uncomfortable. As Venture Capitalist, Mark Suster warns, “a company’s runway should not fall below 6-7 months of cash on hand” and reminds us “high fixed costs and high debt rates killed many great companies in Dot Com 1.0”. Figure out a “Buy” strategy to keep sticky situations and rainy days to a minimum by increasing variable costs. This in turn, generates momentum for speed to market and allows you to maintain a position well ahead of your peers. 
  • Even though we are inundated with “self-help” advice on how to manage our personal lives and relationships, institutional introspection is underrated. It is equally just as important to identify and diagnose weak areas in your company from the outset. Then buy or hire those services from a vendor that spends day and night perfecting that exact skillset. 

3.The Butterfly Effect of Partnering on Business Development 

  • Do not underestimate the “Butterfly Effect” of your outsourcing partner’s ability to drive inorganic growth in unique ways. Unsuspecting partnerships have helped drive:
    • Geographical scale
    • Customer acquisition and adoption
    • IP augmentation 
    • Insights & analytics 
  • Choose the domain experts that can connect you to peers and establish these relationships. Just how Wipro and Infosys were able to leverage its internal IT projects with Apple to amplify adoption. Sewing a web of interconnectivity of Apple products with other clients’ business applications and adapting best practices continue to be a win-win for the iPhone/iPad maker as well as their outsourcing providers. 

Finally, the skeptics are not wrong to be wary of anything except “Build”. It  has become the dominant fall back approach for many emerging technology companies across Retail, Healthcare, FinTech and Blockchain after “outsourcing” earned a bad reputation over the last decade (read: overcharging, “landing and expanding”, and poor results). However, with the right governance, acknowledging that most engagements can leverage free open-source solutions with effective domain-specific frameworks, and creating equitable partnerships, a little bit more of “Buy” and “Partner” can get you where you want to go exponentially faster. 

Using AI for Contract Lifecycle Management

Contracting as an activity has been around, ever since the start of the service economy. But despite it being a well-used practice, very few companies have mastered the art of managing contracts efficiently or effectively.  According to a KPMG report, inefficient contracting leads to a loss of 5% to 40% of the value of a given deal in some cases. 

The main challenge facing companies in the financial services industry is the sheer volume of contracts that they have to keep track of; these contracts often lack uniformity and are hard to organize, maintain and update on a regular basis. Manual maintenance of contracts is not only difficult but also cumbersome and prone to multiple forms of errors.  Also, it poses the risk of missing important deadlines or missed scheduled follow-ups, as written in the contract and could potentially lead to expensive repercussions.

Contract management is a way to manage multifarious contracts from vendors, partners, customers, etc. so that data from these contracts can be easily identified, segregated, labeled and extracted to be used in various cases and also updated regularly. 

Recent technological advances in Artificial Intelligence (AI) and Machine Learning, are now helping companies resolve many of the contracting challenges by delivering efficient contract management as a seamless automated solution. 

Benefits of Using AI in the contract management lifecycle

Basic Search

AI can help in enhancing the searchability of the contracts including clauses, dates, notes, comments and, even metadata associated with it. The AI method used for this purpose is called natural language processing(NLP) and the extraction of metadata is done at a granular level to enable the user to search from the vast repository of contracts in an effective manner.

Example: This search function would be extremely useful for the relationship managers/chat-bots to answer any customer queries pertaining to a particular contract. 

Analysis and Diagnostic Search:  AI can be used to proactively identify expiry dates, renewal dates, follow-up dates or low KPI compliance, and then can be used to apply suggestive course of action or flag any alerts. Analytics can further be used to study and predict any kind of risks or non-compliance and therefore send a notification to relevant stakeholders for pending payments or negotiations.

Example: This can be effectively utilised for improving customer satisfaction as well as guide negotiations based on accessible information.

Cognitive Support: AI is highly sought for its predictive intelligence. AI’s predictive capabilities can be used to do an analysis of the existing contracts to understand contract terms & clauses. Its pattern recognition algorithms can identify points of parity, differentiations on pricing, geographic, products & services. Based on the predictive analysis, AI can provide suggestions for inclusion/exclusion of clauses, terms & conditions, etc when authoring new contracts. 

Example : AI systems may automatically predict and suggest clauses pertaining to NDA (non-disclosure agreement) based on the historical contracts that have been previously processed and the events associated with it.

Dynamic Contracts: Advanced AI can be used to build an adaptive dynamic contract. Based on the past data and by taking into account external factors such as market fluctuations, currency exchange, prices, labor rate, changes in laws and regulations, etc, AI algorithms can create a contract. Such a contract would require auditing by an expert but nonetheless would reduce the effort required to generate the contract.

Example: AI can be used to assess existing contracts for making them GDPR (General Data Protection Regulation) compliant. It will insert the relevant data privacy terms and conditions into the contract and subsequently notify the concerned stakeholder about the changes in the contract, so they can be verified.

Challenges in contract management with AI-ML

The use of AI and Machine Learning for contract management is highly promising but it is also challenged by few limitations. 

Machine Learning (ML) is only as effective as the training data that has been used to train the ML algorithms. Therefore, before any AI-ML application is put into practice, an exhaustive dataset of contracts must be developed and then classified, sorted, labeled, and retrieved based on the metadata. This would provide the base, as training data, for AI to build up and therefore put the ‘intelligence’ in the Contract Management process.

For the exhaustive dataset to be developed, all the contract data must be assimilated together. In many organizations, the contracts are still hard copies lying in cabinets. Approximately 10% of written agreements aren’t even traceable. Even when digitized contracts are available, for the AI machine to read these contract’s insights, they must first be in uniform contents. This not only requires scanning of all the documents but also the ability to extract the meaning of the content in the contracts. 

Overcoming the challenges

In order to make the contract portfolios AI-ready, the first step is to  digitize these contract documents. This can be done using OCR (optical character recognition). OCR reads the physical document as a human eye would read it and converts into digitized text which can easily be searched with ML formulas. While it may be too onerous to scan all historical contracts, this purpose can be accomplished by using a CMS (contract management software), which is capable of converting the documents into machine readable filed, thus making a significant data pool. Then AI, can be used to use this data to gain relevant insights. When AI algorithms access huge pools of data, its ability to decipher patterns and provide insights becomes much stronger. The predictive insights can be achieved by incorporating NLP (natural language processing). NLP allows contact groups to identify when contracts have deviated from defined standards. This makes the approval process, negotiation process much faster when the stakeholder is aware of the current contract version deviation from standards. NLP is also used in reporting risk based on language meaning rather than just string matching. For example, identifying those contracts which are about to expire and starting their renewal process.


Potentially, AI in contract management will change the contract management lifecycle to uplevel the strategic role of the contract managers, which would position them in a superior spot while negotiating terms of contracts. It can also help tremendously in strategic planning, risk management, supplier search, and final selections. Thus enhancing the efficiency and effectiveness of category managers. AI innovation continues to play a vital role when contract managers educate themselves and ensure that their contract processes are fully digitized and AI-ready.

Get started with Artificial Intelligence by booking a workshop with us today!

How FinTech Startups Can Outgrow the Financial Giants

According to ‘The Pulse of FinTech 2018’ report by KPMG, fintech startups bagged over $111 billion in investments across 2,196 deals. The technological evolution of fintech startups has outmatched that of traditional financial services by many leagues. Not only has this served to disrupt the space by directly pitting startups against tech giants, but has also transformed the tools of global trade and commerce.  One startup even estimated the total cost of the recent US government shutdown right down to its last cent.

Various emerging technologies have given rise to new business-technology startups that didn’t even exist ten years ago. It’s no surprise then that investments in sectors of regulatory technology (RegTech) have tripled from USD 1.2 billion in 2017 to USD 3.7 billion in 2018.

Meanwhile, the versatile nature of blockchain technology is being used to craft specific solutions for capital markets, everything from cryptocurrencies to capital issuance. Even the simplest technology tools in the hands of FinTech are being used to enhance point-of-sale customer experience while also controlling fraudulent transactions.

However, the most recent breakthrough amongst all these has been the rise of FinTech startups in capital markets. Since 2010, capital market infrastructure (CMI) linked FinTechs has grown nearly 300% since 2010, offering solutions to tackle complex front, middle, and back-office problems.

Why Startups?

For startups, success amidst cut-throat competition isn’t easy to achieve. ‘Nine out of ten startups fail’ is an oft-repeated maxim. Compliance and legal issues, along with inadequate funding have been the primary roadblocks in this quest. But despite these difficulties, fintech startups are ideally placed to resolve longstanding issues in the capital markets industry. These issues include high structural expenses, stagnant revenues, and enormous capital costs.

These challenges combined with the changes demanded by regulators have led to a decline in the returns on equity (ROE) for investment banks year after year. CMI providers (CMIPs) are compelled to deliver regulatory changes, such as the shift toward compulsory central counterparty clearing of over-the-counter derivatives, or external changes in customer behavior within the investor scenario. These pressures and complexity typically combine to cause organizational fatigue. This leaves high-level management with hardly any scope to invest in initiatives that can increase ROE.

Costs associated with the development and implementation of regulatory compliance systems are unavoidable, but costs incurred by investment banks to maintain disparate systems are unnecessary. Despite wanting to harness cutting edge technologies, they get caught up in the devil’s snare of legacy infrastructure. Instead, they need to leverage an external fintech solution to achieve their goals more optimally. Since startups aren’t tied to any entrenched IT architecture, they can accelerate cutting-edge product and service development.

The agile infrastructure of fintech startups has been proven to improve productivity by 25 to 30% within 6 to 18 months. CMIPs are already being empowered by fintech startups towards solving many of their challenges and are poised to make a significant impact on the capital markets industry. What they are not as certain about is knowing which specific technologies hold the key to helping them most efficiently resolve their challenges and the best collaboration methods when working with fintech firms.

Balancing the Equation

Sopnendu Mohanty, the Chief Fintech Officer at the Monetary Authority of Singapore (MAS), stated that while we normally understand fintech as a technology firm performing banking activities, the reality is only a fraction operate within the banking segment. Most startups are assisting in the digitization of banks. And while mergers and acquisitions by larger firms have been thought to benefit startups, recent developments along the CMI value chain suggests quite the reverse. 

Most startups assist banks by modernizing their dated infrastructure by becoming vendors and partners. Alliances such as the one between ING and the automated lending platform Kabbage are proof that conventional banks are looking to present new offerings to their customer base, and move to a more streamlined, agile, ‘plug-and-play’ model. 

They will continue to drive greater productivity in post-trade services like regulatory reporting and risk management by deploying automation and robotics. We are already witnessing capital markets seeking our next-gen artificial intelligence solutions to cope with their growing data streams and blockchain to optimize their transaction exchanges. Startups are well-positioned to bring in new digital markets, serve as an alternative to conventional access to capital and enhance the security of global financial systems.

Making Finance Relevant

The disruption brought about by fintech startups is indicative of the agile, mobile-first approach that customers across most sectors want. For the record, smaller startup fintech companies are the most active in the CMI space. Despite their considerable data pools and comprehensive resources, technology giants are being given a run for their money by these startups due to their enhanced agility and lack of legacy burdens.

They operate with existing providers rather than against them, and most of their products act as components within the industry, making conciliation much easier. Fintech startups have also been heavily backed by venture capital investment from the CMI sector and this trend is also on the rise. “The Fintech 250” list of 2018 by CBInsights’ further reinforces this reality, with Kabbage, incidentally, being the best-funded fintech startup under business lending and financing.

Ultimately, fintech startups are defying the norm by creating a space for established financial giants to leverage new technologies in a way that will bring about radical but meaningful change. There is no denying that fintech companies will continue to pioneer and outpace traditional financial giants as their technological innovation brings an unparalleled depth of value for capital markets in the 21 st century.

New Bottle, Old Wine? – Separating The True Fintech Solutions from the Rest

A New World of Banking

The rise of fintech in the last half decade or so has taken the financial world by storm. Research suggests that there are now more than 7,500 fintech firms around the world which have raised nearly USD 109 billion in investment. The sector raked in a record breaking USD 54 billion investment in 2018 and USD 10 billion within the first quarter of 2019. 

Clearly, the hunger for fintech is growing, and with it, the fear among banks and traditional financial business about potentially lost revenue and customers. The fact that customers are increasingly preferring these non-traditional competitors does little to calm the uncertainty. 

As established players in the financial services industry wake up to this new business dynamic, the majority are attempting to collaborate with fintech: to leverage its ever-expanding ecosystem, turn the innovation to their favor, and address the concerns that arise with their business being at risk. Research reveals that as many as 82% of incumbents in the financial industry sector expect to enhance their partnerships with fintech players, going forward. 

Fintech – A Force to Reckon With

Fintech can be rightly characterized as a movement that has brought disruptive and transformative innovation in financial services through cutting-edge technology. Unlike traditional financial institutions, fintech startups have the advantage of not being burdened by age-old regulatory constraints, legacy systems and processes. This has allowed them to move faster and come up with solutions that compete directly with conventional methods of financial service deployments. 

Another aspect that has fuelled the rapid progression of fintech is an entirely new generation of well-informed and connected mobile consumers who continue to reshape financial service requirements. With time, fintech companies have managed to rope in these digital natives with smart banking platforms. This has given them a head start in the race to capitalize on the ‘1.7 billion billion adults, who according to World Bank’s Global Findex Database 2017 are naturally inclined towards smart fintech services.

On the other hand, major players in the financial services sector and capital market incumbents have failed to gain precedence on this front. Burdened with massive structural costs, hefty capital charges, and stagnant revenues, this sector continues to score low on the innovation index. Additionally, the relentless pressure to stay compliant and adhere to regulatory guidelines also leaves organizations short of bandwidth to invest time and resources in initiatives that can improve margins. 

There’s no denying that in the digital age, customer experience (CX) is the final battleground for businesses. And here, fintech has a natural advantage. By placing CX above everything else, fintech offerings have been able to provide their users with unending benefits. For instance, by leveraging smart application program interfaces (APIs), fintech companies are able to nurture a healthy community of third party partners around their native software problem. Open APIs allow fintech players to expand their customer services by enabling third party partners and developers to create their own apps and layers into the middleware. 

Apart from this, the algorithmic design and data-rich environment in this sector has proven ideal for machine learning (ML), artificial intelligence (AI) and blockchain-driven product deployments. Developers today are able to leverage these technologies to simplify and optimize cumbersome and effort-intensive processes such as compliance, credit checks, risk management, and P2P payments. 

But there’s good news. These technologies can yield similar results in capital markets as well, provided they are strategically implemented in the right areas. For instance, process automation with Robotic Process Automation (RPA) can help organizations working in the capital markets space replace manual legacy systems, make the systems compliant with Know Your Customer (KYC), Anti-Money laundering (AML) and other regulations, reconcile reports and connect middle and back office functions. On the other hand, more contemporary technologies like AI can simplify cumbersome processes such as trade settlement, compliance reporting, contract management and accounts payable. 

Blockchain, is another area which promises to yield unprecedented gains for capital market players. No wonder, the financial services industry has witnessed some of the biggest use cases of this technology. For example, in digital trading, blockchain is helping organizations reduce settlement times. In the current trading architecture, a single transaction can take days to settle. A blockchain-based settlement solution significantly curbs this turn-around time. A cryptocurrency token that serves as a proxy to a particular transaction is immediately transferred to the wallet of the beneficiary, confirming the completion of the settlement and ledger update.   

Extrapolating into the Future of the Financial Services Sector

With the gradual implementation of next-generation technologies like ML, neural networking with long/short term memory, Blockchain,  AI and robo-advisors, fintech will continue to gain trust and popularity among customers . 73% of millennials are eager to shift to a new financial paradigm where service products from technology companies like Google, Apple, Paypal, and Amazon are more exciting, intuitive, and CX-friendly than anything traditional financial players currently provide.   

The times are clearly changing. Fintechs are fast opening the virtual vault doors to innovation in the once impenetrable banking and the financial services sector. Can traditional players take the bold steps necessary to match the frictionless experience that’s the new norm, or will they eventually lose grounds to the new entrants? Only time can tell. 

STO Processing Lifecycle (Part 1) -An overview of STO Lifecycle in Primary and Secondary markets

Security Token Offering seems to be the next hype to utilize BlockChain concepts and transform the current security instruments (Equity, Debt, Derivatives, etc) into digitized security. STOs have gained popularity and momentum in recent times due to lack of regulations in the ICO world with a lot of outliers for most of the ICOs in 2018. There are many Startups that are building platforms by utilizing programmable blockchain platforms like Ethereum. Recent developments in STO platforms seem to be moving in the direction of building an Ecosystem with defined standards as well. By seeing lots of traction on GitHub towards standards like ERC-1400, ERC-1410, ERC-1594, ERC-1643 and ERC-1644, it has given us the opportunity to think about how can a technology company like us (Specialized in ensuring Quality standards for Blockchain-based applications on Ethereum) can contribute to this. We started our journey in defining the Complete STOs processing cycle in the context of real-time usage (from a functional perspective) with underlying Ethereum platforms (from a technology perspective).

It is highly important that we list down all the major participants & their roles before actually defining the  STO lifecycle :

  1. Issuers – Legal entity who develops, registers & sells security for raising funds for their business expansion.
  2. Investors – Entities who are ready to invest in securities to expect financial returns.
  3. Legal & Compliance Delegates – Entities that ensure all participants & processes are complied within the defined rules & regulations by the jurisdiction.
  4. KYC/AML service providers – Entity which provide KYC/AML for required participants.
  5. Smart Contract Development communities like Developers, Smart Contract Auditors, QA Engineers.

Most of the companies claiming to provide STO platforms are using Ethereum as the underlying programmable blockchain platform with few exceptions. The rationale for using Ethereum as the first choice is – It is a Turing Complete platform to build complex decentralized applications by defining logics inside Smart contracts (Solidity is the most favorable programming language among developer communities). Parallelly, Ethereum is also getting matured, secured and improved on performance with scalability by introducing lots of new features and improvements. There are very few who are utilizing other platforms apart from Ethereum to build their own STO processing platforms and some of them are trying to build a completely new blockchain platform dedicatedly designed for STOs. The last approach seems to be too optimistic as it might take years to build such a  system whereas the current momentum around STOs does not seem to wait that long.

Basis the above, we can now define the generic STO lifecycle from a functional standpoint into 2 phases as below: Primary Market

  1. To issue STO by Issuer
  2. To invest in STO
  3. Secondary Market to trade STO on either on Exchanges or Over The Counter

Primary Market

  1. To issue STO by Issuer –
  1. Registration of Issuer
  2. Creation of STO Token
  3. Approval from Legal & Compliance for STO
  4. Issuance of STO post Legal & Compliance approval  
  1. To invest in STO by Investor –
  1. Registration of Investor
  2. KYC/AML for Investors
  3. Whitelisting of Investors for STOs post KYC/AML
  4. Investment in STO for allowed STOs based on whitelisting of corresponding STO

Before we actually get into the technical insight of underlying blockchain technology, we need to define the STO platform technical architecture from a  user perspective. Each STO platform that exists in any state in today’s world has –

  1. A Presentation layer (User Interface with any chosen front end technology, Integration with Wallets)
  2. A Business Layer (JS libraries to provide an interface to interact with Smart Contracts)
  3. A Data Layer (Ethereum data storage in blocks in the form of key-value storage)

Now let’s define a high-level overview from a technical standpoint by assuming that the STO platform is using Ethereum as an underlying blockchain platform ( assuming that the Backend Layer has been set up already) –

  1. Creation of an external account for all participants to bring everyone on Ethereum blockchain
  2. Defining transactions for Off-Chain and On-Chain for all activities defined for Issuer and Investors
  3. Merger of Off-Chain data with On-Chain data
  4. Develop Smart Contracts
    1. Standard smart contract to be built for each STO depending upon Jurisdictions for generic processes among required participants
    2. STO specific Smart Contract to be built for implementing business/regulation rules
    3. Smart contracts with all business logic especially for transaction processing

Based on the expertise of our group, Magic FinServ can contribute in a very big way in the development of Smart Contracts (Written in Solidity) along with Auditing of contracts.
For more details visit

In the next part, we will detail out all the above mentioned high-level technical overview with high-level functional overview followed by more insight on all these defined functional & technical flow.

Why does it matter for a technology company like us to embrace global tech standards?

When ERC-20 Standards came into existence, it eased down the ICO token interoperability across wallets & crypto exchanges for all ERC-20 compliant tokens. Having standards for any process not only helps to have bigger acceptance but also improves interoperability to build up an ecosystem. Being a technology obsessed firm, we’ve always encouraged standards to be in place. An acceptable standard not only helps developers (One of the strongest stakeholders in the ecosystem who have the responsibility to provide workable solutions by using available technology) to build  the ecosystem but also leads to minimal changes for implementing interoperability. In today’s world, there is no system in existence which does not raise any error /failure in real time usage . Using global standards provides us another vital advantage of finding a resolution for such errors/failures as  these cases would have already been resolved by the tech fraternity earlier.

Today, it is of utmost importance to have standards that can not only integrate multiple systems (STO Platforms, Wallets and Exchanges) with minimal changes but also make security tokens easily interoperable across wallets and exchanges. Security Token Offerings can’t be an exception for not having standards when they seem to have the biggest and most complicated technological advancement for transforming the existing world of security to Digitized security with automated processing over traditional blockchain technology.

The recent traction on ERC-1400 (now moved to ERC-1411) has helped towards defining standard libraries for the complete STO life cycle especially for on-chain/off-chain transactions This compilation of requirements has got the technology folks globally excited as this has the mettle to ease down the complete STO lifecycle. It completely makes sense that lots of individuals are very excited to see such a good compilation of requirements from various involved participants with probable interface that can ease down the complete STO lifecycle. Github, for instance has a lot of real time developers participating in discussions to share their experiences as well.

Ethereum Standards (ERC abbreviation of Ethereum Request for Comments) related to regulated tokens

The below standards are worth a read to understand in depth about the rationale behind targeting more regulated transactions based on Ethereum tokens –  

  1. ERC-1404 : Simple Restricted Token Standard
  2. ERC-1462 : Base Security Token
  3. ERC-884 : Delaware General Corporations Law (DGCL) compatible share token

ConsenSys claims to have implemented ERC-1400  on the Github repository & named the solution as Dauriel Network.  GitHub says, “Dauriel Network is an advanced institutional technology platform for issuing and exchanging tokenized financial assets, powered by the Ethereum blockchain.”  

ERC-1400 (Renamed to ERC-1411) Overview

Smart contracts  will eventually control all the activities like Security issuance process, trading lifecycle from an issuer & investor perspective as well as  event processing related to security token automatically. Let’s try to understand ERC-1400 standard libraries with respect to each activity for STO lifecycle :

  1. ERC-20: Token Standards
  2. ERC-777: A New Advanced Token Standards
  3. ERC 1410: Partially Fungible Token Standard
  4. ERC 1594: Core Security Token Standard
  5. ERC-1643: Document Management Standard
  6. ERC-1644: Controller Token Operation Standard
  7. ERC-1066: Standard way to design Ethereum Status Code (ESC)

All the defined methods inside each standard (Solidity Smart Contract Interfaces) at an activity level are (Pre MarketPrimary MarketSecondary Market)  and can be represented pictorially as below:

It is of utmost importance to distinguish Off-Chain & On-Chain activities  with those that will be processed outside STO platform before defining the mapping between standard libraries methods and activities across all 3 stages. Off Chain activities can be done outside the main chain of underlying blockchain platform then merged. However, Integration will be needed for all activities performed outside an STO platform where several standards (e.g. ERC-725 & ERC-735 define for Identity management) play an  important role.

All activities related to Pre-Market are supposed to happen outside the  STO platform as those are completely related to documentation like structuring the offering, preparing the  required documentation with all internal and external stakeholders including the legal team to ensure regulation compliances . To bring reference of all pre market documentation to the  STO platform, Cryptographic representation of all documentation can be used effectively.

Similarly, KYC/AML process can happen off-chain with proper integration on the STO platforms with proper identity management (standards around Identity management like ERC-725 and ERC-735).

ERC-1400 (now a.k.a ERC-1411) covers all activities related to primary and secondary market with proper integration to all off chain data which brings all related documentation/identity to the underlying blockchain platform on which the STO is designed.

Magic and its approach for defining ERC-1411 mapping

Team Magic is working continuously to define  the mapping between all defined methods with all real time activities of primary and secondary markets. A key part of our strategy is  to collect all requirements from various stakeholders like Security lawyers, Exchange Operators, KYA providers, Custodians, Business Owners, Regulators, Legal Advisor. Once we have all requirements collected then our experienced business analyst teams (Experts from Pricing, Corporate Actions, and Risk Assessment) take over and reconcile the requirements with ERC-1400 standards to not only map each requirement but also find out the gaps in the standards. Post this, our technology team  prepares the implementation strategy of all those standards by developing smart contracts in Solidity. Having an in-house developed smart contract for any specific case study (Provided by our Business Analyst team) helps us define Auditing of ERC-1400 specific smart contracts and the testing strategy for each contract as well. 

Battling Blockchain Vulnerabilities through Quality Audits

The original promise of blockchain technology was security. However, they might not be as invulnerable as initially thought. Smart contracts, the protocols which govern blockchain transactions, have yielded under targeted attacks in the past.

The intricacies of these protocols let programmers implement anything that the core system allows, which includes inserting loops in the code. The greater the options are given to programmers, the more the code needs to be structured. This makes it more likely for security vulnerabilities to enter blockchain-based environments.

The Attacks that Plague Blockchain

Faulty blockchain coding can give rise to several vulnerabilities. For instance, during Ethereum’s Constantinople upgrade in January of this year, reentrancy attacks became a cause for concern. These are possibly the most notorious among all blockchain attacks. A smart contract may interface with an external smart contract by ‘calling it’. This is an external call. Reentrancy attacks exploit malicious code in the external contract to withdraw money from the original smart contract. A similar flaw was first revealed during the 2016 DAO attack, where hackers drained $50 million from a Decentralized Autonomous Organization (DAO). Note the following token contract, from programmer Peter Borah, of what appears to be a great endeavor at condition-oriented programming:

contract TokenWithInvariants {   

mapping(address => uint) public balanceOf;

uint public totalSupply;

   modifier checkInvariants {

         if (this.balance < totalSupply) throw;


   function deposit (uint amount) checkInvariants {

     balanceOf[msg.sender] += amount;

     totalSupply += amount;


  function transfer(address to, uint value) checkInvariants {

        if (balanceOf[msg.sender] >= value) {

        balanceOf[to] += value;

        balanceOf-msg.sender] -= value;



  function withdraw() checkInvariants {

      uint balance = balanceOf[msg.sender];

      if ( ()) {

        totalSupply -= balance;

        balanceOf[msg.sender] = 0;

The above contract executes state-changing operations after an external call. It neither carries out an external call at the end nor does it have a mutex to prevent reentrant calls. The code does perform excellently in some areas, such as checking for a global invariant wherein the contract balance (this.balance) should not be below what the contract perceives it to be (totalSupply). However, these invariant checks are done at function entry in function modifiers, thereby treating a global invariant as a post-condition rather than holding it at all times. The deposit function is also flawed since it considers the user-mandated amount(msg.sender) instead of msg.amount.

Finally, the seventh line has a bug in it. Instead of,

if (this.balance < totalSupply) throw;

It should be,

if (this.balance != totalSupply) throw;

This is so because instead of checking for a stronger condition, we are now confirming a somewhat weaker condition of the contract’s actual balance being higher than what it thinks it should be.

These issues enable the contract to stock more money than it should. An attacker can potentially withdraw more than their share, heightening the danger of reentrancy even when the contract codes are watertight.

Overflows and underflows are also significant vulnerabilities that can be used as a Trojan Horse by non-ethical hackers. An overflow error occurs when a number gets incremented above its maximum value. Think of odometers in cars where the distance gets reset to zero after surpassing, say 999,999 km. If we affirm a uint8 variable that can take up to 8 bits, it can have decimal numbers between 0 and 2^8-1 = 255. Now if we code as such,uint a = 255;a++;

Then this will lead to an overflow error since a’s maximum value is 255.

On the other end, underflow errors effect smart contracts in the exact opposite direction. Taking an uint8 variable again:unint8 a = 0;a-;

Now we have effected an underflow, which will make a assume a maximum value of 255.

Underflow errors are more probable, since users are less likely to possess a large quantity of tokens. The Proof of Weak Hands Coin (POWH) scheme by 4chan’s business and finance imageboard /biz/ suffered a $800k loss overnight in March 2018 because of an underflow attack. Building and auditing secure mathematical libraries that replace the customary arithmetic operators is a sensible defense for these attacks.

The 51% attack is also prevalent in the world of cryptocurrency. A group of miners control more than 50% of the mining hashrate on the network and control all new transactions. Similarly, external contract referencing exploits Ethereum’s ability to reuse code from and interact with already existing contracts by masking malevolent actors in these interactions.

Smart contract auditing that combines the attention of manual code analysis and the efficiency of automated analysis is indispensable in preventing such attacks.
Solving the Conundrum
Fixes to such security risks in blockchain-based environments are very much possible. A process-oriented approach is a must with agile quality assurance (QA) models. Robust automation frameworks are also crucial in weeding out errors in coding and therefore strengthening smart contracts in the process.

In the case of reentrancy attacks, avoiding external calls is a good first step. So is inserting a mutex, a state variable to lock the contract during code execution. This will block reentry calls. All logic that changes state variables should occur before an external call. Correct auditing in this instance will ensure these steps are followed. In the case of overflow and underflow attacks, the right auditing tools will build mathematical libraries for safe math operations. The SafeMath library on Solidity is a good example.

To prevent external contract referencing, even something as simple as using the ‘new’ keyword to create contracts may not be implemented in the absence of proper auditing. Incidentally, this one step can ensure that an instance of the referred contract is formed during the time of execution, and the attacker cannot replace the original contract with anything else without changing the smart contract itself.

Magic BlockchainQA’s pioneering QA model has created industry-leading service level agreements (SLAs). Our portfolio of auditing services leverage our expertise in independently verifying blockchain platforms. This ensures decreased losses on investments for fintech firms, along with end-to-end integration, security, and performance. Crucially, this will usher in widespread acceptance of blockchain-based platforms. With the constant evolution of blockchain-based  environments, we are constantly evolving as well, to tackle new challenges and threats, while ensuring that our tools can conduct impeccable auditing of these contracts.

Blockchain technology first came with the promise of unprecedented security. Through correct auditing practices, we can fulfill this original promise. At Magic BlockchainQA’s, we aim to take that promise to its completion every single time.

The Underlying Process of Predictive Analysis

Predictive Analysis – What it is?

Whenever you hear the term “Predictive Analysis”, a question pop-ups in mind “Can we predict the future?”. The answer is “no” and the future is still a beautiful mystery as it should be. However, the predictive analysis does forecast the possibility of a happening in the future with an acceptable percentage of deviation from the result. In business terms, predictive analysis is used to examine the historical data and interpret the risk and opportunities for the business by recognizing the trends and behavioral patterns.

Predictive analysis is one of the three forms of data analysis. The other two being descriptive analysis and Prescriptive analysis. The descriptive analysis examines the historical data and evaluates the current metrics to tell if business doing good; predictive analysis predicts the future trends and prescriptive analysis provides a viable solution to a problem and its impact on the future. In simpler words, descriptive analysis is used to identify the problem/scenario, predictive analysis is used to define the likelihood of the problem/scenario and why it could happen; prescriptive analysis is used to understand various solutions/consequences to the problem/scenario for the betterment of the business.

predictive analysis

Predictive Analysis process

The predictive analysis uses multiple variables to define the likelihood of a future event with an acceptable level of reliability. Let’s have a look at the underlying process of Predictive Analysis:

Requirement – Identify what needs to be achieved

This is the pre-step in the process where it is identified what needs to be achieved (requirement) as it paves the ways for data exploration which is the building block of predictive analysis. This explains what a business needs to do more vis-à-vis what is being done today to become more valuable and enhance the brand value. This step defines which type of data is required for the analysis. The analyst could take the help of domain experts to determine the data and its sources.

  1. Clearly state the requirement, goals, and objective.
  2. Identify the constraints and restrictions.
  3. Identify the data set and scope.

Data Collection – Ask the right question

Once you know the sources, the next step comes in to collect the data. One must ask the right questions to collect the data. E.g. to build a predictive model for stock analysis, historic data must contain the prices, volume, etc. but one must also pay attention to how useful the social network analysis would be to discover the behavioral and sentiment patterns.

Data Cleaning – Ensure Consistency

Data could be fetched from multiple sources. Before it could be used, this data needs to be normalized into a consistent format. Normally data cleaning includes –

  1. Normalization
    • a. Convert into a consistent format
  2. Selection
    • a. Search for outliers and anomalies
  3. Pre-Processing
    • a. Search for relationships between variables
    • b. Generalize the data to form group and/or structures
  4. Transformation
    • a. Fill in the missing value

Data Cleaning removes errors and ensures consistency of data. If the data is of high quality, clean and relevant, the results will be proper. This is, in fact, the case of “Garbage In – Garbage out”. Data cleaning can support better analytics as well as all-round business intelligence which can facilitate better decision making and execution.

Data collection and Cleaning as described above needs to ask the right questions. Volume and Variety are two words describing the data collection results, however, there is another important thing which one must focus on is “Data Velocity”. Data is not only required to be quickly acquired but needs to be processed at a good rate for faster results. Some data may have a limited lifetime and will not solve the purpose for a long time and any delay in processing would require acquiring new data.

Analyze the data – Use the correct model

Once we have data, we need to analyze the data to find the hidden patterns and forecast the result. The data should be structured in a way to recognize the patterns to identify future trends.

Predictive analytics encompasses a variety of statistical techniques from traditional methods e.g. data mining, statistics to advance methods like machine learning, artificial intelligence which analyze current and historical data to put a numerical value on the likelihood of a scenario. Traditional methods are normally used where the number of variables is manageable. AI/Machine Learning is used to tackle the situations where there are a large number of variables to be managed. Over the ages computing power of the organization has increased multi-fold which has led to the focus on machine learning and artificial intelligence.

Traditional Methods:

  1. Regression Techniques: Regression is a mathematical technique used to estimate the cause and effect relationship among variables.

In business, key performance indicators (KPIs) are the measure of business and regression techniques could be used to establish the relationship between KPI and variables e.g. economic parameters or internal parameters. Normally 2 types of regression are used to find the probability of occurrence of an event.

  1. Linear Regression
  2. Logistic Regression

A time series is a series of data points indexed or listed or graphed in time order.

Decision Tree

Decision Trees are used to solve classification problems. A Decision Tree determines the predictive value based on a series of questions and conditions.

Advanced Methods – Artificial Intelligence / Machine Learning

Special Purpose Libraries

Nowadays a lot of open frameworks or special purpose libraries are available which could be used to develop a model. Users can use these to perform mathematical computations and see data flow graphs. These libraries can handle everything from pattern recognition, image and video processing and can be run over a wide range of hardware. These libraries could help in

  1. Natural Language Processing (NLP). Natural Language refers to how humans communicate with each other in day to day activities. It could be in words, signs, e-data e.g. emails, social media activity, etc. NLP refers to analyzing this unstructured or semi-structured data.
  2. Computer Vision


Several algorithms which are used in Machine Learning include:

1. Random Forest

Random Forest is one of the popular machine learning algorithm Ensemble Methods. It uses a combination of several decision trees as a base and aggregates the result. These several decision trees use one or more distinct factors to predict the output.

2. Neural Networks (NN)

The approach from NN is to solve the problem in a similar way by machines as the human brain will do. NN is widely used in speech recognition, medical diagnosis, pattern recognition, spell checks, paraphrase detection, etc.

3. K-Means

K-Means is used to solve the clustering problem which finds a fixed number (k) of clusters in a set of data. It is an unsupervised learning algorithm that works itself and has no specific supervision.

Interpret result and decide

Once the data is extracted, cleaned and checked, its time to interpret the results. Predictive analytics has come along a long way and goes beyond suggesting the results/benefits from the predictions. It provides the decision-maker with an answer to the query “Why this will happen”.

Few use cases where predictive analysis could be useful for FinTech business

Compliance – Predictive analysis could be used to detect and prevent trading errors and system oversights. The data could be analyzed to monitor the behavioral pattern and prevent fraud. Predictive analytics in companies could help to conduct better internal audits, identify rules and regulations, improve the accuracy of audit selection thus reducing the fraudulent activities.

Risk Mitigation – Firms could monitor and analyze the operational data to detect the error-prone areas and reduce outages and avoid being late on events thus improving the efficiency.

Improving customer service – Customers have always been the center of business. Online reviews, sentiment analysis, social media data analysis could help the business to understand customer behavior and re-engineer their product with tailored offerings.

Being able to predict how customers, industries, markets, and the economy will behave in certain situations can be incredibly useful for the business. The success depends on choosing the right data set with quality data and defining good models where the algorithms explore the relationships between different data sets to identify the patterns and associations. However, FinTech firms have their own challenges in managing the data caused by data silos and incompatible systems. Data sets are becoming large and it is becoming difficult to analyze for the pattern and managing the risk & return.

Predictive Analysis Challenges

Data Quality / Inaccessible Data

Data Quality is still the foremost challenge faced by the predictive analyst. Poor data will lead to poor results. Good data will help to shape major decision making.

Data Volume / Variety / Velocity

Many problems in Predictive analytics belong to big data category. The volume of data generated by users can run in petabytes and it could challenge the existing computing power. With the increase in Internet penetration and autonomous data capturing, the velocity of data is also increasing at a faster rate. As this increases, traditional methods like regression models become unstable for analysis.

Correct Model

Defining a correct model could be a tricky task especially when much is expected from the model. It must be understood that the same model could be used for different purposes. Sometimes, it does not make sense to create one large complex model. Rather than one single model to cover it all, the model could consist of a large number of smaller models that together could deliver better understanding and predictions.

The right set of people

Data analytics is not a “one-man army” show. It requires a correct blend of domain knowledge with data science knowledge. Data Scientist should be able to ask the correct questions to domain experts in terms of what-if-analysis and domain experts should be able to verify the model with appropriate findings. This is where we at Magic FinServ could bring value to your business. At Magic FinServ we have the right blend of domain expertise as well as data science experts to deliver the intelligence and insights from the data using predictive analytics.

Magic FinServ – Value we bring using Predictive Analysis

Magic Finserv Offerings

Magic FinServ hence has designed a set of offerings specifically designed to solve the unstructured & semi-structured data problem for the financial services industry.

Market Information – Research reports, News, Business and Financial Journals & websites providing Market Information generate massive unstructured data. Magic FinServ provides products & services to tag meta data and extracts valuable and accurate information to help our clients make timely, accurate and informed decisions.

Trade – Trading generates structured data, however, there is huge potential to optimize operations and make automated decisions. Magic FinServ has created tools, using Machine Learning & NLP, to automate several process areas, like trade reconciliations, to help improve the quality of decision making and reduce effort. We estimate that almost 33% effort can be reduced in almost every business process in this space.

Reference data – Reference data is structured and standardized, however, it tends to generate several exceptions that require proactive management. Organizations spend millions every year to run reference data operations. Magic FinServ uses Machine Learning tools to help the operations team reduce the effort in exception management, improve the quality of decision making and create a clean audit trail.

Client/Employee data – Organizations often do not realize how much client sensitive data resides on desktops & laptops. Recent regulations like GDPR make it now binding to check this menace. Most of this data is semi-structured and resides in excels, word documents & PDFs. Magic FinServ offers products & services that help organizations identify the quantum of this risk and then take remedial actions.FacebookLinkedInTwitter

Choose Your Cloud Partner Wisely

Gartner Says By 2020, a Corporate “No-Cloud” Policy Will Be as Rare as a “No-Internet” Policy Is Today.

With the increasing number of cloud adoption rate, it has become instrumental for organizations to build a robust Cloud migrations strategy.

As per the Commvault report. the cloud Fear of Missing Out (FOMO) is driving the business leaders to move full speed ahead towards the cloud.

Many organizations are already moving part of their applications to cloud or planning to move all of their applications to cloud.  Apart from the reliability, scalability, cost optimization and security benefits, the recent disruption in cognitive technologies like AI/ML/Blockchain are one of the driving factors to embrace the cloud as an important IT strategy Most of the cloud providers are offering attractive easy to implement AI/ML platform along with other multidimensional benefits.

However, there are so many cloud providers, so many cloud services are available in the market.

Who is the best service provider? Which service model should be fit for the organizations?

The answers are not a single word or list of words. This is a process appropriately designed towards your business goals.

Hence choose a Cloud service provider who can work as a partner, not as a vendor……

This is a journey through the learning curve for both partners.

I am highlighting a few aspects which must be considered when selecting a cloud partner:

Define your migration strategy – IaaS vs PaaS vs SaaS. You need to select the right partner for platform, infrastructure and application services. Sometimes you may need to work with multiple providers for different services or you can have one combine managed Service partner.

The above diagram from Gartner is showing a perfect ownership sharing in various cloud service model. Like if you have a best in class application service team, you can procure infrastructure services or platform services from a service provider and align the internal team to run the cloud service. This will need extensive cloud training for the existing team, hire some cloud experts to build the in-house capability and robust service management process to coordinate among different vendors. Your cloud partner should take a role of your training partner in such cases. In a SaaS-based model, this is essential that cloud partner know your business and industry well. Because ultimately the cloud service will be fully integrated with the business model. Hence it is very important that the SaaS provider is fully aligned with your business need. Overall Selection Criteria can be designed by analyzing and comparing the below factors- 

The provider must be knowledgeable about your application, data, interfaces, compliance, security, BCP/DR and other business requirements. Critical Success Factors are defined in the various model. However, as per our study, we have listed Seven key success factors for cloud computing – Cloud Partner – The most important step towards success. Choose a perfect cloud partner who will help in your journey towards success. Cloud Strategy –

  • Create a plan & solution architecture
  • Define the cloud applications and services
  • Prepare the service catalogue
  • Build the capability and processes

Cost & performance – one of the most important success criteria.

  • Plan cost and ROI
  • Benchmark the performance
  • Proactive monitoring
  • Capacity planning
  • Right-sizing & optimization

Security –

  • Build the security strategy – secure all the layers and components
  • Automation, tooling and proactive monitoring
  • Plan the audit, compliance reporting & certification

Contract & SLA –

  • Incorporate all the aspects of the contract carefully with the legal help
  • Build customer and suppliers terms properly
  • Define Service SLA & service credits
  • Manage the contract (an ongoing process)

Automation –

  • Have an automation strategy
  • From Infrastructure to Application – automate the repetitive work
  • Increase the response and resolution
  • Reduce the human error

Manage the stakeholders –

  • Cloud adoption changing the organizational structure and IT landscape drastically.
  • Manage your stakeholders throughout the journey.
  • Assess the impact of positive and negative stakeholders on the project.

A managed service provider is the ideal solution in today’s complex world. At MagicFinServ, we are helping the global FinTech companies to build their successful SaaS model. Our highly skilled cloud team can align all the moving parts from architecting to implementation and deliver a production-ready solution. To know more about our FinTech focused cloud solution please contact us at

5 ways in which Machine Learning can impact FinTech

Machine learning is one amongst those technologies that is invariably around us and that we might not even comprehend it. For instance, machine learning is employed to resolve issues like deciding if an email that we got is a spam or a genuine one, how cars can drive on their own, and what product someone is likely to purchase. Every day we tend to see these sorts of machine learning solutions in action. Machine learning is when we get a mail and automatically/mechanically scanned and marked for spam within the spam folder. For the past few years, Google, Tesla, and others have been building self-drive systems that may soon augment or replace the human driver. And data information giants like Google and Amazon can use your search history to predict which things you are looking to shop for and ensure you see ads for those things on each webpage you visit. All this useful and sometimes annoying behavior is the result of artificial intelligence.

This definition brings up the key component of machine realizing specifically, that the framework figures out how to tackle the issue from illustration information, instead of us composing a particular rationale. This is a noteworthy advancement from how most writing of computer programs is done. In more customary programming we deliberately examine the issue and compose code.

This code peruses in information and utilizes its predefined rationale to distinguish the right parts to execute, which at that point creates the right outcome.

Machine Learning and Conventional Programming

With conventional programming, we use code structs like– if statements, switch-case statements, and control loops implemented with — while, for and do statements. Every one of these announcements has tests that must be characterized. And the dynamic information, typical of machine learning issues can make defining these tests very troublesome. In contradiction to machine learning, we do not write this logic that produces the results. Instead, we gather the information we need and modify its format into a form which machine learning can use. We then pass this data to an algorithm. The algorithmic program analyses the data and creates a model that implements the solution to solve the problem based on the information and data.

Machine Learning: High-Level View

We initially start with lots of data, the data that contains patterns. That data gets inside machine learning logic and algorithm to find a pattern or several patterns. A predictive model is the outcome of the machine learning algorithm process. A model is typically the business logic that identifies the probable patterns with new data. The application is used to supply data to the model to know if the model identifies the known pattern with the new data. In the case that we took, new data could be data of more transactions. Probable patterns mean that a model should come up with predictive patterns to check if the transactions are fraudulent.

Machine Learning and FinTech
FinTech is one of the industries that could be hugely impacted by machine learning and can leverage machine learning technologies to get better predictions and risk analysis in finance applications. Following are five areas where machine learning could impact finance applications and so financial technologies can become smarter to take care of fraud detection, algorithmic trading or portfolio management.

Risk Management
Applying predictive analysis model to the huge amount of real-time data can help the machine learning algorithm to have command over numerous data points. The traditional method of risk management worked on analyzing structured data against some data rules which were very constrained to only structured data. But there is more than 90% of data that is unstructured. Deep learning technology can process unstructured data and does not really depend upon static information coming from loan applications or other financial reports. Predictive analysis can even foresee the loan applicant’s financial status that may be impacted by the current market trends.

Internet Banking Fraud
Another such example could be to detect internet banking fraud. If there is a continuous fraud happening with the fund’s transfer via internet banking and we have the complete data, we could find out the pattern involved. Through this, we can identify where are the loopholes or hack prone areas of the application. So, it’s all about patterns and predicting the results and future based on those patterns. Machine learning plays an important role in data mining, image processing, and language processing. It cannot always provide a correct analysis or cannot always provide an accurate result based on the analysis, but it gives a predictive model based on historical data to make decisions. The more data, the more the result-oriented predictions that can be made.

Sentiment Analysis
One of the areas where machine learning can play an important role is sentiment analysis or news analysis. The futuristic applications on machine learning can no longer depend upon the only data coming from trades and stock prices. As a legacy, the human intuition of financial activities is dependent upon trades and stocks data to discover new trends. The machine learning technology can be evolved to understand social media trends and other information/news trends to do sentiment or news analysis. The algorithms can computationally identify and categorize the opinions or thoughts expressed by the user to make predictive analysis. The more the data the more accurate would be the predictions.

Robo-advisors are a kind of digital platforms to calibrate a financial portfolio. They provide planning services with least manual or human intervention. The users furnish details like their age, current income, and their financial status and expect from Robo-advisors to predict the kind of investment they can make, as per current and futuristic market trends to meet their retirement goals. The advisor processes this request by spreading the investments across financial instruments and asset classes to match the goals of the user. The system works on real-time modification in user’s goals and current market trends and does a predictive analysis to find the best match for the user’s investments. Robo-advisors may in future completely wipe out the human advisors who make money out of these services.

The highest concern for banks and other financial institutions is the security of the user and user’s details, which if leaked could be prone to hacking and eventually resulting in financial losses. The traditional way in which the system works are providing a username and password to the user for secure access and in case of loss of password or recovery of the lost account, few security questions or mobile number validation is needed. Using AI, in the future, one can develop an anomaly detection application that might use biometric data like facial recognition, voice recognition or retina scan. This could only be possible by applying predictive analysis over a huge amount of biometric data to make more accurate predictions by applying repetitive models.

How Can Magic FinServ help?

Magic FinServ is aggressively working on visual analytics and artificial intelligence thereby leveraging the concept of machine learning and transforming the same in the perspective of technology to solve business problems like financial analysis, portfolio management, and risk management. Magic FinServ being a financial services provider can foresee the impact of machine learning and predictive analysis on financial services and financial technologies. The technology business unit of Magic uses technologies like Python, Big Data, Azure Cognitive Services to develop and provide innovative solutions. Data scientists and technical architects at Magic work hand in hand to provide consulting and developing financial technology services having a futuristic approach.