Enterprise Blockchain Consortiums – Part 2

wayne-bishop-5737
https://goo.gl/EewTRW

 

This is the second part of our article about Blockchain consortiums focusing on the enterprise market. Part I of this article can be found here.

In Part I, we covered Linux Foundation’s HyperLedger Project which acts as an umbrella for Fabric, Intel Sawtooth Lake, and Iroha. We also that R3’s Corda and Digital Asset Holding’s offerings.  In this part we will cover the following three players:

  1. Chain
  2. Ripple
  3. Enterprise Ethereum

Chain

Chain is a Silicon Valley startup that provides Enterprise Blockchain solutions. Chain offers an enterprise-grade platform called Chain Core that allows companies to build and launch their own permissioned blockchains. Chain Core is built based on the Chain Protocol which defines how assets are issued, transferred, and controlled on a blockchain network. It allows a single entity or a group of organizations to operate a network, supports the coexistence of multiple types of assets, and is interoperable with other independent networks.

Chain Core SDK is available in three languages (Java, Ruby, and Node.js) and provides a robust set of functionality to create enterprise applications that require a permissioned Blockchain. Chain has put a significant amount of effort in building and documenting the APIs. By abstracting the technical details that may impose a steep learning curve when it comes to learning DLT, Chain has done a great job with providing simplified libraries which will have an enterprise developer up and running in a minimal amount of time.

Screen Shot 2017-02-23 at 6.22.04 AM.png

Image Credit: Chain.com

Chain could have stopped at just releasing the developer documentation – but they went an extra step and open sourced their Chain protocol as well. This allows third party developers to inspect and view the protocol specifications and build additional adapters and bridges to other popular blockchain networks and solutions in the market.

Chain has managed to demonstrate their success by building a fully permissioned Blockchain-based solution for Visa called B2B Connect, that gives financial institutions a simple, fast and secure way to process business-to-business payments globally. Chain has also managed to gather investments from NASDAQ, Fiserv, Capital One, Citi, Fidelity, First Data, Orange etc. With its developer and enterprise-friendly platform, Chain provides an easy entry to DLT technologies for enterprises that are looking to experiment with Blockchains.

The foundation of Chain platform includes the Chain Virtual Machine(CVM) that executes the smart contract programs written in Java, Ruby or Node. These high-level language smart contracts/programs allow business logic to be executed on chain using the  CVM instruction set. These programs are Turing Complete; to guarantee that the contract does not get stuck in an infinite loop, the chain virtual machine terminates contract code that executes more than a run limit time. (Similar to the ‘gas’ that powers ethereum smart contracts).

Chain and Interledger.JPG

Image Credit: Construct 2017, Coindesk

Ripple/Interledger:

Ripple is one of the first successful Distributed Ledger technology companies that has managed to successfully integrate with financial institutions around the world to solve the problem of faster cross-border payments.  Sending money from one country to the other can be a frustrating experience even in today’s standards. Almost all of the banks in the world today use SWIFT technology to move money across borders.

An average SWIFT payment takes days and not hours to settle. Bank to Bank transfer is usually 1-2 days (keeping into consideration time differences, if applicable). Depending on the size of the bank and their presence in the receiver country,  a correspondent bank may be involved.

Ripple attempts to eradicate this delay by providing near real-time settlement times for cross-border money movement. By their own statement:

Ripple’s solution is built around an open, neutral protocol (Interledger Protocol or ILP) to power payments across different ledgers and networks globally. It offers a cryptographically secure end-to-end payment flow with transaction immutability and information redundancy. Architected to fit within a bank’s existing infrastructure, Ripple is designed to comply with risk, privacy and compliance requirements.

Interledger Protocol(ILP) – ILP serves as the backbone of Ripple technology. ILP strongly borrows many of the battle-tested ideas from Internet standards (RFC 1122, RFC 1123 and RFC 1009) that exist today. ILP is an open suite of protocols for connecting ledgers of different types of digital wallets to national payment systems and other blockchains. A detailed overview of the ILP protocol can be found in a white paper here.

Ripple Connect – Ripple Connect acts as the glue to connect various Interledgers operated by FI clients around the world. By linking the ledgers of FI’s through ILP for real-time settlement of cross-border payments, it preserves the ledger and transaction privacy of the financial institution. It also provides a way for banks to exchange originator and beneficiary information, fees and the estimated delivery time of the payment before it is initiated thus providing transactional visibility.

Screen Shot 2017-03-16 at 5.40.49 AMImage Credit: ripple.com

Ripple has also built a sample ILP client for developers along with supporting documentation that shows how to use ILP. A good overview of Interledger development documentation can be found here. Complete documentation about developing for Ripple platform can be found here.

Ripple has managed to successfully prove that DLT can be used to solve a real-world problem of cross-border payments. They are one of the few companies who has managed to acquire BitLicense – a virtual currency license from the New York State Department of Financial Services.

Here is a list of Ripple’s Financial Institutions clients who are either using it in production or testing the technology for cross-border money movement:

Ripple clients.png

Image Credit: Ripple.com

Enterprise Ethereum:

Enterprise Ethereum is technically the new kid on the Enterprise Blockchain Consortium Block; however, Ethereum as a platform has a robust base of credentials to bring a fight to the ring. Enterprises have shown significant interest in Ethereum as a platform and many innovation labs have been testing a private deployment of Ethereum to understand the technology as well as the use cases it can solve.

Jeremey Millar from ConsenSys elegantly presents the merits of using Ethereum in an Enterprise setting:

Ethereum is arguably, the most commonly used blockchain technology for enterprise development today. With more than 20,000 developers globally, the benefits of a public chain holding roughly $1bn of value, and an emerging open source ecosystem of development tools, it is little wonder that Accenture observed ‘every self-respecting innovation lab is running and experimenting with Ethereum’. Cloud vendors are also supporting Ethereum as a first class citizen: Alibaba Cloud, Microsoft Azure, RedHat OpenShift, Pivotal CloudFoundry all feature Ethereum as one of their, if not the primary blockchain offering.

Enterprise Ethereum (EE) brings forth a very crucial difference between the technology used by players in this space – Blockchain vs.  Distributed Ledger. While these two technologies have been interchangeably used by many in the field, there is a subtle difference that catches folks off-guard. Antony Lewis of R3 says:

All blockchains are distributed ledgers, but not all distributed ledgers are blockchains!

Enterprise Ethereum attempts to bring the technology behind Ethereum into enterprises while the other players have redesigned the idea behind Blockchains to fit the enterprise needs. It is still early in the game to see how Enterprise Ethereum will evolve as not much details have been published except the launch day webcast (7 hours). Around the first hour, Vitalik Buterin talks about the Ethereum roadmap.

We also see the various alliance partners in various panels talking about what Ethereum in Enterprise means to them and the reason why they are part of it. EE has launched with a significant number of important players in the space – most notably JP Morgan, who also have announced the launch of their Ethereum based blockchain called Quorum.

Enterprise Ethereum Alliance members

Image Credit: http://entethalliance.org/

Conclusion:

The Enterprise Blockchain/Distributed Ledger space is starting to become competitive with a good number of offerings. There are are other players like Multichain and Monax, which we have not covered here which deserve a close watch. Like relational databases that changed the way how enterprise applications are built, Blockchains and Distributed ledgers will shape the future of how enterprise applications are designed and built for the next generation of applications.

 

 

 

 

 

 

 

Enterprise Blockchain Consortiums – Part 1

chuttersnap-149285
https://goo.gl/qzVPS5

2016 was a big year for Blockchain. The hype around Blockchain was palpable, and you saw news from all around the world about various initiatives picking up steam. While this entry is not meant to provide a comprehensive state of *all* of the consortiums out there, I am attempting to capture the ones who seem to have the most momentum and news coverage.

If you are serious about following the enterprise Blockchain scene, the following players merit your attention:

  1. HyperLedger
  2. R3CEV
  3. Digital Asset Holdings
  4. Chain
  5. Ripple
  6. Ethereum Enterprise (This was the new entry on the block as this entry was being written).

HyperLedger  – Hyperledger is an open source effort by Linux Foundation to develop a blockchain platform and to support blockchain-based distributed ledgers.The term Hyperledger itself doesn’t specifically point to one project; rather is a collection of various projects committed by different members. The projects themselves range from building blockchain based distributed ledgers to a Blockchain as a Service toolkit to a blockchain explorer.

Hyperledger ecosystem is built on the following tenets:

  • Data/Decisions need to remain with project stakeholders
  • Plug-ability to enable flexibility.
  • Ability to trace transaction history
  • Utilize native SDKs as much as possible
  • Support for HSM (Hardware Security Module)
  • Support for ACL
  • Eliminate Single point of Failure
  • Collaborate with projects within the Hyperledger ecosystem to pick the best design/architecture

HyperLedger.JPG

Image Credit: Coin Desk Construct 2017

HyperLedger projects:

HyperLedger Fabric – An implementation of blockchain technology that is intended as a foundation for developing blockchain applications or solutions. Fabric offers modular design allowing plug-and-play components like consensus and membership services. It leverages container technology to host smart contracts called chaincode that comprise the application logic.  Fabric was incubated with the contributions from IBM and Digital Assets Holding Group (DAH)  based on the successful experiment implemented at the F2F hackathon hosted by JPMC in NYC week of March 21, 2016. 

The fastest way to take Fabric for a test drive is to use IBM’s Bluemix Free Developer edition that can be found here. On the other hand, if you are a glutton for shell scripts and terminal, you can build your own Node.Js/Vagrant based Fabric client SDK development environment setup by following instructions here.  A Java SDK version is apparently in the works.

Iroha – Iroha has an interesting background it comes from the land of rising sun – It is a distributed ledger project that was designed to support infrastructural projects that require the use of distributed ledger technology in a simple and easy way. Iroha was developed in C++ and emphasizes its use in mobile environments and application development.

Iroha specifies its goal to provide the following c++ components that can be used across all Hyperledger projects:

  • Sumeragi consensus library
  • Ed25519 digital signature library
  • SHA-3 hashing library
  • Iroha transaction serialization library
  • P2P broadcast library
  • API server library
  • iOS library
  • Android library
  • JavaScript library
  • Blockchain explorer/data visualization suite

Iroha was added to the Hyperledger family with contributions from Soramitsu, Hitachi, NTT Data and Colu. A full white paper explaining the design of Iroha can be found here. The client code can be found on Github here.

Sawtooth Lake – Sawtooth Lake is a blockchain ledger project from Intel. Its designed to be a modular platform for building and running distributed ledgers. By using a pluggable architecture for consensus, Sawtooth Lake provides the ability to run both permissioned and permissionless Blockchains. Sawtooth Lake comes out of the box with a consensus Algorithm called PoET (Proof of Elapsed Time), which is intended to run in a Trusted Execution Environment (TEE), such as Intel® Software Guard Extensions (SGX).  PoET is a lottery protocol that follows the Nakamoto Consensus model to elect a leader in the voting process. However, unlike Bitcoin consensus that uses extensive Proof of Work based validation which wastes compute cycles and electricity, PoET uses a Trusted Execution Environment(TEE) to provide a guaranteed wait time for each validator(node).

Each validator in the network requests a wait time from a trust function within the TEE. The validator with the shortest wait time for a particular transaction block is elected the leader. The TEE’s provide trust services like CreateTimer() and CheckTimer() to ensure that a particular validator created a timer within a TEE and waited the specified amount of time to win the lottery. By utilizing Intel’s SGX enabled processors that can provide a TEE environment, PoET can scale by adding more validators without having to resort to extensive Proof of Work based mining.

Cello – Is a toolkit for deploying a Blockchain-as-a-Service, that reduces the effort required for creating, managing, and terminating Blockchains. Getting Hyperledger Fabric up and running requires on different nodes required configuring Docker scripts to get the environment up and running on various machines that can be time-consuming and error-prone.

Cello avoids these problems by providing pre-configured environments to get a Blockchain running similar to IBM Blumix or Microsoft Azure BaaS.

In summary, Hyperledger seems to have a vibrant ecosystem of various participants contributing to developing a vibrant open source Distributed Ledger technologies. The Hyperledger Technical Steering Committee ensures that the various efforts stay in sync with the core philosophies of open source contributions. You can find more information about Hyperledger at their wiki or in their slack channel.

R3CEV

R3CEV is a financial services/distributed ledger technology company responsible for creating R3, the most famous blockchain consortium that has managed to garner the attention of the financial industry. It has an impressive roster of 70+ members who are working with some of the sharpest minds in the DLT space like Richard Gendal Brown(CTO of R3) and Mike Hearn(former Bitcoin Core Developer) to create a new world of financial services infrastructure and applications using the ideas derived from Blockchain technology.

According to Clemens Wan, Associate Director of R3, they see their technology as one that requires broader buy-in from enterprises and corporates to achieve a strong network effect and important applications.  According to Wan, R3’s open sourced DLT platform Corda is similar to XBox Live, in a sense it builds the ecosystem and the connectivity. By focusing on providing the platform and services, R3 will enable its members to innovate using DLT technologies to solve their business problems by building a variety of use cases.

The use cases that are being worked on my R3 members is quite comprehensive:

R3 believes that 2017 will be the year of Pilot and 2018 will be the year when DLT applications will hit mainstream production.

R3pilot.jpg

Image credit: R3 blog.

Corda platform has been open sourced as part of the Linux Foundation’s HyperLedger project. Corda attempts to not replicate the Bitcoin blockchain; rather it takes an enterprise point of view of creating a shared ledger that allows managing financial agreements. Corda was designed to address the pain points which stops enterprises from embracing DLT technologies fully. To facilitate this, Corda is designed with these goals in mind according to Richard Gendal Brown, CTO of R3CEV:

  • Corda has no unnecessary global sharing of data: only those parties with a legitimate need to know can see the data within an agreement
  • Corda choreographs workflow between firms without a central controller
  • Corda achieves consensus between firms at the level of individual deals, not the level of the system
  • Corda’s design directly enables regulatory and supervisory observer nodes
  • Corda transactions are validated by parties to the transaction rather than a broader pool of unrelated validators
  • Corda supports a variety of consensus mechanisms
  • Corda records an explicit link between human language legal prose documents and smart contract code
  • Corda is built on industry-standard tools (Kotlin, a JVM compatible language).
  • Corda has no native cryptocurrency (it rather uses real world currencies).

By addressing these design goals, Corda is able to address the blockchain benefits like validity, uniqueness, immutability and authentication within the context of applications for financial services.

Recently R3 and Corda were in the news when they announced that their solution is “blockchain inspired” rather than a blockchain. An interesting analysis by Chris Skinner about this can topic can be found here.

Digital Asset Holdings(DAH):

DAH is a Distributed Ledger Technology startup founded by Blythe Masters and other industry veterans. DAH aims to use distributed ledger technologies to disrupt the legacy processes which slow down the financial industry. In their own words:

In theory, a shared, immutable ledger enables transparent, peer-to-peer, real-time settlement without the need for financial intermediaries. In reality, markets require known, reliable counterparties, rights of reversal and error correction, high levels of privacy and the operational benefits of net settlement in a system in which legal entities are responsible for the perfection of title to and legal standing of financial assets. Consequently, markets will continue to benefit from and require third party service providers to perform a variety of functions as they do today: from ensuring clean title, to enabling operational and balance sheet netting.

Assets are not currently issued solely into these distributed networks, and may never be. This necessitates careful on- and off-ramping procedures for keeping the two systems in sync as new technology is adopted. These needs can be met at the transaction layer, where Digital Asset software maps business logic and legal processes to cryptographic signature flows. As an example, our software constructs transactions that enjoy privacy and, when required, permit the ability for net settlement.

Digital Asset claims they select the right kind of distributed ledger for the problem at hand; they can work both on permissionless ledgers (like Bitcoin and Ethereum) as well as permissioned ledgers (Hyperledger) that provides better control. To enforce the right type of smart contract execution logic, DAH has modeled a new language called Digital Asset Modeling Language (DAML).  DAML is similar to a smart contract language in many ways but it is designed with the needs of financial institutions in mind. According to DAH, it is optimized for usage in a private execution environment rather than in an open execution environment(in which it would be processed by all of the nodes in a network). DAML is designed to achieve many of the same benefits of Smart Contracts.

DAML does not support Turing Completeness. This allows it to specifically focus on financial services use cases where the potential outcomes are predictable (and hence avoid the halting problem).  DAML focuses on verifiability – only by the stakeholders of that agreement rather than by everyone, and on certainty – being able to accurately predict all possible outcomes of the agreement rather than introducing doubt with unnecessary complexity.

Consensus in DAML: DAML ensures that all stakeholders can reach consensus by utilizing the shared log containing the complete provenance of the rights and obligations along with an off-chain execution environment for processing the workflows and the behaviors that are being modeled.  DAML also ensures that not all nodes in the network need to know and process the contents of a contract; only the parties specified/relevant in the contract are involved in the execution. Contract data is revealed on a need to know basis and even the distributed ledger, which only contains references to the agreement, is encrypted so other entities cannot detect even its existence on the ledger, let alone the terms.

DAML Agreements and Hyperledger: The name Hyperledger used below can be easily mistaken for the Linux Foundation’s Hyperledger project. Tim Swanson from R3 explains the naming confusion between these two in his blog post here. Take a few minutes to read that before you proceed below. If you did not have the time, here is a brief blurb:

So when someone asks “what is Hyperledger technology?” the short answer is: it is currently the name of a collective set of different codebases managed by the Linux Foundation and is not related to the original distributed ledger product called Hyperledger created by a company called Hyper that was acquired by DAH. The only tenuous connection is the name.

The combination of the Digital Asset Modeling Language and Hyperledger allows for scale and privacy while maintaining a fully reconciled system across multiple parties. DAML serves as a logic and validation layer sitting above the ledger, providing an auditable way to prove the updates that occurred to the distributed ledger. An agreement modeled in DAML is only active if Hyperledger confirms it is valid and not referenced by any other transaction, creating an independently verifiable logical mapping between the original business intent all the way through to the relevant Hyperledger transactions.

DAH and HyperLedger.JPG

Image Credit: Coin Desk Construct 2017

Nodes in the Hyperledger network that are not party to the agreement are still able to agree upon its outcome because they can independently verify that all of the required authorizations have been made without ever actually seeing the contents of the agreement itself. However, the contents of agreements can be provably revealed to authorized third parties such as regulators.

We will cover the last three (Chain, Ripple and Enterprise Ethereum in Part 2 of this series).

Blockchain – What is Permissioned vs Permissionless?

bitcoin.jpg

If you have been following distributed ledger technologies (Blockchain), you would see the term permissioned blockchain being thrown around. What is the core difference between a permissioned vs. a permissionless blockchain?

A permissioned blockchain restricts the actors who can contribute to the consensus of the system state. In a permissioned blockchain, only a restricted set of users have the rights to validate the block transactions. A permissioned blockchain may also restrict access to approved actors who can create smart contracts.

Permissionless blockchain is contrary to what you read above – Here anyone can join the network, participate in the process of block verification to create consensus and also create smart contracts. A good example of permissionless blockchain is the Bitcoin and Ethereum blockchains, where any user can join the network and start mining.

Now you may wonder what the benefits and disadvantages of each approach are? In a permissionless world, you do not have to prove your identity to the ledger. As long as you are willing to commit processing power to be part of the network and extending the blockchain, you are allowed to play. Any miner who is playing the game by the rule may be able to solve the hash puzzle and verify the block of transactions to win the mining reward (Higher the mining power, better the chances of winning the mining reward).

In the permissioned blockchain world, you need to be an approved actor in the system to participate in growing the chain as well as building consensus. Many of the blockchain consortiums that build private blockchains for financial institutions and other enterprises follow this model.

One other critical difference between these two is the underlying mining model – permissionless blockchains use Proof of Work(PoW) mining where hashing power is offered to build trust. As long as 51% of the nodes are honest players, network consensus is reached. (Read about 51% attack here).  While Bitcoin uses PoW mining, Ethereum is proposing to use a Proof of Stake model (PoS) for reaching consensus.  Proof of stake mining asks users to prove ownership of a certain amount of currency (their “stake” in the currency). Instead of buying computers and electricity for mining in a PoW system, a PoS systems uses the capital to acquire the coins/tokens that allow you to validate transactions.

Permissioned blockchains do not have to use the computing power based mining to reach a consensus since all of the actors are known; They end up using consensus algorithms like RAFT or Paxos. There are also other PBFT algorithms that can be used to reach consensus without PoW mining.

Let us look at the topic of enterprise blockchains. Almost all of these piloted blockchains these days are permissioned. There are many reasons why this is the case:

  1. Privacy – using a permissioned blockchain allows only actors who have rights to view the transactions. A permissionless blockchain is ideal as a shared database where everyone can read everything, but no single user controls who can write. Imagine you are a large bank who uses a shared ledger with a list of other banking partners within a consortium – you do not want the volume of your transactions to be visible to your competitors.
  2. Scalability – A Permissioned blockchain can build a simplified Proof of Stake model to establish consensus; this prevents the proof of work by burning computational cycles. The ultimate result is scalability compared to a public blockchain network like Bitcoin. (See BigChainDB).
  3. Fine Grained Access Control – A Permissioned blockchain allows restricted access to the data within the ledger (See the design model underlying R3CEV’s Corda)

I want to highlight one of the most famous interviews by Bitcoin guru Andreas Antonopoulos – Sewer Rat and the Bubble Boy. When asked about “How are enterprise/private blockchains different from bitcoin’s blockchain?”, Andreas responded:

“The banks and the corporations say, “Oh, bitcoin’s awesome. We want that. Only without the open, decentralized, peer-to-peer, borderless, permissionless part. Could we instead have a closed, controlled, tame, identity-laden permission version of that please?”

It is an interesting argument where Andreas compares Bitcoin network to the Internet and the private blockchains as the secure intranet within enterprises. He compares them to the Sewer Rat and the Bubble Boy:

Bitcoin is a hardened platform because its security is tested on an everyday basis.

“Bitcoin is a sewer rat. It’s missing a leg. Its snout was badly mangled in an accident in last year. It’s not allergic to anything. In fact, it’s probably got a couple of strains of bubonic plague on it which it treats like a common cold. You have a system that is antifragile and dynamic and robust.”

Does this mean that the enterprise blockchains are the bubble boy?

 “let’s take bitcoin, cut off its beard, take away its piercings, put it in a suit, call it blockchain, and present it to the board.” It’s safe. It’s got borders. We can apply the same regulations. We can put barriers to entry and create an anti-competitive environment to control who has access. It will be more efficient than our existing banking system.

Eventually, successful, vibrant, innovative companies are the ones that turn their IT infrastructure inside out. In the future of finance, successful, vibrant, and innovative banks are the ones that turn their infrastructure inside out and make it part of an open, borderless financial system that serves the other 6 billion, that serves all the people who have been excluded from finance.”

I for one cannot wait to see how this whole debate will shape up the future of finance. The advent of bitcoin and blockchain and DLT technologies is a pivotal moment in the history of computing and technology. It will change the way of how we build systems of the future.

You can watch the full-text interview with Andreas here and the video of his talk here.

And the Fintech Country of the year award goes to…

If you are not following what is happening in India in the last few months, you will need to wake up and take a look. The demonetization effort launched by the Indian government has shaken this cash based economy to the core.

While everyday folks suffered a lot with trouble in exchanging the 500 and 1000 Rupee notes for the newer 2000 or older denominations, the mobile wallet scene in India saw a sudden interest and growth which was a fortunate side effect.

Companies like PayTM and MobiKwik have seen their user and transaction volumes skyrocket in the past few months. Here is a link to the Paytm growth numbers for their year ending 2016 which is crazy!

As a payments industry observer I find this growth fascinating. I can’t wait for the time when Apple would release its ApplePay numbers in open like this!

The folks in Daily Fintech have done a fantastic job in presenting the case of why India is the Fintech country to watch here in this article. Go check it out!

indian-map_1017-1589

You got heart, kid.

A lifetime ago I used to be a developer. I started coding when I was in 9’th grade with GW-Basic on an IBM PC which ran an Intel 8086 processor with 256kb RAM. I have to admit this was the most fun I had as a 12 year old and I was hooked. I pushed the boundaries of the GW-BASIC interpreter by using commands like Peek and Poke which the elders told me to stay away from. In short Peek and Poke let you do some wild things with the language like prodding the brain of the computer. However you start to realize that as fine as an entry level language GW-BASIC was, you can only push the boundaries to an extent. (Developing a hotel management system entirely using Random Access Files in GW-Basic that used its own custom index scheme to locate records using B-Trees was quite painful).

This was all fun and good, but taking a full day to run all of the month end accounting sucked. Enter dBase III plus and Foxbase which rocked my world for the next couple of years. As I graduated high school, I got introduced to sexy Pascal and finally C – the mother of all languages. (I did dabble in some 8086 assembly, but I figured I wasn’t a masochist and gave that up).

C taught me the discipline to code right. You do not take anything for granted. You could not assume you “freed” a pointer. Or that array you allocated had enough space to capture that crazy user input which will someday cause a buffer overflow. In my first paid gig at caterpillar I had to write a symbol table implementation that created a memory manager for strings in C. This memory manager wrapped the alloc() and free() to keep track of all strings being handled within the application (that automagically built a service manual for a giant caterpillar earth mover).

At this stage I was starting to explore C++ but it seemed like an abusive relationship. It just didn’t feel very natural and I wasn’t too excited to switch my primary development tool of choice at that point in time.  This is when a new kid on the block emerged- Java.

Java was touted as “Write once – Run Anywhere”. In fact, it was so hyped up at that time the whole world went crazy and tried to shove Java in places where it should not belong to. (I am looking at you Sony – for committing the crime of writing your entire blu ray spec/stack in Java which in my opinion is one of the worst conceived ideas ever).

Sean Bean Write Once.jpg

By the time the world woke up and got to its senses, enough damage had been done in the form of Java applets. However, the one saving grace that came out of this debacle was server side Java. While Sun Microsystems was busy in a pissing match with Microsoft and others, the enterprise application server market started blossoming with players like BEA  and IBM making good contributions on the Java Enterprise scene. (Remember the venerable pet-store application?)

Today, many of the commercial enterprise level applications and services are powered by software written in Java. The ecosystem and development tool chain are supported by a robust set of offerings by players like the Eclipse Foundation, JetBrains and Oracle.

Around this timeframe, the concept of n-tier application development hit main stream. Java borrowed some good ideas like MVC from Smalltalk and made it commercially successful. While Java did a great job on the Controller side, it did have some weakness on the Model and View side tools.  The native database support for Java existed in the java.sql package. Any database developer worth his salt would let you know how much CRUD code you needed to develop in order to get the backend model represented properly and it was quite the chore.

Enter ORM mapping tools like Hibernate. They simplified the idea of building models in a more natural fashion by taking away the complexity involved in writing plumbing code. Java took note of this and introduced these features as part of the core language as Java Persistence API in future releases.

While the model side of the world settled with solutions like this, the View on front end was not a pretty scene. Java server pages technology was the recommended way to build the front end of web apps. This was not a great solution in many ways as the graphic designers would mock-up HTML and Java developers would take this and translate that into backend code. In order to perform front end dynamic validations you resorted to Javascript.

Now Java developers always looked down Javascript as an inferior development language. There was a philosophical difference between Java as a strongly typed language v.s. Javascript, which was interpreted at runtime.  Javascript developers were called/treated as “script kiddies” and Javascript as a language was not being considered for any serious development (primarily due to its applications on the front end side in the browser).

This all changed in the last few years. A powerful set of libraries started emerging for Javascript and lot of serious development started to happen. With companies like Google, Yahoo and Facebook committing to Javascript development, we started seeing some new unique tools and features which are making web experiences more engaging and realtime.

The pivotal moment for this development can be attributed to the advent of Node.js (a Javascript runtime environment that allows JS code to be executed on client as well as the server using Google’s powerful V8 Javascript engine). JQuery created and abstract layer on the front end end for Javascript which took care of many of the browser level rendering idiosyncrasies.

In fact so much development is being done for Javascript these days that there is a rich array of options a developer can choose from:

Web Frameworks: Angular JS, ReactJS, EmberJS, Meteor.

Front End Development – This list is long, I am hyperlinking it here.

Compilers – Babel, traceur

Build/automation Tools – Grunt, Gulp

Package Managers – npm, boweryarn

Mobile Cross platform Development – Sencha Touch, Phonegap, Ionic, Kendo UI, Mobile Angular UI, Intel XDK

Backend Development – Node.js, Express, KoaHorizon, socket.io, Redux,

Javascript Databases – RethinkDB, TaffyDB, PouchDB, LokiJS

I was quite amazed by how rich the Javascript ecosystem has grown. Javascript is the #1 language in github with the most number of active repositories and commits as seen here at Githut.

To summarize, I have developed quite the respect for the Javascript ecosystem. As Captain America said – You got heart kid!

cap-america

(Image Credit – Marvel Studios).

Digital Delight – How are you moved?

index-mobile2007 was a watershed moment in technology. The iPhone redefined the concept of personal computing and the world as we knew it was never the same. In just a decade, we have seen the proliferation of smart devices in every aspects of our life. We take for granted the various conveniences  offered by our smart phones.

However too much of a good thing can also create fatigue. Our smartphones are cluttered with apps which we download and use once or twice and never revisit again. Like a kid in a candy store, we hoard on apps and eventually get tired of it and settle on a few which truly make your everyday life easy. If you start filtering apps with this lens, I bet most folks don’t use any more than 10 to 15 apps in a repeating basis.

Apple and Google haven’t really done much to help solve this hoarding issue of the users. I have always thought a neat feature in an operating system level would be is to notify the user of an app dormancy – if I haven’t touched a thing in 3+ months, is there any possibility I would ever use it again?  Recommend a list of these apps which I can get rid of and keep my sanity!

At the end of the the day, the few of the 10 to 15 apps we use may not be the best of the best but they may serve a utility which we cannot live without. A good example would be apps published by your banking and credit card providers. I counted a total of 10 apps in my phone representing my banking, credit card and investment/retirement account providers.

A fellow Fintech Mafia member Alex Jimenez mentioned the other day that most of the mobile banking apps are online banking shoved into a small screen. I tend to agree with his assessment. In a race to keep the mobile apps in feature parity, most financial institutions are in a rush to port the kitchen sink into their mobile apps. While I appreciate the swiss army knife of a mobile banking app, we really don’t need 48 features in a form factor meant to engage you for less than a few minutes to take care of quick and important banking transactions.

A mobile banking product manager’s wish is to figure out what makes the customer tick. Everyone wants to build the next Uber of the banking service. However, to get the formula right to digitally delight a customer is not an easy task.

What is Digital Delight you ask?

Digital Delight – if an online or a mobile product/service creates a pleasurable moment that makes an experience just a little more fun.

How can we can introduce this digital delight as a product designer? Not all of the apps can be digitally delightful as Monument Valley the game (which btw still blows my mind!). Sometimes the opportunity lies in taking the most mundane process workflow and integrating that into an informative notification that can make a huge difference.

Case in point – Delta launched a new redesigned app with a feature to track the status of your checked in bag.

delta1

(Image Credit – Delta)

From the moment the baggage gets tagged into the system, the Fly Delta app starts tracking the baggage in transit.  You get notifications that your luggage has been loaded into your plane. It allows you to see where your bags are at any point in time even using a satellite map!

delta2

(Image Credit – Delta)

I want to send kudos to the Delta team for making an useful feature like this part of their app update. This is certainly one of those things where I was delighted to see in action. A nice video of this feature is available here.

What other apps surprise/delight you in this fashion? Share them in the feedback section below. As I see digital delights in the wild I will make sure I share them as well.

Happy New Year 2017!

Google wifi – Mother-in-Law Approved.

gwifi

I live in a three level home which is a terrible location for a wifi network to provide full home network coverage. Interestingly enough, the home I used to live prior was much larger but my good old 8 year old Linksys wireless N router never had any issues covering 4000 sq.feet between two levels. When I moved to the current home I live now, I chose to have my cable modem installed at the basement level (which was a terrible mistake). The walls of this home seem to have lead in it which almost makes the home sorts of a Faraday cage killing the signal strength on the top level.

Hence started my quest to find a router that worked across all three levels.

Setup 1:

My first move was to buy a cheap router (D-Link) and use it as slave to my primary Linksys router at the basement.  I decided to buy a NetGear Powerline Adapter  to extend the Ethernet connection to my third floor and then use the D-link router to setup a slave network. In theory this sounded fine, on implementation a few issues arose. I had two distinct wifi networks in my home and I had to switch between networks as I roamed around. Not to  mention the fact both the Linksys and D-Link routers had dual-band radios for 2.4 Ghz and 5 Ghz, so I had 4 wifi networks in total!

Why can’t you disable one of the radios you ask?  Well the 5 Ghz does extremely well for short distance applications (like streaming Apple TV) while the 2.4 Ghz radio crosses walls better. To make things more interesting, I had the 5 Ghz radios in Wireless-N mode while 2.4 Ghz band operated in Wireless G/B mixed mode to provide compatibility to older devices at home. (As crazy as it may sound, an average home these days can easily have more than 20+ wireless devices considering all of the the cell phones, tablets, SMART TVs, Laptops, Media Players, Smart Home Assistants like Alexa, refrigerators etc).

Setup 1 worked OK for a while but I was getting mysterious signal drops on wifi when I was in the slave network on the third level. The connection would randomly freeze and I would have to reset the wifi radio on my phone to get the connectivity back on. I was not sure what the culprit was. By process of elimination, I could tell the basement Linksys router was working OK and the slave D-Link upstairs was working fine as well. (I swapped them and the problem still persisted). This made me conclude the culprit may be the NetGear Powerline adapter (which uses the home electrical connectivity as an extended wired Ethernet network).

To test this theory out, I bought a TP-Link Nano Ethernet Poweline adapter to see if  it solved the problem. After swapping it with the NetGear, the problem seemed to have gone away for a few days and then it was back again.

House 1:  Me 0.

Setup 2:

Based on lessons learnt on Setup 1, I realized there may be a hidden electrical goblin in my house  wiring eating my IP packets. I started exploring alternate options like wifi extenders which technically extend the same wifi network across a bigger area. I ended up buying the Amped wireless extender and connected that to my basement Linksys router. While this seems to have marginally solved the problem, I was still having intermittent issues with connectivity dropping randomly which was quite frustrating. After a 2 week trial run, I returned the Amped back to Costco.

House 2: Me 0.

Setup 3:

By now I was starting to think the issues may be with my Linksys router which did not have a firmware update from Cisco in 4 years.  I started researching new routers. This is when Google got into the router business and launched their OnHub router with the help of TP-Link and Asus. OnHub was google’s first foray into home networking business. The OnHub was supposed to solve all of the coverage issues. I was ready to cry tears of joy and drove to BestBuy to pick it up on Day 1 when it was released.

OnHub ended up being a huge disappointment. Since it was a router on its own respect, I replaced the basement Linksys with the OnHub. The setup was super easy with a smartphone app which configured all the settings which was very cool (compared to the legacy admin consoles hosted at 127.0.0.1 in your older routers). It was definitely very  user friendly but ultimately failed the important test of not being able to have the range to penetrate the lead-laden walls of my house to the third level. I mucked around with various settings in vain. However the best speed which I was able to get on the third level was a measly 750 kbps. No YouTube or Netflix streaming while I was in bed.

House 3: Me 0.

Setup 4:

OnHub was promptly returned after 3 days of testing as it didn’t quite cut it for my needs. At this point I was fairly disillusioned with fancy new routers and did more research to find the holy grail of routers. After reading many reviews on The Wirecutter, I finally decided to give the Archer C9 a try. The Archer C7 was rated as one of the best router  (for most people) by The Wirecutter in the market. I picked up the Archer C9 from Costco. The added bonus was it came with a range extender which worked out of the box.

The Archer C9 seemed to do OK on most of my testing and has been my primary router for the past year. The extender would take the same wifi network name, so I did not have to keep switching networks as I moved around the house. However I did notice a little bit of lag in streaming speeds on the third level. It worked most of the times, but sometimes it would make me reset the wifi radio in my phone to get it to work again. (By now you may have realized that turning wifi on/off has been something I do at least a few times in an average day and over a period of four years it gets mighty annoying. You ultimately give up and accept it as  price to pay to have a wifi connection – I keep reminding myself of pre-wifi days during when I went and bought a 100 meter Ethernet cable 12 years ago when wifi was not a common concept!)

House 3.5: Me 0.5.

Setup 5:

Recently I came to know about a new concept where a new breed of mesh networked home wifi equipment that have been popping up in the home network scene. Most prominent of those were NetGear’s Orbi, Eero Home wifi, Luma Home and Google Wifi.

I stayed away from making a leap into this scene as these equipment were all quite pricey. (Typically more than $400). However when I watched the Google Hardware event on October 4’th, I was blown away by some of the things Google launched/announced. They had a wide range of products from Pixel Phone to VR goggles and then Google Wifi. Watch this video starting at minute 50 to catch the Google wifi announcement. The pricing seemed reasonable for the three pack unit starting at $300. I decided to give one final shot and preordered the Google wifi from Amazon.

I received Google wifi 3-pack today and opened it to perform the setup. The packaging was very apple like. There was just a quick start card inside along with the units and power cords and an Ethernet cable. Very simple and spartan. Following the quick start instructions, I downloaded the Google wifi app on the app store and started the installation process.

I setup the circular hockey puck looking units (a little larger than that) at each level in my home. These devices are beautifully designed and blend into the environment. The bottom of the devices have the SSID of the unit and a password to connect to it (Note: You will only use this SSID and password during setup).

Setup was a little confusing to start with. The App was attempting to connect to an unit without asking me to join to the SSID broadcast by the base unit. I had to manually switch to phone system wifi settings to connect to the SSID of the base unit. At that point, the setup app took over and asked me to name the location where I placed the unit (basement) and asked me to also name the home network.

Once I did this step and the network was created, the app asked me if I had more Google wifi units. I had two more units I needed to setup. The app again went on a wild goose chase trying to locate both the units. I had to now switch to phone wifi settings to identify the unit by name on my family room. Once I connected to this unit using the user id and password at the bottom of the unit, the app took over again. This time the app recognized it as a peer unit and configured it part  of the home mesh network.  I repeated the same steps for the unit on my third level and it connected to the network like a charm. I breathed a sigh of relief when I realized all of the three units were part of the mesh network and were under the home network SSID I created when I setup the first unit.

Now to the acid test. Speed Test app blazed through on all three levels with speeds consistently exceeding 70 Mbps.  I tested the speeds while I was stationary on all three levels. Then to put the system to the test, I started a Speed Test and moved around the house where I knew the connectivity would be weak. Impressively enough I could see the speeds slowing a bit on dead spots but the mesh network came to rescue with another unit picking up the signal seamlessly and I could visually see the speeds normalizing to the 50 Mbps and above threshold.

Finally I think I have a winner with Google wifi!

The best part about this is I hope I do not have to keep resetting the wifi to stream YouTube on my visiting mother in law’s iPad which is worth its weight in gold!

Home KO:  Me smiley

 

2015 is *not* the year of Mobile Payments.

mobilepmnt

There, I said it. It’s an inside joke amongst the payments industry folks to guess which year might be the year when Mobile Payments finally becomes mainstream. It may seem with the highly promoted launch of Apple Pay, we may have finally put a rest to the claim that 2015 is the year when mobile payments finally hit mainstream.

Maybe Not.

For all intents and purposes I feel Apple Pay, with its marketing momentum, managed to make the general public take notice of mobile payments. However, with Apple finally decided to get into the ring, it managed to make the other failed efforts combine forces together to finally put up a fight. I see the playing field with four major players.

  1. Apple Pay with NFC
  2. Google Wallet with NFC (Android Pay?)
  3. MCX/CurrentC/PayPal/Paydiant Wallet
  4. Samsung Pay (With Loop’s MST and NFC)

Each of these players have their own strengths to polarize the market in their own way. Lets dive in!

ApplePay:

Strengths: Best User Experience for making mobile payments in the current landscape. Pioneered the concept of tokenization and biometrics to provide a new easy way to pay. Brought all the payment networks and banks to execute a near textbook perfect launch.

Weaknesses: Merchant Adoption. As we see more merchants adopting NFC enabled terminals (and the EMV liability shift in 2015), the number of locations where you use Apple Pay will grow. However, don’t expect to use Apple Pay in your neighborhood Walmart anytime soon.

Not having any rewards programs to brag about is another limitation with Apple Pay.

Add to this the other side of Mobile phone pie, the Android user ecosystem who zealously hate anything Apple Makes. These folks will never use Apple Pay, no matter what.

Opportunities: As more of the newer iPhones get sold, Apple Pay will gain traction and people will try it atleast once to see how this works.

Threat: Industry analyst Cherian Abraham has been blogging about the fraud activities generated by ApplePay. Even though none of this has anything to do with the security measures Apple put in place in designing Apple Pay, the soft underbelly of the whole scheme is the Yellow Path for provisioning cards for users. Banks were forced to launch AP without much time to account for this newer security threats and that may slow some of the momentum.

My take: I would equate the launch of Apple Pay to Tesla’s Model S launch. Both were revolutionary products in their own respect. They broke the convention in their areas where common wisdom dictated that it could not be easily done. However, any first generation product there may be some quality issues which may get ironed out as future versions are released. Apple Pay is surprisingly a solid offering for v1 and it can only get better.

2. Google Wallet with NFC (Android Pay?)

I don’t even know where to start with this. Google Wallet has languished forever mostly due to Google’s approach in not building partnerships with the players in the field. Finally the stars aligned for Google Wallet after the Apple Pay launch and lit up the much needed fire in their back to get their act together. Google was able to broker a partnership with Softcard and provide its new wallet offering as an API.

Strengths: Launching Android Pay as a payment API (See Sundar Pichai’s remarks here) is a wise move. This allows Google to open up the API to device manufacturers and developers to build wallets which may result in mainstream adoption. Apple’s walled garden approach would have never worked for Google in the first place. Android has a healthy user base in US who may try this service.

Weaknesses: Design by committee never produces mind blowing user experiences. Apple Pay set a very high bar(See Tim Cook launch Video here). I cannot imagine how the Android Pay UX can be any simpler than this given the fact that it has to work on a multitude of devices made by different manufacturers. The rev. share scheme Google has promised the Telcos to pre-install Google Wallet in newer handsets sounds like something which needs to be vetted after product launch to see how successful it may be in the long run.

Opportunities: Android has an 80% worldwide market share (source). With an open API approach, Google is betting on sheer volume of its users to make this a success. When you thrown in enough magic potion in the cauldron, you never know what may come out 🙂

Threats: Not having the first movers advantage sometimes may come back and bite you. Apple Pay was able to alleviate this shortcoming with an amazing UX. With Android’s open model and fragmented ecosystem, the success of Android Pay may not be imminent. If history is any indicator, the ARPU(Avg. Rev. Per User) for Android is one quarter of iOS users (according to Benedict Evans analysis here).

My Take: Android Pay maybe late to the party but it surely provides an alternative for Apple Pay. How successful this may be is something we will have to wait to see.

3. MCX/CurrentC/PayPal/Paydiant Wallet

Its hard to analyze a wallet only a few have used in Beta. However, the PayPal acquisition of Paydiant came as a surprise and caught most of us off guard. We well knew MCX was working with FIS/Paydiant to use their QR code technology to payments. The QR-Code premise was very promising when all of the industry pundits declared NFC dead (including me). Add some Apple juice and suddenly we see a re-animated NFC becoming the de-facto payments standard.

The QR-code method of payments is still not bad. We use it in Starbucks everyday and most people do not have any issues paying using QR-Codes. However the entire premise of CurrentC wallet which tries to remove payment networks from the picture (atleast in the first iteration) seems like a dicey proposition for a successful launch.  From a consumer point of view, I do not understand why a customer would be motivated to link their bank accounts to save interchange fees for a merchant.

4. Samsung Pay (With Loop’s MST and NFC)

Talk about bad timing and possibly some buyer’s remorse. Samsung acquired LoopPay which would provide out of the box support to use existing POS terminals using the Magnetic Secure Transmission technology. While many rumored a marriage between these two companies could have provided a great mobile wallet in Samsung phones, this comes a year too late. With the EMV liability shift fast approaching, many merchants are upgrading their terminals. The concept of swipe as we know may go the way of dodo pretty soon. (This reminds me of Blu-Ray as a technology. I wonder how many people still buy Blu-Ray players now that streaming is becoming the most preferred way to consume content?)

Naming this technology as Samsung Pay is also not a great move in my opinion. For the hardware chops they have, Loop technology could have been made into a hardware chip which Samsung could then sell to any of the device makers thereby securing their investment and guaranteeing a longer shelf life with strength in numbers . With Google announcement of Android Pay API, I am just not sure how Samsung would promote their offering when they will also have to install Google Wallet.

Conclusion:

So, I hope, my dear reader, if you have come this far, you probably know how big of a cluster is the current mobile payments landscape in US. I don’t even want to imagine the free technical support we have to provide our families and relatives once every one of them gets their hands on a pay with a phone thingamajiggy. So good luck.

For my money, I can surely bet 2015 is not the year of mobile payments 😉

UX fails – even Apple is not immune.

I am amazed for all the brain power Apple has in its engineering team, they still manage to provide bad user experience in some cases. General consensus in the industry is Apple Hardware and Software just work together and delight the user in ways which we would not imagine.

Not always.

Allow me to elucidate.

Back in 1990’s the biggest frustration I used to have as a gamer was not able to fit the massive games ( > 5MB) into storage media like floppy disks. (I made myself look like a dinosaur, I know!). MS-DOS and Windows had a nasty habit of not letting you know how much space is available before you start a copy process and would miserably fail half way through.

Fast Forward to 2015. I am using iOS 8.1.3, the world’s most advanced operating system ever (I didn’t say it, Cook did). All I did was try to shoot a video of my 6 week old smiling using my 64 GB iPhone. The camera app was recording the video and right when the baby was starting to coo and cackle, I get this nasty error dialog which said my device ran out of storage space and the recording was stopped. (Thank the lord, it did record some of the footage).

I just don’t understand how the world’s most advanced operating system, cannot provide a word or warning when it starts running out of space. It doesn’t have to be an automated nag, but having a visual clue on the camera app which shows how many pictures are possible or how many minutes of video can be recorded with the available space would be a good visual clue.

Go ahead and highlight that in red when you are running out of space to let the user know that they don’t have enough space to record baby video and its time to delete that Angry Birds V1 app which is taking up close to a GB of space.

And while you are at it, why not give some more smart recommendations like the age/usage of apps on the phone? If I haven’t touched an app in months and its taking up valuable storage space, maybe its time to provide some advice to the user to delete that bloat.

What are the other UX annoyances you face with your favorite Device/OS?

PS: If you are frustrated with other OS annoyances, this Quora Thread has a laundry list of things which iOS sucks at.

http://www.quora.com/What-are-the-best-examples-of-poor-Apple-user-interface-design

UPDATE: Just a few minutes after I publish this post, I see this news about Apple working on revamping the stability of their OS  – Good news!

Apple’s iOS 9 to have ‘huge’ stability and optimization focus after years of feature additions

Paypal needs to leave 1998 and move on to 2015.

I rarely use PayPal. The only times I use PayPal is to buy from Internet merchants who I have never dealt with before. I never use their money movement feature. That being said, a couple of months ago I decided to add a new bank account to PayPal.

A typical way to setup a new bank account for ACH is utilizing the trial deposit method of account verification. With this method, the entity which likes to make the ACH linked setup, sends two micro trial deposits usually less than a dollar to the bank account and asks you to verify this amount. This process takes typically a day or two due to the underlying limitation of the ACH technology which uses batch files in this day and age to settle transactions.

Recently a newer way to verify bank information has been floating around – this is called Instant Verification where the entity (like Paypal) utilizes the online banking login information to confirm if you really are the owner of the account you are trying to link. I have used this method in a few places and 99% of the time, unless you are trying to link to one of the big banks, this never works. (In which case I fallback to the trial deposit method).

So when I set up a new funding bank account with PayPal using this new Instant Verification method, I was surprised to see it was able to connect to a small local credit union account. However I realized a few months later, instead of linking the checking account, PayPal ended up linking the savings account. (Disclaimer: I am not sure who screwed up here, PayPal or the aggregator they use or my credit union’s core system).

During holiday shopping season I used Paypal a little more than normal. This ended up deducting  from my savings account (which didn’t have much balance on the first place since this bank account is purely used for everyday spend).

At one point, the savings account over drafted and PayPal hit me with a $20 fee for the failed ACH transaction (even though I have a backup credit card setup within Paypal to fallback in the event of ACH failure.

Transaction from my bank on 12/23:

Screen Shot 2014-12-30 at 8.09.45 AM

Not wanting to deal with PayPal’s customer service, I decided to remove this bank account from PayPal and just leave it only with a credit card as funding source. When I try removing the bank account, I get an error message that “You have a pending transaction – you cannot remove this bank account”.

Screen Shot 2014-12-30 at 8.10.40 AM

I give PayPal a full 5 day window and try removing this account and I got the same error message again. To add insult to injury, I can’t seem to locate the $20 fee or the Pending transaction within PayPal Account Activity Section.

Here is PayPal’s account history where you can’t see the $20 fee or a Pending transaction:

Screen Shot 2014-12-30 at 8.16.16 AM

I even tried the Ugly Sister version (Classic Site) of PayPal to see if this Pending Transaction and hidden fee are visible – no luck.

Screen Shot 2014-12-30 at 8.18.32 AM

I can’t believe PayPal, a massive platform with so many customers would suck so bad on User Experience. I wrote to PayPal Customer Service, lets see what that response would look like 🙂

Update 1 on Jan 1, 2015.

PayPal sent me a generic email about how to add and remove bank accounts when I specifically asked them to remove a bank account. Auto Responders are not cool – especially when you deal with customer’s money.

I tried moving some money from the PayPal account to my bank account. It seems like the transaction went through but this is the confirmation screen I got after the transaction. I got an error message to check my card details followed by a big green check mark possibly indicating that the money transfer was initiated. What does this even mean?

Screenshot:

Screen Shot 2015-01-01 at 2.37.36 PM

To facilitate this move, I had to increase my monthly transfer limit. PayPal cleverly suggests that we add another credit or debit card to do this. I added a new card to increase my limit. PayPal charged me an amount of $1.95 to validate if the card really belongs to me. In the transaction memo they embed a 4 digit code which needs to be used to validate that the credit card belongs to me.

Once I finished adding the card, I got a realtime mobile SMS alert from my FI and I added the 4 digit code to confirm that the card was mine. After that I was able to move the money from PayPal to the bank account. However, when I viewed the account register again, instead of seeing a credit and a debit for the $1.95 which PayPal posted, I see two credits to my account. SMH.

Screen Shot 2015-01-01 at 2.47.15 PM

Does this company even do any Quality Assurance Testing on the code they push to production?!?!