Enterprise Blockchain Consortiums – Part 1

chuttersnap-149285
https://goo.gl/qzVPS5

2016 was a big year for Blockchain. The hype around Blockchain was palpable, and you saw news from all around the world about various initiatives picking up steam. While this entry is not meant to provide a comprehensive state of *all* of the consortiums out there, I am attempting to capture the ones who seem to have the most momentum and news coverage.

If you are serious about following the enterprise Blockchain scene, the following players merit your attention:

  1. HyperLedger
  2. R3CEV
  3. Digital Asset Holdings
  4. Chain
  5. Ripple
  6. Ethereum Enterprise (This was the new entry on the block as this entry was being written).

HyperLedger  – Hyperledger is an open source effort by Linux Foundation to develop a blockchain platform and to support blockchain-based distributed ledgers.The term Hyperledger itself doesn’t specifically point to one project; rather is a collection of various projects committed by different members. The projects themselves range from building blockchain based distributed ledgers to a Blockchain as a Service toolkit to a blockchain explorer.

Hyperledger ecosystem is built on the following tenets:

  • Data/Decisions need to remain with project stakeholders
  • Plug-ability to enable flexibility.
  • Ability to trace transaction history
  • Utilize native SDKs as much as possible
  • Support for HSM (Hardware Security Module)
  • Support for ACL
  • Eliminate Single point of Failure
  • Collaborate with projects within the Hyperledger ecosystem to pick the best design/architecture

HyperLedger.JPG

Image Credit: Coin Desk Construct 2017

HyperLedger projects:

HyperLedger Fabric – An implementation of blockchain technology that is intended as a foundation for developing blockchain applications or solutions. Fabric offers modular design allowing plug-and-play components like consensus and membership services. It leverages container technology to host smart contracts called chaincode that comprise the application logic.  Fabric was incubated with the contributions from IBM and Digital Assets Holding Group (DAH)  based on the successful experiment implemented at the F2F hackathon hosted by JPMC in NYC week of March 21, 2016. 

The fastest way to take Fabric for a test drive is to use IBM’s Bluemix Free Developer edition that can be found here. On the other hand, if you are a glutton for shell scripts and terminal, you can build your own Node.Js/Vagrant based Fabric client SDK development environment setup by following instructions here.  A Java SDK version is apparently in the works.

Iroha – Iroha has an interesting background it comes from the land of rising sun – It is a distributed ledger project that was designed to support infrastructural projects that require the use of distributed ledger technology in a simple and easy way. Iroha was developed in C++ and emphasizes its use in mobile environments and application development.

Iroha specifies its goal to provide the following c++ components that can be used across all Hyperledger projects:

  • Sumeragi consensus library
  • Ed25519 digital signature library
  • SHA-3 hashing library
  • Iroha transaction serialization library
  • P2P broadcast library
  • API server library
  • iOS library
  • Android library
  • JavaScript library
  • Blockchain explorer/data visualization suite

Iroha was added to the Hyperledger family with contributions from Soramitsu, Hitachi, NTT Data and Colu. A full white paper explaining the design of Iroha can be found here. The client code can be found on Github here.

Sawtooth Lake – Sawtooth Lake is a blockchain ledger project from Intel. Its designed to be a modular platform for building and running distributed ledgers. By using a pluggable architecture for consensus, Sawtooth Lake provides the ability to run both permissioned and permissionless Blockchains. Sawtooth Lake comes out of the box with a consensus Algorithm called PoET (Proof of Elapsed Time), which is intended to run in a Trusted Execution Environment (TEE), such as Intel® Software Guard Extensions (SGX).  PoET is a lottery protocol that follows the Nakamoto Consensus model to elect a leader in the voting process. However, unlike Bitcoin consensus that uses extensive Proof of Work based validation which wastes compute cycles and electricity, PoET uses a Trusted Execution Environment(TEE) to provide a guaranteed wait time for each validator(node).

Each validator in the network requests a wait time from a trust function within the TEE. The validator with the shortest wait time for a particular transaction block is elected the leader. The TEE’s provide trust services like CreateTimer() and CheckTimer() to ensure that a particular validator created a timer within a TEE and waited the specified amount of time to win the lottery. By utilizing Intel’s SGX enabled processors that can provide a TEE environment, PoET can scale by adding more validators without having to resort to extensive Proof of Work based mining.

Cello – Is a toolkit for deploying a Blockchain-as-a-Service, that reduces the effort required for creating, managing, and terminating Blockchains. Getting Hyperledger Fabric up and running requires on different nodes required configuring Docker scripts to get the environment up and running on various machines that can be time-consuming and error-prone.

Cello avoids these problems by providing pre-configured environments to get a Blockchain running similar to IBM Blumix or Microsoft Azure BaaS.

In summary, Hyperledger seems to have a vibrant ecosystem of various participants contributing to developing a vibrant open source Distributed Ledger technologies. The Hyperledger Technical Steering Committee ensures that the various efforts stay in sync with the core philosophies of open source contributions. You can find more information about Hyperledger at their wiki or in their slack channel.

R3CEV

R3CEV is a financial services/distributed ledger technology company responsible for creating R3, the most famous blockchain consortium that has managed to garner the attention of the financial industry. It has an impressive roster of 70+ members who are working with some of the sharpest minds in the DLT space like Richard Gendal Brown(CTO of R3) and Mike Hearn(former Bitcoin Core Developer) to create a new world of financial services infrastructure and applications using the ideas derived from Blockchain technology.

According to Clemens Wan, Associate Director of R3, they see their technology as one that requires broader buy-in from enterprises and corporates to achieve a strong network effect and important applications.  According to Wan, R3’s open sourced DLT platform Corda is similar to XBox Live, in a sense it builds the ecosystem and the connectivity. By focusing on providing the platform and services, R3 will enable its members to innovate using DLT technologies to solve their business problems by building a variety of use cases.

The use cases that are being worked on my R3 members is quite comprehensive:

R3 believes that 2017 will be the year of Pilot and 2018 will be the year when DLT applications will hit mainstream production.

R3pilot.jpg

Image credit: R3 blog.

Corda platform has been open sourced as part of the Linux Foundation’s HyperLedger project. Corda attempts to not replicate the Bitcoin blockchain; rather it takes an enterprise point of view of creating a shared ledger that allows managing financial agreements. Corda was designed to address the pain points which stops enterprises from embracing DLT technologies fully. To facilitate this, Corda is designed with these goals in mind according to Richard Gendal Brown, CTO of R3CEV:

  • Corda has no unnecessary global sharing of data: only those parties with a legitimate need to know can see the data within an agreement
  • Corda choreographs workflow between firms without a central controller
  • Corda achieves consensus between firms at the level of individual deals, not the level of the system
  • Corda’s design directly enables regulatory and supervisory observer nodes
  • Corda transactions are validated by parties to the transaction rather than a broader pool of unrelated validators
  • Corda supports a variety of consensus mechanisms
  • Corda records an explicit link between human language legal prose documents and smart contract code
  • Corda is built on industry-standard tools (Kotlin, a JVM compatible language).
  • Corda has no native cryptocurrency (it rather uses real world currencies).

By addressing these design goals, Corda is able to address the blockchain benefits like validity, uniqueness, immutability and authentication within the context of applications for financial services.

Recently R3 and Corda were in the news when they announced that their solution is “blockchain inspired” rather than a blockchain. An interesting analysis by Chris Skinner about this can topic can be found here.

Digital Asset Holdings(DAH):

DAH is a Distributed Ledger Technology startup founded by Blythe Masters and other industry veterans. DAH aims to use distributed ledger technologies to disrupt the legacy processes which slow down the financial industry. In their own words:

In theory, a shared, immutable ledger enables transparent, peer-to-peer, real-time settlement without the need for financial intermediaries. In reality, markets require known, reliable counterparties, rights of reversal and error correction, high levels of privacy and the operational benefits of net settlement in a system in which legal entities are responsible for the perfection of title to and legal standing of financial assets. Consequently, markets will continue to benefit from and require third party service providers to perform a variety of functions as they do today: from ensuring clean title, to enabling operational and balance sheet netting.

Assets are not currently issued solely into these distributed networks, and may never be. This necessitates careful on- and off-ramping procedures for keeping the two systems in sync as new technology is adopted. These needs can be met at the transaction layer, where Digital Asset software maps business logic and legal processes to cryptographic signature flows. As an example, our software constructs transactions that enjoy privacy and, when required, permit the ability for net settlement.

Digital Asset claims they select the right kind of distributed ledger for the problem at hand; they can work both on permissionless ledgers (like Bitcoin and Ethereum) as well as permissioned ledgers (Hyperledger) that provides better control. To enforce the right type of smart contract execution logic, DAH has modeled a new language called Digital Asset Modeling Language (DAML).  DAML is similar to a smart contract language in many ways but it is designed with the needs of financial institutions in mind. According to DAH, it is optimized for usage in a private execution environment rather than in an open execution environment(in which it would be processed by all of the nodes in a network). DAML is designed to achieve many of the same benefits of Smart Contracts.

DAML does not support Turing Completeness. This allows it to specifically focus on financial services use cases where the potential outcomes are predictable (and hence avoid the halting problem).  DAML focuses on verifiability – only by the stakeholders of that agreement rather than by everyone, and on certainty – being able to accurately predict all possible outcomes of the agreement rather than introducing doubt with unnecessary complexity.

Consensus in DAML: DAML ensures that all stakeholders can reach consensus by utilizing the shared log containing the complete provenance of the rights and obligations along with an off-chain execution environment for processing the workflows and the behaviors that are being modeled.  DAML also ensures that not all nodes in the network need to know and process the contents of a contract; only the parties specified/relevant in the contract are involved in the execution. Contract data is revealed on a need to know basis and even the distributed ledger, which only contains references to the agreement, is encrypted so other entities cannot detect even its existence on the ledger, let alone the terms.

DAML Agreements and Hyperledger: The name Hyperledger used below can be easily mistaken for the Linux Foundation’s Hyperledger project. Tim Swanson from R3 explains the naming confusion between these two in his blog post here. Take a few minutes to read that before you proceed below. If you did not have the time, here is a brief blurb:

So when someone asks “what is Hyperledger technology?” the short answer is: it is currently the name of a collective set of different codebases managed by the Linux Foundation and is not related to the original distributed ledger product called Hyperledger created by a company called Hyper that was acquired by DAH. The only tenuous connection is the name.

The combination of the Digital Asset Modeling Language and Hyperledger allows for scale and privacy while maintaining a fully reconciled system across multiple parties. DAML serves as a logic and validation layer sitting above the ledger, providing an auditable way to prove the updates that occurred to the distributed ledger. An agreement modeled in DAML is only active if Hyperledger confirms it is valid and not referenced by any other transaction, creating an independently verifiable logical mapping between the original business intent all the way through to the relevant Hyperledger transactions.

DAH and HyperLedger.JPG

Image Credit: Coin Desk Construct 2017

Nodes in the Hyperledger network that are not party to the agreement are still able to agree upon its outcome because they can independently verify that all of the required authorizations have been made without ever actually seeing the contents of the agreement itself. However, the contents of agreements can be provably revealed to authorized third parties such as regulators.

We will cover the last three (Chain, Ripple and Enterprise Ethereum in Part 2 of this series).

Blockchain – What is Permissioned vs Permissionless?

bitcoin.jpg

If you have been following distributed ledger technologies (Blockchain), you would see the term permissioned blockchain being thrown around. What is the core difference between a permissioned vs. a permissionless blockchain?

A permissioned blockchain restricts the actors who can contribute to the consensus of the system state. In a permissioned blockchain, only a restricted set of users have the rights to validate the block transactions. A permissioned blockchain may also restrict access to approved actors who can create smart contracts.

Permissionless blockchain is contrary to what you read above – Here anyone can join the network, participate in the process of block verification to create consensus and also create smart contracts. A good example of permissionless blockchain is the Bitcoin and Ethereum blockchains, where any user can join the network and start mining.

Now you may wonder what the benefits and disadvantages of each approach are? In a permissionless world, you do not have to prove your identity to the ledger. As long as you are willing to commit processing power to be part of the network and extending the blockchain, you are allowed to play. Any miner who is playing the game by the rule may be able to solve the hash puzzle and verify the block of transactions to win the mining reward (Higher the mining power, better the chances of winning the mining reward).

In the permissioned blockchain world, you need to be an approved actor in the system to participate in growing the chain as well as building consensus. Many of the blockchain consortiums that build private blockchains for financial institutions and other enterprises follow this model.

One other critical difference between these two is the underlying mining model – permissionless blockchains use Proof of Work(PoW) mining where hashing power is offered to build trust. As long as 51% of the nodes are honest players, network consensus is reached. (Read about 51% attack here).  While Bitcoin uses PoW mining, Ethereum is proposing to use a Proof of Stake model (PoS) for reaching consensus.  Proof of stake mining asks users to prove ownership of a certain amount of currency (their “stake” in the currency). Instead of buying computers and electricity for mining in a PoW system, a PoS systems uses the capital to acquire the coins/tokens that allow you to validate transactions.

Permissioned blockchains do not have to use the computing power based mining to reach a consensus since all of the actors are known; They end up using consensus algorithms like RAFT or Paxos. There are also other PBFT algorithms that can be used to reach consensus without PoW mining.

Let us look at the topic of enterprise blockchains. Almost all of these piloted blockchains these days are permissioned. There are many reasons why this is the case:

  1. Privacy – using a permissioned blockchain allows only actors who have rights to view the transactions. A permissionless blockchain is ideal as a shared database where everyone can read everything, but no single user controls who can write. Imagine you are a large bank who uses a shared ledger with a list of other banking partners within a consortium – you do not want the volume of your transactions to be visible to your competitors.
  2. Scalability – A Permissioned blockchain can build a simplified Proof of Stake model to establish consensus; this prevents the proof of work by burning computational cycles. The ultimate result is scalability compared to a public blockchain network like Bitcoin. (See BigChainDB).
  3. Fine Grained Access Control – A Permissioned blockchain allows restricted access to the data within the ledger (See the design model underlying R3CEV’s Corda)

I want to highlight one of the most famous interviews by Bitcoin guru Andreas Antonopoulos – Sewer Rat and the Bubble Boy. When asked about “How are enterprise/private blockchains different from bitcoin’s blockchain?”, Andreas responded:

“The banks and the corporations say, “Oh, bitcoin’s awesome. We want that. Only without the open, decentralized, peer-to-peer, borderless, permissionless part. Could we instead have a closed, controlled, tame, identity-laden permission version of that please?”

It is an interesting argument where Andreas compares Bitcoin network to the Internet and the private blockchains as the secure intranet within enterprises. He compares them to the Sewer Rat and the Bubble Boy:

Bitcoin is a hardened platform because its security is tested on an everyday basis.

“Bitcoin is a sewer rat. It’s missing a leg. Its snout was badly mangled in an accident in last year. It’s not allergic to anything. In fact, it’s probably got a couple of strains of bubonic plague on it which it treats like a common cold. You have a system that is antifragile and dynamic and robust.”

Does this mean that the enterprise blockchains are the bubble boy?

 “let’s take bitcoin, cut off its beard, take away its piercings, put it in a suit, call it blockchain, and present it to the board.” It’s safe. It’s got borders. We can apply the same regulations. We can put barriers to entry and create an anti-competitive environment to control who has access. It will be more efficient than our existing banking system.

Eventually, successful, vibrant, innovative companies are the ones that turn their IT infrastructure inside out. In the future of finance, successful, vibrant, and innovative banks are the ones that turn their infrastructure inside out and make it part of an open, borderless financial system that serves the other 6 billion, that serves all the people who have been excluded from finance.”

I for one cannot wait to see how this whole debate will shape up the future of finance. The advent of bitcoin and blockchain and DLT technologies is a pivotal moment in the history of computing and technology. It will change the way of how we build systems of the future.

You can watch the full-text interview with Andreas here and the video of his talk here.

Digital Delight – How are you moved?

index-mobile2007 was a watershed moment in technology. The iPhone redefined the concept of personal computing and the world as we knew it was never the same. In just a decade, we have seen the proliferation of smart devices in every aspects of our life. We take for granted the various conveniences  offered by our smart phones.

However too much of a good thing can also create fatigue. Our smartphones are cluttered with apps which we download and use once or twice and never revisit again. Like a kid in a candy store, we hoard on apps and eventually get tired of it and settle on a few which truly make your everyday life easy. If you start filtering apps with this lens, I bet most folks don’t use any more than 10 to 15 apps in a repeating basis.

Apple and Google haven’t really done much to help solve this hoarding issue of the users. I have always thought a neat feature in an operating system level would be is to notify the user of an app dormancy – if I haven’t touched a thing in 3+ months, is there any possibility I would ever use it again?  Recommend a list of these apps which I can get rid of and keep my sanity!

At the end of the the day, the few of the 10 to 15 apps we use may not be the best of the best but they may serve a utility which we cannot live without. A good example would be apps published by your banking and credit card providers. I counted a total of 10 apps in my phone representing my banking, credit card and investment/retirement account providers.

A fellow Fintech Mafia member Alex Jimenez mentioned the other day that most of the mobile banking apps are online banking shoved into a small screen. I tend to agree with his assessment. In a race to keep the mobile apps in feature parity, most financial institutions are in a rush to port the kitchen sink into their mobile apps. While I appreciate the swiss army knife of a mobile banking app, we really don’t need 48 features in a form factor meant to engage you for less than a few minutes to take care of quick and important banking transactions.

A mobile banking product manager’s wish is to figure out what makes the customer tick. Everyone wants to build the next Uber of the banking service. However, to get the formula right to digitally delight a customer is not an easy task.

What is Digital Delight you ask?

Digital Delight – if an online or a mobile product/service creates a pleasurable moment that makes an experience just a little more fun.

How can we can introduce this digital delight as a product designer? Not all of the apps can be digitally delightful as Monument Valley the game (which btw still blows my mind!). Sometimes the opportunity lies in taking the most mundane process workflow and integrating that into an informative notification that can make a huge difference.

Case in point – Delta launched a new redesigned app with a feature to track the status of your checked in bag.

delta1

(Image Credit – Delta)

From the moment the baggage gets tagged into the system, the Fly Delta app starts tracking the baggage in transit.  You get notifications that your luggage has been loaded into your plane. It allows you to see where your bags are at any point in time even using a satellite map!

delta2

(Image Credit – Delta)

I want to send kudos to the Delta team for making an useful feature like this part of their app update. This is certainly one of those things where I was delighted to see in action. A nice video of this feature is available here.

What other apps surprise/delight you in this fashion? Share them in the feedback section below. As I see digital delights in the wild I will make sure I share them as well.

Happy New Year 2017!

Google wifi – Mother-in-Law Approved.

gwifi

I live in a three level home which is a terrible location for a wifi network to provide full home network coverage. Interestingly enough, the home I used to live prior was much larger but my good old 8 year old Linksys wireless N router never had any issues covering 4000 sq.feet between two levels. When I moved to the current home I live now, I chose to have my cable modem installed at the basement level (which was a terrible mistake). The walls of this home seem to have lead in it which almost makes the home sorts of a Faraday cage killing the signal strength on the top level.

Hence started my quest to find a router that worked across all three levels.

Setup 1:

My first move was to buy a cheap router (D-Link) and use it as slave to my primary Linksys router at the basement.  I decided to buy a NetGear Powerline Adapter  to extend the Ethernet connection to my third floor and then use the D-link router to setup a slave network. In theory this sounded fine, on implementation a few issues arose. I had two distinct wifi networks in my home and I had to switch between networks as I roamed around. Not to  mention the fact both the Linksys and D-Link routers had dual-band radios for 2.4 Ghz and 5 Ghz, so I had 4 wifi networks in total!

Why can’t you disable one of the radios you ask?  Well the 5 Ghz does extremely well for short distance applications (like streaming Apple TV) while the 2.4 Ghz radio crosses walls better. To make things more interesting, I had the 5 Ghz radios in Wireless-N mode while 2.4 Ghz band operated in Wireless G/B mixed mode to provide compatibility to older devices at home. (As crazy as it may sound, an average home these days can easily have more than 20+ wireless devices considering all of the the cell phones, tablets, SMART TVs, Laptops, Media Players, Smart Home Assistants like Alexa, refrigerators etc).

Setup 1 worked OK for a while but I was getting mysterious signal drops on wifi when I was in the slave network on the third level. The connection would randomly freeze and I would have to reset the wifi radio on my phone to get the connectivity back on. I was not sure what the culprit was. By process of elimination, I could tell the basement Linksys router was working OK and the slave D-Link upstairs was working fine as well. (I swapped them and the problem still persisted). This made me conclude the culprit may be the NetGear Powerline adapter (which uses the home electrical connectivity as an extended wired Ethernet network).

To test this theory out, I bought a TP-Link Nano Ethernet Poweline adapter to see if  it solved the problem. After swapping it with the NetGear, the problem seemed to have gone away for a few days and then it was back again.

House 1:  Me 0.

Setup 2:

Based on lessons learnt on Setup 1, I realized there may be a hidden electrical goblin in my house  wiring eating my IP packets. I started exploring alternate options like wifi extenders which technically extend the same wifi network across a bigger area. I ended up buying the Amped wireless extender and connected that to my basement Linksys router. While this seems to have marginally solved the problem, I was still having intermittent issues with connectivity dropping randomly which was quite frustrating. After a 2 week trial run, I returned the Amped back to Costco.

House 2: Me 0.

Setup 3:

By now I was starting to think the issues may be with my Linksys router which did not have a firmware update from Cisco in 4 years.  I started researching new routers. This is when Google got into the router business and launched their OnHub router with the help of TP-Link and Asus. OnHub was google’s first foray into home networking business. The OnHub was supposed to solve all of the coverage issues. I was ready to cry tears of joy and drove to BestBuy to pick it up on Day 1 when it was released.

OnHub ended up being a huge disappointment. Since it was a router on its own respect, I replaced the basement Linksys with the OnHub. The setup was super easy with a smartphone app which configured all the settings which was very cool (compared to the legacy admin consoles hosted at 127.0.0.1 in your older routers). It was definitely very  user friendly but ultimately failed the important test of not being able to have the range to penetrate the lead-laden walls of my house to the third level. I mucked around with various settings in vain. However the best speed which I was able to get on the third level was a measly 750 kbps. No YouTube or Netflix streaming while I was in bed.

House 3: Me 0.

Setup 4:

OnHub was promptly returned after 3 days of testing as it didn’t quite cut it for my needs. At this point I was fairly disillusioned with fancy new routers and did more research to find the holy grail of routers. After reading many reviews on The Wirecutter, I finally decided to give the Archer C9 a try. The Archer C7 was rated as one of the best router  (for most people) by The Wirecutter in the market. I picked up the Archer C9 from Costco. The added bonus was it came with a range extender which worked out of the box.

The Archer C9 seemed to do OK on most of my testing and has been my primary router for the past year. The extender would take the same wifi network name, so I did not have to keep switching networks as I moved around the house. However I did notice a little bit of lag in streaming speeds on the third level. It worked most of the times, but sometimes it would make me reset the wifi radio in my phone to get it to work again. (By now you may have realized that turning wifi on/off has been something I do at least a few times in an average day and over a period of four years it gets mighty annoying. You ultimately give up and accept it as  price to pay to have a wifi connection – I keep reminding myself of pre-wifi days during when I went and bought a 100 meter Ethernet cable 12 years ago when wifi was not a common concept!)

House 3.5: Me 0.5.

Setup 5:

Recently I came to know about a new concept where a new breed of mesh networked home wifi equipment that have been popping up in the home network scene. Most prominent of those were NetGear’s Orbi, Eero Home wifi, Luma Home and Google Wifi.

I stayed away from making a leap into this scene as these equipment were all quite pricey. (Typically more than $400). However when I watched the Google Hardware event on October 4’th, I was blown away by some of the things Google launched/announced. They had a wide range of products from Pixel Phone to VR goggles and then Google Wifi. Watch this video starting at minute 50 to catch the Google wifi announcement. The pricing seemed reasonable for the three pack unit starting at $300. I decided to give one final shot and preordered the Google wifi from Amazon.

I received Google wifi 3-pack today and opened it to perform the setup. The packaging was very apple like. There was just a quick start card inside along with the units and power cords and an Ethernet cable. Very simple and spartan. Following the quick start instructions, I downloaded the Google wifi app on the app store and started the installation process.

I setup the circular hockey puck looking units (a little larger than that) at each level in my home. These devices are beautifully designed and blend into the environment. The bottom of the devices have the SSID of the unit and a password to connect to it (Note: You will only use this SSID and password during setup).

Setup was a little confusing to start with. The App was attempting to connect to an unit without asking me to join to the SSID broadcast by the base unit. I had to manually switch to phone system wifi settings to connect to the SSID of the base unit. At that point, the setup app took over and asked me to name the location where I placed the unit (basement) and asked me to also name the home network.

Once I did this step and the network was created, the app asked me if I had more Google wifi units. I had two more units I needed to setup. The app again went on a wild goose chase trying to locate both the units. I had to now switch to phone wifi settings to identify the unit by name on my family room. Once I connected to this unit using the user id and password at the bottom of the unit, the app took over again. This time the app recognized it as a peer unit and configured it part  of the home mesh network.  I repeated the same steps for the unit on my third level and it connected to the network like a charm. I breathed a sigh of relief when I realized all of the three units were part of the mesh network and were under the home network SSID I created when I setup the first unit.

Now to the acid test. Speed Test app blazed through on all three levels with speeds consistently exceeding 70 Mbps.  I tested the speeds while I was stationary on all three levels. Then to put the system to the test, I started a Speed Test and moved around the house where I knew the connectivity would be weak. Impressively enough I could see the speeds slowing a bit on dead spots but the mesh network came to rescue with another unit picking up the signal seamlessly and I could visually see the speeds normalizing to the 50 Mbps and above threshold.

Finally I think I have a winner with Google wifi!

The best part about this is I hope I do not have to keep resetting the wifi to stream YouTube on my visiting mother in law’s iPad which is worth its weight in gold!

Home KO:  Me smiley