Sunday, 24 June 2018

Taking the AI training wheels off: moving from PoC to production

In helping dozens of organizations build on-premises AI initiatives, we have seen three fundamental stages organizations go through on their journey to enterprise-scale AI.

First, individual data scientists experiment on proof of concept projects which may be promising. These PoCs then often hit knowledge, data management and infrastructure performance obstacles that keep them from proceeding to the second stage to deliver optimized and trained models quickly enough to deliver value to the organization.  Moving to the third and final stage of AI adoption, where AI is integrated across multiple lines of business and requires enterprise-scale infrastructure, presents significant integration, security and support challenges.

Today IBM introduced IBM PowerAI Enterprise and an on-premises AI infrastructure reference architecture to help organizations jump-start AI and deep learning projects, and to remove the obstacles to moving from experimentation to production and ultimately to enterprise-scale AI.

On-premises AI infrastructure reference architecture


AI and deep learning are sophisticated areas of data analytics, which is rapidly changing. Not many people have the extensive knowledge and experience needed to implement a solution (at least not today).

To help fill this knowledge gap, IBM has built PowerAI Enterprise – easy-to-use, integrated tools to get AI open source frameworks up and running quickly. These tools utilize cognitive algorithms and automation to dramatically increase the productivity of data scientists throughout the AI workflow. This tested, validated and optimized AI reference architecture includes GPU-accelerated servers purposely built for AI. There is also a scalable storage infrastructure that not only cost-effectively handles the volume of data needed for AI, but also delivers the performance needed to keep data-hungry GPUs busy all of the time.

IBM Study Material, IBM Tutorials and Materials, IBM Learning, IBM AI, IBM Certifications

IBM AI Infrastructure Reference Architecture

Ritu Joyti, Vice President of IDC’s Cloud IaaS, Enterprise Storage and Server analyst, noted “IBM has one of the most comprehensive AI solution stacks that includes tools and software for all the critical personas of AI deployments including the data scientists. Their solution helps reduce the complexity of AI deployments and help organizations improve productivity and efficiency, lower acquisition and support costs, and accelerate adoption of AI.”

One customer which has successfully navigated the new world of AI is Wells Fargo, as they use deep learning models to comply with a critical financial validation process.  Their data scientists build, enhance, and validate hundreds of models each day and speed is critical, as well as scalability, as they deal with greater amounts of data and more complicated models. As Richard Liu, Quantitative Analytics manager at Wells Fargo said at IBM Think, “Academically, people talk about fancy algorithms. But in real life, how efficiently the models run in distributed environments is critical.”  Wells Fargo uses IBM AI Enterprise software platform for the speed and resource scheduling and management functionality it provides. “IBM is a very good partner and we are very pleased with their solution,” added Liu.

When a large Canadian financial institution wanted to build an AI Center of Competency for 35 data scientists to help identify fraud, minimize risk, and increase customer satisfaction, they turned to IBM. By deploying the IBM Systems AI Infrastructure Reference Architecture, they now provide distributed deep learning as a service designed to enable easy-to-deploy, unique environments for each data scientist across shared resources.

Get started quickly


PowerAI Enterprise shortcuts the time to get up and running with an AI environment that supports the data scientist from data ingest and preparation, through training and optimization and finally to testing and inference. Included are fully compiled and ready-to-use IBM-optimized versions of popular open source deep learning frameworks (including TensorFlow and IBM Caffe), as well as a software framework designed to support distributed deep learning and scale to 100 and 1000 of nodes. The whole solution comes with support from IBM, including the open source frameworks.

The IBM Systems AI Infrastructure Reference Architecture is built on IBM Power System servers and IBM Elastic Storage Server (ESS), with a software stack that includes IBM PowerAI Enterprise and IBM’s award-winning Spectrum Scale. IBM PowerAI Enterprise installs full versions IBM PowerAI base, IBM Spectrum Conductor and IBM Spectrum Conductor Deep Learning Impact.

IBM Study Material, IBM Tutorials and Materials, IBM Learning, IBM AI, IBM Certifications

IBM Spectrum Scale’s easy-to use interface

IBM PowerAI Enterprise


IBM PowerAI Enterprise extends all of the capability we have been packing into our distribution of deep learning and machine learning frameworks, PowerAI, by adding tools which span the entire model development workflow. With these capabilities customers can develop better models more quickly, and as their requirements grow, efficiently scale and share data science infrastructure.

To shorten data preparation and transformation time, PowerAI Enterprise integrates a structured, template-based approach to building and transforming data sets. It also includes powerful model setup tools designed to eliminate the earliest “dead end” training runs. By instrumenting the training process, Power AI enterprise allows a data scientist to see feedback in real time on the training cycle, eliminate potentially wasted time and speed time to accuracy.

Bringing these and other capabilities together accelerates development for data scientists, and the combination of automating the workflow and extending the capabilities of open source frameworks unlocks the hidden value in organizational data.

Saturday, 23 June 2018

Intellectual Property and the software-defined supply chain

Intellectual Property issues are often flagged as an area of concern for the development of the software-defined supply chain, yet the nature of these perceived issues is not generally clearly identified.

IBM Study Materials, IBM Guides, IBM Learning, IBM Tutorial and Material, IBM CAD/CAM

CAD/CAM, CNC and fast prototyping techniques, FPGAs and ASICs are well established digital manufacturing techniques, however they tend to require considerable technical expertise, substantial customization and set-up effort and vary a great deal from implementation to implementation. This acts as barrier to infringement since even if one were to acquire digital design files for a particular product, the skills, equipment and effort needed to use it would be prohibitive. The software-defined supply chain emphasis on small scale local production and one hand and dependence on low cost open source solutions on the other will lead to a high degree of standardization in manufacturing platforms, so that it becomes feasible to reuse copied design files. Furthermore the dispersed nature of manufacturing means that such files will be more widely disseminated and more likely to be exposed.

The core concern is therefore that it will become possible to download copied plans for a product, and have the product manufactured without the originally creator having any control or receiving any remuneration.

Protecting your investment in distributable designs


Digital product designs will generally be protected by copyright which benefits from good harmonization worldwide, and may furthermore be the subject of Patents, Design right, registered designs and Trademarks subject to local requirements. On this basis, the consumer and manufacturer in the scenario described above would almost certainly infringe the original creator’s IP rights and expose themselves to civil action. To this extent therefore, the IP system provides satisfactory basic tools for protection of the different types of value that may be embodied in a digital design. The lowering of the manufacturing hurdle makes it all the more important to ensure that relevant IP protection is identified and secured in good time.

The real issues here relate not to the primary law that is applicable, but rather to how it can be enforced. The software-defined supply chain implies a multiplication of possible infringers, most of which will be small businesses or even private individuals, with minimal liquidity and low liability for damages. The protection available under IP rights corresponds best to large scale infringements- bringing suit for IP infringement is a long and expensive business, and in a software-defined supply chain context may be out of all proportion with the recoverable damages.

The approaches developed in the electronic media field may provide a useful starting point when looking for solutions to these perceived problems.

For certain digital products it is usual to make distribution subject to contractual provisions which define how the consumer may use the product, address warranty issues and cover redress and termination matters. Shrink wrap and click through licensing approaches have been developed and widely adopted. Open source style licensing where contract acceptance is implied through use of the code is a step further along this path which may be helpful in the context of distributable design files.

Digital Rights Management is another set of techniques developed in the digital media field. Such techniques may well be helpful in the case of distributable design files. DRM mechanisms might be tied to hardware dongles, manufacturing machinery or activation keys to ensure that only authorized manufacturers implement the designs. This suggests the development of distributable design file formats supporting encryption and other DRM enabling technologies. In some cases this may also call for contributions to the design of manufacturing machinery such as 3d printers to ensure that they are able to function correctly and securely with DRM protected distributable design files. This may well be relatively straight forward technically given the open source approach of many such printers, but may need careful management from a social point of view given the hostility of the open source movement generally to DRM.

While the primary path to enforcing IP rights is though litigation in the courts, the distributed nature of the software-defined supply chain makes this approach overly cumbersome, since the recoverable damages will often be less that the cost of the process itself. A number of Administrative and quasi judicial measures are currently available in some jurisdictions which may be taken as possible models for protection of distributable design files.

In some jurisdictions some types of IP infringement may be subject to sanctions under criminal law. This may enable the right holder to enlist the help of public law enforcement bodies to bring infringers to justice. Such activities may not always be within the remit of Police organizations, but Trading Standards and Customs bodies are often more familiar with actions of this kind. Generally such bodies are most comfortable with trademark enforcement, and organizations pursuing a software-defined supply chain model may do well to pay special attention to this aspect of their IP strategy.

Some jurisdictions have developed special measures relating to IP infringements using the internet, and in particular the downloading by consumers of media files from peer to peer networks and the like. The details of the process and available sanctions vary widely from jurisdiction to jurisdiction, and may involve a graduated series of warning messages, internet access restrictions and eventually as streamlined judicial process. These measures are designed with a view to the same need for light weight low cost processes that would be equally applicable in the case of software-defined supply chain infringements.

Other IP issues inherent in the software-defined supply chain


The Software-defined supply chain model suggests a variety of uses for open source material:

◉ Open source developed 3d printing equipment
◉ Open source firmware on products
◉ Open source distribution of design files for 3d printing

The adoption of open source materials in mission critical product manufacturing potentially exposes an organization to certain special risks, in particular

◉ the viral effects of certain open source licenses
◉ Difficulties in establishing provenance and/or licensing terms for code
◉ Difficulties in interpreting and complying with licensing terms

This suggests that businesses wishing to adopt Software-defined supply chain mode will need a high degree of sophistication and strong process for identifying and resolving such issues.

Handling third party design contributions


A key benefit of the Software-defined supply chain model is that it becomes viable to offer many variants of a product, to better correspond to local preferences or even the desires of individual consumers. While merely offering combinations of predefined choices does not raise any special IP issues, the approach’s flexibility lends itself to third parties offering custom modifications and even the creation of communities of enthusiasts modifying the original designers works, or indeed the works of other enthusiasts. In this context the issues of control and ownership arise. Generally, any such modifications will constitute derivative works of the original design, and as such would constitute infringement of the copyright in those designs unless permitted by any applicable license agreement. While such scenarios can be inhibited by the use of the mechanisms described above, it may prove advantageous to foster this type of community contribution to some extent, and the license provides the means to achieve this. Indeed, this may be seen as a further advantage of the software-defined supply chain, since IP rights in physical products may be exhausted by sale of the product, whereas in a license based software defined product the original designer decides what rights to give. Accordingly, one approach may be to separate the design files into two or more licensing categories, with certain parts of the file frozen, and other parts left open to modification. The frozen parts might be encrypted or otherwise protected by DRM type mechanisms. The parts left open to modification, which may correspond to the external appearance of the article, may be licensed under an open source type license, or a license permitting modification and distribution, but stipulating that all modifications are ceded to the original designer. The issue of how modified designed may be used commercially will also need to be addressed.

Friday, 22 June 2018

IBM – Microservices Specialization on Coursera – a learning journey

IBM Certifications, IBM Study Materials, IBM Guides, IBM Tutorials and Materials

IBM – Microservices Specialization is intended for application developers seeking to understand the benefits of microservices architecture and container-based applications. The student learns how to develop and deploy microservices applications with Kubernetes on IBM Cloud and IBM Cloud Private via a continuous release pipeline.

There are four self-study courses in this specialization, each course offers exercises followed by a badge quiz for the course. When you compete all the courses and earn the badge for each course, you will also earn the IBM Microservices Specialization badge which will become available soon. You can also take each course by itself if you only need skills in one area, by going directly to the course page.

IBM Certifications, IBM Study Materials, IBM Guides, IBM Tutorials and Materials


In enterprise environments, the architectural style of microservices is gaining momentum. In this course, you will learn why microservices are well-suited to modern cloud environments which require short development and delivery cycles.  You will learn the characteristics of microservices.  You will compare the microservice architecture with monolithic style, emphasizing  why microservices are well suited to continuous delivery.

While microservices are more modular to develop and may look simpler, you will discovery that the complexity does not go away, it shifts.  An inevitable organizational complexity comes along with many small interacting pieces.  Managing, monitoring, logging, and updating microservices creates a greater operational complexity. In this course you learn about the tools necessary to successfully deploy, manage and monitor microservice based applications.

After taking this course, you will have a much better understanding of why microservices are so well suited to cloud environments, the DevOps environments in which microservices run and the tools to manage the complexity that microservices bring to the operational and production environment.


This course provides an introduction to Microclimate, an end-to-end development environment that lets you rapidly create, edit, and deploy applications that run in containers. Microclimate can be installed locally, or on IBM Cloud Private, where you can create a pipeline for continuous integration and delivery.

In this course, you learn how to quickly set up a development environment for working with Microclimate, and import a sample application. Using the Integrated Jenkins pipeline and Github, you also learn how to deploy a microservice application to IBM Cloud Private.


In this course, you learn how to install the Kubernetes command-line interface (CLI), and create a Kubernetes cluster on which to run applications. Hands-on tutorials show you how to deploy microservices to a Kubernetes cluster. You also learn about securing and managing a Kubernetes cluster, and how to plan your Kubernetes cluster for deployment on IBM Cloud.

The ideal candidate for this course has a basic understanding of cloud computing, a working knowledge of developing microservices, and some experience working with IBM Cloud. Experience with using Docker, and familiarity with YAML is also a plus.


IBM Cloud Private is an application platform for developing and managing on-premises, containerized applications. It includes the container orchestrator Kubernetes, a private image repository, a management console, and monitoring frameworks. In this course, you learn how to install and configure IBM Cloud Private components in your environment, and how to prepare microservices applications for deployment.

Wednesday, 20 June 2018

Is pulling your organization’s IP like pulling teeth?

Many organizations suffer from lack of time and resources to get anything other than the supercritical accomplished – and often through super hero efforts at that. With the constant pressures to do more with less, who has time to write down a quick summary about an improvement to a customer design that shaved a week off the schedule or a tweak to a model that generated 10% more accurate simulation results?

IBM Certification, IBM Learning, IBM Guides, IBM Study Materials, IBM IP, IBM Tutorials and Materials

What’s more, who even recognizes the achievement as being valuable intellectual property? With so much focus on the end result the critical know how responsible for getting that result goes undocumented, unprotected and unvalued; putting freedom to operate and the ability to keep IP out of the hands of competitors at serious risk.

So how can one conquer the constant IP pull battle and create a self-sustaining, IP push culture of innovation given the intense pressures? Let’s first take a look at various innovation-pulling initiatives and point out what you have probably already seen go wrong when used in isolation.

1. Incentives


Incentives seem to come up as a method of getting blood out of stones. Although in many cases incentives may help for a short time (either intrinsic or extrinsic), things quickly return to status quo. The same responsible employees bound by their duty to disclose IP (and know how to recognize it and what to do with it at that point) continue to do so, the rest go back to fighting the fires that encompass their day jobs. Incentives programs certainly can be good for both morale and promoting good IP disclosure practices, but for them to become a driving force to change the culture will require a sustained and focused effort that may be difficult for the very reasons we discussed.

2. Quotas:


How about forcing already over-loaded workers to promise to submit or have their employees submit a certain number of disclosures as potentially valuable IP? Certainly yields the result of getting the bare minimum number to fill a quota, at the last minute and of relatively low quality. Laws of statistics argue that there may still be valuable IP amongst the chaff, but how much of it was lost from the beginning of the year when the pressure wasn’t focused on getting disclosures submitted?

2. Invention Miners:


Perhaps a special ops task force that constantly mines for IP throughout the organization, like a robotic vacuum cleaner sucking ideas out of people’s heads? Probably more effective than the first two, but now you’re going to have to get additional precious and rare requisitions that are slated for higher priority (according to a likely non-IP centric executive team). Hiring a lower cost consulting firm is an option, however the business expertise lies within your own organization and would likely require some internal resources to direct the ongoing effort – at least in the short term.

3. Innovation Lab:


What about a dedicated “innovation lab” where employees can rotate through on a part-time or full-time basis? We’ve seen significant success for organizations that set up a dedicated workspace that promotes creativity and collaboration. The drawback is, of course, that you are only receiving ideas from a small sub-set of the organization and although the ideas may be important forward thinking solutions there may be much more critical-to-the-business-today IP that is not being captured.

4. Innovation Day


Another tack may be to host an “innovation day” at the office. A message from the executive team about the importance of IP, some encouraging insights from Inventors (guests or otherwise), a word about the submission process and some food might go a long way to inspire creativity, spread the word, help employees connect with others and fill a few of the invention coffers. Although the effects may not be long lasting it might be sufficient to host something similar once or twice a year to keep getting the message across and a few inventions in the door. The trick will be to make it sincere and respected… not hokey.

5. Enlist OC:


Outside counsel can be a valuable partner resource to help with invention mining as well. If you have a good OC they should be able to assist with invention mining efforts directly with your teams. Being part of the process enables them to have a clear understanding of the inventions which typically leads to quality applications that can be filed quickly after the session. They may be expensive; however, many enjoy invention mining and may provide a discount for such services, especially if you want to try a few “pilots”. Invite them to the innovation fair while you’re at it and have them meet with some of the employees directly.

6. IP Champions:


Volunteer armies of patent or IP champions have been deployed in some organizations to instill innovation awareness, provide training and do some invention mining amongst their teams. While this can be an incredibly effective grass roots effort it takes very special individuals with a passion for IP to make it work. Most volunteers will quickly go back to their day jobs after a minimal amount of effort (hey – isn’t that what we all do?).

The truth is that the problem doesn’t lay with the innovation programs themselves, and it’s certainly not with the employees, but rather getting the right mix of programs at the right time to the right people. Easier said than done, but certainly doable. Just look at IBM’s proven patent leadership and resounding culture of innovation, which stem from just the right concoction of invention capture initiatives- carefully managed and always flexible.

Although we have yet to find the silver bullet cure for making the process of pulling IP from the organization completely pain free – using a mix of each of the invention cultivation tools at various points in the R&D cycle and calendar year coupled with careful monitoring of changes to the business and its IP needs should help to build an internally sustainable culture of innovation that creates much more push and requires much less pull. The mix of push and pull initiatives helps ensure you’re covering all your bases. At any given moment employees are working in various stages of a project, with different teams and often on very different tasks. So what works for some projects/employees might not work as well for others. At the very least your organization should be better equipped to capture critical IP, mitigate damaging risks of IP loss and/or freedom to operate, without breaking the bank.

Sunday, 17 June 2018

Self-sovereign identity: Why blockchain?

One of the most common questions I get when talking to customers and analysts about the self-sovereign identity (SSI) movement is, “Why blockchain?”

This question tends to stem from the notion that data associated with a person’s identity is destined to be stored, shared and used for verification on some form of distributed ledger technology. My hope is that this article with help to debunk that notion and provide a basic foundational understanding of how distributed ledger technology is being used to solve our identity infrastructure dilemma and resolve the impacts of the internet lacking an identity layer.

Busting the myth of on-chain PII


One of the most common myths surrounding blockchain and identity is that blockchain technology provides an ideal distributed alternative to a centralized database for storing personally identifiable information (PII). There are several flavors of this perception: (a) use blockchain to store the data; (b) use a blockchain as a distributed hash table (DHT) for PII data stored off-chain.

Yes, blockchain can technically support the placement of PII on the chain or used to create attestations on the chain that point to off-chain PII storage. Just because technology can be applied to solve a specific problem does not mean that it is the proper tool for the job. This misconception about PII storage in the early stages of the blockchain technology adoption lifecycle is so pervasive that it recently inspired a Twitter thread dedicated to the debate on why putting hashed PII on any immutable ledger is a bad Idea. From GDPR compliance, to correlation, to the cost of block read/write transactions, the debate continues.

Blockchain technology is much more than a distributed storage system. My intent herein is to help the inquisitive identity solution researcher debunk beliefs about PII storage approaches by gaining an understanding for how blockchain can be used as an infrastructure for identity attestations. My hope is this article will offer a helpful aid towards that education and awareness.

The SSI initiative is a perfect counterpunch to detrimental PII management practices. A SSI solution uses a distributed ledger to establish immutable recordings of lifecycle events for globally unique decentralized identifiers (DIDs). Consider the global domain name system (DNS) as an exemplar of a widely accepted public mapping utility. This hierarchical decentralized naming system maps domain names to the numerical IP addresses needed for locating and identifying computers, services or other connected devices, with the underlying network protocols. Analogous to the DNS, a SSI solution based on DIDs is compliant with the same underpinning internet standard universally unique identifiers (UUIDs) and provides the mapping of a unique identifier such as DID, to an entity — a person, organization or connected device. However, the verifiable credentials that are associated with an individual’s DID and PII are never placed on a public ledger. A verifiable credential is cryptographically shared between peers at the edges of the network. The recipient of a verifiable credential, known as a verifier, in a peer to peer connection would use the associated DID as a resource locator for the sender’s public verification key so that the data in the verifiable credentials can be decoded and validated.

No PII on ledger, then why blockchain?


So, what problem is blockchain solving for identity if PII is not being stored on the ledger? The short answer is that blockchain provides a transparent, immutable, reliable and auditable way to address the seamless and secure exchange of cryptographic keys. To better understand this position, let us explore some foundational concepts.

Encryption schemes


Initial cryptography solutions used a symmetrical encryption scheme which uses a secret key that can either be a number, a word or a string of random letters. Symmetrical encryption blends a secret key and the plain text of a message in an algorithmic specific manner to hide a message. If the sender and the recipient of the message have shared the secret key, then they can encrypt and decrypt messages. A drawback to this approach is the requirement of exchanging the secret encryption key between all recipients involved before they can decrypt it.

Asymmetrical encryption, or public key cryptography, is a scheme based on two keys. It addresses the shortcomings of symmetrical encryption by using one key to encrypt and another to decrypt a message. Since malicious persons know that anyone with a secret key can decrypt a message encrypted with the same key, they are motivated to obtain access to the secret key. To deter malicious attempts and improve security, asymmetrical encryption allows a public key to be made freely available to anyone who might want to send you a message. The second private key is managed in a manner so that only the owner has access. A message that is encrypted using a public key can only be decrypted using a private key, while a message encrypted using a private key can be decrypted using a public key.

Unfortunately, asymmetric encryption introduces the problem of discovering a trusted and authentic public key. Today the most pervasive technique for public key discovery in communications based on a client-server model is the use of digital certificates. A digital certificate is a document that binds metadata about a trusted server with a person or organization. The metadata contained in this digital document includes details such as an organization’s name, the organization that issued the certificate, the user’s email address and country, and the user’s public key. When using digital certificates, the parties required to communicate in a secure encrypted manner must discover each other’s public keys by extracting the other party’s public key from the certificate obtained by the trusted server.

Trust chains


A trusted server, or certificate authority, uses digital certificates to provide a mechanism whereby trust can be established through a chain of known or associated endorsements. For example, Alice can be confident that the public key in Carol’s digital certificate belongs to Carol because Alice can walk the chain of certificate endorsements from trusted relationships back to a common root of trust.

IBM Certification, IBM Guides, IBM Learning, IBM Tutorial and Material, IBM Blockchain

Our current identity authentication scheme on the internet is based on asymmetric encryption and the use of a centralized trust model. Public key infrastructure (PKI) implements this centralized trust model by inserting reliance on a hierarchy of certificate authorities. These certificate authorities establish the authenticity of the binding between a public key and its owner via the issuance of digital certificates.

Understanding the key exchange dilemma


As the identity industry migrates beyond authentication based on a current client-server model towards a peer-to-peer relationship model, based on private encrypted connections, it is important to understand the differences between symmetric and asymmetric encryption schemas:

◈ Symmetric encryption uses a single key that needs to be shared among the people who need to receive the message.
◈ Asymmetrical encryption uses a public/private key pair to encrypt and decrypt messages.
◈ Asymmetric encryption tends to take more setup and processing time than symmetric encryption.
◈ Asymmetric encryption eliminates the need to share a symmetric key by using a pair of public-private keys.
◈ Key discovery and sharing in symmetric key encryption can be addressed using inconvenient and expensive methods:

◈ Face-to-face key exchange
◈ Reliance on a trusted third party that has a relationship with all message stakeholders

◈ Asymmetric encryption eliminates the problem of private key exchange, but introduces the issue of trusting the authenticity of a publicly available key. Nevertheless, similar methods can be used for the discovery and sharing of trusted public keys:

◈ Face-to-face key exchange
◈ Reliance on a trusted third party that has a relationship with all message stakeholders
◈ Certificates that provide digitally signed assertions that a specific key belongs to an entity

Rebooting the web of trust


What if we wanted to avoid this centralized reliance on a trust chain of certificate authorities? What if we could leverage distributed ledger technology as a transparent and immutable source for verifying and auditing the authenticity of the binding between a public key and its owner?

An alternative to the PKI-based centralized trust model, which relies exclusively on a hierarchy of certificate authorities, is a decentralized trust model. A web of trust, which relies on an individual’s social network to be the source of trust, offers one approach to this decentralized alternative. However, the emergence of distributed ledger technology has provided new life to the web of trust vision. Solutions using SSI can leverage distributed ledger as the basis for a new web of trust model that provides immutable recordings of the lifecycle events associated with the binding between a public key and its owner.

Decentralized PKI in a nutshell


As explained earlier and depicted in the diagram below, in a PKI based system Alice and Bob need to establish a way to exchange and store their public keys. Conversely, in a blockchain-based web of trust model, the storage of public keys are managed on the public ledger. As participants in a global identity network, Alice and Bob create their unique DIDs, attach their public keys and write them to the public ledger. Now any person or organization that can discover these DIDs will be able to acquire access to the associated public keys for verification purposes.

IBM Certification, IBM Guides, IBM Learning, IBM Tutorial and Material, IBM Blockchain

Thursday, 14 June 2018

Environments Where Blockchain Can Thrive

IBM BlockChain, IBM Certifications, IBM Learning, IBM Study Materials, IBM Guides

A Blockchain solution can flourish in business scenarios that have a high number of participants that all want to track a particular product or item. And the more complex the tracking process, the more a Blockchain application can thrive.

For example, if a product traverses through a series of steps that starts with its creation and ends with its delivery into the hands of a consumer, then incorporating a Blockchain solution into this process can potentially offer many benefits. For example, it can enhance the overall information security, as well as provide both substantial time and cost savings for all participants in this product’s life cycle.

Blockchain Benefits


To illustrate this further, when a purchase order is received by a manufacturer for a particular item, the product’s life cycle begins. Starting with the purchase order, the product’s manufacturer builds the product then hands it over to a shipper. This shipper then sends the item to a warehouser who then ships it to a wholesaler. This wholesaler then utilizes another shipper to have it sent to a retailer. The retailer then stocks the item until a consumer purchases it. Having a way for all participants in these steps to view where the product originated from, i.e. its provenance, and trace all of its handling can add many benefits, including:

◈ Transparency within supply chains
◈ Immutable information that can be available to all participants
◈ More efficiency in maintaining records
◈ Organized data for auditors and regulators
◈ Reduced or eliminated administrative record keeping errors
◈ Reduced or eliminated processing paperwork

Blockchain for an International Air Services Provider


Recently, an international air services provider, dnata, successfully tested the use of Blockchain technology in its cargo operations. This achievement is a real life example of the aforementioned scenario.

With the help of IBM and other partners, dnata developed a logistics platform with a Blockchain infrastructure. This platform was put into effect to view supply chain transactions, starting with the purchase order of an item and ending at its delivery to a warehouse. This business use-case exemplifies where a Blockchain can thrive: an environment that has a large number of participants wanting to track products through the supply chain.

Blockchain Solution for Asset Management


Another environment where Blockchain can thrive is when a company transfers assets within a business network. When a company internally transfers a physical asset such as a laptop, or in the case of trucking company, a semi trailer, from one location within its business network to another, there can be many people involved and much related paperwork to keep track of its journey. In this case, a Blockchain can establish a clear trail for the asset that has been transferred within this business network. Acting as a shared ledger, the Blockchain can allow internal company parties to view where the company’s assets have been moved to, who it was handled by, its current state, its past state, and how many times it has been transferred and even how many times it has been used – all from the same source, i.e. the shared ledger. And it can be viewed at anytime and by anyone that has permission to do so.

Also within the asset management process, there can be many issues including having transfer information split among many different record-keeping systems, conflicting information on transactional updates, and long wait times to resolve discrepancies. These can add to costs and subtract from efficiency. A properly executed Blockchain can be the sole source of transfer information, and reduce the number of asset discrepancies as well as the time it takes to resolve them.

Wednesday, 13 June 2018

The Race to a Truly Smart Home

The concept of a smart home has been around for a long time and yet even in 2017, it remains relatively ignored by all but the early adopters.

IBM Certifications, IBM Learning, IBM Guides, IBM Tutorials and Materials, IBM Study Materials

The number of vendors offering new smart home platforms continues to expand, whether it is new startups, consumer goods giants or the Silicon Valley elite. But as the choice widens, it becomes more difficult to choose and integrate the best equipment.

In any case, smart homes are not even something the typical homeowner would even give much if any thought to. But to paraphrase a rather hackneyed quotation attributed to Henry Ford, if he had asked his customers what they had wanted, they would have asked for faster horses rather than a motor car.

The growing popularity of smart speaker devices perhaps give us the closest vision yet of the future home. However, by themselves they are little more than mobile devices connected to a loudspeaker and any interaction with lights, thermostats, security devices, etc. requires purchasing and integrating a significant amount of hardware.

The final experience is inevitably going to be rather stilted, not to mention the effort required to install and configure it in the first place. Faster horses perhaps, but horses nonetheless.

What we really need are homes that are built smart, homes where you walk in for the first time and everything just works intuitively.

Wienerberger took a step forward in this direction with the e4 house: an Arup designed housing concept with preconfigured supply chain including sensor technology and Building Information Modeling (BIM) in addition to the more traditional building products.

e4 stands for four key principles:

◈ Economy – a house that is affordable yet built to last
◈ Energy – so efficient that energy bills will be just a couple of hundred pounds per year
◈ Environment – a house that minimizes its environmental impact and is responsibly sourced
◈ Emotion – a house that people will want to live in

IBM is helping Wienerberger integrate the digital components of the house and provide the smart technology to complement the e4 principles.

It is just the beginning of the journey however the smart e4 house can now benefit from a building health monitoring system which will help identify when equipment like the boiler are about to fail enabling the homeowner take action when it is most convenient.

Whilst the building is inherently energy efficient due to the materials used, IBM’s artificial intelligence system, Watson, will perform further optimization by analyzing usage patterns and reducing unnecessary energy usage.

Sensors will detect water leaks and ensure that the system can be repaired before too much damage occurs.

Access to the BIM model will enable the homeowner to obtain useful information such as when the warranty on the boiler expires, the location of live wires behind the walls that may be hit with a misplaced drill, or the type of roof tile used during construction so that an exact replacement can be found for one lost during high winds.

IBM Certifications, IBM Learning, IBM Guides, IBM Tutorials and Materials, IBM Study Materials

It will also enable the development of applications that calculate the amount of paint/tiles/carpet required to decorate a room by accessing the dimensions from the BIM model and perhaps even allow retailers or local tradesmen to make an offer to get the job done.

Perhaps most importantly, a home concierge powered by Watson will enable the homeowner to interact with the house through a natural language interface – voice activated and via a chat interface on a mobile device. Whether it is switching on lights, boosting the heating or answering questions like “how can I reduce my electricity usage?” or “when is the recycling next collected?”.

A truly smart home needs to solve the real problems that homeowners face in order to be adopted. It needs to become as vital to today’s consumer as electricity or the internet.

House builders now have a unique opportunity to make the running in smart homes. Smart personal devices are starting to plateau in capability but smart vehicles are accelerating rapidly. Even smart workplaces are gathering momentum. Yet the race to a truly smart home has yet to really get started.

The technology is here today, but the winner will need to integrate the technology in a way that transforms the homeowner experience for the better.