Friday, 29 December 2017

The Ultimate Ground Transportation Solution

The Ultimate Ground Transportation Solution

Here is a distressing fact: on average, a traffic accident occurs every 5.4 seconds in the United States. That’s over 5.7 million per year. Weather related traffic accidents occur nearly every 26.3 seconds, comprising roughly 22% of all vehicle crashes. Hence, traffic accidents are an unfortunate occurrence. Included in the possibility of having one of your company’s drivers involved in a traffic accident, there are the indirect costs associated with these events such as inefficient asset utilization and service delays. Thus, taking weather, traffic, and road conditions into consideration when planning delivery times and routes can be a difficult, but necessary task.

Ground Transportation Questions and Solution

What can help to avoid traffic delays? What can logistics business leaders, dispatchers, and drivers do to save precious time and lower fuel costs? How can stress levels can be lowered for freight, logistics, and delivery team members? What can be done to improve the expectations of customers who rely on ground shipments, deliveries, and ground transportation services? The answers to all of these questions is found in one solution: Operations Dashboard for Ground Transportation from The Weather Company, an IBM Business.

Operations Dashboard for Ground Transportation

Operations Dashboard for Ground Transportation is a SaaS and combines real-time traffic information with continually updated weather and road condition data to increase overall freight, logistic, and deliver service efficacy.

For those directing a company’s efforts in getting people or products from one point to another safely and on-time, clear skies, calm winds, and unobstructed roadways are much appreciated. However, these very same people have to expect and plan for inclement weather and traffic incidents that disrupt their company’s freight, logistics, or delivery responsibilities. This solution seamlessly and intuitively delivers the best available weather, traffic, and road condition data to help formulate those plans.

The Benefits

Operations Dashboard for Ground Transportation has many possible benefits, including:

◈ Improve the safety of your company’s drivers.
◈ Boost on-time performance.
◈ Avoid traffic incidents in all geographies.
◈ Improve fixed asset utilization.
◈ Elevate workforce productivity.
◈ Optimize route selection.

Features include:

◈ Customizable traffic and weather alerts.
◈ Road condition information that includes precipitation, wind, fog, ice and pooling water.
◈ High-resolution live traffic feeds.
◈ Live weather conditions from radar and satellites.
◈ Road-specific weather forecasts.
◈ User-reported traffic conditions.

Furthermore, this solution coalesces the best available weather and road observations onto a smartphone or tablet device. Thus, delivering dynamic information on mobile platforms, making it easy to refer to for those out in the field.

Thursday, 28 December 2017

Your documents + AI equal world domination

I’m a movie buff and any movie that deals with machines, technology and artificial intelligence eventually taking over the world has got me hooked. While world domination by machines that can learn is not a matter of immediate concern, it still gets some folks nervous. On the other hand, what gets me excited about it is the potential to use cognitive technology to make documents smarter, workflow more efficient and interactions with clients even more productive.

Think about it – your organization collects, processes and stores a tremendous volume and variety of data. But how much of it are you really putting to work for you? Whether evaluating loan applications, processing insurance claims or managing shipping invoices, human intervention is often needed to review and make sense of that unstructured data. Unfortunately, this type of handling is usually slow, labor-intensive, costly and error-prone.

Cognitive capture solutions – which can better understand, reason and learn – make it easier to extract and automate the use of valuable information from unstructured documents. These solutions quickly turn data into useful information that will help you  identify patterns and generate new insights. As part of digital business automation platforms, they play a vital role in transforming how you accelerate business processes, improve efficiency and help reduce costs.

IBM uses advanced cognitive computing capabilities, such as those found in IBM Datacap Insight Edition, to go beyond the limits of traditional capture solutions. IBM’s solution applies advanced imaging, text analytics, natural language processing and machine learning technologies to automatically find, extract and classify important information from complex and variable document types.

The value of Datacap Insight Edition extends to the millions of business documents you might be storing in content repositories. Each document might contain valuable, difficult-to-access information. With Datacap Insight Edition, you can process stored documents to extract information, augment index information, convert old files to newer searchable formats and provide data to analytics engines. The result: putting stored content to work to draw out new insight from old data.

Now back to movies about machines taking over the world.

Data-in-use Protection on IBM Cloud – IBM, Intel, and Fortanix partner to keep enterprises secure to the core

Cloud computing has made collecting, storing, and processing data easier and affordable than ever, but many risk-conscious organizations struggle on how to control, secure and protect data that is processed in a public cloud platform. The data protection needs of organizations are driven by concerns about protecting intellectual property, meeting compliance requirements, or navigating the ambiguity of legal protections for data in the cloud. These organizations see the need to independently retain ownership and control of their data.

Security best practices traditionally call for encrypting data-at-rest and data-in-motion, but the advent of cloud computing has created the need for data-in-use encryption as well. In fact, Identity Theft Resource Center (ITRC) anticipates that the number of breaches could reach 1,500 by the end of 2017, a 37 percent annual increase over 2016, when breaches reached a record high of 1,093.

The Cloud Security Alliance (CSA) recommends, “controls should be applied throughout the entire lifecycle (in transit, at rest and in use) to allow the customer to maintain control over the data while the [cloud service provider] hosts and processes it.” Therefore, the challenge now is how to protect data while it is in use?

Intel® Software Guard Extensions (Intel® SGX) is the only technology that can protect data in use through hardware based server security.  Intel SGX allows application developers the ability to protect select code and data from disclosure or modification. Intel® SGX makes such protections possible using enclaves, which are trusted execution environments (TEE) that utilize a separate portion of memory that is encrypted for TEE use.

IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM Learning

Data-in-use Protection using IBM Cloud Data Guard

Today, Intel SGX application developers need to structure their application into trusted and untrusted parts, where trusted parts are executed inside the enclave. Project “IBM Cloud Data Guard”, powered by Fortanix Runtime Encryption Platform, offers easy to use and powerful services that accelerate application protection with Intel SGX enclaves. Fortanix platform transparently protects applications by creating a portable security envelope to run applications in completely protected states. We extend the reach and benefits of Intel SGX to application developers working in an agile environment, by integrating with their CI/CD systems.

Software development teams can leverage IBM Cloud Data Guard to convert their applications or containers to protected applications or containers capable to run in Intel SGX enclaves.

IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM Learning

Integration of IBM Cloud Data Guard with Development Pipelines

Today, we are announcing IBM Cloud Data Guard Preview, supporting the following scenarios, so you can try and start building your protected applications:

1. IBM Cloud SGX capable baremetal servers: You can provision SGX capable baremetal servers on IBM Cloud today (Model: Intel Xeon E3-1270-v6). You can start building your applications using the Intel SDKs for C/C++ or Fortanix RUST SDK.
2. Curated Applications: You can pull curated protected applications, built using IBM Cloud Data Guard from our Docker private registry. We initially intend to host MySQL, Nginx, Forgerock OpenDJ, OpenStack Barbican, and software key managers.
3. IBM Cloud Data Guard Preview Toolkit: Early access toolkit can convert your application container images to protected container images that runs your applications inside Intel SGX enclaves.

Monday, 18 December 2017

Encrypted Workers in the IBM Cloud Container Service

The IBM Cloud Container Service combines Docker and Kubernetes to deliver powerful tools, an intuitive user experiences, and built-in security and isolation. You can rapidly deliver your apps while leveraging IBM Cloud services like artificial intelligence with Watson.

IBM Cloud, IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM

Today, we are proud to announce that we are turning on encryption of worker nodes by default. Many internal teams and external customers asked us for encrypted data volumes on worker nodes, and we listened to you!

What this means for you

As of today, this change makes your data in new clusters and workers that you create even more secure by default.

IBM Cloud Container Service provides encrypted data partitions for all worker nodes by provisioning them with two local SSD partitions. The first boot partition is not encrypted, and the second partition mounted to /var/lib/docker is unlocked at boot time by using LUKS encryption keys. Each worker in each Kubernetes cluster has its own unique LUKS encryption key, managed by the IBM Cloud Container Service. At boot time, they are pulled securely and then discarded after the encrypted disk is unlocked.

You might find that some workloads with high-performance disk I/O requirements are impacted when encrypted. In some of our encrypted performance tests, we saw single-digit percentage disk I/O impact, but in most there was no impact. If you have performance-sensitive workloads, you might want to do benchmarks tests with both encryption-enabled and disabled to help you decide if you want to turn off encryption.

How to get started

From the IBM Cloud console GUI, encryption is already turned on for you. If you want to turn off encryption, clear the Encrypt local disk check box (see below) when you create a cluster or add a worker to an existing cluster.

IBM Cloud, IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM

From the CLI, to take advantage of default encryption, first update your plug-in with the following command:

bx plugin update container-service -r Bluemix

Now, encryption is turned on by default when you create a cluster or add a worker to an existing cluster! If you want to disable encryption, specify the --disable-disk-encrypt option when using the cluster-create or the worker-add commands.

Friday, 15 December 2017

Convergence of Blockchain and Cybersecurity

Blockchain technology offers profound opportunities in a myriad of domains, across self-sovereign digital identities, financial services, enterprise asset protection, supply chain risk management, and healthcare IT transformation. Because blockchain technology offers exciting prospects for developing blockchain-based solutions for cybersecurity applications, we explore some applications that help re-imagine conventional approaches to managing cybersecurity challenges.

Digital Identity

Imagine if you were able to – with a significant degree of confidence –  provide service providers your identity enriched with physical, behavioral, or temporal attributes that you can control. IBM is integrating secure, self-sovereign, digital identities and profiles alongside behavioral attributes, providing proof in the form of verifiable claims for the delivery of digital services. The goal is to provide individuals in a permissioned blockchain ecosystem with a decentralized, secure, self-sovereign, trusted identity associated with industry-mandated assurance levels.

Cybersecurity, IBM Tutorials and Materials, IBM Guides, IBM Learning

Following the recent spate of breaches at Equifax and other institutions historically designated as arbiters of citizen identities, the urgency for taking a different approach – a departure from centralized architectures – is imperative. Applying distributed ledger technologies to this domain is promising, with the potential to achieve non-repudiation of transactions with high degrees of confidence, earn digital reputation, increase security assurance levels, and meet requirements for regulatory compliance. In effect, adopting blockchains allows relevant parties to validate who is offering data, who certified the accuracy and authenticity, and who is receiving the data.

Financial Services

Financial services are arguably the use case for blockchain applications that is farthest along, specifically around facilitating decentralized transactions across financial institutions and a decades-old, vast network of intermediaries. Digital currencies – specifically, Bitcoin – with significantly the most market capitalization of cryptocurrencies, is one application that runs on the blockchain technology. The technology inherently offers cryptographic verification and validation of transactions across multiple parties. It effectively disrupts and eliminates core legacy, traditional business processes implemented to facilitate “know your customer”; anti-money laundering; regulatory audit and compliance functions; cross-border payments; clearing and settlement; trade finance processes, back-office operations, etc.

Blockchain technology offers new, innovative opportunities, spurs transformation and results in cost savings around fraud detection, establishing and managing human and machine identities with the potential of offering financial products to 1.2 billion people otherwise disenfranchised from global commerce.

The associated vulnerability exposures to cross-border infrastructures, such as the SWIFT network, are susceptible to attacks and, as such, systemic, economic, and national security risks – as we have seen from recent vulnerability exposures could be catastrophic.

Enterprise Asset Protection – IoT Devices, Critical Infrastructure

With the proliferation of technology devices from Internet of Things (IoT) to enterprise technology infrastructure, endpoints, and assets, the challenge of effectively managing these devices is significant. These devices expand from the traditional IT boundary and attack surfaces with objects that carry different sets of risks, as well as complexities. To enable and improve situational awareness of IoT devices and critical infrastructure, adoption of innovative blockchain capabilities that augment traditional solutions is necessary. This innovative approach augments traditional security monitoring and mitigation capabilities and provides distributed, continuous monitoring of IoT devices, endpoints, and assets enriched with immutable, tamperproof, and cryptographically signed transaction data.

IoT device sensors and critical infrastructure endpoints and assets feed blockchain capabilities, enabling devices to participate in secure monitoring of transactions. Devices will be able to communicate with enterprise-defined, blockchain-based ledgers that autonomously collect, manage, and analyze, through SmartContracts, the security hygiene of endpoints; i.e. device information, software versions, most recent vulnerability scan information, firmware versions, etc. This approach offers a counter-argument to traditional centralized security operations capabilities saddled with the task of detecting known and unknown advanced persistent threats; distributed denial of service; “man-in-the-middle;” and open web application security project attacks from the dark web, while providing actionable security intelligence at scale across distributed peers.

Supply Chain Risk Management

In today’s increasingly globalized market, it can be incredibly difficult to prove the authenticity, chain of custody, and provenance of anything from microelectronics in semiconductors embedded in mission-critical military components, medical devices, and enterprise infrastructure (hardware, software, firmware) to IoT devices. Vulnerabilities and exposure to risks are prevalent across the supply chains of all domains. Confidentiality and availability of transactions aside, there are significant uncertainties around maintaining the integrity of data traversing through supply chain lifecycles from procurement to delivery.

One principle that underpins blockchain technology is the ability to achieve consensus among different parties to validate the accuracy of updates to the ledger. Updates, or “transactions,” are confirmed using cryptographic protocols. Agreements between these nodes can be codified into SmartContracts that enforce business logic and can be used to eliminate data integrity issues often seen in supply chains.

Standards such as NIST SP 800-161 prescribe and define best practices for supply chain risk management approaches. In addition to security controls defined in NIST SP 800-53, the “provenance” control family was created to address the many challenges of establishing a secure supply chain from original manufacture of a component to customer acceptance. Deployed with trust frameworks, and traditional risk management standards augmented with protocols such as the inter-planetary file system), distributed ledger technology can address many of the concerns outlined in 800-161 for supply chain protection.

Healthcare IT Transformation

Healthcare is another domain poised for transformation with blockchain technology. Innovative institutions in the healthcare industry are actively building disruptive blockchain-based applications with a myriad of objectives to realize significant cost savings, rolling out modern solutions that curb fraud, waste, and abuse while putting patients at the center of their medical data and realizing patient-centered healthcare outcomes. Standardization of data security and privacy processes that protect sensitive medical records and ultimately privacy of patients and citizens are critical design considerations for the long-term viability and mainstream adoption of these healthcare-based blockchain applications.

Instead, constructs of pseudonymity (decoupling data from individual identity) and data minimization techniques (such as zero knowledge proofs that offer mechanisms to protect patient privacy and identities while empowering patients with control of their medical data) are vital. Design considerations around preservation of personal health information and personally identifiable information data standards consistent with HHS-HITECH regulations and guidelines are critical to successful implementation for this use case.

Wednesday, 13 December 2017

Is blockchain secure?

For public- and private-sector blockchain initiatives, blockchain technology must contend with technological, governance and regulatory challenges. For all its business transformational values, immutability, distributed transactions, cryptography, provenance, etc., pertinent issues around governance, advocacy, SmartContracts development, and securing blockchain applications at the edges still must be resolved for blockchain to garner mainstream adoption.

Part I of this two-part abstract describes IBM’s Global Business Services’ (GBS) cybersecurity perspective on aligning blockchain cybersecurity principles that use an enterprise risk-based approach to permissioned blockchain application development.

Part I: Blockchain Security Assurance

Governance Risk and Compliance – SmartContract Management

In an age of first-mover advantage, first-to-market institutions across all vertical industries ignore game-changing technologies at their peril. Conventional wisdom suggests that those who move quickly to embrace disruptive technologies benefit the most from them. Blockchains and distributed ledger technologies represent a paradigm shift, offering a single version of the truth across complex, disparate ecosystems and processes, thereby achieving shared business value, reducing cost, lowering risk and enabling new business models.

The technology dramatically expands access for new entrants into the global marketplace. Securing transactions is critical to adoption of the blockchain protocol. However, traditional approaches to managing blockchain-based application risks and maintaining security situational awareness largely remain unsolved. While there have been several exploits to blockchain applications to date, it is worth noting that the they were not on the blockchain technology itself, but rather targeted SmartContracts (business logic defined in code, intended to facilitate, verify, or enforce contract negotiation) and applications at the edges of blockchain networks.

Blockchain Cybersecurity Assurance

As organizations evolve and application development and deployment of blockchain technology proliferate, invariably applications, interfaces, and SmartContract complexity increases, thus increasing risk to blockchain applications. Therefore, there’s a need for comprehensive risk management and cybersecurity assurance programs for blockchain applications that support skilled cybersecurity professionals with strategy, governance, regulations, and compliance processes.

Blockchain application developers, together with development operations (DevOps) teams, must consider whether they have the right tools for security and privacy compliance. The industry, as a whole, must examine the security landscape to identify security risks, develop threat modeling tools, establish roadmaps to harden the security posture, and deploy technologies to mitigate risks.

Figure 1 below depicts a blockchain cybersecurity assurance model that addresses blockchain risks based on a domain-specific, risk-based defense methodology and cybersecurity implementation best practices:

Blockchain secure, IBM Tutorials and Materials, IBM Certifications

Figure 1 Overview of blockchain security assurance services
  • SmartContract governance and risk assessment – Defining and aligning the security program to blockchain application and ecosystem DevOps by cybersecurity methodologies and NIST’s risk management frameworks;
  • Data security and privacy assessment – Analyzing blockchain application data sets, thus informing legal, policy and regulatory issues, on- and off-chain design considerations, liability and enforceability issues;
  • Key management – Implementing public key infrastructure and associated key management lifecycle management services, including certificate revocation, generation, destruction, etc.;
  • Blockchain application threat modeling and secure coding assessments – Analyzing blockchain network participant ecosystem design, securing micro-services: Service-to-service security; application programming interfaces; access controls; and business associate agreements;
  • Certification and accreditation and authority to operate blockchain business network – Understanding and applying risk-based procedures for evaluating, describing, documenting, testing, and authorizing blockchain applications and business networks;
  • Blockchain cybersecurity intelligence and operations – Continuously monitoring, detecting, analyzing, diagnosing, and mitigating threats to gain insights into the blockchain threat exposure and prevent incidents; and
  • Incident response – Developing incident response orchestration plan; effectively activating people, processes, and technologies to respond to and recover from security breaches impacting the confidentiality, integrity, or availability of enterprise blockchain applications.

Monday, 11 December 2017

Achieving Holistic Cybersecurity

No longer can security programs rely on “if it’s not broke, don’t fix it” — adversaries could already be inside systems, stealing data or probing to get in.  Too many CIOs and CISOs have thought their systems and data were secure when in fact the opposite was true.   Security programs need effective protection of valuable information and systems to prevent data breaches, and to comply with the ever increasing federal compliance requirements (such as the Federal Information Security Management Act (FISMA), the Privacy Act, policy and guidance from the Office of Management and Budget (OMB) and the National Institute for Standards and Technology (NIST), the General Services Administration’s FedRAMP program, and the Federal Acquisition Regulation (FAR), etc.). To be effective, CIOs and CISOs need timely cyber security insights to take proactive actions.

Security challenges are greater than ever

With massive increases in data, mobile devices and connections, security challenges are increasing in number and scope. The aftermath of a security breach can be devastating to an organization in terms of both reputational and monetary damages, and can be experienced through three major categories of security challenges: external threats, internal threats, and compliance requirements.

External threats

The Nation faces a proliferation of external attacks against major companies and government organizations. In the past, these threats have largely come from individuals working independently. However, these attacks have become increasingly more coordinated, and launched by groups ranging from criminal enterprises to organized collections of hackers to state-sponsored entities; attackers’ motivations can include profit, prestige, or espionage.

These attacks target ever-more critical organizational assets, including customer databases, intellectual property, and even physical assets that are driven by information systems. They have significant consequences, resulting in IT, legal and regulatory costs, not to mention loss of reputation. Many of these attacks take place slowly over time, masked as normal activity. The vector known as Advanced Persistent Threat (APT) requires specialized continuous monitoring methods to detect threats and vulnerabilities prior to breaches or loss of sensitive data.

Internal threats

In many situations, breaches in information security are not perpetuated by external parties, but by insiders. Insiders today can be employees, contractors, consultants and even partners and service providers. These breaches range from careless behavior and administrative mistakes (such as giving away passwords to others, losing back-up tapes or laptops, or inadvertently releasing sensitive information), to deliberate actions taken by disgruntled employees. These actions can lead to harm as or more dangerous than external attacks.

A strong security program must include capabilities to predict external and internal threats and assess their mission impacts, validated by cognitive technology and cybersecurity experts serving mission operators.

Compliance Requirements and Effective Protection

Public sector enterprises face a steadily increasing number of federal, industry and local mandates related to security, each of which have their own standards and reporting requirements. These many mandates include FISMA, Privacy Act, NIST standards and special publications, OMB mandates, FAR and Defense FAR clauses, and FedRAMP; in addition, they can include as sector-specific requirements like HIPAA/HITECH for health information and Sarbanes-Oxley for financial information, and other general mandates state privacy/data breach laws, COBIT®, and various international standards and privacy directives. Complying with these and similar requirements often takes a significant amount of time and effort to prioritize issues, develop appropriate policies and controls, and monitor compliance.

To address external, internal, and compliance challenges through a proactive approach, mission-oriented cognitive cybersecurity capability is needed. To achieve the capability, four key areas that must be addressed:

◉ Security architecture effectiveness
◉ Critical data protection
◉ Security compliance
◉ Holistic security program

Security Architecture Effectiveness – focuses on rapidly accessing vulnerabilities in the security architecture and developing a prioritized road map to strengthen cyber protection that plugs security gaps and meets policy expectations. Ensuring the identity of users and their access rights, and reducing the number of privileged users, is critically important to effective security architecture.

Critical Data Protection – focuses on rapidly accessing the data architecture, and shortfalls in tracking and protecting critical data. Prioritized action plans can reshape data architecture for more focused security protection and improved continuous monitoring.

Security Compliance – focuses on rapidly accessing compliance gaps and establishing a roadmap to prioritize issues, develop appropriate policies and controls, and achieve compliance.

Effectively implementing the first 3 areas above lays the foundation of a Holistic Security Program that addresses risk management and IT governance at the enterprise level:

■ Risk identifies critical business processes that are most import to an Agency’s mission success, as well as threats and vulnerabilities that can impact critical business processes.

■ Information Technology (IT) Governance is a key enabler of successful cybersecurity protection – it provides the “tone at the top,” emphasizing that ensuring security and privacy is the responsibility of all staff. In addition, consistent and standardized security and privacy processes and technology configurations support protection at a lower cost.

Making a Holistic Program Actionable

A holistic security program focuses on protection through continuous monitoring of systems and data. This involves moving from a more common defensive-reactive approach to a defensive-proactive (predictive) approach, using cyber analytics to foster “Security Intelligence” that also protects privacy.

Continuous monitoring is now required by OMB and NIST mandates – and it can be supplemented using cyber analytics to proactively highlight risks and identify, monitor and address threats. As enterprises bolster their security defenses, predictive analytics plays an increasingly important role. Enterprises can conduct sophisticated correlations to detect advanced persistent threats, while implementing IT governance and automated enterprise risk processes– critical building blocks for enabling security intelligence. This includes the ability to:

◈ identify previous breach patterns and outside threats to predict potential areas of attack,
◈ analyze insider behavior to identify patterns of potential misuse, and
◈ monitor the external environment for potential security threats.

Continuous monitoring combined with cyber analytics via security intelligence can provide key cybersecurity capabilities. Continuous monitoring, along with analysis of cyberthreat related data sources (e.g., through DNS, Netflow, or query results), provides the needed context for fusion of data — data that can be analyzed using tools that produce actionable, meaningful and timely information for CISOs and CIOs to address the most important issues affecting their Agency, to deter and prevent cyber threats.

Using cyber analytics to proactively highlight risks, and identify, monitor, and address threats and vulnerabilities, helps to achieve predictive and preventive cybersecurity capabilities. However, cyber analytics can be greatly enhanced, using cognitive-based systems to build knowledge and learn, understand natural language, and reason and interact more naturally with human beings. Cognitive-based systems can also put content into context with confidence-weighted responses and supporting evidence, and can quickly identify new patterns and insights.

Specifically, cognitive solutions have these three critical capabilities that are needed to achieve security intelligence:

1. Engagement: These systems provide expert assistance by developing deep domain insights and presenting the information in a timely, natural and usable way.
2. Decision: These systems have decision-making capabilities. Decisions made by cognitive systems are evidence-based and continually evolve based on new information, outcomes and actions.
3. Discovery: These systems can discover insights that could not be discovered otherwise. Discovery involves finding insights and connections and understanding the vast amounts of information available.

Thus, Agency senior executives involved in cybersecurity can move from a basic to an optimized level of security intelligence as depicted below.

Cybersecurity, IBM Guides, IBM Certifications, IBM Tutorials and Materials

Achieving cybersecurity protection preserves mission success while achieving Agency key objectives for their security program. Government can move from a basic (manual and reactive) to an optimized (automated and proactive) posture to secure critical systems and valuable information through Security Intelligence.

Cybersecurity, IBM Guides, IBM Certifications, IBM Tutorials and Materials

Friday, 8 December 2017

10 Lessons: Design Thinking for Blockchain

Over the past fifteen months I’ve facilitated eighteen design thinking workshops to explore the application of blockchain to customer projects.  Most of these (actually 12) have been in the finance industry, and in Europe (specifically 15).  Some interesting lessons and reflections emerge.

First a little context . . .

We use a standard approach for customer projects as shown in Figure 1

IBM Tutorials and Materials, IBM Certifications, IBM Guides
Figure 1 – Project Approach

After the vitally important first two “awareness” steps, we start the First Project with a two day Design Thinking Workshop to select the use case then analyse it from a user perspective before deciding how blockchain can best be used to transform the user experience.  This gives clear direction to the agile development sprints that follow.

. . . now the lessons

[1] It’s a BUSINESS workshop

Sometimes participants think the workshop is a technical deep dive and chance to write some chain code!  This is usually because they have skipped the “Blockchain hands-on” module.  We mitigate against this risk by being ultra clear in the workshop invite, scoping documents and participant selection.  During the workshop we use a “parking lot” to record technical questions for answering later so as not to detract from the business discussion.

[2] Use Case Selection is CRITICAL

Sometimes our customers approach us with a business area where they think blockchain can make a difference, other times with a detailed use case.  So we must (optionally) select and always unpack the use case to understand the details, ensuring all participants have the same level of understanding.  This thorough exploration is critical to the success of the workshop.

[3] Hypothesise & test WORKS

Design Thinking thoroughly explores the problem before thinking about a solution.  When I started leading blockchain Design Thinking workshops I was worried that the (pre)selection of blockchain would invalidate this important principal.  This has not been a problem in practice.  I ask participants to hypothesise that blockchain will be a possible solution and continuously test this throughout the workshop.  If we come up with parts of the problem where blockchain is not suited, we call this out and move on.

[4] Business knowledge from customer KEY to success

We need the customer to explain their business challenge and how the current systems work.  We can then explore together at how blockchain can be used to improve things.  Without this in depth business input the workshop looses its “anchor”.  We seek full time business expert participation during the workshop which will greatly reduce – but is still needed – for the agile development process that follows.

[5] DEV team presence gives FAST START

If the scrum master, UX designer and lead developer can be part of the workshop, the project startup is much more efficient.  Their involvement ensures (a) an early infusion of use case knowledge into the project team and (b) that the workshop output is of sufficient quality & depth to get them off to a great start.

[6] Persona choice is IMPORTANT

Once we are clear on the use case and the business network involved, we select specific persona from the business network.  We then analyse the business problem and improvement opportunity from their viewpoint.  We pick two or three personas, depending on the number of workshop participants BUT do this after considering all possible personas in the business network.  Key here is that personas are prioritised for analysis and never lost.    We can always loop back and consider more persona later as needed.

[7] Consolidation & prioritisation NORMAL

The theme introduced above of prioritisation and / or consolidation continues through the workshop as we need to give clear directions to the DEV team at the end.  But as stated above, nothing is lost and all work products are captured (as photos or videos) and shared with participants at the end of the workshop.

[8] From organisations to NETWORKS

Of the eighteen workshops I’ve facilitated, seven involved multiple members of the business network from the start.  Most of the these “consortium” workshops have been in 2017.  Earlier workshops were instigated by one organisation, with other business network members invited to join after a couple of agile iterations, once the value proposition becomes clear and more tangible.  Both approaches are valid and can work, but cross organisational workshops need care . . .

[9] Networks need LONGER

If multiple members of the business network are involved from the start, we need to build time into the workshop for them to talk, understand each other and exchange knowledge.  Playbacks will take longer as a result, and it’s a delicate balance between timekeeping and allowing time for quality discussion.  We must also discuss what information they are prepared to share, and what they must keep private.  This will become critical as we move into the agile development phase!

[10] Design Thinking is IDEAL

Yes, it really is!  In one recent workshop, the consortium brought along a requirement and had mapped out user journeys.  I was reluctant to go back to the start, analyse the business network and do persona based analysis.  But I’m pleased I did.  Design Thinking added clarity which helped the DEV team deliver early value and the consortium members explain the benefit of their solution to stakeholders and potential new business network members.

Wednesday, 6 December 2017

AI and Machine Learning: The Key to Managing Data in the Supply Chain

I think we all know how frustrating it is when traffic lights aren’t synced together during our morning commute. It leads to constant jams and delays—not to mention a lot of frustration. The same thing happens in the supply chain when we aren’t properly synced with our suppliers, production teams, and customers; however, in this case, it also leads to lost production time—and lost profit. Luckily, artificial intelligence (AI) and machine learning (ML) are offering smarter ways to connect our supply chains via big data. And the best part is that you don’t need to understand big data to take advantage of it.

The Great Supply Chain of Data

The truth is that we have reached data overload. Whether or not you already have machine learning in place to process your data, it’s likely you’ve been collecting mounds of it for at least a few years now. If you needed to, you could pull data about everything from peak temperatures in your key markets to the average efficiency of a process on your production floor. It is no longer an issue of having enough data. It’s an issue of having so much that no human could possibly make meaningful sense of it. That is where ML and AI come in.

IBM Tutorial and Materials, IBM Guides, IBM Certifications

Solutions like IBM Metro Pulse are bringing supply managers even greater control of their connected chains by using cognitive learning to find insights that are generally “locked away” in the data pool. This tool doesn’t just read the data for keywords or trends. It can interpret it—everything from traffic and weather to local holidays and news—and put it into meaningful contexts. In effect, it can merge hyperlocal data with your company’s data to give you the potential to be even more efficient, productive, and profitable.

Creating Smarter, More-Connected Supply Chains

So how does local data make a difference to your global supply chain? Lots of ways. Take a look:
  • By understanding the weather in a certain region, you can better target the supply—and eventually the marketing—of your company’s goods there, be they warm drinks or umbrellas.
  • By knowing the special events happening in local communities, such as marathons or parades, you can better gauge how they may affect the demand for your products—perhaps, negatively, by cutting off customer access to your local retail partners, or by creating an even greater opportunity to sell your goods to a wider audience.
  • By staying on top of traffic patterns, you can better plan deliveries of goods and supplies so that everyone always has the items they need when they need them—and I’m talking not just in the United States, but around the world.
  • By being aware of a local victory—or tragedy—you can be even more helpful and emotionally aware of a community’s experience. For instance, if a flood makes product delivery to a certain community challenging, you can let customers in that region know you’re aware of the problem, and perhaps offer a certain discount or free item to help them through their difficulties. This type of partnership between supply chain and marketing will make customers even more loyal over time.
The most exciting part: with machine learning, all these insights can be gleaned instantly, for every market you serve. With IBM Metro Pulse, you can become a local expert on every community you serve, better anticipating its needs and wants.

Machine learning and AI have already proven their value in the marketing realm, but I think it’s time for them to burst wide open the industrial and supply chain sectors. When they are applied to the vast amounts of information that can now be gathered via the Internet of Things, there is truly no limit to the insights and knowledge you can gain about your customers and their communities. In my view, machine learning will become an absolute staple in the supply chain sector. It will become one of those factors that determines which companies succeed—and which companies die out—in this time of digital transformation.

Monday, 4 December 2017

The Voice of Digital: Shake or get shaken

The pace of change in the industry is accelerating. Markets have evolved from a state of organizational centricity, in which manufacturers and service providers largely defined what to produce and market to customers, to one of individual centricity, in which empowered consumers demand insight-driven, customized experiences. And they are continuing to evolve into new forms in which customers, clients and colleagues are becoming active participants rather than passive recipients.

This environment is what we call the everyone-to-everyone (E2E) economy. The E2E economy has four distinct elements: It is orchestrated, based on business ecosystems, which are both collaborative and seamless. It is contextual, in that customer and partner experiences are calibrated and relevant to their specific actions and needs. It is symbiotic, in that everyone and everything, including customers and businesses, are mutually interdependent. And it is cognitive, characterized by data-enabled self-supported learning and predictive capabilities.

IBM Tutorials and Materials, IBM Certificatuions, IBM Guides

Industry examples such as Uber and Airbnb are great examples of E2E Economy. The entire ecosystem comprises partners who come together to offer seamless experience to individuals like you and me. However, these are orchestrated based on business rules, contextual as customer get information only what he needs to know, Cognitive as the recommendations and pricing is based on deep algorithms that are learning and becoming sharper with time, symbiotic as it is a true power to all as unless all players come together the value will not be delivered to consumers like us.

E2E economy, while directly impacting end consumer industries like Retail, Financial Services and Telecom, it is also significantly impacting B2B business like manufacturing due to advent of technologies like Internet of Things (IoT), Blockchain, 3D Printing.

It has been often said that most of the administrative jobs done by paralegal, paramedical, bankers, etc will be done better using IoT and Blockchain.

It is our belief that no one will remain untouched. Early adopters will benefit the most as it is anticipated that the value for the me-too adopters will diminish significantly. This can be seen from the survey that IBM Institute for Business Value did in 2016. Executives from top performing firms see a higher revenue impact by proactively engaging with ecosystems as compared to average impact seen by executives.

IBM Tutorials and Materials, IBM Certificatuions, IBM Guides

There is no option but to have a nuanced digital strategy which need to develop new ways of realizing and monetizing value. Initiatives might include spawning new business models, tapping new forms of financing and developing better, more holistic ways of conducting risk assessments.

Shaking in the digital E2E Economy

To set out on the path toward digital, leaders can take four initial steps: envision possibilities, create pilots, deepen capabilities and orchestrate environments.

Step 1: Envision possibilities: Conduct envisioning sessions based on design thinking to produce a definitive reinvention blueprint. For example, through deep conversations and in-depth marketing analysis, develop a better understanding of customer needs, aspirations and desires; brainstorm new ideas to enhance engagement; and visualize unexpected customer scenarios. Incorporate external stakeholders in these sessions, including customers, to encourage thinking that goes beyond business-as-usual.

Step 2: Create pilots: Develop prototypes using agile development, test them with customers and get them to market quickly to promote feedback and iteration. Establish communities of interest to create safe environments to beta test innovations, and incorporate them as a central part of design and development processes.

Step 3: Deepen capabilities: Augment digital capabilities with strategic initiatives, and continue to build and deploy necessary applications aligned to the target digital reinvention operating model and ecosystem strategy. As pilots evolve, impediments around development will emerge, highlighting limitations in existing capabilities. Adopt a continuous, iterative strategy to address these limitations by building new or extending existing capabilities.

Step 4: Orchestrate ecosystems: Embrace a strategy based on holistic reinvention rather than a series of point solutions, maintaining a clear focus on deep needs, aspirations, or desires of customers, clients (such as partners) and colleagues (such as service providers). Focus on ecosystems to expand and align a broader set of capabilities and to help create and deliver on customer promises.
In summary, while there is enough being said about impending disruptions being caused by Digital, the good news it that is also offer the power to you. This is the right time to be a creator of value, all it needs is a focus, speed and partners who can scale along with you

Friday, 1 December 2017

How Banks Can Remain Innovative on Continuous Basis in the Digital / AI Era

With technological advancements in Digital and Artificial Intelligence, in coming few years, human beings will get back to their intrinsic nature- ability to apply human wisdom and creativity. Most of the tasks based on mathematics and memory will be delegated to machines.

In Banks, more than 50% of the operations related to regulatory compliance, loan processing, fraud detection are the top candidates to be delegated to artificial-intelligence-powered machines. These AI powered machines will be capable of providing contextual and personalised interactions when engaging with the customer- that too at huge scale! They will be always aware of the spending and investment pattern of the customer. They will be predicting kind of financial needs of the customer- knowing very well that the daughter will goto university education next year.

Soon Amazon Alexa or IBM Watson powered voice capabilities in mobile app/web portals/ IOT Wearables will start delivering consumer-experience use cases where bank’s products will be used like a utility in the background. Customers will give voice command to Watson powered app to plan a dinner. Reservation of a suitable restaurant and UBER will happen in the background. Customers will enjoy the dinner and credit card will get used in background as a utility! Credit card usage will not come to explicit attention of the customer during the entire experience and will cease to remain an active player in customer facing role.

The other trend induced by modern digital technologies is DECENTRALIZATION of TRUST. As of now government appointed authorities publish your credit score. Slowly, trust will move to multiple individuals, institutions, social influencers. Look at the popularity of who publishes your Reputation Score’ based on social media analytics, your personality analysis, awards, key links, most relevant pictures, and “The star” ratings you have earned on sites like eBay, AirBnB, etc. Likes of Twitter, will become institutions of trust. As of now, a small shop-keeper operating on cash transactions does not get a home loan because the way current centralized system of trust operates. But soon, such part of population, which is huge in number, will start getting acknowledged from likes of,  eBay, AirBnB, Amazon, UBER, OLA, etc. Blockchain will be used for tracking almost anything and everything: From food items, to fashion garments to diamond to property. The diagram below depicts the four types of enterprises based on dimension of Trust and Product. The enterprises that remain product centric will lose market relevance sooner or later.

Digital Reinvention, IBM Tutorials and Materials, IBM Guides, IBM Certifications

The Banks need to choose their role in the eco-system for coming 5 years.

Digital Reinvention, IBM Tutorials and Materials, IBM Guides, IBM Certifications

In such a scenario where customer-experience-delivery gets prominence and bank’s traditional products get relegated to utility levels, the banks and its staff would need to reinvent themselves. They would need to dig into their innermost core and manifest creativity to redesign their products and services for customer-experience-led scenarios. Shift from ‘product’ to ‘experience’ will help banks and their staff survive and thrive. This would need invocation of human creativity. This cannot be delegated to artificial-intelligence-powered machine.

Saturday, 25 November 2017

Small doesn’t necessarily mean simple: Implementing SAP S/4HANA and SAP MII

Just 10 short years ago, OMR Group was a small automotive company, based in Italy, with two owners and 50 or so employees. They were satisfied with the paper-based manual processes and systems that OMR had used for over 40 years. But OMR had the drive and ambition to move to the future and expand worldwide. Recently, they’ve experienced tremendous growth in production, employees and subsequently, revenue. Their fast-growing family-owned business has become much more complex and complicated. Simply put, they’ve outgrown their paper-based process.

IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM Learning

Before now, OMR had no experience with technology. Moving to a technology-based system is a huge step forward, but it also means that the move to technology-based systems would take them out of their comfort zone. Their growth had forced them to review their internal procedures and processes and, ultimately, they realized that they had to make the move.

After much consideration, IBM and SAP were chosen to help transform their business. The customer appreciated our expertise, and our friendly and gentle approach. We designed the project journey as a team. However, implementing an SAP project in a small company doesn’t mean that it’s simple.

IBM presents the flexible approach

In a small family-owned company, the owners maintain the authority to make changes to processes, systems and their organization. And sometimes, it’s tough to convince the owners to embrace change.

Owners that are accustomed to paper-based reports with a specific format, down to a specific font, cannot always recognize the value and power that new technologies can offer. Using analytics tools that allow the user to surf through the data, aggregate and simulate reports online, on any device, without a paper trail can be difficult to accept. Letting go of paper-based, manual processes is not as easy as it sounds.

IBM needed to take a different approach to this project with OMR. We had to tailor our way of working to better meet the company’s culture, rather than forcing an unwanted culture onto them. We addressed OMR’s apprehension with:

◈ Short meetings with the owners to make key decisions instead of long discussions with all managers
◈ Simple presentations instead of long documents with too many details
◈ A prototype approach instead of a long blueprint, with no system demo

Most importantly, IBM placed the decision-making back into the owner’s control, instilling confidence in the overall project and gradually convincing them to embrace new technologies.

Implementing the latest SAP ERP

In OMR, we implemented their innovative solution based on the latest SAP ERP product, SAP S/4HANA and SAP MII (Manufacturing Intelligence and Integration). To date, OMR is currently running SAP Finance in four legal entities as well as live with a foundry (all supply chain processes, plus finance). We are planning to rollout to five more plants soon.

We spread the system out to all employees in the company, including factory operators. With the previous systems, only “white-collar employees” were given access to the systems to report data (such as sales orders, purchase orders, invoice). Operators were asked to report their production data (number of produced goods, number of defects, machine downtime, scraps) on paper, and the day after, the white-collar employee put this information into the system. Real-time data simply did not exist—and the risk of mistakes due to transcription was very high.

With the new S/4HANA and SAP MII architecture, operators now have direct access to MII with touch screen monitors placed in the factory, and they can report what they did in real time. This new approach has helped OMR make more timely, targeted decision-making at the executive level and enhances the ability to trace products and monitor quality.

By implementing the integrated solutions, OMR now has the ability to connect production systems right through to delivered products. Making these advances are crucial to driving international growth.

Friday, 24 November 2017

Cognitive Analytics and Mobility Can Empower Maintenance Service Personnel

One of the important KPI’s of Maintenance is to reduce the “Mean Time to Repair” of the equipment. As in the Mining (any) industry the maintenance personnel are always hard pressed for time to rectify/ service the equipment in the shortest possible time and return it in a safe operating condition to its potential capacity. Maintenance personnel need to allocate jobs, allocate and withdraw spares & tools, refer to the hard copy manuals during the execution of the job, and connect with OEM for expert support. Assume maintenance personnel with tools in one hand and continuously referring equipment manual during the rectification or service of the equipment. The mobility & cognitive analytics will provide another mobile tool in the hands of maintenance personnel and empower them to reduce the maintenance time and increase the equipment utilization.

IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM Learning

Asset Life cycle of Mining Machinery

The total asset life cycle starts from its conception till scrapping during which, it passes through multiple stages of the life cycle.

There are two players for Asset Maintenance

1. Original Equipment Manufacturer

◉ Direct OEM – Annual Maintenance Contract
◉ Through Dealer – Annual Maintenance Contract

2. Equipment User – Internal Maintenance

Let us discuss the cases.

As the data generated by the equipment is available since its manufacturing stage, the usage of the data from the entire life cycle can be utilized for maintenance depending on OEM or the equipment user. The characteristics or attributes values help in deriving the root cause analysis of the failures. The problems / faults and the associated actions will generate a historical data which will help in deriving the future maintenance activities and reduce the rectification or service time. All these data will be available on maintenance supervisor or maintenance executioner depending on role and responsibility.

How cognitive analytics can support during the entire life cycle

IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM Learning

The Manufacturer can utilize the data generated by the user as per the feedback and support required from time to time and help reduce the SLA’s during the AMC and Equipment user can utilize their own historian and online help from the OEM for maintenance.

Further these analyses can also help in remote support using mobile as Internet of Things (IoT) for better decision framework where the OEM can monitor the attributes remotely and suggest actions or remedies.

Monday, 20 November 2017

Malls of the future will harness tech to make tenants and shoppers happier

The rise of online shopping has not been kind to malls. Traffic is declining, jobs are dwindling, and many malls are disappearing altogether. By 2022, analysts estimate, one out of every four malls in the U.S. could be out of business. But while more malls will certainly shut down in the years ahead, those that remain will evolve to become more vital than they’ve ever been.

IBM Tutorials and Materials, IBM Certifications, IBM Guides

Traditionally, shopping malls have been at the heart of communities across the world. To continue to play that important role, mall operators need to reconsider the operating model that has serviced them for decades. By combining physical and digital capabilities, mall operators can create new sources of revenue and offer unique experiences for tenants and consumers alike.

Imagining solutions

Smarter city analytics, predictive modeling and advanced analytics could dramatically improve decision making at all levels in malls. Traditional reporting could be replaced with mobile dashboards and workflows could be automated to improve efficiency and response times across the ecosystem of service providers.

The entire tenant lifecycle, meanwhile, could be digitized, allowing tenants frictionless access to a mall’s services. Lead management, CRM, master data management and mobile apps could support the prospect to contracting cycle.

In the future, mall operators could, for a fee, offer non-traditional services beyond the traditional leasing contract, such as access to commerce platforms, digital marketing and campaign services, consumer segmentation services, in-store apps, and IoT solutions. Smaller brands that can’t be accommodated within the physical mall could be accommodated in the digital mall.

For shoppers, malls could offer guided navigation, AI-driven digital concierge services, personalized marketing, proximity based offers, seamless Wi-Fi (both inside stores and outside in common areas), and ticketless parking. A mall could even offer services in collaboration with third-party service providers. For instance, a mall could partner with Uber to provide free rides to the mall and back after a set number of visits.

Crafting experiences

Malls of the future will need to offer unique experiences to incentivize shoppers to visit. Some of today’s most successful stores are already offering such experiences. At historic French department store Le Bon Marché, for instance, in addition to shopping the world’s top clothing brands, shoppers can have lunch, buy music on vinyl, and take in an exhibition. Delivering such signature experiences depends on a comprehensive understanding of the customer base, so mall operators will need to gain the ability to mine insights from consumer data for segmentation and targeting.

Mall operators can learn from the collective experience of other industries and avoid the pitfalls of disjointed solutions, opting instead to centralize core capabilities that can be exposed across various channels. We call this collection of centralized capabilities the “Mall Operating System” or “Mall OS” of the future.

With key digital technologies, malls can improve business agility, lower spends, and create differentiated experiences. Far from presenting an existential threat, digital technology is the key to the malls of the future.

Friday, 17 November 2017

Risk Management: New Frontiers for Metal Companies

It’s high time for metal producers and consumers to explore new frontiers in risk management strategies due to increased volatility, risk and regulatory compliance. Enterprise wide risk management requires an integrated view of market, credit, operational and Geo-political risk. Managing market risk is vital to managing profitability. There are many factors effecting market risks for metal companies – foreign exchange, interest rate, commodity prices (ore, alloys, finished products, energy etc). An effective risk management policy includes all levers of influence like derivatives trading, hedging; revising contracting-payment terms; pricing policies; capacity utilization & shutdowns; inventory policy etc. In this blog, I focus on market risk specifically commodity cost, price fluctuations effecting producers and consumers.

Due to increased volatility in metal markets prices, both consumers and producers want to secure their profit margins. Volatility is further intensified by demand forecast errors, increased raw material price liquidity, trading arbitrage, shifting customer delivery dates. These effects impact entire supply chain. The degree and maturity in commodity trading varies among consumer and producers. Some execute basic strategies to manage risk exposure and others are deriving profits from trade activities (significant revenue component).

Risk Management, IBM Tutorials, IBM Materials

Three key areas where we need to integrate our risk management and commodity trading strategies

1. Derivatives Trading – Multiple exchanges for metal trading, where producers (miners and metal/steel refiners/manufacturers) can hedge risk exposure caused by metal price volatility. Financial instruments used for hedging include forward transactions, futures contracts, warrants, zero cost collar options etc. It is important to monitor open positions (mark to market, net exposure) on physical and financial trades at transaction level. This can be achieved by end to end supply chain integration and visibility.

2. Supply chain integration with end to end transaction visibility & analytics – in order to manage risk, we first need to make visible our risk positions across sales, inventory (raw material, work in progress and finished goods), operations (production, quality, transportation) and procurement. Lets start with sales contract creation. Companies use complex pricing mechanisms based on market, exchange linked prices for quotational periods in provisional offers, final delivery including formulas based on metal characteristics. Further we need to monitor our exposure position based on the open contractual agreements and mark to market values of both physical and traded transactions including movement of physical commodity. This further integrates to production and quality where assay characteristics are used to calculate bonus, penalties for provisional, final invoices. Similar concept extends throughout the supply chain to procurement.

3. Monitoring & Reporting – Regulatory compliance’s for Hedge Accounting, Fair Value Disclosures, Value at risk and monitoring other risk categories linked to credit risk, Geo-political risk and operational risk.

Let us take one case example from stainless steel. Stainless steel requires input (iron ore, coal, alloys –chromium & nickel, energy) to produce finished product. Composition of alloys chromium and nickel varies from 10-30%; 2-10% respectively. Price fluctuations in Nickel which trades around US $14,000-20,000 per ton significantly effects cost of goods sold compared to Chromium which trades at $2300-2500 per ton. In order to manage Nickel fluctuations company is updating their pricing policies, formulas and implementing commodity management solution that provide end to end supply chain visibility to manage risk and profitability at customer order line level.

Conclusion – Enterprise wide risk management needs to support the growth strategy of a company including financial targets. In metals companies, Market risk is seen as a major risk category. In order to effectively manage Market risk we need to go beyond the excel workbook of a day trader and deploy technology solutions that provide end to end supply chain visibility, analytics with real-time integration of sales, planning, production, quality, transportation, inventory and procurement transaction. Further we see metal markets (including steel) are heading towards increased liquidity like oil, gas. In these liquid markets producers can explore opportunities to decouple their supply chain in terms of supply, production and demand. In future risk management and commodity trading will be an integral part of consumers, producer’s strategy influencing entire supply chain for sustainable profitability.

Wednesday, 15 November 2017

Use the power of digital and win

I board a flight and call my rental car company to book a vehicle. I land at the major airport where my reservation is confirmed but encounter one small detail: all the cars are rented. The manager apologizes, smiles and hands me a coupon for a free day. When changing cities in midweek and trying to be productive, it is pointless for me to argue that I have a reservation. Instead, I find the supervisor and start chatting about whether IBM could help the company fix its planning and scheduling system challenges. Before long, my car arrives and I dial into my next work call.

Monday, 13 November 2017

The role of the Energy Integrator

Continuing disruption in the energy sector is pervasive worldwide and it is driving structural, technical and commercial changes in the industry. In the midst of this disruption, utilities need to embrace the role of the Energy Integrator. Three precepts are at the core of what it means:

IBM Tutorial and Mateial, IBM Certification, IBM Guides

◉ The essential grid
◉ The power of markets
◉ The mantle of sustainability

The essential grid

Although the electric grid will be a central element of the electricity system for the foreseeable future, it will have to operate a new grid with operating characteristics that are more complex than the grid of today. Although significant amounts of automation and new technology can be used, the energy integrator will still be required to operate the grid.

The power of markets

The energy integrator will use marketplace mechanisms to support a robust marketplace for electricity and associated required products and services. An open market construct for price is preferred as attempts are made to integrate renewable technology and demand-side technologies into the supply mix as dispatchable resources in the future.

The mantle of sustainability

IBM Tutorial and Mateial, IBM Certification, IBM Guides
Sustainability as a social objective and political imperative increasingly is being adopted on a global basis. Embracing, fostering and promoting sustainability is a foundational concept for the energy integrator. Sustainable energy production is a fundamental aspect of the energy integrator’s mission and offers opportunities for innovation.

Friday, 10 November 2017

Mobility in the Energy and Utility industry

Use of smartphones and tablets by utility employees and customers have forced utilities to deal with the challenges of the new mobile era. Mobility has affected the speed of the utility’s digital transformation, with employee and customer expectations increasing rapidly. Utility employees are awaiting dynamic and configurable services and processes with similar level of experience as their consumer mobile apps. Customers require a personalized and interactive mobile experience.

The line between personal and professional use of mobile devices has blurred, with more and more employees using their personal devices to get work done. Mobility has expanded our world beyond simple smart devices, creating a new digital world, which means both the IT and business unit leaders in a utility are faced with the challenge of taking advantage of the technology while addressing security and privacy issues.

Energy Integrator - Operations innovation, Mobility, IBM Energy

Even though many organizations have acknowledged the mobile transformation trend, there are still many utilities that do not have a formal mobile strategy. According to IDC Energy Insight,1 more than 37% of utilities consider mobility a top priority for 2016, however, more than 40% of utilities do not have a mobile strategy. It is clear that mobile is disrupting virtually every existing business and technology process and is forcing utilities to develop a sound mobility strategy.

Why is mobile important?

Mobile technology is helping Energy and Utility organizations become more nimble, responsive and efficient. When outages occur, speed of response and public safety are paramount. From user-centric and process specific apps, to apps configured “on the fly” in an outage situation and the ability to provide front-line workers detailed information about outage location, customers, network status and external data, as well as enabling collab oration between personnel, mobility is changing the way utilities work.

Mobility enables organizations to improve customer service by providing field workers access to technical data and documentation when and where it is needed. Mobile field crews can be dispatched to the right location with the right inventory, and enabled to collaborate with colleagues in the field or other departments to resolve issues while at the scene. Mobility improves overall efficiency by enabling direct updates from the field, real-time data/picture/video collection and analytics on asset status.

Example, a European water utility with a significant mobile workforce needed to integrate many of its enterprise applications — such as Enterprise Asset Management, Resource Planning and Customer Relationship Management — to make specific data accessible to its field workers’ various mobile devices (supporting multiple operating systems and form factors). It was imperative for this utility to provide its field workforce with up-to-date status information on assets, work orders and customer requests, and for the work to be provisioned as well as updated while in the field. Thanks to the utility’s well-thought-through mobile strategy, the workforce in the field is now able to address an asset failure or deal with a customer request more efficiently by accessing information from the company core operational systems and combining that with data updated in the field, with pictures, voice recordings, and videos added to the work order and system of record.

Expanding mobile capabilities for the field workforce are providing utilities with clear benefits, such as reduction of asset paperwork and manual reporting, crew member dispatching and communication in the field, workforce safety, best practice procedures, managing timesheets, network maps and documentation. Utilities are also offering mobile and flexible solutions to not only their field workers, but to their office workers and customers as well. All of these activities mean that utilities are facing the challenges inherent in implementing mobility; i.e. device management, security, communication networks and mobile application development.

To take full advantage of mobile transformation, utilities must be able to fully address the business impact of mobile solutions within the entire enterprise – not just for workers in the field. A utility can do this by effectively integrating the unique capabilities offered by mobile devices and technology into its business operations. Mobile app development should focus on dramatically changing the way work is done, integrating mobile with analytics and helping employees make decisions and do their jobs in the way they could never do before. Some examples include:
  • Mobile technology can enable first responders to communicate with each other in “real-time” and respond faster to emergency situations, which can increase safety for both employees and the public.
  • Enabling employees to collaborate within the organization can improve workforce efficiency, boost their skills and enable knowledge management.
  • Providing real-time access to information, location and status of employees and assets can help boost efficiency across the organization, improve reliability and improve customer satisfaction.
Utilities are in the midst of a fundamental transformation. The industry has many challenges, such as distributed renewable energy, energy efficiency, microgrids, smart metering, carbon emission regulations, grid infrastructure, severe weather events, security, customer expectations and competitive retail market. However, leading utilities are adopting and leveraging the technologies to help them deliver more services and respond more quickly and effectively to employees, customers, regulators and other stakeholders.

Wednesday, 8 November 2017

The Smart Grid was your first Internet of Things project

For many electric power utilities, the Smart Grid was their first Internet of Things project. Just as some early smart grid projects were started before the term became popular, and were then known as an intelligent utility network or advanced distribution automation project, so too was smart grid an early version of internet of things for utilities.

Monday, 6 November 2017

The energy integrator spheres of operation

The energy integrator provisions the systems of engagement to sustainably balance distribution-side energy supply and demand safely, reliably and securely. The energy integrator operaEnergy integrator_Spheres of operationstes within three core spheres of operation:

Friday, 3 November 2017

Transacting in real-time: IBM launches new payment systems with Zelle

Earlier this year the Federal Reserve released the second part of its long-anticipated report from its Faster Payments Task Force. The report outlines a path to modernizing the U.S. payments network by 2020.

The Clearing House’s Real-Time Payment System has just come online after two years of development, opening the floodgates for Real Time ACH.

NACHA is marching forward from same-day credit ACH transaction, launched last September, to sub-10 second debit transaction, putting pressure on legacy systems, while setting an aggressive expectation for the industry.

Fintechs like Venmo, PayPal, and TransferWise have revolutionized peer-to-peer payments, capturing a latent market beholden to slow ACH processing. They’ve set the bar for customer expectations, and the American banking industry is rapidly responding.

Over the past 18 months, and accelerating into this year, everything in the payments ecosystem has changed. Banks are now expected to settle transactions faster than ever before, putting new pressure on existing platforms and processes. Further complicating the issue is the fact that as the speed of payments increase, so does the speed and complexity of payment fraud.

So here we are. Financial Institutions are scrambling for the necessary infrastructure, innovation and expertise to support the new world of faster payments.

Having seen the writing on the wall, IBM began collaborating with The Clearing House and the Zelle network to figure out how to help customers manage these changing market dynamics. At Money20/20 this week, IBM announced that it has integrated its Financial Transaction Manager (FTM) software within Zelle’s payment network.

This big step forward allows customers of banks that use FTM to have immediate access to the Zelle network, to send and receive money in minutes.
While seemingly daunting in scope, the potential of this connected, high-speed payments network allowing all U.S. banks to begin dealing with real-time payments is achievable with the right approach to making hardware, transactional cost, and security blend seamlessly together.


In addition to supporting traditional distributed hardware platforms, IBM is uniquely positioned to take advantages of history’s most enduring computing platform: the mainframe.  What might be a little-known fact is that the vast majority of financial institutions in North America run massive amounts of transaction processes on IBM mainframes. Solutions running on IBM Z Systems have the necessary computing power, speed, security and connectivity to host payments workloads. With October’s announcement of FTM for ACH and Real-time payments now supported on the mainframe—with very strong integration capabilities to legacy settlement systems—this new solution is facilitating a better way to tackle the real-time payments challenge.

Transactional Cost

Given that mainframes have always had the computing power, security, and capacity to run payment workloads, why haven’t we seen widespread adoption in the past? The answer comes down to cost.  Historically, mainframe software costs are based on consumption usage models—how much compute power and capacity are they using across the subsystem stack. Banks have become very good at predicting daily consumption usage, but payment volumes introduce unpredictable spikes, throwing a wild card into the equation. From one day to the next, a bank could see spikes in payments, so their consumption usage could go outside the expected range, adding significant cost to the system.

IBM introduced a new pricing model that mitigates the risk of consumption spikes. The model prices payments by transaction, not on the consumption used by the transaction. This makes running real-time payments on Z Systems much more economically viable.


More and faster digital transactions, means more and faster payments fraud. Via IBM’s Safer Payments solution, fraud detection is now possible with ever increasing accuracy and in sub-second timeframes. Safer Payments is powered by an Artificial Intelligence (AI) engine which learns and optimizes its fraud detection algorithms as it processes more transactions. This allows the system to detect emerging fraud patterns and adapt to them in a way that traditional manually trained systems would otherwise miss. Safer Payments is unique in the industry because as fraud detection goes up, false positives come down. In the emerging world of real time payments, real time fraud detection is a must.

We’re thrilled by the new integration of FTM with the Zelle network because it means a better delivery of service to end-user customers, lower costs to banks, and stronger protection against fraud. In this rapidly changing landscape of payments, innovative enterprise thinking builds the right solution for modern banking.