Wednesday, 28 February 2018

Instant Checkout: Transforming Customer Experience with Shell

IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM Learning

Self-checkouts are supposed to save us time, but crumpled barcodes in unpredictable packaging locations requiring individual scanning, and the dreaded ‘unexpected item in the bagging area’ warning message can quickly turn a trip to the shop into an extreme test of one’s patience. The longer and more arduous the process to check out, the more likely it is the shopper will abandon the basket, a worst-case scenario for the retailer.

Fortunately for shoppers and retailers alike, IBM iX has partnered with a series of specialist companies to develop a breakthrough invention for ‘instant checkout and connected store’ that is set to revolutionize shopping. The first step is to give customers the fastest and most secure way to check out. The long-term vision, however, is to transform the store and its supply chain, reduce waste globally, and empower more ethical consumerism worldwide.

Phase 1: Fastest, easiest, most secure way to check out in stores

To refine the customer experience and technology, the program was trialed over six weeks at a Shell store in the UK with roughly 140 customers.

Customers were universally delighted by the instant checkout’s ease of use and ultra fast speeds. It takes only five seconds for the instant checkout to complete a transaction for any number of items – unlike traditional self-checkouts and cashier checkouts that take longer the more items you buy, due to the need for scanning individual barcodes. When limited to only 10 items, the difference in speed is still staggering with the instant checkout generally performing 15 times faster than a self-checkout, and seven times faster than a cashier.

IBM iX has partnered with a series of specialist companies to develop a breakthrough invention for ‘instant checkout and connected store’ that is set to revolutionize shopping.

Beyond the instant checkout’s speed and ease of use though, consumers in the Shell trial also greatly valued the extra security provided by the invention. This is because it utilizes a new patent-pending payment process to facilitate payments via a BlueTooth enabled reader. The reader, unlike traditional card and NFC readers, doesn’t collect or send any customer or payment data. Instead, it sends a unique code to the customer’s app to be matched and routed via a Cloud platform. No identifying information about customers or their payment details is ever transmitted in, or to the store. When the payment details are matched, customers receive their receipt on the app. This makes the instant checkout more secure than NFC (‘Contactless’) or Chip & Pin.

The Universal Tag: Breaking the Barriers of RFID 

The instant checkout capability uses brand new ‘universal tags’ that employ RFID (Radio Frequency Identification) technology. Unlike RFID tags of the past, these are small, discreet and can work on any product type regardless of materials like metal or liquid.

Once available, the universal tags allow the instant checkout to also deliver retailers a solution for the truly ‘connected store.’ By equipping shelves with ultra-discrete RFID antennas, it’s possible to know exactly what stock is in any store in real-time. This is accomplished without any costly changes to the existing shelving or infrastructure, or the need to install infrared cameras and sensors throughout the store. The only other way to gain this level of real-time inventory data would be through manual stock counts – a costly and potentially inaccurate process that most store staff only get a chance to do during slow periods or after hours.

Because customers use the app to pay, retailers are also provided with anonymized data to gain insight into what customers are buying and when. The platform also analyses IBM data about weather, local events, and trends, which means retailers get insights driven by Artificial Intelligence to be able to predict what stock they’ll need, thereby reducing waste and improving profitability for each store individually at a very local level.

Phase Two: Doing Well by Doing Good 

During the first phase, the ‘universal tags’ will be physically attached to each item via specially designed tagging guns. Initially, this enables tags to be placed and encoded in an instant. Once adoption of the instant checkout has achieved sufficient volume, the RFID technology can then be embedded directly into all product packaging and garment labels. This will eliminate the need for applying separate tags and will allow product manufacturers to store more information about each product via its unique ID number and the solution’s Cloud platform.

This is critical because fast checkout is just the beginning of IBM’s vision for this solution. By first providing customers with a fast and secure way to pay, the invention will ultimately drive five major changes for retailers and consumers alike.

1. Transform shopping

Imagine an app that can tell you precisely where everything is in every local store in real-time, without even stepping foot inside. It can also navigate you to exactly where a specific product is – even when items are put back in the wrong place. As you shop, the app can also send you discounts for the products you actually buy – instead of its best guess of what you might like.  With accurate insights combined with external data feeds, the offers coming through will be relevant and personal to you, rather than misinformed or untimely offers.

2. Empower consumers

The RFID technology means information about whether a product is truly sustainable, ethical or organic will be stored in an unalterable record that customers can access in stores and on their smart devices via the app. Once the invention is adopted by product manufacturers, the RFID technology can be used to link any individual product to its points of origin, sources of raw ingredients, factory conditions, farm classifications, nutritional information, expiry dates and more. This information is stored on a Blockchain, which is an encrypted shared ledger that every company along a retailer’s supply chain can add to, but not erase or alter.

3. Evolve the role of the shop assistant

Unlike other futurist concept stores, this invention wants to use the data and AI insights gained by having a fully connected store to evolve, not replace, the role of shop staff. For example, when equipped with the right data, training and technology, the staff in a grocery store could become trusted advisors on cooking, nutrition and food pairings – providing much-needed differentiation for a grocery retail brand that’s struggling to compete with discount rivals and online merchants.

4. Reward recycling

Once universal RFID tags are integrated into product packaging, they’ll survive the journey from the store to the customer, all the way to the recycling plant. There, scanners can be used to detect products and reward customers for recycling.

5. Reduce product packaging overall

A lot of product packaging is decided based on needing to fit a lot of product information onto the package or label in a way that’s clear, visually attractive and inexpensive. As such, product manufacturers often turn to plastic. With this RFID technology and its cloud platform, unlimited information in any format can be stored against each individual product. This information could be shared via interactive displays on shelves, and the products themselves can be wrapped in plainer, smaller and more eco-friendly packaging.

Innovation through Collaboration

When fully realized, the goal for instant checkout and the connected store will be to transform not only the way we shop as customers but also the way we behave and consume as humans. To achieve this vision in an accelerated time frame, IBM iX is collaborating with retailers, product manufacturers, wholesale distributors and many other partners from across the globe to expand, perfect and scale this invention.

Sunday, 25 February 2018

Social Media Insights: Classroom of the Future

In 1907, The Journal of Education solemnly condemned “our modern family gathering, silent around the fire, each individual with his head buried in his favorite magazine, is the somewhat natural outcome of the banishment of colloquy from the school…” Now imagine replacing “favorite magazine” with “smartphone” or “app” and you get the kind of remark often made today. So, before analyzing “kids these days” along with their new learning technologies, take a moment to think that maybe each generation is inclined to an exaggerated fear about the latest trends influencing the young, from newspapers to social media.

That said, let’s dive into a discussion about the future of education as reflected by different social media trends. What you’ll be reading next is based on social metrics that reveal insights about new developments in two major areas: online education and personalized learning.

IBM Tutorials and Materials, IBM Guides, IBM Certifications, IBM Learning

Massive Open Online Courses

Or MOOCs: an acronym filled with great expectations – particularly in 2013, a year proclaimed by The New York Times “The Year of the MOOC”. Since then, the concept has been commercialised to the extent that #moocs (along with #edtech) became some of the most popular education-centric hashtags on Twitter. But what do social media users believe about MOOCs now – more than 3 years after this declaration was made?

Worldwide social data shows that MOOCs continue to be a hotly debated topic. Most English language social media conversations happen, unsurprisingly, in the US. The non-English speaking country that showed most interest in this topic is India, where there is a strong online community that discusses digitization in the education sector.

Generally, MOOCs are still considered an opportunity to bridge the skills gap. However, some users seem undecided if MOOCs are effective on their own and prefer blended education in an attempt to get the best from both worlds, online and offline. Though the level of euphoria seems to have decreased, there is little negativity associated with MOOCs. In many cases this negativity is centered on research that claims online courses are mostly designed and used by graduate students and is therefore failing to impact other (unprivileged) groups as much as initially expected. The latest primary research also confirms this catch-22 scenario: while many low-income and low-education Americans would benefit from e-learning, they don’t access it far as much as their educated, high-income peers. Additionally social listening reveals that certain negative sentiment affiliation is triggered by concerns in regards to students’ privacy, as more and more data is gathered about their learning patterns, behaviours, attendance and results.

But with 87% of college students reporting that having access to data analytics regarding their academic performance has a positive impact on their learning experience, the role of analytics and therefore technology looks set to play a central role in the future of education. If tracking progress helps scholars learn better, efforts to improve and standardize data collection and drive better integration across different learning platforms/phases seems justified.

Getting back to our main point of discussion – has the NYT rushed its celebration of MOOCs? Google Trends shows a slow decrease in the level of searches which might indicate the hype is, indeed, fading.

IBM Tutorials and Materials, IBM Guides, IBM Certifications, IBM Learning

 However, social media listening shows it continues to be a trending topic on Twitter, even if part of the initial enthusiasm has been dulled by a number of legitimate concerns.

With so many of us used to interactive content nowadays, it seems likely that videos will become the new textbooks. Already, video is expected to make up for about 80% of all consumer Internet traffic by 2019 (up from 64% in 2014) so it is hard to believe MOOCs were just a trend of the past. More likely instead, it’s something which is shifting from a phase of early hype into a more mature element of mainstream education.

Personalised learning

The Jetsons cartoon view of daily life in the future painted a utopian vision filled with the elaborate robotic contraptions, holograms, and whimsical inventions that could come to transform transportation, lifestyle or… schooling. In one episode, Elroy (the little boy of that futuristic family) is taught by a robot teacher who leads the class. Was futuristic pop-culture of 1960s anticipating the way kids will learn soon?

Perhaps we are not anticipating robotic teachers at the head of the class just yet but most commenters tend to agree that we have reached the end of “one size fits all” education. This marks the beginning of a new era. The era of personalized education – tuned to the preferences, needs and performance of the individual. However, social data indicates that opinion is pretty divided on what that means in practical reality.
IBM Tutorials and Materials, IBM Guides, IBM Certifications, IBM Learning

On Twitter, “personalized learning” means different things to different people. By using text analytics and reading through statistically representative samples of tweets, we were able to identify two main trends that have emerged and defined the past year. For some, it means more time for teachers to engage with students on a personal level, to give courses that personal touch while creating a powerful bond with the students. For others it means innovative learning technology able to identify learning patterns, to tailor coursework and to alert others when students show signs of being at risk of dropping out. I believe that the two trends don’t actually oppose one another.

Of course we need teachers to teach to individual students as much as they can; at the same time, we need non-judgmental personalisation that feeds into student’s interests and aspirations. This does not lead to an opposition of the type relationship vs. algorithms, but might point instead to a fruitful collaboration between human teachers and intelligent, data driven assistants.

By applying analytics to data about each student’s way of learning, a caring teacher gains fast, powerful insight that allows him/her to act quickly in helping students achieve their potential or prevent them from dropping out. Technology isn’t the focal point here, it’s just a way to connect one teacher with thirty students’ interests, hopes and dreams.

IBM Tutorials and Materials, IBM Guides, IBM Certifications, IBM Learning

So though the Jetsons might have not entirely got it right when putting a robot in front of a class, they were still very insightful about the way the students of today might leverage learning technology.

Bottom line

By analyzing 1-year of online chatter about education coming from teachers, students, parents or digital natives alike, I have observed a deep concern about what the future of education should look like. Instead of a clash of visions, there is a common desire to transform education into a more personalized learning process that provides meaning and motivation. Maybe in an ideal classroom of the future, teachers will indulge more in existing technologies and analytics, identifying the problematic areas in students’ ways of learning and designing courses that address them head on. Maybe, giving students access to their learning records (from kindergarten to high school) will lead to making better career and life choices. After all, as one tweet was pointing out: “The problem with ed data is not a technical one, it’s an actionable one”.

Thursday, 22 February 2018

IBM and Skytap ensure smooth tool migration for Veritas

IBM Tutorials and Materials, IBM Guides, IBM Certifications, IBM Learning

It’s hard to say who has more trepidation about an IT infrastructure change: those who are responsible for making it happen or users. Change, though often inevitable, is not usually without some anxiety.

Two years ago, Veritas Technologies, a leader in multicloud data management, was spun off as an independent entity from Symantec. At that time, Veritas had a large lab environment running on an internal Symantec cloud with more than 10,000 virtual machines (VMs) defined in that cloud and 1,200 templates in active use.

IBM Tutorials and Materials, IBM Guides, IBM Certifications, IBM Learning
Because of the divestiture from Symantec, Veritas needed to move off the Symantec cloud by a deadline, otherwise it would incur financial penalties. The timeframe was short.

Searching for a solution

The challenge was that users were happy with the tool set they had. It was a homegrown tool that, with customizations, offered breadth and usability.

Veritas issued a formal request for proposals (RFP) for the migration, and reviewed all sorts of different solutions from major providers. Additionally, it considered creating another homegrown solution. Ultimately, the company engaged IBM Cloud Professional Services and IBM Business Partner Skytap to migrate its complex enterprise application to an IBM Cloud data center with IBM Cloud for Skytap Solutions.

This solution offered a pre-built user interface. Having the front-end orchestration layer taken care of would mean faster time to market.

Making the move

The migration involved moving the infrastructure that was located in Tucson, Arizona to Dallas, Texas, which included relocating more than three petabytes of data. How to most effectively move that amount of data was one of the key technical challenges that Veritas faced. The company built a network that could sustain two gigabits per second throughput from Tucson to the IBM Cloud platform that hosts the Skytap environment.

The other challenge the company overcame was converting the VMs, because when VMs change, there are subtle differences, such as DNS, host names and network configurations that have to be changed for each.

IBM and Skytap wrote scripts to ensure that the VMs would run in the new environment. It would possibly be the largest migration that the Veritas team had ever undertaken.

IBM Cloud Professional Services and Skytap were able to meet these challenges and provide a custom solution for Veritas in the IBM Cloud data center. Users were able to use the old system while the migration was in progress.

The “six-month migration” was completed in just four months.

Transitioning smoothly

The Veritas team marveled that this was “one of the biggest non-events” that they had ever been a part of.

At the beginning, it was slow going because of the technical challenges faced, as well as the planning, testing, and initial work. Things sped up as the project progressed. There were some blips here and there, but given the scope and breadth of this migration, the cutover was great. According to the team, every cutover should be this smooth.

From a user perspective, one of the really attractive things, aside from the overwhelming number of features, is the usability of the tool. Skytap monitors the environment proactively to manage any little hiccup effectively and quickly.

The tool has always been internal-facing only, but now Veritas is looking whether to expand externally.

Wednesday, 21 February 2018

IAmI prevents cyberattacks in real time with IBM Cloud

IBM Security, IBM Tutorials and Materials, IBM CertificationsWith ever-advancing cyberattacks, hackers are gaining unauthorized access to networks and secure data at alarmingly high and unprecedented rates. Many organizations don’t even know their network security has been compromised until it’s much too late. Once stolen, their private data is impossible to protect.

Hackers typically begin by acquiring a set of login credentials — the easiest part of the attack — to gain initial access to the network. Once in, they will do surveillance, gathering more information on how to make their way further inside the network until they reach and gain access to the central databases and servers. Then comes the data breach.

Preventing cyberattacks in real-time

IAmI Authentications, an IBM Business and Technology Partner, is a business-to-business cybersecurity solution that protects all enterprise networks and sensitive data from intrusion attacks and breaches in real-time. The solution empowers users to protect their own login credentials from hackers who would otherwise try to exploit them to gain unauthorized access. Organizations and users can know their login credentials and secure data are shielded from being nefariously exploited or breached.

Each time a user’s login credentials are used to access the network or any privileged access area(s) across the network, they received an authentication request from the IAmI smartphone application prompting them to confirm or deny the authenticity of the login.

Authentication requests are sent through two-factor authentication protocols using push technology, and users are required to respond with one simple touch to the app. The IAmI solution advances all current authentication and identity service methods, as this becomes the new way for “trusted people” to gain access to “trusted networks.”

If it’s an attack, users will immediately identify the intrusion and prevent the hacker from gaining unauthorized access. Just by touching the “deny” button on the app, the attack is thwarted.

Zero-minute detection rates

Today, the average detection time surpasses 205 days, after the attack has taken place. That’s simply too late. IAmI drastically brings the detection rates from many days to zero minutes, as clients benefit from having real-time network intrusion monitoring and detection systems as well as a real-time intrusion prevention solution.

IAmI not only has advanced where other authentication providers have not, but the solution also prevents data breaches from occurring by implementing it all the way through to mainframe environments, sensitive databases and servers.

The IAmI solution is reinforced with the power of IBM Cloud, and also includes a lightweight API and proprietary smartphone apps running on Apple Watch, iOS and Android. The entire solution operates strictly on tokens, with no user Personal Identifiable Information (PII) ever being exposed or requested.

IBM Security, IBM Tutorials and Materials, IBM Certifications

Partnering with IBM

Creating a much required solution for a global epidemic meant IAmI needed the support and reliability of a global leader in emerging cloud technology. That’s IBM Cloud.

IBM has been instrumental to the success and growth of the company. Since coming through the IBM Innovation Space in early 2016, IAmI immediately benefited from the IBM Global Entrepreneur Program and continues today to work with IBM to reach clients, while getting advice and guidance from world class executives.

Being an IBM Business and Technology Partner means that IAmI is able to offer its solution to clients from all parts of the world and also deliver it on the reliability of IBM Cloud.

Thursday, 15 February 2018

How IBM Cloud supports innovative IoT

What is the Internet of Things (IoT), exactly? What are the “things”?

They could be any internet-connected objects that collect and share data using embedded sensors. For example, many of us are familiar with fitness trackers or smart appliances that are controlled by cell phones.

Business Insider predicts that there will be there will be more than 24 billion IoT devices on Earth by 2020. That’s approximately four devices for every human being on the planet.

Here are some innovative uses of IoT you might not have imagined.

IBM Cloud Functions powers the “Internet of Garbage”

IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM Learning, IBM Cloud

The “Internet of Garbage” doesn’t refer to all the ridiculous and inane things one might find on social media. It’s literally about garbage trucks with sensors.

GreenQ has installed sensors on trucks to gather real-time data to optimize the waste collection process. When a waste bin is picked up, the sensors on the truck measure the amount of garbage inside the container and monitor the time and location of the pickup.

There’s a cloud-based system that collects, analyzes and displays the real-time data and analytics. GreenQ calls it the Internet of Garbage.

GreenQ migrated to the scalable IBM Cloud infrastructure, the heart of which is IBM Cloud Functions, an on-demand, serverless platform.

GreenQ helps its customers do the same job for less cost or use the same budget to provide a better quality of service. It might be a matter of different routing, different scheduling, different waste bin mapping or what trucks are used.

Free-floating, cloud-based car sharing app car2go takes the keys

For many city dwellers around the world, owning a car is more of a hassle than it’s worth.

IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM Learning, IBM CloudFirst of all, where do you park it? In New York City, for instance, the cost of a parking spot starts at $100 per month on the low end up to $1 million USD for an underground spot in Soho. In fact, the average cost to buy a parking spot is over $100,000 in The Big Apple. With prices like that, who can even afford a car, never mind insurance, maintenance and gas?
With car2go, car ownership isn’t a necessity.

With its new business model, car2go is a new way of renting cars.

When car2go started as a Daimler company internal pilot project in 2008, there was no established car-sharing business model. The car sharing program embodied IoT before it was a buzzword by integrating vehicles, electronics, software and an app on a smartphone to offer getting from point A to point B in a smart and elegant way for customers. Thus was born free-floating car sharing.

Drivers can rent a car directly on the street using a smartphone. There’s no rental office involved. There’s no handling of keys. It’s a completely online business.

Intelligent services for elevators and escalators built with IBM Watson

IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM Learning, IBM Cloud

If elevators and escalators do not work properly, it has a significant impact on the way cities function. People may not get to work in their office buildings. They may even end up with having difficulty getting home.

KONE helps people move in and between buildings as smoothly and safely as possible. Globally, the company services more than 1.1 million elevators and escalators and move more than 1 billion people every day.

That’s why KONE launched “24/7 Connected Services,” which uses the IBM Watson IoT platform to bring intelligent services to elevators and escalators. For example, it helps predict the condition of the elevator or escalator, thereby helping customers manage their equipment over its life cycle.

For people who use elevators and escalators, it means less waiting time, fewer delays, and the potential for new, personalized experiences.

In a first for the industry, KONE is revealing real-time machine conversations between elevators and the IoT cloud. Teams at IBM and KONE worked together to introduce a popular marketing campaign that brings a human touch to intelligent services and demystify a complex topic.

Wednesday, 14 February 2018

IBM Connections 6.0 Delivers a New Level of Collaboration

IBM Tutorials and Materials, IBM Certifications, IBM Guides

Announcing the next release of IBM Connections!

IBM Connections 6.0 brings a range of new capabilities to the end-user, creating a richer, more focused and more effective experience. With IBM Connections 6.0 we added new functionality across the board from Communities to Files, Search, Onboarding as well as the brand-new IBM Connections Homepage – Orient Me.

To highlight the main updates, here is a quick summary:


Advanced Community customization capabilities provide Community owners additional options for designing their custom Community. Community owners can:

◈ Create Community experiences with enhanced rich text content editing (HTML), so that Community members can enjoy an engaging experience.
◈ Choose from new modern layouts with a horizontal navigation bar to better use space on Community pages and match the Community’s purpose more effectively.
◈ Create new Communities faster by choosing from existing layouts, which can help to save time and establish guidelines for Community design.
◈ Reduce clutter on Community pages by hiding a widget while retaining its link in the navigation menu.


◈ Files users can select a top-level folder in their Files and mark it for sync. This permits the users to take the content of entire folders offline to their desktop and keep them synchronized with the files on the server.
◈ For IBM Connections environments with large numbers of documents stored in IBM Connections Files, administrators can now leverage IBM Cloud Object Storage, the hybrid cloud object storage that adapts to your workload needs.

Orient Me

Relevant updates are brought front and center – IBM Connections Orient Me can be used with Connections V6.0 to provide new home page capabilities that apply advanced analytics to surface information and people that are most relevant to an individual. Individuals can take advantage of new capabilities provided by Connections Orient Me:

◈ See, at a glance, the updates and information most relevant to users, displayed in a new visual layout and prioritized based on their interaction with content and people.
◈ Apply new content and people filters to better control what users see.
◈ More easily view updates grouped by a person, a Community, or content.
◈ Receive suggestions about the people most likely to be important and relevant to their work.
◈ See a snapshot of their day in the Action Center, accessible throughout Connections

Monday, 12 February 2018

Blockchain for the EMS (Electronics Manufacturing Services) Supply Chain

IBM Tutorials and Materials, IBM Guides, IBM Certifications

When you’re first exposed to blockchain technology, one question that comes up is where and how to apply it. The answer is often a variation of this phrase: “Blockchain excels at solving issues with ‘high-friction[1],’ multi-party processes where there is a lack of trust.”

The Electronics Manufacturing Services (EMS) industry would appear to check all of those boxes. EMS providers are inserted in their customers’ supply chains (between their parts suppliers and their distribution system), which means their processes are often multi-party (supplier, EMS, and customer plus logistics companies). That often upsets existing relationships, creating a lack of trust and causing friction in the business processes between those parties.

Blockchain technology would appear to be a perfect fit for the EMS industry, and experience seems to confirm this: In 2017 we began to see a high level of interest in blockchain, both from EMS providers themselves and from their customers. We also began to see the first proofs of concept and pilot applications developed. In 2018 I expect the number of new applications to grow, and we should begin to see some of those initial PoCs and pilots turn into production blockchain networks.

So how can blockchain be used to solve problems within the EMS industry? And who (the EMS or the OEM) should sponsor blockchain projects? The answer to those questions depends on who benefits the most, but the obvious answer isn’t necessarily the only option. To be sure, some opportunities may primarily benefit one company (which would normally make them the sponsor of the project). But some issues that are mainly experienced by OEMs may be better solved by the EMS on behalf of multiple customers, instead of just one. In those cases, the EMS may find that their OEM customers may even be willing to pay for a well-considered solution.

IBM Tutorials and Materials, IBM Guides, IBM Certifications

The nature of the relationship can have an impact as well. Our mental image of EMS providers may be high-volume manufacturing lines churning out millions of consumer electronic products, but some OEMs utilize external manufacturing to build low-volume/ high value products; others ask their EMS partners to configure or even design to order; and some EMS providers provide extended supply chain services, going as far as to handle their OEM customers’ own customer fulfillment operations.

An EMS may sponsor internal blockchain projects to take cost out of their own operations or improve performance in some way. Some potential examples include:

◈ Using blockchain with less sophisticated component suppliers (or with suppliers in less developed regions) as a secure mechanism to manage the entire purchase order & response/ shipment, invoice & payment process – including not only the suppliers but also their financial institutions.[2]

◈ Using blockchain to build a trade finance process with its financial partners. EMS providers often operate with limited working capital. They may use innovative ways to maximize their access to cash – but those innovative ways almost always are (currently) built on manual, time-consuming and expensive processes. Blockchain can automate those processes and make it easier (and cheaper) to access critical working capital.

◈ Using blockchain to monitor and manage the inbound (and outbound) flow of materials and finished goods. This is truly mission-critical to EMS companies. Knowing when a crucial shipment of components will arrive at a manufacturing site is essential, as is being able to provide valid delivery dates for customers’ finished products. For inbound use, it’s even possible to build a time-stamped electronic Kanban process.

On the other hand, an EMS’s OEM customer may be motivated to solve different problems using blockchain technology:

◈ Blockchain can be used to provide critical data on the quality of an inbound shipment from the EMS to the OEM, so that the OEM can assess and pre-approve the shipment before it arrives. That can help to improve quality (by giving the quality department enough time to do a full review of the data), as well as reduce cycle time when the product arrives on site. This is especially important during the initial weeks of production. A tool like this can even be used to improve processes when the product is handled by another third party (for instance, a 3PL-managed distribution center) instead of directly by the OEM.

◈ Although EMS providers purchase many components in higher volumes than their OEM customers (and therefore often have stronger relationships with the component suppliers), there are often some suppliers which would rather work with the OEM than the EMS. Those suppliers may offer lower prices, and may be more willing to accommodate scheduling changes, when they deal direct with the OEM. Using blockchain technology to implement a virtual buy/sell process permits the physical supply chain (from component supplier to EMS facility to OEM) to be separated from the financial flow (from component supplier to OEM, potentially to the EMS and back again). This can be a win-win-win: the component supplier and the OEM preserve their strong relationship, while the physical flow is streamlined and efficient.

◈ Blockchain can be used as a secure delivery mechanism for sensitive data – data that may need to be incorporated into products manufactured at an EMS. (Examples may include proprietary software, private ‘keys,’ etc.) Not only does blockchain offer advanced security, but it time-stamps each transaction and its smart contract feature can be configured to optimize both security aspects and manufacturing flow.

Finally, farsighted EMS companies may find that they can offer “blockchain applications as a service” to multiple customers, if they build applications designed to solve common problems among companies that utilize external manufacturing. Some ideas here include:

◈ Providing “Transparency” – extensive and up-to-the minute visibility of exactly what quantities of finished products are available, and where, from the EMS facility through the transportation and logistics process, all the way to the OEM’s distribution network or even to the OEM’s own customers. This can provide real value to OEMs. An OEM can’t “sell what they have” if they don’t know what they have.

◈ Providing detailed provenance of the components that go into an OEM’s product, so that the OEM can (a) ensure compliance with laws and regulations (ROHS, for example); and (b) minimize the cost of any recalls, if it turns out that a particular lot code of a component, from a specific supplier, had a problem of some kind.

◈ Providing greatly improved coordination on such aspects as engineering change notices (ECNs) and Demand Forecasts. Using blockchain processes (possibly in combination with existing EDI or RosettaNet processes), including the smart contract feature, can ensure that the OEM and the EMS stay ‘in sync’ with each other, while still permitting the EMS facility to operate as efficiently as possible.

◈ Going beyond coordination on ECNs, blockchain can also provide a mechanism to coordinate design-to-order and configure-to-order processes between OEM and EMS. Increasingly, EMS providers are supporting their OEM customers with processes to build custom products, down to single-unit lot sizes. This requires a much higher level of coordination (and a sort of ongoing synchronization), which blockchain excels at.

Not only does Blockchain technology offer a good conceptual fit to the EMS industry, but it also is a good match to this industry’s frequently risk-averse approach to investment. A new blockchain project doesn’t require a big up-front commitment. A proof of concept or even a small pilot project can be executed quickly (a matter of a few months) and at minimal cost, providing the sponsoring company with the opportunity to get hands-on experience with the application before every step of the process has been finalized, and before the User Interface has been fully designed. This agile methodology lets companies quickly take a good idea to execution, then refine and iterate it until it is finally ready for full production.

[1] High-friction is used as a way to summarize all the ways that business processes may be unsatisfactory – too time-consuming, resource-intense, expensive, slow, frustrating, etc. – often because they rely too much on manual process steps, require numerous inspections, have to be redone because of varying information standards, or due to other issues.

[2] There’s generally no need to use Blockchain to replace a well-functioning, existing EDI or RosettaNet-based process; but if none exists, Blockchain can provide a cost-effective way to automate the entire PO to Invoice process with suppliers.

Saturday, 10 February 2018

Advent of Cognitive and IoT in Process Manufacturing


Process Manufacturing – typically including refining, petrochemicals and commodities chemicals – has traditionally focused on stability, controllability and optimization. Advanced control techniques and information integration pushed operations closer to economic constraints while maintaining desired objectives around safety, stability and production. The natural progression in smart manufacturing has been to adopt advanced analytics and enhanced decision-support.

As depicted, manufacturing assets and processes have related information ranging from real-time data, operating procedures, and regulatory requirements to new information from sources such as augmented reality. These types of structured and unstructured information remain untapped when it comes to leveraging for decision support.

IBM Tutorials and Materials, IBM Guides, IBM Learning, IBM Certifications, IBM Manufacturing

This untapped potential is where newer technologies such as IoT (Internet of Things) and Cognitive applications are beginning to make impacts. These technologies are gathering significant momentum in process manufacturing in the form of pilots and proofs-of-concepts. That is not to say that the penetration is of the same order for both trends. The success of these initiatives depend on the functional areas in which they are applied and the approaches that are used to deploy them.

IoT – as applied to process manufacturing

IoT, ever since it was coined in the late 90’s, has found broad acceptance as a key transformative enabler of Industrie 4.0. While being more pervasive in household appliances (thermostats or garage doors, for example), the industrial version of IoT is still on the uptake. Industrial IoT (IIoT, as it is sometimes referred to) has seen broader acceptance in automotive and electronics segments where predictability is of a higher order when compared to process manufacturing. The affordability also makes other industries more suitable for IoT, particularly if they were not well instrumented to begin with. A refinery, on the other hand, is highly instrumented and integrated except for the odd ones that are still out there deciphering pneumatic signals. So the question arises as to what additional information this new technology can bring, in what way does it change the operations, and most importantly, what is the substantiated business case. When it comes to the terminology itself, there are multiple interpretations of what IoT is and how it is relevant to Industrie 4.0.

So, what is IoT? Specifically, what does it mean for the asset intensive environment of a refiner or chemicals manufacturer?

Frequently, the meaning gets lost because of the way it is applied, the associated business imperatives and just pure buzzword potential. For example, programs such as Manufacturing Operations Management (or MOM, as they are referred to) are tagged as IoT, though they may not have elements of IoT in the strictest sense. Advanced applications that leverage information from sensors and actuators through process control networks have been around for a long time. IoT is often loosely attached to those initiatives as well, more so because of sales tactics or internal business buy-ins.

In a simplistic sense, IoT provides newer and larger volumes of asset-related information delivered using internet/cloud protocols, that were hitherto not available.

First – the information acquisition, management and delivery…

Traditional sensor information is consumed by DCS or SCADA through fieldbus, HART, etc. But today’s customer demands a different engagement – the same way they consume their daily news feeds and sports updates. That requires delivery of information on different devices using internet protocols such as http, etc. This is in direct conflict with the DMZ (demilitarized zone) requirements of the process control network. How can the sensor information be made available to something outside of PCN (Process Control Network)? Will that compromise the security? The choice of the information gathered, and its intent has a direct bearing on its management and delivery. Adoption of wireless networks in the plant environment is one such example which highlights the route the information takes to reach decision support.

IBM Tutorials and Materials, IBM Guides, IBM Learning, IBM Certifications, IBM Manufacturing

Information acquired through IoT-enabled devices are typically outside of PCN and used primarily for decision support. Controllability, and asset security are still maintained within the DMZ so that it doesn’t become a roadblock for IoT adoption.

Second – the volume of information being delivered…

Typically, decision-support only uses up to 15% of the information gathered from the field as per multiple industry journals. Aggregation and filters are used as a workaround to mask inability to handle large volumes of information. High performance computing, real-time streams and analytics have eliminated that constraint. The information gathered by IoT-enabled sensors are therefore processed, contextualized and made available for consumption by other machines or humans with relative ease. This enhances the productivity of the engineers and operators. Optimization targets transition from hourly to every-minute. Energy management becomes a real-time endeavor instead of a weekly activity. All of this is likely to take another leap forward with the recent advances in quantum computing.

Third – the new type of information that is made available…

Smart, as well as traditional sensors typically measure the process variables such as pressure, temperature, flow and qualities. Enabling newer devices with IoT technology brings information that were not available till now. Examples such as: wireless acoustic monitors for valve leaks, flare monitors in stacks and remote asset inspections using Drones/UAVs provide essential information regarding the assets. The benefits are realized in improving optimal conditions, enhanced worker safety and increased productivity. More innovations in sensor technology will deliver additional information from the field that can be consumed without burdening the PCN.

Cognitive – as applied to process manufacturing

Terms such as deep learning, AI etc. have been swirling since the resurgence of cognitive technology resulting from Watson’s grand entry in the gameshow ‘Jeopardy’. Since then it has been applied to encouraging levels of success in different industries such as healthcare, automotive, education and so forth. As with any other technology, it has taken its time to reach the asset intensive domain of a process manufacturer – refining, petrochemicals or chemicals plant.

To keep things distinct, cognitive is defined as the contextual intelligence gained from unstructured information regarding the asset(s) or operation(s) in question. Though cognitive analytics are derived from both unstructured and structured data, the focus here is on its uniqueness in being able to handle unstructured information.

The type of information that is mined can be varied…

Traditional information – data – is consumed in real-time through the sensor network and DCS at the rate of a few thousand tags per minute. And this is further filtered and aggregated to suit the capability and needs of the decision-support entities. It is an understatement that much is lost in the process. A cognitive tool such as Watson can process a million pages per second. So what sort of applications exist in process manufacturing domain that can leverage this ability? The answer depends on the sort of unstructured information available, its volume and its dynamic varying nature.

Assets possess various types of unstructured information, including design documents, inspection routines, alarm conditions, maintenance manuals, spares specifications, standard operating procedures, asset correlations, etc. They also include external information from bloggers and forums about experiences regarding the assets.

Such information pertains not only to the assets themselves, but also to the corresponding operating conditions. Examples would include process economics, catalyst usage, feedstock variability, impacts on asset corrosiveness, etc. Depending upon the process, some of this information could be dynamic in nature. Published journals, technical forums, conference proceedings, etc. add to the consistently changing knowledge base regarding the process and/or asset. Borrowing from a healthcare example, a doctor cannot be expected to be on top of every breakthrough in his or her field of interest. The same applies to a planner, engineer or operator when it comes to the process operation in their purview.

Capability Progression – Enabled by Cognitive & IoT

In other similarly asset-intensive industries, the adoption of innovations such as autonomous assembly lines and real-time asset condition monitoring has led to the concept of ‘lights-out manufacturing’ environments in the near-future. Automation is enabled not only for physically intensive and hazardous tasks, but is also encroaching the expertise-centric domain. Irrespective of the source and ingestion of information, cognitive or IoT, the derivative analytics are leveraged in decision-support.

Information stored within personal hard-drives and in the minds of an aging workforce are the targets for extraction towards building an enterprise with systemic inherent knowledge. To a large extent, loss of the aging workforce is almost behind us and the level of accessible expertise-based information is close to an all-time low. In order to maintain competitiveness and improve key metrics such as safety and productivity, leveraging new technology becomes imperative.

Between applications of cognitive and IoT innovations, the choice of either or combination would depend on the suitability and need of the functional areas. A planner doesn’t have much use for additional IoT information, but can use the cognitive ability to understand feedstock cost and product pricing opportunities. A console engineer can use the flare information from cameras as well as the cognitive ability to identify the operating conditions that induced it in making the mitigating adjustments.

IBM Tutorials and Materials, IBM Guides, IBM Learning, IBM Certifications, IBM Manufacturing

The value of cognitive and IoT innovations towards augmenting experience of operators doing vital tasks in the field so that they perform as efficiently and safely is without debate. The objective, to put it simply, is to make every operator perform like the best operator and every engineer perform with the knowledge of a thousand engineers. Application of cognitive and IoT innovations go a long way in that direction. A transition from an aging workforce has already occurred in the industry costing a significant loss in expertise over the last few years. It might sound unfathomable at this point in time, but the tipping point is within sight where effect of employee attrition on an organization’s knowledge drain starts diminishing.

IBM Tutorials and Materials, IBM Guides, IBM Learning, IBM Certifications, IBM Manufacturing

Adoption of cognitive or IoT technologies deliver capabilities with varying levels of complexity to the enterprise depending on the area of application. Reliability and maintenance is provided as an example in Figure 4. Asset maintenance and, subsequently, unit operations are driven by effectiveness, utilization and availability. The maturity progression covers statistical approach (typically through univariate analysis), predictive analytics (based on empirical or rigorous models) and cognitive application (leveraging unstructured information). The first two capabilities – statistical and predictive – are further enhanced by application of IoT innovation. For example, augmented reality can improve inspection efficiency and expedite risk mitigation actions. For the same asset, cognitive application might be utilized to look at both internal and external reports about the asset to gain additional insight in terms of mitigating actions.

IBM Tutorials and Materials, IBM Guides, IBM Learning, IBM Certifications, IBM Manufacturing

Without a sufficient level of instrumentation, the integration of information systems is a futile exercise. By the same argument, gaining any reasonable level of predictive capability or operational intelligence is not reliable without the right level of integration. The capability progression would take different technology elements, depending upon on the domain of interest. Some of the examples are provided in the progression curve (Figure 5) as an illustration. Not every functional area needs to aspire to be at the end of the curve. Business value, team readiness, ability to support and complexity of the solution need to be considered for target setting along the maturity curve.

IBM Tutorials and Materials, IBM Guides, IBM Learning, IBM Certifications, IBM Manufacturing

When it comes to the adoption of IoT or Cognitive innovation, there have been many examples of organizations being unclear as to where they should start, or, how comprehensive the pilot should be. The approaches often tend to validate the solution across a functional domain with a limited portion of the technology components. This can lead to an inordinate focus on technology elements resulting in compartmentalization of sensors, analytics or cognitive elements. True business value remains hidden in such an approach. As a result, operations frequently remain unconvinced of the results from the pilot or POC.

A better approach – well tested – is to choose a use case within a functional domain. Upon the development of business benefit estimation for the use case, it should be addressed through all the relevant technology components that are required to deliver the use case. This changes the priority from that of ‘testing a technology’component to ‘validating a business capability’. Process maturity models, benchmarking metrics, business process descriptions, and KPIs collectively help accelerate the process of proving the solution in its delivery of the desired capability to the organization.

Friday, 9 February 2018

Turn-key Kubernetes with data visualization and analytics

There are many reasons why you might choose Kubernetes as your platform for hosting your application(s). In Cloud Foundry, for each application, the platform provides the isolation, OS, runtime, networking and management capabilities. This opinionated environment is ideal for some use cases. Kubernetes provides similar capabilities but gives you the control of the OS, runtime, networking rules of each service, communication between services in your cluster and more.

Monitoring or diagnosing performance or errors with your applications, containers, pods or workers doesn’t have to be challenge. IBM Cloud delivers a set of integrated tools to remove this hurdle, even for a user that has little experience managing clusters. These tools provide a single pane of glass to manage application activity and health across multiple clusters, compute options and/or regions.

IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM Learning

Follow this tutorial to go from scratch to having an application running in a cloud-hosted Kubernetes cluster with built-in industry standard open source tools for log aggregation, data visualization, monitoring and alerting. Once you’ve created your cluster on IBM Cloud, this tutorial should take you no more than 30 minutes.

IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM Learning

This solution starts off by using the IBM Cloud Developer Tools to generate a starter application. No need to program your own yml files or helm charts. The starters come preconfigured ready to use as-is or customize them to your liking. Push the containerized application image to a IBM Cloud Container Registry and create a Kubernetes deployment in one single command.

IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM Learning

Then, stand up your Log Analysis and Monitoring services which ingests data from your cluster. These services are meant to be used with Kubernetes clusters, Cloud Foundry or Virtual Servers. When your application architecture is composed of several microservices, this provides a single dashboard to analyze logs or metrics across all your microservices. Keep in mind that data can also be sent from outside of IBM-Cloud. This scales well to architecture that span across multi-clouds or hybrid environments.

IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM Learning

IBM Tutorials and Materials, IBM Certifications, IBM Guides, IBM Learning

Wednesday, 7 February 2018

How do I monitor my IBM Cloud applications?

If you are a service owner or first responder, the following questions surely cross your mind each day:

IBM Cloud Applications, IBM Tutorials and Materials, IBM Certifications, IBM Learning

◈ What’s going on with my IBM Cloud application?
◈ Are my customers satisfied with the service they’re getting?
◈ Has performance changed recently?
◈ Is anything unusual happening with my application?
◈ What’s the difference between my on-premise and hosted services?

These questions often boil down to issues of performance and availability of your applications and services. You could use the IBM Cloud console and refresh your browser window every few minutes, but realistically, you need a solution that is automated and efficient.

In order to address the concerns above, we in IBM have defined the discipline of Cloud Service Management & Operations (CSMO) as “all the activities that an organization does to plan, design, deliver, operate, and control the IT and cloud services that it offers to customers.” Within that discipline are practices and toolchains which enable us to perform these tasks.

For this first post in the series, we’ll focus our attention on monitoring applications. Of course, there are tasks beyond monitoring that prompt questions like “If we have detected an issue, how will we solve it and return the application to a fully functional state?” Again, these tasks will be discussed in a future post.

Types of monitoring

Monitoring can be roughly divided into three types:

◈ Metrics – collecting numerical information from the application and platform. This may be a number that is calculated by the application (e.g., how any items are in a queue) or exposed by the platform (e.g., how much memory is the process consuming).
◈ Logging – collecting textual information (e.g., an error message generated by the application).
◈ Synthetic monitoring – sending an external message to the application and examining the response to determine the component’s status (e.g., sending a ping to a server or simulating an entire customer transaction).

IBM Cloud Applications, IBM Tutorials and Materials, IBM Certifications, IBM Learning

Once the monitoring system has discovered that a specific metrics has passed a threshold or a log entry matches, a test it will forward an event up the incident management toolchain so that the issue can be solved either automatically or manually.

Choosing the right monitoring for your environment

Today’s cloud environments are not homogeneous, they’re a combination of traditional non-cloud platforms, public and private cloud platforms, and even third-party platforms. This mixing of platforms means your monitoring solution must account for these non-cloud and multi-cloud environments. To help guide your decisions, the following table summarizes the practices and tools that apply:

IBM Cloud Applications, IBM Tutorials and Materials, IBM Certifications, IBM Learning

Don’t be intimidated by the size of this table and the plethora of acronyms! The following sections will step through each row and explain them in more detail, going from the bottom row to the top. Let’s begin with the key differentiator, the environment type.

Different types of platforms in hybrid cloud

Below is the bottommost row of the table; it indicates the platform for which the rows above it apply.

IBM Cloud Applications, IBM Tutorials and Materials, IBM Certifications, IBM Learning

◈ Traditional Environment means the on-premise environment of physical or virtualized servers that has been common for years with both automated and non-automated provisioning.
◈ IBM Cloud Private is IBM’s application platform for developing and managing on-premises, containerized applications (PaaS/CaaS).
◈ IBM Cloud is IBM’s one-stop cloud computing solution which provides multiple types of solutions (IaaS/SaaS/PaaS/CaaS/FaaS).
◈ 3rd Party Cloud may be either on-premises or cloud-based, depending on the provider.

The rest of this post will discuss the differences and commonalities of monitoring metrics and logs in the various types of Cloud workloads. We will discuss synthetic monitoring in a future post since, being external to the workload, it is similar in all environment types.

Monitoring the various types of Cloud workloads

Since your organization’s environment is likely to be complex and a hybrid of multiple technologies and cloud types, it is likely that you will need a variety of solutions in order to monitor each service in the best possible way.

IBM Cloud Applications, IBM Tutorials and Materials, IBM Certifications, IBM LearningWatch the workload’s infrastructure/platform: Infrastructure is divided into on-premise where you are responsible for the platform down to the hardware, and off-premise/cloud where your service provider supports the infrastructure and your sole concern is that the platform is available.

IBM Cloud Applications, IBM Tutorials and Materials, IBM Certifications, IBM LearningMonitor Cloud-Ready workloads: These are workloads that are suitable for running in the cloud, but their heritage is from the traditional environment. Applications running on Virtual Machines that can be lifted and shifted to the cloud are the classic example of Cloud Ready applications.

IBM Cloud Applications, IBM Tutorials and Materials, IBM Certifications, IBM LearningMonitor Cloud-Native workloads: These are workloads that were designed to run in the cloud. Container, runtime & serverless applications are the typical kind of workload that is Cloud Native.

IBM Cloud Applications, IBM Tutorials and Materials, IBM Certifications, IBM LearningCollect logs: Since the multiple workloads create a wide variety and large amount of logs, it is critical to have a mechanism to collect and make sense of all the log entries. The collection and aggregation of logs is where problem analysis begins.

Remember that some solutions may be dedicated to specific workloads, but others may monitor multiple workloads. For example, you may use a single instance of Application Performance Manager to monitor both your traditional environment and IBM Cloud Private or install two instances, one for each workload. This decision will be made based on operational considerations and may differ from environment to environment.

Monitoring the cloud platforms

The first level of monitoring is that of the platform (when in the cloud) and the datacenter infrastructure (when on-premises). While each platform and infrastructure usually has a dedicated (siloed) monitoring solution, you can use Netcool Operations Insight (NOI) or Cloud Event Management (CEM) to collect events from these solutions and use Application Performance Management (APM) to monitor them independently.

IBM Cloud Applications, IBM Tutorials and Materials, IBM Certifications, IBM Learning

The following is the list of IBM’s monitoring solutions for cloud platforms:

◈ Application Performance Management (APM) is designed to intelligently monitor, analyze and manage cloud, on-premises and hybrid applications and IT infrastructure. APM can monitor all types of workloads for both Cloud and on-premise applications and infrastructure/platforms.
◈ Netcool Operations Insight (NOI) and Cloud Event Management (CEM) are designed to collect, correlate and consolidate millions of events and alarms from your on-and off-premise environments. You use them to leverage siloed monitoring systems and gather information and events. NOI and CEM have a role in event management and incident management which goes beyond monitoring.
◈ IBM Cloud has a status console that displays the state of the IBM Cloud platform, services and runtimes.

Monitoring cloud ready workloads

Cloud-Ready workloads (virtualized servers, middleware and so on) are also monitored using APM for the application performance and NOI/CEM to collect information from other monitoring solutions.

IBM Cloud Applications, IBM Tutorials and Materials, IBM Certifications, IBM Learning

Those using Cloud Automation Manager (CAM) in IBM Cloud Private can orchestrate and control multiple clouds, but the monitoring of these resources is not performed under IBM Cloud Private itself. In other words, if you use CAM to provision a traditional virtual server within your datacenter, then you will use your traditional solution to monitor the servers and not the IBM Cloud Private monitoring solution.

Monitoring cloud native workloads

Cloud native workloads are workloads that are specifically designed to benefit from the features of automation and orchestration that cloud platforms provide. These include Containers running under Kubernetes & Cloud Foundry runtimes in both IBM Cloud and IBM Cloud Private and IBM Cloud Functions in IBM Cloud.

IBM Cloud Applications, IBM Tutorials and Materials, IBM Certifications, IBM Learning

While the same monitoring solutions for Cloud-Ready workloads exist, Cloud-Native workloads have further available solutions:

◈ Prometheus is an open-source systems monitoring and alerting toolkit which is part of the Cloud Native Computing Foundation, together with Kubernetes. It can monitor multiple workloads, but is mostly used with the Container workloads. Prometheus comes built-in with IBM Cloud Private and can be deployed manually to monitor IBM Cloud workloads too.

◈ IBM Cloud Monitoring automatically collects metric data from IBM Cloud applications and services, eliminating the need for agents. APIs make it easy to add custom metrics and to query your monitoring data. Cloud Monitoring can monitor all types of workloads in the IBM Cloud.

◈ APM for DevOps is a new member of the APM solution suite, dedicated to ensuring the optimal performance of your applications and to make the most efficient use of containerized resources.

Collecting logs

Due to the dynamism of Cloud environments, the collection, aggregation and analysis of logs becomes a cornerstone of the monitoring solution. While Cloud-Ready applications may still write logfiles to disks and depend on an external collector to read them, Cloud-Native applications will usually simply stream messages out. These log entries will be lost unless an existing log collector saves them.

IBM Cloud Applications, IBM Tutorials and Materials, IBM Certifications, IBM Learning

The following is the list of IBM’s solutions for collecting and analyzing logs:

◈ IBM Cloud Log Analysis collects and aggregates application and platform logs for consolidated application or insights. It enables “zero configuration” out-of-the-box automated log collection of Cloud Foundry and Containers workloads. Log Analysis can collect logs from  all types of workloads.

◈ ElasticSearch, previously known as the ELK stack, enables you to securely and reliably search, analyze, and visualize your data. It is installed as part of IBM Cloud Private.

◈ IBM Operations Analytics – Log Analysis helps turn terabytes of big operational log data into understandable and actionable insights for quicker problem solving and better overall service. It accelerates problem isolation, identification, and repair by providing dashboard views into analyzed sources of log data from solutions and devices across the service management infrastructure.

◈ Log File Agents are components of APM which read and correlate logs

Service Management toolchain

While each of the cloud platforms and workloads may benefit from using a dedicated monitoring solution, the rest of the service management toolchain benefits from being consolidated. For example, it is simpler and easier for the organization if there is a single dashboard solution so everyone is looking at the same dashboard and a central ticketing solution to facilitate the tracking and transferring of tickets within the organization. These considerations are shown in the final and topmost row of the table, Service Management:

IBM Cloud Applications, IBM Tutorials and Materials, IBM Certifications, IBM Learning