What is Cognitive Computing?

Although computers are better for data processing and making calculations, they were not able to accomplish some of the most basic human tasks, like recognizing Apple or Orange from basket of fruits, till now.
Computers can capture, move, and store the data, but they cannot understand what the data mean. Thanks to Cognitive Computing, machines are bringing human-like intelligence to a number of business applications.
Cognitive Computing is a term that IBM had coined for machines that can interact and think like humans.
In today’s Digital Transformation age, various technological advancements have given machines a greater ability to understand information, to learn, to reason, and act upon it.
Today, IBM Watson and Google DeepMind are leading the cognitive computing space.
Cognitive Computing systems may include the following components:
· Natural Language Processing – understand meaning and context in a language, allowing deeper, more intuitive level of discovery and even interaction with information.
· Machine Learning with Neural Networks – algorithms that help train the system to recognize images and understand speech
· Algorithms that learn and adapt with Artificial Intelligence
· Deep Learning – to recognize patterns
· Image recognition – like humans but more faster
· Reasoning and decision automation – based on limitless data
· Emotional Intelligence
Cognitive computing can help banking and insurance companies to identify risks and frauds. It analyses information to predict weather patterns. In healthcare it is helping doctors to treat patients based on historical data.
Some of the recent examples of Cognitive Computing:
· ANZ bank of Australia used Watson-based financial services apps to offer investment advice, by reading through thousands of investments options and suggesting best-fit based on customer specific profiles, further taking into consideration their age, life stage, financial position, and risk tolerance.
· Geico is using Watson based cognitive computing to learn the underwriting guidelines, read the risk submissions, and effectively help underwrite
· Brazilian bank Banco Bradesco is using Cognitive assistants at work helping build more intimate, personalized relationships
· Out of the personal digital assistants we have Siri, Google Now & Cortana – I feel Google now is much easy and quickly adapt to your spoken language. There is a voice command for just about everything you need to do — texting, emailing, searching for directions, weather, and news. Speak it; don’t text it!
As Big Data gives the ability to store huge amounts of data, Analytics gives ability to predict what is going to happen, Cognitive gives the ability to learn from further interactions and suggest best actions.


Source: customerthink.com-What is Cognitive Computing?

Cloud wars: Google turns aggressive in battle with AWS and Microsoft Azure

Google is getting more aggressive in the cloud as it looks to make up ground on Amazon Web Services and Microsoft Azure.

This may not come as a surprise for many who will have noticed that Google has been pretty active on the acquisition front over the past year, having now spent over $1bn on the likes of Apigee, Orbitera, and other acquisitions.

According to Deutsche Bank estimates, Google Cloud Platform has a $750 million revenue run-rate.

Further findings from the markets research study, ‘ Google getting more aggressive in the cloud’, predict that GCP is preparing a series of new product announcements in September that will be aimed at strengthening the company’s customer-facing roadmap.

In a total available market of $1 trillion the combined revenues of AWS, Microsoft Azure, and GCP accounts for less than $15bn, suggesting that while these companies are big in cloud, there is lot more market that they could grow in to.

Deutsche Bank classifies the Enterprise IT spending market by combining storage, network, infrastructure software, IT outsourcing and support, data management software, BI and analytics, application software and consulting. All of which the cloud vendors have some capabilities in and are looking to build out.

Google Cloud Platform can be used by developers to tap into AI capabilities.
Google Cloud Platform can be used by developers to tap into AI capabilities.


On the product front it is expected that GCP will continue to concentrate on machine learning, data analytics and security, which will include data encryption and identity, and access management. Already the company has revealed capabilities such as SQL Server Images and the second generation version of Cloud SQL.

To support these moves the company is also taking an aggressive approach to building out global infrastructure locations. It announced in its Q4 2015 earnings that 12 new regions would be built in 2016 and 2017.

Of course, Google isn’t alone in expanding its infrastructure footprint for its cloud as both Microsoft and AWS have already made similar moves. Microsoft recently opened a region in the UK and AWS has one planned to open in late 2016 or early 2017.

All of these capabilities and build out of infrastructure still rely upon someone to sell what the company is offering, which is why Deutsche bank said that Google is “hiring very aggressively” to increase its enterprise sales rep capacity.

According to the research these moves appear to be helping it to gain traction among the start-up community. Customers estimated that 25% of start-ups are using GCP today, but 75% are with AWS, so it still has a long way to go to catch up.

Source: cbronline.com-Cloud wars: Google turns aggressive in battle with AWS and Microsoft Azure

The Role and Value of RPA in Business Process Automation

Over the past several years there’s been a dramatic upswing in awareness of robotic process automation (RPA) in the business world. There’s also been a significant level of this technology integrated into service offerings by BPO providers. But this buzz shouldn’t disguise the fact most people are just beginning to hear about robotic software and trying to make sense of it.

Ironically, a real challenge in getting knowledgeable about this technology is benefits are so conclusive while preconditions are so reasonable. When a manager reads process costs can be cut from 30% to 60% – if activities are rules-based – the temptation is to stop learning and start doing. Perhaps a quick Google search and a demo download. The better route is to hold off on the demo, stay focused on becoming knowledgeable on the technology, and parlay first moves into smart moves.

A good place to start is by asking the question: why, since business process automation has been available for decades, is robotic process automation causing such a stir? It’s a good question because the answer reveals compelling benefits beyond the “30% to 60%” and “rules-based” sound bites. Once someone understands the full rationale for BPO providers to redesign their services offerings to incorporate robotic software, they’ll be able to make sound judgements for their own company. The answer to the question begins by setting a context for business process automation.

Origins of Business Process Automation

Business process automation has its roots in enterprise resource planning (ERP) systems and business process management systems (BPMS).

First introduced in the early 1990s, ERP systems have a central database which serves a suite of applications for core business processes such as finance & accounting, procurement, HR, etc. The objective was to rationalize and standardize those processes, and then integrate a common data set. For example, sales data would be used to forecast demand and order materials; it would also trigger commission calculations which integrated with the payroll application. Drawbacks include: expense; complexity; implementation time and customizations creep.

BPMS emerged as smaller and more limited automation option for companies who couldn’t afford an ERP investment. It also became an option for ERP customers who found the standardized application suite too confining. For example, many health insurance companies could afford an ERP system but required a specialized claims processing automation system as well. Drawbacks include: expense (though less than ERP); process design complexity; application and data integration complexity; and implementation time.

The difference between the two systems is one of focus: BPMS is focused on optimizing, improving and automating specific business processes. ERP is focused on leveraging centralized data across an integrated suite of core business functions. Despite drawbacks, these two systems remain the foundation for business process automation.

Where RPA fits in Business Process Automation

The fact is that neither ERP nor BPMS – alone or combined – can automate all of an organization’s business processes. The principle issue is ROI. Both ERP and BPMS are expensive and time-consuming to implement. Unless a process employs a critical mass of resources, it cannot justify the automation investment required by either of those systems. The practical reality is every company has hundreds of processes too small to warrant system automation. The economic reality is the aggregated inefficiencies of those hundreds of smaller processes are too large to ignore. What to do?

The answer for many companies was business process outsourcing (BPO). In a sense, it was a mirror image of the early “your mess for less” approach used for application outsourcing. BPO providers shadowed and documented the processes, then created a labour arbitrage by shifting the work offshore. The concept worked well over the past ten years. Companies saved double digits in business process costs and BPO providers enjoyed double digit revenue growth.

Labour arbitrage is not scalable – as volume grows so does headcount. For providers this means flat margins after the first years of productivity gains. For customers – on the other side of the same coin – it means flat savings. Providers lowered onsite headcount more aggressively and optimized to the extent possible, but the situation was essentially static.

During this same ten year period desktop automation – starting out as macro scripts – matured into rules driven desktop robotic process automation, capable of integrating with applications on the user interface layers. Soon many RPA products became able to integrate on other layers, using API and other technologies.

There were three subsequent RPA innovations that led directly to the current surge in interest and adoption by companies and BPO providers.

  • Enterprise level, server-based robot deployments:
  • Centralized dashboard robot management:
  • Robot Orchestration:

These three innovations were game-changers because they redefined RPA from a desktop tool into a server-based, scalable, automation solution. Companies and providers are now able to orchestrate large (dozens or hundreds) groups of robots into workflows capable of addressing large volumes of work.

Over the past two years the RPA role has shifted to that of an automation solution capable of addressing – both technically and economically – the hundreds of remaining un-automated work processes left behind by ERP and BPMS.

The new value of RPA for the customer lies in the fact savings are freed from the non-scaling nature of business process outsourcing labour arbitrage. In fact, this automation solution frees processes from labour itself. When the benefits of scalability, accuracy and speed are added to savings, it becomes easier to understand why so many people are talking about the technology and why BPO providers are revising their service offerings to include it.

The future role and value of RPA lies in the fact its innovative journey is not over. Industry leaders UiPath and Blue Prism are developing cognitive capabilities for their robotic software. Robots will work with unstructured data (e.g. voice recognition, email, documents) and change rules based on their process experience or additional knowledge. The key for companies and providers will be to proactively implement an automation roadmap, based on a well-balanced technology and operational strategy.

Source: professionaloutsourcingmagazine.net-The Role and Value of RPA in Business Process Automation

CIOs still struggling to manage cloud use

 Although companies apply ITSM processes to in-house IT, cloud services are more likely to be managed by providers

Research has revealed that 85 per cent of CIOs think the cloud is preventing organisations from having control over their IT network.

A study by Fruition Partners revealed that cloud services are much less commonly managed by IT service management (ITSM) processes compared to other in-house IT services, which are, on average, managed by six processes.

Much of this is down to IT maturity, Fruition Partners said, but the gap and could have a negative impact on the way entire organisations are managed. In fact, 80 per cent of CIOs interviewed by the company said they do not apply the same comprehensive ITSM processes to the cloud as they do for other in-house IT services.

“The maturity of cloud services has started to improve, but it is still leagues away from where it needs to be,” said Paul Cash, managing partner of Fruition Partners UK. “There has to be a recognition that the need for rigorous management is greater, not less, in the cloud.”

Furthermore, the research revealed 73 per cent of CIOs are happily handing over both change management and cloud application management controls to cloud providers and vendors, which means the IT department as a whole has less control over their organisation.

He added that CIOs cannot trust that public cloud services will work flawlessly and be delivered perfectly at all times and therefore should be wary about handing responsibility over to providers without ensuring ITSM principles are applied, because if they do not check there are sufficient safeguards in place, they open themselves up to blame if one of the services fails.

“CIOs should still be managing cloud services internally, rather than abdicating responsibility to the provider. Otherwise they risk losing control, and increasing both cost and risk to themselves and the business,” he added.

Shadow IT is also a concern for CIOs, Fruition Partners’ research revealed. 66 per cent of the respondents said there was an increasing culture of shadow IT in their company and 68 per cent of CIOs reported the organisation doesn’t ask for advice before buying public cloud-based services.

“Organisations have the tools at hand to ensure IT services are delivered consistently, comprehensively, and without risk. By failing to apply these tools to the cloud, they are doing themselves a major disservice,” Cash continued.

“The longer business spend without unifying their approach to both cloud and in-house IT, the harder managing IT will become. Dealing with this is relatively easy in the short term; simply ensuring that ITSM processes are unified across in-house and cloud services will reduce a great number of the challenges and risks associated with cloud.”


Source: cloudpro.co.uk-CIOs still struggling to manage cloud use

How cognitive computing is changing IoT

Cognitive computing means giving computers the ability to work out complex problems for themselves. Just like humans, cognitive computers benefit greatly from experience, learning better ways to solve problems with each encounter. When a traditional system of rules finds a task impossible, cognitive computing sees only an opportunity to expand its knowledge.

The necessity for cognitive computing in the Internet of Things (IoT) arises from the importance of data in modern business. In the smart IoT venues of the future, everyone from startups to enterprises to homeowners will use data to make decisions using facts rather than instincts. Cognitive computing uses data and responds to changes within it to make better decisions on the basis of specific learning from past experiences, compared with a rule-based decision system.

How we define that data is changing, though. Soon, data itself will require this level of computing to extract, making this new method even more valuable to the development of the IoT.

Cognitive computing implications for IoT

While we are still a long way from talking to our operating systems like they’re our friends, cognitive computing has some immediate applications in the IoT that will allow businesses to use their devices to their fullest potentials.

Consider cognitive computing from a perspective of its immediate return on investment. While no computing system is close to true artificial intelligence yet, breaking up the duties of the cognitive machine into smaller tasks allows it to perform cognitive duties in specific fields with great success. Through bite-sized chunks of cognitive computing such as planning, forecasting, reasoning, and recognizing information such as text and images, companies can incorporate cognitive computing into their existing IoT and immediately reap the benefits.

The banking industry already has several uses for cognitive computing in the IoT,specifically in fraud detection. Previously, detecting fraud relied on rules-based analysis. Is the card being used in another state? Is the card being used for a foreign transaction at an odd hour? With cognitive computing, those rules become small parts of a more comprehensive whole, allowing banks to learn consumers’ spending habits, project the likelihoods of future purchases, and put a freeze on a card if the usage pattern indicates the card is being used fraudulently.

As cognitive computing and the IoT grow together, businesses big and small will benefit from the autonomous capabilities of the new technologies.

Improving productivity through technology

In the near future, an IoT powered by cognitive computing will lead a revolution in increased productivity. As more autonomous systems enter the IoT, businesses will need to learn new skills to take advantage of the expanded potential.

Cognitive computing’s ability to forecast more accurately means businesses must become more familiar with anticipatory and predictive systems. As the communication abilities of the technology become more robust, users will need to learn how to respond to and interact with the devices’ queries. Businesses will need to train decision makers in interpreting the advanced data models that cognitive computers can produce in order to reap the full benefits of the technology.

Eventually, cognitive computing in the IoT will lead to products that can make instant, autonomous business decisions without human intervention. From customer interaction to manufacturing and maintenance of equipment, processes that once required guesswork and reactive management will have fact-based, proactive solutions.

The future of data and cognitive computing

Right now, companies don’t have the talent they need to realize the full potential of all the data they measure. Cognitive computing in the IoT will allow data-collection and data-interpreting machines to communicate with one another quickly and completely, opening the door for a surge of new business strategies.

Thankfully, the early generations of these products are already here. Google’s DeepMind is the most visible example, replicating some basic functions of human thought with faster processing speeds to deliver actionable answers to data-based questions. As devices like these become more advanced and more prominent in the business world, companies will be able to test the limits of their application to the extreme, putting data and smart computing to work in ways that will change the landscape of modern business as we know it.

The real benefits in the world of millions of devices and sensors connected in an IoT world comes from having a learning engine closer to each sensor, displacing any existing rules. This way, decision-making becomes individual and specific to the sensor or node and purely based on its own experience. For example, in the case of healthcare, health trends and past learning for a specific person is used against a fixed threshold in decision-making. The same idea can be applied across other industries as well.

And because all those devices and sensors are interconnected, their exchange of information and collective learning can offset the significant data and the time required for learning while also preparing for the dynamic needs of the solution. For example, a particular node exposed to a cyberattack can pass this learning over the network on the fly, which will help in safeguarding the rest of the nodes.

Cognitive computing in the IoT presents as many challenges as it solves, but the challenges will be the kind that businesses want. Rather than worry that they don’t have the talent or resources to collect, read, and act upon their data, companies will soon wonder what to do with the bounty of information and analytics at their fingertips. Luckily, cognitive computing power will be there to help them along the way.

Source: readwrite.com-How cognitive computing is changing IoT

Robotic process automation technology gets to work

When IT veteran Allan Surtees joined Gazprom Energy as head of delivery in 2014, he spent time with each of his business partners at the U.K. gas and electric supplier, talking to them about their perceptions of the IT function and what they wanted from technology. “Their perception was IT didn’t innovate and we didn’t offer them much,” he said.

During his first three months on the job, Surtees also sat down with employees and watched them work. “I observed straightway that a lot of them were doing work with data that gets manually copied and pasted from one system to another. It’s what they call ‘swivel chair’ work — clicking on multiple systems, getting data from one source and putting it into another, where people are actually stuck four or five hours a day just doing this boring, manual nonsense,” he said. “I immediately saw there was an opportunity for RPA.”

RPA, or robotic process automation, has a sexy ring to it these days, especially in the C-suite and company boardrooms. And why not? There’s a lot about this emerging technology to pique a boss’ interest. Robotic process automation technology — defined in simple terms as software that automates other software — promises to improve efficiency, boost productivity and save money by helping with or entirely replacing the manual, routine and often error-prone digital processing jobs still done with human labor at many companies.

Allan Surtees

Better yet, RPA tools, like the one Surtees put to use at Gazprom Energy, promise to do so without the heavy lifting associated with other types of business software used for automating enterprise work — ERP implementations, for example, orbusiness process management (BPM) suites.

The software robots of RPA ilk — virtual workers, if you please — interact with computer systems the way most employees do, at the presentation layer through the user interface, requiring minimal code-based programming or deep back-end systems integration. As one vendor put it, RPA software can be a good fit for tasks that wouldn’t be cost-effective to automate through more brute-force ways at theSOA layer.

Claims processing, “onboarding” new employees, help desk support, customer service: The makers of RPA platforms say their lightweight tools will automate — and disrupt — how routine work gets done in just about every industry and across departments, from finance to HR to IT operations. In a widely quoted studypublished in 2013, McKinsey estimated that as many as 140 million “knowledge workers” could be replaced by 2025 through the use of RPA and its AI-centric close relative — cognitive or intelligent automation.

Finding the right RPA target

In the meantime, CIOs interested in using RPA tools for automating IT operations, or who find themselves under pressure from the business to deploy virtual workers, will have their work cut out for them, say analysts who follow the field.

Cathy Tornbohm

For starters, the term RPA, much like cloud a decade or so ago, is foggy. Robotic process automation is used indiscriminately to cover a wide range of capabilities, from tools that have been around for decades such as screen scraping and desktopscripting, to the artificial intelligence skills of a Watson, whose 27 or so technologies include RPA. And the capabilities of the products marketed as robotic process automation technology are opaque, according to Gartner analystCathy Tornbohm.

“Learning the difference between the plethora of automation and new cognitive tools and how to evaluate the ‘sweet spot’ of each tool in context has proven difficult and confusing,” she wrote in a report on use cases for robotic process automation technology published last year.

What IT can do, very usefully, is help companies understand what should be automated with RPA — and what should maybe be outsourced, or done as a service, or even through another tool that has all the rules written into it.

Cathy Tornbohmanalyst, Gartner

But CIOs face bigger challenges with RPA besides sorting through vendor hype, Tornbohm told SearchCIO in a recent phone interview from her home base in the U.K. “What RPA is doing is spotlighting things that should have been automated but haven’t, due to an accident of history,” she said.

Automating such routine, rules-based work was perhaps too costly, complex or time-consuming. Robotic process automation technology can function as acatalyst for digital transformation, Tornbohm said. CIOs could use the “sexiness of robots” to highlight areas that are good candidates for automation and develop an “enterprise automated roadmap” for their business partners that may or may not include RPA.

“What IT can do, very usefully, is help companies understand what should be automated with RPA — and what should maybe be outsourced, or done as a service, or even through another tool that has all the rules written into it.” Another hurdle Tornbohm has observed? “Organizations often have poor insight into ‘what good looks like’ for an end-to-end process and optimal processing options.”

Craig Le Clair

Indeed, matching RPA to the right process is “still an art form,” said Craig Le Clair, principal analyst specializing in enterprise architecture at Forrester Research. Best practices are in their infancy. The immaturity of RPA deployments, he said, means that CIOs are at risk for implementing software without a framework that addresses critical elements for RPA success, including change management; compliance controls; robust user support; and the process links, or “bridges,” to what many believe represent the real payday in business automation — the super cutting-edge cognitive automation tools that combinemachine learning, natural language processing, big data analytics and other computing tools to simulate human brain power.

Moreover, in any consideration of RPA, said Tornbohm, CIOs need to provide a reality check on one of robotic process automation technology’s biggest marketing points — that it can be implemented by the business. “These tools are not as easily done by business people as they are purported to be, and you actually do need some coding background to do the scripting. Even writing a macro in Excel is not that easy,” she said. IT needs to monitor the bots, provision the servers, help with security and make sure the solutions are designed well.

Downstream benefits to RPA

Analysts will get no argument on IT’s role in robotic automation projects from Surtees, who came to Gazprom with one RPA implementation under his belt. While at O2, a Telefonica UK company, he worked closely with leading RPA vendor Blue Prism to further develop, test and implement its automation tool.

“This is not a tool the business can just go wild with and start creating processes left and right to automate, because it would be chaos. It has to be properly controlled,” said Surtees, who served as head of IT at Gazprom until July when he left to become director of robotic process automation at Alsbridge Inc., a global sourcing, benchmarking and transformation advisory based in Dallas.

“The people who are trained to be robotic process automation designers, developers, process analysts, whatever, must follow the same delivery rigor as an IT department would,” he said. That includes creating a process design document that meticulously describes what the subject matter expert does on a day-to-day basis. “Sometimes the task is in the subject matter expert’s head, and you have to make sure you capture every scenario, not just the happy parts, as we say, but all possible exceptions.”

At Gazprom, Surtees chose a simple process implementing robotic process automation technology — the handling of meter readings. “They come in on a single file, and you can have hundreds and hundreds of meter reads coming on a daily basis,” he said. The meter readings are not always accurate or in the right format, and the person handling them often spent about four hours a day on the task, Surtees said — reviewing the meter reads, running a process that marked them as valid or invalid and manually inputting the valid ones into the back-end systems. “They never touched the invalid stuff because they didn’t have time,” he said.

Under the new automated process, which went live in the spring, there is zero human touch. “The files come into the server; the robot goes and gets them, runs the same little program to say whether it is valid or invalid and then processes the valid ones. When it finds incompletions it actually goes to the back-end system or goes externally to find the right data and corrects it internally as well,” he explained.

The invalid meter reads get deposited in a file for an employee to handle. “We re-edited that [part of the] process so the robot … filters them into a certain order as well, making it easier for the person who picks up the invalid processes to work on them,” he said. Anomalies are often resolved by simply calling the customer.

The benefit was striking. In the first two weeks the automation went live, the employee was able to validate about 130 invalid meter reads.

“That means we do not have to rebill the customer in those cases,” he said. A service rep takes about six minutes to rebill a customer. Multiply that by the 130 resolved meter readings and it adds up to another 10 hours of work saved. That is a large downstream benefit, Surtees said. He stressed that Gazprom’s RPA journey, as he calls it, is not about eliminating jobs but freeing up people for more valuable work. “That was made very clear upfront. This is about creating more space for people to actually help us grow.”

Gartner outlines use cases and key questions for robotic process automation technology

Use cases for RPA working with structured data

  • Automate an existing manual task or process with minimal process re-engineering
  • Reduce or remove head count from batch data input and output or data rekeying
  • Link to external systems that cannot be connected via other IT options
  • Avoid system integration projects or specific new major application development
  • Replace individual “shadow or citizen IT” desktop automation with enterprise-wide automation

Four questions for evaluating RPA

  • How efficient are your processes?
  • How effective are your processes?
  • What other general technology or business options are available to process this activity?
  • How are IT and process leaders working together to deploy RPA?

Source: “Use Cases for Robotic Process Automation: Providing a Team of ‘Virtual Workers,'” by Gartner’s Cathy Tornbohm

RPA as tactical answer to costly back-end integration

Indeed, one of the mistakes business people make when thinking about robotic process automation technology, according to Gartner’s Tornbohm, is to see it exclusively as a mechanism for cutting costs — labor costs, in particular. “It is more interesting to think of RPA to expand revenue, so, for example, using RPA to make more of your product available on the internet for self-service,” she said.

IT people, especially those with an automation background, come to RPA technology with their own biases, dismissing it as a screen scraping tool when it is “so much more,” Tornbohm said, and also predisposed to other automation strategies.

“IT will say, ‘Shouldn’t this be SOA or an API?’ and it probably should, but you’re not going to get around to that in the next five years. If you haven’t been able to address the problem and there isn’t a commercially available tool to deploy in under six months, then RPA can be a great answer,” she advised. “IT can be a great help by not being a barrier.”

Robin Gomez

Robin Gomez, director of operational excellence at Radial, a business process outsourcing firm, needs no lectures on RPA. He started looking at RPA about four years ago when costlier integration methods were not on the table. The Melbourne, Fla., firm does work for large retailers. Its contact center is a mix of systems it has developed and has acquired, plus the ones clients bring.

“We are the epitome of the swivel chair. We have agents who are handling five to 10 different brands, and each of these brands may have different systems they are leveraging,” Gomez said. It was not uncommon for agents to have 30-plus windows open. Making matters more complex, clients are constantly changing applications as they seek to keep up with the rapidly evolving retail market.

About four years ago he began looking for ways to reduce or eliminate the agents’ manual slog of trawling through screens and data for the right information. While the aim was to bring together all the selling channels for “more of a next-gen CRM functionality” and so-called 360 degree view of the customer, neither Radial nor its clients were willing to make the investments required of deep integration on the back end, he said.

“The question became how was I going to make it feel like we’re integrated at the desktop [without being] really integrated from the back end,” Gomez said.

Using software from RPA provider OpenSpan, acquired in April by Pegasystems, Radial started with “simple things,” such as assisted customer search. Robotic software captures the data customers have already entered — and often find themselves repeating to an agent, such as their order number — and presents that information to the agent.

“So the agent is engaging with the customer right now, rather than spending upwards of a minute — and in some cases two minutes — just going through the verification process,” he said. The agent’s focus on the customer is key today, as self-service options expand and improve.

“When customers are reaching out to agents, it is because they have a problem. I don’t want our agents struggling through system usage; I want the system to present information automatically for them so they can focus on the issue at hand,” he said.

RPA learnings

Surtees also views RPA software as a substitute and sometimes a stopgap solution for major systems integration work. At O2, a company with deep pockets, architects insisted his team do a proof of concept for a BPM tool, before rolling out RPA. “The costs of taking [the BPM approach] were probably 10 times higher than implementing Blue Prism,” he said. At Gazprom Energy, which is in the midst of a business transformation project, RPA has proved to be a cost-effective tactical solution in the interim, he said.

Still, both RPA enthusiasts caution that robotics automation implementations are not without challenges. For example, after putting robots on a cluster of virtual machines shared by other applications at O2, Surtees learned the RPA software needed its own server with all its virtual machines on it to ensure good performance.

At Radial, network capability was an issue, not so much internally but to support the BPO provider’s large work-from-home contingency. “We migrated from a Citrix environment over to a VMware Horizon environment and we’ve had to work through that,” Gomez said, citing initial “slowness challenges.” Browser compatibility was also an issue, and then there are the systems’ nuances to account for, he said. “It’s not as easy as flipping a switch and saying we’ve interrogated the system and now we can automate it; we have to look at the particular system of a client and how we’re going to integrate it across what we’re doing.” Still, RPA work continues apace.

“We’ve automated about 60-plus systems; we still have about 90 to go. So we have a lot of work ahead of us, but at the end of the day we’re getting better at what we’re doing.”

As for cutting jobs, Gomez said he hasn’t. RPA has increased efficiency. The company has been able to reduce its large seasonal ramp-up in hiring, but the core workforce hasn’t been downsized. “I don’t see it as a job-eliminating technology and I don’t think the company does.”

Source: searchcio.techtarget.com-Robotic process automation technology gets to work

A two-speed IT strategy for the digital era

As the business world gets rapidly digitized, the practice of “bimodal IT” is gaining popularity. For CIOs who want to keep up with the latest trends in technology, it is important to understand what the term, coined by Gartner, actually means. Bimodal IT is the practice of maintaining two distinct modes — Mode 1 and Mode 2 — of IT delivery. Mode 1 is more traditional, with a focus on stability, while Mode 2 is more exploratory, with a focus on agility.

In this webcast presentation, Kurt Marko, analyst at MarkoInsights, discusses how the era of digitization is driving the idea of two-speed IT. Read on to find out about the important differences between the two modes, the role of cloud services in a bimodal IT strategy, and why IT shops need to be able to develop digital IT products in an uncertain environment.

Editor’s note: The following is a transcript of the first of four excerpts of Marko’s webcast presentation on two-speed IT. It has been edited for clarity and length.

The term “bimodal IT” was coined by Gartner in 2014. And like many of the buzzwords that Gartner coins, it quickly became a very commonly used phrase, and, of course, they were instrumental in promoting that through their venues or their conferences and white papers.

Unfortunately, I think a lot of people ended up hearing the term “bimodal” — and it does seem sort of stark when you hear it — and it sounds like a psychological disorder. But it’s not any of that, and it’s not nearly as threatening as some people seem to make it out to be. But hopefully after the next 20 or 30 minutes, you’ll understand why.

It really defines two modes of IT where, as I say, they are creatively named Mode 1 and Mode 2. But they’re basically categories to describe different characteristics for IT systems and applications, the operating characteristics, the business requirements, the business environment even.

What is the first mode of two-speed IT?

And the notion is Mode 1 would be what most people would consider traditional IT: your big business systems, your ERP, your finance, your HR, those mission-critical systems that really have defined IT for decades. And IT as a discipline grew up around these systems. In fact, actually the way I got into IT out ofproduct development and engineering was as IT was becoming formalized as a discipline, and it looked like it was an attractive opportunity for me both professionally and just intellectually.

But … many, many businesses have become IT-centric, where IT, instead of being just an operator of critical infrastructure, is now an instrumental part of the business itself.

Kurt Markoanalyst, MarkoInsights

And IT became synonymous with conservative, keep it running, but let’s not take too many risks because the business could be at stake if we mess up. And that worked fine for many, many years. Unfortunately, as the business environments have changed dramatically, through the internet, there have been many catalysts, internet, mobility, consumerization of technology. But … many, many businesses have become IT-centric, where IT, instead of being just an operator of critical infrastructure, is now an instrumental part of the business itself.

What is the second mode of bimodal IT?

And this is where Mode 2 comes in, and it’s a concept Gartner calls the digitization of business. We’ll look at that in a bit. But this type of environment is characterized by digital applications that really take their cue almost from the mobile and the internet world, where things have to be fast, agile, fail-fast, continuous delivery, you have new ideas that need to be tested and implemented at at least a very initial stage rapidly. And it focuses on cloud services and design as vehicles for the deployment. Often, it includes mobile as the target application platform.

And this is partly where some of the confusion about bimodal IT has come in because cloud services are really instrumental in these Mode 2 applications, because they allow developers and businesses to both create rapidly and deploy at any scale desired. And so many people consider Mode 2 synonymous with cloud, Mode 1 synonymous with on premises. That’s not the case, and we’ll get into some of the subtleties of why in a bit..

Two-speed IT delivery model driven by digital economy

As I mentioned, digital business is driving this notion of two-speed IT. And that’s really what this whole Mode 2-Mode 1 dichotomy is about. It’s about having a set of applications that are run conservatively, but it’s important to understand they’re not on life support. And [then there is] another set of applications which need to be developed, deployed very rapidly but where failure is an option and getting it right the first time or getting it perfect isn’t an absolute requirement.

Explaining this rationale for bimodal IT, Gartner talks about this notion of different eras of IT. And the Mode 1 is characterized by this industrial manufacturing line, kind of the Henry Ford era of IT, where it’s about minimizing risk, planning, being predictable, doing it right, maintaining control, reliability, stability.

However, in this era, this digital business era, which they call a third era, there are very different business drivers and that prompts different behaviors. And they characterize the challenges and the activities that IT and businesses need to be capable of the following five [things]:

  • Absorbing new business models rapidly. Being able to adapt to a changing digital environment.
    Being able to scale up and down rapidly again, and this gets to some of thecloud discussion, but the notion that a successful product may grow demand, like a hockey stick, or it may have very seasonal demands.
    You have to be able to react both over time and to events as they call them, “business moments,” whether you have a promotional campaign that’s going to coincide with the Super Bowl or the Oscars or some big event that you’re focusing on, or you have a product that gets mentioned on Oprah Winfrey and suddenly demand has shot through the roof.
    You also have to be able to support different business models. And as we know, the internet and the digital business era we’re in now, whether it’s sharing services like Uber, and Airbnb, or different distribution channels for media like Spotify or digital content books, you have to be able to support different ways of monetizing your digital assets.
  • And you have to be able do this in an environment where you don’t necessarily understand ahead of time what’s going to work and what’s not. And so you have to be comfortable in developing digital IT products in this environment of uncertainty.

Source: searchcio.techtarget.com-A two-speed IT strategy for the digital era

ITSM: The ‘what’ and ‘how’ of digital transformation

When it comes to digital transformation, the question is not just whether organisations are ready for it but how they will carry it out.

Digital transformation is defined as the changes associated with the application of digital technology in all aspects of society – from government to mass communications and from art to medicine and science.

Two steps to digital transformation

In business there are two steps to every digital transformation plan: the ‘what’ and the ‘how’. The ‘what’ are the digital initiatives needed to achieve transformational change and the ‘how’ is ensuring this transformational change delivers business results.

We recently conducted a survey at the SITS 2016 conference in London, which showed that mobility and IT service management (ITSM) is key to executing digital transformation strategies. Over half the SITS survey respondents, who were mostly CIOs and business decision makers, said they are ‘nearly ready’, followed by 19 per cent who said they are ‘neither ready or not’. In contrast, only 16 per cent said they were ‘completely ready’.

When it comes to digital transformation, mobility is key. While most organisations have implemented an enterprise mobility strategy, risk, and security concerns are still preventing organisations from realising value from it. Having the right tools to manage risk and ensure security has never been more important.

Over half of respondents said mobile ITSM is somewhat important to their digital transformation plans while just over a quarter said it’s ‘highly important’. In the middle, just 12 percent said mobile ITSM was ‘neither important or not important’ while the naysayers who said it was ‘not very important’ or ‘not relevant’ were 7 per cent and 3 per cent respectively.

Some of the key reasons why ITSM is changing the workplace of the future include the fact that service quality and customer satisfaction have become the biggest technology priorities for organisations. ITSM is moving away from ‘budget driven’ IT legacy whilst transforming IT into service brokers.

Despite the recognition of the importance of mobile ITSM by the majority of respondents, 34 per cent said they don’t have the right mobility and ITSM tools to operate in a digital world. Though the majority (48 per cent) said they have the right tools with only 19 per cent saying they weren’t sure.

In addition to the tools, organisations also need to ensure that they recruit the right people skilled in mobile app provision, mobile app development and management and mobile content management in order to accelerate their mobile strategies.

Given organisations are faced with new challenges introduced by mobility such as mobile device management (MDM) and enterprise mobility management (EMM), they need the right tools.

Device and application management solutions can benefit businesses in numerous ways, including enabling them to automate business process flows and approvals that are mobile first in approach, ensure licence compliance across multiple operating systems platforms and make real time cost allocations of apps and services.

ITSM: A key investment

ITSM is also one of the key spending areas, the survey found. The majority of respondents said budgets are likely to increase over the next 12 months for security, followed by ITSM, compliance, mobility, hardware, OS upgrades and ID and access management. However, in all of those categories, some respondents said spending is likely to stay the same. Participants said the biggest decrease in spending will be in hardware and OS upgrades.

The majority of respondents said they will be working on ITSM projects over the next 12 months, followed by security and mobile workforce management.

So why does ITSM rank so highly with respondents in terms of spending? They outlined the key benefits of ITSM as:

  • Improvement of service
  • Reduction of costs
  • Improvement self-service
  • Automation

By helping businesses improve their processes, ITSM meets some of the main objectives of digital transformation, which include driving employee productivity and improving innovation. In the 21st Century, it’s all about collaboration, which is why a human-centric approach to digital transformation is vital. This goes beyond automation replacing manual tasks to equipping employees with the right information at the right time and on the right platform whether that’s their laptop, tablet or mobile phone.

So it’s not just merely a question of whether businesses are ready for digital transformation but how they’re going to execute it.

Source: ITproportal-ITSM: The ‘what’ and ‘how’ of digital transformation

Kofax Advisor | 5 Key Concepts For Robotic Process Automation

I recently attended shareserviceslink’s analytics and robotics conference in Dallas, where we explored how applying analytics and robotic process automation can improve finance and business decision making.

This was a great opportunity to hear from companies concerning their experiences with robotic process automation (RPA) and where they are on their journey. Many attendees were relatively new to RPA and in the early stages of understanding it. In fact, according to sharedserviceslink’s pre-survey, 39% of attendees did not yet have a defined a strategy for RPA.

One company who presented was definitely farther along on the journey than others, and a few attendees I talked with were hoping to kick off a project in 2016.

If you are curious about RPA and wonder how it might benefit your organization, here are five things I learned from the event that you may find helpful.

  1. Many organizations are still in the early stages of understanding what “robotic process automation” is and how it can help them. The market is relatively new and there’s still plenty of time to learn what RPA is all about. First, remember that “robotic process automation” does not involve physical robots; it involves automation software that runs on either a desktop or server. Second, business and IT need to work together to understand where and how RPA can best solve the many integration and automation challenges faced by enterprise organizations.
  1. Your Business Process Outsourcers (BPO) are likely using some form of RPA technology today. Robotic automation originated in the BPO market. Now companies are seeing the value of what it could mean to their internal processes as well. One company discussed how they plan to support a hybrid model where some work will continue to be outsourced, and other internal processes will be reviewed, and hopefully automated using RPA.
  1. Understand the process well and why employees perform certain steps along the way. It is important to know why an employee performs multiple repetitive tasks to complete a business activity. As one speaker said, it makes no sense to automate five tasks, when only three of the tasks make sense. Some employees do crazy things when trying to make their job easier. So don’t take an overly complicated series of human tasks and simply apply robotic automation software.
  1. “Software robots” work like people and can make errors like people, therefore it is important to pay attention to the details. You may think RPA is automated and perfect. A robot can automate virtually any repetitive task a user performs and do it with 100% accuracy, but if you build a process flow (software robot) to do something wrong, then the automation will be repeated wrong every time.
  1. Benefits of RPA are much more than just cost savings. While companies can reap cost savings, it’s not the only business driver for investing in RPA. Think about your employees and breakdown the value of their work tasks into three categories: low, medium, and high. RPA automates the low-level tasks so your employees can focus on higher-value tasks.

Over the years, many companies we’ve worked with have realized efficiency gains and ability to a better customer experience. In the case of logistics and transportation, customer service reps can spend more time servicing clients, speeding up collection disputes, and less time performing data entry work between internal systems and partner/customer web portals.


Source: Kofax Advisor – 5 Key Concepts For Robotic Process Automation

Despite progress, the future of AI will require human assistance

Part one of this story on the future of AI explained how technology developments have led to a resurgence in a field that has progressed in fits and starts since the 1950s. Today’s cheap storage and amped-up and inexpensive compute power, combined with an explosion in data, have revived an interest in deep learning and “neural nets,” which use multiple layers of data processing that proponents sometimes liken to how the brain takes in information.

The field is red hot today, with Google, Facebook and other technology giants racing to apply the technology to consumer products. In the second part of this story, SearchCIO senior news writer Nicole Laskowski reports on where two AI luminaries — Facebook’s Yann LeCun and Microsoft’s Eric Horvitz — see the trend going.

Like Microsoft, IBM and Google, Facebook Inc. is placing serious bets on deep learning, neural networks and natural language processing. The social media maven recently signaled its commitment to advancing these types of machine learning by hiring Yann LeCun, a well-regarded authority on deep learning and neural nets, to head up its new artificial intelligence (AI) lab. A tangible byproduct of this renewed focus on neural nets is Facebook’s digital personal assistant, M, which rolled out to select users a few months back.

Today, M’s AI technology is backed by human assistants, who oversee how M is responding to queries (such as placing a take-out order or making a reservation) and can step in when needed. According to a Wired article, the AI-plus-human system is helping Facebook build a model: When human assistants intervene to perform a task, their process is recorded, creating valuable data for the AI system.

Once enough of the “right data” is collected, M will be built on neural nets, which is where LeCun’s team comes in. But even as the AI behind M advances to neural nets, humans will need to be in the loop to continue training the technology, according to the article.

Still work to be done

That’s because M, like most contemporary AI systems, is a supervised learning system, which means the system can’t teach itself. Instead, if LeCun wants an algorithm to recognize dogs, he has to feed it examples of what dogs look like — and not just a handful of examples. “You have to do it a million times,” LeCun said at EmTech, an emerging technology conference hosted by the MIT Technology Review. “But, of course, humans don’t learn that way. We learn by observing the world. We figure out that the world is three-dimensional, that objects move independently. … We’d like machines to be able to do this, but we don’t have good techniques for this.”

Yann LeCun

Building machines that have a kind of artificial common sense, according to LeCun, will be the big challenge for the future of AI. “It’s done by solving a problem we don’t really have good solutions for yet, which is unsupervised learning,” he said.

One of the ways Facebook (among others) is trying to insert rudimentary reasoning into AI systems is with vector embedding, where unstructured data is mapped to a sequence of numbers that describe the text or object in detail. LeCun said the process brings together perception, reason, perspective and language capabilities so that if the algorithm encounters an unfamiliar word or image, it can make an educated guess by comparing and contrasting the rich mathematical descriptions of the unknown against the known. One of his clearest explanations about how vector embedding works had to do with language translation: “We can take two pieces of text, one in French and one in English, and figure out if they mean the same thing,” he said.

Facebook is not alone in taking this approach to improving AI. A recent article inExtremeTech described the “thought vector” work Google’s doing as a way of training computers to comprehend language, which they are incapable of doing now.

The future of AI

But language comprehension is a far cry from machines that can perform the same intellectual tasks that humans perform. Developing a common sense program, or “artificial general intelligence,” as it’s called, is still a way’s off, said LeCun, who shared the EmTech stage with Eric Horvitz, director at the Microsoft Research laboratory in Redmond, Wash. “If Eric were to grab his water bottle and walk off stage, you could close your eyes, be told that, and picture all of the actions he’d have to take to do that.” AI machines, on the other hand, can’t.

Sci-fi films such as Her and Ex Machina may give the impression that the future of AI is conscious machines, but LeCun and Horvitz described generalized intelligence as really hard problems to solve. “We’re nowhere near being able to think that through,” Horvitz said. “I do think that with the great work on the long-term road toward intelligence, we’ll have competencies, new kinds of machines, and it may well be that deep competency is perceived as consciousness.”

Eric Horvitz

One of the basic obstacles Horvitz is interested in solving is a classic IT problem: AI technologies were essentially built in silos. For systems to become more powerful, they’ll likely need to be knitted together. “When I say we’re trying to build systems where the whole is greater than the sum of its parts, we sometimes see surprising increases in competency when we combine, for example, language and vision together,” said Horvitz, who recently launched (and, along with his wife, is funding) the One Hundred Year Study on Artificial Intelligence at Stanford University, an interdisciplinary research effort on the effects of AI.

Exactly how AI systems should be integrated together is still up for debate, “but I’m pretty sure that the next big leaps in AI will come from these kinds of integrative solutions,” Horvitz said. Silo busting may not be as sexy as creating conscious machines, but it could lead to what Horvitz called a “symphony of intelligence.”

“With every advance, and particularly with the advances in machine learning and deep learning more recently, we get more tools. And these tools are often really interesting components we can add to older components and see them light up in new ways,” he said.

Source: searchcio.techtarget.com-Despite progress, the future of AI will require human assistance