Artificial intelligence, cognitive computing and machine learning are coming to healthcare: Is it time to invest?

With Google, IBM and Microsoft all setting sights squarely on healthcare, and analysts predicting 30 percent of providers will run cognitive analytics on patient data by 2018, the risk of investing too late may outweigh the risk of doing so too soon.

The arrival of artificial intelligence and its ilk — cognitive computing, deep machine learning — has felt like a vague distant future state for so long that it’s tempting to think it’s still decades away from practicable implementation at the point of care.

And while many use cases today are admittedly still the exception rather than the norm, some examples are emerging to make major healthcare providers take note.

Regenstrief Institute and Indiana University School of Informatics and Computing, for instance, recently examined open source algorithms and machine learning tools in public health reporting: The tools bested human reviewers in detecting cancer using pathology reports and did so faster than people.

Indeed, more and more leading health systems are looking at ways to harness the power of AI, cognitive computing and machine learning.

“Our initial application of deep learning convinced me that these methods have great value to healthcare,” said Andy Schuetz, a senior data scientist at Sutter Health’s Research Development and Dissemination Group. “Development will be driven by our acute need to gain efficiency.”

Schuetz and his colleagues are not alone. By as soon as 2018, some 30 percent of healthcare systems will be running cognitive analytics against patient data and real-world evidence to personalize treatment regiments, according to industry analysts IDC.

What’s more, IDC projects that during the same year physicians will tap cognitive solutions for nearly half of cancer patients and, as a result, will reduce costs and mortality rates by 10 percent.

Race is heating up
IBM’s Watson is the big dog in cognitive computing for healthcare, but the race is on and the track is growing increasingly crowded.

IBM rivals Dell and Hewlett-Packard are readying systems to challenge Watson, while a range of companies including Apple and Hitachi Data Systems are each taking their own tack toward AI, cognitive computing and machine learning.

A report from Deloitte in 2015 rattled off a list of other competitors, including: Luminoso, Alchemy API, Digital Reasoning, Highspot, Lumiata, Sentient Technologies, Enterra, IPSoft and Next IT.

And late last month Google and Microsoft battled it out when Google unwrapped its Cloud Machine Learning and Microsoft shot back that same week with big data analytics of its own and the new phrase “conversational intelligence” to describe its new offerings.

So don’t expect Watson to be the only “thinking machine” option moving forward.

Hurdles ahead
Among the obstacles facing healthcare organizations and the intrepid technology vendors trekking to AI, cognitive computing and machine learning will have to high-step to overcome: data.

Data is always going to be an issue for both healthcare providers and technology vendors. Collecting it, storing it, normalizing it, tracing its lineage and the critical – if not particularly sexy – matter of governance, are all necessary so providers can harness cutting-edge software and hardware innovations to glean insights that enhance patient care.

“Translating data into action — that is the hard part, isn’t it?” said Sarah Mihalik, vice president of provider solutions at IBM Watson Health.

Achieving the transformative potential for AI, she added, is also going to require a mindset and practice shift in how providers embrace technologies and acquire talent.

The right data is essential to solving many of today’s problems but the information itself does not a lasting strategy make.

“Analytics is just one part of an overall data strategy,” said Nicholas Marko, MD, chief data officer at Geisinger Health System.

Other key pieces include: business intelligence, enterprise data warehouse, infrastructure, privacy and security.

“If you’re not focusing on how these pieces are all in motion then invariably you’re going to hit some kind of bottleneck,” Marko said. “The strategy has to be a dynamic living thing, not something you just put down on paper. There is not some secret sauce that allows you to lay down an analytics strategy. It’s a lot of hard work. Nobody has the magic solution.”

Not even technology titans.

First-mover advantage
AI, cognitive computing and deep machine learning are still nascent technologies but consultancies are suggesting that healthcare organizations begin working these technologies now rather than waiting.

“The risk of investing too late in smart machines is likely greater than the risk of investing too soon,” according to a report from Gartner Group.

There’s little arguing that the degree of complexity around big data in healthcare is exactly why clinicians, physicians and, indeed, the industry at large need these emerging technologies, which have felt so far away for so long.

“I have no doubt that sophisticated learning and AI algorithms will find a place in healthcare over the coming years,” Sutter’s Schuetz said. “I don’t know if it’s two years or ten — but it’s coming.”

 

Source: m.healthcareitnews.com – Artificial intelligence, cognitive computing and machine learning are coming to healthcare: Is it time to invest?

Digitisation connects customers to companies on a new level

‘Digital’ is one of those trendy terms that seems to mean different things to different people depending on the point being made or the corner being fought.

How businesses can win talent war with mobile apps, HR data analytics and cloud technology

Latest articles on HR best practices: mobile apps disrupting the annual staff appraisals; reasons for and against open-source software; and Rolls-Royce case study.

By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners.

You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

But its usage also seems to depend on which function you work for. Marketing, for example, tends to apply it to how customers engage with the brand, and the proposition that brand offers to its customers via online channels.

IT, on the other hand, generally uses digital to refer to technology and platforms such as cloud, mobile and big data, which tend to be linked to the internet in various ways and give users access to new products and services.

But what is common to all these definitions is that digital provides more ways for people and businesses to connect and be connected than ever before. This situation has led consumers to become much more vocal and demanding about what they want, giving them more control over their own customer experience.

A third characteristic of digital relates to market disruption. Organisations such as Uber and Airbnb, for instance, have become poster children for the digital age after undercutting incumbents in the taxi and hotel industries, respectively, shaking up these markets in the process.

Meanwhile, as to where the average UK company is in terms of embracing digital transformation, it appears there is still quite a lot of work to be done. According to management consultancy PA Consulting’s Digital Barometer, which looks at how organisations are adapting to the digital age, 78% of the 400 firms questioned classed themselves as somewhere between “digital dabblers” and “digitising today”.

Of that group, 62% were keen to find ways to create new products, services and even business models using digital means. On the downside, though, only 18% said they understood what going down this route would mean for their organisation and less than 30% felt they had the right approach internally to succeed. Legacy technology and a lack of appropriate skills were also seen as a drag on change.

Leap in awareness

But PA Consulting’s Kevin O’Shaughnessy says that over the past 12 to 18 months, he has seen a leap in awareness of the issues, especially at senior levels, and industries such as retail, travel and retail banking were now “moving quite fast”.

“For incumbents, the drivers are defending their customer base, cutting costs and making themselves appear more relevant and modern,” says O’Shaughnessy. “New firms, on the other hand, are trying to steal market share and scale up to become bigger. So there is a polarisation there.”

While the low capital investment required to employ online technologies and cloud services is making it easier for startups to make money out of their new ideas, more established firms are finding that they need to experiment to get it right.

William Fellows, research vice-president at analyst firm 451 Research, says: “The biggest challenge for many companies is cultural – organisational resistance, conventional budgeting and the like. There is also the issue of where you put your money down and which digital initiative is the best use of your resources.”

A key consideration, whether the company culture supports it or not, is how to introduce change quickly rather than risk being left behind. As a result, an increasingly common approach is to set up joint ventures with third parties, create internal incubator/accelerator models to support new ideas, or even just set up a skunkworks project, says Fellows.

“It’s not really a technology problem,” he says. “There’s lots of technology out there already. It’s really more about how you use and implement it in order to take advantage of new opportunities.”

 

Source: computerweekly.com-Digitisation connects customers to companies on a new level

Will a robot take your job?

Type your job title into the search box below to find out the likelihood that it could be automated within the next two decades.

About 35% of current jobs in the UK are at high risk of computerisation over the following 20 years, according to a study by researchers at Oxford University and Deloitte.

Sources

‘The Future of Employment: How susceptible are jobs to automation’. Data supplied by Michael Osborne and Carl Frey, from Oxford University’s Martin School. Figures on UK job numbers and average wages from the Office for National Statistics and Deloitte UK.

Methodology

Oxford University academics Michael Osborne and Carl Frey calculated how susceptible to automation each job is based on nine key skills required to perform it; social perceptiveness, negotiation, persuasion, assisting and caring for others, originality, fine arts, finger dexterity, manual dexterity and the need to work in a cramped work space.

The research was originally carried out using detailed job data from the United States O*NET employment database. The analysis for UK jobs was made by adapting the findings to corresponding occupations in the UK based on Office for National Statistics job classifications. For the purpose of the UK study, some US occupations were merged. In these cases, the probabilities were calculated as weighted averages of the probabilities of automation for each US occupation within the group.

Some job names have been edited for clarity. Where average salary has been mentioned, the median has been used. Figures are not available for occupations in the military, or for politicians.

*Where two jobs have the same figure for their risk of automation but are ranked differently this is because the data goes to more than one decimal place.

Intelligent Machines – a BBC News series looking at AI and robotics

Source: BBC-Will a robot take your job?

These 4 Major Paradigm Shifts Will Transform The Future Of Technology

For the past fifty years or so, technology has followed a fairly predictable path. We squeeze more transistors onto silicon wafers, which makes chips more powerful and devices smaller. Manual processes become automated, productivity increases and life gets better. Rinse and repeat.

Today, we’re at an inflection point and that predictable path to progress will soon be closed off. What lies ahead is a period of extreme disruption in which most of what we’ve come to expect from technology is becoming undone. What replaces it will be truly new and different.

Over the next decade, Moore’s Law will end. Instead of replacing manual labor, technology will automate routine cognitive work. As information technology fades into the background, second order technologies, such as genomics, nanotechnology and robotics will take center stage. Here are the four major paradigm shifts that we need to watch and prepare for.

From The Chip to The System

In 1965, Intel cofounder Gordon Moore published a remarkably prescient paperwhich observed that the number of transistors on an integrated circuit was doubling every two years. He also predicted that this pace would lead to computers becoming embedded in homes, cars and communication systems.

That simple idea, known today as Moore’s Law, has helped power the digital revolution. As computing performance has become exponentially cheaper and more robust, we have been able to do a lot more with it. Even a basic smartphone today is more powerful than the supercomputers of past generations.

Yet Moore’s law is now nearing its end. The problem is twofold. First, there are only so many transistors you can squeeze onto a chip before quantum effects cause them to malfunction. Second, is the problem known as the von Neumann bottleneck. Simply put, it doesn’t matter how fast chips can process if they need to wait too long to communicate with each other.

So we have to shift our approach from the chip to the system. One approach,called 3D stacking, would simply combine integrated circuits into a single three dimensional chip. This is harder than it sounds, because entirely new chip designs have to be devised, but it could increase speeds significantly and allow progress to continue.

From Applications To Architectures

Since the 1960’s, when Moore wrote his article, the ever expanding power of computers made new applications possible. For example, after relational databases were developed in 1970, it became possible to store and retrieve massive amounts of information quickly and easily. That, in turn, dramatically changed how organizations could be managed.

Later innovations, like graphic displays, word processors and spreadsheets, set the stage for personal computers to be widely deployed. The Internet led to email, e-commerce and, eventually, mobile computing. In essence, the modern world is little more than the applications that make it possible.

Till now, all of these applications have taken place on von Neumann machines—devices with a central processing unit paired with data and applications stored in a separate place. So far, that’s worked well enough, but for the things that we’ve begun asking computers to do, like power self-driving cars, the von Neumann bottleneck is proving to be a major constraint.

So the emphasis is moving from developing new applications to developing new architectures that can handle them better. Neuromorphic chips, based on the brain itself, will be thousands of times more efficient than conventional chips. Quantum computers, which IBM has recently made available in the cloud, work far better for security applications. New FPGA chips can be optimized for other applications.

Soon, when we choose to use a specific application, our devices will automatically be switched to the architecture—often, but not always, made available through the cloud—that can run it best.

From Products To Platforms

It used to be that firms looked to launch hit products. If you look at the great companies of the last century, they often rode to prominence on the back of a single great product, like IBM’s System/360, the Apple II or Sony’s Walkman. Those first successes could then lead to follow ups—like the PC and the Macintosh—and lead to further dominance.

Yet look at successful companies today and they make their money off of platforms. Amazon earns the bulk of its profits from third party sellers, Amazon Prime and cloud computing, all of which are platforms. And what would Apple’s iPhone be without the App Store, where so much of its functionality comes from?

Platforms are important because they allow us to access ecosystems. Amazon’s platform connects ecosystems of retailers to ecosystems of consumers. The App Store connects ecosystems of developers to ecosystems of end users. IBM haslearned to embrace open technology platforms, because they give it access to capabilities far beyond it own engineers.

The rise of platforms makes it imperative that managers learn to think differently about their businesses. While in the 20th century, firms could achieve competitive advantage by optimizing their value chains, the future belongs to those who can widen and deepen connections.

From Bits To Atoms

In The Rise and Fall of American Growth, economist Robert Gordon argues that the rapid productivity growth the US experienced from 1920-1970 is largely a thing of the past. While there may be short spurts of growth, like there was in the late 90’s, we’re not likely to see a sustained period of progress anytime soon.

Among the reasons he gives is that, while earlier innovations such as electricity and the internal combustion engine had broad implications, the impact of digital technology has been fairly narrow. The evidence bears this out. We see, to paraphrase Robert Solow, digital technology just about everywhere except in the productivity statistics.

Still, there are indications that the future will look very different than the past. Digital technology is beginning to power new areas in the physical world, such asgenomics, nanotechnology and robotics, that are already having a profound impact on such high potential fields as renewable technology, medical research and logistics.

It is all too easy to get caught up in old paradigms. When progress is powered by chip performance and the increased capabilities of computer software, we tend to judge the future by those same standards. What we often miss is that paradigms shift and the challenges—and opportunities—of the future are likely to be vastly different.

In an age of disruption, the only viable strategy is to adapt.

Source: Forbes-These 4 Major Paradigm Shifts Will Transform The Future Of Technology

What is machine learning?

Machine learning is the process of building analytical models to automatically discover previously unknown patterns from data that indicate associations, sequences, anomalies (outliers), classifications, and clusters and segments. These patterns reveal hidden rules as to why an event happened—for example, rules that predict likely customer churn. Businesses can take advantage of several kinds of uses for machine learning:

  • Segmentation, or grouping sets of customers who have similar buying patterns for targeted marketing
  • Classification based on a set of attributes to make a prediction—for example propensity to buy, customers with insurance policies likely to lapse and equipment failure that triggers preventive maintenance
  • Forecasts—for example, sales projections based on time series
  • Pattern discovery that associates one product with another to reveal cross-sell opportunities and sequences—for example, products that sell together over time
  • Anomaly detection—for example, detecting fraud

Predictive analytics model methodology

The widely used Cross Industry Standard Process for Data Mining (CRISP-DM) methodology is used to develop predictive analytical models. CRISP-DM includes six phases: business understanding, data understanding, data preparation, model development using supervised and unsupervised learning, model evaluation and model deployment.

Business understanding

The business understanding phase involves defining the business problem or use case, the business objectives and the business questions that need to be answered. It also involves defining success criteria. Then the standard project-related tasks need to be performed. These tasks include defining resource requirements such as people and money, technology requirements, creating a project plan, defining any constraints, assessing risks and creating a contingency plan.

Data understanding

The data understanding phase involves data requirements such as internal and external data sources and data characteristics including data volumes, variety, velocity, formats and so on, as well as whether the data is in flat files, a relational database, a Hadoop Distributed File System (HDFS) or if it is live, streaming data.

This phase also includes data exploration using statistical analysis to look at data—for example, basic statistics about each data column and any information about whether data is skewed in any way. Visualizations such as histograms and scatterplots help with drilling down on outliers and errors. In addition, a data quality assessment involves understanding the degree to which data is missing, has errors, is inconsistent and is duplicated.

Data preparation

The objective of the data preparation phase is to produce a set of data that can be fed into machine-learning algorithms. This process requires a number of tasks including data enrichment, filtering and cleaning; data conversion; data transformation; and variable identification, which is also known as feature selection or dimensionality reduction. Variable identification’s objective is to create a data set of the most highly relevant variables to be used as model input to get optimal results. The intention is also to remove variables from a data set that are not useful as model input without compromising the model’s accuracy—for example, the accuracy of the predictions it makes.

Model development

The model development phase is about the development of a machine-learning model. Models can be built to predict, forecast or analyze data to find patterns such as associations and groups. Two types of machine learning can be used in model development: supervised learning and unsupervised learning.

Typically, predictive models are built using supervised learning. For example, if we want to develop a model that predicts equipment failure, we can use data that describes equipment that has actually failed. We can use that data to train the model to recognize the profile of a piece of equipment that is likely to fail. To accomplish this profile recognition, we split the data set containing failed equipment data records into a training data set and a test data set. Then we train the model by feeding the training data set into an algorithm, several of which can be used for prediction. Then we test the model using the test data set.

Unsupervised learning is a process of analyzing data to try and find hidden patterns in the data that indicate product association and groupings—for example, customer segmentation. Grouping is based on maximizing or minimizing similarity. The K-means clustering algorithm is a widely used algorithm for this approach. Predictive and descriptive analytical models can be built using advanced analytics or data mining tools, data science interactive workbooks with procedural or declarative programming languages, analytics clouds and automated model development tools.

Model evaluation

After a model is developed, the next phase is to evaluate the accuracy of predictions or groupings. For predictions, this evaluation means understanding how many predictions were correct and how many were incorrect. Various methods can accomplish this evaluation. Key measures in model evaluation are the number of true positives, false positives, true negatives and false negatives. The bottom line is that we need to make sure that the model is accurate; otherwise, it could generate lots of false positives that may result in incorrect decisions and actions.

Model deployment

Once we are happy with the model we’ve developed, the final phase involves deploying models to run in many different environments. These environments include spreadsheets, analytics servers, applications, database management systems (DBMSs), analytical relational database management systems (RDBMSs), Apache Hadoop, Apache Spark and streaming analytics platforms:

 

Source: ibmbigdatahub.com-What is machine learning?

All you need to know about blockchain, explained simply

Many people know it as the technology behind Bitcoin, but blockchain’s potential uses extend far beyond digital currencies.

Its admirers include Bill Gates and Richard Branson, and banks and insurers are falling over one another to be the first to work out how to use it.

So what exactly is blockchain, and why are Wall Street and Silicon Valley so excited about it?

What is blockchain?

Currently, most people use a trusted middleman such as a bank to make a transaction. But blockchain allows consumers and suppliers to connect directly, removing the need for a third party.

Using cryptography to keep exchanges secure, blockchain provides a decentralized database, or “digital ledger”, of transactions that everyone on the network can see. This network is essentially a chain of computers that must all approve an exchange before it can be verified and recorded.

 

How does it work in practice?

In the case of Bitcoin, blockchain stores the details of every transaction of the digital currency, and the technology stops the same Bitcoin being spent more than once.

Image: Financial Times

Why is it so revolutionary?

The technology can work for almost every type of transaction involving value, including money, goods and property. Its potential uses are almost limitless: from collecting taxes to enabling migrants to send money back to family in countries where banking is difficult.

Blockchain could also help to reduce fraud because every transaction would be recorded and distributed on a public ledger for anyone to see.

Who is using it?

In theory, if blockchain goes mainstream, anyone with access to the internet would be able to use it to make transactions.

Currently only a very small proportion of global GDP (around 0.025%, or $20 billion) is held in the blockchain, according to a survey by the World Economic Forum’s Global Agenda Council.

But the Forum’s research suggests this will increase significantly in the next decade, as banks, insurers and tech firms see the technology as a way to speed up settlements and cut costs.

Companies racing to adapt blockchain include UBS, Microsoft, IBM and PwC. The Bank of Canada is also experimenting with the technology.

A report from financial technology consultant Aite estimated that banks spent $75 million last year on blockchain. And Silicon Valley venture capitalists are also queuing up to back it.

Source: weforum.org-All you need to know about blockchain, explained simply

Business failing to learn lessons of past cyber attacks, report shows

Business and other organisations are failing to learn the lessons of past cyber attacks, the latest Verizon Data Breach Investigations Report (DBIR) reveals.

The analysis of 2,260 breaches and more than 100,000 incidents at 67 organisations in 82 countries shows that organisations are still failing to address basic issues and well-known attack methods.

“This year’s study underlines that things are not getting better,” said Laurance Dine, managing principal of investigative response at Verizon Enterprise Solutions.

“We continue to see the same kind of attacks exploiting the same vulnerabilities because many organisations still lack basic defences,” he told Computer Weekly.

The 2016 DBIR shows, for example, that nearly two-thirds of confirmed data breaches involved using weak, default or stolen passwords.

The report also shows that most attacks exploit known vulnerabilities that organisations have never patched, despite patches being available for months – or even years – with the top 10 known vulnerabilities accounting for 85% of successful exploits.

“User security awareness continues to be overlooked as organisations fail to understand that they need to make their employees the first line of defence,” said Dine.

“Organisations should be investing in training to help employees know what they should and shouldn’t be doing, and to be aware of the risks so they can alert security teams if they spot anything suspicious,” he said.

For this reason, Dine said it is important for organisations to have the processes in place that make it easy for employees to report security issues.

Phishing attacks

Phishing is one area where increased user awareness could help, said Dine, especially as the use of fraudulent emails to steal credentials or spread malware increased dramatically in the past year.

“If we could reduce phishing through user awareness training, we could probably reduce a lot of the other stuff as well because many of the attacks start with a simple phishing email,” said Dine.

The study shows that 30% of phishing messages were opened – up from 23% in the 2015 report – and 12% clicked on malicious attachments or links that installed malware.

In previous years, phishing was a leading attack pattern for cyber espionage, but now features in most cyber attacks.

According to Verizon researchers, this technique is amazingly effective and offers attackers a number of advantages, such as a very quick time to compromise and the ability to target specific individuals and organisations.

Human error cause of most attacks

Underlining the importance of user awareness and the human element of security, the report shows that human error accounts for the largest proportion of security incidents, with 26% of these errors involve sending sensitive info to the wrong person.

 

Source: computerweekly.co-Business failing to learn lessons of past cyber attacks, report shows

What is Service Automation?

When one hears the term “service automation,” it’s natural to envision robotic machines serving humans. In fact, this is somewhat true to a certain degree. Generally speaking, service automation is a broad term that in most instances encompasses the concept of technology that is used to help human workers complete necessary tasks faster, easier and with greater efficiency. Beyond this, there are a number of different ways an organization might benefit from service automation. Let’s take a look at how and why this concept is becoming more and more popular by the day.

Streamlining Operations

In every business, there are always going to be certain repetitive tasks that may seem menial, but are in fact necessary to ongoing operational success. Unfortunately, because these services are often tedious, they also bring along a particular degree of inefficiency, which can be costly over time. They are also, by nature, more prone to human error.

Service automation leverages technology to transfer these manual, error-prone tasks from human to machine, speeding up the process and eliminating mistakes along the way. Work is performed faster, which drives up efficiency levels. It also frees up personnel to apply their skills and resources to more business-critical tasks, increasing productivity as a result.

Maintaining Control and Visibility

Today’s technology has made it easy and incredibly affordable for organizations of every size and industry to be able to manage their IT and other key initiativesin-house. One such area that has been particularly beneficial is that of service automation. Where a small to mid-sized business might once have been forced to outsource or rely on a third-party to handle things like maintenance and help desk support, automation has allowed these things to be moved back on-site.

There are many benefits to maintaining IT functions from within, not the least of which is the ability to stay in complete control of sensitive data and operational processes. With the right service automation tool, housing IT internally can also provide greater visibility, which accelerates and improves the decision-making process.

Facilitating Scalability

Another key area where service automation has virtually revolutionized how businesses function is the ability it provides for instant and powerful scalability. Managing change and accommodating increasing demands was once a significant hurdle that smaller and even mid-sized organizations once faced. In certain situations, such as a sudden and significant increase in workflow, existing human capital simply wasn’t enough to meet the changing needs, making it nearly impossible to compete with larger enterprises.

With the adoption of service automation, however, the amount of work produced can be increased (and performed without the risk of human error) on-demand at the veritable click of a button. This has dramatically changed the competitive landscape and leveled the playing field, opening the doors of practically limitless opportunities for companies of all sizes.

Service automation may come in a wide variety of flavors, but its basic concept remains constant in just about every instance and application. If you haven’t yet looked into how this advanced technology could benefit your business, the time to consider it is now.

Source: Ayehu-What is Service Automation? 

Wipro to deploy AI platform Holmes to do the job of 3,000 engineers

Wipro Ltd will use its artificial intelligence platform Holmes to automate several aspects of its so-called fixed-price projects, saving up to $46.5 million and freeing around 3,000 engineers from mundane software maintenance activities.

The move is part of Wipro’s larger plan to generate $60-$70 million in revenue by selling the platform to new and existing clients in the current financial year.

A Wipro spokesperson declined comment.

About 30,000 of Wipro’s workforce of 110,000 work in fixed-price projects.

New chief executive officer Abidali Neemuchwala has set an ambitious target for Wipro— more than tripling its revenue growth to 12-14% for fiscal 2017.

“Hyper-automation is one of the six themes Abid has outlined. We will move out 1,300 engineers on on-site (fixed price contracts) and about 2,000 people from off-site this year,” said an executive briefed about the development.

The person did not want to be named.

“We have seen good traction from Holmes and across industries, we are seeing early benefits,” said the executive, adding that automation is one of the central pillars of Neemuchwala’s vision of making Wipro a $15 billion firm by 2020 with an operating margin of 23% (versus 20.5% currently).

In fixed-price projects, a client and IT vendor agree on a price, with the outsourcing firm having the final word on the number of people to be deployed on a project.

Wipro’s push to monetize Holmes and use it to save costs, and thereby arrest falling profitability, comes more than a year after it launched the platform.

The move also comes at a time when Indian IT vendors’ traditional approach of outsourcing work to cheaper locations is under pressure, as automation platforms and cloud computing erase the labour arbitrage enjoyed by these firms.

“The suggestion on the large number of personnel being freed up by the emergence of intelligent automation underlines the disruption that is about to hit the supply side,” said Thomas Reuner, managing director of IT outsourcing research at HfS Research.

Wipro’s larger rivals Tata Consultancy Services Ltd (TCS) and Infosys Ltd are also relying on their intelligent platforms, Ignio and Mana, respectively, to improve their profitability and revenue per employee.

TCS is looking to increase by 20% the revenue per employee in its IT infrastructure division by automating tasks previously done by engineers. Infosysexpects the productivity boost tied to automation and artificial intelligence to reflect in a meaningful way from April 2017.

Holmes, or “heuristics and ontology-based learning machines and experiential systems”, is an intelligent platform which promises to bring efficiency for clients by reducing errors. Additionally, it helps Wipro save on wage costs by deploying fewer people to complete a task.

For instance, Wipro is using Holmes to help large banks approve and process loans quickly. The platform extracts a new customer’s information, performs a cognitive search comparing the credit history with the bank’s other customers, authenticates and validates the loan process.

However, don’t expect Holmes to change things for Wipro overnight, cautioned some Wipro executives and experts.

“Freeing a few people in the delivery side of business will be easy than, say, generating more business from selling Holmes,” said another executive on condition of anonymity.

“Most service providers still don’t have clear ideas as to what will happen to those people being freed from current tasks: will they be re-skilled, re-badged or will there be even redundancies?” asked Reuner. “In all likelihood, it will be a mixture of the three.”

While automation has started to feature prominently in the earnings calls of many Indian IT services firms, most of them, including Wipro, remain tight-lipped about the details.

“We need a more honest and open debate about the transformation of knowledge work. Many tasks and jobs will disappear while new jobs, often highly analytical, will be created,” Reuner said.

His worries aren’t unwarranted.

For the first time in over two decades, two of India’s five largest software services firms, Wipro and HCL Technologies Ltd, reported a net decline in direct hiring. Meanwhile, Tech Mahindra Ltd saw a decline in its existing workforce in the January-March period.

Source: Livemint-Wipro to deploy AI platform Holmes to do the job of 3,000 engineers

Software Robots Are Transforming Old Guard Industries

Automation shows no sign of slowing down, says Blue Prism CEO Alastair Bathgate in an article for Xconomy. This is especially true in ‘old-guard’ industries like financial services and insurance, where robotic process automation helps enterprises become more competitive through the automation of slow, laborious back-office work. Read Alastair’s full article for Xconomy here or below:

Software Robots Are Transforming Old Guard Industries

Everyone’s talking about robots and automation. Do you imagine having a “Rosie” cleaning your home, or the next industrial revolution of automation putting millions out of work? Neither is quite reality, but there is a highly valuable virtual workforce of “software robots” working in today’s offices. They aren’t rolling around like BB-8 delivering the mail, but they are putting the “human” back into a lot of jobs, saving companies money and making them more competitive.

Software robots mimic humans, but live in the cloud or in the data center. They are the ethereal versions of their machine cousins – following business rules to execute processes within and across systems. Their people managers can configure software robots to drive any application – without code. And unlike traditional computer software, they teach the machines how to complete tasks. Any rules-based procedures – administrative, repetitive processes – are fitting work for a virtual workforce of software robots.

Surprisingly, the use of software robots is growing fastest not in slick tech companies but in old-guard industries like financial services and insurance, where robotic process automation (RPA) can help them be more competitive by taking over slow and laborious back office operations and supporting digital transformations and mobile applications, allowing human employees to have a more significant impact on customer experience and loyalty.

These early adopters of automation and software robots have a lot in common: large customer bases they want to protect, complex product portfolios, and a history of burdensome regulatory environments. They are also under immense pressure from new market entrants: agile startups that aren’t saddled by legacy systems to slow them down.

Back-office operations for these companies are ripe opportunities for software robots. They can quickly realize the benefits of RPA in back office transactions like insurance claims, invoice processing, and others where high volumes of transactions can be processed faster without human error, increasing speed and accuracy. Take for example The Co-operative Bank, a Blue Prism customer that deployed a virtual workforce of software robots to manage its excess queue procedure. Previously, eleven individuals worked eight hours per day to manually process 2,500 high-risk accounts each day. With RPA implemented, nine of these team members have taken on proactive account management positions – engaging with customers to discuss account issues before they arise.

Perhaps most significantly to IT, there is no need to abandon or add layers to legacy platforms. This last bit is important: banks and insurers have hundreds of legacy (and often proprietary) systems they don’t want to replace, nor should they. The right software robots are built to be non-invasive and don’t require customization of existing IT systems.

But efficiency is only one driving force for companies today: 74 percent of insurance companies surveyed by Forrester in North America, for example, said improving the experience of their customers is their number one initiative in the next 12 months. With that in mind, we’re beginning to see companies making more sophisticated decisions about what business functions are automated: while insurance providers can use software robots to process insurance claims behind the scenes without human intervention, consumers expect to have a real person answer the phone when they call a hotline. Automating straightforward tasks that are done the same way every time allows humans to handle the jobs that require improvising in a way that robots can’t handle.

The next wave of massive software robot adoption will happen in the healthcare industry, where driving down costs and improving patient experiences and care are paramount. Like in other old-guard industries, healthcare is burdened by legacy systems, complicated regulatory environments, and huge pressure to drive down costs. The efficiencies to be gained without adding IT burdens is tremendous. Alsbridge, a global sourcing, benchmarking, and advisory firm, recently predicted that RPA in claims processing alone could save the healthcare industry more than $1 billion. But imagine the possibilities in healthcare – from the back-office administration (claims processing, patient record reconciliation, pharmacy stock controls) to customer service improvements (self-service check-ins, appointment scheduling). Who can imagine a better place than a hospital to take out unnecessary burdens of red tape and enable people to focus on people?

Technology has always changed the way we do business, and that trend isn’t slowing down any time soon. The opportunity before us is to use automation smartly, in a way that allows us to be more human in our jobs. We have emotional and intellectual intelligence that software robots can’t replicate or automate, so let’s put it to better use.

Source: BluePrism-Software Robots Are Transforming Old Guard Industries