IoT to play a part in more than a quarter of cyber attacks by 2020, says Gartner

More than 25% of cyber attacks will involve the internet of things (IoT) by 2020, according to technology research firm Gartner.

And yet, researchers claimed IoT would account for less than 10% of IT security budgets and, as a result, security suppliers would have little incentive to provide usable IoT security features.

They also said the decentralised approach to early IoT implementations in organisations would result in too little focus on security.

Suppliers will focus too much on spotting vulnerabilities and exploits, rather than segmentation and other long-term means that better protect IoT, according to Gartner.

“The effort of securing IoT is expected to focus more and more on the management, analytics and provisioning of devices and their data,” said Gartnerresearch director Ruggero Contu.

“IoT business scenarios will require a delivery mechanism that can also grow and keep pace with requirements in monitoring, detection, access control and other security needs,” he added.

According to Contu, the future of cloud-based security services is, in part, linked with the future of the IoT.

“The IoT’s fundamental strength in scale and presence will not be fully realised without cloud-based security services to deliver an acceptable level of operation for many organisations in a cost-effective manner,” he said.

Gartner predicted that by 2020, at least half of all IoT implementations would use some form of cloud-based security service.

Read more about IoT security

Although overall spending will initially be moderate, Gartner predicted that IoT security market spending would increase at a faster rate after 2020, as improved skills, organisational change and more scalable service options improved execution.

Gartner predicted global spending on IoT security would reach $348m in 2016 – just 23.7% up compared with 2015 – $433.95m in 2017 and $547m in 2018.

“The market for IoT security products is currently small, but it is growing as both consumers and businesses start using connected devices in ever greater numbers,” said Contu.

“Gartner forecasts that 6.4 billion connected things will be in use worldwide in 2016, up by 30% from 2015, and will reach 11.4 billion units by 2018. However, considerable variation exists among different industry sectors as a result of different levels of prioritisation and security awareness,” he said.

Source: computerweekly.com – IoT to play a part in more than a quarter of cyber attacks by 2020

Gaining Productivity with Robotics Process Automation (RPA)

With so much hype in the market about Robotic Process Automation (RPA) and the various vendors all rebranding their existing software to RPA, customers are confused by the all the conflicting information and range of functionality. Blue Prism brings a number of experts together to discuss what is going on in the marketplace and the most recent research that has been conducted of the early adopters in this space.

Join us as Sarah Burnett, VP and global lead for automation research at the analyst firm Everest Group, shares her analysis of the automation market, the technology adoption trends, and the outlook for the future. She is followed by London School of Economics Professor Leslie Willcocks and Dr. Mary Lacity from University of Missouri who have studied 14 early adopters of RPA. The results of that research have culminated in their book “Service Automation: Robots and The Future of Work.”

Source: Vimeo-London School of Economics, Missouri Univ.  & Everest Research

Is Cloud clouding the issue? |

Cloud computing isn’t just fashionable, it’s widely considered to be a required component of any modern IT solution. But the varied and ‘catch-all’ nature of a term like ‘cloud computing’ leaves it wide open to confusion. There are fundamental differences between public and private clouds that can substantially affect their suitability for one business or another.

One is occupied by multiple tenants, externally managed and rapidly expandable, the other is privately managed and secured, specifically for the needs of a single tenant. The only characteristic these solutions share is virtualisation. Despite being decidedly less ‘sexy’ and marketable, virtualisation is the real game-changer in the IT industry today.

Public cloud

The principal value of a cloud solution is often assumed to be as a cost saving initiative, which is undoubtedly the case for public cloud. These solutions can be rapidly expanded to meet business requirements and cope with spiked demands on resources. Opting for public cloud outsources the cost of maintaining, orchestrating and optimising your own systems.

Another key element of the public cloud is the speed with which it can be implemented. Cloud providers offer out-of-the-box solutions that allow for very agile implementations, further keeping costs down. These cost saving elements are a major reason behind the widespread uptake of public cloud by SMEs. Many companies find themselves in the situation where it makes commercial sense to offload the management of their IT infrastructure, lowering the cost-barrier to new technologies and dramatically simplifying a major business process. Public clouds offer access to a smart, automated and adaptive IT solution at a fraction of the cost of a private installation of similar capabilities.

Private cloud

A private cloud brings very different benefits to a business. Because individual businesses are responsible for the management and upkeep of private cloud systems, great control is the primary goal of a private cloud, not cost-saving. When one business is the sole tenant of cloud infrastructure they can control and customise it to their very specific needs. Processes relating to R&D, supply chain management, business analytics, and enterprise resource planning can be all effectively migrated to a private cloud.

Where public clouds can offer flexibility and scalability to a huge degree at reduced cost, private clouds offer increased control and customisation over a virtualised IT environment. This is an important consideration for businesses in industries or sectors that are governed by strict regulations around data handling, security or transaction speed that may effectively prohibit the use of public cloud solutions.

Moving on from Safe Harbour

The European Court of Justice’s recent decision to nullify the ‘Safe Harbour’ agreement has caused many businesses to re-evaluate their cloud solutions in order to adhere to the new rules governing geographical limitations data transfer and storage. What becomes clear here is that all cloud implementations are absolutely not created equal. Given the misplaced connotations around the term, talking about ‘cloud’ can actually unhelpfully confuse discussions that should instead be focused on the required merits of virtualisation and how they can be applied to the specific IT service needs of a particular business.

Cloud computing is not the simple answer to the challenges facing enterprise IT today. Rather it is an important IT sourcing decision in the journey towards a multi-sourced, software-defined IT system that is specifically tailored to the needs of modern business. As businesses mature, managed self-built and public clouds must co-exist with each other as well as with legacy systems.

Here, the role of the data centre in enterprise IT architecture becomes crucial. All clouds have a physical footprint of some kind or another. More often than not, this footprint resides in a data centre that is capable of facilitating the high power-density, efficient workload focus and seamless interconnectivity between data sources that underline the core principles of virtualisation.

The hot topics of enterprise IT – the IoT, big data manipulation, machine to machine communication – all fundamentally rely on the software-defined approach that datacentres can provide. In the question of how enterprise IT can cope with the challenges of these applications, the cloud is only a small part of the answer.

Source: itproportal-Is Cloud clouding the issue? 

2016: CIOs And CMOs Must Rally To Lead Customer-Obsessed Change Now

In the coming weeks Forrester has published its annual set of predictions for our major roles, industries, and research themes — more than 35 in total. These predictions for 2016 will feature our calls on how firms will execute in the Age of the Customer, a 20-year business cycle in which the most successful enterprises will reinvent themselves to systematically understand and serve increasingly powerful customers.

In 2016, the gap between customer-obsessed leaders and laggards will widen. Leaders will tackle the hard work of shifting to a customer-obsessed operating model; laggards will aimlessly push forward with flawed digital priorities and disjointed operations. It will require strong leadership to win, and we believe that in 2016 CMOs will step up to lead customer experience efforts. They face a massive challenge: Years of uncoordinated technology adoption across call centers, marketing teams, and product lines make a single view of the customer an expensive and near-impossible endeavor. As a result, in 2016 companies will be limited to fixing their customer journeys.

CMOs will have good partners, though. As they continue to break free of IT gravity and invest in business technology, CIOs will be at their sides. 2016 is the year that a new breed of customer-obsessed CIOs will become the norm. Fast-cycle strategy and governance will be more common throughout technology management and CIOs will push hard on departmental leaders to let go of their confined systems to make room for a simpler, unified, agile portfolio.

Firms without these senior leadership efforts will find themselves falling further behind in 2016, with poor customer experience ratings impacting their bottom line. Look for common symptoms of these laggards: Poorly coordinated investment in digital tools, misguided efforts to invent new C-level titles, and new products with unclear business models.

We will begin publishing our predictions for the CMO and CIO roles on November 2nd in conjunction with our Age of the Customer Executive Summit. A steady stream of predictions will follow in the days after that. In the meantime, I’m providing a sneak peek at our predictions documents by indentifying the top 10 critical success factors for winning in the age of the customer:

  1. Disrupt leadership: CEOs will need to consider significant changes to their leadership teams to win a customer-led, digital market; CEOs that hang on to leadership structures to simply preserve current power structures will create unnecessary risk.
  2. Institute a customer-obsessed operating model: Companies that shift to customer-obsessed operations will gain sustainable differentiation; those that preserve old ways of doing business will begin the slow process of failing.
  3. Connect culture to business success: Those that invest in culture to fuel change will gain significant speed in the market; those that avoid or defer culture investments will lose ground in the market.
  4. Personalize the customer experience (CX): Customers will reward companies that anticipate their individual needs and punish those that have to relearn basic information at each touchpoint.
  5. Implement multidiscipline CX strategies: Companies that transform operations to deliver high-value, personalized experiences will drive a wedge between themselves and laggards just executing CX tactics.
  6. Operate at the speed of disruptors: Leaders will animate their scale, brand, and data while operating at the speed of disruptors; laggards will continue to be surprised and play defense in the market.
  7. Evolve loyalty programs: Companies that find ways for customers to participate with their brand and in product design will experience new and powerful levels of affinity; companies that try to optimize existing loyalty programs will see little impact to affinity or revenue.
  8. Convert analytics to customer value: Leaders will use analytics as a competitive asset to deliver personalized services across human and digital touchpoints; laggards will drown in big data.
  9. Master digital: Companies that become experts in digital will further differentiate themselves from those that dabble in a set of digital services that merely decorate their traditional business.
  10. Elevate privacy as a differentiator: Leaders will extend privacy from a risk and legal consideration to a position to win customers; companies that relegate privacy as a niche consideration will play defense and face churn risk.

As you can tell we’re expecting 2016 to be another year of rapid change as firms learn to cope and respond to empowered customers and agile competitors. The decisions companies make, and how fast they act, will determine if they thrive or fail in the age of the customer.

To learn more, download Forrester guide to the Top 10 Success Factors To Determine Who Wins And Who Fails In The Age Of The Customer.

Software Eats Everything

A widely quoted phrase these days is “software eats everything.” It refers to the great value that software delivers. I believe it also applies to the profound impact it’s making in the services world. Software is disintermediating the industrialized labor arbitrage model and also infrastructure services. Let’s look at the huge implications for the services industry.

How is software eating services? It’s happening in a number of important ways and areas.

Software eating BPO
First, software enables automation and RPA to replace much of what the current industrialized arbitrage model does. Much of this work is repetitive and screams for a more automated approach. BPO work, for instance, bridges the gap between the labor that interfaces between records and the system of records. As I’ve blogged before, software is about to eat BPO labor.

DevOps and software eating infrastructure services
The DevOps revolution’s impact on infrastructure services is another example of software eating everything. A fully integrated DevOps platform allows defining code for infrastructure hardware at the same time as defining code for functionality. Increasingly in a software-defined infrastructure, companies can build an integrated DevOps platform that enables simultaneously configuring the entire supply chain from functionality all the way down to the number of cores it requires to run and test it.

Prior to the DevOps movement, all these steps were labor based, and much of this work migrated into the industrialized arbitrage model. They now become largely automated and software controlled.

Software and virtual services eating infrastructure services
Another example within infrastructure is the infrastructure itself. Five years ago, companies operated in a world where they were trying to move from 20 servers per FTE to 50. Most of the infrastructure service providers succeeded based upon their ability to make that shift.

Today, the services industry tries to get up to somewhere in the range of 200 to 500 FTEs per server. But the highly automated world in Silicon Valley has over 100,000 virtual servers per person. They’ve completely severed the link between people and servers. Again, a dramatic example of software eating everything.

SaaS, BPaaS impact
Another dramatic example of software eating everything is the Software-as-a-Service (SaaS) and Business Process as a Service (BPaaS) offerings. These software-based services offerings completely automate and configure the software, hardware, and business process experience for customers. SaaS and BPaaS completely upend the classic functional model previously used to deliver these functions.

Implications for the service industry
Software eating everything is a relentless focus on different ways to sever the traditional link of labor (FTEs) to service. The dislocation to labor-based businesses will be immense over the next few years as this journey to a software-defined world continues and existing business models struggle to adapt.

A software-defined marketplace will dramatically change the current services market. It will create opportunities for new industries to emerge and force tremendous tension on the incumbent service providers to survive by embracing the change and cannibalizing their existing work.

Source: EverestGroup-Software Eats Everything by Peter Bendor-Samuel

Gartner Highlights Five Key Steps to Delivering an Agile I&O Culture

By 2018, 90 Percent of I&O Organizations Attempting to Use DevOps Without Specifically Addressing their Cultural Foundations Will Fail

Infrastructure and operations (I&O) leaders planning a bimodal IT strategy will miss out on the benefits of DevOps support for agile practices unless they transform their I&O culture, according to Gartner, Inc.

Gartner said that the implementation of a bimodal IT strategy requires careful planning and execution. Analysts predict that, by 2018, three quarters of enterprise IT organizations will have tried to create a bimodal capability, but that less than 50 percent of those will reap the benefits of improved agility and risk management.

“I&O leaders are under pressure to support customers who want to go faster, so they are utilizing agile development,” said Ian Head, research director at Gartner. “However, movement to agile will not, and should not, be a wholesale immediate change. Instead, it should first be implemented in areas where there is a very real business need for speed, and then carefully rolled out — taking the culture of the organization into account.”

Gartner has developed the strategy known as “bimodal IT,” which refers to having two modes of IT. Mode 1 is traditional, emphasizing scalability, efficiency, safety and accuracy, while Mode 2 is nonsequential, emphasizing agility and speed.

“Changing the behaviors and culture are fundamental to the success of a bimodal IT approach. We estimate that, by 2018, 90 percent of I&O organizations attempting to use DevOps without specifically addressing their cultural foundations will fail,” said Mr. Head.

“We do not advocate wholesale cultural change in a single organizationwide program. Instead, I&O leaders should focus their efforts on an initial, small Mode 2 team, establish the values and behaviors needed, and take incremental efforts to recognize and reinforce desired outcomes prior to scaling.”

The following five-step approach will help I&O leaders achieve an agile I&O culture:

  • Identify Your Mode 2 Behavioral Gap Analysis
  • Work With Key Stakeholders to Gain Consensus on the Approach
  • Start With a Small, Focused Pilot
  • Deploy Behaviors Incrementally to Successive Groups With Feedback Loops
  • Pursue Continual Improvement

Read more at: Gartner Highlights Five Key Steps to Delivering an Agile I&O Culture

Gartner: CIOs boost spending on sci-fi technologies

CIOs in the UK and Ireland are expected to increase their spending by 1.4% in 2015, fuelled by the economic recovery, according to research company Gartner.

Technologies related to the internet of things (IoT), robotics and 3D printing are among the big areas of investment.

Gartner’s research found that 10% of CIOshave already deployed IoT.

“A lot of science fiction is becoming real investment, such as 3D printing,” said Gartner analyst Lee Weldon.

Gartner estimated that 5% of CIOs have already deployed 3D printing, while 9% of UK and Irish CIOs are implementing new forms of robotics. This is higher than the global average.

In 2015, the technologies attracting the highest level of investments will be analytics and cloud computing. According to Weldon, of the CIOs who said their organisations were developing new software, 21% were considering hosting these in the cloud as their first choice.

“After years of cost-cutting, strategic investment in information and technology is returning,” said Weldon.

He said companies are looking to drive business growth in new areas, pointing to widespread interest in technology trends such as cloud, mobility, IoT and digitisation.

CEOs recognise that growth often comes through technology innovation, according to Gartner’s research. “There is more investment from the business in technology,” said Weldon.

This year, spending on digital technology moved up three places to third priority, behind analytics and cloud but ahead of datacentre and infrastructure spending.

Gartner urged CIOs to spend the majority of their time working with areas of the business outside IT to ensure that the value of information and technology is understood and the right investments are made.

In 2015, leading CIOs will spend less than 40% of their time running the IT organisation, choosing instead to spend time with other CxOs (27% of their time), business unit leaders (18% of their time) and external customers (16% of their time).

Source: Computerweekly.com Gartner: CIOs boost spending on sci-fi technologies by Cliff Saran