Analysis: Why Salesforce and Sage are climbing into bed together

Salesforce customers should expect price hikes in a bid to keep shareholders happy Forrester warns

Leading SaaS vendor Salesforce has partnered with its on-premise rival, Sage, to boost customer acquisitions and allow the UK-based firm to develop a brand new product on the salesforce platform. ComputerworldUK looks at what brought strange bedfellows, CEOs Marc Benioff and Stephen Kelly together.

Stephen Kelly, former UK government COO, today announced a new cloud product, Sage Life, which is “socially-powered, with a mobile control centre” for social business and real-time accounting for small and medium businesses. The product is built entirely on the Salesforce1 platform, which could explain why a team of financial advisers were called into Salesforce towers last week

He also announced a “complimentary” portal called Sage Impact that integrates with the existing Sage tools that accountants and bookkeepers currently use to run their businesses. Accountants can receive personalised news alerts and updates from social networks as well as the information they need on-the-go in one hub online.

The partnership announced this morning may dissapoint those excited by the rumours Salesforce was up for sale. But it does signal an interesting trend. Pureplay SaaS vendors are looking at how on-premise rivals can help them boost profitability, while traditional software vendors are looking at ways to offer customer a worthy cloud product.

Who’s buying whom?

It would appear that no-one is being bought outright, yet. Microsoft and SAP were named (and subsequently denied their involvement) as potential suitors for the hugely successful CRM tool-turned platform. NetSuite, which offers similar cloud-based enterprise tools, refused to confirm or deny any dealings with the Salesforce.

Gossip surrounding the sale of the SaaS vendor have always been in the zeitgeist thanks to its increasing pressure to return profits to investors.

This is because customer acquisition costs Salesforce more than traditional on-premise vendors like SAP who have a perpetual license fee providing cash. Salesforce relies on investor’s cash, and investors can demand a higher growth rate year-on-year.

“SaaS companies make losses and have problems with profitability, almost without exception”, says Anthony Miller, analyst at TechMarketView.

“The underlying problem is that delivering SaaS is inherently less profitable than delivering software-as-a-product.”

Vendors like Salesforce are responsible for developing, hosting and providing a high service level for its software, unlike traditional software vendors who simply send off the download link or CD.

Miller adds: “That may be all well and good if they charged a premium, but it is quite the contrary. The whole way SaaS markets itself is at a lesser cost to their client.”

Comparing SaaS’ business model to airline sales, Miller points out that rather than paying a premium for flexibility as airline passengeres do, the opposite is the case with SaaS customers. “SaaS vendors have turned it on its head and said ‘have as little or as much as you want and it is going to be cheaper than you have had before’.”

When you consider that attracting investment costs SaaS vendors like Salesforce “roughly half of their revenue,” according to Miller, it is easy to see why their growth may appear unsustainable.

So how will this partnership benefit both vendors?

The partnership with Sage could bring existing customer lock-in from its small and medium customer base, boosting Salesforce’s cash.

Further, this meeting of minds could be the “dramatic action” needed to boost Sage’s growth, said Peter Roe, another TechMarketView analyst, citing its “so far disappointing and tardy performance in cloud/SaaS where able and more customer-friendly competitors are making inroads,” for its hum drum financial results.

This partnership saves time and investment into a technology that Sage is unlikely to have a market lead in.

How will it affect customers?

The pressure to return profits to shareholders will see customers forced into spending more with the Salesforce with higher prices and less innovative products, Forrester analysts warned recently. This may filter through to Sage customers who use Sage Life due to its backend. However, for those concerned, Miller believes the market is competitive enough at least to keep costs relatively low for enterprise users.

It is important to remember that turning SaaS on and off may not be not as simple as it was billed. Price commitments, contracts and minimum user numbers will affect how deeply embedded a product will be within a company; plus integration and migration will be difficult as the product is hosted by another party.

“They [SaaS vendors] are already trying to put the financial terms around it to make it more difficult for customers to bail out but they would have trouble with doubling prices, for example,” Miller noted

The voracious machine that is SaaS

While this partnership could signal a good deal for all involved, SaaS vendors are still left with as the hamster wheel, chasing growth to secure investment. More partnerships are likely to be announced, Miller forecasts.

“This machine has got a voracious appetite to grow the business abnormally fast” he says.

The announcement itself may not come as a shock. In February, Sage revealed that its global employees would use its salesforce’s Customer Success Platform to improve its internal system architecture and data management in February this year.

Former government COO, and recent CEO of Sage said of the announcement: “Together with Salesforce, Sage is shaping the future of small business…Now running a small business can be as easy as updating your Facebook status.”

Source: computerworlduk-Analysis: Why Salesforce and Sage are climbing into bed together  By Margi Murphy

Advertisements

Hadoop big data adoption fails to live up to hype, says Gartner

More than half of survey respondents have no plans to deploy the open source analytics platform

Gartner research shows that more than half of companies have no current plans to adopt Hadoop-based data analytics, despite large firms like British Airways and Marks & Spencer being big fans of the technology.

Gartner’s 2015 Hadoop Adoption Study has found that investment remains “tentative” in the face of “sizable challenges around business value and skills”. The survey, which was conducted in February and March 2015 among 284 Gartner Research Circle members, found that only 125 respondents had already invested in Hadoop or had plans to do so within the next two years.

The Gartner Research Circle is a Gartner-managed panel composed of IT and business leaders. “Despite considerable hype and reported successes for early adopters, 54 percent of survey respondents report no plans to invest at this time, while only 18 percent have plans to invest in Hadoop over the next two years,” said Nick Heudecker, an analyst at Gartner.

“Furthermore,” he said, “the early adopters don’t appear to be championing for substantial Hadoop adoption over the next 24 months; in fact, there are fewer who plan to begin in the next two years than already have.”

Hadoop is an open source framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage.

According to the Gartner research, only 26 percent of respondents claim to be either deploying, piloting or experimenting with Hadoop, while 11 percent plan to invest within 12 months and seven percent are planning investment in 24 months.

Responses pointed to two interesting reasons for the lack of intent, said the analyst. First, several responded that Hadoop was simply “not a priority”. The second was that Hadoop was “overkill” for the problems the business faced, “implying the opportunity costs of implementing Hadoop were too high relative to the expected benefit”, said Gartner.

Gartner analyst Merv Adrian said: “Future demand for Hadoop looks fairly anaemic over at least the next 24 months. Moreover, the lack of near-term plans for Hadoop adoption suggest that despite continuing enthusiasm for the big data phenomenon, demand for Hadoop specifically is not accelerating.

“The best hope for revenue growth for providers would appear to be in moving to larger deployments within their existing customer base.”

Skills gaps were a major adoption inhibitor for 57 percent of respondents, while figuring out how to get value from Hadoop was cited by 49 percent. “The absence of skills has long been a key blocker,” the analyst said. “Tooling vendors claim their products also address the skills gap. While tools are improving, they primarily support highly skilled users rather than elevate the skills already available in most enterprises,” Gartner said.

Gartner estimates it will take “two to three years” for the Hadoop skills challenge to be addressed.

Source: computerworlduk-Hadoop big data adoption fails to live up to hype, says Gartner

Now Published: IT Standard for Business – a Model for Business driven IT Management

IT Standard for Business (or IT Standard in short) has been developed to support decision makers both in business and IT to have a holistic picture of how IT should be managed in order to provide maximum value for business. Since the business needs should be the main driver in IT management, the emphasis in the IT Standard is on the cooperation between business and IT.

IT Standard for Business differs from many other IT standards and frameworks because it is simple and written in everyday language. This makes it useful for everyone who wants to understand how IT functions should be governed in an organization. The basic framework illustration, called the grid, gives an overview of the five principle elements of IT management which are the following:

  • Enterprise Development turns business development initiatives into operational actions in IT.
  • Strategy and Governance defines how IT operates and creates value for the business.
  • Sourcing and Supplier Management ensures that the company has the services that best fit its business purposes.
  • Project and Development Management is essential for organizations to improve and create new solutions to succeed in competitive environments.
  • Service Management offers business-aligned services that ensure efficient and uninterrupted business operations.

Read the e-book at: IT Standard for Business – a Model for Business driven IT Management

More information at: https://www.ictstandard.org/

The CIO of the future is integrator and cultural enabler: Coleman, Unisys

Information Age sits down with Ed Coleman, CEO of technology veteran Unisys, to find out how IT will stay relevant through the era of the social, software-defined enterprise, as the pieces of the truly software-defined data centre fall into place 

A company that has been a pioneer in the technology industry for over 130 years, Unisys’ antecedents rolled out the first commercial typewriter and adding machine and later the first commercially-available digital computer.

It was the second largest computer company next to IBM in the 1980s, and focuses now on projects as diverse as developing biometric technology for national ID schemes, to providing systems integration for airports.

As Ed Coleman, Unisys CEO explains, the company owes its remarkable longevity to keeping a sharp eye on IT’s next big sea change, and in today’s industry this means a focus on software.

See the interview at: The CIO of the future is integrator and cultural enabler: Coleman, Unisys

Magic Quadrant for Enterprise Network Firewalls

The enterprise network firewall market represented by this Magic Quadrant is composed primarily of purpose-built appliances for securing enterprise corporate networks. Products must be able to support single-enterprise firewall deployments and large and/or complex deployments, including branch offices, multitiered demilitarized zones (DMZs) and, increasingly, the option to include virtual versions, often within the data center. These products are accompanied by highly scalable (and granular) management and reporting consoles, and there is a range of offerings to support the network edge, the data center, branch offices and deployments within virtualized servers.

The companies that serve this market are identifiably focused on enterprises — as demonstrated by the proportion of their sales in the enterprise; as delivered with their support, sales teams and channels; but also as demonstrated by the features dedicated to solve enterprise requirements and serve enterprise use cases.

As the firewall market continues to evolve, NGFWs add new features to better enforce policy (application and user control) or detect new threats (intrusion prevention systems [IPSs], sandboxing and threat intelligence feeds). The stand-alone Secure Sockets Layer (SSL) VPN market has largely been absorbed by the firewall market. Eventually, the NGFW will continue to subsume more of the stand-alone network IPS appliance market at the enterprise edge. This is happening now; however, some enterprises will continue to choose to have best-of-breed IPSs embodied in next-generation IPSs (NGIPSs). More recently, enterprises have begun looking to firewall vendors to provide cloud-based malware-detection instances to aid them in their advanced threat efforts, as a cost-effective alternative to stand-alone sandboxing solutions (see “Market Guide for Network Sandboxing”).

However, next-generation firewalls will not subsume all network security functions. All-in-one or unified threat management (UTM) approaches are suitable for small or midsize businesses (SMBs), but not for the enterprise (see “Next-Generation Firewalls and Unified Threat Management Are Distinct Products and Markets”).

The needs for branch-office firewalls are becoming specialized, and they are diverging from, rather than converging with, UTM products. As part of increasing the effectiveness and efficiency of firewalls, they will need to truly integrate more-granular blocking capability as part of the base product, go beyond port/protocol identification and move toward an integrated service view of traffic, rather than merely performing “sheet metal integration” of point products.

Reaad the report at: Magic Quadrant for Enterprise Network Firewalls

The transformational CIO: a customer-centric visionary

How CIOs must shift their focus toward systems that support their organisation’s ability to win, serve and retain customers

With digital technologies continuing to create disruption in the enterprise, CEOs require a partner on the C-suite that can combine technology expertise with business skills to successfully navigate the change.

Indeed, in today’s marketplace, CIOs are no longer just responsible for “keeping the lights on”, they play a pivotal role in making the company successful and driving the use of new technologies in a way that adds value to the business.

Gone are the days where CIOs had to manage the day-to-day processes of transactional systems that record everything but provide the business with little operational value.

Consumer technology has worked its way into the enterprise, which means that today’s CIO needs to manage on-premises systems and infrastructure, as well as cloud-based systems.

In addition, they need to deal with structured data coming from sources such as spreadsheets and databases, alongside unstructured data being generated from channels like email and social media.

Ultimately, it translates into dealing with the consumer-grade offerings often favoured by users, but still applying the enterprise-grade security required by customers, regulators and shareholders.

Fundamentally, the end-user’s priority is simplicity and ease – which employees tend to favour over data classification processes or regulatory compliance.

However, mismanaged information across an organsation can cost the enterprise millions of pounds in costs associated with litigation and compliance.

Increasingly, CIOs are assuming more responsibility for massive volumes of enterprise information and the systems that house the data. Yet, with this responsibility comes the key challenges of data governance, information security and authenticated access.

As such, today’s CIOs are responsible for keeping information processes simple but secure and compliant – and at the same time having to unlock the unrealised business potential that IT can enable.

Digital governance

It is therefore no surprise that research from analyst firm IDG shows that, according to 140 technology leaders, an enterprise information management (EIM) strategy should be a top priority for CIOs and IT business executives.

Indeed, CIOs need to be responsible for implementing the best effective digital governance frameworks across the business, in order to ensure that information is secure and managed correctly.

While the role of information management is undeniably paramount, many CIOs struggle knowing where to start.

Digital technologies stretch across every facet of a business, and organisations have rapidly growing amounts of information to deal with on a day-to-day basis.

At the same time, it’s becoming even more crucial that people within the business have quick and secure access to the information available, whenever it is need.

To solve this challenge, CIOs need to combine information management platforms with digital governance frameworks to secure the flow of information throughout the business but also provide a platform for digital transformation.

For many organisations, the CIOs ability to effectively drive digital transformation will ultimately decide whether the company remains competitive or not.

 Next wave of transformation

The role of the CIO is continuing to evolve and is centred around embracing disruptive technologies to truly deliver on the operational excellence and business enablement that new technology affords.

The pace of change is not relinquishing, and the next ten years are shaping up to be just as disruptive as the last.

Embracing the next round of disruptive IT will require more dramatic changes to internal operations and customer relations.

With products and services becoming increasingly commoditised, using technology to deliver business benefits will become the key differentiator for enterprises looking to get ahead.

This is where the transformational CIO has an opportunity to lead the way. CIOs that adopt this approach will be able to create IT infrastructures that improve the quality of products and services provided, whilst simultaneously driving efficiencies and cost reductions.

Source: The transformational CIO: a customer-centric visionary by Ben Ross

SAP users plan in majority for Hana and hybrid

SAP customers have a lot of work to do to make the most of their SAP environments before they can move to systems based on Hana, the supplier’s high-speed, in-memory database appliance.

The UK & Ireland SAP User Group surveyed 120 member organisations in advance of one of its symposia and found the vast majority of SAP users (93%) find optimising their SAP systems difficult.

Barriers facing optimisation include lack of money (60%), lack of time (53%) and lack of knowledge and skills.

The survey also found that 63% of responding organisations had moved or planned to move to Hana. The results showed 10% of respondents are already on Hana, 22% are planning to use it in the next three years, and 31% have it on their roadmap for three or more years’ time.

In the response, two in three organisations also said their underlying infrastructure would be hybrid (on-premise and cloud) in three years’ time.

Optimisation remains a challenge for the majority of SAP users, both from a resources and technological perspective

Paul Cooper, UK & Ireland SAP User Group

Paul Cooper, vice-chairman of the user group, said it is clear many users are looking to move to SAP Hana and run SAP on a hybrid infrastructure. However, to ensure a smooth transition and fully realise the benefits, users need to optimise their existing SAP landscape.

“The survey shows that optimisation remains a challenge for the majority of users, both from a resources and technological perspective,” said Cooper. “Nearly half of our respondents (47%) stated that customisations to their existing landscape had hindered the speed at which they could implement enhancement packs and/or adopt newer platforms from SAP, such as cloud and Hana.”

Cooper said the user group’s upcoming symposium would cover “simplify to succeed, document management, Solution Manager and preparing for the future”, including the S/4 Hana enterprise resource planning system, announced in February, and hybrid cloud.

“It is our first big event of the year, and our members want information. It’s early in our understanding of S/4 Hana, and this event helps us gauge topics for 2015, including the conference in November,” said Cooper.

Nearly three-quarters (72%) of respondents stated they simply did not know if S/4 Hana would help them run simpler in the future.

Bigger data

Managing growing data volumes is another area of difficulty. In the survey, 65% said their organisation’s data volumes have increased in excess of 40% over the past three years. Many respondents (53%) face obstacles with respect to data accuracy and governance.

READ MORE ABOUT THE UK & IRELAND SAP USER GROUP AND TECHNOLOGY STRATEGY

Most respondents (91%) thought SAP could do a better job of making customers aware of the products and services it offers to help with data management.

“The communication needs to be more targeted rather than blanket,” said Cooper.

Three-quarters of users said they had used SAP Solution Manager in the past three years, but nearly half (45%) said they were unaware of SAP’s Expert Guided Implementations (EGI) service.

Cormac Watters, managing director of SAP UK & Ireland, said users are committed to building long-term, strategic partnerships that help customers maximise their SAP landscape, from traditional enterprise resource planning (ERP) to new innovations such as S/4 Hana.

“We are working closely with the user group to provide details of the SAP Service and Support portfolio in a way that it is accessible and easy to use,” said Watters.

The survey questioned 120 of the 500 user organisations in the UK and Ireland. The Optimising your SAP Landscapesymposium takes place in London on 28 April 2015.

Source: computerweekly-SAP users plan in majority for Hana and hybrid by Brian McKenna

Shadow technology: Four ways to reduce its use, minimize its impact

For years, the centralized IT department, responsible for managing all IT needs for the enterprise, has been losing power — and control of technology resources — within the corporate structure as it gets easier and easier for business units to purchase their own technology. C-level executives are grappling with how to react to the shadow IT dynamic. In this webcast presentation, Derek Lonsdale, IT transformation leader and Lean expert for PA Consulting, says that in order to reduce the use of shadow technology and minimize the disruption caused by it, centralized IT has four main tasks. Find out what those tasks are and get advice on how to tackle them.

The following is a transcript of the first of five parts of Lonsdale’s webcast presentation on shadow technology.

Derek Lonsdale: I always think about IT as the life cycle of plan, build and run, and the way you move into development and operations. And I think, today, businesses tend to buy external help for the build and run options within IT rather than the planning side, and it can be strategic purchasing of IT services or tactical purchasing of IT services, but we do know that the concept of having a centralized IT functionthat manages all of IT for the enterprise is virtually disappearing. In today’s virtual world it becomes much easier for businesses to buy their own IT and we’re certainly seeing lots of examples of this.

It’s not a random activity. We’ve seen some tactical purchases, but increasingly, many are strategic — [for example] Salesforce.com, Google Docs and others. … And we know that renegade IT, or shadow IT, comes with additional risk and potential inefficiencies, but the business [units] keep on doing it. And why do they insist on buying their own IT when they’ve got an IT department that can provide some of the services? And I think it’s fairly simplistic. The business buys IT because IT can’t do things fast enough for them. The business wants it faster and cheaper, they think they can get to the business outcome much faster than if they use their own IT department.

And I think IT does need to do some basic things. It needs to earn the right to deliver services. It has to fix the basics. Whether it’s server provisioning or onboarding or just desktop provisioning, it has to fix those basic things. It has to understand the business better. We know that technology is increasingly becoming more integral to the business outcome, and in some cases it is the business outcome. It’s likely to become more prevalent. We know that if IT doesn’t get to understand the business better and how the IT services can add value to the business, then renegade IT will probably get more and more used by the business.

So, even if IT transforms into a function that is entirely focused on adding value to the business, there’s still going to be some external partners that provide niche services that IT is just unable to compete with. And if that’s the case, I think we need to be able to embrace the change. Renegade IT is here, but what can IT do to improve it? Can they provide some subject matter expertise in IT design, in IT procurement, into managing relationships with third-party providers? There’s a whole bunch of things that IT can help with and if we embrace that change and provide that expertise, we are more likely to get the business on our side.

Improving governance and supporting processes to improve business and IT engagement … is all about the ways of working. And if we can provide a single point of contact for the business through good governance and have the supporting processes, then I think we can help improve the way that we work together.

One of my colleagues, Deepak Bharathan, wrote an article last December, and the title was a bit of a scary subject. It’s about cutting your IT budget — all of it. I know it’s forward looking, and it goes on to say, ‘Computers will vanish as they get embedded. Computing is already embedded in everything that we use, whether that’s communications, entertainment, transportation, fashion. No matter what company you work for, what product you dream of making, IT is omnipresent. Flashy gadgets [might] continue to wow us, but you’re more likely to see computing technology in everyday items, [like] clothes and toothbrushes.’

He finishes off by saying, ‘If technology is now an inherent part of all businesses, is it right to think about IT investment as if it was a separate bucket?’ And I think that thinking fits nicely to renegade IT. Can we afford to separate renegade IT from what we do within the rest of IT, or do we embrace it? And I think IT has a hard time demonstrating value to the business today and this increases instances of renegade IT.

But as I said earlier, there are risks and inefficiencies in renegade IT. So, how can [IT] make sure … that we’re included in every IT conversation with the business. I think it goes back to these four bullet points:

  • We need to earn the right to deliver services,
  • We absolutely need to understand the business better,
  • We need to embrace this change and provide some subject matter expertise, and
  • [We need to] improve the governance and supporting processes.

And there’s some roles within this governance that I’ll talk about a little bit later in more detail, [including] things like the business relationship managers — an essential role [in] managing the business and decreasing the use of renegade IT.

Questions about this webcast presentation on coping with shadow technology? Emailstroy@techtarget.com.

More on Enterprise ITIL and ITSM

  • canderson

    Fixing broken IT service management processes

    VIDEO – Find out why it’s so important for centralized IT departments to fix IT service management problems as part of their efforts to minimize and mitigate shadow IT.

  • canderson

    Understanding business processes helps ward off shadow IT

    VIDEO – Learn about the role understanding business-facing processes plays in buttressing the IT department to better compete against shadow IT.

  • canderson

    Internal vs. external customers? CIO Brook Colangelo doesn’t buy it

    VIDEO – At Houghton Mifflin Harcourt, IT is adopting ‘radical simplicity’ to better serve its internal and external customers.

  • #CIOChat: Taking shadow IT into the light

    News – Shadow IT doesn’t have to be in the shadows. As the debate over how to handle shadow IT continues, it’s often unclear how best to respond. Join #CIOChat on March 25 at 3 p.m. EST to discuss enterprise shadow IT management.( Mar 20, 2015 )

  • Embedded IT vs. central IT: Finding common ground

    Feature – ITSM expert George Spalding says that the key to solving problems between embedded IT and central IT is a common language based on services and not technology components.

  • project scope

    Definition – Project scope is the part of project planning that involves determining and documenting a list of specific project goals, deliverables, tasks and deadlines.

  • Three tips for a better IT service model

    Tip – With IT budgets moving out of the IT department, these three best practices will help put your IT organization in a better position to compete with outside IT services companies.

  • gap analysis

    Definition – A gap analysis is a technique used to assess the differences between the current and desired performance levels of a company’s systems or applications, as well as determine how to meet those requirements.

View the presentation at: http://bcove.me/jwj14sq6

Source: SearchCIO-Shadow technology: Four ways to reduce its use, minimize its impact

Data center components that deserve an update this year

In the next 12 months, consider making a few upgrades to the data center. TechTarget advisory board explains which ones are the most important.

In the next 12 months, the data center will experience its biggest review yet. IT staff will stop asking, “What updates should be applied?” and start asking, “What is the data center for?”
That’s what Clive Longbottom, analyst at Quocirca and SearchDataCenter advisory board member, forecasts for the future. The SearchDataCenter advisory board picks the most important data center component upgrades in the next 12 months, and shares how to implement them into your infrastructure — and budget.

Clive Longbottom, co-founder and service director at Quocirca: Outsource

Self-owned and managed data centers are becoming less attractive. Now is the time to review what equipment, software and data can be updated or outsourced, and what has to remain under direct control.

Any data center upgrade has to include flexibility. The business may decide that some parts of the existing platform should remain under its direct control, but that ruling can change. Set up the IT architecture so that workloads can move around without trouble.

In many cases, a software as a service (SaaS) model makes sense. SaaS offers greater flexibility, predictable costs and continuous delivery.

Colocation also can make more sense than an owned data center facility, as it allows for elastic growth and reduction of required space as the hybrid model of computing progresses.

To prepare for a move to colo or anything as a service:

  • Perform an asset survey of what is actually in the data center at the hardware and software level;
  • Carry out a usage survey of what is really being used, and why;
  • Cleanse data, preferably including master data modeling for better veracity with new data;
  • Rationalize software instances and versions to the lowest possible number;
  • Consolidate workload via virtualization onto the optimum hardware stack;
  • Virtualize on VMs or containers to make workload transitions from the existing data center platform to colo, hosted, or infrastructure or platform as a service as easy as possible; and
  • Automate the movement of workloads into the cloud via appropriate tools.

Robert Crawford, systems programmer: Skills

Stuff in the data center gets updated all the time. This is the year to update your skills.

Updating your skills is always important. For most of us, the problem is finding the time. This year, make a little time in the work week for training.

The mainframe makes learning relatively easy for the autodidact. You can pick up valuable information by looking around in a dump of your favorite address space (advanced level: use an unfamiliar IPCS verb exit to find something you may have missed).

This is also an interesting time for Assembler programmers. IBM’s latest processor, the z13, comes with its own Principle of Operations manual. Take time to familiarize yourself with the new instructions.

If you want a broader mainframe update, there are other new features to explore. For instance, CICS and IMS have recent enhancements for mobile computing. IBM has a z/OS download page with some interesting tools, and user organizations such as SHARE are also helpful.

Carrie Higbie, global director of data center at The Siemon Company: Fabric networks

Expect major reconfigurations surrounding fabrics in data centers. Top-of-rack switching is overly expensive, wasteful and nonconducive to virtualized environments.

Fabrics and Layer 2 switching are the next wave for data center networks. Three-tier switching methods using Layer 3 networking leave half of a company’s network investment waiting around for the primary half to fail. If something goes down, all traffic stops while the network is reconfigured. Further, virtualized servers do not work well over Layer 3 networks and in many cases, servers must share a switch for software migration to work. With fabrics, a layer or two of switching disappears.

Few data centers have sufficient power to install enough servers to populate a top-of-rack switch, leaving many stranded, unusable ports. Implementing fabrics over 10GBASE-T saves money in switch spend and ongoing power and maintenance costs. Fabrics also lead to software-defined networking.

Examine the entire data center ecosystem to find the best tool and right cost. Re-evaluate vendors on fabric and SDN implementation, as well as network virtualization strategies. The benefits are significant with non-blocking, self-healing architectures. Additional benefits include active/active components where the primary and secondary sides both pass packets effectively, doubling bandwidth over the two networks.

Higher-speed backbones will come into play when companies upgrade to support 40 and 100 GbE technologies. This can be done via IEEE standard products or proprietary products. There is the risk with proprietary products of tying your data center’s performance to a vendor’s roadmap.

Involve the entire data center team in design decisions for massive, costly upgrades. Designing around a single application that lasts two to three years is wasteful.

Sander van Vugt, independent trainer and consultant: Object storage

Data center teams should change the way they think about storage. To many data center teams, storage means storage-area network (SAN) — which means big money.

Cloud environments are different. Object storage allows data center admins to scale out their storage needs without any limit, using cheap disks instead of expensive Serial Attached SCSI. The Ceph object storage tool, for example, has recently matured to be a serious option for enterprises.

Break away from vendor lock-in on storage tools. Proprietary SAN isn’t faster — it’s only more expensive. So in the next couple of months, data center administrators should start exploring object storage.

Robert McFarlane, principal at Shen Milsom & Wilke: Contain and monitor

Assuming you’ve done the basics — blanking panels, blocking bypass air paths and removing comatose hardware — go to containment to segregate hot and cool air. Even if you found it too difficult or expensive in the past, look again.

New products make working within the newest fire suppression standard (NFPA-75) easier. Whether you use hot aisle or cool aisle, full or partial containment, the cooling improvements and energy savings are well worth it.

Monitoring is also an important data center upgrade to make this year. Consider implementing data center infrastructure monitoring and management (DCIM), even if you can’t afford, justify or support a whole suite of tools. The best DCIM packages are modular, so start small and add as you’re ready. It takes time to learn from the information, particularly if you have more reported than you can use. A good DCIM offering reduces huge volumes of data into easily usable information.

If full DCIM is not viable, start with smart power distribution units (ePDUs or CDUs). Include a basic power monitoring package and start tracking your loads and phase balance. If you haven’t already, network them. You don’t need to waste 10-Gb ports on your central switch; cheap switches on a separate small network are fine for this upgrade. Try adding temperature and/or humidity monitoring to the strips.

Source: TechTarget-Data center components that deserve an update this year by Sharon Zaharoff

Changing roles: the CIO as transformer

The successful implementation of ‘game-changing’ transformational technology lies in the back office

The speed of change in the technologies available to organisations over the last five years has been astonishing. In enterprise organisations across EMEA, there is widespread recognition that IT has become an enabler of business and a key force in driving innovation and new ways of working.

As a direct result, the role of the CIO has fundamentally shifted; IT now has a place at the top of the table, advising the Board on strategic business investments.

The continued rise of new technologies, which fundamentally change the way businesses operate and engage with their customers, are responsible for this dramatic renaissance for the IT department.

Garter calls this the ‘Nexus of Forces’, the ‘convergence and mutual reinforcement of four interdependent trends which empower individuals as they interact with each other and their information’. These are macro technology trends that are, in no small part, responsible for revolutionising enterprise business today.

We are of course talking about trends such as social, mobility, cloud and information (or data). They have disrupted existing business models and the relationship between IT, the wider organisation, and its customers.

Lines of business and end users now essentially make their own IT decisions in many cases. Business leaders see these new technologies as representing a significant opportunity to innovate, improve efficiency, and provide differentiated customer experiences in highly competitive marketplaces.

CIOs should, in theory, rejoice at the CEO’s newfound interest in cloud. They should be delighted that the CFO wants to know how exploiting big data and business analytics will drive process improvement. And the chance to sit side-by-side with the heads of sales and marketing to plan the rollout of a social CRM system that will revolutionise the company’s customer engagement model is surely an opportunity too good to miss.

The challenge, however, is ensuring that repeated adoption of the ‘next big thing’ delivers maximum benefit and optimal value to the business.

>See also: The evolving CIO: change or suffer

Expectations surrounding what can be achieved through transformational IT projects – such as cloud, mobility, social, and data implementations – are high. And the vendor community is by no means innocent when it comes to fanning the flames of hype and over-promise.

That’s not to say that these macro technologies can’t deliver the huge savings and efficiencies advertised on the box, but the promise of improved business performance, and other such benefits are being undermined by infrastructures unable to support leading-edge technology.

As a result, expensive applications that are being deployed across the business are failing to maximise their potential, at a substantial cost to the company in many organisations.

Recent research commissioned by Riverbed Technology amongst EMEA CIOs in ten countries showed that in the past 12 months, over four in ten (44%) new applications deployed have failed to meet their performance expectation.

In many cases, there is a disconnect between what the Board wants to achieve and what can be delivered using existing infrastructure. The same research found that in only 6% of cases are Board-level executives fully mindful that new technology innovations can be hampered by poor infrastructure or network performance.

1 in 4 getting it right

However, Riverbed has identified a significant minority (25%) who are utilising technology effectively and transforming their investments into high business performance. This group, which Riverbed has coined ‘the transformers’, is leading the way when it comes to technology investment.

>See also: 5 trends that will make or break the CIO in 2014

Rather than constantly adding new applications onto existing infrastructures, transformer organisations recognise the importance of a high performance application infrastructure to the potential of cutting-edge technology.

Not only are these companies enjoying greater productivity across the business, but they also expect ROI to be realised in a shorter timescale. 70% of transformers expect ROI on most or all new technology innovations within two-to-three years, compared to 57% of non-transformers.

A key factor is that when looking to optimise the value of technology investments, transformers prioritise consolidation and new architectures over other factors, whilst non-transformers prioritise maintenance and ongoing operations. As might be expected, transformers are therefore more likely to have already completed (36%) or be currently implementing (41%) a data centre consolidation project than other companies.

The transformers appreciate that technology is integral to business performance and, as such, are more likely to consider technology as a driver of innovation (66% compared to 40% amongst non-transformers).

Critically, however, network performance is ‘front of line’ amongst IT departments. Over half (53%) of transformers are delivering a high performance IT infrastructure for business applications as a core part of their role, compared to 31% amongst non-transformers.

Educating the Board about the power of IT to transform the business is a battle only half-won. Judging by the findings of the transformers report, the challenge now is to secure the right levels of investment across the IT function and develop a balanced IT operation that combines innovation with consistency.

Source: Changing roles: the CIO as transformer by Ben Rossi