Despite progress, the future of AI will require human assistance

Part one of this story on the future of AI explained how technology developments have led to a resurgence in a field that has progressed in fits and starts since the 1950s. Today’s cheap storage and amped-up and inexpensive compute power, combined with an explosion in data, have revived an interest in deep learning and “neural nets,” which use multiple layers of data processing that proponents sometimes liken to how the brain takes in information.

The field is red hot today, with Google, Facebook and other technology giants racing to apply the technology to consumer products. In the second part of this story, SearchCIO senior news writer Nicole Laskowski reports on where two AI luminaries — Facebook’s Yann LeCun and Microsoft’s Eric Horvitz — see the trend going.

Like Microsoft, IBM and Google, Facebook Inc. is placing serious bets on deep learning, neural networks and natural language processing. The social media maven recently signaled its commitment to advancing these types of machine learning by hiring Yann LeCun, a well-regarded authority on deep learning and neural nets, to head up its new artificial intelligence (AI) lab. A tangible byproduct of this renewed focus on neural nets is Facebook’s digital personal assistant, M, which rolled out to select users a few months back.

Today, M’s AI technology is backed by human assistants, who oversee how M is responding to queries (such as placing a take-out order or making a reservation) and can step in when needed. According to a Wired article, the AI-plus-human system is helping Facebook build a model: When human assistants intervene to perform a task, their process is recorded, creating valuable data for the AI system.

Once enough of the “right data” is collected, M will be built on neural nets, which is where LeCun’s team comes in. But even as the AI behind M advances to neural nets, humans will need to be in the loop to continue training the technology, according to the article.

Still work to be done

That’s because M, like most contemporary AI systems, is a supervised learning system, which means the system can’t teach itself. Instead, if LeCun wants an algorithm to recognize dogs, he has to feed it examples of what dogs look like — and not just a handful of examples. “You have to do it a million times,” LeCun said at EmTech, an emerging technology conference hosted by the MIT Technology Review. “But, of course, humans don’t learn that way. We learn by observing the world. We figure out that the world is three-dimensional, that objects move independently. … We’d like machines to be able to do this, but we don’t have good techniques for this.”

Yann LeCun

Building machines that have a kind of artificial common sense, according to LeCun, will be the big challenge for the future of AI. “It’s done by solving a problem we don’t really have good solutions for yet, which is unsupervised learning,” he said.

One of the ways Facebook (among others) is trying to insert rudimentary reasoning into AI systems is with vector embedding, where unstructured data is mapped to a sequence of numbers that describe the text or object in detail. LeCun said the process brings together perception, reason, perspective and language capabilities so that if the algorithm encounters an unfamiliar word or image, it can make an educated guess by comparing and contrasting the rich mathematical descriptions of the unknown against the known. One of his clearest explanations about how vector embedding works had to do with language translation: “We can take two pieces of text, one in French and one in English, and figure out if they mean the same thing,” he said.

Facebook is not alone in taking this approach to improving AI. A recent article inExtremeTech described the “thought vector” work Google’s doing as a way of training computers to comprehend language, which they are incapable of doing now.

The future of AI

But language comprehension is a far cry from machines that can perform the same intellectual tasks that humans perform. Developing a common sense program, or “artificial general intelligence,” as it’s called, is still a way’s off, said LeCun, who shared the EmTech stage with Eric Horvitz, director at the Microsoft Research laboratory in Redmond, Wash. “If Eric were to grab his water bottle and walk off stage, you could close your eyes, be told that, and picture all of the actions he’d have to take to do that.” AI machines, on the other hand, can’t.

Sci-fi films such as Her and Ex Machina may give the impression that the future of AI is conscious machines, but LeCun and Horvitz described generalized intelligence as really hard problems to solve. “We’re nowhere near being able to think that through,” Horvitz said. “I do think that with the great work on the long-term road toward intelligence, we’ll have competencies, new kinds of machines, and it may well be that deep competency is perceived as consciousness.”

Eric Horvitz

One of the basic obstacles Horvitz is interested in solving is a classic IT problem: AI technologies were essentially built in silos. For systems to become more powerful, they’ll likely need to be knitted together. “When I say we’re trying to build systems where the whole is greater than the sum of its parts, we sometimes see surprising increases in competency when we combine, for example, language and vision together,” said Horvitz, who recently launched (and, along with his wife, is funding) the One Hundred Year Study on Artificial Intelligence at Stanford University, an interdisciplinary research effort on the effects of AI.

Exactly how AI systems should be integrated together is still up for debate, “but I’m pretty sure that the next big leaps in AI will come from these kinds of integrative solutions,” Horvitz said. Silo busting may not be as sexy as creating conscious machines, but it could lead to what Horvitz called a “symphony of intelligence.”

“With every advance, and particularly with the advances in machine learning and deep learning more recently, we get more tools. And these tools are often really interesting components we can add to older components and see them light up in new ways,” he said.

Source: searchcio.techtarget.com-Despite progress, the future of AI will require human assistance

Is Desktop-as-a-Service ready for business?

Licensing complications and an immature market limit where DaaS is useful, but desktops in the cloud are ideal for specific situations.

For companies looking to reduce the cost and complexity of Virtual Desktop Infrastructure (VDI), the attraction of Desktop-as-a-Service (DaaS) is that you can greatly reduce up-front investment. “It’s pay as you go and you only pay for what you need,” says Mark Lockwood, research director at Gartner.

VDI often costs more and delivers less value than expected, warns Mark Lockwood, research director at Gartner. “VDI is complex and it seems like there’s always something extra you have to buy. You have to worry about bursting, you have to worry about the user experience.” With VDI, you have to pay for the infrastructure up front and depreciate it over time; you also have to buy infrastructure for your peak level of usage. “You’re going to spend a ton of money on storage, on compute, on data centers; and if people don’t use it, you still have to pay for that,” Lockwood points out.

“With DaaS, you only buy what you need and you pay for it month to month,” Lockwood says.

Appealing as that might sound, there are several reasons why you can’t treat DaaS as a replacement for VDI. For one thing, you can’t get all of the same features yet. “There are things you can do with on-premises VDI from Citrix or VMware that you can’t do today with a lot of cloud providers,” Lockwood cautions. “Persistent desktops are not available from all providers. You may not be able to do app layering. You may not be able to get a full GPU as an option. You have to be very careful not to assume that you can do with a DaaS provider what you can do with in-house VDI.”

In fact, you need to check carefully what different DaaS providers offer, because the term covers everything from Microsoft’s Azure Remote App to a full desktop that people can use like a PC. Some services offer a full stack, including capacity management and performance monitoring. Some don’t.

As with VDI, nonpersistent desktops suit task workers (in a call center or a retail store, for example) who can work form a standard, shared image. But if you want users to be able to customize their desktop — pinning programs to the Start menu and taskbar, setting up their own signatures in Outlook and generally treating the desktop like their own PCs — you need to look for persistent desktops.

Cost and complexity

“There are perhaps 15 vendors talking about DaaS,” says Lockwood, “and probably none of them have the same definition. They’re different in every way imaginable. The two biggest providers are Amazon Workspaces and VMware with Horizon Air. They don’t even offer you the same operating systems.”

VMware offers Windows 7, 8.1 and 10. Amazon has Windows Server running a Remote Desktop Session so that it looks like the client OS (and only Windows 7 at that), which can cause problems for some software that needs to run on the Windows client.

Because of licensing restrictions, the Windows Server version of DaaS is cheaper. The DaaS provider can sign up for the Microsoft Service Provider License Agreement (SPLA) and also buy a Remote Desktop Service (RDS) Subscriber Access License for each user, which will also be needed for Microsoft software like Office that’s provided by the service, and share the hardware that Windows Server runs on among multiple customers.

For the Windows desktop OS, in most cases you have to use what Microsoft calls Dedicated Outsourcing, running your own volume licenses (with either Software Assurance or Windows Virtual Desktop Access licenses) on dedicated physical servers that the DaaS provider can’t use for any other customers. You’ll also need Client Access Licenses (CALs) for any Microsoft tools that you use to manage the desktops, like System Center, as well as licenses for the software that runs on them.

“SPLA allows the service provider to share a server running Windows among multiple clients. There’s no such thing for the desktop OS,” says Lockwood. “If I go to VMware and say I want a virtual desktop running Windows 7, they have to put me on my own server — and that’s not terribly cost effective.”

Wes Miller, licensing specialist at analyst firm Directions on Microsoft points out several other complications. “Without truly dedicated hardware for each tenant, which is generally antithetical to cloud scale, there is no way to license the Windows desktop OS for use in a server-based desktop ‘as a service’,” he warns. “You bring your own VDA license to the provider offering you dedicated hardware. In essence, it’s ‘your’ hardware.” If the desktops access any of your on-premise systems, you’ll need the relevant CALs for that.

A Microsoft Developer Network (MSDN) subscription does include the right to run a Windows 10 virtual machine on Azure, but that’s only for development and testing, not for production use.

In short, he says, don’t expect to save any money on licensing with DaaS. “At this point, the primary benefit you get from DaaS is that you don’t have to run it, and they may know how to design it for your organization in such a way that you save on hardware compared to your own implementation on-premises. But you won’t really pay less for software.”

The logical vendor to offer Windows 10 DaaS, Lockwood points out, would be Microsoft, especially with Azure Infrastructure-as-a-Service (IaaS) and Office 365 as part of a unified service. For a short time last year, the Microsoft Azure site promised that Microsoft would “enable customers with Windows Enterprise per user to run Windows 10 Current Branch for Business on Azure.” But rather than changing Windows licensing to simplify this or entering the market as a DaaS provider itself, Microsoft recently announced that later this year Citrix customers who’ve bought Windows Software Assurance on a per-user basis will be able to host Windows 10 Enterprise Current Branch for Business images on Azure through XenDesktop VDI — without an extra licensing fee.

Lockwood describes the news as “somewhat significant, but it doesn’t change the fact that Citrix is a partial stack provider. And it doesn’t really tick all the boxes of a real Microsoft all-in-one offering, which could give you everything from IaaS to DaaS with OneDrive and Office 365 included. In reality, this is Microsoft bending their ‘no desktop OS on Azure’ rules for their friends at Citrix, but the question must be asked: Will Microsoft do it themselves next?”

To Miller, “this is the beginning of the news we’ve been expecting to hear,” but he notes that the announcement doesn’t cover costs or the specifics of the virtual desktops, and organizations are still responsible for their own licenses. This might be a significant development for business-ready DaaS, but there are a lot of questions still to ask.

Spec your desktops

It’s important to remember that not all your desktop users will be using simple desktop applications. If they’re doing computer-aided design (CAD), financial modeling, video editing or complex graphics editing, not all DaaS services will offer the right specifications. “Most DaaS providers have a limited catalog of disk space, CPU and memory,” warns Lockwood. “You’ll struggle to find more than two or three providers who can offer you a powerful machine with lots of disk space and high performance GPU — and if they do, it’s expensive.”

That might change later this year when Windows Server 2016 launches — and if the XenDesktop on Azure offering comes after that launch, it could take advantage of new options. The improvements to the RemoteFX support that shares GPUs between client virtual machines (VMs) include support for OpenGL 4.4 and OpenCL 1.1, as well as DirectX 11, and the ability to give each VM dedicated video memory. That means demanding software like Photoshop, ArcGIS and CAD packages will have far better performance — and applications will be able to use the virtual GPU for hardware acceleration, like applying filters in Photoshop.

Windows Server 2016 will also allow hardware pass-through of the GPU using Discrete Device Assignment, so the guest VM will get full access to the graphics card, including using the native graphics driver, which will allow DaaS providers to offer more powerful VMs for compute and visualization. (Azure is using that for the N-series VMs that are currently in preview, but they run Windows Server 2016.) And, again, expect those options to be pricey.

You also need to think about connectivity. If users need to access servers that are in your data center, remember that those will no longer be on the same network as their PCs, so they may notice a delay when opening files or loading data. Check if the DaaS provider charges for data coming in and out of the service. Such costs can add up quickly.

Just because you don’t have the up-front investment in infrastructure doesn’t mean that DaaS is cheap. Lockwood’s research shows that a quarter of Amazon Workspaces implementations had negative ROI after a year, even though it’s less expensive than VMware’s offering. “Discounts might be available for larger customers, but looking at the list price, $40 a month adds up really quickly,” he warns. “In eighteen months you’ll have paid much more than you would have for a physical machine or for VDI. The cost is prohibitive for indefinite use right now.”

Lockwood expects the DaaS market will settle down as it matures, “but the market hasn’t even decided what it is yet.” Even so, there are a handful of situations when DaaS will prove useful. For example, it’s ideal if you need to provide a large number of desktops for a short period of time. Similarly, DaaS can be an inexpensive way to give developers a secondary machine for testing, but you’ll need to make sure they get sufficient performance, otherwise an IaaS service like Azure may be a better choice.

“If you’re acquiring another company and you very quickly need to give them your company’s experience, so they can get to SAP or use your health benefits system,” DaaS is useful, Lockwood suggests.

It’s also well-suited to proof of concept work. “If you’re testing Windows 10, do you want to buy machines so teams can test their apps and re-image them again when they’re done? Is it feasible to buy a whole bunch of infrastructure for VDI that you only need for two months to do app remediation? If you use DaaS for that, you can turn it off once you don’t need it any more, and you don’t even need to pay for it on the weekends.” It’s worth noting that the Citrix Azure service will include AppDNA, Citrix’s application migration tool, to help you move to Windows 10.

Lockwood expects DaaS prices to drop, but even at current prices it will work well for companies with seasonal workers (for example around tax preparation time). “These people need computers for two or three months a year. Is it a good idea to buy two rooms full of laptops that you’re going to ship out to them and then get back, and have to update the anti-virus and re-join them all to the Active Directory because they’re expired, or have VDI infrastructure that’s in use for two months of the year but that you have to depreciate the whole time?”

The appeal of DaaS is that you can give up the burden of deploying, managing and supporting desktops and configuring remote access, without the up-front costs of VDI. But unless you’re certain that you can give all your users good enough performance, pick the scenarios where it’s a good fit, and remember that it doesn’t really remove any licensing headaches.

 

Source: CIO-Is Desktop-as-a-Service ready for business?

The End of Unemployment

Recently there was an article regarding how the driverless truck is coming, and it’s going to automate millions of jobs. Similarly, intelligent automation of the workforce has become a heavily discussed topic since the World Economic Forum DAVOS 2016 gathering where the fourth age of the industrial revolutionwas an area of focus. The discussions got me thinking about economic indicators used to measure economic activity.

When machines displace an industry completely, like truckers or taxi drivers, how do we account for this in our economic indicators? As of today, the number I’m fascinated with is the unemployment rate.

Unemployment Formula

Unemployment rate is the percentage of labor force that is currently unemployed but was available for job in last four weeks and was actively seeking employment in that period.

Unemployment Rate = Unemployed/(Employed + Unemployed)

Short Term Impact in Unemployment

Now what happens when automation replaces a job:

Unemployed increases — you lose your job to a robot

Employed decreases — a robot, not a human, is now employed.

Therefore, Unemployment Rate increases

Medium Term Impact in Unemployment

Now this is where things get interesting — let me caveat the Short Term effect with the fact that discouraged workers may be able to find work and get added to the employed variable. My counter argument is that in a world where automation is proliferated at a rapid pace, discouraged workers (especially blue collared & back office workers) where automation will be hit the hardest, will stay discouraged as the rate of automation will outpace the rate at which one can retool their skill set later on in their careers. So now

Unemployed increases/decreases — Robots take jobs, Unemployed goes up. Discouraged workers give up against the machine and attempt to retool — they leave the equation.

Employed goes down — Employed goes down as robots, not humans, take jobs.

Note: the question I’m still trying to answer is if discouraged workers outpace unemployed worker in the medium term. If jobs are being replaced at a rate faster than unemployed moving to discouraged — unemployment goes up. If the opposite occurs, unemployment goes down.

Therefore, the Unemployment Rate fluctuates quite a bit

Permanent Impact on Unemployment

If we continue down this road of AI & Automation, front office and white collar jobs will be severely impacted too — not to say they won’t be in the short term but usually people in power find ways/excuses to stay in power … “who will manage the robots?”

Also, if we’re trying to automate our lives and have Automation & AI run the show, the theme of becoming unemployed and then discouraged doesn’t necessarily become a fluid two-way mechanism, that is people coming in/out of the process, but rather a permanent state.

Unemployed worker = Discouraged worker = ZERO

Therefore, Unemployment Rate is ZERO.

Great my country is at zero unemployment — now what?

The unemployment rate premise was based on a foundation that there was fluidity in the job market, humans do the work, and market cycles create new job opportunities. With the rise of the fourth industrial revolution and the upsides we gain in Automation & AI, none of that foundation exists. We may get pretty close to a unemployment rate of zero, but that also means we have a record number of people sitting on the sidelines. What happens to the new found fortune of time for these retirees:

1. We Party: we no longer need to work so we indulge in all of life’s enjoyment. We go socialist and the government takes care of us cause they have nothing else to worry about — the market becomes instantaneously perfect (supply & demand are now dictated on discrete predictable actions of a computer). We become recluse, gluttonous, tv watching zombies.

2. We Revolt: income inequality shoots through the roof, GDP tanks because households don’t have income to consume goods, Automation & AI are used to cut out the low & middle class, crime is through the roof, and we are left with no choice but to revolt against the machines:

3. We Do What Matters: we move past trivial mundane tasks that are our jobs, and we start attempting to answer the more important questions — and maybe that’s the point.

Automation & AI should help free up our time (in both our personal & professional lives) so we can focus on the stuff that really matters.

Source: Medium-The End of Unemployment

How CIOs can use 3D printing technology to their advantage

CIOs may want to label 3D printing — a type of additive manufacturing that constructs layers of plastic, resin or metal into three-dimensional objects — a design or a product development tool and, therefore, technology that falls outside the senior IT manager’s purview.

But, according to experts, the perspective is short-sighted. If 3D printing technology is brought into the enterprise, it will connect to the network, suck up bandwidth, produce data that needs to be collected and secured, and have an impact on file retrieval. If 3D printing is outsourced, the business will need to shuttle large files between the enterprise and 3D printing service bureaus — files that contain sensitive intellectual property that will need to be protected.

Any item on the above list is reason enough for the CIO and the IT organization to make a case for their involvement in 3D printing. But another? Proactively introducing or supporting 3D printing technology in the enterprise could help CIOs establish IT as a business innovation partner, a shift many IT experts argue is necessary as companies increasingly rely on technology to perform just about any business function.

Taking 3D printing out for an enterprise spin, however, won’t be easy. While the business headlines suggest that 3D printing tools are consumer-ready — Mattel recently announced it will sell a 3D printing toy maker for kids this fall — experts caution that the enterprise 3D vendor market is immature. The business applications are still largely confined to manufacturing companies and the technology itself — for all the advances made since its introduction 30 years ago — is not standardized.

Another issue for CIOs? Can they get in on business technology conversations early enough to earn the business’ trust and prove their department’s worth?

Fragmented market

The first hurdle CIOs and their companies will face is the immaturity of the technology, itself. Experts describe the vendor market as evolving but still nascent and fragmented — and not changing any time soon.

Sophia Vargas
“There is no dominant player — there is no dominant printer — that can solve all of your needs,” said Sophia Vargas, a Forrester Research analyst who has been watching the 3D printing market for two years.

In fact, to date, experts said the most popular use case for 3D printing technology is still rapid prototyping, an iterative process of building models that can be quickly tweaked. One of the reasons for the limitation is that printers can often print products in one type of material — say a certain type of plastic or ceramic — but not in multiple types of materials. Yet, as Vinod Baya, director at PricewaterhouseCoopers’ Center for Technology and Innovation, said, “Most products we use aren’t made with one material.”

Vinod Baya
He said the technology will need to overcome three challenges if it wants to go from a “prototyping phase” to a “production phase”:

Performance. Baya said 3D printers need to get faster, more accurate and more consistent. “Right now, in some cases, you print a product on one printer, and if you print the same product on another printer, they could be a bit different,” he said.
Printing in multiple materials. Machines that can 3D print in multiple materials are beginning to show up on the market, but they’re still young. And while they can print in multiple types of resin, for example, they still can’t print in a diverse set of materials, say metals and plastics.
Printing complete systems. Printers that can, for example, print a hearing aid device complete with the plastic ear piece, the circuitry and the battery all in a single go will be the market’s “biggest game changer,” Baya said. But that technology has not yet hit the market and, when asked to put a number on it, he predicted it’s at least five years out.
Using the moment to IT’s advantage

Still, 3D printing is having a moment, experts stressed. Look no further than GE for proof: This year, the company’s new CFM LEAP engines will include the first 3D-printed parts in an aircraft engine platform — 19 fuel nozzles that couldn’t have been constructed otherwise, according to a press release.

If CIOs are smart, they’ll leverage this hot — and quickly evolving — technology to burnish their IT department’s standing as a trusted innovation partner. That’s especially the case for CIOs working in verticals like retail, medical device manufacturing, industrial manufacturing, education, fashion, architecture and product design, Forrester’s Vargas said. These are industries either primed to begin experimenting with 3D printing technology or are already doing so, which means the next step will be to extend the technology to other members of the enterprise, posing integration and interoperability questions for the IT department.

Pete Basiliere
The good news? Introducing the enterprise to 3D printing technology doesn’t have to cost much. Pete Basiliere, Gartner analyst, said CIOs can set up a 3D printing innovation lab (what he called a “maker’s space”) for as little as $10,000. Experimenting with 3D printing gives CIOs an opportunity to be a digital thought leader to the business, said Graham Waller, a Gartner analyst and co-author of the new book Digital to the Core.

Up next: 4D printing

Skylar Tibbits, the director of Massachusetts Institute of Technology’s Self-Assembly Lab, has teamed up with Stratasys and Autodesk in an attempt to add a fourth dimension to 3D printing — time. Three-dimensional objects are printed with materials that react to specific environmental stimulus — say heat or moisture — that can trigger self-assembly or a shape-shifting transformation in the product, itself. “You can think of it as the field of smart materials,” Tibbits said at EmTech, an emerging technology conference hosted by MIT Technology Review.

“The CIO can often be in a position of being the provocateur or the inspiration,” he said, “because often, they’ve got a purview to see across the business. This is the skill that they need to bring — to see across the core business vision, mission, strategy as well as the technology — and then combine those things so that they can distill out what is relevant and material from a strategy side.”

Graham Waller
But that requires CIOs get in on emerging technology conversations with the business early, according to experts. For senior IT managers who are already stretched thin maintaining legacy systems, fighting uptime fires, and keeping costs in line, finding the time to build relationships with the business can be difficult.

“We can never take our eye off the ball,” Waller said. “The trap for the CIO is that they spend all of their personal time doing that and then they don’t have any time to step up and play the more strategic role.”

Digital CIO

One piece of advice from Waller? Delegate. He advised CIOs consider establishing an operational CIO role to oversee the day-to-day operational IT duties. That way, CIOs could ensure operational service remains excellent while carving out time to think about and talk about emerging technology initiatives with the business, he said.

The payoff in restructuring the IT department to include an operational CIO role isn’t 3D printing, per se. Instead, it’s a strategic move that could help CIOs reframe how the business looks at IT and how the C-suite regards the role of the CIO.

“This is very deep technology and digital-enabled change,” Waller said. “Often a traditional business leader, whether that’s a CEO or any other role in the C-suite, and also boards of directors, can have blind spots as to what is possible. So there’s a tremendous opportunity for CIOs to play a role there.”

Source: searchcio.techtarget.com-How CIOs can use 3D printing technology to their advantage

Is deep learning the key to more human-like AI?

Will the current resurgence in artificial intelligence turn into another AI winter? Two prominent experts in the field — Eric Horvitz and Yann LeCun — don’t think so. “It’s unclear how fast things are going to go with machine intelligence. I’m uncertain myself,” said Horvitz, director at the Microsoft Research laboratory in Redmond, Wash. “But I think it’s pretty clear that we’ll have some big developments soon that will be very valuable.”

Horvitz pointed to two “core developments” that have gotten AI to where it is today. He referred to the first core development as a “revolution of probability” — that is, a move from logic-based reasoning to reasoning under uncertainty. [This development] “stands on the shoulders of disciplines including statistics, operations research and, of course, probability decision science,” Horvitz said at a recent emerging technology conference — dubbed EmTech — hosted by the MIT Technology Review.

Eric Horvitz

But it’s Horvitz’s second core development that will likely strike a chord with CIOs, who should notice parallels between the technological improvements underpinning this latest AI revitalization and those that brought big data to the forefront. “The second has been a revolution in machine learning, but this has largely been fueled in recent years by the almost limitless storage we have for capturing data, as well as the rise of the data sources through the ubiquity and connectivity of the Web,” he said. “That, along with computation, has fueled a renaissance in machine learning, which has become a core pillar of where we are today in AI.”

That’s especially true for one area of AI that’s receiving plenty of buzz these days: deep learning, a type of machine learning described by some proponents as analogous to how the human brain processes information. Yann LeCun, a luminary in the field, a professor at New York University, and the director of AI research at Facebook, described deep learning as an old idea (one that arguably dates back to the 1950s) that is undergoing a renaissance thanks to these technological developments. He pointed specifically to machines that now can handle the large data sets, which are needed to train or “teach” deep learning systems effectively, and to graphics processing units (GPUs), “which are highly parallel — most of the new chips have something like 2,000 cores on them,” LeCun said at EmTech.

The advent of neural networks

In the 1990s, LeCun made a notable mark on a type of deep learning known as convolutional neural networks, the design for which is loosely inspired by the brain’s visual cortex. Convolutional neural networks (convnets) process data in layers to identify an image or text, starting at the lowest layer by identifying the basic features in the unstructured data and passing the findings up to the highest layers where what’s being portrayed is eventually identified. While working at AT&T Bell Laboratories 25 years ago, LeCun built a neural net that could recognize handwritten numbers, a technique that proved so successful, banks implemented the technology at ATMs for check deposits.

Yann LeCun

The time in the limelight for convnets, however, was fleeting. “The machine learning community moved away from it. They did not believe this idea had any legs,” LeCun said, leading to what he referred to as the “second death of neural nets,” the first having occurred in the 1960s when researchers, including MIT’s Marvin Minsky, “exposed the limitations of the approaches people had taken so far,” he said.

Creating powerful neural nets, it turns out, involves adding more layers of processing to the mix, but in the 1960s, as Minsky argued, the technology was not up to the task. In the 1980s, researchers were still limited by technique. “People were only training networks of two to three layers because that’s all we could afford on the machines we had, mostly because the data sets were small,” LeCun said. “If you try to make the networks too big with little data, they don’t work that well.”

In cases like this, neural nets risk learning by rote only, which won’t amount to anything useful. One exception was LeCun’s convolutional net for handwriting, which had “five, six, seven layers,” he said. But by the late 1990s, the concept of neural nets had fallen out of favor and with it the promise of convnets.

For the record

In an interview earlier this year, Facebook’s Yann LeCun told Lee Gomes of theIEEE Spectrum that, “while deep learning gets an inspiration from biology, it’s very, very far from what the brain actually does. And describing it like the brain gives a bit of the aura of magic to it, which is dangerous. It leads to hype; people claim things that are not true. AI has gone through a number of AI winters because people claimed things they couldn’t deliver.”

“People didn’t believe that by just adding layers, it would work” he said. “That was partly due to the wrong mathematical intuition.”

In the last five years or so, adding multiple layers to neural nets (10 or more) has become the norm, dramatically transforming the technology’s performance and the industry’s perception. Today, “every speech recognition system that runs on your smartphone … ended up using deep learning,” he said. It’s also taken computer vision — the robot version of human vision — by storm.

“Within the space of a year, the entire computer vision community working on image recognition switched from whatever they were using to convolutional nets,” LeCun said, “In my 30 years of research, I’ve never seen anything like this — a technique like this taking over so quickly.”

 

 

Source: searchcio.techtarget.com-Is deep learning the key to more human-like AI?

4 key workforce trends driven by digital transformation

 

Source: tech.e

Today’s ‘workplace’ is no longer a place at all, such as a constricted cubicle with a computer. Expectations of employees as well as employers are rapidly changing in regards to how and where work gets done. With the rise in mobile and internet penetration, productivity is now independent of work location.

The ability to work from anywhere has improved and traditional working hours have changed. However, there is a tremendous pressure on IT service management (ITSM) teams to make critical services available 24/7 to support the new digital enterprise.

In a recent Kensington Productivity Survey, it was found that over 60% of working professionals use multiple devices at work at least half of the time, and 90% believe integrating devices would enhance productivity. To add, tech-savvy millennials today comprise a larger percentage of the work force and expect a consumer-like experience at work. They want to make smart use of technology to be productive while working from anywhere at anytime.

With traditional businesses going digital, the workplace is expected to enable employees to choose the productivity tools and technology they want to use. Companies who are not modernizing their workplaces to adequately support their digital businesses will face slower growth and difficulty in attracting and retaining talent.

Below are some interesting ways companies are embracing digital to stay relevant and meet the requirements of a dynamic workforce:

1. Mobile-ready: Digital-savvy millennials have grown-up using mobile phones and expect workplaces to offer a technology user experience they’re used to. They need flexibility to work from anywhere on multiple devices with a seamless experience. A mobile-first approach can also offer unparalleled convenience and productivity to IT service support teams, along with increased customer satisfaction.

2. Appropriate distribution of apps, devices: It is imperative to give individuals easy access to appropriate tools and streamlined service delivery based on their roles, such as a ‘market analyst’ or ‘software developer’. This promotes user understanding, interest, adoption and a better overall user experience.

3. Automation: Automation is key for adoption of the digital workplace. Digital business requires a strategic approach to automation that responds to changes almost immediately. It needs to move at the speed of expectations.

Automation in the form of user self-service, for example, reduces IT staff workloads while improving employee productivity and satisfaction. Reducing the chance of human error and optimizing every step of a process also radically reduces security and compliance risks.

Empowering IT service management to support the digital business is enabling companies to provide self-service access to the answers and tools employees need based on their locations, roles and preferences. Rather than submitting a trouble ticket into a long queue or waiting on hold, the information they need is available through a browser or a mobile app, easing resolution and reducing the burden on IT staff. In addition, by solving their own problems quickly and easily, employees can get back to work promptly to serve customers.

4. Crowdsourcing: Many companies use crowdsourcing to enable employees to help IT map and manage the IT environment. Using crowdsourcing, users add assets to location-aware maps, while IT determines what information needs to be included and controls who can add what information to which maps. Employees can also report outages, providing IT with a real-time flow of asset updates. By building a repository of crowdsourced problems and resolutions, IT empowers employees to find answers to most of their questions with little effort.

Companies today need to rethink about how to best utilize their workforces. Optimizing employee performance is not only about making offices more mobile and digital but also enabling employees to get work done quickly and effectively, while working anytime and from anywhere. It is about how employees experience work, and the tools and services they use to increase their success and satisfaction. By reconsidering their digital capabilities, companies will be able to raise the bar on how employees can engage with customers, drive operational efficiencies and boost overall productivity by adopting these best practices.

For an enterprise, becoming ‘digital’ is not the plan for the future, it’s a transformation that CIOs need to make now. Digital empowerment is imperative for companies to stay relevant and competitive, or face inevitable extinction without transformative technologies.

conomictimes.indiatimes.com-4 key workforce trends driven by digital transformation

Network Collapse: Why the internet is flirting with disaster

A panel of academic experts recently took part in a discussion on the future of the internet, and among other things highlighted its fragility, the ease with which it can be disrupted and its seeming resistance to change.

A collection of our most popular articles for IT leaders from the first few months of 2016, including: – Corporate giants recruit digitally-minded outsiders to drive transformation – Analytics platforms to drive strategy in 2016 – Next generation: The changing role of IT leaders.

The weaknesses arise primarily from the fact that the internet comprises protocols for Layer 3 networking in the TCP/IP stack, invented many years ago.

“There are a lot of challenges for the internet. We face daily problems,” said Timothy Roscoe, a professor at ETH, Zurich’s science, technology and mathematics university in Zurich.

“Most of what we do is at Layer 3, which is what makes the internet the internet.” However, new and incredibly popular services, such as YouTube, Netflix, Twitter and Facebook, have put pressures on these protocols.

New age, old protocols

Laurent Vanbever, an assistant professor at ETH, said: “There is a growing expectation by users that they can watch a 4K video on Netflix while someone else in the house is having a Skype call. They expect it to work but the protocols of the internet were designed in the 1970s and 1980s and we are now stretching the boundaries.”

The internet is often described as a network of networks. What makes these networks communicate with one another is BGP, the border gateway protocol. In essence, it’s the routing protocol used by internet service providers (ISP). It makes the internet work.

Roscoe said: “BGP is controlled by 60,000 people, who need to cooperate but also compete.” These people, network engineers at major ISPs, email each other to keep the internet running.

Source: computerweekly.com-Network Collapse: Why the internet is flirting with disaster

Cut IT Costs and Improve Service with Smart Automation

For the past 20 years and to the benefit of the C-suite in companies all over the world, smart automation software has been engineered, refined, and constantly improved by its developers to respond automatically to repetitive and resource-consuming tasks – that is, the work that nobody wants to do.

It is widely accepted that the most basic and repetitive tasks usually can be solved with software, scripts, and simple programs with even limited intelligence making it the baseline for automation. That scenario is so 2010 – if you threw just a few variables into the mix, and the engine would come to a grinding halt.

Today, organizations such as the largest banks, IT companies, and telecoms are moving beyond the baseline of basic automation, and it isn’t for cost savings alone. Their compliance requirements related to higher-order transactions have increased, and the stakes are much higher for them to remain in compliance. This plus the expense of using expertly trained personnel to keep IT systems running and to solve what boil down to basic and repetitive problems are key drivers in the move to more advanced intelligent automation.

Another key factor is one that has been a constant this decade – the issue of so much data flowing into the company, from transactions, authentications, and other business processes affecting customers that organizations need any help they can find just to keep up. It is the quintessential image of trying to drink from the fire hose, and the fire hose is ever-widening. And with the state of IT today, few organizations – even highly successful ones – have been able to increase IT spending fast enough or significantly enough to match the growing demand for IT in all aspects of the business.

If You Can’t Get Bigger, Get Smarter

What this means, in the end, is either increased costs or increased risks, neither of which is acceptable. But the good news is that the smart automation technologies available today are light years ahead of earlier automation tools, which depended heavily on hand-coded scripts and rigid run books to automate some traditional manual IT tasks. Today, smart automation engines are more capable than ever of analyzing, learning, and responding to a mountain of transactions, which can number in the thousands per minute. These transactions are critical, must be watched and responded to in real time and, as an added benefit, produce critical, valuable, and actionable data.

Source: data-informed.com-Cut IT Costs and Improve Service with Smart Automation

How to embrace the benefits of shadow IT

By making shadow IT a bad word, CIOs are ignoring the benefits of what are business-aligned systems and missing an opportunity to build a cohesive strategy and governance system that includes all the technology systems in an enterprise. Here’s how to better identify, manage and take advantage of business-procured IT.

The terms shadow IT conjures up negative images in the minds of most IT organizations. Yet non-IT enterprise functions and lines of business are buying more of their own IT systems than ever before, particularly product, operations and external customer-facing groups and highly dynamic services areas. “As business functions seek to realize the benefits from these non-traditional channels of IT enablement, the shadow IT organizations are growing aggressively in order to help orchestrate and aggregate services into business consumable offerings,” says Craig Wright, managing director of outsourcing and technology consultancy Pace Harmon.

Shadow IT is not necessarily a threat to the IT organization. In fact, it can be an effective way to meet changing business needs and create a greater understanding between IT and the business. But IT leaders must do a better job of identifying, assessing and managing these once stealth systems to both manage their risk and reap their benefits. CIO.com talked to Wright about how IT organizations should rethink their relationship with this realm of IT systems.

CIO.com: The term is largely a pejorative in IT groups—or used to be. What are the legitimate reasons for concern about shadow IT?

Craig Wright, managing director of outsourcing and technology consultancy Pace Harmon: Shadow IT has traditionally had negative connotations for IT groups as it is often perceived as a serious threat to the continued existence of IT as a function.

Many IT organizations have evolved over time, morphing to accommodate major transformation projects such as ERP implementations AND refreshes, re-platforming from legacy technologies to current day solutions, and extending or contracting based on mergers, acquisitions, and divestitures. As a result, the size, shape and composition of the traditional IT organization is often as confusing and complex as the myriad of technologies that are woven together into a tapestry of IT solutions that are constantly challenged to keep up with business needs.

Contrast that dynamic with shadow IT, which is often set up by the business for the business, very well aligned with the affordability and competitive demands of the business, is easily understood as it aligns perfectly with the business functions OR products, embraces the latest and greatest technologies via SaaS, PaaS, IaaS, and other consumption-based models, and is agile by design—not as a costly retrofit.

While shadow IT often appears to win over the traditional IT group, this is not the case where organizations have legitimate concerns in major technology areas, such as:

  • The ability to scale to deliver and support enterprise-wide solutions
  • Conformance with regulatory and quality requirements, particularly where design, construction, installation, operation, or performance [is auditable]
  • The continued use and integration of legacy platforms where there is no as-a-service alternative and down and dirty IT programming skills are required
  • The need to address the corner cases where there is no real business case, but there is an absolute technology-driven need to address obsolescence, vulnerabilities, customization, or localization requirements

CIO.com: So what’s the upside—not just for the business, but also for the IT organization itself?

Wright:Shadow IT demystifies IT. It is a trusted model, relatively inexpensive, and established along operating principles that are clear and obvious for consumers. Enterprise users of IT often have difficulties understanding the terminology and definitions of services used by IT and are even more puzzled by the costs and time to achieve desired outcomes. IT functions that recognize the value of bringing shadow IT under the IT umbrella are viewed by the business as being less intimidating and much more business intimate.

 

Source: CIO-How to embrace the benefits of shadow IT

Artificial intelligence, cognitive computing and machine learning are coming to healthcare: Is it time to invest?

With Google, IBM and Microsoft all setting sights squarely on healthcare, and analysts predicting 30 percent of providers will run cognitive analytics on patient data by 2018, the risk of investing too late may outweigh the risk of doing so too soon.

The arrival of artificial intelligence and its ilk — cognitive computing, deep machine learning — has felt like a vague distant future state for so long that it’s tempting to think it’s still decades away from practicable implementation at the point of care.

And while many use cases today are admittedly still the exception rather than the norm, some examples are emerging to make major healthcare providers take note.

Regenstrief Institute and Indiana University School of Informatics and Computing, for instance, recently examined open source algorithms and machine learning tools in public health reporting: The tools bested human reviewers in detecting cancer using pathology reports and did so faster than people.

Indeed, more and more leading health systems are looking at ways to harness the power of AI, cognitive computing and machine learning.

“Our initial application of deep learning convinced me that these methods have great value to healthcare,” said Andy Schuetz, a senior data scientist at Sutter Health’s Research Development and Dissemination Group. “Development will be driven by our acute need to gain efficiency.”

Schuetz and his colleagues are not alone. By as soon as 2018, some 30 percent of healthcare systems will be running cognitive analytics against patient data and real-world evidence to personalize treatment regiments, according to industry analysts IDC.

What’s more, IDC projects that during the same year physicians will tap cognitive solutions for nearly half of cancer patients and, as a result, will reduce costs and mortality rates by 10 percent.

Race is heating up
IBM’s Watson is the big dog in cognitive computing for healthcare, but the race is on and the track is growing increasingly crowded.

IBM rivals Dell and Hewlett-Packard are readying systems to challenge Watson, while a range of companies including Apple and Hitachi Data Systems are each taking their own tack toward AI, cognitive computing and machine learning.

A report from Deloitte in 2015 rattled off a list of other competitors, including: Luminoso, Alchemy API, Digital Reasoning, Highspot, Lumiata, Sentient Technologies, Enterra, IPSoft and Next IT.

And late last month Google and Microsoft battled it out when Google unwrapped its Cloud Machine Learning and Microsoft shot back that same week with big data analytics of its own and the new phrase “conversational intelligence” to describe its new offerings.

So don’t expect Watson to be the only “thinking machine” option moving forward.

Hurdles ahead
Among the obstacles facing healthcare organizations and the intrepid technology vendors trekking to AI, cognitive computing and machine learning will have to high-step to overcome: data.

Data is always going to be an issue for both healthcare providers and technology vendors. Collecting it, storing it, normalizing it, tracing its lineage and the critical – if not particularly sexy – matter of governance, are all necessary so providers can harness cutting-edge software and hardware innovations to glean insights that enhance patient care.

“Translating data into action — that is the hard part, isn’t it?” said Sarah Mihalik, vice president of provider solutions at IBM Watson Health.

Achieving the transformative potential for AI, she added, is also going to require a mindset and practice shift in how providers embrace technologies and acquire talent.

The right data is essential to solving many of today’s problems but the information itself does not a lasting strategy make.

“Analytics is just one part of an overall data strategy,” said Nicholas Marko, MD, chief data officer at Geisinger Health System.

Other key pieces include: business intelligence, enterprise data warehouse, infrastructure, privacy and security.

“If you’re not focusing on how these pieces are all in motion then invariably you’re going to hit some kind of bottleneck,” Marko said. “The strategy has to be a dynamic living thing, not something you just put down on paper. There is not some secret sauce that allows you to lay down an analytics strategy. It’s a lot of hard work. Nobody has the magic solution.”

Not even technology titans.

First-mover advantage
AI, cognitive computing and deep machine learning are still nascent technologies but consultancies are suggesting that healthcare organizations begin working these technologies now rather than waiting.

“The risk of investing too late in smart machines is likely greater than the risk of investing too soon,” according to a report from Gartner Group.

There’s little arguing that the degree of complexity around big data in healthcare is exactly why clinicians, physicians and, indeed, the industry at large need these emerging technologies, which have felt so far away for so long.

“I have no doubt that sophisticated learning and AI algorithms will find a place in healthcare over the coming years,” Sutter’s Schuetz said. “I don’t know if it’s two years or ten — but it’s coming.”

 

Source: m.healthcareitnews.com – Artificial intelligence, cognitive computing and machine learning are coming to healthcare: Is it time to invest?