Emerging enterprise techs to watch

emerging monarch 93153534
Thinkstock

New technologies affecting enterprise IT continue to be invented, commercialized and adopted. The latest batch looming on the horizon includes quantum computing, gamification, reactive programming, augmented reality, transient computing electronics and Named Data Networking.

Of those six, reactive programming and gamification are probably closest to general adoption, while augmented reality awaits better hardware. Transient computing and Named Data Networking remain in the proof-of-concept stage. As for quantum computing, there are multiple ways of looking at it -- of course.

Quantum computing

A future generation of computers may be based on quantum mechanics rather than electronics. Experts are of two minds about the possibility -- which is only appropriate, considering the odd nature of the quantum world.

Quantum computing is based on the quantum bit or qubit, explains Michele Mosca, deputy director of the Institute for Quantum Computing at the University of Waterloo in Canada. While bits in conventional computers can be ones and zeroes, thanks to a quantum mechanical property called "superposition," a qubit can be simultaneously one or zero and therefore represent two bits. This makes the number of possible configurations two raised to the number of qubits, he explains.

Patterns in data can be extracted quickly, without having to compare all the values the data set contains. "For some problems, the speedup is astronomical, and for others the speedup is the square root of the original speed," Mosca notes. "And for some questions that have fast classical algorithms, you get no speedup."

Combining qubits with quantum logic gates, "There is a significant possibility we will have a general-purpose quantum computer in 10 to 15 years," Mosca says. And he's of two minds about that, since one problem likely to surface from an astronomical speedup is the cracking of certain types of encryption, making that data an open book. Security officials need to start looking for "quantum-safe" encryption now, he adds.

D-Wave circuit D-Wave Systems

A 512-qubit memory device from D-Wave Systems.

Jeremy Hilton, vice president at quantum-hardware vendor D-Wave Systems in Burnaby, British Columbia, says that a general-purpose quantum computer may be decades away -- but that his company already sells a conventional computer with quantum components that's aimed at optimization problems. (Some observers, however, question whether D-Wave's wares are true quantum-based computers.) His firm's D-Wave 2 system has 512 qubits, and a thousand-qubit version will be available in early 2015; NASA, Lockheed Martin and Google are testing the systems.

"Third-party benchmarks show that the hardware can be tens of thousands of times faster than mature, general-purpose problem-solving software," says Hilton. Against highly optimized classical algorithms, performance is about the same, he adds. "We are still exploring how the technology works."

"There is no reason to think the technology won't be viable at some point," agrees Mike Battista, analyst at the Info-Tech Research Group in London, Ontario. "As for whether it's viable now, I'm not sure."

For more information:

Gamification

You can expect your employees to do what they are supposed to do. Or you can track what they do and then reward them when they do what they are supposed to do, with points, badges, position within a competition and even prizes.

Applebee gamification app Bunchball

A gamification app for waiters in Applebee’s restaurants.

The latter approach is called gamification. It's hardly new; if you ever chose to patronize a store primarily to get its loyalty points, you were responding to gamification. Today, it's primarily used as a motivational tool, "with both customers and employees," notes analyst Rob Enderle.

The technique is catching on, with many different types of businesses trying it out.

"Gamification is the use of the best ideas of games outside of entertainment," explains gamification consultant Gabe Zichermann. "Instead of being purely entertainment it creates engagement. Most people's jobs are pretty boring, and just turning them into games will not make them awesome. The idea is to dig into them and find ways to make them more interesting."

Actually spending some development effort on usability and the user experience is a big part of gamification, adds Duncan Lennox, CEO and founder of gamification vendor Qstream. "Enterprise apps have a tradition of atrocious interfaces imposed on users who were expected to use them regardless," he notes.

Most enterprise gamification efforts have been centered around training, says Zichermann. Indeed, Lennox notes that his firm offers a training tool for sales agents involving two-minute online sessions. The short interaction keeps it fun, and the competitive elements keep it engaging, he says.

Beyond that, a gamification effort should be designed around the data that the users or employees are already generating through their activities, says Rajat Paharia, founder and chief product officer of gamification firm Bunchball. However, designers must be careful to reward the right behavior, with incentives that users will respond to, he says.

A gamification effort should be intended to drive a specific, measurable outcome, such as increased order size for sales agents, Paharia adds.

For more information:

Reactive programming

Hardware may be doubling in power every other year, but there has been no Moore's Law for software. Reactive programming, however, may give software a chance to catch up.

Reactive programs are composed of independent software agents that do one thing and do it well, but only after receiving a request to do so, explains Robin Hillyard, a consultant in Boston. "If one agent does not do the thing you want to happen, it may be able to pass your request to another agent who knows how. A network of agents can be incredibly powerful since it can incorporate intelligence you'd otherwise have a hard time finding."

"Concurrency and distributed operations are now the norm rather than the exception," says Jonas Bonér, co-founder of Typesafe, which is building a platform for creating reactive software. "We believe that the tools that developers have been using are not up to the challenges of multi-core computing and cloud computing, where long serial chains of code won't fly anymore."

But with reactive programming, "The size and complexity you could have is limitless," Hillyard adds. "The (older) procedural model of software does not model the real world anymore."

There is a learning curve, of course. It takes two to three months before a programmer feels truly productive with reactive programming, "with a lot of reading in the first week, but things fall into place in the second," says Bonér. Most programmers who are new to the process "are up to speed after doing a proof of concept for two months." Learning a new language is not necessary and most reactive programming practitioners use Java, he adds.

"Once you get the hang of it, it's actually much easier to write programs in the reactive model than in the traditional threaded model, easier to express what you want, and easier to debug the code," adds Hillyard.

Hillyard says that, currently, reactive programming is found mostly in the financial and healthcare fields, where information is not only critical but changing constantly.

"Maybe in 10 years it will be the normal way of doing things," he adds.

For more information:

Augmented reality

You look through your smartphone's camera at a building. Thanks to GPS and directional sensors, and various data feeds, your smartphone's display can tell you the address and which businesses are located there, overlaid on your view of the building.

That's augmented reality (AR). One example is a smartphone app called Wikitude World Browser, though there are plenty of others at this point.

"AR takes the digital world, superimposes it on the real world and presents the two in a seamless fashion," explains Oscar Diaz, head of FuelFX, a Houston firm that produces training materials for industrial use. There are also animation, audio, and text elements, and a video recording may also be captured of the user's actions, he adds.

When used to guide industrial tasks, AR can cut human error and slash training times by a factor of four, Diaz says. It can also record what the worker does for later verification. "You can't leave your procedures manual behind anymore," he notes.

But the full potential of the technology requires the use of high-definition goggles or eyewear rather than smartphone viewfinders, and that technology is not ready for general use, Diaz admits. The superimposed graphics "should be indistinguishable from reality" and are not. "But while AR evolved from the (viewing) devices themselves, now viewing devices are evolving to meet the needs of AR," Diaz says.

In a tech generation or two, someone will come out with intelligent eyewear that establishes the standard for AR, he predicts. The much-touted Google Glass falls short, he notes, if only because it cannot superimpose over the entire field of view.

"When you see AR demonstrated you can easily see its value proposition," agrees analyst Tim Bajarin, head of Creative Strategies. "But it is too gimmicky now. It demands precise positioning, and for the moment I don't see a mechanism for delivering it cleanly, in a way acceptable to a broader audience."

For more information:

Transient electronics

While it may seem that the world is filling up with old electronics, the new field of transient electronics (TR) could create devices that can safely dissolve in the environment, or even in the body.

reza montazami Iowa State University

Reza Montazami, assistant professor at Iowa State University, calls transient electronics in the "proof of concept" stage.

"It's the next big wave," predicts analyst Rob Enderle. "The next decades will be about finding creative ways of putting technology inside your body, and building the construct between machine and man."

But so far, "There is no commercialization and very few labs are working on this -- it's in the proof-of-concept stage," says Reza Montazami, assistant professor at Iowa State University.

TR requires that metals dissolve, and there are two main ways to do that, he explains. The first is to use a metal like magnesium, which is both conductive and water-soluble. The other is to use nano-particles of a metal like silver, which is conductive but not soluble, suspended in a polymer that is soluble, or melts at a temperature that the circuit itself can produce.

With the latter type, the transformed polymer would disperse the nano-particles, turning the circuit into silver mush. This would not only erase any data the device contained, but leave little evidence that the device ever existed, a feature that could be desirable in military situations, Montazami notes.

Initial civil use might include smartcards whose circuitry can be destroyed, or passports that can be canceled remotely, but commercialization is still several years away, he adds.

Of course, complete disappearance could also be desirable for diagnostic medical devices that would be inserted in the body, or swallowed, he notes.

"There is tremendous promise for things like capsules with cameras," says Bajarin. "But will consumers accept it without it being clear that it will be safe and not invade their privacy? Those two issues will have to be nailed down."

For more information:

Named data networking

While TCP/IP was fine for the apps that dominated the Internet at its birth (like email, file transfers, and remote log-ons), today the Internet is dominated by data-centric apps like Web data retrieval and video streaming. So the hunt is on for an improved protocol, and a leading candidate is Named Data Networking (NDN).

This is not commercial technology, nor is it likely to be ready for market in less than two years, says Dave Oran, a Cisco Fellow who is working on development efforts with the NDN Consortium. "But its premise is that it could replace TCP/IP someday."

Basically, NDN replaces the notion of hosts and IP addresses with the notion of named pieces of data and requests to fetch those pieces of data, Oran explains. "Every primitive piece of data that might exist on the Internet will have a name that a client can ask for. The network will be able to route that request to some app that has that piece of data and then return the data to the requester. That is the basic paradigm," he says.

Cryptographic protection of the data will be built into the protocol, Oran says. Also, since the network can pay attention to the data produced by sensors rather than the ID of the sensor, NDN is more suitable for the Internet of Things. Since data is sent only in response to requests, NDN should also be resistant to many forms of attack, he adds. But NDN will offer no particular speed advantage, he cautions.

More than three decades after its invention in 1982, VoIP has become commercially important but has still not entirely replaced the public switched phone network, and Oran says he assumes NDN's timescale for adoption will be similar. For niche uses, NDN could be available much sooner -- or it may never be adopted, with certain features at best retrofitted to TCP/IP, he adds.

For more information:

Copyright © 2015 IDG Communications, Inc.

It’s time to break the ChatGPT habit
Shop Tech Products at Amazon