Home 
Links
 
Words
 
Family
 
SF Archive
  
Internet Archive 
FAQ 
E-mail me
 

Copyright 2001 John Zipperer. 

From and copyright by Internet World:

Internet Whirl
Not Mr. Know-It-All

To devise more exact IT plans, know what you can't know
By John Zipperer

(11/01/01) The company made a good case for its product. I could see how its software fit into the enterprise market, and I could see how users could expect to save money using it. Then they looked across our conference table and said users would receive ROI in three months. All I could think in response was, “Cool, if true.”

Perhaps the more appropriate response would have been to question the assumptions and data behind that claim. Leaving aside the reach-for-the-sky rhetoric anyone might use in their marketing efforts, I think there is still a considerable amount of self-delusion within companies about how much they are able to make IT decisions based on hard knowledge. Instead, they are left to go on guesses disguised as knowledge.

Recognizing what we don’t know is important for a number of reasons, but mostly for two reasons. First, you can help narrow down what you don’t know and figure out if it’s unknowable or if it can become known with an acceptable amount of effort. Second, you can save yourself the time and frustration of trying to scapegoat someone after the, er, fact, if something happens differently than expected because it depended on an unknowable factor.

Companies have benefited from a tremendous increase in the realm of Things That Are Knowable. They used to not know what was going on inside their suppliers’ systems; now, they are hooked right into their networks and they know where products and bottlenecks are. They used to not know all kinds of things about their own production system; now, they have desktop (and often even mobile Web) access to any database info they might need.

These thoughts came to me again while reading an essay by Andrew McAfee, one of several experts writing in The Economic Payoff from the Internet Revolution (Brookings Institution Press, 2001). McAfee looked at the benefits the United States manufacturing sector was hoping to reap from using the Internet, and he made the point more than once that much is still not known. And yet, companies need to be making decisions about how and where to invest their IT dollars. Shouldn’t they be able to know what is their expected payoff?

But we don’t know some big things, including the possible savings from enterprise use of the Internet. So on what will manufacturing companies be basing their decisions about future IT investment? Technology studies from think tanks? Alan Greenspan’s comments about new economy productivity? Prayer?

“Despite increasingly clear evidence of the value of IT in the Internet era, the U.S. manufacturing sector is still characterized by high levels of IT heterogeneity, both in rates of adoption of these technologies and in levels of success from these investments,” writes McAfee.

“[C]ompanies with a track record of successful implementation and use of IT are more likely to continue these projects in the Internet era, while those that have been disappointed, or worse, by previous IT investments may not be as eager to continue this kind of spending.”
That is disappointing, especially because it is quite possible that many of those companies that have had bad experiences are insufficiently invested in IT. Acting on their universe of knowledge—their own experiences—could be leading them to make exactly the wrong decision.

McAfee writes: “The U.S. Census Bureau has only begun to measure business-to-business e-commerce.... Without comprehensive and objective data on firms’ e-business initiatives, it is difficult to paint a full picture of the digital economy and to test some of its important performance hypotheses.”

In other words, we don’t know, and with currently available data, we can’t know. With an outlay of effort (and money and time), we can know more, perhaps quite a bit more valuable information. And even though that won’t be complete knowledge, it will increase our universe of Things That Are Knowable. It ain’t perfect, but it’s one step closer to the IT-enabled world of perfect knowledge.


J2EE and .Net
Bridging Troubled Waters

Developers offer solutions for connecting .Net and J2EE
By John Zipperer

(10/01/01) Corporate IT leaders face the task of bridging two alien technologies—the familiar J2EE and the emerging .Net. It’s an attempt to create a united whole out of a decentralized collection of technology assets. With mergers and acquisitions, corporations can pretty quickly build up an organization running on multiple platforms (see “Make Diversity Work,” Aug. 1, 2001, p. 48). But only relatively recently has there been a mania to interconnect all parts of the enterprise, especially with the popularity of various value-chain/business-web type ideas. So what’s the answer? Java has found a home in corporate IT departments, and it isn’t going away. And Microsoft’s .Net is coming on with a lot of marketing firepower (and some built-in strengths) behind it, so the reality is that a lot of corporate IT departments are being called upon to look into getting the two technologies to work together. A prudent policy would be to be prepared to withstand some of that pressure while you lay out sound long-term plans.

“I’m personally of the opinion that the whole notion of the bridging between the two is not going to be as important as people make it out to be right now,” says Ted Neward, a software architect and author of several books, including Server-Based Java Programming (Manning Publications, 2000). He suggests that long before one gets to high-tech solutions involving SOAP and Web services, there are simpler partial solutions along the lines of database bridging that have been around for some time.

But companies involved in offering solutions to the bridging challenge naturally are bullish on their proffered fixes, and they tend to view J2EE-.Net bridging as a widespread challenge. Web services figure prominently in their plans.

“The good thing about the computer science industry is that we come up with a new panacea every month,” jokes Dale Skeen, CTO of Vitria, a maker of integration servers. In the universe of options for the harried IT business executive, Web services is being suggested as a near-panacea by many who think it neatly meets the challenge of bridging the J2EE and .Net divide. But all of this is part of an evolution in technology toward a world where application developers can reuse modules instead of recreating them, where CXO-level executives can describe accurately what they want an app to do and not worry that IT will drastically change it during creation, and where developers make use of Web services made available on the Internet.

Consider Borland, the well-known e-business vendor. The company touts its strength in the Java field, but it knows it will be working with .Net and sees the Web services architecture as the thing that will make that work. “We think Web services are incredibly important, because it’s the first time we’ve had a good method for getting components to talk to other components,” says Frank Slootman, Borland’s vice president of products. The difficulty of a previous technology, CORBA, prevented it from getting traction. But with an easy-to-use approach that lets you wrap things in Web services interfaces, Borland and others see a serious contender.

Someday. Right now, there is more talk than reality to the Web services, something that suggests a wait-and-see approach to those who can afford the time. There remains a lot of work still to be done on protocols and standards, including SOAP and even the Web Services User Interface (WSUI) Standard.

“The notion of Web services is still not nailed down, in terms of exactly what it is and what it’s going to include,” says Bob Pasker, CTO of Kenemea, which makes the Kenemea Application Network for application communications. “I tend to shy away from a facile answer that it’s Web services and that’s it. There’s not a lot of technology there; it’s mostly an idea about how to do things.” He says for Web services to be successful, developers will have to ensure that they are easy to implement and adopt, and there will have to be a lot of vendors providing support. Many vendors are interested and are saying they are going full-bore into the Web services realm, but Pasker could add a third issue: Just how companies will want to use Web services in their bridging projects.

“If you’re building a very transactional system behind the firewall, you’re not going to build those subsystems and call them via Web services,” says Dean Guida, CEO of applications-components builder Infragistics. “You’re going to want to make direct, in-processor and in-memory calls. It makes a lot of sense to use Web services when you have a large organization and want to share information across business units.”

Others agree, if they nonetheless sound more optimistic in the process. One suggests that we’ll see another 12 months of fluctuation in standards before stabilization. “Companies like Microsoft and IBM, Oracle, and HP, and all these different companies have different approaches to a specification that are just different enough so that they don’t communicate as well as they need to,” says Rob Mayfield, a lead engineer for Avinon, which sees Web services as a solution to the bridging issue.

The high-profile rollout of Microsoft’s .Net initiative over a period of months and years is going to subject us all to a great deal of talk about the merits of it and J2EE. To a point, these differences break down to this: J2EE is seen as being more enterprise-grade, but .Net is seen as being stronger on the front end with better usability and—obviously—integration with the Microsoft desktop.

Microsoft is also the business equivalent of a permanent campaigner, in the form of a constant on-message stream of marketing. Coupled with the close contact it keeps with its developer community, Microsoft has advantages in promoting and supporting its .Net initiative that can’t be matched. A visit to this summer’s Microsoft TechEd in Atlanta and the subsequent spread of the VisualStudio.Net beta prove the point.

“The common wisdom is that there will be two major platforms, and the rest will be tactical—custom programmed,” says Borland’s Slootman. “The second observation is that half of all projects will combine J2EE and .Net platforms for single-implementation projects. If that is true, then the interoperability between the platforms will become a big deal.”

SIDEBAR
Five Key Decision Points

What are your needs? Ask what are the internal applications—like human-resources apps—you need, says Tom Clement, Avinon’s director of emerging technologies. “And how do you get at the information you need in order to write those applications?”

Do you need to make a choice of platforms? “You shouldn’t have to,” says Brian C. Reed, Merant DataDirect’s vice president of business development. Instead, you’ll have the power of a Microsoft front end mixed with the strong drive and corporate development on the back end in Java.

You still need to decide how much you can do. “The most you can standardize on is the rules for the content” such as the frequency and content of data extracts, says Adam Greissman, CEO of Universal Data Interface Corp. “Everything else leave to the regions within the corporation to decide.”

What skills does your IT staff need? Let others worry; Web services technology enables your developers to continue using the skills they have. “One of our customers has something like 95 percent of a 1,300-person shop unable to directly implement business applications using .Net or J2EE,” says Jack Greenfield, a chief architect at Rational Software. “That 95 percent of staffers could, however, develop using model-assisted techniques.”

What are your cost constraints? Switching to an all-Java architecture may make sense for some, but if money’s too tight for that and “your business users are still clamoring for one-off applications that have to be delivered,” says Ed Anuff, chief strategy officer of Epicentric, then Web services may solve both problems.

Are you thinking outside the corporation? Connecting systems within and between companies is another plus for platform-neutrality.


Internet Whirl
AIMing at Enterprises

Tools that enable private chat fill a neglected business need
By John Zipperer

(10/01/01) An early experience of mine with the way technology can enable collaboration occurred in high school. My friends and I were fans of a great computer game called Galactic Empires, in which players attempt to take over other planets and exploit the shipbuilding capabilities of the planets they already control. It had a simple, archaic interface that made good use of the limited abilities of computers in the mid-1980s. My friends and I learned the collaborative aspect of the game when Bill and I secretly teamed up against Tom. As wave after wave of our ships zoomed in for the kill, our combined fleets were able to decimate Tom’s planets, and we won the game.

OK, so that isn’t the kind of collaboration people generally mean when they discuss business technology. But—my mind working as it does—I tend to wonder about new technologies and products being used in ways that weren’t intended by their makers or their corporate installers.

There is a lot going on in the collaborative technology space, as companies look at various conferencing and shared-workspace tools. But the simplest one, in my view—and the one most likely to continue spreading like wildfire—is instant messaging.

I’ve heard people in companies large and small reporting their use of simple messaging tools from AOL, MSN, or Yahoo at work. Smaller companies also offer such tools, such as NetLert’s messaging product aimed specifically at the enterprise market. But it’s the use of consumer-focused IM in corporations that surprises me. After all, it’s not their intended playground. But they benefit because many people already have accounts on them for use at home—and undoubtedly have been covertly using their private accounts at work for some time.

Here at Internet World we recently started an experiment with the AOL Instant Messenger (AIM). The troubles I had setting the application up suggest one challenge to its use as a business tool. I had trouble getting the system to use the screen name and password I had originally set up, and when I tried the password reminder feature, I got an out of order message; when I tried later, I got a message telling me that my account had tried to use the password-reminder feature too often—I’d tried it all of three times—and so it was being disabled. A call to AOL’s customer support was phenomenally unhelpful.

I finally registered yet another screen name and got set up. It’s worked well since then, but that was a lot of headache for me to go through, and it could take up a lot of time from IT staffs to do the troubleshooting I did myself. I don’t claim to be a technical genius, but I probably have more interest and abilities than the typical employee. AOL has a nice system, but its customer service and help offerings need to be beefed up.

The second challenge will be more of a problem for some companies than others. I’m talking about the distraction that instant messaging can cause if employees have clued in their families and friends to the IM account name they’re using at work. Of course, chances are that the worst offenders will be the same people who already spend too much company time on personal phone calls or writing personal e-mail. In other words, same culprits, same intent, different technology.

Those two challenges aside, AIM lets me keep in touch with other editors at the magazine, as well as our freelance writers across the country. I have yet to experience the downside of adding yet another communications tool to my toolbox, but they exist.

Todd Vernon, CTO of Raindance, a maker of Web-conferencing technology, tells the story of one company that uses AIM. Users leave it running all the time so they are always available for messaging, but they often end up making a phone call afterward. “That’s very interesting to me,” Vernon says. “You’ve got a customer using a couple of different products simultaneously.” This observation may well indicate that low-end messaging tools have a short window of opportunity before higher-end enterprise collaboration suites really take over and handle multiple communications methods.

Now, if I could only dream up a business application for Galactic Empires.


Internet Whirl
The Davids Inside Goliath

Internet technology lets regular folk handle technical tasks
By John Zipperer

(09/15/01) If the early theory was that Internet technology would empower the “little guy” against the corporations, then the conflicting reality is that Internet technology is in fact empowering the “little guy” within corporations. Those smaller competitors on the outside may be getting a little disheartened these days as the slow but steady legacy corporations continue to assert their mastery of the field, but sympathy for the entrepreneur shouldn’t stop us from cheering on what’s happening inside enterprises.

What’s occurring there is significant and good. Just as it has been a long time since word processing required you to type in the coding for paragraph breaks or boldface, software development is making it easier for the nontechnical majority to carry out tasks such as database queries, analytics, and content creation and management. These are duties that once required (and in some cases still require) technical experts to perform them. These days, however, I am seeing more pitches from companies that highlight their software products’ advanced but simple interfaces. Anyone familiar with click-and-type and drag-and-drop can learn to use them.

This is important because the people who are using these tools don’t care about the languages or other technologies that are used to create, store, and transfer the information. “They don’t want to know how SAS works,” as David Butler, vice president of product strategy and marketing for software maker Spotfire, told me recently. “They want to access the information.”

Just a few of the latest products to catch my attention with this claim include iOn, an enterprise Web content management software from Group EM3; RedDot Professional, a content management product for enterprises, from RedDot Solutions; DecisionSite 6.2, an XML-based data analysis tool from Spotfire; and Actinic Business, an e-commerce-in-a-box software product from Actinic Software that allows a business of any size to set up and manage basic commerce sites.

Many but not all applications of this type have a Web-based interface, setting the user’s workspace in a familiar context. I’ve seen enough of them to appreciate the way they enable workers to concentrate on the data and information they are manipulating rather than on how they are manipulating the data and information.

The key is knowing who in the organization needs what information and what they need to do with it. That knowledge is necessary throughout the value chain, as Butler points out. These products matter not just because they enable an isolated employee to add value to the company’s efforts, but because people in different departments need to work on projects together. What’s important is their subject-matter expertise, not their technological smarts. EM3, for example, stresses not only the out-of-the-box functionality of its iOn software, but also its role in putting marketing, sales, and editorial people, instead of Web specialists, in charge of managing Web content.

The other benefit to companies is that these products free up both time and money spent on high-priced IT staffers doing work that increasingly can be done with—for lack of a better term—“dumbed down” software. I’d much rather have my IT staff working on mission-critical technology projects than on tasks that can be done with software.

Companies still have to give their employees time and training to get up to speed on these easier tools. After all, even professionals often need to be walked through basic programs such as Microsoft Word or FrontPage if they haven’t used them before.

But there’s no real loser in the empowerment of company workers. It lets everyone—IT and non-IT staffers alike—do what they were hired to do. And that means companies should be asking the people setting up systems for them and selling them software just how much technological expertise is needed to use them. Except for very complicated systems that are probably wedded to IT staffs forever, I predict the successful products will be the ones that either already are run through click-and-drag interfaces or are working on adding them.


Profile
Third Time's a Charm

CEO Chen of Vweb has a chip on her shoulder -- and it could be a boon to video-over-IP
By John Zipperer

(09/15/01) Corporate videoconferencing is at a confusing juncture in its development: Not everyone is sure whether the conferencing technology we have is adequate or not. Vwebcorp, which provides MPEG video compression and broadband solutions for network applications, thinks it has some improvements to offer: Its VW2000 MPEG-2 video encoder chip is designed for video-over-IP applications, including private videoconferencing and commercial uses such as video-on-demand.

The driving force behind San Jose, Calif.­based Vweb is founder and CEO Sho Long Chen, who has nearly two decades of experience working in DRAM, CPU, and MPEG technologies at such companies as Nexgen Microsystems, Mosel Vitelic, and Stream Machine, which she founded. Stream Machine, in fact, was the second-generation of video compression technology; Vweb, which she started in 1998, is the third generation. “Three is a charm, right?” she jokes. Calling compression a key technology, she says, “We need compression because of storage and bandwidth.”

“Our major differentiation is the innovation in our algorithm,” says Chen. “Our second innovation is that we integrate all of our interface onto a single chip. This minimizes the solution cost. The third is quality of service; we detect the feedback and adjust it to provide the best quality.”

Listening to Chen speak, one appreciates that she is a true technology person and not a marketer. Her history in the broadband space, her knowledge of the players, and her enthusiasm about its potential are evident as she rapidly shares her thoughts. But this doesn’t mean she doesn’t understand the business side of her technology business.

Chen, born and raised in Taipei, Taiwan, sees opportunity in both the Asian and U.S. markets for her broadband technology. But she says the U.S. differs in that it is a more mature market. But competition is cutthroat, and huge investments are needed—the current economic climate isn’t helping. “The whole U.S. communications market is really bad,” says Chen. “It’s not a market correction; it’s a market reset. It’s the worst I’ve seen in 20 years.”

China’s Internet market is much less developed by comparison. What China has, however, besides a huge population and a hungry market, is a government that uses the former to satisfy the latter. For example, it uses the manpower of its army to lay optical fiber to build up an optical-fiber infrastructure in all of the major cities. In addition, Chen says, the government put up money for the effort, reducing the cost of digital so that it was 70 percent cheaper than analog.

As for the future, Chen sees quality of service throughout networks as a challenge for video-over-IP, as it will be for wireless delivery. In the meantime, she says, she’s looking for a partner to help in deploying digital video over the Web.