Future Watch: Speak to your computer as you would your friends

Future watch: NLP (natural-language processing)By teaching computers to express and extract information using everyday language, including slang, NLP aims to eliminate the burden of translating data and information requests into forms that a computer can understand. Among other things, this Herculean task requires discovering algorithms to approximate human-language processingIn any complex exchange between a human and a computer, chunks of time and data are inevitably lost in the translation process. Users are forced to modify their language so that computers can recognise it, dumbing down their questions for a database. In a perfect world, we would not be forced to speak a computer's language. Instead, computers would be capable of NLP (natural-language processing), understanding and generating language in a human-like way.

With NLP, business professionals could formulate questions using their industry-specific lingo and receive answers in the same language. By eliminating the need for users to constantly translate between business-speak and computer-speak, NLP could save time and boost productivity unlike any other technology launched over the past several decades.

Of course, a huge cheque awaits the development group that introduces a comprehensive NLP solution to the market. Verizon's BBN unit, IBM, AT&T, Sun Microsystems, and Microsoft are all racing for the flag, but the prize remains elusive. As of now, viable content-comprehension tools are nowhere near the horizon.

The grail that eludes NLP researchers is a working computer model of the brain's linguistic comprehension process. Although today's better NLP systems can understand most of what a five-year-old says, adult language is replete with dialectic variances, relaxed syntax, and occupational lingo, all of which confounds the computer's preference for fixed vocabularies and staid grammatical rules.

Some complex statistical models provide limited adaptability to adult-speak, but so far these methods work better in controlled settings, such as classrooms, than in commercial applications.

For example, when you interact with a program such as the Eliza virtual therapist or the plain-language search feature of some Web sites, you are using a shortcut, an NLP cheat that is as old as computers: the keyword-driven parser. The programmer who creates such a parser can give an awe-inspiring demo because he or she has learned to speak the parser's language. That's what makes most so-called NLP systems a cheat: They require you to adopt their rules.

For developers, open-source and academic NLP projects abound. Project Grok, an open-source project headed by Gann Bierner and Jason Baldridge, parses English sentences using a compact grammar table.

And Microsoft offers a workable NLP subsystem for SQL Server 2000, but it is far from automatic. The developer must train Microsoft's English Query before it can process user requests. This training is too time-consuming to make English queries commonplace, especially for changing data structures.

NLP is not a single problem, so its challenges will not bow to a single solution. Instead, we will see slow progress as programmers, linguists, mathematicians, sociologists, and other scientists combine their knowledge. NLP will only become truly viable when our current complex and fragile linguistic models are translated into elegant and consistent formulas. Until then, we must content ourselves to communicate with computers on their terms.

Join the newsletter!

Error: Please check your email address.

More about AT&TIBM AustraliaMicrosoftSun MicrosystemsVerizon

Show Comments

Market Place