Saturday, July 20, 2024

Doctrine of Chances
Stock Market
Webster's Dictionary
Language Translator
Text to Speech


© 2024, ARCADIAN™
Site Info





The Quest for Meaning (continued)

1  2  3  4  5

"What Autonomy has is so important and unique," adds Eric Brown, research director for Forrester Research, "it doesn't just belong 'in a box' - it belongs everywhere."

Not bad for a 3-year-old startup leveraging the 250-year-old brainstorm of a current resident of Bunhill Fields, one of the oldest graveyards in London.

It was while Lynch was a grad student at Cambridge, studying under Peter Rayner, head of the university's Signal Processing and Communication Research Group, that he began thinking about the implications of the curious notions of the Reverend Thomas Bayes. A statistician as well as a cleric, Bayes helped shape the foundation of modern probability theory. Rayner is one of a growing number of researchers turning to computers to take Bayes' method of statistical inference - using mathematics to predict the outcome of events - much further than the shy reverend might have dreamed.

Its methods are complex, but Autonomy's promise is simple: Enable computers to understand context.

Born in 1702, Bayes, a second-generation minister, served as his father's assistant in London and was made wealthy by his inheritance. Bayes spent the final 30 years of his life in a little spa town called Tunbridge Wells, filling notebooks with speculations on diverse subjects but formally publishing only two pieces in his lifetime. The first was an unsigned tract with the marvelous title Divine Benevolence: Or, An Attempt to Prove That the Principal End of the Divine Providence and Government Is the Happiness of His Creatures; the followup, a defense of Newton's calculus against an attack by a bishop who called Newton "an Infidel Mathematician." The latter work earned Bayes a fellowship in the Royal Society, the highest recognition he would earn in his lifetime.

When he died in 1761, Bayes left his papers and £100 to Richard Price, another clergyman whose speculations ventured beyond pondering how many divine agents could dance on the head of a pin. Price is celebrated for creating the first actuarial model for life insurance. Unlike his quiet and cautious friend Bayes, however, Price - a fiery and prolific public figure who wrote defenses of the American and French Revolutions - was an 18th-century genius of hype.

In one of Bayes' notebooks, crammed with musings on astronomy and electricity, Price discovered an unpublished work called An Essay Towards Solving a Problem in the Doctrine of Chances. In broad terms, Bayes had sketched a model for predicting events under conditions of uncertainty. The equation - now known as Bayes' theorem, or Bayes' rule - takes into account knowledge of past events and new observations to infer the probable occurrence of an event. Typically, the minister's son had been modest to a fault about the implications of his work. "So far as mathematics do not tend to make men more sober and rational thinkers, wiser and better men," Bayes wrote, "they are only to be considered as an amusement, which ought not to take us off from serious business."

Price disagreed. Upon publishing the essay after his friend's death, he replaced the self-deprecating introduction with a declaration that Bayes had not only solved "a problem that has never before been solved" but had even successfully "confirm[ed] the argument for the existence of the Deity." Hyperbole aside, Bayes had created a statistical model for harvesting wisdom from experience. Bayes' theorem chains probabilities, maximizing the amount of learned information brought to bear on a problem, and is especially well suited to predicting the outcome of situations where a mass of conflicting or overlapping influences converge on the result.

In the language of statistics, Bayes' theorem relates phenomena observed in the present (the evidence) to phenomena known to have occurred in the past (the prior) and ideas about what is going to happen (the model). Mathematically, the formula is represented as

P(t|y) =

For non-mathematicians, understanding the implications of the theorem without recourse to real-world metaphors is difficult. So imagine a short-order cook working in a busy café. In a din of conversation, clattering plates, and street noise, the waiters dash by the kitchen counter, shouting orders. What a customer actually ordered is t in the above equation. The garble the cook hears is y. A Bayesian decision process would allow the beleaguered chef to send out as many correct orders as possible.

He has some prior knowledge: Few customers order the smothered duck feet, while the steak frites is a perennial favorite. The cook can use information like this to calculate the prior probability that a given dish was ordered: P(t), or the probability of t. He also knows that one waiter tends to mispronounce any dish with a French name, that a bus roars past the restaurant every 10 minutes, and that the sizzling sounds from the skillets can obscure the difference between, say, ham and lamb. Taking these factors into account, he creates a complicated model of ways the orders might get confused: P(y|t), the probability of y, given t. Bayes' theorem allows the chef to chain the probabilities of all possible influences on what he hears to calculate the probability, P(t|y), that the order he heard was for the dish the customer chose.

Doctors perform Bayesian exercises in pattern recognition constantly, relating probabilities and beliefs to observations in the dance we think of as seasoned judgment. If a patient has a sore throat, red spots on her skin, and a 102-degree fever, what is the probability that she has chicken pox (and not measles)? In a medical context, the pattern to recognize, P(t|y), is that of the underlying disease.

For humans, the detection of a meaningful signal in clouds of data smog happens subliminally all the time. Neither the short-order cook nor the doctor needs to consult a mathematical formula to confirm his or her reasoning.

Bayes' quest to organize and extend that process was necessarily limited by his tools - a pen, a notebook, and the time required to do his calculations. What has made his notions widely applicable in everyday situations is the invention of computers, which can chain millions of probabilities in a heartbeat. With Bayesian "reasoning engines" embedded in software to drive the recognition process, computers can begin to approach the everyday capabilities of the human mind for sifting through chaos and finding meaning. "Bayes gave us a key to a secret garden," says Lynch. "A lot of people have opened up the gate, looked at the first row of roses, said, 'That's nice,' and shut the gate. They don't realize there's a whole new country stretching out behind those roses. With the new, super powerful computers, we can explore that country."

<< Page 1            Page 3 >>


"esse quam videri
©1997-2014 Real Knowledge Data Network  All Rights Reserved.
All trademarks or copyrights remain the property of their respective owners.

Not responsible for the content of external links. Contact the WebMaster
The opinions expressed or represented herein do not necessarily reflect those of da cap'n.
~But they may!