Actions

dANN

Bayesian Network

From Syncleus Wiki

dANN Information
General
Description An Artificial Intelligence Library written in Java.
Last Activity Today
License OSCL Type C
Homepage dANN
Download
Distributions 2.1
Documentation Documentation Snapshot
Development http://gerrit.syncleus.com/dANN
JIRA Bug Tracking
Sonar Technical Debt
Support
IRC Room #dANN on irc.freenode.org
Mailing Lists Directory of Mailing Lists

Bayesian networks are similar in many ways to Naive Bayes Classifiers except the features of an item are considered to be codependent on each other. For example if you have a document that you want to test to see if its spam you might look for the keywords "Buy" and "viagra". While these words directly effect the chance that a document is spam they also effect each other. The presence of the word "buy" makes it more likely the word "viagra" will appear in the document. Therefore "viagra" is dependent on "buy" and if its spam is dependent on both of these. Bayesian networks can capture the codependency and learn how features influence each other. Once a Bayesian Network has been trained it can answer any Conditional Probability questions regarding the mapped features (like "what is the chance a document is spam if it contains the word buy in it").

Bayesian Networks don't just apply to text based documents, although this can be the easiest to understand conceptually. When using the core package com.syncleus.dann.graphicalmodel.bayesian.* the dependency of features, as well as the extraction of features from an item, is left up to your implementation. There are techniques to help automate this process, but that will be discussed elsewhere.

Example: Diagnosing the Sick

We want to start with a simple conceptual example for using Bayesian Networks to help get you started. In this example we will use the problem of needing to diagnose a patient's symptoms and determine the likelyhood that they are sick and need additional medical attention.

The first step is to create the Bayesian Network object, there isnt much to this step:

SimpleBayesianNetwork network = new SimpleBayesianNetwork();

The next step is defining what features need to be looked at to accurately decide if a person is sick or not. For simplicity we will pick only 5 features, in a real application you will likely have many more than this. We are looking at the current season, age, stuffy nose, fever, and tired. Each of these have various states; for example season has winter, spring, fall, and summer as its states. States are represented as Enum types. Since a State can not be null we must set an arbitrary initial state, however the initial state isn't important as we will be setting the state manually later when it comes time to use it.

First we need to define the Enum types representing the possible states for each feature:

private static enum BooleanState{ TRUE,FALSE; }
private static enum SeasonState{ WINTER,SUMMER,SPRING,FALL; }
private static enum AgeState{ BABY,CHILD,TEENAGER,ADULT,SENIOR; }
private static enum FeverState{ LOW,NONE,WARM,HOT; }

Next we need to create the actual nodes representing each feature:

BayesianNode<SeasonState> season =
     new SimpleBayesianNode<SeasonState>(SeasonState.WINTER, network);
BayesianNode<AgeState> age =
     new SimpleBayesianNode<AgeState>(AgeState.BABY, network);
BayesianNode<BooleanState> stuffyNose =
     new SimpleBayesianNode<BooleanState>(BooleanState.TRUE, network);
BayesianNode<FeverState> fever =
     new SimpleBayesianNode<FeverState>(FeverState.HOT, network);
BayesianNode<BooleanState> tired =
     new SimpleBayesianNode<BooleanState>(BooleanState.FALSE, network);
BayesianNode<BooleanState> sick =
     new SimpleBayesianNode<BooleanState>(BooleanState.FALSE, network);

Now we need to add the new nodes to the network so the network knows about them and can operate on them:

network.add(season);
network.add(age);
network.add(stuffyNose);
network.add(fever);
network.add(tired);
network.add(sick);

Next is the important step of defining the codependency of the various nodes/features. This is where we indicate how one node effects the state of another node. For example if your older, regardless of the other states, you are more likely to be tired, even if you arent sick. Therefore there is an obvious dependency between age and tired nodes. Here is an example of how you might choose to connect the various nodes:

network.connect(season, stuffyNose);
network.connect(season, fever);
network.connect(season, tired);
network.connect(season, sick);
network.connect(age, stuffyNose);
network.connect(age, fever);
network.connect(age, tired);
network.connect(age, sick);
network.connect(tired, fever);
network.connect(tired, stuffyNose);
network.connect(tired, sick);
network.connect(stuffyNose, fever);
network.connect(stuffyNose, sick);
network.connect(fever, sick);

Next we need to actually train the network so it can learn the various probabilities present in the network. In this case the best way to do this would be to input information from a real doctors database. If the database is large enough, and accurate enough, the network should learn how to diagnose by itself. Here is an example for one iteration of learning, in reality you should do many more than just one:

season.setState(SeasonState.WINTER);
age.setState(AgeState.SENIOR);
fever.setState(FeverState.HOT);
stuffyNose.setState(BooleanState.TRUE);
tired.setState(BooleanState.TRUE);
sick.setState(BooleanState.TRUE);
 
network.learnStates();

This teaches the network that in this case the season was winter, the persons was a senior, they had a hot fever, a stuffy nose, were more tired than usual, and they turned out to be sick. This was just one incident with one person, you want to train the network off several different people for it to learn from.

Now that the network is trained the final step is calculating some Conditional Probability. Lets say we wanted to check how a persons fever effects their chance of being sick (without consideration for the other features), this is how we do it:

Set<BayesianNode> goals = new HashSet<BayesianNode>();
goals.add(sick);
 
Set<BayesianNode> influences = new HashSet<BayesianNode>();
influences.add(fever);
 
sick.setState(BooleanState.TRUE);
 
fever.setState(FeverState.LOW);
double lowPercentage = network.conditionalProbability(goals, influences);
 
fever.setState(FeverState.NONE);
double nonePercentage = network.conditionalProbability(goals, influences);
 
fever.setState(FeverState.WARM);
double warmPercentage = network.conditionalProbability(goals, influences);
 
fever.setState(FeverState.HOT);
double hotPercentage = network.conditionalProbability(goals, influences);
 
assert ( (nonePercentage < lowPercentage)
      && (lowPercentage < warmPercentage)
      && (warmPercentage < hotPercentage) );

We tend to expect that no fever has the least chance of meaning a person is sick, followed by low fever with the next highest chance, then warm fever, followed by hot fever. Therefore if the network is properly trained the assert on the last line above should always pass without an AssertionError.

See Also