Machine Learning

Tuesday, March 01, 2005

KIML: known prior work

I mentioned Bayes nets as the best existing example of what "knowledge intensive machine learning" is about. There really aren't very many techniques out there for learning given priors or constraints generated from high-level qualitative knowledge, and the few that I can think of are all structural.

Inductive logic programming in some sense fits in here, but the intensional and extensional information must be in the same format -- logical statements -- which is a little unnatural for ML folks, and furthermore, the built systems don't deal well with uncertainty. Probabilistic logic programs (BLPs and SLPs) are a little closer, but they still only deal with structural domain knowledge (first-order rather than propositional). Qualitative reasoning techniques give us natural and intuitive ways to express qualitative domain knowledge (in particular, knowledge other than structural knowledge), but they don't use that knowledge to learn better models from data, and they have only primitive notions of uncertainty (in particular they don't deal with probabilities). Knowledge-based model construction offers ways of using domain knowledge to better answer queries, even ones on probabilities, but they generally assume hand-engineered probabilities, not ones learned from data.

Of course, this leaves out the maxent work which of course is all about finding distributions which satisfy constraints -- clearly something I need to research more.

0 Comments:

Post a Comment

<< Home