32 thoughts on “I hope to never again hear the word “ontology””
It's better than "deonotological," which for some reason has become popular in the legal academic world.
I guess in that case you're never coming over to join my wife and I for dinner, then. Oh well. (Although to be fair I could certainly subscribe to a qualified version, e.g., "I hope to never again here the word 'ontology' spoken by a Political Scientist.")
Now and again there is some VERY PARTICULAR reason that makes me have to utter it, and I am always laughing at myself even before the word is out of my mouth, and have to frame it to the students in some sort of self-ironizing way!
Well,
You can Hope, even pray, but in the end, will you achieve sucess?
As a former student of philosophy (and before that, economics) and now a student of politics, I wouldn't be so concerned with "ontology" as I would with "We have solid financial fundaments" or other commonsense sentence before the economic crisis.
After all, if I say "ontology", maybe I am trying just to appear more smart than I am, but with little consequences. However, these economic commonsense had disastrous consequences.
Too bad. I would love to hear you expand upon this more.
Wow–that's more comments than I expected!
Without giving the whole story, I'll just say that I was at a meeting where a few people (not the political scientists in the room!) kept bringing up "ontology," is in, everybody needs to have an ontology.
After a few rounds of this, I asked how could this be, since I didn't even know what an ontology is! Is it something like the spleen, which each of us needs to live, even those primitive people who don't know what the spleen is?
Nobody even responded to me on this. They just kept using the word "ontology" as if it was a regular word. It was a very strange experience.
You don't need a spleen to live. My college roommate had his out while still quite young, and didn't miss it a bit.
Were they using "ontology" in the computer-science sense? I've encountered it in that context, where it seems to have a reasonably specific meaning, i.e. neither a buzzword nor abstruse philosophical jargon.
Wow–I didn't know that about the spleen. Also, yes, I was told it was some sort of CS term. Still seemed strange to me.
They just kept using the word "ontology" as if it was a regular word. It was a very strange experience
I hear other people sometimes have the same experience with "posterior".
Kieran: Sure, but I don't expect that people outside my subfield will necessarily know what "posterior" means in that context. These people were just throwing around "ontology" without any explanation or any expectation that an explanation or translation was needed.
If there is a CS use of "ontology" as distinct from a philosopher's use, it may be allowable in this scheme…
I don't mean to be giving you a hard time Andrew. I'd probably have had much the same reaction under the circumstances.
Come on, this is the 21st century. Everyone needs an ontology. Just head to WalMart and get yourself one on the cheap – they're usually on the shelves right between Ethics and Moralities.
Ontology is how you represent knowledge or data. Say you have a dataset and you're going to do statistical modeling on it. The variables and cases of the dataset are a reflection of how one has chosen to represent the problem.
The statistical ontologies tend to be simple cases, variables and the values the variables can take. Multi-level modeling can address more complex ontologies, ones where there is nesting of cases. Networks are another type of an ontology that requires special modeling approaches – but more often networks are just mapped down to being dealt with as cases. There is a sub-field within machine learning called statistical relational learning – where they try to model relations between different types of cases.
I had the same experience exactly at a meeting with a group of people who the
NIH paid $18.8M to study ontologies ( http://med.stanford.edu/news_releases/2005/septem… ). It took a long time to get a definition of ontologies, which seems to mean basically how you model things. A more complicated meaning is necessary if, e.g., you have hierarchies of diseases or taxonomies of genes, etc. Given that "ontology" has much overlap with "model" I knew which questions to ask: they agreed that if there was no way to select which of two ontologies was the better one, and they seemed to have no interest in statistical approaches to infer an ontology from data.
I would only amend the idea of Aleks' comment to say that your ontology describes how you connect your dictionary and syntax to your semantics. I don't have much of a philosophy background, so I'll [naively] say that I've never seen it used outside of problem solving. So, in this case, your ontology is the scheme by which you connect your [computational] model to the manifest world and vice versa.
You, in particular, will be challenged to relate to the CS convention of understanding ontologies. To the average CS person, developing an ontology is a forward experience. Make a description, decompose that description from [effectively] first principles, make a deliberate step forward with a new description, repeat until you have an algorithm for solving your 'problem'.
In your case, the [above-] average Bayesian statistician's case, the ontology is an emergent phenomenon. You have a pile of data, and you slice it and dice it, and look for the meaning inside it. In the end, you have an 'A and then B, and then …' description of the solution to a 'problem', which you got to by [artfully] wandering through the data.
The ontology is a technical artifact to you, so it doesn't make sense for you to care too much about it. I'll make this analogy. To the CS crowd, the ontology implements innate talent, and to the statistical crowd, the ontology is implemented by acquired skill.
I'm guessing that you were hanging out with a machine-learning-bubbas, no?
I will (comically) add one more note – I am almost certain that I first learned the word "ontology" when I read Thomas Pynchon's "Gravity's Rainbow" as a very obsessive fifteen-year-old (with a good vocabulary…) and had to make a little notebook of all the words I didn't know and look them up afterwards!
Lexicographically speaking, there are now (at least) two senses of the word "ontology" in play. The old-school one, in which ontology is a branch of metaphysics, and the new-school one, where it's a branch of "knowledge representation".
Whether an artificial intelligence practitioner believes his or her ontology is "real" (in the Platonic sense) is the philosophical issue. Today, lots of folks are ignorant of the 2000-plus years of philosophical debate on metaphysics, and simply treat ontologies as a practical tool for creating machine-readable content.
There's a nifty ontogy consisting of hypernym, synonym, antonym, and other lexical-conceptual relations in WordNet. It's basically a dictionary that's easier to read into a computer program to use in applications like search.
For some well-developed examples in medicine, check out the gene ontology (aka GO) for a controlled vocabulary to design gene function, or the medical subject headings (aka MeSH) which the U.S. National Library of Medicine (aka NLM) uses to catalogue most of the bio-medical research literature.
One of the most problematic, yet widely used ontologies is the international classification of diseases, 9th edition (aka ICD-9). Pretty much all medical billing in the U.S. must be encoded using ICD-9. My favorite ICD-9 code is E845.9, "accident involving spacecraft injuring other person". It's never been used in practice.
Be careful not to confuse E845.9 with E845.0, "accident involving spacecraft injuring occupant of spacecraft", or E845.8, "Accident involving spacecraft injuring ground crew airline employee". Those things have actually happened. E845.9 is a subclass of E845, "accidents involving spacecraft", which belongs to E840-E845, "air and space transport accidents", which belongs to the top-level concept E800-E999, "external cause of injury and poisoning".
All I know is that ontology recapitulates philology.
The old-school one, in which ontology is a branch of metaphysics, and the new-school one, where it's a branch of "knowledge representation".
That's funny — so in the CS usage ontology means epistemology.
so in the CS usage ontology means epistemology
Booo! An ontology is an outcome of epistemology.
All I know is that ontology recapitulates philology.
No, phrenology.
Booo! An ontology is an outcome of epistemology.
In CS, so it seems. In philosophy, them's fightin' words.
The way to deal with the unrestrained use of ontology is simply to ask the question, "How does your ontology [say the word with just a bit of scorn in your tone] differ from a mere taxonomy?"
Asking this question will, of course, mark you as insufferable pedant, but that's a small price to pay for what it'll tell you about your counterparts. If they bluster and avoid the question, you can feel free to simple ignore them. If they patiently explain the difference between ontology and taxonomy, using the their previous topic as an example, then you should humbly apologize for your confusion. If socially appropriate, offer to buy them the beverage of their choice. You will have learned many useful things. If they say that the distinction really isn't important in this particular case, you'll know your dealing with folks who are honest, if a bit sloppy in their language and prone to faddishness.
"Ontology" is one of "Semantic Web" spells.
And "Semantic Web" is the Web 3.0 bubble.
twenty comments about ontology, not one comment about description logic?
boo!
I was just reading a book that argued that researchers need a framework from which they can develop a theory out of which models can be designed and tested. Does this use of the work "ontology" is something to insert between theory and model. Ugh.
Personally, I like the words ontology and deontology, but only (as one comment noted) when used in discussing philosophy or ethics. I don't see any use for it here that improves upon the already shifting uses of theory/model vocab.
Hmm, but in political science (here specifically speaking for IR scholars) we use "ontology" the "old school" way (as a branch of metaphysics) to refer to how one conceives of, e.g.: the difference (if any) between social and natural kinds, similarly the distinction (if any) between physical objects and ideas and their import, the (non)reducibility of structures to agents, and so on. Seems like a perfectly useful, and fairly common, way to use the term.
In the context of systems biology, an ontology is a method for extracting money from a funding agency.
Yes! In computer science, we don't care about metaphysics, implicitly embrace post-modernism while remaining outwardly dogmatically empirical, and basically call the data structure and API resulting from our epistemological and political wrangling an ontology.
Let's just call them all data structures or "controlled vocabularies" or "coding standards" and be done with all of this high-falutin', mis-appropriated, pseudo-philosophical nonsense. It'll get the philosophers off your back, too — their ears don't prick up at "data structure" or "application programmer interface".
Sadly, my wife Mitzi is stuck in a small lab at NYU for three days munging her boss's experimental methodology into a form acceptable for the funding agency's ontology police. Ontologies are not only a way to extract money from a funding agency, they're also a way to impose work on fundees. With the illusion that it'll somehow foster sharing across researchers. A previous project she worked on for DARPA had the same integration-by-ontology plan, which hit a rather major roadblock before it was cancelled — no one could shoehorn their data into the agreed-upon ontology, and no one could agree how to change it.
Andrew has the best comment: "In the context of systems biology, an ontology is a method for extracting money from a funding agency."
Ontology is a nice, misused word in all sorts of situations. It covers a multitude of sins… or sloppy thinking.
There's some movement in social science (after Critical Realism) to draw attention to ontology, which just comes down to what you think the world is like. Sociology is a great example a subject with multiple conflicting ontologies, usually dealing with "how much freedom" an individual has within society.
Of course, saying "everyone needs an ontology" is absurd, because everyone has an ontology. Implicit, certainly; and while explicit analysis can be useful, it's often not essential.
In CS (where I came across the term first), the definition (as above in the comments), it's related to the philosophy but not directly. A good question, of course, is: "Do you mean ontology as in metaphysics, or as in information science?" If it's the latter, throw in William Ockham's taxonomy question.
It's better than "deonotological," which for some reason has become popular in the legal academic world.
I guess in that case you're never coming over to join my wife and I for dinner, then. Oh well. (Although to be fair I could certainly subscribe to a qualified version, e.g., "I hope to never again here the word 'ontology' spoken by a Political Scientist.")
Now and again there is some VERY PARTICULAR reason that makes me have to utter it, and I am always laughing at myself even before the word is out of my mouth, and have to frame it to the students in some sort of self-ironizing way!
"Hear". Honestly.
How are you with business professors using "deontology"?
Well,
You can Hope, even pray, but in the end, will you achieve sucess?
As a former student of philosophy (and before that, economics) and now a student of politics, I wouldn't be so concerned with "ontology" as I would with "We have solid financial fundaments" or other commonsense sentence before the economic crisis.
After all, if I say "ontology", maybe I am trying just to appear more smart than I am, but with little consequences. However, these economic commonsense had disastrous consequences.
Too bad. I would love to hear you expand upon this more.
Wow–that's more comments than I expected!
Without giving the whole story, I'll just say that I was at a meeting where a few people (not the political scientists in the room!) kept bringing up "ontology," is in, everybody needs to have an ontology.
After a few rounds of this, I asked how could this be, since I didn't even know what an ontology is! Is it something like the spleen, which each of us needs to live, even those primitive people who don't know what the spleen is?
Nobody even responded to me on this. They just kept using the word "ontology" as if it was a regular word. It was a very strange experience.
You don't need a spleen to live. My college roommate had his out while still quite young, and didn't miss it a bit.
Were they using "ontology" in the computer-science sense? I've encountered it in that context, where it seems to have a reasonably specific meaning, i.e. neither a buzzword nor abstruse philosophical jargon.
Wow–I didn't know that about the spleen. Also, yes, I was told it was some sort of CS term. Still seemed strange to me.
They just kept using the word "ontology" as if it was a regular word. It was a very strange experience
I hear other people sometimes have the same experience with "posterior".
Kieran: Sure, but I don't expect that people outside my subfield will necessarily know what "posterior" means in that context. These people were just throwing around "ontology" without any explanation or any expectation that an explanation or translation was needed.
Hmmm, might be worth invoking Mary Midgley:
http://jennydavidson.blogspot.com/2007/07/unusual…
If there is a CS use of "ontology" as distinct from a philosopher's use, it may be allowable in this scheme…
I don't mean to be giving you a hard time Andrew. I'd probably have had much the same reaction under the circumstances.
Come on, this is the 21st century. Everyone needs an ontology. Just head to WalMart and get yourself one on the cheap – they're usually on the shelves right between Ethics and Moralities.
Ontology is how you represent knowledge or data. Say you have a dataset and you're going to do statistical modeling on it. The variables and cases of the dataset are a reflection of how one has chosen to represent the problem.
The statistical ontologies tend to be simple cases, variables and the values the variables can take. Multi-level modeling can address more complex ontologies, ones where there is nesting of cases. Networks are another type of an ontology that requires special modeling approaches – but more often networks are just mapped down to being dealt with as cases. There is a sub-field within machine learning called statistical relational learning – where they try to model relations between different types of cases.
I had the same experience exactly at a meeting with a group of people who the
NIH paid $18.8M to study ontologies ( http://med.stanford.edu/news_releases/2005/septem… ). It took a long time to get a definition of ontologies, which seems to mean basically how you model things. A more complicated meaning is necessary if, e.g., you have hierarchies of diseases or taxonomies of genes, etc. Given that "ontology" has much overlap with "model" I knew which questions to ask: they agreed that if there was no way to select which of two ontologies was the better one, and they seemed to have no interest in statistical approaches to infer an ontology from data.
I would only amend the idea of Aleks' comment to say that your ontology describes how you connect your dictionary and syntax to your semantics. I don't have much of a philosophy background, so I'll [naively] say that I've never seen it used outside of problem solving. So, in this case, your ontology is the scheme by which you connect your [computational] model to the manifest world and vice versa.
You, in particular, will be challenged to relate to the CS convention of understanding ontologies. To the average CS person, developing an ontology is a forward experience. Make a description, decompose that description from [effectively] first principles, make a deliberate step forward with a new description, repeat until you have an algorithm for solving your 'problem'.
In your case, the [above-] average Bayesian statistician's case, the ontology is an emergent phenomenon. You have a pile of data, and you slice it and dice it, and look for the meaning inside it. In the end, you have an 'A and then B, and then …' description of the solution to a 'problem', which you got to by [artfully] wandering through the data.
The ontology is a technical artifact to you, so it doesn't make sense for you to care too much about it. I'll make this analogy. To the CS crowd, the ontology implements innate talent, and to the statistical crowd, the ontology is implemented by acquired skill.
I'm guessing that you were hanging out with a machine-learning-bubbas, no?
I will (comically) add one more note – I am almost certain that I first learned the word "ontology" when I read Thomas Pynchon's "Gravity's Rainbow" as a very obsessive fifteen-year-old (with a good vocabulary…) and had to make a little notebook of all the words I didn't know and look them up afterwards!
Lexicographically speaking, there are now (at least) two senses of the word "ontology" in play. The old-school one, in which ontology is a branch of metaphysics, and the new-school one, where it's a branch of "knowledge representation".
Whether an artificial intelligence practitioner believes his or her ontology is "real" (in the Platonic sense) is the philosophical issue. Today, lots of folks are ignorant of the 2000-plus years of philosophical debate on metaphysics, and simply treat ontologies as a practical tool for creating machine-readable content.
There's a nifty ontogy consisting of hypernym, synonym, antonym, and other lexical-conceptual relations in WordNet. It's basically a dictionary that's easier to read into a computer program to use in applications like search.
For some well-developed examples in medicine, check out the gene ontology (aka GO) for a controlled vocabulary to design gene function, or the medical subject headings (aka MeSH) which the U.S. National Library of Medicine (aka NLM) uses to catalogue most of the bio-medical research literature.
One of the most problematic, yet widely used ontologies is the international classification of diseases, 9th edition (aka ICD-9). Pretty much all medical billing in the U.S. must be encoded using ICD-9. My favorite ICD-9 code is E845.9, "accident involving spacecraft injuring other person". It's never been used in practice.
Be careful not to confuse E845.9 with E845.0, "accident involving spacecraft injuring occupant of spacecraft", or E845.8, "Accident involving spacecraft injuring ground crew airline employee". Those things have actually happened. E845.9 is a subclass of E845, "accidents involving spacecraft", which belongs to E840-E845, "air and space transport accidents", which belongs to the top-level concept E800-E999, "external cause of injury and poisoning".
I wrote a whole blog entry about ICD-9 and E845.
All I know is that ontology recapitulates philology.
The old-school one, in which ontology is a branch of metaphysics, and the new-school one, where it's a branch of "knowledge representation".
That's funny — so in the CS usage ontology means epistemology.
so in the CS usage ontology means epistemology
Booo! An ontology is an outcome of epistemology.
All I know is that ontology recapitulates philology.
No, phrenology.
Booo! An ontology is an outcome of epistemology.
In CS, so it seems. In philosophy, them's fightin' words.
The way to deal with the unrestrained use of ontology is simply to ask the question, "How does your ontology [say the word with just a bit of scorn in your tone] differ from a mere taxonomy?"
Asking this question will, of course, mark you as insufferable pedant, but that's a small price to pay for what it'll tell you about your counterparts. If they bluster and avoid the question, you can feel free to simple ignore them. If they patiently explain the difference between ontology and taxonomy, using the their previous topic as an example, then you should humbly apologize for your confusion. If socially appropriate, offer to buy them the beverage of their choice. You will have learned many useful things. If they say that the distinction really isn't important in this particular case, you'll know your dealing with folks who are honest, if a bit sloppy in their language and prone to faddishness.
"Ontology" is one of "Semantic Web" spells.
And "Semantic Web" is the Web 3.0 bubble.
twenty comments about ontology, not one comment about description logic?
boo!
I was just reading a book that argued that researchers need a framework from which they can develop a theory out of which models can be designed and tested. Does this use of the work "ontology" is something to insert between theory and model. Ugh.
Personally, I like the words ontology and deontology, but only (as one comment noted) when used in discussing philosophy or ethics. I don't see any use for it here that improves upon the already shifting uses of theory/model vocab.
Hmm, but in political science (here specifically speaking for IR scholars) we use "ontology" the "old school" way (as a branch of metaphysics) to refer to how one conceives of, e.g.: the difference (if any) between social and natural kinds, similarly the distinction (if any) between physical objects and ideas and their import, the (non)reducibility of structures to agents, and so on. Seems like a perfectly useful, and fairly common, way to use the term.
In the context of systems biology, an ontology is a method for extracting money from a funding agency.
Yes! In computer science, we don't care about metaphysics, implicitly embrace post-modernism while remaining outwardly dogmatically empirical, and basically call the data structure and API resulting from our epistemological and political wrangling an ontology.
Let's just call them all data structures or "controlled vocabularies" or "coding standards" and be done with all of this high-falutin', mis-appropriated, pseudo-philosophical nonsense. It'll get the philosophers off your back, too — their ears don't prick up at "data structure" or "application programmer interface".
If you followed the link to my earlier blog post, you'll see that it's all about that child of the 1970s, description logics, which were themselves begat by Minsky's frames and Schank's conceptual dependencies, which were all about knowledge representation.
Sadly, my wife Mitzi is stuck in a small lab at NYU for three days munging her boss's experimental methodology into a form acceptable for the funding agency's ontology police. Ontologies are not only a way to extract money from a funding agency, they're also a way to impose work on fundees. With the illusion that it'll somehow foster sharing across researchers. A previous project she worked on for DARPA had the same integration-by-ontology plan, which hit a rather major roadblock before it was cancelled — no one could shoehorn their data into the agreed-upon ontology, and no one could agree how to change it.
Andrew has the best comment: "In the context of systems biology, an ontology is a method for extracting money from a funding agency."
Ontology is a nice, misused word in all sorts of situations. It covers a multitude of sins… or sloppy thinking.
There's some movement in social science (after Critical Realism) to draw attention to ontology, which just comes down to what you think the world is like. Sociology is a great example a subject with multiple conflicting ontologies, usually dealing with "how much freedom" an individual has within society.
Of course, saying "everyone needs an ontology" is absurd, because everyone has an ontology. Implicit, certainly; and while explicit analysis can be useful, it's often not essential.
In CS (where I came across the term first), the definition (as above in the comments), it's related to the philosophy but not directly. A good question, of course, is: "Do you mean ontology as in metaphysics, or as in information science?" If it's the latter, throw in William Ockham's taxonomy question.