Re: A simplistic definition of "ontology"

"Nicola Guarino" <guarino@ladseb.pd.cnr.it>
Message-id: <guarino.1163724279A@150.178.2.3>
Date: Tue, 10 Oct 95 15:50:39 +0100
From: "Nicola Guarino" <guarino@ladseb.pd.cnr.it>
Subject: Re: A simplistic definition of "ontology"
To: "Eduard Hovy" <hovy@isi.edu>, srkb@cs.umbc.edu,
        "Pat Hayes" <phayes@cs.uiuc.edu>
X-Mailer: VersaTerm Link v1.1
Sender: owner-srkb@cs.umbc.edu
Precedence: bulk
Sorry again of being so slow in my answers...

Ed Hovy wrote:

>Just to stir up some blood: 
>
>  An ontology is a collection of symbols that represent (i.e., name) some 
>  set of phenomena in the "external world" within a computer (or possibly 
>  within other, non-implemented, systems, although who knows what that 
>  would be interesting for).  Typically, the phenomena include objects 
>  and processes and states, and typically, these entities are related 
>  among themselves; usually, the ontology names (some of) these relations.  
>

When you establish such a collection of symbols (e.g., a vocabulary), you
have this set of phenomena in "your mind". Suppose, just for the sake of
simplicity, that this set of phenomena is very simple and finite, e.g. the
possible configurations of black and white pixels on a 9x9 matrix. Suppose
your collection of symbols consists of the letters of the english alphabet.
Is this an ontology? I would say, no: what is an ontology is the particular
way according to which you organize your set of configurations in subgroups
of similar configurations, one for each letter you have "in mind" [see below].

Maybe this example is not the right one, since such configurations are still
"symbols", in a sense. You can do a similar play with toy blocks, for
instance: you have a finite number of them, you can build houses, tunnels,
towers, bridges, airplanes... An ontology should account for those possible
configurations of blocks which can be called "a house"...

In other terms, according to this example, an ontology should be able to
establish a way to distinguish "houses" from "bridges". This can be done in
different ways, of course (NOT NECESSARILY BY USING LOGIC, although logic
could be MUCH better for many AI tasks): for instance, one can devise
suitable linguistic patterns which apply to houses and not to bridges.

Pat Hayes writes:

>I think the issue can be focussed by asking whether it is enough to simply
>*name* the relations, or should we ask an ontology to somehow *specify*
>them.

I have tried to explain this idea above.

>Can the ontology rely on the knowledge of the reader to interpret
>what its symbols mean, or should we think of it rather as a vehicle for
>representing the knowledge that human users use to do that very
>interpreting? I think this tension has been in the ontology community since
>the beginning. Coming as I do from the 'knowledge representation' (rather
>than the 'glossary') side of the divide, I always wonder just how far away
>the other side can be taken to be. My comment which started this exchange
>wasnt really asking for a definition (maybe its impossible to give a
>*definition* of an ontology which will satisfy everyone) but raising a
>doubt, or a question, about whether it is useful to talk of a mere glossary
>as being an ontology. Or, to put it another way; if an ontology has no
>axioms in it at all, what do we gain by calling it an ontology?

Thinking better at the difference between the two positions, I don't see a
BIG tension between Michael Gruninger's definition of a "glossary-ontology":

>a coherent characterisation of a domain with the terms and their
>inter-relationships carefully considered and defined.

and the more logical definition I have proposed. Maybe I have being
stressing too much the use of logic in my previous messages, but the crucial
point, in my opinion, is not so much the use of logic, but the ability to
discriminate somehow among external (real-world) "models". Gruninger's
definition complies to this point; I am in trouble however with Ed's
definition, since it is too general, allowing a pure, flat vocabulary to be
considered as an ontology [I am sure however that Ed's ontologies are not of
this kind].

Pat Hayes writes:

> What one 'has in mind' is presumably expressed
>there in some mental representation. Now, how do we know that this
>mentalese representation in fact has unique models? All the lessons of
>logic would suggest that it usually doesn't; that usually there will be
>nonstandard models of our thoughts.
>...
>Now, if this is the case, then what warrant can we have for insisting that
>there is "a" model that we have in mind? We have no special access to
>models of our mental language: we can only think of things IN that
>language.

Notice that, in the examples above, I am speaking (and thinking) of
*concrete things* (configurations of pixels or blocks) these are the models
(better, the states of affairs) I have "in mind"...

A final comment regarding something I have not understood in Pat's words:

>At  5:10 PM 10/4/95 +0100, Nicola Guarino wrote:
>.....
>e)
>>
>>In order to stress this molteplicity of models, we may refine the definition
>>as follow:
>>
>>"An ontology is a *partial* specification of the intended *possible* models
>>of a logical language"
>>
>
>But then what else is that logical language itself?

Sorry Pat, I am not able to see your point, could you expand a little bit?
You seem to allude to the fact that in this case ther woule be no
distinction between the language and the ontology, but with language I only
intend a set of symbols (or WFFs), with no proper axioms; an ontology, on
the other side, is a particular set of proper axioms used to restrict the
models of the language in order to avoid (some of) the non-intended ones.


                         ---------------------

Nicola Guarino
National Research Council     phone: +39 49 8295751
LADSEB-CNR                    fax:   +39 49 8295778
Corso Stati Uniti, 4          email: guarino@ladseb.pd.cnr.it
I-35127 Padova                WWW: http://www.ladseb.pd.cnr.it/infor/infor.html
Italy