Thomas R. GruberandGregory R. Olsen

Knowledge Systems Laboratory

Stanford University

701 Welch Road, Building C, Palo Alto, CA 94304

gruber@ksl.stanford.edu

We describe an ontology for mathematical modeling in engineering. The ontology includes conceptual foundations for scalar, vector, and tensor quantities, physical dimensions, units of measure, functions of quantities, and dimensionless quantities. The conceptualization builds on abstract algebra and measurement theory, but is designed explicitly for knowledge sharing purposes. The ontology is being used as a communication language among cooperating engineering agents, and as a foundation for other engineering ontologies. In this paper we describe the conceptualization of the ontology, and show selected axioms from definitions. We describe the design of the ontology and justify the important representation choices. We offer evaluation criteria for such ontologies and demonstrate design techniques for achieving them.

To enable the sharing and reuse of engineering models among engineering tools and their users, it is important to specify a conceptual foundation that makes these distinctions explicit and provides a context- and reader-independent semantics. Toward this end, we have developed a formal ontology for mathematical modeling in engineering, called EngMath. The ontology builds on abstract algebra and measurement theory, adapted to meet the expressive needs of engineering modeling. The specification includes a first-order axiomatization of representational vocabulary that is machine and human readable.

This paper is about the EngMath ontology, and how it exemplifies the design and use of such ontologies in support of agent communication and knowledge reuse. Such an ontology differs from what is found in engineering textbooks and philosophy books in that it is designed as a specification for these knowledge sharing purposes. We begin in Section 2 by describing the role of ontologies as formal specification and the uses of the EngMath ontology. In Section 3, we give define the basic concepts and relations in the ontology. In Section 4, we discuss a series of design decisions and their rationale. In Section 5, we offer design criteria--minimizing ontological commitment and maximizing monotonic extendibility--and demonstrate techniques used to achieve them. In Section 6, we discuss the relationship of the EngMath ontologies to relevant work in philosophy and AI

For the purpose of knowledge sharing, formal ontologies serve as
*specifications of common conceptualizations*
[20]
among agents. In the philosophy literature, ontology is the systematic account
of Existence--aiming to account for all forms and modes of being
[5].
For AI systems, what can exist in a conceptualized world is determined by what
can be represented.[Note 1]
If agents are to communicate in a shared language or if a body of formally
represented knowledge is to be reused, then there must be some agreement about
a universe of discourse. Furthermore, if the shared language includes
vocabulary denoting entities and relationships in the conceptualization, there
must be some way to specify what can be meaningfully stated in this vocabulary.
Ontologies, in the context of knowledge sharing, are a means for making such
content-specific agreements.

If we assume a common syntax and semantics for a core representation language, then we can specify conceptualizations by writing definitions of shared vocabulary. That is the strategy proposed by the ARPA Knowledge Sharing Effort [33,35], and is the tack we are taking. A Knowledge Interchange Format (KIF) [16] serves as the language for making assertions and definitions, and ontologies provide axiomatic and textual definitions of relations, functions, and objects. By 'definitions' we mean specifications of the well formed use of the vocabulary. Definitions include axioms that constrain the interpretation.[Note 2] Such an axiomatization specifies a logical theory, but is not intended as a knowledge base. Instead, the ontology serves as a domain-specific representation language in which knowledge is shared and communicated.

In practice, our ontologies define the vocabulary with which queries and assertions are exchanged among interoperating agents, some of which may be passive (e.g., deductive databases). The agents conform to ontological commitments [19, 20] which are agreements to use the shared vocabulary in a coherent and consistent manner. An ontological commitment is a guarantee of consistency, but not completeness, with respect to queries and assertions using the vocabulary defined in the ontology (c.f. [23]). Committed agents may "know" things not implied by the shared ontologies, and may not be able to answer queries that follow from the shared ontologies. Furthermore, the "shared knowledge" of these agents can be viewed at the Knowledge Level, as attributed and independent of symbol-level encoding [34]. Thus, the agents may operate on any internal representation desired, as long as they use the shared vocabulary consistently in communication. This model of agent collaboration is being pursued by several groups [9, 15, 22, 32].

Textbook notations for physical quantities vary by author and leave much implicit--relying on context and background knowledge of the reader for proper interpretation. The problem of implicit notation is revealed when students try to encode engineering models using mathematical support software. Human expertise is required to map expressions about physical quantities to the purely mathematical constructs of current commercial math tools (e.g., Matlab, Mathematica, Maple).

The EngMath ontology is intended to provide a formal language sufficient to express the models in engineering textbooks and to map them to mathematical software tools. We view the latter application as an instance of agent communication, which is the subject of the next section.

To illustrate a use of the ontology, consider a simple example of agents exchanging symbolic representations of spring behavior. Agent A is a specialist in the design of springs, and agent B is a specialist in quantity algebra. Agent A needs a solution to a set of equations relating spring and material properties that include the following:

4 d G k = --------- , G = 11,5000kpsi 3 8 D Nwhere

Agent A can send Agent B these equations as a set of KIF sentences, using the vocabulary of the EngMath ontology:

(scalar-quantity k) (= (physical.dimension k) (/ force-dimension length-dimension)) (scalar-quantity d) (= (physical.dimension d) length-dimension) (scalar-quantity dm) (= (physical.dimension Dm) length-dimension) (scalar-quantity N) (= (physical.dimension N) identity-dimension) (scalar-quantity G) (= (physical.dimension G) (* force-dimension (expt length-dimension -2))) (= k (/ (* (expt d 4) G) (* 8 (expt Dm 3) N))) (= G (* 11.5 (expt 10 6) psi))After receiving the equations in this form, agent B can answer questions about the values of the terms such as the diameter (d). The vocabulary used in this interaction, such as the function constant

The spring example is typical of problems in introductory engineering textbooks. More complex interactions are required to coordinate commercial design tools on industrial problems.

In the SHADE project [22, 32], we have constructed a set of software agents that interact to support collaboration on industrial problems like satellite system design. One SHADE agent is a specialist in rigid body dynamics (RBD), and another is responsible for the geometric layout of satellite components (Layout). Both commit to the EngMath ontology. The RBD agent queries the Layout agent about the inertial characteristics of a particular component. These characteristics include the mass whose value is a scalar quantity and the inertia tensor whose value is a second order tensor. In reply, the Layout agent specifies the inertia tensor with respect to a global reference frame and point. The reference frame is part of the shared domain theory for the two agents; it is not implicit in the representation of the tensor. This allows the RBD agent to translate the inertia into a different reference frame convenient for dynamic analysis.

Most SHADE agents are commercial tools wrapped so that they conform to
ontological commitments and communication protocols. These agents are designed
to be conformant at the interface, but are not required to represent the
ontologies internally. Some agents can assimilate an ontology and use it as
input. The Unit Conversion Agent is an example. Its contract is specified
entirely by the EngMath ontology. This agent takes KIF expressions over
quantities, and performs services such as symbolic simplification, unit
conversion, and dimensional consistency verification. It can *read*
ontologies that specify of other unit systems, and determine whether the system
is complete for the dimensions specified.

The Compositional Modeling Language (CML)[ 12] is another example of building on the EngMath ontologies. CML is a modeling language that is intended to synthesize and redesign the various formulations of Compositional Modeling [ 8, 13, 14, 29 ] to enable model sharing among research groups. Part of the language design of CML is an ontology about time, continuity, object properties, etc. The semantics of the language are specified axiomatically, using the vocabulary of the CML ontology. The CML ontology builds on the EngMath ontology as a foundation.

The entire ontology is too large and complex to present in static, linear form (about 2000 lines of definitions). The complete specification is available on-line on the World Wide Web in cross-indexed, machine-formatted hypertext [ 21]. To give a flavor for the details, we have included a few axioms from the actual ontologies in this section.

Although we use the term "physical quantity" for this generalized notion of quantitative measure, the definition allows for nonphysical quantities such as amounts of money or rates of inflation. However, it excludes values associated with nominal scales, such as Boolean state and part number, because they are not amenable to these algebraic operations.

(defrelation PHYSICAL QUANTITY (=> (physical-quantity ?x) (and (defined (quantity.dimension ?x)) (physical-dimension (quantity.dimension ?x)) (or (constant-quantity ?x) (function-quantity ?x)))))

(* mass (* (expt length -1) (expt time -2)))where

Physical dimensions can be composed from other dimensions using multiplication and exponentiation to a real power. It is important for Dimensional Analysis [31] that dimensions have certain algebraic properties. The product of any two physical dimensions is also a physical dimension, and the multiplication operator * is associative, commutative, and invertible with an identity element called the identity dimension (i.e., it forms an abelian group with *).

(defrelation PHYSICAL-DIMENSION (abelian-group physical-dimension * identity-dimension))Constant quantities whose physical dimension is the identity dimension are called, paradoxically,

(defrelation DIMENSIONLESS-QUANTITY (<=> (dimensionless-quantity ?x) (and (constant-quantity ?x) (= (quantity.dimension ?x) identity-dimension)) (=> (real-number ?x) (dimensionless-quantity ?x))))Dimensional homogeneity is a prerequisite to unit conversion and other algebraic operations on quantities. Consider the simplest type of physical quantities, scalar quantities.

Ellis defines a quantity [type] as exactly that which can be linearly ordered. We needed to depart on two fronts, to accommodate the sorts of quantities we find in engineering models. First, comparability is different for higher-order tensors (see Section 3.5); the tensor order and spatial dimensions of the quantities must be compatible to be able to compare them, and the ordering need not be total. Second, for scalars we insist that the order be dense: one can multiply any scalar quantities of a given physical dimension by a real number and obtain another scalar quantity of that physical dimension. This property also holds for mass, and illustrates that calling something a quantity is a modeling decision. That mass is densely ordered in this way is an assumption of continuum mechanics. It also was a consequence of including the reals as a species of physical quantity. Nonetheless, we depart from writers like Ellis primarily because our goals are slightly different. Our primary responsibility is to explicate a coherent framework that is adequate for expressing the content of engineering models.

The notion of physical dimension is intimately tied up with the notion of physical quantity, and both are primitive concepts ultimately grounded in the comparability of quantities in the world. Thus, from our definitions alone a computer program cannot infer that some entity is a physical quantity unless it defined in terms of other quantities. The practical consequence of including such primitives in a formal ontology is that the types of such entities must be declared.

In our conceptualization, units of measure are quantities themselves (positive, scalar, constant quantities). A unit of measure is an absolute amount of something that can be used as a standard reference quantity. Like all quantities, units have dimensions, and units can be defined as any other scalar quantity. For example, the kilogram is a unit of measure for the mass dimension. The unit called "pound" can be defined as a mass quantity equal to the kilogram times some constant, just as the quantity 50kg is equal to the product of the unit called "kilogram" and the real number 50. What makes the pound special, compared with quantities like 50kg, is a matter of convention. (We will return to the issue of standard units in Section 3.8.) To provide for unit conversion over all physical dimensions, every product and real-valued exponentiation of a unit is also a unit of measure.

(defrelation UNIT-Of-MEASURE ;; units are scalar quantities (=> (unit-of-measure ?u) (scalar-quantity ?u)) ;; units are positive (=> (unit-of-measure ?u) (forall ?u2 (=> (and (unit-of-measure ?u2) (= (quantity.dimension ?u) (quantity.dimension ?u2))) (positive (magnitude ?u ?u2))))) ;; units can be combined using * (abelian-group unit-of-measure * identity-unit) ;; units can be combined using expt (=> (and (unit-of-measure ?u) (real-number ?r)) (unit-of-measure (expt ?u ?r))) ;; * is commutative for units and other Qs (=> (and (unit-of-measure ?u) (constant-quantity ?q))) (= (* ?u ?q) (* ?q ?u))))

The requirement for dimensional consistency fits our intuition. The magnitude
of 50kg in kilograms is 50, but the magnitude of 50kg in meters is undefined.
For higher-order tensor quantities, the value of the magnitude function is an
ordinary dimensionless tensor. Since units of measure are *scalar*
quantities, one can think of the magnitude function as factoring out the
physical dimension of a quantity (returning a dimensionless quantity) and
producing a value normalized on a scale corresponding to the unit.

Although a unit of measure implicitly determines a measurement scale, units of measure are not the same thing as scales in this conceptualization. Measurement scales are a more general way to map quantities to numeric values, and are described in Section 4.7.

(deffunction MAGNITUDE (<=> (and (defined (magnitude ?q ?unit)) (= (magnitude ?q ?unit) ?mag)) (and (constant-quantity ?q) (unit-of-measure ?unit) (dimensionless-quantity ?mag) (= (quantity.dimension ?q) (quantity.dimension ?unit)) (defined (* ?mag ?unit)) (= (* ?mag ?unit) ?q))) ;; dimensionless magnitudes can be factored (forall (?q ?unit ?mag) (=> (and (constant-quantity ?q) (unit-of-measure ?unit) (dimensionless-quantity ?mag) (defined (* ?mag ?q))) (= (magnitude (* ?mag ?q) ?unit) (* ?mag (magnitude ?q ?unit))))))

This is an interesting representation problem, because both the set of units and the space of physical dimensions are conventions, and both are constrained (but not determined) by the background domain theory assumed in a model. The set of dimensions and their mutual relationships are determined by a physical theory, while the choice of units for each dimension is a measurement convention. The relationship between force, mass, length, and time is given by physics. The theory does not need to give fundamental status to any one physical dimension, but it does say that the force dimension is equal to (* (* length mass) (expt time -2)). One system of measurement may take mass, length, and time to be primitive and derive force; another could take force as primitive and derive mass. The same physical laws could be expressed in either system.

The concept of system of units is defined so that commitments to physical theories, sets of fundamental dimensions, and standard units are independent. To define a system of units, the model builder chooses a set of fundamental dimensions that are orthogonal (i.e., not composable from each other). According to this physical theory, mass and time are orthogonal, but force and mass are not. The set of fundamental dimensions determines the space of possible quantities that can be described in this system--those whose physical dimensions are some algebraic combination of the fundamental dimensions. For each of the fundamental dimensions, the model builder chooses a standard unit of that dimension; these are called the base-units of the system. Then every other standard unit in the system is a composition (using * and expt) of units from the base set. For example, the Systeme International (SI) is a system of units that defines a set of seven fundamental dimensions with the base-units meter, kilogram, second, ampere, Kelvin, mole, and candela.

(defrelation SYSTEM-OF-UNITS (<=> (system-of-units ?s) (and (class ?s) (subclass-of ?s unit-of-measure) ;; The base-units of the system are ;; those with fundamental dimens (defined (base-units ?s)) (=> (member ?unit (base-units ?s)) (instance-of ?unit ?s)) (orthogonal-dimension-set (setofall ?dim (exists ?unit (and (member ?unit (base-units ?s)) (= ?dim (quantity.dimension ?unit)))))) ;; Every unit in the system is the ;; standard unit for its dimension. (=> (instance-of ?unit ?s) (= (standard-unit ?s (quantity.dimension ?unit)) ?unit)))))

(defrelation ORTHOGONAL-DIMENSION-SET (<=> (orthogonal-dimension-set ?s) (and (set ?s) (=> (member ?d ?s) (and (physical-dimension ?d) (not (dimension-composable-from ?d (difference ?s (setof ?d)))))))))

(defrelation DIMENSION-COMPOSABLE-FROM (<=> (dimension-composable-from ?d ?s) (or (member ?d ?s) (exists (?d1 ?d2) (and (dimension-composable-from ?d1 ?s) (dimension-composable-from ?d2 ?s) (= ?d (* ?d1 ?d2)))) (exists (?d1 ?real) (and (dimension-composable-from ?d1 ?s) (real-number ?real) (= ?d (expt ?d1 ?real)))))))

For vector quantities, the sum of the quantities is only defined where the sum of the dimensionless versions of the vectors would be defined (i.e., the spatial dimensions must align). For higher-order tensors, tensor order and spatial dimensions, as well as the physical dimension, must be homogeneous. Analogous restrictions apply for the multiplication of tensors.

For function quantities, the sum or product of two function quantities is
another function quantity that is only defined where the domains of the
functions are equal. For unary scalar function quantities, the addition and
multiplication operators are defined to handle a mix of function quantities and
constant quantities. The sum of a constant *k* and a time-dependent
function *f*, for example, is a function defined everywhere *f*(t) is
defined and is equal to *f*(t)+*k*. Continuous time-dependent
quantities can also be defined from others using a time-derivative function.

Most of the axiomatization in the specialized theories for functions, vectors, and tensors is concerned with specifying the conditions under which algebraic operators apply. This is essential for building agents with guarantees of completeness for some class of quantities.

Of course, the model builder is free to define a function from physical objects to quantities, but this function is not the same thing as a mass quantity. Our formulation of quantities does not preclude such object-to quantity functions; it provides the language in which to describe the range of those functions. In CML [12] , for example, there are functions from objects (e.g. pipes) to time-dependent function quantities. This distinction is central to the semantics of CML, which allows both time-varying and static relations. The object-to quantity functions are time-independent (e.g., the pipe always has a flow rate), but the quantities are functions of time (e.g., the flow rate is a function quantity that has different values throughout a scenario and can be undefined on some values of time).

The rationale for this decision is again to provide generality and flexibility.
We know from physics that there is no physical basis for giving some quantities
primacy. Measurement theories make a distinction between quantities amenable
to fundamental and associative measurement [6]
.
Again, we have made this distinction irrelevant for engineering models by
avoiding the temptation to define quantities in terms of measurement. The
analogous argument holds for not giving special status to some units of measure
or physical dimensions. Even though it is possible to *compose* units
and dimensions, there is no need to stipulate in the shared theory exactly
which units and dimensions are fundamental.

We say that a physical dimension distinguishes a type or class of quantities that can be meaningfully combined with algebraic operations, can undergo unit conversion, and are comparable. Amount of money is a meaningful dimension because one can accumulate sums of money, do currency conversion, and compare relative wealth. Amount of money can be meaningfully combined with other dimensions. A rate of inflation, for example, is computed by dividing an amount of money (a change in price) by a different amount of money (the base price) and dividing the result by a unit of time (a year). The rate is a quantity of dimension (expt time -1). Money is something to be tagged as different in type from other quantities; that's why we see dollar signs (or other units) carried with the numbers in formulae.

As a negative example, consider quantities like number of politicians and number of constituents. We might write a formulae

(= Nconstituents (* Npoliticians 1000000))In this model, we are making an abstraction; these quantities of humans are being compared as numbers. The = sign implies that the quantities must be of the same physical dimension, which would be, in this case, the identity dimension. Suppose, however, that we wanted to describe the number of molecules in those politicians. There is an international unit for number of molecules, the Mole.[Note 3] In the modern SI system of units, this is not a dimensionless quantity, but a quantity of the dimension amount of substance. Why does amount of substance get special status over number of politicians? Because chemical models need to distinguish amount of substance quantities from other measures. There is something homogeneous about molecules in that it makes sense to measure stuff by counting them. The formula for the average amount of gas in those politicians would use an amount of substance quantity for the amount of gas, and a dimensionless quantity for the number of politicians. This makes sense in the chemist's abstraction of politicians: that they can be viewed as volumes of gas.

By this mathematical definition, the units of measure in our conceptualization correspond to ratio scales.[Note 4] Each value on the scale is the ratio of the measured quantity to the degree-Kelvin unit. Thus the Kelvin scale can be defined from the degree-Kelvin using the magnitude function:

(lambda (?q) (magnitude ?q degree-kelvin))We don't call Mohs ratings or degrees-Celsius units of measure, because they aren't quantities against which to compare other quantities. Of course one can write a function that does measure any temperature on the Celsius scale, such as

(lambda (?q)Since there is no principled constraint on the form of such function, we leave it to the model builder to define scales appropriate to a domain.(- (magnitude ?q degree-kelvin) 273.15))

Two of the criteria will be illustrated here: to minimize *ontological
commitment* while allowing for *monotonic extendibility*. Minimizing
ontological commitment means making as few claims as possible about the world
being modeled, allowing the parties committed to the ontology freedom to
specialize and instantiate the ontology as needed. Extendibility means an
ontology should be crafted so that one can extend and specialize the ontology
*monotonically*. In other words, one should be able to define new terms
for special uses based on the existing vocabulary, in a way that does not
require the revision of the existing definitions. Both of these criteria hold
a natural tension with the goal of supporting the knowledge sharing needs of a
range of agents with differing abilities and assumptions. Adding vocabulary to
handle a broad range of representation needs will increase ontological
commitment by adding more constraints on the interpretation, and will make it
more likely that some definitions will be incompatible with future
representation needs.

In this section, we will discuss two techniques in the design of the EngMath ontology that help us meet these two criteria. One is the decomposition of a large ontology into modules. The second is another familiar design technique--parameterization--applied to the problem of representing conventions.

Figure 1 shows the inclusion lattice of theories for the EngMath family. The core ontology of physical quantities includes abstract algebra (evolved from the example in the KIF 3.0 specification [16]) and a theory of objects and relations called the Frame Ontology [20]. The EngMath family includes ontologies for Scalar Quantities, Vector Quantities, and Unary Scalar Functions. The Standard Units ontology defines many of the most common physical dimensions and units, and includes the SI system. Other engineering ontologies that build on the EngMath family--for describing component structure, design tasks, discrete events, and specific analysis domains such as kinematics--are being developed.

Figure 1: Inclusion lattice of ontologies

Decomposing into loosely coupled ontologies helps minimize ontological commitment by allowing one to commit to a coherent subset of the axioms of the entire ontology. For example, one can commit to scalars but not vectors; unary functions (for ODE models) but not n-ary functions (for PDE models); and static models (no functions). Even an agent that does not include physical dimensions in its conceptualization can be accommodated, since all the realnumber operations are also defined for quantities, and the reals are quantities.

Decomposition also supports the specialization of monotonically extendible ontologies. For example, the CML ontology inherits the basic commitments of the Unary Scalar Functions ontology and adds notions specific to its needs. Since the definitions in Unary Scalar Functions anticipate the general class of time-dependent functions, the CML extensions were possible without redefining the originals.

Designing a family of coherent ontologies is more difficult than designing a
monolithic one, since the designer must anticipate intermodule interactions,
such as axioms in several ontologies constraining the same vocabulary. To help
manage the extension of vocabulary across ontologies, we borrow a technique
from software engineering: *polymorphism*.

Polymorphism allows the same function constant to be defined in several places,
where each definition adds axioms about the use of the constant. The +, *, and
`expt` functions, for example, are polymorphically extended to each type
of quantity and to physical dimensions. The form of a polymorphic
definition of a function F is`
`

where(=> (and (p ?x) (q ?x))

(=> (= (F ?x ?y) ?z) (r ?x ?y ?z)))

Due to theory inclusion, the definition of a function is the union of axioms contributed by each theory. For example, the definition of + for the Vector Quantities ontology is the union of the axioms for + for vector quantities, + for scalar quantities, + for physical quantities in general, and + for the reals.

To minimize ontological commitment, we formulate these choices as parameters of the engineering model, rather than global constants of the shared ontology. To support extendibility, we provide an expressive representational vocabulary for specifying values of parameters. For example, to allow the model builder to specify the choices of fundamental dimensions and standard units, we provide the machinery to define systems of units.

It is in this sense that ontologies are a "coupling mechanism" [18]
for knowledge bases and knowledge based agents. Parameters such as the system
of units for a domain model are like abstract data types in software, except
that the latter are ground and parameters of a domain model can be
*theories*. We mentioned that the Unit Conversion agent can take, as
inputs, ontologies specifying systems of units. When the other SHADE tools
exchange sets of equations they are also exchanging theories. When agents pass
around theories, the constraints on what they can pass around is specified in a
shared ontology. In this sense, ontologies play a similar role as database
schemata, except that ontologies may require a more expressive language for
constraints than is typical in database specifications.

The aim of the philosophy texts is to describe The World as it Is in its Entirety, and to relate the results to prior writings. For the philosopher, it is important to relate the nature of quantities to the process of measurement and observation, and more generally, to the question of scientific knowledge. For instance, even the very notion that quantities such as mass can be meaningfully expressed as linear ratios of a standard turns out to be a convention, albeit a very useful and familiar one. Ellis [11] argues that the notion of a unit is incomplete without a choice of how such units are combined (which is related to, but not the same as, measurement procedure). We are assuming that there is a shared interpretation to the result of multiplying the meter times the real number 1000.

For our purposes--the sharing and communication of engineering models in
machine and human readable forms--it is an *advantage* to be able to
isolate the meaning of quantities from the process of measurement. We make no
apologies: this is not a sweeping-under-the-rug of relevant issues, but a
strategic decoupling of issues. In building ontologies we are writing social
contracts. We are free to invent the conceptualization as long as its meaning
can be effectively communicated (which is why we use standard terminology and
logic). By accepting the KIF language as a foundation, for example, we already
commit to including abstract things like sets and relations in the universe of
discourse. We add to that an ontology of abstract algebra, and extend it to
physical quantities. The philosophical ontologist draws a heavy line when the
objects cross from timeless mathematical entities to physical concepts like
mass and length. Our agents need no strong boundary; if our engineering model
states that a quantity of mass exists and is related to a length quantity by
some algebraic expression, it is so for the agent. According to our evaluation
criteria [19]
,
clarity and coherence in the specification are paramount, and faithfulness with
The World is not an issue.

Nonetheless, we can "share and reuse" the analysis found in philosophy writing,
for very pragmatic ends. For example, one of the differences between a casual
writing on quantities and a careful one is the treatment of property
association. One often sees it stated that quantities are "properties" of
"objects" -- like qualities but quantitative. However, the careful writer will
point out that quantities as we use them are *relational*--they are about
comparing objects in some respect, or about relationships among them (e.g.,
distance between reference points). This informs our design. Departing from
the conventional "object oriented" approach, we give independent status to
quantities and leave it as a modeling decision to map from objects to
quantities. This is, in essence, a decoupling of theories about quantities
from theories about model formulation.

Similarly, an understanding of the nature of physical theory and measurement guides us away from the impulse to oversimplify for the sake of computational elegance. For example, it would be simpler from a computational point of view to fix a single basis set of "fundamental" physical dimensions or units of measure, and recursively derive the rest. However, the laws of physics tell us that there are no inherently privileged physical dimensions, and the study of measurement tells us that the choice of basis sets for dimensions and units is a convention. The fact that engineers must deal with at least two systems of units, each of which chooses a different basis set (and which has changed over historical time), motivates us to provide the model builder with the representational machinery to define a system of units as part of the domain model.

There is a growing body of ontologies appearing in the literature seen by the knowledge representation community, including as a sample [3, 10,28, 30, 39].

The most closely related on engineering ontology is the thesis by Alberts [2]. Alberts describes a formal ontology intended as the basis for building interoperable and reusable knowledge systems for design. His ontology provides a vocabulary for modeling the structure and behavior of systems, based on systems theory [38] and finite element modeling. While the formalization of systems is exemplary, the treatment of quantities in that ontology is simplistic. First, quantities have no status other than as the values of variables with symbolic 'quantity types.' There is no provision for defining new dimensions or describing complex dimensions; the quantity types appear to be those that are anticipated by systems theory (analogs of effort and flow). Second, the values are always unary scalar functions of time, where time is "exceptional" (i.e., outside the range of the model). This prevents vector and tensor models, PDE models, phase-space models where time is not the independent variable, etc. Third, the values of these functions are tuples of numbers and fixed units.

More recent work by Akkermans and Top [1] develops the systems theory methodology to maturity. It proposes engineering ontologies at four levels of description: functional components, physical processes, mathematical relations, and model data. Work on model formulation using the CML language [12] and SHADE engineering agents [32] aims at a suite of ontologies not based on system theory that have similar coverage. Perhaps the ontology presented in this paper can provide a foundation for the mathematical and data level of models in these comprehensive engineering ontologies.

[2] L. K. Alberts. *YMIR: an ontology for engineering design. *Doctoral
dissertation, University of Twente, 1993.

[3] J. A. Bateman, R. T. Kasper, J. D. Moore, & R. A. Whitney. A General Organization of Knowledge for Natural Language Processing: The Penman Upper Model. USC/Information Sciences Institute, Marina del Rey, CA, Technical report 1990.

[4] M. Bunge. *Treatise on Basic Philosophy, Volumes 3-4: Ontology*. D.
Reidel Publishing Company, Dordrecht, Holland, 1979.

[5] H. Burkhardt & B. Smith (Eds.). *Handbook of Metaphysics and
Ontology*. Philosophia Verlag, Munich, 1991.

[6] N. R. Campbell. *An Account of the Principles of Measurement and
Calculations*. Longmans, Green, London, 1928.

[7] C. H. Coombs. The theory and methods of social measurement. In L.
Festinger & D. Katz, Ed., *Research Methods in the Behavioral
Sciences*, Dryden Press, New York, 1952.

[8] J. Crawford, A. Farquhar, & B. Kuipers. QPC: A Compiler from Physical
Models into Qualitative Differential Equations. *Proceedings of the Eighth
National Conference on Artificial Intelligence, *Boston, pages 365-371. AAAI
Press/The MIT Press, 1990.

[9] M. Cutkosky, R. S. Engelmore, R. E. Fikes, T. R. Gruber, M. R. Genesereth,
W. S. Mark, J. M. Tenenbaum, & J. C. Weber. PACT: An experiment in
integrating concurrent engineering systems. *IEEE Computer*,
**26**(1):28-37, 1993.

[10] E. Davis. *Representations of Commonsense Knowledge*. Morgan
Kaufmann, San Mateo, 1990.

[11] B. Ellis. *Basic Concepts of Measurement*. Cambridge University
Press, London, 1966.

[12] B. Falkenhainer, A. Farquhar, D. Bobrow, R. Fikes, K. Forbus, T. Gruber, Y. Iwasaki, & B. Kuipers. CML: A compositional modeling language. Stanford Knowledge Systems Laboratory, Technical Report KSL-94-16, January 1994.

[13] B. Falkenhainer & K. D. Forbus. Compositional modeling: Finding the
right model for the job. *Artificial Intelligence*, **51**:95-143,
1991.

[14] K. D. Forbus. Qualitative Process Theory. *Artificial Intelligence*,
**24**:85-168, 1984.

[15] M. R. Genesereth. An Agent-Based Framework for Software Interoperability.
*Proceedings of the DARPA Software Technology Conference, *Meridian
Corporation, Arlington VA, pages 359-366. 1992.

[16]M. R. Genesereth & R. E. Fikes. Knowledge Interchange Format, Version 3.0 Reference Manual. Computer Science Department, Stanford University, Technical Report Logic-92-1, March 1992.

[17]. R. Genesereth & N. J. Nilsson. *Logical Foundations of
Artificial Intelligence*. Morgan Kaufmann Publishers, San Mateo, CA, 1987.

[18] T. R. Gruber. The Role of Common Ontology in Achieving Sharable, Reusable
Knowledge Bases. In James A. Allen, Richard Fikes, & Erik Sandewall, Ed.,
*Principles of Knowledge Representation and Reasoning: Proceedings of the
Second International Conference, *Cambridge, MA, pages 601-602. Morgan
Kaufmann, 1991.

[19] T. R. Gruber. Toward principles for the design of ontologies used for
knowledge sharing. In Nicola Guarino, Ed., *International Workshop on Formal
Ontology, *Padova, Italy, 1992. Revised August 1993, to appear in *Formal
Ontology in Conceptual Analysis and Knowledge Representation,* Guarino
& Poli (Eds), Kluwer, in preparation.

[20] T. R. Gruber. A Translation Approach to Portable Ontology Specifications.
*Knowledge Acquisition*, **5**(2):199-220, 1993.

[21] T. R. Gruber & G. R. Olsen. *The Engineering Math Ontologies*.
World Wide Web, URL
"http://www-ksl.stanford.edu/knowledge-sharing/README.html", 1994.

[22]T. R. Gruber, J. M. Tenenbaum, & J. C. Weber. Toward a knowledge
medium for collaborative product development. In John S. Gero, Ed.,
*Artificial Intelligence in Design `92*, pages 413-432. Kluwer Academic
Publishers, Boston, 1992.

[23] N. Guarino. An ontology of meta-level categories. *KR'94.*

[24] N. Guarino & R. Poli (Eds.). *Formal Ontology in Conceptual
Analysis and Knowledge Representation*. Kluwer, in preparation.

[25] R. V. Guha. *Contexts: A formalization and some applications.
*doctoral dissertation, Stanford University, 1991.

[26] D. Halliday & R. Resnick. *Physics*. John Wiley and Sons, New
York, 1978.

[27] G. Hirst. Ontological assumptions in knowledge representation. KR'89,*
*1989.

[28] J. R. Hobbs & R. C. Moore (Eds.). *Formal Theories of the Common
Sense World*. Ablex, Norwood, NJ, 1985.

[29] Y. Iwasaki & C. M. Low. Model Generation and Simulation of Device
Behavior with Continuous and Discrete Changes. *Intelligent Systems
Engineering*, **1**(2)1993.

[30] D. B. Lenat & R. V. Guha. *Building Large Knowledge-based Systems:
Representation and Inference in the Cyc Project*. Addison-Wesley, Menlo
Park, CA, 1990.

[31] B. S. Massey. *Measures in Science and Engineering: Their Expression,
Relation, and Interpretation*. Ellis Horwood Limited, 1986.

[32] J. G. McGuire, D. R. Kuokka, J. C. Weber, J. M. Tenenbaum, T. R. Gruber,
& G. R. Olsen. SHADE: Technology for knowledge-based collaborative
engineering. *Journal of Concurrent Engineering: Applications and Research
(CERA)*, **1**(2)1993.

[33] R. Neches, R. E. Fikes, T. Finin, T. R. Gruber, R. Patil, T. Senator,
& W. R. Swartout. Enabling technology for knowledge sharing. *AI
Magazine*, **12**(3):16-36, 1991.

[34] A. Newell. The knowledge level. *Artificial Intelligence*,
**18**(1):87-127, 1982.

[35] R. S. Patil, R. E. Fikes, P. F. Patel-Schneider, D. McKay, T. Finin, T. R.
Gruber, & R. Neches. The DARPA Knowledge Sharing Effort: Progress report.
*Principles of Knowledge Representation and Reasoning: Proceedings of the
Third International Conference, *Cambridge, MA, Morgan Kaufmann, 1992.

[36] D. A. Randall & A. G. Cohn. Modelling topological and metrical
propoerties in physical processes.*KR'89, *Morgan Kaufmann, 1989.

[37] W. C. Reynolds & H. C. Perkins. *Engineering Thermodynamics*.
McGraw-Hill, 1977.

[38] R. C. Rosenberg & D. C. Karnopp. *Introduction to physical system
dynamics*. McGraw-Hill, New York, 1983.

[39] Y. Shoham. Temporal logics in AI: Semantical and ontological
considerations. *Artificial Intelligence*, **33**:89-104, 1987.

[Note 2]For the purpose of specifying common conceptualizations, we see no justification for restricting the form of the definitions e.g. to necessary and sufficient conditions, or to require that they be conservative definitions (which make no claims about the world).

[Note 3]It's actually number of particles, but if you understand that distinction you understand our point here.

[Note 4]Ellis (1969) gives no special status to units, saying that they are merely the names for the associated scales. We wanted to allow for agents to commit to unit conversion without committing to a theory of measurement scales.