959 resultados para Topic model
Resumo:
Various factors are believed to govern the selection of references in citation networks, but a precise, quantitative determination of their importance has remained elusive. In this paper, we show that three factors can account for the referencing pattern of citation networks for two topics, namely "graphenes" and "complex networks", thus allowing one to reproduce the topological features of the networks built with papers being the nodes and the edges established by citations. The most relevant factor was content similarity, while the other two - in-degree (i.e. citation counts) and age of publication - had varying importance depending on the topic studied. This dependence indicates that additional factors could play a role. Indeed, by intuition one should expect the reputation (or visibility) of authors and/or institutions to affect the referencing pattern, and this is only indirectly considered via the in-degree that should correlate with such reputation. Because information on reputation is not readily available, we simulated its effect on artificial citation networks considering two communities with distinct fitness (visibility) parameters. One community was assumed to have twice the fitness value of the other, which amounts to a double probability for a paper being cited. While the h-index for authors in the community with larger fitness evolved with time with slightly higher values than for the control network (no fitness considered), a drastic effect was noted for the community with smaller fitness. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Remanufacturing is the process of rebuilding used products that ensures that the quality of remanufactured products is equivalent to that of new ones. Although the theme is gaining ground, it is still little explored due to lack of knowledge, the difficulty of visualizing it systemically, and implementing it effectively. Few models treat remanufacturing as a system. Most of the studies still treated remanufacturing as an isolated process, preventing it from being seen in an integrated manner. Therefore, the aim of this work is to organize the knowledge about remanufacturing, offering a vision of remanufacturing system and contributing to an integrated view about the theme. The methodology employed was a literature review, adopting the General Theory of Systems to characterize the remanufacturing system. This work consolidates and organizes the elements of this system, enabling a better understanding of remanufacturing and assisting companies in adopting the concept.
Resumo:
This PhD thesis addresses the topic of large-scale interactions between climate and marine biogeochemistry. To this end, centennial simulations are performed under present and projected future climate conditions with a coupled ocean-atmosphere model containing a complex marine biogeochemistry model. The role of marine biogeochemistry in the climate system is first investigated. Phytoplankton solar radiation absorption in the upper ocean enhances sea surface temperatures and upper ocean stratification. The associated increase in ocean latent heat losses raises atmospheric temperatures and water vapor. Atmospheric circulation is modified at tropical and extratropical latitudes with impacts on precipitation, incoming solar radiation, and ocean circulation which cause upper-ocean heat content to decrease at tropical latitudes and to increase at middle latitudes. Marine biogeochemistry is tightly related to physical climate variability, which may vary in response to internal natural dynamics or to external forcing such as anthropogenic carbon emissions. Wind changes associated with the North Atlantic Oscillation (NAO), the dominant mode of climate variability in the North Atlantic, affect ocean properties by means of momentum, heat, and freshwater fluxes. Changes in upper ocean temperature and mixing impact the spatial structure and seasonality of North Atlantic phytoplankton through light and nutrient limitations. These changes affect the capability of the North Atlantic Ocean of absorbing atmospheric CO2 and of fixing it inside sinking particulate organic matter. Low-frequency NAO phases determine a delayed response of ocean circulation, temperature and salinity, which in turn affects stratification and marine biogeochemistry. In 20th and 21st century simulations natural wind fluctuations in the North Pacific, related to the two dominant modes of atmospheric variability, affect the spatial structure and the magnitude of the phytoplankton spring bloom through changes in upper-ocean temperature and mixing. The impacts of human-induced emissions in the 21st century are generally larger than natural climate fluctuations, with the phytoplankton spring bloom starting one month earlier than in the 20th century and with ~50% lower magnitude. This PhD thesis advances the knowledge of bio-physical interactions within the global climate, highlighting the intrinsic coupling between physical climate and biosphere, and providing a framework on which future studies of Earth System change can be built on.
Resumo:
In this PhD thesis the crashworthiness topic is studied with the perspective of the development of a small-scale experimental test able to characterize a material in terms of energy absorption. The material properties obtained are then used to validate a nu- merical model of the experimental test itself. Consequently, the numerical model, calibrated on the specific ma- terial, can be extended to more complex structures and used to simulate their energy absorption behavior. The experimental activity started at University of Washington in Seattle, WA (USA) and continued at Second Faculty of Engi- neering, University of Bologna, Forl`ı (Italy), where the numerical model for the simulation of the experimental test was implemented and optimized.
Resumo:
The development of a multibody model of a motorbike engine cranktrain is presented in this work, with an emphasis on flexible component model reduction. A modelling methodology based upon the adoption of non-ideal joints at interface locations, and the inclusion of component flexibility, is developed: both are necessary tasks if one wants to capture dynamic effects which arise in lightweight, high-speed applications. With regard to the first topic, both a ball bearing model and a journal bearing model are implemented, in order to properly capture the dynamic effects of the main connections in the system: angular contact ball bearings are modelled according to a five-DOF nonlinear scheme in order to grasp the crankshaft main bearings behaviour, while an impedance-based hydrodynamic bearing model is implemented providing an enhanced operation prediction at the conrod big end locations. Concerning the second matter, flexible models of the crankshaft and the connecting rod are produced. The well-established Craig-Bampton reduction technique is adopted as a general framework to obtain reduced model representations which are suitable for the subsequent multibody analyses. A particular component mode selection procedure is implemented, based on the concept of Effective Interface Mass, allowing an assessment of the accuracy of the reduced models prior to the nonlinear simulation phase. In addition, a procedure to alleviate the effects of modal truncation, based on the Modal Truncation Augmentation approach, is developed. In order to assess the performances of the proposed modal reduction schemes, numerical tests are performed onto the crankshaft and the conrod models in both frequency and modal domains. A multibody model of the cranktrain is eventually assembled and simulated using a commercial software. Numerical results are presented, demonstrating the effectiveness of the implemented flexible model reduction techniques. The advantages over the conventional frequency-based truncation approach are discussed.
Resumo:
Until few years ago, 3D modelling was a topic confined into a professional environment. Nowadays technological innovations, the 3D printer among all, have attracted novice users to this application field. This sudden breakthrough was not supported by adequate software solutions. The 3D editing tools currently available do not assist the non-expert user during the various stages of generation, interaction and manipulation of 3D virtual models. This is mainly due to the current paradigm that is largely supported by two-dimensional input/output devices and strongly affected by obvious geometrical constraints. We have identified three main phases that characterize the creation and management of 3D virtual models. We investigated these directions evaluating and simplifying the classic editing techniques in order to propose more natural and intuitive tools in a pure 3D modelling environment. In particular, we focused on freehand sketch-based modelling to create 3D virtual models, interaction and navigation in a 3D modelling environment and advanced editing tools for free-form deformation and objects composition. To pursuing these goals we wondered how new gesture-based interaction technologies can be successfully employed in a 3D modelling environments, how we could improve the depth perception and the interaction in 3D environments and which operations could be developed to simplify the classical virtual models editing paradigm. Our main aims were to propose a set of solutions with which a common user can realize an idea in a 3D virtual model, drawing in the air just as he would on paper. Moreover, we tried to use gestures and mid-air movements to explore and interact in 3D virtual environment, and we studied simple and effective 3D form transformations. The work was carried out adopting the discrete representation of the models, thanks to its intuitiveness, but especially because it is full of open challenges.
Resumo:
In condensed matter systems, the interfacial tension plays a central role for a multitude of phenomena. It is the driving force for nucleation processes, determines the shape and structure of crystalline structures and is important for industrial applications. Despite its importance, the interfacial tension is hard to determine in experiments and also in computer simulations. While for liquid-vapor interfacial tensions there exist sophisticated simulation methods to compute the interfacial tension, current methods for solid-liquid interfaces produce unsatisfactory results.rnrnAs a first approach to this topic, the influence of the interfacial tension on nuclei is studied within the three-dimensional Ising model. This model is well suited because despite its simplicity, one can learn much about nucleation of crystalline nuclei. Below the so-called roughening temperature, nuclei in the Ising model are not spherical anymore but become cubic because of the anisotropy of the interfacial tension. This is similar to crystalline nuclei, which are in general not spherical but more like a convex polyhedron with flat facets on the surface. In this context, the problem of distinguishing between the two bulk phases in the vicinity of the diffuse droplet surface is addressed. A new definition is found which correctly determines the volume of a droplet in a given configuration if compared to the volume predicted by simple macroscopic assumptions.rnrnTo compute the interfacial tension of solid-liquid interfaces, a new Monte Carlo method called ensemble switch method'' is presented which allows to compute the interfacial tension of liquid-vapor interfaces as well as solid-liquid interfaces with great accuracy. In the past, the dependence of the interfacial tension on the finite size and shape of the simulation box has often been neglected although there is a nontrivial dependence on the box dimensions. As a consequence, one needs to systematically increase the box size and extrapolate to infinite volume in order to accurately predict the interfacial tension. Therefore, a thorough finite-size scaling analysis is established in this thesis. Logarithmic corrections to the finite-size scaling are motivated and identified, which are of leading order and therefore must not be neglected. The astounding feature of these logarithmic corrections is that they do not depend at all on the model under consideration. Using the ensemble switch method, the validity of a finite-size scaling ansatz containing the aforementioned logarithmic corrections is carefully tested and confirmed. Combining the finite-size scaling theory with the ensemble switch method, the interfacial tension of several model systems, ranging from the Ising model to colloidal systems, is computed with great accuracy.
Resumo:
It is generally agreed that the mechanical environment of intervertebral disc cells plays an important role in maintaining a balanced matrix metabolism. The precise mechanism by which the signals are transduced into the cells is poorly understood. Osmotic changes in the extracellular matrix (ECM) are thought to be involved. Current in-vitro studies on this topic are mostly short-term and show conflicting data on the reaction of disc cells subjected to osmotic changes which is partially due to the heterogenous and often substantially-reduced culture systems. The aim of the study was therefore to investigate the effects of cyclic osmotic loading for 4 weeks on metabolism and matrix gene expression in a full-organ intervertebral disc culture system. Intervertebral disc/endplate units were isolated from New Zealand White Rabbits and cultured either in iso-osmotic media (335 mosmol/kg) or were diurnally exposed for 8 hours to hyper-osmotic conditions (485 mosmol/kg). Cell viability, metabolic activity, matrix composition and matrix gene expression profile (collagen types I/II and aggrecan) were monitored using Live/Dead cell viability assay, tetrazolium reduction test (WST 8), proteoglycan and DNA quantification assays and quantitative PCR. The results show that diurnal osmotic stimulation did not have significant effects on proteoglycan content, cellularity and disc cell viability after 28 days in culture. However, hyperosmolarity caused increased cell death in the early culture phase and counteracted up-regulation of type I collagen gene expression in nucleus and annulus cells. Moreover, the initially decreased cellular dehydrogenase activity recovered with osmotic stimulation after 4 weeks and aggrecan gene down-regulation was delayed, although the latter was not significant according to our statistical criteria. In contrast, collagen type II did not respond to the osmotic changes and was down-regulated in both groups. In conclusion, diurnal hyper-osmotic stimulation of a whole-organ disc/endplate culture partially inhibits a matrix gene expression profile as encountered in degenerative disc disease and counteracts cellular metabolic hypo-activity.
Resumo:
Energy efficiency has become an important research topic in intralogistics. Especially in this field the focus is placed on automated storage and retrieval systems (AS/RS) utilizing stacker cranes as these systems are widespread and consume a significant portion of the total energy demand of intralogistical systems. Numerical simulation models were developed to calculate the energy demand rather precisely for discrete single and dual command cycles. Unfortunately these simulation models are not suitable to perform fast calculations to determine a mean energy demand value of a complete storage aisle. For this purpose analytical approaches would be more convenient but until now analytical approaches only deliver results for certain configurations. In particular, for commonly used stacker cranes equipped with an intermediate circuit connection within their drive configuration there is no analytical approach available to calculate the mean energy demand. This article should address this research gap and present a calculation approach which enables planners to quickly calculate the energy demand of these systems.
Resumo:
Water-conducting faults and fractures were studied in the granite-hosted A¨ spo¨ Hard Rock Laboratory (SE Sweden). On a scale of decametres and larger, steeply dipping faults dominate and contain a variety of different fault rocks (mylonites, cataclasites, fault gouges). On a smaller scale, somewhat less regular fracture patterns were found. Conceptual models of the fault and fracture geometries and of the properties of rock types adjacent to fractures were derived and used as input for the modelling of in situ dipole tracer tests that were conducted in the framework of the Tracer Retention Understanding Experiment (TRUE-1) on a scale of metres. After the identification of all relevant transport and retardation processes, blind predictions of the breakthroughs of conservative to moderately sorbing tracers were calculated and then compared with the experimental data. This paper provides the geological basis and model calibration, while the predictive and inverse modelling work is the topic of the companion paper [J. Contam. Hydrol. 61 (2003) 175]. The TRUE-1 experimental volume is highly fractured and contains the same types of fault rocks and alterations as on the decametric scale. The experimental flow field was modelled on the basis of a 2D-streamtube formalism with an underlying homogeneous and isotropic transmissivity field. Tracer transport was modelled using the dual porosity medium approach, which is linked to the flow model by the flow porosity. Given the substantial pumping rates in the extraction borehole, the transport domain has a maximum width of a few centimetres only. It is concluded that both the uncertainty with regard to the length of individual fractures and the detailed geometry of the network along the flowpath between injection and extraction boreholes are not critical because flow is largely one-dimensional, whether through a single fracture or a network. Process identification and model calibration were based on a single uranine breakthrough (test PDT3), which clearly showed that matrix diffusion had to be included in the model even over the short experimental time scales, evidenced by a characteristic shape of the trailing edge of the breakthrough curve. Using the geological information and therefore considering limited matrix diffusion into a thin fault gouge horizon resulted in a good fit to the experiment. On the other hand, fresh granite was found not to interact noticeably with the tracers over the time scales of the experiments. While fracture-filling gouge materials are very efficient in retarding tracers over short periods of time (hours–days), their volume is very small and, with time progressing, retardation will be dominated by altered wall rock and, finally, by fresh granite. In such rocks, both porosity (and therefore the effective diffusion coefficient) and sorption Kds are more than one order of magnitude smaller compared to fault gouge, thus indicating that long-term retardation is expected to occur but to be less pronounced.
Resumo:
Neuroenhancement (NE), the use of substances as a means to enhance performance, has garnered considerable scientific attention of late. While ethical and epidemiological publications on the topic accumulate, there is a lack of theory-driven psychological research that aims at understanding psychological drivers of NE. In this perspective article we argue that self-control strength offers a promising theory-based approach to further understand and investigate NE behavior. Using the strength model of self-control, we derive two theory-driven perspectives on NE-self-control research. First, we propose that individual differences in state/trait self-control strength differentially affect NE behavior based on one’s individual experience of NE use. Building upon this, we outline promising research questions that (will) further elucidate our understanding of NE based on the strength model’s propositions. Second, we discuss evidence indicating that popular NE substances (like Methylphenidate) may counteract imminent losses of self-control strength. We outline how further research on NE’s effects on the ego-depletion effect may further broaden our understanding of the strength model of self-control.
Resumo:
The human cytochrome P450 3A (CYP3A) subfamily is responsible for most of the metabolism of therapeutic drugs; however, an adequate in vivo model has yet to be discovered. This study begins with an investigation of a controversial topic surrounding the human CYP3As--estrogen regulation. A novel approach to this topic was used by defining expression in the estrogen-responsive endometrium. This study shows that estrogen down-regulates CYP3A4 expression in the endometrium. On the other hand, analogous studies showed an increase in CYP3A expression as age increases in liver tissue. Following the discussion of estrogen regulation, is an investigation of the cross-species relationships among all of the CYP3As was completed. The study compares isoforms from piscines, avians, rodents, canines, ovines, bovines, and primates. Using the traditional phylogenetic analyses and employing a novel approach using exon and intron lengths, the results show that only another primate could be the best animal model for analysis of the regulation of the expression of the human CYP3As. This analysis also demonstrated that the chimpanzee seems to be the best available human model. Moreover, the study showed the presence and similarities of one additional isoform in the chimpanzee genome that is absent in humans. Based on these results, initial characterization of the chimpanzee CYP3A subfamily was begun. While the human genome contains four isoforms--CYP3A4, CYP3A5, CYP3A7, and CYP3A43--the chimpanzee genome has five, the four previously mentioned and CYP3A67. Both species express CYP3A4, CYP3A5, and CYP3A43, but humans express CYP3A7 while chimpanzees express CYP3A67. In humans, CYP3A4 is expressed at higher levels than the other isoforms, but some chimpanzee individuals express CYP3A67 at higher levels than CYP3A4. Such a difference is expected to alter significantly the total CYP3A metabolism. On the other hand, any study considering individual isoforms would still constitute a valid method of study for the human CYP3A4, CYP3A5, and CYP3A43 isoforms. ^
Resumo:
Domestic violence is a major public health problem, yet most physicians do not effectively identify patients at risk. Medical students and residents are not routinely educated on this topic and little is known about the factors that influence their decisions to include screening for domestic violence in their subsequent practice. In order to assess the readiness of primary care residents to screen all patients for domestic violence, this study utilized a survey incorporating constructs from the Transtheoretical Model, including Stages of Change, Decisional Balance (Pros and Cons) and Self-Efficacy. The survey was distributed to residents at the University of Texas Health Science Center Medical School in Houston in: Internal Medicine, Medicine/Pediatrics, Pediatrics, Family Medicine, and Obstetrics and Gynecology. Data from the survey was analyzed to test the hypothesis that residents in the earlier Stages of Change report more costs and fewer benefits with regards to screening for domestic violence, and that those in the later stages exhibit higher Self-Efficacy scores. The findings from this study were consistent with the model in that benefits to screening (Pros) and Self-Efficacy were correlated with later Stages of Change, however reporting fewer costs (Cons) was not. Very few residents were ready to screen all of their patients.^
Resumo:
The disparate burden of breast cancer-related morbidity and mortality experienced by African American women compared with women of other races is a topic of intense debate in the medical and public health arenas. The anomaly is consistently attributed to the fact that at diagnosis, a large proportion of African American women have advanced-stage disease. Extensive research has documented the impacts of cultural factors and of socioeconomic factors in shaping African American women's breast-health practices; however, there is another factor of a more subtle influence that might have some role in establishing these women's vulnerability to this disease: the lack of or perceived lack of partner support. Themes expressed in the research literature reflect that many African American breast cancer patients and survivors consider their male partners as being apathetic and nonsupportive. ^ The purpose of this study was to learn how African American couples' ethnographic paradigms and cultural explanatory model of breast cancer frame the male partners' responses to the women's diagnosis and to assess his ability to cope and willingness to adapt to the subsequent challenges. The goal of the study was to determine whether these men's coping and adaptation skills positively or negatively affect the women's self-care attitudes and behaviors. ^ This study involved 4 African American couples in which the woman was a breast cancer survivor. Participants were recruited through a community-based cancer support group and a church-based cancer support group. Recruitment sessions were held at regular meetings of these organizations. Accrual took 2 months. In separate sessions, each male partner and each survivor completed a demographic survey and a questionnaire and were interviewed. Additionally, the couples were asked to participate in a communications activity (Adinkra). This activity was not done to fulfill any part of the study purpose and was not included in the data analysis; rather, it was done to assess its potential use as an intervention to promote dialogue between African American partners about the experience of breast cancer. ^ The questionnaire was analyzed on the basis of a coding schema and the interview responses were analyzed on the principles of hermeneutic phenomenology. In both cases, the instruments were used to determine whether the partner's coping skills reflected a compassionate attitude (positive response) versus an apathetic attitude (negative response) and whether his adaptation skills reflected supportive behaviors (the positive response) versus nonsupportive behaviors (the negative response). Overall, the women's responses showed that they perceived of their partners as being compassionate, yet nonsupportive, and the partner's perceived of themselves likewise. Only half of the women said that their partners' coping and adaptation abilities enabled them to relinquish traditional concepts of control and focus on their own well-being. ^ The themes that emerged indicate that African American men's attitudes and behaviors regarding his female partner's diagnosis of breast cancer and his ability to cope and willingness to adapt are influenced by their ritualistic mantras, folk beliefs, religious teachings/spiritual values, existential ideologies, socioeconomic status, and environmental factors and by their established perceptions of what causes breast cancer, what the treatments and outcomes are, and how the disease affects the entire family, particularly him. These findings imply that a culturally specific intervention might be useful in educating African American men about breast cancer and their roles in supporting their female partners, physically and psychologically, during diagnosis, treatment, and recovery. ^
Resumo:
OntoTag - A Linguistic and Ontological Annotation Model Suitable for the Semantic Web
1. INTRODUCTION. LINGUISTIC TOOLS AND ANNOTATIONS: THEIR LIGHTS AND SHADOWS
Computational Linguistics is already a consolidated research area. It builds upon the results of other two major ones, namely Linguistics and Computer Science and Engineering, and it aims at developing computational models of human language (or natural language, as it is termed in this area). Possibly, its most well-known applications are the different tools developed so far for processing human language, such as machine translation systems and speech recognizers or dictation programs.
These tools for processing human language are commonly referred to as linguistic tools. Apart from the examples mentioned above, there are also other types of linguistic tools that perhaps are not so well-known, but on which most of the other applications of Computational Linguistics are built. These other types of linguistic tools comprise POS taggers, natural language parsers and semantic taggers, amongst others. All of them can be termed linguistic annotation tools.
Linguistic annotation tools are important assets. In fact, POS and semantic taggers (and, to a lesser extent, also natural language parsers) have become critical resources for the computer applications that process natural language. Hence, any computer application that has to analyse a text automatically and ‘intelligently’ will include at least a module for POS tagging. The more an application needs to ‘understand’ the meaning of the text it processes, the more linguistic tools and/or modules it will incorporate and integrate.
However, linguistic annotation tools have still some limitations, which can be summarised as follows:
1. Normally, they perform annotations only at a certain linguistic level (that is, Morphology, Syntax, Semantics, etc.).
2. They usually introduce a certain rate of errors and ambiguities when tagging. This error rate ranges from 10 percent up to 50 percent of the units annotated for unrestricted, general texts.
3. Their annotations are most frequently formulated in terms of an annotation schema designed and implemented ad hoc.
A priori, it seems that the interoperation and the integration of several linguistic tools into an appropriate software architecture could most likely solve the limitations stated in (1). Besides, integrating several linguistic annotation tools and making them interoperate could also minimise the limitation stated in (2). Nevertheless, in the latter case, all these tools should produce annotations for a common level, which would have to be combined in order to correct their corresponding errors and inaccuracies. Yet, the limitation stated in (3) prevents both types of integration and interoperation from being easily achieved.
In addition, most high-level annotation tools rely on other lower-level annotation tools and their outputs to generate their own ones. For example, sense-tagging tools (operating at the semantic level) often use POS taggers (operating at a lower level, i.e., the morphosyntactic) to identify the grammatical category of the word or lexical unit they are annotating. Accordingly, if a faulty or inaccurate low-level annotation tool is to be used by other higher-level one in its process, the errors and inaccuracies of the former should be minimised in advance. Otherwise, these errors and inaccuracies would be transferred to (and even magnified in) the annotations of the high-level annotation tool.
Therefore, it would be quite useful to find a way to
(i) correct or, at least, reduce the errors and the inaccuracies of lower-level linguistic tools;
(ii) unify the annotation schemas of different linguistic annotation tools or, more generally speaking, make these tools (as well as their annotations) interoperate.
Clearly, solving (i) and (ii) should ease the automatic annotation of web pages by means of linguistic tools, and their transformation into Semantic Web pages (Berners-Lee, Hendler and Lassila, 2001). Yet, as stated above, (ii) is a type of interoperability problem. There again, ontologies (Gruber, 1993; Borst, 1997) have been successfully applied thus far to solve several interoperability problems. Hence, ontologies should help solve also the problems and limitations of linguistic annotation tools aforementioned.
Thus, to summarise, the main aim of the present work was to combine somehow these separated approaches, mechanisms and tools for annotation from Linguistics and Ontological Engineering (and the Semantic Web) in a sort of hybrid (linguistic and ontological) annotation model, suitable for both areas. This hybrid (semantic) annotation model should (a) benefit from the advances, models, techniques, mechanisms and tools of these two areas; (b) minimise (and even solve, when possible) some of the problems found in each of them; and (c) be suitable for the Semantic Web. The concrete goals that helped attain this aim are presented in the following section.
2. GOALS OF THE PRESENT WORK
As mentioned above, the main goal of this work was to specify a hybrid (that is, linguistically-motivated and ontology-based) model of annotation suitable for the Semantic Web (i.e. it had to produce a semantic annotation of web page contents). This entailed that the tags included in the annotations of the model had to (1) represent linguistic concepts (or linguistic categories, as they are termed in ISO/DCR (2008)), in order for this model to be linguistically-motivated; (2) be ontological terms (i.e., use an ontological vocabulary), in order for the model to be ontology-based; and (3) be structured (linked) as a collection of ontology-based