911 resultados para CIDOC Conceptual Reference Model
Resumo:
Climatic changes are most pronounced in northern high latitude regions. Yet, there is a paucity of observational data, both spatially and temporally, such that regional-scale dynamics are not fully captured, limiting our ability to make reliable projections. In this study, a group of dynamical downscaling products were created for the period 1950 to 2100 to better understand climate change and its impacts on hydrology, permafrost, and ecosystems at a resolution suitable for northern Alaska. An ERA-interim reanalysis dataset and the Community Earth System Model (CESM) served as the forcing mechanisms in this dynamical downscaling framework, and the Weather Research & Forecast (WRF) model, embedded with an optimization for the Arctic (Polar WRF), served as the Regional Climate Model (RCM). This downscaled output consists of multiple climatic variables (precipitation, temperature, wind speed, dew point temperature, and surface air pressure) for a 10 km grid spacing at three-hour intervals. The modeling products were evaluated and calibrated using a bias-correction approach. The ERA-interim forced WRF (ERA-WRF) produced reasonable climatic variables as a result, yielding a more closely correlated temperature field than precipitation field when long-term monthly climatology was compared with its forcing and observational data. A linear scaling method then further corrected the bias, based on ERA-interim monthly climatology, and bias-corrected ERA-WRF fields were applied as a reference for calibration of both the historical and the projected CESM forced WRF (CESM-WRF) products. Biases, such as, a cold temperature bias during summer and a warm temperature bias during winter as well as a wet bias for annual precipitation that CESM holds over northern Alaska persisted in CESM-WRF runs. The linear scaling of CESM-WRF eventually produced high-resolution downscaling products for the Alaskan North Slope for hydrological and ecological research, together with the calibrated ERA-WRF run, and its capability extends far beyond that. Other climatic research has been proposed, including exploration of historical and projected climatic extreme events and their possible connections to low-frequency sea-atmospheric oscillations, as well as near-surface permafrost degradation and ice regime shifts of lakes. These dynamically downscaled, bias corrected climatic datasets provide improved spatial and temporal resolution data necessary for ongoing modeling efforts in northern Alaska focused on reconstructing and projecting hydrologic changes, ecosystem processes and responses, and permafrost thermal regimes. The dynamical downscaling methods presented in this study can also be used to create more suitable model input datasets for other sub-regions of the Arctic.
Resumo:
Identifying inequities in access to health care requires critical scrutiny of the patterns and processes of care decisions. This paper describes a conceptual model. derived from social problems theory. which is proposed as a useful framework for explaining patterns of post-acute care referral and in particular, individual variations in referral to rehabilitation after traumatic brain injury (TBI). The model is based on three main components: (1) characteristics of the individual with TBI, (2) activities of health care professionals and the processes of referral. and (3) the contexts of care. The central argument is that access to rehabilitation following TBI is a dynamic phenomenon concerning the interpretations and negotiations of health care professionals. which in turn are shaped by the organisational and broader health care contexts. The model developed in this paper provides opportunity to develop a complex analysis of post-acute care referral based on patient factors, contextual factors and decision-making processes. It is anticipated that this framework will have utility in other areas examining and understanding patterns of access to health care. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
The structure and function of the pharyngeal jaw apparatus (PJA) and postpharyngeal alimentary tract of Arrhamphus sclerolepis krefftii, an herbivorous hemiramphid, were investigated by dissection, light and scanning electron microscopy, and X-ray analysis of live specimens. A simple model of PJA operation is proposed, consisting of an adductive power stroke of the third pharyngobranchial that draws it posteriorly while the fifth ceratobranchial is adducted, and a return stroke in which the third pharyngobranchial bone is drawn anteriorly during abduction of the fifth ceratobranchial. Teeth in the posteromedial region of the PJA are eroded into an occlusion zone where the teeth of the third pharyngobranchial are spatulate incisiform and face posteriorly in opposition to the rostrally oriented spatulate incisiform teeth in the wear zone of the fifth ceratobranchial. The shape of the teeth and their pedestals (bone of attachment) is consistent with the model and with the forces likely to operate on the elements of the PJA during mastication. The role of pharyngeal tooth replacement in maintaining the occlusal surfaces in the PJA during growth is described. The postpharyngeal alimentary tract of A. sclerolepis krefftii comprises a stomachless cylinder that attenuates gradually as it passes straight to the anus, interrupted only by a rectal valve. The ratio of gut length to standard length is about 0.5. Despite superficial similarities to the cichlid PJA (Stiassny and Jensen [1987] Bull Mus Comp Zool 151: 269-319), the hemiramphid PJA differs in the fusion of the third pharyngobranchial bones, teeth in the second pharyngobranchials and the fifth ceratobranchial face anteriorly, the presence of a slide-like diarthroses between the heads of the fourth epibranchials and the third pharyngobranchial, the occlusion zone of constantly wearing teeth, and the unusual form of the muscularis craniopharyngobranchialis. The functional relationship between these structures is explained and the consequence for the fish of a complex PJA and a simple gut is discussed. (C) 2002 Wiley-Liss, Inc.
Resumo:
Descriptive models of social response are concerned with identifying and discriminating between different types of response to social influence. In a previous article (Nail, MacDonald, & Levy, 2000), the authors demonstrated that 4 conceptual dimensions are necessary to adequately distinguish between such phenomena as conformity, compliance, contagion, independence, and anticonformity in a single model. This article expands the scope of the authors' 4-dimensional approach by reviewing selected experimental and cultural evidence, further demonstrating the integrative power of the model. This review incorporates political psychology, culture and aggression, self-persuasion, group norms, prejudice, impression management, psychotherapy, pluralistic ignorance, bystander intervention/nonintervention, public policy, close relationships, and implicit attitudes.
Resumo:
Much research has been devoted over the years to investigating and advancing the techniques and tools used by analysts when they model. As opposed to what academics, software providers and their resellers promote as should be happening, the aim of this research was to determine whether practitioners still embraced conceptual modeling seriously. In addition, what are the most popular techniques and tools used for conceptual modeling? What are the major purposes for which conceptual modeling is used? The study found that the top six most frequently used modeling techniques and methods were ER diagramming, data flow diagramming, systems flowcharting, workflow modeling, UML, and structured charts. Modeling technique use was found to decrease significantly from smaller to medium-sized organizations, but then to increase significantly in larger organizations (proxying for large, complex projects). Technique use was also found to significantly follow an inverted U-shaped curve, contrary to some prior explanations. Additionally, an important contribution of this study was the identification of the factors that uniquely influence the decision of analysts to continue to use modeling, viz., communication (using diagrams) to/from stakeholders, internal knowledge (lack of) of techniques, user expectations management, understanding models' integration into the business, and tool/software deficiencies. The highest ranked purposes for which modeling was undertaken were database design and management, business process documentation, business process improvement, and software development. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Smallholder farmers in Africa practice traditional cropping techniques such as intercropping. Intercropping is thought to offer higher productivity and resource milisation than sole cropping. In this study, risk associated with maize-bean intercropping was evaluated by quantifying long-term yield in both intercropping and sole cropping in a semi-arid region of South Africa (Bloemfontein, Free State) with reference to rainfall variability. The crop simulation model was run with different cultural practices (planting date and plant density) for 52 summer crop growing seasons (1950/1951-2001/2002). Eighty-one scenarios, consisted of three levels of initial soil water, planting date, maize population, and bean population, were simulated. From the simulation outputs, the total land equivalent ratio (LER) was greater than one. The intercrop (equivalent to sole maize) had greater energy value (EV) than sole beans, and the intercrop (equivalent to sole beans) had greater monetary value (MV) than sole maize. From these results, it can be concluded that maize-bean intercropping is advantageous for this semi-arid region. Soil water at planting was the most important factor of all scenario factors, followed by planting date. Irrigation application at planting, November/December planting and high plant density of maize for EV and beans for MV can be one of the most effective cultural practices in the study region. With regard to rainfall variability, seasonal (October-April) rainfall positively affected EV and MV, but not LER. There was more intercrop production in La Nina years than in El Nino years. Thus, better cultural practices may be selected to maximize maize-bean intercrop yields for specific seasons in the semi-arid region based on the global seasonal outlook. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
Geospatio-temporal conceptual models provide a mechanism to explicitly represent geospatial and temporal aspects of applications. Such models, which focus on both what and when/where, need to be more expressive than conventional conceptual models (e.g., the ER model), which primarily focus on what is important for a given application. In this study, we view conceptual schema comprehension of geospatio-temporal data semantics in terms of matching the external problem representation (that is, the conceptual schema) to the problem-solving task (that is, syntactic and semantic comprehension tasks), an argument based on the theory of cognitive fit. Our theory suggests that an external problem representation that matches the problem solver's internal task representation will enhance performance, for example, in comprehending such schemas. To assess performance on geospatio-temporal schema comprehension tasks, we conducted a laboratory experiment using two semantically identical conceptual schemas, one of which mapped closely to the internal task representation while the other did not. As expected, we found that the geospatio-temporal conceptual schema that corresponded to the internal representation of the task enhanced the accuracy of schema comprehension; comprehension time was equivalent for both. Cognitive fit between the internal representation of the task and conceptual schemas with geospatio-temporal annotations was, therefore, manifested in accuracy of schema comprehension and not in time for problem solution. Our findings suggest that the annotated schemas facilitate understanding of data semantics represented on the schema.
Resumo:
The Bunge-Wand-Weber (BWW) representation model defines ontological constructs for information systems. According to these constructs the completeness and efficiency of a modeling technique can be defined. Ontology plays an essential role in e-commerce. Using or updating an existing ontology and providing tools to solve any semantic conflicts become essential steps before putting a system online. We use conceptual graphs (CGs) to implement ontologies. This paper evaluates CG capabilities using the BWW representation model. It finds out that CGs are ontologically complete according to Wand and Weber definition. Also it finds out that CGs have construct overload and construct redundancy which can undermine the ontological clarity of CGs. This leads us to build a meta-model to avoid some ontological-unclarity problems. We use some of the BWW constructs to build the meta-model. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Although information systems (IS) problem solving involves knowledge of both the IS and application domains, little attention has been paid to the role of application domain knowledge. In this study, which is set in the context of conceptual modeling, we examine the effects of both IS and application domain knowledge on different types of schema understanding tasks: syntactic and semantic comprehension tasks and schema-based problem-solving tasks. Our thesis was that while IS domain knowledge is important in solving all such tasks, the role of application domain knowledge is contingent upon the type of understanding task under investigation. We use the theory of cognitive fit to establish theoretical differences in the role of application domain knowledge among the different types of schema understanding tasks. We hypothesize that application domain knowledge does not influence the solution of syntactic and semantic comprehension tasks for which cognitive fit exists, but does influence the solution of schema-based problem-solving tasks for which cognitive fit does not exist. To assess performance on different types of conceptual schema understanding tasks, we conducted a laboratory experiment in which participants with high- and low-IS domain knowledge responded to two equivalent conceptual schemas that represented high and low levels of application knowledge (familiar and unfamiliar application domains). As expected, we found that IS domain knowledge is important in the solution of all types of conceptual schema understanding tasks in both familiar and unfamiliar applications domains, and that the effect of application domain knowledge is contingent on task type. Our findings for the EER model were similar to those for the ER model. Given the differential effects of application domain knowledge on different types of tasks, this study highlights the importance of considering more than one application domain in designing future studies on conceptual modeling.
Resumo:
The application of nonlocal density functional theory (NLDFT) to determine pore size distribution (PSD) of activated carbons using a nongraphitized carbon black, instead of graphitized thermal carbon black, as a reference system is explored. We show that in this case nitrogen and argon adsorption isotherms in activated carbons are precisely correlated by the theory, and such an excellent correlation would never be possible if the pore wall surface was assumed to be identical to that of graphitized carbon black. It suggests that pore wall surfaces of activated carbon are closer to that of amorphous solids because of defects of crystalline lattice, finite pore length, and the presence of active centers.. etc. Application of the NLDFT adapted to amorphous solids resulted in quantitative description of N-2 and Ar adsorption isotherms on nongraphitized carbon black BP280 at their respective boiling points. In the present paper we determined solid-fluid potentials from experimental adsorption isotherms on nongraphitized carbon black and subsequently used those potentials to model adsorption in slit pores and generate a corresponding set of local isotherms, which we used to determine the PSD functions of different activated carbons. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Background: Published birthweight references in Australia do not fully take into account constitutional factors that influence birthweight and therefore may not provide an accurate reference to identify the infant with abnormal growth. Furthermore, studies in other regions that have derived adjusted (customised) birthweight references have applied untested assumptions in the statistical modelling. Aims: To validate the customised birthweight model and to produce a reference set of coefficients for estimating a customised birthweight that may be useful for maternity care in Australia and for future research. Methods: De-identified data were extracted from the clinical database for all births at the Mater Mother's Hospital, Brisbane, Australia, between January 1997 and June 2005. Births with missing data for the variables under study were excluded. In addition the following were excluded: multiple pregnancies, births less than 37 completed week's gestation, stillbirths, and major congenital abnormalities. Multivariate analysis was undertaken. A double cross-validation procedure was used to validate the model. Results: The study of 42 206 births demonstrated that, for statistical purposes, birthweight is normally distributed. Coefficients for the derivation of customised birthweight in an Australian population were developed and the statistical model is demonstrably robust. Conclusions: This study provides empirical data as to the robustness of the model to determine customised birthweight. Further research is required to define where normal physiology ends and pathology begins, and which segments of the population should be included in the construction of a customised birthweight standard.
Resumo:
In the absence of an external frame of reference-i.e., in background independent theories such as general relativity-physical degrees of freedom must describe relations between systems. Using a simple model, we investigate how such a relational quantum theory naturally arises by promoting reference systems to the status of dynamical entities. Our goal is twofold. First, we demonstrate using elementary quantum theory how any quantum mechanical experiment admits a purely relational description at a fundamental. Second, we describe how the original non-relational theory approximately emerges from the fully relational theory when reference systems become semi-classical. Our technique is motivated by a Bayesian approach to quantum mechanics, and relies on the noiseless subsystem method of quantum information science used to protect quantum states against undesired noise. The relational theory naturally predicts a fundamental decoherence mechanism, so an arrow of time emerges from a time-symmetric theory. Moreover, our model circumvents the problem of the collapse of the wave packet as the probability interpretation is only ever applied to diagonal density operators. Finally, the physical states of the relational theory can be described in terms of spin networks introduced by Penrose as a combinatorial description of geometry, and widely studied in the loop formulation of quantum gravity. Thus, our simple bottom-up approach (starting from the semiclassical limit to derive the fully relational quantum theory) may offer interesting insights on the low energy limit of quantum gravity.
Resumo:
Semantic data models provide a map of the components of an information system. The characteristics of these models affect their usefulness for various tasks (e.g., information retrieval). The quality of information retrieval has obvious important consequences, both economic and otherwise. Traditionally, data base designers have produced parsimonious logical data models. In spite of their increased size, ontologically clearer conceptual models have been shown to facilitate better performance for both problem solving and information retrieval tasks in experimental settings. The experiments producing evidence of enhanced performance for ontologically clearer models have, however, used application domains of modest size. Data models in organizational settings are likely to be substantially larger than those used in these experiments. This research used an experiment to investigate whether the benefits of improved information retrieval performance associated with ontologically clearer models are robust as the size of the application domains increase. The experiment used an application domain of approximately twice the size as tested in prior experiments. The results indicate that, relative to the users of the parsimonious implementation, end users of the ontologically clearer implementation made significantly more semantic errors, took significantly more time to compose their queries, and were significantly less confident in the accuracy of their queries.
Resumo:
Our extensive research has indicated that high-school teachers are reluctant to make use of existing instructional educational software (Pollard, 2005). Even software developed in a partnership between a teacher and a software engineer is unlikely to be adopted by teachers outside the partnership (Pollard, 2005). In this paper we address these issues directly by adopting a reusable architectural design for instructional educational software which allows easy customisation of software to meet the specific needs of individual teachers. By doing this we will facilitate more teachers regularly using instructional technology within their classrooms. Our domain-specific software architecture, Interface-Activities-Model, was designed specifically to facilitate individual customisation by redefining and restructuring what constitutes an object so that they can be readily reused or extended as required. The key to this architecture is the way in which the software is broken into small generic encapsulated components with minimal domain specific behaviour. The domain specific behaviour is decoupled from the interface and encapsulated in objects which relate to the instructional material through tasks and activities. The domain model is also broken into two distinct models - Application State Model and Domainspecific Data Model. This decoupling and distribution of control gives the software designer enormous flexibility in modifying components without affecting other sections of the design. This paper sets the context of this architecture, describes it in detail, and applies it to an actual application developed to teach high-school mathematical concepts.