957 resultados para model-based reasoning processes


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Results from the first Sun-to-Earth coupled numerical model developed at the Center for Integrated Space Weather Modeling are presented. The model simulates physical processes occurring in space spanning from the corona of the Sun to the Earth's ionosphere, and it represents the first step toward creating a physics-based numerical tool for predicting space weather conditions in the near-Earth environment. Two 6- to 7-d intervals, representing different heliospheric conditions in terms of the three-dimensional configuration of the heliospheric current sheet, are chosen for simulations. These conditions lead to drastically different responses of the simulated magnetosphere-ionosphere system, emphasizing, on the one hand, challenges one encounters in building such forecasting tools, and on the other hand, emphasizing successes that can already be achieved even at this initial stage of Sun-to-Earth modeling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Grass-based diets are of increasing social-economic importance in dairy cattle farming, but their low supply of glucogenic nutrients may limit the production of milk. Current evaluation systems that assess the energy supply and requirements are based on metabolisable energy (ME) or net energy (NE). These systems do not consider the characteristics of the energy delivering nutrients. In contrast, mechanistic models take into account the site of digestion, the type of nutrient absorbed and the type of nutrient required for production of milk constituents, and may therefore give a better prediction of supply and requirement of nutrients. The objective of the present study is to compare the ability of three energy evaluation systems, viz. the Dutch NE system, the agricultural and food research council (AFRC) ME system, and the feed into milk (FIM) ME system, and of a mechanistic model based on Dijkstra et al. [Simulation of digestion in cattle fed sugar cane: prediction of nutrient supply for milk production with locally available supplements. J. Agric. Sci., Cambridge 127, 247-60] and Mills et al. [A mechanistic model of whole-tract digestion and methanogenesis in the lactating dairy cow: model development, evaluation and application. J. Anim. Sci. 79, 1584-97] to predict the feed value of grass-based diets for milk production. The dataset for evaluation consists of 41 treatments of grass-based diets (at least 0.75 g ryegrass/g diet on DM basis). For each model, the predicted energy or nutrient supply, based on observed intake, was compared with predicted requirement based on observed performance. Assessment of the error of energy or nutrient supply relative to requirement is made by calculation of mean square prediction error (MSPE) and by concordance correlation coefficient (CCC). All energy evaluation systems predicted energy requirement to be lower (6-11%) than energy supply. The root MSPE (expressed as a proportion of the supply) was lowest for the mechanistic model (0.061), followed by the Dutch NE system (0.082), FIM ME system (0.097) and AFRCME system(0.118). For the energy evaluation systems, the error due to overall bias of prediction dominated the MSPE, whereas for the mechanistic model, proportionally 0.76 of MSPE was due to random variation. CCC analysis confirmed the higher accuracy and precision of the mechanistic model compared with energy evaluation systems. The error of prediction was positively related to grass protein content for the Dutch NE system, and was also positively related to grass DMI level for all models. In conclusion, current energy evaluation systems overestimate energy supply relative to energy requirement on grass-based diets for dairy cattle. The mechanistic model predicted glucogenic nutrients to limit performance of dairy cattle on grass-based diets, and proved to be more accurate and precise than the energy systems. The mechanistic model could be improved by allowing glucose maintenance and utilization requirements parameters to be variable. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We developed a stochastic simulation model incorporating most processes likely to be important in the spread of Phytophthora ramorum and similar diseases across the British landscape (covering Rhododendron ponticum in woodland and nurseries, and Vaccinium myrtillus in heathland). The simulation allows for movements of diseased plants within a realistically modelled trade network and long-distance natural dispersal. A series of simulation experiments were run with the model, representing an experiment varying the epidemic pressure and linkage between natural vegetation and horticultural trade, with or without disease spread in commercial trade, and with or without inspections-with-eradication, to give a 2 x 2 x 2 x 2 factorial started at 10 arbitrary locations spread across England. Fifty replicate simulations were made at each set of parameter values. Individual epidemics varied dramatically in size due to stochastic effects throughout the model. Across a range of epidemic pressures, the size of the epidemic was 5-13 times larger when commercial movement of plants was included. A key unknown factor in the system is the area of susceptible habitat outside the nursery system. Inspections, with a probability of detection and efficiency of infected-plant removal of 80% and made at 90-day intervals, reduced the size of epidemics by about 60% across the three sectors with a density of 1% susceptible plants in broadleaf woodland and heathland. Reducing this density to 0.1% largely isolated the trade network, so that inspections reduced the final epidemic size by over 90%, and most epidemics ended without escape into nature. Even in this case, however, major wild epidemics developed in a few percent of cases. Provided the number of new introductions remains low, the current inspection policy will control most epidemics. However, as the rate of introduction increases, it can overwhelm any reasonable inspection regime, largely due to spread prior to detection. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a non-local version of the NJL model, based on a separable quark-quark interaction. The interaction is extended to include terms that bind vector and axial-vector mesons. The non-locality means that no further regulator is required. Moreover the model is able to confine the quarks by generating a quark propagator without poles at real energies. Working in the ladder approximation, we calculate amplitudes in Euclidean space and discuss features of their continuation to Minkowski energies. Conserved currents are constructed and we demonstrate their consistency with various Ward identities. Various meson masses are calculated, along with their strong and electromagnetic decay amplitudes. We also calculate the electromagnetic form factor of the pion, as well as form factors associated with the processes γγ* → π0 and ω → π0γ*. The results are found to lead to a satisfactory phenomenology and lend some dynamical support to the idea of vector-meson dominance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Hadley Centre Global Environmental Model (HadGEM) includes two aerosol schemes: the Coupled Large-scale Aerosol Simulator for Studies in Climate (CLASSIC), and the new Global Model of Aerosol Processes (GLOMAP-mode). GLOMAP-mode is a modal aerosol microphysics scheme that simulates not only aerosol mass but also aerosol number, represents internally-mixed particles, and includes aerosol microphysical processes such as nucleation. In this study, both schemes provide hindcast simulations of natural and anthropogenic aerosol species for the period 2000–2006. HadGEM simulations of the aerosol optical depth using GLOMAP-mode compare better than CLASSIC against a data-assimilated aerosol re-analysis and aerosol ground-based observations. Because of differences in wet deposition rates, GLOMAP-mode sulphate aerosol residence time is two days longer than CLASSIC sulphate aerosols, whereas black carbon residence time is much shorter. As a result, CLASSIC underestimates aerosol optical depths in continental regions of the Northern Hemisphere and likely overestimates absorption in remote regions. Aerosol direct and first indirect radiative forcings are computed from simulations of aerosols with emissions for the year 1850 and 2000. In 1850, GLOMAP-mode predicts lower aerosol optical depths and higher cloud droplet number concentrations than CLASSIC. Consequently, simulated clouds are much less susceptible to natural and anthropogenic aerosol changes when the microphysical scheme is used. In particular, the response of cloud condensation nuclei to an increase in dimethyl sulphide emissions becomes a factor of four smaller. The combined effect of different 1850 baselines, residence times, and abilities to affect cloud droplet number, leads to substantial differences in the aerosol forcings simulated by the two schemes. GLOMAP-mode finds a presentday direct aerosol forcing of −0.49Wm−2 on a global average, 72% stronger than the corresponding forcing from CLASSIC. This difference is compensated by changes in first indirect aerosol forcing: the forcing of −1.17Wm−2 obtained with GLOMAP-mode is 20% weaker than with CLASSIC. Results suggest that mass-based schemes such as CLASSIC lack the necessary sophistication to provide realistic input to aerosol-cloud interaction schemes. Furthermore, the importance of the 1850 baseline highlights how model skill in predicting present-day aerosol does not guarantee reliable forcing estimates. Those findings suggest that the more complex representation of aerosol processes in microphysical schemes improves the fidelity of simulated aerosol forcings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A model based on graph isomorphisms is used to formalize software evolution. Step by step we narrow the search space by an informed selection of the attributes based on the current state-of-the-art in software engineering and generate a seed solution. We then traverse the resulting space using graph isomorphisms and other set operations over the vertex sets. The new solutions will preserve the desired attributes. The goal of defining an isomorphism based search mechanism is to construct predictors of evolution that can facilitate the automation of ’software factory’ paradigm. The model allows for automation via software tools implementing the concepts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A model based on graph isomorphisms is used to formalize software evolution. Step by step we narrow the search space by an informed selection of the attributes based on the current state-of-the-art in software engineering and generate a seed solution. We then traverse the resulting space using graph isomorphisms and other set operations over the vertex sets. The new solutions will preserve the desired attributes. The goal of defining an isomorphism based search mechanism is to construct predictors of evolution that can facilitate the automation of ’software factory’ paradigm. The model allows for automation via software tools implementing the concepts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trust is one of the most important factors that influence the successful application of network service environments, such as e-commerce, wireless sensor networks, and online social networks. Computation models associated with trust and reputation have been paid special attention in both computer societies and service science in recent years. In this paper, a dynamical computation model of reputation for B2C e-commerce is proposed. Firstly, conceptions associated with trust and reputation are introduced, and the mathematical formula of trust for B2C e-commerce is given. Then a dynamical computation model of reputation is further proposed based on the conception of trust and the relationship between trust and reputation. In the proposed model, classical varying processes of reputation of B2C e-commerce are discussed. Furthermore, the iterative trust and reputation computation models are formulated via a set of difference equations based on the closed-loop feedback mechanism. Finally, a group of numerical simulation experiments are performed to illustrate the proposed model of trust and reputation. Experimental results show that the proposed model is effective in simulating the dynamical processes of trust and reputation for B2C e-commerce.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: Professional practice placement programs in dietetics face a number of challenges in respect of quantity, quality and sustainability. The aim of the present study is to report on the development of an innovative placement model based on a variety of training and supervision approaches to address these aforementioned challenges.

Methods: The model was developed following an investigation of existing practice and the literature with approaches that were identified as important to the requirements and constraints of dietetics clinical training incorporated into the model.

Results: Although one-on-one supervision is the predominant approach in Australian dietetic education, the educational literature and the authors' experience showed that a variety of approaches are represented in some form. The model developed involves the pairing of two students with one supervisor with students changing peer partners and supervisors every three weeks during the nine-week placement to diversify exposure to working and learning styles. The model integrates four customised approaches: incremental exposure to tasks; use of a clinical reasoning framework to help structure student understanding of the methods and judgements involved in patient care; structured enquiry in group discussions; and peer observation and feedback.

Conclusions: The model has potential to achieve efficiencies in supervisors' involvement by coordinating the skill development activities of students as a group and promoting peer-assisted learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent time, technology applications in different fields, especially Business Intelligence (BI) have been developed rapidly and considered to be one of the most significant uses of information technology with special position reserved. The application of BI systems provides organizations with a sense of superiority in the competitive environment. Despite many advantages, the companies applying such systems may also encounter problems in decision-making process because of the highly diversified interactions within the systems. Hence, the choice of a suitable BI platform is important to take the great advantage of using information technology in all organizational fields. The current research aims at addressing the problems existed in the organizational decision-making process, proposing and implementing a suitable BI platform using Iranian companies as case study. The paper attempts to present a solitary model based on studying different methods in BI platform choice and applying the chosen BI platform for different decisionmaking processes. The results from evaluating the effectiveness of subsequently implementing the model for Iranian Industrial companies are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The continuous growth of the users pool of Social Networking web sites such as Facebook and MySpace, and their incessant augmentation of services and capabilities will in the future, meet and compare in contrast with today's Content distribution Networks (CDN) and Peer-to-Peer File sharing applications such as Kazaa and BitTorrent, but how can these two main streams applications, that already encounter their own security problems cope with the combined issues, trust for Social Networks, content and index poisoning in CDN? We will address the problems of Social Trust and File Sharing with an overlay level of trust model based on social activity and transactions, this can be an answer to enable users to increase the reliability of their online social life and also enhance the content distribution and create a better file sharing example. The aim of this research is to lower the risk of malicious activity on a given Social Network by applying a correlated trust model, to guarantee the validity of someone's identity, privacy and trustfulness in sharing content.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – Contemporary organizations are increasingly paying attention to incorporate diversity management practices into their systems in order to promote socially responsible actions and equitable employment outcomes for minority groups. The aim of this paper is to seek to address a major oversight in diversity management literature, the integration of organizational justice principles.

Design/methodology/approach – Drawing upon the existing literature on workforce diversity and organizational justice, the authors develop a model based on normative principles of organizational justice for justice-based diversity management processes and outcomes.

Findings – The paper proposes that effective diversity management results from a decision-making process that meets the normative principles of organizational justice (i.e. interactional, procedural and distributive justice). The diversity justice management model introduced in this article provides important theoretical and practical implications for establishing more moral and just workplaces.

Research limitations/implications – The authors have not tested the conceptual framework of the diversity justice management model, and recommend future research to take up the challenge. The payoff for doing so is to enable the establishment of socially responsible workplaces where individuals, regardless of their background, are given an equal opportunity to flourish in their assigned jobs.

Practical implications – The diversity justice management model introduced in this paper provides organizational justice (OJ)-based guidelines for managers to ensure that OJ can be objectively benchmarked and discussed amongst diversity stakeholders to continuously improve actual and perceived OJ outcomes.

Social implications – The social implication of this conceptual paper is reduction of workforce marginalization and establishment of socially responsible organizations whereby those marginalized (e.g. people with disabilities) can effectively work in their organizations.

Originality/value – This is the first attempt to establish a diveristy justice management model, which incorporates normative principles of organizational justice into diversity management processes and outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVES: Following observations in the literature that obsessions often contain or imply negative evaluative information about the self, Aardema et al. (2013) recently developed a measure of feared-self relevant to OCD. The current study aimed to provide further examination of the relevance of such feared self-beliefs to obsessive compulsive processes - in particular whether they partially underlie doubt in OCD-relevant situations. METHOD: Nonclinical participants (N = 463; 291 females; Mage = 25.17, SD = 7.47), were presented with three vignettes, related to washing, checking and non-OCD relevant themes, which assessed doubt through providing alternating sensory and possibility-based information. RESULTS: Higher levels of OCD symptoms and feared-self beliefs both significantly predicted both higher baseline levels of doubt and greater fluctuation in levels of doubt in both the contamination and checking scenarios, and to a much lesser extent in the control scenario. Feared-self beliefs did not predict fluctuation in doubt over-and-above OCD symptoms, consistent with a mediation model. LIMITATIONS: The main limitation was the use of a non-clinical sample, although this allowed sufficient participant numbers to test hypotheses. CONCLUSIONS: The findings provided further experimental support for reasoning processes in OCD, and suggested that feared self-beliefs may make individuals vulnerable to experiencing doubt. Additionally, these results suggested that individuals with high OCD symptoms and those with high feared self-beliefs are unable to recognise the improbable nature of possibility-based statements. Implications for treatment and theory are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Communication is an important area in health professional education curricula, however it has been dealt with as discrete skills that can be learned and taught separate to the underlying thinking. Communication of clinical reasoning is a phenomenon that has largely been ignored in the literature. This research sought to examine how experienced physiotherapists communicate their clinical reasoning and to identify the core processes of this communication. A hermeneutic phenomenological research study was conducted using multiple methods of text construction including repeated semi-structured interviews, observation and written exercises. Hermeneutic analysis of texts involved iterative reading and interpretation of texts with the development of themes and sub-themes. Communication of clinical reasoning was perceived to be complex, dynamic and largely automatic. A key finding was that articulating reasoning (particularly during research) does not completely represent actual reasoning processes but represents a (re)construction of the more complex, rapid and multi-layered processes that operate in practice. These communications are constructed in ways that are perceived as being most relevant to the audience, context and purpose of the communication. Five core components of communicating clinical reasoning were identified: active listening, framing and presenting the message, matching the co-communicator, metacognitive aspects of communication and clinical reasoning abilities. We propose that communication of clinical reasoning is both an inherent part of reasoning as well as an essential and complementary skill based on the contextual demands of the task and situation. In this way clinical reasoning and its communication are intertwined, providing evidence for the argument that they should be learned (and explicitly taught) in synergy and in context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Present day weather forecast models usually cannot provide realistic descriptions of local and particulary extreme weather conditions. However, for lead times of about a small number of days, they provide reliable forecast of the atmospheric circulation that encompasses the subscale processes leading to extremes. Hence, forecasts of extreme events can only be achieved through a combination of dynamical and statistical analysis methods, where a stable and significant statistical model based on prior physical reasoning establishes posterior statistical-dynamical model between the local extremes and the large scale circulation. Here we present the development and application of such a statistical model calibration on the besis of extreme value theory, in order to derive probabilistic forecast for extreme local temperature. The dowscaling applies to NCEP/NCAR re-analysis, in order to derive estimates of daily temperature at Brazilian northeastern region weather stations