820 resultados para approach to information systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A procedure is proposed to accurately model thin wires in lossy media by finite element analysis. It is based on the determination of a suitable element width in the vicinity of the wire, which strongly depends on the wire radius to yield accurate results. The approach is well adapted to the analysis of grounding systems. The numerical results of the application of finite element analysis with the suitably chosen element width are compared with both analytical results and those computed by a commercial package for the analysis of grounding systems, showing very good agreement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a hybrid numerical method of an inverse approach to the design of compact magnetic resonance imaging magnets. The problem is formulated as a field synthesis and the desired current density on the surface of a cylinder is first calculated by solving a Fredholm equation of the first, kind. Nonlinear optimization methods are then invoked to fit practical magnet coils to the desired current density. The field calculations are performed using a semi-analytical method. The emphasis of this work is on the optimal design of short MRI magnets. Details of the hybrid numerical model are presented, and the model is used to investigate compact, symmetric MRI magnets as well as asymmetric magnets. The results highlight that the method can be used to obtain a compact MRI magnet structure and a very homogeneous magnetic field over the central imaging volume in clinical systems of approximately 1 m in length, significantly shorter than current designs. Viable asymmetric magnet designs, in which the edge of the homogeneous region is very close to one end of the magnet system are also presented. Unshielded designs are the focus of this work. This method is flexible and may be applied to magnets of other geometries. (C) 2000 American Association of Physicists in Medicine. [S0094-2405(00)00303-5].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

I examine a situation where a firm chooses to locate a new factory in one of several jurisdictions. The value of the factory may differ among jurisdictions and it depends on the private information held by each jurisdiction. Jurisdictions compete for the location of the new factory. This competition may take the form of expenditures already incurred on infrastructure, commitments to spend on infrastructure, tax incentives or even cash payments. The model combines two elements that are usually considered separately; competition is desirable because we want the factory to be located in the jurisdiction that values it the most, but competition in itself is wasteful. I show that the expected total amount paid to the firm under a large family of arrangements is the same. Moreover, I show that the ex-ante optimal mechanism that is, the mechanism that guarantees that the firm chooses the jurisdiction with the highest value for the factory, minimizes the total expected payment to the firm, and balances the budget in an ex-ante sense - can be implemented by running a standard auction and subsidizing participation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Valuation of projects for the preservation of water resources provides important information to policy makers and funding institutions. Standard contingent valuation models rely on distributional assumptions to provide welfare measures. Deviations from assumed and actual distribution of benefits are important when designing policies in developing countries, where inequality is a concern. This article applies semiparametric methods to obtain estimates of the benefit from a project for the preservation of an important Brazilian river basin. These estimates lead to significant differences from those obtained using the standard parametric approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The core structure of the natural sesquiterpene lactones furanoheliangolides, an 11-oxabicyclo[6.2.1]undecane system, was synthesized through a pathway involving two Diels-Alder reactions. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sound application of molecular epidemiological principles requires working knowledge of both molecular biological and epidemiological methods. Molecular tools have become an increasingly important part of studying the epidemiology of infectious agents. Molecular tools have allowed the aetiological agent within a population to be diagnosed with a greater degree of efficiency and accuracy than conventional diagnostic tools. They have increased the understanding of the pathogenicity, virulence, and host-parasite relationships of the aetiological agent, provided information on the genetic structure and taxonomy of the parasite and allowed the zoonotic potential of previously unidentified agents to be determined. This review describes the concept of epidemiology and proper study design, describes the array of currently available molecular biological tools and provides examples of studies that have integrated both disciplines to successfully unravel zoonotic relationships that would otherwise be impossible utilising conventional diagnostic tools. The current limitations of applying these tools, including cautions that need to be addressed during their application are also discussed.(c) 2005 Australian Society for Parasitology Inc. Published by Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fuzzy Bayesian tests were performed to evaluate whether the mother`s seroprevalence and children`s seroconversion to measles vaccine could be considered as ""high"" or ""low"". The results of the tests were aggregated into a fuzzy rule-based model structure, which would allow an expert to influence the model results. The linguistic model was developed considering four input variables. As the model output, we obtain the recommended age-specific vaccine coverage. The inputs of the fuzzy rules are fuzzy sets and the outputs are constant functions, performing the simplest Takagi-Sugeno-Kang model. This fuzzy approach is compared to a classical one, where the classical Bayes test was performed. Although the fuzzy and classical performances were similar, the fuzzy approach was more detailed and revealed important differences. In addition to taking into account subjective information in the form of fuzzy hypotheses it can be intuitively grasped by the decision maker. Finally, we show that the Bayesian test of fuzzy hypotheses is an interesting approach from the theoretical point of view, in the sense that it combines two complementary areas of investigation, normally seen as competitive. (C) 2007 IMACS. Published by Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Many factors have been associated with the onset and maintenance of depressive symptoms in later life, although this knowledge is yet to be translated into significant health gains for the population. This study gathered information about common modifiable and non-modifiable risk factors for depression with the aim of developing a practical probabilistic model of depression that can be used to guide risk reduction strategies. \Methods: A cross-sectional study was undertaken of 20,677 community-dwelling Australians aged 60 years or over in contact with their general practitioner during the preceding 12 months. Prevalent depression (minor or major) according to the Patient Health Questionnaire (PHQ-9) assessment was the main outcome of interest. Other measured exposures included self-reported age, gender, education, loss of mother or father before age 15 years, physical or sexual abuse before age 15 years, marital status, financial stress, social support, smoking and alcohol use, physical activity, obesity, diabetes, hypertension, and prevalent cardiovascular diseases, chronic respiratory diseases and cancer. Results: The mean age of participants was 71.7 +/- 7.6 years and 57.9% were women. Depression was present in 1665 (8.0%) of our subjects. Multivariate logistic regression showed depression was independently associated with age older than 75 years, childhood adverse experiences, adverse lifestyle practices (smoking, risk alcohol use, physical inactivity), intermediate health hazards (obesity, diabetes and hypertension), comorbid medical conditions (clinical history of coronary heart disease, stroke, asthma, chronic obstructive pulmonary disease, emphysema or cancers), and social or financial strain. We stratified the exposures to build a matrix that showed that the probability of depression increased progressively with the accumulation of risk factors, from less than 3% for those with no adverse factors to more than 80% for people reporting the maximum number of risk factors. Conclusions: Our probabilistic matrix can be used to estimate depression risk and to guide the introduction of risk reduction strategies. Future studies should now aim to clarify whether interventions designed to mitigate the impact of risk factors can change the prevalence and incidence of depression in later life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the problem of establishing a formal relationship of abstraction and refinement between abstract enterprise models and the concrete information systems which implement them. It introduces and justifies a number of reasonableness requirements, which turn out to justify the use of category theoretic concepts, particularly fibrations, to precisely specify a semantics for enterprise models which enables them to be considered as abstractions of the conceptual models from which the implementing information systems are built. The category-theoretic concepts are developed towards the problem of testing whether a system satisfies the fibration axioms, and are applied to case studies to demonstrate their practicability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Activated sludge models are used extensively in the study of wastewater treatment processes. While various commercial implementations of these models are available, there are many people who need to code models themselves using the simulation packages available to them, Quality assurance of such models is difficult. While benchmarking problems have been developed and are available, the comparison of simulation data with that of commercial models leads only to the detection, not the isolation of errors. To identify the errors in the code is time-consuming. In this paper, we address the problem by developing a systematic and largely automated approach to the isolation of coding errors. There are three steps: firstly, possible errors are classified according to their place in the model structure and a feature matrix is established for each class of errors. Secondly, an observer is designed to generate residuals, such that each class of errors imposes a subspace, spanned by its feature matrix, on the residuals. Finally. localising the residuals in a subspace isolates coding errors. The algorithm proved capable of rapidly and reliably isolating a variety of single and simultaneous errors in a case study using the ASM 1 activated sludge model. In this paper a newly coded model was verified against a known implementation. The method is also applicable to simultaneous verification of any two independent implementations, hence is useful in commercial model development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We apply the quantum trajectory method to current noise in resonant tunneling devices. The results from dynamical simulation are compared with those from unconditional master equation approach. We show that the stochastic Schrodinger equation approach is useful in modeling the dynamical processes in mesoscopic electronic systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Within the information systems field, the task of conceptual modeling involves building a representation of selected phenomena in some domain. High-quality conceptual-modeling work is important because it facilitates early detection and correction of system development errors. It also plays an increasingly important role in activities like business process reengineering and documentation of best-practice data and process models in enterprise resource planning systems. Yet little research has been undertaken on many aspects of conceptual modeling. In this paper, we propose a framework to motivate research that addresses the following fundamental question: How can we model the world to better facilitate our developing, implementing, using, and maintaining more valuable information systems? The framework comprises four elements: conceptual-modeling grammars, conceptual-modeling methods, conceptual-modeling scripts, and conceptual-modeling contexts. We provide examples of the types of research that have already been undertaken on each element and illustrate research opportunities that exist.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Tissue Doppler may be used to quantify regional left ventricular function but is limited by segmental variation of longitudinal velocity from base to apex and free to septal walls. We sought to overcome this by developing a composite of longitudinal and radial velocities. Methods and Results. We examined 82 unselected patients undergoing a standard dobutamine echocardiogram. Longitudinal velocity was obtained in the basal and mid segments of each wall using tissue Doppler in the apical views. Radial velocities were derived in the same segments using an automated border detection system and centerline method with regional chords grouped according to segment location and temporally averaged. In 25 patients at low probability of coronary disease, the pattern of regional variation in longitudinal velocity (higher in the septum) was the opposite of radial velocity (higher in the free wall) and the combination was homogenous. In 57 patients undergoing angiography, velocity in abnormal segments was less than normal segments using longitudinal (6.0 +/- 3.6 vs 9.0 +/- 2.2 cm/s, P = .01) and radial velocity (6.0 +/- 4.0 vs 8.0 +/- 3.9 cm/s, P = .02). However, the composite velocity permitted better separation of abnormal and normal segments (13.3 +/- 5.6 vs 17.5 +/- 4.2 cm/s, P = .001). There was no significant difference between the accuracy of this quantitative approach and expert visual wall motion analysis (81% vs 84%, P = .56). Conclusion: Regional variation of uni-dimensional myocardial velocities necessitates site-specific normal ranges, probably because of different fiber directions. Combined analysis of longitudinal and radial velocities allows the derivation of a composite velocity, which is homogenous in all segments and may allow better separation of normal and abnormal myocardium.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright (C) 2003 John Wiley Sons, Ltd.