1000 resultados para Cohort simulation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluated the stress levels at the core layer and the veneer layer of zirconia crowns (comprising an alternative core design vs. a standard core design) under mechanical/thermal simulation, and subjected simulated models to laboratory mouth-motion fatigue. The dimensions of a mandibular first molar were imported into computer-aided design (CAD) software and a tooth preparation was modeled. A crown was designed using the space between the original tooth and the prepared tooth. The alternative core presented an additional lingual shoulder that lowered the veneer bulk of the cusps. Finite element analyses evaluated the residual maximum principal stresses fields at the core and veneer of both designs under loading and when cooled from 900 degrees C to 25 degrees C. Crowns were fabricated and mouth-motion fatigued, generating master Weibull curves and reliability data. Thermal modeling showed low residual stress fields throughout the bulk of the cusps for both groups. Mechanical simulation depicted a shift in stress levels to the core of the alternative design compared with the standard design. Significantly higher reliability was found for the alternative core. Regardless of the alternative configuration, thermal and mechanical computer simulations showed stress in the alternative core design comparable and higher to that of the standard configuration, respectively. Such a mechanical scenario probably led to the higher reliability of the alternative design under fatigue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Questionnaire surveys, while more economical, typically achieve poorer response rates than interview surveys. We used data from a national volunteer cohort of young adult twins, who were scheduled for assessment by questionnaire in 1989 and by interview in 1996-2000, to identify predictors of questionnaire non-response. Out of a total of 8536 twins, 5058 completed the questionnaire survey (59% response rate), and 6255 completed a telephone interview survey conducted a decade later (73% response rate). Multinomial logit models were fitted to the interview data to identify socioeconomic, psychiatric and health behavior correlates of non-response in the earlier questionnaire survey. Male gender, education below University level, and being a dizygotic rather than monozygotic twin, all predicted reduced likelihood of participating in the questionnaire survey. Associations between questionnaire response status and psychiatric history and health behavior variables were modest, with history of alcohol dependence and childhood conduct disorder predicting decreased probability of returning a questionnaire, and history of smoking and heavy drinking more weakly associated with non-response. Body-mass index showed no association with questionnaire non-response. Despite a poor response rate to the self-report questionnaire survey, we found only limited sampling biases for most variables. While not appropriate for studies where socioeconomic variables are critical, it appears that survey by questionnaire, with questionnaire administration by telephone to non-responders, will represent a viable strategy for gene-mapping studies requiring that large numbers of relatives be screened.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The step size determines the accuracy of a discrete element simulation. The position and velocity updating calculation uses a pre-calculated table and hence the control of step size can not use the integration formulas for step size control. A step size control scheme for use with the table driven velocity and position calculation uses the difference between the calculation result from one big step and that from two small steps. This variable time step size method chooses the suitable time step size for each particle at each step automatically according to the conditions. Simulation using fixed time step method is compared with that of using variable time step method. The difference in computation time for the same accuracy using a variable step size (compared to the fixed step) depends on the particular problem. For a simple test case the times are roughly similar. However, the variable step size gives the required accuracy on the first run. A fixed step size may require several runs to check the simulation accuracy or a conservative step size that results in longer run times. (C) 2001 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes the creation of an objectively acquired reference database to more accurately characterize the incidence and longterm risk of relatively infrequent, but serious, adverse events. Such a database would be maintained longitudinally to provide for ongoing comparison with new rheumatologic drug safety databases collecting the occurrences and treatments of rare events, We propose the establishment of product-specific registries to prospectively follow a cohort of patients with rheumatoid arthritis (RA) who receive newly approved therapies. In addition, a database is required of a much larger cohort of RA patients treated with multiple second line agents of sufficient size to enable case-controlled determinations of the relative incidence of rare but serious events in the treated (registry) versus the larger disease population, The number of patients necessary for agent-specific registries and a larger patient population adequate to supply a matched case-control cohort will depend upon estimates of the detectability of an increased incidence over background. We suggest a system to carry out this proposal that will involve an umbrella organization. responsible for establishment of this large patient cohort, envisioned to be drawn from around the world.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Activated sludge models are used extensively in the study of wastewater treatment processes. While various commercial implementations of these models are available, there are many people who need to code models themselves using the simulation packages available to them, Quality assurance of such models is difficult. While benchmarking problems have been developed and are available, the comparison of simulation data with that of commercial models leads only to the detection, not the isolation of errors. To identify the errors in the code is time-consuming. In this paper, we address the problem by developing a systematic and largely automated approach to the isolation of coding errors. There are three steps: firstly, possible errors are classified according to their place in the model structure and a feature matrix is established for each class of errors. Secondly, an observer is designed to generate residuals, such that each class of errors imposes a subspace, spanned by its feature matrix, on the residuals. Finally. localising the residuals in a subspace isolates coding errors. The algorithm proved capable of rapidly and reliably isolating a variety of single and simultaneous errors in a case study using the ASM 1 activated sludge model. In this paper a newly coded model was verified against a known implementation. The method is also applicable to simultaneous verification of any two independent implementations, hence is useful in commercial model development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational simulations of the title reaction are presented, covering a temperature range from 300 to 2000 K. At lower temperatures we find that initial formation of the cyclopropene complex by addition of methylene to acetylene is irreversible, as is the stabilisation process via collisional energy transfer. Product branching between propargyl and the stable isomers is predicted at 300 K as a function of pressure for the first time. At intermediate temperatures (1200 K), complex temporal evolution involving multiple steady states begins to emerge. At high temperatures (2000 K) the timescale for subsequent unimolecular decay of thermalized intermediates begins to impinge on the timescale for reaction of methylene, such that the rate of formation of propargyl product does not admit a simple analysis in terms of a single time-independent rate constant until the methylene supply becomes depleted. Likewise, at the elevated temperatures the thermalized intermediates cannot be regarded as irreversible product channels. Our solution algorithm involves spectral propagation of a symmetrised version of the discretized master equation matrix, and is implemented in a high precision environment which makes hitherto unachievable low-temperature modelling a reality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The QU-GENE Computing Cluster (QCC) is a hardware and software solution to the automation and speedup of large QU-GENE (QUantitative GENEtics) simulation experiments that are designed to examine the properties of genetic models, particularly those that involve factorial combinations of treatment levels. QCC automates the management of the distribution of components of the simulation experiments among the networked single-processor computers to achieve the speedup.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of cropping systems simulation capabilities world-wide combined with easy access to powerful computing has resulted in a plethora of agricultural models and consequently, model applications. Nonetheless, the scientific credibility of such applications and their relevance to farming practice is still being questioned. Our objective in this paper is to highlight some of the model applications from which benefits for farmers were or could be obtained via changed agricultural practice or policy. Changed on-farm practice due to the direct contribution of modelling, while keenly sought after, may in some cases be less achievable than a contribution via agricultural policies. This paper is intended to give some guidance for future model applications. It is not a comprehensive review of model applications, nor is it intended to discuss modelling in the context of social science or extension policy. Rather, we take snapshots around the globe to 'take stock' and to demonstrate that well-defined financial and environmental benefits can be obtained on-farm from the use of models. We highlight the importance of 'relevance' and hence the importance of true partnerships between all stakeholders (farmer, scientists, advisers) for the successful development and adoption of simulation approaches. Specifically, we address some key points that are essential for successful model applications such as: (1) issues to be addressed must be neither trivial nor obvious; (2) a modelling approach must reduce complexity rather than proliferate choices in order to aid the decision-making process (3) the cropping systems must be sufficiently flexible to allow management interventions based on insights gained from models. The pro and cons of normative approaches (e.g. decision support software that can reach a wide audience quickly but are often poorly contextualized for any individual client) versus model applications within the context of an individual client's situation will also be discussed. We suggest that a tandem approach is necessary whereby the latter is used in the early stages of model application for confidence building amongst client groups. This paper focuses on five specific regions that differ fundamentally in terms of environment and socio-economic structure and hence in their requirements for successful model applications. Specifically, we will give examples from Australia and South America (high climatic variability, large areas, low input, technologically advanced); Africa (high climatic variability, small areas, low input, subsistence agriculture); India (high climatic variability, small areas, medium level inputs, technologically progressing; and Europe (relatively low climatic variability, small areas, high input, technologically advanced). The contrast between Australia and Europe will further demonstrate how successful model applications are strongly influenced by the policy framework within which producers operate. We suggest that this might eventually lead to better adoption of fully integrated systems approaches and result in the development of resilient farming systems that are in tune with current climatic conditions and are adaptable to biophysical and socioeconomic variability and change. (C) 2001 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The two-node tandem Jackson network serves as a convenient reference model for the analysis and testing of different methodologies and techniques in rare event simulation. In this paper we consider a new approach to efficiently estimate the probability that the content of the second buffer exceeds some high level L before it becomes empty, starting from a given state. The approach is based on a Markov additive process representation of the buffer processes, leading to an exponential change of measure to be used in an importance sampling procedure. Unlike changes of measures proposed and studied in recent literature, the one derived here is a function of the content of the first buffer. We prove that when the first buffer is finite, this method yields asymptotically efficient simulation for any set of arrival and service rates. In fact, the relative error is bounded independent of the level L; a new result which is not established for any other known method. When the first buffer is infinite, we propose a natural extension of the exponential change of measure for the finite buffer case. In this case, the relative error is shown to be bounded (independent of L) only when the second server is the bottleneck; a result which is known to hold for some other methods derived through large deviations analysis. When the first server is the bottleneck, experimental results using our method seem to suggest that the relative error is bounded linearly in L.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A two-dimensional numerical simulation model of interface states in scanning capacitance microscopy (SCM) measurements of p-n junctions is presented-In the model, amphoteric interface states with two transition energies in the Si band gap are represented as fixed charges to account for their behavior in SCM measurements. The interface states are shown to cause a stretch-out-and a parallel shift of the capacitance-voltage characteristics in the depletion. and neutral regions of p-n junctions, respectively. This explains the discrepancy between - the SCM measurement and simulation near p-n junctions, and thus modeling interface states is crucial for SCM dopant profiling of p-n junctions. (C) 2002 American Institute of Physics.