936 resultados para Parametric sensitivity analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper examines the integration of a tolerance design process within the Computer-Aided Design (CAD) environment having identified the potential to create an intelligent Digital Mock-Up [1]. The tolerancing process is complex in nature and as such reliance on Computer-Aided Tolerancing (CAT) software and domain experts can create a disconnect between the design and manufacturing disciplines It is necessary to implement the tolerance design procedure at the earliest opportunity to integrate both disciplines and to reduce workload in tolerance analysis and allocation at critical stages in product development when production is imminent.
The work seeks to develop a methodology that will allow for a preliminary tolerance allocation procedure within CAD. An approach to tolerance allocation based on sensitivity analysis is implemented on a simple assembly to review its contribution to an intelligent DMU. The procedure is developed using Python scripting for CATIA V5, with analysis results aligning with those in literature. A review of its implementation and requirements is presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work we explore optimising parameters of a physical circuit model relative to input/output measurements, using the Dallas Rangemaster Treble Booster as a case study. A hybrid metaheuristic/gradient descent algorithm is implemented, where the initial parameter sets for the optimisation are informed by nominal values from schematics and datasheets. Sensitivity analysis is used to screen parameters, which informs a study of the optimisation algorithm against model complexity by fixing parameters. The results of the optimisation show a significant increase in the accuracy of model behaviour, but also highlight several key issues regarding the recovery of parameters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ce travail évalue le comportement mécanique des matériaux cimentaires à différentes échelles de distance. Premièrement, les propriétés mécaniques du béton produit avec un bioplastifiant à base de microorganismes efficaces (EM) sont etudiées par nanoindentation statistique, et comparées aux propriétés mécaniques du béton produit avec un superplastifiant ordinaire (SP). Il est trouvé que l’ajout de bioplastifiant à base de produit EM améliore la résistance des C–S–H en augmentant la cohésion et la friction des nanograins solides. L’analyse statistique des résultats d’indentation suggère que le bioplastifiant à base de produit EM inhibe la précipitation des C–S–H avec une plus grande fraction volumique solide. Deuxièmement, un modèle multi-échelles à base micromécanique est dérivé pour le comportement poroélastique de la pâte de ciment au jeune age. L’approche proposée permet d’obtenir les propriétés poroélastiques requises pour la modélisation du comportoment mécanique partiellement saturé des pâtes de ciment viellissantes. Il est montré que ce modèle prédit le seuil de percolation et le module de Young non drainé de façon conforme aux données expérimentales. Un metamodèle stochastique est construit sur la base du chaos polynomial pour propager l’incertitude des paramètres du modèle à travers plusieurs échelles de distance. Une analyse de sensibilité est conduite par post-traitement du metamodèle pour des pâtes de ciment avec ratios d’eau sur ciment entre 0.35 et 0.70. Il est trouvé que l’incertitude sous-jacente des propriétés poroélastiques équivalentes est principalement due à l’énergie d’activation des aluminates de calcium au jeune age et, plus tard, au module élastique des silicates de calcium hydratés de basse densité.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In urban areas, interchange spacing and the adequacy of design for weaving, merge, and diverge areas can significantly influence available capacity. Traffic microsimulation tools allow detailed analyses of these critical areas in complex locations that often yield results that differ from the generalized approach of the Highway Capacity Manual. In order to obtain valid results, various inputs should be calibrated to local conditions. This project investigated basic calibration factors for the simulation of traffic conditions within an urban freeway merge/diverge environment. By collecting and analyzing urban freeway traffic data from multiple sources, specific Iowa-based calibration factors for use in VISSIM were developed. In particular, a repeatable methodology for collecting standstill distance and headway/time gap data on urban freeways was applied to locations throughout the state of Iowa. This collection process relies on the manual processing of video for standstill distances and individual vehicle data from radar detectors to measure the headways/time gaps. By comparing the data collected from different locations, it was found that standstill distances vary by location and lead-follow vehicle types. Headways and time gaps were found to be consistent within the same driver population and across different driver populations when the conditions were similar. Both standstill distance and headway/time gap were found to follow fairly dispersed and skewed distributions. Therefore, it is recommended that microsimulation models be modified to include the option for standstill distance and headway/time gap to follow distributions as well as be set separately for different vehicle classes. In addition, for the driving behavior parameters that cannot be easily collected, a sensitivity analysis was conducted to examine the impact of these parameters on the capacity of the facility. The sensitivity analysis results can be used as a reference to manually adjust parameters to match the simulation results to the observed traffic conditions. A well-calibrated microsimulation model can enable a higher level of fidelity in modeling traffic behavior and serve to improve decision making in balancing need with investment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-08

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Dementia is a global issue, with increasing prevalence rates impacting on health services internationally. People with dementia are frequently admitted to hospital, an environment that may not be suited to their needs. While many initiatives have been developed to improve their care in the acute setting, there is a lack of cohesive understanding of how staff experience and perceive the care they give to people with dementia in the acute setting. Objectives The aim of this qualitative synthesis was to explore health care staffs’ experiences and perceptions of caring for people with dementia in the acute setting. Qualitative synthesis can bring together isolated findings in a meaningful way that can inform policy development. Settings A screening process, using inclusion/exclusion criteria, identified qualitative studies that focused on health care staff caring for people with dementia in acute settings. Participants Twelve reports of nine studies were included for synthesis. Data extraction was conducted on each report by two researchers. Methods Framework synthesis was employed using VIPS framework, using Values, Individualised, Perspective and Social and psychological as concepts to guide synthesis. The VIPS framework has previously been used for exploring approaches to caring for people with dementia. Quality appraisal was conducted using Critical Appraisal Skills Programme (CASP) and NVivo facilitated sensitivity analysis to ensure confidence in the findings. Results Key themes, derived from VIPS, included a number of specific subthemes that examined: infrastructure and care pathways, person-centred approaches to care, how the person interacts with their environment and other patients, and family involvement in care decisions. The synthesis identified barriers to appropriate care for the person with dementia. These include ineffective pathways of care, unsuitable environments, inadequate resources and staffing levels and lack of emphasis on education and training for staff caring for people with dementia. Conclusions This review has identified key issues in the care of people with dementia in the acute setting: improving pathways of care, creating suitable environments, addressing resources and staffing levels and placing emphasis on the education for staff caring for people with dementia. Recommendations are made for practice consideration, policy development and future research. Leadership is required to instil the values needed to care for this client group in an effective and personcentred way. Qualitative evidence synthesis can inform policy and in this case, recommends VIPS as a suitable framework for guiding decisions around care for people with dementia in acute settings.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aims and objectives The aim of this qualitative evidence synthesis was to explore the experiences and perceptions of health care staff caring for people with dementia in the acute setting. This paper focuses on the methodological process of conducting framework synthesis using NVivo for each stage of the review: screening, data extraction, synthesis and critical appraisal. Background Qualitative evidence synthesis brings together many research findings in a meaningful way that can be used to guide practice and policy development. For this purpose, synthesis must be conducted in a comprehensive and rigorous way. There has been previous discussion on how using NVivo can assist in enhancing and illustrate the rigorous processes involved. Design Qualitative Framework Synthesis. Methods Twelve documents, or research reports, based on nine studies, were included for synthesis. Conclusion The benefits of using NVivo are outlined in terms of facilitating teams of researchers to systematically and rigorously synthesise findings. NVivo functions were used to conduct a sensitivity analysis. Some valuable lessons were learned and these are presented to assist and guide researchers who wish to use similar methods in future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A deterministic model of tuberculosis in Cameroon is designed and analyzed with respect to its transmission dynamics. The model includes lack of access to treatment and weak diagnosis capacity as well as both frequency-and density-dependent transmissions. It is shown that the model is mathematically well-posed and epidemiologically reasonable. Solutions are non-negative and bounded whenever the initial values are non-negative. A sensitivity analysis of model parameters is performed and the most sensitive ones are identified by means of a state-of-the-art Gauss-Newton method. In particular, parameters representing the proportion of individuals having access to medical facilities are seen to have a large impact on the dynamics of the disease. The model predicts that a gradual increase of these parameters could significantly reduce the disease burden on the population within the next 15 years.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Les enjeux hydrologiques modernes, de prévisions ou liés aux changements climatiques, forcent l’exploration de nouvelles approches en modélisation afin de combler les lacunes actuelles et d’améliorer l’évaluation des incertitudes. L’approche abordée dans ce mémoire est celle du multimodèle (MM). L’innovation se trouve dans la construction du multimodèle présenté dans cette étude : plutôt que de caler individuellement des modèles et d’utiliser leur combinaison, un calage collectif est réalisé sur la moyenne des 12 modèles globaux conceptuels sélectionnés. Un des défis soulevés par cette approche novatrice est le grand nombre de paramètres (82) qui complexifie le calage et l’utilisation, en plus d’entraîner des problèmes potentiels d’équifinalité. La solution proposée dans ce mémoire est une analyse de sensibilité qui permettra de fixer les paramètres peu influents et d’ainsi réduire le nombre de paramètres total à caler. Une procédure d’optimisation avec calage et validation permet ensuite d’évaluer les performances du multimodèle et de sa version réduite en plus d’en améliorer la compréhension. L’analyse de sensibilité est réalisée avec la méthode de Morris, qui permet de présenter une version du MM à 51 paramètres (MM51) tout aussi performante que le MM original à 82 paramètres et présentant une diminution des problèmes potentiels d’équifinalité. Les résultats du calage et de la validation avec le « Split-Sample Test » (SST) du MM sont comparés avec les 12 modèles calés individuellement. Il ressort de cette analyse que les modèles individuels, composant le MM, présentent de moins bonnes performances que ceux calés indépendamment. Cette baisse de performances individuelles, nécessaire pour obtenir de bonnes performances globales du MM, s’accompagne d’une hausse de la diversité des sorties des modèles du MM. Cette dernière est particulièrement requise pour les applications hydrologiques nécessitant une évaluation des incertitudes. Tous ces résultats mènent à une amélioration de la compréhension du multimodèle et à son optimisation, ce qui facilite non seulement son calage, mais également son utilisation potentielle en contexte opérationnel.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A fundamental step in understanding the effects of irradiation on metallic uranium and uranium dioxide ceramic fuels, or any material, must start with the nature of radiation damage on the atomic level. The atomic damage displacement results in a multitude of defects that influence the fuel performance. Nuclear reactions are coupled, in that changing one variable will alter others through feedback. In the field of fuel performance modeling, these difficulties are addressed through the use of empirical models rather than models based on first principles. Empirical models can be used as a predictive code through the careful manipulation of input variables for the limited circumstances that are closely tied to the data used to create the model. While empirical models are efficient and give acceptable results, these results are only applicable within the range of the existing data. This narrow window prevents modeling changes in operating conditions that would invalidate the model as the new operating conditions would not be within the calibration data set. This work is part of a larger effort to correct for this modeling deficiency. Uranium dioxide and metallic uranium fuels are analyzed through a kinetic Monte Carlo code (kMC) as part of an overall effort to generate a stochastic and predictive fuel code. The kMC investigations include sensitivity analysis of point defect concentrations, thermal gradients implemented through a temperature variation mesh-grid, and migration energy values. In this work, fission damage is primarily represented through defects on the oxygen anion sublattice. Results were also compared between the various models. Past studies of kMC point defect migration have not adequately addressed non-standard migration events such as clustering and dissociation of vacancies. As such, the General Utility Lattice Program (GULP) code was utilized to generate new migration energies so that additional non-migration events could be included into kMC code in the future for more comprehensive studies. Defect energies were calculated to generate barrier heights for single vacancy migration, clustering and dissociation of two vacancies, and vacancy migration while under the influence of both an additional oxygen and uranium vacancy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hydrometallurgical process modeling is the main objective of this Master’s thesis work. Three different leaching processes namely, high pressure pyrite oxidation, direct oxidation zinc concentrate (sphalerite) leaching and gold chloride leaching using rotating disc electrode (RDE) are modeled and simulated using gPROMS process simulation program in order to evaluate its model building capabilities. The leaching mechanism in each case is described in terms of a shrinking core model. The mathematical modeling carried out included process model development based on available literature, estimation of reaction kinetic parameters and assessment of the model reliability by checking the goodness fit and checking the cross correlation between the estimated parameters through the use of correlation matrices. The estimated parameter values in each case were compared with those obtained using the Modest simulation program. Further, based on the estimated reaction kinetic parameters, reactor simulation and modeling for direct oxidation zinc concentrate (sphalerite) leaching is carried out in Aspen Plus V8.6. The zinc leaching autoclave is based on Cominco reactor configuration and is modeled as a series of continuous stirred reactors (CSTRs). The sphalerite conversion is calculated and a sensitivity analysis is carried out so to determine the optimum reactor operation temperature and optimum oxygen mass flow rate. In this way, the implementation of reaction kinetic models into the process flowsheet simulation environment has been demonstrated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In a industrial environment, to know the process one is working with is crucial to ensure its good functioning. In the present work, developed at Prio Biocombustíveis S.A. facilities, using process data, collected during the present work, and historical process data, the methanol recovery process was characterized, having started with the characterization of key process streams. Based on the information retrieved from the stream characterization, Aspen Plus® process simulation software was used to replicate the process and perform a sensitivity analysis with the objective of accessing the relative importance of certain key process variables (reflux/feed ratio, reflux temperature, reboiler outlet temperature, methanol, glycerol and water feed compositions). The work proceeded with the application of a set of statistical tools, starting with the Principal Components Analysis (PCA) from which the interactions between process variables and their contribution to the process variability was studied. Next, the Design of Experiments (DoE) was used to acquire experimental data and, with it, create a model for the water amount in the distillate. However, the necessary conditions to perform this method were not met and so it was abandoned. The Multiple Linear Regression method (MLR) was then used with the available data, creating several empiric models for the water at distillate, the one with the highest fit having a R2 equal to 92.93% and AARD equal to 19.44%. Despite the AARD still being relatively high, the model is still adequate to make fast estimates of the distillate’s quality. As for fouling, its presence has been noticed many times during this work. Not being possible to directly measure the fouling, the reboiler inlet steam pressure was used as an indicator of the fouling growth and its growth variation with the amount of Used Cooking Oil incorporated in the whole process. Comparing the steam cost associated to the reboiler’s operation when fouling is low (1.5 bar of steam pressure) and when fouling is high (reboiler’s steam pressure of 3 bar), an increase of about 58% occurs when the fouling increases.