916 resultados para High fidelity simulation
Resumo:
Background: The search for alternative and effective forms of training simulation is needed due to ethical and medico-legal aspects involved in training surgical skills on living patients, human cadavers and living animals. Aims : To evaluate if the bench model fidelity interferes in the acquisition of elliptical excision skills by novice medical students. Materials and Methods: Forty novice medical students were randomly assigned to 5 practice conditions with instructor-directed elliptical excision skills' training (n = 8): didactic materials (control); organic bench model (low-fidelity); ethylene-vinyl acetate bench model (low-fidelity); chicken legs' skin bench model (high-fidelity); or pig foot skin bench model (high-fidelity). Pre- and post-tests were applied. Global rating scale, effect size, and self-perceived confidence based on Likert scale were used to evaluate all elliptical excision performances. Results : The analysis showed that after training, the students practicing on bench models had better performance based on Global rating scale (all P < 0.0000) and felt more confident to perform elliptical excision skills (all P < 0.0000) when compared to the control. There was no significant difference (all P > 0.05) between the groups that trained on bench models. The magnitude of the effect (basic cutaneous surgery skills' training) was considered large (>0.80) in all measurements. Conclusion : The acquisition of elliptical excision skills after instructor-directed training on low-fidelity bench models was similar to the training on high-fidelity bench models; and there was a more substantial increase in elliptical excision performances of students that trained on all simulators compared to the learning on didactic materials.
Resumo:
Los ensayos virtuales de materiales compuestos han aparecido como un nuevo concepto dentro de la industria aeroespacial, y disponen de un vasto potencial para reducir los enormes costes de certificación y desarrollo asociados con las tediosas campañas experimentales, que incluyen un gran número de paneles, subcomponentes y componentes. El objetivo de los ensayos virtuales es sustituir algunos ensayos por simulaciones computacionales con alta fidelidad. Esta tesis es una contribución a la aproximación multiescala desarrollada en el Instituto IMDEA Materiales para predecir el comportamiento mecánico de un laminado de material compuesto dadas las propiedades de la lámina y la intercara. La mecánica de daño continuo (CDM) formula el daño intralaminar a nivel constitutivo de material. El modelo de daño intralaminar se combina con elementos cohesivos para representar daño interlaminar. Se desarrolló e implementó un modelo de daño continuo, y se aplicó a configuraciones simples de ensayos en laminados: impactos de baja y alta velocidad, ensayos de tracción, tests a cortadura. El análisis del método y la correlación con experimentos sugiere que los métodos son razonablemente adecuados para los test de impacto, pero insuficientes para el resto de ensayos. Para superar estas limitaciones de CDM, se ha mejorado la aproximación discreta de elementos finitos enriqueciendo la cinemática para incluir discontinuidades embebidas: el método extendido de los elementos finitos (X-FEM). Se adaptó X-FEM para un esquema explícito de integración temporal. El método es capaz de representar cualitativamente los mecanismos de fallo detallados en laminados. Sin embargo, los resultados muestran inconsistencias en la formulación que producen resultados cuantitativos erróneos. Por último, se ha revisado el método tradicional de X-FEM, y se ha desarrollado un nuevo método para superar sus limitaciones: el método cohesivo X-FEM estable. Las propiedades del nuevo método se estudiaron en detalle, y se concluyó que el método es robusto para implementación en códigos explícitos dinámicos escalables, resultando una nueva herramienta útil para la simulación de daño en composites. Virtual testing of composite materials has emerged as a new concept within the aerospace industry. It presents a very large potential to reduce the large certification costs and the long development times associated with the experimental campaigns, involving the testing of a large number of panels, sub-components and components. The aim of virtual testing is to replace some experimental tests by high-fidelity numerical simulations. This work is a contribution to the multiscale approach developed in Institute IMDEA Materials to predict the mechanical behavior of a composite laminate from the properties of the ply and the interply. Continuum Damage Mechanics (CDM) formulates intraply damage at the the material constitutive level. Intraply CDM is combined with cohesive elements to model interply damage. A CDM model was developed, implemented, and applied to simple mechanical tests of laminates: low and high velocity impact, tension of coupons, and shear deformation. The analysis of the results and the comparison with experiments indicated that the performance was reasonably good for the impact tests, but insuficient in the other cases. To overcome the limitations of CDM, the kinematics of the discrete finite element approximation was enhanced to include mesh embedded discontinuities, the eXtended Finite Element Method (X-FEM). The X-FEM was adapted to an explicit time integration scheme and was able to reproduce qualitatively the physical failure mechanisms in a composite laminate. However, the results revealed an inconsistency in the formulation that leads to erroneous quantitative results. Finally, the traditional X-FEM was reviewed, and a new method was developed to overcome its limitations, the stable cohesive X-FEM. The properties of the new method were studied in detail, and it was demonstrated that the new method was robust and can be implemented in a explicit finite element formulation, providing a new tool for damage simulation in composite materials.
Resumo:
CcmG is unlike other periplasmic thioredoxin (TRX)like proteins in that it has a specific reducing activity in an oxidizing environment and a high fidelity of interaction. These two unusual properties are required for its role in c-type cytochrome maturation. The crystal structure of CcmG reveals a modified TRX fold with an unusually acidic active site and a groove formed from two inserts in the fold. Deletion of one of the groove-forming inserts disrupts c-type cytochrome formation. Two unique structural features of CcmG-an acidic active site and an adjacent groove-appear to be necessary to convert an indiscriminately binding scaffold, the TRX fold, into a highly specific redox protein.
Resumo:
Objectives The objective of this article is to describe the development of an anatomically accurate simulator in order to aid the training of a perinatal team in the insertion and removal of a fetal endoscopic tracheal occlusion (FETO) balloon in the management of prenatally diagnosed congenital diaphragmatic hernia. Methods An experienced perinatal team collaborated with a medical sculptor to design a fetal model for the FETO procedure. Measurements derived from 28-week fetal magnetic resonance imaging were used in the development of an anatomically precise simulated airway within a silicone rubber preterm fetal model. Clinician feedback was then used to guide multiple iterations of the model with serial improvements in the anatomic accuracy of the simulator airway. Results An appropriately sized preterm fetal mannequin with a high-fidelity airway was developed. The team used this model to develop surgical skills with balloon insertion, and removal, and to prepare the team for an integrated response to unanticipated delivery with the FETO balloon still in situ. Conclusions This fetal mannequin aided in the ability of a fetal therapy unit to offer the FETO procedure at their center for the first time. This model may be of benefit to other perinatal centers planning to offer this procedure.
Resumo:
High-fidelity 'proofreading' polymerases are often used in library construction for next-generation sequencing projects, in an effort to minimize errors in the resulting sequence data. The increased template fidelity of these polymerases can come at the cost of reduced template specificity, and library preparation methods based on the AFLP technique may be particularly susceptible. Here, we compare AFLP profiles generated with standard Taq and two versions of a high-fidelity polymerase. We find that Taq produces fewer and brighter peaks than high-fidelity polymerase, suggesting that Taq performs better at selectively amplifying templates that exactly match the primer sequences. Because the higher accuracy of proofreading polymerases remains important for sequencing applications, we suggest that it may be more effective to use alternative library preparation methods.
Resumo:
Objectifs: Les patients hospitalisés aux soins intensifs (SI) sont souvent victimes d’erreurs médicales. La nature interprofessionnelle des équipes de SI les rend vulnérables aux erreurs de communication. L’objectif primaire du projet est d’améliorer la communication dans une équipe interprofessionnelle de soins intensifs par une formation en simulation à haute fidélité. Méthodologie Une étude prospective randomisée contrôlée à double insu a été réalisée. Dix équipes de six professionnels de SI ont complété trois scénarios de simulations de réanimation. Le groupe intervention était débreffé sur des aspects de communication alors que le groupe contrôle était débreffé sur des aspects techniques de réanimation. Trois mois plus tard, les équipes réalisaient une quatrième simulation sans débreffage. Les simulations étaient toutes évaluées pour la qualité, l’efficacité de la communication et le partage des informations critiques par quatre évaluateurs. Résultats Pour l’issue primaire, il n’y a pas eu d’amélioration plus grande de la communication dans le groupe intervention en comparaison avec le groupe contrôle. Une amélioration de 16% de l’efficacité des communications a été notée dans les équipes de soins intensifs indépendamment du groupe étudié. Les infirmiers et les inhalothérapeutes ont amélioré significativement l’efficacité de la communication après trois sessions. L’effet observé ne s’est pas maintenu à trois mois. Conclusion Une formation sur simulateur à haute fidélité couplée à un débreffage peut améliorer à court terme l’efficacité des communications dans une équipe interprofessionnelle de SI.
Resumo:
Simulations of polar ozone losses were performed using the three-dimensional high-resolution (1∘ × 1∘) chemical transport model MIMOSA-CHIM. Three Arctic winters 1999–2000, 2001–2002, 2002–2003 and three Antarctic winters 2001, 2002, and 2003 were considered for the study. The cumulative ozone loss in the Arctic winter 2002–2003 reached around 35% at 475 K inside the vortex, as compared to more than 60% in 1999–2000. During 1999–2000, denitrification induces a maximum of about 23% extra ozone loss at 475 K as compared to 17% in 2002–2003. Unlike these two colder Arctic winters, the 2001–2002 Arctic was warmer and did not experience much ozone loss. Sensitivity tests showed that the chosen resolution of 1∘ × 1∘ provides a better evaluation of ozone loss at the edge of the polar vortex in high solar zenith angle conditions. The simulation results for ozone, ClO, HNO3, N2O, and NO y for winters 1999–2000 and 2002–2003 were compared with measurements on board ER-2 and Geophysica aircraft respectively. Sensitivity tests showed that increasing heating rates calculated by the model by 50% and doubling the PSC (Polar Stratospheric Clouds) particle density (from 5 × 10−3 to 10−2 cm−3) refines the agreement with in situ ozone, N2O and NO y levels. In this configuration, simulated ClO levels are increased and are in better agreement with observations in January but are overestimated by about 20% in March. The use of the Burkholder et al. (1990) Cl2O2 absorption cross-sections slightly increases further ClO levels especially in high solar zenith angle conditions. Comparisons of the modelled ozone values with ozonesonde measurement in the Antarctic winter 2003 and with Polar Ozone and Aerosol Measurement III (POAM III) measurements in the Antarctic winters 2001 and 2002, shows that the simulations underestimate the ozone loss rate at the end of the ozone destruction period. A slightly better agreement is obtained with the use of Burkholder et al. (1990) Cl2O2 absorption cross-sections.
Resumo:
The parameterization of surface heat-flux variability in urban areas relies on adequate representation of surface characteristics. Given the horizontal resolutions (e.g. ≈0.1–1km) currently used in numerical weather prediction (NWP) models, properties of the urban surface (e.g. vegetated/built surfaces, street-canyon geometries) often have large spatial variability. Here, a new approach based on Urban Zones to characterize Energy partitioning (UZE) is tested within a NWP model (Weather Research and Forecasting model;WRF v3.2.1) for Greater London. The urban land-surface scheme is the Noah/Single-Layer Urban Canopy Model (SLUCM). Detailed surface information (horizontal resolution 1 km)in central London shows that the UZE offers better characterization of surface properties and their variability compared to default WRF-SLUCM input parameters. In situ observations of the surface energy fluxes and near-surface meteorological variables are used to select the radiation and turbulence parameterization schemes and to evaluate the land-surface scheme
Resumo:
Technological innovations have had a profound influence on how we study the sensory perception in humans and other animals. One example was the introduction of affordable computers, which radically changed the nature of visual experiments. It is clear that vision research is now at cusp of a similar shift, this time driven by the use of commercially available, low-cost, high- fidelity virtual reality (VR). In this review we will focus on: (a) the research questions VR allows experimenters to address and why these research questions are important, (b) the things that need to be considered when using VR to study human perception, (c) the drawbacks of current VR systems, and (d) the future direction vision research may take, now that VR has become a viable research tool.
Resumo:
Dados meteorológicos e simulações numéricas de alta resolução foram usados para estimar campos espaciais na região leste da Amazônia onde se situam a Floresta e a Baía de Caxiuanã, no Estado do Pará. O estudo foi feito para o período de Novembro de 2006, quando foi realizado o experimento de campo COBRA-PARÁ. Análises de imagens do sensor MODIS mostram a ocorrência de vários fenômenos locais como avenidas de nuvens, sistemas convectivos precipitantes, e importante influência das interfaces entre a floresta e as superfícies aquáticas. Simulações numéricas para o dia 7 de novembro de 2006 mostraram que o modelo representou bem as principais variáveis meteorológicas. Os resultados mostram que a Baía de Caxiuanã provoca importante impacto nos campos meteorológicos adjacentes, principalmente, através da advecção pelos ventos de nordeste que induzem a temperaturas do dossel mais frias a oeste da baía. Simulações de alta resolução (LES) produziram padrões espaciais de temperatura e umidade alinhados com os ventos durante o período diurno e mudanças noturnas causadas principalmente pela presença da baía e chuvas convectivas. Correlações espaciais entre os ventos de níveis médios e os fluxos verticais de calor latente mostraram que existe uma mudança de correlações negativas para as primeiras horas do dia passando para correlações positivas para o período da tarde e início da noite.
Resumo:
The enzymatically catalyzed template-directed extension of ssDNA/primer complex is an impor-tant reaction of extraordinary complexity. The DNA polymerase does not merely facilitate the insertion of dNMP, but it also performs rapid screening of substrates to ensure a high degree of fidelity. Several kinetic studies have determined rate constants and equilibrium constants for the elementary steps that make up the overall pathway. The information is used to develop a macro-scopic kinetic model, using an approach described by Ninio [Ninio J., 1987. Alternative to the steady-state method: derivation of reaction rates from first-passage times and pathway probabili-ties. Proc. Natl. Acad. Sci. U.S.A. 84, 663–667]. The principle idea of the Ninio approach is to track a single template/primer complex over time and to identify the expected behavior. The average time to insert a single nucleotide is a weighted sum of several terms, in-cluding the actual time to insert a nucleotide plus delays due to polymerase detachment from ei-ther the ternary (template-primer-polymerase) or quaternary (+nucleotide) complexes and time delays associated with the identification and ultimate rejection of an incorrect nucleotide from the binding site. The passage times of all events and their probability of occurrence are ex-pressed in terms of the rate constants of the elementary steps of the reaction pathway. The model accounts for variations in the average insertion time with different nucleotides as well as the in-fluence of G+C content of the sequence in the vicinity of the insertion site. Furthermore the model provides estimates of error frequencies. If nucleotide extension is recognized as a compe-tition between successful insertions and time delaying events, it can be described as a binomial process with a probability distribution. The distribution gives the probability to extend a primer/template complex with a certain number of base pairs and in general it maps annealed complexes into extension products.