950 resultados para Simulation methods.
Resumo:
OBJECTIVES The aim was to study the impact of the defect size of endodontically treated incisors compared to dental implants as abutments on the survival of zirconia two-unit anterior cantilever-fixed partial dentures (2U-FPDs) during 10-year simulation. MATERIALS AND METHODS Human maxillary central incisors were endodontically treated and divided into three groups (n = 24): I, access cavities rebuilt with composite core; II, teeth decoronated and restored with composite; and III as II supported by fiber posts. In group IV, implants with individual zirconia abutments were used. Specimens were restored with zirconia 2U-FPDs and exposed to two sequences of thermal cycling and mechanical loading. Statistics: Kaplan-Meier; log-rank tests. RESULTS During TCML in group I two tooth fractures and two debondings with chipping were found. Solely chippings occurred in groups II (2×), IV (2×), and III (1×). No significant different survival was found for the different abutments (p = 0.085) or FPDs (p = 0.526). Load capability differed significantly between groups I (176 N) and III (670 N), and III and IV (324 N) (p < 0.024). CONCLUSION Within the limitations of an in vitro study, it can be concluded that zirconia-framework 2U-FPDs on decoronated teeth with/without post showed comparable in vitro reliability as restorations on implants. The results indicated that restorations on teeth with only access cavity perform worse in survival and linear loading. CLINICAL RELEVANCE Even severe defects do not justify per se a replacement of this particular tooth by a dental implant from load capability point of view.
Resumo:
In this article we propose an exact efficient simulation algorithm for the generalized von Mises circular distribution of order two. It is an acceptance-rejection algorithm with a piecewise linear envelope based on the local extrema and the inflexion points of the generalized von Mises density of order two. We show that these points can be obtained from the roots of polynomials and degrees four and eight, which can be easily obtained by the methods of Ferrari and Weierstrass. A comparative study with the von Neumann acceptance-rejection, with the ratio-of-uniforms and with a Markov chain Monte Carlo algorithms shows that this new method is generally the most efficient.
Resumo:
We derive multiscale statistics for deconvolution in order to detect qualitative features of the unknown density. An important example covered within this framework is to test for local monotonicity on all scales simultaneously. We investigate the moderately ill-posed setting, where the Fourier transform of the error density in the deconvolution model is of polynomial decay. For multiscale testing, we consider a calibration, motivated by the modulus of continuity of Brownian motion. We investigate the performance of our results from both the theoretical and simulation based point of view. A major consequence of our work is that the detection of qualitative features of a density in a deconvolution problem is a doable task, although the minimax rates for pointwise estimation are very slow.
Resumo:
Aims: Angiographic ectasias and aneurysms in stented segments have been associated with late stent thrombosis. Using optical coherence tomography (OCT), some stented segments show coronary evaginations reminiscent of ectasias. The purpose of this study was to explore, using computational fluid-dynamic (CFD) simulations, whether OCT-detected coronary evaginations can induce local changes in blood flow. Methods and results: OCT-detected evaginations are defined as outward bulges in the luminal vessel contour between struts, with the depth of the bulge exceeding the actual strut thickness. Evaginations can be characterised cross ectionally by depth and along the stented segment by total length. Assuming an ellipsoid shape, we modelled 3-D evaginations with different sizes by varying the depth from 0.2-1.0 mm, and the length from 1-9 mm. For the flow simulation we used average flow velocity data from non-diseased coronary arteries. The change in flow with varying evagination sizes was assessed using a particle tracing test where the particle transit time within the segment with evagination was compared with that of a control vessel. The presence of the evagination caused a delayed particle transit time which increased with the evagination size. The change in flow consisted locally of recirculation within the evagination, as well as flow deceleration due to a larger lumen - seen as a deflection of flow towards the evagination. Conclusions: CFD simulation of 3-D evaginations and blood flow suggests that evaginations affect flow locally, with a flow disturbance that increases with increasing evagination size.
Resumo:
BACKGROUND Microvascular anastomosis is the cornerstone of free tissue transfers. Irrespective of the microsurgical technique that one seeks to integrate or improve, the time commitment in the laboratory is significant. After extensive previous training on several animal models, we sought to identify an animal model that circumvents the following issues: ethical rules, cost, time-consuming and expensive anesthesia, and surgical preparation of tissues required to access vessels before performing the microsurgical training, not to mention that laboratories are closed on weekends. METHODS Between January 2012 and April 2012, a total of 91 earthworms were used for 150 microsurgical training exercises to simulate vascular end-to-side microanastomosis. The training sessions were divided into ten periods of 7 days. Each training session included 15 simulations of end-to-side vascular microanastomoses: larger than 1.5 mm (n=5), between 1.0 and 1.5 mm (n=5), and smaller than 1.0 mm (n=5). A linear model with the main variables being the number of weeks (as a numerical covariate) and the size of the animal (as a factor) was used to determine the trend in time of anastomosis over subsequent weeks as well as the differences between the different size groups. RESULTS The linear model shows a significant trend (p<0.001) in time of anastomosis in the course of the training, as well as significant differences (p<0.001) between the groups of animals of different sizes. For microanastomoses larger than 1.5 mm, the mean anastomosis time decreased from 19.3±1.0 to 11.1±0.4 min between the first and last week of training (decrease of 42.5%). For training with smaller diameters, the results showed a decrease in execution time of 43.2% (diameter between 1.0 and 1.5 mm) and 40.9% (diameter<1.0 mm) between the first and last periods. The study demonstrates an improvement in the dexterity and speed of nodes execution. CONCLUSION The earthworm appears to be a reliable experimental model for microsurgical training of end-to-side microanastomoses. Its numerous advantages are discussed here and we predict training on earthworms will significantly grow and develop in the near future. LEVEL OF EVIDENCE III This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .
Resumo:
Software for use with patient records is challenging to design and difficult to evaluate because of the tremendous variability of patient circumstances. A method was devised by the authors to overcome a number of difficulties. The method evaluates and compares objectively various software products for use in emergency departments and compares software to conventional methods like dictation and templated chart forms. The technique utilizes oral case simulation and video recording for analysis. The methodology and experiences of executing a study using this case simulation are discussed in this presentation.
Resumo:
Calcium levels in spines play a significant role in determining the sign and magnitude of synaptic plasticity. The magnitude of calcium influx into spines is highly dependent on influx through N-methyl D-aspartate (NMDA) receptors, and therefore depends on the number of postsynaptic NMDA receptors in each spine. We have calculated previously how the number of postsynaptic NMDA receptors determines the mean and variance of calcium transients in the postsynaptic density, and how this alters the shape of plasticity curves. However, the number of postsynaptic NMDA receptors in the postsynaptic density is not well known. Anatomical methods for estimating the number of NMDA receptors produce estimates that are very different than those produced by physiological techniques. The physiological techniques are based on the statistics of synaptic transmission and it is difficult to experimentally estimate their precision. In this paper we use stochastic simulations in order to test the validity of a physiological estimation technique based on failure analysis. We find that the method is likely to underestimate the number of postsynaptic NMDA receptors, explain the source of the error, and re-derive a more precise estimation technique. We also show that the original failure analysis as well as our improved formulas are not robust to small estimation errors in key parameters.
Resumo:
BACKGROUND Electrochemical conversion of xenobiotics has been shown to mimic human phase I metabolism for a few compounds. MATERIALS & METHODS Twenty-one compounds were analyzed with a semiautomated electrochemical setup and mass spectrometry detection. RESULTS The system was able to mimic some metabolic pathways, such as oxygen gain, dealkylation and deiodination, but many of the expected and known metabolites were not produced. CONCLUSION Electrochemical conversion is a useful approach for the preparative synthesis of some types of metabolites, but as a screening method for unknown phase I metabolites, the method is, in our opinion, inferior to incubation with human liver microsomes and in vivo experiments with laboratory animals, for example.
Resumo:
BACKGROUND Efficiently performed basic life support (BLS) after cardiac arrest is proven to be effective. However, cardiopulmonary resuscitation (CPR) is strenuous and rescuers' performance declines rapidly over time. Audio-visual feedback devices reporting CPR quality may prevent this decline. We aimed to investigate the effect of various CPR feedback devices on CPR quality. METHODS In this open, prospective, randomised, controlled trial we compared three CPR feedback devices (PocketCPR, CPRmeter, iPhone app PocketCPR) with standard BLS without feedback in a simulated scenario. 240 trained medical students performed single rescuer BLS on a manikin for 8min. Effective compression (compressions with correct depth, pressure point and sufficient decompression) as well as compression rate, flow time fraction and ventilation parameters were compared between the four groups. RESULTS Study participants using the PocketCPR performed 17±19% effective compressions compared to 32±28% with CPRmeter, 25±27% with the iPhone app PocketCPR, and 35±30% applying standard BLS (PocketCPR vs. CPRmeter p=0.007, PocketCPR vs. standard BLS p=0.001, others: ns). PocketCPR and CPRmeter prevented a decline in effective compression over time, but overall performance in the PocketCPR group was considerably inferior to standard BLS. Compression depth and rate were within the range recommended in the guidelines in all groups. CONCLUSION While we found differences between the investigated CPR feedback devices, overall BLS quality was suboptimal in all groups. Surprisingly, effective compression was not improved by any CPR feedback device compared to standard BLS. All feedback devices caused substantial delay in starting CPR, which may worsen outcome.
Resumo:
In this study two commonly used automated methods to detect atmospheric fronts in the lower troposphere are compared in various synoptic situations. The first method is a thermal approach, relying on the gradient of equivalent potential temperature (TH), while the second method is based on temporal changes in the 10 m wind (WND). For a comprehensive objective comparison of the outputs of these methods of frontal identification, both schemes are firstly applied to an idealised strong baroclinic wave simulation in the absence of topography. Then, two case-studies (one in the Northern Hemisphere (NH) and one in the Southern Hemisphere (SH)) were conducted to contrast fronts detected by the methods. Finally, we obtain global winter and summer frontal occurrence climatologies (derived from ERA-Interim for 1979–2012) and compare the structure of these. TH is able to identify cold and warm fronts in strong baroclinic cases that are in good agreement with manual analyses. WND is particularly suited for the detection of strongly elongated, meridionally oriented moving fronts, but has very limited ability to identify zonally oriented warm fronts. We note that the areas of the main TH frontal activity are shifted equatorwards compared to the WND patterns and are located upstream of regions of main WND front activity. The number of WND fronts in the NH shows more interseasonal variations than TH fronts, decreasing by more than 50% from winter to summer. In the SH there is a weaker seasonal variation of the number of observed WND fronts, however TH front activity reduces from summer (DJF) to winter (JJA). The main motivation is to give an overview of the performance of these methods, such that researchers can choose the appropriate one for their particular interest.
Resumo:
Today, there is little knowledge on the attitude state of decommissioned intact objects in Earth orbit. Observational means have advanced in the past years, but are still limited with respect to an accurate estimate of motion vector orientations and magnitude. Especially for the preparation of Active Debris Removal (ADR) missions as planned by ESA’s Clean Space initiative or contingency scenarios for ESA spacecraft like ENVISAT, such knowledge is needed. ESA's “Debris Attitude Motion Measurements and Modelling” project (ESA Contract No. 40000112447), led by the Astronomical Institute of the University of Bern (AIUB), addresses this problem. The goal of the project is to achieve a good understanding of the attitude evolution and the considerable internal and external effects which occur. To characterize the attitude state of selected targets in LEO and GTO, multiple observation methods are combined. Optical observations are carried out by AIUB, Satellite Laser Ranging (SLR) is performed by the Space Research Institute of the Austrian Academy of Sciences (IWF) and radar measurements and signal level determination are provided by the Fraunhofer Institute for High Frequency Physics and Radar Techniques (FHR). The In-Orbit Tumbling Analysis tool (ιOTA) is a prototype software, currently in development by Hyperschall Technologie Göttingen GmbH (HTG) within the framework of the project. ιOTA will be a highly modular software tool to perform short-(days), medium-(months) and long-term (years) propagation of the orbit and attitude motion (six degrees-of-freedom) of spacecraft in Earth orbit. The simulation takes into account all relevant acting forces and torques, including aerodynamic drag, solar radiation pressure, gravitational influences of Earth, Sun and Moon, eddy current damping, impulse and momentum transfer from space debris or micro meteoroid impact, as well as the optional definition of particular spacecraft specific influences like tank sloshing, reaction wheel behaviour, magnetic torquer activity and thruster firing. The purpose of ιOTA is to provide high accuracy short-term simulations to support observers and potential ADR missions, as well as medium-and long-term simulations to study the significance of the particular internal and external influences on the attitude, especially damping factors and momentum transfer. The simulation will also enable the investigation of the altitude dependency of the particular external influences. ιOTA's post-processing modules will generate synthetic measurements for observers and for software validation. The validation of the software will be done by cross-calibration with observations and measurements acquired by the project partners.
Resumo:
This study investigates a theoretical model where a longitudinal process, that is a stationary Markov-Chain, and a Weibull survival process share a bivariate random effect. Furthermore, a Quality-of-Life adjusted survival is calculated as the weighted sum of survival time. Theoretical values of population mean adjusted survival of the described model are computed numerically. The parameters of the bivariate random effect do significantly affect theoretical values of population mean. Maximum-Likelihood and Bayesian methods are applied on simulated data to estimate the model parameters. Based on the parameter estimates, predicated population mean adjusted survival can then be calculated numerically and compared with the theoretical values. Bayesian method and Maximum-Likelihood method provide parameter estimations and population mean prediction with comparable accuracy; however Bayesian method suffers from poor convergence due to autocorrelation and inter-variable correlation. ^
Resumo:
The difficulty of detecting differential gene expression in microarray data has existed for many years. Several correction procedures try to avoid the family-wise error rate in multiple comparison process, including the Bonferroni and Sidak single-step p-value adjustments, Holm's step-down correction method, and Benjamini and Hochberg's false discovery rate (FDR) correction procedure. Each multiple comparison technique has its advantages and weaknesses. We studied each multiple comparison method through numerical studies (simulations) and applied the methods to the real exploratory DNA microarray data, which detect of molecular signatures in papillary thyroid cancer (PTC) patients. According to our results of simulation studies, Benjamini and Hochberg step-up FDR controlling procedure is the best process among these multiple comparison methods and we discovered 1277 potential biomarkers among 54675 probe sets after applying the Benjamini and Hochberg's method to PTC microarray data.^
Resumo:
Objectives. This paper seeks to assess the effect on statistical power of regression model misspecification in a variety of situations. ^ Methods and results. The effect of misspecification in regression can be approximated by evaluating the correlation between the correct specification and the misspecification of the outcome variable (Harris 2010).In this paper, three misspecified models (linear, categorical and fractional polynomial) were considered. In the first section, the mathematical method of calculating the correlation between correct and misspecified models with simple mathematical forms was derived and demonstrated. In the second section, data from the National Health and Nutrition Examination Survey (NHANES 2007-2008) were used to examine such correlations. Our study shows that comparing to linear or categorical models, the fractional polynomial models, with the higher correlations, provided a better approximation of the true relationship, which was illustrated by LOESS regression. In the third section, we present the results of simulation studies that demonstrate overall misspecification in regression can produce marked decreases in power with small sample sizes. However, the categorical model had greatest power, ranging from 0.877 to 0.936 depending on sample size and outcome variable used. The power of fractional polynomial model was close to that of linear model, which ranged from 0.69 to 0.83, and appeared to be affected by the increased degrees of freedom of this model.^ Conclusion. Correlations between alternative model specifications can be used to provide a good approximation of the effect on statistical power of misspecification when the sample size is large. When model specifications have known simple mathematical forms, such correlations can be calculated mathematically. Actual public health data from NHANES 2007-2008 were used as examples to demonstrate the situations with unknown or complex correct model specification. Simulation of power for misspecified models confirmed the results based on correlation methods but also illustrated the effect of model degrees of freedom on power.^
Resumo:
Li-Fraumeni syndrome (LFS) is characterized by a variety of neoplasms occurring at a young age with an apparent autosomal dominant transmission. Individuals in pedigrees with LFS have high incidence of second malignancies. Recently LFS has been found to be associated with germline mutations of a tumor-suppressor gene, p53. Because LFS is rare and indeed not a clear-cut disease, it is not known whether all cases of LFS are attributable to p53 germline mutations and how p53 plays in cancer occurrence in such cancer syndrome families. In the present study, DNAs from constitutive cells of two-hundred and thirty-three family members from ten extended pedigrees were screened for p53 mutations. Six out of the ten LFS families had germline mutations at the p53 locus, including point and deletion mutations. In these six families, 55 out of 146 members were carriers of p53 mutations. Except one, all mutations occurred in exons 5 to 8 (i.e., the "hot spot" region) of the p53 gene. The age-specific penetrance of cancer was estimated after the genotype for each family member at risk was determined. The penetrance was 0.15, 0.29, 0.35, 0.77, and 0.91 by 20, 30, 40, 50 and 60 year-old, respectively, in male carriers; 0.19, 0.44, 0.76, and 0.90 by 20, 30, 40, and 50 year-old, respectively, in female carriers. These results indicated that one cannot escape from tumorigenesis if one inherits a p53 mutant allele; at least ninety percent of p53 carriers will develop cancer by the age of 60. To evaluate the possible bias due to the unexamined blood-relatives in LFS families, I performed a simulation analysis in which a p53 genotype was assigned to each unexamined person based on his cancer status and liability to cancer. The results showed that the penetrance estimates were not biased by the unexamined relatives. I also determined the sex, site, and age-specific penetrance of breast cancer in female carriers and lung cancer in male carriers. The penetrance of breast cancer in female carriers was 0.81 by age 45; the penetrance of lung cancer in male carriers was 0.78 by age 60, indicating that p53 play a key role for tumorigenesis in common cancers. ^