983 resultados para Simulation Testing
Resumo:
Our new simple method for calculating accurate Franck-Condon factors including nondiagonal (i.e., mode-mode) anharmonic coupling is used to simulate the C2H4+X2B 3u←C2H4X̃1 Ag band in the photoelectron spectrum. An improved vibrational basis set truncation algorithm, which permits very efficient computations, is employed. Because the torsional mode is highly anharmonic it is separated from the other modes and treated exactly. All other modes are treated through the second-order perturbation theory. The perturbation-theory corrections are significant and lead to a good agreement with experiment, although the separability assumption for torsion causes the C2 D4 results to be not as good as those for C2 H4. A variational formulation to overcome this circumstance, and deal with large anharmonicities in general, is suggested
Resumo:
Earthquakes occurring around the world each year cause thousands ofdeaths, millions of dollars in damage to infrastructure, and incalculablehuman suffering. In recent years, satellite technology has been asignificant boon to response efforts following an earthquake and itsafter-effects by providing mobile communications between response teamsand remote sensing of damaged areas to disaster management organizations.In 2007, an international team of students and professionals assembledduring theInternational Space University’s Summer Session Program in Beijing, Chinato examine how satellite and ground-based technology could be betterintegrated to provide an optimised response in the event of an earthquake.The resulting Technology Resources for Earthquake MOnitoring and Response(TREMOR) proposal describes an integrative prototype response system thatwill implement mobile satellite communication hubs providing telephone anddata links between response teams, onsite telemedicine consultation foremergency first-responders, and satellite navigation systems that willlocate and track emergency vehicles and guide search-and-rescue crews. Aprototype earthquake simulation system is also proposed, integratinghistorical data, earthquake precursor data, and local geomatics andinfrastructure information to predict the damage that could occur in theevent of an earthquake. The backbone of these proposals is a comprehensiveeducation and training program to help individuals, communities andgovernments prepare in advance. The TREMOR team recommends thecoordination of these efforts through a centralised, non-governmentalorganization.
Resumo:
This study was conducted to assess if fingerprint specialists could be influenced by extraneous contextual information during a verification process. Participants were separated into three groups: a control group (no contextual information was given), a low bias group (minimal contextual information was given in the form of a report prompting conclusions), and a high bias group (an internationally recognized fingerprint expert provided conclusions and case information to deceive this group into believing that it was his case and conclusions). A similar experiment was later conducted with laypersons. The results showed that fingerprint experts were influenced by contextual information during fingerprint comparisons, but not towards making errors. Instead, fingerprint experts under the biasing conditions provided significantly fewer definitive and erroneous conclusions than the control group. In contrast, the novice participants were more influenced by the bias conditions and did tend to make incorrect judgments, especially when prompted towards an incorrect response by the bias prompt.
Resumo:
British mammalogists have used two different systems for surveying the common dormouse Muscardinus avellanarius: a modified bird nest box with the entrance facing the tree trunk, and a smaller, cheaper model called a "nest tube". However, only few data comparing different nest box systems are currently available. To determine which system is more efficient, we compared the use of the large (GB-type) and small nest boxes (DE-type, a commercial wooden mouse trap without a door) in three Swiss forest. The presence of Muscardinus, potential competitors, and any evidence of occupation were examined in 60 pairs of nest boxes based on 2,280 nest box checks conducted over 5 years. Mean annual occupation and cumulative numbers of Muscardinus present were both significantly higher for the DE than for the GB boxes (64.6% versus 32.1%, and 149 versus 67 dormice, respectively). In contrast, the annual occupation by competitors including Glis glis, Apodemus spp. and hole-nesting birds was significantly higher in the GB than in the DE boxes in all forest (19-68% versus 0-16%, depending on the species and forest). These results suggest that smaller nest boxes are preferred by the common dormouse and are rarely occupied by competitors. These boxes hence appear to be preferable for studying Muscardinus populations.
Resumo:
Introduction: According to guidelines, patients with coronary artery disease (CAD) should undergo revascularization if myocardial ischemia is present. While coronary angiography (CXA) allows the morphological assessment of CAD, the fractional flow reserve (FFR) has proved to be a complementary invasive test to assess the functional significance of CAD, i.e. to detect ischemia. Perfusion Cardiac Magnetic Resonance (CMR) has turned out to be a robust non-invasive technique to assess myocardial ischemia. The objective: is to compare the cost-effectiveness ratio - defined as the costs per patient correctly diagnosed - of two algorithms used to diagnose hemodynamically significant CAD in relation to the pretest likelihood of CAD: 1) aCMRto assess ischemia before referring positive patients to CXA (CMR + CXA), 2) a CXA in all patients combined with a FFR test in patients with angiographically positive stenoses (CXA + FFR). Methods: The costs, evaluated from the health care system perspective in the Swiss, German, the United Kingdom (UK) and the United States (US) contexts, included public prices of the different tests considered as outpatient procedures, complications' costs and costs induced by diagnosis errors (false negative). The effectiveness criterion wasthe ability to accurately identify apatient with significantCAD.Test performancesused in the model were based on the clinical literature. Using a mathematical model, we compared the cost-effectiveness ratio for both algorithms for hypothetical patient cohorts with different pretest likelihood of CAD. Results: The cost-effectiveness ratio decreased hyperbolically with increasing pretest likelihood of CAD for both strategies. CMR + CXA and CXA + FFR were equally costeffective at a pretest likelihood of CAD of 62% in Switzerland, 67% in Germany, 83% in the UK and 84% in the US with costs of CHF 5'794, Euros 1'472, £ 2'685 and $ 2'126 per patient correctly diagnosed. Below these thresholds, CMR + CXA showed lower costs per patient correctly diagnosed than CXA + FFR. Implications for the health care system/professionals/patients/society These results facilitate decision making for the clinical use of new generations of imaging procedures to detect ischemia. They show to what extent the cost-effectiveness to diagnose CAD depends on the prevalence of the disease.
Resumo:
An active strain formulation for orthotropic constitutive laws arising in cardiac mechanics modeling is introduced and studied. The passive mechanical properties of the tissue are described by the Holzapfel-Ogden relation. In the active strain formulation, the Euler-Lagrange equations for minimizing the total energy are written in terms of active and passive deformation factors, where the active part is assumed to depend, at the cell level, on the electrodynamics and on the specific orientation of the cardiac cells. The well-posedness of the linear system derived from a generic Newton iteration of the original problem is analyzed and different mechanical activation functions are considered. In addition, the active strain formulation is compared with the classical active stress formulation from both numerical and modeling perspectives. Taylor-Hood and MINI finite elements are employed to discretize the mechanical problem. The results of several numerical experiments show that the proposed formulation is mathematically consistent and is able to represent the main key features of the phenomenon, while allowing savings in computational costs.
Resumo:
Background: With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (bio)chemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA). The key quantity is the step size, or waiting time, τ, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where τ can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called τ-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as τ grows. Results: In this paper we extend Poisson τ-leap methods to a general class of Runge-Kutta (RK) τ-leap methods. We show that with the proper selection of the coefficients, the variance of the extended τ-leap can be well-behaved, leading to significantly larger step sizes.Conclusions: The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original τ-leap method. The approach paves the way to explore new multiscale methods to simulate (bio)chemical systems.
Resumo:
Current monitoring techniques for determination of compaction of earthwork and asphalt generally involve destructive testing of the materials following placement. Advances in sensor technologies show significant promise for obtaining necessary information through nondestructive and remote techniques. To develop a better understanding of suitable and potential technologies, this study was undertaken to conduct a synthesis review of nondestructive testing technologies and perform preliminary evaluations of selected technologies to better understand their application to testing of geomaterials (soil fill, aggregate base, asphalt, etc.). This research resulted in a synthesis of potential technologies for compaction monitoring with a strong emphasis on moisture sensing. Techniques were reviewed and selectively evaluated for their potential to improve field quality control operations. Activities included an extensive review of commercially available moisture sensors, literature review, and evaluation of selected technologies. The technologies investigated in this study were dielectric, nuclear, near infrared spectroscopy, seismic, electromagnetic induction, and thermal. The primary disadvantage of all the methods is the small sample volume measured. In addition, all the methods possessed some sensitivity to non-moisture factors that affected the accuracy of the results. As the measurement volume increases, local variances are averaged out providing better accuracy. Most dielectric methods with the exception of ground penetrating radar have a very small measurement volume and are highly sensitive to variations in density, porosity, etc.
Resumo:
Background: Recent advances on high-throughput technologies have produced a vast amount of protein sequences, while the number of high-resolution structures has seen a limited increase. This has impelled the production of many strategies to built protein structures from its sequence, generating a considerable amount of alternative models. The selection of the closest model to the native conformation has thus become crucial for structure prediction. Several methods have been developed to score protein models by energies, knowledge-based potentials and combination of both.Results: Here, we present and demonstrate a theory to split the knowledge-based potentials in scoring terms biologically meaningful and to combine them in new scores to predict near-native structures. Our strategy allows circumventing the problem of defining the reference state. In this approach we give the proof for a simple and linear application that can be further improved by optimizing the combination of Zscores. Using the simplest composite score () we obtained predictions similar to state-of-the-art methods. Besides, our approach has the advantage of identifying the most relevant terms involved in the stability of the protein structure. Finally, we also use the composite Zscores to assess the conformation of models and to detect local errors.Conclusion: We have introduced a method to split knowledge-based potentials and to solve the problem of defining a reference state. The new scores have detected near-native structures as accurately as state-of-art methods and have been successful to identify wrongly modeled regions of many near-native conformations.
Resumo:
Microsatellite instability (MSI) testing in clinics is becoming increasingly widespread; therefore, there is an urgent need for methodology standardization and the availability of quality control. This study is aimed to assess the interlaboratory reproducibility of MSI testing in archive samples by using a panel of 5 recently introduced, mononucleotide repeats (MNR). The quality control involved 8 European institutions. Participants were supplied with DNA extracted from 15 archive colon carcinoma samples and from the corresponding normal tissues. Every group was asked to assess the MSI status of the samples by using the BAT25, BAT26, NR21, NR24, and NR27 mononucleotide markers. Four institutions repeated the analysis using the NCI reference panel to confirm the results obtained with the MNR markers. The overall concordance among institutions for MSI analyses at single locus level was 97.7% when using the MNR panel and 95.0% with the NCI one. The laboratories obtained a full agreement in scoring the MSI status of each patient sample, both using the mononucleotide and the NCI marker sets. With the NCI marker set, however, concordance was lowered to 85.7% when considering the MSI-Low phenotype. Concordance between the 2 panels in scoring the MSI status of each sample was complete if no discrimination was made between MSI-Stable and MSI-L, whereas it dropped to 76.7% if MSI-L was considered. In conclusion, the use of the MNR panel seems to be a robust approach that yields a very high level of reproducibility. The results obtained with the 5 MNR are diagnostically consistent with those obtained by the use of the NCI markers, except for the MSI-Low phenotype.
Resumo:
Imatinib is the standard of care for patients with advanced metastatic gastrointestinal stromal tumors (GIST), and is also approved for adjuvant treatment in patients at substantial risk of relapse. Studies have shown that maximizing benefit from imatinib depends on long-term administration at recommended doses. Pharmacokinetic (PK) and pharmacodynamic factors, adherence, and drug-drug interactions can affect exposure to imatinib and impact clinical outcomes. This article reviews the relevance of these factors to imatinib's clinical activity and response in the context of what has been demonstrated in chronic myelogenous leukemia (CML), and in light of new data correlating imatinib exposure to response in patients with GIST. Because of the wide inter-patient variability in drug exposure with imatinib in both CML and GIST, blood level testing (BLT) may play a role in investigating instances of suboptimal response, unusually severe toxicities, drug-drug interactions, and suspected non-adherence. Published clinical data in CML and in GIST were considered, including data from a PK substudy of the B2222 trial correlating imatinib blood levels with clinical responses in patients with GIST. Imatinib trough plasma levels <1100ng/mL were associated with lower rates of objective response and faster development of progressive disease in patients with GIST. These findings have been supported by other analyses correlating free imatinib (unbound) levels with response. These results suggest a future application for imatinib BLT in predicting and optimizing therapeutic response. Nevertheless, early estimates of threshold imatinib blood levels must be confirmed prospectively in future studies and elaborated for different patient subgroups.
Resumo:
Risks of significant infant drug exposure through human milk arepoorly defined due to lack of large-scale PK data. We propose to useBayesian approach based on population PK (popPK)-guided modelingand simulation for risk prediction. As a proof-of-principle study, weexploited fluoxetine milk concentration data from 25 women. popPKparameters including milk-to-plasma ratio (MP ratio) were estimatedfrom the best model. The dose of fluoxetine the breastfed infant wouldreceive through mother's milk, and infant plasma concentrations wereestimated from 1000 simulated mother-infant pairs, using randomassignment of feeding times and milk volume. A conservative estimateof CYP2D6 activity of 20% of the allometrically-adjusted adult valuewas assumed. Derived model parameters, including MP ratio were consistentwith those reported in the literature. Visual predictive check andother model diagnostics showed no signs of model misspecifications.The model simulation predicted that infant exposure levels to fluoxetinevia mother's milk were below 10% of weight-adjusted maternal therapeuticdoses in >99% of simulated infants. Predicted median ratio ofinfant-mother serum levels at steady state was 0.093 (range 0.033-0.31),consistent with literature reported values (mean=0.07; range 0-0.59).Predicted incidence of relatively high infant-mother ratio (>0.2) ofsteady-state serum fluoxetine concentrations was <1.3%. Overall, ourpredictions are consistent with clinical observations. Our approach maybe valid for other drugs, allowing in silico prediction of infant drugexposure risks through human milk. We will discuss application of thisapproach to another drug used in lactating women.
Resumo:
The McIsaac scoring system is a tool designed to predict the probability of streptococcal pharyngitis in children aged 3 to 17 years with a sore throat. Although it does not allow the physician to make the diagnosis of streptococcal pharyngitis, it enables to identify those children with a sore throat in whom rapid antigen detection tests have a good predictive value.