883 resultados para Simulation Based Method


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many modern statistical applications involve inference for complex stochastic models, where it is easy to simulate from the models, but impossible to calculate likelihoods. Approximate Bayesian computation (ABC) is a method of inference for such models. It replaces calculation of the likelihood by a step which involves simulating artificial data for different parameter values, and comparing summary statistics of the simulated data with summary statistics of the observed data. Here we show how to construct appropriate summary statistics for ABC in a semi-automatic manner. We aim for summary statistics which will enable inference about certain parameters of interest to be as accurate as possible. Theoretical results show that optimal summary statistics are the posterior means of the parameters. Although these cannot be calculated analytically, we use an extra stage of simulation to estimate how the posterior means vary as a function of the data; and we then use these estimates of our summary statistics within ABC. Empirical results show that our approach is a robust method for choosing summary statistics that can result in substantially more accurate ABC analyses than the ad hoc choices of summary statistics that have been proposed in the literature. We also demonstrate advantages over two alternative methods of simulation-based inference.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The validity of ensemble averaging on event-related potential (ERP) data has been questioned, due to its assumption that the ERP is identical across trials. Thus, there is a need for preliminary testing for cluster structure in the data. New method: We propose a complete pipeline for the cluster analysis of ERP data. To increase the signalto-noise (SNR) ratio of the raw single-trials, we used a denoising method based on Empirical Mode Decomposition (EMD). Next, we used a bootstrap-based method to determine the number of clusters, through a measure called the Stability Index (SI). We then used a clustering algorithm based on a Genetic Algorithm (GA)to define initial cluster centroids for subsequent k-means clustering. Finally, we visualised the clustering results through a scheme based on Principal Component Analysis (PCA). Results: After validating the pipeline on simulated data, we tested it on data from two experiments – a P300 speller paradigm on a single subject and a language processing study on 25 subjects. Results revealed evidence for the existence of 6 clusters in one experimental condition from the language processing study. Further, a two-way chi-square test revealed an influence of subject on cluster membership.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The use of kilometre-scale ensembles in operational forecasting provides new challenges for forecast interpretation and evaluation to account for uncertainty on the convective scale. A new neighbourhood based method is presented for evaluating and characterising the local predictability variations from convective scale ensembles. Spatial scales over which ensemble forecasts agree (agreement scales, S^A) are calculated at each grid point ij, providing a map of the spatial agreement between forecasts. By comparing the average agreement scale obtained from ensemble member pairs (S^A(mm)_ij), with that between members and radar observations (S^A(mo)_ij), this approach allows the location-dependent spatial spread-skill relationship of the ensemble to be assessed. The properties of the agreement scales are demonstrated using an idealised experiment. To demonstrate the methods in an operational context the S^A(mm)_ij and S^A(mo)_ij are calculated for six convective cases run with the Met Office UK Ensemble Prediction System. The S^A(mm)_ij highlight predictability differences between cases, which can be linked to physical processes. Maps of S^A(mm)_ij are found to summarise the spatial predictability in a compact and physically meaningful manner that is useful for forecasting and for model interpretation. Comparison of S^A(mm)_ij and S^A(mo)_ij demonstrates the case-by-case and temporal variability of the spatial spread-skill, which can again be linked to physical processes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Conventional procedures employed in the modeling of viscoelastic properties of polymer rely on the determination of the polymer`s discrete relaxation spectrum from experimentally obtained data. In the past decades, several analytical regression techniques have been proposed to determine an explicit equation which describes the measured spectra. With a diverse approach, the procedure herein introduced constitutes a simulation-based computational optimization technique based on non-deterministic search method arisen from the field of evolutionary computation. Instead of comparing numerical results, this purpose of this paper is to highlight some Subtle differences between both strategies and focus on what properties of the exploited technique emerge as new possibilities for the field, In oder to illustrate this, essayed cases show how the employed technique can outperform conventional approaches in terms of fitting quality. Moreover, in some instances, it produces equivalent results With much fewer fitting parameters, which is convenient for computational simulation applications. I-lie problem formulation and the rationale of the highlighted method are herein discussed and constitute the main intended contribution. (C) 2009 Wiley Periodicals, Inc. J Appl Polym Sci 113: 122-135, 2009

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Inhibition of microtubule function is an attractive rational approach to anticancer therapy. Although taxanes are the most prominent among the microtubule-stabilizers, their clinical toxicity, poor pharmacokinetic properties, and resistance have stimulated the search for new antitumor agents having the same mechanism of action. Discodermolide is an example of nontaxane natural product that has the same mechanism of action, demonstrating superior antitumor efficacy and therapeutic index. The extraordinary chemical and biological properties have qualified discodermolide as a lead structure for the design of novel anticancer agents with optimized therapeutic properties. In the present work, we have employed a specialized fragment-based method to develop robust quantitative structure - activity relationship models for a series of synthetic discodermolide analogs. The generated molecular recognition patterns were combined with three-dimensional molecular modeling studies as a fundamental step on the path to understanding the molecular basis of drug-receptor interactions within this important series of potent antitumoral agents.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A myriad of methods are available for virtual screening of small organic compound databases. In this study we have successfully applied a quantitative model of consensus measurements, using a combination of 3D similarity searches (ROCS and EON), Hologram Quantitative Structure Activity Relationships (HQSAR) and docking (FRED, FlexX, Glide and AutoDock Vina), to retrieve cruzain inhibitors from collected databases. All methods were assessed individually and then combined in a Ligand-Based Virtual Screening (LBVS) and Target-Based Virtual Screening (TBVS) consensus scoring, using Receiving Operating Characteristic (ROC) curves to evaluate their performance. Three consensus strategies were used: scaled-rank-by-number, rank-by-rank and rank-by-vote, with the most thriving the scaled-rank-by-number strategy, considering that the stiff ROC curve appeared to be satisfactory in every way to indicate a higher enrichment power at early retrieval of active compounds from the database. The ligand-based method provided access to a robust and predictive HQSAR model that was developed to show superior discrimination between active and inactive compounds, which was also better than ROCS and EON procedures. Overall, the integration of fast computational techniques based on ligand and target structures resulted in a more efficient retrieval of cruzain inhibitors with desired pharmacological profiles that may be useful to advance the discovery of new trypanocidal agents.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this work we have elaborated a spline-based method of solution of inicial value problems involving ordinary differential equations, with emphasis on linear equations. The method can be seen as an alternative for the traditional solvers such as Runge-Kutta, and avoids root calculations in the linear time invariant case. The method is then applied on a central problem of control theory, namely, the step response problem for linear EDOs with possibly varying coefficients, where root calculations do not apply. We have implemented an efficient algorithm which uses exclusively matrix-vector operations. The working interval (till the settling time) was determined through a calculation of the least stable mode using a modified power method. Several variants of the method have been compared by simulation. For general linear problems with fine grid, the proposed method compares favorably with the Euler method. In the time invariant case, where the alternative is root calculation, we have indications that the proposed method is competitive for equations of sifficiently high order.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The stretch zone width (SZW) data for 15-5PH steel CTOD specimens fractured at -150 degrees C to + 23 degrees C temperature were measured based on focused images and 3D maps obtained by extended depth-of-field reconstruction from light microscopy (LM) image stacks. This LM-based method, with a larger lateral resolution, seems to be as effective for quantitative analysis of SZW as scanning electron microscopy (SEM) or confocal scanning laser microscopy (CSLM), permitting to clearly identify stretch zone boundaries. Despite the worst sharpness of focused images, a robust linear correlation was established to fracture toughness (KC) and SZW data for the 15-5PH steel tested specimens, measured at their center region. The method is an alternative to evaluate the boundaries of stretched zones, at a lower cost of implementation and training, since topographic data from elevation maps can be associated with reconstructed image, which summarizes the original contrast and brightness information. Finally, the extended depth-of-field method is presented here as a valuable tool for failure analysis, as a cheaper alternative to investigate rough surfaces or fracture, compared to scanning electron or confocal light microscopes. Microsc. Res. Tech. 75:11551158, 2012. (C) 2012 Wiley Periodicals, Inc.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The electromechanical impedance (EMI) technique has been successfully used in structural health monitoring (SHM) systems on a wide variety of structures. The basic concept of this technique is to monitor the structural integrity by exciting and sensing a piezoelectric transducer, usually a lead zirconate titanate (PZT) wafer bonded to the structure to be monitored and excited in a suitable frequency range. Because of the piezoelectric effect, there is a relationship between the mechanical impedance of the host structure, which is directly related to its integrity, and the electrical impedance of the PZT transducer, obtained by a ratio between the excitation and the sensing signals.This work presents a study on damage (leaks) detection using EMI based method. Tests were carried out in a rig water system built in a Hydraulic Laboratory for different leaks conditions in a metallic pipeline. Also, it was evaluated the influence of the PZT position bonded to the pipeline. The results show that leaks can effectively be detected using common metrics for damage detection such as RMSD and CCDM. Further, it was observed that the position of the PZT bonded to the pipes is an important variable and has to be controlled.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Although measurement errors can impair statistical analysis, reliability analysis has been neglected in applied microbiology. This study assessed the intra-rater reproducibility of the Agar-based method for estimation of phospholipase activity (Pz). Pz readings were performed twice by two examiners (E1, E2), either directly on plates or in photos, and both black and white backgrounds were used. Pz values were taken from one or triplicate colonies from each sample (n=30). Intra-examiner reproducibility was estimated using Intraclass Correlation Coefficient (ICC). For both examiners, reading triplicate (ICCE1=0.91, ICCE2=0.86) was better than one colony (ICCE1=0.86, ICCE2=0.80). E1 had an excellent concordance when measurements were performed on photos using a white background (ICC=0.95) and a good concordance in the other conditions

Relevância:

90.00% 90.00%

Publicador:

Resumo:

One of the key objectives in fuel cell technology is to reduce Pt loading by the improvement of its catalytic activity towards alcohol oxidation. Here, a sol-gel based method was used to prepare ternary and quaternary carbon supported nanoparticles by combining Pt-Ru with Mo, Ta, Pb, Rh or Ir, which were used as electro-catalysts for the methanol and ethanol oxidation reactions in acid medium. Structural characterization performed by XRD measurements revealed that crystalline structures with crystallites ranging from 2.8 to 4.1 nm in size and with different alloy degrees were produced. Tantalum and lead deposited as a heterogeneous mixture of oxides with different valences resulting in materials with complex structures. The catalysts activities were evaluated by cyclic voltammetry and by Tafel plots and the results showed that the activity towards methanol oxidation was highly dependent of the alloy degree, while for ethanol the presence of a metal capable to promote the break of C-C bond, such as Rh, was necessary for a good performance. Additionally, the catalysts containing of TaOx or PbOx resulted in the best materials due to different effects: the hi-functional mechanism promoted by TaOx and a better dispersion of the catalysts constituents promoted by PbOx. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The continuous advancements and enhancements of wireless systems are enabling new compelling scenarios where mobile services can adapt according to the current execution context, represented by the computational resources available at the local device, current physical location, people in physical proximity, and so forth. Such services called context-aware require the timely delivery of all relevant information describing the current context, and that introduces several unsolved complexities, spanning from low-level context data transmission up to context data storage and replication into the mobile system. In addition, to ensure correct and scalable context provisioning, it is crucial to integrate and interoperate with different wireless technologies (WiFi, Bluetooth, etc.) and modes (infrastructure-based and ad-hoc), and to use decentralized solutions to store and replicate context data on mobile devices. These challenges call for novel middleware solutions, here called Context Data Distribution Infrastructures (CDDIs), capable of delivering relevant context data to mobile devices, while hiding all the issues introduced by data distribution in heterogeneous and large-scale mobile settings. This dissertation thoroughly analyzes CDDIs for mobile systems, with the main goal of achieving a holistic approach to the design of such type of middleware solutions. We discuss the main functions needed by context data distribution in large mobile systems, and we claim the precise definition and clean respect of quality-based contracts between context consumers and CDDI to reconfigure main middleware components at runtime. We present the design and the implementation of our proposals, both in simulation-based and in real-world scenarios, along with an extensive evaluation that confirms the technical soundness of proposed CDDI solutions. Finally, we consider three highly heterogeneous scenarios, namely disaster areas, smart campuses, and smart cities, to better remark the wide technical validity of our analysis and solutions under different network deployments and quality constraints.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Schon seit einigen Jahrzehnten wird die Sportwissenschaft durch computergestützte Methoden in ihrer Arbeit unterstützt. Mit der stetigen Weiterentwicklung der Technik kann seit einigen Jahren auch zunehmend die Sportpraxis von deren Einsatz profitieren. Mathematische und informatische Modelle sowie Algorithmen werden zur Leistungsoptimierung sowohl im Mannschafts- als auch im Individualsport genutzt. In der vorliegenden Arbeit wird das von Prof. Perl im Jahr 2000 entwickelte Metamodell PerPot an den ausdauerorientierten Laufsport angepasst. Die Änderungen betreffen sowohl die interne Modellstruktur als auch die Art der Ermittlung der Modellparameter. Damit das Modell in der Sportpraxis eingesetzt werden kann, wurde ein Kalibrierungs-Test entwickelt, mit dem die spezifischen Modellparameter an den jeweiligen Sportler individuell angepasst werden. Mit dem angepassten Modell ist es möglich, aus gegebenen Geschwindigkeitsprofilen die korrespondierenden Herzfrequenzverläufe abzubilden. Mit dem auf den Athleten eingestellten Modell können anschliessend Simulationen von Läufen durch die Eingabe von Geschwindigkeitsprofilen durchgeführt werden. Die Simulationen können in der Praxis zur Optimierung des Trainings und der Wettkämpfe verwendet werden. Das Training kann durch die Ermittlung einer simulativ bestimmten individuellen anaeroben Schwellenherzfrequenz optimal gesteuert werden. Die statistische Auswertung der PerPot-Schwelle zeigt signifikante Übereinstimmungen mit den in der Sportpraxis üblichen invasiv bestimmten Laktatschwellen. Die Wettkämpfe können durch die Ermittlung eines optimalen Geschwindigkeitsprofils durch verschiedene simulationsbasierte Optimierungsverfahren unterstützt werden. Bei der neuesten Methode erhält der Athlet sogar im Laufe des Wettkampfs aktuelle Prognosen, die auf den Geschwindigkeits- und Herzfrequenzdaten basieren, die während des Wettkampfs gemessen werden. Die mit PerPot optimierten Wettkampfzielzeiten für die Athleten zeigen eine hohe Prognosegüte im Vergleich zu den tatsächlich erreichten Zielzeiten.