969 resultados para bayesian methods
Resumo:
When using a polynomial approximating function the most contentious aspect of the Heat Balance Integral Method is the choice of power of the highest order term. In this paper we employ a method recently developed for thermal problems, where the exponent is determined during the solution process, to analyse Stefan problems. This is achieved by minimising an error function. The solution requires no knowledge of an exact solution and generally produces significantly better results than all previous HBI models. The method is illustrated by first applying it to standard thermal problems. A Stefan problem with an analytical solution is then discussed and results compared to the approximate solution. An ablation problem is also analysed and results compared against a numerical solution. In both examples the agreement is excellent. A Stefan problem where the boundary temperature increases exponentially is analysed. This highlights the difficulties that can be encountered with a time dependent boundary condition. Finally, melting with a time-dependent flux is briefly analysed without applying analytical or numerical results to assess the accuracy.
Resumo:
The recent advances in sequencing technologies have given all microbiology laboratories access to whole genome sequencing. Providing that tools for the automated analysis of sequence data and databases for associated meta-data are developed, whole genome sequencing will become a routine tool for large clinical microbiology laboratories. Indeed, the continuing reduction in sequencing costs and the shortening of the 'time to result' makes it an attractive strategy in both research and diagnostics. Here, we review how high-throughput sequencing is revolutionizing clinical microbiology and the promise that it still holds. We discuss major applications, which include: (i) identification of target DNA sequences and antigens to rapidly develop diagnostic tools; (ii) precise strain identification for epidemiological typing and pathogen monitoring during outbreaks; and (iii) investigation of strain properties, such as the presence of antibiotic resistance or virulence factors. In addition, recent developments in comparative metagenomics and single-cell sequencing offer the prospect of a better understanding of complex microbial communities at the global and individual levels, providing a new perspective for understanding host-pathogen interactions. Being a high-resolution tool, high-throughput sequencing will increasingly influence diagnostics, epidemiology, risk management, and patient care.
Application of standard and refined heat balance integral methods to one-dimensional Stefan problems
Resumo:
The work in this paper concerns the study of conventional and refined heat balance integral methods for a number of phase change problems. These include standard test problems, both with one and two phase changes, which have exact solutions to enable us to test the accuracy of the approximate solutions. We also consider situations where no analytical solution is available and compare these to numerical solutions. It is popular to use a quadratic profile as an approximation of the temperature, but we show that a cubic profile, seldom considered in the literature, is far more accurate in most circumstances. In addition, the refined integral method can give greater improvement still and we develop a variation on this method which turns out to be optimal in some cases. We assess which integral method is better for various problems, showing that it is largely dependent on the specified boundary conditions.
Resumo:
In this work we have studied the modifications in the biological properties of Trypanosoma cruzi when the parasite is maintained for a long time in axenic culture. The studies were done with a clone from an avirulent strain (Dm30L) and a non-cloned virulent strain (EP) of T. cruzi. Both parasiteswere maintained, for at least three years, by successive triatomine/mouse alternate passage (control condition), or by serial passage in axenic medium (culture condition), or only in the mouse (mouse condition). The comparison between parasites of culture and control condition showed that metacyclogenesis capacity was reduced in the former and that the resulting metacyclics displayed an attenuatedvirulence. In order to compare the virulence of metacyclics from the urine of the insect vector, Rhodnius prolixus were infected by artificial feeding with parasites of the control or culture condition. After three triatomine/triatomine passages, there was observed an almost identical biological behavior for these parasites, hence indicating that the maintenance of T. cruzi for a long time in axenic culture affects the differentiation capacity and the virulence of the parasite. Additionally, it was demonstrated that it is possible to maintain T. cruzi exclusively through passages in the invertebrate host.
Resumo:
ACuteTox is a project within the 6th European Framework Programme which had as one of its goals to develop, optimise and prevalidate a non-animal testing strategy for predicting human acute oral toxicity. In its last 6 months, a challenging exercise was conducted to assess the predictive capacity of the developed testing strategies and final identification of the most promising ones. Thirty-two chemicals were tested blind in the battery of in vitro and in silico methods selected during the first phase of the project. This paper describes the classification approaches studied: single step procedures and two step tiered testing strategies. In summary, four in vitro testing strategies were proposed as best performing in terms of predictive capacity with respect to the European acute oral toxicity classification. In addition, a heuristic testing strategy is suggested that combines the prediction results gained from the neutral red uptake assay performed in 3T3 cells, with information on neurotoxicity alerts identified by the primary rat brain aggregates test method. Octanol-water partition coefficients and in silico prediction of intestinal absorption and blood-brain barrier passage are also considered. This approach allows to reduce the number of chemicals wrongly predicted as not classified (LD50>2000 mg/kg b.w.).
Resumo:
Sampling issues represent a topic of ongoing interest to the forensic science community essentially because of their crucial role in laboratory planning and working protocols. For this purpose, forensic literature described thorough (Bayesian) probabilistic sampling approaches. These are now widely implemented in practice. They allow, for instance, to obtain probability statements that parameters of interest (e.g., the proportion of a seizure of items that present particular features, such as an illegal substance) satisfy particular criteria (e.g., a threshold or an otherwise limiting value). Currently, there are many approaches that allow one to derive probability statements relating to a population proportion, but questions on how a forensic decision maker - typically a client of a forensic examination or a scientist acting on behalf of a client - ought actually to decide about a proportion or a sample size, remained largely unexplored to date. The research presented here intends to address methodology from decision theory that may help to cope usefully with the wide range of sampling issues typically encountered in forensic science applications. The procedures explored in this paper enable scientists to address a variety of concepts such as the (net) value of sample information, the (expected) value of sample information or the (expected) decision loss. All of these aspects directly relate to questions that are regularly encountered in casework. Besides probability theory and Bayesian inference, the proposed approach requires some additional elements from decision theory that may increase the efforts needed for practical implementation. In view of this challenge, the present paper will emphasise the merits of graphical modelling concepts, such as decision trees and Bayesian decision networks. These can support forensic scientists in applying the methodology in practice. How this may be achieved is illustrated with several examples. The graphical devices invoked here also serve the purpose of supporting the discussion of the similarities, differences and complementary aspects of existing Bayesian probabilistic sampling criteria and the decision-theoretic approach proposed throughout this paper.
Stabilized Petrov-Galerkin methods for the convection-diffusion-reaction and the Helmholtz equations
Resumo:
We present two new stabilized high-resolution numerical methods for the convection–diffusion–reaction (CDR) and the Helmholtz equations respectively. The work embarks upon a priori analysis of some consistency recovery procedures for some stabilization methods belonging to the Petrov–Galerkin framework. It was found that the use of some standard practices (e.g. M-Matrices theory) for the design of essentially non-oscillatory numerical methods is not feasible when consistency recovery methods are employed. Hence, with respect to convective stabilization, such recovery methods are not preferred. Next, we present the design of a high-resolution Petrov–Galerkin (HRPG) method for the 1D CDR problem. The problem is studied from a fresh point of view, including practical implications on the formulation of the maximum principle, M-Matrices theory, monotonicity and total variation diminishing (TVD) finite volume schemes. The current method is next in line to earlier methods that may be viewed as an upwinding plus a discontinuity-capturing operator. Finally, some remarks are made on the extension of the HRPG method to multidimensions. Next, we present a new numerical scheme for the Helmholtz equation resulting in quasi-exact solutions. The focus is on the approximation of the solution to the Helmholtz equation in the interior of the domain using compact stencils. Piecewise linear/bilinear polynomial interpolation are considered on a structured mesh/grid. The only a priori requirement is to provide a mesh/grid resolution of at least eight elements per wavelength. No stabilization parameters are involved in the definition of the scheme. The scheme consists of taking the average of the equation stencils obtained by the standard Galerkin finite element method and the classical finite difference method. Dispersion analysis in 1D and 2D illustrate the quasi-exact properties of this scheme. Finally, some remarks are made on the extension of the scheme to unstructured meshes by designing a method within the Petrov–Galerkin framework.
Resumo:
We introduce and analyze two new semi-discrete numerical methods for the multi-dimensional Vlasov-Poisson system. The schemes are constructed by combing a discontinuous Galerkin approximation to the Vlasov equation together with a mixed finite element method for the Poisson problem. We show optimal error estimates in the case of smooth compactly supported initial data. We propose a scheme that preserves the total energy of the system.
Resumo:
This study presents a classification criteria for two-class Cannabis seedlings. As the cultivation of drug type cannabis is forbidden in Switzerland, law enforcement authorities regularly ask laboratories to determine cannabis plant's chemotype from seized material in order to ascertain that the plantation is legal or not. In this study, the classification analysis is based on data obtained from the relative proportion of three major leaf compounds measured by gas-chromatography interfaced with mass spectrometry (GC-MS). The aim is to discriminate between drug type (illegal) and fiber type (legal) cannabis at an early stage of the growth. A Bayesian procedure is proposed: a Bayes factor is computed and classification is performed on the basis of the decision maker specifications (i.e. prior probability distributions on cannabis type and consequences of classification measured by losses). Classification rates are computed with two statistical models and results are compared. Sensitivity analysis is then performed to analyze the robustness of classification criteria.
Advanced mapping of environmental data: Geostatistics, Machine Learning and Bayesian Maximum Entropy
Resumo:
This book combines geostatistics and global mapping systems to present an up-to-the-minute study of environmental data. Featuring numerous case studies, the reference covers model dependent (geostatistics) and data driven (machine learning algorithms) analysis techniques such as risk mapping, conditional stochastic simulations, descriptions of spatial uncertainty and variability, artificial neural networks (ANN) for spatial data, Bayesian maximum entropy (BME), and more.
Resumo:
Nandrolone (19-nortestosterone) is a widely used anabolic steroid in sports where strength plays an essential role. Once nandrolone has been metabolised, two major metabolites are excreted in urine, 19-norandrosterone (NA) and 19-noretiocholanolone (NE). In 1997, in France, quite a few sportsmen had concentrations of 19-norandrosterone very close to the IOC cut off limit (2ng/ml). At that time, a debate took place about the capability of the human male body to produce by itself these metabolites without any intake of nandrolone or related compounds. The International Football Federation (FIFA) was very concerned with this problematic, especially because the World Cup was about to start in France. In this respect, a statistical study was held with all football players from the first and second divisions of the Swiss Football National League. All players gave a urine sample after effort and around 6% of them showed traces of 19-norandrosterone. These results were compared with amateur football players (control group) and around 6% of them had very small amounts of 19-norandrosterone and/or 19-noretiocholanolone in urine after effort, whereas none of them had detectable traces of one or the other metabolite before effort. The origin of these compounds in urine after a strenuous physical activity is still unknown, but three hypotheses can be put forward. First, an endogenous production of nandrolone metabolites takes place. Second, nandrolone metabolites are released from the fatty tissues after an intake of nandrolone, some related compounds or some contaminated nutritive supplements. Finally, the sportsmen may have taken something during or just before the football game.
Resumo:
A direct agglutination test (DAT) and an immunofluorescence technique (IFAT) were compared for detection of Leishmania infantum infection in 43 dogs and five foxes from Alto-Douro and Arrábida, two known endemic areas in Portugal. In four dogs with proved canine leishmaniasis, both DAT and IFAT showed positive readings (titres >1:320 and >1:128). Of 34 samples collected from apparently healthy dogs, ten were positive by both serological tests and eight were serologically positive by one test or the other. Three foxes out of five captured in this area, scored titres indicative of leishmaniasis in both DAT and IFAT. The concordance between DAT and IFAT in all collected samples (48) was 81.25%. Considering these and previous studies in the adjacent Mediterranean areas, the seroprevalence of L. infantum infection in the canine and vulpine populations appear to be of high magnitude.
Resumo:
Background: It is suggested that a low dose of valganciclovir can be equally effective than a standard dose for cytomegalovirus (CMV) prophylaxis after kidney transplantation. The aim of our study was to determine the ganciclovir exposure observed under a routine daily dosage of 450 mg valganciclovir in kidney transplant recipients with a wide range of renal function. Methods: In this prospective study, kidney transplant recipients with a GFR MDRD above 25 mL/min at risk for CMV (donor or recipient seropositive for CMV) received a dose of valganciclovir (450 mg daily) prophylaxis for 3 months. Ganciclovir levels at trough (Ctrough) and at peak (C3h) were measured monthly. Ganciclovir exposure (AUC0-24) was estimated using Bayesian non-linear mixed-effect modelling (NONMEM) and compared between 3 groups of patients according to their kidney function: GFRMDRD 26-39 mL/min (Group 1), GFRMDRD 40-59 mL/min (Group 2) and GFRMDRD 60-90 mL/min (Group 3). CMV DNAemia was assessed during and after prophylaxis using PCR. Results: Thirty-six patients received 450 mg daily of valganciclovir for 3 months. Median ganciclovir C3h was 3.9 mg/L (range: 1.3-7.1) and Ctrough was 0.4 mg/L (range 0.1-2.7). Median (range) AUC0-24 of ganciclovir was 59.3 mg.h/L (39.0-85.3) in Group 1 patients, 35.8 mg.h/L (24.9-55.8) in Group 2 patients and 29.6 mg.h/L (22.0- 43.2) in Group 3 patients (p<0.001). Anemia was more common in Group 1 patients compared to patients on the other groups (p=0.01). No differences in other adverse events according to ganciclovir exposure were observed. CMV DNAemia was not detected during prophylaxis. After discontinuing prophylaxis, CMV DNAemia was seen in 8/34 patients (23.5%) and 4/36 patients (11%) developed CMV disease. Conclusion: A routine dosage of valganciclovir achieved plasma levels of ganciclovir in patients with GFR>60 mL/min similar to those previously reported using oral ganciclovir. A daily dose of 450 mg valganciclovir appears to be acceptable for CMV prophylaxis in most kidney transplant recipients.