154 resultados para unknown-input estimation

em Université de Lausanne, Switzerland


Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Patients with unknown stroke onset are generally excluded from acute recanalisation treatments. We designed a pilot study to assess feasibility of a trial of perfusion computed tomography (PCT)-guided thrombolysis in patients with ischemic tissue at risk of infarction and unknown stroke onset. METHODS: Patients with a supratentorial stroke of unknown onset in the middle cerebral artery territory and significant volume of at-risk tissue on PCT were randomized to intravenous thrombolysis with alteplase (0.9 mg/kg) or placebo. Feasibility endpoints were randomization and blinded treatment of patients within 2 h after hospital arrival, and the correct application (estimation) of the perfusion imaging criteria. RESULTS: At baseline, there was a trend towards older age [69.5 (57-78) vs. 49 (44-78) years] in the thrombolysis group (n = 6) compared to placebo (n = 6). Regarding feasibility, hospital arrival to treatment delay was above the allowed 2 h in three patients (25%). There were two protocol violations (17%) regarding PCT, both underestimating the predicted infarct in patients randomized in the placebo group. No symptomatic hemorrhage or death occurred during the first 7 days. Three of the four (75%) and one of the five (20%) patients were recanalized in the thrombolysis and placebo group respectively. The volume of non-infarcted at-risk tissue was 84 (44-206) cm(3) in the treatment arm and 29 (8-105) cm(3) in the placebo arm. CONCLUSIONS: This pilot study shows that a randomized PCT-guided thrombolysis trial in patients with stroke of unknown onset may be feasible if issues such as treatment delays and reliable identification of tissue at risk of infarction tissue are resolved. Safety and efficiency of such an approach need to be established.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractFor a wide range of environmental, hydrological, and engineering applications there is a fast growing need for high-resolution imaging. In this context, waveform tomographic imaging of crosshole georadar data is a powerful method able to provide images of pertinent electrical properties in near-surface environments with unprecedented spatial resolution. In contrast, conventional ray-based tomographic methods, which consider only a very limited part of the recorded signal (first-arrival traveltimes and maximum first-cycle amplitudes), suffer from inherent limitations in resolution and may prove to be inadequate in complex environments. For a typical crosshole georadar survey the potential improvement in resolution when using waveform-based approaches instead of ray-based approaches is in the range of one order-of- magnitude. Moreover, the spatial resolution of waveform-based inversions is comparable to that of common logging methods. While in exploration seismology waveform tomographic imaging has become well established over the past two decades, it is comparably still underdeveloped in the georadar domain despite corresponding needs. Recently, different groups have presented finite-difference time-domain waveform inversion schemes for crosshole georadar data, which are adaptations and extensions of Tarantola's seminal nonlinear generalized least-squares approach developed for the seismic case. First applications of these new crosshole georadar waveform inversion schemes on synthetic and field data have shown promising results. However, there is little known about the limits and performance of such schemes in complex environments. To this end, the general motivation of my thesis is the evaluation of the robustness and limitations of waveform inversion algorithms for crosshole georadar data in order to apply such schemes to a wide range of real world problems.One crucial issue to making applicable and effective any waveform scheme to real-world crosshole georadar problems is the accurate estimation of the source wavelet, which is unknown in reality. Waveform inversion schemes for crosshole georadar data require forward simulations of the wavefield in order to iteratively solve the inverse problem. Therefore, accurate knowledge of the source wavelet is critically important for successful application of such schemes. Relatively small differences in the estimated source wavelet shape can lead to large differences in the resulting tomograms. In the first part of my thesis, I explore the viability and robustness of a relatively simple iterative deconvolution technique that incorporates the estimation of the source wavelet into the waveform inversion procedure rather than adding additional model parameters into the inversion problem. Extensive tests indicate that this source wavelet estimation technique is simple yet effective, and is able to provide remarkably accurate and robust estimates of the source wavelet in the presence of strong heterogeneity in both the dielectric permittivity and electrical conductivity as well as significant ambient noise in the recorded data. Furthermore, our tests also indicate that the approach is insensitive to the phase characteristics of the starting wavelet, which is not the case when directly incorporating the wavelet estimation into the inverse problem.Another critical issue with crosshole georadar waveform inversion schemes which clearly needs to be investigated is the consequence of the common assumption of frequency- independent electromagnetic constitutive parameters. This is crucial since in reality, these parameters are known to be frequency-dependent and complex and thus recorded georadar data may show significant dispersive behaviour. In particular, in the presence of water, there is a wide body of evidence showing that the dielectric permittivity can be significantly frequency dependent over the GPR frequency range, due to a variety of relaxation processes. The second part of my thesis is therefore dedicated to the evaluation of the reconstruction limits of a non-dispersive crosshole georadar waveform inversion scheme in the presence of varying degrees of dielectric dispersion. I show that the inversion algorithm, combined with the iterative deconvolution-based source wavelet estimation procedure that is partially able to account for the frequency-dependent effects through an "effective" wavelet, performs remarkably well in weakly to moderately dispersive environments and has the ability to provide adequate tomographic reconstructions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a Bayesian approach for estimating the relative frequencies of multi-single nucleotide polymorphism (SNP) haplotypes in populations of the malaria parasite Plasmodium falciparum by using microarray SNP data from human blood samples. Each sample comes from a malaria patient and contains one or several parasite clones that may genetically differ. Samples containing multiple parasite clones with different genetic markers pose a special challenge. The situation is comparable with a polyploid organism. The data from each blood sample indicates whether the parasites in the blood carry a mutant or a wildtype allele at various selected genomic positions. If both mutant and wildtype alleles are detected at a given position in a multiply infected sample, the data indicates the presence of both alleles, but the ratio is unknown. Thus, the data only partially reveals which specific combinations of genetic markers (i.e. haplotypes across the examined SNPs) occur in distinct parasite clones. In addition, SNP data may contain errors at non-negligible rates. We use a multinomial mixture model with partially missing observations to represent this data and a Markov chain Monte Carlo method to estimate the haplotype frequencies in a population. Our approach addresses both challenges, multiple infections and data errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method is proposed for the estimation of absolute binding free energy of interaction between proteins and ligands. Conformational sampling of the protein-ligand complex is performed by molecular dynamics (MD) in vacuo and the solvent effect is calculated a posteriori by solving the Poisson or the Poisson-Boltzmann equation for selected frames of the trajectory. The binding free energy is written as a linear combination of the buried surface upon complexation, SASbur, the electrostatic interaction energy between the ligand and the protein, Eelec, and the difference of the solvation free energies of the complex and the isolated ligand and protein, deltaGsolv. The method uses the buried surface upon complexation to account for the non-polar contribution to the binding free energy because it is less sensitive to the details of the structure than the van der Waals interaction energy. The parameters of the method are developed for a training set of 16 HIV-1 protease-inhibitor complexes of known 3D structure. A correlation coefficient of 0.91 was obtained with an unsigned mean error of 0.8 kcal/mol. When applied to a set of 25 HIV-1 protease-inhibitor complexes of unknown 3D structures, the method provides a satisfactory correlation between the calculated binding free energy and the experimental pIC5o without reparametrization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a thorough aggregation of probability and graph theory, Bayesian networks currently enjoy widespread interest as a means for studying factors that affect the coherent evaluation of scientific evidence in forensic science. Paper I of this series of papers intends to contribute to the discussion of Bayesian networks as a framework that is helpful for both illustrating and implementing statistical procedures that are commonly employed for the study of uncertainties (e.g. the estimation of unknown quantities). While the respective statistical procedures are widely described in literature, the primary aim of this paper is to offer an essentially non-technical introduction on how interested readers may use these analytical approaches - with the help of Bayesian networks - for processing their own forensic science data. Attention is mainly drawn to the structure and underlying rationale of a series of basic and context-independent network fragments that users may incorporate as building blocs while constructing larger inference models. As an example of how this may be done, the proposed concepts will be used in a second paper (Part II) for specifying graphical probability networks whose purpose is to assist forensic scientists in the evaluation of scientific evidence encountered in the context of forensic document examination (i.e. results of the analysis of black toners present on printed or copied documents).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The comparison of cancer prevalence with cancer mortality can lead under some hypotheses to an estimate of registration rate. A method is proposed, where the cases with cancer as a cause of death are divided into 3 categories: (1) cases already known by the registry (2) unknown cases having occured before the registry creation date (3) unknown cases occuring during the registry operates. The estimate is then the number of cases in the first category divided by the total of those in categories 1 and 3 (these only are to be registered). An application is performed on the data of the Canton de Vaud. Survival rates of the Norvegian Cancer Registry are used for computing the number of unknown cases to be included in second and third category, respectively. The discussion focusses on the possible determinants of the obtained comprehensiveness rates for various cancer sites.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Restriction site-associated DNA sequencing (RADseq) provides researchers with the ability to record genetic polymorphism across thousands of loci for nonmodel organisms, potentially revolutionizing the field of molecular ecology. However, as with other genotyping methods, RADseq is prone to a number of sources of error that may have consequential effects for population genetic inferences, and these have received only limited attention in terms of the estimation and reporting of genotyping error rates. Here we use individual sample replicates, under the expectation of identical genotypes, to quantify genotyping error in the absence of a reference genome. We then use sample replicates to (i) optimize de novo assembly parameters within the program Stacks, by minimizing error and maximizing the retrieval of informative loci; and (ii) quantify error rates for loci, alleles and single-nucleotide polymorphisms. As an empirical example, we use a double-digest RAD data set of a nonmodel plant species, Berberis alpina, collected from high-altitude mountains in Mexico.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to the hypothesis of Traub, also known as the 'formula of Traub', postmortem values of glucose and lactate found in the cerebrospinal fluid or vitreous humor are considered indicators of antemortem blood glucose levels. However, because the lactate concentration increases in the vitreous and cerebrospinal fluid after death, some authors postulated that using the sum value to estimate antemortem blood glucose levels could lead to an overestimation of the cases of glucose metabolic disorders with fatal outcomes, such as diabetic ketoacidosis. The aim of our study, performed on 470 consecutive forensic cases, was to ascertain the advantages of the sum value to estimate antemortem blood glucose concentrations and, consequently, to rule out fatal diabetic ketoacidosis as the cause of death. Other biochemical parameters, such as blood 3-beta-hydroxybutyrate, acetoacetate, acetone, glycated haemoglobin and urine glucose levels, were also determined. In addition, postmortem native CT scan, autopsy, histology, neuropathology and toxicology were performed to confirm diabetic ketoacidosis as the cause of death. According to our results, the sum value does not add any further information for the estimation of antemortem blood glucose concentration. The vitreous glucose concentration appears to be the most reliable marker to estimate antemortem hyperglycaemia and, along with the determination of other biochemical markers (such as blood acetone and 3-beta-hydroxybutyrate, urine glucose and glycated haemoglobin), to confirm diabetic ketoacidosis as the cause of death.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dispersal process, by which individuals or other dispersing agents such as gametes or seeds move from birthplace to a new settlement locality, has important consequences for the dynamics of genes, individuals, and species. Many of the questions addressed by ecology and evolutionary biology require a good understanding of species' dispersal patterns. Much effort has thus been devoted to overcoming the difficulties associated with dispersal measurement. In this context, genetic tools have long been the focus of intensive research, providing a great variety of potential solutions to measuring dispersal. This methodological diversity is reviewed here to help (molecular) ecologists find their way toward dispersal inference and interpretation and to stimulate further developments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Recommendations for statin use for primary prevention of coronary heart disease (CHD) are based on estimation of the 10- year CHD risk. We compared the 10-year CHD risk assessments and eligibility percentages for statin therapy using three scoring algorithms currently used in Europe. METHODS: We studied 5683 women and men, aged 35-75, without overt cardiovascular disease (CVD), in a population-based study in Switzerland. We compared the 10-year CHD risk using three scoring schemes, i.e., the Framingham risk score (FRS) from the U.S. National Cholesterol Education Program's Adult Treatment Panel III (ATP III), the PROCAM scoring scheme from the International Atherosclerosis Society (IAS), and the European risk SCORE for low-risk countries, without and with extrapolation to 60 years as recommended by the European Society of Cardiology guidelines (ESC). With FRS and PROCAM, high-risk was defined as a 10- year risk of fatal or non-fatal CHD>20% and a 10-year risk of fatal CVD≥5% with SCORE. We compared the proportions of high-risk participants and eligibility for statin use according to these three schemes. For each guideline, we estimated the impact of increased statin use from current partial compliance to full compliance on potential CHD deaths averted over 10 years, using a success proportion of 27% for statins. RESULTS: Participants classified at high-risk (both genders) were 5.8% according to FRS and 3.0% to the PROCAM, whereas the European risk SCORE classified 12.5% at high-risk (15.4% with extrapolation to 60 years). For the primary prevention of CHD, 18.5% of participants were eligible for statin therapy using ATP III, 16.6% using IAS, and 10.3% using ESC (13.0% with extrapolation) because ESC guidelines recommend statin therapy only in high-risk subjects. In comparison with IAS, agreement to identify eligible adults for statins was good with ATP III, but moderate with ESC. Using a population perspective, a full compliance with ATP III guidelines would reduce up to 17.9% of the 24′ 310 CHD deaths expected over 10 years in Switzerland, 17.3% with IAS and 10.8% with ESC (11.5% with extrapolation). CONCLUSIONS: Full compliance with guidelines for statin therapy would result in substantial health benefits, but proportions of high-risk adults and eligible adults for statin use varied substantially depending on the scoring systems and corresponding guidelines used for estimating CHD risk in Europe.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Developing a predictive understanding of subsurface flow and transport is complicated by the disparity of scales across which controlling hydrological properties and processes span. Conventional techniques for characterizing hydrogeological properties (such as pumping, slug, and flowmeter tests) typically rely on borehole access to the subsurface. Because their spatial extent is commonly limited to the vicinity near the wellbores, these methods often cannot provide sufficient information to describe key controls on subsurface flow and transport. The field of hydrogeophysics has evolved in recent years to explore the potential that geophysical methods hold for improving the quantification of subsurface properties and processes relevant for hydrological investigations. This chapter is intended to familiarize hydrogeologists and water-resource professionals with the state of the art as well as existing challenges associated with hydrogeophysics. We provide a review of the key components of hydrogeophysical studies, which include: geophysical methods commonly used for shallow subsurface characterization; petrophysical relationships used to link the geophysical properties to hydrological properties and state variables; and estimation or inversion methods used to integrate hydrological and geophysical measurements in a consistent manner. We demonstrate the use of these different geophysical methods, petrophysical relationships, and estimation approaches through several field-scale case studies. Among other applications, the case studies illustrate the use of hydrogeophysical approaches to quantify subsurface architecture that influence flow (such as hydrostratigraphy and preferential pathways); delineate anomalous subsurface fluid bodies (such as contaminant plumes); monitor hydrological processes (such as infiltration, freshwater-seawater interface dynamics, and flow through fractures); and estimate hydrological properties (such as hydraulic conductivity) and state variables (such as water content). The case studies have been chosen to illustrate how hydrogeophysical approaches can yield insights about complex subsurface hydrological processes, provide input that improves flow and transport predictions, and provide quantitative information over field-relevant spatial scales. The chapter concludes by describing existing hydrogeophysical challenges and associated research needs. In particular, we identify the area of quantitative watershed hydrogeophysics as a frontier area, where significant effort is required to advance the estimation of hydrological properties and processes (and their uncertainties) over spatial scales relevant to the management of water resources and contaminants.