999 resultados para Pseudo-Bayesian Design


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale for the purpose of improving predictions of groundwater flow and solute transport. However, extending corresponding approaches to the regional scale still represents one of the major challenges in the domain of hydrogeophysics. To address this problem, we have developed a regional-scale data integration methodology based on a two-step Bayesian sequential simulation approach. Our objective is to generate high-resolution stochastic realizations of the regional-scale hydraulic conductivity field in the common case where there exist spatially exhaustive but poorly resolved measurements of a related geophysical parameter, as well as highly resolved but spatially sparse collocated measurements of this geophysical parameter and the hydraulic conductivity. To integrate this multi-scale, multi-parameter database, we first link the low- and high-resolution geophysical data via a stochastic downscaling procedure. This is followed by relating the downscaled geophysical data to the high-resolution hydraulic conductivity distribution. After outlining the general methodology of the approach, we demonstrate its application to a realistic synthetic example where we consider as data high-resolution measurements of the hydraulic and electrical conductivities at a small number of borehole locations, as well as spatially exhaustive, low-resolution estimates of the electrical conductivity obtained from surface-based electrical resistivity tomography. The different stochastic realizations of the hydraulic conductivity field obtained using our procedure are validated by comparing their solute transport behaviour with that of the underlying ?true? hydraulic conductivity field. We find that, even in the presence of strong subsurface heterogeneity, our proposed procedure allows for the generation of faithful representations of the regional-scale hydraulic conductivity structure and reliable predictions of solute transport over long, regional-scale distances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Screening tests for subclinical cardiovascular disease, such as markers of atherosclerosis, are increasingly used in clinical prevention to identify individuals at high cardiovascular risk. Being aware of these test results might also enhance patient motivation to change unhealthy behaviors but the effectiveness of such a screening strategy has been poorly studied. METHODS: The CAROtid plaque Screening trial on Smoking cessation (CAROSS) is a randomized controlled trial in 530 regular smokers aged 40-70 years to test the hypothesis that carotid plaque screening will influence smokers' behavior with an increased rate of smoking cessation (primary outcome) and an improved control of other cardiovascular risk factors (secondary outcomes) after 1-year follow-up. All smokers will receive a brief advice for smoking cessation,and will subsequently be randomly assigned to either the intervention group (with plaques screening) or the control group (without plaque screening). Carotid ultrasound will be conducted with a standard protocol. Smokers with at least one carotid plaque will receive pictures of their own plaques with a structured explanation on the general significance of plaques. To ensure equal contact conditions, smokers not undergoing ultrasound and those without plaque will receive a relevant explanation on the risks associated with tobacco smoking. Study outcomes will be compared between smokers randomized to plaque screening and smokers not submitted to plaque screening. SUMMARY: This will be the first trial to assess the impact of carotid plaque screening on 1-year smoking cessation rates and levels of control of other cardiovascular risk factors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study was carried out to evaluate the molecular pattern of all available Brazilian human T-cell lymphotropic virus type 1 Env (n = 15) and Pol (n = 43) nucleotide sequences via epitope prediction, physico-chemical analysis, and protein potential sites identification, giving support to the Brazilian AIDS vaccine program. In 12 previously described peptides of the Env sequences we found 12 epitopes, while in 4 peptides of the Pol sequences we found 4 epitopes. The total variation on the amino acid composition was 9 and 17% for human leukocyte antigen (HLA) class I and class II Env epitopes, respectively. After analyzing the Pol sequences, results revealed a total amino acid variation of 0.75% for HLA-I and HLA-II epitopes. In 5 of the 12 Env epitopes the physico-chemical analysis demonstrated that the mutations magnified the antigenicity profile. The potential protein domain analysis of Env sequences showed the loss of a CK-2 phosphorylation site caused by D197N mutation in one epitope, and a N-glycosylation site caused by S246Y and V247I mutations in another epitope. Besides, the analysis of selection pressure have found 8 positive selected sites (w = 9.59) using the codon-based substitution models and maximum-likelihood methods. These studies underscore the importance of this Env region for the virus fitness, for the host immune response and, therefore, for the development of vaccine candidates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This case study introduces our continuous work to enhance the virtual classroom in order to provide faculty and students with an environment open to their needs, compliant with learning standards and, therefore compatible with other e-learning environments, and based on open source software. The result is a modulable, sustainable and interoperable learning environment that can be adapted to different teaching and learning situations by incorporating the LMS integrated tools as well as wikis, blogs, forums and Moodle activities among others.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present paper shows de design of an experimental study conducted with large groups using educational innovation methodologies at the Polytechnic University of Madrid. Concretely, we have chosen the course titled "History and Politics of Sports" that belongs to the Physical Activity and Sport Science Degree. The selection of this course is because the syllabus is basically theoretical and there are four large groups of freshmen students who do not have previous experiences in a teaching-learning process based on educational innovation. It is hope that the results of this research can be extrapolated to other courses with similar characteristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: To improve the efficacy of first-line therapy for advanced non-small cell lung cancer (NSCLC), additional maintenance chemotherapy may be given after initial induction chemotherapy in patients who did not progress during the initial treatment, rather than waiting for disease progression to administer second-line treatment. Maintenance therapy may consist of an agent that either was or was not present in the induction regimen. The antifolate pemetrexed is efficacious in combination with cisplatin for first-line treatment of advanced NSCLC and has shown efficacy as a maintenance agent in studies in which it was not included in the induction regimen. We designed a phase III study to determine if pemetrexed maintenance therapy improves progression-free survival (PFS) and overall survival (OS) after cisplatin/pemetrexed induction therapy in patients with advanced nonsquamous NSCLC. Furthermore, since evidence suggests expression levels of thymidylate synthase, the primary target of pemetrexed, may be associated with responsiveness to pemetrexed, translational research will address whether thymidylate synthase expression correlates with efficacy outcomes of pemetrexed. METHODS/DESIGN: Approximately 900 patients will receive four cycles of induction chemotherapy consisting of pemetrexed (500 mg/m2) and cisplatin (75 mg/m2) on day 1 of a 21-day cycle. Patients with an Eastern Cooperative Oncology Group performance status of 0 or 1 who have not progressed during induction therapy will randomly receive (in a 2:1 ratio) one of two double-blind maintenance regimens: pemetrexed (500 mg/m2 on day 1 of a 21-day cycle) plus best supportive care (BSC) or placebo plus BSC. The primary objective is to compare PFS between treatment arms. Secondary objectives include a fully powered analysis of OS, objective tumor response rate, patient-reported outcomes, resource utilization, and toxicity. Tumor specimens for translational research will be obtained from consenting patients before induction treatment, with a second biopsy performed in eligible patients following the induction phase. DISCUSSION: Although using a drug as maintenance therapy that was not used in the induction regimen exposes patients to an agent with a different mechanism of action, evidence suggests that continued use of an agent present in the induction regimen as maintenance therapy enables the identification of patients most likely to benefit from maintenance treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A promising approach to adoptive transfer therapy of tumors is to reprogram autologous T lymphocytes by TCR gene transfer of defined Ag specificity. An obstacle, however, is the undesired pairing of introduced TCRalpha- and TCRbeta-chains with the endogenous TCR chains. These events vary depending on the individual endogenous TCR and they not only may reduce the levels of cell surface-introduced TCR but also may generate hybrid TCR with unknown Ag specificities. We show that such hybrid heterodimers can be generated even by the pairing of human and mouse TCRalpha- and TCRbeta-chains. To overcome this hurdle, we have identified a pair of amino acid residues in the crystal structure of a TCR that lie at the interface of associated TCR Calpha and Cbeta domains and are related to each other by both a complementary steric interaction analogous to a "knob-into-hole" configuration and the electrostatic environment. We mutated the two residues so as to invert the sense of this interaction analogous to a charged "hole-into-knob" configuration. We show that this inversion in the CalphaCbeta interface promotes selective assembly of the introduced TCR while preserving its specificity and avidity for Ag ligand. Noteworthily, this TCR modification was equally efficient on both a Mu and a Hu TCR. Our data suggest that this approach is generally applicable to TCR independently of their Ag specificity and affinity, subset distribution, and species of origin. Thus, this strategy may optimize TCR gene transfer to efficiently and safely reprogram random T cells into tumor-reactive T cells.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The 'database search problem', that is, the strengthening of a case - in terms of probative value - against an individual who is found as a result of a database search, has been approached during the last two decades with substantial mathematical analyses, accompanied by lively debate and centrally opposing conclusions. This represents a challenging obstacle in teaching but also hinders a balanced and coherent discussion of the topic within the wider scientific and legal community. This paper revisits and tracks the associated mathematical analyses in terms of Bayesian networks. Their derivation and discussion for capturing probabilistic arguments that explain the database search problem are outlined in detail. The resulting Bayesian networks offer a distinct view on the main debated issues, along with further clarity. Methods As a general framework for representing and analyzing formal arguments in probabilistic reasoning about uncertain target propositions (that is, whether or not a given individual is the source of a crime stain), this paper relies on graphical probability models, in particular, Bayesian networks. This graphical probability modeling approach is used to capture, within a single model, a series of key variables, such as the number of individuals in a database, the size of the population of potential crime stain sources, and the rarity of the corresponding analytical characteristics in a relevant population. Results This paper demonstrates the feasibility of deriving Bayesian network structures for analyzing, representing, and tracking the database search problem. The output of the proposed models can be shown to agree with existing but exclusively formulaic approaches. Conclusions The proposed Bayesian networks allow one to capture and analyze the currently most well-supported but reputedly counter-intuitive and difficult solution to the database search problem in a way that goes beyond the traditional, purely formulaic expressions. The method's graphical environment, along with its computational and probabilistic architectures, represents a rich package that offers analysts and discussants with additional modes of interaction, concise representation, and coherent communication.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compositional random vectors are fundamental tools in the Bayesian analysis of categorical data.Many of the issues that are discussed with reference to the statistical analysis of compositionaldata have a natural counterpart in the construction of a Bayesian statistical model for categoricaldata.This note builds on the idea of cross-fertilization of the two areas recommended by Aitchison (1986)in his seminal book on compositional data. Particular emphasis is put on the problem of whatparameterization to use

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El presente estudio evalúa la influencia de los esquemas implícitos y explícitos, tanto acerca de uno mismo como de los demás, sobre la experimentación de experiencias pseudo-psicóticas positivas; tomando como marco de referencia dos destacados modelos explicativos: el modelo atribucional de auto-representación y el modelo de la anticipación cognitiva de la amenaza. Los análisis se realizaron sobre una muestra de 185 estudiantes de psicología de la UAB. Los resultados obtenidos apoyaron la influencia de los esquemas explícitos positivos del yo y los negativos de los otros sobre las experiencias pseudo-psicóticas positivas; aportando así un apoyo parcial para ambos modelos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Leprosy will continue to be a public health problem for several decades. The World Health Organization (WHO) recommends that, for treatment purposes, leprosy cases be classified as either paucibacillary or multibacillary (MB). A uniform leprosy treatment regimen would simplify treatment and halve the treatment duration for MB patients. The clinical trial for uniform multidrug therapy (U-MDT) for leprosy patients (LPs) in Brazil is a randomised, open-label clinical trial to evaluate if the effectiveness of U-MDT for leprosy equals the regular regimen, to determine the acceptability of the U-MDT regimen and to identify the prognostic factors. This paper details the clinical trial methodology and patient enrolment data. The study enrolled 858 patients at two centres and 78.4% of participants were classified as MB according to the WHO criteria. The main difficulty in evaluating a new leprosy treatment regimen is that no reliable data are available for the current treatment regimen. Relapse, reaction and impaired nerve function rates have never been systematically determined, although reaction and impaired nerve function are the two major causes of nerve damage that lead to impairments and disabilities in LPs. Our study was designed to overcome the need for reliable data about the current treatment and to compare its efficacy with that of a uniform regimen.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the forensic examination of DNA mixtures, the question of how to set the total number of contributors (N) presents a topic of ongoing interest. Part of the discussion gravitates around issues of bias, in particular when assessments of the number of contributors are not made prior to considering the genotypic configuration of potential donors. Further complication may stem from the observation that, in some cases, there may be numbers of contributors that are incompatible with the set of alleles seen in the profile of a mixed crime stain, given the genotype of a potential contributor. In such situations, procedures that take a single and fixed number contributors as their output can lead to inferential impasses. Assessing the number of contributors within a probabilistic framework can help avoiding such complication. Using elements of decision theory, this paper analyses two strategies for inference on the number of contributors. One procedure is deterministic and focuses on the minimum number of contributors required to 'explain' an observed set of alleles. The other procedure is probabilistic using Bayes' theorem and provides a probability distribution for a set of numbers of contributors, based on the set of observed alleles as well as their respective rates of occurrence. The discussion concentrates on mixed stains of varying quality (i.e., different numbers of loci for which genotyping information is available). A so-called qualitative interpretation is pursued since quantitative information such as peak area and height data are not taken into account. The competing procedures are compared using a standard scoring rule that penalizes the degree of divergence between a given agreed value for N, that is the number of contributors, and the actual value taken by N. Using only modest assumptions and a discussion with reference to a casework example, this paper reports on analyses using simulation techniques and graphical models (i.e., Bayesian networks) to point out that setting the number of contributors to a mixed crime stain in probabilistic terms is, for the conditions assumed in this study, preferable to a decision policy that uses categoric assumptions about N.