973 resultados para subset consistency


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Artemisinin induced dormancy is a proposed mechanism for failures of mono-therapy and is linked with artemisinin resistance in Plasmodium falciparum. The biological characterization and dynamics of dormant parasites are not well understood. Here we report that following dihydroartemisinin (DHA) treatment in vitro, a small subset of morphologically dormant parasites was stained with rhodamine 123 (RH), a mitochondrial membrane potential (MMP) marker, and persisted to recovery. FACS sorted RH-positive parasites resumed growth at 10,000/well while RH-negative parasites failed to recover at 5 million/well. Furthermore, transcriptional activity for mitochondrial enzymes was only detected in RH-positive dormant parasites. Importantly, after treating dormant parasites with different concentrations of atovaquone, a mitochondrial inhibitor, the recovery of dormant parasites was delayed or stopped. This demonstrates that mitochondrial activity is critical for survival and regrowth of dormant parasites and that RH staining provides a means of identifying these parasites. These findings provide novel paths for studying and eradicating this dormant stage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ultimate goal of profiling is to identify the major behavioral and personality characteristics to narrow the suspect pool. Inferences about offender characteristics can be accomplished deductively, based on the analysis of discrete offender behaviors established within a particular case. They can also be accomplished inductively, involving prediction based on abstract offender averages from group data (these methods and the logic on which they are based is detailed extensively in Chapters 2 and 4). As discussed, these two approaches are by no means equal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Criminal profiling is an investigative tool used around the world to infer the personality and behavioural characteristics of an offender based on their crime. Case linkage, the process of determining discreet connections between crimes of the same offender, is a practice that falls under the general banner of criminal profiling and has been widely criticized. Two theories, behavioural consistency and the homology assumption, are examined and their impact on profiling in general and case linkage specifically is discussed...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Endometriosis is a heritable common gynaecological condition influenced by multiple genetic and environmental factors. Genome-wide association studies (GWASs) have proved successful in identifying common genetic variants of moderate effects for various complex diseases. To date, eight GWAS and replication studies from multiple populations have been published on endometriosis. In this review, we investigate the consistency and heterogeneity of the results across all the studies and their implications for an improved understanding of the aetiology of the condition. METHODS Meta-analyses were conducted on four GWASs and four replication studies including a total of 11 506 cases and 32 678 controls, and on the subset of studies that investigated associations for revised American Fertility Society (rAFS) Stage III/IV including 2859 cases. The datasets included 9039 cases and 27 343 controls of European (Australia, Belgium, Italy, UK, USA) and 2467 cases and 5335 controls of Japanese ancestry. Fixed and Han and Elkin random-effects models, and heterogeneity statistics (Cochran's Q test), were used to investigate the evidence of the nine reported genome-wide significant loci across datasets and populations. RESULTS Meta-analysis showed that seven out of nine loci had consistent directions of effect across studies and populations, and six out of nine remained genome-wide significant (P < 5 × 10(-8)), including rs12700667 on 7p15.2 (P = 1.6 × 10(-9)), rs7521902 near WNT4 (P = 1.8 × 10(-15)), rs10859871 near VEZT (P = 4.7 × 10(-15)), rs1537377 near CDKN2B-AS1 (P = 1.5 × 10(-8)), rs7739264 near ID4 (P = 6.2 × 10(-10)) and rs13394619 in GREB1 (P = 4.5 × 10(-8)). In addition to the six loci, two showed borderline genome-wide significant associations with Stage III/IV endometriosis, including rs1250248 in FN1 (P = 8 × 10(-8)) and rs4141819 on 2p14 (P = 9.2 × 10(-8)). Two independent inter-genic loci, rs4141819 and rs6734792 on chromosome 2, showed significant evidence of heterogeneity across datasets (P < 0.005). Eight of the nine loci had stronger effect sizes among Stage III/IV cases, implying that they are likely to be implicated in the development of moderate to severe, or ovarian, disease. While three out of nine loci were inter-genic, the remaining were in or near genes with known functions of biological relevance to endometriosis, varying from roles in developmental pathways to cellular growth/carcinogenesis. CONCLUSIONS Our meta-analysis shows remarkable consistency in endometriosis GWAS results across studies, with little evidence of population-based heterogeneity. They also show that the phenotypic classifications used in GWAS to date have been limited. Stronger associations with Stage III/IV disease observed for most loci emphasize the importance for future studies to include detailed sub-phenotype information. Functional studies in relevant tissues are needed to understand the effect of the variants on downstream biological pathways.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given that there is increasing recognition of the effect that submillimetre changes in collimator position can have on radiotherapy beam dosimetry, this study aimed to evaluate the potential variability in small field collimation that may exist between otherwise matched linacs. Field sizes and field output factors were measured using radiochromic film and an electron diode, for jaw- and MLC-collimated fields produced by eight dosimetrically matched Varian iX linacs (Varian Medical Systems, Palo Alto, USA). This study used nominal sizes from 0.6×0.6 to 10×10 cm215 , for jaw-collimated fields,and from 1×1 to 10×10 cm216 , for MLC-collimated fields, delivered from a zero (head up, beam directed vertically downward) gantry angle. Differences between the field sizes measured for the eight linacs exceeded the uncertainty of the film measurements and the repositioning uncertainty of the jaws and MLCs on one linac. The dimensions of fields defined by MLC leaves were more consistent between linacs, while also differing more from their nominal values than fields defined by orthogonal jaws. The field output factors measured for the different linacs generally increased with increasing measured field size for the nominal 0.6×0.6 and 1×1 cm2 fields, and became consistent between linacs for nominal field sizes of 2×2 cm2 25 and larger. The inclusion in radiotherapy treatment planning system beam data of small field output factors acquired in fields collimated by jaws (rather than the more-reproducible MLCs), associated with either the nominal or the measured field sizes, should be viewed with caution. The size and reproducibility of the fields (especially the small fields) used to acquire treatment planning data should be investigated thoroughly as part of the linac or planning system commissioning process. Further investigation of these issues, using different linac models, collimation systems and beam orientations, is recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fusing data from multiple sensing modalities, e.g. laser and radar, is a promising approach to achieve resilient perception in challenging environmental conditions. However, this may lead to \emph{catastrophic fusion} in the presence of inconsistent data, i.e. when the sensors do not detect the same target due to distinct attenuation properties. It is often difficult to discriminate consistent from inconsistent data across sensing modalities using local spatial information alone. In this paper we present a novel consistency test based on the log marginal likelihood of a Gaussian process model that evaluates data from range sensors in a relative manner. A new data point is deemed to be consistent if the model statistically improves as a result of its fusion. This approach avoids the need for absolute spatial distance threshold parameters as required by previous work. We report results from object reconstruction with both synthetic and experimental data that demonstrate an improvement in reconstruction quality, particularly in cases where data points are inconsistent yet spatially proximal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report the results of two studies of aspects of the consistency of truncated nonlinear integral equation based theories of freezing: (i) We show that the self-consistent solutions to these nonlinear equations are unfortunately sensitive to the level of truncation. For the hard sphere system, if the Wertheim–Thiele representation of the pair direct correlation function is used, the inclusion of part but not all of the triplet direct correlation function contribution, as has been common, worsens the predictions considerably. We also show that the convergence of the solutions found, with respect to number of reciprocal lattice vectors kept in the Fourier expansion of the crystal singlet density, is slow. These conclusions imply great sensitivity to the quality of the pair direct correlation function employed in the theory. (ii) We show the direct correlation function based and the pair correlation function based theories of freezing can be cast into a form which requires solution of isomorphous nonlinear integral equations. However, in the pair correlation function theory the usual neglect of the influence of inhomogeneity of the density distribution on the pair correlation function is shown to be inconsistent to the lowest order in the change of density on freezing, and to lead to erroneous predictions. The Journal of Chemical Physics is copyrighted by The American Institute of Physics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report the results of two studies of aspects of the consistency of truncated nonlinear integral equation based theories of freezing: (i) We show that the self-consistent solutions to these nonlinear equations are unfortunately sensitive to the level of truncation. For the hard sphere system, if the Wertheim–Thiele representation of the pair direct correlation function is used, the inclusion of part but not all of the triplet direct correlation function contribution, as has been common, worsens the predictions considerably. We also show that the convergence of the solutions found, with respect to number of reciprocal lattice vectors kept in the Fourier expansion of the crystal singlet density, is slow. These conclusions imply great sensitivity to the quality of the pair direct correlation function employed in the theory. (ii) We show the direct correlation function based and the pair correlation function based theories of freezing can be cast into a form which requires solution of isomorphous nonlinear integral equations. However, in the pair correlation function theory the usual neglect of the influence of inhomogeneity of the density distribution on the pair correlation function is shown to be inconsistent to the lowest order in the change of density on freezing, and to lead to erroneous predictions. The Journal of Chemical Physics is copyrighted by The American Institute of Physics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Control of pests in stored grain and the evolution of resistance to pesticides are serious problems worldwide. A stochastic individual-based two-locus model was used to investigate the impact of two important issues, the consistency of pesticide dosage through the storage facility and the immigration rate of the adult pest, on overall population control and avoidance of evolution of resistance to the fumigant phosphine in an important pest of stored grain, the lesser grain borer. RESULTS A very consistent dosage maintained good control for all immigration rates, while an inconsistent dosage failed to maintain control in all cases. At intermediate dosage consistency, immigration rate became a critical factor in whether control was maintained or resistance emerged. CONCLUSION Achieving a consistent fumigant dosage is a key factor in avoiding evolution of resistance to phosphine and maintaining control of populations of stored-grain pests; when the dosage achieved is very inconsistent, there is likely to be a problem regardless of immigration rate. © 2012 Society of Chemical Industry

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – This paper aims to go beyond a bookkeeping approach to evolutionary analysis whereby surviving firms are better adapted and extinct firms were less adapted. From discussion of the preliminary findings of research into the Hobart pizza industry, evidence is presented of the need to adopt a more traditional approach to applying evolutionary theories with organizational research. Design/methodology/approach – After a brief review of the relevant literature, the preliminary findings of research into the Hobart pizza industry are presented. Then, several evolutionary concepts that are commonplace in ecological research are introduced to help explain the emergent findings. The paper concludes with consideration given to advancing a more consistent approach to employing evolutionary theories within organizational research. Findings – The paper finds that the process of selection cannot be assumed to occur evenly across time and/or space. Within geographically small markets different forms of selection operate in different ways and degrees requiring the use of more traditional evolutionary theories to highlight the causal process associated with population change. Research limitations/implications – The paper concludes by highlighting Geoffrey Hodgson’s Principle of Consistency. It is demonstrated that a failure to truly understand how and why theory is used in one domain will likely result in its misuse in another domain. That, at present, too few evolutionary concepts are employed in organisational research to ensure an appreciation of any underlying causal processes through which social change occurs. Originality/value – The concepts introduced throughout this paper, whilst not new, provide new entry points for organizational researchers intent on employing an evolutionary approach to understand the process of social change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The irreversible epidermal growth factor receptor (EGFR) inhibitors have demonstrated efficacy in NSCLC patients with activating EGFR mutations, but it is unknown if they are superior to the reversible inhibitors. Dacomitinib is an oral, small-molecule irreversible inhibitor of all enzymatically active HER family tyrosine kinases. Methods: The ARCHER 1009 (NCT01360554) and A7471028 (NCT00769067) studies randomized patients with locally advanced/metastatic NSCLC following progression with one or two prior chemotherapy regimens to dacomitinib or erlotinib. EGFR mutation testing was performed centrally on archived tumor samples. We pooled patients with exon 19 deletion and L858R EGFR mutations from both studies to compare the efficacy of dacomitinib to erlotinib. Results: One hundred twenty-one patients with any EGFR mutation were enrolled; 101 had activating mutations in exon 19 or 21. For patients with exon19/21 mutations, the median progression-free survival was 14.6 months [95% confidence interval (CI) 9.0–18.2] with dacomitinib and 9.6 months (95% CI 7.4–12.7) with erlotinib [unstratified hazard ratio (HR) 0.717 (95% CI 0.458–1.124), two-sided log-rank, P = 0.146]. The median survival was 26.6 months (95% CI 21.6–41.5) with dacomitinib versus 23.2 months (95% CI 16.0–31.8) with erlotinib [unstratified HR 0.737 (95% CI 0.431–1.259), two-sided log-rank, P = 0.265]. Dacomitinib was associated with a higher incidence of diarrhea and mucositis in both studies compared with erlotinib. Conclusions: Dacomitinib is an active agent with comparable efficacy to erlotinib in the EGFR mutated patients. The subgroup with exon 19 deletion had favorable outcomes with dacomitinib. An ongoing phase III study will compare dacomitinib to gefitinib in first-line therapy of patients with NSCLC harboring common activating EGFR mutations (ARCHER 1050; NCT01774721). Clinical trials number: ARCHER 1009 (NCT01360554) and A7471028 (NCT00769067).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Guo and Nixon proposed a feature selection method based on maximizing I(x; Y),the multidimensional mutual information between feature vector x and class variable Y. Because computing I(x; Y) can be difficult in practice, Guo and Nixon proposed an approximation of I(x; Y) as the criterion for feature selection. We show that Guo and Nixon's criterion originates from approximating the joint probability distributions in I(x; Y) by second-order product distributions. We remark on the limitations of the approximation and discuss computationally attractive alternatives to compute I(x; Y).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A ternary thermodynamic function has been developed based on statistico-thermodynamic considerations, with a particular emphasis on the higher-order terms indicating the effects of truncation at the various stages of the treatment. Although the truncation of a series involved in the equation introduces inconsistency, the latter may be removed by imposing various thermodynamic boundary conditions. These conditions are discussed in the paper. The present equation with higher-order terms shows that the α function of a component reduces to a quadratic function of composition at constant compositional paths involving the other two components in the system. The form of the function has been found to be representative of various experimental observations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

After Gödel's incompleteness theorems and the collapse of Hilbert's programme Gerhard Gentzen continued the quest for consistency proofs of Peano arithmetic. He considered a finitistic or constructive proof still possible and necessary for the foundations of mathematics. For a proof to be meaningful, the principles relied on should be considered more reliable than the doubtful elements of the theory concerned. He worked out a total of four proofs between 1934 and 1939. This thesis examines the consistency proofs for arithmetic by Gentzen from different angles. The consistency of Heyting arithmetic is shown both in a sequent calculus notation and in natural deduction. The former proof includes a cut elimination theorem for the calculus and a syntactical study of the purely arithmetical part of the system. The latter consistency proof in standard natural deduction has been an open problem since the publication of Gentzen's proofs. The solution to this problem for an intuitionistic calculus is based on a normalization proof by Howard. The proof is performed in the manner of Gentzen, by giving a reduction procedure for derivations of falsity. In contrast to Gentzen's proof, the procedure contains a vector assignment. The reduction reduces the first component of the vector and this component can be interpreted as an ordinal less than epsilon_0, thus ordering the derivations by complexity and proving termination of the process.