990 resultados para Inference process
Resumo:
This report is a summary of the feedback from the public consultation process on the current Lifeline contract and future options.
Resumo:
In the course of its complex life cycle, the parasite Schistosoma mansoni need to adapt to distinct environments, and consequently is exposed to various DNA damaging agents. The Schistosoma genome sequencing initiative has uncovered sequences from genes and transcripts related to the process of DNA damage tolerance as the enzymes UBC13, MMS2, and RAD6. In the present work, we evaluate the importance of this process in different stages of the life cycle of this parasite. The importance is evidenced by expression and phylogenetic profiles, which show the conservation of this pathway from protozoa to mammalians on evolution.
Resumo:
While the US jurisprudence of the 1993 Daubert requires judges to question not only the methodology behind, but also the principles governing, a body of knowledge to qualify it as scientific, can forensic science, based on Locard's and Kirk's Principles, pretend to this higher status in the courtroom ? Moving away from the disputable American legal debate, this historical and philosophical study will screen the relevance of the different logical epistemologies to recognize the scientific status of forensic science. As a consequence, the authors are supporting a call for its recognition as a science of its own, defined as the science of identifying and associating traces for investigative and security purposes, based o its fundamental principles and the case assesment and interpretation process that follows with its specific and relevant mode of inference.
Resumo:
Cryo-electron microscopy of vitreous sections (CEMOVIS) has recently been shown to provide images of biological specimens with unprecedented quality and resolution. Cutting the sections remains however the major difficulty. Here, we examine the parameters influencing the quality of the sections and analyse the resulting artefacts. They are in particular: knife marks, compression, crevasses, and chatter. We propose a model taking into account the interplay between viscous flow and fracture. We confirm that crevasses are formed on only one side of the section, and define conditions by which they can be avoided. Chatter is an effect of irregular compression due to friction of the section of the knife edge and conditions to prevent this are also explored. In absence of crevasses and chatter, the bulk of the section is compressed approximately homogeneously. Within this approximation, it is possible to correct for compression by a simple linear transformation for the bulk of the section. A research program is proposed to test and refine our understanding of the sectioning process.
Resumo:
To assess the effectiveness of a multidisciplinary evaluation and referral process in a prospective cohort of general hospital patients with alcohol dependence. Alcohol-dependent patients were identified in the wards of the general hospital and its primary care center. They were evaluated and then referred to treatment by a multidisciplinary team; those patients who accepted to participate in this cohort study were consecutively included and followed for 6 months. Not included patients were lost for follow-up, whereas all included patients were assessed at time of inclusion, 2 and 6 months later by a research psychologist in order to collect standardized baseline patients' characteristics, process salient features and patients outcomes (defined as treatment adherence and abstinence). Multidisciplinary evaluation and therapeutic referral was feasible and effective, with a success rate of 43%for treatment adherence and 28%for abstinence at 6 months. Among patients' characteristics, predictors of success were an age over 45, not living alone, being employed and being motivated to treatment (RAATE-A score < 18), whereas successful process characteristics included detoxification of the patient at time of referral and a full multidisciplinary referral meeting. This multidisciplinary model of evaluation and referral of alcohol dependent patients of a general hospital had a satisfactory level of effectiveness. Predictors of success and failure allow to identify subsets of patients for whom new strategies of motivation and treatment referral should be designed.
Resumo:
Personalization in e-learning allows the adaptation of contents, learning strategiesand educational resources to the competencies, previous knowledge or preferences of the student. This project takes a multidisciplinary perspective for devising standards-based personalization capabilities into virtual e-learning environments, focusing on the conceptof adaptive learning itinerary, using reusable learning objects as the basis of the system and using ontologies and semantic web technologies.
Resumo:
In this research, we analyse the contact-specific mean of the final cooperation probability, distinguishing on the one hand between contacts with household reference persons and with other eligible household members, and on the other hand between first and later contacts. Data comes from two Swiss Household Panel surveys. The interviewer-specific variance is higher for first contacts, especially in the case of the reference person. For later contacts with the reference person, the contact-specific variance dominates. This means that interaction effects and situational factors are decisive. The contact number has negative effects on the performance of contacts with the reference person, positive in the case of other persons. Also time elapsed since the previous contact has negative effects in the case of reference persons. The result of the previous contact has strong effects, especially in the case of the reference person. These findings call for a quick completion of the household grid questionnaire, assigning the best interviewers to conducting the first contact. While obtaining refusals has negative effects, obtaining other contact results has only weak effects on the interviewer's subsequent contact outcome. Using the same interviewer for contacts has no positive effects.
Resumo:
Low concentrations of elements in geochemical analyses have the peculiarity of beingcompositional data and, for a given level of significance, are likely to be beyond thecapabilities of laboratories to distinguish between minute concentrations and completeabsence, thus preventing laboratories from reporting extremely low concentrations of theanalyte. Instead, what is reported is the detection limit, which is the minimumconcentration that conclusively differentiates between presence and absence of theelement. A spatially distributed exhaustive sample is employed in this study to generateunbiased sub-samples, which are further censored to observe the effect that differentdetection limits and sample sizes have on the inference of population distributionsstarting from geochemical analyses having specimens below detection limit (nondetects).The isometric logratio transformation is used to convert the compositional data in thesimplex to samples in real space, thus allowing the practitioner to properly borrow fromthe large source of statistical techniques valid only in real space. The bootstrap method isused to numerically investigate the reliability of inferring several distributionalparameters employing different forms of imputation for the censored data. The casestudy illustrates that, in general, best results are obtained when imputations are madeusing the distribution best fitting the readings above detection limit and exposes theproblems of other more widely used practices. When the sample is spatially correlated, itis necessary to combine the bootstrap with stochastic simulation
Resumo:
Supervisory systems evolution makes the obtaining of significant information from processes more important in the way that the supervision systems' particular tasks are simplified. So, having signal treatment tools capable of obtaining elaborate information from the process data is important. In this paper, a tool that obtains qualitative data about the trends and oscillation of signals is presented. An application of this tool is presented as well. In this case, the tool, implemented in a computer-aided control systems design (CACSD) environment, is used in order to give to an expert system for fault detection in a laboratory plant
Resumo:
In the forensic examination of DNA mixtures, the question of how to set the total number of contributors (N) presents a topic of ongoing interest. Part of the discussion gravitates around issues of bias, in particular when assessments of the number of contributors are not made prior to considering the genotypic configuration of potential donors. Further complication may stem from the observation that, in some cases, there may be numbers of contributors that are incompatible with the set of alleles seen in the profile of a mixed crime stain, given the genotype of a potential contributor. In such situations, procedures that take a single and fixed number contributors as their output can lead to inferential impasses. Assessing the number of contributors within a probabilistic framework can help avoiding such complication. Using elements of decision theory, this paper analyses two strategies for inference on the number of contributors. One procedure is deterministic and focuses on the minimum number of contributors required to 'explain' an observed set of alleles. The other procedure is probabilistic using Bayes' theorem and provides a probability distribution for a set of numbers of contributors, based on the set of observed alleles as well as their respective rates of occurrence. The discussion concentrates on mixed stains of varying quality (i.e., different numbers of loci for which genotyping information is available). A so-called qualitative interpretation is pursued since quantitative information such as peak area and height data are not taken into account. The competing procedures are compared using a standard scoring rule that penalizes the degree of divergence between a given agreed value for N, that is the number of contributors, and the actual value taken by N. Using only modest assumptions and a discussion with reference to a casework example, this paper reports on analyses using simulation techniques and graphical models (i.e., Bayesian networks) to point out that setting the number of contributors to a mixed crime stain in probabilistic terms is, for the conditions assumed in this study, preferable to a decision policy that uses categoric assumptions about N.
Resumo:
Leishmania parasites expose phosphatidylserine (PS) on their surface, a process that has been associated with regulation of host's immune responses. In this study we demonstrate that PS exposure by metacyclic promastigotes of Leishmania amazonensis favours blood coagulation. L. amazonensis accelerates in vitro coagulation of human plasma. In addition, L. amazonensis supports the assembly of the prothrombinase complex, thus promoting thrombin formation. This process was reversed by annexin V which blocks PS binding sites. During blood meal, Lutzomyia longipalpis sandfly inject saliva in the bite site, which has a series of pharmacologically active compounds that inhibit blood coagulation. Since saliva and parasites are co-injected in the host during natural transmission, we evaluated the anticoagulant properties of sandfly saliva in counteracting the procoagulant activity of L. amazonensis . Lu. longipalpis saliva reverses plasma clotting promoted by promastigotes. It also inhibits thrombin formation by the prothrombinase complex assembled either in phosphatidylcholine (PC)/PS vesicles or in L. amazonensis . Sandfly saliva inhibits factor X activation by the intrinsic tenase complex assembled on PC/PS vesicles and blocks factor Xa catalytic activity. Altogether our results show that metacyclic promastigotes of L. amazonensis are procoagulant due to PS exposure. Notably, this effect is efficiently counteracted by sandfly saliva.