982 resultados para training tool
Resumo:
Diverse conditions for stimulating human mononuclear cells to release thymocyte costimulatory factors were tested for their contribution to the generation of supernatants high titers of these monokines. Activity titers increased with LPS concentration, reaching a plateau between 1 and 10 microng/ml. Indomethacin did not modify the monokine, but the assay for thymocyte costimulatory activity was substantially affected by inhibitory substances produced by the monocytes in the absence of indomethacin. The use of nylon wool columns to trap the cells was shown to be effective in raising cellular densities without decreasing activity titers. As result, the yield per cell could be maintained even in the absence of serum, an important step toward the goal of purifiying bioactive from crude broths.
Resumo:
This technical report is a document prepared as a deliverable [D4.3 Report of the Interlinkages and forecasting prototype tool] of a EU project – DECOIN Project No. 044428 - FP6-2005-SSP-5A. The text is divided into 4 sections: (1) this short introductory section explains the purpose of the report; (2) the second section provides a general discussion of a systemic problem found in existing quantitative analysis of sustainability. It addresses the epistemological implications of complexity, which entails the need of dealing with the existence of Multiple-Scales and non-equivalent narratives (multiple dimensions/attributes) to be used to define sustainability issues. There is an unavoidable tension between a “steady-state view” (= the perception of what is going on now – reflecting a PAST --& PRESENT view of the reality) versus an “evolutionary view” (= the unknown transformation that we have to expect in the process of becoming of the observed reality and in the observer – reflecting a PRESENT --& FUTURE view of the reality). The section ends by listing the implications of these points on the choice of integrated packages of sustainability indicators; (3) the third section illustrates the potentiality of the DECOIN toolkit for the study of sustainability trade-offs and linkages across indicators using quantitative examples taken from cases study of another EU project (SMILE). In particular, this section starts by addressing the existence of internal constraints to sustainability (economic versus social aspects). The narrative chosen for this discussion focuses on the dark side of ageing and immigration on the economic viability of social systems. Then the section continues by exploring external constraints to sustainability (economic development vs the environment). The narrative chosen for this discussion focuses on the dark side of current strategy of economic development based on externalization and the “bubbles-disease”; (4) the last section presents a critical appraisal of the quality of energy data found in energy statistics. It starts with a discussion of the general goal of statistical accounting. Then it introduces the concept of multipurpose grammars. The second part uses the experience made in the activities of the DECOIN project to answer the question: how useful are EUROSTAT energy statistics? The answer starts with an analysis of basic epistemological problems associated with accounting of energy. This discussion leads to the acknowledgment of an important epistemological problem: the unavoidable bifurcations in the mechanism of accounting needed to generate energy statistics. By using numerical example the text deals with the following issues: (i) the pitfalls of the actual system of accounting in energy statistics; (ii) a critical appraisal of the actual system of accounting in BP statistics; (iii) a critical appraisal of the actual system of accounting in Eurostat statistics. The section ends by proposing an innovative method to represent energy statistics which can result more useful for those willing develop sustainability indicators.
Resumo:
Introduction: As part of the MicroArray Quality Control (MAQC)-II project, this analysis examines how the choice of univariate feature-selection methods and classification algorithms may influence the performance of genomic predictors under varying degrees of prediction difficulty represented by three clinically relevant endpoints. Methods: We used gene-expression data from 230 breast cancers (grouped into training and independent validation sets), and we examined 40 predictors (five univariate feature-selection methods combined with eight different classifiers) for each of the three endpoints. Their classification performance was estimated on the training set by using two different resampling methods and compared with the accuracy observed in the independent validation set. Results: A ranking of the three classification problems was obtained, and the performance of 120 models was estimated and assessed on an independent validation set. The bootstrapping estimates were closer to the validation performance than were the cross-validation estimates. The required sample size for each endpoint was estimated, and both gene-level and pathway-level analyses were performed on the obtained models. Conclusions: We showed that genomic predictor accuracy is determined largely by an interplay between sample size and classification difficulty. Variations on univariate feature-selection methods and choice of classification algorithm have only a modest impact on predictor performance, and several statistically equally good predictors can be developed for any given classification problem.
Resumo:
BACKGROUND: Iron deficiency is a common and undertreated problem in inflammatory bowel disease (IBD). AIM: To develop an online tool to support treatment choice at the patient-specific level. METHODS: Using the RAND/UCLA Appropriateness Method (RUAM), a European expert panel assessed the appropriateness of treatment regimens for a variety of clinical scenarios in patients with non-anaemic iron deficiency (NAID) and iron deficiency anaemia (IDA). Treatment options included adjustment of IBD medication only, oral iron supplementation, high-/low-dose intravenous (IV) regimens, IV iron plus erythropoietin-stimulating agent (ESA), and blood transfusion. The panel process consisted of two individual rating rounds (1148 treatment indications; 9-point scale) and three plenary discussion meetings. RESULTS: The panel reached agreement on 71% of treatment indications. 'No treatment' was never considered appropriate, and repeat treatment after previous failure was generally discouraged. For 98% of scenarios, at least one treatment was appropriate. Adjustment of IBD medication was deemed appropriate in all patients with active disease. Use of oral iron was mainly considered an option in NAID and mildly anaemic patients without disease activity. IV regimens were often judged appropriate, with high-dose IV iron being the preferred option in 77% of IDA scenarios. Blood transfusion and IV+ESA were indicated in exceptional cases only. CONCLUSIONS: The RUAM revealed high agreement amongst experts on the management of iron deficiency in patients with IBD. High-dose IV iron was more often considered appropriate than other options. To facilitate dissemination of the recommendations, panel outcomes were embedded in an online tool, accessible via http://ferroscope.com/.
Resumo:
Several cytogenetic traits were tested a species diagnostic characters on five triatomine species: Rhodnius pictipes, R. nasutus, R. robustus, Triatoma matogrossensis and T. pseudomaculata. Four of them are described for the first time. The detailed analysis of the meiotic process and the application of C-banding allowed us to identify seven cytogenetic characters wich result useful to characterize and differentiate triatomine species.
Resumo:
The aim of this work is to evaluate the capabilities and limitations of chemometric methods and other mathematical treatments applied on spectroscopic data and more specifically on paint samples. The uniqueness of the spectroscopic data comes from the fact that they are multivariate - a few thousands variables - and highly correlated. Statistical methods are used to study and discriminate samples. A collection of 34 red paint samples was measured by Infrared and Raman spectroscopy. Data pretreatment and variable selection demonstrated that the use of Standard Normal Variate (SNV), together with removal of the noisy variables by a selection of the wavelengths from 650 to 1830 cm−1 and 2730-3600 cm−1, provided the optimal results for infrared analysis. Principal component analysis (PCA) and hierarchical clusters analysis (HCA) were then used as exploratory techniques to provide evidence of structure in the data, cluster, or detect outliers. With the FTIR spectra, the Principal Components (PCs) correspond to binder types and the presence/absence of calcium carbonate. 83% of the total variance is explained by the four first PCs. As for the Raman spectra, we observe six different clusters corresponding to the different pigment compositions when plotting the first two PCs, which account for 37% and 20% respectively of the total variance. In conclusion, the use of chemometrics for the forensic analysis of paints provides a valuable tool for objective decision-making, a reduction of the possible classification errors, and a better efficiency, having robust results with time saving data treatments.
Resumo:
Résumé La mondialisation des marchés, les mutations du contexte économique et enfin l'impact des nouvelles technologies de l'information ont obligé les entreprises à revoir la façon dont elles gèrent leurs capitaux intellectuel (gestion des connaissances) et humain (gestion des compétences). II est communément admis aujourd'hui que ceux-ci jouent un rôle particulièrement stratégique dans l'organisation. L'entreprise désireuse de se lancer dans une politique gestion de ces capitaux devra faire face à différents problèmes. En effet, afin de gérer ces connaissances et ces compétences, un long processus de capitalisation doit être réalisé. Celui-ci doit passer par différentes étapes comme l'identification, l'extraction et la représentation des connaissances et des compétences. Pour cela, il existe différentes méthodes de gestion des connaissances et des compétences comme MASK, CommonKADS, KOD... Malheureusement, ces différentes méthodes sont très lourdes à mettre en oeuvre, et se cantonnent à certains types de connaissances et sont, par conséquent, plus limitées dans les fonctionnalités qu'elles peuvent offrir. Enfin, la gestion des compétences et la gestion des connaissances sont deux domaines dissociés alors qu'il serait intéressant d'unifier ces deux approches en une seule. En effet, les compétences sont très proches des connaissances comme le souligne la définition de la compétence qui suit : « un ensemble de connaissances en action dans un contexte donné ». Par conséquent, nous avons choisi d'appuyer notre proposition sur le concept de compétence. En effet, la compétence est parmi les connaissances de l'entreprise l'une des plus cruciales, en particulier pour éviter la perte de savoir-faire ou pour pouvoir prévenir les besoins futurs de l'entreprise, car derrière les compétences des collaborateurs, se trouve l'efficacité de l'organisation. De plus, il est possible de décrire grâce à la compétence de nombreux autres concepts de l'organisation, comme les métiers, les missions, les projets, les formations... Malheureusement, il n'existe pas réellement de consensus sur la définition de la compétence. D'ailleurs, les différentes définitions existantes, même si elles sont pleinement satisfaisantes pour les experts, ne permettent pas de réaliser un système opérationnel. Dans notre approche; nous abordons la gestion des compétences à l'aide d'une méthode de gestion des connaissances. En effet, de par leur nature même, connaissance et compétence sont intimement liées et donc une telle méthode est parfaitement adaptée à la gestion des compétences. Afin de pouvoir exploiter ces connaissances et ces compétences nous avons dû, dans un premier temps, définir les concepts organisationnels de façon claire et computationnelle. Sur cette base, nous proposons une méthodologie de construction des différents référentiels d'entreprise (référentiel de compétences, des missions, des métiers...). Pour modéliser ces différents référentiels, nous avons choisi l'ontologie, car elle permet d'obtenir des définitions cohérentes et consensuelles aux concepts tout en supportant les diversités langagières. Ensuite, nous cartographions les connaissances de l'entreprise (formations, missions, métiers...) sur ces différentes ontologies afin de pouvoir les exploiter et les diffuser. Notre approche de la gestion des connaissances et de la gestion des compétences a permis la réalisation d'un outil offrant de nombreuses fonctionnalités comme la gestion des aires de mobilités, l'analyse stratégique, les annuaires ou encore la gestion des CV. Abstract The globalization of markets, the easing of economical regulation and finally the impact of new information and communication technologies have obliged firms to re-examine the way they manage their knowledge capital (knowledge management) and their human capital (competence management). It is commonly admitted that knowledge plays a slightly strategical role in the organization. The firms who want to establish one politic of management of these capitals will have to face with different problems. To manage that knowledge, a long process of capitalization must be done. That one has different steps like identification, extraction and representation of knowledge and competences. There are some different methods of knowledge management like MASK, CommonKADS or KOD. Unfortunately, those methods are very difficult to implement and are using only some types of knowledge and are consequently more limited in the functionalities they can offer. Knowledge management and competence management are two different domain where it could be interesting to unify those to one. Indeed, competence is very close than knowledge as underline this definition: "a set of knowledge in action in a specified context". We choose in our approach to rely on the concept of competence. Indeed, the competence is one of crucial knowledge in the company, particularly to avoid the loss of know-how or to prevent future needs. Because behind collaborator's competence, we can find company efficiency. Unfortunately, there is no real consensus on the definition of the concept of competence. Moreover, existing different definitions don't permit to develop an operational system. Among other key concept, we can find jobs, mission, project, and training... Moreover, we approach different problems of the competence management under the angle of the knowledge management. Indeed, knowledge and competence are closely linked. Then, we propose a method to build different company repositories (competence, jobs, projects repositories). To model those different repositories we choose ontology because it permits to obtain coherent and consensual definitions of the concepts with support of linguistics diversities too. This building repositories method coupled with this knowledge and competence management approach permitted the realization of a tool offering functionalities like mobility management, strategical analysis, yellow pages or CV management.
Resumo:
La adaptación del reconocimiento de objetos sobre la robótica móvil requiere un enfoque y nuevas aplicaciones que optimicen el entrenamiento de los robots para obtener resultados satisfactorios. Es conocido que el proceso de entrenamiento es largo y tedioso, donde la intervención humana es absolutamente necesaria para supervisar el comportamiento del robot y la dirección hacia los objetivos. Es por esta razón que se ha desarrollado una herramienta que reduce notablemente el esfuerzo humano que se debe hacer para esta supervisión, automatizando el proceso necesario para obtener una evaluación de resultados, y minimizando el tiempo que se malgasta debido a errores humanos o falta de infraestructuras.
Resumo:
Aquesta memoria resumeix el treball de final de carrera d’Enginyeria Superior d’Informàtica. Explicarà les principals raons que han motivat el projecte així com exemples que il·lustren l’aplicació resultant. En aquest cas el software intentarà resoldre la actual necessitat que hi ha de tenir dades de Ground Truth per als algoritmes de segmentació de text per imatges de color complexes. Tots els procesos seran explicats en els diferents capítols partint de la definició del problema, la planificació, els requeriments i el disseny fins a completar la il·lustració dels resultats del programa i les dades de Ground Truth resultants.
Resumo:
Previous evidences reported by us and by other authors revealed the presence of IgG in sera of Schistosoma mansoni-infected patients to immunodominant antigens which are enzymes. Besides their immunological interest as possible inductors of protection, several of these enzume antigens might be also intersting markers of infection in antibody-detecting immunocapture assays which use the intrinsic catalytic property of these antigens. It was thus thought important to define some enzymatic and immunological characteristics of these molecules to better exploit their use as antigens. Four different enzymes from adult worms were partially characterized in their biochemical properties and susceptibility to react with antibodies of infected patients, namely alkaline phosphatase (AKP, Mg*+, pH 9.5), type I phosphodiesterase (PDE, pH 9.5), cysteine proteinase (CP, dithiothreitol, pH 5.5) and N-acetyl-ß-D-glucosaminidase (NAG, pH 5.5). The AKP and PDE are distinct tegumental membrane-bound enzymes whereas CP and NAG are soluble acid enzymes. Antibodies in infected human sera differed in their capacity to react with and to inhibit these enzyme antigens. Possibly, the specificity of the antibodies related to the extent of homology between the parasite and the host enzyme might be in part responsible for the above differences. The results are also discussed in view of the possible functional importance of these enzymes.
Resumo:
Background and Aims: The international EEsAI study group is currently developing the first activity index specific for Eosinophilic Esophagitis (EoE). None of the existing dysphagia questionnaires takes into account the consistency of the ingested food that considerably impacts the symptom presentation. Goal: To develop an EoE-specific questionnaire assessing dysphagia associated with different food consistencies. Methods: Based on patient chart reviews, an expert panel (EEsAI study group) identified internationally standardized food prototypes typically associated with EoE-related dysphagia. Food consistencies were correlated with EoE-related dysphagia, also considering potential food avoidance. This Visual Dysphagia Questionnaire (VDQ) was then tested, as a pilot, in 10 EoE patients. Results: The following 9 food consistency prototypes were identified: water, soft foods (pudding, jelly), grits, toast bread, French fries, dry rice, ground meat, raw fibrous foods (eg. apple, carrot), solid meat. Dysphagia was ranked on a 5-point Likert scale (0=no difficulties, 5=very severe difficulties, food will not pass). Severity of dysphagia in the 10 EoE patients was related to the eosinophil load and presence of esophageal strictures. Conclusions: The VDQ will be the first EoE-specific tool for assessing dysphagia related to internationally defined food consistencies. It performed well in a pilot study and will now be further evaluated in a cohort study including 100 adult and 100 pediatric EoE patients.
Resumo:
This paper analyzes the role of standing facilities in the determination of the demand for reserves in the overnight money market. In particular, we study how the asymmetric nature of the deposit and lending facilities could be used as a powerful policy tool for the simultaneous control of prices and quantities in the market for daily funds.
Resumo:
BACKGROUND: As the diversity of the European population evolves, measuring providers' skillfulness in cross-cultural care and understanding what contextual factors may influence this is increasingly necessary. Given limited information about differences in cultural competency by provider role, we compared cross-cultural skillfulness between physicians and nurses working at a Swiss university hospital. METHODS: A survey on cross-cultural care was mailed in November 2010 to front-line providers in Lausanne, Switzerland. This questionnaire included some questions from the previously validated Cross-Cultural Care Survey. We compared physicians' and nurses' mean composite scores and proportion of "3-good/4-very good" responses, for nine perceived skillfulness items (4-point Likert-scale) using the validated tool. We used linear regression to examine how provider role (physician vs. nurse) was associated with composite skillfulness scores, adjusting for demographics (gender, non-French dominant language), workplace (time at institution, work-unit "sensitized" to cultural-care), reported cultural-competence training, and cross-cultural care problem-awareness. RESULTS: Of 885 questionnaires, 368 (41.2%) returned the survey: 124 (33.6%) physicians and 244 (66.4%) nurses, reflecting institutional distribution of providers. Physicians had better mean composite scores for perceived skillfulness than nurses (2.7 vs. 2.5, p < 0.005), and significantly higher proportion of "good/very good" responses for 4/9 items. After adjusting for explanatory variables, physicians remained more likely to have higher skillfulness (β = 0.13, p = 0.05). Among all, higher skillfulness was associated with perception/awareness of problems in the following areas: inadequate cross-cultural training (β = 0.14, p = 0.01) and lack of practical experience caring for diverse populations (β = 0.11, p = 0.04). In stratified analyses among physicians alone, having French as a dominant language (β = -0.34, p < 0.005) was negatively correlated with skillfulness. CONCLUSIONS: Overall, there is much room for cultural competency improvement among providers. These results support the need for cross-cultural skills training with an inter-professional focus on nurses, education that attunes provider awareness to the local issues in cross-cultural care, and increased diversity efforts in the work force, particularly among physicians.
Resumo:
BACKGROUND: Adequate pain assessment is critical for evaluating the efficacy of analgesic treatment in clinical practice and during the development of new therapies. Yet the currently used scores of global pain intensity fail to reflect the diversity of pain manifestations and the complexity of underlying biological mechanisms. We have developed a tool for a standardized assessment of pain-related symptoms and signs that differentiates pain phenotypes independent of etiology. METHODS AND FINDINGS: Using a structured interview (16 questions) and a standardized bedside examination (23 tests), we prospectively assessed symptoms and signs in 130 patients with peripheral neuropathic pain caused by diabetic polyneuropathy, postherpetic neuralgia, or radicular low back pain (LBP), and in 57 patients with non-neuropathic (axial) LBP. A hierarchical cluster analysis revealed distinct association patterns of symptoms and signs (pain subtypes) that characterized six subgroups of patients with neuropathic pain and two subgroups of patients with non-neuropathic pain. Using a classification tree analysis, we identified the most discriminatory assessment items for the identification of pain subtypes. We combined these six interview questions and ten physical tests in a pain assessment tool that we named Standardized Evaluation of Pain (StEP). We validated StEP for the distinction between radicular and axial LBP in an independent group of 137 patients. StEP identified patients with radicular pain with high sensitivity (92%; 95% confidence interval [CI] 83%-97%) and specificity (97%; 95% CI 89%-100%). The diagnostic accuracy of StEP exceeded that of a dedicated screening tool for neuropathic pain and spinal magnetic resonance imaging. In addition, we were able to reproduce subtypes of radicular and axial LBP, underscoring the utility of StEP for discerning distinct constellations of symptoms and signs. CONCLUSIONS: We present a novel method of identifying pain subtypes that we believe reflect underlying pain mechanisms. We demonstrate that this new approach to pain assessment helps separate radicular from axial back pain. Beyond diagnostic utility, a standardized differentiation of pain subtypes that is independent of disease etiology may offer a unique opportunity to improve targeted analgesic treatment.