933 resultados para participatory evaluation methodology
Resumo:
Background: Microarray data is frequently used to characterize the expression profile of a whole genome and to compare the characteristics of that genome under several conditions. Geneset analysis methods have been described previously to analyze the expression values of several genes related by known biological criteria (metabolic pathway, pathology signature, co-regulation by a common factor, etc.) at the same time and the cost of these methods allows for the use of more values to help discover the underlying biological mechanisms. Results: As several methods assume different null hypotheses, we propose to reformulate the main question that biologists seek to answer. To determine which genesets are associated with expression values that differ between two experiments, we focused on three ad hoc criteria: expression levels, the direction of individual gene expression changes (up or down regulation), and correlations between genes. We introduce the FAERI methodology, tailored from a two-way ANOVA to examine these criteria. The significance of the results was evaluated according to the self-contained null hypothesis, using label sampling or by inferring the null distribution from normally distributed random data. Evaluations performed on simulated data revealed that FAERI outperforms currently available methods for each type of set tested. We then applied the FAERI method to analyze three real-world datasets on hypoxia response. FAERI was able to detect more genesets than other methodologies, and the genesets selected were coherent with current knowledge of cellular response to hypoxia. Moreover, the genesets selected by FAERI were confirmed when the analysis was repeated on two additional related datasets. Conclusions: The expression values of genesets are associated with several biological effects. The underlying mathematical structure of the genesets allows for analysis of data from several genes at the same time. Focusing on expression levels, the direction of the expression changes, and correlations, we showed that two-step data reduction allowed us to significantly improve the performance of geneset analysis using a modified two-way ANOVA procedure, and to detect genesets that current methods fail to detect.
Resumo:
OBJECTIVE: To evaluate parents' and nurses' opinions regarding the adequacy of an educational program on shaken baby syndrome: the Perinatal Shaken Baby Syndrome Prevention Program (PSBSPP). DESIGN: Qualitative and quantitative assessments in the form of interviews and questionnaires administered in French. SETTING: Two birthing institutions in Montréal, QC, Canada: a university hospital and a regional center. PARTICIPANTS: Two hundred and sixty-three parents (73.8% mothers, 26.2% fathers) received the intervention after the birth of their child, and 69 nurses administered it. METHODS: Parents' and nurses' assessments of the adequacy and relevance of the program and nurses' assessments of the training they received to administer the program were evaluated. RESULTS: Both parents and nurses supported this initiative. Most parents appreciated the usefulness of the information. Nurses believed the program was adequate, and their training to deliver the program was satisfactory. All participants reported that the program was highly relevant, especially for new parents. CONCLUSION: The Perinatal Shaken Baby Syndrome Prevention Program achieves the goals of (a) increasing parents' knowledge about infant crying, anger, and shaken baby syndrome and (b) helping parents identify coping strategies. The relevance of introducing the PSBSPP in all birthing institutions is supported. Future studies should focus on vulnerable and culturally diverse populations, and longitudinal follow-up could help determine if the PSBSPP reduces the incidence of shaken baby syndrome.
Resumo:
The aim of this pilot project was to evaluate the feasibility of assessing the deposited particle dose in the lungs by applying the dynamic light scattering-based methodology in exhaled breath condensateur (EBC). In parallel, we developed and validated two analytical methods allowing the determination of inflammatory (hydrogen peroxide - H2O2) and lipoperoxidation (malondialdehyde - MDA) biomarkers in exhaled breath condensate. Finally, these methods were used to assess the particle dose and consecutive inflammatory effect in healthy nonsmoker subjects exposed to environmental tobacco smoke in controlled situations was done.
Resumo:
OBJECTIVES: The Swiss Aids prevention strategy has been subject to a continuous process of evaluation for the past 12 years. This paper describes the conceptual approach, methodology, results obtained and contribution to policy-making of that evaluation.¦DESIGN: The evaluation is on-going, global with respect to all components of the strategy, and utilization-focused. Each successive phase of the evaluation has included 10-20 studies centred either on aspects of process, of outcome or of environmental context. Findings are synthesized at the end of each phase. METHODS: Both quantitative and qualitative methods are used. Studies generally have one of three functions within the overall evaluation: assessment of trends through surveys or other types of repeated studies; evaluation of specific areas through a series of studies from different viewpoints; in-depth investigation or rapid assessment through one-off studies. Various methods of triangulation are used to validate findings. RESULTS: The evaluation has allowed for: the observation of behavioural change in different populations; the availability of scientific data in controversial fields such as drug-use policy; an understanding of the diversity of public appropriation of prevention messages. Recommendations are regularly formulated and have been used by policy-makers and field workers for strategy development. CONCLUSIONS: The global approach adopted corresponds well to the evaluation requirements of an integrated long-term prevention strategy. Cost is low relative to the extent of information provided. Such an evaluation cannot however address the question of causal relationship between the strategy and observed changes. The evaluation has contributed to the development of a culture of evaluation in Swiss AIDS prevention more generally.
Resumo:
Actualment un típic embedded system (ex. telèfon mòbil) requereix alta qualitat per portar a terme tasques com codificar/descodificar a temps real; han de consumir poc energia per funcionar hores o dies utilitzant bateries lleugeres; han de ser el suficientment flexibles per integrar múltiples aplicacions i estàndards en un sol aparell; han de ser dissenyats i verificats en un període de temps curt tot i l’augment de la complexitat. Els dissenyadors lluiten contra aquestes adversitats, que demanen noves innovacions en arquitectures i metodologies de disseny. Coarse-grained reconfigurable architectures (CGRAs) estan emergent com a candidats potencials per superar totes aquestes dificultats. Diferents tipus d’arquitectures han estat presentades en els últims anys. L’alta granularitat redueix molt el retard, l’àrea, el consum i el temps de configuració comparant amb les FPGAs. D’altra banda, en comparació amb els tradicionals processadors coarse-grained programables, els alts recursos computacionals els permet d’assolir un alt nivell de paral•lelisme i eficiència. No obstant, els CGRAs existents no estant sent aplicats principalment per les grans dificultats en la programació per arquitectures complexes. ADRES és una nova CGRA dissenyada per I’Interuniversity Micro-Electronics Center (IMEC). Combina un processador very-long instruction word (VLIW) i un coarse-grained array per tenir dues opcions diferents en un mateix dispositiu físic. Entre els seus avantatges destaquen l’alta qualitat, poca redundància en les comunicacions i la facilitat de programació. Finalment ADRES és un patró enlloc d’una arquitectura concreta. Amb l’ajuda del compilador DRESC (Dynamically Reconfigurable Embedded System Compile), és possible trobar millors arquitectures o arquitectures específiques segons l’aplicació. Aquest treball presenta la implementació d’un codificador MPEG-4 per l’ADRES. Mostra l’evolució del codi per obtenir una bona implementació per una arquitectura donada. També es presenten les característiques principals d’ADRES i el seu compilador (DRESC). Els objectius són de reduir al màxim el nombre de cicles (temps) per implementar el codificador de MPEG-4 i veure les diferents dificultats de treballar en l’entorn ADRES. Els resultats mostren que els cícles es redueixen en un 67% comparant el codi inicial i final en el mode VLIW i un 84% comparant el codi inicial en VLIW i el final en mode CGA.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
EEG recordings are usually corrupted by spurious extra-cerebral artifacts, which should be rejected or cleaned up by the practitioner. Since manual screening of human EEGs is inherently error prone and might induce experimental bias, automatic artifact detection is an issue of importance. Automatic artifact detection is the best guarantee for objective and clean results. We present a new approach, based on the time–frequency shape of muscular artifacts, to achieve reliable and automatic scoring. The impact of muscular activity on the signal can be evaluated using this methodology by placing emphasis on the analysis of EEG activity. The method is used to discriminate evoked potentials from several types of recorded muscular artifacts—with a sensitivity of 98.8% and a specificity of 92.2%. Automatic cleaning ofEEGdata are then successfully realized using this method, combined with independent component analysis. The outcome of the automatic cleaning is then compared with the Slepian multitaper spectrum based technique introduced by Delorme et al (2007 Neuroimage 34 1443–9).
Resumo:
The objective of this work was to develop neural network models of backpropagation type to estimate solar radiation based on extraterrestrial radiation data, daily temperature range, precipitation, cloudiness and relative sunshine duration. Data from Córdoba, Argentina, were used for development and validation. The behaviour and adjustment between values observed and estimates obtained by neural networks for different combinations of input were assessed. These estimations showed root mean square error between 3.15 and 3.88 MJ m-2 d-1 . The latter corresponds to the model that calculates radiation using only precipitation and daily temperature range. In all models, results show good adjustment to seasonal solar radiation. These results allow inferring the adequate performance and pertinence of this methodology to estimate complex phenomena, such as solar radiation.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
This Phase II follow-up study of IHRB Project TR-473 focused on the performance evaluation of rubblized pavements in Iowa. The primary objective of this study was to evaluate the structural condition of existing rubblized concrete pavements across Iowa through Falling Weight Deflectometer (FWD) tests, Dynamic Cone Penetrometer (DCP) tests, visual pavement distress surveys, etc. Through backcalculation of FWD deflection data using the Iowa State University's advanced layer moduli backcalculation program, the rubblized layer moduli were determined for various projects and compared with each other for correlating with the long-term pavement performance. The AASHTO structural layer coefficient for rubblized layer was also calculated using the rubblized layer moduli. To validate the mechanistic-empirical (M-E) hot mix asphalt (HMA) overlay thickness design procedure developed during the Phase I study, the actual HMA overlay thicknesses from the rubblization projects were compared with the predicted thicknesses obtained from the design software. The results of this study show that rubblization is a valid option to use in Iowa in the rehabilitation of portland cement concrete pavements provided the foundation is strong enough to support construction operations during the rubblization process. The M-E structural design methodology developed during Phase I can estimate the HMA overlay thickness reasonably well to achieve long-lasting performance of HMA pavements. The rehabilitation strategy is recommended for continued use in Iowa under those conditions conducive for rubblization.
Resumo:
There are still many vintage portland cement concrete (PCC) pavements, 18 ft wide (5.4 m), dating back to pre-World War II era in use today. Successive overlays have been placed to cover joints and to improve rideability. The average thickness of the existing asphalt cement concrete (ACC) along route E66 in Tama County, Iowa, was 6.13 in. (15.6 cm). The rehabilitation strategy called for widening the base using the top 3 in. (7.6 cm) of the existing ACC by a recycling process involving cold milling and mixing with additional emulsion/rejuvenator. The material was then placed into a widening trench and compacted to match the level of the milled surface. This project was undertaken to develop a rehabilitation methodology to widen these older pavements economically and to have a finished surface capable of carrying traffic with little or no additional work.
Resumo:
ABSTRACT: BACKGROUND: The Psychiatric arm of the population-based CoLaus study (PsyCoLaus) is designed to: 1) establish the prevalence of threshold and subthreshold psychiatric syndromes in the 35 to 66 year-old population of the city of Lausanne (Switzerland); 2) test the validity of postulated definitions for subthreshold mood and anxiety syndromes; 3) determine the associations between psychiatric disorders, personality traits and cardiovascular diseases (CVD), 4) identify genetic variants that can modify the risk for psychiatric disorders and determine whether genetic risk factors are shared between psychiatric disorders and CVD. This paper presents the method as well as somatic and sociodemographic characteristics of the sample. METHODS: All 35 to 66 year-old persons previously selected for the population-based CoLaus survey on risk factors for CVD were asked to participate in a substudy assessing psychiatric conditions. This investigation included the Diagnostic Interview for Genetic Studies to elicit diagnostic criteria for threshold disorders according to DSM-IV and algorithmically defined subthreshold syndromes. Complementary information was gathered on potential risk and protective factors for psychiatric disorders, migraine and on the morbidity of first-degree family members, whereas the collection of DNA and plasma samples was part of the original somatic study (CoLaus). RESULTS: A total of 3,691 individuals completed the psychiatric evaluation (67% participation). The gender distribution of the sample did not differ significantly from that of the general population in the same age range. Although the youngest 5-year band of the cohort was underrepresented and the oldest 5-year band overrepresented, participants of PsyCoLaus and individuals who refused to participate revealed comparable scores on the General Health Questionnaire, a self-rating instrument completed at the somatic exam. CONCLUSIONS: Despite limitations resulting from the relatively low participation in the context of a comprehensive and time-consuming investigation, the PsyCoLaus study should significantly contribute to the current understanding of psychiatric disorders and comorbid somatic conditions by: 1) establishing the clinical relevance of specific psychiatric syndromes below the DSM-IV threshold; 2) determining comorbidity between risk factors for CVD and psychiatric disorders; 3) assessing genetic variants associated with common psychiatric disorders and 4) identifying DNA markers shared between CVD and psychiatric disorders.
Resumo:
Ultrasound scans in the mid trimester of pregnancy are now a routine part of antenatal care in most European countries. With the assistance of Registries of Congenital Anomalies a study was undertaken in Europe. The objective of the study was to evaluate prenatal detection of congenital heart defects (CHD) by routine ultrasonographic examination of the fetus. All congenital malformations suspected prenatally and all congenital malformations, including chromosome anomalies, confirmed at birth were identified from the Congenital Malformation Registers, including 20 registers from the following European countries: Austria, Croatia, Denmark, France, Germany, Italy, Lithuania, Spain, Switzerland, The Netherlands, UK and Ukrainia. These registries follow the same methodology. The study period was 1996-1998, 709 030 births were covered, and 8126 cases with congenital malformations were registered. If more than one cardiac malformation was present the case was coded as complex cardiac malformation. CHD were subdivided into 'isolated' when only a cardiac malformation was present and 'associated' when at least one other major extra cardiac malformation was present. The associated CHD were subdivided into chromosomal, syndromic non-chromosomal and multiple. The study comprised 761 associated CHD including 282 cases with multiple malformations, 375 cases with chromosomal anomalies and 104 cases with non-chromosomal syndromes. The proportion of prenatal diagnosis of associated CHD varied in relation to the ultrasound screening policies from 17.9% in countries without routine screening (The Netherlands and Denmark) to 46.0% in countries with only one routine fetal scan and 55.6% in countries with two or three routine fetal scans. The prenatal detection rate of chromosomal anomalies was 40.3% (151/375 cases). This rate for recognized syndromes and multiply malformed with CHD was 51.9% (54/104 cases) and 48.6% (137/282 cases), respectively; 150/229 Down syndrome (65.8%) were livebirths. Concerning the syndromic cases, the detection rate of deletion 22q11, situs anomalies and VATER association was 44.4%, 64.7% and 46.6%, respectively. In conclusion, the present study shows large regional variations in the prenatal detection rate of CHD with the highest rates in European regions with three screening scans. Prenatal diagnosis of CHD is significantly higher if associated malformations are present. Cardiac defects affecting the size of the ventricles have the highest detection rate. Mean gestational age at discovery was 20-24 weeks for the majority of associated cardiac defects.
Resumo:
Lorsque de l'essence est employée pour allumer et/ou propager un incendie, l'inférence de la source de l'essence peut permettre d'établir un lien entre le sinistre et une source potentielle. Cette inférence de la source constitue une alternative intéressante pour fournir des éléments de preuve dans ce type d'événements où les preuves matérielles laissées par l'auteur sont rares. Le but principal de cette recherche était le développement d'une méthode d'analyse de spécimens d'essence par GC-IRMS, méthode pas routinière et peu étudiée en science forensique, puis l'évaluation de son potentiel à inférer la source de traces d'essence en comparaison aux performances de la GC-MS. Un appareillage permettant d'analyser simultanément les échantillons par MS et par IRMS a été utilisé dans cette recherche. Une méthode d'analyse a été développée, optimisée et validée pour cet appareillage. Par la suite, des prélèvements d'essence provenant d'un échantillonnage conséquent et représentatif du marché de la région lausannoise ont été analysés. Finalement, les données obtenues ont été traitées et interprétées à l'aide de méthodes chimiométriques. Les analyses effectuées ont permis de montrer que la méthodologie mise en place, aussi bien pour la composante MS que pour l'IRMS, permet de différencier des échantillons d'essence non altérée provenant de différentes stations-service. Il a également pu être démontré qu'à chaque nouveau remplissage des cuves d'une station-service, la composition de l'essence distribuée par cette station est quasi unique. La GC-MS permet une meilleure différenciation d'échantillons prélevés dans différentes stations, alors que la GC-IRMS est plus performante lorsqu'il s'agit de comparer des échantillons collectés après chacun des remplissages d'une cuve. Ainsi, ces résultats indiquent que les deux composantes de la méthode peuvent être complémentaires pour l'analyse d'échantillons d'essence non altérée. Les résultats obtenus ont également permis de montrer que l'évaporation des échantillons d'essence ne compromet pas la possibilité de grouper des échantillons de même source par GC-MS. Il est toutefois nécessaire d'effectuer une sélection des variables afin d'éliminer celles qui sont influencées par le phénomène d'évaporation. Par contre, les analyses effectuées ont montré que l'évaporation des échantillons d'essence a une forte influence sur la composition isotopique des échantillons. Cette influence est telle qu'il n'est pas possible, même en effectuant une sélection des variables, de grouper correctement des échantillons évaporés par GC-IRMS. Par conséquent, seule la composante MS de la méthodologie mise en place permet d'inférer la source d'échantillons d'essence évaporée. _________________________________________________________________________________________________ When gasoline is used to start and / or propagate an arson, source inference of gasoline can allow to establish a link between the fire and a potential source. This source inference is an interesting alternative to provide evidence in this type of events where physical evidence left by the author are rare. The main purpose of this research was to develop a GC-IRMS method for the analysis of gasoline samples, a non-routine method and little investigated in forensic science, and to evaluate its potential to infer the source of gasoline traces compared to the GC-MS performances. An instrument allowing to analyze simultaneously samples by MS and IRMS was used in this research. An analytical method was developed, optimized and validated for this instrument. Thereafter, gasoline samples from a large sampling and representative of the Lausanne area market were analyzed. Finally, the obtained data were processed and interpreted using chemometric methods. The analyses have shown that the methodology, both for MS and for IRMS, allow to differentiate unweathered gasoline samples from different service stations. It has also been demonstrated that each new filling of the tanks of a station generates an almost unique composition of gasoline. GC-MS achieves a better differentiation of samples coming from different stations, while GC-IRMS is more efficient to distinguish samples collected after each filling of a tank. Thus, these results indicate that the two components of the method can be complementary to the analysis of unweathered gasoline samples. The results have also shown that the evaporation of gasoline samples does not compromise the possibility to group samples coming from the same source by GC-MS. It is however necessary to make a selection of variables in order to eliminate those which are influenced by the evaporation. On the other hand, the carried out analyses have shown that the evaporation of gasoline samples has such a strong influence on the isotopic composition of the samples that it is not possible, even by performing a selection of variables, to properly group evaporated samples by GC-IRMS. Therefore, only the MS allows to infer the source of evaporated gasoline samples.