16 resultados para Brazilian Public Examination Bill Project

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The draft of the new law on the confidentiality of personal data severely curtails medical and epidemiological research. This might be detrimental and dangerous to public health. The project therefore has to be amended.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The purpose of this article is to present the specific public health indicators recently developed by EUROCAT that aim to summarize important aspects of the public health impact of congenital anomalies in a few quantitative measures. METHODS: The six indicators are: (1) congenital anomaly perinatal mortality, (2) congenital anomaly prenatal diagnosis prevalence, (3) congenital anomaly termination of pregnancy, (4) Down syndrome livebirth prevalence, (5) congenital anomaly pediatric surgery, and (6) neural tube defects (NTD) total prevalence. Data presented for this report pertained to all cases (livebirths, fetal deaths, or stillbirths after 20 weeks of gestation and terminations of pregnancy for fetal anomaly [TOPFA]) of congenital anomaly from 27 full member registries of EUROCAT that could provide data for at least 3 years during the period 2004 to 2008. Prevalence of anomalies, prenatal diagnosis, TOPFA, pediatric surgery, and perinatal mortality were calculated per 1000 births. RESULTS: The overall perinatal mortality was approximately 1.0 per 1000 births for EUROCAT registries with almost half due to fetal and the other half due to first week deaths. There were wide variations in perinatal mortality across the registries with the highest rates observed in Dublin and Malta, registries in countries where TOPFA are illegal, and in Ukraine. The overall perinatal mortality across EUROCAT registries slightly decreased between 2004 and 2008 due to a decrease in first week deaths. The prevalence of TOPFA was fairly stable at about 4 per 1000 births. There were variations in livebirth prevalence of cases typically requiring surgery across the registries; however, for most registries this prevalence was between 3 and 5 per 1000 births. Prevalence of NTD decreased by about 10% from 1.05 in 2004 to 0.94 per 1000 in 2008. CONCLUSION: It is hoped that by publishing the data on EUROCAT indicators, the public health importance of congenital anomalies can be clearly summarized to policy makers, the need for accurate data from registries emphasized, the need for primary prevention and treatment services highlighted, and the impact of current services measured.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report the generation and analysis of functional data from multiple, diverse experiments performed on a targeted 1% of the human genome as part of the pilot phase of the ENCODE Project. These data have been further integrated and augmented by a number of evolutionary and computational analyses. Together, our results advance the collective knowledge about human genome function in several major areas. First, our studies provide convincing evidence that the genome is pervasively transcribed, such that the majority of its bases can be found in primary transcripts, including non-protein-coding transcripts, and those that extensively overlap one another. Second, systematic examination of transcriptional regulation has yielded new understanding about transcription start sites, including their relationship to specific regulatory sequences and features of chromatin accessibility and histone modification. Third, a more sophisticated view of chromatin structure has emerged, including its inter-relationship with DNA replication and transcriptional regulation. Finally, integration of these new sources of information, in particular with respect to mammalian evolution based on inter- and intra-species sequence comparisons, has yielded new mechanistic and evolutionary insights concerning the functional landscape of the human genome. Together, these studies are defining a path for pursuit of a more comprehensive characterization of human genome function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forensic examinations of ink have been performed since the beginning of the 20th century. Since the 1960s, the International Ink Library, maintained by the United States Secret Service, has supported those analyses. Until 2009, the search and identification of inks were essentially performed manually. This paper describes the results of a project designed to improve ink samples' analytical and search processes. The project focused on the development of improved standardization procedures to ensure the best possible reproducibility between analyses run on different HPTLC plates. The successful implementation of this new calibration method enabled the development of mathematical algorithms and of a software package to complement the existing ink library.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Data for trends in glycaemia and diabetes prevalence are needed to understand the effects of diet and lifestyle within populations, assess the performance of interventions, and plan health services. No consistent and comparable global analysis of trends has been done. We estimated trends and their uncertainties in mean fasting plasma glucose (FPG) and diabetes prevalence for adults aged 25 years and older in 199 countries and territories. METHODS: We obtained data from health examination surveys and epidemiological studies (370 country-years and 2·7 million participants). We converted systematically between different glycaemic metrics. For each sex, we used a Bayesian hierarchical model to estimate mean FPG and its uncertainty by age, country, and year, accounting for whether a study was nationally, subnationally, or community representative. FINDINGS: In 2008, global age-standardised mean FPG was 5·50 mmol/L (95% uncertainty interval 5·37-5·63) for men and 5·42 mmol/L (5·29-5·54) for women, having risen by 0·07 mmol/L and 0·09 mmol/L per decade, respectively. Age-standardised adult diabetes prevalence was 9·8% (8·6-11·2) in men and 9·2% (8·0-10·5) in women in 2008, up from 8·3% (6·5-10·4) and 7·5% (5·8-9·6) in 1980. The number of people with diabetes increased from 153 (127-182) million in 1980, to 347 (314-382) million in 2008. We recorded almost no change in mean FPG in east and southeast Asia and central and eastern Europe. Oceania had the largest rise, and the highest mean FPG (6·09 mmol/L, 5·73-6·49 for men; 6·08 mmol/L, 5·72-6·46 for women) and diabetes prevalence (15·5%, 11·6-20·1 for men; and 15·9%, 12·1-20·5 for women) in 2008. Mean FPG and diabetes prevalence in 2008 were also high in south Asia, Latin America and the Caribbean, and central Asia, north Africa, and the Middle East. Mean FPG in 2008 was lowest in sub-Saharan Africa, east and southeast Asia, and high-income Asia-Pacific. In high-income subregions, western Europe had the smallest rise, 0·07 mmol/L per decade for men and 0·03 mmol/L per decade for women; North America had the largest rise, 0·18 mmol/L per decade for men and 0·14 mmol/L per decade for women. INTERPRETATION: Glycaemia and diabetes are rising globally, driven both by population growth and ageing and by increasing age-specific prevalences. Effective preventive interventions are needed, and health systems should prepare to detect and manage diabetes and its sequelae. FUNDING: Bill & Melinda Gates Foundation and WHO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geological research on the Mediterranean region is presently characterized by the transition from disciplinary to multidisciplinary research, as well as from national to international investigations. In order to synthesize and integrate the vast disciplinary and national datasets which are available, it is necessary to implement maximum interaction among geoscientists of different backgrounds. The creation of project-oriented task forces in universities and other research institutions, as well as the development of large international cooperation programs, is instrumental in pursuing such a multidisciplinary and supranational approach. The TRANSMED Atlas, an official publication of the 32nd International Geological Congress (Florence 2004), is the result of an international scientific cooperation program which brought together for over two years sixty-three structural geologists, geophysicists, marine geologists, petrologists, sedimentologists, stratigraphers, paleogeographers, and petroleum geologists coming from eighteen countries, and working for the petroleum industry, academia, and other institutions, both public and private. The TRANSMED Atlas provides an updated, synthetic, and coherent portrayal of the overall geological-geophysical structure of the Mediterranean domain and the surrounding areas. The initial stimulus for the Atlas came from the realization of the extremely heterogeneous nature of the existing geological-geophysical data about such domain. These data have been gathered by universities, oil companies, geological surveys and other institutions in several countries, often using different procedures and standards. In addition, much of these data are written in languages and published in outlets that are not readily accessible to the general international reader. By synthesizing and integrating a wealth of preexisting and new data derived from surficial geology, seismic sections at various scales, and mantle tomographies, the TRANSMED Atlas provides for the first time a coherent geological overview of the Mediterranean region and represents an ideal springboard for future studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper aims to provide empirical support for the use of the principal-agent framework in the analysis of public sector and public policies. After reviewing the different conditions to be met for a relevant analysis of the relationship between population and government using the principal-agent theory, our paper focuses on the assumption of conflicting goals between the principal and the agent. A principal-agent analysis assumes in effect that inefficiencies may arise because principal and agent pursue different goals. Using data collected during an amalgamation project of two Swiss municipalities, we show the existence of a gap between the goals of the population and those of the government. Consequently, inefficiencies as predicted by the principal-agent model may arise during the implementation of a public policy, i.e. an amalgamation project. In a context of direct democracy where policies are regularly subjected to referendum, the conflict of objectives may even lead to a total failure of the policy at the polls.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La tomodensitométrie (CT) est une technique d'imagerie dont l'intérêt n'a cessé de croître depuis son apparition dans le début des années 70. Dans le domaine médical, son utilisation est incontournable à tel point que ce système d'imagerie pourrait être amené à devenir victime de son succès si son impact au niveau de l'exposition de la population ne fait pas l'objet d'une attention particulière. Bien évidemment, l'augmentation du nombre d'examens CT a permis d'améliorer la prise en charge des patients ou a rendu certaines procédures moins invasives. Toutefois, pour assurer que le compromis risque - bénéfice soit toujours en faveur du patient, il est nécessaire d'éviter de délivrer des doses non utiles au diagnostic.¦Si cette action est importante chez l'adulte elle doit être une priorité lorsque les examens se font chez l'enfant, en particulier lorsque l'on suit des pathologies qui nécessitent plusieurs examens CT au cours de la vie du patient. En effet, les enfants et jeunes adultes sont plus radiosensibles. De plus, leur espérance de vie étant supérieure à celle de l'adulte, ils présentent un risque accru de développer un cancer radio-induit dont la phase de latence peut être supérieure à vingt ans. Partant du principe que chaque examen radiologique est justifié, il devient dès lors nécessaire d'optimiser les protocoles d'acquisitions pour s'assurer que le patient ne soit pas irradié inutilement. L'avancée technologique au niveau du CT est très rapide et depuis 2009, de nouvelles techniques de reconstructions d'images, dites itératives, ont été introduites afin de réduire la dose et améliorer la qualité d'image.¦Le présent travail a pour objectif de déterminer le potentiel des reconstructions itératives statistiques pour réduire au minimum les doses délivrées lors d'examens CT chez l'enfant et le jeune adulte tout en conservant une qualité d'image permettant le diagnostic, ceci afin de proposer des protocoles optimisés.¦L'optimisation d'un protocole d'examen CT nécessite de pouvoir évaluer la dose délivrée et la qualité d'image utile au diagnostic. Alors que la dose est estimée au moyen d'indices CT (CTDIV0| et DLP), ce travail a la particularité d'utiliser deux approches radicalement différentes pour évaluer la qualité d'image. La première approche dite « physique », se base sur le calcul de métriques physiques (SD, MTF, NPS, etc.) mesurées dans des conditions bien définies, le plus souvent sur fantômes. Bien que cette démarche soit limitée car elle n'intègre pas la perception des radiologues, elle permet de caractériser de manière rapide et simple certaines propriétés d'une image. La seconde approche, dite « clinique », est basée sur l'évaluation de structures anatomiques (critères diagnostiques) présentes sur les images de patients. Des radiologues, impliqués dans l'étape d'évaluation, doivent qualifier la qualité des structures d'un point de vue diagnostique en utilisant une échelle de notation simple. Cette approche, lourde à mettre en place, a l'avantage d'être proche du travail du radiologue et peut être considérée comme méthode de référence.¦Parmi les principaux résultats de ce travail, il a été montré que les algorithmes itératifs statistiques étudiés en clinique (ASIR?, VEO?) ont un important potentiel pour réduire la dose au CT (jusqu'à-90%). Cependant, par leur fonctionnement, ils modifient l'apparence de l'image en entraînant un changement de texture qui pourrait affecter la qualité du diagnostic. En comparant les résultats fournis par les approches « clinique » et « physique », il a été montré que ce changement de texture se traduit par une modification du spectre fréquentiel du bruit dont l'analyse permet d'anticiper ou d'éviter une perte diagnostique. Ce travail montre également que l'intégration de ces nouvelles techniques de reconstruction en clinique ne peut se faire de manière simple sur la base de protocoles utilisant des reconstructions classiques. Les conclusions de ce travail ainsi que les outils développés pourront également guider de futures études dans le domaine de la qualité d'image, comme par exemple, l'analyse de textures ou la modélisation d'observateurs pour le CT.¦-¦Computed tomography (CT) is an imaging technique in which interest has been growing since it first began to be used in the early 1970s. In the clinical environment, this imaging system has emerged as the gold standard modality because of its high sensitivity in producing accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase of the number of CT examinations performed has raised concerns about the potential negative effects of ionizing radiation on the population. To insure a benefit - risk that works in favor of a patient, it is important to balance image quality and dose in order to avoid unnecessary patient exposure.¦If this balance is important for adults, it should be an absolute priority for children undergoing CT examinations, especially for patients suffering from diseases requiring several follow-up examinations over the patient's lifetime. Indeed, children and young adults are more sensitive to ionizing radiation and have an extended life span in comparison to adults. For this population, the risk of developing cancer, whose latency period exceeds 20 years, is significantly higher than for adults. Assuming that each patient examination is justified, it then becomes a priority to optimize CT acquisition protocols in order to minimize the delivered dose to the patient. Over the past few years, CT advances have been developing at a rapid pace. Since 2009, new iterative image reconstruction techniques, called statistical iterative reconstructions, have been introduced in order to decrease patient exposure and improve image quality.¦The goal of the present work was to determine the potential of statistical iterative reconstructions to reduce dose as much as possible without compromising image quality and maintain diagnosis of children and young adult examinations.¦The optimization step requires the evaluation of the delivered dose and image quality useful to perform diagnosis. While the dose is estimated using CT indices (CTDIV0| and DLP), the particularity of this research was to use two radically different approaches to evaluate image quality. The first approach, called the "physical approach", computed physical metrics (SD, MTF, NPS, etc.) measured on phantoms in well-known conditions. Although this technique has some limitations because it does not take radiologist perspective into account, it enables the physical characterization of image properties in a simple and timely way. The second approach, called the "clinical approach", was based on the evaluation of anatomical structures (diagnostic criteria) present on patient images. Radiologists, involved in the assessment step, were asked to score image quality of structures for diagnostic purposes using a simple rating scale. This approach is relatively complicated to implement and also time-consuming. Nevertheless, it has the advantage of being very close to the practice of radiologists and is considered as a reference method.¦Primarily, this work revealed that the statistical iterative reconstructions studied in clinic (ASIR? and VECO have a strong potential to reduce CT dose (up to -90%). However, by their mechanisms, they lead to a modification of the image appearance with a change in image texture which may then effect the quality of the diagnosis. By comparing the results of the "clinical" and "physical" approach, it was showed that a change in texture is related to a modification of the noise spectrum bandwidth. The NPS analysis makes possible to anticipate or avoid a decrease in image quality. This project demonstrated that integrating these new statistical iterative reconstruction techniques can be complex and cannot be made on the basis of protocols using conventional reconstructions. The conclusions of this work and the image quality tools developed will be able to guide future studies in the field of image quality as texture analysis or model observers dedicated to CT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cardiovascular diseases (CVD) remain the main cause of morbidity and mortality in our society. CoLaus is a population-based health examination survey started in 2003 in Lausanne in order to assess: 1. Prevalence of cardiovascular risk factors, 2. New genetic determinants of cardiovascular risk factors such as hypertension, 3. Association of mood disorders with incidence of cardiovascular events and 4. Trends in prevalence of cardiovascular risk factors. In order to do so, over 6000 subjects (ages 35-75 years) provided data on CVD risk factors. Herein we provide preliminary results of this study, in particular on classical risk factors such as hypertension, obesity and diabetes. Implications and perspectives of this population based-study for public health and genetic studies are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The GENCODE Consortium aims to identify all gene features in the human genome using a combination of computational analysis, manual annotation, and experimental validation. Since the first public release of this annotation data set, few new protein-coding loci have been added, yet the number of alternative splicing transcripts annotated has steadily increased. The GENCODE 7 release contains 20,687 protein-coding and 9640 long noncoding RNA loci and has 33,977 coding transcripts not represented in UCSC genes and RefSeq. It also has the most comprehensive annotation of long noncoding RNA (lncRNA) loci publicly available with the predominant transcript form consisting of two exons. We have examined the completeness of the transcript annotation and found that 35% of transcriptional start sites are supported by CAGE clusters and 62% of protein-coding genes have annotated polyA sites. Over one-third of GENCODE protein-coding genes are supported by peptide hits derived from mass spectrometry spectra submitted to Peptide Atlas. New models derived from the Illumina Body Map 2.0 RNA-seq data identify 3689 new loci not currently in GENCODE, of which 3127 consist of two exon models indicating that they are possibly unannotated long noncoding loci. GENCODE 7 is publicly available from gencodegenes.org and via the Ensembl and UCSC Genome Browsers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ASTM standards on Writing Ink Identification (ASTM 1789-04) and on Writing Ink Comparison (ASTM 1422-05) are the most up-to-date guidelines that have been published on the forensic analysis of ink. The aim of these documents is to cover most aspects of the forensic analysis of ink evidence, from the analysis of ink samples, the comparison of the analytical profile of these samples (with the aim to differentiate them or not), through to the interpretation of the result of the examination of these samples in a forensic context. Significant evolutions in the technology available to forensic scientists, in the quality assurance requirements brought onto them, and in the understanding of frameworks to interpret forensic evidence have been made in recent years. This article reviews the two standards in the light of these evolutions and proposes some practical improvements in terms of the standardization of the analyses, the comparison of ink samples, and the interpretation of ink examination. Some of these suggestions have already been included in a DHS funded project aimed at creating a digital ink library for the United States Secret Service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis contains three parts. The first one offers the theoretical basement, where the history of the police from their beginning in the early 19th century to this day is shown. The emphasis however is laid on the last 40 years, which gave birth to a multitude of innovations, such as community, problem-oriented, hot-spots or zero-tolerance policing. Those innovations are described in detail and are critically commented. At the end of this section, I present a scheme, where all the approaches are classified as strategic or methodic innovations, but united under a model called "modern policing". The fact that the innovations are not competitive but rather complementary is the most important finding of this examination. The second part of this work deals with a unique survey about the implementation of four innovations and eight problem- and community-oriented activities in 85 Swiss police forces. This explorative study shows that in the last 15 years the Swiss police forces have increasingly adopted innovative approaches. The most frequent innovation is community policing, which has been implemented all over the country. Due to the results, we can also assume that the implementation of the innovations is mostly substantial and profound. However, particularly in the area of problem-solving there is still a need for improvements. The third section consists of a scientific evaluation of a temporary special unit of the municipal police Zurich, which, during nine months, fought against public drug dealing and illegal prostitution in a particular neighborhood called Langstrasse. The effects of this hot-spot project were measured with police data, observations and several population surveys. In general, the special unit achieved a positive outcome and helped to defuse the hot-spot. Additionally, a survey conducted within the police department showed that the personal attitude towards the special unit differed widely between the policemen. We found significant differences between both police regions East and West, rank-and-file and higher ranking officers, different ages and the personal connection to the special unit. In fact, the higher the rank, the lower the age, and the closer the relationship, the more positive the officers were towards the unit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Action research is a useful instrument for the organization health care and the clinical governance of psychiatric institutions. What this type of research offers can be illustrated by the cohort study of migrant patients without health insurance who consulted the Department of Psychiatry of the Vaudois university medical center (CHUV) in 2008. While giving greater visibilty to the psychological suffering and social distress of these patients, the study also enabled the authors to determine which clinical procedures were actually offered to these patients and the amount these procedures cost the department. The small number of cases that were identified as well as their uneven distribution amongst the different services of the department suggest that considerable efforts must still be made to improve access for this population to public psychiatric services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ObjectiveCandidate genes for non-alcoholic fatty liver disease (NAFLD) identified by a bioinformatics approach were examined for variant associations to quantitative traits of NAFLD-related phenotypes.Research Design and MethodsBy integrating public database text mining, trans-organism protein-protein interaction transferal, and information on liver protein expression a protein-protein interaction network was constructed and from this a smaller isolated interactome was identified. Five genes from this interactome were selected for genetic analysis. Twenty-one tag single-nucleotide polymorphisms (SNPs) which captured all common variation in these genes were genotyped in 10,196 Danes, and analyzed for association with NAFLD-related quantitative traits, type 2 diabetes (T2D), central obesity, and WHO-defined metabolic syndrome (MetS).Results273 genes were included in the protein-protein interaction analysis and EHHADH, ECHS1, HADHA, HADHB, and ACADL were selected for further examination. A total of 10 nominal statistical significant associations (P<0.05) to quantitative metabolic traits were identified. Also, the case-control study showed associations between variation in the five genes and T2D, central obesity, and MetS, respectively. Bonferroni adjustments for multiple testing negated all associations.ConclusionsUsing a bioinformatics approach we identified five candidate genes for NAFLD. However, we failed to provide evidence of associations with major effects between SNPs in these five genes and NAFLD-related quantitative traits, T2D, central obesity, and MetS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The National Academies has stressed the need to develop quantifiable measures for methods that are currently qualitative in nature, such as the examination of fingerprints. Current protocols and procedures to perform these examinations rely heavily on a succession of subjective decisions, from the initial acceptance of evidence for probative value to the final assessment of forensic results. This project studied the concept of sufficiency associated with the decisions made by latent print examiners at the end of the various phases of the examination process. During this 2-year effort, a web‐based interface was designed to capture the observations of 146 latent print examiners and trainees on 15 pairs of latent/control prints. Two main findings resulted from the study: The concept of sufficiency is driven mainly by the number and spatial relationships between the minutiae observed on the latent and control prints. Data indicate that demographics (training, certification, years of experience) or non‐minutiae based features (such as level 3 features) do not play a major role in examiners' decisions; Significant variability was observed between detecting and interpreting friction ridge features and at all levels of details, as well as for factors that have the potential to influence the examination process, such as degradation, distortion, or influence of the background and the development technique.