120 resultados para global trends
em Université de Lausanne, Switzerland
Resumo:
BACKGROUND: Data for trends in glycaemia and diabetes prevalence are needed to understand the effects of diet and lifestyle within populations, assess the performance of interventions, and plan health services. No consistent and comparable global analysis of trends has been done. We estimated trends and their uncertainties in mean fasting plasma glucose (FPG) and diabetes prevalence for adults aged 25 years and older in 199 countries and territories. METHODS: We obtained data from health examination surveys and epidemiological studies (370 country-years and 2·7 million participants). We converted systematically between different glycaemic metrics. For each sex, we used a Bayesian hierarchical model to estimate mean FPG and its uncertainty by age, country, and year, accounting for whether a study was nationally, subnationally, or community representative. FINDINGS: In 2008, global age-standardised mean FPG was 5·50 mmol/L (95% uncertainty interval 5·37-5·63) for men and 5·42 mmol/L (5·29-5·54) for women, having risen by 0·07 mmol/L and 0·09 mmol/L per decade, respectively. Age-standardised adult diabetes prevalence was 9·8% (8·6-11·2) in men and 9·2% (8·0-10·5) in women in 2008, up from 8·3% (6·5-10·4) and 7·5% (5·8-9·6) in 1980. The number of people with diabetes increased from 153 (127-182) million in 1980, to 347 (314-382) million in 2008. We recorded almost no change in mean FPG in east and southeast Asia and central and eastern Europe. Oceania had the largest rise, and the highest mean FPG (6·09 mmol/L, 5·73-6·49 for men; 6·08 mmol/L, 5·72-6·46 for women) and diabetes prevalence (15·5%, 11·6-20·1 for men; and 15·9%, 12·1-20·5 for women) in 2008. Mean FPG and diabetes prevalence in 2008 were also high in south Asia, Latin America and the Caribbean, and central Asia, north Africa, and the Middle East. Mean FPG in 2008 was lowest in sub-Saharan Africa, east and southeast Asia, and high-income Asia-Pacific. In high-income subregions, western Europe had the smallest rise, 0·07 mmol/L per decade for men and 0·03 mmol/L per decade for women; North America had the largest rise, 0·18 mmol/L per decade for men and 0·14 mmol/L per decade for women. INTERPRETATION: Glycaemia and diabetes are rising globally, driven both by population growth and ageing and by increasing age-specific prevalences. Effective preventive interventions are needed, and health systems should prepare to detect and manage diabetes and its sequelae. FUNDING: Bill & Melinda Gates Foundation and WHO.
Resumo:
In most areas of the world, thyroid cancer incidence has been appreciably increasing over the last few decades, whereas mortality has steadily declined. We updated global trends in thyroid cancer mortality and incidence using official mortality data from the World Health Organization (1970-2012) and incidence data from the Cancer Incidence in Five Continents (1960-2007). Male mortality declined in all the major countries considered, with annual percent changes around -2/-3% over the last decades. Only in the United States mortality declined up to the mid 1980s and increased thereafter. Similarly, in women mortality declined in most countries considered, with APCs around -2/-5% over the last decades, with the exception of the UK, the United States and Australia, where mortality has been declining up to the late 1980s/late 1990s to level off (or increase) thereafter. In 2008-2012, most countries had mortality rates (age-standardized, world population) between 0.20 and 0.40/100,000 men and 0.20 and 0.60/100,000 women, the highest rates being in Latvia, Hungary, the Republic of Moldova and Israel (over 0.40/100,000) for men and in Ecuador, Colombia and Israel (over 0.60/100,000) for women. In most countries, a steady increase in the incidence of thyroid cancer (mainly papillary carcinomas) was observed in both sexes. The declines in thyroid cancer mortality reflect both variations in risk factor exposure and changes in the diagnosis and treatment of the disease, while the increases in the incidence are likely due to the increase in the detection of this neoplasm over the last few decades.
Resumo:
Past and current climate change has already induced drastic biological changes. We need projections of how future climate change will further impact biological systems. Modeling is one approach to forecast future ecological impacts, but requires data for model parameterization. As collecting new data is costly, an alternative is to use the increasingly available georeferenced species occurrence and natural history databases. Here, we illustrate the use of such databases to assess climate change impacts on mountain flora. We show that these data can be used effectively to derive dynamic impact scenarios, suggesting upward migration of many species and possible extinctions when no suitable habitat is available at higher elevations. Systematically georeferencing all existing natural history collections data in mountain regions could allow a larger assessment of climate change impact on mountain ecosystems in Europe and elsewhere.
Resumo:
Spatial data on species distributions are available in two main forms, point locations and distribution maps (polygon ranges and grids). The first are often temporally and spatially biased, and too discontinuous, to be useful (untransformed) in spatial analyses. A variety of modelling approaches are used to transform point locations into maps. We discuss the attributes that point location data and distribution maps must satisfy in order to be useful in conservation planning. We recommend that before point location data are used to produce and/or evaluate distribution models, the dataset should be assessed under a set of criteria, including sample size, age of data, environmental/geographical coverage, independence, accuracy, time relevance and (often forgotten) representation of areas of permanent and natural presence of the species. Distribution maps must satisfy additional attributes if used for conservation analyses and strategies, including minimizing commission and omission errors, credibility of the source/assessors and availability for public screening. We review currently available databases for mammals globally and show that they are highly variable in complying with these attributes. The heterogeneity and weakness of spatial data seriously constrain their utility to global and also sub-global scale conservation analyses.
Resumo:
New data confirm that antidepressants have a global protective effect on the risk of suicide even if they very moderately and transiently increase suicide attempts, particularly in young people. We also present recent data about the following topics. (1) The main results of the STAR*D study, an outpatient study about depression treatment following a very complex algorithm. Nearly all the head to head comparisons between treatments showed a lack of significant difference, and the rates of clinical remission for patients non responders to their first two treatments were low. (2) The reintroduction of sertindole, an atypical antipsychotic: cardiac monitoring is necessary, but should not prevent its use as a second line drug. (3) The potential efficacy of dawn simulation and high density negative air ionization in seasonal depression.
Resumo:
OBJECTIVES: The Swiss Aids prevention strategy has been subject to a continuous process of evaluation for the past 12 years. This paper describes the conceptual approach, methodology, results obtained and contribution to policy-making of that evaluation.¦DESIGN: The evaluation is on-going, global with respect to all components of the strategy, and utilization-focused. Each successive phase of the evaluation has included 10-20 studies centred either on aspects of process, of outcome or of environmental context. Findings are synthesized at the end of each phase. METHODS: Both quantitative and qualitative methods are used. Studies generally have one of three functions within the overall evaluation: assessment of trends through surveys or other types of repeated studies; evaluation of specific areas through a series of studies from different viewpoints; in-depth investigation or rapid assessment through one-off studies. Various methods of triangulation are used to validate findings. RESULTS: The evaluation has allowed for: the observation of behavioural change in different populations; the availability of scientific data in controversial fields such as drug-use policy; an understanding of the diversity of public appropriation of prevention messages. Recommendations are regularly formulated and have been used by policy-makers and field workers for strategy development. CONCLUSIONS: The global approach adopted corresponds well to the evaluation requirements of an integrated long-term prevention strategy. Cost is low relative to the extent of information provided. Such an evaluation cannot however address the question of causal relationship between the strategy and observed changes. The evaluation has contributed to the development of a culture of evaluation in Swiss AIDS prevention more generally.
Resumo:
An enormous burst of interest in the public health burden from chronic disease in Africa has emerged as a consequence of efforts to estimate global population health. Detailed estimates are now published for Africa as a whole and each country on the continent. These data have formed the basis for warnings about sharp increases in cardiovascular disease (CVD) in the coming decades. In this essay we briefly examine the trajectory of social development on the continent and its consequences for the epidemiology of CVD and potential control strategies. Since full vital registration has only been implemented in segments of South Africa and the island nations of Seychelles and Mauritius - formally part of WHO-AFRO - mortality data are extremely limited. Numerous sample surveys have been conducted but they often lack standardization or objective measures of health status. Trend data are even less informative. However, using the best quality data available, age-standardized trends in CVD are downward, and in the case of stroke, sharply so. While acknowledging that the extremely limited available data cannot be used as the basis for inference to the continent, we raise the concern that general estimates based on imputation to fill in the missing mortality tables may be even more misleading. No immediate remedies to this problem can be identified, however bilateral collaborative efforts to strength local educational institutions and governmental agencies rank as the highest priority for near term development.
Resumo:
BACKGROUND: In the past century, there has been a significant rise in life expectancy in almost all regions of the world, contributing to an increasingly older population. The aging of the population, in conjunction with urbanization and industrialization, has resulted in an important epidemiological transition marked by a widespread increase in the prevalence of chronic diseases and their sequelae. Current trends suggest that the transition will have a greater impact on developing countries compared to developed countries. An adequate response to the transition requires a strong emphasis on primary prevention and adequate resource allocation.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
Until the mid-1990s, gastric cancer has been the first cause of cancer death worldwide, although rates had been declining for several decades and gastric cancer has become a relatively rare cancer in North America and in most Northern and Western Europe, but not in Eastern Europe, Russia and selected areas of Central and South America or East Asia. We analyzed gastric cancer mortality in Europe and other areas of the world from 1980 to 2005 using joinpoint regression analysis, and provided updated site-specific incidence rates from 51 selected registries. Over the last decade, the annual percent change (APC) in mortality rate was around -3, -4% for the major European countries. The APC were similar for the Republic of Korea (APC = -4.3%), Australia (-3.7%), the USA (-3.6%), Japan (-3.5%), Ukraine (-3%) and the Russian Federation (-2.8%). In Latin America, the decline was less marked, but constant with APC around -1.6% in Chile and Brazil, -2.3% in Argentina and Mexico and -2.6% in Colombia. Cancers in the fundus and pylorus are more common in high incidence and mortality areas and have been declining more than cardia gastric cancer. Steady downward trends persist in gastric cancer mortality worldwide even in middle aged population, and hence further appreciable declines are likely in the near future.
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.