991 resultados para Disease Data Base
Resumo:
Severe wind storms are one of the major natural hazards in the extratropics and inflict substantial economic damages and even casualties. Insured storm-related losses depend on (i) the frequency, nature and dynamics of storms, (ii) the vulnerability of the values at risk, (iii) the geographical distribution of these values, and (iv) the particular conditions of the risk transfer. It is thus of great importance to assess the impact of climate change on future storm losses. To this end, the current study employs—to our knowledge for the first time—a coupled approach, using output from high-resolution regional climate model scenarios for the European sector to drive an operational insurance loss model. An ensemble of coupled climate-damage scenarios is used to provide an estimate of the inherent uncertainties. Output of two state-of-the-art global climate models (HadAM3, ECHAM5) is used for present (1961–1990) and future climates (2071–2100, SRES A2 scenario). These serve as boundary data for two nested regional climate models with a sophisticated gust parametrizations (CLM, CHRM). For validation and calibration purposes, an additional simulation is undertaken with the CHRM driven by the ERA40 reanalysis. The operational insurance model (Swiss Re) uses a European-wide damage function, an average vulnerability curve for all risk types, and contains the actual value distribution of a complete European market portfolio. The coupling between climate and damage models is based on daily maxima of 10 m gust winds, and the strategy adopted consists of three main steps: (i) development and application of a pragmatic selection criterion to retrieve significant storm events, (ii) generation of a probabilistic event set using a Monte-Carlo approach in the hazard module of the insurance model, and (iii) calibration of the simulated annual expected losses with a historic loss data base. The climate models considered agree regarding an increase in the intensity of extreme storms in a band across central Europe (stretching from southern UK and northern France to Denmark, northern Germany into eastern Europe). This effect increases with event strength, and rare storms show the largest climate change sensitivity, but are also beset with the largest uncertainties. Wind gusts decrease over northern Scandinavia and Southern Europe. Highest intra-ensemble variability is simulated for Ireland, the UK, the Mediterranean, and parts of Eastern Europe. The resulting changes on European-wide losses over the 110-year period are positive for all layers and all model runs considered and amount to 44% (annual expected loss), 23% (10 years loss), 50% (30 years loss), and 104% (100 years loss). There is a disproportionate increase in losses for rare high-impact events. The changes result from increases in both severity and frequency of wind gusts. Considerable geographical variability of the expected losses exists, with Denmark and Germany experiencing the largest loss increases (116% and 114%, respectively). All countries considered except for Ireland (−22%) experience some loss increases. Some ramifications of these results for the socio-economic sector are discussed, and future avenues for research are highlighted. The technique introduced in this study and its application to realistic market portfolios offer exciting prospects for future research on the impact of climate change that is relevant for policy makers, scientists and economists.
Resumo:
Two models for predicting Septoria tritici on winter wheat (cv. Ri-band) were developed using a program based on an iterative search of correlations between disease severity and weather. Data from four consecutive cropping seasons (1993/94 until 1996/97) at nine sites throughout England were used. A qualitative model predicted the presence or absence of Septoria tritici (at a 5% severity threshold within the top three leaf layers) using winter temperature (January/February) and wind speed to about the first node detectable growth stage. For sites above the disease threshold, a quantitative model predicted severity of Septoria tritici using rainfall during stern elongation. A test statistic was derived to test the validity of the iterative search used to obtain both models. This statistic was used in combination with bootstrap analyses in which the search program was rerun using weather data from previous years, therefore uncorrelated with the disease data, to investigate how likely correlations such as the ones found in our models would have been in the absence of genuine relationships.
Resumo:
This Atlas presents statistical analyses of the simulations submitted to the Aqua-Planet Experiment (APE) data archive. The simulations are from global Atmospheric General Circulation Models (AGCM) applied to a water-covered earth. The AGCMs include ones actively used or being developed for numerical weather prediction or climate research. Some are mature, application models and others are more novel and thus less well tested in Earth-like applications. The experiment applies AGCMs with their complete parameterization package to an idealization of the planet Earth which has a greatly simplified lower boundary that consists of an ocean only. It has no land and its associated orography, and no sea ice. The ocean is represented by Sea Surface Temperatures (SST) which are specified everywhere with simple, idealized distributions. Thus in the hierarchy of tests available for AGCMs, APE falls between tests with simplified forcings such as those proposed by Held and Suarez (1994) and Boer and Denis (1997) and Earth-like simulations of the Atmospheric Modeling Intercomparison Project (AMIP, Gates et al., 1999). Blackburn and Hoskins (2013) summarize the APE and its aims. They discuss where the APE fits within a modeling hierarchy which has evolved to evaluate complete models and which provides a link between realistic simulation and conceptual models of atmospheric phenomena. The APE bridges a gap in the existing hierarchy. The goals of APE are to provide a benchmark of current model behaviors and to stimulate research to understand the cause of inter-model differences., APE is sponsored by the World Meteorological Organization (WMO) joint Commission on Atmospheric Science (CAS), World Climate Research Program (WCRP) Working Group on Numerical Experimentation (WGNE). Chapter 2 of this Atlas provides an overview of the specification of the eight APE experiments and of the data collected. Chapter 3 lists the participating models and includes brief descriptions of each. Chapters 4 through 7 present a wide variety of statistics from the 14 participating models for the eight different experiments. Additional intercomparison figures created by Dr. Yukiko Yamada in AGU group are available at http://www.gfd-dennou.org/library/ape/comparison/. This Atlas is intended to present and compare the statistics of the APE simulations but does not contain a discussion of interpretive analyses. Such analyses are left for journal papers such as those included in the Special Issue of the Journal of the Meteorological Society of Japan (2013, Vol. 91A) devoted to the APE. Two papers in that collection provide an overview of the simulations. One (Blackburn et al., 2013) concentrates on the CONTROL simulation and the other (Williamson et al., 2013) on the response to changes in the meridional SST profile. Additional papers provide more detailed analysis of the basic simulations, while others describe various sensitivities and applications. The APE experiment data base holds a wealth of data that is now publicly available from the APE web site: http://climate.ncas.ac.uk/ape/. We hope that this Atlas will stimulate future analyses and investigations to understand the large variation seen in the model behaviors.
Resumo:
Atmospheric dust is an important feedback in the climate system, potentially affecting the radiative balance and chemical composition of the atmosphere and providing nutrients to terrestrial and marine ecosystems. Yet the potential impact of dust on the climate system, both in the anthropogenically disturbed future and the naturally varying past, remains to be quantified. The geologic record of dust provides the opportunity to test earth system models designed to simulate dust. Records of dust can be obtained from ice cores, marine sediments, and terrestrial (loess) deposits. Although rarely unequivocal, these records document a variety of processes (source, transport and deposition) in the dust cycle, stored in each archive as changes in clay mineralogy, isotopes, grain size, and concentration of terrigenous materials. Although the extraction of information from each type of archive is slightly different, the basic controls on these dust indicators are the same. Changes in the dust flux and particle size might be controlled by a combination of (a) source area extent, (b) dust emission efficiency (wind speed) and atmospheric transport, (c) atmospheric residence time of dust, and/or (d) relative contributions of dry settling and rainout of dust. Similarly, changes in mineralogy reflect (a) source area mineralogy and weathering and (b) shifts in atmospheric transport. The combination of these geological data with process-based, forward-modelling schemes in global earth system models provides an excellent means of achieving a comprehensive picture of the global pattern of dust accumulation rates, their controlling mechanisms, and how those mechanisms may vary regionally. The Dust Indicators and Records of Terrestrial and MArine Palaeoenvironments (DIRTMAP) data base has been established to provide a global palaeoenvironmental data set that can be used to validate earth system model simulations of the dust cycle over the past 150,000 years.
Resumo:
Global syntheses of palaeoenvironmental data are required to test climate models under conditions different from the present. Data sets for this purpose contain data from spatially extensive networks of sites. The data are either directly comparable to model output or readily interpretable in terms of modelled climate variables. Data sets must contain sufficient documentation to distinguish between raw (primary) and interpreted (secondary, tertiary) data, to evaluate the assumptions involved in interpretation of the data, to exercise quality control, and to select data appropriate for specific goals. Four data bases for the Late Quaternary, documenting changes in lake levels since 30 kyr BP (the Global Lake Status Data Base), vegetation distribution at 18 kyr and 6 kyr BP (BIOME 6000), aeolian accumulation rates during the last glacial-interglacial cycle (DIRTMAP), and tropical terrestrial climates at the Last Glacial Maximum (the LGM Tropical Terrestrial Data Synthesis) are summarised. Each has been used to evaluate simulations of Last Glacial Maximum (LGM: 21 calendar kyr BP) and/or mid-Holocene (6 cal. kyr BP) environments. Comparisons have demonstrated that changes in radiative forcing and orography due to orbital and ice-sheet variations explain the first-order, broad-scale (in space and time) features of global climate change since the LGM. However, atmospheric models forced by 6 cal. kyr BP orbital changes with unchanged surface conditions fail to capture quantitative aspects of the observed climate, including the greatly increased magnitude and northward shift of the African monsoon during the early to mid-Holocene. Similarly, comparisons with palaeoenvironmental datasets show that atmospheric models have underestimated the magnitude of cooling and drying of much of the land surface at the LGM. The inclusion of feedbacks due to changes in ocean- and land-surface conditions at both times, and atmospheric dust loading at the LGM, appears to be required in order to produce a better simulation of these past climates. The development of Earth system models incorporating the dynamic interactions among ocean, atmosphere, and vegetation is therefore mandated by Quaternary science results as well as climatological principles. For greatest scientific benefit, this development must be paralleled by continued advances in palaeodata analysis and synthesis, which in turn will help to define questions that call for new focused data collection efforts.
Resumo:
This paper describes the methodology of providing multiprobability predictions for proteomic mass spectrometry data. The methodology is based on a newly developed machine learning framework called Venn machines. Is allows to output a valid probability interval. The methodology is designed for mass spectrometry data. For demonstrative purposes, we applied this methodology to MALDI-TOF data sets in order to predict the diagnosis of heart disease and early diagnoses of ovarian cancer and breast cancer. The experiments showed that probability intervals are narrow, that is, the output of the multiprobability predictor is similar to a single probability distribution. In addition, probability intervals produced for heart disease and ovarian cancer data were more accurate than the output of corresponding probability predictor. When Venn machines were forced to make point predictions, the accuracy of such predictions is for the most data better than the accuracy of the underlying algorithm that outputs single probability distribution of a label. Application of this methodology to MALDI-TOF data sets empirically demonstrates the validity. The accuracy of the proposed method on ovarian cancer data rises from 66.7 % 11 months in advance of the moment of diagnosis to up to 90.2 % at the moment of diagnosis. The same approach has been applied to heart disease data without time dependency, although the achieved accuracy was not as high (up to 69.9 %). The methodology allowed us to confirm mass spectrometry peaks previously identified as carrying statistically significant information for discrimination between controls and cases.
Resumo:
A new method to measure the epicycle frequency kappa in the Galactic disc is presented. We make use of the large data base on open clusters completed by our group to derive the observed velocity vector (amplitude and direction) of the clusters in the Galactic plane. In the epicycle approximation, this velocity is equal to the circular velocity given by the rotation curve, plus a residual or perturbation velocity, of which the direction rotates as a function of time with the frequency kappa. Due to the non-random direction of the perturbation velocity at the birth time of the clusters, a plot of the present-day direction angle of this velocity as a function of the age of the clusters reveals systematic trends from which the epicycle frequency can be obtained. Our analysis considers that the Galactic potential is mainly axis-symmetric, or in other words, that the effect of the spiral arms on the Galactic orbits is small; in this sense, our results do not depend on any specific model of the spiral structure. The values of kappa that we obtain provide constraints on the rotation velocity of the in particular, V(0) is found to be 230 +/- 15 km s(-1) even if the scale (R(0) = 7.5 kpc) of the Galaxy is adopted. The measured kappa at the solar radius is 43 +/- 5 km s(-1) kpc(-1). The distribution of initial velocities of open clusters is discussed.
Resumo:
In this paper, we construct a dynamic portrait of the inner asteroidal belt. We use information about the distribution of test particles, which were initially placed on a perfectly rectangular grid of initial conditions, after 4.2 Myr of gravitational interactions with the Sun and five planets, from Mars to Neptune. Using the spectral analysis method introduced by Michtchenko et al., the asteroidal behaviour is illustrated in detail on the dynamical, averaged and frequency maps. On the averaged and frequency maps, we superpose information on the proper elements and proper frequencies of real objects, extracted from the data base, AstDyS, constructed by Milani and Knezevic. A comparison of the maps with the distribution of real objects allows us to detect possible dynamical mechanisms acting in the domain under study; these mechanisms are related to mean-motion and secular resonances. We note that the two- and three-body mean-motion resonances and the secular resonances (strong linear and weaker non-linear) have an important role in the diffusive transportation of the objects. Their long-lasting action, overlaid with the Yarkovsky effect, may explain many observed features of the density, size and taxonomic distributions of the asteroids.
Resumo:
The study shows the current need for security solutions concerning work with information in different areas.Investigations will show important solutions for printers’ needs to meet the increasingly harder demands forfast and safe digital communications. Important factors to be analyzed in the investigations are: access todifferent types of information, workers authority of information, access to the data base register internallyand externally, production solutions for an effective fault detection and data base solutions fororders and distribution.Planned and unplanned stops result in a standard of value in interruptions. Internal data bases areprotected by so-called “Fire Walls”, “Watch Dogs” and Virtual Private Networks. Offset plates are locked infor a definitive range of time. Subsequent destruction and remaining sheets are shredded and recycled. Alldocumentation is digital, in business control systems, which guarantees that no important documents arelying around in working places. Fault detection work is facilitated by the ability to fully track the order numberson incoming orders.
Resumo:
The problems of finding best facility locations require complete and accurate road network with the corresponding population data in a specific area. However the data obtained in road network databases usually do not fit in this usage. In this paper we propose our procedure of converting the road network database to a road graph which could be used in localization problems. The road network data come from the National road data base in Sweden. The graph derived is cleaned, and reduced to a suitable level for localization problems. The population points are also processed in ordered to match with that graph. The reduction of the graph is done maintaining most of the accuracy for distance measures in the network.
Resumo:
O mundo atravessa uma fase de incertezas sobre o câmbio. O objetivo deste trabalho é apresentar uma revisão bibliográfica sobre o assunto, trazendo os modelos que tentam elucidar como se forma a taxa de câmbio no curto e longo prazo. Serão tratados os motivos que levaram o Brasil a abandonar a política de bandas cambiais em 1999 e quais as medidas tomadas pelas autoridades econômicas, bem como o impacto destas sobre a economia. Ao longo dessa análise será calculada a taxa de câmbio real e real efetiva mensalmente com data base julho de 1994, permitindo visualizar as fases de apreciação e depreciação do câmbio. Encerrando o capítulo uma revisão teórica do impacto das políticas monetária e fiscal sobre o câmbio à luz dos diferentes regimes cambiais existentes. Finalmente, uma revisão dos fatos que levaram às crises cambiais no México, Tailândia, Malásia, Indonésia, Coréia do Sul, Rússia e Argentina, analisando os principais indicadores macroeconômicos destas economias tentando encontrar elementos comuns que permitam entender os motivos das crises.
Resumo:
Com a globalização do mercado e o alto nível de competitividade no setor educacional, as organizações, para manterem-se, devem ser ágeis e competentes. Neste contexto, a gestão eficiente dos recursos e a obtenção de informações precisas que apóiem a tomada de decisão dependerão, em grande parte, de um sistema de informações de custos. Este sistema deverá ter como base um método de custeio que forneça informações, a fim de atender as distintas necessidades dos gestores dos diversos níveis hierárquico e das diversas áreas de atuação. O trabalho consiste no estudo de uma metodologia de custeio aplicável a uma Instituição de Ensino Superior – IES privada, a qual atenda as três perspectivas que são fornecer informações para embasar a composição dos preços, para apoiar o processo decisório e para o planejamento e controle de gastos. Para tanto, partiu-se da pesquisa bibliográfica no levantamento do estado da arte relacionada ao tema. Com o estudo de caso buscou-se a identificação das necessidades de informações de custos, demandadas pelos gestores da IES, por meio de pesquisa qualitativa. A partir dessa identificação, as necessidades foram cruzadas com os métodos de custeio existentes, o que permitiu a identificação do método mais adequado a IES. Nesta etapa foi possível o cruzamento entre a teoria e a prática, onde foram comparados o método proposto em relação ao atual método adotado pela IES o que possibilitou a identificação das deficiências do modelo atual e suas causas. A partir disto, propõe-se uma sistemática mais adequada para apoiar a tomada de decisão, com o intuito de melhoria do desempenho da instituição. Os resultados obtidos demonstram o cumprimento do objetivo onde, considerando as necessidades de informações de custos dos gestores, o método de custeio por atividades é o mais adequado para o suporte a gestão da IES.
Resumo:
O presente trabalho tem por objetivo abordar o tema microcrédito e analisar sua experiência mais bem sucedida no Brasil, o CrediAMIGO, do Banco do Nordeste. Há uma descrição do programa e uma análise quantitativa de sua atuação, que será feita a partir de uma base de dados composta pelas pesquisas Ecinf de 1997 e 2003, utilizando regressões logísticas e o método de diferenças em diferenças. Em termos substantivos concluímos que há experiência de microcrédito na região mais pobre do Brasil, baseada em aval solidário, proporcionando aumento no acesso a crédito superior ao resto do país.
Resumo:
O presente trabalho constitui o Relatório Final de um projeto de pesquisa financiado pelo Núcleo de Pesquisas e Publicações da FGV-EAESP. Analisam-se nele os Planos de Gestão de Resíduos de Serviços de Saúde PGRSS de uma amostra de 70 hospitais nacionais, elaborados em 2003, como resultado de um curso de educação a distância, ministrado por um consórcio formado entre a Universidade Federal de Santa Catarina e a Fundação Getulio Vargas. O curso foi sugerido pela UNESCO e financiado por bancos internacionais, com intermediação do REFORSUS. Para cada Plano, foram tabulados, em planilha EXCEL, 164 itens, sendo 12 informações gerais sobre o hospital, 141 relativas à infra-estrutura e aos procedimentos atualmente usados e 11 referentes ao plano futuro de gestão de resíduos. Diagnosticou-se a situação desses hospitais no tocante ao manejo dos resíduos, classificados como infectantes, químicos, radioativos, comuns e perfurocortantes, desde a coleta, o armazenamento e o tratamento interno até a remoção, o tratamento externo e a disposição final. A água, desde a fonte de suprimento até seu consumo, os efluentes líquidos e as emissões gasosas também foram objeto de investigação. Foram avaliados ainda, sob os aspectos técnico e econômico, os planos elaborados pelos hospitais para a gestão futura dos seus resíduos. Os resultados da pesquisa indicam que os hospitais estudados se encontram em sua maioria numa fase incipiente em matéria de gestão dos seus resíduos, carecendo de infra-estrutura, recursos financeiros e humanos e gerenciamento, existindo considerável distância entre a gestão atual dos resíduos e as exigências legais que os hospitais devem cumprir nas áreas sanitária e ambiental.
Resumo:
This Research Report presents the Environmental Management Information System, which has been planned for CEAMA - Center for Management and Environment. The System consists in the framework of data bases in the environment field, and the logic for information treatment. It has been developed to support current and future requirements of Center for Management and Environment in Escola de Administração de Empresas de São Paulo, Fundação Getulio Vargas (EAESP/FGV). The System was developed for Windows/Access environment and consists of a Main Data Base and a Thematic Data Base. The Main Data Base can store libraries, NGOs, Environmental Impact Reports, technical reports, companies and many other items. The Thematic one can contain several subjects related with environment, specifying the place or access form in which the data can be obtained. The Report also describes several environmental information systems, as a result of a survey that has been done as a part of the research project.