927 resultados para Adverse Weather
Weather and War – Economic and social vulnerability in Switzerland at the end of the First World War
Resumo:
Neutral Switzerland – not embedded in the fighting forces – yet was involved in the Great War mainly in economical terms. Since Switzerland is a landlocked country especially agriculture became an important topic of war economy in regard to food security. Until 1916 national food supply was limited but could be maintained through barter trade. In 1916 a crisis on both supply and production level occurred and led to a decline in food availability and to immense price risings causing social turmoil. This paper aims to outline the factors of vulnerability in respect of food in Switzerland during the First World War and further it will show different coping strategies that were undertaken during that time. The paper takes the work of Mario Aeby and Christian Pfister (University of Bern) into consideration that pointed out to weather anomalies during the years 1916 and 1917 aggravating the already tense food situation. Arguing for an overlap of supply and production crisis the paper focuses on agricultural and economic history including environmental impacts. Further the paper addresses the question of what makes a food system resilient to such unforeseen impacts.
Resumo:
High-resolution, ground-based and independent observations including co-located wind radiometer, lidar stations, and infrasound instruments are used to evaluate the accuracy of general circulation models and data-constrained assimilation systems in the middle atmosphere at northern hemisphere midlatitudes. Systematic comparisons between observations, the European Centre for Medium-Range Weather Forecasts (ECMWF) operational analyses including the recent Integrated Forecast System cycles 38r1 and 38r2, the NASA’s Modern-Era Retrospective Analysis for Research and Applications (MERRA) reanalyses, and the free-running climate Max Planck Institute–Earth System Model–Low Resolution (MPI-ESM-LR) are carried out in both temporal and spectral dom ains. We find that ECMWF and MERRA are broadly consistent with lidar and wind radiometer measurements up to ~40 km. For both temperature and horizontal wind components, deviations increase with altitude as the assimilated observations become sparser. Between 40 and 60 km altitude, the standard deviation of the mean difference exceeds 5 K for the temperature and 20 m/s for the zonal wind. The largest deviations are observed in winter when the variability from large-scale planetary waves dominates. Between lidar data and MPI-ESM-LR, there is an overall agreement in spectral amplitude down to 15–20 days. At shorter time scales, the variability is lacking in the model by ~10 dB. Infrasound observations indicate a general good agreement with ECWMF wind and temperature products. As such, this study demonstrates the potential of the infrastructure of the Atmospheric Dynamics Research Infrastructure in Europe project that integrates various measurements and provides a quantitative understanding of stratosphere-troposphere dynamical coupling for numerical weather prediction applications.
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
Therapy by human immunoglobulin G (IgG) concentrates is a success story ongoing for decades with an ever increasing demand for this plasma product. The success of IgG concentrates on a clinical level is documented by the slowly increasing number of registered indication and the more rapid increase of the off-label uses, a topic dealt with in another contribution to this special issue of Frontiers in Immunology. A part of the success is the adverse event (AE) profile of IgG concentrates which is, even at life-long need for therapy, excellent. Transmission of pathogens in the last decade could be entirely controlled through the antecedent introduction by authorities of a regulatory network and installing quality standards by the plasma fractionation industry. The cornerstone of the regulatory network is current good manufacturing practice. Non-infectious AEs occur rarely and mainly are mild to moderate. However, in recent times, the increase in frequency of hemolytic and thrombotic AEs raised worrying questions on the possible background for these AEs. Below, we review elements of non-infectious AEs, and particularly focus on hemolysis and thrombosis. We discuss how the introduction of plasma fractionation by ion-exchange chromatography and polishing by immunoaffinity chromatographic steps might alter repertoire of specificities and influence AE profiles and efficacy of IgG concentrates.
Resumo:
OBJECTIVE The aim of this study was to examine the prevalence of nutritional risk and its association with multiple adverse clinical outcomes in a large cohort of acutely ill medical inpatients from a Swiss tertiary care hospital. METHODS We prospectively followed consecutive adult medical inpatients for 30 d. Multivariate regression models were used to investigate the association of the initial Nutritional Risk Score (NRS 2002) with mortality, impairment in activities of daily living (Barthel Index <95 points), hospital length of stay, hospital readmission rates, and quality of life (QoL; adapted from EQ5 D); all parameters were measured at 30 d. RESULTS Of 3186 patients (mean age 71 y, 44.7% women), 887 (27.8%) were at risk for malnutrition with an NRS ≥3 points. We found strong associations (odds ratio/hazard ratio [OR/HR], 95% confidence interval [CI]) between nutritional risk and mortality (OR/HR, 7.82; 95% CI, 6.04-10.12), impaired Barthel Index (OR/HR, 2.56; 95% CI, 2.12-3.09), time to hospital discharge (OR/HR, 0.48; 95% CI, 0.43-0.52), hospital readmission (OR/HR, 1.46; 95% CI, 1.08-1.97), and all five dimensions of QoL measures. Associations remained significant after adjustment for sociodemographic characteristics, comorbidities, and medical diagnoses. Results were robust in subgroup analysis with evidence of effect modification (P for interaction < 0.05) based on age and main diagnosis groups. CONCLUSION Nutritional risk is significant in acutely ill medical inpatients and is associated with increased medical resource use, adverse clinical outcomes, and impairments in functional ability and QoL. Randomized trials are needed to evaluate evidence-based preventive and treatment strategies focusing on nutritional factors to improve outcomes in these high-risk patients.
Resumo:
Systematic consideration of scientific support is a critical element in developing and, ultimately, using adverse outcome pathways (AOPs) for various regulatory applications. Though weight of evidence (WoE) analysis has been proposed as a basis for assessment of the maturity and level of confidence in an AOP, methodologies and tools are still being formalized. The Organization for Economic Co-operation and Development (OECD) Users' Handbook Supplement to the Guidance Document for Developing and Assessing AOPs (OECD 2014a; hereafter referred to as the OECD AOP Handbook) provides tailored Bradford-Hill (BH) considerations for systematic assessment of confidence in a given AOP. These considerations include (1) biological plausibility and (2) empirical support (dose-response, temporality, and incidence) for Key Event Relationships (KERs), and (3) essentiality of key events (KEs). Here, we test the application of these tailored BH considerations and the guidance outlined in the OECD AOP Handbook using a number of case examples to increase experience in more transparently documenting rationales for assigned levels of confidence to KEs and KERs, and to promote consistency in evaluation within and across AOPs. The major lessons learned from experience are documented, and taken together with the case examples, should contribute to better common understanding of the nature and form of documentation required to increase confidence in the application of AOPs for specific uses. Based on the tailored BH considerations and defining questions, a prototype quantitative model for assessing the WoE of an AOP using tools of multi-criteria decision analysis (MCDA) is described. The applicability of the approach is also demonstrated using the case example aromatase inhibition leading to reproductive dysfunction in fish. Following the acquisition of additional experience in the development and assessment of AOPs, further refinement of parameterization of the model through expert elicitation is recommended. Overall, the application of quantitative WoE approaches hold promise to enhance the rigor, transparency and reproducibility for AOP WoE determinations and may play an important role in delineating areas where research would have the greatest impact on improving the overall confidence in the AOP.
Resumo:
Accurate rainfall data are the key input parameter for modelling river discharge and soil loss. Remote areas of Ethiopia often lack adequate precipitation data and where these data are available, there might be substantial temporal or spatial gaps. To counter this challenge, the Climate Forecast System Reanalysis (CFSR) of the National Centers for Environmental Prediction (NCEP) readily provides weather data for any geographic location on earth between 1979 and 2014. This study assesses the applicability of CFSR weather data to three watersheds in the Blue Nile Basin in Ethiopia. To this end, the Soil and Water Assessment Tool (SWAT) was set up to simulate discharge and soil loss, using CFSR and conventional weather data, in three small-scale watersheds ranging from 112 to 477 ha. Calibrated simulation results were compared to observed river discharge and observed soil loss over a period of 32 years. The conventional weather data resulted in very good discharge outputs for all three watersheds, while the CFSR weather data resulted in unsatisfactory discharge outputs for all of the three gauging stations. Soil loss simulation with conventional weather inputs yielded satisfactory outputs for two of three watersheds, while the CFSR weather input resulted in three unsatisfactory results. Overall, the simulations with the conventional data resulted in far better results for discharge and soil loss than simulations with CFSR data. The simulations with CFSR data were unable to adequately represent the specific regional climate for the three watersheds, performing even worse in climatic areas with two rainy seasons. Hence, CFSR data should not be used lightly in remote areas with no conventional weather data where no prior analysis is possible.
Resumo:
Floods are the leading cause of fatalities related to natural disasters in Texas. Texas leads the nation in flash flood fatalities. From 1959 through 2009 there were three times more fatalities in Texas (840) than the following state Pennsylvania (265). Texas also leads the nation in flood-related injuries (7753). Flood fatalities in Texas represent a serious public health problem. This study addresses several objectives of Healthy People 2010 including reducing deaths from motor vehicle accidents (Objective 15-15), reducing nonfatal motor vehicle injuries (Objective 15-17), and reducing drownings (Objective 15-29). The study examined flood fatalities that occurred in Texas between 1959 and 2008. Flood fatality statistics were extracted from three sources: flood fatality databases from the National Climatic Data Center, the Spatial Hazard Event and Loss Database for the United States, and the Texas Department of State Health Services. The data collected for flood fatalities include the date, time, gender, age, location, and type of flood. Inconsistencies among the three databases were identified and discussed. Analysis reveals that most fatalities result from driving into flood water (77%). Spatial analysis indicates that more fatalities occurred in counties containing major urban centers – some of the Flash Flood Alley counties (Bexar, Dallas, Travis, and Tarrant), Harris County (Houston), and Val Verde County (Del Rio). An intervention strategy targeting the behavior of driving into flood water is proposed. The intervention is based on the Health Belief model. The main recommendation of the study is that flood fatalities in Texas can be reduced through a combination of improved hydrometeorological forecasting, educational programs aimed at enhancing the public awareness of flood risk and the seriousness of flood warnings, and timely and appropriate action by local emergency and safety authorities.^
Resumo:
Maternal use of SSRIs for depression and anxiety during pregnancy has increased over the last decade. Recent studies have questioned the safety of these antidepressants when used in during pregnancy. The aim of this project is to assess the associations between maternal SSRI use and GH, SGA, and preterm birth using data from a U.S. population-based study with self-reported exposure information. ^ The study population is comprised of mothers of control infants from the NBDPS, an ongoing, multi-state, population-based case-control study. Mothers were asked about any use of medications during pregnancy, including the dates they started and stopped taking each medication. Maternal GH was self-reported, while gestational age and birth weight were calculated from information on birth certificates or medical records. ^ Our study found that women exposed to SSRIs in the first trimester and beyond had a higher odds of GH compared to unexposed women (aOR=1.96, 95% CI=1.02-3.74). Women who used SSRIs only in the first trimester had no increased odds of GH (aOR=0.77, 95% CI=0.24-2.50). Women who used SSRIs throughout their entire pregnancy had a two-fold increase in the odds of delivering an SGA infant compared to unexposed women (aOR=2.16, 95% CI=1.01-4.62), while women who reported SSRI use only in the first trimester had a decreased odds of delivering an SGA infant (aOR=0.56, 95% CI=0.14-2.34). Finally, both women who used SSRIs in the first trimester only (aOR=1.58, 95% CI=0.71-3.51) and women who used SSRIs in the first trimester and beyond (aOR=1.49, 95% CI=0.76-2.90) had an increased odds of delivering preterm compared to unexposed women. ^ Results from our study suggest that women who use SSRIs in the first trimester and beyond have an increased and significant odds of GH and SGA. An increase in the odds of preterm birth was also observed among women exposed in this period and is consistent with the results of previous studies which had much larger sample sizes. Women who use SSRIs only in the first trimester appear to have no increased odds of GH or SGA, but may have an increased odds of preterm birth. These findings are consistent with previous studies and highlight how exposure to SSRIs at different points in gestation may result in different risks for these outcomes. ^
Resumo:
The ascertainment and analysis of adverse reactions to investigational agents presents a significant challenge because of the infrequency of these events, their subjective nature and the low priority of safety evaluations in many clinical trials. A one year review of antibiotic trials published in medical journals demonstrates the lack of standards in identifying and reporting these potentially fatal conditions. This review also illustrates the low probability of observing and detecting rare events in typical clinical trials which include fewer than 300 subjects. Uniform standards for ascertainment and reporting are suggested which include operational definitions of study subjects. Meta-analysis of selected antibiotic trials using multivariate regression analysis indicates that meaningful conclusions may be drawn from data from multiple studies which are pooled in a scientifically rigorous manner. ^
Resumo:
Precipitation for 2011 was less than the longterm climate average. Early in the year, precipitation lagged behind normal, but then tracked close to the normal accumulation rate from mid-April through mid-August. After that time, precipitation amounts greatly lagged behind normal, and the year ended almost 7 in. behind the long-term average. (Figure 1). Overall, 2011 will be remembered for good moisture early, but ending the season with almost no rainfall.