988 resultados para AC losses


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given the danger of the unbearable catastrophe of nuclear war, NATO in the mid-1950s abandoned any war aim of achieving victory in an all-out war with the Soviet Union or the Warsaw Treaty Organisation. Instead, it adopted the war aim of a cease-fire or war termination. By contrast, the WTO clung on to the war aim of victory - expressed even in terms of the survival of slightly more Soviet citizens than NATO citizens, after unprecedented losses of life in nuclear exchanges - until the mid-1980s when under Gorbachev the concept of victory in nuclear war was abandoned.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We use a soil carbon (C) model (RothC), driven by a range of climate models for a range of climate scenarios to examine the impacts of future climate on global soil organic carbon (SOC) stocks. The results suggest an overall global increase in SOC stocks by 2100 under all scenarios, but with a different extent of increase among the climate model and emissions scenarios. The impacts of projected land use changes are also simulated, but have relatively minor impacts at the global scale. Whether soils gain or lose SOC depends upon the balance between C inputs and decomposition. Changes in net primary production (NPP) change C inputs to the soil, whilst decomposition usually increases under warmer temperatures, but can also be slowed by decreased soil moisture. Underlying the global trend of increasing SOC under future climate is a complex pattern of regional SOC change. SOC losses are projected to occur in northern latitudes where higher SOC decomposition rates due to higher temperatures are not balanced by increased NPP, whereas in tropical regions, NPP increases override losses due to higher SOC decomposition. The spatial heterogeneity in the response of SOC to changing climate shows how delicately balanced the competing gain and loss processes are, with subtle changes in temperature, moisture, soil type and land use, interacting to determine whether SOC increases or decreases in the future. Our results suggest that we should stop looking for a single answer regarding whether SOC stocks will increase or decrease under future climate, since there is no single answer. Instead, we should focus on improving our prediction of the factors that determine the size and direction of change, and the land management practices that can be implemented to protect and enhance SOC stocks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method is presented to calculate economic optimum fungicide doses accounting for the risk-aversion of growers responding to variability in disease severity between crops. Simple dose-response and disease-yield loss functions are used to estimate net disease-related costs (fungicide cost, plus disease-induced yield loss) as a function of dose and untreated severity. With fairly general assumptions about the shapes of the probability distribution of disease severity and the other functions involved, we show that a choice of fungicide dose which minimises net costs on average across seasons results in occasional large net costs caused by inadequate control in high disease seasons. This may be unacceptable to a grower with limited capital. A risk-averse grower can choose to reduce the size and frequency of such losses by applying a higher dose as insurance. For example, a grower may decide to accept ‘high loss’ years one year in ten or one year in twenty (i.e. specifying a proportion of years in which disease severity and net costs will be above a specified level). Our analysis shows that taking into account disease severity variation and risk-aversion will usually increase the dose applied by an economically rational grower. The analysis is illustrated with data on septoria tritici leaf blotch of wheat caused by Mycosphaerella graminicola. Observations from untreated field plots at sites across England over three years were used to estimate the probability distribution of disease severities at mid-grain filling. In the absence of a fully reliable disease forecasting scheme, reducing the frequency of ‘high loss’ years requires substantially higher doses to be applied to all crops. Disease resistant cultivars reduce both the optimal dose at all levels of risk and the disease-related costs at all doses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aims to elucidate the key mechanisms controlling phytoplankton growth and decay within the Thames basin through the application of a modified version of an established river-algal model and comparison with observed stream water chlorophyll-a concentrations. The River Thames showed a distinct simulated phytoplankton seasonality and behaviour having high spring, moderate summer and low autumn chlorophyll-a concentrations. Three main sections were identified along the River Thames with different phytoplankton abundance and seasonality: (i) low chlorophyll-a concentrations from source to Newbridge; (ii) steep concentration increase between Newbridge and Sutton; and (iii) high concentrations with a moderate increase in concentration from Sutton to the end of the study area (Maidenhead). However, local hydrologic (e.g. locks) and other conditions (e.g. radiation, water depth, grazer dynamics, etc.) affected the simulated growth and losses. The model achieved good simulation results during both calibration and testing through a range of hydrological and nutrient conditions. Simulated phytoplankton growth was controlled predominantly by residence time, but during medium–low flow periods available light, water temperature and herbivorous grazing defined algal community development. These results challenge the perceived importance of in-stream nutrient concentrations as the perceived primary control on phytoplankton growth and death.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The assumption that ‘states' primary goal is survival’ lies at the heart of the neorealist paradigm. A careful examination of the assumption, however, reveals that neorealists draw upon a number of distinct interpretations of the ‘survival assumption’ that are then treated as if they are the same, pointing towards conceptual problems that surround the treatment of state preferences. This article offers a specification that focuses on two questions that highlight the role and function of the survival assumption in the neorealist logic: (i) what do states have to lose if they fail to adopt self-help strategies?; and (ii) how does concern for relevant losses motivate state behaviour and affect international outcomes? Answering these questions through the exploration of governing elites' sensitivity towards regime stability and territorial integrity of the state, in turn, addresses the aforementioned conceptual problems. This specification has further implications for the debates among defensive and offensive realists, potential extensions of the neorealist logic beyond the Westphalian states, and the relationship between neorealist theory and policy analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The surface mass balance for Greenland and Antarctica has been calculated using model data from an AMIP-type experiment for the period 1979–2001 using the ECHAM5 spectral transform model at different triangular truncations. There is a significant reduction in the calculated ablation for the highest model resolution, T319 with an equivalent grid distance of ca 40 km. As a consequence the T319 model has a positive surface mass balance for both ice sheets during the period. For Greenland, the models at lower resolution, T106 and T63, on the other hand, have a much stronger ablation leading to a negative surface mass balance. Calculations have also been undertaken for a climate change experiment using the IPCC scenario A1B, with a T213 resolution (corresponding to a grid distance of some 60 km) and comparing two 30-year periods from the end of the twentieth century and the end of the twenty-first century, respectively. For Greenland there is change of 495 km3/year, going from a positive to a negative surface mass balance corresponding to a sea level rise of 1.4 mm/year. For Antarctica there is an increase in the positive surface mass balance of 285 km3/year corresponding to a sea level fall by 0.8 mm/year. The surface mass balance changes of the two ice sheets lead to a sea level rise of 7 cm at the end of this century compared to end of the twentieth century. Other possible mass losses such as due to changes in the calving of icebergs are not considered. It appears that such changes must increase significantly, and several times more than the surface mass balance changes, if the ice sheets are to make a major contribution to sea level rise this century. The model calculations indicate large inter-annual variations in all relevant parameters making it impossible to identify robust trends from the examined periods at the end of the twentieth century. The calculated inter-annual variations are similar in magnitude to observations. The 30-year trend in SMB at the end of the twenty-first century is significant. The increase in precipitation on the ice sheets follows closely the Clausius-Clapeyron relation and is the main reason for the increase in the surface mass balance of Antarctica. On Greenland precipitation in the form of snow is gradually starting to decrease and cannot compensate for the increase in ablation. Another factor is the proportionally higher temperature increase on Greenland leading to a larger ablation. It follows that a modest increase in temperature will not be sufficient to compensate for the increase in accumulation, but this will change when temperature increases go beyond any critical limit. Calculations show that such a limit for Greenland might well be passed during this century. For Antarctica this will take much longer and probably well into following centuries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The phase shift full bridge (PSFB) converter allows high efficiency power conversion at high frequencies through zero voltage switching (ZVS); the parasitic drain-to-source capacitance of the MOSFET is discharged by a resonant inductance before the switch is gated resulting in near zero turn-on switching losses. Typically, an extra inductance is added to the leakage inductance of a transformer to form the resonant inductance necessary to charge and discharge the parasitic capacitances of the PSFB converter. However, many PSFB models do not consider the effects of the magnetizing inductance or dead-time in selecting the resonant inductance required to achieve ZVS. The choice of resonant inductance is crucial to the ZVS operation of the PSFB converter. Incorrectly sized resonant inductance will not achieve ZVS or will limit the load regulation ability of the converter. This paper presents a unique and accurate equation for calculating the resonant inductance required to achieve ZVS over a wide load range incorporating the effects of the magnetizing inductance and dead-time. The derived equations are validated against PSPICE simulations of a PSFB converter and extensive hardware experimentations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years a number of chemistry-climate models have been developed with an emphasis on the stratosphere. Such models cover a wide range of time scales of integration and vary considerably in complexity. The results of specific diagnostics are here analysed to examine the differences amongst individual models and observations, to assess the consistency of model predictions, with a particular focus on polar ozone. For example, many models indicate a significant cold bias in high latitudes, the “cold pole problem”, particularly in the southern hemisphere during winter and spring. This is related to wave propagation from the troposphere which can be improved by improving model horizontal resolution and with the use of non-orographic gravity wave drag. As a result of the widely differing modelled polar temperatures, different amounts of polar stratospheric clouds are simulated which in turn result in varying ozone values in the models. The results are also compared to determine the possible future behaviour of ozone, with an emphasis on the polar regions and mid-latitudes. All models predict eventual ozone recovery, but give a range of results concerning its timing and extent. Differences in the simulation of gravity waves and planetary waves as well as model resolution are likely major sources of uncertainty for this issue. In the Antarctic, the ozone hole has probably reached almost its deepest although the vertical and horizontal extent of depletion may increase slightly further over the next few years. According to the model results, Antarctic ozone recovery could begin any year within the range 2001 to 2008. The limited number of models which have been integrated sufficiently far indicate that full recovery of ozone to 1980 levels may not occur in the Antarctic until about the year 2050. For the Arctic, most models indicate that small ozone losses may continue for a few more years and that recovery could begin any year within the range 2004 to 2019. The start of ozone recovery in the Arctic is therefore expected to appear later than in the Antarctic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Safety is an element of extreme priority in mining operations, currently many traditional mining countries are investing in the implementation of wireless sensors capable of detecting risk factors; through early warning signs to prevent accidents and significant economic losses. The objective of this research is to contribute to the implementation of sensors for continuous monitoring inside underground mines providing technical parameters for the design of sensor networks applied in underground coal mines. The application of sensors capable of measuring in real time variables of interest, promises to be of great impact on safety for mining industry. The relationship between the geological conditions and mining method design, establish how to implement a system of continuous monitoring. In this paper, the main causes of accidents for underground coal mines are established based on existing worldwide reports. Variables (temperature, gas, structural faults, fires) that can be related to the most frequent causes of disaster and its relevant measuring range are then presented, also the advantages, management and mining operations are discussed, including the analyzed of applying these systems in terms of Benefit, Opportunity, Cost and Risk. The publication focuses on coal mining, based on the proportion of these events a year worldwide, where a significant number of workers are seriously injured or killed. Finally, a dynamic assessment of safety at underground mines it is proposed, this approach offers a contribution to design personalized monitoring networks, the experience developed in coal mines provides a tool that facilitates the application development of technology within underground coal mines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The fungal family Clavicipitaceae includes plant symbionts and parasites that produce several psychoactive and bioprotective alkaloids. The family includes grass symbionts in the epichloae clade (Epichloë and Neotyphodium species), which are extraordinarily diverse both in their host interactions and in their alkaloid profiles. Epichloae produce alkaloids of four distinct classes, all of which deter insects, and some—including the infamous ergot alkaloids—have potent effects on mammals. The exceptional chemotypic diversity of the epichloae may relate to their broad range of host interactions, whereby some are pathogenic and contagious, others are mutualistic and vertically transmitted (seed-borne), and still others vary in pathogenic or mutualistic behavior. We profiled the alkaloids and sequenced the genomes of 10 epichloae, three ergot fungi (Claviceps species), a morning-glory symbiont (Periglandula ipomoeae), and a bamboo pathogen (Aciculosporium take), and compared the gene clusters for four classes of alkaloids. Results indicated a strong tendency for alkaloid loci to have conserved cores that specify the skeleton structures and peripheral genes that determine chemical variations that are known to affect their pharmacological specificities. Generally, gene locations in cluster peripheries positioned them near to transposon-derived, AT-rich repeat blocks, which were probably involved in gene losses, duplications, and neofunctionalizations. The alkaloid loci in the epichloae had unusual structures riddled with large, complex, and dynamic repeat blocks. This feature was not reflective of overall differences in repeat contents in the genomes, nor was it characteristic of most other specialized metabolism loci. The organization and dynamics of alkaloid loci and abundant repeat blocks in the epichloae suggested that these fungi are under selection for alkaloid diversification. We suggest that such selection is related to the variable life histories of the epichloae, their protective roles as symbionts, and their associations with the highly speciose and ecologically diverse cool-season grasses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dairy cow foot health is a subject of concern because it is considered to be the most important welfare problem in dairy farming and causes economic losses for the farmer. In order to improve dairy cow foot health it is important to take into account the attitude and intention of dairy farmers. In our study the objective was to gain insight into the attitude and intention of dairy farmers to take action to improve dairy cow foot health and determine drivers and barriers to take action, using the Theory of Planned Behavior. Five hundred Dutch dairy farmers were selected randomly and were invited by email to fill in an online questionnaire. The questionnaire included questions about respondents’ intentions, attitudes, subjective norms and perceived behavioral control and was extended with questions about personal normative beliefs. With information from such a framework, solution strategies for the improvement of dairy cow foot health can be proposed. The results showed that almost 70% of the dairy farmers had an intention to take action to improve dairy cow foot health. Most important drivers seem to be the achievement of better foot health with cost-effective measures. Possible barriers to taking action were labor efficiency and a long interval between taking action and seeing an improvement in dairy cow foot health. The feed advisor and foot trimmer seemed to have most influence on intentions to take action to improve dairy cow foot health. Most farmers seemed to be satisfied with the foot health status at their farm, which probably weakens the intention for foot health improvement, especially compared to other issues which farmers experience as more urgent. Subclinical foot disorders (where cows are not visibly lame) were not valued as important with respect to animal welfare. Furthermore, 25% of the respondents did not believe cows could suffer pain. Animal welfare, especially the provision of good care for the cows, was valued as important but was not related to intention to improve dairy cow foot health. The cost-effectiveness of measures seemed to be more important. Providing more information on the effects of taking intervention measures might stimulate farmers to take action to achieve improvement in dairy cow foot health.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power delivery for biomedical implants is a major consideration in their design for both measurement and stimulation. When performed by a wireless technique, transmission efficiency is critically important not only because of the costs associated with any losses but also because of the nature of those losses, for example, excessive heat can be uncomfortable for the individual involved. In this study, a method and means of wireless power transmission suitable for biomedical implants are both discussed and experimentally evaluated. The procedure initiated is comparable in size and simplicity to those methods already employed; however, some of Tesla’s fundamental ideas have been incorporated in order to obtain a significant improvement in efficiency. This study contains a theoretical basis for the approach taken; however, the emphasis here is on practical experimental analysis

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simulations of polar ozone losses were performed using the three-dimensional high-resolution (1∘ × 1∘) chemical transport model MIMOSA-CHIM. Three Arctic winters 1999–2000, 2001–2002, 2002–2003 and three Antarctic winters 2001, 2002, and 2003 were considered for the study. The cumulative ozone loss in the Arctic winter 2002–2003 reached around 35% at 475 K inside the vortex, as compared to more than 60% in 1999–2000. During 1999–2000, denitrification induces a maximum of about 23% extra ozone loss at 475 K as compared to 17% in 2002–2003. Unlike these two colder Arctic winters, the 2001–2002 Arctic was warmer and did not experience much ozone loss. Sensitivity tests showed that the chosen resolution of 1∘ × 1∘ provides a better evaluation of ozone loss at the edge of the polar vortex in high solar zenith angle conditions. The simulation results for ozone, ClO, HNO3, N2O, and NO y for winters 1999–2000 and 2002–2003 were compared with measurements on board ER-2 and Geophysica aircraft respectively. Sensitivity tests showed that increasing heating rates calculated by the model by 50% and doubling the PSC (Polar Stratospheric Clouds) particle density (from 5 × 10−3 to 10−2 cm−3) refines the agreement with in situ ozone, N2O and NO y levels. In this configuration, simulated ClO levels are increased and are in better agreement with observations in January but are overestimated by about 20% in March. The use of the Burkholder et al. (1990) Cl2O2 absorption cross-sections slightly increases further ClO levels especially in high solar zenith angle conditions. Comparisons of the modelled ozone values with ozonesonde measurement in the Antarctic winter 2003 and with Polar Ozone and Aerosol Measurement III (POAM III) measurements in the Antarctic winters 2001 and 2002, shows that the simulations underestimate the ozone loss rate at the end of the ozone destruction period. A slightly better agreement is obtained with the use of Burkholder et al. (1990) Cl2O2 absorption cross-sections.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Possible changes in the frequency and intensity of windstorms under future climate conditions during the 21st century are investigated based on an ECHAM5 GCM multi-scenario ensemble. The intensity of a storm is quantified by the associated estimated loss derived with using an empirical model. The geographical focus is ‘Core Europe’, which comprises countries of Western Europe. Possible changes of losses are analysed by comparing ECHAM5 GCM data for recent (20C, 1960 to 2000) and future climate conditions (B1, A1B, A2; 2060 to 2100), each with 3 ensemble members. Changes are quantified using both rank statistics and return periods (RP) estimated by fitting an extreme value distribution using the peak over threshold method to potential storm losses. The estimated losses for ECHAM5 20C and reanalysis events show similar statistical features in terms of return periods. Under future climate conditions, all climate scenarios show an increase in both frequency and magnitude of potential losses caused by windstorms for Core Europe. Future losses that are double the highest ECHAM5 20C loss are identified for some countries. While positive changes of ranking are significant for many countries and multiple scenarios, significantly shorter RPs are mostly found under the A2 scenario for return levels correspondent to 20 yr losses or less. The emergence time of the statistically significant changes in loss varies from 2027 to 2100. These results imply an increased risk of occurrence of windstorm-associated losses, which can be largely attributed to changes in the meteorological severity of the events. Additionally, factors such as changes in the cyclone paths and in the location of the wind signatures relative to highly populated areas are also important to explain the changes in estimated losses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A statistical–dynamical regionalization approach is developed to assess possible changes in wind storm impacts. The method is applied to North Rhine-Westphalia (Western Germany) using the FOOT3DK mesoscale model for dynamical downscaling and ECHAM5/OM1 global circulation model climate projections. The method first classifies typical weather developments within the reanalysis period using K-means cluster algorithm. Most historical wind storms are associated with four weather developments (primary storm-clusters). Mesoscale simulations are performed for representative elements for all clusters to derive regional wind climatology. Additionally, 28 historical storms affecting Western Germany are simulated. Empirical functions are estimated to relate wind gust fields and insured losses. Transient ECHAM5/OM1 simulations show an enhanced frequency of primary storm-clusters and storms for 2060–2100 compared to 1960–2000. Accordingly, wind gusts increase over Western Germany, reaching locally +5% for 98th wind gust percentiles (A2-scenario). Consequently, storm losses are expected to increase substantially (+8% for A1B-scenario, +19% for A2-scenario). Regional patterns show larger changes over north-eastern parts of North Rhine-Westphalia than for western parts. For storms with return periods above 20 yr, loss expectations for Germany may increase by a factor of 2. These results document the method's functionality to assess future changes in loss potentials in regional terms.