864 resultados para change models
Resumo:
In this paper, we examine the major predictions made so far regarding the nature of climate change and its impacts on our region in the light of the known errors of the set of models and the observations over this century. The major predictions of the climate models about the impact of increased concentration of greenhouse gases ave at variance with the observations over the Indian region during the last century characterized by such increases and global warming. It is important to note that as far as the Indian region is concerned, the impact of year-to-year variation of the monsoon will continue to be dominant over longer period changes even in the presence of global warming. Recent studies have also brought out the uncertainties in the yields simulated by crop models. It is suggested that a deeper understanding of the links between climate and agricultural productivity is essential for generating reliable predictions of impact of climate change. Such an insight is also required for identifying cropping patterns and management practices which are tailored for sustained maximum yield in the face of the vagaries of the monsoon.
Resumo:
Relatively few studies have addressed water management and adaptation measures in the face of changing water balances due to climate change. The current work studies climate change impact on a multipurpose reservoir performance and derives adaptive policies for possible futurescenarios. The method developed in this work is illustrated with a case study of Hirakud reservoir on the Mahanadi river in Orissa, India,which is a multipurpose reservoir serving flood control, irrigation and power generation. Climate change effects on annual hydropower generation and four performance indices (reliability with respect to three reservoir functions, viz. hydropower, irrigation and flood control, resiliency, vulnerability and deficit ratio with respect to hydropower) are studied. Outputs from three general circulation models (GCMs) for three scenarios each are downscaled to monsoon streamflow in the Mahanadi river for two future time slices, 2045-65 and 2075-95. Increased irrigation demands, rule curves dictated by increased need for flood storage and downscaled projections of streamflow from the ensemble of GCMs and scenarios are used for projecting future hydrologic scenarios. It is seen that hydropower generation and reliability with respect to hydropower and irrigation are likely to show a decrease in future in most scenarios, whereas the deficit ratio and vulnerability are likely to increase as a result of climate change if the standard operating policy (SOP) using current rule curves for flood protection is employed. An optimal monthly operating policy is then derived using stochastic dynamic programming (SDP) as an adaptive policy for mitigating impacts of climate change on reservoir operation. The objective of this policy is to maximize reliabilities with respect to multiple reservoir functions of hydropower, irrigation and flood control. In variations to this adaptive policy, increasingly more weightage is given to the purpose of maximizing reliability with respect to hydropower for two extreme scenarios. It is seen that by marginally sacrificing reliability with respect to irrigation and flood control, hydropower reliability and generation can be increased for future scenarios. This suggests that reservoir rules for flood control may have to be revised in basins where climate change projects an increasing probability of droughts. However, it is also seen that power generation is unable to be restored to current levels, due in part to the large projected increases in irrigation demand. This suggests that future water balance deficits may limit the success of adaptive policy options. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Impacts of climate change on hydrology are assessed by downscaling large scale general circulation model (GCM) outputs of climate variables to local scale hydrologic variables. This modelling approach is characterized by uncertainties resulting from the use of different models, different scenarios, etc. Modelling uncertainty in climate change impact assessment includes assigning weights to GCMs and scenarios, based on their performances, and providing weighted mean projection for the future. This projection is further used for water resources planning and adaptation to combat the adverse impacts of climate change. The present article summarizes the recent published work of the authors on uncertainty modelling and development of adaptation strategies to climate change for the Mahanadi river in India.
Resumo:
The question at issue in this dissertation is the epistemic role played by ecological generalizations and models. I investigate and analyze such properties of generalizations as lawlikeness, invariance, and stability, and I ask which of these properties are relevant in the context of scientific explanations. I will claim that there are generalizable and reliable causal explanations in ecology by generalizations, which are invariant and stable. An invariant generalization continues to hold or be valid under a special change called an intervention that changes the value of its variables. Whether a generalization remains invariant during its interventions is the criterion that determines whether it is explanatory. A generalization can be invariant and explanatory regardless of its lawlike status. Stability deals with a generality that has to do with holding of a generalization in possible background conditions. The more stable a generalization, the less dependent it is on background conditions to remain true. Although it is invariance rather than stability of generalizations that furnishes us with explanatory generalizations, there is an important function that stability has in this context of explanations, namely, stability furnishes us with extrapolability and reliability of scientific explanations. I also discuss non-empirical investigations of models that I call robustness and sensitivity analyses. I call sensitivity analyses investigations in which one model is studied with regard to its stability conditions by making changes and variations to the values of the model s parameters. As a general definition of robustness analyses I propose investigations of variations in modeling assumptions of different models of the same phenomenon in which the focus is on whether they produce similar or convergent results or not. Robustness and sensitivity analyses are powerful tools for studying the conditions and assumptions where models break down and they are especially powerful in pointing out reasons as to why they do this. They show which conditions or assumptions the results of models depend on. Key words: ecology, generalizations, invariance, lawlikeness, philosophy of science, robustness, explanation, models, stability
Resumo:
One of the effects of the Internet is that the dissemination of scientific publications in a few years has migrated to electronic formats. The basic business practices between libraries and publishers for selling and buying the content, however, have not changed much. In protest against the high subscription prices of mainstream publishers, scientists have started Open Access (OA) journals and e-print repositories, which distribute scientific information freely. Despite widespread agreement among academics that OA would be the optimal distribution mode for publicly financed research results, such channels still constitute only a marginal phenomenon in the global scholarly communication system. This paper discusses, in view of the experiences of the last ten years, the many barriers hindering a rapid proliferation of Open Access. The discussion is structured according to the main OA channels; peer-reviewed journals for primary publishing, subject- specific and institutional repositories for secondary parallel publishing. It also discusses the types of barriers, which can be classified as consisting of the legal framework, the information technology infrastructure, business models, indexing services and standards, the academic reward system, marketing, and critical mass.
Resumo:
The Internet has made possible the cost-effective dissemination of scientific journals in the form of electronic versions, usually in parallel with the printed versions. At the same time the electronic medium also makes possible totally new open access (OA) distribution models, funded by author charges, sponsorship, advertising, voluntary work, etc., where the end product is free in full text to the readers. Although more than 2,000 new OA journals have been founded in the last 15 years, the uptake of open access has been rather slow, with currently around 5% of all peer-reviewed articles published in OA journals. The slow growth can to a large extent be explained by the fact that open access has predominantly emerged via newly founded journals and startup publishers. Established journals and publishers have not had strong enough incentives to change their business models, and the commercial risks in doing so have been high. In this paper we outline and discuss two different scenarios for how scholarly publishers could change their operating model to open access. The first is based on an instantaneous change and the second on a gradual change. We propose a way to manage the gradual change by bundling traditional “big deal” licenses and author charges for opening access to individual articles.
Resumo:
The Internet has made possible the cost-effective dissemination of scientific journals in the form of electronic versions, usually in parallel with the printed versions. At the same time the electronic medium also makes possible totally new open access (OA) distribution models, funded by author charges, sponsorship, advertising, voluntary work, etc., where the end product is free in full text to the readers. Although more than 2,000 new OA journals have been founded in the last 15 years, the uptake of open access has been rather slow, with currently around 5% of all peer-reviewed articles published in OA journals. The slow growth can to a large extent be explained by the fact that open access has predominantly emerged via newly founded journals and startup publishers. Established journals and publishers have not had strong enough incentives to change their business models, and the commercial risks in doing so have been high. In this paper we outline and discuss two different scenarios for how scholarly publishers could change their operating model to open access. The first is based on an instantaneous change and the second on a gradual change. We propose a way to manage the gradual change by bundling traditional “big deal” licenses and author charges for opening access to individual articles.
Resumo:
We examine the stability of hadron resonance gas models by extending them to include undiscovered resonances through the Hagedorn formula. We find that the influence of unknown resonances on thermodynamics is large but bounded. We model the decays of resonances and investigate the ratios of particle yields in heavy-ion collisions. We find that observables such as hydrodynamics and hadron yield ratios change little upon extending the model. As a result, heavy-ion collisions at the RHIC and LHC are insensitive to a possible exponential rise in the hadronic density of states, thus increasing the stability of the predictions of hadron resonance gas models in this context. Hadron resonance gases are internally consistent up to a temperature higher than the crossover temperature in QCD, but by examining quark number susceptibilities we find that their region of applicability ends below the QCD crossover.
Resumo:
The paper proposes two methodologies for damage identification from measured natural frequencies of a contiguously damaged reinforced concrete beam, idealised with distributed damage model. The first method identifies damage from Iso-Eigen-Value-Change contours, plotted between pairs of different frequencies. The performance of the method is checked for a wide variation of damage positions and extents. The method is also extended to a discrete structure in the form of a five-storied shear building and the simplicity of the method is demonstrated. The second method is through smeared damage model, where the damage is assumed constant for different segments of the beam and the lengths and centres of these segments are the known inputs. First-order perturbation method is used to derive the relevant expressions. Both these methods are based on distributed damage models and have been checked with experimental program on simply supported reinforced concrete beams, subjected to different stages of symmetric and un-symmetric damages. The results of the experiments are encouraging and show that both the methods can be adopted together in a damage identification scenario.
Resumo:
Regional impacts of climate change remain subject to large uncertainties accumulating from various sources, including those due to choice of general circulation models (GCMs), scenarios, and downscaling methods. Objective constraints to reduce the uncertainty in regional predictions have proven elusive. In most studies to date the nature of the downscaling relationship (DSR) used for such regional predictions has been assumed to remain unchanged in a future climate. However,studies have shown that climate change may manifest in terms of changes in frequencies of occurrence of the leading modes of variability, and hence, stationarity of DSRs is not really a valid assumption in regional climate impact assessment. This work presents an uncertainty modeling framework where, in addition to GCM and scenario uncertainty, uncertainty in the nature of the DSR is explored by linking downscaling with changes in frequencies of such modes of natural variability. Future projections of the regional hydrologic variable obtained by training a conditional random field (CRF) model on each natural cluster are combined using the weighted Dempster-Shafer (D-S) theory of evidence combination. Each projection is weighted with the future projected frequency of occurrence of that cluster (''cluster linking'') and scaled by the GCM performance with respect to the associated cluster for the present period (''frequency scaling''). The D-S theory was chosen for its ability to express beliefs in some hypotheses, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The methodology is tested for predicting monsoon streamflow of the Mahanadi River at Hirakud Reservoir in Orissa, India. The results show an increasing probability of extreme, severe, and moderate droughts due to limate change. Significantly improved agreement between GCM predictions owing to cluster linking and frequency scaling is seen, suggesting that by linking regional impacts to natural regime frequencies, uncertainty in regional predictions can be realistically quantified. Additionally, by using a measure of GCM performance in simulating natural regimes, this uncertainty can be effectively constrained.
Resumo:
Cosmopolitan ideals have been on the philosophical agenda for several millennia but the end of the Cold War started a new discussion on state sovereignty, global democracy, the role of international law and global institutions. The Westphalian state system in practice since the 17th century is transforming and the democracy deficit needs new solutions. An impetus has been the fact that in the present world, an international body representing global citizens does not exist. In this Master’s thesis, the possibility of establishing a world parliament is examined. In a case analysis, 17 models on world parliament from two journals, a volume of essays and two other publications are discussed. Based on general observations, the models are divided into four thematic groups. The models are analyzed with an emphasis on feasible and probable elements. Further, a new scenario with a time frame of thirty years is proposed based on the methodology of normative futures studies, taking special interest in causal relationships and actions leading to change. The scenario presents three gradual steps that each need to be realized before a sustainable world parliament is established. The theoretical framework is based on social constructivism, and changes in international and multi-level governance are examined with the concepts of globalization, democracy and sovereignty. A feasible, desirable and credible world parliament is constituted gradually by implying electoral, democratic and legal measures for members initially from exclusively democratic states, parliamentarians, non-governmental organizations and other groups. The parliament should be located outside the United Nations context, since a new body avoids the problem of inefficiency currently prevailing in the UN. The main objectives of the world parliament are to safeguard peace and international law and to offer legal advice in cases when international law has been violated. A feasible world parliament is advisory in the beginning but it is granted legislative powers in the future. The number of members in the world parliament could also be extended following the example of the EU enlargement process.
Synthetic peptide models for the redox-active disulfide loop of glutaredoxin. Conformational studies
Resumo:
Two cyclic peptide disulfides Boc-Cys-Pro-X-Cys-NHMe (X = L-Tyr or L-Phe) have been synthesized as models for the 14-membered redox-active disulfide loop of glutaredoxin. 'H NMR studies at 270 MHz in chloroform solutions establish a type I 0-turn conformation for the Pro-X segment in both peptides, stabilized by a 4-1 hydrogen bond between the Cys(1) CO and Cys(4) NH groups. Nuclear Overhauser effects establish that the aromatic ring in the X = Phe peptide is oriented over the central peptide unit. In dimethyl sulfoxide solutions two conformational species are observed in slow exchange on the NMR time scale, for both peptides. These are assigned to type I and type I1 p-turn structures with -Pro-Tyr(Phe)-as the corner residues. The structural assignments are based on correlation of NMR parameters with model 14-membered cyclic cystine peptides with Pro-X spacers. Circular dichroism studies based on the -S-Sn- u* transition suggest a structural change in the disulfide bridge with changing solvent polarity, establishing conformational coupling between the peptide backbone and the disulfide linkage in these systems.
Resumo:
Linear optimization model was used to calculate seven wood procurement scenarios for years 1990, 2000 and 2010. Productivity and cost functions for seven cutting, five terrain transport, three long distance transport and various work supervision and scaling methods were calculated from available work study reports. All method's base on Nordic cut to length system. Finland was divided in three parts for description of harvesting conditions. Twenty imaginary wood processing points and their wood procurement areas were created for these areas. The procurement systems, which consist of the harvesting conditions and work productivity functions, were described as a simulation model. In the LP-model the wood procurement system has to fulfil the volume and wood assortment requirements of processing points by minimizing the procurement cost. The model consists of 862 variables and 560 restrictions. Results show that it is economical to increase the mechanical work in harvesting. Cost increment alternatives effect only little on profitability of manual work. The areas of later thinnings and seed tree- and shelter wood cuttings increase on cost of first thinnings. In mechanized work one method, 10-tonne one grip harvester and forwarder, is gaining advantage among other methods. Working hours of forwarder are decreasing opposite to the harvester. There is only little need to increase the number of harvesters and trucks or their drivers from today's level. Quite large fluctuations in level of procurement and cost can be handled by constant number of machines, by alternating the number of season workers and by driving machines in two shifts. It is possible, if some environmental problems of large scale summer time harvesting can be solved.
Resumo:
Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster-Shafer (D-S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D-S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D-S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D-S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster-Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D-S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D-S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
M.A. (Educ.) Anu Kajamaa from the University of Helsinki, Center for Research on Activity, Development and Learning (CRADLE), examines change efforts and their consequences in health care in the public sector. The aim of her academic dissertation is, by providing a new conceptual framework, to widen our understanding of organizational change efforts and their consequences and managerial challenges. Despite the multiple change efforts, the results of health care development projects have not been very promising, and many developmental needs and managerial challenges exist. The study challenges the predominant, well-framed health care change paradigm and calls for an expanded view to explore the underlying issues and multiplicities of change efforts and their consequences. The study asks what kind of expanded conceptual framework is needed to better understand organizational change as transcending currently dominant oppositions in management thinking, specifically in the field of health care. The study includes five explorative case studies of health care change efforts and their consequences in Finland. Theory and practice are tightly interconnected in the study. The methodology of the study integrates the ethnography of organizational change, a narrative approach and cultural-historical activity theory. From the stance of activity theory, historicity, contradictions, locality and employee participation play significant roles in developing health care. The empirical data of the study has mainly been collected in two projects, funded by the Finnish Work Environment Fund. The data was collected in public sector health care organizations during the years 2004-2010. By exploring the oppositions between distinct views on organizational change and the multi-site, multi-level and multi-logic of organizational change, the study develops an expanded, multidimensional activity-theoretical framework on organizational change and management thinking. The findings of the study contribute to activity theory and organization studies, and provide information for health care management and practitioners. The study illuminates that continuous development efforts bridged to one another and anchored to collectively created new activity models can lead to significant improvements and organizational learning in health care. The study presents such expansive learning processes. The ways of conducting change efforts in organizations play a critical role in the creation of collective new practices and tools and in establishing ownership over them. Some of the studied change efforts were discontinuous or encapsulated, not benefiting the larger whole. The study shows that the stagnation and unexpected consequences of change efforts relate to the unconnectedness of the different organizational sites, levels and logics. If not dealt with, the unintended consequences such as obstacles, breaks and conflicts may stem promising change and learning processes.