896 resultados para global approach
Resumo:
Our article focuses on the region of Chilean Patagonia and considers how it has developed as a leading producer of salmon for global food markets. It addresses the problem of how to decentre conventional views of the forces driving regional development that give primacy to the role of capital and technology, instead giving due recognition to the knowledge and practices of situated actors and to the relationships that form between human and non-human entities in food producing regions. As an alternative, we ask whether an assemblage approach can improve our understanding of regional transformation. To explore this question, we present original ethnographic data on constitutive practices that have transformed the Patagonian region, from the territorialization of Salmonidae species to experimentation in ocean ranching and sea water fish farming, and finally the development of a global industry. The evidence leads us to argue that in a complex globalised world, assemblage theory offers a valuable approach for understanding how regional potential is realised. In the case of Chilean Patagonia, it is apparent that forms of bio-power generate new relations between life, agency and nature, stimulating contemporary regional transformations in ways overlooked by the lineal logic of capital objectification discourses. Applying an assemblage approach enables the significance of new contemporary human – non-human relationships and inter-subjectivities to come to the fore, keeping the social in view as potential for regional transformation and new power asymmetries continuously emerge.
Social connection and practice-dependence: some recent developments in the global justice literature
Resumo:
This review essay discusses two recent attempts to reform the framework in which issues of international and global justice are discussed: Iris Marion Young’s ‘social connection’ model and the practice-dependent approach, here exemplified by Ayelet Banai, Miriam Ronzoni and Christian Schemmel’s edited collection. I argue that while Young’s model may fit some issues of international or global justice, it misconceives the problems that many of them pose. Indeed, its difficulties point precisely in the direction of practice dependence as it is presented by Banai et al. I go on to discuss what seem to be the strengths of that method, and particularly Banai et al.’s defence of it against the common claim that it is biased towards the status quo. I also discuss Andrea Sangiovanni and Kate MacDonald’s contributions to the collection.
Resumo:
Current state-of-the-art global climate models produce different values for Earth’s mean temperature. When comparing simulations with each other and with observations it is standard practice to compare temperature anomalies with respect to a reference period. It is not always appreciated that the choice of reference period can affect conclusions, both about the skill of simulations of past climate, and about the magnitude of expected future changes in climate. For example, observed global temperatures over the past decade are towards the lower end of the range of CMIP5 simulations irrespective of what reference period is used, but exactly where they lie in the model distribution varies with the choice of reference period. Additionally, we demonstrate that projections of when particular temperature levels are reached, for example 2K above ‘pre-industrial’, change by up to a decade depending on the choice of reference period. In this article we discuss some of the key issues that arise when using anomalies relative to a reference period to generate climate projections. We highlight that there is no perfect choice of reference period. When evaluating models against observations, a long reference period should generally be used, but how long depends on the quality of the observations available. The IPCC AR5 choice to use a 1986-2005 reference period for future global temperature projections was reasonable, but a case-by-case approach is needed for different purposes and when assessing projections of different climate variables. Finally, we recommend that any studies that involve the use of a reference period should explicitly examine the robustness of the conclusions to alternative choices.
Resumo:
Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961–2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño–Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.
Resumo:
Purpose – The purpose of this paper is to present the model of the translation of particularly important ideas for the organization and its context, called mythical ideas. Design/methodology/approach – The study is based on ethnographic research. Findings – It is found that change processes based on mythical ideas are especially dynamic but also very vulnerable. The consequences of failure can be vital for the organization and its environment. Originality/value – The paper explores the outcomes to which the translation of a mythical idea can lead. The findings are of value for people involved in organizational change processes.
Resumo:
A set of four eddy-permitting global ocean reanalyses produced in the framework of the MyOcean project have been compared over the altimetry period 1993–2011. The main differences among the reanalyses used here come from the data assimilation scheme implemented to control the ocean state by inserting reprocessed observations of sea surface temperature (SST), in situ temperature and salinity profiles, sea level anomaly and sea-ice concentration. A first objective of this work includes assessing the interannual variability and trends for a series of parameters, usually considered in the community as essential ocean variables: SST, sea surface salinity, temperature and salinity averaged over meaningful layers of the water column, sea level, transports across pre-defined sections, and sea ice parameters. The eddy-permitting nature of the global reanalyses allows also to estimate eddy kinetic energy. The results show that in general there is a good consistency between the different reanalyses. An intercomparison against experiments without data assimilation was done during the MyOcean project and we conclude that data assimilation is crucial for correctly simulating some quantities such as regional trends of sea level as well as the eddy kinetic energy. A second objective is to show that the ensemble mean of reanalyses can be evaluated as one single system regarding its reliability in reproducing the climate signals, where both variability and uncertainties are assessed through the ensemble spread and signal-to-noise ratio. The main advantage of having access to several reanalyses differing in the way data assimilation is performed is that it becomes possible to assess part of the total uncertainty. Given the fact that we use very similar ocean models and atmospheric forcing, we can conclude that the spread of the ensemble of reanalyses is mainly representative of our ability to gauge uncertainty in the assimilation methods. This uncertainty changes a lot from one ocean parameter to another, especially in global indices. However, despite several caveats in the design of the multi-system ensemble, the main conclusion from this study is that an eddy-permitting multi-system ensemble approach has become mature and our results provide a first step towards a systematic comparison of eddy-permitting global ocean reanalyses aimed at providing robust conclusions on the recent evolution of the oceanic state.
Resumo:
This paper introduces the special issue of Climatic Change on the QUEST-GSI project, a global-scale multi-sectoral assessment of the impacts of climate change. The project used multiple climate models to characterise plausible climate futures with consistent baseline climate and socio-economic data and consistent assumptions, together with a suite of global-scale sectoral impacts models. It estimated impacts across sectors under specific SRES emissions scenarios, and also constructed functions relating impact to change in global mean surface temperature. This paper summarises the objectives of the project and its overall methodology, outlines how the project approach has been used in subsequent policy-relevant assessments of future climate change under different emissions futures, and summarises the general lessons learnt in the project about model validation and the presentation of multi-sector, multi-region impact assessments and their associated uncertainties to different audiences.
Resumo:
Biological invasions threaten the native biota of several countries and this threat is even greater in the tropical regions that have the greatest biodiversity. In order to evaluate the representativeness of studies on invasive plants in tropical countries compared to the world, as well as the region of origin and habits of the most reported invasive plants in research, we analyzed the publications from eight of the most important international journals that address the theme, from January 1995 to December 2004. The articles on biological invasions were classified as theoretical or as case studies, and according to their approach, main question, where the study was conducted, region of origin and habit of the invasive plant. Case studies predominated, as did questions about the environment`s susceptibility to the invasion, the species` invasive power and the impacts it had. The most reported invasive species were herbaceous plants from Asia and Europe. Few articles address tropical environments and only one referred to Brazil. Most referred to North America and Europe. This small number of publications in the tropics indicates the need for a global projection on this subject and underscores the lack of consistent and organized data to understand the phenomenon and propose effective strategies to combat biological invasion.
Resumo:
Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.
Resumo:
Deviations from the average can provide valuable insights about the organization of natural systems. The present article extends this important principle to the systematic identification and analysis of singular motifs in complex networks. Six measurements quantifying different and complementary features of the connectivity around each node of a network were calculated, and multivariate statistical methods applied to identify singular nodes. The potential of the presented concepts and methodology was illustrated with respect to different types of complex real-world networks, namely the US air transportation network, the protein-protein interactions of the yeast Saccharomyces cerevisiae and the Roget thesaurus networks. The obtained singular motifs possessed unique functional roles in the networks. Three classic theoretical network models were also investigated, with the Barabasi-Albert model resulting in singular motifs corresponding to hubs, confirming the potential of the approach. Interestingly, the number of different types of singular node motifs as well as the number of their instances were found to be considerably higher in the real-world networks than in any of the benchmark networks. Copyright (C) EPLA, 2009
Resumo:
Complex networks exist in many areas of science such as biology, neuroscience, engineering, and sociology. The growing development of this area has led to the introduction of several topological and dynamical measurements, which describe and quantify the structure of networks. Such characterization is essential not only for the modeling of real systems but also for the study of dynamic processes that may take place in them. However, it is not easy to use several measurements for the analysis of complex networks, due to the correlation between them and the difficulty of their visualization. To overcome these limitations, we propose an effective and comprehensive approach for the analysis of complex networks, which allows the visualization of several measurements in a few projections that contain the largest data variance and the classification of networks into three levels of detail, vertices, communities, and the global topology. We also demonstrate the efficiency and the universality of the proposed methods in a series of real-world networks in the three levels.
Resumo:
A novel global optimization method based on an Augmented Lagrangian framework is introduced for continuous constrained nonlinear optimization problems. At each outer iteration k the method requires the epsilon(k)-global minimization of the Augmented Lagrangian with simple constraints, where epsilon(k) -> epsilon. Global convergence to an epsilon-global minimizer of the original problem is proved. The subproblems are solved using the alpha BB method. Numerical experiments are presented.
Resumo:
This paper reports an expert system (SISTEMAT) developed for structural determination of diverse chemical classes of natural products, including lignans, based mainly on 13C NMR and 1H NMR data of these compounds. The system is composed of five programs that analyze specific data of a lignan and shows a skeleton probability for the compound. At the end of analyses, the results are grouped, the global probability is computed, and the most probable skeleton is exhibited to the user. SISTEMAT was able to properly predict the skeletons of 80% of the 30 lignans tested, demonstrating its advantage during the structural elucidation course in a short period of time.
Resumo:
This study got its origin in the failed climate negotiations in the Copenhagen 2009 summit. By conducting a public good game, with participants from China and Sweden, my study indicates that previous studies on public good games can predict the outcome of the game to a quit large extent even though most of my statistical tests came out statistically insignificant. My study also indicates that by framing the game as climate negotiations there were no statistical significant difference on the level of contributions in comparison to the unframed versions of the game. The awareness of the issues with emissions, global warming and other environmental problems are pretty high but even so when push comes to shove gains in the short run are prioritized to gains in the long run. There are however hypothetical willingness to come to term with the environmental issues. The results of the study indicate that the outcome of the Copenhagen summit can be avoidable but would need additional experiments made on cultural differences and behavior.
Resumo:
When an accurate hydraulic network model is available, direct modeling techniques are very straightforward and reliable for on-line leakage detection and localization applied to large class of water distribution networks. In general, this type of techniques based on analytical models can be seen as an application of the well-known fault detection and isolation theory for complex industrial systems. Nonetheless, the assumption of single leak scenarios is usually made considering a certain leak size pattern which may not hold in real applications. Upgrading a leak detection and localization method based on a direct modeling approach to handle multiple-leak scenarios can be, on one hand, quite straightforward but, on the other hand, highly computational demanding for large class of water distribution networks given the huge number of potential water loss hotspots. This paper presents a leakage detection and localization method suitable for multiple-leak scenarios and large class of water distribution networks. This method can be seen as an upgrade of the above mentioned method based on a direct modeling approach in which a global search method based on genetic algorithms has been integrated in order to estimate those network water loss hotspots and the size of the leaks. This is an inverse / direct modeling method which tries to take benefit from both approaches: on one hand, the exploration capability of genetic algorithms to estimate network water loss hotspots and the size of the leaks and on the other hand, the straightforwardness and reliability offered by the availability of an accurate hydraulic model to assess those close network areas around the estimated hotspots. The application of the resulting method in a DMA of the Barcelona water distribution network is provided and discussed. The obtained results show that leakage detection and localization under multiple-leak scenarios may be performed efficiently following an easy procedure.