855 resultados para Pareto Frontier
Resumo:
A new frontier in weather forecasting is emerging by operational forecast models now being run at convection-permitting resolutions at many national weather services. However, this is not a panacea; significant systematic errors remain in the character of convective storms and rainfall distributions. The DYMECS project (Dynamical and Microphysical Evolution of Convective Storms) is taking a fundamentally new approach to evaluate and improve such models: rather than relying on a limited number of cases, which may not be representative, we have gathered a large database of 3D storm structures on 40 convective days using the Chilbolton radar in southern England. We have related these structures to storm life-cycles derived by tracking features in the rainfall from the UK radar network, and compared them statistically to storm structures in the Met Office model, which we ran at horizontal grid length between 1.5 km and 100 m, including simulations with different subgrid mixing length. We also evaluated the scale and intensity of convective updrafts using a new radar technique. We find that the horizontal size of simulated convective storms and the updrafts within them is much too large at 1.5-km resolution, such that the convective mass flux of individual updrafts can be too large by an order of magnitude. The scale of precipitation cores and updrafts decreases steadily with decreasing grid lengths, as does the typical storm lifetime. The 200-m grid-length simulation with standard mixing length performs best over all diagnostics, although a greater mixing length improves the representation of deep convective storms.
Resumo:
The idea of Sustainable Intensification comes as a response to the challenge of avoiding resources such as land, water and energy being overexploited while increasing food production for an increasing demand from a growing global population. Sustainable Intensification means that farmers need to simultaneously increase yields and sustainably use limited natural resources, such as water. Within the agricultural sector water has a number of uses including irrigation, spraying, drinking for livestock and washing (vegetables, livestock buildings). In order to achieve Sustainable Intensification measures are needed that enable policy makers and managers to inform them about the relative performance of farms as well as of possible ways to improve such performance. We provide a benchmarking tool to assess water use (relative) efficiency at a farm level, suggest pathways to improve farm level productivity by identifying best practices for reducing excessive use of water for irrigation. Data Envelopment Analysis techniques including analysis of returns to scale were used to evaluate any excess in agricultural water use of 66 Horticulture Farms based on different River Basin Catchments across England. We found that farms in the sample can reduce on average water requirements by 35% to achieve the same output (Gross Margin) when compared to their peers on the frontier. In addition, 47% of the farms operate under increasing returns to scale, indicating that farms will need to develop economies of scale to achieve input cost savings. Regarding the adoption of specific water use efficiency management practices, we found that the use of a decision support tool, recycling water and the installation of trickle/drip/spray lines irrigation system has a positive impact on water use efficiency at a farm level whereas the use of other irrigation systems such as the overhead irrigation system was found to have a negative effect on water use efficiency.
Resumo:
We propose a bargaining process supergame over the strategies to play in a non-cooperative game. The agreement reached by players at the end of the bargaining process is the strategy profile that they will play in the original non-cooperative game. We analyze the subgame perfect equilibria of this supergame, and its implications on the original game. We discuss existence, uniqueness, and efficiency of the agreement reachable through this bargaining process. We illustrate the consequences of applying such a process to several common two-player non-cooperative games: the Prisoner’s Dilemma, the Hawk-Dove Game, the Trust Game, and the Ultimatum Game. In each of them, the proposed bargaining process gives rise to Pareto-efficient agreements that are typically different from the Nash equilibrium of the original games.
Resumo:
Reconsidering the initial Christian Conversion of Scotland in the fifth and sixth centuries AD, using archaeological and historical evidence, it is argued that this was carried out by missionaries from what had been Roman Britain. It is shown that this missionary activity - and similar British missions in Ireland - represents the first instance of Western missionary work beyond the former Roman imperial frontiers. The location of the northern frontier of Roman Britain in the fourth century, and the meaning of Pictish Class 1 symbol stones, are discussed as part of the broader argument.
Resumo:
Understanding complex social-ecological systems, and anticipating how they may respond to rapid change, requires an approach that incorporates environmental, social, economic, and policy factors, usually in a context of fragmented data availability. We employed fuzzy cognitive mapping (FCM) to integrate these factors in the assessment of future wildfire risk in the Chiquitania region, Bolivia. In this region, dealing with wildfires is becoming increasingly challenging due to reinforcing feedbacks between multiple drivers. We conducted semi-structured interviews and constructed different FCMs in focus groups to understand the regional dynamics of wildfire from diverse perspectives. We used FCM modelling to evaluate possible adaptation scenarios in the context of future drier climatic conditions. Scenarios also considered possible failure to respond in time to the emergent risk. This approach proved of great potential to support decision-making for risk management. It helped identify key forcing variables and generate insights into potential risks and trade-offs of different strategies. All scenarios showed increased wildfire risk in the event of more droughts. The ‘Hands-off’ scenario resulted in amplified impacts driven by intensifying trends, affecting particularly the agricultural production. The ‘Fire management’ scenario, which adopted a bottom-up approach to improve controlled burning, showed less trade-offs between wildfire risk reduction and production compared to the ‘Fire suppression’ scenario. Findings highlighted the importance of considering strategies that involve all actors who use fire, and the need to nest these strategies for a more systemic approach to manage wildfire risk. The FCM model could be used as a decision-support tool and serve as a ‘boundary object’ to facilitate collaboration and integration of different forms of knowledge and perceptions of fire in the region. This approach has also the potential to support decisions in other dynamic frontier landscapes around the world that are facing increased risk of large wildfires.
Resumo:
Based on a large dataset from eight Asian economies, we test the impact of post-crisis regulatory reforms on the performance of depository institutions in countries at different levels of financial development. We allow for technological heterogeneity and estimate a set of country-level stochastic cost frontiers followed by a deterministic bootstrapped meta-frontier to evaluate cost efficiency and cost technology. Our results support the view that liberalization policies have a positive impact on bank performance, while the reverse is true for prudential regulation policies. The removal of activities restrictions, bank privatization and foreign bank entry have a positive and significant impact on technological progress and cost efficiency. In contrast, prudential policies, which aim to protect the banking sector from excessive risk-taking, tend to adversely affect banks cost efficiency but not cost technology.
Resumo:
Following the 1997 crisis, banking sector reforms in Asia have been characterised by the emphasis on prudential regulation, associated with increased financial liberalisation. Using a panel data set of commercial banks from eight major Asian economies over the period 2001-2010, this study explores how the coexistence of liberalisation and prudential regulation affects banks’ cost characteristics. Given the presence of heterogeneity of technologies across countries, we use a stochastic frontier approach followed by the estimation of a deterministic meta-frontier to provide ‘true’ estimates of bank cost efficiency measures. Our results show that the liberalization of bank interest rates and the increase in foreign banks' presence have had a positive and significant impact on technological progress and cost efficiency. On the other hand, we find that prudential regulation might adversely affect bank cost performance. When designing an optimal regulatory framework, policy makers should combine policies which aim to foster financial stability without hindering financial intermediation.
Resumo:
This work is an assessment of frequency of extreme values (EVs) of daily rainfall in the city of Sao Paulo. Brazil, over the period 1933-2005, based on the peaks-over-threshold (POT) and Generalized Pareto Distribution (GPD) approach. Usually. a GPD model is fitted to a sample of POT Values Selected With a constant threshold. However. in this work we use time-dependent thresholds, composed of relatively large p quantities (for example p of 0.97) of daily rainfall amounts computed from all available data. Samples of POT values were extracted with several Values of p. Four different GPD models (GPD-1, GPD-2, GPD-3. and GDP-4) were fitted to each one of these samples by the maximum likelihood (ML) method. The shape parameter was assumed constant for the four models, but time-varying covariates were incorporated into scale parameter of GPD-2. GPD-3, and GPD-4, describing annual cycle in GPD-2. linear trend in GPD-3, and both annual cycle and linear trend in GPD-4. The GPD-1 with constant scale and shape parameters is the simplest model. For identification of the best model among the four models WC used rescaled Akaike Information Criterion (AIC) with second-order bias correction. This criterion isolates GPD-3 as the best model, i.e. the one with positive linear trend in the scale parameter. The slope of this trend is significant compared to the null hypothesis of no trend, for about 98% confidence level. The non-parametric Mann-Kendall test also showed presence of positive trend in the annual frequency of excess over high thresholds. with p-value being virtually zero. Therefore. there is strong evidence that high quantiles of daily rainfall in the city of Sao Paulo have been increasing in magnitude and frequency over time. For example. 0.99 quantiles of daily rainfall amount have increased by about 40 mm between 1933 and 2005. Copyright (C) 2008 Royal Meteorological Society
Resumo:
The abundance of heavy r-elements may provide a better understanding of the r-process, and the determination of several reference r-elements should allow a better determination of a star`s age. The space UV region (lambda < 3000 angstrom) presents a large number of lines of the heavy elements, and in the case of some elements, such as Bi, Pt, Au, detectable lines are not available elsewhere. The extreme ""r-process star"" CS 31082-001 ([Fe/H] = -2.9) was observed in the space UV to determine abundances of the heaviest stable elements, using STIS on board Hubble Space Telescope.
Resumo:
We present preliminary results for the estimation of barium [Ba/Fe], and strontium [Sr/Fe], abundances ratios using medium-resolution spectra (1-2 angstrom). We established a calibration between the abundance ratios and line indices for Ba and Sr, using multiple regression and artificial neural network techniques. A comparison between the two techniques (showing the advantage of the latter), as well as a discussion of future work, is presented.
Resumo:
Clustering is a difficult task: there is no single cluster definition and the data can have more than one underlying structure. Pareto-based multi-objective genetic algorithms (e.g., MOCK Multi-Objective Clustering with automatic K-determination and MOCLE-Multi-Objective Clustering Ensemble) were proposed to tackle these problems. However, the output of such algorithms can often contains a high number of partitions, becoming difficult for an expert to manually analyze all of them. In order to deal with this problem, we present two selection strategies, which are based on the corrected Rand, to choose a subset of solutions. To test them, they are applied to the set of solutions produced by MOCK and MOCLE in the context of several datasets. The study was also extended to select a reduced set of partitions from the initial population of MOCLE. These analysis show that both versions of selection strategy proposed are very effective. They can significantly reduce the number of solutions and, at the same time, keep the quality and the diversity of the partitions in the original set of solutions. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
In this paper, we present an algorithm for cluster analysis that integrates aspects from cluster ensemble and multi-objective clustering. The algorithm is based on a Pareto-based multi-objective genetic algorithm, with a special crossover operator, which uses clustering validation measures as objective functions. The algorithm proposed can deal with data sets presenting different types of clusters, without the need of expertise in cluster analysis. its result is a concise set of partitions representing alternative trade-offs among the objective functions. We compare the results obtained with our algorithm, in the context of gene expression data sets, to those achieved with multi-objective Clustering with automatic K-determination (MOCK). the algorithm most closely related to ours. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Cortical bones, essential for mechanical support and structure in many animals, involve a large number of canals organized in intricate fashion. By using state-of-the art image analysis and computer graphics, the 3D reconstruction of a whole bone (phalange) of a young chicken was obtained and represented in terms of a complex network where each canal was associated to an edge and every confluence of three or more canals yielded a respective node. The representation of the bone canal structure as a complex network has allowed several methods to be applied in order to characterize and analyze the canal system organization and the robustness. First, the distribution of the node degrees (i.e. the number of canals connected to each node) confirmed previous indications that bone canal networks follow a power law, and therefore present some highly connected nodes (hubs). The bone network was also found to be partitioned into communities or modules, i.e. groups of nodes which are more intensely connected to one another than with the rest of the network. We verified that each community exhibited distinct topological properties that are possibly linked with their specific function. In order to better understand the organization of the bone network, its resilience to two types of failures (random attack and cascaded failures) was also quantified comparatively to randomized and regular counterparts. The results indicate that the modular structure improves the robustness of the bone network when compared to a regular network with the same average degree and number of nodes. The effects of disease processes (e. g., osteoporosis) and mutations in genes (e.g., BMP4) that occur at the molecular level can now be investigated at the mesoscopic level by using network based approaches.
Resumo:
Based on a divide and conquer approach, knowledge about nature has been organized into a set of interrelated facts, allowing a natural representation in terms of graphs: each `chunk` of knowledge corresponds to a node, while relationships between such chunks are expressed as edges. This organization becomes particularly clear in the case of mathematical theorems, with their intense cross-implications and relationships. We have derived a web of mathematical theorems from Wikipedia and, thanks to the powerful concept of entropy, identified its more central and frontier elements. Our results also suggest that the central nodes are the oldest theorems, while the frontier nodes are those recently added to the network. The network communities have also been identified, allowing further insights about the organization of this network, such as its highly modular structure.
Resumo:
In this paper, a simple relation between the Leimkuhler curve and the mean residual life is established. The result is illustrated with several models commonly used in informetrics, such as exponential, Pareto and lognormal. Finally, relationships with some other reliability concepts are also presented. (C) 2010 Elsevier Ltd. All rights reserved.