804 resultados para Pareto frontier
Resumo:
This paper investigates the frequency of extreme events for three LIFFE futures contracts for the calculation of minimum capital risk requirements (MCRRs). We propose a semiparametric approach where the tails are modelled by the Generalized Pareto Distribution and smaller risks are captured by the empirical distribution function. We compare the capital requirements form this approach with those calculated from the unconditional density and from a conditional density - a GARCH(1,1) model. Our primary finding is that both in-sample and for a hold-out sample, our extreme value approach yields superior results than either of the other two models which do not explicitly model the tails of the return distribution. Since the use of these internal models will be permitted under the EC-CAD II, they could be widely adopted in the near future for determining capital adequacies. Hence, close scrutiny of competing models is required to avoid a potentially costly misallocation capital resources while at the same time ensuring the safety of the financial system.
Resumo:
The Crusades in the Near East, eastern Baltic and Iberian Peninsula (in the context of the Reconquest/reconquista) were accompanied by processes of colonisation, characterising the expansion of medieval Europe and resulting in the creation of frontier societies at the fringes of Christendom. Colonisation was closely associated with — indeed, depended on — the exploitation of local environments, but this dimension is largely missing from studies of the crusading frontiers. This paper, the product of a European Science Foundation Exploratory Workshop on 'The Ecology of Crusading' in 2009, surveys the potential for investigating the environmental impact of the crusading movement in all three frontier regions. It considers a diverse range of archaeological, palaeoenvironmental and written sources, with the aim of situating the societies created by the Crusades within the context of medieval colonisation and human ecological niche construction. It demonstrates that an abundant range of data exists for developing this largely neglected and disparately studied aspect of medieval frontier societies into a significant research programme.
Resumo:
This paper redefines technical efficiency by incorporating provision of environmental goods as one of the outputs of the farm. The proportion of permanent and rough grassland to total agricultural land area is used as a proxy for the provision of environmental goods. Stochastic frontier analysis was conducted using a Bayesian procedure. The methodology is applied to panel data on 215 dairy farms in England and Wales. Results show that farm efficiency rankings change when provision of environmental outputs by farms is incorporated in the efficiency analysis, which may have important political implications.
Resumo:
Controllers for feedback substitution schemes demonstrate a trade-off between noise power gain and normalized response time. Using as an example the design of a controller for a radiometric transduction process subjected to arbitrary noise power gain and robustness constraints, a Pareto-front of optimal controller solutions fulfilling a range of time-domain design objectives can be derived. In this work, we consider designs using a loop shaping design procedure (LSDP). The approach uses linear matrix inequalities to specify a range of objectives and a genetic algorithm (GA) to perform a multi-objective optimization for the controller weights (MOGA). A clonal selection algorithm is used to further provide a directed search of the GA towards the Pareto front. We demonstrate that with the proposed methodology, it is possible to design higher order controllers with superior performance in terms of response time, noise power gain and robustness.
Resumo:
A statistical methodology is proposed and tested for the analysis of extreme values of atmospheric wave activity at mid-latitudes. The adopted methods are the classical block-maximum and peak over threshold, respectively based on the generalized extreme value (GEV) distribution and the generalized Pareto distribution (GPD). Time-series of the ‘Wave Activity Index’ (WAI) and the ‘Baroclinic Activity Index’ (BAI) are computed from simulations of the General Circulation Model ECHAM4.6, which is run under perpetual January conditions. Both the GEV and the GPD analyses indicate that the extremes ofWAI and BAI areWeibull distributed, this corresponds to distributions with an upper bound. However, a remarkably large variability is found in the tails of such distributions; distinct simulations carried out under the same experimental setup provide sensibly different estimates of the 200-yr WAI return level. The consequences of this phenomenon in applications of the methodology to climate change studies are discussed. The atmospheric configurations characteristic of the maxima and minima of WAI and BAI are also examined.
Resumo:
Markowitz showed that assets can be combined to produce an 'Efficient' portfolio that will give the highest level of portfolio return for any level of portfolio risk, as measured by the variance or standard deviation. These portfolios can then be connected to generate what is termed an 'Efficient Frontier' (EF). In this paper we discuss the calculation of the Efficient Frontier for combinations of assets, again using the spreadsheet Optimiser. To illustrate the derivation of the Efficient Frontier, we use the data from the Investment Property Databank Long Term Index of Investment Returns for the period 1971 to 1993. Many investors might require a certain specific level of holding or a restriction on holdings in at least some of the assets. Such additional constraints may be readily incorporated into the model to generate a constrained EF with upper and/or lower bounds. This can then be compared with the unconstrained EF to see whether the reduction in return is acceptable. To see the effect that these additional constraints may have, we adopt a fairly typical pension fund profile, with no more than 20% of the total held in Property. The paper shows that it is now relatively easy to use the Optimiser available in at least one spreadsheet (EXCEL) to calculate efficient portfolios for various levels of risk and return, both constrained and unconstrained, so as to be able to generate any number of Efficient Frontiers.
Resumo:
New Mo(II) complexes with 2,2'-dipyridylamine (L1), [Mo(CH(3)CN)(eta(3)-C(3)H(5))(CO)(2)(L1)]OTf (C1a) and [{MoBr(eta(3)-C(3)H(5))(CO)(2)(L1)}(2)(4,4'-bipy)](PF(6))(2) (C1b), with {[bis(2-pyridyl)amino]carbonyl}ferrocene (L2), [MoBr(eta(3)-C(3)H(5))(CO)(2)(L2)] (C2), and with the new ligand N,N-bis(ferrocenecarbonyl)-2-aminopyridine (L3), [MoBr(eta(3)-C(3)H(5))(CO)(2)(L3)] (C3), were prepared and characterized by FTIR and (1)H and (13)C NMR spectroscopy. C1a, C1b, L3, and C2 were also structurally characterized by single crystal X-ray diffraction. The Mo(II) coordination sphere in all complexes features the facial arrangement of allyl and carbonyl ligands, with the axial isomer present in C1a and C2, and the equatorial in the binuclear C1b. In both C1a and C1b complexes, the L1 ligand is bonded to Mo(II) through the nitrogen atoms and the NH group is involved in hydrogen bonds. The X-ray single crystal structure of C2 shows that L2 is coordinated in a kappa(2)-N,N-bidentate chelating fashion. Complex C3 was characterized as [MoBr(eta(3)-C(3)H(5))(CO)(2)(L3)] with L3 acting as a kappa(2)-N,O-bidentate ligand, based on the spectroscopic data, complemented by DFT calculations. The electrochemical behavior of the monoferrocenyl and diferrocenyl ligands L2 and L3 has been studied together with that of their Mo(II) complexes C2 and C3. As much as possible, the nature of the different redox changes has been confirmed by spectrophotometric measurements. The nature of the frontier orbitals, namely the localization of the HOMO in Mo for both in C2 and C3, was determined by DFT studies.
Resumo:
Let 0 denote the level of quality inherent in a food product that is delivered to some terminal market. In this paper, I characterize allocations over 0 and provide an economic rationale for regulating safety and quality standards in the food system. Zusman and Bockstael investigate the theoretical foundations for imposing standards and stress the importance of providing a tractable conceptual foundation. Despite a wealth of contributions that are mainly empirical (for reviews of these works see, respectively, Caswell and Antle), there have been relatively few attempts to model formally the linkages between farm and food markets when food quality and consumer safety are at issue. Here, I attempt to provide such a framework, building on key contributions in the theoretical literature and linking them in a simple model of quality determination in a vertically related marketing channel. The food-marketing model is due to Gardner. Spence provides a foundation for Pareto-improving intervention in a deterministic model of quality provision, and Leland, building on the classic paper by Akerlof, investigates licensing and minimum standards when the information structure is incomplete. Linking these ideas in a satisfactory model of the food markets is the main objective of the paper.
Resumo:
Food is fundamental to human wellbeing and development. Increased food production remains a cornerstone strategy in the effort to alleviate global food insecurity. But despite the fact that global food production over the past half century has kept ahead of demand, today around one billion people do not have enough to eat, and a further billion lack adequate nutrition. Food insecurity is facing mounting supply-side and demand-side pressures; key among these are climate change, urbanisation, globalisation, population increases, disease, as well as a number of other factors that are changing patterns of food consumption. Many of the challenges to equitable food access are concentrated in developing countries where environmental pressures including climate change, population growth and other socio-economic issues are concentrated. Together these factors impede people's access to sufficient, nutritious food; chiefly through affecting livelihoods, income and food prices. Food security and human development go hand in hand, and their outcomes are co-determined to a significant degree. The challenge of food security is multi-scalar and cross-sector in nature. Addressing it will require the work of diverse actors to bring sustained improvements inhuman development and to reduce pressure on the environment. Unless there is investment in future food systems that are similarly cross-level, cross-scale and cross-sector, sustained improvements in human wellbeing together with reduced environmental risks and scarcities will not be achieved. This paper reviews current thinking, and outlines these challenges. It suggests that essential elements in a successfully adaptive and proactive food system include: learning through connectivity between scales to local experience and technologies high levels of interaction between diverse actors and sectors ranging from primary producers to retailers and consumers, and use of frontier technologies.
Resumo:
The paper develops a more precise specification and understanding of the process of national-level knowledge accumulation and absorptive capabilities by applying the reasoning and evidence from the firm-level analysis pioneered by Cohen and Levinthal (1989, 1990). In doing so, we acknowledge that significant cross-border effects due to the role of both inward and outward FDI exist and that assimilation of foreign knowledge is not only confined to catching-up economies but is also carried out by countries at the frontier-sharing phase. We postulate a non-linear relationship between national absorptive capacity and the technological gap, due to the effects of the cumulative nature of the learning process and the increase in complexity of external knowledge as the country approaches the technological frontier. We argue that national absorptive capacity and the accumulation of knowledge stock are simultaneously determined. This implies that different phases of technological development require different strategies. During the catching-up phase, knowledge accumulation occurs predominately through the absorption of trade and/or inward FDI-related R&D spillovers. At the pre-frontier-sharing phase onwards, increases in the knowledge base occur largely through independent knowledge creation and actively accessing foreign-located technological spillovers, inter alia through outward FDI-related R&D, joint ventures and strategic alliances.
Resumo:
Construction professional services (CPSs), such as architecture, engineering, and consultancy, are not only high value-added profit centers in their own right but also have a knock-on effect on other businesses, such as construction and the export of materials and machinery. Arguably, competition in the international construction market has shifted to these knowledge-intensive CPS areas. Yet CPSs represent a research frontier that has received scant attention. This research aims to enrich the body of knowledge on CPSs by examining strengths, weaknesses, opportunities, and threats (SWOT) of Chinese CPSs (CCPSs) in the international context. It does so by triangulating theories with quantitative and qualitative data gleaned from yearbooks, annual reports, interviews, seminars, and interactions with managers in major CCPS companies. It is found that CCPSs present both strengths and weaknesses in talents, administration systems, and development strategies in dealing with the external opportunities and threats brought about by globalization and market evolution. Low price, which has helped the Chinese construction business to succeed in the international market, is also a major CCPS strength. An opportunity for CCPSs is the relatively strong delivery capability possessed by Chinese contractors; by partnering with them CCPSs can better establish themselves in the international arena. This is probably the first ever comprehensive study on the performance of CCPSs in the international marketplace. The research is conducted at an opportune time, particularly when the world is witnessing the burgeoning force of Chinese businesses in many areas including manufacturing, construction, and, potentially, professional services. It adds new insights to the knowledge body of CPSs and provides valuable references to other countries faced with the challenge of developing CPS business efficiently in the international market.
Exploring socioeconomic impacts of forest based mitigation projects: Lessons from Brazil and Bolivia
Resumo:
This paper aims to contribute new insights globally and regionally on how carbon forest mitigation contributes to sustainable development in South America. Carbon finance has emerged as a potential policy option to tackling global climate change, degradation of forests, and social development in poor countries. This paper focuses on evaluating the socioeconomic impacts of a set of forest based mitigation pilot projects that emerged under the United Nations Framework Convention on Climate Change. The paper reviews research conducted in 2001–2002, drawing from empirical data from four pilot projects, derived from qualitative stakeholder interviews, and complemented by policy documents and literature. Of the four projects studied three are located in frontier areas, where there are considerable pressures for conversion of standing forest to agriculture. In this sense, forest mitigation projects have a substantial role to play in the region. Findings suggest however, that all four projects have experienced cumbersome implementation processes specifically, due to weak social objectives, poor communication, as well as time constraints. In three out of four cases, stakeholders highlighted limited local acceptance at the implementation stages. In the light of these findings, we discuss opportunities for implementation of future forest based mitigation projects in the land use sector.
Resumo:
We present projections of winter storm-induced insured losses in the German residential building sector for the 21st century. With this aim, two structurally most independent downscaling methods and one hybrid downscaling method are applied to a 3-member ensemble of ECHAM5/MPI-OM1 A1B scenario simulations. One method uses dynamical downscaling of intense winter storm events in the global model, and a transfer function to relate regional wind speeds to losses. The second method is based on a reshuffling of present day weather situations and sequences taking into account the change of their frequencies according to the linear temperature trends of the global runs. The third method uses statistical-dynamical downscaling, considering frequency changes of the occurrence of storm-prone weather patterns, and translation into loss by using empirical statistical distributions. The A1B scenario ensemble was downscaled by all three methods until 2070, and by the (statistical-) dynamical methods until 2100. Furthermore, all methods assume a constant statistical relationship between meteorology and insured losses and no developments other than climate change, such as in constructions or claims management. The study utilizes data provided by the German Insurance Association encompassing 24 years and with district-scale resolution. Compared to 1971–2000, the downscaling methods indicate an increase of 10-year return values (i.e. loss ratios per return period) of 6–35 % for 2011–2040, of 20–30 % for 2041–2070, and of 40–55 % for 2071–2100, respectively. Convolving various sources of uncertainty in one confidence statement (data-, loss model-, storm realization-, and Pareto fit-uncertainty), the return-level confidence interval for a return period of 15 years expands by more than a factor of two. Finally, we suggest how practitioners can deal with alternative scenarios or possible natural excursions of observed losses.
Resumo:
The dissymmetrical naphthalene-bridged complexes [Cp′Fe(μ-C10H8)FeCp*] (3; Cp* = η5-C5Me5, Cp′ = η5-C5H2-1,2,4-tBu3) and [Cp′Fe(μ-C10H8)RuCp*] (4) were synthesized via a one-pot procedure from FeCl2(thf)1.5, Cp′K, KC10H8, and [Cp* FeCl(tmeda)] (tmeda = N,N,N′,N′- tetramethylethylenediamine) or [Cp*RuCl]4, respectively. The symmetrically substituted iron ruthenium complex [Cp*Fe(μ-C10H8)RuCp*] (5) bearing two Cp* ligands was prepared as a reference compound. Compounds 3−5 are diamagnetic and display similar molecular structures, where the metal atoms are coordinated to opposite sides of the bridging naphthalene molecule. Cyclic voltammetry and UV/vis spectroelectrochemistry studies revealed that neutral 3−5 can be oxidized to monocations 3+−5+ and dications 32+−52+. The chemical oxidation of 3 and 4 with [Cp2Fe]PF6 afforded the paramagnetic hexafluorophosphate salts [Cp′Fe(μ-C10H8)FeCp*]PF6 ([3]PF6) and [Cp′Fe(μ-C10H8)RuCp*]PF6 ([4]PF6), which were characterized by various spectroscopic techniques, including EPR and 57Fe Mössbauer spectroscopy. The molecular structure of [4]PF6 was determined by X-ray crystallography. DFT calculations support the structural and spectroscopic data and determine the compositions of frontier molecular orbitals in the investigated complexes. The effects of substituting Cp* with Cp′ and Fe with Ru on the electronic structures and the structural and spectroscopic properties are analyzed.
Resumo:
Exascale systems are the next frontier in high-performance computing and are expected to deliver a performance of the order of 10^18 operations per second using massive multicore processors. Very large- and extreme-scale parallel systems pose critical algorithmic challenges, especially related to concurrency, locality and the need to avoid global communication patterns. This work investigates a novel protocol for dynamic group communication that can be used to remove the global communication requirement and to reduce the communication cost in parallel formulations of iterative data mining algorithms. The protocol is used to provide a communication-efficient parallel formulation of the k-means algorithm for cluster analysis. The approach is based on a collective communication operation for dynamic groups of processes and exploits non-uniform data distributions. Non-uniform data distributions can be either found in real-world distributed applications or induced by means of multidimensional binary search trees. The analysis of the proposed dynamic group communication protocol has shown that it does not introduce significant communication overhead. The parallel clustering algorithm has also been extended to accommodate an approximation error, which allows a further reduction of the communication costs. The effectiveness of the exact and approximate methods has been tested in a parallel computing system with 64 processors and in simulations with 1024 processing elements.