969 resultados para attori, concorrenza, COOP, Akka, benchmark
Resumo:
The Aqua-Planet Experiment (APE) was first proposed by Neale and Hoskins (2000a) as a benchmark for atmospheric general circulation models (AGCMs) on an idealised water-covered Earth. The experiment and its aims are summarised, and its context within a modelling hierarchy used to evaluate complex models and to provide a link between realistic simulation and conceptual models of atmospheric phenomena is discussed. The simplified aqua-planet configuration bridges a gap in the existing hierarchy. It is designed to expose differences between models and to focus attention on particular phenomena and their response to changes in the underlying distribution of sea surface temperature.
Resumo:
Asset allocation is concerned with the development of multi--‐asset portfolio strategies that are likely to meet an investor’s objectives based on the interaction of expected returns, risk, correlation and implementation from a range of distinct asset classes or beta sources. Challenges associated with the discipline are often particularly significant in private markets. Specifically, composition differences between the ‘index’ or ‘benchmark’ universe and the investible universe mean that there can often be substantial and meaningful deviations between the investment characteristics implied in asset allocation decisions and those delivered by investment teams. For example, while allocation decisions are often based on relatively low--‐risk diversified real estate ‘equity’ exposure, implementation decisions frequently include exposure to higher risk forms of the asset class as well as investments in debt based instruments. These differences can have a meaningful impact on the contribution of the asset class to the overall portfolio and, therefore, lead to a potential misalignment between asset allocation decisions and implementation. Despite this, the key conclusion from this paper is not that real estate investors should become slaves to a narrowly defined mandate based on IPD / NCREIF or other forms of benchmark replication. The discussion suggests that such an approach would likely lead to the underutilization of real estate in multi--‐asset portfolio strategies. Instead, it is that to achieve asset allocation alignment, real estate exposure should be divided into multiple pools representing distinct forms of the asset class. In addition, the paper suggests that associated investment guidelines and processes should be collaborative and reflect the portfolio wide asset allocation objectives of each pool. Further, where appropriate they should specifically target potential for ‘additional’ beta or, more marginally, ‘alpha’.
Resumo:
We diagnose forcing and climate feedbacks in benchmark sensitivity experiments with the new Met Office Hadley Centre Earth system climate model HadGEM2-ES. To identify the impact of newly-included biogeophysical and chemical processes, results are compared to a parallel set of experiments performed with these processes switched off, and different couplings with the biogeochemistry. In abrupt carbon dioxide quadrupling experiments we find that the inclusion of these processes does not alter the global climate sensitivity of the model. However, when the change in carbon dioxide is uncoupled from the vegetation, or when the model is forced with a non-carbon dioxide forcing – an increase in solar constant – new feedbacks emerge that make the climate system less sensitive to external perturbations. We identify a strong negative dust-vegetation feedback on climate change that is small in standard carbon dioxide sensitivity experiments due to the physiological/fertilization effects of carbon dioxide on plants in this model.
Resumo:
Global flood hazard maps can be used in the assessment of flood risk in a number of different applications, including (re)insurance and large scale flood preparedness. Such global hazard maps can be generated using large scale physically based models of rainfall-runoff and river routing, when used in conjunction with a number of post-processing methods. In this study, the European Centre for Medium Range Weather Forecasts (ECMWF) land surface model is coupled to ERA-Interim reanalysis meteorological forcing data, and resultant runoff is passed to a river routing algorithm which simulates floodplains and flood flow across the global land area. The global hazard map is based on a 30 yr (1979–2010) simulation period. A Gumbel distribution is fitted to the annual maxima flows to derive a number of flood return periods. The return periods are calculated initially for a 25×25 km grid, which is then reprojected onto a 1×1 km grid to derive maps of higher resolution and estimate flooded fractional area for the individual 25×25 km cells. Several global and regional maps of flood return periods ranging from 2 to 500 yr are presented. The results compare reasonably to a benchmark data set of global flood hazard. The developed methodology can be applied to other datasets on a global or regional scale.
Resumo:
This Atlas presents statistical analyses of the simulations submitted to the Aqua-Planet Experiment (APE) data archive. The simulations are from global Atmospheric General Circulation Models (AGCM) applied to a water-covered earth. The AGCMs include ones actively used or being developed for numerical weather prediction or climate research. Some are mature, application models and others are more novel and thus less well tested in Earth-like applications. The experiment applies AGCMs with their complete parameterization package to an idealization of the planet Earth which has a greatly simplified lower boundary that consists of an ocean only. It has no land and its associated orography, and no sea ice. The ocean is represented by Sea Surface Temperatures (SST) which are specified everywhere with simple, idealized distributions. Thus in the hierarchy of tests available for AGCMs, APE falls between tests with simplified forcings such as those proposed by Held and Suarez (1994) and Boer and Denis (1997) and Earth-like simulations of the Atmospheric Modeling Intercomparison Project (AMIP, Gates et al., 1999). Blackburn and Hoskins (2013) summarize the APE and its aims. They discuss where the APE fits within a modeling hierarchy which has evolved to evaluate complete models and which provides a link between realistic simulation and conceptual models of atmospheric phenomena. The APE bridges a gap in the existing hierarchy. The goals of APE are to provide a benchmark of current model behaviors and to stimulate research to understand the cause of inter-model differences., APE is sponsored by the World Meteorological Organization (WMO) joint Commission on Atmospheric Science (CAS), World Climate Research Program (WCRP) Working Group on Numerical Experimentation (WGNE). Chapter 2 of this Atlas provides an overview of the specification of the eight APE experiments and of the data collected. Chapter 3 lists the participating models and includes brief descriptions of each. Chapters 4 through 7 present a wide variety of statistics from the 14 participating models for the eight different experiments. Additional intercomparison figures created by Dr. Yukiko Yamada in AGU group are available at http://www.gfd-dennou.org/library/ape/comparison/. This Atlas is intended to present and compare the statistics of the APE simulations but does not contain a discussion of interpretive analyses. Such analyses are left for journal papers such as those included in the Special Issue of the Journal of the Meteorological Society of Japan (2013, Vol. 91A) devoted to the APE. Two papers in that collection provide an overview of the simulations. One (Blackburn et al., 2013) concentrates on the CONTROL simulation and the other (Williamson et al., 2013) on the response to changes in the meridional SST profile. Additional papers provide more detailed analysis of the basic simulations, while others describe various sensitivities and applications. The APE experiment data base holds a wealth of data that is now publicly available from the APE web site: http://climate.ncas.ac.uk/ape/. We hope that this Atlas will stimulate future analyses and investigations to understand the large variation seen in the model behaviors.
Resumo:
This article summarizes the main research findings from the first of a series of annual surveys conducted for the British Council of Shopping Centres. The study examines the changing pattern of retailing in the United Kingdom and provides an overview of key research from previous studies in both the U.K. and the United States. The main findings are then presented, including an examination of the impact of e-commerce on sales and rental values and on the future space and ownership / leasing requirements of U.K. retailers for 2000-2005. The impact on a shopping center in a case study town in the U.K. is also considered. The difficulties of isolating the impact of e-commerce from other forces for change in retailing are highlighted. In contrast to other viewpoints, the results show that e-commerce will not mean the death of conventional store-based U.K. retailing, although further benchmark research is needed.
Resumo:
We present a benchmark system for global vegetation models. This system provides a quantitative evaluation of multiple simulated vegetation properties, including primary production; seasonal net ecosystem production; vegetation cover, composition and 5 height; fire regime; and runoff. The benchmarks are derived from remotely sensed gridded datasets and site-based observations. The datasets allow comparisons of annual average conditions and seasonal and inter-annual variability, and they allow the impact of spatial and temporal biases in means and variability to be assessed separately. Specifically designed metrics quantify model performance for each process, 10 and are compared to scores based on the temporal or spatial mean value of the observations and a “random” model produced by bootstrap resampling of the observations. The benchmark system is applied to three models: a simple light-use efficiency and water-balance model (the Simple Diagnostic Biosphere Model: SDBM), and the Lund-Potsdam-Jena (LPJ) and Land Processes and eXchanges (LPX) dynamic global 15 vegetation models (DGVMs). SDBM reproduces observed CO2 seasonal cycles, but its simulation of independent measurements of net primary production (NPP) is too high. The two DGVMs show little difference for most benchmarks (including the interannual variability in the growth rate and seasonal cycle of atmospheric CO2), but LPX represents burnt fraction demonstrably more accurately. Benchmarking also identified 20 several weaknesses common to both DGVMs. The benchmarking system provides a quantitative approach for evaluating how adequately processes are represented in a model, identifying errors and biases, tracking improvements in performance through model development, and discriminating among models. Adoption of such a system would do much to improve confidence in terrestrial model predictions of climate change 25 impacts and feedbacks.
Resumo:
Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.
Resumo:
Background: Since their inception, Twitter and related microblogging systems have provided a rich source of information for researchers and have attracted interest in their affordances and use. Since 2009 PubMed has included 123 journal articles on medicine and Twitter, but no overview exists as to how the field uses Twitter in research. // Objective: This paper aims to identify published work relating to Twitter indexed by PubMed, and then to classify it. This classification will provide a framework in which future researchers will be able to position their work, and to provide an understanding of the current reach of research using Twitter in medical disciplines. Limiting the study to papers indexed by PubMed ensures the work provides a reproducible benchmark. // Methods: Papers, indexed by PubMed, on Twitter and related topics were identified and reviewed. The papers were then qualitatively classified based on the paper’s title and abstract to determine their focus. The work that was Twitter focused was studied in detail to determine what data, if any, it was based on, and from this a categorization of the data set size used in the studies was developed. Using open coded content analysis additional important categories were also identified, relating to the primary methodology, domain and aspect. // Results: As of 2012, PubMed comprises more than 21 million citations from biomedical literature, and from these a corpus of 134 potentially Twitter related papers were identified, eleven of which were subsequently found not to be relevant. There were no papers prior to 2009 relating to microblogging, a term first used in 2006. Of the remaining 123 papers which mentioned Twitter, thirty were focussed on Twitter (the others referring to it tangentially). The early Twitter focussed papers introduced the topic and highlighted the potential, not carrying out any form of data analysis. The majority of published papers used analytic techniques to sort through thousands, if not millions, of individual tweets, often depending on automated tools to do so. Our analysis demonstrates that researchers are starting to use knowledge discovery methods and data mining techniques to understand vast quantities of tweets: the study of Twitter is becoming quantitative research. // Conclusions: This work is to the best of our knowledge the first overview study of medical related research based on Twitter and related microblogging. We have used five dimensions to categorise published medical related research on Twitter. This classification provides a framework within which researchers studying development and use of Twitter within medical related research, and those undertaking comparative studies of research relating to Twitter in the area of medicine and beyond, can position and ground their work.
Resumo:
Organisations need the right business and IT capabilities in order to achieve future business success. It follows that the sourcing of these capabilities is an important decision. Yet, there is a lack of consensus on the approach to decid-ing where and how to source the core operational capabilities. Furthermore, de-veloping its dynamic capability enables an organisation to effectively manage change its operational capabilities. Recent research has proposed that analysing business capabilities is a key pre-requisite to defining its Information Technology (IT) solutions. This research builds on these findings by considering the interde-pendencies between the dynamic business change capability and the sourcing of IT capabilities. Further it examines the decision-making oversight of these areas as implemented through IT governance. There is a good understanding of the direct impact of IT sourcing decision on operational capabilities However, there is a lack of research on the indirect impact to the capability of managing business change. Through a review of prior research and initial pilot field research, a capability framework and three main propositions are proposed, each examining a two-way interdependency. This paper describes the development of the integrated capa-bility framework and the rationale for the propositions. These respectively cover managing business change, IT sourcing and IT governance. Firstly, the sourcing of IT affects both the operational capabilities and the capability to manage business change. Similarly a business change may result in new or revised operational ca-pabilities, which can influence the IT sourcing decision resulting in a two-way rela-tionship. Secondly, this IT sourcing is directed under IT governance, which pro-vides a decision-making framework for the organisation. At the same time, the IT sourcing can have an impact on the IT governance capability, for example by out-sourcing key capabilities; hence this is potentially again a two-way relationship. Finally, there is a postulated two-way relationship between IT governance and managing business change in that IT governance provides an oversight of manag-ing business change through portfolio management while IT governance is a key element of the business change capability. Given the nature and novelty of this framework, a philosophical paradigm of constructivism is preferred. To illustrate and explore the theoretical perspectives provided, this paper reports on the find-ings of a case study incorporating eight high-level interviews with senior execu-tives in a German bank with 2300 employees. The collected data also include or-ganisational charts, annual reports, project and activity portfolio and benchmark reports for the IT budget. Recommendations are made for practitioners. An understanding of the interdependencies can support professionals in improving business success through effectively managing business change. Additionally, they can be assisted to evaluate the impact of IT sourcing decisions on the organisa-tion’s operational and dynamic capabilities, using an appropriate IT governance framework.
Resumo:
In this paper, dual-hop amplify-and-forward (AF) cooperative systems in the presence of in-phase and quadrature-phase (I/Q) imbalance, which refers to the mismatch between components in I and Q branches, are investigated. First, we analyze the performance of the considered AF cooperative protocol without compensation for I/Q imbalance as the benchmark. Furthermore, a compensation algorithm for I/Q imbalance is proposed, which makes use of the received signals at the destination, from the source and relay nodes, together with their conjugations to detect the transmitted signal. The performance of the AF cooperative system under study is evaluated in terms of average symbol error probability (SEP), which is derived considering transmission over Rayleigh fading channels. Numerical results are provided and show that the proposed compensation algorithm can efficiently mitigate the effect of I/Q imbalance.
Resumo:
We present a Galerkin method with piecewise polynomial continuous elements for fully nonlinear elliptic equations. A key tool is the discretization proposed in Lakkis and Pryer, 2011, allowing us to work directly on the strong form of a linear PDE. An added benefit to making use of this discretization method is that a recovered (finite element) Hessian is a byproduct of the solution process. We build on the linear method and ultimately construct two different methodologies for the solution of second order fully nonlinear PDEs. Benchmark numerical results illustrate the convergence properties of the scheme for some test problems as well as the Monge–Amp`ere equation and the Pucci equation.
Resumo:
In this paper a modified algorithm is suggested for developing polynomial neural network (PNN) models. Optimal partial description (PD) modeling is introduced at each layer of the PNN expansion, a task accomplished using the orthogonal least squares (OLS) method. Based on the initial PD models determined by the polynomial order and the number of PD inputs, OLS selects the most significant regressor terms reducing the output error variance. The method produces PNN models exhibiting a high level of accuracy and superior generalization capabilities. Additionally, parsimonious models are obtained comprising a considerably smaller number of parameters compared to the ones generated by means of the conventional PNN algorithm. Three benchmark examples are elaborated, including modeling of the gas furnace process as well as the iris and wine classification problems. Extensive simulation results and comparison with other methods in the literature, demonstrate the effectiveness of the suggested modeling approach.
Resumo:
Background Staphylococcus aureus is a major cause of healthcare associated mortality, but like many important bacterial pathogens, it is a common constituent of the normal human body flora. Around a third of healthy adults are carriers. Recent evidence suggests that evolution of S. aureus during nasal carriage may be associated with progression to invasive disease. However, a more detailed understanding of within-host evolution under natural conditions is required to appreciate the evolutionary and mechanistic reasons why commensal bacteria such as S. aureus cause disease. Therefore we examined in detail the evolutionary dynamics of normal, asymptomatic carriage. Sequencing a total of 131 genomes across 13 singly colonized hosts using the Illumina platform, we investigated diversity, selection, population dynamics and transmission during the short-term evolution of S. aureus. Principal Findings We characterized the processes by which the raw material for evolution is generated: micro-mutation (point mutation and small insertions/deletions), macro-mutation (large insertions/deletions) and the loss or acquisition of mobile elements (plasmids and bacteriophages). Through an analysis of synonymous, non-synonymous and intergenic mutations we discovered a fitness landscape dominated by purifying selection, with rare examples of adaptive change in genes encoding surface-anchored proteins and an enterotoxin. We found evidence for dramatic, hundred-fold fluctuations in the size of the within-host population over time, which we related to the cycle of colonization and clearance. Using a newly-developed population genetics approach to detect recent transmission among hosts, we revealed evidence for recent transmission between some of our subjects, including a husband and wife both carrying populations of methicillin-resistant S. aureus (MRSA). Significance This investigation begins to paint a picture of the within-host evolution of an important bacterial pathogen during its prevailing natural state, asymptomatic carriage. These results also have wider significance as a benchmark for future systematic studies of evolution during invasive S. aureus disease.
Resumo:
The primary role of land surface models embedded in climate models is to partition surface available energy into upwards, radiative, sensible and latent heat fluxes. Partitioning of evapotranspiration, ET, is of fundamental importance: as a major component of the total surface latent heat flux, ET affects the simulated surface water balance, and related energy balance, and consequently the feedbacks with the atmosphere. In this context it is also crucial to credibly represent the CO2 exchange between ecosystems and their environment. In this study, JULES, the land surface model used in UK weather and climate models, has been evaluated for temperate Europe. Compared to eddy covariance flux measurements, the CO2 uptake by the ecosystem is underestimated and the ET overestimated. In addition, the contribution to ET from soil and intercepted water evaporation far outweighs the contribution of plant transpiration. To alleviate these biases, adaptations have been implemented in JULES, based on key literature references. These adaptations have improved the simulation of the spatio-temporal variability of the fluxes and the accuracy of the simulated GPP and ET, including its partitioning. This resulted in a shift of the seasonal soil moisture cycle. These adaptations are expected to increase the fidelity of climate simulations over Europe. Finally, the extreme summer of 2003 was used as evaluation benchmark for the use of the model in climate change studies. The improved model captures the impact of the 2003 drought on the carbon assimilation and the water use efficiency of the plants. It, however, underestimates the 2003 GPP anomalies. The simulations showed that a reduction of evaporation from the interception and soil reservoirs, albeit not of transpiration, largely explained the good correlation between the carbon and the water fluxes anomalies that was observed during 2003. This demonstrates the importance of being able to discriminate the response of individual component of the ET flux to environmental forcing.