932 resultados para Most Productive Scale Size


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compare rain event size distributions derived from measurements in climatically different regions, which we find to be well approximated by power laws of similar exponents over broad ranges. Differences can be seen in the large-scale cutoffs of the distributions. Event duration distributions suggest that the scale-free aspects are related to the absence of characteristic scales in the meteorological mesoscale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a preface to this Special Issue on the results of the QUEST-GSI (Global Scale Impacts) project on climate change impacts on catchment-scale water resources. A detailed description of the unified methodology, subsequently used in all studies in this issue, is provided. The project method involved running simulations of catchment-scale hydrology using a unified set of past and future climate scenarios, to enable a consistent analysis of the climate impacts around the globe. These scenarios include "policy-relevant" prescribed warming scenarios. This is followed by a synthesis of the key findings. Overall, the studies indicate that in most basins the models project substantial changes to river flow, beyond that observed in the historical record, but that in many cases there is considerable uncertainty in the magnitude and sign of the projected changes. The implications of this for adaptation activities are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Geostationary Earth Radiation Budget Intercomparison of Longwave and Shortwave radiation (GERBILS) was an observational field experiment over North Africa during June 2007. The campaign involved 10 flights by the FAAM BAe-146 research aircraft over southwestern parts of the Sahara Desert and coastal stretches of the Atlantic Ocean. Objectives of the GERBILS campaign included characterisation of mineral dust geographic distribution and physical and optical properties, assessment of the impact upon radiation, validation of satellite remote sensing retrievals, and validation of numerical weather prediction model forecasts of aerosol optical depths (AODs) and size distributions. We provide the motivation behind GERBILS and the experimental design and report the progress made in each of the objectives. We show that mineral dust in the region is relatively non-absorbing (mean single scattering albedo at 550 nm of 0.97) owing to the relatively small fraction of iron oxides present (1–3%), and that detailed spectral radiances are most accurately modelled using irregularly shaped particles. Satellite retrievals over bright desert surfaces are challenging owing to the lack of spectral contrast between the dust and the underlying surface. However, new techniques have been developed which are shown to be in relatively good agreement with AERONET estimates of AOD and with each other. This encouraging result enables relatively robust validation of numerical models which treat the production, transport, and deposition of mineral dust. The dust models themselves are able to represent large-scale synoptically driven dust events to a reasonable degree, but some deficiencies remain both in the Sahara and over the Sahelian region, where cold pool outflow from convective cells associated with the intertropical convergence zone can lead to significant dust production.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reduction of portfolio risk is important to all investors but is particularly important to real estate investors as most property portfolios are generally small. As a consequence, portfolios are vulnerable to a significant risk of under-performing the market, or a target rate of return and so investors may be exposing themselves to greater risk than necessary. Given the potentially higher risk of underperformance from owning only a few properties, we follow the approach of Vassal (2001) and examine the benefits of holding more properties in a real estate portfolio. Using Monte Carlo simulation and the returns from 1,728 properties in the IPD database, held over the 10-year period from 1995 to 2004, the results show that increases in portfolio size offers the possibility of a more stable and less volatile return pattern over time, i.e. down-side risk is diminished with increasing portfolio size. Nonetheless, increasing portfolio size has the disadvantage of restricting the probability of out-performing the benchmark index by a significant amount. In other words, although increasing portfolio size reduces the down-side risk in a portfolio, it also decreases its up-side potential. Be that as it may, the results provide further evidence that portfolios with large numbers of properties are always preferable to portfolios of a smaller size.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Property portfolio diversification takes many forms, most of which can be associated with asset size. In other words larger property portfolios are assumed to have greater diversification potential than small portfolios. In addition, since greater diversification is generally associated with lower risk it is assumed that larger property portfolios will also have reduced return variability compared with smaller portfolios. If large property portfolios can simply be regarded as scaled-up, better-diversified versions of small property portfolios, then the greater a portfolio’s asset size, the lower its risk. This suggests a negative relationship between asset size and risk. However, if large property portfolios are not simply scaled-up versions of small portfolios, the relationship between asset size and risk may be unclear. For instance, if large portfolios hold riskier assets or pursue more volatile investment strategies, it may be that a positive relationship between asset size and risk would be observed, even if large property portfolios are more diversified. This paper tests the empirical relationship between property portfolio size, diversification and risk, in Institutional portfolios in the UK, during the period from 1989 to 1999 to determine which of these two characterisations is more appropriate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the virtual porous carbon model proposed by Harris et al, we study the effect of carbon surface oxidation on the pore size distribution (PSD) curve determined from simulated Ar, N(2) and CO(2) isotherms. It is assumed that surface oxidation is not destructive for the carbon skeleton, and that all pores are accessible for studied molecules (i.e., only the effect of the change of surface chemical composition is studied). The results obtained show two important things, i.e., oxidation of the carbon surface very slightly changes the absolute porosity (calculated from the geometric method of Bhattacharya and Gubbins (BG)); however, PSD curves calculated from simulated isotherms are to a greater or lesser extent affected by the presence of surface oxides. The most reliable results are obtained from Ar adsorption data. Not only is adsorption of this adsorbate practically independent from the presence of surface oxides, but, more importantly, for this molecule one can apply the slit-like model of pores as the first approach to recover the average pore diameter of a real carbon structure. For nitrogen, the effect of carbon surface chemical composition is observed due to the quadrupole moment of this molecule, and this effect shifts the PSD curves compared to Ar. The largest differences are seen for CO2, and it is clearly demonstrated that the PSD curves obtained from adsorption isotherms of this molecule contain artificial peaks and the average pore diameter is strongly influenced by the presence of electrostatic adsorbate-adsorbate as well as adsorbate-adsorbent interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on arable sandy loam and silty clay loam soils on 4° slopes in England has shown that tramlines (i.e. the unseeded wheeling areas used to facilitate spraying operations in cereal crops) can represent the most important pathway for phosphorus and sediment loss from moderately sloping fields. Detailed monitoring over the October–March period in winters 2005–2006 and 2006–2007 included event-based sampling of surface runoff, suspended and particulate sediment, and dissolved and particulate phosphorus from hillslope segments (each ∼300–800 m2) established in a randomized block design with four replicates of each treatment at each of two sites on lighter and heavier soils. Experimental treatments assessed losses from the cropped area without tramlines, and from the uncropped tramline area, and were compared to losses from tramlines which had been disrupted once in the autumn with a shallow tine. On the lighter soil, the effects of removal or shallow incorporation of straw residues was also determined. Research on both sandy and silty clay loam soils across two winters showed that tramline wheelings represented the dominant pathway for surface runoff and transport of sediment, phosphorus and nitrogen from cereal crops on moderate slopes. Results indicated 5·5–15·8% of rainfall lost as runoff, and losses of 0·8–2·9 kg TP ha−1 and 0·3–4·8 t ha−1 sediment in tramline treatments, compared to only 0·2–1·7% rainfall lost as runoff, and losses of 0·0–0·2 kg TP ha−1 and 0·003–0·3 t ha−1 sediment from treatments without tramlines or those where tramlines had been disrupted. The novel shallow disruption of tramline wheelings using a tine once following the autumn spray operation consistently and dramatically reduced (p < 0·001) surface runoff and loads of sediment, total nitrogen and total phosphorus to levels similar to those measured in cropped areas between tramlines. Results suggest that options for managing tramline wheelings warrant further refinement and evaluation with a view to incorporating them into spatially-targeted farm-level management planning using national or catchment-based agri-environment policy instruments aimed at reducing diffuse pollution from land to surface water systems. Copyright © 2010 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this article was to determine which aspects of Huntington's disease (HD) are most important with regard to the health-related quality of life (HrQOL) of patients with this neurodegenerative disease. Seventy patients with HD participated in the study. Assessment comprised the Unified Huntington's Disease Rating Scale (UHDRS) motor, cognitive and functional capacity sections, and the Beck Depression inventory. Mental and physical HrQOL were assessed using summary scores of the SF-36. Multiple regression analyses showed that functional capacity and depressive mood were significantly associated with HrQOL, in that greater impairments in HrQOL were associated with higher levels of depressive mood and lower functional capacity. Motor symptoms and cognitive function were not found to be as closely linked with HrQOL. Therefore, it can be concluded that, depressive mood and greater functional incapacity are key factors in HrQOL for people with HD, and further longitudinal investigation will be useful to determine their utility as specific targets in intervention studies aimed at improving patient HrQOL, or whether other mediating variables. As these two factors had a similar association with the mental and physical summary scores of the SF-36, this generic HrQOL measure did not adequately capture and distinguish the true mental and physical health-related HrQOL in HD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Housebuilding is frequently viewed as an industry full of small firms. However, large firms exist in many countries. Here, a comparative analysis is made of the housebuilding industries in Australia, Britain and the USA. Housebuilding output is found to be much higher in Australia and the USA than in Britain when measured on a per capita basis. At the same time, the degree of market concentration in Australia and the USA is relatively low but in Britain it is far greater, with a few firms having quite substantial market shares. Investigation of the size distribution of the top 100 or so firms ranked by output also shows that the decline in firm size from the largest downwards is more rapid in Britain than elsewhere. The exceptionalism of the British case is put down to two principal reasons. First, the close proximity of Britain’s regions enables housebuilders to diversify successfully across different markets. The gains from such diversification are best achieved by large firms, because they can gain scale benefits in any particular market segment. Second, land shortages induced by a restrictive planning system encourage firms to takeover each other as a quick and beneficial means of acquiring land. The institutional rules of planning also make it difficult for new entrants to come in at the bottom end of the size hierarchy. In this way, concentration grows and a handful of large producers emerge. These conditions do not hold in the other two countries, so their industries are less concentrated. Given the degree of rivalry between firms over land purchases and takeovers, it is difficult to envisage them behaving in a long-term collusive manner, so that competition in British housebuilding is probably not unduly compromised by the exceptional degree of firm concentration. Reforms to lower the restrictions, improve the slow responsiveness and reduce the uncertainties associated with British planning systems’ role in housing supply are likely to greatly improve the ability of new firms to enter housebuilding and all firms’ abilities to increase output in response to rising housing demand. Such reforms would also probably lower overall housebuilding firm concentration over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate-G is a large scale distributed testbed devoted to climate change research. It is an unfunded effort started in 2008 and involving a wide community both in Europe and US. The testbed is an interdisciplinary effort involving partners from several institutions and joining expertise in the field of climate change and computational science. Its main goal is to allow scientists carrying out geographical and cross-institutional data discovery, access, analysis, visualization and sharing of climate data. It represents an attempt to address, in a real environment, challenging data and metadata management issues. This paper presents a complete overview about the Climate-G testbed highlighting the most important results that have been achieved since the beginning of this project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Executive summary Nature of the problem • Environmental problems related to nitrogen concern all economic sectors and impact all media: atmosphere, pedosphere, hydrosphere and anthroposphere. • Therefore, the integration of fluxes allows an overall coverage of problems related to reactive nitrogen (Nr) in the environment, which is not accessible from sectoral approaches or by focusing on specific media. Approaches • This chapter presents a set of high resolution maps showing key elements of the N flux budget across Europe, including N2 and Nr fluxes. • Comparative nitrogen budgets are also presented for a range of European countries, highlighting the most efficient strategies for mitigating Nr problems at a national scale. A new European Nitrogen Budget (EU-27) is presented on the basis of state-of-the-art Europe-wide models and databases focusing on different segments of Europe’s society. Key findings • From c. 18 Tg Nr yr −1 input to agriculture in the EU-27, only about 7 Tg Nr yr− 1 find their way to the consumer or are further processed by industry. • Some 3.7 Tg Nr yr−1 is released by the burning of fossil fuels in the EU-27, whereby the contribution of the industry and energy sectors is equal to that of the transport sector. More than 8 Tg Nr yr−1 are disposed of to the hydrosphere, while the EU-27 is a net exporter of reactive nitrogen through atmospheric transport of c. 2.3 Tg Nr yr−1. • The largest single sink for Nr appears to be denitrifi cation to N2 in European coastal shelf regions (potentially as large as the input of mineral fertilizer, about 11 Tg N yr–1 for the EU-27); however, this sink is also the most uncertain, because of the uncertainty of Nr import from the open ocean. Major uncertainties • National nitrogen budgets are diffi cult to compile using a large range of data sources and are currently available only for a limited number of countries. • Modelling approaches have been used to fill in the data gaps in some of these budgets, but it became obvious during this study that further research is needed in order to collect necessary data and make national nitrogen budgets inter-comparable across Europe. • In some countries, due to inconsistent or contradictory information coming from different data sources, closure of the nitrogen budget was not possible. Recommendations • The large variety of problems associated with the excess of Nr in the European environment,including adverse impacts, requires an integrated nitrogen management approach that would allow for creation and closure of N budgets within European environments. • Development of nitrogen budgets nationwide, their assessment and management could become an effective tool to prioritize measures and prevent unwanted side effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The K-Means algorithm for cluster analysis is one of the most influential and popular data mining methods. Its straightforward parallel formulation is well suited for distributed memory systems with reliable interconnection networks, such as massively parallel processors and clusters of workstations. However, in large-scale geographically distributed systems the straightforward parallel algorithm can be rendered useless by a single communication failure or high latency in communication paths. The lack of scalable and fault tolerant global communication and synchronisation methods in large-scale systems has hindered the adoption of the K-Means algorithm for applications in large networked systems such as wireless sensor networks, peer-to-peer systems and mobile ad hoc networks. This work proposes a fully distributed K-Means algorithm (EpidemicK-Means) which does not require global communication and is intrinsically fault tolerant. The proposed distributed K-Means algorithm provides a clustering solution which can approximate the solution of an ideal centralised algorithm over the aggregated data as closely as desired. A comparative performance analysis is carried out against the state of the art sampling methods and shows that the proposed method overcomes the limitations of the sampling-based approaches for skewed clusters distributions. The experimental analysis confirms that the proposed algorithm is very accurate and fault tolerant under unreliable network conditions (message loss and node failures) and is suitable for asynchronous networks of very large and extreme scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The time evolution of the circulation change at the end of the Baiu season is investigated using ERA40 data. An end-day is defined for each of the 23 years based on the 850 hPa θe value at 40˚Nin the 130-140˚E sector exceeding 330 K. Daily time series of variables are composited with respect to this day. These composite time-series exhibit a clearer and more rapid change in the precipitation and the large-scale circulation over the whole East Asia region than those performed using calendar days. The precipitation change includes the abrupt end of the Baiu rain, the northward shift of tropical convection perhaps starting a few days before this, and the start of the heavier rain at higher latitudes. The northward migration of lower tropospheric warm, moist tropical air, a general feature of the seasonal march in the region, is fast over the continent and slow over the ocean. By mid to late July the cooler air over the Sea of Japan is surrounded on 3 sides by the tropical air. It is suggestive that the large-scale stage has been set for a jump to the post-Baiu state, i.e., for the end of the Baiu season. Two likely triggers for the actual change emerge from the analysis. The first is the northward movement of tropical convection into the Philippine region. The second is an equivalent barotropic Rossby wave-train, that over a 10-day period develops downstream across Eurasia. It appears likely that in most years one or both mechanisms can be important in triggering the actual end of the Baiu season.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fast increase in the size and number of databases demands data mining approaches that are scalable to large amounts of data. This has led to the exploration of parallel computing technologies in order to perform data mining tasks concurrently using several processors. Parallelization seems to be a natural and cost-effective way to scale up data mining technologies. One of the most important of these data mining technologies is the classification of newly recorded data. This paper surveys advances in parallelization in the field of classification rule induction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advances in hardware and software in the past decade allow to capture, record and process fast data streams at a large scale. The research area of data stream mining has emerged as a consequence from these advances in order to cope with the real time analysis of potentially large and changing data streams. Examples of data streams include Google searches, credit card transactions, telemetric data and data of continuous chemical production processes. In some cases the data can be processed in batches by traditional data mining approaches. However, in some applications it is required to analyse the data in real time as soon as it is being captured. Such cases are for example if the data stream is infinite, fast changing, or simply too large in size to be stored. One of the most important data mining techniques on data streams is classification. This involves training the classifier on the data stream in real time and adapting it to concept drifts. Most data stream classifiers are based on decision trees. However, it is well known in the data mining community that there is no single optimal algorithm. An algorithm may work well on one or several datasets but badly on others. This paper introduces eRules, a new rule based adaptive classifier for data streams, based on an evolving set of Rules. eRules induces a set of rules that is constantly evaluated and adapted to changes in the data stream by adding new and removing old rules. It is different from the more popular decision tree based classifiers as it tends to leave data instances rather unclassified than forcing a classification that could be wrong. The ongoing development of eRules aims to improve its accuracy further through dynamic parameter setting which will also address the problem of changing feature domain values.