50 resultados para index model


Relevância:

40.00% 40.00%

Publicador:

Resumo:

We have developed a model that allows players in the building and construction sector and the energy policy makers on energy strategies to be able to perceive the interest of investors in the kingdom of Bahrain in conducting Building Integrated Photovoltaic (BIPV) or Building integrated wind turbines (BIWT) projects, i.e. a partial sustainable or green buildings. The model allows the calculation of the Sustainable building index (SBI), which ranges from 0.1 (lowest) to 1.0 (highest); the higher figure the more chance for launching BIPV or BIWT. This model was tested in Bahrain and the calculated SBI was found 0.47. This means that an extensive effort must be made through policies on renewable energy, renewable energy education, and incentives to BIPV and BIWT projects, environmental awareness and promotion to clean and sustainable energy for building and construction projects. Our model can be used internationally to create a "Global SBI" database. The Sustainable building and construction initiative (SBCI), United Nation, can take the task for establishing such task using this model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We examine whether a three-regime model that allows for dormant, explosive and collapsing speculative behaviour can explain the dynamics of the S&P 500. We extend existing models of speculative behaviour by including a third regime that allows a bubble to grow at a steady rate, and propose abnormal volume as an indicator of the probable time of bubble collapse. We also examine the financial usefulness of the three-regime model by studying a trading rule formed using inferences from it, whose use leads to higher Sharpe ratios and end of period wealth than from employing existing models or a buy-and-hold strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models developed to identify the rates and origins of nutrient export from land to stream require an accurate assessment of the nutrient load present in the water body in order to calibrate model parameters and structure. These data are rarely available at a representative scale and in an appropriate chemical form except in research catchments. Observational errors associated with nutrient load estimates based on these data lead to a high degree of uncertainty in modelling and nutrient budgeting studies. Here, daily paired instantaneous P and flow data for 17 UK research catchments covering a total of 39 water years (WY) have been used to explore the nature and extent of the observational error associated with nutrient flux estimates based on partial fractions and infrequent sampling. The daily records were artificially decimated to create 7 stratified sampling records, 7 weekly records, and 30 monthly records from each WY and catchment. These were used to evaluate the impact of sampling frequency on load estimate uncertainty. The analysis underlines the high uncertainty of load estimates based on monthly data and individual P fractions rather than total P. Catchments with a high baseflow index and/or low population density were found to return a lower RMSE on load estimates when sampled infrequently than those with a tow baseflow index and high population density. Catchment size was not shown to be important, though a limitation of this study is that daily records may fail to capture the full range of P export behaviour in smaller catchments with flashy hydrographs, leading to an underestimate of uncertainty in Load estimates for such catchments. Further analysis of sub-daily records is needed to investigate this fully. Here, recommendations are given on load estimation methodologies for different catchment types sampled at different frequencies, and the ways in which this analysis can be used to identify observational error and uncertainty for model calibration and nutrient budgeting studies. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The intraseasonal variability (ISV) of the Indian summer monsoon is dominated by a 30–50 day oscillation between “active” and “break” events of enhanced and reduced rainfall over the subcontinent, respectively. These organized convective events form in the equatorial Indian Ocean and propagate north to India. Atmosphere–ocean coupled processes are thought to play a key role the intensity and propagation of these events. A high-resolution, coupled atmosphere–mixed-layer-oceanmodel is assembled: HadKPP. HadKPP comprises the Hadley Centre Atmospheric Model (HadAM3) and the K Profile Parameterization (KPP) mixed-layer ocean model. Following studies that upper-ocean vertical resolution and sub-diurnal coupling frequencies improve the simulation of ISV in SSTs, KPP is run at 1 m vertical resolution near the surface; the atmosphere and ocean are coupled every three hours. HadKPP accurately simulates the 30–50 day ISV in rainfall and SSTs over India and the Bay of Bengal, respectively, but suffers from low ISV on the equator. This is due to the HadAM3 convection scheme producing limited ISV in surface fluxes. HadKPP demonstrates little of the observed northward propagation of intraseasonal events, producing instead a standing oscillation. The lack of equatorial ISV in convection in HadAM3 constrains the ability of KPP to produce equatorial SST anomalies, which further weakens the ISV of convection. It is concluded that while atmosphere–ocean interactions are undoubtedly essential to an accurate simulation of ISV, they are not a panacea for model deficiencies. In regions where the atmospheric forcing is adequate, such as the Bay of Bengal, KPP produces SST anomalies that are comparable to the Tropical Rainfall Measuring Mission Microwave Imager (TMI) SST analyses in both their magnitude and their timing with respect to rainfall anomalies over India. HadKPP also displays a much-improved phase relationship between rainfall and SSTs over a HadAM3 ensemble forced by observed SSTs, when both are compared to observations. Coupling to mixed-layer models such as KPP has the potential to improve operational predictions of ISV, particularly when the persistence time of SST anomalies is shorter than the forecast lead time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates the response of wintertime North Atlantic Oscillation (NAO) to increasing concentrations of atmospheric carbon dioxide (CO2) as simulated by 18 global coupled general circulation models that participated in phase 2 of the Coupled Model Intercomparison Project (CMIP2). NAO has been assessed in control and transient 80-year simulations produced by each model under constant forcing, and 1% per year increasing concentrations of CO2, respectively. Although generally able to simulate the main features of NAO, the majority of models overestimate the observed mean wintertime NAO index of 8 hPa by 5-10 hPa. Furthermore, none of the models, in either the control or perturbed simulations, are able to reproduce decadal trends as strong as that seen in the observed NAO index from 1970-1995. Of the 15 models able to simulate the NAO pressure dipole, 13 predict a positive increase in NAO with increasing CO2 concentrations. The magnitude of the response is generally small and highly model-dependent, which leads to large uncertainty in multi-model estimates such as the median estimate of 0.0061 +/- 0.0036 hPa per %CO2. Although an increase of 0.61 hPa in NAO for a doubling in CO2 represents only a relatively small shift of 0.18 standard deviations in the probability distribution of winter mean NAO, this can cause large relative increases in the probabilities of extreme values of NAO associated with damaging impacts. Despite the large differences in NAO responses, the models robustly predict similar statistically significant changes in winter mean temperature (warmer over most of Europe) and precipitation (an increase over Northern Europe). Although these changes present a pattern similar to that expected due to an increase in the NAO index, linear regression is used to show that the response is much greater than can be attributed to small increases in NAO. NAO trends are not the key contributor to model-predicted climate change in wintertime mean temperature and precipitation over Europe and the Mediterranean region. However, the models' inability to capture the observed decadal variability in NAO might also signify a major deficiency in their ability to simulate the NAO-related responses to climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Much uncertainty in the value of the imaginary part of the refractive index of mineral dust contributes to uncertainty in the radiative effect of mineral dust in the atmosphere. A synthesis of optical, chemical and physical in-situ aircraft measurements from the DODO experiments during February and August 2006 are used to calculate the refractive index mineral dust encountered over West Africa. Radiative transfer modeling and measurements of broadband shortwave irradiance at a range of altitudes are used to test and validate these calculations for a specific dust event on 23 August 2006 over Mauritania. Two techniques are used to determine the refractive index: firstly a method combining measurements of scattering, absorption, size distributions and Mie code simulations, and secondly a method using composition measured on filter samples to apportion the content of internally mixed quartz, calcite and iron oxide-clay aggregates, where the iron oxide is represented by either hematite or goethite and clay by either illite or kaolinite. The imaginary part of the refractive index at 550 nm (ni550) is found to range between 0.0001 i to 0.0046 i, and where filter samples are available, agreement between methods is found depending on mineral combination assumed. The refractive indices are also found to agree well with AERONET data where comparisons are possible. ni550 is found to vary with dust source, which is investigated with the NAME model for each case. The relationship between both size distribution and ni550 on the accumulation mode single scattering albedo at 550 nm (ω0550) are examined and size distribution is found to have no correlation to ω0550, while ni550 shows a strong linear relationship with ω0550. Radiative transfer modeling was performed with different models (Mie-derived refractive indices, but also filter sampling composition assuming both internal and external mixing). Our calculations indicate that Mie-derived values of ni550 and the externally mixed dust where the iron oxide-clay aggregate corresponds to the goethite-kaolinite combination result in the best agreement with irradiance measurements. The radiative effect of the dust is found to be very sensitive to the mineral combination (and hence refractive index) assumed, and to whether the dust is assumed to be internally or externally mixed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper considers meta-analysis of diagnostic studies that use a continuous score for classification of study participants into healthy or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might be confounded by a potentially unknown variation of the cut-off value. To cope with this phenomena it is suggested to use, instead, an overall estimate of the misclassification error previously suggested and used as Youden’s index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel–Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden’s index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The formulation of a new process-based crop model, the general large-area model (GLAM) for annual crops is presented. The model has been designed to operate on spatial scales commensurate with those of global and regional climate models. It aims to simulate the impact of climate on crop yield. Procedures for model parameter determination and optimisation are described, and demonstrated for the prediction of groundnut (i.e. peanut; Arachis hypogaea L.) yields across India for the period 1966-1989. Optimal parameters (e.g. extinction coefficient, transpiration efficiency, rate of change of harvest index) were stable over space and time, provided the estimate of the yield technology trend was based on the full 24-year period. The model has two location-specific parameters, the planting date, and the yield gap parameter. The latter varies spatially and is determined by calibration. The optimal value varies slightly when different input data are used. The model was tested using a historical data set on a 2.5degrees x 2.5degrees grid to simulate yields. Three sites are examined in detail-grid cells from Gujarat in the west, Andhra Pradesh towards the south, and Uttar Pradesh in the north. Agreement between observed and modelled yield was variable, with correlation coefficients of 0.74, 0.42 and 0, respectively. Skill was highest where the climate signal was greatest, and correlations were comparable to or greater than correlations with seasonal mean rainfall. Yields from all 35 cells were aggregated to simulate all-India yield. The correlation coefficient between observed and simulated yields was 0.76, and the root mean square error was 8.4% of the mean yield. The model can be easily extended to any annual crop for the investigation of the impacts of climate variability (or change) on crop yield over large areas. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasing rates of obesity and heart disease are compromising quality of life for a growing number of people. There is much research linking adult disease with the growth and development both in utero and during the first year of life. The pig is an ideal model for studying the origins of developmental programming. The objective of this paper was to construct percentile growth curves for the pig for use in biomedical studies. The body weight (BIN) of pigs was recorded from birth to 150 days of age and their crown-to-rump length was measured over the neonatal period to enable the ponderal index (Pl; kg/m(3)) to be calculated. Data were normalised and percentile curves were constructed using Cole's lambda-mu-sigma (LMS) method for BW and PI. The construction of these percentile charts for use in biomedical research will allow a more detailed and precise tracking of growth and development of individual pigs under experimental conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Central Brazil, the long-term sustainability of beef cattle systems is under threat over vast tracts of farming areas, as more than half of the 50 million hectares of sown pastures are suffering from degradation. Overgrazing practised to maintain high stocking rates is regarded as one of the main causes. High stocking rates are deliberate and crucial decisions taken by the farmers, which appear paradoxical, even irrational given the state of knowledge regarding the consequences of overgrazing. The phenomenon however appears inextricably linked with the objectives that farmers hold. In this research those objectives were elicited first and from their ranking two, ‘asset value of cattle (representing cattle ownership)' and ‘present value of economic returns', were chosen to develop an original bi-criteria Compromise Programming model to test various hypotheses postulated to explain the overgrazing behaviour. As part of the model a pasture productivity index is derived to estimate the pasture recovery cost. Different scenarios based on farmers' attitudes towards overgrazing, pasture costs and capital availability were analysed. The results of the model runs show that benefits from holding more cattle can outweigh the increased pasture recovery and maintenance costs. This result undermines the hypothesis that farmers practise overgrazing because they are unaware or uncaring about overgrazing costs. An appropriate approach to the problem of pasture degradation requires information on the economics, and its interplay with farmers' objectives, for a wide range of pasture recovery and maintenance methods. Seen within the context of farmers' objectives, some level of overgrazing appears rational. Advocacy of the simple ‘no overgrazing' rule is an insufficient strategy to maintain the long-term sustainability of the beef production systems in Central Brazil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Central Brazil, the long-term, sustainability of beef cattle systems is under threat over vast tracts of farming areas, as more than half of the 50 million hectares of sown pastures are suffering from. degradation. Overgrazing practised to maintain high stocking rates is regarded as one of the main causes. High stocking rates are deliberate and crucial decisions taken by the farmers, which appear paradoxical, even irrational given the state of knowledge regarding the consequences of overgrazing. The phenomenon however appears inextricably linked with the objectives that farmers hold. In this research those objectives were elicited first and from their ranking two, 'asset value of cattle (representing cattle ownership and 'present value of economic returns', were chosen to develop an original bi-criteria Compromise Programming model to test various hypotheses postulated to explain the overgrazing behaviour. As part of the model a pasture productivity index is derived to estimate the pasture recovery cost. Different scenarios based on farmers' attitudes towards overgrazing, pasture costs and capital availability were analysed. The results of the model runs show that benefits from holding more cattle can outweigh the increased pasture recovery and maintenance costs. This result undermines the hypothesis that farmers practise overgrazing because they are unaware or uncaring caring about overgrazing costs. An appropriate approach to the problem of pasture degradation requires information on the economics,and its interplay with farmers' objectives, for a wide range of pasture recovery and maintenance methods. Seen within the context of farmers' objectives, some level of overgrazing appears rational. Advocacy of the simple 'no overgrazing' rule is an insufficient strategy to maintain the long-term sustainability of the beef production systems in Central Brazil. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper considers meta-analysis of diagnostic studies that use a continuous Score for classification of study participants into healthy, or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between Studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might he confounded by a potentially unknown variation of the cut-off Value. To cope with this phenomena it is suggested to use, instead an overall estimate of the misclassification error previously suggested and used as Youden's index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel-Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden's index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.