950 resultados para Mean Absolute Scaled Error (MASE)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction. Leaf area is often related to plant growth, development, physiology and yield. Many non-destructive models have been proposed for leaf area estimation of several plant genotypes, demonstrating that leaf length, leaf width and leaf area are closely correlated. Thus, the objective of our study was to develop a reliable model for leaf area estimation from linear measurements of leaf dimensions for citrus genotypes. Materials and methods. Leaves of citrus genotypes were harvested, and their dimensions (length, width and area) were measured. Values of leaf area were regressed against length, width, the square of length, the square of width and the product (length x width). The most accurate equations, either linear or second-order polynomial, were regressed again with a new data set; then the most reliable equation was defined. Results and discussion. The first analysis showed that the variables length, width and the square of length gave better results in second-order polynomial equations, while the linear equations were more suitable and accurate when the width and the product (length x width) were used. When these equations were regressed with the new data set, the coefficient of determination (R(2)) and the agreement index 'd' were higher for the one that used the variable product (length x width), while the Mean Absolute Percentage Error was lower. Conclusion. The product of the simple leaf dimensions (length x width) can provide a reliable and simple non-destructive model for leaf area estimation across citrus genotypes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Elétrica - FEIS

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Elétrica - FEIS

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Box-Cox transformation is a technique mostly utilized to turn the probabilistic distribution of a time series data into approximately normal. And this helps statistical and neural models to perform more accurate forecastings. However, it introduces a bias when the reversion of the transformation is conducted with the predicted data. The statistical methods to perform a bias-free reversion require, necessarily, the assumption of Gaussianity of the transformed data distribution, which is a rare event in real-world time series. So, the aim of this study was to provide an effective method of removing the bias when the reversion of the Box-Cox transformation is executed. Thus, the developed method is based on a focused time lagged feedforward neural network, which does not require any assumption about the transformed data distribution. Therefore, to evaluate the performance of the proposed method, numerical simulations were conducted and the Mean Absolute Percentage Error, the Theil Inequality Index and the Signal-to-Noise ratio of 20-step-ahead forecasts of 40 time series were compared, and the results obtained indicate that the proposed reversion method is valid and justifies new studies. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Agronomia (Produção Vegetal) - FCAV

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Agronomia (Produção Vegetal) - FCAV

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to compare 18 reference evapotranspiration models to the standard Penman-Monteith model in the Jaboticabal, Sao Paulo, region for the following time scales: daily, 5-day, 15-day and seasonal. A total of 5 years of daily meteorological data was used for the following analyses: accuracy (mean absolute percentage error, Mape), precision (R-2) and tendency (bias) (systematic error, SE). The results were also compared at the 95% probability level with Tukey's test. The Priestley-Taylor (1972) method was the most accurate for all time scales, the Tanner-Pelton (1960) method was the most accurate in the winter, and the Thornthwaite (1948) method was the most accurate of the methods that only used temperature data in the equations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using a convenient and fast HPLC procedure we determined serum concentrations of the fungistatic agent 5-fluorocytosine (5-FC) in 375 samples from 60 patients treated with this drug. The mean trough concentration (n = 127) was 64.3 mg/l (range: 11.8-208.0 mg/l), the mean peak concentration (n = 122) was 99.9 mg/l (range: 25.6-263.8 mg/l), the mean nonpeak/nontrough concentration (n = 126) was 80.1 mg/l (range: 10.5-268.0 mg/l). Totally 134 (35.7%) samples were outside the therapeutic range (25-100 mg/l), 108 (28.8%) being too high, 26 (6.9%) being too low. Forty-four (73%) patients showed 5-FC serum concentrations outside the therapeutic range at least once during the treatment course. In a prospective study we performed 65 dosage predictions on 30 patients by use of a 3-point method previously developed for aminoglycoside dosage adaptation. The mean absolute prediction error of the dosage adaptation was +0.7 mg/l (range: -26.0 to +28.0 mg/l). The root mean square prediction error was 10.7 mg/l. The mean predicted concentration (65.3 mg/l) agreed very well with the mean measured concentration (64.6 mg/l). The frequency distribution of 5-FC serum concentrations indicates that 5-FC monitoring is important. The applied pharmacokinetic method allows individual adaptations of 5-FC dosage with a clinically acceptable prediction error.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Individuals with type 1 diabetes (T1D) have to count the carbohydrates (CHOs) of their meal to estimate the prandial insulin dose needed to compensate for the meal’s effect on blood glucose levels. CHO counting is very challenging but also crucial, since an error of 20 grams can substantially impair postprandial control. Method: The GoCARB system is a smartphone application designed to support T1D patients with CHO counting of nonpacked foods. In a typical scenario, the user places a reference card next to the dish and acquires 2 images with his/her smartphone. From these images, the plate is detected and the different food items on the plate are automatically segmented and recognized, while their 3D shape is reconstructed. Finally, the food volumes are calculated and the CHO content is estimated by combining the previous results and using the USDA nutritional database. Results: To evaluate the proposed system, a set of 24 multi-food dishes was used. For each dish, 3 pairs of images were taken and for each pair, the system was applied 4 times. The mean absolute percentage error in CHO estimation was 10 ± 12%, which led to a mean absolute error of 6 ± 8 CHO grams for normal-sized dishes. Conclusion: The laboratory experiments demonstrated the feasibility of the GoCARB prototype system since the error was below the initial goal of 20 grams. However, further improvements and evaluation are needed prior launching a system able to meet the inter- and intracultural eating habits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study demonstrated that accurate, short-term forecasts of Veterans Affairs (VA) hospital utilization can be made using the Patient Treatment File (PTF), the inpatient discharge database of the VA. Accurate, short-term forecasts of two years or less can reduce required inventory levels, improve allocation of resources, and are essential for better financial management. These are all necessary achievements in an era of cost-containment.^ Six years of non-psychiatric discharge records were extracted from the PTF and used to calculate four indicators of VA hospital utilization: average length of stay, discharge rate, multi-stay rate (a measure of readmissions) and days of care provided. National and regional levels of these indicators were described and compared for fiscal year 1984 (FY84) to FY89 inclusive.^ Using the observed levels of utilization for the 48 months between FY84 and FY87, five techniques were used to forecast monthly levels of utilization for FY88 and FY89. Forecasts were compared to the observed levels of utilization for these years. Monthly forecasts were also produced for FY90 and FY91.^ Forecasts for days of care provided were not produced. Current inpatients with very long lengths of stay contribute a substantial amount of this indicator and it cannot be accurately calculated.^ During the six year period between FY84 and FY89, average length of stay declined substantially, nationally and regionally. The discharge rate was relatively stable, while the multi-stay rate increased slightly during this period. FY90 and FY91 forecasts show a continued decline in the average length of stay, while the discharge rate is forecast to decline slightly and the multi-stay rate is forecast to increase very slightly.^ Over a 24 month ahead period, all three indicators were forecast within a 10 percent average monthly error. The 12-month ahead forecast errors were slightly lower. Average length of stay was less easily forecast, while the multi-stay rate was the easiest indicator to forecast.^ No single technique performed significantly better as determined by the Mean Absolute Percent Error, a standard measure of error. However, Autoregressive Integrated Moving Average (ARIMA) models performed well overall and are recommended for short-term forecasting of VA hospital utilization. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electricity price forecasting is an interesting problem for all the agents involved in electricity market operation. For instance, every profit maximisation strategy is based on the computation of accurate one-day-ahead forecasts, which is why electricity price forecasting has been a growing field of research in recent years. In addition, the increasing concern about environmental issues has led to a high penetration of renewable energies, particularly wind. In some European countries such as Spain, Germany and Denmark, renewable energy is having a deep impact on the local power markets. In this paper, we propose an optimal model from the perspective of forecasting accuracy, and it consists of a combination of several univariate and multivariate time series methods that account for the amount of energy produced with clean energies, particularly wind and hydro, which are the most relevant renewable energy sources in the Iberian Market. This market is used to illustrate the proposed methodology, as it is one of those markets in which wind power production is more relevant in terms of its percentage of the total demand, but of course our method can be applied to any other liberalised power market. As far as our contribution is concerned, first, the methodology proposed by García-Martos et al(2007 and 2012) is generalised twofold: we allow the incorporation of wind power production and hydro reservoirs, and we do not impose the restriction of using the same model for 24h. A computational experiment and a Design of Experiments (DOE) are performed for this purpose. Then, for those hours in which there are two or more models without statistically significant differences in terms of their forecasting accuracy, a combination of forecasts is proposed by weighting the best models(according to the DOE) and minimising the Mean Absolute Percentage Error (MAPE). The MAPE is the most popular accuracy metric for comparing electricity price forecasting models. We construct the combi nation of forecasts by solving several nonlinear optimisation problems that allow computation of the optimal weights for building the combination of forecasts. The results are obtained by a large computational experiment that entails calculating out-of-sample forecasts for every hour in every day in the period from January 2007 to Decem ber 2009. In addition, to reinforce the value of our methodology, we compare our results with those that appear in recent published works in the field. This comparison shows the superiority of our methodology in terms of forecasting accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation aimed to improve travel time estimation for the purpose of transportation planning by developing a travel time estimation method that incorporates the effects of signal timing plans, which were difficult to consider in planning models. For this purpose, an analytical model has been developed. The model parameters were calibrated based on data from CORSIM microscopic simulation, with signal timing plans optimized using the TRANSYT-7F software. Independent variables in the model are link length, free-flow speed, and traffic volumes from the competing turning movements. The developed model has three advantages compared to traditional link-based or node-based models. First, the model considers the influence of signal timing plans for a variety of traffic volume combinations without requiring signal timing information as input. Second, the model describes the non-uniform spatial distribution of delay along a link, this being able to estimate the impacts of queues at different upstream locations of an intersection and attribute delays to a subject link and upstream link. Third, the model shows promise of improving the accuracy of travel time prediction. The mean absolute percentage error (MAPE) of the model is 13% for a set of field data from Minnesota Department of Transportation (MDOT); this is close to the MAPE of uniform delay in the HCM 2000 method (11%). The HCM is the industrial accepted analytical model in the existing literature, but it requires signal timing information as input for calculating delays. The developed model also outperforms the HCM 2000 method for a set of Miami-Dade County data that represent congested traffic conditions, with a MAPE of 29%, compared to 31% of the HCM 2000 method. The advantages of the proposed model make it feasible for application to a large network without the burden of signal timing input, while improving the accuracy of travel time estimation. An assignment model with the developed travel time estimation method has been implemented in a South Florida planning model, which improved assignment results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Yield management helps hotels more profitably manage the capacity of their rooms. Hotels tend to have two types of business: transient and group. Yield management research and systems have been designed for transient business in which the group forecast is taken as a given. In this research, forecast data from approximately 90 hotels of a large North American hotel chain were used to determine the accuracy of group forecasts and to identify factors associated with accurate forecasts. Forecasts showed a positive bias and had a mean absolute percentage error (MAPE) of 40% at two months before arrival; 30% at one month before arrival; and 10-15% on the day of arrival. Larger hotels, hotels with a higher dependence on group business, and hotels that updated their forecasts frequently during the month before arrival had more accurate forecasts.