898 resultados para trajectory accuracy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work proposes and discusses an approach for inducing Bayesian classifiers aimed at balancing the tradeoff between the precise probability estimates produced by time consuming unrestricted Bayesian networks and the computational efficiency of Naive Bayes (NB) classifiers. The proposed approach is based on the fundamental principles of the Heuristic Search Bayesian network learning. The Markov Blanket concept, as well as a proposed ""approximate Markov Blanket"" are used to reduce the number of nodes that form the Bayesian network to be induced from data. Consequently, the usually high computational cost of the heuristic search learning algorithms can be lessened, while Bayesian network structures better than NB can be achieved. The resulting algorithms, called DMBC (Dynamic Markov Blanket Classifier) and A-DMBC (Approximate DMBC), are empirically assessed in twelve domains that illustrate scenarios of particular interest. The obtained results are compared with NB and Tree Augmented Network (TAN) classifiers, and confinn that both proposed algorithms can provide good classification accuracies and better probability estimates than NB and TAN, while being more computationally efficient than the widely used K2 Algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the field of operational water management, Model Predictive Control (MPC) has gained popularity owing to its versatility and flexibility. The MPC controller, which takes predictions, time delay and uncertainties into account, can be designed for multi-objective management problems and for large-scale systems. Nonetheless, a critical obstacle, which needs to be overcome in MPC, is the large computational burden when a large-scale system is considered or a long prediction horizon is involved. In order to solve this problem, we use an adaptive prediction accuracy (APA) approach that can reduce the computational burden almost by half. The proposed MPC scheme with this scheme is tested on the northern Dutch water system, which comprises Lake IJssel, Lake Marker, the River IJssel and the North Sea Canal. The simulation results show that by using the MPC-APA scheme, the computational time can be reduced to a large extent and a flood protection problem over longer prediction horizons can be well solved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The Örst reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modiÖed information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of Ötted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy ñreaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Industrial companies in developing countries are facing rapid growths, and this requires having in place the best organizational processes to cope with the market demand. Sales forecasting, as a tool aligned with the general strategy of the company, needs to be as much accurate as possible, in order to achieve the sales targets by making available the right information for purchasing, planning and control of production areas, and finally attending in time and form the demand generated. The present dissertation uses a single case study from the subsidiary of an international explosives company based in Brazil, Maxam, experiencing high growth in sales, and therefore facing the challenge to adequate its structure and processes properly for the rapid growth expected. Diverse sales forecast techniques have been analyzed to compare the actual monthly sales forecast, based on the sales force representatives’ market knowledge, with forecasts based on the analysis of historical sales data. The dissertation findings show how the combination of both qualitative and quantitative forecasts, by the creation of a combined forecast that considers both client´s demand knowledge from the sales workforce with time series analysis, leads to the improvement on the accuracy of the company´s sales forecast.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este estudo investiga o poder preditivo fora da amostra, um mês à frente, de um modelo baseado na regra de Taylor para previsão de taxas de câmbio. Revisamos trabalhos relevantes que concluem que modelos macroeconômicos podem explicar a taxa de câmbio de curto prazo. Também apresentamos estudos que são céticos em relação à capacidade de variáveis macroeconômicas preverem as variações cambiais. Para contribuir com o tema, este trabalho apresenta sua própria evidência através da implementação do modelo que demonstrou o melhor resultado preditivo descrito por Molodtsova e Papell (2009), o “symmetric Taylor rule model with heterogeneous coefficients, smoothing, and a constant”. Para isso, utilizamos uma amostra de 14 moedas em relação ao dólar norte-americano que permitiu a geração de previsões mensais fora da amostra de janeiro de 2000 até março de 2014. Assim como o critério adotado por Galimberti e Moura (2012), focamos em países que adotaram o regime de câmbio flutuante e metas de inflação, porém escolhemos moedas de países desenvolvidos e em desenvolvimento. Os resultados da nossa pesquisa corroboram o estudo de Rogoff e Stavrakeva (2008), ao constatar que a conclusão da previsibilidade da taxa de câmbio depende do teste estatístico adotado, sendo necessária a adoção de testes robustos e rigorosos para adequada avaliação do modelo. Após constatar não ser possível afirmar que o modelo implementado provém previsões mais precisas do que as de um passeio aleatório, avaliamos se, pelo menos, o modelo é capaz de gerar previsões “racionais”, ou “consistentes”. Para isso, usamos o arcabouço teórico e instrumental definido e implementado por Cheung e Chinn (1998) e concluímos que as previsões oriundas do modelo de regra de Taylor são “inconsistentes”. Finalmente, realizamos testes de causalidade de Granger com o intuito de verificar se os valores defasados dos retornos previstos pelo modelo estrutural explicam os valores contemporâneos observados. Apuramos que o modelo fundamental é incapaz de antecipar os retornos realizados.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses social housing policy in Brazil since the 1990s by analyzing government programs’ institutional arrangements, their sources of revenues and the formatting of related financial systems. The conclusion suggests that all these arrangements have not constituted a comprehensive housing policy with the clear aim of serving to enhance housing conditions in the country. Housing ‘policies’ since the 1990s – as proposed by Fernando Collor de Mello, Itamar Franco, Fernando Henrique Cardoso and ´ Luis Inacio Lula da Silva’s governments (in the latter case, despite much progress towards subsidized investment programs) – have sought to consolidate financial instruments in line with global markets, restructuring the way private interests operate within the system, a necessary however incomplete course of action. Different from rhetoric, this has resulted in failure as the more fundamental social results for the poor have not yet been achieved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

After removal of the Selective Availability in 2000, the ionosphere became the dominant error source for Global Navigation Satellite Systems (GNSS), especially for the high-accuracy (cm-mm) demanding applications like the Precise Point Positioning (PPP) and Real Time Kinematic (RTK) positioning.The common practice of eliminating the ionospheric error, e. g. by the ionosphere free (IF) observable, which is a linear combination of observables on two frequencies such as GPS L1 and L2, accounts for about 99% of the total ionospheric effect, known as the first order ionospheric effect (Ion1). The remaining 1% residual range errors (RREs) in the IF observable are due to the higher - second and third, order ionospheric effects, Ion2 and Ion3, respectively. Both terms are related with the electron content along the signal path; moreover Ion2 term is associated with the influence of the geomagnetic field on the ionospheric refractive index and Ion3 with the ray bending effect of the ionosphere, which can cause significant deviation in the ray trajectory (due to strong electron density gradients in the ionosphere) such that the error contribution of Ion3 can exceed that of Ion2 (Kim and Tinin, 2007).The higher order error terms do not cancel out in the (first order) ionospherically corrected observable and as such, when not accounted for, they can degrade the accuracy of GNSS positioning, depending on the level of the solar activity and geomagnetic and ionospheric conditions (Hoque and Jakowski, 2007). Simulation results from early 1990s show that Ion2 and Ion3 would contribute to the ionospheric error budget by less than 1% of the Ion1 term at GPS frequencies (Datta-Barua et al., 2008). Although the IF observable may provide sufficient accuracy for most GNSS applications, Ion2 and Ion3 need to be considered for higher accuracy demanding applications especially at times of higher solar activity.This paper investigates the higher order ionospheric effects (Ion2 and Ion3, however excluding the ray bending effects associated with Ion3) in the European region in the GNSS positioning considering the precise point positioning (PPP) method. For this purpose observations from four European stations were considered. These observations were taken in four time intervals corresponding to various geophysical conditions: the active and quiet periods of the solar cycle, 2001 and 2006, respectively, excluding the effects of disturbances in the geomagnetic field (i.e. geomagnetic storms), as well as the years of 2001 and 2003, this time including the impact of geomagnetic disturbances. The program RINEX_HO (Marques et al., 2011) was used to calculate the magnitudes of Ion2 and Ion3 on the range measurements as well as the total electron content (TEC) observed on each receiver-satellite link. The program also corrects the GPS observation files for Ion2 and Ion3; thereafter it is possible to perform PPP with both the original and corrected GPS observation files to analyze the impact of the higher order ionospheric error terms excluding the ray bending effect which may become significant especially at low elevation angles (Ioannides and Strangeways, 2002) on the estimated station coordinates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To assess viability of the development of percentage body fat cutoffs based on blood pressure values in Brazilian adolescents.Methods: A cross-sectional study was conducted with a sample of 358 male subjects from 8 to 18 years old. Blood pressure was measured by the oscilometric method, and body composition was measured by dual-energy X-ray absorptiometry (DXA).Results: For the identification of elevated blood pressure, these nationally developed body fat cutoffs presented relative accuracy. The cutoffs were significantly associated with elevated blood pressure [odds ratio = 5.91 (95% confidence interval: 3.54-9.86)].Conclusions: Development of national body fat cutoffs is viable, because presence of high accuracy is an indication of elevated blood pressure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJETIVO: Determinar a acurácia das variáveis: tempo de escada (tTE), potência de escada (PTE), teste de caminhada (TC6) e volume expiratório forçado (VEF1) utilizando o consumo máximo de oxigênio (VO2máx) como padrão-ouro. MÉTODOS: Os testes foram realizados em 51 pacientes. O VEF1 foi obtido através da espirometria. O TC6 foi realizado em corredor plano de 120m. O TE foi realizado em escada de 6 lances obtendo-se tTE e PTE. O VO2máx foi obtido por ergoespirometria, utilizando o protocolo de Balke. Foram calculados a correlação linear de Pearson (r) e os valores de p, entre VO2máx e variáveis. Para o cálculo da acurácia, foram obtidos os pontos de corte, através da curva característica operacional (ROC). A estatística Kappa (k) foi utilizada para cálculo da concordância. RESULTADOS: Obteve-se as acurácias: tTE - 86%, TC6 - 80%, PTE - 71%, VEF1(L) - 67%, VEF1% - 63%. Para o tTE e TC6 combinados em paralelo, obteve-se sensibilidade de 93,5% e em série, especificidade de 96,4%. CONCLUSÃO: O tTE foi a variável que apresentou a melhor acurácia. Quando combinados o tTE e TC6 podem ter especificidade e sensibilidade próxima de 100%. Estes testes deveriam ser mais usados rotineiramente, especialmente quando a ergoespirometria para a medida de VO2máx não é disponível.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scintimammography using Tc-99m-sestamibi is a noninvasive and painless diagnostic imaging method that is used to detect breast cancer when mammography is inconclusive Because of the advantages of labeling v '7,ith Tc-99m-sestamibi and its high efficiency in detecting carcinomas, it is the most widespread agent for this purpose Its accumulation in the tumor has multifactorial causes and does not depend on the presence of architectural distortion or local or diffuse density variation in the breast The objective of tfiis study was to evaluate the accuracy of scintimammography 1 for detecting breast cancer One hundred and fifty-seven patients presenting 158 palpable and non-palpable breast nodules were evaluated Three patients were male and 154 were female, aged between 14 and 81 years All patients underwent scintimammography, and the nodule was subjected i to cytological or histological study, i e, the gold standard for diagnosing cancer One hundred and eleven malignant and 47 benign nodules were detected, with predominance of ductal carcinomas (n=94) and fibroadenoma/fibrocysiic condition (n=11/n=11), respectively The mean size was 3 11 cm (7-10 cm) among the malignant nodules and 2 07 cm among the benign nodules (0 5-10 cm) The sensitivity, specificity, positive predictive value, negative predictive value and accuracy were 89, 89, 95, 78 and 89%, respectively Analysis on the histological types showed that the technique was more effective on tumors that were more aggressive, such as ductal carcinoma In this study, Tc-99m-sestamibi scintim immography was shown to be an important tool for diagnosing breast cancer when mammography was inconclusive