819 resultados para rule-based algorithms
Resumo:
Distributed energy and water balance models require time-series surfaces of the meteorological variables involved in hydrological processes. Most of the hydrological GIS-based models apply simple interpolation techniques to extrapolate the point scale values registered at weather stations at a watershed scale. In mountainous areas, where the monitoring network ineffectively covers the complex terrain heterogeneity, simple geostatistical methods for spatial interpolation are not always representative enough, and algorithms that explicitly or implicitly account for the features creating strong local gradients in the meteorological variables must be applied. Originally developed as a meteorological pre-processing tool for a complete hydrological model (WiMMed), MeteoMap has become an independent software. The individual interpolation algorithms used to approximate the spatial distribution of each meteorological variable were carefully selected taking into account both, the specific variable being mapped, and the common lack of input data from Mediterranean mountainous areas. They include corrections with height for both rainfall and temperature (Herrero et al., 2007), and topographic corrections for solar radiation (Aguilar et al., 2010). MeteoMap is a GIS-based freeware upon registration. Input data include weather station records and topographic data and the output consists of tables and maps of the meteorological variables at hourly, daily, predefined rainfall event duration or annual scales. It offers its own pre and post-processing tools, including video outlook, map printing and the possibility of exporting the maps to images or ASCII ArcGIS formats. This study presents the friendly user interface of the software and shows some case studies with applications to hydrological modeling.
Resumo:
A determinação da taxa de juros estrutura a termo é um dos temas principais da gestão de ativos financeiros. Considerando a grande importância dos ativos financeiros para a condução das políticas econômicas, é fundamental para compreender a estrutura que é determinado. O principal objetivo deste estudo é estimar a estrutura a termo das taxas de juros brasileiras, juntamente com taxa de juros de curto prazo. A estrutura a termo será modelado com base em um modelo com uma estrutura afim. A estimativa foi feita considerando a inclusão de três fatores latentes e duas variáveis macroeconômicas, através da técnica Bayesiana da Cadeia de Monte Carlo Markov (MCMC).
Resumo:
The number of research papers available today is growing at a staggering rate, generating a huge amount of information that people cannot keep up with. According to a tendency indicated by the United States’ National Science Foundation, more than 10 million new papers will be published in the next 20 years. Because most of these papers will be available on the Web, this research focus on exploring issues on recommending research papers to users, in order to directly lead users to papers of their interest. Recommender systems are used to recommend items to users among a huge stream of available items, according to users’ interests. This research focuses on the two most prevalent techniques to date, namely Content-Based Filtering and Collaborative Filtering. The first explores the text of the paper itself, recommending items similar in content to the ones the user has rated in the past. The second explores the citation web existing among papers. As these two techniques have complementary advantages, we explored hybrid approaches to recommending research papers. We created standalone and hybrid versions of algorithms and evaluated them through both offline experiments on a database of 102,295 papers, and an online experiment with 110 users. Our results show that the two techniques can be successfully combined to recommend papers. The coverage is also increased at the level of 100% in the hybrid algorithms. In addition, we found that different algorithms are more suitable for recommending different kinds of papers. Finally, we verified that users’ research experience influences the way users perceive recommendations. In parallel, we found that there are no significant differences in recommending papers for users from different countries. However, our results showed that users’ interacting with a research paper Recommender Systems are much happier when the interface is presented in the user’s native language, regardless the language that the papers are written. Therefore, an interface should be tailored to the user’s mother language.
Resumo:
Consumption is an important macroeconomic aggregate, being about 70% of GNP. Finding sub-optimal behavior in consumption decisions casts a serious doubt on whether optimizing behavior is applicable on an economy-wide scale, which, in turn, challenge whether it is applicable at all. This paper has several contributions to the literature on consumption optimality. First, we provide a new result on the basic rule-of-thumb regression, showing that it is observational equivalent to the one obtained in a well known optimizing real-business-cycle model. Second, for rule-of-thumb tests based on the Asset-Pricing Equation, we show that the omission of the higher-order term in the log-linear approximation yields inconsistent estimates when lagged observables are used as instruments. However, these are exactly the instruments that have been traditionally used in this literature. Third, we show that nonlinear estimation of a system of N Asset-Pricing Equations can be done efficiently even if the number of asset returns (N) is high vis-a-vis the number of time-series observations (T). We argue that efficiency can be restored by aggregating returns into a single measure that fully captures intertemporal substitution. Indeed, we show that there is no reason why return aggregation cannot be performed in the nonlinear setting of the Pricing Equation, since the latter is a linear function of individual returns. This forms the basis of a new test of rule-of-thumb behavior, which can be viewed as testing for the importance of rule-of-thumb consumers when the optimizing agent holds an equally-weighted portfolio or a weighted portfolio of traded assets. Using our setup, we find no signs of either rule-of-thumb behavior for U.S. consumers or of habit-formation in consumption decisions in econometric tests. Indeed, we show that the simple representative agent model with a CRRA utility is able to explain the time series data on consumption and aggregate returns. There, the intertemporal discount factor is significant and ranges from 0.956 to 0.969 while the relative risk-aversion coefficient is precisely estimated ranging from 0.829 to 1.126. There is no evidence of rejection in over-identifying-restriction tests.
Resumo:
The goal of this paper is to debate the degree of effectiveness of the rule of law in Brazil, through a survey measuring perceptions, attitudes and habits of Brazilians in regard to compliance to law. The survey conducted in Brazil is based on the study conducted by Tom R. Tyler in the United States, entitled Why People Obey the Law? (New Haven, CT: Yale University Press, 1990). The main argument of Tyler´s study is that people obey the law when they believe it’s legitimate, and not because they fear punishment. We test the same argument in Brazil, relying on five indicators: (i) behavior, which depicts the frequency with which respondents declared to have engaged in conducts in disobedience to the law; (ii) instrumentality, measuring perception of losses associated with the violation of the law, specially fear of punishment; (iii) morality, measuring perception of how much is right or wrong to engage in certain conducts in violation of the law; (iv) social control, which measures perception of social disapproval of certain types of behavior in violation of the law, and (v) legitimacy, which measures the perception of respect to the law and to some authorities. Results indicate that fear of sanctions is not the strongest drive in compliance to law, but more than legitimacy, indicators of morality and social control are the strongest in explaining why people obey the law in Brazil.
Resumo:
We consider a class of sampling-based decomposition methods to solve risk-averse multistage stochastic convex programs. We prove a formula for the computation of the cuts necessary to build the outer linearizations of the recourse functions. This formula can be used to obtain an efficient implementation of Stochastic Dual Dynamic Programming applied to convex nonlinear problems. We prove the almost sure convergence of these decomposition methods when the relatively complete recourse assumption holds. We also prove the almost sure convergence of these algorithms when applied to risk-averse multistage stochastic linear programs that do not satisfy the relatively complete recourse assumption. The analysis is first done assuming the underlying stochastic process is interstage independent and discrete, with a finite set of possible realizations at each stage. We then indicate two ways of extending the methods and convergence analysis to the case when the process is interstage dependent.
Resumo:
We consider risk-averse convex stochastic programs expressed in terms of extended polyhedral risk measures. We derive computable con dence intervals on the optimal value of such stochastic programs using the Robust Stochastic Approximation and the Stochastic Mirror Descent (SMD) algorithms. When the objective functions are uniformly convex, we also propose a multistep extension of the Stochastic Mirror Descent algorithm and obtain con dence intervals on both the optimal values and optimal solutions. Numerical simulations show that our con dence intervals are much less conservative and are quicker to compute than previously obtained con dence intervals for SMD and that the multistep Stochastic Mirror Descent algorithm can obtain a good approximate solution much quicker than its nonmultistep counterpart. Our con dence intervals are also more reliable than asymptotic con dence intervals when the sample size is not much larger than the problem size.
Resumo:
This project describes an authentication technique that is shoulder-surfing resistant. Shoulder surfing is an attack in which an attacker can get access to private information by observing the user’s interaction with a terminal, or by using recording tools to record the user interaction and study the obtained data, with the objective of obtaining unauthorized access to a target user’s personal information. The technique described here relies on gestural analysis coupled with a secondary channel of authentication that uses button pressing. The thesis presents and evaluates multiple alternative algorithms for gesture analysis, and furthermore assesses the effectiveness of the technique.
Resumo:
Nowadays, fraud detection is important to avoid nontechnical energy losses. Various electric companies around the world have been faced with such losses, mainly from industrial and commercial consumers. This problem has traditionally been dealt with using artificial intelligence techniques, although their use can result in difficulties such as a high computational burden in the training phase and problems with parameter optimization. A recently-developed pattern recognition technique called optimum-path forest (OPF), however, has been shown to be superior to state-of-the-art artificial intelligence techniques. In this paper, we proposed to use OPF for nontechnical losses detection, as well as to apply its learning and pruning algorithms to this purpose. Comparisons against neural networks and other techniques demonstrated the robustness of the OPF with respect to commercial losses automatic identification.
Resumo:
In the minimization of tool switches problem we seek a sequence to process a set of jobs so that the number of tool switches required is minimized. In this work different variations of a heuristic based on partial ordered job sequences are implemented and evaluated. All variations adopt a depth first strategy of the enumeration tree. The computational test results indicate that good results can be obtained by a variation which keeps the best three branches at each node of the enumeration tree, and randomly choose, among all active nodes, the next node to branch when backtracking.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Petroleum well drilling is an expensive and risky operation. In this context, well design presents itself as a fundamental key to decrease costs and risks involved. Experience acquired by engineers is notably an important factor in good drilling design elaborations. Therefore, the loss of this knowledge may entail additional problems and costs. In this way, this work represents an initiative to model a petroleum well design case-based architecture. Tests with a prototype showed that the system built with this architecture may help in a well design and enable corporate knowledge preservation. (C) 2003 Elsevier B.V. B.V. All rights reserved.
Resumo:
The evaluation of free carrier concentration based on Drude's theory can be performed by the use of optical transmittance in the range 800-2000 nm (near infrared) for Sb-doped SnO2 thin films. In this article, we estimate the free carrier concentration for these films, which are deposited via sol-gel dip-coating. At approximately 900 mn, there is a separation among transmittance curves of doped and undoped samples. The plasma resonance phenomena approach leads to free carrier concentration of about 5 x 1020 cm(-3). The increase in the Sb concentration increases the film conductivity; however, the magnitude of measured resistivity is still very high. The only way to combine such a high free carrier concentration with a rather low conductivity is to have a very low mobility. It becomes possible when the crystallite dimensions are taken into account. We obtain grains with 5 nm of average size by estimating the grain size from X-ray diffraction data, and by using line broadening in the diffraction pattern. The low conductivity is due to very intense scattering at the grain boundary, which is created by the presence of a large amount of nanoscopic crystallites. Such a result is in accordance with X-ray photoemission spectroscopy data that pointed to Sb incorporation proportional to the free electron concentration, evaluated according to Drude's model. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
To enhance the global search ability of population based incremental learning (PBIL) methods, it is proposed that multiple probability vectors are to be included on available PBIL algorithms. The strategy for updating those probability vectors and the negative learning and mutation operators are thus re-defined correspondingly. Moreover, to strike the best tradeoff between exploration and exploitation searches, an adaptive updating strategy for the learning rate is designed. Numerical examples are reported to demonstrate the pros and cons of the newly implemented algorithm.