967 resultados para Classification models
Resumo:
Esta tese tem por objectivo o desenho e avaliação de um sistema de contagem e classificação de veículos automóveis em tempo-real e sem fios. Pretende, também, ser uma alternativa aos actuais equipamentos, muito intrusivos nas vias rodoviárias. Esta tese inclui um estudo sobre as comunicações sem fios adequadas a uma rede de equipamentos sensores rodoviários, um estudo sobre a utilização do campo magnético como meio físico de detecção e contagem de veículos e um estudo sobre a autonomia energética dos equipamentos inseridos na via, com recurso, entre outros, à energia solar. O projecto realizado no âmbito desta tese incorpora, entre outros, a digitalização em tempo real da assinatura magnética deixada pela passagem de um veículo, no campo magnético da Terra, o respectivo envio para servidor via rádio e WAN, Wide Area Network, e o desenvolvimento de software tendo por base a pilha de protocolos ZigBee. Foram desenvolvidas aplicações para o equipamento sensor, para o coordenador, para o painel de controlo e para a biblioteca de Interface de um futuro servidor aplicacional. O software desenvolvido para o equipamento sensor incorpora ciclos de detecção e digitalização, com pausas de adormecimento de baixo consumo, e a activação das comunicações rádio durante a fase de envio, assegurando assim uma estratégia de poupança energética. Os resultados obtidos confirmam a viabilidade desta tecnologia para a detecção e contagem de veículos, assim como para a captura de assinatura usando magnetoresistências. Permitiram ainda verificar o alcance das comunicações sem fios com equipamento sensor embebido no asfalto e confirmar o modelo de cálculo da superfície do painel solar bem como o modelo de consumo energético do equipamento sensor.
Resumo:
INTRODUCTION: The correct identification of the underlying cause of death and its precise assignment to a code from the International Classification of Diseases are important issues to achieve accurate and universally comparable mortality statistics These factors, among other ones, led to the development of computer software programs in order to automatically identify the underlying cause of death. OBJECTIVE: This work was conceived to compare the underlying causes of death processed respectively by the Automated Classification of Medical Entities (ACME) and the "Sistema de Seleção de Causa Básica de Morte" (SCB) programs. MATERIAL AND METHOD: The comparative evaluation of the underlying causes of death processed respectively by ACME and SCB systems was performed using the input data file for the ACME system that included deaths which occurred in the State of S. Paulo from June to December 1993, totalling 129,104 records of the corresponding death certificates. The differences between underlying causes selected by ACME and SCB systems verified in the month of June, when considered as SCB errors, were used to correct and improve SCB processing logic and its decision tables. RESULTS: The processing of the underlying causes of death by the ACME and SCB systems resulted in 3,278 differences, that were analysed and ascribed to lack of answer to dialogue boxes during processing, to deaths due to human immunodeficiency virus [HIV] disease for which there was no specific provision in any of the systems, to coding and/or keying errors and to actual problems. The detailed analysis of these latter disclosed that the majority of the underlying causes of death processed by the SCB system were correct and that different interpretations were given to the mortality coding rules by each system, that some particular problems could not be explained with the available documentation and that a smaller proportion of problems were identified as SCB errors. CONCLUSION: These results, disclosing a very low and insignificant number of actual problems, guarantees the use of the version of the SCB system for the Ninth Revision of the International Classification of Diseases and assures the continuity of the work which is being undertaken for the Tenth Revision version.
Resumo:
We present new populational growth models, generalized logistic models which are proportional to beta densities with shape parameters p and 2, where p > 1, with Malthusian parameter r. The complex dynamical behaviour of these models is investigated in the parameter space (r, p), in terms of topological entropy, using explicit methods, when the Malthusian parameter r increases. This parameter space is split into different regions, according to the chaotic behaviour of the models.
Resumo:
Recent literature has proved that many classical pricing models (Black and Scholes, Heston, etc.) and risk measures (V aR, CV aR, etc.) may lead to “pathological meaningless situations”, since traders can build sequences of portfolios whose risk leveltends to −infinity and whose expected return tends to +infinity, i.e., (risk = −infinity, return = +infinity). Such a sequence of strategies may be called “good deal”. This paper focuses on the risk measures V aR and CV aR and analyzes this caveat in a discrete time complete pricing model. Under quite general conditions the explicit expression of a good deal is given, and its sensitivity with respect to some possible measurement errors is provided too. We point out that a critical property is the absence of short sales. In such a case we first construct a “shadow riskless asset” (SRA) without short sales and then the good deal is given by borrowing more and more money so as to invest in the SRA. It is also shown that the SRA is interested by itself, even if there are short selling restrictions.
Resumo:
The aim of this paper is to analyze the forecasting ability of the CARR model proposed by Chou (2005) using the S&P 500. We extend the data sample, allowing for the analysis of different stock market circumstances and propose the use of various range estimators in order to analyze their forecasting performance. Our results show that there are two range-based models that outperform the forecasting ability of the GARCH model. The Parkinson model is better for upward trends and volatilities which are higher and lower than the mean while the CARR model is better for downward trends and mean volatilities.
Resumo:
We consider a simple model consisting of particles with four bonding sites ("patches"), two of type A and two of type B, on the square lattice, and investigate its global phase behavior by simulations and theory. We set the interaction between B patches to zero and calculate the phase diagram as the ratio between the AB and the AA interactions, epsilon(AB)*, varies. In line with previous work, on three-dimensional off-lattice models, we show that the liquid-vapor phase diagram exhibits a re-entrant or "pinched" shape for the same range of epsilon(AB)*, suggesting that the ratio of the energy scales - and the corresponding empty fluid regime - is independent of the dimensionality of the system and of the lattice structure. In addition, the model exhibits an order-disorder transition that is ferromagnetic in the re-entrant regime. The use of low-dimensional lattice models allows the simulation of sufficiently large systems to establish the nature of the liquid-vapor critical points and to describe the structure of the liquid phase in the empty fluid regime, where the size of the "voids" increases as the temperature decreases. We have found that the liquid-vapor critical point is in the 2D Ising universality class, with a scaling region that decreases rapidly as the temperature decreases. The results of simulations and theoretical analysis suggest that the line of order-disorder transitions intersects the condensation line at a multi-critical point at zero temperature and density, for patchy particle models with a re-entrant, empty fluid, regime. (C) 2011 American Institute of Physics. [doi: 10.1063/1.3657406]
Resumo:
We study the implications for two-Higgs-doublet models of the recent announcement at the LHC giving a tantalizing hint for a Higgs boson of mass 125 GeV decaying into two photons. We require that the experimental result be within a factor of 2 of the theoretical standard model prediction, and analyze the type I and type II models as well as the lepton-specific and flipped models, subject to this requirement. It is assumed that there is no new physics other than two Higgs doublets. In all of the models, we display the allowed region of parameter space taking the recent LHC announcement at face value, and we analyze the W+W-, ZZ, (b) over barb, and tau(+)tau(-) expectations in these allowed regions. Throughout the entire range of parameter space allowed by the gamma gamma constraint, the numbers of events for Higgs decays into WW, ZZ, and b (b) over bar are not changed from the standard model by more than a factor of 2. In contrast, in the lepton-specific model, decays to tau(+)tau(-) are very sensitive across the entire gamma gamma-allowed region.
Resumo:
The idiomatic expression “In Rome be a Roman” can be applied to leadership training and development as well. Leaders who can act as role models inspire other future leaders in their behaviour, attitudes and ways of thinking. Based on two examples of current leaders in the fields of Politics and Public Administration, I support the idea that exposure to role models during their training was decisive for their career paths and current activities as prominent characters in their profession. Issues such as how students should be prepared for community or national leadership as well as cross-cultural engagement are raised here. The hypothesis of transculturalism and cross-cultural commitment as a factor of leadership is presented. Based on current literature on Leadership as well as the presented case studies, I expect to raise a debate focusing on strategies for improving leaders’ training in their cross-cultural awareness.
Resumo:
O documento em anexo encontra-se na versão post-print (versão corrigida pelo editor).
Resumo:
Long-term contractual decisions are the basis of an efficient risk management. However those types of decisions have to be supported with a robust price forecast methodology. This paper reports a different approach for long-term price forecast which tries to give answers to that need. Making use of regression models, the proposed methodology has as main objective to find the maximum and a minimum Market Clearing Price (MCP) for a specific programming period, and with a desired confidence level α. Due to the problem complexity, the meta-heuristic Particle Swarm Optimization (PSO) was used to find the best regression parameters and the results compared with the obtained by using a Genetic Algorithm (GA). To validate these models, results from realistic data are presented and discussed in detail.
Resumo:
This paper presents an integrated system for vehicle classification. This system aims to classify vehicles using different approaches: 1) based on the height of the first axle and_the number of axles; 2) based on volumetric measurements and; 3) based on features extracted from the captured image of the vehicle. The system uses a laser sensor for measurements and a set of image analysis algorithms to compute some visual features. By combining different classification methods, it is shown that the system improves its accuracy and robustness, enabling its usage in more difficult environments satisfying the proposed requirements established by the Portuguese motorway contractor BRISA.
Resumo:
In music genre classification, most approaches rely on statistical characteristics of low-level features computed on short audio frames. In these methods, it is implicitly considered that frames carry equally relevant information loads and that either individual frames, or distributions thereof, somehow capture the specificities of each genre. In this paper we study the representation space defined by short-term audio features with respect to class boundaries, and compare different processing techniques to partition this space. These partitions are evaluated in terms of accuracy on two genre classification tasks, with several types of classifiers. Experiments show that a randomized and unsupervised partition of the space, used in conjunction with a Markov Model classifier lead to accuracies comparable to the state of the art. We also show that unsupervised partitions of the space tend to create less hubs.
Resumo:
This paper describes a methodology that was developed for the classification of Medium Voltage (MV) electricity customers. Starting from a sample of data bases, resulting from a monitoring campaign, Data Mining (DM) techniques are used in order to discover a set of a MV consumer typical load profile and, therefore, to extract knowledge regarding to the electric energy consumption patterns. In first stage, it was applied several hierarchical clustering algorithms and compared the clustering performance among them using adequacy measures. In second stage, a classification model was developed in order to allow classifying new consumers in one of the obtained clusters that had resulted from the previously process. Finally, the interpretation of the discovered knowledge are presented and discussed.
Resumo:
The growing importance and influence of new resources connected to the power systems has caused many changes in their operation. Environmental policies and several well know advantages have been made renewable based energy resources largely disseminated. These resources, including Distributed Generation (DG), are being connected to lower voltage levels where Demand Response (DR) must be considered too. These changes increase the complexity of the system operation due to both new operational constraints and amounts of data to be processed. Virtual Power Players (VPP) are entities able to manage these resources. Addressing these issues, this paper proposes a methodology to support VPP actions when these act as a Curtailment Service Provider (CSP) that provides DR capacity to a DR program declared by the Independent System Operator (ISO) or by the VPP itself. The amount of DR capacity that the CSP can assure is determined using data mining techniques applied to a database which is obtained for a large set of operation scenarios. The paper includes a case study based on 27,000 scenarios considering a diversity of distributed resources in a 33 bus distribution network.
Resumo:
Mestrado em Radiações Aplicadas às Tecnologias da Saúde - Área de especialização: Imagem Digital por Radiação X.