833 resultados para Robustness
Resumo:
This paper aims to present three new methods for color detection and segmentation of road signs. The images are taken by a digital camera mounted in a car. The RGB images are converted into IHLS color space, and new methods are applied to extract the colors of the road signs under consideration. The methods are tested on hundreds of outdoor images in different light conditions, and they show high robustness. This project is part of the research taking place in Dalarna University / Sweden in the field of the ITS.
Resumo:
The ever increasing spurt in digital crimes such as image manipulation, image tampering, signature forgery, image forgery, illegal transaction, etc. have hard pressed the demand to combat these forms of criminal activities. In this direction, biometrics - the computer-based validation of a persons' identity is becoming more and more essential particularly for high security systems. The essence of biometrics is the measurement of person’s physiological or behavioral characteristics, it enables authentication of a person’s identity. Biometric-based authentication is also becoming increasingly important in computer-based applications because the amount of sensitive data stored in such systems is growing. The new demands of biometric systems are robustness, high recognition rates, capability to handle imprecision, uncertainties of non-statistical kind and magnanimous flexibility. It is exactly here that, the role of soft computing techniques comes to play. The main aim of this write-up is to present a pragmatic view on applications of soft computing techniques in biometrics and to analyze its impact. It is found that soft computing has already made inroads in terms of individual methods or in combination. Applications of varieties of neural networks top the list followed by fuzzy logic and evolutionary algorithms. In a nutshell, the soft computing paradigms are used for biometric tasks such as feature extraction, dimensionality reduction, pattern identification, pattern mapping and the like.
Resumo:
The demands of image processing related systems are robustness, high recognition rates, capability to handle incomplete digital information, and magnanimous flexibility in capturing shape of an object in an image. It is exactly here that, the role of convex hulls comes to play. The objective of this paper is twofold. First, we summarize the state of the art in computational convex hull development for researchers interested in using convex hull image processing to build their intuition, or generate nontrivial models. Secondly, we present several applications involving convex hulls in image processing related tasks. By this, we have striven to show researchers the rich and varied set of applications they can contribute to. This paper also makes a humble effort to enthuse prospective researchers in this area. We hope that the resulting awareness will result in new advances for specific image recognition applications.
Resumo:
A number of recent works have introduced statistical methods for detecting genetic loci that affect phenotypic variability, which we refer to as variability-controlling quantitative trait loci (vQTL). These are genetic variants whose allelic state predicts how much phenotype values will vary about their expected means. Such loci are of great potential interest in both human and non-human genetic studies, one reason being that a detected vQTL could represent a previously undetected interaction with other genes or environmental factors. The simultaneous publication of these new methods in different journals has in many cases precluded opportunity for comparison. We survey some of these methods, the respective trade-offs they imply, and the connections between them. The methods fall into three main groups: classical non-parametric, fully parametric, and semi-parametric two-stage approximations. Choosing between alternatives involves balancing the need for robustness, flexibility, and speed. For each method, we identify important assumptions and limitations, including those of practical importance, such as their scope for including covariates and random effects. We show in simulations that both parametric methods and their semi-parametric approximations can give elevated false positive rates when they ignore mean-variance relationships intrinsic to the data generation process. We conclude that choice of method depends on the trait distribution, the need to include non-genetic covariates, and the population size and structure, coupled with a critical evaluation of how these fit with the assumptions of the statistical model.
Resumo:
Developing successful navigation and mapping strategies is an essential part of autonomous robot research. However, hardware limitations often make for inaccurate systems. This project serves to investigate efficient alternatives to mapping an environment, by first creating a mobile robot, and then applying machine learning to the robot and controlling systems to increase the robustness of the robot system. My mapping system consists of a semi-autonomous robot drone in communication with a stationary Linux computer system. There are learning systems running on both the robot and the more powerful Linux system. The first stage of this project was devoted to designing and building an inexpensive robot. Utilizing my prior experience from independent studies in robotics, I designed a small mobile robot that was well suited for simple navigation and mapping research. When the major components of the robot base were designed, I began to implement my design. This involved physically constructing the base of the robot, as well as researching and acquiring components such as sensors. Implementing the more complex sensors became a time-consuming task, involving much research and assistance from a variety of sources. A concurrent stage of the project involved researching and experimenting with different types of machine learning systems. I finally settled on using neural networks as the machine learning system to incorporate into my project. Neural nets can be thought of as a structure of interconnected nodes, through which information filters. The type of neural net that I chose to use is a type that requires a known set of data that serves to train the net to produce the desired output. Neural nets are particularly well suited for use with robotic systems as they can handle cases that lie at the extreme edges of the training set, such as may be produced by "noisy" sensor data. Through experimenting with available neural net code, I became familiar with the code and its function, and modified it to be more generic and reusable for multiple applications of neural nets.
Resumo:
The US term structure of interest rates plays a central role in fixed-income analysis. For example, estimating accurately the US term structure is a crucial step for those interested in analyzing Brazilian Brady bonds such as IDUs, DCBs, FLIRBs, EIs, etc. In this work we present a statistical model to estimate the US term structure of interest rates. We address in this report all major issues which drove us in the process of implementing the model developed, concentrating on important practical issues such as computational efficiency, robustness of the final implementation, the statistical properties of the final model, etc. Numerical examples are provided in order to illustrate the use of the model on a daily basis.
Resumo:
We give necessary and sufficient conditions for the existence of symmetric equilibrium without ties in interdependent values auctions, with multidimensional independent types and no monotonic assumptions. In this case, non-monotonic equilibria might happen. When the necessary and sufficient conditions are not satisfied, there are ties with positive probability. In such case, we are still able to prove the existence of pure strategy equilibrium with an all-pay auction tie-breaking rule. As a direct implication of these results, we obtain a generalization of the Revenue Equivalence Theorem. From the robustness of equilibrium existence for all-pay auctions in multidimensional setting, an interpretation of our results can give a new justification to the use of tournaments in practice.
Resumo:
This paper measures the degree of segmentation in the brazilian labor market. Controlling for observable and unobservable characteristics, workers earn more in the formal sector, which supports the segmentation hypothesis. We break down the degree of segmentation by socio-economic attributes to identify the groups where this phenomenon is more prevalent. We investigate the robustness of our findings to the inclusion of self-employed individuals, and apply a two-stage panel probit model using the self-selection correction strategy to investigate a potential weakness of the fixed-effects estimator
Resumo:
Este artigo analisou as relações entre poupança pública e crescimento econômico. Inicialmente, a análise teórico-descritiva dessas relações mostrou que a poupança pública é um indicador de sustentabilidade fiscal mais completo do que o superávit primário e tende a apresentar efeitos mais positivos sobre o produto do que o superávit operacional. As equações estimadas e os testes de robustez dos resultados da posterior análise econométrica, que utilizou modelos de regressão múltipla para um painel de 38 nações, comprovaram, a elevados níveis de confiança, a hipótese de relação positiva entre as taxas de poupança pública e de crescimento econômico per capita indicando a direção de causalidade entre ambos, além de fornecerem resultados interessantes e consistentes sobre a forma de associação do desenvolvimento a outras variáveis. A conclusão central foi de que um aumento de uma unidade na taxa de poupança pública deve levar, em média, a uma elevação de 0,17 unidades na taxa de crescimento econômico per capita
Resumo:
The present work has as main objective the identification and impact analysis for the practice ITIL in the organizational flexibility of a multinational IT company, being this study of quali-quantitative and exploratory nature. To achieve this objective, some theoretical studies on bureaucracy, organization flexibility, control, IT governance and ITIL were done, as a form to better understand the research problem. For analysis effect a set of eleven ITIL process was considered in this research ¿ service desk, incident management, problem management, change management, configuration management, release management, service level management, availability management, capacity management, continuity management and finally IT financial services management ¿ grouped in its two core areas ¿ service support and service delivery. Then a scale was constructed and validated, on the basis of theoretical models developed by Volberda (1997), Tenório (2002) and Golden and Powell (1999), to measure the flexibility related to each process comprising the ITIL core. The dimensions adopted to measure flexibility were: organization design task, managerial task, IT impact on work force, HR management, efficiency impact, sensitivity, versatility and robustness. The instrument used in research was a semi-structured interview, which was divided in two parts. The data collection was performed with ten interviewed people from an IT multinational company, based on convenience, some were managers and there were users, some were ITIL certified and others not. The statistic tests of t student and Wilcoxon non-parametric were adopted. The result of the research indicated that the ITIL service support area, for possessing greater operational focus, presents flexibility trend. The opposite was found for the service delivery area, which has greater tactical focus. The results also suggest that the change management discipline was the one that contributed for the most flexibility inside the company, followed by incident management discipline and the service desk function.
Resumo:
A presente dissertação tem como objetivo estudar e aprimorar métodos de projetos de controladores para sistemas de potência, sendo que esse trabalho trata da estabilidade dinâmica de sistemas de potência e, portanto, do projeto de controladores amortecedores de oscilações eletromecânicas para esses sistemas. A escolha dos métodos aqui estudados foi orientada pelos requisitos que um estabilizador de sistemas de potência (ESP) deve ter, que são robustez, descentralização e coordenação. Sendo que alguns deles tiveram suas características aprimoradas para atender a esses requisitos. A abordagem dos métodos estudados foi restringida à análise no domínio tempo, pois a abordagem temporal facilita a modelagem das incertezas paramétricas, para atender ao requisito da robustez, e também permite a formulação do controle descentralizado de maneira simples. Além disso, a abordagem temporal permite a formulação do problema de projeto utilizando desigualdades matriciais lineares (LMI’s), as quais possuem como vantagem o fato do conjunto solução ser sempre convexo e a existência de algoritmos eficientes para o cálculo de sua solução. De fato, existem diversos pacotes computacionais desenvolvidos no mercado para o cálculo da solução de um problema de inequações matriciais lineares. Por esse motivo, os métodos de projeto para controladores de saída buscam sempre colocar o problema na forma de LMI’s, tendo em vista que ela garante a obtenção de solução, caso essa solução exista.
Resumo:
This paper presents a poverty profile for Brazil, based on three different sources of household data for 1996. We use PPV consumption data to estimate poverty and indigence lines. “Contagem” data is used to allow for an unprecedented refinement of the country’s poverty map. Poverty measures and shares are also presented for a wide range of population subgroups, based on the PNAD 1996, with new adjustments for imputed rents and spatial differences in cost of living. Robustness of the profile is verified with respect to different poverty lines, spatial price deflators, and equivalence scales. Overall poverty incidence ranges from 23% with respect to an indigence line to 45% with respect to a more generous poverty line. More importantly, however, poverty is found to vary significantly across regions and city sizes, with rural areas, small and medium towns and the metropolitan peripheries of the North and Northeast regions being poorest.
Resumo:
Esta dissertação tem por objetivo verificar se há relação positiva entre restrição de capital e as operações de cessão de crédito realizadas pelos bancos brasileiros. O período de análise, compreendido entre junho de 2004 e junho de 2009, inclui o impacto da crise financeira, período relevante tendo em vista a dificuldade enfrentada pelas instituições financeiras diante da redução dos empréstimos interbancários. O estudo está dividido em quatro seções, além de introdução e conclusão. Inicialmente é apresentado o levantamento da literatura existente e aspectos institucionais e normativos da cessão de crédito nos bancos brasileiros. Na seção seguinte são desenvolvidos a estatística descritiva e um panorama sobre a evolução das operações de cessão de crédito no Brasil, considerando o período de junho de 2004 a junho de 2009. A próxima seção apresenta a metodologia utilizada e o modelo aplicado – Logit, Tobit e respectivos modelos com dados em painel, considerando as operações de cessão de crédito total, com e sem coobrigação – juntamente com a descrição das variáveis inseridas em cada caso. De forma geral, os resultados observados e os testes de robustez indicam que um aumento na restrição de capital próprio e de terceiros – nesse caso, especialmente para a modalidade sem coobrigação – está relacionada à expansão das operações de cessão de crédito realizadas pelos bancos brasileiros. Confirma-se também a hipótese de que os efeitos da restrição de capital sobre a cessão se exacerbam diante de crises financeiras.
Resumo:
Os impactos da adoção das Normas Internacionais de Contabilidade (IFRSs) tem sido objeto de debates nos meios profissionais e acadêmicos, entretanto, pouco tem sido pesquisado sobre as repercussões da adoção dos IFRSs na atividade pericial criminal. Portanto, o objetivo deste estudo é captar e analisar a percepção dos Peritos Criminais Federais sobre os impactos da adoção dos IFRSs na atividade de perícia criminal oficial realizada em fraudes contábeis. Lastreou-se numa abordagem quantitativa e qualitativa utilizada para verificar associações entre as percepções, recorrendo-se ao teste Qui-quadrado de Pearson e a análise de conteúdo, respectivamente. Os resultados evidenciaram que a maior parte dos respondentes concorda parcial ou totalmente que a adoção dos IFRSs facilitará o trabalho de perícia criminal federal, encontrando associação estatística com a percepção de que fraudes cometidas sem engenharia financeira são mais fáceis de comprovar e com a percepção que um maior espaço para julgamentos técnicos tem impacto positivo na atividade de perícia criminal. Outros benefícios apontados foram o aumento da comparabilidade, a diminuição da complexidade e a valorização profissional. Entretanto, constatou-se como riscos a possibilidade de aumento nas contestações técnicas aos laudos periciais, o risco de viés e a necessidade de qualificação, porém sem associação estatística com a percepção de que os IFRSs facilitarão ou não o trabalho pericial. Não foram identificadas diferenças estatísticas de percepção em função do nível de conhecimento dos preceitos sobre os IFRSs e dos conhecimentos teóricos e práticos dos pesquisados. O estudo apresenta limitações que dizem respeito principalmente à generalização dos resultados, uma vez que a abordagem pretendida foi qualitativa e quantitativa e o número de questionários respondidos não possibilitou realizar testes estatísticos com maior robustez.
Resumo:
This paper estimates the impact of the use of structured methods on the quality of education of the students in primary public school in Brazil. Structure methods encompass a range of pedagogical and managerial instruments applied to the education system. In recent years, several municipalities in the State of São Paulo have contracted out private educational providers to implement these structured methods in their schooling system. Their pedagogical proposal involves structuring curriculum contents, elaboration and use of teachers and students textbooks, and training and supervision of the teachers and instructors. Using a difference in differences estimation strategy, we find that the fourth and eighth grader students in the municipalities with structured methods performed better in Portuguese and Math than students in municipalities not exposed to the methods. We find no differences in approval rates. However, a robustness check is not able to discard the possibility that unobserved municipal characteristics may affect the results.