877 resultados para new method
Resumo:
Este trabalho tem por objetivo identificar os coeficientes sazonais de algumas variáveis econômicas (produção industrial, exportações e importações), isentos das mudanças estruturais registradas na economia. O estudo verifica se os planos de estabilização implementados pelo governo nos últimos quinze anos afetaram o padrão sazonal daquelas séries. Para tanto aplica-se o X-12-ARIMA, o novo método de dessazonalização de séries desenvolvido pelo U.S. Bureau of the Census. O uso desse método torna-se necessário, porque os demais métodos conhecidos impedem testar nossa hipótese, ao não permitirem o emprego de intervenções, não obtendo assim os melhores estimadores para os coeficientes sazonais. O estudo cobre o período que vai de 1980 a 1997 e os resultados confirmam a nossa hipótese de mudança no padrão sazonal no período. As nossas variáveis econômicas foram - de um ou de outro modo - atingidas pelos planos de estabilização implementados nos últimos quinze anos.
Resumo:
The Internet recruiting has been announced in the job market as a modern tool to attract the best and brightest professionals to the companies. The main objective of the present study is to analyze the Internet resources applied to the Recruitment and Selection process, so as to understand how can these tools make easier recruitment, in which concerns to cost efficiency, time spent and adequacy of the candidates selected to fill the job vacancies. It is also studied the role of the intermediates in the job market, specifically the recruitment consultants for executive search, considering the intensive use of the Internet tools for companies that, in this new way, get in touch directly with a major group of possible candidates. It is also investigated how these new resources to develop in-house capabilities to manage on-line recruiting, will bring savings, better process times and superior qualities of candidates. The study has an empirical section based on a case study of the Companhia Distribuidora de Gás do Rio de Janeiro - CEG, in which the new method with the Internet tools is compared vis-à-vis the traditional one. The study analyses the new method¿s impact on the main variables present in the process.Keywords: Labor Market, Recruitment and Selection, On-line Recruitment, Human Resource Management, Internet
Resumo:
O autor realiza revisão da literatura internacional e nacional sobre o tema, descreve as várias fases de desenvolvimento de uma nova droga e faz uma análise crítica da si t uaç;ão do gerenciamento das pesquisas fármaco-clínicas. Discute as posições da Farmacologia Clínica e da Medicina da indústria farmacêutica no exterior e no Brasil. Aponta as dificuldades para realizações e controle destas pesquisas e propõe uma nova abordagem gerencial para as pesquisas fármaco-clínicas. Após um periodo de doze anos, de 1975 a 1987, atuando nos diversos setores do departamento médico-Científico, de algumas indústrias farmacêuticas, com mais de 80% deste período dedicado ao planejamento, implantação, monitorização, conclusão e publicação de pesquisas fármaco-clínicas. Responsável pela publicação de nada menos do que quinze (15) ensaios. Tendo participado em, pelo menos, outras 24 estudos publicados, e mais outros 31 ainda não publicados (anexo 3, neste espaço de tempo. O autor após uma análise da situação, vem propor um método de gerenciamento destas pesquisas fármaco-clínicos realizadas no Brasil, de' forma que' todas as etapas necessárias para que estes ensaios se realizem, sejam metodotizadas, planejadas e devidamente avaliadas, podendo ainda permanecer armazenado e à disposição para pronta recuperação a qualquer tempo. Transformando-se desta forma em um excelente banco de dados, para as mais diversas informações, incluindo as de cunho oficial.
Resumo:
O objetivo deste trabalho é compreender e avaliar a utilização da Contratualização de Resultados pela Secretaria Municipal de Assistência Social de São Paulo com entidades privadas sem fins lucrativos, em especial para a prestação do serviço de acolhimento institucional. Para tanto, buscaremos apresentar as características desse novo formato de relacionamento do Estado com a sociedade civil e avaliar em que medida o acordo de resultados pode colaborar com a melhoria da efetividade na prestação deste serviço no município de São Paulo.
Resumo:
Esta tese é composta por três artigos e uma nota, sendo um em cada capítulo. Todos os capítulos enquadram-se na área de Microeconomia Aplicada e Economia do Trabalho. O primeiro artigo estende o modelo tradicional de decomposição das flutuações na taxa de desemprego de Shimer (2012), separando o emprego formal do informal. Com essa modificação, os principais resultados da metodologia se alteram e conclui-se que os principais fatores para a queda do desemprego na última década foram (i) a queda na taxa de participação, principalmente pela menor entrada na força de trabalho; (ii) o aumento da formalização, atingido tanto pelo aumento da probabilidade de encontrar um trabalho formal quanto pela probabilidade de deixar a condição de empregado formal. O segundo capítulo apresenta estimativas para o retorno à educação no Brasil, utilizando uma nova metodologia que não necessita de variáveis de exclusão. A vantagem do método em relação a abordagens que utilizam variáveis instrumentais é a de permitir avaliar o retorno médio para todos os trabalhadores (e não somente os afetados pelos instrumentos) e em qualquer instante do tempo. Face aos resultados, concluímos as estimativas via MQO subestimam o retorno médio. Discute-se possíveis explicações para esse fenômeno. O terceiro artigo trata da terceirização da mão de obra no Brasil. Mais especificamente, mede-se o diferencial de salários entre os trabalhadores terceirizados e os contratados diretamente. Os resultados de uma comparação não condicional indicam que os terceirizados têm salário médio 17% menor no período 2007 a 2012. Porém, com estimativas que levam em conta o efeito fixo de cada trabalhador, esse diferencial cai para 3,0%. Além disso, o diferencial é bastante heterogêneo entre os tipos de serviços: aqueles que utilizam trabalhadores de baixa qualificação apresentam salário menores, enquanto nas ocupações de maior qualificação os terceirizados têm salários iguais ou maiores do que os diretamente contratados. Mais ainda, as evidencias apontam para a diminuição do diferencial ao longo do tempo no período analisado. Finalmente, a nota que encerra a tese documenta dois aspectos relevantes e pouco conhecidos da Pesquisa Mensal de Emprego do IBGE que podem levar a resultados imprecisos nas pesquisas que utilizam esse painel se não forem tratados adequadamente.
Resumo:
Neste trabalho apresentamos um novo método numérico com passo adaptativo baseado na abordagem de linearização local, para a integração de equações diferenciais estocásticas com ruído aditivo. Propomos, também, um esquema computacional que permite a implementação eficiente deste método, adaptando adequadamente o algorítimo de Padé com a estratégia “scaling-squaring” para o cálculo das exponenciais de matrizes envolvidas. Antes de introduzirmos a construção deste método, apresentaremos de forma breve o que são equações diferenciais estocásticas, a matemática que as fundamenta, a sua relevância para a modelagem dos mais diversos fenômenos, e a importância da utilização de métodos numéricos para avaliar tais equações. Também é feito um breve estudo sobre estabilidade numérica. Com isto, pretendemos introduzir as bases necessárias para a construção do novo método/esquema. Ao final, vários experimentos numéricos são realizados para mostrar, de forma prática, a eficácia do método proposto, e compará-lo com outros métodos usualmente utilizados.
Resumo:
BARBOSA, André F. ; SOUZA, Bryan C. ; PEREIRA JUNIOR, Antônio ; MEDEIROS, Adelardo A. D.de, . Implementação de Classificador de Tarefas Mentais Baseado em EEG. In: CONGRESSO BRASILEIRO DE REDES NEURAIS, 9., 2009, Ouro Preto, MG. Anais... Ouro Preto, MG, 2009
Resumo:
Spiking neural networks - networks that encode information in the timing of spikes - are arising as a new approach in the artificial neural networks paradigm, emergent from cognitive science. One of these new models is the pulsed neural network with radial basis function, a network able to store information in the axonal propagation delay of neurons. Learning algorithms have been proposed to this model looking for mapping input pulses into output pulses. Recently, a new method was proposed to encode constant data into a temporal sequence of spikes, stimulating deeper studies in order to establish abilities and frontiers of this new approach. However, a well known problem of this kind of network is the high number of free parameters - more that 15 - to be properly configured or tuned in order to allow network convergence. This work presents for the first time a new learning function for this network training that allow the automatic configuration of one of the key network parameters: the synaptic weight decreasing factor.
Resumo:
Near-infrared Raman spectroscopy (NIRS) is a particularly promising technique that is being used in recent years for many biomedical applications. Optical spectroscopy has gained increasing prominence as a tool for quantitative analysis of biological samples, clinical diagnostic, concentration measurements of blood metabolites and therapeutic drugs, and analysis of the chemical composition of human tissues. Toxoplasmosis is an important zoonosis in public health, and domestic cats are the most important transmitters of the disease. This disease can be detected by several serological tests, which usually have a high cost and require a long time. The goal of this work was to investigate a new method to diagnosis Toxoplasma gondii infections using NIRS. In order to confirm antibody detection, 24 cat blood scrum samples were analyzed by the Raman spectra, from which 23 presented positive serology to toxoplasmosis and one was a reference negative serum. Characteristic Raman peaks allowed differentiation between negative and positive sera, confirming the possibility of antibody detection by Raman spectroscopy. These results give the first evidence that this technique can be useful to quantify antibodies in cat sera.
Resumo:
This work introduces a new method for environment mapping with three-dimensional information from visual information for robotic accurate navigation. Many approaches of 3D mapping using occupancy grid typically requires high computacional effort to both build and store the map. We introduce an 2.5-D occupancy-elevation grid mapping, which is a discrete mapping approach, where each cell stores the occupancy probability, the height of the terrain at current place in the environment and the variance of this height. This 2.5-dimensional representation allows that a mobile robot to know whether a place in the environment is occupied by an obstacle and the height of this obstacle, thus, it can decide if is possible to traverse the obstacle. Sensorial informations necessary to construct the map is provided by a stereo vision system, which has been modeled with a robust probabilistic approach, considering the noise present in the stereo processing. The resulting maps favors the execution of tasks like decision making in the autonomous navigation, exploration, localization and path planning. Experiments carried out with a real mobile robots demonstrates that this proposed approach yields useful maps for robot autonomous navigation
Resumo:
In this work we present a new clustering method that groups up points of a data set in classes. The method is based in a algorithm to link auxiliary clusters that are obtained using traditional vector quantization techniques. It is described some approaches during the development of the work that are based in measures of distances or dissimilarities (divergence) between the auxiliary clusters. This new method uses only two a priori information, the number of auxiliary clusters Na and a threshold distance dt that will be used to decide about the linkage or not of the auxiliary clusters. The number os classes could be automatically found by the method, that do it based in the chosen threshold distance dt, or it is given as additional information to help in the choice of the correct threshold. Some analysis are made and the results are compared with traditional clustering methods. In this work different dissimilarities metrics are analyzed and a new one is proposed based on the concept of negentropy. Besides grouping points of a set in classes, it is proposed a method to statistical modeling the classes aiming to obtain a expression to the probability of a point to belong to one of the classes. Experiments with several values of Na e dt are made in tests sets and the results are analyzed aiming to study the robustness of the method and to consider heuristics to the choice of the correct threshold. During this work it is explored the aspects of information theory applied to the calculation of the divergences. It will be explored specifically the different measures of information and divergence using the Rényi entropy. The results using the different metrics are compared and commented. The work also has appendix where are exposed real applications using the proposed method
Resumo:
One of the objectives of this work is the ana1ysis of planar structures using the PBG (photonic Bandgap), a new method of controlling propagation of electromagnetic waves in devices with dielectrics. Here the basic theory of these structures will be presented, as well as applications and determination of certain parameters. In this work the analysis will be performed concerning PBG structures, including the basic theory and applications in planar structures. Considerations are made related to the implementation of devices. Here the TTL (Transverse Transmission Line) method is employed, characterized by the simplicity in the treatment of the equations that govern the propagation of electromagnetic waves in the structure. In this method, the fields in x and z are expressed in function of the fields in the traverse direction y in FTD (Fourier Transform Domain). This method is useful in the determination of the complex propagation constant with application in high frequency and photonics. In this work structures will be approached in micrometric scale operating in frequencies in the range of T erahertz, a first step for operation in the visible spectra. The mathematical basis are approached for the determination of the electromagnetic fields in the structure, based on the method L TT taking into account the dimensions approached in this work. Calculations for the determination of the constant of complex propagation are also carried out. The computational implementation is presented for high frequencies. at the first time the analysis is done with base in open microstrip lines with semiconductor substrate. Finally, considerations are made regarding applications ofthese devices in the area of telecommunications, and suggestions for future
Resumo:
A new method to perform TCP/IP fingerprinting is proposed. TCP/IP fingerprinting is the process of identify a remote machine through a TCP/IP based computer network. This method has many applications related to network security. Both intrusion and defence procedures may use this process to achieve their objectives. There are many known methods that perform this process in favorable conditions. However, nowadays there are many adversities that reduce the identification performance. This work aims the creation of a new OS fingerprinting tool that bypass these actual problems. The proposed method is based on the use of attractors reconstruction and neural networks to characterize and classify pseudo-random numbers generators
Resumo:
Following the new tendency of interdisciplinarity of modern science, a new field called neuroengineering has come to light in the last decades. After 2000, scientific journals and conferences all around the world have been created on this theme. The present work comprises three different subareas related to neuroengineering and electrical engineering: neural stimulation; theoretical and computational neuroscience; and neuronal signal processing; as well as biomedical engineering. The research can be divided in three parts: (i) A new method of neuronal photostimulation was developed based on the use of caged compounds. Using the inhibitory neurotransmitter GABA caged by a ruthenium complex it was possible to block neuronal population activity using a laser pulse. The obtained results were evaluated by Wavelet analysis and tested by non-parametric statistics. (ii) A mathematical method was created to identify neuronal assemblies. Neuronal assemblies were proposed as the basis of learning by Donald Hebb remain the most accepted theory for neuronal representation of external stimuli. Using the Marcenko-Pastur law of eigenvalue distribution it was possible to detect neuronal assemblies and to compute their activity with high temporal resolution. The application of the method in real electrophysiological data revealed that neurons from the neocortex and hippocampus can be part of the same assembly, and that neurons can participate in multiple assemblies. (iii) A new method of automatic classification of heart beats was developed, which does not rely on a data base for training and is not specialized in specific pathologies. The method is based on Wavelet decomposition and normality measures of random variables. Throughout, the results presented in the three fields of knowledge represent qualification in neural and biomedical engineering
Resumo:
Modern wireless systems employ adaptive techniques to provide high throughput while observing desired coverage, Quality of Service (QoS) and capacity. An alternative to further enhance data rate is to apply cognitive radio concepts, where a system is able to exploit unused spectrum on existing licensed bands by sensing the spectrum and opportunistically access unused portions. Techniques like Automatic Modulation Classification (AMC) could help or be vital for such scenarios. Usually, AMC implementations rely on some form of signal pre-processing, which may introduce a high computational cost or make assumptions about the received signal which may not hold (e.g. Gaussianity of noise). This work proposes a new method to perform AMC which uses a similarity measure from the Information Theoretic Learning (ITL) framework, known as correntropy coefficient. It is capable of extracting similarity measurements over a pair of random processes using higher order statistics, yielding in better similarity estimations than by using e.g. correlation coefficient. Experiments carried out by means of computer simulation show that the technique proposed in this paper presents a high rate success in classification of digital modulation, even in the presence of additive white gaussian noise (AWGN)