878 resultados para Observational techniques and algorithms
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Mecânica na Área de Manutenção e Produção
Resumo:
This work presents an analysis of the method of intervention used in the craft of design by SEBRAE in Rio Grande do Norte under the viewpoint of the actors involved, especially the craftsmen and design consultants. The research methodology used was based on the ergonomic work analysis - AET, applying observational techniques and interactive. Data tabulated from the matrix to include comments were analyzed in order to allow the generation of qualitative and quantitative information. The results of this research allow us to affirm that for the process of innovation through the design of intervention are satisfactory for all involved, it is necessary that the bonds of interaction are established and the joint actions in the innovation process to bear fruit in favor sustainability of artisan groups
Resumo:
In this work, the quantitative analysis of glucose, triglycerides and cholesterol (total and HDL) in both rat and human blood plasma was performed without any kind of pretreatment of samples, by using near infrared spectroscopy (NIR) combined with multivariate methods. For this purpose, different techniques and algorithms used to pre-process data, to select variables and to build multivariate regression models were compared between each other, such as partial least squares regression (PLS), non linear regression by artificial neural networks, interval partial least squares regression (iPLS), genetic algorithm (GA), successive projections algorithm (SPA), amongst others. Related to the determinations of rat blood plasma samples, the variables selection algorithms showed satisfactory results both for the correlation coefficients (R²) and for the values of root mean square error of prediction (RMSEP) for the three analytes, especially for triglycerides and cholesterol-HDL. The RMSEP values for glucose, triglycerides and cholesterol-HDL obtained through the best PLS model were 6.08, 16.07 e 2.03 mg dL-1, respectively. In the other case, for the determinations in human blood plasma, the predictions obtained by the PLS models provided unsatisfactory results with non linear tendency and presence of bias. Then, the ANN regression was applied as an alternative to PLS, considering its ability of modeling data from non linear systems. The root mean square error of monitoring (RMSEM) for glucose, triglycerides and total cholesterol, for the best ANN models, were 13.20, 10.31 e 12.35 mg dL-1, respectively. Statistical tests (F and t) suggest that NIR spectroscopy combined with multivariate regression methods (PLS and ANN) are capable to quantify the analytes (glucose, triglycerides and cholesterol) even when they are present in highly complex biological fluids, such as blood plasma
Resumo:
Pós-graduação em Educação Escolar - FCLAR
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
We present an implementation of the F-statistic to carry out the first search in data from the Virgo laser interferometric gravitational wave detector for periodic gravitational waves from a priori unknown, isolated rotating neutron stars. We searched a frequency f(0) range from 100 Hz to 1 kHz and the frequency dependent spindown f(1) range from -1.6(f(0)/100 Hz) x 10(-9) Hz s(-1) to zero. A large part of this frequency-spindown space was unexplored by any of the all-sky searches published so far. Our method consisted of a coherent search over two-day periods using the F-statistic, followed by a search for coincidences among the candidates from the two-day segments. We have introduced a number of novel techniques and algorithms that allow the use of the fast Fourier transform (FFT) algorithm in the coherent part of the search resulting in a fifty-fold speed-up in computation of the F-statistic with respect to the algorithm used in the other pipelines. No significant gravitational wave signal was found. The sensitivity of the search was estimated by injecting signals into the data. In the most sensitive parts of the detector band more than 90% of signals would have been detected with dimensionless gravitational-wave amplitude greater than 5 x 10(-24).
Resumo:
In this work we will discuss about a project started by the Emilia-Romagna Regional Government regarding the manage of the public transport. In particular we will perform a data mining analysis on the data-set of this project. After introducing the Weka software used to make our analysis, we will discover the most useful data mining techniques and algorithms; and we will show how these results can be used to violate the privacy of the same public transport operators. At the end, despite is off topic of this work, we will spend also a few words about how it's possible to prevent this kind of attack.
Resumo:
En la actualidad existe una gran expectación ante la introducción de nuevas herramientas y métodos para el desarrollo de productos software, que permitirán en un futuro próximo un planteamiento de ingeniería del proceso de producción software. Las nuevas metodologías que empiezan a esbozarse suponen un enfoque integral del problema abarcando todas las fases del esquema productivo. Sin embargo el grado de automatización conseguido en el proceso de construcción de sistemas es muy bajo y éste está centrado en las últimas fases del ciclo de vida del software, consiguiéndose así una reducción poco significativa de sus costes y, lo que es aún más importante, sin garantizar la calidad de los productos software obtenidos. Esta tesis define una metodología de desarrollo software estructurada que se puede automatizar, es decir una metodología CASE. La metodología que se presenta se ajusta al modelo de ciclo de desarrollo CASE, que consta de las fases de análisis, diseño y pruebas; siendo su ámbito de aplicación los sistemas de información. Se establecen inicialmente los principios básicos sobre los que la metodología CASE se asienta. Posteriormente, y puesto que la metodología se inicia con la fijación de los objetivos de la empresa que demanda un sistema informático, se emplean técnicas que sirvan de recogida y validación de la información, que proporcionan a la vez un lenguaje de comunicación fácil entre usuarios finales e informáticos. Además, estas mismas técnicas detallarán de una manera completa, consistente y sin ambigüedad todos los requisitos del sistema. Asimismo, se presentan un conjunto de técnicas y algoritmos para conseguir que desde la especificación de requisitos del sistema se logre una automatización tanto del diseño lógico del Modelo de Procesos como del Modelo de Datos, validados ambos conforme a la especificación de requisitos previa. Por último se definen unos procedimientos formales que indican el conjunto de actividades a realizar en el proceso de construcción y cómo llevarlas a cabo, consiguiendo de esta manera una integridad en las distintas etapas del proceso de desarrollo.---ABSTRACT---Nowdays there is a great expectation with regard to the introduction of new tools and methods for the software products development that, in the very near future will allow, an engineering approach in the software development process. New methodologies, just emerging, imply an integral approach to the problem, including all the productive scheme stages. However, the automatization degree obtained in the systems construction process is very low and focused on the last phases of the software lifecycle, which means that the costs reduction obtained is irrelevant and, which is more important, the quality of the software products is not guaranteed. This thesis defines an structured software development methodology that can be automated, that is a CASE methodology. Such a methodology is adapted to the CASE development cycle-model, which consists in analysis, design and testing phases, being the information systems its field of application. Firstly, we present the basic principies on which CASE methodology is based. Secondly, since the methodology starts from fixing the objectives of the company demanding the automatization system, we use some techniques that are useful for gathering and validating the information, being at the same time an easy communication language between end-users and developers. Indeed, these same techniques will detail completely, consistently and non ambiguously all the system requirements. Likewise, a set of techniques and algorithms are shown in order to obtain, from the system requirements specification, an automatization of the Process Model logical design, and of the Data Model logical design. Those two models are validated according to the previous requirement specification. Finally, we define several formal procedures that suggest which set of activities to be accomplished in the construction process, and how to carry them out, getting in this way integrity and completness for the different stages of the development process.
Resumo:
The artisan fishing activity, which involves historical - cultural, environmental, social, political, economi c, among other factors, presents, nowadays, as an important source of income, creating jobs and food, contributing to the permanence of man in their own birthplace. However, the fish, considered one of the most perishable foods, requires a proper handling and conservation, from capture to its availability in the market, in order to slow the deterioration process. Thereby, this dissertation aims to study the effect of the manipulation practices on fish quality, from capture to its landing on the beach, resul ting from fisherman’s activity from Ponta Negra - Natal/RN. It also presents the purpose of analyzing the quality of the fish and propose recommendations for their proper handling and possible solutions to add value to the product, through the improvement of the quality and good handling practices. For this purpose, the methodology used was based on ergonomic analysis of their work through observational techniques and interactional with the focus group, the jangadeiros, to understand their activity and eval uated the freshness and quality of the fish by sensory analysis, and microbiological parameters and physicochemical from existing legislation Ordinance No. 185 of May 13, 1997 and RIISPOA - amended on December 1, 2007 and RDC No. 12, dated January 2, 2001. According to the results obtained in laboratory tests, it can be established, the acceptable quality of fish as the existing rules and regulations parameters , not getting significant deterioration caused by poor handling and improper storage of fish.
Resumo:
The results of the research systematized on this analysis sought apprehend the linkage of the socio-educational service network, destined to adolescents who comply with socioeducational measure of confinement, in the region of the Seridó of the state of the Rio Grande do Norte, especially in the city of Caicó, central town of this region. The achievement of this study was stimulated by the interest in unraveling the contradictory reality imposed by neoliberal State, sparing the guarantee of rights, especially to these teens, who are seen as authors of violations and are stigmatized by capitalist society. The research was carried in the period July-September 2013, under critical perspective, using the documental analysis and the observational techniques and interviews with professionals of the Educational Center (CEDUC), of the Unified Health System (SUS), of the Social Policies of Social Assistance, and of the State Department of Education, which should make the service network that gravitates around the National System of Socio-educational Services (SINASE). The Statute of Children and Adolescents (ECA) and SINASE define that the application of socioeducational measures cannot occur isolated of the public policies, becoming indispensable the linkages of the system with the social policies of social assistance, education and health. However, it was observed that the neoliberal logic of the capitalist State has developed broken, disconnected, focal and superficial social policies, who fail give effect to the rights acquired beyond the legal sphere. In this perspective, it is possible affirm that the everyday of the Brazilian poor teens is marked by the action of the State, which aims to control those who disturb the order of capital, who threaten the production, the market, the consume and the private property. This way, actions are promoted criminalizing poverty and imprint a legal action over this expression of the social issue to the detriment of social policies that meet the real needs of adolescents. Face of this reality, it becomes necessary to put on the agenda of the here and now to fight for rights, aiming at a broad public debate involving professionals, researchers and social movements in support of the viability of rights, which aims to support reflections and to strengthen ways to confront this social problem. With the approximations of this study, it was learned that the struggle for rights is a fight for another project of society, beyond what is laid.
Resumo:
This master's thesis aims to analyze the activity of the operators in a control room of the processes of production on-shore petroleum, with a focus on sociotechnical restrictions that interfere in the decision-making process and the actions of operators and therefore, the strategies (individual and collective) to regulate and maintain the operator action required and the safety of the system, together. The activity in focus involves the supervision and control of the production of thousands of barrels of oil/day in a complex and dispersed production’s structures built in an extension of 80 km. This operational framework highlights the importance of this activity for the fulfilment of the targets local and corporate efficiency, good management of the environment, health and safety of operators. This is an exploratory research and in the field, which uses the methodology of Ergonomic Analysis of the Work, composed of observational techniques and interactional, having as locus control room of the processes of production on-shore oil of an oil company. The population of this research is formed by operators in the control room of an Brazilian oil company. The results showed that the supervisory activity and control of the superheated steam injection is an complex context, demands greater attention, concentration, calculations, comparisons, trend analysis and decision making. The activity is collectively constructed between the control room operator, field operator and the supplier of steam. The research showed that the processes of communication and collaboration between the control room , fields and support staff are the key elements of this activity. The study shows that the operators have the autonomy and the elements necessary for work; and that there is continuous investments to improve the technology used and that the operators report sleep disturbances as a result of chronic exposure to night work. The study contributed with proposals for transformation of this activity: with regard to the installation of a area reserved for food in control room, the update the screens of the supervisory current operating condition, the periodic visits by room operators in the field, standardization of production reports, development assistance and standardization of nomenclature of controlling stations steam systems, to improve the conditions of realization of the activity, improve the quality of products produced by operators and contribute to reduce the possibility of slips or shifts in the activity.
Resumo:
The mining environment presents a challenging prospect for stereo vision. Our objective is to produce a stereo vision sensor suited to close-range scenes consisting mostly of rocks. This sensor should produce a dense depth map within real-time constraints. Speed and robustness are of foremost importance for this application. This paper compares a number of stereo matching algorithms in terms of robustness and suitability to fast implementation. These include traditional area-based algorithms, and algorithms based on non-parametric transforms, notably the rank and census transforms. Our experimental results show that the rank and census transforms are robust with respect to radiometric distortion and introduce less computational complexity than conventional area-based matching techniques.
Resumo:
The domination and Hamilton circuit problems are of interest both in algorithm design and complexity theory. The domination problem has applications in facility location and the Hamilton circuit problem has applications in routing problems in communications and operations research.The problem of deciding if G has a dominating set of cardinality at most k, and the problem of determining if G has a Hamilton circuit are NP-Complete. Polynomial time algorithms are, however, available for a large number of restricted classes. A motivation for the study of these algorithms is that they not only give insight into the characterization of these classes but also require a variety of algorithmic techniques and data structures. So the search for efficient algorithms, for these problems in many classes still continues.A class of perfect graphs which is practically important and mathematically interesting is the class of permutation graphs. The domination problem is polynomial time solvable on permutation graphs. Algorithms that are already available are of time complexity O(n2) or more, and space complexity O(n2) on these graphs. The Hamilton circuit problem is open for this class.We present a simple O(n) time and O(n) space algorithm for the domination problem on permutation graphs. Unlike the existing algorithms, we use the concept of geometric representation of permutation graphs. Further, exploiting this geometric notion, we develop an O(n2) time and O(n) space algorithm for the Hamilton circuit problem.
Resumo:
To make a comprehensive evaluation of organ-specific out-of-field doses using Monte Carlo (MC) simulations for different breast cancer irradiation techniques and to compare results with a commercial treatment planning system (TPS). Three breast radiotherapy techniques using 6MV tangential photon beams were compared: (a) 2DRT (open rectangular fields), (b) 3DCRT (conformal wedged fields), and (c) hybrid IMRT (open conformal+modulated fields). Over 35 organs were contoured in a whole-body CT scan and organ-specific dose distributions were determined with MC and the TPS. Large differences in out-of-field doses were observed between MC and TPS calculations, even for organs close to the target volume such as the heart, the lungs and the contralateral breast (up to 70% difference). MC simulations showed that a large fraction of the out-of-field dose comes from the out-of-field head scatter fluence (>40%) which is not adequately modeled by the TPS. Based on MC simulations, the 3DCRT technique using external wedges yielded significantly higher doses (up to a factor 4-5 in the pelvis) than the 2DRT and the hybrid IMRT techniques which yielded similar out-of-field doses. In sharp contrast to popular belief, the IMRT technique investigated here does not increase the out-of-field dose compared to conventional techniques and may offer the most optimal plan. The 3DCRT technique with external wedges yields the largest out-of-field doses. For accurate out-of-field dose assessment, a commercial TPS should not be used, even for organs near the target volume (contralateral breast, lungs, heart).
Resumo:
Magnetic Resonance Imaging (MRI) is a multi sequence medical imaging technique in which stacks of images are acquired with different tissue contrasts. Simultaneous observation and quantitative analysis of normal brain tissues and small abnormalities from these large numbers of different sequences is a great challenge in clinical applications. Multispectral MRI analysis can simplify the job considerably by combining unlimited number of available co-registered sequences in a single suite. However, poor performance of the multispectral system with conventional image classification and segmentation methods makes it inappropriate for clinical analysis. Recent works in multispectral brain MRI analysis attempted to resolve this issue by improved feature extraction approaches, such as transform based methods, fuzzy approaches, algebraic techniques and so forth. Transform based feature extraction methods like Independent Component Analysis (ICA) and its extensions have been effectively used in recent studies to improve the performance of multispectral brain MRI analysis. However, these global transforms were found to be inefficient and inconsistent in identifying less frequently occurred features like small lesions, from large amount of MR data. The present thesis focuses on the improvement in ICA based feature extraction techniques to enhance the performance of multispectral brain MRI analysis. Methods using spectral clustering and wavelet transforms are proposed to resolve the inefficiency of ICA in identifying small abnormalities, and problems due to ICA over-completeness. Effectiveness of the new methods in brain tissue classification and segmentation is confirmed by a detailed quantitative and qualitative analysis with synthetic and clinical, normal and abnormal, data. In comparison to conventional classification techniques, proposed algorithms provide better performance in classification of normal brain tissues and significant small abnormalities.