22 resultados para preprocessing

em Repositório Institucional UNESP - Universidade Estadual Paulista "Julio de Mesquita Filho"


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates properties of integer programming models for a class of production planning problems. The models are developed within a decision support system to advise a sales team of the products on which to focus their efforts in gaining new orders in the short term. The products generally require processing on several manufacturing cells and involve precedence relationships. The cells are already (partially) committed with products for stock and to satisfy existing orders and therefore only the residual capacities of each cell in each time period of the planning horizon are considered. The determination of production recommendations to the sales team that make use of residual capacities is a nontrivial optimization problem. Solving such models is computationally demanding and techniques for speeding up solution times are highly desirable. An integer programming model is developed and various preprocessing techniques are investigated and evaluated. In addition, a number of cutting plane approaches have been applied. The performance of these approaches which are both general and application specific is examined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a filter based on a general regression neural network and a moving average filter, for preprocessing half-hourly load data for short-term multinodal load forecasting, discussed in another paper. Tests made with half-hourly load data from nine New Zealand electrical substations demonstrate that this filter is able to handle noise, missing data and abnormal data. © 2011 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To prevent large errors in the GPS positioning, cycle slips should be detected and corrected. Such procedure is not trivial, mainly for single frequency receivers, but normally it is not noticed by the users. Thus, it will be discussed some practical and more used methods for cycle slips detection and correction using just GPS single-frequency observations. In the detection, the triple (TD) and tetra differences were used. In relation to the correction, in general, each slip is corrected in the preprocessing. Otherwise, other strategies should be adopted during the processing. In this paper, the option was to the second option, and two strategies were tested. In one of them, the elements of the covariance matrix of the involved ambiguities are modified and new ambiguity estimation starts. In the one, a new ambiguity is introduced as additional unknown when a cycle slip is detected. These possibilities are discussed and compared in this paper, as well as the aspects related to the practicity, implementation and viability of each one. Some experiments were carried out using simulated data with cycle slips in different satellites and epochs of the data. This allowed assessing and comparing the results of different occurrence of cycle slip and correction in several conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUÇÃO: O ensaio do cometa ou técnica da eletroforese de células isoladas é largamente empregado para avaliação de danos e reparo do DNA em células individuais. O material pode ser corado por técnicas de fluorescência ou por sal de prata. Este último apresenta vantagens técnicas, como o tipo de microscópio utilizado e a possibilidade de armazenamento das lâminas. A análise dos cometas pode ser feita de modo visual, porém há a desvantagem da subjetividade dos resultados, que pode ser minimizada por análise digital automatizada. OBJETIVOS: Desenvolvimento e validação de método de análise digital de cometas corados por sal de prata. MÉTODOS: Cinquenta cometas foram fotografados de maneira padronizada e impressos em papel. Além de medidas manualmente, essas imagens foram classificadas em cinco categorias por três avaliadores, antes e depois de pré-processadas automaticamente pelo software ImageJ 1.38x. As estimativas geradas pelos avaliadores foram comparadas quanto sua correlação e reprodutibilidade. em seguida, foram desenvolvidos algoritmos de análise digital das medidas, com base em filtros estatísticos de mediana e de mínimo. Os valores obtidos foram comparados com os estimados manual e visualmente após o pré-processamento. RESULTADOS: As medidas manuais das imagens pré-processadas apresentaram maior correlação intraclasse do que as imagens preliminares. Os parâmetros automatizados apresentaram alta correlação com as medidas manuais pré-processadas, sugerindo que este sistema aumenta a objetividade da análise, podendo ser utilizado na estimativa dos parâmetros dos cometas. CONCLUSÃO: A presente análise digital proposta para o teste do cometa corado pela prata mostrou-se factível e de melhor reprodutibilidade que a análise visual.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes an interactive environment built entirely upon public domain or free software, intended to be used as the preprocessor of a finite element package for the simulation of three-dimensional electromagnetic problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The edges detection model by a non-linear anisotropic diffusion, consists in a mathematical model of smoothing based in Partial Differential Equation (PDE), alternative to the conventional low-pass filters. The smoothing model consists in a selective process, where homogeneous areas of the image are smoothed intensely in agreement with the temporal evolution applied to the model. The level of smoothing is related with the amount of undesired information contained in the image, i.e., the model is directly related with the optimal level of smoothing, eliminating the undesired information and keeping selectively the interest features for Cartography area. The model is primordial for cartographic applications, its function is to realize the image preprocessing without losing edges and other important details on the image, mainly airports tracks and paved roads. Experiments carried out with digital images showed that the methodology allows to obtain the features, e.g. airports tracks, with efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of mobile robots turns out to be interesting in activities where the action of human specialist is difficult or dangerous. Mobile robots are often used for the exploration in areas of difficult access, such as rescue operations and space missions, to avoid human experts exposition to risky situations. Mobile robots are also used in agriculture for planting tasks as well as for keeping the application of pesticides within minimal amounts to mitigate environmental pollution. In this paper we present the development of a system to control the navigation of an autonomous mobile robot through tracks in plantations. Track images are used to control robot direction by preprocessing them to extract image features. Such features are then submitted to a support vector machine in order to find out the most appropriate route. The overall goal of the project to which this work is connected is to develop a real time robot control system to be embedded into a hardware platform. In this paper we report the software implementation of a support vector machine, which so far presented around 93% accuracy in predicting the appropriate route. © 2012 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Finite Element Method is a well-known technique, being extensively applied in different areas. Studies using the Finite Element Method (FEM) are targeted to improve cardiac ablation procedures. For such simulations, the finite element meshes should consider the size and histological features of the target structures. However, it is possible to verify that some methods or tools used to generate meshes of human body structures are still limited, due to nondetailed models, nontrivial preprocessing, or mainly limitation in the use condition. In this paper, alternatives are demonstrated to solid modeling and automatic generation of highly refined tetrahedral meshes, with quality compatible with other studies focused on mesh generation. The innovations presented here are strategies to integrate Open Source Software (OSS). The chosen techniques and strategies are presented and discussed, considering cardiac structures as a first application context. © 2013 E. Pavarino et al.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An important tool for the heart disease diagnosis is the analysis of electrocardiogram (ECG) signals, since the non-invasive nature and simplicity of the ECG exam. According to the application, ECG data analysis consists of steps such as preprocessing, segmentation, feature extraction and classification aiming to detect cardiac arrhythmias (i.e.; cardiac rhythm abnormalities). Aiming to made a fast and accurate cardiac arrhythmia signal classification process, we apply and analyze a recent and robust supervised graph-based pattern recognition technique, the optimum-path forest (OPF) classifier. To the best of our knowledge, it is the first time that OPF classifier is used to the ECG heartbeat signal classification task. We then compare the performance (in terms of training and testing time, accuracy, specificity, and sensitivity) of the OPF classifier to the ones of other three well-known expert system classifiers, i.e.; support vector machine (SVM), Bayesian and multilayer artificial neural network (MLP), using features extracted from six main approaches considered in literature for ECG arrhythmia analysis. In our experiments, we use the MIT-BIH Arrhythmia Database and the evaluation protocol recommended by The Association for the Advancement of Medical Instrumentation. A discussion on the obtained results shows that OPF classifier presents a robust performance, i.e.; there is no need for parameter setup, as well as a high accuracy at an extremely low computational cost. Moreover, in average, the OPF classifier yielded greater performance than the MLP and SVM classifiers in terms of classification time and accuracy, and to produce quite similar performance to the Bayesian classifier, showing to be a promising technique for ECG signal analysis. © 2012 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pós-graduação em Ciências Cartográficas - FCT

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)