23 resultados para Algoritmo computacional
Resumo:
Raman imaging spectroscopy is a highly useful analytical tool that provides spatial and spectral information on a sample. However, CCD detectors used in dispersive instruments present the drawback of being sensitive to cosmic rays, giving rise to spikes in Raman spectra. Spikes influence variance structures and must be removed prior to the use of multivariate techniques. A new algorithm for correction of spikes in Raman imaging was developed using an approach based on comparison of nearest neighbor pixels. The algorithm showed characteristics including simplicity, rapidity, selectivity and high quality in spike removal from hyperspectral images.
Resumo:
One of the main referring subjects to the solar energy is how to compare it economically with other sources of energy, as much alternatives as with conventionals (like the electric grid). The purpose of this work was to develop a software which congregates the technical and economic main data to identify, through methods of microeconomic analysis, the commercial viability in the sizing of photovoltaic systems, besides considering the benefits proceeding from the proper energy generation. Considering the period of useful life of the components of the generation system of photovoltaic electricity, the costs of the energy proceeding from the conventional grid had been identified. For the comparison of the conventional sources, electric grid and diesel generation, three scenes of costs of photovoltaic panels and two for the factor of availability of diesel generation had been used. The results have shown that if the cost of the panels is low and the place of installation is more distant of the electric grid, the photovoltaic system becomes the best option.
Resumo:
A base-cutter represented for a mechanism of four bars, was developed using the Autocad program. The normal force of reaction of the profile in the contact point was determined through the dynamic analysis. The equations of dynamic balance were based on the laws of Newton-Euler. The linkage was subject to an optimization technique that considered the peak value of soil reaction force as the objective function to be minimized while the link lengths and the spring constant varied through a specified range. The Algorithm of Sequential Quadratic Programming-SQP was implemented of the program computational Matlab. Results were very encouraging; the maximum value of the normal reaction force was reduced from 4,250.33 to 237.13 N, making the floating process much less disturbing to the soil and the sugarcane rate. Later, others variables had been incorporated the mechanism optimized and new otimization process was implemented .
Resumo:
Remotely sensed imagery has been widely used for land use/cover classification thanks to the periodic data acquisition and the widespread use of digital image processing systems offering a wide range of classification algorithms. The aim of this work was to evaluate some of the most commonly used supervised and unsupervised classification algorithms under different landscape patterns found in Rondônia, including (1) areas of mid-size farms, (2) fish-bone settlements and (3) a gradient of forest and Cerrado (Brazilian savannah). Comparison with a reference map based on the kappa statistics resulted in good to superior indicators (best results - K-means: k=0.68; k=0.77; k=0.64 and MaxVer: k=0.71; k=0.89; k=0.70 respectively for three areas mentioned). Results show that choosing a specific algorithm requires to take into account both its capacity to discriminate among various spectral signatures under different landscape patterns as well as a cost/benefit analysis considering the different steps performed by the operator performing a land cover/use map. it is suggested that a more systematic assessment of several options of implementation of a specific project is needed prior to beginning a land use/cover mapping job.
Resumo:
This work approaches the forced air cooling of strawberry by numerical simulation. The mathematical model that was used describes the process of heat transfer, based on the Fourier's law, in spherical coordinates and simplified to describe the one-dimensional process. For the resolution of the equation expressed for the mathematical model, an algorithm was developed based on the explicit scheme of the numerical method of the finite differences and implemented in the scientific computation program MATLAB 6.1. The validation of the mathematical model was made by the comparison between theoretical and experimental data, where strawberries had been cooled with forced air. The results showed to be possible the determination of the convective heat transfer coefficient by fitting the numerical and experimental data. The methodology of the numerical simulations was showed like a promising tool in the support of the decision to use or to develop equipment in the area of cooling process with forced air of spherical fruits.
Resumo:
The physical model was based on the method of Newton-Euler. The model was developed by using the scientific computer program Mathematica®. Several simulations where tried varying the progress speeds (0.69; 1.12; 1.48; 1.82 and 2.12 m s-1); soil profiles (sinoidal, ascending and descending ramp) and height of the profile (0.025 and 0.05 m) to obtain the normal force of soil reaction. After the initial simulations, the mechanism was optimized using the scientific computer program Matlab® having as criterion (function-objective) the minimization of the normal force of reaction of the profile (FN). The project variables were the lengths of the bars (L1y, L2, l3 and L4), height of the operation (L7), the initial length of the spring (Lmo) and the elastic constant of the spring (k t). The lack of robustness of the mechanism in relation to the variable height of the operation was outlined by using a spring with low rigidity and large length. The results demonstrated that the mechanism optimized showed better flotation performance in relation to the initial mechanism.
Resumo:
Animal welfare has been an important research topic in animal production mainly in its ways of assessment. Vocalization is found to be an interesting tool for evaluating welfare as it provides data in a non-invasive way as well as it allows easy automation of process. The present research had as objective the implementation of an algorithm based on artificial neural network that had the potential of identifying vocalization related to welfare pattern indicatives. The research was done in two parts, the first was the development of the algorithm, and the second its validation with data from the field. Previous records allowed the development of the algorithm from behaviors observed in sows housed in farrowing cages. Matlab® software was used for implementing the network. It was selected a retropropagation gradient algorithm for training the network with the following stop criteria: maximum of 5,000 interactions or error quadratic addition smaller than 0.1. Validation was done with sows and piglets housed in commercial farm. Among the usual behaviors the ones that deserved enhancement were: the feed dispute at farrowing and the eventual risk of involuntary aggression between the piglets or between those and the sow. The algorithm was able to identify through the noise intensity the inherent risk situation of piglets welfare reduction.
Resumo:
In order to determine the energy needed to artificially dry an agricultural product the latent heat of vaporization of moisture in the product, H, must be known. Generally, the expressions for H reported in the literature are of the form H = h(T)f(M), where h(T) is the latent heat of vaporization of free water, and f(M) is a function of the equilibrium moisture content, M, which is a simplification. In this article, a more general expression for the latent heat of vaporization, namely H = g(M,T), is used to determine H for cowpea, always-green variety. For this purpose, a computer program was developed which automatically fits about 500 functions, with one or two independent variables, imbedded in its library to experimental data. The program uses nonlinear regression, and classifies the best functions according to the least reduced chi-squared. A set of executed statistical tests shows that the generalized expression for H used in this work produces better results of H for cowpea than other equations found in literature.
Resumo:
Among the important changes in the production processes, it is necessary to guarantee the sustainability of the human enterprises, what makes us to foresee changes in the managerial administration to adapt to a new model, with the insert of the concepts of Clean Production, Cleaner, Lean and Total Productive Maintenance (TPM). The main focus of this work was to elaborate a methodology that made it possible to guarantee the reliability in the waterworks of the sugarcane harvester, identifying and analyzing the manners of flaws, in order to result in the improvement of the environmental and socioeconomic quality in the atmosphere of an industry of sugarcane through the significant decrease of hydraulic oil spill. Through the existent report in ERP (Enterprise Resource Planning), used in a Sugarcane Industry Plant, it was possible to accompany of the operational acting of the sugarcane harvester used during 03 crops, regarding the manners of flaws in the waterworks of the same ones, and, in one of the crops it was elaborated the total control of the waterworks of 5 harvesters. Based on the obtained data and the developed methodology it was possible to develop a software that specifies the electric outlet of decisions.
Resumo:
The aim of this paper is to discuss some rhythmic differences between European and Brazilian Portuguese and their relationship to pretonic vowel reduction phenomena. After the basic facts of PE and PB are presented, we show that the issue cannot be discussed without taking into account secondary stress placement, and we proceed to present the algorithm-based approach to secondary stress in Portuguese, representative of Metrical Phonology analyses. After showing that this deterministic approach cannot adequately explain the variable position of secondary stress in both languages regarding words with an even number of pretonic syllables, we argue for the interpretation of secondary stress and therefore for the construction of rhythmic units at the PF interface, as suggested in Chomsky s Minimalist Program. We also propose, inspired by the constrain hierarchies as proposed in Optimality Theory, that such interpretation must take into account two different constraint rankings, in EP and BP. These different rankings would ultimately explain the rhythmic differences between both languages, as well as the different behavior of pretonic vowels with respect to reduction processes.
Resumo:
In this paper I present some evidencie that forces us to conclude that within the Minimalist Program (Chomsky 1993; 1995), Binding Theory (BT) should be computed after LF (Logical Form). I show that derivations leading to structures containing violations of BT-Principles must converge at LF, since less economical alternative derivations respecting those principles are also ungrammatical. Being irrelevant to the notion of convergence, BT must apply after LF. A similar reasoning reveals that the Theta-Criterion should have the status of a bare output condition appling at LF, since less economical derivations are allowed by the computational system to prevent violations of it.
Resumo:
Losses of horticulture product in Brazil are significant and among the main causes are the use of inappropriate boxes and the absence of a cold chain. A project for boxes is proposed, based on computer simulations, optimization and experimental validation, trying to minimize the amount of wood associated with structural and ergonomic aspects and the effective area of the openings. Three box prototypes were designed and built using straight laths with different configurations and areas of openings (54% and 36%). The cooling efficiency of Tommy Atkins mango (Mangifera Indica L.) was evaluated by determining the cooling time for fruit packed in the wood models and packed in the commercially used cardboard boxes, submitted to cooling in a forced-air system, at a temperature of 6ºC and average relative humidity of 85.4±2.1%. The Finite Element Method was applied, for the dimensioning and structural optimization of the model with the best behavior in relation to cooling. All wooden boxes with fruit underwent vibration testing for two hours (20 Hz). There was no significant difference in average cooling time in the wooden boxes (36.08±1.44 min); however, the difference was significant in comparison to the cardboard boxes (82.63±29.64 min). In the model chosen for structural optimization (36% effective area of openings and two side laths), the reduction in total volume of material was 60% and 83% in the cross section of the columns. There was no indication of mechanical damage in the fruit after undergoing the vibration test. Computer simulations and structural study may be used as a support tool for developing projects for boxes, with geometric, ergonomic and thermal criteria.
Resumo:
A common breeding strategy is to carry out basic studies to investigate the hypothesis of a single gene controlling the trait (major gene) with or without polygenes of minor effect. In this study we used Bayesian inference to fit genetic additive-dominance models of inheritance to plant breeding experiments with multiple generations. Normal densities with different means, according to the major gene genotype, were considered in a linear model in which the design matrix of the genetic effects had unknown coefficients (which were estimated in individual basis). An actual data set from an inheritance study of partenocarpy in zucchini (Cucurbita pepo L.) was used for illustration. Model fitting included posterior probabilities for all individual genotypes. Analysis agrees with results in the literature but this approach was far more efficient than previous alternatives assuming that design matrix was known for the generations. Partenocarpy in zucchini is controlled by a major gene with important additive effect and partial dominance.
Resumo:
Evolving interfaces were initially focused on solutions to scientific problems in Fluid Dynamics. With the advent of the more robust modeling provided by Level Set method, their original boundaries of applicability were extended. Specifically to the Geometric Modeling area, works published until then, relating Level Set to tridimensional surface reconstruction, centered themselves on reconstruction from a data cloud dispersed in space; the approach based on parallel planar slices transversal to the object to be reconstructed is still incipient. Based on this fact, the present work proposes to analyse the feasibility of Level Set to tridimensional reconstruction, offering a methodology that simultaneously integrates the proved efficient ideas already published about such approximation and the proposals to process the inherent limitations of the method not satisfactorily treated yet, in particular the excessive smoothing of fine characteristics of contours evolving under Level Set. In relation to this, the application of the variant Particle Level Set is suggested as a solution, for its intrinsic proved capability to preserve mass of dynamic fronts. At the end, synthetic and real data sets are used to evaluate the presented tridimensional surface reconstruction methodology qualitatively.
Resumo:
Evolving interfaces were initially focused on solutions to scientific problems in Fluid Dynamics. With the advent of the more robust modeling provided by Level Set method, their original boundaries of applicability were extended. Specifically to the Geometric Modeling area, works published until then, relating Level Set to tridimensional surface reconstruction, centered themselves on reconstruction from a data cloud dispersed in space; the approach based on parallel planar slices transversal to the object to be reconstructed is still incipient. Based on this fact, the present work proposes to analyse the feasibility of Level Set to tridimensional reconstruction, offering a methodology that simultaneously integrates the proved efficient ideas already published about such approximation and the proposals to process the inherent limitations of the method not satisfactorily treated yet, in particular the excessive smoothing of fine characteristics of contours evolving under Level Set. In relation to this, the application of the variant Particle Level Set is suggested as a solution, for its intrinsic proved capability to preserve mass of dynamic fronts. At the end, synthetic and real data sets are used to evaluate the presented tridimensional surface reconstruction methodology qualitatively.