912 resultados para Simulated annealing algorithms
Resumo:
ABSTRACT Global warming increases the occurrence of events such as extreme heat waves. Research on thermal and air conditions affecting broiler-rearing environment are important to evaluate the animal welfare under extreme heat aiming mitigation measures. This study aimed at evaluating the effect of a simulated heat wave, in a climatic chamber, on the thermal and air environment of 42-day-old broilers. One hundred and sixty broilers were housed and reared for 42 days in a climatic chamber; the animals were divided into eight pens. Heat wave simulation was performed on the 42nd day, the period of great impact and data sampling. The analyzed variables were room and litter temperatures, relative humidity, concentrations of oxygen, carbon monoxide and ammonia at each pen. These variables were assessed each two hours, starting at 8 am, simulating a day heating up to 4 pm, when it is reached the maximum temperature. By the results, we concluded that increasing room temperatures promoted a proportional raise in litter temperatures, contributing to ammonia volatilization. In addition, oxygen concentrations decreased with increasing temperatures; and the carbon monoxide was only observed at temperatures above 27.0 °C, relative humidity higher than 88.4% and litter temperatures superior to 30.3 °C.
Resumo:
Plot-scale overland flow experiments were conducted to evaluate the efficiency of streamside management zones (SMZs) in retaining herbicides in runoff generated from silvicultural activities. Herbicide retention was evaluated for five different slopes (2, 5, 10, 15, and 20%), two cover conditions (undisturbed O horizon and raked surface), and two periods under contrasting soil moisture conditions (summer dry and winter wet season) and correlated to O horizon and site conditions. Picloram (highly soluble in water) and atrazine (moderately sorbed into soil particles) at concentrations in the range of 55 and 35 µg L-1 and kaolin clay (approximately 5 g L-1) were mixed with 13.000 liters of water and dispersed over the top of 5 x 10 m forested plots. Surface flow was collected 2, 4, 6, and 10 m below the disperser to evaluate the changes in concentration as it moved through the O horizon and surface soil horizon-mixing zone. Results showed that, on average, a 10 m long forested SMZ removed around 25% of the initial concentration of atrazine and was generally ineffective in reducing the more soluble picloram. Retention of picloram was only 6% of the applied quantity. Percentages of mass reduction by infiltration were 36% for atrazine and 20% for picloram. Stronger relationships existed between O horizon depth and atrazine retention than in any other measured variable, suggesting that better solid-solution contact associated with flow through deeper O horizons is more important than either velocity or soil moisture as a determinant of sorption.
Resumo:
Global illumination algorithms are at the center of realistic image synthesis and account for non-trivial light transport and occlusion within scenes, such as indirect illumination, ambient occlusion, and environment lighting. Their computationally most difficult part is determining light source visibility at each visible scene point. Height fields, on the other hand, constitute an important special case of geometry and are mainly used to describe certain types of objects such as terrains and to map detailed geometry onto object surfaces. The geometry of an entire scene can also be approximated by treating the distance values of its camera projection as a screen-space height field. In order to shadow height fields from environment lights a horizon map is usually used to occlude incident light. We reduce the per-receiver time complexity of generating the horizon map on N N height fields from O(N) of the previous work to O(1) by using an algorithm that incrementally traverses the height field and reuses the information already gathered along the path of traversal. We also propose an accurate method to integrate the incident light within the limits given by the horizon map. Indirect illumination in height fields requires information about which other points are visible to each height field point. We present an algorithm to determine this intervisibility in a time complexity that matches the space complexity of the produced visibility information, which is in contrast to previous methods which scale in the height field size. As a result the amount of computation is reduced by two orders of magnitude in common use cases. Screen-space ambient obscurance methods approximate ambient obscurance from the depth bu er geometry and have been widely adopted by contemporary real-time applications. They work by sampling the screen-space geometry around each receiver point but have been previously limited to near- field effects because sampling a large radius quickly exceeds the render time budget. We present an algorithm that reduces the quadratic per-pixel complexity of previous methods to a linear complexity by line sweeping over the depth bu er and maintaining an internal representation of the processed geometry from which occluders can be efficiently queried. Another algorithm is presented to determine ambient obscurance from the entire depth bu er at each screen pixel. The algorithm scans the depth bu er in a quick pre-pass and locates important features in it, which are then used to evaluate the ambient obscurance integral accurately. We also propose an evaluation of the integral such that results within a few percent of the ray traced screen-space reference are obtained at real-time render times.
Resumo:
In the present study, using noise-free simulated signals, we performed a comparative examination of several preprocessing techniques that are used to transform the cardiac event series in a regularly sampled time series, appropriate for spectral analysis of heart rhythm variability (HRV). First, a group of noise-free simulated point event series, which represents a time series of heartbeats, was generated by an integral pulse frequency modulation model. In order to evaluate the performance of the preprocessing methods, the differences between the spectra of the preprocessed simulated signals and the true spectrum (spectrum of the model input modulating signals) were surveyed by visual analysis and by contrasting merit indices. It is desired that estimated spectra match the true spectrum as close as possible, showing a minimum of harmonic components and other artifacts. The merit indices proposed to quantify these mismatches were the leakage rate, defined as a measure of leakage components (located outside some narrow windows centered at frequencies of model input modulating signals) with respect to the whole spectral components, and the numbers of leakage components with amplitudes greater than 1%, 5% and 10% of the total spectral components. Our data, obtained from a noise-free simulation, indicate that the utilization of heart rate values instead of heart period values in the derivation of signals representative of heart rhythm results in more accurate spectra. Furthermore, our data support the efficiency of the widely used preprocessing technique based on the convolution of inverse interval function values with a rectangular window, and suggest the preprocessing technique based on a cubic polynomial interpolation of inverse interval function values and succeeding spectral analysis as another efficient and fast method for the analysis of HRV signals
Resumo:
Several lines of evidence point to the participation of serotonin (5HT) in anxiety. Its specific role, however, remains obscure. The objective of the present study was to evaluate the effect of reducing 5HT-neurotransmission through an acute tryptophan depletion on anxiety induced by a simulated public speaking (SPS) test. Two groups of 14-15 subjects were submitted to a 24-h diet with a low or normal content of tryptophan and received an amino acid mixture without (TRY-) or with (TRY+) tryptophan under double-blind conditions. Five hours later they were submitted to the SPS test. The state-trait anxiety inventory (STAI) and the visual analogue mood scale (VAMS) were used to measure subjective anxiety. Both scales showed that SPS induced a significant increase in anxiety. Although no overall difference between groups was found, there was a trend (P = 0.078) to an interaction of group x gender x phases of the SPS, and a separate analysis of each gender showed an increase in anxiety measured by the STAI in females of the TRY- group. The results for the female TRY- group also suggested a greater arousing effect of the SPS test. In conclusion, the tryptophan depletion procedure employed in the present study did not induce a significant general change in subjective anxiety, but tended to induce anxiety in females. This suggests a greater sensitivity of the 5HT system to the effects of the procedure in this gender.
Resumo:
Identification of low-dimensional structures and main sources of variation from multivariate data are fundamental tasks in data analysis. Many methods aimed at these tasks involve solution of an optimization problem. Thus, the objective of this thesis is to develop computationally efficient and theoretically justified methods for solving such problems. Most of the thesis is based on a statistical model, where ridges of the density estimated from the data are considered as relevant features. Finding ridges, that are generalized maxima, necessitates development of advanced optimization methods. An efficient and convergent trust region Newton method for projecting a point onto a ridge of the underlying density is developed for this purpose. The method is utilized in a differential equation-based approach for tracing ridges and computing projection coordinates along them. The density estimation is done nonparametrically by using Gaussian kernels. This allows application of ridge-based methods with only mild assumptions on the underlying structure of the data. The statistical model and the ridge finding methods are adapted to two different applications. The first one is extraction of curvilinear structures from noisy data mixed with background clutter. The second one is a novel nonlinear generalization of principal component analysis (PCA) and its extension to time series data. The methods have a wide range of potential applications, where most of the earlier approaches are inadequate. Examples include identification of faults from seismic data and identification of filaments from cosmological data. Applicability of the nonlinear PCA to climate analysis and reconstruction of periodic patterns from noisy time series data are also demonstrated. Other contributions of the thesis include development of an efficient semidefinite optimization method for embedding graphs into the Euclidean space. The method produces structure-preserving embeddings that maximize interpoint distances. It is primarily developed for dimensionality reduction, but has also potential applications in graph theory and various areas of physics, chemistry and engineering. Asymptotic behaviour of ridges and maxima of Gaussian kernel densities is also investigated when the kernel bandwidth approaches infinity. The results are applied to the nonlinear PCA and to finding significant maxima of such densities, which is a typical problem in visual object tracking.
Resumo:
We compared the cost-benefit of two algorithms, recently proposed by the Centers for Disease Control and Prevention, USA, with the conventional one, the most appropriate for the diagnosis of hepatitis C virus (HCV) infection in the Brazilian population. Serum samples were obtained from 517 ELISA-positive or -inconclusive blood donors who had returned to Fundação Pró-Sangue/Hemocentro de São Paulo to confirm previous results. Algorithm A was based on signal-to-cut-off (s/co) ratio of ELISA anti-HCV samples that show s/co ratio ³95% concordance with immunoblot (IB) positivity. For algorithm B, reflex nucleic acid amplification testing by PCR was required for ELISA-positive or -inconclusive samples and IB for PCR-negative samples. For algorithm C, all positive or inconclusive ELISA samples were submitted to IB. We observed a similar rate of positive results with the three algorithms: 287, 287, and 285 for A, B, and C, respectively, and 283 were concordant with one another. Indeterminate results from algorithms A and C were elucidated by PCR (expanded algorithm) which detected two more positive samples. The estimated cost of algorithms A and B was US$21,299.39 and US$32,397.40, respectively, which were 43.5 and 14.0% more economic than C (US$37,673.79). The cost can vary according to the technique used. We conclude that both algorithms A and B are suitable for diagnosing HCV infection in the Brazilian population. Furthermore, algorithm A is the more practical and economical one since it requires supplemental tests for only 54% of the samples. Algorithm B provides early information about the presence of viremia.
Resumo:
Our objective is to evaluate the accuracy of three algorithms in differentiating the origins of outflow tract ventricular arrhythmias (OTVAs). This study involved 110 consecutive patients with OTVAs for whom a standard 12-lead surface electrocardiogram (ECG) showed typical left bundle branch block morphology with an inferior axis. All the ECG tracings were retrospectively analyzed using the following three recently published ECG algorithms: 1) the transitional zone (TZ) index, 2) the V2 transition ratio, and 3) V2 R wave duration and R/S wave amplitude indices. Considering all patients, the V2 transition ratio had the highest sensitivity (92.3%), while the R wave duration and R/S wave amplitude indices in V2 had the highest specificity (93.9%). The latter finding had a maximal area under the ROC curve of 0.925. In patients with left ventricular (LV) rotation, the V2 transition ratio had the highest sensitivity (94.1%), while the R wave duration and R/S wave amplitude indices in V2 had the highest specificity (87.5%). The former finding had a maximal area under the ROC curve of 0.892. All three published ECG algorithms are effective in differentiating the origin of OTVAs, while the V2 transition ratio, and the V2 R wave duration and R/S wave amplitude indices are the most sensitive and specific algorithms, respectively. Amongst all of the patients, the V2 R wave duration and R/S wave amplitude algorithm had the maximal area under the ROC curve, but in patients with LV rotation the V2 transition ratio algorithm had the maximum area under the ROC curve.
Resumo:
The yam (Discorea sp) is a tuber rich in carbohydrates, vitamins and mineral salts, besides several components that serve as raw material for medicines. It grows well in tropical and subtropical climates and develops well in zones with an annual pluvial precipitation of around 1300mm, and with cultural treatments, its productivity can exceed 30t/ha. When harvested, the tubers possess about 70% of moisture, and are merchandised "in natura", in the atmospheric temperature, which can cause its fast deterioration. The present work studied the drying of the yam in the form of slices of 1.0 and 2.5cm thickness, as well as in the form of fillets with 1.0 x 1.0 x 5.0cm, with the drying air varying from 40 to 70°C. The equating of the process was accomplished, allowing to simulate the drying as a function of the conditions of the drying air and of the initial and final moisture of the product. Also investigated was the expense of energy as function of the air temperature. The drying in the form of fillets, with the air in a temperature range between 45 and 50°C, was shown to be the most viable process when combining both the quality of the product and the expense of energy.
Resumo:
Many industrial applications need object recognition and tracking capabilities. The algorithms developed for those purposes are computationally expensive. Yet ,real time performance, high accuracy and small power consumption are essential measures of the system. When all these requirements are combined, hardware acceleration of these algorithms becomes a feasible solution. The purpose of this study is to analyze the current state of these hardware acceleration solutions, which algorithms have been implemented in hardware and what modifications have been done in order to adapt these algorithms to hardware.
Resumo:
Simplification of highly detailed CAD models is an important step when CAD models are visualized or by other means utilized in augmented reality applications. Without simplification, CAD models may cause severe processing and storage is- sues especially in mobile devices. In addition, simplified models may have other advantages like better visual clarity or improved reliability when used for visual pose tracking. The geometry of CAD models is invariably presented in form of a 3D mesh. In this paper, we survey mesh simplification algorithms in general and focus especially to algorithms that can be used to simplify CAD models. We test some commonly known algorithms with real world CAD data and characterize some new CAD related simplification algorithms that have not been surveyed in previous mesh simplification reviews.
Resumo:
Valmistavan teollisuuden kiristyvät vaatimukset suunnittelusta markkinoille -ajassa (engl. time-to-market), laadussa, kustannustehokkuudessa ja turvallisuudessa luovat paineita uusien toimintatapojen etsimisessä. Usein laitteiston ohjausalgoritmeja ei ole mahdollista testata todellisen laitteiston kanssa, vaan ainoaksi ennakoivaksi vaihtoehdoksi jää todellisen laitteiston virtuaalinen mallintaminen. Eräs uusista toimintavoista on virtuaalinen käyttöönotto, jossa tuotantolinja tai laitteisto mallinnetaan ja sen käyttäytymistä simuloidaan ohjausalgoritmien parantamista ja todentamista varten. Tämän diplomityön tavoitteena oli toteuttaa virtuaalinen käyttöönottoympäristö, jolla laitteiston 3D-mallinnettua virtuaalista mallia voidaan ohjata reaaliajassa todellisen laitteiston ohjauslaitteistolla. Käyttöönottoympäristön toteuttamisen lopullisena tavoitteena on tutkia, millaisia hyötyjä sillä voidaan saavuttaa Outotec (Finland) Oy:n automaatiojärjestelmien suunnittelussa ja käyttöönotossa kiristyvien vaatimusten täyttämiseksi. Työssä toteutetulla käyttöönottoympäristöllä pystytään simuloimaan 3D-mallinnetun laitteiston osan toimintaa reaaliajassa. Todellisen laitteiston ominaisuuksista määritettyjä vaatimuksia ei kustannussyistä täytetty, sillä ennen sitä haluttiin varmistua valitun alustan ominaisuuksista, toimivuudesta ja soveltuvuudesta. Toteutuksen katsotaan kuitenkin täyttävän pehmeän reaaliaikaisuuden kriteerin noin 40 ms aikatasolla ja 80 ms reaktioajalla. Toteutettu virtuaalinen käyttöönottoympäristö osoittautui toimivaksi ja soveltuvaksi, sekä sen todettiin tuovan potentiaalisia hyötyjä Outotec (Finland) Oy:lle, esimerkiksi kosketusnäyttöjen visualisoinnin parannus, hybridikäyttöönottomahdollisuus sekä automaatio-ohjauksien kehittäminen. Työn perusteella arvioidaan onko Outotec:lla tarvetta jatkaa valitulla alustalla todellisen laitteiston aikavaatimukset täyttävään reaaliaika-toteutukseen, jota työssä esitellään.