980 resultados para Algorithm efficiency


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work develops a method for solving ordinary differential equations, that is, initial-value problems, with solutions approximated by using Legendre's polynomials. An iterative procedure for the adjustment of the polynomial coefficients is developed, based on the genetic algorithm. This procedure is applied to several examples providing comparisons between its results and the best polynomial fitting when numerical solutions by the traditional Runge-Kutta or Adams methods are available. The resulting algorithm provides reliable solutions even if the numerical solutions are not available, that is, when the mass matrix is singular or the equation produces unstable running processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of the present study was to estimate (co)variance components for length of productive life (LPL) and some alternative reproductive traits of 6-year-old Nellore cattle. The data set contained 57,410 records for age at first calving from Nellore females and was edited to remove animal records with uncertain paternity and cows with just one piece of calving information. Only animals with age at first calving ranging from 23 to 48 months and calving intervals between 11 and 24 months were kept for analysis. LPL and life production ( LP) were used to describe productive life. LPL was defined as the number of months a cow was kept in the herd until she was 6 years old, given that she was alive at first calving and LP was defined as total number of calves in that time. Four traits were used to describe reproductive traits: two breeding efficiencies on original scale were estimated using Wilcox and Tomar functions (BEW and BET, respectively), and two breeding efficiencies transformed (ASBEW and ASBET, respectively), using the function [arcsine (square root (BEi/100))]. Estimates of heritability for measures of LPL and LP were low and ranged from 0.04 to 0.05. Estimates of heritability for breeding efficiencies on original and transformed scales oscillated from 0.18 to 0.32. Estimates of genetic correlations ranged from -0.57 to 0.79 for LPL and other traits and from 0.28 to 0.63 for LP and other traits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new statistical algorithm to estimate rainfall over the Amazon Basin region using the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI). The algorithm relies on empirical relationships derived for different raining-type systems between coincident measurements of surface rainfall rate and 85-GHz polarization-corrected brightness temperature as observed by the precipitation radar (PR) and TMI on board the TRMM satellite. The scheme includes rain/no-rain area delineation (screening) and system-type classification routines for rain retrieval. The algorithm is validated against independent measurements of the TRMM-PR and S-band dual-polarization Doppler radar (S-Pol) surface rainfall data for two different periods. Moreover, the performance of this rainfall estimation technique is evaluated against well-known methods, namely, the TRMM-2A12 [ the Goddard profiling algorithm (GPROF)], the Goddard scattering algorithm (GSCAT), and the National Environmental Satellite, Data, and Information Service (NESDIS) algorithms. The proposed algorithm shows a normalized bias of approximately 23% for both PR and S-Pol ground truth datasets and a mean error of 0.244 mm h(-1) ( PR) and -0.157 mm h(-1)(S-Pol). For rain volume estimates using PR as reference, a correlation coefficient of 0.939 and a normalized bias of 0.039 were found. With respect to rainfall distributions and rain area comparisons, the results showed that the formulation proposed is efficient and compatible with the physics and dynamics of the observed systems over the area of interest. The performance of the other algorithms showed that GSCAT presented low normalized bias for rain areas and rain volume [0.346 ( PR) and 0.361 (S-Pol)], and GPROF showed rainfall distribution similar to that of the PR and S-Pol but with a bimodal distribution. Last, the five algorithms were evaluated during the TRMM-Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA) 1999 field campaign to verify the precipitation characteristics observed during the easterly and westerly Amazon wind flow regimes. The proposed algorithm presented a cumulative rainfall distribution similar to the observations during the easterly regime, but it underestimated for the westerly period for rainfall rates above 5 mm h(-1). NESDIS(1) overestimated for both wind regimes but presented the best westerly representation. NESDIS(2), GSCAT, and GPROF underestimated in both regimes, but GPROF was closer to the observations during the easterly flow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context. The luminous material in clusters of galaxies exists in two forms: the visible galaxies and the X-ray emitting intra-cluster medium. The hot intra-cluster gas is the major observed baryonic component of clusters, about six times more massive than the stellar component. The mass contained within visible galaxies is approximately 3% of the dynamical mass. Aims. Our aim was to analyze both baryonic components, combining X-ray and optical data of a sample of five galaxy clusters (Abell 496, 1689, 2050, 2631 and 2667), within the redshift range 0.03 < z < 0.3. We determined the contribution of stars in galaxies and the intra-cluster medium to the total baryon budget. Methods. We used public XMM-Newton data to determine the gas mass and to obtain the X-ray substructures. Using the optical counterparts from SDSS or CFHT we determined the stellar contribution. Results. We examine the relative contribution of galaxies, intra-cluster light and intra-cluster medium to baryon budget in clusters through the stellar-to-gas mass ratio, estimated with recent data. We find that the stellar-to-gas mass ratio within r(500) (the radius within which the mean cluster density exceeds the critical density by a factor of 500), is anti-correlated with the ICM temperature, which range from 24% to 6% while the temperature ranges from 4.0 to 8.3 keV. This indicates that less massive cold clusters are more prolific star forming environments than massive hot clusters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context. B[e] supergiants are luminous, massive post-main sequence stars exhibiting non-spherical winds, forbidden lines, and hot dust in a disc-like structure. The physical properties of their rich and complex circumstellar environment (CSE) are not well understood, partly because these CSE cannot be easily resolved at the large distances found for B[e] supergiants (typically greater than or similar to 1 kpc). Aims. From mid-IR spectro-interferometric observations obtained with VLTI/MIDI we seek to resolve and study the CSE of the Galactic B[e] supergiant CPD-57 degrees 2874. Methods. For a physical interpretation of the observables (visibilities and spectrum) we use our ray-tracing radiative transfer code (FRACS), which is optimised for thermal spectro-interferometric observations. Results. Thanks to the short computing time required by FRACS (<10 s per monochromatic model), best-fit parameters and uncertainties for several physical quantities of CPD-57 degrees 2874 were obtained, such as inner dust radius, relative flux contribution of the central source and of the dusty CSE, dust temperature profile, and disc inclination. Conclusions. The analysis of VLTI/MIDI data with FRACS allowed one of the first direct determinations of physical parameters of the dusty CSE of a B[e] supergiant based on interferometric data and using a full model-fitting approach. In a larger context, the study of B[e] supergiants is important for a deeper understanding of the complex structure and evolution of hot, massive stars.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the star/galaxy classification efficiency of 13 different decision tree algorithms applied to photometric objects in the Sloan Digital Sky Survey Data Release Seven (SDSS-DR7). Each algorithm is defined by a set of parameters which, when varied, produce different final classification trees. We extensively explore the parameter space of each algorithm, using the set of 884,126 SDSS objects with spectroscopic data as the training set. The efficiency of star-galaxy separation is measured using the completeness function. We find that the Functional Tree algorithm (FT) yields the best results as measured by the mean completeness in two magnitude intervals: 14 <= r <= 21 (85.2%) and r >= 19 (82.1%). We compare the performance of the tree generated with the optimal FT configuration to the classifications provided by the SDSS parametric classifier, 2DPHOT, and Ball et al. We find that our FT classifier is comparable to or better in completeness over the full magnitude range 15 <= r <= 21, with much lower contamination than all but the Ball et al. classifier. At the faintest magnitudes (r > 19), our classifier is the only one that maintains high completeness (> 80%) while simultaneously achieving low contamination (similar to 2.5%). We also examine the SDSS parametric classifier (psfMag - modelMag) to see if the dividing line between stars and galaxies can be adjusted to improve the classifier. We find that currently stars in close pairs are often misclassified as galaxies, and suggest a new cut to improve the classifier. Finally, we apply our FT classifier to separate stars from galaxies in the full set of 69,545,326 SDSS photometric objects in the magnitude range 14 <= r <= 21.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The importance of interface effects for organic devices has long been recognized, but getting detailed knowledge of the extent of such effects remains a major challenge because of the difficulty in distinguishing from bulk effects. This paper addresses the interface effects on the emission efficiency of poly(p-phenylene vinylene) (PPV), by producing layer-by-layer (LBL) films of PPV alternated with dodecylbenzenesulfonate. Films with thickness varying from similar to 15 to 225 nm had the structural defects controlled empirically by converting the films at two temperatures, 110 and 230 degrees C, while the optical properties were characterized by using optical absorption, photoluminescence (PL), and photoluminescence excitation spectra. Blueshifts in the absorption and PL spectra for LBL films with less than 25 bilayers (<40-50 nm) pointed to a larger number of PPV segments with low conjugation degree, regardless of the conversion temperature. For these thin films, the mean free-path for diffusion of photoexcited carriers decreased, and energy transfer may have been hampered owing to the low mobility of the excited carriers. The emission efficiency was then found to depend on the concentration of structural defects, i.e., on the conversion temperature. For thick films with more than 25 bilayers, on the other hand, the PL signal did not depend on the PPV conversion temperature. We also checked that the interface effects were not caused by waveguiding properties of the excited light. Overall, the electronic states at the interface were more localized, and this applied to film thickness of up to 40-50 nm. Because this is a typical film thickness in devices, the implication from the findings here is that interface phenomena should be a primary concern for the design of any organic device. (C) 2011 American Institute of Physics. [doi:10.1063/1.3622143]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of interaction neighborhood estimation from the partial observation of a finite number of realizations of a random field. We introduce a model selection rule to choose estimators of conditional probabilities among natural candidates. Our main result is an oracle inequality satisfied by the resulting estimator. We use then this selection rule in a two-step procedure to evaluate the interacting neighborhoods. The selection rule selects a small prior set of possible interacting points and a cutting step remove from this prior set the irrelevant points. We also prove that the Ising models satisfy the assumptions of the main theorems, without restrictions on the temperature, on the structure of the interacting graph or on the range of the interactions. It provides therefore a large class of applications for our results. We give a computationally efficient procedure in these models. We finally show the practical efficiency of our approach in a simulation study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estimates of greenhouse-gas emissions from deforestation are highly uncertain because of high variability in key parameters and because of the limited number of studies providing field measurements of these parameters. One such parameter is burning efficiency, which determines how much of the original forest`s aboveground carbon stock will be released in the burn, as well as how much will later be released by decay and how much will remain as charcoal. In this paper we examined the fate of biomass from a semideciduous tropical forest in the ""arc of deforestation,"" where clearing activity is concentrated along the southern edge of the Amazon forest. We estimated carbon content, charcoal formation and burning efficiency by direct measurements (cutting and weighing) and by line-intersect sampling (LIS) done along the axis of each plot before and after burning of felled vegetation. The total aboveground dry biomass found here (219.3 Mg ha(-1)) is lower than the values found in studies that have been done in other parts of the Amazon region. Values for burning efficiency (65%) and charcoal formation (6.0%, or 5.98 Mg C ha(-1)) were much higher than those found in past studies in tropical areas. The percentage of trunk biomass lost in burning (49%) was substantially higher than has been found in previous studies. This difference may be explained by the concentration of more stems in the smaller diameter classes and the low humidity of the fuel (the dry season was unusually long in 2007, the year of the burn). This study provides the first measurements of forest burning parameters for a group of forest types that is now undergoing rapid deforestation. The burning parameters estimated here indicate substantially higher burning efficiency than has been found in other Amazonian forest types. Quantification of burning efficiency is critical to estimates of trace-gas emissions from deforestation. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to test the hypothesis that in obese children: 1) Ventilatory efficiency (VentE) is decreased during graded exercise; and 2) Weight loss through diet alone (D) improves VentE, and 3) diet associated with exercise training (DET) leads to greater improvement in VentE than by D. Thirty-eight obese children (10 +/- 0.2 years; BMI > 95(th) percentile) were randomly divided into two Study groups: D (n=17; BMI = 30 +/- 1 kg/m(2)) and DET (n = 21; 28 +/- 1 kg/m(2)). Ten lean children were included in a control group (10 +/- 0.3 years; 17 +/- 0.5 kg/m(2)). All children performed maximal treadmill testing with respiratory gas analysis (breath-by-breath) to determine the ventilatory anaerobic threshold (VAT) and peak oxygen consumption (VO(2) peak). VentE was determined by the VE/VCO(2) method at VAT. Obese children showed lower VO(2) peak and lower VentE than controls (p < 0.05). After interventions, all obese children reduced body weight (p < 0.05). D group did not improve in terms of VO(2) peak or VentE (p > 0.05). In contrast, the DET group showed increased VO(2) peak (p = 0.01) and improved VentE(Delta VE/VCO(2) = -6.1 +/- 0.9; p = 0.01). VentE is decreased in obese children, where weight loss by means of DET, but not D alone, improves VentE and cardiorespiratory fitness during graded exercise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The enzymatic hydrolysis of sugarcane bagasse was investigated by treating a peroxide-alkaline bagasse with a pineapple stem juice, xylanase and cellulase. Pre-treatment procedures of sugarcane bagasse with alkaline hydrogen peroxide were evaluated and compared. Analyses were performed using 2(4) factorial designs, with pre-treatment time, temperature, magnesium sulfate and hydrogen peroxide concentration as factors. The responses evaluated were the yield of cellobiose and glucose released from pretreated bagasse after enzymatic hydrolysis. The results show that the highest enzymatic conversion was obtained for bagasse using 2% hydrogen peroxide at 60 degrees C for 16 h in the presence of 0.5% magnesium sulfate. Bagasse (5%) was treated with pineapple stem extract, which contains mixtures of protease and esterase, in combination with xylanase and cellulase. It was observed that the amount of glucose and cellobiose released from bagasse increased with the mixture of enzymes. It is believed that the enzymes present in pineapple extracts are capable of hydrolyze specific linkages that would facilitate the action of digesting plant cell walls enzymes. This increases the amount of glucose and other hexoses that are released during the enzymatic treatment and also reduces the amount of cellulase necessary in a typical hydrolysis. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The power loss reduction in distribution systems (DSs) is a nonlinear and multiobjective problem. Service restoration in DSs is even computationally hard since it additionally requires a solution in real-time. Both DS problems are computationally complex. For large-scale networks, the usual problem formulation has thousands of constraint equations. The node-depth encoding (NDE) enables a modeling of DSs problems that eliminates several constraint equations from the usual formulation, making the problem solution simpler. On the other hand, a multiobjective evolutionary algorithm (EA) based on subpopulation tables adequately models several objectives and constraints, enabling a better exploration of the search space. The combination of the multiobjective EA with NDE (MEAN) results in the proposed approach for solving DSs problems for large-scale networks. Simulation results have shown the MEAN is able to find adequate restoration plans for a real DS with 3860 buses and 632 switches in a running time of 0.68 s. Moreover, the MEAN has shown a sublinear running time in function of the system size. Tests with networks ranging from 632 to 5166 switches indicate that the MEAN can find network configurations corresponding to a power loss reduction of 27.64% for very large networks requiring relatively low running time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The productivity associated with commonly available disassembly methods today seldomly makes disassembly the preferred end-of-life solution for massive take back product streams. Systematic reuse of parts or components, or recycling of pure material fractions are often not achievable in an economically sustainable way. In this paper a case-based review of current disassembly practices is used to analyse the factors influencing disassembly feasibility. Data mining techniques were used to identify major factors influencing the profitability of disassembly operations. Case characteristics such as involvement of the product manufacturer in the end-of-life treatment and continuous ownership are some of the important dimensions. Economic models demonstrate that the efficiency of disassembly operations should be increased an order of magnitude to assure the competitiveness of ecologically preferred, disassembly oriented end-of-life scenarios for large waste of electric and electronic equipment (WEEE) streams. Technological means available to increase the productivity of the disassembly operations are summarized. Automated disassembly techniques can contribute to the robustness of the process, but do not allow to overcome the efficiency gap if not combined with appropriate product design measures. Innovative, reversible joints, collectively activated by external trigger signals, form a promising approach to low cost, mass disassembly in this context. A short overview of the state-of-the-art in the development of such self-disassembling joints is included. (c) 2008 CIRP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article a novel algorithm based on the chemotaxis process of Echerichia coil is developed to solve multiobjective optimization problems. The algorithm uses fast nondominated sorting procedure, communication between the colony members and a simple chemotactical strategy to change the bacterial positions in order to explore the search space to find several optimal solutions. The proposed algorithm is validated using 11 benchmark problems and implementing three different performance measures to compare its performance with the NSGA-II genetic algorithm and with the particle swarm-based algorithm NSPSO. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The general flowshop scheduling problem is a production problem where a set of n jobs have to be processed with identical flow pattern on in machines. In permutation flowshops the sequence of jobs is the same on all machines. A significant research effort has been devoted for sequencing jobs in a flowshop minimizing the makespan. This paper describes the application of a Constructive Genetic Algorithm (CGA) to makespan minimization on flowshop scheduling. The CGA was proposed recently as an alternative to traditional GA approaches, particularly, for evaluating schemata directly. The population initially formed only by schemata, evolves controlled by recombination to a population of well-adapted structures (schemata instantiation). The CGA implemented is based on the NEH classic heuristic and a local search heuristic used to define the fitness functions. The parameters of the CGA are calibrated using a Design of Experiments (DOE) approach. The computational results are compared against some other successful algorithms from the literature on Taillard`s well-known standard benchmark. The computational experience shows that this innovative CGA approach provides competitive results for flowshop scheduling; problems. (C) 2007 Elsevier Ltd. All rights reserved.