20 resultados para benchmark

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We aimed to develop site-specific sediment quality guidelines (SQGs) for two estuarine and port zones in Southeastern Brazil (Santos Estuarine System and Paranagua Estuarine System) and three in Southern Spain (Ria of Huelva, Bay of Cadiz, and Bay of Algeciras), and compare these values against national and traditionally used international benchmark values. Site-specific SQGs were derived based on sediment physical-chemical, toxicological, and benthic community data integrated through multivariate analysis. This technique allowed the identification of chemicals of concern and the establishment of effects range correlatively to individual concentrations of contaminants for each site of study. The results revealed that sediments from Santos channel, as well as inner portions of the SES, are considered highly polluted (exceeding SQGs-high) by metals, PAHs and PCBs. High pollution by PAHs and some metals was found in Sao Vicente channel. In PES, sediments from inner portions (proximities of the Ponta do Mix port`s terminal and the Port of Paranagua) are highly polluted by metals and PAHs, including one zone inside the limits of an environmental protection area. In Gulf of Cadiz, SQGs exceedences were found in Ria of Huelva (all analysed metals and PAHs), in the surroundings of the Port of CAdiz (Bay of CAdiz) (metals), and in Bay of Algeciras (Ni and PAHs). The site-specific SQGs derived in this study are more restricted than national SQGs applied in Brazil and Spain, as well as international guidelines. This finding confirms the importance of the development of site-specific SQGs to support the characterisation of sediments and dredged material. The use of the same methodology to derive SQGs in Brazilian and Spanish port zones confirmed the applicability of this technique with an international scope and provided a harmonised methodology for site-specific SQGs derivation. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, we report the construction of potential energy surfaces for the (3)A '' and (3)A' states of the system O(P-3) + HBr. These surfaces are based on extensive ab initio calculations employing the MRCI+Q/CBS+SO level of theory. The complete basis set energies were estimated from extrapolation of MRCI+Q/aug-cc-VnZ(-PP) (n = Q, 5) results and corrections due to spin-orbit effects obtained at the CASSCF/aug-cc-pVTZ(-PP) level of theory. These energies, calculated over a region of the configuration space relevant to the study of the reaction O(P-3) + HBr -> OH + Br, were used to generate functions based on the many-body expansion. The three-body potentials were interpolated using the reproducing kernel Hilbert space method. The resulting surface for the (3)A '' electronic state contains van der Waals minima on the entrance and exit channels and a transition state 6.55 kcal/mol higher than the reactants. This barrier height was then scaled to reproduce the value of 5.01 kcal/mol, which was estimated from coupled cluster benchmark calculations performed to include high-order and core-valence correlation, as well as scalar relativistic effects. The (3)A' surface was also scaled, based on the fact that in the collinear saddle point geometry these two electronic states are degenerate. The vibrationally adiabatic barrier heights are 3.44 kcal/mol for the (3)A '' and 4.16 kcal/mol for the (3)A' state. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4705428]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The irregular shape packing problem is approached. The container has a fixed width and an open dimension to be minimized. The proposed algorithm constructively creates the solution using an ordered list of items and a placement heuristic. Simulated annealing is the adopted metaheuristic to solve the optimization problem. A two-level algorithm is used to minimize the open dimension of the container. To ensure feasible layouts, the concept of collision free region is used. A collision free region represents all possible translations for an item to be placed and may be degenerated. For a moving item, the proposed placement heuristic detects the presence of exact fits (when the item is fully constrained by its surroundings) and exact slides (when the item position is constrained in all but one direction). The relevance of these positions is analyzed and a new placement heuristic is proposed. Computational comparisons on benchmark problems show that the proposed algorithm generated highly competitive solutions. Moreover, our algorithm updated some best known results. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Based on a structured literature review, the ceramic tiles sector of Italy (benchmark) and Brazil (2nd world producer and consumer) are compared, under four strategic factors: normative, market, technology and strategic management, in order to identify critical risks for a national strategic sector. The document aims to propose guidelines for a strategic re-planning of the Brazilian ceramic tiles sector, making the Brazilian producers aware of the national market fragility (in spite of its recent remarkable evolution) and helping the policy makers to reflect on the need of reviewing the strategic planning methods and practice, of designing new targeted programs (based on coherence between operation and business strategies), of providing improved management to strengthen the sector against unfair competition by low-cost producers, enhancing the necessary infrastructure in technology, work, marketing and quality management. The analysis is limited to the single-firing production technology. The wide-coverage strategic analysis of the Brazilian ceramic tiles sector, very little studied until now in a scientific way, emphasizes the importance of applying research methodology and may be valuable to both scholars and practitioners. Additionally, it highlights the need of investments in innovation (product design and production technology) and the fundamental role of the sector organization, identifying different dimensions. It is possible to conclude that the recent Brazilian production growth is not due to a natural strengthening because of the hit of the sector and of correct enterprises strategy, but it seems the result of a temporary and favorable economic contingency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to develop an efficient numerical algorithm for the self-consistent solution of Schrodinger and Poisson equations in one-dimensional systems. The goal is to compute the charge-control and capacitance-voltage characteristics of quantum wire transistors. Design/methodology/approach - The paper presents a numerical formulation employing a non-uniform finite difference discretization scheme, in which the wavefunctions and electronic energy levels are obtained by solving the Schrodinger equation through the split-operator method while a relaxation method in the FTCS scheme ("Forward Time Centered Space") is used to solve the two-dimensional Poisson equation. Findings - The numerical model is validated by taking previously published results as a benchmark and then applying them to yield the charge-control characteristics and the capacitance-voltage relationship for a split-gate quantum wire device. Originality/value - The paper helps to fulfill the need for C-V models of quantum wire device. To do so, the authors implemented a straightforward calculation method for the two-dimensional electronic carrier density n(x,y). The formulation reduces the computational procedure to a much simpler problem, similar to the one-dimensional quantization case, significantly diminishing running time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The single machine scheduling problem with a common due date and non-identical ready times for the jobs is examined in this work. Performance is measured by the minimization of the weighted sum of earliness and tardiness penalties of the jobs. Since this problem is NP-hard, the application of constructive heuristics that exploit specific characteristics of the problem to improve their performance is investigated. The proposed approaches are examined through a computational comparative study on a set of 280 benchmark test problems with up to 1000 jobs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work presents numerical simulations of two fluid flow problems involving moving free surfaces: the impacting drop and fluid jet buckling. The viscoelastic model used in these simulations is the eXtended Pom-Pom (XPP) model. To validate the code, numerical predictions of the drop impact problem for Newtonian and Oldroyd-B fluids are presented and compared with other methods. In particular, a benchmark on numerical simulations for a XPP drop impacting on a rigid plate is performed for a wide range of the relevant parameters. Finally, to provide an additional application of free surface flows of XPP fluids, the viscous jet buckling problem is simulated and discussed. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The oxygen reduction reaction (ORR) was studied in KOH electrolyte on carbon supported epsilon-manganese dioxide (epsilon-MnO2/C). The epsilon-MnO2/C catalyst was prepared via thermal decomposition of manganese nitrate and carbon powder (Vulcan XC-72) mixtures. X-ray powder diffraction (XRD) measurements were performed in order to determine the crystalline structure of the resulting composite, while energy dispersive X-ray analysis (EDX) was used to evaluate the chemical composition of the synthesized material. The electrochemical studies were conducted using cyclic voltammetry (CV) and quasi-steady state polarization measurements carried out with an ultra thin layer rotating ring/disk electrode (RRDE) configuration. The electrocatalytic results obtained for 20% (w/w) Pt/C (E-TEK Inc., USA) and alpha-MnO2/C for the ORR, considered as one of the most active manganese oxide based catalyst for the ORR in alkaline media, were included for comparison. The RRDE results revealed that the ORR on the MnO2 catalysts proceeds preferentially through the complete 4e(-) reduction pathway via a 2 plus 2e(-) reduction process involving hydrogen peroxide as an intermediate. A benchmark close to the performance of 20% (w/w) Pt/C (E-TEK Inc., USA) was observed for the epsilon-MnO2/C material in the kinetic control region, superior to the performance of alpha-MnO2/C, but a higher amount of HO2- was obtained when epsilon-MnO2/C was used as catalyst. The higher production of hydrogen peroxide on epsilon-MnO2/C was related to the presence of structural defects, typical of this oxide, while the better catalytic performance in the kinetic control region compared to alpha-MnO2/C was related with the higher electrochemical activity for the proton insertion kinetics, which is a structure sensitive process. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently, researches have shown that the performance of metaheuristics can be affected by population initialization. Opposition-based Differential Evolution (ODE), Quasi-Oppositional Differential Evolution (QODE), and Uniform-Quasi-Opposition Differential Evolution (UQODE) are three state-of-the-art methods that improve the performance of the Differential Evolution algorithm based on population initialization and different search strategies. In a different approach to achieve similar results, this paper presents a technique to discover promising regions in a continuous search-space of an optimization problem. Using machine-learning techniques, the algorithm named Smart Sampling (SS) finds regions with high possibility of containing a global optimum. Next, a metaheuristic can be initialized inside each region to find that optimum. SS and DE were combined (originating the SSDE algorithm) to evaluate our approach, and experiments were conducted in the same set of benchmark functions used by ODE, QODE and UQODE authors. Results have shown that the total number of function evaluations required by DE to reach the global optimum can be significantly reduced and that the success rate improves if SS is employed first. Such results are also in consonance with results from the literature, stating the importance of an adequate starting population. Moreover, SS presents better efficacy to find initial populations of superior quality when compared to the other three algorithms that employ oppositional learning. Finally and most important, the SS performance in finding promising regions is independent of the employed metaheuristic with which SS is combined, making SS suitable to improve the performance of a large variety of optimization techniques. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article is the product of research that analyzed the work of bus drivers of a public transportation company that is considered a benchmark reference in its field of operations, in which it strives to achieve operating excellence. Within this context, the authors sought to understand how such a company has managed to maintain a policy that is capable of reconciling quality public transport while also providing working conditions compatible with the professional development, comfort and health of its workers. Ergonomic work analysis and activity analysis were the guiding elements used in this study. Initial analyses indicate that the activity of drivers includes serving a population and providing mobility for it, which depends on driving the vehicle itself and on relationships with colleagues, users, pedestrians, drivers and others.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Solution of structural reliability problems by the First Order method require optimization algorithms to find the smallest distance between a limit state function and the origin of standard Gaussian space. The Hassofer-Lind-Rackwitz-Fiessler (HLRF) algorithm, developed specifically for this purpose, has been shown to be efficient but not robust, as it fails to converge for a significant number of problems. On the other hand, recent developments in general (augmented Lagrangian) optimization techniques have not been tested in aplication to structural reliability problems. In the present article, three new optimization algorithms for structural reliability analysis are presented. One algorithm is based on the HLRF, but uses a new differentiable merit function with Wolfe conditions to select step length in linear search. It is shown in the article that, under certain assumptions, the proposed algorithm generates a sequence that converges to the local minimizer of the problem. Two new augmented Lagrangian methods are also presented, which use quadratic penalties to solve nonlinear problems with equality constraints. Performance and robustness of the new algorithms is compared to the classic augmented Lagrangian method, to HLRF and to the improved HLRF (iHLRF) algorithms, in the solution of 25 benchmark problems from the literature. The new proposed HLRF algorithm is shown to be more robust than HLRF or iHLRF, and as efficient as the iHLRF algorithm. The two augmented Lagrangian methods proposed herein are shown to be more robust and more efficient than the classical augmented Lagrangian method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transportation planning is currently being confronted with a broader planning view, which is given by the concept of mobility. The Index of Sustainable Urban Mobility (I_SUM) is among the tools developed for supporting this new concept implementation. It is a tool to assess the current mobility conditions of any city, which can also be applied for policy formulation. This study focus on the application of I_SUM in the city of Curitiba, Brazil. Considering that the city is known worldwide as a reference of successful urban and transportation planning, the index application must confirm it. An additional objective of the study was to evaluate the index itself, or the subjacent assessment method and reference values. A global I_SUM value of 0.747 confirmed that the city has indeed very positive characteristics regarding sustainable mobility policies. However, some deficiencies were also detected, particularly with respect to non-motorized transport modes. The application has also served to show that a few I_SUM indicators were not able to capture some of the positive aspects of the city, what may suggest the need of changes in their formulation. Finally, the index application in parts of the city suggests that the city provides fair and equitable mobility conditions to all citizens throughout the city. This is certainly a good attribute for becoming a benchmark of sustainable mobility, even if it is not yet the ideal model. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work studies the optimization and control of a styrene polymerization reactor. The proposed strategy deals with the case where, because of market conditions and equipment deterioration, the optimal operating point of the continuous reactor is modified significantly along the operation time and the control system has to search for this optimum point, besides keeping the reactor system stable at any possible point. The approach considered here consists of three layers: the Real Time Optimization (RTO), the Model Predictive Control (MPC) and a Target Calculation (TC) that coordinates the communication between the two other layers and guarantees the stability of the whole structure. The proposed algorithm is simulated with the phenomenological model of a styrene polymerization reactor, which has been widely used as a benchmark for process control. The complete optimization structure for the styrene process including disturbances rejection is developed. The simulation results show the robustness of the proposed strategy and the capability to deal with disturbances while the economic objective is optimized.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract Background With the development of DNA hybridization microarray technologies, nowadays it is possible to simultaneously assess the expression levels of thousands to tens of thousands of genes. Quantitative comparison of microarrays uncovers distinct patterns of gene expression, which define different cellular phenotypes or cellular responses to drugs. Due to technical biases, normalization of the intensity levels is a pre-requisite to performing further statistical analyses. Therefore, choosing a suitable approach for normalization can be critical, deserving judicious consideration. Results Here, we considered three commonly used normalization approaches, namely: Loess, Splines and Wavelets, and two non-parametric regression methods, which have yet to be used for normalization, namely, the Kernel smoothing and Support Vector Regression. The results obtained were compared using artificial microarray data and benchmark studies. The results indicate that the Support Vector Regression is the most robust to outliers and that Kernel is the worst normalization technique, while no practical differences were observed between Loess, Splines and Wavelets. Conclusion In face of our results, the Support Vector Regression is favored for microarray normalization due to its superiority when compared to the other methods for its robustness in estimating the normalization curve.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract Background A large number of probabilistic models used in sequence analysis assign non-zero probability values to most input sequences. To decide when a given probability is sufficient the most common way is bayesian binary classification, where the probability of the model characterizing the sequence family of interest is compared to that of an alternative probability model. We can use as alternative model a null model. This is the scoring technique used by sequence analysis tools such as HMMER, SAM and INFERNAL. The most prevalent null models are position-independent residue distributions that include: the uniform distribution, genomic distribution, family-specific distribution and the target sequence distribution. This paper presents a study to evaluate the impact of the choice of a null model in the final result of classifications. In particular, we are interested in minimizing the number of false predictions in a classification. This is a crucial issue to reduce costs of biological validation. Results For all the tests, the target null model presented the lowest number of false positives, when using random sequences as a test. The study was performed in DNA sequences using GC content as the measure of content bias, but the results should be valid also for protein sequences. To broaden the application of the results, the study was performed using randomly generated sequences. Previous studies were performed on aminoacid sequences, using only one probabilistic model (HMM) and on a specific benchmark, and lack more general conclusions about the performance of null models. Finally, a benchmark test with P. falciparum confirmed these results. Conclusions Of the evaluated models the best suited for classification are the uniform model and the target model. However, the use of the uniform model presents a GC bias that can cause more false positives for candidate sequences with extreme compositional bias, a characteristic not described in previous studies. In these cases the target model is more dependable for biological validation due to its higher specificity.