94 resultados para Engineering, Industrial|Engineering, System Science|Operations Research


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The flowshop scheduling problem with blocking in-process is addressed in this paper. In this environment, there are no buffers between successive machines: therefore intermediate queues of jobs waiting in the system for their next operations are not allowed. Heuristic approaches are proposed to minimize the total tardiness criterion. A constructive heuristic that explores specific characteristics of the problem is presented. Moreover, a GRASP-based heuristic is proposed and Coupled with a path relinking strategy to search for better outcomes. Computational tests are presented and the comparisons made with an adaptation of the NEH algorithm and with a branch-and-bound algorithm indicate that the new approaches are promising. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Selenium (Se) intake is generally from food, whose Se content depends on soil Se and plant accumulation. For humans, adequate Se intake is essential for several selenoenzymes. In the Lower Tapajos region of the Brazilian Amazon, Se status is elevated with large inter-community variability. Se intake in this region, where Hg exposure is among the highest in the world, may be important to counteract mercury (Hg) toxicity. The present study was conducted in 2006 with 155 persons from four communities of the Lower Tapajos. The objectives were: i) to evaluate Se content in their typical diet and drinking water; ii) to compare food Se concentrations with respect to geographic location; and iii) to examine the contribution of consumption of different food items to blood Se. More than 400 local foods and 40 drinking water samples were collected. Participants responded to an interview-administered food frequency questionnaire and provided blood samples. Food, water and blood Se levels were assessed by ICP-MS. Since Brazil nuts may also contain significant levels of barium (Ba) and strontium (Sr), these elements were likewise analyzed in nuts. The highest Se concentrations were found in Brazil nuts, but concentrations were highly variable (median: 13.9 mu g/g; range: 0.4-158.4 mu g/g). Chicken, game meat, eggs and beef also contained considerable levels of Se, with median concentrations from 0.3 to 1.4 mu g/g. There was no particular geographic distribution of food Se. Se concentration in drinking water was very low (<1.4 mu g/L). Blood Se covered a (103-1500 mu g/L), and was positively related to regular consumption of Brazil nuts, domestic chicken and game meat. Brazil nuts were found to contain highly variable and often very high concentrations of Ba (88.0 mu g/g, 1.9-1437 mu g/g) and Sr (38.7 mu g/g, 3.3-173 mu g/g). Further studies should address multiple nutrient/toxic interactions in the diet and related effects on health. (c) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes the use of the q-Gaussian mutation with self-adaptation of the shape of the mutation distribution in evolutionary algorithms. The shape of the q-Gaussian mutation distribution is controlled by a real parameter q. In the proposed method, the real parameter q of the q-Gaussian mutation is encoded in the chromosome of individuals and hence is allowed to evolve during the evolutionary process. In order to test the new mutation operator, evolution strategy and evolutionary programming algorithms with self-adapted q-Gaussian mutation generated from anisotropic and isotropic distributions are presented. The theoretical analysis of the q-Gaussian mutation is also provided. In the experimental study, the q-Gaussian mutation is compared to Gaussian and Cauchy mutations in the optimization of a set of test functions. Experimental results show the efficiency of the proposed method of self-adapting the mutation distribution in evolutionary algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mesenchymal stem cells (MSC) are multipotent cells which can be obtained from several adult and fetal tissues including human umbilical cord units. We have recently shown that umbilical cord tissue (UC) is richer in MSC than umbilical cord blood (UCB) but their origin and characteristics in blood as compared to the cord remains unknown. Here we compared, for the first time, the exonic protein-coding and intronic noncoding RNA (ncRNA) expression profiles of MSC from match-paired UC and UCB samples, harvested from the same donors, processed simultaneously and under the same culture conditions. The patterns of intronic ncRNA expression in MSC from UC and UCB paired units were highly similar, indicative of their common donor origin. The respective exonic protein-coding transcript expression profiles, however, were significantly different. Hierarchical clustering based on protein-coding expression similarities grouped MSC according to their tissue location rather than original donor. Genes related to systems development, osteogenesis and immune system were expressed at higher levels in UCB, whereas genes related to cell adhesion, morphogenesis, secretion, angiogenesis and neurogenesis were more expressed in UC cells. These molecular differences verified in tissue-specific MSC gene expression may reflect functional activities influenced by distinct niches and should be considered when developing clinical protocols involving MSC from different sources. In addition, these findings reinforce our previous suggestion on the importance of banking the whole umbilical cord unit for research or future therapeutic use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the two-level network design problem with intermediate facilities. This problem consists of designing a minimum cost network respecting some requirements, usually described in terms of the network topology or in terms of a desired flow of commodities between source and destination vertices. Each selected link must receive one of two types of edge facilities and the connection of different edge facilities requires a costly and capacitated vertex facility. We propose a hybrid decomposition approach which heuristically obtains tentative solutions for the vertex facilities number and location and use these solutions to limit the computational burden of a branch-and-cut algorithm. We test our method on instances of the power system secondary distribution network design problem. The results show that the method is efficient both in terms of solution quality and computational times. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel mathematical framework inspired on Morse Theory for topological triangle characterization in 2D meshes is introduced that is useful for applications involving the creation of mesh models of objects whose geometry is not known a priori. The framework guarantees a precise control of topological changes introduced as a result of triangle insertion/removal operations and enables the definition of intuitive high-level operators for managing the mesh while keeping its topological integrity. An application is described in the implementation of an innovative approach for the detection of 2D objects from images that integrates the topological control enabled by geometric modeling with traditional image processing techniques. (C) 2008 Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Policy hierarchies and automated policy refinement are powerful approaches to simplify administration of security services in complex network environments. A crucial issue for the practical use of these approaches is to ensure the validity of the policy hierarchy, i.e. since the policy sets for the lower levels are automatically derived from the abstract policies (defined by the modeller), we must be sure that the derived policies uphold the high-level ones. This paper builds upon previous work on Model-based Management, particularly on the Diagram of Abstract Subsystems approach, and goes further to propose a formal validation approach for the policy hierarchies yielded by the automated policy refinement process. We establish general validation conditions for a multi-layered policy model, i.e. necessary and sufficient conditions that a policy hierarchy must satisfy so that the lower-level policy sets are valid refinements of the higher-level policies according to the criteria of consistency and completeness. Relying upon the validation conditions and upon axioms about the model representativeness, two theorems are proved to ensure compliance between the resulting system behaviour and the abstract policies that are modelled.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OWL-S is an application of OWL, the Web Ontology Language, that describes the semantics of Web Services so that their discovery, selection, invocation and composition can be automated. The research literature reports the use of UML diagrams for the automatic generation of Semantic Web Service descriptions in OWL-S. This paper demonstrates a higher level of automation by generating complete complete Web applications from OWL-S descriptions that have themselves been generated from UML. Previously, we proposed an approach for processing OWL-S descriptions in order to produce MVC-based skeletons for Web applications. The OWL-S ontology undergoes a series of transformations in order to generate a Model-View-Controller application implemented by a combination of Java Beans, JSP, and Servlets code, respectively. In this paper, we show in detail the documents produced at each processing step. We highlight the connections between OWL-S specifications and executable code in the various Java dialects and show the Web interfaces that result from this process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present a new reformulation of the KKT system associated to a variational inequality as a semismooth equation. The reformulation is derived from the concept of differentiable exact penalties for nonlinear programming. The best theoretical results are presented for nonlinear complementarity problems, where simple, verifiable, conditions ensure that the penalty is exact. We close the paper with some preliminary computational tests on the use of a semismooth Newton method to solve the equation derived from the new reformulation. We also compare its performance with the Newton method applied to classical reformulations based on the Fischer-Burmeister function and on the minimum. The new reformulation combines the best features of the classical ones, being as easy to solve as the reformulation that uses the Fischer-Burmeister function while requiring as few Newton steps as the one that is based on the minimum.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large-scale simulations of parts of the brain using detailed neuronal models to improve our understanding of brain functions are becoming a reality with the usage of supercomputers and large clusters. However, the high acquisition and maintenance cost of these computers, including the physical space, air conditioning, and electrical power, limits the number of simulations of this kind that scientists can perform. Modern commodity graphical cards, based on the CUDA platform, contain graphical processing units (GPUs) composed of hundreds of processors that can simultaneously execute thousands of threads and thus constitute a low-cost solution for many high-performance computing applications. In this work, we present a CUDA algorithm that enables the execution, on multiple GPUs, of simulations of large-scale networks composed of biologically realistic Hodgkin-Huxley neurons. The algorithm represents each neuron as a CUDA thread, which solves the set of coupled differential equations that model each neuron. Communication among neurons located in different GPUs is coordinated by the CPU. We obtained speedups of 40 for the simulation of 200k neurons that received random external input and speedups of 9 for a network with 200k neurons and 20M neuronal connections, in a single computer with two graphic boards with two GPUs each, when compared with a modern quad-core CPU. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Considering a series representation of a coherent system using a shift transform of the components lifetime T-i, at its critical level Y-i, we study two problems. First, under such a shift transform, we analyse the preservation properties of the non-parametric distribution classes and secondly the association preserving property of the components lifetime under such transformations. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diagnostic methods have been an important tool in regression analysis to detect anomalies, such as departures from error assumptions and the presence of outliers and influential observations with the fitted models. Assuming censored data, we considered a classical analysis and Bayesian analysis assuming no informative priors for the parameters of the model with a cure fraction. A Bayesian approach was considered by using Markov Chain Monte Carlo Methods with Metropolis-Hasting algorithms steps to obtain the posterior summaries of interest. Some influence methods, such as the local influence, total local influence of an individual, local influence on predictions and generalized leverage were derived, analyzed and discussed in survival data with a cure fraction and covariates. The relevance of the approach was illustrated with a real data set, where it is shown that, by removing the most influential observations, the decision about which model best fits the data is changed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The inherent complexity of statistical methods and clinical phenomena compel researchers with diverse domains of expertise to work in interdisciplinary teams, where none of them have a complete knowledge in their counterpart's field. As a result, knowledge exchange may often be characterized by miscommunication leading to misinterpretation, ultimately resulting in errors in research and even clinical practice. Though communication has a central role in interdisciplinary collaboration and since miscommunication can have a negative impact on research processes, to the best of our knowledge, no study has yet explored how data analysis specialists and clinical researchers communicate over time. Methods/Principal Findings: We conducted qualitative analysis of encounters between clinical researchers and data analysis specialists (epidemiologist, clinical epidemiologist, and data mining specialist). These encounters were recorded and systematically analyzed using a grounded theory methodology for extraction of emerging themes, followed by data triangulation and analysis of negative cases for validation. A policy analysis was then performed using a system dynamics methodology looking for potential interventions to improve this process. Four major emerging themes were found. Definitions using lay language were frequently employed as a way to bridge the language gap between the specialties. Thought experiments presented a series of ""what if'' situations that helped clarify how the method or information from the other field would behave, if exposed to alternative situations, ultimately aiding in explaining their main objective. Metaphors and analogies were used to translate concepts across fields, from the unfamiliar to the familiar. Prolepsis was used to anticipate study outcomes, thus helping specialists understand the current context based on an understanding of their final goal. Conclusion/Significance: The communication between clinical researchers and data analysis specialists presents multiple challenges that can lead to errors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The masses of neutron-deficient nuclides near the N=Z line with A=64-80 have been determined using a direct time-of-flight technique which employed a cyclotron as a high-resolution spectrometer. The measured atomic masses for (68)Se and (80)Y were 67.9421(3) u and 79.9344(2) u, respectively. The new values agree with the 2003 Atomic Mass Evaluation. The result for (68)Se confirms that this nucleus is a waiting point of the rp-process, and that for (80)Y resolves the conflict between earlier measurements. Using the present results and the 2003 Atomic Mass Evaluation compilation, the empirical interaction between the last proton and the last neutron in N=Z nuclei has been revisited and extended.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present an analysis of the absorption of acoustic waves by a black hole analogue in (2 + 1) dimensions generated by a fluid flow in a draining bathtub. We show that the low-frequency absorption length is equal to the acoustic hole circumference and that the high-frequency absorption length is 4 times the ergoregion radius. For intermediate values of the wave frequency, we compute the absorption length numerically and show that our results are in excellent agreement with the low-and high-frequency limits. We analyze the occurrence of superradiance, manifested as negative partial absorption lengths for corotating modes at low frequencies.