993 resultados para Processes optimization
Resumo:
Magma flow in dykes is still not well understood; some reported magnetic fabrics are contradictory and the potential effects of exsolution and metasomatism processes on the magnetic properties are issues open to debate. Therefore, a long dyke made of segments with different thickness, which record distinct degrees of metasomatism, the Messejana-Plasencia dyke (MPD), was studied. Oriented dolerite samples were collected along several cross-sections and characterized by means of microscopy and magnetic analyses. The results obtained show that the effects of metasomatism on rock mineralogy are important, and that the metasomatic processes can greatly influence anisotropy degree and mean susceptibility only when rocks are strongly affected by metasomatism. Petrography, scanning electron microscopy (SEM) and bulk magnetic analyses show a high-temperature oxidation-exsolution event, experienced by the very early Ti-spinels, during the early stages of magma cooling, which was mostly observed in central domains of the thick dyke segments. Exsolution reduced the grain size of the magnetic carrier (multidomain to single domain transformation), thus producing composite fabrics involving inverse fabrics. These are likely responsible for a significant number of the 'abnormal' fabrics, which make the interpretation of magma flow much more complex. By choosing to use only the 'normal' fabric for magma flow determination, we have reduced by 50 per cent the number of relevant sites. In these sites, the imbrication angle of the magnetic foliation relative to dyke wall strongly suggests flow with end-members indicating vertical-dominated flow (seven sites) and horizontal-dominated flow (three sites).
Resumo:
When a paleomagnetic pole is sought for in an igneous body, the host rocks should be subjected to a contact test to assure that the determined paleopole has the age of the intrusion. If the contact test is positive, it precludes the possibility that the measured magnetization is a later effect. Therefore, we investigated the variations of the remanent magnetization along cross-sections of rocks hosting the Foum Zguid dyke (southern Morocco) and the dyke itself. A positive contact test was obtained, but it is mainly related with Chemical/Crystalline Remanent Magnetization due to metasomatic processes in the host-rocks during magma intrusion and cooling, and not only with Thermo-Remanent Magnetization as ordinarily assumed in standard studies. Paleomagnetic data obtained within the dyke then reflect the Earth magnetic field during emplacement of this well-dated (196.9 +/- 1.8 Ma) intrusion.
Resumo:
Topology optimization consists in finding the spatial distribution of a given total volume of material for the resulting structure to have some optimal property, for instance, maximization of structural stiffness or maximization of the fundamental eigenfrequency. In this paper a Genetic Algorithm (GA) employing a representation method based on trees is developed to generate initial feasible individuals that remain feasible upon crossover and mutation and as such do not require any repairing operator to ensure feasibility. Several application examples are studied involving the topology optimization of structures where the objective functions is the maximization of the stiffness and the maximization of the first and the second eigenfrequencies of a plate, all cases having a prescribed material volume constraint.
Resumo:
With accelerated market volatility, faster response times and increased globalization, business environments are going through a major transformation and firms have intensified their search for strategies which can give them competitive advantage. This requires that companies continuously innovate, to think of new ideas that can be transformed or implemented as products, processes or services, generating value for the firm. Innovative solutions and processes are usually developed by a group of people, working together. A grouping of people that share and create new knowledge can be considered as a Community of Practice (CoP). CoP’s are places which provide a sound basis for organizational learning and encourage knowledge creation and acquisition. Virtual Communities of Practice (VCoP's) can perform a central role in promoting communication and collaboration between members who are dispersed in both time and space. Nevertheless, it is known that not all CoP's and VCoP's share the same levels of performance or produce the same results. This means that there are factors that enable or constrain the process of knowledge creation. With this in mind, we developed a case study in order to identify both the motivations and the constraints that members of an organization experience when taking part in the knowledge creating processes of VCoP's. Results show that organizational culture and professional and personal development play an important role in these processes. No interviewee referred to direct financial rewards as a motivation factor for participation in VCoPs. Most identified the difficulty in aligning objectives established by the management with justification for the time spent in the VCoP. The interviewees also said that technology is not a constraint.
Resumo:
Paper to be presented at the ESREA Conference Learning to Change? The Role of Identity and Learning Careers in Adult Education, 7-8 December, 2006, Université Catholique Louvain, Louvain–la-Neuve, Belgium
Resumo:
This article describes work performed on the assessment of the levels of airborne ultrafine particles emitted in two welding processes metal-active gas (MAG) of carbon steel and friction-stir welding (FSW) of aluminium in terms of deposited area in alveolar tract of the lung using a nanoparticle surface area monitor analyser. The obtained results showed the dependence from process parameters on emitted ultrafine particles and clearly demonstrated the presence of ultrafine particles, when compared with background levels. The obtained results showed that the process that results on the lower levels of alveolar-deposited surface area is FSW, unlike MAG. Nevertheless, all the tested processes resulted in important doses of ultrafine particles that are to be deposited in the human lung of exposed workers.
Resumo:
In this paper, a stochastic programming approach is proposed for trading wind energy in a market environment under uncertainty. Uncertainty in the energy market prices is the main cause of high volatility of profits achieved by power producers. The volatile and intermittent nature of wind energy represents another source of uncertainty. Hence, each uncertain parameter is modeled by scenarios, where each scenario represents a plausible realization of the uncertain parameters with an associated occurrence probability. Also, an appropriate risk measurement is considered. The proposed approach is applied on a realistic case study, based on a wind farm in Portugal. Finally, conclusions are duly drawn. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Dissertação apresentada ao Instituto Superior de Contabilidade para a obtenção do Grau de Mestre em Auditoria Orientador: Mestre Agostinho Sousa Pinto
Resumo:
Trabalho de Projeto apresentado ao Instituto Superior de Contabilidade e Administração do Porto para obtenção do grau de Mestre em Auditoria Orientado por: Doutora Alcina Augusta de Sena Portugal Dias
Resumo:
Fluorescence confocal microscopy (FCM) is now one of the most important tools in biomedicine research. In fact, it makes it possible to accurately study the dynamic processes occurring inside the cell and its nucleus by following the motion of fluorescent molecules over time. Due to the small amount of acquired radiation and the huge optical and electronics amplification, the FCM images are usually corrupted by a severe type of Poisson noise. This noise may be even more damaging when very low intensity incident radiation is used to avoid phototoxicity. In this paper, a Bayesian algorithm is proposed to remove the Poisson intensity dependent noise corrupting the FCM image sequences. The observations are organized in a 3-D tensor where each plane is one of the images acquired along the time of a cell nucleus using the fluorescence loss in photobleaching (FLIP) technique. The method removes simultaneously the noise by considering different spatial and temporal correlations. This is accomplished by using an anisotropic 3-D filter that may be separately tuned in space and in time dimensions. Tests using synthetic and real data are described and presented to illustrate the application of the algorithm. A comparison with several state-of-the-art algorithms is also presented.
Resumo:
In practical applications of optimization it is common to have several conflicting objective functions to optimize. Frequently, these functions are subject to noise or can be of black-box type, preventing the use of derivative-based techniques. We propose a novel multiobjective derivative-free methodology, calling it direct multisearch (DMS), which does not aggregate any of the objective functions. Our framework is inspired by the search/poll paradigm of direct-search methods of directional type and uses the concept of Pareto dominance to maintain a list of nondominated points (from which the new iterates or poll centers are chosen). The aim of our method is to generate as many points in the Pareto front as possible from the polling procedure itself, while keeping the whole framework general enough to accommodate other disseminating strategies, in particular, when using the (here also) optional search step. DMS generalizes to multiobjective optimization (MOO) all direct-search methods of directional type. We prove under the common assumptions used in direct search for single objective optimization that at least one limit point of the sequence of iterates generated by DMS lies in (a stationary form of) the Pareto front. However, extensive computational experience has shown that our methodology has an impressive capability of generating the whole Pareto front, even without using a search step. Two by-products of this paper are (i) the development of a collection of test problems for MOO and (ii) the extension of performance and data profiles to MOO, allowing a comparison of several solvers on a large set of test problems, in terms of their efficiency and robustness to determine Pareto fronts.
Resumo:
This paper presents a methodology that aims to increase the probability of delivering power to any load point of the electrical distribution system by identifying new investments in distribution components. The methodology is based on statistical failure and repair data of the distribution power system components and it uses fuzzy-probabilistic modelling for system component outage parameters. Fuzzy membership functions of system component outage parameters are obtained by statistical records. A mixed integer non-linear optimization technique is developed to identify adequate investments in distribution networks components that allow increasing the availability level for any customer in the distribution system at minimum cost for the system operator. To illustrate the application of the proposed methodology, the paper includes a case study that considers a real distribution network.
Resumo:
This work addresses the treatment by nanofiltration (NF) of solutions containing NaCN and NH(4)Cl at various pH values. The NF experiments are carried out in a Lab-Unit equipped with NF-270 membranes for model solutions that are surrogates of industrial ammoniacal wastewaters generated in the coke-making processes. The applied pressure is 30 bar. The main objective is the separation of the compounds NaCN and NH(4)Cl and the optimization of this separation as a function of the pH. Membrane performance is highly dependent on solution composition and characteristics, namely on the pH. In fact, the rejection coefficients for the binary model solution containing sodium cyanide are always higher than the rejections coefficients for the ammonium chloride model solution. For ternary solutions (cyanide/ammonium/water) it was observed that for pH values lower than 9 the rejection coefficients to ammonium are well above the ones observed for the cyanides, but for pH values higher than 9.5 there is a drastic decrease in the ammonium rejection coefficients with the increase of the pH. These results take into account the changes that occur in solution, namely, the solute species that are predominant, with the increase of the pH. The fluxes of the model solutions decreased with increased pH. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This paper addresses the problem of energy resources management using modern metaheuristics approaches, namely Particle Swarm Optimization (PSO), New Particle Swarm Optimization (NPSO) and Evolutionary Particle Swarm Optimization (EPSO). The addressed problem in this research paper is intended for aggregators’ use operating in a smart grid context, dealing with Distributed Generation (DG), and gridable vehicles intelligently managed on a multi-period basis according to its users’ profiles and requirements. The aggregator can also purchase additional energy from external suppliers. The paper includes a case study considering a 30 kV distribution network with one substation, 180 buses and 90 load points. The distribution network in the case study considers intense penetration of DG, including 116 units from several technologies, and one external supplier. A scenario of 6000 EVs for the given network is simulated during 24 periods, corresponding to one day. The results of the application of the PSO approaches to this case study are discussed deep in the paper.
Resumo:
Metaheuristics performance is highly dependent of the respective parameters which need to be tuned. Parameter tuning may allow a larger flexibility and robustness but requires a careful initialization. The process of defining which parameters setting should be used is not obvious. The values for parameters depend mainly on the problem, the instance to be solved, the search time available to spend in solving the problem, and the required quality of solution. This paper presents a learning module proposal for an autonomous parameterization of Metaheuristics, integrated on a Multi-Agent System for the resolution of Dynamic Scheduling problems. The proposed learning module is inspired on Autonomic Computing Self-Optimization concept, defining that systems must continuously and proactively improve their performance. For the learning implementation it is used Case-based Reasoning, which uses previous similar data to solve new cases. In the use of Case-based Reasoning it is assumed that similar cases have similar solutions. After a literature review on topics used, both AutoDynAgents system and Self-Optimization module are described. Finally, a computational study is presented where the proposed module is evaluated, obtained results are compared with previous ones, some conclusions are reached, and some future work is referred. It is expected that this proposal can be a great contribution for the self-parameterization of Metaheuristics and for the resolution of scheduling problems on dynamic environments.