967 resultados para Palaeomagnetism Applied to Tectonics
Resumo:
Paper presented at the 9th European Conference on Knowledge Management, Southampton Solent University, Southampton, UK, 4-5 Sep. 2008. URL: http://academic-conferences.org/eckm/eckm2008/eckm08-home.htm
Resumo:
In this work, the effect of incorporation of recycled glass fibre reinforced plastics (GFRP) waste materials, obtained by means of shredding and milling processes, on mechanical behavior of polyester polymer mortar (PM) materials was assessed. For this purpose, different contents of GFRP recyclates (between 4% up to 12% in mass), were incorporated into polyester PM materials as sand aggregates and filler replacements. The effect of silane coupling agent addition to resin binder was also evaluated. Applied waste material was proceeding from the shredding of the leftovers resultant from the cutting and assembly processes of GFRP pultrusion profiles. Currently, these leftovers, jointly with unfinished products and scrap resulting from pultrusion manufacturing process, are landfilled, with supplementary added costs. Thus, besides the evident environmental benefits, a viable and feasible solution for these wastes would also conduct to significant economic advantages. Design of experiments and data treatment were accomplish by means of full factorial design approach and analysis of variance ANOVA. Experimental results were promising toward the recyclability of GFRP waste materials as aggregates and reinforcement for PM materials, with significant improvements on mechanical properties with regard to non-modified formulations.
Resumo:
This paper presents the creation and development of technological schools directly linked to the business community and to higher public education. Establishing themselves as the key interface between the two sectors they make a signigicant contribution by having a greater competitive edge when faced with increasing competition in the tradional markets. The development of new business strategies supported by references of excellence, quality and competitiveness also provides a good link between the estalishment of partnerships aiming at the qualification of education boards at a medium level between the technological school and higher education with a technological foundation. We present a case study as an example depicting the success of Escola Tecnológica de Vale de Cambra.
Resumo:
A new method, based on linear correlation and phase diagrams was successfully developed for processes like the sedimentary process, where the deposition phase can have different time duration - represented by repeated values in a series - and where the erosion can play an important rule deleting values of a series. The sampling process itself can be the cause of repeated values - large strata twice sampled - or deleted values: tiny strata fitted between two consecutive samples. What we developed was a mathematical procedure which, based upon the depth chemical composition evolution, allows the establishment of frontiers as well as the periodicity of different sedimentary environments. The basic tool isn't more than a linear correlation analysis which allow us to detect the existence of eventual evolution rules, connected with cyclical phenomena within time series (considering the space assimilated to time), with the final objective of prevision. A very interesting discovery was the phenomenon of repeated sliding windows that represent quasi-cycles of a series of quasi-periods. An accurate forecast can be obtained if we are inside a quasi-cycle (it is possible to predict the other elements of the cycle with the probability related with the number of repeated and deleted points). We deal with an innovator methodology, reason why it's efficiency is being tested in some case studies, with remarkable results that shows it's efficacy. Keywords: sedimentary environments, sequence stratigraphy, data analysis, time-series, conditional probability.
Resumo:
Recent integrated circuit technologies have opened the possibility to design parallel architectures with hundreds of cores on a single chip. The design space of these parallel architectures is huge with many architectural options. Exploring the design space gets even more difficult if, beyond performance and area, we also consider extra metrics like performance and area efficiency, where the designer tries to design the architecture with the best performance per chip area and the best sustainable performance. In this paper we present an algorithm-oriented approach to design a many-core architecture. Instead of doing the design space exploration of the many core architecture based on the experimental execution results of a particular benchmark of algorithms, our approach is to make a formal analysis of the algorithms considering the main architectural aspects and to determine how each particular architectural aspect is related to the performance of the architecture when running an algorithm or set of algorithms. The architectural aspects considered include the number of cores, the local memory available in each core, the communication bandwidth between the many-core architecture and the external memory and the memory hierarchy. To exemplify the approach we did a theoretical analysis of a dense matrix multiplication algorithm and determined an equation that relates the number of execution cycles with the architectural parameters. Based on this equation a many-core architecture has been designed. The results obtained indicate that a 100 mm(2) integrated circuit design of the proposed architecture, using a 65 nm technology, is able to achieve 464 GFLOPs (double precision floating-point) for a memory bandwidth of 16 GB/s. This corresponds to a performance efficiency of 71 %. Considering a 45 nm technology, a 100 mm(2) chip attains 833 GFLOPs which corresponds to 84 % of peak performance These figures are better than those obtained by previous many-core architectures, except for the area efficiency which is limited by the lower memory bandwidth considered. The results achieved are also better than those of previous state-of-the-art many-cores architectures designed specifically to achieve high performance for matrix multiplication.
Resumo:
The elastic behavior of the demand consumption jointly used with other available resources such as distributed generation (DG) can play a crucial role for the success of smart grids. The intensive use of Distributed Energy Resources (DER) and the technical and contractual constraints result in large-scale non linear optimization problems that require computational intelligence methods to be solved. This paper proposes a Particle Swarm Optimization (PSO) based methodology to support the minimization of the operation costs of a virtual power player that manages the resources in a distribution network and the network itself. Resources include the DER available in the considered time period and the energy that can be bought from external energy suppliers. Network constraints are considered. The proposed approach uses Gaussian mutation of the strategic parameters and contextual self-parameterization of the maximum and minimum particle velocities. The case study considers a real 937 bus distribution network, with 20310 consumers and 548 distributed generators. The obtained solutions are compared with a deterministic approach and with PSO without mutation and Evolutionary PSO, both using self-parameterization.
Resumo:
In this study, the concentration probability distributions of 82 pharmaceutical compounds detected in the effluents of 179 European wastewater treatment plants were computed and inserted into a multimedia fate model. The comparative ecotoxicological impact of the direct emission of these compounds from wastewater treatment plants on freshwater ecosystems, based on a potentially affected fraction (PAF) of species approach, was assessed to rank compounds based on priority. As many pharmaceuticals are acids or bases, the multimedia fate model accounts for regressions to estimate pH-dependent fate parameters. An uncertainty analysis was performed by means of Monte Carlo analysis, which included the uncertainty of fate and ecotoxicity model input variables, as well as the spatial variability of landscape characteristics on the European continental scale. Several pharmaceutical compounds were identified as being of greatest concern, including 7 analgesics/anti-inflammatories, 3 β-blockers, 3 psychiatric drugs, and 1 each of 6 other therapeutic classes. The fate and impact modelling relied extensively on estimated data, given that most of these compounds have little or no experimental fate or ecotoxicity data available, as well as a limited reported occurrence in effluents. The contribution of estimated model input variables to the variance of freshwater ecotoxicity impact, as well as the lack of experimental abiotic degradation data for most compounds, helped in establishing priorities for further testing. Generally, the effluent concentration and the ecotoxicity effect factor were the model input variables with the most significant effect on the uncertainty of output results.
Resumo:
Comunicação apresentada no 8º Congresso Nacional de Administração Pública - Desafios e Soluções, em Carcavelos de 21 a 22 de Novembro de 2011.
Resumo:
The alkaline soluble Trypanosoma cruzi epimastigote antigen (ASEA) was assessed in dot-ELISA for the diagnosis of Chagas' disease. Serum samples (355) from chagasic and non-chagasic patients were studied, and IgG antibodies to ASEA were found in all patients with chronic Chagas' disease. In non-chagasic patients 95.6% were negative, except for those with leishmaniasis (visceral and mucocutaneous), and some patients from control group reacted in low titers. The data indicate that dot-ELISA using ASEA is suitable for seroepidemiologic surveys to be employed in endemic areas for Chagas' disease.
Resumo:
This project aims to study the implementation of Lean principles and tools in several levels of logistics, from internal logistics to interface with distribution center and suppliers, in an industrial plant. The main focus of all efforts is to create the conditions to approach the continuous flow scenario in the manufacturing processes. The subject of improvement actions is a company whose core activity is car seat production, more specifically the car seat cover production and assembly. This focuses the assembly process, which requires the usage of a considerable variety of components and therefore is an important obstacle to the implementation of continuous flow. The most salient issues are related with inefficient interaction between sections and late supply of components in assembly lines, forcing the operator to abandon his work station and leading to production interruption. As an operational methodology, actions from Lean philosophy and optimization were implemented according to project management principles.
Resumo:
Thesis presented in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the subject of Electrical and Computer Engineering by the Universidade Nova de Lisboa,Faculdade de Ciências e Tecnologia
Resumo:
Performance appraisal increasingly assumes a more important role in any organizational environment. In the trucking industry, drivers are the company's image and for this reason it is important to develop and increase their performance and commitment to the company's goals. This paper aims to create a performance appraisal model for trucking drivers, based on a multi-criteria decision aid methodology. The PROMETHEE and MMASSI methodologies were adapted using the criteria used for performance appraisal by the trucking company studied. The appraisal involved all the truck drivers, their supervisors and the company's Managing Director. The final output is a ranking of the drivers, based on their performance, for each one of the scenarios used. The results are to be used as a decision-making tool to allocate drivers to the domestic haul service.
Resumo:
This paper presents a decision support tool methodology to help virtual power players (VPPs) in the Smart Grid (SGs) context to solve the day-ahead energy resource scheduling considering the intensive use of Distributed Generation (DG) and Vehicle-To-Grid (V2G). The main focus is the application of a new hybrid method combing a particle swarm approach and a deterministic technique based on mixedinteger linear programming (MILP) to solve the day-ahead scheduling minimizing total operation costs from the aggregator point of view. A realistic mathematical formulation, considering the electric network constraints and V2G charging and discharging efficiencies is presented. Full AC power flow calculation is included in the hybrid method to allow taking into account the network constraints. A case study with a 33-bus distribution network and 1800 V2G resources is used to illustrate the performance of the proposed method.
Resumo:
The aggregation and management of Distributed Energy Resources (DERs) by an Virtual Power Players (VPP) is an important task in a smart grid context. The Energy Resource Management (ERM) of theses DERs can become a hard and complex optimization problem. The large integration of several DERs, including Electric Vehicles (EVs), may lead to a scenario in which the VPP needs several hours to have a solution for the ERM problem. This is the reason why it is necessary to use metaheuristic methodologies to come up with a good solution with a reasonable amount of time. The presented paper proposes a Simulated Annealing (SA) approach to determine the ERM considering an intensive use of DERs, mainly EVs. In this paper, the possibility to apply Demand Response (DR) programs to the EVs is considered. Moreover, a trip reduce DR program is implemented. The SA methodology is tested on a 32-bus distribution network with 2000 EVs, and the SA results are compared with a deterministic technique and particle swarm optimization results.
Resumo:
Thesis submitted in the fulfillment of the requirements for the Degree of Master in Biomedical Engineering