12 resultados para Fluid dynamics -- Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estuaries are perhaps the most threatened environments in the coastal fringe; the coincidence of high natural value and attractiveness for human use has led to conflicts between conservation and development. These conflicts occur in the Sado Estuary since its location is near the industrialised zone of Peninsula of Setúbal and at the same time, a great part of the Estuary is classified as a Natural Reserve due to its high biodiversity. These facts led us to the need of implementing a model of environmental management and quality assessment, based on methodologies that enable the assessment of the Sado Estuary quality and evaluation of the human pressures in the estuary. These methodologies are based on indicators that can better depict the state of the environment and not necessarily all that could be measured or analysed. Sediments have always been considered as an important temporary source of some compounds or a sink for other type of materials or an interface where a great diversity of biogeochemical transformations occur. For all this they are of great importance in the formulation of coastal management system. Many authors have been using sediments to monitor aquatic contamination, showing great advantages when compared to the sampling of the traditional water column. The main objective of this thesis was to develop an estuary environmental management framework applied to Sado Estuary using the DPSIR Model (EMMSado), including data collection, data processing and data analysis. The support infrastructure of EMMSado were a set of spatially contiguous and homogeneous regions of sediment structure (management units). The environmental quality of the estuary was assessed through the sediment quality assessment and integrated in a preliminary stage with the human pressure for development. Besides the earlier explained advantages, studying the quality of the estuary mainly based on the indicators and indexes of the sediment compartment also turns this methodology easier, faster and human and financial resource saving. These are essential factors to an efficient environmental management of coastal areas. Data management, visualization, processing and analysis was obtained through the combined use of indicators and indices, sampling optimization techniques, Geographical Information Systems, remote sensing, statistics for spatial data, Global Positioning Systems and best expert judgments. As a global conclusion, from the nineteen management units delineated and analyzed three showed no ecological risk (18.5 % of the study area). The areas of more concern (5.6 % of the study area) are located in the North Channel and are under strong human pressure mainly due to industrial activities. These areas have also low hydrodynamics and are, thus associated with high levels of deposition. In particular the areas near Lisnave and Eurominas industries can also accumulate the contamination coming from Águas de Moura Channel, since particles coming from that channel can settle down in that area due to residual flow. In these areas the contaminants of concern, from those analyzed, are the heavy metals and metalloids (Cd, Cu, Zn and As exceeded the PEL guidelines) and the pesticides BHC isomers, heptachlor, isodrin, DDT and metabolits, endosulfan and endrin. In the remain management units (76 % of the study area) there is a moderate impact potential of occurrence of adverse ecological effects and in some of these areas no stress agents could be identified. This emphasizes the need for further research, since unmeasured chemicals may be causing or contributing to these adverse effects. Special attention must be taken to the units with moderate impact potential of occurrence of adverse ecological effects, located inside the natural reserve. Non-point source pollution coming from agriculture and aquaculture activities also seem to contribute with important pollution load into the estuary entering from Águas de Moura Channel. This pressure is expressed in a moderate impact potential for ecological risk existent in the areas near the entrance of this Channel. Pressures may also came from Alcácer Channel although they were not quantified in this study. The management framework presented here, including all the methodological tools may be applied and tested in other estuarine ecosystems, which will also allow a comparison between estuarine ecosystems in other parts of the globe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação apresentada para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A thesis submitted in fulfillment of the requirements for the degree of the Masters in Molecular Genetics and Biomedicine

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A potentially renewable and sustainable source of energy is the chemical energy associated with solvation of salts. Mixing of two aqueous streams with different saline concentrations is spontaneous and releases energy. The global theoretically obtainable power from salinity gradient energy due to World’s rivers discharge into the oceans has been estimated to be within the range of 1.4-2.6 TW. Reverse electrodialysis (RED) is one of the emerging, membrane-based, technologies for harvesting the salinity gradient energy. A common RED stack is composed by alternately-arranged cation- and anion-exchange membranes, stacked between two electrodes. The compartments between the membranes are alternately fed with concentrated (e.g., sea water) and dilute (e.g., river water) saline solutions. Migration of the respective counter-ions through the membranes leads to ionic current between the electrodes, where an appropriate redox pair converts the chemical salinity gradient energy into electrical energy. Given the importance of the need for new sources of energy for power generation, the present study aims at better understanding and solving current challenges, associated with the RED stack design, fluid dynamics, ionic mass transfer and long-term RED stack performance with natural saline solutions as feedwaters. Chronopotentiometry was used to determinate diffusion boundary layer (DBL) thickness from diffusion relaxation data and the flow entrance effects on mass transfer were found to avail a power generation increase in RED stacks. Increasing the linear flow velocity also leads to a decrease of DBL thickness but on the cost of a higher pressure drop. Pressure drop inside RED stacks was successfully simulated by the developed mathematical model, in which contribution of several pressure drops, that until now have not been considered, was included. The effect of each pressure drop on the RED stack performance was identified and rationalized and guidelines for planning and/or optimization of RED stacks were derived. The design of new profiled membranes, with a chevron corrugation structure, was proposed using computational fluid dynamics (CFD) modeling. The performance of the suggested corrugation geometry was compared with the already existing ones, as well as with the use of conductive and non-conductive spacers. According to the estimations, use of chevron structures grants the highest net power density values, at the best compromise between the mass transfer coefficient and the pressure drop values. Finally, long-term experiments with natural waters were performed, during which fouling was experienced. For the first time, 2D fluorescence spectroscopy was used to monitor RED stack performance, with a dedicated focus on following fouling on ion-exchange membrane surfaces. To extract relevant information from fluorescence spectra, parallel factor analysis (PARAFAC) was performed. Moreover, the information obtained was then used to predict net power density, stack electric resistance and pressure drop by multivariate statistical models based on projection to latent structures (PLS) modeling. The use in such models of 2D fluorescence data, containing hidden, but extractable by PARAFAC, information about fouling on membrane surfaces, considerably improved the models fitting to the experimental data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Apenas há 100 anos atrás foi finalmente estabelecido que os sistemas biológicos têm a capacidade de fixar o azoto. A Química ainda se encontrava atrasada, mas em 1913, Haber e Bosh projectaram a primeira instalação comercial de fixação do azoto e fundaram a indústria química inorgânica moderna. Os mecanismos destes dois processos relacionados são, no entanto, muito diversos. Laplaza e Cummins apresentaram na revista Science uma reacção que representa o culminar de 30 anos de trabalho da comunidade química na fixação de azoto em condições próximas das atmosféricas. Esta descoberta apresenta um complexo inorgânico simples que pode quebrar a ligação tripla da molécula de azoto para formar um novo nitrilo-complexo sem a necessidade de qualquer outro reagente. Esta publicação inspirou o trabalho apresentado nesta tese. Como a transferência de massa entre as fases – gasosa (azoto) e o solvente líquido – constituía um factor limitativo da cinética da reacção, assim a utilização de um solvente supercrítico pareceu ser uma melhoria óbvia. O xénon é o único fluido supercrítico, à temperatura ambiente suficientemente inerte quimicamente para ser usado como solvente em contacto com uma substância extremamente reactiva, capaz de quebrar a ligação da molécula de azoto. Neste trabalho, a reacção descoberta por Laplaza e Cummins foi efectuada em xénon supercrítico. A realização desta reacção envolveu diversas etapas: 1. As sínteses do composto Mo(NRAr)3 (1) (onde R é C(CD3)2CH3 e Ar é 3,5-C6H3(CH3)2), usando caixa de luvas e técnicas sob atmosfera de árgon; 2. A construção de uma nova instalação, projectada para a realização da reacção do composto 1 com o azoto em xénon supercrítico, com monitorização contínua através de espectrofotometria visível; 3. A introdução de sucessivas modificações devido a dificuldades experimentais imprevistas conduziu à reconstrução substancial da primeira instalação utilizada, de forma a ser possível medir a solubilidade do composto 1 em xénon supercrítico, e de proporcionar a difusão do azoto no xénon. 4. Medições da solubilidade do complexo 1 em xénon supercrítico, à temperatura ambiente e pressões entre 6 e 10MPa; 5. Medições da cinética da reacção do composto 1 com azoto gasoso, usando xénon supercrítico como solvente, em diversas condições, com diversos ambientes de solventes, em diferentes quantidades do composto 1, e usando métodos de detecção completamente diferentes do método espectroscópico inicialmente programado; 6. Utilização de simulações em CFD (Computer Fluid Dynamics) para interpretar os resultados obtidos. Estas simulações sugerem que a elevada densidade do xénon induz a sedimentação lenta do excesso (não dissolvido) do composto 1, que controla a distribuição do produto da reacção dentro do reactor. A conclusão principal foi a de que a cisão da ligação da molécula do azoto pelo composto 1 pode ser obtida em segundos em xénon supercrítico.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia Biomédica. A presente dissertação foi desenvolvida no Erasmus Medical Center em Roterdão, Holanda

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Doutor em Engenharia Química, especialidade de Engenharia Bioquímica

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, existing 3D scanning cameras and microscopes in the market use digital or discrete sensors, such as CCDs or CMOS for object detection applications. However, these combined systems are not fast enough for some application scenarios since they require large data processing resources and can be cumbersome. Thereby, there is a clear interest in exploring the possibilities and performances of analogue sensors such as arrays of position sensitive detectors with the final goal of integrating them in 3D scanning cameras or microscopes for object detection purposes. The work performed in this thesis deals with the implementation of prototype systems in order to explore the application of object detection using amorphous silicon position sensors of 32 and 128 lines which were produced in the clean room at CENIMAT-CEMOP. During the first phase of this work, the fabrication and the study of the static and dynamic specifications of the sensors as well as their conditioning in relation to the existing scientific and technological knowledge became a starting point. Subsequently, relevant data acquisition and suitable signal processing electronics were assembled. Various prototypes were developed for the 32 and 128 array PSD sensors. Appropriate optical solutions were integrated to work together with the constructed prototypes, allowing the required experiments to be carried out and allowing the achievement of the results presented in this thesis. All control, data acquisition and 3D rendering platform software was implemented for the existing systems. All these components were combined together to form several integrated systems for the 32 and 128 line PSD 3D sensors. The performance of the 32 PSD array sensor and system was evaluated for machine vision applications such as for example 3D object rendering as well as for microscopy applications such as for example micro object movement detection. Trials were also performed involving the 128 array PSD sensor systems. Sensor channel non-linearities of approximately 4 to 7% were obtained. Overall results obtained show the possibility of using a linear array of 32/128 1D line sensors based on the amorphous silicon technology to render 3D profiles of objects. The system and setup presented allows 3D rendering at high speeds and at high frame rates. The minimum detail or gap that can be detected by the sensor system is approximately 350 μm when using this current setup. It is also possible to render an object in 3D within a scanning angle range of 15º to 85º and identify its real height as a function of the scanning angle and the image displacement distance on the sensor. Simple and not so simple objects, such as a rubber and a plastic fork, can be rendered in 3D properly and accurately also at high resolution, using this sensor and system platform. The nip structure sensor system can detect primary and even derived colors of objects by a proper adjustment of the integration time of the system and by combining white, red, green and blue (RGB) light sources. A mean colorimetric error of 25.7 was obtained. It is also possible to detect the movement of micrometer objects using the 32 PSD sensor system. This kind of setup offers the possibility to detect if a micro object is moving, what are its dimensions and what is its position in two dimensions, even at high speeds. Results show a non-linearity of about 3% and a spatial resolution of < 2µm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deep-eutectic solvents (DES) are considered novel renewable and biodegradable solvents, with a cheap and easy synthesis, without waste production. Later it was discovered a new subclass of DES that even can be biocompatible, since their synthesis uses primary metabolites such as amino acids, organic acids and sugars, from organisms. This subclass was named natural deep-eutectic solvents (NADES). Due to their properties it was tried to study the interaction between these solvents and biopolymers, in order to produce functionalized fibers for biomedical applications. In this way, fibers were produced by using the electrospinning technique. However, it was first necessary to study some physical properties of NADES, as well as the influence of water in their properties. It has been concluded that the water has a high influence on NADES properties, which can be seen on the results obtained from the rheology and viscosity studies. The fluid dynamics had changed, as well as the viscosity. Afterwards, it was tested the viability of using a starch blend. First it was tested the dissolution of these biopolymers into NADES, in order to study the viability of their application in electrospinning. However the results obtained were not satisfactory, since the starch polymers studied did not presented any dissolution in any NADES, or even in organic solvents. In this way it was changed the approach, and it was used other biocompatible polymers. Poly(ethylene oxide), poly(vinyl alcohol) and gelatin were the others biopolymers tested for the electrospinning, with NADES. All polymers show good results, since it was possible to obtain fibers. However for gelatin it was used only eutectic mixtures, containing active pharmaceutical ingredients (API’s), instead of NADES. For this case it was used mandelic acid (antimicrobial properties), choline chloride, ibuprofen (anti-inflammatory properties) and menthol (analgesic properties). The polymers and the produced fibers were characterized by scanning electron microscope (SEM), Transmission electron microscopy (TEM) and Fourier transform infrared spectroscopy (FTIR). With the help of these techniques it was possible to conclude that it was possible to encapsulate NADES within the fibers. Rheology it was also study for poly(ethylene oxide) and poly(vinyl alcohol), in a way to understand the influence of polymer concentration, on the electrospinning technique. For the gelatin, among the characterization techniques, it was also performed cytotoxicity and drug release studies. The gelatin membranes did not show any toxicity for the cells, since their viability was maintained. Regarding the controlled release profile experiment no conclusion could be drawn from the experiments, due to the rapid and complete dissolution of the gelatin in the buffer solution. However it was possible to quantify the mixture of choline chloride with mandelic acid, allowing thus to complete, and confirm, the information already obtained for the others characterization technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based in internet growth, through semantic web, together with communication speed improvement and fast development of storage device sizes, data and information volume rises considerably every day. Because of this, in the last few years there has been a growing interest in structures for formal representation with suitable characteristics, such as the possibility to organize data and information, as well as the reuse of its contents aimed for the generation of new knowledge. Controlled Vocabulary, specifically Ontologies, present themselves in the lead as one of such structures of representation with high potential. Not only allow for data representation, as well as the reuse of such data for knowledge extraction, coupled with its subsequent storage through not so complex formalisms. However, for the purpose of assuring that ontology knowledge is always up to date, they need maintenance. Ontology Learning is an area which studies the details of update and maintenance of ontologies. It is worth noting that relevant literature already presents first results on automatic maintenance of ontologies, but still in a very early stage. Human-based processes are still the current way to update and maintain an ontology, which turns this into a cumbersome task. The generation of new knowledge aimed for ontology growth can be done based in Data Mining techniques, which is an area that studies techniques for data processing, pattern discovery and knowledge extraction in IT systems. This work aims at proposing a novel semi-automatic method for knowledge extraction from unstructured data sources, using Data Mining techniques, namely through pattern discovery, focused in improving the precision of concept and its semantic relations present in an ontology. In order to verify the applicability of the proposed method, a proof of concept was developed, presenting its results, which were applied in building and construction sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The personal data protection is presented as an indisputably complex and transversal subject and gives an account of this report, a result of curricular internship at the Portuguese Commission for Data Protection. The Commission is the competent authority for the control and supervision of personal data processing. The subject around which this report was prepared is the protection of personal data, analyzed in several aspects. The protection of personal data is, for some time, a topic that raises many concerns, because it is closely linked to fundamental rights constitutionally protected. Fundamental rights inherent in each of us are a result of Article 1 of the Constitution of the Portuguese Republic, in the sense that the dignity of the human person is affirmed as the first value around which the Portuguese legal system will have to be based. In other words, is the dignity of the human person the highest value in the Portuguese legal system. Was the development of societies to the point that we know today that has led to the importance to the personal data of citizens. In modern societies, it is possible to know everything about everyone and the curiosity of others seems not to worry about the injuries that affect the rights of citizens. Where new technologies make excuses for the excessive processing of personal data and where subjects do not seem to bother about their personal data crossing the world, it is important that jurisdictions give value the protection of personal data and the implications of its misuse, in that as these are the mirror of identity each of us and can be used against their owners, causing irreparable damage to the their fundamental rights. Being understood as protection of personal data the possibility of each citizen to decide the use of their data and how they can be used, we can say that its protection depends essentially on each of us, as holders of personal data. Therefore, the protection of our data begins in ourselves.