940 resultados para Low cost piezoelectric sensor
Resumo:
A report from the National Institutes of Health defines a disease biomarker as a “characteristic that is objectively measured and evaluated as an indicator of normal biologic processes, pathogenic processes, or pharmacologic responses to a therapeutic intervention.” Early diagnosis is a crucial factor for incurable disease such as cancer and Alzheimer’s disease (AD). During the last decade researchers have discovered that biochemical changes caused by a disease can be detected considerably earlier as compared to physical manifestations/symptoms. In this dissertation electrochemical detection was utilized as the detection strategy as it offers high sensitivity/specificity, ease of operation, and capability of miniaturization and multiplexed detection. Electrochemical detection of biological analytes is an established field, and has matured at a rapid pace during the last 50 years and adapted itself to advances in micro/nanofabrication procedures. Carbon fiber microelectrodes were utilized as the platform sensor due to their high signal to noise ratio, ease and low-cost of fabrication, biocompatibility, and active carbon surface which allows conjugation with biorecognition moieties. This dissertation specifically focuses on the detection of 3 extensively validated biomarkers for cancer and AD. Firstly, vascular endothelial growth factor (VEGF) a cancer biomarker was detected using a one-step, reagentless immunosensing strategy. The immunosensing strategy allowed a rapid and sensitive means of VEGF detection with a detection limit of about 38 pg/mL with a linear dynamic range of 0–100 pg/mL. Direct detection of AD-related biomarker amyloid beta (Aβ) was achieved by exploiting its inherent electroactivity. The quantification of the ratio of Aβ1-40/42 (or Aβ ratio) has been established as a reliable test to diagnose AD through human clinical trials. Triple barrel carbon fiber microelectrodes were used to simultaneously detect Aβ1-40 and Aβ1-42 in cerebrospinal fluid from rats within a detection range of 100nM to 1.2μM and 400nM to 1μM respectively. In addition, the release of DNA damage/repair biomarker 8-hydroxydeoxyguanine (8-OHdG) under the influence of reactive oxidative stress from single lung endothelial cell was monitored using an activated carbon fiber microelectrode. The sensor was used to test the influence of nicotine, which is one of the most biologically active chemicals present in cigarette smoke and smokeless tobacco.
Resumo:
A Wireless Sensor Network (WSN) consists of distributed devices in an area in order to monitor physical variables such as temperature, pressure, vibration, motion and environmental conditions in places where wired networks would be difficult or impractical to implement, for example, industrial applications of difficult access, monitoring and control of oil wells on-shore or off-shore, monitoring of large areas of agricultural and animal farming, among others. To be viable, a WSN should have important requirements such as low cost, low latency, and especially low power consumption. However, to ensure these requirements, these networks suffer from limited resources, and eventually being used in hostile environments, leading to high failure rates, such as segmented routing, mes sage loss, reducing efficiency, and compromising the entire network, inclusive. This work aims to present the FTE-LEACH, a fault tolerant and energy efficient routing protocol that maintains efficiency in communication and dissemination of data.This protocol was developed based on the IEEE 802.15.4 standard and suitable for industrial networks with limited energy resources
Resumo:
The main focus of this thesis is to address the relative localization problem of a heterogenous team which comprises of both ground and micro aerial vehicle robots. This team configuration allows to combine the advantages of increased accessibility and better perspective provided by aerial robots with the higher computational and sensory resources provided by the ground agents, to realize a cooperative multi robotic system suitable for hostile autonomous missions. However, in such a scenario, the strict constraints in flight time, sensor pay load, and computational capability of micro aerial vehicles limits the practical applicability of popular map-based localization schemes for GPS denied navigation. Therefore, the resource limited aerial platforms of this team demand simpler localization means for autonomous navigation. Relative localization is the process of estimating the formation of a robot team using the acquired inter-robot relative measurements. This allows the team members to know their relative formation even without a global localization reference, such as GPS or a map. Thus a typical robot team would benefit from a relative localization service since it would allow the team to implement formation control, collision avoidance, and supervisory control tasks, independent of a global localization service. More importantly, a heterogenous team such as ground robots and computationally constrained aerial vehicles would benefit from a relative localization service since it provides the crucial localization information required for autonomous operation of the weaker agents. This enables less capable robots to assume supportive roles and contribute to the more powerful robots executing the mission. Hence this study proposes a relative localization-based approach for ground and micro aerial vehicle cooperation, and develops inter-robot measurement, filtering, and distributed computing modules, necessary to realize the system. The research study results in three significant contributions. First, the work designs and validates a novel inter-robot relative measurement hardware solution which has accuracy, range, and scalability characteristics, necessary for relative localization. Second, the research work performs an analysis and design of a novel nonlinear filtering method, which allows the implementation of relative localization modules and attitude reference filters on low cost devices with optimal tuning parameters. Third, this work designs and validates a novel distributed relative localization approach, which harnesses the distributed computing capability of the team to minimize communication requirements, achieve consistent estimation, and enable efficient data correspondence within the network. The work validates the complete relative localization-based system through multiple indoor experiments and numerical simulations. The relative localization based navigation concept with its sensing, filtering, and distributed computing methods introduced in this thesis complements system limitations of a ground and micro aerial vehicle team, and also targets hostile environmental conditions. Thus the work constitutes an essential step towards realizing autonomous navigation of heterogenous teams in real world applications.
Resumo:
This thesis explores methods for fabrication of nanohole arrays, and their integration into a benchtop system for use as sensors or anti-counterfeit labels. Chapter 1 gives an introduction to plasmonics and more specifically nanohole arrays and how they have potential as label free sensors compared to the current biosensors on the market. Various fabrication methods are explored, including Focused Ion Beam, Electron Beam Lithography, Nanoimprint lithography, Template stripping and Phase Shift Lithography. Focused Ion Beam was chosen to fabricate the nanohole arrays due to its suitability for rapid prototyping and it’s relatively low cost. In chapter 2 the fabrication of nanohole arrays using FIB is described, and the samples characterised. The fabricated nanohole arrays are tested as bulk refractive index sensors, before a bioassay using whole molecule human IgG antibodies and antigen is developed and performed on the senor. In chapter 3 the fabricated sensors are integrated into a custom built system, capable of real time, multiplexed detection of biomolecules. Here, scFv antibodies of two biomolecules relevant to the detection of pancreatic cancer (C1q and C3) are attached to the nanohole arrays, and detection of their complementary proteins is demonstrated both in buffer (10 nM detection of C1q Ag) and human serum. Chapter 4 explores arrays of anisotropic (elliptical) nanoholes and shows how the shape anisotropy induces polarisation sensitive transmission spectra, in both simulations and fabricated arrays. The potential use of such samples as visible and NIR tag for anti-counterfeiting applications is demonstrated. Finally, chapter 5 gives a summary of the work completed and discusses potential future work in this area.
Resumo:
The absence of rapid, low cost and highly sensitive biodetection platform has hindered the implementation of next generation cheap and early stage clinical or home based point-of-care diagnostics. Label-free optical biosensing with high sensitivity, throughput, compactness, and low cost, plays an important role to resolve these diagnostic challenges and pushes the detection limit down to single molecule. Optical nanostructures, specifically the resonant waveguide grating (RWG) and nano-ribbon cavity based biodetection are promising in this context. The main element of this dissertation is design, fabrication and characterization of RWG sensors for different spectral regions (e.g. visible, near infrared) for use in label-free optical biosensing and also to explore different RWG parameters to maximize sensitivity and increase detection accuracy. Design and fabrication of the waveguide embedded resonant nano-cavity are also studied. Multi-parametric analyses were done using customized optical simulator to understand the operational principle of these sensors and more important the relationship between the physical design parameters and sensor sensitivities. Silicon nitride (SixNy) is a useful waveguide material because of its wide transparency across the whole infrared, visible and part of UV spectrum, and comparatively higher refractive index than glass substrate. SixNy based RWGs on glass substrate are designed and fabricated applying both electron beam lithography and low cost nano-imprint lithography techniques. A Chromium hard mask aided nano-fabrication technique is developed for making very high aspect ratio optical nano-structure on glass substrate. An aspect ratio of 10 for very narrow (~60 nm wide) grating lines is achieved which is the highest presented so far. The fabricated RWG sensors are characterized for both bulk (183.3 nm/RIU) and surface sensitivity (0.21nm/nm-layer), and then used for successful detection of Immunoglobulin-G (IgG) antibodies and antigen (~1μg/ml) both in buffer and serum. Widely used optical biosensors like surface plasmon resonance and optical microcavities are limited in the separation of bulk response from the surface binding events which is crucial for ultralow biosensing application with thermal or other perturbations. A RWG based dual resonance approach is proposed and verified by controlled experiments for separating the response of bulk and surface sensitivity. The dual resonance approach gives sensitivity ratio of 9.4 whereas the competitive polarization based approach can offer only 2.5. The improved performance of the dual resonance approach would help reducing probability of false reading in precise bio-assay experiments where thermal variations are probable like portable diagnostics.
Resumo:
This thesis involved the development of two Biosensors and their associated assays for the detection of diseases, namely IBR and BVD for veterinary use and C1q protein as a biomarker to pancreatic cancer for medical application, using Surface Plasmon Resonance (SPR) and nanoplasmonics. SPR techniques have been used by a number of groups, both in research [1-3] and commercially [4, 5] , as a diagnostic tool for the detection of various biomolecules, especially antibodies [6-8]. The biosensor market is an ever expanding field, with new technology and new companies rapidly emerging on the market, for both human [8] and veterinary applications [9, 10]. In Chapter 2, we discuss the development of a simultaneous IBR and BVD virus assay for the detection of antibodies in bovine serum on an SPR-2 platform. Pancreatic cancer is the most lethal cancer by organ site, partially due to the lack of a reliable molecular signature for diagnostic testing. C1q protein has been recently proposed as a biomarker within a panel for the detection of pancreatic cancer. The third chapter discusses the fabrication, assays and characterisation of nanoplasmonic arrays. We will talk about developing C1q scFv antibody assays, clone screening of the antibodies and subsequently moving the assays onto the nanoplasmonic array platform for static assays, as well as a custom hybrid benchtop system as a diagnostic method for the detection of pancreatic cancer. Finally, in chapter 4, we move on to Guided Mode Resonance (GMR) sensors, as a low-cost option for potential use in Point-of Care diagnostics. C1q and BVD assays used in the prior formats are transferred to this platform, to ascertain its usability as a cost effective, reliable sensor for diagnostic testing. We discuss the fabrication, characterisation and assay development, as well as their use in the benchtop hybrid system.
Resumo:
Because of high efficacy, long lifespan, and environment-friendly operation, LED lighting devices become more and more popular in every part of our life, such as ornament/interior lighting, outdoor lightings and flood lighting. The LED driver is the most critical part of the LED lighting fixture. It heavily affects the purchasing cost, operation cost as well as the light quality. Design a high efficiency, low component cost and flicker-free LED driver is the goal. The conventional single-stage LED driver can achieve low cost and high efficiency. However, it inevitably produces significant twice-line-frequency lighting flicker, which adversely affects our health. The conventional two-stage LED driver can achieve flicker-free LED driving at the expenses of significantly adding component cost, design complexity and low the efficiency. The basic ripple cancellation LED driving method has been proposed in chapter three. It achieves a high efficiency and a low component cost as the single-stage LED driver while also obtaining flicker-free LED driving performance. The basic ripple cancellation LED driver is the foundation of the entire thesis. As the research evolving, another two ripple cancellation LED drivers has been developed to improve different aspects of the basic ripple cancellation LED driver design. The primary side controlled ripple cancellation LED driver has been proposed in chapter four to further reduce cost on the control circuit. It eliminates secondary side compensation circuit and an opto-coupler in design while at the same time maintaining flicker-free LED driving. A potential integrated primary side controller can be designed based on the proposed LED driving method. The energy channeling ripple cancellation LED driver has been proposed in chapter five to further reduce cost on the power stage circuit. In previous two ripple cancellation LED drivers, an additional DC-DC converter is needed to achieve ripple cancellation. A power transistor has been used in the energy channeling ripple cancellation LED driving design to successfully replace a separate DC-DC converter and therefore achieved lower cost. The detailed analysis supports the theory of the proposed ripple cancellation LED drivers. Simulation and experiment have also been included to verify the proposed ripple cancellation LED drivers.
Resumo:
Security Onion is a Network Security Manager (NSM) platform that provides multiple Intrusion Detection Systems (IDS) including Host IDS (HIDS) and Network IDS (NIDS). Many types of data can be acquired using Security Onion for analysis. This includes data related to: Host, Network, Session, Asset, Alert and Protocols. Security Onion can be implemented as a standalone deployment with server and sensor included or with a master server and multiple sensors allowing for the system to be scaled as required. Many interfaces and tools are available for management of the system and analysis of data such as Sguil, Snorby, Squert and Enterprise Log Search and Archive (ELSA). These interfaces can be used for analysis of alerts and captured events and then can be further exported for analysis in Network Forensic Analysis Tools (NFAT) such as NetworkMiner, CapME or Xplico. The Security Onion platform also provides various methods of management such as Secure SHell (SSH) for management of server and sensors and Web client remote access. All of this with the ability to replay and analyse example malicious traffic makes the Security Onion a suitable low cost alternative for Network Security Management. In this paper, we have a feature and functionality review for the Security Onion in terms of: types of data, configuration, interface, tools and system management.
Resumo:
In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.
Resumo:
Fiber optical sensors have played an important role in applications for monitoring the health of civil infrastructures, such as bridges, oil rigs, and railroads. Due to the reduction in cost of fiber-optic components and systems, fiber optical sensors have been studied extensively for their higher sensitivity, precision and immunity to electrical interference compared to their electrical counterparts. A fiber Bragg grating (FBG) strain sensor has been employed for this study to detect and distinguish normal and lateral loads on rail tracks. A theoretical analysis of the relationship between strain and displacement under vertical and horizontal strains on an aluminum beam has been performed, and the results are in excellent agreement with the measured strain data. Then a single FBG sensor system with erbium-doped fiber amplifier broadband source has been carried out. Force and temperature applied on the system have resulted in changes of 0.05 nm per 50 με and 0.094 nm per 10 oC at the center wavelength of the FBG. Furthermore, a low cost fiber-optic sensor system with a distributed feedback (DFB) laser as the light source has been implemented. We show that it has superior noise and sensitivity performances compared to strain gauge sensors. The design has been extended to accommodate multiple sensors with negligible cross talk. When two cascaded sensors on a rail track section are tested, strain readings of the sensor 20 inches away from the position of applied force decay to one seventh of the data of the sensor at the applied force location. The two FBG sensor systems can detect 1 ton of vertical load with a square wave pattern and 0.1 ton of lateral loads (3 tons and 0.5 ton, respectively, for strain gauges). Moreover, a single FBG sensor has been found capable of detecting and distinguishing lateral and normal strains applied at different frequencies. FBG sensors are promising alternatives to electrical sensors for their high sensitivity,ease of installation, and immunity to electromagnetic interferences.
Resumo:
A crescente urbanização global tem como consequência o aumento dos níveis de poluentes na atmosfera e a respetiva deterioração da qualidade do ar. O controlo da poluição atmosférica e monitorização da qualidade do ar são passos fundamentais para implementar estratégias de redução e estimular a consciência ambiental dos cidadãos. Com este intuito, existem várias técnicas e tecnologias que podem ser usadas para monitorizar a qualidade do ar. A utilização de microsensores surge como uma ferramenta inovadora para a monitorização da qualidade do ar. E, apesar dos desempenhos dos microsensores permitirem uma nova estratégia, resultando em respostas rápidas, baixos custos operacionais e eficiências elevadas, que não podem ser alcançados apenas com abordagens convencionais, ainda é necessário aprofundar o conhecimento a fim de integrar estas novas tecnologias, particularmente quanto à verificação do desempenho dos sensores comparativamente aos métodos de referência em campanhas experimentais. Esta dissertação, desenvolvida no Instituto do Ambidente e Desenvolvimento em forma de estágio, teve como objetivo a avaliação do desempenho de sensores de baixo custo comparativamente com os métodos de referência, tendo como base uma campanha de monitorização da qualidade do ar realizada no centro de Aveiro durante 2 semanas de outubro de 2014. De forma mais específica pretende-se perceber até que ponto se podem utilizar sensores de baixo custo que cumpram os requisitos especificados na legislação e as especificidades das normas, estabelecendo assim um protocolo de avaliação de microsensores. O trabalho realizado passou ainda pela caracterização da qualidade do ar no centro de Aveiro para o período da campanha de monitorização. A aplicação de microsensores eletroquímicos, MOS e OPC em paralelo com equipamento de referência neste estudo de campo permitiu avaliar a fiabilidade e a incerteza destas novas tecnologias de monitorização. Com este trabalho verificou-se que os microsensores eletroquímicos são mais precisos comparativamente aos microsensores baseados em óxidos metálicos, apresentando correlações fortes com os métodos de referência para diversos poluentes. Por sua vez, os resultados obtidos pelos contadores óticos de partículas foram satisfatórios, contudo poderiam ser melhorados quer pelo modo de amostragem, quer pelo método de tratamento de dados aplicado. Idealmente, os microsensores deveriam apresentar fortes correlações com o método de referência e elevada eficiência de recolha de dados. No entanto, foram identificados alguns problemas na eficiência de recolha de dados dos sensores que podem estar relacionados com a humidade relativa e temperaturas elevadas durante a campanha, falhas de comunicação intermitentes e, também, a instabilidade e reatividade causada por gases interferentes. Quando as limitações das tecnologias de sensores forem superadas e os procedimentos adequados de garantia e controlo de qualidade possam ser cumpridos, os sensores de baixo custo têm um grande potencial para permitir a monitorização da qualidade do ar com uma elevada cobertura espacial, sendo principalmente benéfico em áreas urbanas.
Resumo:
A crescente urbanização global tem como consequência o aumento dos níveis de poluentes na atmosfera e a respetiva deterioração da qualidade do ar. O controlo da poluição atmosférica e monitorização da qualidade do ar são passos fundamentais para implementar estratégias de redução e estimular a consciência ambiental dos cidadãos. Com este intuito, existem várias técnicas e tecnologias que podem ser usadas para monitorizar a qualidade do ar. A utilização de microsensores surge como uma ferramenta inovadora para a monitorização da qualidade do ar. E, apesar dos desempenhos dos microsensores permitirem uma nova estratégia, resultando em respostas rápidas, baixos custos operacionais e eficiências elevadas, que não podem ser alcançados apenas com abordagens convencionais, ainda é necessário aprofundar o conhecimento a fim de integrar estas novas tecnologias, particularmente quanto à verificação do desempenho dos sensores comparativamente aos métodos de referência em campanhas experimentais. Esta dissertação, desenvolvida no Instituto do Ambidente e Desenvolvimento em forma de estágio, teve como objetivo a avaliação do desempenho de sensores de baixo custo comparativamente com os métodos de referência, tendo como base uma campanha de monitorização da qualidade do ar realizada no centro de Aveiro durante 2 semanas de outubro de 2014. De forma mais específica pretende-se perceber até que ponto se podem utilizar sensores de baixo custo que cumpram os requisitos especificados na legislação e as especificidades das normas, estabelecendo assim um protocolo de avaliação de microsensores. O trabalho realizado passou ainda pela caracterização da qualidade do ar no centro de Aveiro para o período da campanha de monitorização. A aplicação de microsensores eletroquímicos, MOS e OPC em paralelo com equipamento de referência neste estudo de campo permitiu avaliar a fiabilidade e a incerteza destas novas tecnologias de monitorização. Com este trabalho verificou-se que os microsensores eletroquímicos são mais precisos comparativamente aos microsensores baseados em óxidos metálicos, apresentando correlações fortes com os métodos de referência para diversos poluentes. Por sua vez, os resultados obtidos pelos contadores óticos de partículas foram satisfatórios, contudo poderiam ser melhorados quer pelo modo de amostragem, quer pelo método de tratamento de dados aplicado. Idealmente, os microsensores deveriam apresentar fortes correlações com o método de referência e elevada eficiência de recolha de dados. No entanto, foram identificados alguns problemas na eficiência de recolha de dados dos sensores que podem estar relacionados com a humidade relativa e temperaturas elevadas durante a campanha, falhas de comunicação intermitentes e, também, a instabilidade e reatividade causada por gases interferentes. Quando as limitações das tecnologias de sensores forem superadas e os procedimentos adequados de garantia e controlo de qualidade possam ser cumpridos, os sensores de baixo custo têm um grande potencial para permitir a monitorização da qualidade do ar com uma elevada cobertura espacial, sendo principalmente benéfico em áreas urbanas.
Resumo:
A necessidade de controlar de forma rápida, eficaz e precisa a qualidade de líquidos, principalmente a água, um bem essencial para os seres humanos, conduziu ao desenvolvimento de dispositivos capazes de relacionar a medição de parâmetros físicos com a qualidade da água. Estes dispositivos devem ser capazes de monitorizar os inúmeros parâmetros necessários para aferir a qualidade do líquido, como a turvação, índice de refração, concentração de sedimentos e propriedades cromáticas. Desta forma devem-se recorrer a sensores multiparâmetros. A tecnologia baseada em POF (fibra ótica polimérica) tem sido apontada com elevado potencial no desenvolvimento de sensores óticos nas mais variadas aplicações dado o seu baixo custo, imunidade a interferências eletromagnéticas e flexibilidade. Neste trabalho é proposto um sensor POF multiparâmetro capaz de reunir num único dispositivo a capacidade de medição de altos e baixos valores de turvação, assim como ter a capacidade de medir vários parâmetros em simultâneo e em tempo real. Os resultados permitem avaliar o aumento da capacidade de gama dinâmica de turvação e concentração de sedimentos face aos sensores multiparâmetros já comercializados, uma vez que houve uma boa resposta por parte do sensor para altos e para baixos valores. Um método utilizado para descorrelacionar os diferentes parâmetros foi aplicado com sucesso. A WATGRID LDA. pretende, partindo deste tipo de sensores, disponibilizar aos seus clientes soluções (plataformas) inteligentes e integradas para avaliação e gestão da qualidade dos líquidos para consumo (e.g. água e vinho). Estas soluções irão permitir que os clientes da WATGRID LDA. aumentem a sua eficiência, a qualidade do seu produto e tenham um maior controlo do processo ao mesmo tempo que reduzem custos.
Resumo:
Nos dias que correm a eficiência energética está na ordem do dia, havendo um esforço significativo para obter-se equipamentos cada vez mais eficientes. Uma parte significativa do consumo global de energia, bem como da emissão de gases nocivos e de efeito de estufa, está associado à climatização, quer doméstica quer industrial. Assim, é importante desenvolver tecnologias mais eficientes neste domínio. O principal objetivo deste trabalho consiste no desenvolvimento de um módulo de interface a uma sonda lambda para monitorização de combustão em caldeiras de biomassa. Este módulo permitirá medir a concentração de oxigénio presente na saída das caldeiras, possibilitando o ajuste dinâmico dos parâmetros de combustão por forma a maximizar o seu rendimento e minimizar a emissão de gases poluentes. O módulo desenvolvido é de baixo custo e apresenta uma interface bastante simples, facilitando a sua incorporação em equipamentos já existentes. Os resultados obtidos revelaram-se consistentes com os valores teóricos fornecidos pelo fabricante da sonda utilizada, podendo assim concluir-se que o trabalho foi realizado com sucesso.
Resumo:
The main task is to analyze the state of the art of grating couplers production and low-cost polymer substrates. Then to provide a recommendation of a new or adapted process for the production of metallic gratings on polymer sheets, based on a Failure Mode and Effect Analysis (FMEA). In order to achieve that, this thesis is divided into four chapters. After the first introductory chapter, the second section provides details about the state-of-the-art in optical technology platforms with focus on polymers and their main features for the aimed application, such as flexibility, low cost and roll to roll compatibility. It defines then the diffraction gratings and their specifications and closes with the explanation of adhesion mechanisms of inorganic materials on polymer substrates. The third chapter discusses processing of grating couplers. It introduces the basic fabrication methods and details a selection of current fabrication schemes found in literature with an assessment of their potential use for the desired application. The last chapter is a FMEA analysis of the retained fabrication process, called Flip and Fuse, in order to check its capability to realize the grating structure.