926 resultados para portable analyzer
Resumo:
Multiple breath wash-out (MBW) testing requires prior wash-in of inert tracer gas. Wash-in efficiency can be enhanced by a rebreathing tracer in a closed circuit. Previous attempts to deploy this did not account for the impact of CO2 accumulation on patients and were unsuccessful. We hypothesised that an effective rebreathe wash-in could be delivered and it would not alter wash-out parameters. Computer modelling was used to assess the impact of the rebreathe method on wash-in efficiency. Clinical testing of open and closed circuit wash-in–wash-out was performed in healthy controls and adult patients with cystic fibrosis (CF) using a circuit with an effective CO2 scrubber and a refined wash-in protocol. Wash-in efficiency was enhanced by rebreathing. There was no difference in mean lung clearance index between the two wash-in methods for controls (6.5 versus 6.4; p=0.2, n=12) or patients with CF (10.9 versus 10.8; p=0.2, n=19). Test time was reduced by rebreathe wash-in (156 versus 230 s for CF patients, p<0.001) and both methods were well tolerated. End wash-in CO2 was maintained below 2% in most cases. Rebreathe–wash-in is a promising development that, when correctly deployed, reduces wash-in time and facilitates portable MBW testing. For mild CF, wash-out outcomes are equivalent to an open circuit.
Resumo:
Roadside safety barriers designs are tested with passenger cars in Europe using standard EN1317 in which the impact angle for normal, high and very high containment level tests is 20°. In comparison to EN1317, the US standard MASH has higher impact angles for cars and pickups (25°) and different vehicle masses. Studies in Europe (RISER) and the US have shown values for the 90th percentile impact angle of 30°–34°. Thus, the limited evidence available suggests that the 20° angle applied in EN 1317 may be too low.
The first goal of this paper is to use the US NCHRP database (Project NCHRP 17–22) to assess the distribution of impact angle and collision speed in recent ROR accidents. Second, based on the findings of the statistical analysis and on analysis of impact angles and speeds in the literature, an LS-DYNA finite element analysis was carried out to evaluate the normal containment level of concrete barriers in non-standard collisions. The FE model was validated against a crash test of a portable concrete barrier carried out at the UK Transport Research Laboratory (TRL).
The accident data analysis for run-off road accidents indicates that a substantial proportion of accidents have an impact angle in excess of 20°. The baseline LS-DYNA model showed good comparison with experimental acceleration severity index (ASI) data and the parametric analysis indicates a very significant influence of impact angle on ASI. Accordingly, a review of European run-off road accidents and the configuration of EN 1317 should be performed.
Resumo:
Biogas from anaerobic digestion of sewage sludge is a renewable resource with high energy content, which is formed mainly of CH4 (40-75 vol.%) and CO2 (15-60 vol.%) Other components such as water (H2O, 5-10 vol.%) and trace amounts of hydrogen sulfide and siloxanes can also be present. A CH4-rich stream can be produced by removing the CO2 and other impurities so that the upgraded bio-methane can be injected into the natural gas grid or used as a vehicle fuel. The main objective of this paper is to develop a new modeling methodology to assess the technical and economic performance of biogas upgrading processes using ionic liquids which physically absorb CO2. Three different ionic liquids, namely the 1-ethyl-3-methylimidazolium bis[(trifluoromethyl)sulfonyl]imide, 1-hexyl-3-methylimidazoliumbis[(trifluoromethyl)sulfonyl]imide and trihexyl(tetradecyl)phosphonium bis[(trifluoromethyl)sulfonyl]imide, are considered for CO2 capture in a pressure-swing regenerative absorption process. The simulation software Aspen Plus and Aspen Process Economic Analyzer is used to account for mass and energy balances as well as equipment cost. In all cases, the biogas upgrading plant consists of a multistage compressor for biogas compression, a packed absorption column for CO2 absorption, a flash evaporator for solvent regeneration, a centrifugal pump for solvent recirculation, a pre-absorber solvent cooler and a gas turbine for electricity recovery. The evaluated processes are compared in terms of energy efficiency, capital investment and bio-methane production costs. The overall plant efficiency ranges from 71-86 % whereas the bio-methane production cost ranges from £6.26-7.76 per GJ (LHV). A sensitivity analysis is also performed to determine how several technical and economic parameters affect the bio-methane production costs. The results of this study show that the simulation methodology developed can predict plant efficiencies and production costs of large scale CO2 capture processes using ionic liquids without having to rely on gas solubility experimental data.
Resumo:
Dado o aumento acelerado dos preços dos combustíveis fósseis e as incertezas quanto à sua disponibilidade futura, tem surgido um novo interesse nas tecnologias da biomassa aplicadas à produção de calor, eletricidade ou combustíveis sintéticos. Não obstante, para a conversão termoquímica de uma partícula de biomassa sólida concorrem fenómenos bastante complexos que levam, em primeiro lugar, à secagem do combustível, depois à pirólise e finalmente à combustão ou gasificação propriamente ditas. Uma descrição relativamente incompleta de alguns desses estágios de conversão constitui ainda um obstáculo ao desenvolvimento das tecnologias que importa ultrapassar. Em particular, a presença de elevados conteúdos de matéria volátil na biomassa põe em evidência o interesse prático do estudo da pirólise. A importância da pirólise durante a combustão de biomassa foi evidenciada neste trabalho através de ensaios realizados num reator piloto de leito fluidizado borbulhante. Verificou-se que o processo ocorre em grande parte à superfície do leito com chamas de difusão devido à libertação de voláteis, o que dificulta o controlo da temperatura do reator acima do leito. No caso da gasificação de biomassa a pirólise pode inclusivamente determinar a eficiência química do processo. Isso foi mostrado neste trabalho durante ensaios de gasificação num reator de leito fluidizado de 2MWth, onde um novo método de medição permitiu fechar o balanço de massa ao gasificador e monitorizar o grau de conversão da biomassa. A partir destes resultados tornou-se clara a necessidade de descrever adequadamente a pirólise de biomassa com vista ao projeto e controlo dos processos. Em aplicações de engenharia há particular interesse na estequiometria e propriedades dos principais produtos pirolíticos. Neste trabalho procurou-se responder a esta necessidade, inicialmente através da estruturação de dados bibliográficos sobre rendimentos de carbonizado, líquidos pirolíticos e gases, assim como composições elementares e poderes caloríficos. O resultado traduziu-se num conjunto de parâmetros empíricos de interesse prático que permitiram elucidar o comportamento geral da pirólise de biomassa numa gama ampla de condições operatórias. Para além disso, propôs-se um modelo empírico para a composição dos voláteis que pode ser integrado em modelos compreensivos de reatores desde que os parâmetros usados sejam adequados ao combustível ensaiado. Esta abordagem despoletou um conjunto de ensaios de pirólise com várias biomassas, lenhina e celulose, e temperaturas entre os 600 e 975ºC. Elevadas taxas de aquecimento do combustível foram alcançadas em reatores laboratoriais de leito fluidizado borbulhante e leito fixo, ao passo que um sistema termo-gravimétrico permitiu estudar o efeito de taxas de aquecimento mais baixas. Os resultados mostram que, em condições típicas de processos de combustão e gasificação, a quantidade de voláteis libertada da biomassa é pouco influenciada pela temperatura do reator mas varia bastante entre combustíveis. Uma análise mais aprofundada deste assunto permitiu mostrar que o rendimento de carbonizado está intimamente relacionado com o rácio O/C do combustível original, sendo proposto um modelo simples para descrever esta relação. Embora a quantidade total de voláteis libertada seja estabelecida pela composição da biomassa, a respetiva composição química depende bastante da temperatura do reator. Rendimentos de espécies condensáveis (água e espécies orgânicas), CO2 e hidrocarbonetos leves descrevem um máximo relativamente à temperatura para dar lugar a CO e H2 às temperaturas mais altas. Não obstante, em certas gamas de temperatura, os rendimentos de algumas das principais espécies gasosas (e.g. CO, H2, CH4) estão bem correlacionados entre si, o que permitiu desenvolver modelos empíricos que minimizam o efeito das condições operatórias e, ao mesmo tempo, realçam o efeito do combustível na composição do gás. Em suma, os ensaios de pirólise realizados neste trabalho permitiram constatar que a estequiometria da pirólise de biomassa se relaciona de várias formas com a composição elementar do combustível original o que levanta várias possibilidades para a avaliação e projeto de processos de combustão e gasificação de biomassa.
Resumo:
The increased capabilities (e.g., processing, storage) of portable devices along with the constant need of users to retrieve and send information have introduced a new form of communication. Users can seamlessly exchange data by means of opportunistic contacts among them and this is what characterizes the opportunistic networks (OppNets). OppNets allow users to communicate even when an end-to-end path may not exist between them. Since 2007, there has been a trend to improve the exchange of data by considering social similarity metrics. Social relationships, shared interests, and popularity are examples of such metrics that have been employed successfully: as users interact based on relationships and interests, this information can be used to decide on the best next forwarders of information. This Thesis work combines the features of today's devices found in the regular urban environment with the current social-awareness trend in the context of opportunistic routing. To achieve this goal, this work was divided into di erent tasks that map to a set of speci c objectives, leading to the following contributions: i) an up-to-date opportunistic routing taxonomy; ii) a universal evaluation framework that aids in devising and testing new routing proposals; iii) three social-aware utility functions that consider the dynamic user behavior and can be easily incorporated to other routing proposals; iv) two opportunistic routing proposals based on the users' daily routines and on the content traversing the network and interest of users in such content; and v) a structure analysis of the social-based network formed based on the approaches devised in this work.
Resumo:
Over the last decade, the most widespread approaches for traditional management were based on the Simple Network Management Protocol (SNMP) or Common Management Information Protocol (CMIP). However, they both have several problems in terms of scalability, due to their centralization characteristics. Although the distributed management approaches exhibit better performance in terms of scalability, they still underperform regarding communication costs, autonomy, extensibility, exibility, robustness, and cooperation between network nodes. The cooperation between network nodes normally requires excessive overheads for synchronization and dissemination of management information in the network. For emerging dynamic and large-scale networking environments, as envisioned in Next Generation Networks (NGNs), exponential growth in the number of network devices and mobile communications and application demands is expected. Thus, a high degree of management automation is an important requirement, along with new mechanisms that promote it optimally and e ciently, taking into account the need for high cooperation between the nodes. Current approaches for self and autonomic management allow the network administrator to manage large areas, performing fast reaction and e ciently facing unexpected problems. The management functionalities should be delegated to a self-organized plane operating within the network, that decrease the network complexity and the control information ow, as opposed to centralized or external servers. This Thesis aims to propose and develop a communication framework for distributed network management which integrates a set of mechanisms for initial communication, exchange of management information, network (re) organization and data dissemination, attempting to meet the autonomic and distributed management requirements posed by NGNs. The mechanisms are lightweight and portable, and they can operate in di erent hardware architectures and include all the requirements to maintain the basis for an e cient communication between nodes in order to ensure autonomic network management. Moreover, those mechanisms were explored in diverse network conditions and events, such as device and link errors, di erent tra c/network loads and requirements. The results obtained through simulation and real experimentation show that the proposed mechanisms provide a lower convergence time, smaller overhead impact in the network, faster dissemination of management information, increase stability and quality of the nodes associations, and enable the support for e cient data information delivery in comparison to the base mechanisms analyzed. Finally, all mechanisms for communication between nodes proposed in this Thesis, that support and distribute the management information and network control functionalities, were devised and developed to operate in completely decentralized scenarios.
Resumo:
Systems equipped with multiple antennas at the transmitter and at the receiver, known as MIMO (Multiple Input Multiple Output) systems, offer higher capacities, allowing an efficient exploitation of the available spectrum and/or the employment of more demanding applications. It is well known that the radio channel is characterized by multipath propagation, a phenomenon deemed problematic and whose mitigation has been achieved through techniques such as diversity, beamforming or adaptive antennas. By exploring conveniently the spatial domain MIMO systems turn the characteristics of the multipath channel into an advantage and allow creating multiple parallel and independent virtual channels. However, the achievable benefits are constrained by the propagation channel’s characteristics, which may not always be ideal. This work focuses on the characterization of the MIMO radio channel. It begins with the presentation of the fundamental results from information theory that triggered the interest on these systems, including the discussion of some of their potential benefits and a review of the existing channel models for MIMO systems. The characterization of the MIMO channel developed in this work is based on experimental measurements of the double-directional channel. The measurement system is based on a vector network analyzer and a two-dimensional positioning platform, both controlled by a computer, allowing the measurement of the channel’s frequency response at the locations of a synthetic array. Data is then processed using the SAGE (Space-Alternating Expectation-Maximization) algorithm to obtain the parameters (delay, direction of arrival and complex amplitude) of the channel’s most relevant multipath components. Afterwards, using a clustering algorithm these data are grouped into clusters. Finally, statistical information is extracted allowing the characterization of the channel’s multipath components. The information about the multipath characteristics of the channel, induced by existing scatterers in the propagation scenario, enables the characterization of MIMO channel and thus to evaluate its performance. The method was finally validated using MIMO measurements.
Resumo:
A range of instruments are available to measure thermal conductivity of building materials. Some of these tools are heat-flow meter, hot plate, hot box and heat transfer analyzer. Thermal conductivity data derived by using different instruments can be different from each other. Implication of these variations in thermal conductivity is significant in terms of commercial profile of the insulations and also in terms of calculating energy saving in large scale use of that specific insulation. Thus it is important to know which of the measuring instrument for thermal conductivity can produce relatively accurate and representative result. This paper firstly looks at the methods and instrument for measuring thermal conductivity of building materials and secondly compares and analyses the results of testing thermal conductivity of fibrous insulations using a heat analyzer and a hot plate.
Resumo:
The goal of the project "SmartVision: active vision for the blind" is to develop a small and portable but intelligent and reliable system for assisting the blind and visually impaired while navigating autonomously, both outdoor and indoor. In this paper we present an overview of the prototype, design issues, and its different modules which integrate a GIS with GPS, Wi-Fi, RFID tags and computer vision. The prototype addresses global navigation by following known landmarks, local navigation with path tracking and obstacle avoidance, and object recognition. The system does not replace the white cane, but extends it beyond its reach. The user-friendly interface consists of a 4-button hand-held box, a vibration actuator in the handle of the cane, and speech synthesis. A future version may also employ active RFID tags for marking navigation landmarks, and speech recognition may complement speech synthesis.
Resumo:
The SmartVision prototype is a small, cheap and easily wearable navigation aid for blind and visually impaired persons. Its functionality addresses global navigation for guiding the user to some destiny, and local navigation for negotiating paths, sidewalks and corridors, with avoidance of static as well as moving obstacles. Local navigation applies to both in- and outdoor situations. In this article we focus on local navigation: the detection of path borders and obstacles in front of the user and just beyond the reach of the white cane, such that the user can be assisted in centering on the path and alerted to looming hazards. Using a stereo camera worn at chest height, a portable computer in a shoulder-strapped pouch or pocket and only one earphone or small speaker, the system is inconspicuous, it is no hindrence while walking with the cane, and it does not block normal surround sounds. The vision algorithms are optimised such that the system can work at a few frames per second.
Resumo:
Tese de mestrado em Engenharia Biomédica e Biofísica, apresentada à Universidade de Lisboa, através da Faculdade de Ciências, 2015
Resumo:
Thesis (Master's)--University of Washington, 2015
Resumo:
Food product safety is one of the most promising areas for the application of electronic noses. The performance of a portable electronic nose has been evaluated in monitoring the spoilage of beef fillet stored aerobically at different storage temperatures (0, 4, 8, 12, 16 and 20°C). This paper proposes a fuzzy-wavelet neural network model which incorporates a clustering pre-processing stage for the definition of fuzzy rules. The dual purpose of the proposed modeling approach is not only to classify beef samples in the respective quality class (i.e. fresh, semi-fresh and spoiled), but also to predict their associated microbiological population directly from volatile compounds fingerprints. Comparison results indicated that the proposed modeling scheme could be considered as a valuable detection methodology in food microbiology
Resumo:
Freshness and safety of muscle foods are generally considered as the most important parameters for the food industry. The performance of a portable electronic nose has been evaluated in monitoring the spoilage of beef fillet stored aerobically at different storage temperatures (0, 4, 8, 12, 16 and 20°C). An adaptive fuzzy logic system model that utilizes a prototype defuzzification scheme has been developed to classify beef samples in their respective quality class and to predict their associated microbiological population directly from volatile compounds fingerprints. Results confirmed the superiority of the adopted methodology and indicated that volatile information in combination with an efficient choice of a modeling scheme could be considered as an alternative methodology for the accurate evaluation of meat spoilage
Resumo:
Acetate is a short chain fatty acid produced as a result of fermentation of ingested fibers by the gut microbiota. While it has been shown to reduce cell proliferation in some cancer cell lines1,2, more recent studies on liver3 and brain4 tumours suggest that acetate may actually promote tumour growth. Acetate in the cell is normally converted into acetyl-coA by two enzymes and metabolized; mitochondrial (ACSS1) and cytosolic (ACSS2) acetyl-coA synthetase. In the mitochondria acetyl-coA is utilized in the TCA cycle. In the cytosol it is utilized in lipid synthesis. In this study, the effect of acetate treatment on the growth of HT29 colon cancer cell line and its mechanism of action was assessed. HT29 human colorectal adenocarcinoma cells were treated with 10mM NaAc and cell viability, cellular bioenergetics and gene expression were investigated. Cell viability was assessed 24 hours after treatment using an MTT assay (Sigma, UK, n=8). Cellular oxygen consumption rate (OCR) and extracellular acidification rate (ECAR) was measured by XFe Analyzer (Seahorse Bioscience, USA). After a baseline reading cells were treated and OCR and ECAR measurements were observed for 18 hours (n=4). Total mRNA was isolated 24 hours after treatment using RNeasy kit (Qiagen, USA). Quantitative PCR reactions were performed using Taqman gene expression assays and Taqman Universal PCR Master Mix (ThermoFisher Scientific, UK) on Applied Biosystems 7500 Fast Real-Time PCR System (Life Technologies, USA) and analysed using ΔΔCt method (n=3). Acetate treatment led to a significant reduction in cell viability (15.9%, Figure 1). OCR, an indicator of oxidative phosphorylation, was significantly increased (p<0.0001) while ECAR, an indicator of glycolysis, was significantly reduced (p<0.0001, Figure 2). Gene expression of ACSS1 was increased by 1.7 fold of control (p=0.07) and ACSS2 expression was reduced to 0.6 fold of control (p=0.06, Figure 3). In conclusion, in colon cancer cells acetate supplementation induces cell death and increases oxidative capacity. These changes together with the trending decrease in ACSS2 expression suggest suppression of lipid synthesis pathways. We hypothesize that the reduced tumor growth by acetate is a consequence of the suppression of ACSS2 and lipid synthesis, both effects reported previously to reduce tumor growth3–5. These effects clearly warrant further investigation.