905 resultados para Robot sensing systems
Resumo:
论文设计研制了行为辅助机器人验证平台,以此平台为基础可以针对行为辅助机器人进行力控制研究。针对机器人力控制的实际要求,设计了行为辅助机器人柔性关节结构,辨识了系统参数。基于QNX实时操作系统设计了控制系统软件。机器人控制系统软件主要包括传感器数据采集和控制算法两部分,可以满足柔性关节控制需求。
Resumo:
网络遥操作机器人系统是网络技术与机器人技术相结合的产物。它延伸了操作者的感知和操作能力,使操作者可以置身于安全的环境中而完成危险环境中的作业任务;提高了机器人对工作环境的适应能力,辅之以操作者的决策,机器人可以工作于非结构化的工作环境中。网络遥操作机器人技术作为机器人学的一个重要分支,近十几年来受到许多研究机构和研究人员的关注和重视。 本文针对网络遥操作机器人采样控制结构,通过控制策略解决非结构环境下网络遥操作机器人的实时控制问题。为此,首先研究了网络遥操作机器人采样系统建模问题。目前,关于网络遥操作机器人采样系统模型大多是针对单采样周期的,主从端不同采样周期的统一模型目前还没有相关报道。由于操作者是网络遥操作机器人采样系统的组成部分,因此若建立网络遥操作机器人采样系统模型首先要建立操作者模型。然而由于操作者建模问题比较复杂,目前在遥操作系统建模时,一般都避开了操作者建模问题。本文在分析了现有的遥操作控制方式和遥操作系统模型的基础上,主要针对网络遥操作机器人采样系统模型和控制问题进行研究。 操作者模型研究方面,主要以操作者用小臂操作具有力反馈功能的操纵杆为例,研究操作者操作操纵杆过程的动态模型建模方法。首先对人体骨骼肌肌肉力学模型中不可测量,即肌肉激活度,通过实验进行研究,得出在操作者保持紧张程度不变情况下“肌肉激活度”与肌肉收缩长度的关系。在此基础上,考虑手臂的动力学特性、操纵杆的动力学特性,建立了肌肉力驱动的手臂—操纵杆系统动力学模型。在操作者模型的基础上,设计动态补偿器,补偿操作者操作操纵杆的动态过程,解决由于肌肉动态特性被污染所造成的操作者所想与所做不一致的问题,克服操作者操作时延,提高网络遥操作机器人系统的性能。 遥操作机器人采样系统模型研究方面,首先针对主从端不同采样周期的网络遥操作机器人采样控制结构,通过引入双端口RAM的方法,实现网络遥操作机器人系统主从端的采样同步;在网络遥操作机器人采样同步控制结构模型的基础上,建立从端离散状态空间表达式,利用提升技术对从端离散状态空间表达式按遥操作周期提升,利用采样系统理论得到主从端统一的网络遥操作机器人采样系统模型;最后对从端系统提升前后的稳定性、可控性、可观测性进行分析,得出从端系统提升前后稳定性、可控性、可观测性不变的结论。 遥操作控制策略研究方面,提出基于时延预测的采样切换控制方法。首先对互联网节点间的网络时延进行测试分析,得出任意两个网络节点间时延分布规律,即任意时间段内网络时延的概率密度都可以用平移Gamma分布曲线描述。采用拟合样本概率密度曲线的方法,对平移Gamma参数进行预估,得出平移Gamma分布的种类,进而根据平移Gamma分布的种类,确定出网络时延的均值,最后确定出期望的采样周期;为了实现任意采样周期下切换系统的稳定控制,对采样切换系统的稳定性进行了研究,得到如下结论,即如果从端系统一致渐进稳定,则对从端实行任意采样切换控制时网络遥操作机器人采样系统是稳定的。 为了对所研究内容进行实验验证,以移动机器人为被控对象,搭建了一个具有力反馈控制和局部自主功能的网络遥操作机器人采样系统实验平台。用人工势场法建立了虚拟力模型并给出了虚拟力在力反馈操纵杆上的实现方法;以移动机器人自主避障为例,给出了从端自主的模糊控制设计方法和实验系统遥操作软件设计方法。 实验结果证明了所提出的模型和控制方法是有效的、可行的,对于建立性能良好的网络遥操作机器人系统具有现实意义。本文所研究的许多结论,对于一般网络遥操作机器人系统的理论研究和实际应用也具有一定的参考价值。
Resumo:
Wireless sensor networks have recently emerged as enablers of important applications such as environmental, chemical and nuclear sensing systems. Such applications have sophisticated spatial-temporal semantics that set them aside from traditional wireless networks. For example, the computation of temperature averaged over the sensor field must take into account local densities. This is crucial since otherwise the estimated average temperature can be biased by over-sampling areas where a lot more sensors exist. Thus, we envision that a fundamental service that a wireless sensor network should provide is that of estimating local densities. In this paper, we propose a lightweight probabilistic density inference protocol, we call DIP, which allows each sensor node to implicitly estimate its neighborhood size without the explicit exchange of node identifiers as in existing density discovery schemes. The theoretical basis of DIP is a probabilistic analysis which gives the relationship between the number of sensor nodes contending in the neighborhood of a node and the level of contention measured by that node. Extensive simulations confirm the premise of DIP: it can provide statistically reliable and accurate estimates of local density at a very low energy cost and constant running time. We demonstrate how applications could be built on top of our DIP-based service by computing density-unbiased statistics from estimated local densities.
Resumo:
The multitude of biomolecular and regulatory factors involved in staphylococcal adhesion and biofilm formation owe much to their ability to colonize surfaces, allowing the biofilm form to become the preferential bacterial phenotype. Judging by total number, biomass and variety of environments colonized, bacteria can be categorized as the most successful lifeform on earth. This is due to the ability of bacteria and other microorganisms to respond phenotypically via biomolecular processes to the stresses of their surrounding environment. This review focuses on the specific pathways involved in the adhesion of the Gram-positive bacteria Staphylococcus epidermidis and Staphylococcus aureus with reference to the role of specific cell surface adhesins, the ica operon, accumulation-associated proteins and quorum-sensing systems and their significance in medical device-related infection.
Resumo:
Pseudomonas aeruginosa and Escherichia coli are the most prevalent Gram-negative biofilm forming medical device associated pathogens, particularly with respect to catheter associated urinary tract infections. In a similar manner to Gram-positive bacteria, Gram-negative biofilm formation is fundamentally determined by a series of steps outlined more fully in this review, namely adhesion, cellular aggregation, and the production of an extracellular polymeric matrix. More specifically this review will explore the biosynthesis and role of pili and flagella in Gram-negative adhesion and accumulation on surfaces in Pseudomonas aeruginosa and Escherichia coli. The process of biofilm maturation is compared and contrasted in both species, namely the production of the exopolysaccharides via the polysaccharide synthesis locus (Psl), pellicle Formation (Pel) and alginic acid synthesis in Pseudomonas aeruginosa, and UDP-4-amino-4-deoxy-l-arabinose and colonic acid synthesis in Escherichia coli. An emphasis is placed on the importance of the LuxR homologue sdiA; the luxS/autoinducer-II; an autoinducer-III/epinephrine/norepinephrine and indole mediated Quorum sensing systems in enabling Gram-negative bacteria to adapt to their environments. The majority of Gram-negative biofilms consist of polysaccharides of a simple sugar structure (either homo- or heteropolysaccharides) that provide an optimum environment for the survival and maturation of bacteria, allowing them to display increased resistance to antibiotics and predation.
Resumo:
Bridge Weigh in Motion (B-WIM) uses accurate sensing systems to transform an existing bridge into a mechanism to determine actual traffic loading. This information on traffic loading can enable efficient and economical management of transport networks and is becoming a valuable tool for bridge safety assessment. B-WIM can provide site specific traffic loading on deteriorating bridges, which can be used to determine if the reduced capacity is still sufficient to allow the structure to remain operational and minimise unnecessary replacement or rehabilitation costs and prevent disruption to traffic. There have been numerous reports on the accuracy classifications of existing B-WIM installations and some common issues have emerged. This paper details some of the recent developments in B-WIM which were aimed at overcoming these issues. A new system has been developed at Queens University Belfast using fibre optic sensors to provide accurate axle detection and improved accuracy overall. The results presented in this paper show that the fibre optic system provided much more accurate results than conventional WIM systems, as the FOS provide clearer signals at high scanning rates which require less filtering and less post processing. A major disadvantage of existing B-WIM systems is the inability to deal with more than one vehicle on the bridge at the same time; sensor strips have been proposed to overcome this issue. A bridge can be considered safe if the probability that load exceeds resistance is acceptably low, hence B-WIM information from advanced sensors can provide confidence in our ageing structures.
Resumo:
Optische Spektroskopie ist eine sehr wichtige Messtechnik mit einem hohen Potential für zahlreiche Anwendungen in der Industrie und Wissenschaft. Kostengünstige und miniaturisierte Spektrometer z.B. werden besonders für moderne Sensorsysteme “smart personal environments” benötigt, die vor allem in der Energietechnik, Messtechnik, Sicherheitstechnik (safety and security), IT und Medizintechnik verwendet werden. Unter allen miniaturisierten Spektrometern ist eines der attraktivsten Miniaturisierungsverfahren das Fabry Pérot Filter. Bei diesem Verfahren kann die Kombination von einem Fabry Pérot (FP) Filterarray und einem Detektorarray als Mikrospektrometer funktionieren. Jeder Detektor entspricht einem einzelnen Filter, um ein sehr schmales Band von Wellenlängen, die durch das Filter durchgelassen werden, zu detektieren. Ein Array von FP-Filter wird eingesetzt, bei dem jeder Filter eine unterschiedliche spektrale Filterlinie auswählt. Die spektrale Position jedes Bandes der Wellenlänge wird durch die einzelnen Kavitätshöhe des Filters definiert. Die Arrays wurden mit Filtergrößen, die nur durch die Array-Dimension der einzelnen Detektoren begrenzt werden, entwickelt. Allerdings erfordern die bestehenden Fabry Pérot Filter-Mikrospektrometer komplizierte Fertigungsschritte für die Strukturierung der 3D-Filter-Kavitäten mit unterschiedlichen Höhen, die nicht kosteneffizient für eine industrielle Fertigung sind. Um die Kosten bei Aufrechterhaltung der herausragenden Vorteile der FP-Filter-Struktur zu reduzieren, wird eine neue Methode zur Herstellung der miniaturisierten FP-Filtern mittels NanoImprint Technologie entwickelt und präsentiert. In diesem Fall werden die mehreren Kavitäten-Herstellungsschritte durch einen einzigen Schritt ersetzt, die hohe vertikale Auflösung der 3D NanoImprint Technologie verwendet. Seit dem die NanoImprint Technologie verwendet wird, wird das auf FP Filters basierende miniaturisierte Spectrometer nanospectrometer genannt. Ein statischer Nano-Spektrometer besteht aus einem statischen FP-Filterarray auf einem Detektorarray (siehe Abb. 1). Jeder FP-Filter im Array besteht aus dem unteren Distributed Bragg Reflector (DBR), einer Resonanz-Kavität und einen oberen DBR. Der obere und untere DBR sind identisch und bestehen aus periodisch abwechselnden dünnen dielektrischen Schichten von Materialien mit hohem und niedrigem Brechungsindex. Die optischen Schichten jeder dielektrischen Dünnfilmschicht, die in dem DBR enthalten sind, entsprechen einen Viertel der Design-Wellenlänge. Jeder FP-Filter wird einer definierten Fläche des Detektorarrays zugeordnet. Dieser Bereich kann aus einzelnen Detektorelementen oder deren Gruppen enthalten. Daher werden die Seitenkanal-Geometrien der Kavität aufgebaut, die dem Detektor entsprechen. Die seitlichen und vertikalen Dimensionen der Kavität werden genau durch 3D NanoImprint Technologie aufgebaut. Die Kavitäten haben Unterschiede von wenigem Nanometer in der vertikalen Richtung. Die Präzision der Kavität in der vertikalen Richtung ist ein wichtiger Faktor, der die Genauigkeit der spektralen Position und Durchlässigkeit des Filters Transmissionslinie beeinflusst.
Resumo:
As ubiquitous systems have moved out of the lab and into the world the need to think more systematically about how there are realised has grown. This talk will present intradisciplinary work I have been engaged in with other computing colleagues on how we might develop more formal models and understanding of ubiquitous computing systems. The formal modelling of computing systems has proved valuable in areas as diverse as reliability, security and robustness. However, the emergence of ubiquitous computing raises new challenges for formal modelling due to their contextual nature and dependence on unreliable sensing systems. In this work we undertook an exploration of modelling an example ubiquitous system called the Savannah game using the approach of bigraphical rewriting systems. This required an unusual intra-disciplinary dialogue between formal computing and human- computer interaction researchers to model systematically four perspectives on Savannah: computational, physical, human and technical. Each perspective in turn drew upon a range of different modelling traditions. For example, the human perspective built upon previous work on proxemics, which uses physical distance as a means to understand interaction. In this talk I hope to show how our model explains observed inconsistencies in Savannah and ex- tend it to resolve these. I will then reflect on the need for intradisciplinary work of this form and the importance of the bigraph diagrammatic form to support this form of engagement. Speaker Biography Tom Rodden Tom Rodden (rodden.info) is a Professor of Interactive Computing at the University of Nottingham. His research brings together a range of human and technical disciplines, technologies and techniques to tackle the human, social, ethical and technical challenges involved in ubiquitous computing and the increasing used of personal data. He leads the Mixed Reality Laboratory (www.mrl.nott.ac.uk) an interdisciplinary research facility that is home of a team of over 40 researchers. He founded and currently co-directs the Horizon Digital Economy Research Institute (www.horizon.ac.uk), a university wide interdisciplinary research centre focusing on ethical use of our growing digital footprint. He has previously directed the EPSRC Equator IRC (www.equator.ac.uk) a national interdisciplinary research collaboration exploring the place of digital interaction in our everyday world. He is a fellow of the British Computer Society and the ACM and was elected to the ACM SIGCHI Academy in 2009 (http://www.sigchi.org/about/awards/).
Resumo:
L'experiència de l'autor en la temàtica d'agents intel·ligents i la seva aplicació als robots que emulen el joc de futbol han donat el bagatge suficient per poder encetar i proposar la temàtica plantejada en aquesta tesi: com fer que un complicat robot pugui treure el màxim suc de l'autoconeixement de l'estructura de control inclosa al seu propi cos físic, i així poder cooperar millor amb d'altres agents per optimitzar el rendiment a l'hora de resoldre problemes de cooperació. Per resoldre aquesta qüestió es proposa incorporar la dinàmica del cos físic en les decisions cooperatives dels agents físics unificant els móns de l'automàtica, la robòtica i la intel·ligència artificial a través de la noció de capacitat: la capacitat vista com a entitat on els enginyers de control dipositen el seu coneixement, i a la vegada la capacitat vista com la utilitat on un agent hi diposita el seu autoconeixement del seu cos físic que ha obtingut per introspecció. En aquesta tesi es presenta l'arquitectura DPAA que s'organitza seguint una jerarquia vertical en tres nivells d'abstracció o mòduls control, supervisor i agent, els quals presenten una estructura interna homogènia que facilita les tasques de disseny de l'agent. Aquests mòduls disposen d'un conjunt específic de capacitats que els permeten avaluar com seran les accions que s'executaran en un futur. En concret, al mòdul de control (baix nivell d'abstracció) les capacitats consisteixen en paràmetres que descriuen el comportament dinàmic i estàtic que resulta d'executar un controlador determinat, és a dir, encapsulen el coneixement de l'enginyer de control. Així, a través dels mecanismes de comunicació entre mòduls aquest coneixement pot anar introduint-se als mecanismes de decisió dels mòduls superiors (supervisor i agent) de forma que quan els paràmetres dinàmics i estàtics indiquin que pot haver-hi problemes a baix nivell, els mòduls superiors es poden responsabilitzar d'inhibir o no l'execució d'algunes accions. Aquest procés top-down intern d'avaluació de la viabilitat d'executar una acció determinada s'anomena procés d'introspecció. Es presenten diversos exemples per tal d'il·lustrar com es pot dissenyar un agent físic amb dinàmica pròpia utilitzant l'arquitectura DPAA com a referent. En concret, es mostra tot el procés a seguir per dissenyar un sistema real format per dos robots en formació de comboi, i es mostra com es pot resoldre el problema de la col·lisió utilitzant les capacitats a partir de les especificacions de disseny de l'arquitectura DPAA. Al cinquè capítol s'hi exposa el procés d'anàlisi i disseny en un domini més complex: un grup de robots que emulen el joc del futbol. Els resultats que s'hi mostren fan referència a l'avaluació de la validesa de l'arquitectura per resoldre el problema de la passada de la pilota. S'hi mostren diversos resultats on es veu que és possible avaluar si una passada de pilota és viable o no. Encara que aquesta possibilitat ja ha estat demostrada en altres treballs, l'aportació d'aquesta tesi està en el fet que és possible avaluar la viabilitat a partir de l'encapsulament de la dinàmica en unes capacitats específiques, és a dir, és possible saber quines seran les característiques de la passada: el temps del xut, la precisió o inclòs la geometria del moviment del robot xutador. Els resultats mostren que la negociació de les condicions de la passada de la pilota és possible a partir de capacitats atòmiques, les quals inclouen informació sobre les característiques de la dinàmica dels controladors. La complexitat del domini proposat fa difícil comparar els resultats amb els altres treballs. Cal tenir present que els resultats mostrats s'han obtingut utilitzant un simulador fet a mida que incorpora les dinàmiques dels motors dels robots i de la pilota. En aquest sentit cal comentar que no existeixen treballs publicats sobre el problema de la passada en què es tingui en compte la dinàmica dels robots. El present treball permet assegurar que la inclusió de paràmetres dinàmics en el conjunt de les capacitats de l'agent físic permet obtenir un millor comportament col·lectiu dels robots, i que aquesta millora es deu al fet que en les etapes de decisió els agents utilitzen informació relativa a la viabilitat sobre les seves accions: aquesta viabilitat es pot calcular a partir del comportament dinàmic dels controladors. De fet, la definició de capacitats a partir de paràmetres dinàmics permet treballar fàcilment amb sistemes autònoms heterogenis: l'agent físic pot ser conscient de les seves capacitats d'actuació a través de mecanismes interns d'introspecció, i això permet que pugui prendre compromisos amb altres agents físics.
Resumo:
A series of model experiments with the coupled Max-Planck-Institute ECHAM5/OM climate model have been investigated and compared with microwave measurements from the Microwave Sounding Unit (MSU) and re-analysis data for the period 1979–2008. The evaluation is carried out by computing the Temperature in the Lower Troposphere (TLT) and Temperature in the Middle Troposphere (TMT) using the MSU weights from both University of Alabama (UAH) and Remote Sensing Systems (RSS) and restricting the study to primarily the tropical oceans. When forced by analysed sea surface temperature the model reproduces accurately the time-evolution of the mean outgoing tropospheric microwave radiation especially over tropical oceans but with a minor bias towards higher temperatures in the upper troposphere. The latest reanalyses data from the 25 year Japanese re-analysis (JRA25) and European Center for Medium Range Weather Forecasts Interim Reanalysis are in very close agreement with the time-evolution of the MSU data with a correlation of 0.98 and 0.96, respectively. The re-analysis trends are similar to the trends obtained from UAH but smaller than the trends from RSS. Comparison of TLT, computed from observations from UAH and RSS, with Sea Surface Temperature indicates that RSS has a warm bias after 1993. In order to identify the significance of the tropospheric linear temperature trends we determined the natural variability of 30-year trends from a 500 year control integration of the coupled ECHAM5 model. The model exhibits natural unforced variations of the 30 year tropospheric trend that vary within ±0.2 K/decade for the tropical oceans. This general result is supported by similar results from the Geophysical Fluid Dynamics Laboratory (GFDL) coupled climate model. Present MSU observations from UAH for the period 1979–2008 are well within this range but RSS is close to the upper positive limit of this variability. We have also compared the trend of the vertical lapse rate over the tropical oceans assuming that the difference between TLT and TMT is an approximate measure of the lapse rate. The TLT–TMT trend is larger in both the measurements and in the JRA25 than in the model runs by 0.04–0.06 K/decade. Furthermore, a calculation of all 30 year TLT–TMT trends of the unforced 500-year integration vary between ±0.03 K/decade suggesting that the models have a minor systematic warm bias in the upper troposphere.
Resumo:
The tectonics activity on the southern border of Parnaíba Basin resulted in a wide range of brittle structures that affect siliciclastic sedimentary rocks. This tectonic activity and related faults, joints, and folds are poorly known. The main aims of this study were (1) to identify lineaments using several remotesensing systems, (2) to check how the interpretation based on these systems at several scales influence the identification of lineaments, and (3) to contribute to the knowledge of brittle tectonics in the southern border of the Parnaíba Basin. The integration of orbital and aerial systems allowed a multi-scale identification, classification, and quantification of lineaments. Maps of lineaments were elaborated in the following scales: 1:200,000 (SRTM Shuttle Radar Topographic Mission), 1:50,000 (Landsat 7 ETM+ satellite), 1:10,000 (aerial photographs) and 1:5,000 (Quickbird satellite). The classification of the features with structural significance allowed the determination of four structural sets: NW, NS, NE, and EW. They were usually identified in all remote-sensing systems. The NE-trending set was not easily identified in aerial photographs but was better visualized on images of medium-resolution systems (SRTM and Landsat 7 ETM+). The same behavior characterizes the NW-trending. The NS-and EW-trending sets were better identified on images from high-resolution systems (aerial photographs and Quickbird). The structural meaning of the lineaments was established after field work. The NEtrending set is associated with normal and strike-slip faults, including deformation bands. These are the oldest structures identified in the region and are related to the reactivation of Precambrian basement structures from the Transbrazilian Lineament. The NW-trending set represents strike-slip and subordinated normal faults. The high dispersion of this set suggests a more recent origin than the previous structures. The NW-trending set may be related to the Picos-Santa Inês Lineament. The NS-and EW-trending sets correspond to large joints (100 m 5 km long). The truncation relationships between these joint sets indicate that the EW-is older than the NS-trending set. The methodology developed by the present work is an excellent tool for the understanding of the regional and local tectonic structures in the Parnaíba basin. It helps the choice of the best remote-sensing system to identify brittle features in a poorly known sedimentary basin
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS
Resumo:
The normal gut microbiota has several important functions in host physiology and metabolism, and plays a key role in health and disease. Bifidobacteria, which are indigenous components of gastrointestinal microbiota, may play an important role in maintaining the well-being of the host although its precise function is very difficult to study. Its physiological and biochemical activities are controlled by many factors, particularly diet and environment. Adherence and colonization capacity are considered as contributing factors for immune modulation, pathogen exclusion, and enhanced contact with the mucosa. In this way, bifidobacteria would fortify the microbiota that forms an integral part of the mucosal barrier and colonization resistance against pathogens. Bifidobacteria are not only subjected to stressful conditions in industrial processes, but also in nature, where the ability to respond quickly to stress is essential for survival. Bifidobacteria, like other microorganisms, have evolved sensing systems for/and defences against stress that allow them to withstand harsh conditions and sudden environmental changes. Bacterial stress responses rely on the coordinated expression of genes that alter various cellular processes and structures (e.g. DNA metabolism, housekeeping genes, cell-wall proteins, membrane composition) and act in concert to improve bacterial stress tolerance. The integration of these stress responses is accomplished by regulatory networks that allow the cell to react rapidly to various and sometimes complex environmental changes. This work examined the effect of important stressful conditions, such as changing pH and osmolarity, on the biosynthesis of cell wall proteins in B. pseudolongum subsp. globosum. These environmental factors all influence heavily the expression of BIFOP (BIFidobacterial Outer Proteins) in the cell-wall and can have an impact in the interaction with host. Also evidence has been collected linking the low concentration of sugar in the culture medium with the presence or absence of extracromosomal DNA.
Resumo:
The subject of this Ph.D. research thesis is the development and application of multiplexed analytical methods based on bioluminescent whole-cell biosensors. One of the main goals of analytical chemistry is multianalyte testing in which two or more analytes are measured simultaneously in a single assay. The advantages of multianalyte testing are work simplification, high throughput, and reduction in the overall cost per test. The availability of multiplexed portable analytical systems is of particular interest for on-field analysis of clinical, environmental or food samples as well as for the drug discovery process. To allow highly sensitive and selective analysis, these devices should combine biospecific molecular recognition with ultrasensitive detection systems. To address the current need for rapid, highly sensitive and inexpensive devices for obtaining more data from each sample,genetically engineered whole-cell biosensors as biospecific recognition element were combined with ultrasensitive bioluminescence detection techniques. Genetically engineered cell-based sensing systems were obtained by introducing into bacterial, yeast or mammalian cells a vector expressing a reporter protein whose expression is controlled by regulatory proteins and promoter sequences. The regulatory protein is able to recognize the presence of the analyte (e.g., compounds with hormone-like activity, heavy metals…) and to consequently activate the expression of the reporter protein that can be readily measured and directly related to the analyte bioavailable concentration in the sample. Bioluminescence represents the ideal detection principle for miniaturized analytical devices and multiplexed assays thanks to high detectability in small sample volumes allowing an accurate signal localization and quantification. In the first chapter of this dissertation is discussed the obtainment of improved bioluminescent proteins emitting at different wavelenghts, in term of increased thermostability, enhanced emission decay kinetic and spectral resolution. The second chapter is mainly focused on the use of these proteins in the development of whole-cell based assay with improved analytical performance. In particular since the main drawback of whole-cell biosensors is the high variability of their analyte specific response mainly caused by variations in cell viability due to aspecific effects of the sample’s matrix, an additional bioluminescent reporter has been introduced to correct the analytical response thus increasing the robustness of the bioassays. The feasibility of using a combination of two or more bioluminescent proteins for obtaining biosensors with internal signal correction or for the simultaneous detection of multiple analytes has been demonstrated by developing a dual reporter yeast based biosensor for androgenic activity measurement and a triple reporter mammalian cell-based biosensor for the simultaneous monitoring of two CYP450 enzymes activation, involved in cholesterol degradation, with the use of two spectrally resolved intracellular luciferases and a secreted luciferase as a control for cells viability. In the third chapter is presented the development of a portable multianalyte detection system. In order to develop a portable system that can be used also outside the laboratory environment even by non skilled personnel, cells have been immobilized into a new biocompatible and transparent polymeric matrix within a modified clear bottom black 384 -well microtiter plate to obtain a bioluminescent cell array. The cell array was placed in contact with a portable charge-coupled device (CCD) light sensor able to localize and quantify the luminescent signal produced by different bioluminescent whole-cell biosensors. This multiplexed biosensing platform containing whole-cell biosensors was successfully used to measure the overall toxicity of a given sample as well as to obtain dose response curves for heavy metals and to detect hormonal activity in clinical samples (PCT/IB2010/050625: “Portable device based on immobilized cells for the detection of analytes.” Michelini E, Roda A, Dolci LS, Mezzanotte L, Cevenini L , 2010). At the end of the dissertation some future development steps are also discussed in order to develop a point of care (POCT) device that combine portability, minimum sample pre-treatment and highly sensitive multiplexed assays in a short assay time. In this POCT perspective, field-flow fractionation (FFF) techniques, in particular gravitational variant (GrFFF) that exploit the earth gravitational field to structure the separation, have been investigated for cells fractionation, characterization and isolation. Thanks to the simplicity of its equipment, amenable to miniaturization, the GrFFF techniques appears to be particularly suited for its implementation in POCT devices and may be used as pre-analytical integrated module to be applied directly to drive target analytes of raw samples to the modules where biospecifc recognition reactions based on ultrasensitive bioluminescence detection occurs, providing an increase in overall analytical output.
Resumo:
Für die Zukunft wird eine Zunahme an Verkehr prognostiziert, gleichzeitig herrscht ein Mangel an Raum und finanziellen Mitteln, um weitere Straßen zu bauen. Daher müssen die vorhandenen Kapazitäten durch eine bessere Verkehrssteuerung sinnvoller genutzt werden, z.B. durch Verkehrsleitsysteme. Dafür werden räumlich aufgelöste, d.h. den Verkehr in seiner flächenhaften Verteilung wiedergebende Daten benötigt, die jedoch fehlen. Bisher konnten Verkehrsdaten nur dort erhoben werden, wo sich örtlich feste Meßeinrichtungen befinden, jedoch können damit die fehlenden Daten nicht erhoben werden. Mit Fernerkundungssystemen ergibt sich die Möglichkeit, diese Daten flächendeckend mit einem Blick von oben zu erfassen. Nach jahrzehntelangen Erfahrungen mit Fernerkundungsmethoden zur Erfassung und Untersuchung der verschiedensten Phänomene auf der Erdoberfläche wird nun diese Methodik im Rahmen eines Pilotprojektes auf den Themenbereich Verkehr angewendet. Seit Ende der 1990er Jahre wurde mit flugzeuggetragenen optischen und Infrarot-Aufnahmesystemen Verkehr beobachtet. Doch bei schlechten Wetterbedingungen und insbesondere bei Bewölkung, sind keine brauchbaren Aufnahmen möglich. Mit einem abbildenden Radarverfahren werden Daten unabhängig von Wetter- und Tageslichtbedingungen oder Bewölkung erhoben. Im Rahmen dieser Arbeit wird untersucht, inwieweit mit Hilfe von flugzeuggetragenem synthetischem Apertur Radar (SAR) Verkehrsdaten aufgenommen, verarbeitet und sinnvoll angewendet werden können. Nicht nur wird die neue Technik der Along-Track Interferometrie (ATI) und die Prozessierung und Verarbeitung der aufgenommenen Verkehrsdaten ausführlich dargelegt, es wird darüberhinaus ein mit dieser Methodik erstellter Datensatz mit einer Verkehrssimulation verglichen und bewertet. Abschließend wird ein Ausblick auf zukünftige Entwicklungen der Radarfernerkundung zur Verkehrsdatenerfassung gegeben.