928 resultados para Portable.
Resumo:
Nucleic acid biosensors represent a powerful tool for clinical and environmental pathogens detection. For applications such as point-of-care biosensing, it is fundamental to develop sensors that should be automatic, inexpensive, portable and require a professional skill of the user that should be as low as possible. With the goal of determining the presence of pathogens when present in very small amount, such as for the screening of pathogens in drinking water, an amplification step must be implemented. Often this type of determinations should be performed with simple, automatic and inexpensive hardware: the use of a chemical (or nanotechnological) isothermal solution would be desirable. My Ph.D. project focused on the study and on the testing of four isothermal reactions which can be used to amplify the nucleic acid analyte before the binding event on the surface sensor or to amplify the signal after that the hybridization event with the probe. Recombinase polymerase amplification (RPA) and ligation-mediated rolling circle amplification (L-RCA) were investigated as methods for DNA and RNA amplification. Hybridization chain reaction (HCR) and Terminal deoxynucleotidil transferase-mediated amplification were investigated as strategies to achieve the enhancement of the signal after the surface hybridization event between target and probe. In conclusion, it can be said that only a small subset of the biochemical strategies that are proved to work in solution towards the amplification of nucleic acids does truly work in the context of amplifying the signal of a detection system for pathogens. Amongst those tested during my Ph.D. activity, recombinase polymerase amplification seems the best candidate for a useful implementation in diagnostic or environmental applications.
Resumo:
Quality control of medical radiological systems is of fundamental importance, and requires efficient methods for accurately determine the X-ray source spectrum. Straightforward measurements of X-ray spectra in standard operating require the limitation of the high photon flux, and therefore the measure has to be performed in a laboratory. However, the optimal quality control requires frequent in situ measurements which can be only performed using a portable system. To reduce the photon flux by 3 magnitude orders an indirect technique based on the scattering of the X-ray source beam by a solid target is used. The measured spectrum presents a lack of information because of transport and detection effects. The solution is then unfolded by solving the matrix equation that represents formally the scattering problem. However, the algebraic system is ill-conditioned and, therefore, it is not possible to obtain a satisfactory solution. Special strategies are necessary to circumvent the ill-conditioning. Numerous attempts have been done to solve this problem by using purely mathematical methods. In this thesis, a more physical point of view is adopted. The proposed method uses both the forward and the adjoint solutions of the Boltzmann transport equation to generate a better conditioned linear algebraic system. The procedure has been tested first on numerical experiments, giving excellent results. Then, the method has been verified with experimental measurements performed at the Operational Unit of Health Physics of the University of Bologna. The reconstructed spectra have been compared with the ones obtained with straightforward measurements, showing very good agreement.
Resumo:
A partire da una ricognizione storica sull’ultimo impero coloniale portoghese, la tesi intende analizzare come il mito dell’impero abbia contribuito a definire l’identità nazionale del Portogallo e il suo patrimonio culturale e letterario. Il mito viene indagato in particolar modo nella sua componente linguistica e discorsiva, come modalità peculiare di costruire “costellazioni” di immagini. In questo contesto la letteratura assume una rilevanza specifica, poiché le sue risorse formali le permettono di contrapporre alla fissità del mito una potente articolazione in grado di scardinare gli automatismi legati ad una translatio imperii volta a reiterare l’immaginazione del centro. L’analisi di un corpus letterario afferente a quella che potremmo chiamare la letteratura “dei retornados”, che si concentra soprattutto sulla definizione della focalizzazione narrativa e sulla rielaborazione delle figure narrative, intende ricercare le diverse forme di assumere criticamente il canone imperiale e di oltrepassarlo.
Resumo:
Electrochemical biosensors provide an attractive means to analyze the content of a biological sample due to the direct conversion of a biological event to an electronic signal, enabling the development of cheap, small, portable and simple devices, that allow multiplex and real-time detection. At the same time nanobiotechnology is drastically revolutionizing the biosensors development and different transduction strategies exploit concepts developed in these field to simplify the analysis operations for operators and end users, offering higher specificity, higher sensitivity, higher operational stability, integrated sample treatments and shorter analysis time. The aim of this PhD work has been the application of nanobiotechnological strategies to electrochemical biosensors for the detection of biological macromolecules. Specifically, one project was focused on the application of a DNA nanotechnology called hybridization chain reaction (HCR), to amplify the hybridization signal in an electrochemical DNA biosensor. Another project on which the research activity was focused concerns the development of an electrochemical biosensor based on a biological model membrane anchored to a solid surface (tBLM), for the recognition of interactions between the lipid membrane and different types of target molecules.
Resumo:
The new generation of multicore processors opens new perspectives for the design of embedded systems. Multiprocessing, however, poses new challenges to the scheduling of real-time applications, in which the ever-increasing computational demands are constantly flanked by the need of meeting critical time constraints. Many research works have contributed to this field introducing new advanced scheduling algorithms. However, despite many of these works have solidly demonstrated their effectiveness, the actual support for multiprocessor real-time scheduling offered by current operating systems is still very limited. This dissertation deals with implementative aspects of real-time schedulers in modern embedded multiprocessor systems. The first contribution is represented by an open-source scheduling framework, which is capable of realizing complex multiprocessor scheduling policies, such as G-EDF, on conventional operating systems exploiting only their native scheduler from user-space. A set of experimental evaluations compare the proposed solution to other research projects that pursue the same goals by means of kernel modifications, highlighting comparable scheduling performances. The principles that underpin the operation of the framework, originally designed for symmetric multiprocessors, have been further extended first to asymmetric ones, which are subjected to major restrictions such as the lack of support for task migrations, and later to re-programmable hardware architectures (FPGAs). In the latter case, this work introduces a scheduling accelerator, which offloads most of the scheduling operations to the hardware and exhibits extremely low scheduling jitter. The realization of a portable scheduling framework presented many interesting software challenges. One of these has been represented by timekeeping. In this regard, a further contribution is represented by a novel data structure, called addressable binary heap (ABH). Such ABH, which is conceptually a pointer-based implementation of a binary heap, shows very interesting average and worst-case performances when addressing the problem of tick-less timekeeping of high-resolution timers.
Resumo:
Es wurde ein für bodengebundene Feldmessungen geeignetes System zur digital-holographischen Abbildung luftgetragener Objekte entwickelt und konstruiert. Es ist, abhängig von der Tiefenposition, geeignet zur direkten Bestimmung der Größe luftgetragener Objekte oberhalb von ca. 20 µm, sowie ihrer Form bei Größen oberhalb von ca. 100µm bis in den Millimeterbereich. Die Entwicklung umfaßte zusätzlich einen Algorithmus zur automatisierten Verbesserung der Hologrammqualität und zur semiautomatischen Entfernungsbestimmung großer Objekte entwickelt. Eine Möglichkeit zur intrinsischen Effizienzsteigerung der Bestimmung der Tiefenposition durch die Berechnung winkelgemittelter Profile wurde vorgestellt. Es wurde weiterhin ein Verfahren entwickelt, das mithilfe eines iterativen Ansatzes für isolierte Objekte die Rückgewinnung der Phaseninformation und damit die Beseitigung des Zwillingsbildes erlaubt. Weiterhin wurden mithilfe von Simulationen die Auswirkungen verschiedener Beschränkungen der digitalen Holographie wie der endlichen Pixelgröße untersucht und diskutiert. Die geeignete Darstellung der dreidimensionalen Ortsinformation stellt in der digitalen Holographie ein besonderes Problem dar, da das dreidimensionale Lichtfeld nicht physikalisch rekonstruiert wird. Es wurde ein Verfahren entwickelt und implementiert, das durch Konstruktion einer stereoskopischen Repräsentation des numerisch rekonstruierten Meßvolumens eine quasi-dreidimensionale, vergrößerte Betrachtung erlaubt. Es wurden ausgewählte, während Feldversuchen auf dem Jungfraujoch aufgenommene digitale Hologramme rekonstruiert. Dabei ergab sich teilweise ein sehr hoher Anteil an irregulären Kristallformen, insbesondere infolge massiver Bereifung. Es wurden auch in Zeiträumen mit formal eisuntersättigten Bedingungen Objekte bis hinunter in den Bereich ≤20µm beobachtet. Weiterhin konnte in Anwendung der hier entwickelten Theorie des ”Phasenrandeffektes“ ein Objekt von nur ca. 40µm Größe als Eisplättchen identifiziert werden. Größter Nachteil digitaler Holographie gegenüber herkömmlichen photographisch abbildenden Verfahren ist die Notwendigkeit der aufwendigen numerischen Rekonstruktion. Es ergibt sich ein hoher rechnerischer Aufwand zum Erreichen eines einer Photographie vergleichbaren Ergebnisses. Andererseits weist die digitale Holographie Alleinstellungsmerkmale auf. Der Zugang zur dreidimensionalen Ortsinformation kann der lokalen Untersuchung der relativen Objektabstände dienen. Allerdings zeigte sich, dass die Gegebenheiten der digitalen Holographie die Beobachtung hinreichend großer Mengen von Objekten auf der Grundlage einzelner Hologramm gegenwärtig erschweren. Es wurde demonstriert, dass vollständige Objektgrenzen auch dann rekonstruiert werden konnten, wenn ein Objekt sich teilweise oder ganz außerhalb des geometrischen Meßvolumens befand. Weiterhin wurde die zunächst in Simulationen demonstrierte Sub-Bildelementrekonstruktion auf reale Hologramme angewandt. Dabei konnte gezeigt werden, dass z.T. quasi-punktförmige Objekte mit Sub-Pixelgenauigkeit lokalisiert, aber auch bei ausgedehnten Objekten zusätzliche Informationen gewonnen werden konnten. Schließlich wurden auf rekonstruierten Eiskristallen Interferenzmuster beobachtet und teilweise zeitlich verfolgt. Gegenwärtig erscheinen sowohl kristallinterne Reflexion als auch die Existenz einer (quasi-)flüssigen Schicht als Erklärung möglich, wobei teilweise in Richtung der letztgenannten Möglichkeit argumentiert werden konnte. Als Ergebnis der Arbeit steht jetzt ein System zur Verfügung, das ein neues Meßinstrument und umfangreiche Algorithmen umfaßt. S. M. F. Raupach, H.-J. Vössing, J. Curtius und S. Borrmann: Digital crossed-beam holography for in-situ imaging of atmospheric particles, J. Opt. A: Pure Appl. Opt. 8, 796-806 (2006) S. M. F. Raupach: A cascaded adaptive mask algorithm for twin image removal and its application to digital holograms of ice crystals, Appl. Opt. 48, 287-301 (2009) S. M. F. Raupach: Stereoscopic 3D visualization of particle fields reconstructed from digital inline holograms, (zur Veröffentlichung angenommen, Optik - Int. J. Light El. Optics, 2009)
Resumo:
La neuroriabilitazione è un processo attraverso cui individui affetti da patologie neurologiche mirano al conseguimento di un recupero completo o alla realizzazione del loro potenziale ottimale benessere fisico, mentale e sociale. Elementi essenziali per una riabilitazione efficace sono: una valutazione clinica da parte di un team multidisciplinare, un programma riabilitativo mirato e la valutazione dei risultati conseguiti mediante misure scientifiche e clinicamente appropriate. Obiettivo principale di questa tesi è stato sviluppare metodi e strumenti quantitativi per il trattamento e la valutazione motoria di pazienti neurologici. I trattamenti riabilitativi convenzionali richiedono a pazienti neurologici l’esecuzione di esercizi ripetitivi, diminuendo la loro motivazione. La realtà virtuale e i feedback sono in grado di coinvolgerli nel trattamento, permettendo ripetibilità e standardizzazione dei protocolli. È stato sviluppato e valutato uno strumento basato su feedback aumentati per il controllo del tronco. Inoltre, la realtà virtuale permette l’individualizzare il trattamento in base alle esigenze del paziente. Un’applicazione virtuale per la riabilitazione del cammino è stata sviluppata e testata durante un training su pazienti di sclerosi multipla, valutandone fattibilità e accettazione e dimostrando l'efficacia del trattamento. La valutazione quantitativa delle capacità motorie dei pazienti viene effettuata utilizzando sistemi di motion capture. Essendo il loro uso nella pratica clinica limitato, una metodologia per valutare l’oscillazione delle braccia in soggetti parkinsoniani basata su sensori inerziali è stata proposta. Questi sono piccoli, accurati e flessibili ma accumulano errori durante lunghe misurazioni. È stato affrontato questo problema e i risultati suggeriscono che, se il sensore è sul piede e le accelerazioni sono integrate iniziando dalla fase di mid stance, l’errore e le sue conseguenze nella determinazione dei parametri spaziali sono contenuti. Infine, è stata presentata una validazione del Kinect per il tracking del cammino in ambiente virtuale. Risultati preliminari consentono di definire il campo di utilizzo del sensore in riabilitazione.
Resumo:
n the last few years, the vision of our connected and intelligent information society has evolved to embrace novel technological and research trends. The diffusion of ubiquitous mobile connectivity and advanced handheld portable devices, amplified the importance of the Internet as the communication backbone for the fruition of services and data. The diffusion of mobile and pervasive computing devices, featuring advanced sensing technologies and processing capabilities, triggered the adoption of innovative interaction paradigms: touch responsive surfaces, tangible interfaces and gesture or voice recognition are finally entering our homes and workplaces. We are experiencing the proliferation of smart objects and sensor networks, embedded in our daily living and interconnected through the Internet. This ubiquitous network of always available interconnected devices is enabling new applications and services, ranging from enhancements to home and office environments, to remote healthcare assistance and the birth of a smart environment. This work will present some evolutions in the hardware and software development of embedded systems and sensor networks. Different hardware solutions will be introduced, ranging from smart objects for interaction to advanced inertial sensor nodes for motion tracking, focusing on system-level design. They will be accompanied by the study of innovative data processing algorithms developed and optimized to run on-board of the embedded devices. Gesture recognition, orientation estimation and data reconstruction techniques for sensor networks will be introduced and implemented, with the goal to maximize the tradeoff between performance and energy efficiency. Experimental results will provide an evaluation of the accuracy of the presented methods and validate the efficiency of the proposed embedded systems.
Resumo:
Nowadays microfluidic is becoming an important technology in many chemical and biological processes and analysis applications. The potential to replace large-scale conventional laboratory instrumentation with miniaturized and self-contained systems, (called lab-on-a-chip (LOC) or point-of-care-testing (POCT)), offers a variety of advantages such as low reagent consumption, faster analysis speeds, and the capability of operating in a massively parallel scale in order to achieve high-throughput. Micro-electro-mechanical-systems (MEMS) technologies enable both the fabrication of miniaturized system and the possibility of developing compact and portable systems. The work described in this dissertation is towards the development of micromachined separation devices for both high-speed gas chromatography (HSGC) and gravitational field-flow fractionation (GrFFF) using MEMS technologies. Concerning the HSGC, a complete platform of three MEMS-based GC core components (injector, separation column and detector) is designed, fabricated and characterized. The microinjector consists of a set of pneumatically driven microvalves, based on a polymeric actuating membrane. Experimental results demonstrate that the microinjector is able to guarantee low dead volumes, fast actuation time, a wide operating temperature range and high chemical inertness. The microcolumn consists of an all-silicon microcolumn having a nearly circular cross-section channel. The extensive characterization has produced separation performances very close to the theoretical ideal expectations. A thermal conductivity detector (TCD) is chosen as most proper detector to be miniaturized since the volume reduction of the detector chamber results in increased mass and reduced dead volumes. The microTDC shows a good sensitivity and a very wide dynamic range. Finally a feasibility study for miniaturizing a channel suited for GrFFF is performed. The proposed GrFFF microchannel is at early stage of development, but represents a first step for the realization of a highly portable and potentially low-cost POCT device for biomedical applications.
Resumo:
The two Mars Exploration Rovers (MER), Spirit and Opportunity, landed on the Martian surface in January 2004 and have since collected a wealth of information about their landing sites. As part of their payload, the miniaturised Mössbauer spectrometer MIMOS II contributes to the success of the mission by identifying Iron-bearing minerals and by determining Iron oxidation states in them. The basis of this work is the data set obtained at Opportunity’s landing site at Meridiani Planum. A portion of this data set is evaluated with different methods, with the aim to thoroughly characterize lithologic components at Meridiani Planum and possible relations between them.rnMIMOS II is able to measure Mössbauer spectra at different energies simultaneously, bearing information from different sampling depths of the investigated target. The ability of depth-selective Mössbauer spectroscopy to characterize weathered surface layers is illustrated through its application to two suitable rock targets that were investigated on Mars. In both cases, an enhanced concentration of Iron oxides at the rock surface was detected, pointing to a low degree of aqueous alteration. rnThe mineral hematite (α-Fe2O3) is present in the matrix of outcrop rocks and in spherules weathering from the outcrop. Simultaneous fitting of Mössbauer spectra was applied to data sets obtained on both target types to characterize the hematite component in detail. This approach reveals that two hematite populations are present, both in the outcrop matrix as well as in spherules. The hematite component with a comparably high degree of crystallinity and/or chemical purity is present in the outcrop matrix. The investigation of hematite at Meridiani Planum has shown that simultaneous fitting is a suitable and useful method to evaluate a large, correlated set of Mössbauer spectra.rnOpportunity encountered loose, cm-sized rocks along its traverse. Based on their composition and texture, these “cobbles” can be divided into three different groups. Outcrop fragments are impact-derived ejecta from local outcrop rocks. Cobbles of meteoritic origin contain the minerals kamacite (Fe,Ni) and troilite (FeS) and exhibit high Ni contents. Melt-bearing impact breccias bear similarities to local outcrop rocks and basaltic soil, with a phase composition and texture consistent with a formation scenario involving partial melting and inclusion of small, bright outcrop clasts. rnIron meteorites on the Martian surface experience weathering through the presence of even trace amounts of water due to their metallic nature. Opportunity encountered and investigated four Iron meteorites, which exhibit evidence for physical and chemical weathering. Discontinuous coatings contain Iron oxides, pointing to the influence of limited amounts of water. rnA terrestrial analogue site for Meridiani Planum is the Rio Tinto basin in south-west Spain. With its deposits of sulfate- and iron-oxide-bearing minerals, the region provides an adequate test bed for instrumentation for future Mars missions. In-situ investigations at Rio Tinto were carried out with a special focus on the combined use of Mössbauer spectroscopy with MIMOS II and Raman spectroscopy with a field-portable instrument. The results demonstrate that the two instruments provide complementary information about investigated samples.
Resumo:
The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.
Resumo:
Body-centric communications are emerging as a new paradigm in the panorama of personal communications. Being concerned with human behaviour, they are suitable for a wide variety of applications. The advances in the miniaturization of portable devices to be placed on or around the body, foster the diffusion of these systems, where the human body is the key element defining communication characteristics. This thesis investigates the human impact on body-centric communications under its distinctive aspects. First of all, the unique propagation environment defined by the body is described through a scenario-based channel modeling approach, according to the communication scenario considered, i.e., on- or on- to off-body. The novelty introduced pertains to the description of radio channel features accounting for multiple sources of variability at the same time. Secondly, the importance of a proper channel characterisation is shown integrating the on-body channel model in a system level simulator, allowing a more realistic comparison of different Physical and Medium Access Control layer solutions. Finally, the structure of a comprehensive simulation framework for system performance evaluation is proposed. It aims at merging in one tool, mobility and social features typical of the human being, together with the propagation aspects, in a scenario where multiple users interact sharing space and resources.
Resumo:
Efficient energy storage and conversion is playing a key role in overcoming the present and future challenges in energy supply. Batteries provide portable, electrochemical storage of green energy sources and potentially allow for a reduction of the dependence on fossil fuels, which is of great importance with respect to the issue of global warming. In view of both, energy density and energy drain, rechargeable lithium ion batteries outperform other present accumulator systems. However, despite great efforts over the last decades, the ideal electrolyte in terms of key characteristics such as capacity, cycle life, and most important reliable safety, has not yet been identified. rnrnSteps ahead in lithium ion battery technology require a fundamental understanding of lithium ion transport, salt association, and ion solvation within the electrolyte. Indeed, well-defined model compounds allow for systematic studies of molecular ion transport. Thus, in the present work, based on the concept of ‘immobilizing’ ion solvents, three main series with a cyclotriphosphazene (CTP), hexaphenylbenzene (HBP), and tetramethylcyclotetrasiloxane (TMS) scaffold were prepared. Lithium ion solvents, among others ethylene carbonate (EC), which has proven to fulfill together with pro-pylene carbonate safety and market concerns in commercial lithium ion batteries, were attached to the different cores via alkyl spacers of variable length.rnrnAll model compounds were fully characterized, pure and thermally stable up to at least 235 °C, covering the requested broad range of glass transition temperatures from -78.1 °C up to +6.2 °C. While the CTP models tend to rearrange at elevated temperatures over time, which questions the general stability of alkoxide related (poly)phosphazenes, both, the HPB and CTP based models show no evidence of core stacking. In particular the CTP derivatives represent good solvents for various lithium salts, exhibiting no significant differences in the ionic conductivity σ_dc and thus indicating comparable salt dissociation and rather independent motion of cations and ions.rnrnIn general, temperature-dependent bulk ionic conductivities investigated via impedance spectroscopy follow a William-Landel-Ferry (WLF) type behavior. Modifications of the alkyl spacer length were shown to influence ionic conductivities only in combination to changes in glass transition temperatures. Though the glass transition temperatures of the blends are low, their conductivities are only in the range of typical polymer electrolytes. The highest σ_dc obtained at ambient temperatures was 6.0 x 10-6 S•cm-1, strongly suggesting a rather tight coordination of the lithium ions to the solvating 2-oxo-1,3-dioxolane moieties, supported by the increased σ_dc values for the oligo(ethylene oxide) based analogues.rnrnFurther insights into the mechanism of lithium ion dynamics were derived from 7Li and 13C Solid- State NMR investigations. While localized ion motion was probed by i.e. 7Li spin-lattice relaxation measurements with apparent activation energies E_a of 20 to 40 kJ/mol, long-range macroscopic transport was monitored by Pulsed-Field Gradient (PFG) NMR, providing an E_a of 61 kJ/mol. The latter is in good agreement with the values determined from bulk conductivity data, indicating the major contribution of ion transport was only detected by PFG NMR. However, the μm-diffusion is rather slow, emphasizing the strong lithium coordination to the carbonyl oxygens, which hampers sufficient ion conductivities and suggests exploring ‘softer’ solvating moieties in future electrolytes.rn
Ultrasensitive chemiluminescence bioassays based on microfluidics in miniaturized analytical devices
Resumo:
The activity carried out during my PhD was principally addressed to the development of portable microfluidic analytical devices based on biospecific molecular recognition reactions and CL detection. In particular, the development of biosensors required the study of different materials and procedures for their construction, with particular attention to the development of suitable immobilization procedures, fluidic systems and the selection of the suitable detectors. Different methods were exploited, such as gene probe hybridization assay or immunoassay, based on different platform (functionalized glass slide or nitrocellulose membrane) trying to improve the simplicity of the assay procedure. Different CL detectors were also employed and compared with each other in the search for the best compromise between portability and sensitivity. The work was therefore aimed at miniaturization and simplification of analytical devices and the study involved all aspects of the system, from the analytical methodology to the type of detector, in order to combine high sensitivity with easiness-of-use and rapidity. The latest development involving the use of smartphone as chemiluminescent detector paves the way for a new generation of analytical devices in the clinical diagnostic field thanks to the ideal combination of sensibility a simplicity of the CL with the day-by-day increase in the performance of the new generation smartphone camera. Moreover, the connectivity and data processing offered by smartphones can be exploited to perform analysis directly at home with simple procedures. The system could eventually be used to monitor patient health and directly notify the physician of the analysis results allowing a decrease in costs and an increase in the healthcare availability and accessibility.
Resumo:
This thesis presents a CMOS Amplifier with High Common Mode rejection designed in UMC 130nm technology. The goal is to achieve a high amplification factor for a wide range of biological signals (with frequencies in the range of 10Hz-1KHz) and to reject the common-mode noise signal. It is here presented a Data Acquisition System, composed of a Delta-Sigma-like Modulator and an antenna, that is the core of a portable low-complexity radio system; the amplifier is designed in order to interface the data acquisition system with a sensor that acquires the electrical signal. The Modulator asynchronously acquires and samples human muscle activity, by sending a Quasi-Digital pattern that encodes the acquired signal. There is only a minor loss of information translating the muscle activity using this pattern, compared to an encoding technique which uses astandard digital signal via Impulse-Radio Ultra-Wide Band (IR-UWB). The biological signals, needed for Electromyographic analysis, have an amplitude of 10-100μV and need to be highly amplified and separated from the overwhelming 50mV common mode noise signal. Various tests of the firmness of the concept are presented, as well the proof that the design works even with different sensors, such as Radiation measurement for Dosimetry studies.