954 resultados para continuous biometric authentication system
Resumo:
Digital Microfluidics (DMF) is a second generation technique, derived from the conventional microfluidics that instead of using continuous liquid fluxes, it uses only individual droplets driven by external electric signals. In this thesis a new DMF control/sensing system for visualization, droplet control (movement, dispensing, merging and splitting) and real time impedance measurement have been developed. The software for the proposed system was implemented in MATLAB with a graphical user interface. An Arduino was used as control board and dedicated circuits for voltage switching and contacts were designed and implemented in printed circuit boards. A high resolution camera was integrated for visualization. In our new approach, the DMF chips are driven by a dual-tone signal where the sum of two independent ac signals (one for droplet operations and the other for impedance sensing) is applied to the electrodes, and afterwards independently evaluated by a lock-in amplifier. With this new approach we were able to choose the appropriated amplitudes and frequencies for the different proposes (actuation and sensing). The measurements made were used to evaluate the real time droplet impedance enabling the knowledge of its position and velocity. This new approach opens new possibilities for impedance sensing and feedback control in DMF devices.
Resumo:
Nowadays, authentication studies for paintings require a multidisciplinary approach, based on the contribution of visual features analysis but also on characterizations of materials and techniques. Moreover, it is important that the assessment of the authorship of a painting is supported by technical studies of a selected number of original artworks that cover the entire career of an artist. This dissertation is concerned about the work of modernist painter Amadeo de Souza-Cardoso. It is divided in three parts. In the first part, we propose a tool based on image processing that combines information obtained by brushstroke and materials analysis. The resulting tool provides qualitative and quantitative evaluation of the authorship of the paintings; the quantitative element is particularly relevant, as it could be crucial in solving authorship controversies, such as judicial disputes. The brushstroke analysis was performed by combining two algorithms for feature detection, namely Gabor filter and Scale Invariant Feature Transform. Thanks to this combination (and to the use of the Bag-of-Features model), the proposed method shows an accuracy higher than 90% in distinguishing between images of Amadeo’s paintings and images of artworks by other contemporary artists. For the molecular analysis, we implemented a semi-automatic system that uses hyperspectral imaging and elemental analysis. The system provides as output an image that depicts the mapping of the pigments present, together with the areas made using materials not coherent with Amadeo’s palette, if any. This visual output is a simple and effective way of assessing the results of the system. The tool proposed based on the combination of brushstroke and molecular information was tested in twelve paintings obtaining promising results. The second part of the thesis presents a systematic study of four selected paintings made by Amadeo in 1917. Although untitled, three of these paintings are commonly known as BRUT, Entrada and Coty; they are considered as his most successful and genuine works. The materials and techniques of these artworks have never been studied before. The paintings were studied with a multi-analytical approach using micro-Energy Dispersive X-ray Fluorescence spectroscopy, micro-Infrared and Raman Spectroscopy, micro-Spectrofluorimetry and Scanning Electron Microscopy. The characterization of Amadeo’s materials and techniques used on his last paintings, as well as the investigation of some of the conservation problems that affect these paintings, is essential to enrich the knowledge on this artist. Moreover, the study of the materials in the four paintings reveals commonalities between the paintings BRUT and Entrada. This observation is supported also by the analysis of the elements present in a photograph of a collage (conserved at the Art Library of the Calouste Gulbenkian Foundation), the only remaining evidence of a supposed maquete of these paintings. The final part of the thesis describes the application of the image processing tools developed in the first part of the thesis on a set of case studies; this experience demonstrates the potential of the tool to support painting analysis and authentication studies. The brushstroke analysis was used as additional analysis on the evaluation process of four paintings attributed to Amadeo, and the system based on hyperspectral analysis was applied on the painting dated 1917. The case studies therefore serve as a bridge between the first two parts of the dissertation.
Resumo:
Injectable biomaterials with in situ cross-linking reactions have been suggested to minimize the invasiveness associated with most implantation procedures. However, problems related with the rapid liquid-to-gel transition reaction can arise because it is difficult to predict the reliability of the reaction and its end products, as well as to mitigate cytotoxicity to the surrounding tissues. An alternative minimally invasive approach to deliver solid implants in vivo is based on injectable microparticles, which can be processed in vitro with high fidelity and reliability, while showing low cytotoxicity. Their delivery to the defect can be performed by injection through a small diameter syringe needle. We present a new methodology for the continuous, solvent- and oil-free production of photopolymerizable microparticles containing encapsulated human dermal fibroblasts. A precursor solution of cells in photo-reactive PEG-fibrinogen (PF) polymer was transported through a transparent injector exposed to light-irradiation before being atomized in a jet-in-air nozzle. Shear rheometry data provided the cross-linking kinetics of each PF/cell solution, which was then used to determine the amount of irradiation required to partially polymerize the mixture prior to atomization. The partially polymerized drops fell into a gelation bath for further polymerization. The system was capable of producing cell-laden microparticles with high cellular viability, with an average diameter of between 88.1 µm to 347.1 µm and a dispersity of between 1.1 and 2.4, depending on the parameters chosen.
Resumo:
The assessment of concrete mechanical properties during construction of concrete structures is of paramount importance for many intrinsic operations. However many of the available non-destructive methods for mechanical properties have limitations for use in construction sites. One of such methodologies is EMM-ARM, which is a variant of classic resonant frequency methods. This paper aims to demonstrate the efforts towards in-situ applicability of EMMARM, as to provide real-time information about concrete mechanical properties such as E-modulus and compressive strength. To achieve the aforementioned objective, a set of adaptations to the method have been successfully implemented and tested: (i) the reduction of the beam span; (ii) the use of a different mould material and (iii) a new support system for the beams. Based on these adaptations, a reusable mould was designed to enable easier systematic use of EMMARM. A pilot test was successfully performed under in-situ conditions during a bridge construction.
Resumo:
The Internet of Things (IoT) is a concept that can foster the emergence of innovative applications. In order to minimize parents’s concerns about their children’s safety, this paper presents the design of a smart Internet of Things system for identifying dangerous situations. The system will be based on real time collection and analysis of physiological signals monitored by non-invasive and non-intrusive sensors, Frequency IDentification (RFID) tags and a Global Positioning System (GPS) to determine when a child is in danger. The assumption of a state of danger is made taking into account the validation of a certain number of biometric reactions to some specific situations and according to a self-learning algorithm developed for this architecture. The results of the analysis of data collected and the location of the child will be able in real time to child’s care holders in a web application.
Resumo:
The blood brain barrier (BBB) and the blood cerebrospinal fluid barrier (BCSFB) form the barriers of the brain. These barriers are essential not only for the protection of the brain, but also in regulating the exchange of cells and molecules in and out of the brain. The choroid plexus (CP) epithelial cells and the arachnoid membrane form the BCSFB. The CP is structurally divided into two independent compartments: one formed by a unique and continuous line of epithelial cells that rest upon a basal lamina; and, a second consisting of a central core formed by connective and highly vascularized tissue populated by diverse cell types (fibroblasts, macrophages and dendritic cells). Here, we review how the CP transcriptome and secretome vary depending on the nature and duration of the stimuli to which the CP is exposed. Specifically, when the peripheral stimulation is acute the CP response is rapid, strong and transient, whereas if the stimulation is sustained in time the CP response persists but it is weaker. Furthermore, not all of the epithelium responds at the same time to peripheral stimulation, suggesting the existence of a synchrony system between individual CP epithelial cells.
Resumo:
Bundle of capillaries, drying kinetics, continuous model, relative permeability, capillary pressure, control volume method
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2013
Resumo:
Magdeburg, Univ., Fak. für Verfahrens- und Systemtechnik, Diss., 2015
Resumo:
While mobile technologies can provide great personalized services for mobile users, they also threaten their privacy. Such personalization-privacy paradox are particularly salient for context aware technology based mobile applications where user's behaviors, movement and habits can be associated with a consumer's personal identity. In this thesis, I studied the privacy issues in the mobile context, particularly focus on an adaptive privacy management system design for context-aware mobile devices, and explore the role of personalization and control over user's personal data. This allowed me to make multiple contributions, both theoretical and practical. In the theoretical world, I propose and prototype an adaptive Single-Sign On solution that use user's context information to protect user's private information for smartphone. To validate this solution, I first proved that user's context is a unique user identifier and context awareness technology can increase user's perceived ease of use of the system and service provider's authentication security. I then followed a design science research paradigm and implemented this solution into a mobile application called "Privacy Manager". I evaluated the utility by several focus group interviews, and overall the proposed solution fulfilled the expected function and users expressed their intentions to use this application. To better understand the personalization-privacy paradox, I built on the theoretical foundations of privacy calculus and technology acceptance model to conceptualize the theory of users' mobile privacy management. I also examined the role of personalization and control ability on my model and how these two elements interact with privacy calculus and mobile technology model. In the practical realm, this thesis contributes to the understanding of the tradeoff between the benefit of personalized services and user's privacy concerns it may cause. By pointing out new opportunities to rethink how user's context information can protect private data, it also suggests new elements for privacy related business models.
Resumo:
A new approach to dengue vector surveillance based on permanent egg-collection using a modified ovitrap and Bacillus thuringiensis israelensis(Bti) was evaluated in different urban landscapes in Recife, Northeast Brazil. From April 2004 to April 2005, 13 egg-collection cycles of four weeks were carried out. Geo-referenced ovitraps containing grass infusion, Bti and three paddles were placed at fixed sampling stations distributed over five selected sites. Continuous egg-collections yielded more than four million eggs laid into 464 sentinel-ovitraps over one year. The overall positive ovitrap index was 98.5% (over 5,616 trap observations). The egg density index ranged from 100 to 2,500 eggs per trap-cycle, indicating a wide spread and high density of Aedes aegypti (Diptera: Culicidae) breeding populations in all sites. Fluctuations in population density over time were observed, particularly a marked increase from January on, or later, according to site. Massive egg-collection carried out at one of the sites prevented such a population outbreak. At intra-site level, egg counts made it possible to identify spots where the vector population is consistently concentrated over the time, pinpointing areas that should be considered high priority for control activities. The results indicate that these could be promising strategies for detecting and preventing Ae. aegypti population outbreaks.
Resumo:
Summary Forests are key ecosystems of the earth and associated with a large range of functions. Many of these functions are beneficial to humans and are referred to as ecosystem services. Sustainable development requires that all relevant ecosystem services are quantified, managed and monitored equally. Natural resource management therefore targets the services associated with ecosystems. The main hypothesis of this thesis is that the spatial and temporal domains of relevant services do not correspond to a discrete forest ecosystem. As a consequence, the services are not quantified, managed and monitored in an equal and sustainable manner. The thesis aims were therefore to test this hypothesis, establish an improved conceptual approach and provide spatial applications for the relevant land cover and structure variables. The study was carried out in western Switzerland and based primarily on data from a countrywide landscape inventory. This inventory is part of the third Swiss national forest inventory and assesses continuous landscape variables based on a regular sampling of true colour aerial imagery. In addition, land cover variables were derived from Landsat 5 TM passive sensor data and land structure variables from active sensor data from a small footprint laserscanning system. The results confirmed the main hypothesis, as relevant services did not scale well with the forest ecosystem. Instead, a new conceptual approach for sustainable management of natural resources was described. This concept quantifies the services as a continuous function of the landscape, rather than a discrete function of the forest ecosystem. The explanatory landscape variables are therefore called continuous fields and the forest becomes a dependent and function-driven management unit. Continuous field mapping methods were established for land cover and structure variables. In conclusion, the discrete forest ecosystem is an adequate planning and management unit. However, monitoring the state of and trends in sustainability of services requires them to be quantified as a continuous function of the landscape. Sustainable natural resource management iteratively combines the ecosystem and gradient approaches. Résumé Les forêts sont des écosystèmes-clés de la terre et on leur attribue un grand nombre de fonctions. Beaucoup de ces fonctions sont bénéfiques pour l'homme et sont nommées services écosystémiques. Le développement durable exige que ces services écosystémiques soient tous quantifiés, gérés et surveillés de façon égale. La gestion des ressources naturelles a donc pour cible les services attribués aux écosystèmes. L'hypothèse principale de cette thèse est que les domaines spatiaux et temporels des services attribués à la forêt ne correspondent pas à un écosystème discret. Par conséquent, les services ne sont pas quantifiés, aménagés et surveillés d'une manière équivalente et durable. Les buts de la thèse étaient de tester cette hypothèse, d'établir une nouvelle approche conceptuelle de la gestion des ressources naturelles et de préparer des applications spatiales pour les variables paysagères et structurelles appropriées. L'étude a été menée en Suisse occidentale principalement sur la base d'un inventaire de paysage à l'échelon national. Cet inventaire fait partie du troisième inventaire forestier national suisse et mesure de façon continue des variables paysagères sur la base d'un échantillonnage régulier sur des photos aériennes couleur. En outre, des variables de couverture ? terrestre ont été dérivées des données d'un senseur passif Landsat 5 TM, ainsi que des variables structurelles, dérivées du laserscanning, un senseur actif. Les résultats confirment l'hypothèse principale, car l'échelle des services ne correspond pas à celle de l'écosystème forestier. Au lieu de cela, une nouvelle approche a été élaborée pour la gestion durable des ressources naturelles. Ce concept représente les services comme une fonction continue du paysage, plutôt qu'une fonction discrète de l'écosystème forestier. En conséquence, les variables explicatives de paysage sont dénommées continuous fields et la forêt devient une entité dépendante, définie par la fonction principale du paysage. Des méthodes correspondantes pour la couverture terrestre et la structure ont été élaborées. En conclusion, l'écosystème forestier discret est une unité adéquate pour la planification et la gestion. En revanche, la surveillance de la durabilité de l'état et de son évolution exige que les services soient quantifiés comme fonction continue du paysage. La gestion durable des ressources naturelles joint donc l'approche écosystémique avec celle du gradient de manière itérative.
Oral cancer treatments and adherence: medication event monitoring system assessment for capecitabine
Resumo:
Background: Oncological treatments are traditionally administered via intravenous injection by qualified personnel. Oral formulas which are developing rapidly are preferred by patients and facilitate administration however they may increase non-adherence. In this study 4 common oral chemotherapeutics are given to 50 patients, who are still in the process of inclusion, divided into 4 groups. The aim is to evaluate adherence and offer these patients interdisciplinary support with the joint help of doctors and pharmacists. We present here the results for capecitabine. Materials and Methods: The final goal is to evaluate adhesion in 50 patients split into 4 groups according to oral treatments (letrozole/exemestane, imatinib/sunitinib, capecitabine and temozolomide) using persistence and quality of execution as parameters. These parameters are evaluated using a medication event monitoring system (MEMS®) in addition to routine oncological visits and semi-structured interviews. Patients were monitored for the entire duration of treatment up to a maximum of 1 year. Patient satisfaction was assessed at the end of the monitoring period using a standardized questionary. Results: Capecitabine group included 2 women and 8 men with a median age of 55 years (range: 36−77 years) monitored for an average duration of 100 days (range: 5-210 days). Persistence was 98% and quality of execution 95%. 5 patients underwent cyclic treatment (2 out of 3 weeks) and 5 patients continuous treatment. Toxicities higher than grade 1 were grade 2−3 hand-foot syndrome in 1 patient and grade 3 acute coronary syndrome in 1 patient both without impact on adherence. Patients were satisfied with the interviews undergone during the study (57% useful, 28% very useful, 15% useless) and successfully integrated the MEMS® in their daily lives (57% very easily, 43% easily) according to the results obtained by questionary at the end of the monitoring period. Conclusion: Persistence and quality of execution observed in our Capecitabine group of patients were excellent and better than expected compared to previously published studies. The interdisciplinary approach allowed us to better identify and help patients with toxicities to maintain adherence. Overall patients were satisfied with the global interdisciplinary follow-up. With longer follow up better evaluation of our method and its impact will be possible. Interpretation of the results of patients in the other groups of this ongoing trial will provide us information for a more detailed analysis.
Resumo:
Viruses rapidly evolve, and HIV in particular is known to be one of the fastest evolving human viruses. It is now commonly accepted that viral evolution is the cause of the intriguing dynamics exhibited during HIV infections and the ultimate success of the virus in its struggle with the immune system. To study viral evolution, we use a simple mathematical model of the within-host dynamics of HIV which incorporates random mutations. In this model, we assume a continuous distribution of viral strains in a one-dimensional phenotype space where random mutations are modelled by di ffusion. Numerical simulations show that random mutations combined with competition result in evolution towards higher Darwinian fitness: a stable traveling wave of evolution, moving towards higher levels of fi tness, is formed in the phenoty space.
Resumo:
Background :¦In addition to opportunistic infections of the central nervous system (CNS), which are due to immunosuppression related to HIV, the latter virus, itself, can cause neuropathological abnormalities which are located mainly in the basal ganglia and are characterized by microglial giant cells, reactive astrocytosis and perivascular monocytes. This HIV encephalopathy is characterized, clinically, by psycho-motor slowing, memory loss, difficulties in complex tasks requiring executive functions, as well as motor disorders .These cognitive deficits are grouped under the acronym of HIV-associated neurocognitive disorders (HAND). In fact, HANDs are subdivided in three groups in accordance with the severity of the cognitive impairment: Asymptomatic Neurocognitive Impairment (ANI), Mild/moderate Neurocognitive Disorders (MND) and HIV Associated Dementia (HAD).¦While the incidence of HAD has significantly decreased in the era of combined antiretrobiral therapy (cART), the prevalence of milder forms of HIV-associated neurocognitive disorders HAND seem to have increased. There are many potential reasons to explain this state of facts.¦An important question is to understand how soon the brain may be affected by HIV. Since performing a biopsy in these patients is not an issue, the study of the CSF represents the best available way to look at putative biomarkers of inflammation/neurodegeneration in the CNS. Here, we wanted to examined the putative usefulness of different biomarkers as early indicators of anti-retroviral failure at the level of the CNS. We chose to study the CSF levels of:¦Amyloid-β 1-42 (Aβ42), Tau total (tTau), phosphorylated Tau (pTau), Neopterin and S100-β.¦Indeed, these molecules are representative biomarkers of the major cells of the CNS, i.e. neurons,¦macrophages/microglia and astrocytes.¦To examine how sensitive were these CSF biomarkers to indicate CNS insults caused by HIV, we proposed to take advantage of the MOST (Monotherapy Switzerland/Thailand study) study, recently published in AIDS. Thus, we collaborated with Prof. Pietro Vernazza in St-Gall. In MOST study, monotherapy (MT) consisting in ritonavir-boosted lopinavir (LPV/r) was compared to continuous conventional antiretroviral therapy including several molecules, hereafter referred as CT¦Methods :We tested 61 cerebrospinal fluid (CSF) samples from 52 patients enrolled in MOST, including 34 CSF samples of CT and 27 of MT (mean duration on MT: 47+20 weeks) in patients who maintained full VL suppression in blood (<50cps/ml). Using enzyme-linked immunosorbent assay (ELISA), we determined the CSF concentration of S100-beta (astrocytosis), neopterin (microglia, inflammation), total Tau (tTau), phosphorylated Tau (pTau), and amyloid-beta 1-42 (Abeta), the latter three markers indicating neuronal damages. The CSF samples of 37 HIV-negative patients with Alzheimer dementia (AD) served as controls. Results are expressed in pg/ml and reported as median ± interquartile range. Mann Whitney-U test was used to compare the results of a given biomarker between two groups and the Fisher test to compare frequencies.¦Results: We found a higher concentration of S100-beta (570±1132) and neopterin (2.5±2.9) in the CSF of MT versus CT (0±532, p=0.002 and 1.2±2.5, p=0.058, respectively). A cutoff of 940 pg/ml for S100-beta allowed to discriminate MT (11 above versus 16 below) from CT (1 vs 33, p=0.0003). At a lesser extent, a cutoff of 11 pg/ml for neopterin separated MT (4 above versus 23) from CT (0 vs 34, p=0.034) (Figure).¦In AD, tTau was higher (270±414) and Abeta lower (234±328) than in CT (150±153, p=0.0078, and 466±489, p=0.007, respectively). Such as for CT, Abeta was lower in AD than in MT (390±412, p=0.01). However, contrasting with CT, the levels of tTau were not different between AD and MT (199±177, p=0.11). S100b (173±214; p=0.0006) and neopterin (1.1±0.9; p=0.0014) were lower in AD than MT.¦Conclusions: Despite full VL-suppression in blood, HIV monotherapy is sufficient to trigger inflammation and, especially, astrocytosis. CSF markers of patients on CT have the same profile as reported for healthy subjects, suggesting that CT permits a good control of HIV in the brain. Finally, the levels of tTau, which are relatively similar between AD and MT patients, suggest that neurons are damaged during monotherapy.