985 resultados para Human-centered computing
Resumo:
OBJECTIVES To evaluate the advantages of cytology and PCR of high-risk human papilloma virus (PCR HR-HPV) infection in biopsy-derived diagnosis of high-grade squamous intraepithelial lesions (HSIL = AIN2/AIN3) in HIV-positive men having sex with men (MSM). METHODS This is a single-centered study conducted between May 2010 and May 2014 in patients (n = 201, mean age 37 years) recruited from our outpatient clinic. Samples of anal canal mucosa were taken into liquid medium for PCR HPV analysis and for cytology. Anoscopy was performed for histology evaluation. RESULTS Anoscopy showed 33.8% were normal, 47.8% low-grade squamous intraepithelial lesions (LSIL), and 18.4% HSIL; 80.2% had HR-HPV. PCR of HR-HPV had greater sensitivity than did cytology (88.8% vs. 75.7%) in HSIL screening, with similar positive (PPV) and negative predictive value (NPV) of 20.3 vs. 22.9 and 89.7 vs. 88.1, respectively. Combining both tests increased the sensitivity and NPV of HSIL diagnosis to 100%. Correlation of cytology vs. histology was, generally, very low and PCR of HR-HPV vs. histology was non-existent (<0.2) or low (<0.4). Area under the receiver operating characteristics (AUROC) curve analysis of cytology and PCR HR-HPV for the diagnosis of HSIL was poor (<0.6). Multivariate regression analysis showed protective factors against HSIL were: viral suppression (OR: 0.312; 95%CI: 0.099-0.984), and/or syphilis infection (OR: 0.193; 95%CI: 0.045-0.827). HSIL risk was associated with HPV-68 genotype (OR: 20.1; 95%CI: 2.04-197.82). CONCLUSIONS When cytology and PCR HR-HPV findings are normal, the diagnosis of pre-malignant HSIL can be reliably ruled-out in HIV-positive patients. HPV suppression with treatment protects against the appearance of HSIL.
Resumo:
Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.
Resumo:
Report for the scientific sojourn carried out at the School of Computing of the University of Dundee, United Kingdom, from 2010 to 2012. This document is a scientific report of the work done, main results, publications and accomplishment of the objectives of the 2-year post-doctoral research project with reference number BP-A 00239. The project has addressed the topic of older people (60+) and Information and Communication Technologies (ICT), which is a topic of growing social and research interest, from a Human-Computer Interaction perspective. Over a 2-year period (June 2010-June 2012), we have conducted classical ethnography of ICT use in a computer clubhouse in Scotland, addressing interaction barriers and strategies, social sharing practices in Social Network Sites, and ICT learning, and carried out rapid ethnographical studies related to geo-enabled ICT and e-government services towards supporting independent living and active ageing. The main results have provided a much deeper understanding of (i) the everyday use of Computer-Mediated Communication tools, such as video-chats and blogs, and its evolution as older people’s experience with ICT increases over time, (ii) cross-cultural aspects of ICT use in the north and south of Europe, (iii) the relevance of cognition over vision in interacting with geographical information and a wide range of ICT tools, despite common stereotypes (e.g. make things bigger), (iv) the important relationship offline-online to provide older people with socially inclusive and meaningful eservices for independent living and active ageing, (v) how older people carry out social sharing practices in the popular YouTube, (vi) their user experiences and (vii) the challenges they face in ICT learning and the strategies they use to become successful ICT learners over time. The research conducted in this project has been published in 17 papers, 4 in journals – two of which in JCR, 5 in conferences, 4 in workshops and 4 in magazines. Other public output consists of 10 invited talks and seminars.
Resumo:
Background: The human FOXI1 gene codes for a transcription factor involved in the physiology of the inner ear, testis, and kidney. Using three interspecies comparisons, it has been suggested that this may be a gene underhuman-specific selection. We sought to confirm this finding by using an extended set of orthologous sequences.Additionally, we explored for signals of natural selection within humans by sequencing the gene in 20 Europeans,20 East Asians and 20 Yorubas and by analysing SNP variation in a 2 Mb region centered on FOXI1 in 39worldwide human populations from the HGDP-CEPH diversity panel.Results: The genome sequences recently available from other primate and non-primate species showed that FOXI1divergence patterns are compatible with neutral evolution. Sequence-based neutrality tests were not significant inEuropeans, East Asians or Yorubas. However, the Long Range Haplotype (LRH) test, as well as the iHS and XP-Rsbstatistics revealed significantly extended tracks of homozygosity around FOXI1 in Africa, suggesting a recentepisode of positive selection acting on this gene. A functionally relevant SNP, as well as several SNPs either on theputatively selected core haplotypes or with significant iHS or XP-Rsb values, displayed allele frequencies stronglycorrelated with the absolute geographical latitude of the populations sampled.Conclusions: We present evidence for recent positive selection in the FOXI1 gene region in Africa. Climate mightbe related to this recent adaptive event in humans. Of the multiple functions of FOXI1, its role in kidney-mediatedwater-electrolyte homeostasis is the most obvious candidate for explaining a climate-related adaptation.
Resumo:
The goal of this study was to investigate the impact of computing parameters and the location of volumes of interest (VOI) on the calculation of 3D noise power spectrum (NPS) in order to determine an optimal set of computing parameters and propose a robust method for evaluating the noise properties of imaging systems. Noise stationarity in noise volumes acquired with a water phantom on a 128-MDCT and a 320-MDCT scanner were analyzed in the spatial domain in order to define locally stationary VOIs. The influence of the computing parameters in the 3D NPS measurement: the sampling distances bx,y,z and the VOI lengths Lx,y,z, the number of VOIs NVOI and the structured noise were investigated to minimize measurement errors. The effect of the VOI locations on the NPS was also investigated. Results showed that the noise (standard deviation) varies more in the r-direction (phantom radius) than z-direction plane. A 25 × 25 × 40 mm(3) VOI associated with DFOV = 200 mm (Lx,y,z = 64, bx,y = 0.391 mm with 512 × 512 matrix) and a first-order detrending method to reduce structured noise led to an accurate NPS estimation. NPS estimated from off centered small VOIs had a directional dependency contrary to NPS obtained from large VOIs located in the center of the volume or from small VOIs located on a concentric circle. This showed that the VOI size and location play a major role in the determination of NPS when images are not stationary. This study emphasizes the need for consistent measurement methods to assess and compare image quality in CT.
Resumo:
Different signatures of natural selection persist over varying time scales in our genome, revealing possible episodes of adaptative evolution during human history. Here, we identify genes showing signatures of ancestral positive selection in the human lineage and investigate whether some of those genes have been evolving adaptatively in extant human populations. Specifically, we compared more than 11,000 human genes with their orthologs inchimpanzee, mouse, rat and dog and applied a branch-site likelihood method to test for positive selection on the human lineage. Among the significant cases, a robust set of 11 genes were then further explored for signatures of recent positive selection using SNP data. We genotyped 223 SNPs in 39 worldwide populations from the HGDP Diversity panel and supplemented this information with available genotypes for up to 4,814 SNPs distributed along 2 Mb centered on each gene. After exploring the allele frequency spectrum, population differentiation and the maintainance of long unbroken haplotypes, we found signals of recent adaptative phenomena in only one of the 11 candidate gene regions. However, the signal ofrecent selection in this region may come from a different, neighbouring gene (CD5) ratherthan from the candidate gene itself (VPS37C). For this set of positively-selected genes in thehuman lineage, we find no indication that these genes maintained their rapid evolutionarypace among human populations. Based on these data, it therefore appears that adaptation forhuman-specific and for population-specific traits may have involved different genes.
Resumo:
Discussion on improving the power of genome-wide association studies to identify candidate variants and genes is generally centered on issues of maximizing sample size; less attention is given to the role of phenotype definition and ascertainment. The authors used genome-wide data from patients infected with human immunodeficiency virus type 1 (HIV-1) to assess whether differences in type of population (622 seroconverters vs. 636 seroprevalent subjects) or the number of measurements available for defining the phenotype resulted in differences in the effect sizes of associations between single nucleotide polymorphisms and the phenotype, HIV-1 viral load at set point. The effect estimate for the top 100 single nucleotide polymorphisms was 0.092 (95% confidence interval: 0.074, 0.110) log(10) viral load (log(10) copies of HIV-1 per mL of blood) greater in seroconverters than in seroprevalent subjects. The difference was even larger when the authors focused on chromosome 6 variants (0.153 log(10) viral load) or on variants that achieved genome-wide significance (0.232 log(10) viral load). The estimates of the genetic effects tended to be slightly larger when more viral load measurements were available, particularly among seroconverters and for variants that achieved genome-wide significance. Differences in phenotype definition and ascertainment may affect the estimated magnitude of genetic effects and should be considered in optimizing power for discovering new associations.
Resumo:
Psychophysical studies suggest that humans preferentially use a narrow band of low spatial frequencies for face recognition. Here we asked whether artificial face recognition systems have an improved recognition performance at the same spatial frequencies as humans. To this end, we estimated recognition performance over a large database of face images by computing three discriminability measures: Fisher Linear Discriminant Analysis, Non-Parametric Discriminant Analysis, and Mutual Information. In order to address frequency dependence, discriminabilities were measured as a function of (filtered) image size. All three measures revealed a maximum at the same image sizes, where the spatial frequency content corresponds to the psychophysical found frequencies. Our results therefore support the notion that the critical band of spatial frequencies for face recognition in humans and machines follows from inherent properties of face images, and that the use of these frequencies is associated with optimal face recognition performance.
Resumo:
The User-centered design (UCD) Gymkhana is a tool for human-computer interaction practitioners to demonstrate through a game the key user-centered design methods and how they interrelate in the design process.The target audiences are other organizational departments unfamiliar with UCD but whose work is related to the definition, cretaion, and update of a product service.
Resumo:
The advent of the Internet had a great impact on distance education and rapidly e-learning has become a killer application. Education institutions worldwide are taking advantage of the available technology in order to facilitate education to a growing audience. Everyday, more and more people use e-learning systems, environments and contents for both training and learning. E-learning promotes educationamong people that due to different reasons could not have access to education: people who could nottravel, people with very little free time, or withdisabilities, etc. As e-learning systems grow and more people are accessing them, it is necessary to consider when designing virtual environments the diverse needs and characteristics that different users have. This allows building systems that people can use easily, efficiently and effectively, where the learning process leads to a good user experience and becomes a good learning experience.
Resumo:
In accordance with the Moore's law, the increasing number of on-chip integrated transistors has enabled modern computing platforms with not only higher processing power but also more affordable prices. As a result, these platforms, including portable devices, work stations and data centres, are becoming an inevitable part of the human society. However, with the demand for portability and raising cost of power, energy efficiency has emerged to be a major concern for modern computing platforms. As the complexity of on-chip systems increases, Network-on-Chip (NoC) has been proved as an efficient communication architecture which can further improve system performances and scalability while reducing the design cost. Therefore, in this thesis, we study and propose energy optimization approaches based on NoC architecture, with special focuses on the following aspects. As the architectural trend of future computing platforms, 3D systems have many bene ts including higher integration density, smaller footprint, heterogeneous integration, etc. Moreover, 3D technology can signi cantly improve the network communication and effectively avoid long wirings, and therefore, provide higher system performance and energy efficiency. With the dynamic nature of on-chip communication in large scale NoC based systems, run-time system optimization is of crucial importance in order to achieve higher system reliability and essentially energy efficiency. In this thesis, we propose an agent based system design approach where agents are on-chip components which monitor and control system parameters such as supply voltage, operating frequency, etc. With this approach, we have analysed the implementation alternatives for dynamic voltage and frequency scaling and power gating techniques at different granularity, which reduce both dynamic and leakage energy consumption. Topologies, being one of the key factors for NoCs, are also explored for energy saving purpose. A Honeycomb NoC architecture is proposed in this thesis with turn-model based deadlock-free routing algorithms. Our analysis and simulation based evaluation show that Honeycomb NoCs outperform their Mesh based counterparts in terms of network cost, system performance as well as energy efficiency.
Resumo:
Advances in technology have provided new ways of using entertainment and game technology to foster human interaction. Games and playing with games have always been an important part of people’s everyday lives. Traditionally, human-computer interaction (HCI) research was seen as a psychological cognitive science focused on human factors, with engineering sciences as the computer science part of it. Although cognitive science has made significant progress over the past decade, the influence of people’s emotions on design networks is increasingly important, especially when the primary goal is to challenge and entertain users (Norman 2002). Game developers have explored the key issues in game design and identified that the driving force in the success of games is user experience. User-centered design integrates knowledge of users’ activity practices, needs, and preferences into the design process. Geocaching is a location-based treasure hunt game created by a community of players. Players use GPS (Global Position System) technology to find “treasures” and create their own geocaches; the game can be developed when the players invent caches and used more imagination to creations the caches. This doctoral dissertation explores user experience of geocaching and its applications in tourism and education. Globally, based on the Geocaching.com webpage, geocaching has been played about 180 countries and there are more than 10 million registered geocachers worldwide (Geocaching.com, 25.11.2014). This dissertation develops and presents an interaction model called the GameFlow Experience model that can be used to support the design of treasure hunt applications in tourism and education contexts. The GameFlow Model presents and clarifies various experiences; it provides such experiences in a real-life context, offers desirable design targets to be utilized in service design, and offers a perspective to consider when evaluating the success of adventure game concepts. User-centered game designs have adapted to human factor research in mainstream computing science. For many years, the user-centered design approach has been the most important research field in software development. Research has been focusing on user-centered design in software development such as office programs, but the same ideas and theories that will reflect the needs of a user-centered research are now also being applied to game design (Charles et al. 2005.) For several years, we have seen a growing interest in user experience design. Digital games are experience providers, and game developers need tools to better understand the user experience related to products and services they have created. This thesis aims to present what the user experience is in geocaching and treasure hunt games and how it can be used to develop new concepts for the treasure hunt. Engineers, designers, and researchers should have a clear understanding of what user experience is, what its parts are, and most importantly, how we can influence user satisfaction. In addition, we need to understand how users interact with electronic products and people, and how different elements synergize their experiences. This doctoral dissertation represents pioneering work on the user experience of geocaching and treasure hunt games in the context of tourism and education. The research also provides a model for game developers who are planning treasure hunt concepts.
Resumo:
Despite recent well-known advancements in patient care in the medical fields, such as patient-centeredness and evidence-based medicine and practice, there is rather less known about their effects on the particulars of clinician-patient encounters. The emphasis in clinical encounters remains mostly on treatment and diagnosis and less on communicative competency or engagement for medical professionals. The purpose of this narrative study was to explore interactive competencies in diagnostic and therapeutic encounters and intake protocols within the context of the physicians’, nurses’, and medical receptionists’ perspectives and experiences. Literature on narrative medicine, phenomenology and medicine, therapeutic relationships, cultural and communication competency, and non-Western perspectives on human communication provided the guiding theoretical frameworks for the study. Three data sets including 13 participant interviews (5 physicians, 4 nurses, and 4 medical receptionists), policy documents (physicians, nurses, and medical receptionists) and a website (Communication and Cultural Competency) were used. The researcher then engaged in triangulated analyses, including N-Vivo, manifest and latent, Mishler’s (1984, 1995) narrative elements and Charon’s (2005, 2006a, 2006b, 2013) narrative themes, in recursive, overlapping, comparative and intersected analysis strategies. A common factor affecting physicians’ relationships with their clients was limitation of time, including limited time (a) to listen, (b) to come up with a proper diagnosis, and (c) to engage in decision making in critical conditions and limited time for patients’ visits. For almost all nurse participants in the study establishing therapeutic relationships meant being compassionate and empathetic. The goals of intake protocols for the medical receptionists were about being empathetic to patients, being an attentive listener, developing rapport, and being conventionally polite to patients. Participants with the least iv amount of training and preparation (medical receptionists) appeared to be more committed to working narratively in connecting with patients and establishing human relationships as well as in listening to patients’ stories and providing support to narrow down the reason for their visit. The diagnostic and intake “success stories” regarding patient clinical encounters for other study participants were focused on a timely securing of patient information, with some acknowledgement of rapport and emapathy. Patient-centeredness emerged as a discourse practice, with ambiguous or nebulous enactment of its premises in most clinical settings.
Resumo:
Self-adaptive software provides a profound solution for adapting applications to changing contexts in dynamic and heterogeneous environments. Having emerged from Autonomic Computing, it incorporates fully autonomous decision making based on predefined structural and behavioural models. The most common approach for architectural runtime adaptation is the MAPE-K adaptation loop implementing an external adaptation manager without manual user control. However, it has turned out that adaptation behaviour lacks acceptance if it does not correspond to a user’s expectations – particularly for Ubiquitous Computing scenarios with user interaction. Adaptations can be irritating and distracting if they are not appropriate for a certain situation. In general, uncertainty during development and at run-time causes problems with users being outside the adaptation loop. In a literature study, we analyse publications about self-adaptive software research. The results show a discrepancy between the motivated application domains, the maturity of examples, and the quality of evaluations on the one hand and the provided solutions on the other hand. Only few publications analysed the impact of their work on the user, but many employ user-oriented examples for motivation and demonstration. To incorporate the user within the adaptation loop and to deal with uncertainty, our proposed solutions enable user participation for interactive selfadaptive software while at the same time maintaining the benefits of intelligent autonomous behaviour. We define three dimensions of user participation, namely temporal, behavioural, and structural user participation. This dissertation contributes solutions for user participation in the temporal and behavioural dimension. The temporal dimension addresses the moment of adaptation which is classically determined by the self-adaptive system. We provide mechanisms allowing users to influence or to define the moment of adaptation. With our solution, users can have full control over the moment of adaptation or the self-adaptive software considers the user’s situation more appropriately. The behavioural dimension addresses the actual adaptation logic and the resulting run-time behaviour. Application behaviour is established during development and does not necessarily match the run-time expectations. Our contributions are three distinct solutions which allow users to make changes to the application’s runtime behaviour: dynamic utility functions, fuzzy-based reasoning, and learning-based reasoning. The foundation of our work is a notification and feedback solution that improves intelligibility and controllability of self-adaptive applications by implementing a bi-directional communication between self-adaptive software and the user. The different mechanisms from the temporal and behavioural participation dimension require the notification and feedback solution to inform users on adaptation actions and to provide a mechanism to influence adaptations. Case studies show the feasibility of the developed solutions. Moreover, an extensive user study with 62 participants was conducted to evaluate the impact of notifications before and after adaptations. Although the study revealed that there is no preference for a particular notification design, participants clearly appreciated intelligibility and controllability over autonomous adaptations.
Resumo:
The main objective of this paper aims at developing a methodology that takes into account the human factor extracted from the data base used by the recommender systems, and which allow to resolve the specific problems of prediction and recommendation. In this work, we propose to extract the user's human values scale from the data base of the users, to improve their suitability in open environments, such as the recommender systems. For this purpose, the methodology is applied with the data of the user after interacting with the system. The methodology is exemplified with a case study