916 resultados para Gesture based audio user interface
Resumo:
Metagenomic studies use high-throughput sequence data to investigate microbial communities in situ. However, considerable challenges remain in the analysis of these data, particularly with regard to speed and reliable analysis of microbial species as opposed to higher level taxa such as phyla. We here present Genometa, a computationally undemanding graphical user interface program that enables identification of bacterial species and gene content from datasets generated by inexpensive high-throughput short read sequencing technologies. Our approach was first verified on two simulated metagenomic short read datasets, detecting 100% and 94% of the bacterial species included with few false positives or false negatives. Subsequent comparative benchmarking analysis against three popular metagenomic algorithms on an Illumina human gut dataset revealed Genometa to attribute the most reads to bacteria at species level (i.e. including all strains of that species) and demonstrate similar or better accuracy than the other programs. Lastly, speed was demonstrated to be many times that of BLAST due to the use of modern short read aligners. Our method is highly accurate if bacteria in the sample are represented by genomes in the reference sequence but cannot find species absent from the reference. This method is one of the most user-friendly and resource efficient approaches and is thus feasible for rapidly analysing millions of short reads on a personal computer.
Resumo:
Syftet med denna studie är att undersöka fördröjningsskillnader inom användargränssnitt mellan nativeutvecklade appar (utveckling till varje plattform) och appar av typen generated apps. Eftersom arbetet syftar till att bidra med information om prestanda ansågs en experimentell metod vara det bästa valet. Mätning av laddningstider gjordes med hjälp av en videokamera som filmade utförandet av experimenten vilket gjorde metoden simpel och liknar det som en användare kommer att uppleva. Avgränsning till plattformarna Android och iOS gjordes där Xamarin valdes som ramverk inom tekniker som skapar generated apps. Mätdata från experiment som undersökte laddningstider, experiment med användare som hanterade listors respons samt undersökning av CPU och minnesanvändning tyder på ett återkommande mönster. Xamarin Forms med XAML är den teknik som presterat sämst under experimenten som sedan följs av Xamarin Forms. Xamarin Android/iOS hade inte lika stora prestandaförluster jämfört med nativeutvecklade delar. Generellt hanterar Xamarin Forms telefonens resurser sämre än vad Xamarin Android/iOS och native gör. Resultat från studien kan användas som beslutsstöd vid val av teknik. Studien bidrar även med data som kan användas vid vidare forskning inom området.
Resumo:
Työn tavoitteena oli toimintatutkimuksen kautta tutkia ketterän ohjelmistokehityksen keinoin toteutetun käyttöliittymäkehityksen kykyä vastata asiakkaiden todellisiin tarpeisiin. Työssä haettiin tapaustutkimusyritykselle olemassa olevan työkalun uuden version käyttöliittymän toteutusvaihtoehtoja ja toteutettiin korkean tarkkuuden prototyyppejä näitä hyödyntäen. Ketterän ohjelmistokehityksen arvot ja periaatteet soveltuivat kehitysprosessissa käytettäviksi erinomaisesti. Iteratiivinen lähestymistapa kehitykseen ja läheinen yhteistyö tapaustutkimusyrityksen ja kandidaatintyöntekijän kanssa mahdollistivat yrityksen odotusten täyttämisen. Työkalun käyttöliittymä saatettiin tasolle, joka mahdollistaa jatkokehittämisen aloituksen. Kattavamman testauttamisen sisällyttäminen kehitysprosessiin olisi edesauttanut vielä paremman lopputuloksen saavuttamista.
Resumo:
Phase change problems arise in many practical applications such as air-conditioning and refrigeration, thermal energy storage systems and thermal management of electronic devices. The physical phenomenon in such applications are complex and are often difficult to be studied in detail with the help of only experimental techniques. The efforts to improve computational techniques for analyzing two-phase flow problems with phase change are therefore gaining momentum. The development of numerical methods for multiphase flow has been motivated generally by the need to account more accurately for (a) large topological changes such as phase breakup and merging, (b) sharp representation of the interface and its discontinuous properties and (c) accurate and mass conserving motion of the interface. In addition to these considerations, numerical simulation of multiphase flow with phase change introduces additional challenges related to discontinuities in the velocity and the temperature fields. Moreover, the velocity field is no longer divergence free. For phase change problems, the focus of developmental efforts has thus been on numerically attaining a proper conservation of energy across the interface in addition to the accurate treatment of fluxes of mass and momentum conservation as well as the associated interface advection. Among the initial efforts related to the simulation of bubble growth in film boiling applications the work in \cite{Welch1995} was based on the interface tracking method using a moving unstructured mesh. That study considered moderate interfacial deformations. A similar problem was subsequently studied using moving, boundary fitted grids \cite{Son1997}, again for regimes of relatively small topological changes. A hybrid interface tracking method with a moving interface grid overlapping a static Eulerian grid was developed \cite{Juric1998} for the computation of a range of phase change problems including, three-dimensional film boiling \cite{esmaeeli2004computations}, multimode two-dimensional pool boiling \cite{Esmaeeli2004} and film boiling on horizontal cylinders \cite{Esmaeeli2004a}. The handling of interface merging and pinch off however remains a challenge with methods that explicitly track the interface. As large topological changes are crucial for phase change problems, attention has turned in recent years to front capturing methods utilizing implicit interfaces that are more effective in treating complex interface deformations. The VOF (Volume of Fluid) method was adopted in \cite{Welch2000} to simulate the one-dimensional Stefan problem and the two-dimensional film boiling problem. The approach employed a specific model for mass transfer across the interface involving a mass source term within cells containing the interface. This VOF based approach was further coupled with the level set method in \cite{Son1998}, employing a smeared-out Heaviside function to avoid the numerical instability related to the source term. The coupled level set, volume of fluid method and the diffused interface approach was used for film boiling with water and R134a at the near critical pressure condition \cite{Tomar2005}. The effect of superheat and saturation pressure on the frequency of bubble formation were analyzed with this approach. The work in \cite{Gibou2007} used the ghost fluid and the level set methods for phase change simulations. A similar approach was adopted in \cite{Son2008} to study various boiling problems including three-dimensional film boiling on a horizontal cylinder, nucleate boiling in microcavity \cite{lee2010numerical} and flow boiling in a finned microchannel \cite{lee2012direct}. The work in \cite{tanguy2007level} also used the ghost fluid method and proposed an improved algorithm based on enforcing continuity and divergence-free condition for the extended velocity field. The work in \cite{sato2013sharp} employed a multiphase model based on volume fraction with interface sharpening scheme and derived a phase change model based on local interface area and mass flux. Among the front capturing methods, sharp interface methods have been found to be particularly effective both for implementing sharp jumps and for resolving the interfacial velocity field. However, sharp velocity jumps render the solution susceptible to erroneous oscillations in pressure and also lead to spurious interface velocities. To implement phase change, the work in \cite{Hardt2008} employed point mass source terms derived from a physical basis for the evaporating mass flux. To avoid numerical instability, the authors smeared the mass source by solving a pseudo time-step diffusion equation. This measure however led to mass conservation issues due to non-symmetric integration over the distributed mass source region. The problem of spurious pressure oscillations related to point mass sources was also investigated by \cite{Schlottke2008}. Although their method is based on the VOF, the large pressure peaks associated with sharp mass source was observed to be similar to that for the interface tracking method. Such spurious fluctuation in pressure are essentially undesirable because the effect is globally transmitted in incompressible flow. Hence, the pressure field formation due to phase change need to be implemented with greater accuracy than is reported in current literature. The accuracy of interface advection in the presence of interfacial mass flux (mass flux conservation) has been discussed in \cite{tanguy2007level,tanguy2014benchmarks}. The authors found that the method of extending one phase velocity to entire domain suggested by Nguyen et al. in \cite{nguyen2001boundary} suffers from a lack of mass flux conservation when the density difference is high. To improve the solution, the authors impose a divergence-free condition for the extended velocity field by solving a constant coefficient Poisson equation. The approach has shown good results with enclosed bubble or droplet but is not general for more complex flow and requires additional solution of the linear system of equations. In current thesis, an improved approach that addresses both the numerical oscillation of pressure and the spurious interface velocity field is presented by featuring (i) continuous velocity and density fields within a thin interfacial region and (ii) temporal velocity correction steps to avoid unphysical pressure source term. Also I propose a general (iii) mass flux projection correction for improved mass flux conservation. The pressure and the temperature gradient jump condition are treated sharply. A series of one-dimensional and two-dimensional problems are solved to verify the performance of the new algorithm. Two-dimensional and cylindrical film boiling problems are also demonstrated and show good qualitative agreement with the experimental observations and heat transfer correlations. Finally, a study on Taylor bubble flow with heat transfer and phase change in a small vertical tube in axisymmetric coordinates is carried out using the new multiphase, phase change method.
Resumo:
The traditional process of filling the medicine trays and dispensing the medicines to the patients in the hospitals is manually done by reading the printed paper medicine chart. This process can be very strenuous and error-prone, given the number of sub-tasks involved in the entire workflow and the dynamic nature of the work environment. Therefore, efforts are being made to digitalise the medication dispensation process by introducing a mobile application called Smart Dosing application. The introduction of the Smart Dosing application into hospital workflow raises security concerns and calls for security requirement analysis. This thesis is written as a part of the smart medication management project at Embedded Systems Laboratory, A° bo Akademi University. The project aims at digitising the medicine dispensation process by integrating information from various health systems, and making them available through the Smart Dosing application. This application is intended to be used on a tablet computer which will be incorporated on the medicine tray. The smart medication management system include the medicine tray, the tablet device, and the medicine cups with the cup holders. Introducing the Smart Dosing application should not interfere with the existing process carried out by the nurses, and it should result in minimum modifications to the tray design and the workflow. The re-designing of the tray would include integrating the device running the application into the tray in a manner that the users find it convenient and make less errors while using it. The main objective of this thesis is to enhance the security of the hospital medicine dispensation process by ensuring the security of the Smart Dosing application at various levels. The methods used for writing this thesis was to analyse how the tray design, and the application user interface design can help prevent errors and what secure technology choices have to be made before starting the development of the next prototype of the Smart Dosing application. The thesis first understands the context of the use of the application, the end-users and their needs, and the errors made in everyday medication dispensation workflow by continuous discussions with the nursing researchers. The thesis then gains insight to the vulnerabilities, threats and risks of using mobile application in hospital medication dispensation process. The resulting list of security requirements was made by analysing the previously built prototype of the Smart Dosing application, continuous interactive discussions with the nursing researchers, and an exhaustive stateof- the-art study on security risks of using mobile applications in hospital context. The thesis also uses Octave Allegro method to make the readers understand the likelihood and impact of threats, and what steps should be taken to prevent or fix them. The security requirements obtained, as a result, are a starting point for the developers of the next iteration of the prototype for the Smart Dosing application.
Resumo:
Relatório de Estágio para a obtenção do grau de Mestre na área de Educação e Comunicação Multimédia
Resumo:
A collaboration between dot.rural at the University of Aberdeen and the iSchool at Northumbria University, POWkist is a pilot-study exploring potential usages of currently available linked datasets within the cultural heritage domain. Many privately-held family history collections (shoebox archives) remain vulnerable unless a sustainable, affordable and accessible model of citizen-archivist digital preservation can be offered. Citizen-historians have used the web as a platform to preserve cultural heritage, however with no accessible or sustainable model these digital footprints have been ad hoc and rarely connected to broader historical research. Similarly, current approaches to connecting material on the web by exploiting linked datasets do not take into account the data characteristics of the cultural heritage domain. Funded by Semantic Media, the POWKist project is investigating how best to capture, curate, connect and present the contents of citizen-historians’ shoebox archives in an accessible and sustainable online collection. Using the Curios platform - an open-source digital archive - we have digitised a collection relating to a prisoner of war during WWII (1939-1945). Following a series of user group workshops, POWkist is now connecting these ‘made digital’ items with the broader web using a semantic technology model and identifying appropriate linked datasets of relevant content such as DBPedia (an archived linked dataset of Wikipedia) and Ordnance Survey Open Data. We are analysing the characteristics of cultural heritage linked datasets, so that these materials are better visualised, contextualised and presented in an attractive and comprehensive user interface. Our paper will consider the issues we have identified, the solutions we are developing and include a demonstration of our work-in-progress.
Resumo:
Over the last decade, success of social networks has significantly reshaped how people consume information. Recommendation of contents based on user profiles is well-received. However, as users become dominantly mobile, little is done to consider the impacts of the wireless environment, especially the capacity constraints and changing channel. In this dissertation, we investigate a centralized wireless content delivery system, aiming to optimize overall user experience given the capacity constraints of the wireless networks, by deciding what contents to deliver, when and how. We propose a scheduling framework that incorporates content-based reward and deliverability. Our approach utilizes the broadcast nature of wireless communication and social nature of content, by multicasting and precaching. Results indicate this novel joint optimization approach outperforms existing layered systems that separate recommendation and delivery, especially when the wireless network is operating at maximum capacity. Utilizing limited number of transmission modes, we significantly reduce the complexity of the optimization. We also introduce the design of a hybrid system to handle transmissions for both system recommended contents ('push') and active user requests ('pull'). Further, we extend the joint optimization framework to the wireless infrastructure with multiple base stations. The problem becomes much harder in that there are many more system configurations, including but not limited to power allocation and how resources are shared among the base stations ('out-of-band' in which base stations transmit with dedicated spectrum resources, thus no interference; and 'in-band' in which they share the spectrum and need to mitigate interference). We propose a scalable two-phase scheduling framework: 1) each base station obtains delivery decisions and resource allocation individually; 2) the system consolidates the decisions and allocations, reducing redundant transmissions. Additionally, if the social network applications could provide the predictions of how the social contents disseminate, the wireless networks could schedule the transmissions accordingly and significantly improve the dissemination performance by reducing the delivery delay. We propose a novel method utilizing: 1) hybrid systems to handle active disseminating requests; and 2) predictions of dissemination dynamics from the social network applications. This method could mitigate the performance degradation for content dissemination due to wireless delivery delay. Results indicate that our proposed system design is both efficient and easy to implement.
Resumo:
The traditional process of filling the medicine trays and dispensing the medicines to the patients in the hospitals is manually done by reading the printed paper medicinechart. This process can be very strenuous and error-prone, given the number of sub-tasksinvolved in the entire workflow and the dynamic nature of the work environment.Therefore, efforts are being made to digitalise the medication dispensation process byintroducing a mobile application called Smart Dosing application. The introduction ofthe Smart Dosing application into hospital workflow raises security concerns and callsfor security requirement analysis. This thesis is written as a part of the smart medication management project at EmbeddedSystems Laboratory, A˚bo Akademi University. The project aims at digitising the medicine dispensation process by integrating information from various health systems, and making them available through the Smart Dosing application. This application is intended to be used on a tablet computer which will be incorporated on the medicine tray. The smart medication management system include the medicine tray, the tablet device, and the medicine cups with the cup holders. Introducing the Smart Dosing application should not interfere with the existing process carried out by the nurses, and it should result in minimum modifications to the tray design and the workflow. The re-designing of the tray would include integrating the device running the application into the tray in a manner that the users find it convenient and make less errors while using it. The main objective of this thesis is to enhance the security of the hospital medicine dispensation process by ensuring the security of the Smart Dosing application at various levels. The methods used for writing this thesis was to analyse how the tray design, and the application user interface design can help prevent errors and what secure technology choices have to be made before starting the development of the next prototype of the Smart Dosing application. The thesis first understands the context of the use of the application, the end-users and their needs, and the errors made in everyday medication dispensation workflow by continuous discussions with the nursing researchers. The thesis then gains insight to the vulnerabilities, threats and risks of using mobile application in hospital medication dispensation process. The resulting list of security requirements was made by analysing the previously built prototype of the Smart Dosing application, continuous interactive discussions with the nursing researchers, and an exhaustive state-of-the-art study on security risks of using mobile applications in hospital context. The thesis also uses Octave Allegro method to make the readers understand the likelihood and impact of threats, and what steps should be taken to prevent or fix them. The security requirements obtained, as a result, are a starting point for the developers of the next iteration of the prototype for the Smart Dosing application.
Resumo:
Annotation Pro - a description of techniques, methods implemented in the tool, as well as the list of all built in functionalities and features of the user interface, and usage tips.
Resumo:
Sequences of timestamped events are currently being generated across nearly every domain of data analytics, from e-commerce web logging to electronic health records used by doctors and medical researchers. Every day, this data type is reviewed by humans who apply statistical tests, hoping to learn everything they can about how these processes work, why they break, and how they can be improved upon. To further uncover how these processes work the way they do, researchers often compare two groups, or cohorts, of event sequences to find the differences and similarities between outcomes and processes. With temporal event sequence data, this task is complex because of the variety of ways single events and sequences of events can differ between the two cohorts of records: the structure of the event sequences (e.g., event order, co-occurring events, or frequencies of events), the attributes about the events and records (e.g., gender of a patient), or metrics about the timestamps themselves (e.g., duration of an event). Running statistical tests to cover all these cases and determining which results are significant becomes cumbersome. Current visual analytics tools for comparing groups of event sequences emphasize a purely statistical or purely visual approach for comparison. Visual analytics tools leverage humans' ability to easily see patterns and anomalies that they were not expecting, but is limited by uncertainty in findings. Statistical tools emphasize finding significant differences in the data, but often requires researchers have a concrete question and doesn't facilitate more general exploration of the data. Combining visual analytics tools with statistical methods leverages the benefits of both approaches for quicker and easier insight discovery. Integrating statistics into a visualization tool presents many challenges on the frontend (e.g., displaying the results of many different metrics concisely) and in the backend (e.g., scalability challenges with running various metrics on multi-dimensional data at once). I begin by exploring the problem of comparing cohorts of event sequences and understanding the questions that analysts commonly ask in this task. From there, I demonstrate that combining automated statistics with an interactive user interface amplifies the benefits of both types of tools, thereby enabling analysts to conduct quicker and easier data exploration, hypothesis generation, and insight discovery. The direct contributions of this dissertation are: (1) a taxonomy of metrics for comparing cohorts of temporal event sequences, (2) a statistical framework for exploratory data analysis with a method I refer to as high-volume hypothesis testing (HVHT), (3) a family of visualizations and guidelines for interaction techniques that are useful for understanding and parsing the results, and (4) a user study, five long-term case studies, and five short-term case studies which demonstrate the utility and impact of these methods in various domains: four in the medical domain, one in web log analysis, two in education, and one each in social networks, sports analytics, and security. My dissertation contributes an understanding of how cohorts of temporal event sequences are commonly compared and the difficulties associated with applying and parsing the results of these metrics. It also contributes a set of visualizations, algorithms, and design guidelines for balancing automated statistics with user-driven analysis to guide users to significant, distinguishing features between cohorts. This work opens avenues for future research in comparing two or more groups of temporal event sequences, opening traditional machine learning and data mining techniques to user interaction, and extending the principles found in this dissertation to data types beyond temporal event sequences.
Resumo:
Laser speckle contrast imaging (LSCI) has the potential to be a powerful tool in medicine, but more research in the field is required so it can be used properly. To help in the progression of Michigan Tech's research in the field, a graphical user interface (GUI) was designed in Matlab to control the instrumentation of the experiments as well as process the raw speckle images into contrast images while they are being acquired. The design of the system was successful and is currently being used by Michigan Tech's Biomedical Engineering department. This thesis describes the development of the LSCI GUI as well as offering a full introduction into the history, theory and applications of LSCI.
Resumo:
The International Space Station (ISS) requires a substantial amount of potable water for use by the crew. The economic and logistic limitations of transporting the vast amount of water required onboard the ISS necessitate onboard recovery and reuse of the aqueous waste streams. Various treatment technologies are employed within the ISS water processor to render the waste water potable, including filtration, ion exchange, adsorption, and catalytic wet oxidation. The ion exchange resins and adsorption media are combined in multifiltration beds for removal of ionic and organic compounds. A mathematical model (MFBMODEL™) designed to predict the performance of a multifiltration (MF) bed was developed. MFBMODEL consists of ion exchange models for describing the behavior of the different resin types in a MF bed (e.g., mixed bed, strong acid cation, strong base anion, and weak base anion exchange resins) and an adsorption model capable of predicting the performance of the adsorbents in a MF bed. Multicomponent ion exchange ii equilibrium models that incorporate the water formation reaction, electroneutrality condition, and degree of ionization of weak acids and bases for mixed bed, strong acid cation, strong base anion, and weak base anion exchange resins were developed and verified. The equilibrium models developed use a tanks-inseries approach that allows for consideration of variable influent concentrations. The adsorption modeling approach was developed in related studies and application within the MFBMODEL framework was demonstrated in the Appendix to this study. MFBMODEL consists of a graphical user interface programmed in Visual Basic and Fortran computational routines. This dissertation shows MF bed modeling results in which the model is verified for a surrogate of the ISS waste shower and handwash stream. In addition, a multicomponent ion exchange model that incorporates mass transfer effects was developed, which is capable of describing the performance of strong acid cation (SAC) and strong base anion (SBA) exchange resins, but not including reaction effects. This dissertation presents results showing the mass transfer model's capability to predict the performance of binary and multicomponent column data for SAC and SBA exchange resins. The ion exchange equilibrium and mass transfer models developed in this study are also applicable to terrestrial water treatment systems. They could be applied for removal of cations and anions from groundwater (e.g., hardness, nitrate, perchlorate) and from industrial process waters (e.g. boiler water, ultrapure water in the semiconductor industry).
Resumo:
In this thesis, a machine learning approach was used to develop a predictive model for residual methanol concentration in industrial formalin produced at the Akzo Nobel factory in Kristinehamn, Sweden. The MATLABTM computational environment supplemented with the Statistics and Machine LearningTM toolbox from the MathWorks were used to test various machine learning algorithms on the formalin production data from Akzo Nobel. As a result, the Gaussian Process Regression algorithm was found to provide the best results and was used to create the predictive model. The model was compiled to a stand-alone application with a graphical user interface using the MATLAB CompilerTM.
Resumo:
To distinguish the components of NMR signals from hydrated materials and to monitor their evolution after the addition of water to the powders, during the first two days of hydration. To implement the 3 Tau Model in a MATLAB script, called 3TM, provided with a Graphical User Interface (GUI), to easily use the 3 Tau Model with NMRD profiles. The 3 Tau Model, developed a few years ago is used for interpreting the dispersion (NMRD profiles, dependence on the Larmor frequency) of the longitudinal relaxation times, for liquids confined in porous media. This model describes the molecular dynamics of confined molecules by introducing three characteristic correlation times and additional outputs.