17 resultados para User-Machine System
em Université de Lausanne, Switzerland
Resumo:
Centrifuge is a user-friendly system to simultaneously access Arabidopsis gene annotations and intra- and inter-organism sequence comparison data. The tool allows rapid retrieval of user-selected data for each annotated Arabidopsis gene providing, in any combination, data on the following features: predicted protein properties such as mass, pI, cellular location and transmembrane domains; SWISS-PROT annotations; Interpro domains; Gene Ontology records; verified transcription; BLAST matches to the proteomes of A.thaliana, Oryza sativa (rice), Caenorhabditis elegans, Drosophila melanogaster and Homo sapiens. The tool lends itself particularly well to the rapid analysis of contigs or of tens or hundreds of genes identified by high-throughput gene expression experiments. In these cases, a summary table of principal predicted protein features for all genes is given followed by more detailed reports for each individual gene. Centrifuge can also be used for single gene analysis or in a word search mode. AVAILABILITY: http://centrifuge.unil.ch/ CONTACT: edward.farmer@unil.ch.
Resumo:
Drug delivery is one of the most common clinical routines in hospitals, and is critical to patients' health and recovery. It includes a decision making process in which a medical doctor decides the amount (dose) and frequency (dose interval) on the basis of a set of available patients' feature data and the doctor's clinical experience (a priori adaptation). This process can be computerized in order to make the prescription procedure in a fast, objective, inexpensive, non-invasive and accurate way. This paper proposes a Drug Administration Decision Support System (DADSS) to help clinicians/patients with the initial dose computing. The system is based on a Support Vector Machine (SVM) algorithm for estimation of the potential drug concentration in the blood of a patient, from which a best combination of dose and dose interval is selected at the level of a DSS. The addition of the RANdom SAmple Consensus (RANSAC) technique enhances the prediction accuracy by selecting inliers for SVM modeling. Experiments are performed for the drug imatinib case study which shows more than 40% improvement in the prediction accuracy compared with previous works. An important extension to the patient features' data is also proposed in this paper.
Resumo:
BACKGROUND: DNA sequence integrity, mRNA concentrations and protein-DNA interactions have been subject to genome-wide analyses based on microarrays with ever increasing efficiency and reliability over the past fifteen years. However, very recently novel technologies for Ultra High-Throughput DNA Sequencing (UHTS) have been harnessed to study these phenomena with unprecedented precision. As a consequence, the extensive bioinformatics environment available for array data management, analysis, interpretation and publication must be extended to include these novel sequencing data types. DESCRIPTION: MIMAS was originally conceived as a simple, convenient and local Microarray Information Management and Annotation System focused on GeneChips for expression profiling studies. MIMAS 3.0 enables users to manage data from high-density oligonucleotide SNP Chips, expression arrays (both 3'UTR and tiling) and promoter arrays, BeadArrays as well as UHTS data using MIAME-compliant standardized vocabulary. Importantly, researchers can export data in MAGE-TAB format and upload them to the EBI's ArrayExpress certified data repository using a one-step procedure. CONCLUSION: We have vastly extended the capability of the system such that it processes the data output of six types of GeneChips (Affymetrix), two different BeadArrays for mRNA and miRNA (Illumina) and the Genome Analyzer (a popular Ultra-High Throughput DNA Sequencer, Illumina), without compromising on its flexibility and user-friendliness. MIMAS, appropriately renamed into Multiomics Information Management and Annotation System, is currently used by scientists working in approximately 50 academic laboratories and genomics platforms in Switzerland and France. MIMAS 3.0 is freely available via http://multiomics.sourceforge.net/.
Resumo:
While mobile technologies can provide great personalized services for mobile users, they also threaten their privacy. Such personalization-privacy paradox are particularly salient for context aware technology based mobile applications where user's behaviors, movement and habits can be associated with a consumer's personal identity. In this thesis, I studied the privacy issues in the mobile context, particularly focus on an adaptive privacy management system design for context-aware mobile devices, and explore the role of personalization and control over user's personal data. This allowed me to make multiple contributions, both theoretical and practical. In the theoretical world, I propose and prototype an adaptive Single-Sign On solution that use user's context information to protect user's private information for smartphone. To validate this solution, I first proved that user's context is a unique user identifier and context awareness technology can increase user's perceived ease of use of the system and service provider's authentication security. I then followed a design science research paradigm and implemented this solution into a mobile application called "Privacy Manager". I evaluated the utility by several focus group interviews, and overall the proposed solution fulfilled the expected function and users expressed their intentions to use this application. To better understand the personalization-privacy paradox, I built on the theoretical foundations of privacy calculus and technology acceptance model to conceptualize the theory of users' mobile privacy management. I also examined the role of personalization and control ability on my model and how these two elements interact with privacy calculus and mobile technology model. In the practical realm, this thesis contributes to the understanding of the tradeoff between the benefit of personalized services and user's privacy concerns it may cause. By pointing out new opportunities to rethink how user's context information can protect private data, it also suggests new elements for privacy related business models.
Resumo:
This paper presents general problems and approaches for the spatial data analysis using machine learning algorithms. Machine learning is a very powerful approach to adaptive data analysis, modelling and visualisation. The key feature of the machine learning algorithms is that they learn from empirical data and can be used in cases when the modelled environmental phenomena are hidden, nonlinear, noisy and highly variable in space and in time. Most of the machines learning algorithms are universal and adaptive modelling tools developed to solve basic problems of learning from data: classification/pattern recognition, regression/mapping and probability density modelling. In the present report some of the widely used machine learning algorithms, namely artificial neural networks (ANN) of different architectures and Support Vector Machines (SVM), are adapted to the problems of the analysis and modelling of geo-spatial data. Machine learning algorithms have an important advantage over traditional models of spatial statistics when problems are considered in a high dimensional geo-feature spaces, when the dimension of space exceeds 5. Such features are usually generated, for example, from digital elevation models, remote sensing images, etc. An important extension of models concerns considering of real space constrains like geomorphology, networks, and other natural structures. Recent developments in semi-supervised learning can improve modelling of environmental phenomena taking into account on geo-manifolds. An important part of the study deals with the analysis of relevant variables and models' inputs. This problem is approached by using different feature selection/feature extraction nonlinear tools. To demonstrate the application of machine learning algorithms several interesting case studies are considered: digital soil mapping using SVM, automatic mapping of soil and water system pollution using ANN; natural hazards risk analysis (avalanches, landslides), assessments of renewable resources (wind fields) with SVM and ANN models, etc. The dimensionality of spaces considered varies from 2 to more than 30. Figures 1, 2, 3 demonstrate some results of the studies and their outputs. Finally, the results of environmental mapping are discussed and compared with traditional models of geostatistics.
Resumo:
AbstractDigitalization gives to the Internet the power by allowing several virtual representations of reality, including that of identity. We leave an increasingly digital footprint in cyberspace and this situation puts our identity at high risks. Privacy is a right and fundamental social value that could play a key role as a medium to secure digital identities. Identity functionality is increasingly delivered as sets of services, rather than monolithic applications. So, an identity layer in which identity and privacy management services are loosely coupled, publicly hosted and available to on-demand calls could be more realistic and an acceptable situation. Identity and privacy should be interoperable and distributed through the adoption of service-orientation and implementation based on open standards (technical interoperability). Ihe objective of this project is to provide a way to implement interoperable user-centric digital identity-related privacy to respond to the need of distributed nature of federated identity systems. It is recognized that technical initiatives, emerging standards and protocols are not enough to guarantee resolution for the concerns surrounding a multi-facets and complex issue of identity and privacy. For this reason they should be apprehended within a global perspective through an integrated and a multidisciplinary approach. The approach dictates that privacy law, policies, regulations and technologies are to be crafted together from the start, rather than attaching it to digital identity after the fact. Thus, we draw Digital Identity-Related Privacy (DigldeRP) requirements from global, domestic and business-specific privacy policies. The requirements take shape of business interoperability. We suggest a layered implementation framework (DigldeRP framework) in accordance to model-driven architecture (MDA) approach that would help organizations' security team to turn business interoperability into technical interoperability in the form of a set of services that could accommodate Service-Oriented Architecture (SOA): Privacy-as-a-set-of- services (PaaSS) system. DigldeRP Framework will serve as a basis for vital understanding between business management and technical managers on digital identity related privacy initiatives. The layered DigldeRP framework presents five practical layers as an ordered sequence as a basis of DigldeRP project roadmap, however, in practice, there is an iterative process to assure that each layer supports effectively and enforces requirements of the adjacent ones. Each layer is composed by a set of blocks, which determine a roadmap that security team could follow to successfully implement PaaSS. Several blocks' descriptions are based on OMG SoaML modeling language and BPMN processes description. We identified, designed and implemented seven services that form PaaSS and described their consumption. PaaSS Java QEE project), WSDL, and XSD codes are given and explained.
Resumo:
Nowadays, the joint exploitation of images acquired daily by remote sensing instruments and of images available from archives allows a detailed monitoring of the transitions occurring at the surface of the Earth. These modifications of the land cover generate spectral discrepancies that can be detected via the analysis of remote sensing images. Independently from the origin of the images and of type of surface change, a correct processing of such data implies the adoption of flexible, robust and possibly nonlinear method, to correctly account for the complex statistical relationships characterizing the pixels of the images. This Thesis deals with the development and the application of advanced statistical methods for multi-temporal optical remote sensing image processing tasks. Three different families of machine learning models have been explored and fundamental solutions for change detection problems are provided. In the first part, change detection with user supervision has been considered. In a first application, a nonlinear classifier has been applied with the intent of precisely delineating flooded regions from a pair of images. In a second case study, the spatial context of each pixel has been injected into another nonlinear classifier to obtain a precise mapping of new urban structures. In both cases, the user provides the classifier with examples of what he believes has changed or not. In the second part, a completely automatic and unsupervised method for precise binary detection of changes has been proposed. The technique allows a very accurate mapping without any user intervention, resulting particularly useful when readiness and reaction times of the system are a crucial constraint. In the third, the problem of statistical distributions shifting between acquisitions is studied. Two approaches to transform the couple of bi-temporal images and reduce their differences unrelated to changes in land cover are studied. The methods align the distributions of the images, so that the pixel-wise comparison could be carried out with higher accuracy. Furthermore, the second method can deal with images from different sensors, no matter the dimensionality of the data nor the spectral information content. This opens the doors to possible solutions for a crucial problem in the field: detecting changes when the images have been acquired by two different sensors.
Resumo:
BACKGROUND: The yeast Schizosaccharomyces pombe is frequently used as a model for studying the cell cycle. The cells are rod-shaped and divide by medial fission. The process of cell division, or cytokinesis, is controlled by a network of signaling proteins called the Septation Initiation Network (SIN); SIN proteins associate with the SPBs during nuclear division (mitosis). Some SIN proteins associate with both SPBs early in mitosis, and then display strongly asymmetric signal intensity at the SPBs in late mitosis, just before cytokinesis. This asymmetry is thought to be important for correct regulation of SIN signaling, and coordination of cytokinesis and mitosis. In order to study the dynamics of organelles or large protein complexes such as the spindle pole body (SPB), which have been labeled with a fluorescent protein tag in living cells, a number of the image analysis problems must be solved; the cell outline must be detected automatically, and the position and signal intensity associated with the structures of interest within the cell must be determined. RESULTS: We present a new 2D and 3D image analysis system that permits versatile and robust analysis of motile, fluorescently labeled structures in rod-shaped cells. We have designed an image analysis system that we have implemented as a user-friendly software package allowing the fast and robust image-analysis of large numbers of rod-shaped cells. We have developed new robust algorithms, which we combined with existing methodologies to facilitate fast and accurate analysis. Our software permits the detection and segmentation of rod-shaped cells in either static or dynamic (i.e. time lapse) multi-channel images. It enables tracking of two structures (for example SPBs) in two different image channels. For 2D or 3D static images, the locations of the structures are identified, and then intensity values are extracted together with several quantitative parameters, such as length, width, cell orientation, background fluorescence and the distance between the structures of interest. Furthermore, two kinds of kymographs of the tracked structures can be established, one representing the migration with respect to their relative position, the other representing their individual trajectories inside the cell. This software package, called "RodCellJ", allowed us to analyze a large number of S. pombe cells to understand the rules that govern SIN protein asymmetry. CONCLUSIONS: "RodCell" is freely available to the community as a package of several ImageJ plugins to simultaneously analyze the behavior of a large number of rod-shaped cells in an extensive manner. The integration of different image-processing techniques in a single package, as well as the development of novel algorithms does not only allow to speed up the analysis with respect to the usage of existing tools, but also accounts for higher accuracy. Its utility was demonstrated on both 2D and 3D static and dynamic images to study the septation initiation network of the yeast Schizosaccharomyces pombe. More generally, it can be used in any kind of biological context where fluorescent-protein labeled structures need to be analyzed in rod-shaped cells. AVAILABILITY: RodCellJ is freely available under http://bigwww.epfl.ch/algorithms.html, (after acceptance of the publication).
Resumo:
The mitochondrial 70-kDa heat shock protein (mtHsp70), also known in humans as mortalin, is a central component of the mitochondrial protein import motor and plays a key role in the folding of matrix-localized mitochondrial proteins. MtHsp70 is assisted by a member of the 40-kDa heat shock protein co-chaperone family named Tid1 and a nucleotide exchange factor. Whereas, yeast mtHsp70 has been extensively studied in the context of protein import in the mitochondria, and the bacterial 70-kDa heat shock protein was recently shown to act as an ATP-fuelled unfolding enzyme capable of detoxifying stably misfolded polypeptides into harmless natively refolded proteins, little is known about the molecular functions of the human mortalin in protein homeostasis. Here, we developed novel and efficient purification protocols for mortalin and the two spliced versions of Tid1, Tid1-S, and Tid1-L and showed that mortalin can mediate the in vitro ATP-dependent reactivation of stable-preformed heat-denatured model aggregates, with the assistance of Mge1 and either Tid1-L or Tid1-S co-chaperones or yeast Mdj1. Thus, in addition of being a central component of the protein import machinery, human mortalin together with Tid1, may serve as a protein disaggregating machine which, for lack of Hsp100/ClpB disaggregating co-chaperones, may carry alone the scavenging of toxic protein aggregates in stressed, diseased, or aging human mitochondria.
Resumo:
Fluvial deposits are a challenge for modelling flow in sub-surface reservoirs. Connectivity and continuity of permeable bodies have a major impact on fluid flow in porous media. Contemporary object-based and multipoint statistics methods face a problem of robust representation of connected structures. An alternative approach to model petrophysical properties is based on machine learning algorithm ? Support Vector Regression (SVR). Semi-supervised SVR is able to establish spatial connectivity taking into account the prior knowledge on natural similarities. SVR as a learning algorithm is robust to noise and captures dependencies from all available data. Semi-supervised SVR applied to a synthetic fluvial reservoir demonstrated robust results, which are well matched to the flow performance
Resumo:
PURPOSE: The objective of this experiment is to establish a continuous postmortem circulation in the vascular system of porcine lungs and to evaluate the pulmonary distribution of the perfusate. This research is performed in the bigger scope of a revascularization project of Thiel embalmed specimens. This technique enables teaching anatomy, practicing surgical procedures and doing research under lifelike circumstances. METHODS: After cannulation of the pulmonary trunk and the left atrium, the vascular system was flushed with paraffinum perliquidum (PP) through a heart-lung machine. A continuous circulation was then established using red PP, during which perfusion parameters were measured. The distribution of contrast-containing PP in the pulmonary circulation was visualized on computed tomography. Finally, the amount of leak from the vascular system was calculated. RESULTS: A reperfusion of the vascular system was initiated for 37 min. The flow rate ranged between 80 and 130 ml/min throughout the experiment with acceptable perfusion pressures (range: 37-78 mm Hg). Computed tomography imaging and 3D reconstruction revealed a diffuse vascular distribution of PP and a decreasing vascularization ratio in cranial direction. A self-limiting leak (i.e. 66.8% of the circulating volume) towards the tracheobronchial tree due to vessel rupture was also measured. CONCLUSIONS: PP enables circulation in an isolated porcine lung model with an acceptable pressure-flow relationship resulting in an excellent recruitment of the vascular system. Despite these promising results, rupture of vessel walls may cause leaks. Further exploration of the perfusion capacities of PP in other organs is necessary. Eventually, this could lead to the development of reperfused Thiel embalmed human bodies, which have several applications.
Resumo:
Extracorporeal life support systems (ECLS) have become common in cardiothoracic surgery, but are still "Terra Incognita" in other medical fields due to the fact that perfusion units are normally bound to cardiothoracic centres. The Lifebridge B2T is an ECLS that is meant to be used as an easy and fast-track extracorporeal cardiac support to provide short-term perfusion for the transport of a patient to a specialized centre. With the Lifebridge B2T it is now possible to provide extracorporeal bypass for patients in hospitals without a perfusion unit. The Lifebridge B2T was tested on three calves to analyze the handling, performance and security of this system. The Lifebridge B2T safely can be used clinically and can provide full extracorporeal support for patients in cardiac or pulmonary failure. Flows up to 3.9 +/- 0.2l/min were reached, with an inflow pressure of -103 +/- 13mmHg, using a 21Fr. BioMedicus (Medtronic, Minneapolis, MN, USA) venous cannula. The "Plug and Play" philosophy, with semi-automatic priming, integrated check-list, a long battery time of over two hours and instinctively designed user interface, makes this device very interesting for units with high-risk interventions, such as catheterisation labs. If a system is necessary in an emergency unit, the Lifebridge can provide a high security level, even in centres not acquainted with cardiopulmonary bypass.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
The Learning Affect Monitor (LAM) is a new computer-based assessment system integrating basic dimensional evaluation and discrete description of affective states in daily life, based on an autonomous adapting system. Subjects evaluate their affective states according to a tridimensional space (valence and activation circumplex as well as global intensity) and then qualify it using up to 30 adjective descriptors chosen from a list. The system gradually adapts to the user, enabling the affect descriptors it presents to be increasingly relevant. An initial study with 51 subjects, using a 1 week time-sampling with 8 to 10 randomized signals per day, produced n = 2,813 records with good reliability measures (e.g., response rate of 88.8%, mean split-half reliability of .86), user acceptance, and usability. Multilevel analyses show circadian and hebdomadal patterns, and significant individual and situational variance components of the basic dimension evaluations. Validity analyses indicate sound assignment of qualitative affect descriptors in the bidimensional semantic space according to the circumplex model of basic affect dimensions. The LAM assessment module can be implemented on different platforms (palm, desk, mobile phone) and provides very rapid and meaningful data collection, preserving complex and interindividually comparable information in the domain of emotion and well-being.
Resumo:
OBJECTIVE: The purpose of this study was to adapt and improve a minimally invasive two-step postmortem angiographic technique for use on human cadavers. Detailed mapping of the entire vascular system is almost impossible with conventional autopsy tools. The technique described should be valuable in the diagnosis of vascular abnormalities. MATERIALS AND METHODS: Postmortem perfusion with an oily liquid is established with a circulation machine. An oily contrast agent is introduced as a bolus injection, and radiographic imaging is performed. In this pilot study, the upper or lower extremities of four human cadavers were perfused. In two cases, the vascular system of a lower extremity was visualized with anterograde perfusion of the arteries. In the other two cases, in which the suspected cause of death was drug intoxication, the veins of an upper extremity were visualized with retrograde perfusion of the venous system. RESULTS: In each case, the vascular system was visualized up to the level of the small supplying and draining vessels. In three of the four cases, vascular abnormalities were found. In one instance, a venous injection mark engendered by the self-administration of drugs was rendered visible by exudation of the contrast agent. In the other two cases, occlusion of the arteries and veins was apparent. CONCLUSION: The method described is readily applicable to human cadavers. After establishment of postmortem perfusion with paraffin oil and injection of the oily contrast agent, the vascular system can be investigated in detail and vascular abnormalities rendered visible.