9 resultados para Computer System Management
em Université de Lausanne, Switzerland
Resumo:
International standardisation refers to voluntary technical specifications pertaining to the production and exchange of goods and services across borders. The paper outlines a theoretical framework which spells out the contention of emerging hybrid forms of non state authority in the global realm. It argues that international standardisation is confronted with a deep rift between promoters of further socialisation of international standards (i.e. a transfer of the universal scope of law into the official framework of standard-setting bodies) and multinational corporations in favour of globalisation of technical standards (i.e. universal recognition of minimal sectorial market-based standards). The problems related to the development of a possible ISO standard of system management in corporate social responsibility provides evidence of the argument.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
BACKGROUND: DNA sequence integrity, mRNA concentrations and protein-DNA interactions have been subject to genome-wide analyses based on microarrays with ever increasing efficiency and reliability over the past fifteen years. However, very recently novel technologies for Ultra High-Throughput DNA Sequencing (UHTS) have been harnessed to study these phenomena with unprecedented precision. As a consequence, the extensive bioinformatics environment available for array data management, analysis, interpretation and publication must be extended to include these novel sequencing data types. DESCRIPTION: MIMAS was originally conceived as a simple, convenient and local Microarray Information Management and Annotation System focused on GeneChips for expression profiling studies. MIMAS 3.0 enables users to manage data from high-density oligonucleotide SNP Chips, expression arrays (both 3'UTR and tiling) and promoter arrays, BeadArrays as well as UHTS data using MIAME-compliant standardized vocabulary. Importantly, researchers can export data in MAGE-TAB format and upload them to the EBI's ArrayExpress certified data repository using a one-step procedure. CONCLUSION: We have vastly extended the capability of the system such that it processes the data output of six types of GeneChips (Affymetrix), two different BeadArrays for mRNA and miRNA (Illumina) and the Genome Analyzer (a popular Ultra-High Throughput DNA Sequencer, Illumina), without compromising on its flexibility and user-friendliness. MIMAS, appropriately renamed into Multiomics Information Management and Annotation System, is currently used by scientists working in approximately 50 academic laboratories and genomics platforms in Switzerland and France. MIMAS 3.0 is freely available via http://multiomics.sourceforge.net/.
Resumo:
While mobile technologies can provide great personalized services for mobile users, they also threaten their privacy. Such personalization-privacy paradox are particularly salient for context aware technology based mobile applications where user's behaviors, movement and habits can be associated with a consumer's personal identity. In this thesis, I studied the privacy issues in the mobile context, particularly focus on an adaptive privacy management system design for context-aware mobile devices, and explore the role of personalization and control over user's personal data. This allowed me to make multiple contributions, both theoretical and practical. In the theoretical world, I propose and prototype an adaptive Single-Sign On solution that use user's context information to protect user's private information for smartphone. To validate this solution, I first proved that user's context is a unique user identifier and context awareness technology can increase user's perceived ease of use of the system and service provider's authentication security. I then followed a design science research paradigm and implemented this solution into a mobile application called "Privacy Manager". I evaluated the utility by several focus group interviews, and overall the proposed solution fulfilled the expected function and users expressed their intentions to use this application. To better understand the personalization-privacy paradox, I built on the theoretical foundations of privacy calculus and technology acceptance model to conceptualize the theory of users' mobile privacy management. I also examined the role of personalization and control ability on my model and how these two elements interact with privacy calculus and mobile technology model. In the practical realm, this thesis contributes to the understanding of the tradeoff between the benefit of personalized services and user's privacy concerns it may cause. By pointing out new opportunities to rethink how user's context information can protect private data, it also suggests new elements for privacy related business models.
Resumo:
A score system integrating the evolution of efficacy and tolerability over time was applied to a subpopulation of the STRATHE trial, a trial performed according to a parallel group design, with a double-blind, random allocation to either a fixed-dose combination strategy (perindopril/indapamide 2 mg/0.625 mg, with the possibility to increase the dose to 3 mg/0.935 mg, and 4 mg/1.250 mg if needed, n = 118), a sequential monotherapy approach (atenolol 50 mg, followed by losartan 50 mg and amlodipine 5 mg if needed, n = 108), or a stepped-care strategy (valsartan 40 mg, followed by valsartan 80 mg and valsartan 80 mg+ hydrochlorothiazide 12.5 mg if needed, n = 103). The aim was to lower blood pressure below 140/90 mmHg within a 9-month period. The treatment could be adjusted after 3 and 6 months. Only patients in whom the study protocol was strictly applied were included in this analysis. At completion of the trial the total score averaged 13.1 +/- 70.5 (mean +/- SD) using the fixed-dose combination strategy, compared with -7.2 +/- 81.0 using the sequential monotherapy approach and -17.5 +/- 76.4 using the stepped-care strategy. In conclusion, the use of a score system allows the comparison of antihypertensive therapeutic strategies, taking into account at the same time efficacy and tolerability. In the STRATHE trial the best results were observed with the fixed-dose combination containing low doses of an angiotensin enzyme converting inhibitor (perindopril) and a diuretic (indapamide).
Resumo:
The Learning Affect Monitor (LAM) is a new computer-based assessment system integrating basic dimensional evaluation and discrete description of affective states in daily life, based on an autonomous adapting system. Subjects evaluate their affective states according to a tridimensional space (valence and activation circumplex as well as global intensity) and then qualify it using up to 30 adjective descriptors chosen from a list. The system gradually adapts to the user, enabling the affect descriptors it presents to be increasingly relevant. An initial study with 51 subjects, using a 1 week time-sampling with 8 to 10 randomized signals per day, produced n = 2,813 records with good reliability measures (e.g., response rate of 88.8%, mean split-half reliability of .86), user acceptance, and usability. Multilevel analyses show circadian and hebdomadal patterns, and significant individual and situational variance components of the basic dimension evaluations. Validity analyses indicate sound assignment of qualitative affect descriptors in the bidimensional semantic space according to the circumplex model of basic affect dimensions. The LAM assessment module can be implemented on different platforms (palm, desk, mobile phone) and provides very rapid and meaningful data collection, preserving complex and interindividually comparable information in the domain of emotion and well-being.
Resumo:
BACKGROUND: Frequent emergency department users represent a small number of patients but account for a large number of emergency department visits. They should be a focus because they are often vulnerable patients with many risk factors affecting their quality of life (QoL). Case management interventions have resulted in a significant decrease in emergency department visits, but association with QoL has not been assessed. One aim of our study was to examine to what extent an interdisciplinary case management intervention, compared to standard emergency care, improved frequent emergency department users' QoL. METHODS: Data are part of a randomized, controlled trial designed to improve frequent emergency department users' QoL and use of health-care resources at the Lausanne University Hospital, Switzerland. In total, 250 frequent emergency department users (≥5 attendances during the previous 12 months; ≥ 18 years of age) were interviewed between May 2012 and July 2013. Following an assessment focused on social characteristics; social, mental, and somatic determinants of health; risk behaviors; health care use; and QoL, participants were randomly assigned to the control or the intervention group (n=125 in each group). The final sample included 194 participants (20 deaths, 36 dropouts, n=96 in the intervention group, n=99 in the control group). Participants in the intervention group received a case management intervention by an interdisciplinary, mobile team in addition to standard emergency care. The case management intervention involved four nurses and a physician who provided counseling and assistance concerning social determinants of health, substance-use disorders, and access to the health-care system. The participants' QoL was evaluated by a study nurse using the WHOQOL-BREF five times during the study (at baseline, and at 2, 5.5, 9, and 12 months). Four of the six WHOQOL dimensions of QoL were retained here: physical health, psychological health, social relationship, and environment, with scores ranging from 0 (low QoL) to 100 (high QoL). A linear, mixed-effects model with participants as a random effect was run to analyze the change in QoL over time. The effects of time, participants' group, and the interaction between time and group were tested. These effects were controlled for sociodemographic characteristics and health-related variables (i.e., age, gender, education, citizenship, marital status, type of financial resources, proficiency in French, somatic and mental health problems, and behaviors at risk).