29 resultados para Computer systems organization: general-emerging technologies
em Université de Lausanne, Switzerland
Resumo:
Ants (Hymenoptera, Formicidae) represent one of the most successful eusocial taxa in terms of both their geographic distribution and species number. The publication of seven ant genomes within the past year was a quantum leap for socio- and ant genomics. The diversity of social organization in ants makes them excellent model organisms to study the evolution of social systems. Comparing the ant genomes with those of the honeybee, a lineage that evolved eusociality independently from ants, and solitary insects suggests that there are significant differences in key aspects of genome organization between social and solitary insects, as well as among ant species. Altogether, these seven ant genomes open exciting new research avenues and opportunities for understanding the genetic basis and regulation of social species, and adaptive complex systems in general.
Resumo:
BACKGROUND: Maintaining therapeutic concentrations of drugs with a narrow therapeutic window is a complex task. Several computer systems have been designed to help doctors determine optimum drug dosage. Significant improvements in health care could be achieved if computer advice improved health outcomes and could be implemented in routine practice in a cost effective fashion. This is an updated version of an earlier Cochrane systematic review, by Walton et al, published in 2001. OBJECTIVES: To assess whether computerised advice on drug dosage has beneficial effects on the process or outcome of health care. SEARCH STRATEGY: We searched the Cochrane Effective Practice and Organisation of Care Group specialized register (June 1996 to December 2006), MEDLINE (1966 to December 2006), EMBASE (1980 to December 2006), hand searched the journal Therapeutic Drug Monitoring (1979 to March 2007) and the Journal of the American Medical Informatics Association (1996 to March 2007) as well as reference lists from primary articles. SELECTION CRITERIA: Randomized controlled trials, controlled trials, controlled before and after studies and interrupted time series analyses of computerized advice on drug dosage were included. The participants were health professionals responsible for patient care. The outcomes were: any objectively measured change in the behaviour of the health care provider (such as changes in the dose of drug used); any change in the health of patients resulting from computerized advice (such as adverse reactions to drugs). DATA COLLECTION AND ANALYSIS: Two reviewers independently extracted data and assessed study quality. MAIN RESULTS: Twenty-six comparisons (23 articles) were included (as compared to fifteen comparisons in the original review) including a wide range of drugs in inpatient and outpatient settings. Interventions usually targeted doctors although some studies attempted to influence prescriptions by pharmacists and nurses. Although all studies used reliable outcome measures, their quality was generally low. Computerized advice for drug dosage gave significant benefits by:1.increasing the initial dose (standardised mean difference 1.12, 95% CI 0.33 to 1.92)2.increasing serum concentrations (standradised mean difference 1.12, 95% CI 0.43 to 1.82)3.reducing the time to therapeutic stabilisation (standardised mean difference -0.55, 95%CI -1.03 to -0.08)4.reducing the risk of toxic drug level (rate ratio 0.45, 95% CI 0.30 to 0.70)5.reducing the length of hospital stay (standardised mean difference -0.35, 95% CI -0.52 to -0.17). AUTHORS' CONCLUSIONS: This review suggests that computerized advice for drug dosage has some benefits: it increased the initial dose of drug, increased serum drug concentrations and led to a more rapid therapeutic control. It also reduced the risk of toxic drug levels and the length of time spent in the hospital. However, it had no effect on adverse reactions. In addition, there was no evidence to suggest that some decision support technical features (such as its integration into a computer physician order entry system) or aspects of organization of care (such as the setting) could optimise the effect of computerised advice.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
Breakthrough technologies which now enable the sequencing of individual genomes will irreversibly modify the way diseases are diagnosed, predicted, prevented and treated. For these technologies to reach their full potential requires, upstream, access to high-quality biomedical data and samples from large number of properly informed and consenting individuals and, downstream, the possibility to transform the emerging knowledge into a clinical utility. The Lausanne Institutional Biobank was designed as an integrated, highly versatile infrastructure to harness the power of these emerging technologies and catalyse the discovery and development of innovative therapeutics and biomarkers, and advance the field of personalised medicine. Described here are its rationale, design and governance, as well as parallel initiatives which have been launched locally to address the societal, ethical and technological issues associated with this new bio-resource. Since January 2013, inpatients admitted at Lausanne CHUV University Hospital have been systematically invited to provide a general consent for the use of their biomedical data and samples for research, to complete a standardised questionnaire, to donate a 10-ml sample of blood for future DNA extraction and to be re-contacted for future clinical trials. Over the first 18 months of operation, 14,459 patients were contacted, and 11,051 accepted to participate in the study. This initial 18-month experience illustrates that a systematic hospital-based biobank is feasible; it shows a strong engagement in research from the patient population in this University Hospital setting, and the need for a broad, integrated approach for the future of medicine to reach its full potential.
Resumo:
Curated databases are an integral part of the tool set that researchers use on a daily basis for their work. For most users, however, how databases are maintained, and by whom, is rather obscure. The International Society for Biocuration (ISB) represents biocurators, software engineers, developers and researchers with an interest in biocuration. Its goals include fostering communication between biocurators, promoting and describing their work, and highlighting the added value of biocuration to the world. The ISB recently conducted a survey of biocurators to better understand their educational and scientific backgrounds, their motivations for choosing a curatorial job and their career goals. The results are reported here. From the responses received, it is evident that biocuration is performed by highly trained scientists and perceived to be a stimulating career, offering both intellectual challenges and the satisfaction of performing work essential to the modern scientific community. It is also apparent that the ISB has at least a dual role to play to facilitate biocurators' work: (i) to promote biocuration as a career within the greater scientific community; (ii) to aid the development of resources for biomedical research through promotion of nomenclature and data-sharing standards that will allow interconnection of biological databases and better exploit the pivotal contributions that biocurators are making. DATABASE URL: http://biocurator.org.
Resumo:
[Table des matières] 1. Introduction. 2. Structure (introduction, hiérarchie). 3. Processus (généralités, flux de clientèle, flux d'activité, flux de ressources, aspects temporels, aspects comptables). 4. Descripteurs (qualification, quantification). 5. Indicateurs (définitions, productivité, pertinence, adéquation, efficacité, effectivité, efficience, standards). 6. Bibliographie.
Resumo:
PURPOSE: Afferent asymmetry of visual function is detectable in both normal and pathologic conditions. With a computerized test, we assessed the variability in measuring afferent asymmetry of the pupillary light reflex, that is, the relative afferent pupillary defect. METHODS: In ten normal subjects, pupillary responses to an alternating light stimulus were recorded with computerized infrared pupillography. The relative afferent pupillary defect for each test was determined by using a new computer analysis. The 95% confidence interval of each determination of relative afferent pupillary defect was used to represent the short-term fluctuation in its measurement. To optimize the test for clinical use, we studied the influence of stimulus intensity, duration, and number on the variability of the relative afferent pupillary defect. RESULTS: When the relative afferent pupillary defect was based on only a few light alternations (stimulus pairs), there was excessive variability in its measurement (95% confidence interval > 0.5 log units). With approximately 200 stimulus pairs, the 95% confidence interval was reduced to less than 0.1 log unit (relative afferent pupillary defect +/- 0.05 log unit). Also, there was less variability when the dark interval between alternating light stimulation was less than one second. CONCLUSIONS: Computerized infrared pupillography can standardize the alternating light test and minimize the error in quantifying a relative afferent pupillary defect. A reproducible relative afferent pupillary defect measurement is desirable for defining afferent injury and following the course of disease.
Resumo:
OBJECTIVE: In order to improve the quality of our Emergency Medical Services (EMS), to raise bystander cardiopulmonary resuscitation rates and thereby meet what is becoming a universal standard in terms of quality of emergency services, we decided to implement systematic dispatcher-assisted or telephone-CPR (T-CPR) in our medical dispatch center, a non-Advanced Medical Priority Dispatch System. The aim of this article is to describe the implementation process, costs and results following the introduction of this new "quality" procedure. METHODS: This was a prospective study. Over an 8-week period, our EMS dispatchers were given new procedures to provide T-CPR. We then collected data on all non-traumatic cardiac arrests within our state (Vaud, Switzerland) for the following 12months. For each event, the dispatchers had to record in writing the reason they either ruled out cardiac arrest (CA) or did not propose T-CPR in the event they did suspect CA. All emergency call recordings were reviewed by the medical director of the EMS. The analysis of the recordings and the dispatchers' written explanations were then compared. RESULTS: During the 12-month study period, a total of 497 patients (both adults and children) were identified as having a non-traumatic cardiac arrest. Out of this total, 203 cases were excluded and 294 cases were eligible for T-CPR. Out of these eligible cases, dispatchers proposed T-CPR on 202 occasions (or 69% of eligible cases). They also erroneously proposed T-CPR on 17 occasions when a CA was wrongly identified (false positive). This represents 7.8% of all T-CPR. No costs were incurred to implement our study protocol and procedures. CONCLUSIONS: This study demonstrates it is possible, using a brief campaign of sensitization but without any specific training, to implement systematic dispatcher-assisted cardiopulmonary resuscitation in a non-Advanced Medical Priority Dispatch System such as our EMS that had no prior experience with systematic T-CPR. The results in terms of T-CPR delivery rate and false positive are similar to those found in previous studies. We found our results satisfying the given short time frame of this study. Our results demonstrate that it is possible to improve the quality of emergency services at moderate or even no additional costs and this should be of interest to all EMS that do not presently benefit from using T-CPR procedures. EMS that currently do not offer T-CPR should consider implementing this technique as soon as possible, and we expect our experience may provide answers to those planning to incorporate T-CPR in their daily practice.
Resumo:
The ATP-binding cassette (ABC) family of proteins comprise a group of membrane transporters involved in the transport of a wide variety of compounds, such as xenobiotics, vitamins, lipids, amino acids, and carbohydrates. Determining their regional expression patterns along the intestinal tract will further characterize their transport functions in the gut. The mRNA expression levels of murine ABC transporters in the duodenum, jejunum, ileum, and colon were examined using the Affymetrix MuU74v2 GeneChip set. Eight ABC transporters (Abcb2, Abcb3, Abcb9, Abcc3, Abcc6, Abcd1, Abcg5, and Abcg8) displayed significant differential gene expression along the intestinal tract, as determined by two statistical models (a global error assessment model and a classic ANOVA, both with a P < 0.01). Concordance with semiquantitative real-time PCR was high. Analyzing the promoters of the differentially expressed ABC transporters did not identify common transcriptional motifs between family members or with other genes; however, the expression profile for Abcb9 was highly correlated with fibulin-1, and both genes share a common complex promoter model involving the NFkappaB, zinc binding protein factor (ZBPF), GC-box factors SP1/GC (SP1F), and early growth response factor (EGRF) transcription binding motifs. The cellular location of another of the differentially expressed ABC transporters, Abcc3, was examined by immunohistochemistry. Staining revealed that the protein is consistently expressed in the basolateral compartment of enterocytes along the anterior-posterior axis of the intestine. Furthermore, the intensity of the staining pattern is concordant with the expression profile. This agrees with previous findings in which the mRNA, protein, and transport function of Abcc3 were increased in the rat distal intestine. These data reveal regional differences in gene expression profiles along the intestinal tract and demonstrate that a complete understanding of intestinal ABC transporter function can only be achieved by examining the physiologically distinct regions of the gut.
Resumo:
The impact of navigator spatial resolution and navigator evaluation time on image quality in free-breathing navigator-gated 3D coronary magnetic resonance angiography (MRA), including real-time motion correction, was investigated in a moving phantom. Objective image quality parameters signal-to-noise ratio (SNR) and vessel sharpness were compared. It was found that for improved mage quality a short navigator evaluation time is of crucial importance. Navigator spatial resolution showed minimal influence on image quality.
Resumo:
OBJECTIVE: Before a patient can be connected to a mechanical ventilator, the controls of the apparatus need to be set up appropriately. Today, this is done by the intensive care professional. With the advent of closed loop controlled mechanical ventilation, methods will be needed to select appropriate start up settings automatically. The objective of our study was to test such a computerized method which could eventually be used as a start-up procedure (first 5-10 minutes of ventilation) for closed-loop controlled ventilation. DESIGN: Prospective Study. SETTINGS: ICU's in two adult and one children's hospital. PATIENTS: 25 critically ill adult patients (age > or = 15 y) and 17 critically ill children selected at random were studied. INTERVENTIONS: To stimulate 'initial connection', the patients were disconnected from their ventilator and transiently connected to a modified Hamilton AMADEUS ventilator for maximally one minute. During that time they were ventilated with a fixed and standardized breath pattern (Test Breaths) based on pressure controlled synchronized intermittent mandatory ventilation (PCSIMV). MEASUREMENTS AND MAIN RESULTS: Measurements of airway flow, airway pressure and instantaneous CO2 concentration using a mainstream CO2 analyzer were made at the mouth during application of the Test-Breaths. Test-Breaths were analyzed in terms of tidal volume, expiratory time constant and series dead space. Using this data an initial ventilation pattern consisting of respiratory frequency and tidal volume was calculated. This ventilation pattern was compared to the one measured prior to the onset of the study using a two-tailed paired t-test. Additionally, it was compared to a conventional method for setting up ventilators. The computer-proposed ventilation pattern did not differ significantly from the actual pattern (p > 0.05), while the conventional method did. However the scatter was large and in 6 cases deviations in the minute ventilation of more than 50% were observed. CONCLUSIONS: The analysis of standardized Test Breaths allows automatic determination of an initial ventilation pattern for intubated ICU patients. While this pattern does not seem to be superior to the one chosen by the conventional method, it is derived fully automatically and without need for manual patient data entry such as weight or height. This makes the method potentially useful as a start up procedure for closed-loop controlled ventilation.
Resumo:
BACKGROUND AND OBJECTIVES: Experimental assessment of photodynamic therapy (PDT) for malignant pleural mesothelioma using a polyethylene glycol conjugate of meta-tetrahydroxyphenylchlorin (PEG-mTHPC). STUDY DESIGN/MATERIALS AND METHODS: (a) PDT was tested on H-meso-1 xenografts (652 nm laser light; fluence 10 J/cm(2); 0.93, 9.3, or 27.8 mg/kg of PEG-mTHPC; drug-light intervals 3-8 days). (b) Intraoperative PDT with similar treatment conditions was performed in the chest cavity of minipigs (n = 18) following extrapleural pneumonectomy (EPP) using an optical integrating balloon device combined with in situ light dosimetry. RESULTS: (a) PDT using PEG-mTHPC resulted in larger extent of tumor necrosis than in untreated tumors (P < or = 0.01) without causing damage to normal tissue. (b) Intraoperative PDT following EPP was well tolerated in 17 of 18 animals. Mean fluence and fluence rates measured at four sites of the chest cavity ranged from 10.2 +/- 0.2 to 13.2 +/- 2.3 J/cm(2) and 5.5 +/- 1.2 to 7.9 +/- 1.7 mW/cm(2) (mean +/- SD). Histology 3 months after light delivery revealed no PDT related tissue injury in all but one animal. CONCLUSIONS: PEG-mTHPC mediated PDT showed selective destruction of mesothelioma xenografts without causing damage to intrathoracic organs in pigs at similar treatment conditions. The light delivery system afforded regular light distribution to different parts of the chest cavity.
Resumo:
PURPOSE: Adequate empirical antibiotic dose selection for critically ill burn patients is difficult due to extreme variability in drug pharmacokinetics. Therapeutic drug monitoring (TDM) may aid antibiotic prescription and implementation of initial empirical antimicrobial dosage recommendations. This study evaluated how gradual TDM introduction altered empirical dosages of meropenem and imipenem/cilastatin in our burn ICU. METHODS: Imipenem/cilastatin and meropenem use and daily empirical dosage at a five-bed burn ICU were analyzed retrospectively. Data for all burn admissions between 2001 and 2011 were extracted from the hospital's computerized information system. For each patient receiving a carbapenem, episodes of infection were reviewed and scored according to predefined criteria. Carbapenem trough serum levels were characterized. Prior to May 2007, TDM was available only by special request. Real-time carbapenem TDM was introduced in June 2007; it was initially available weekly and has been available 4 days a week since 2010. RESULTS: Of 365 patients, 229 (63%) received antibiotics (109 received carbapenems). Of 23 TDM determinations for imipenem/cilastatin, none exceeded the predefined upper limit and 11 (47.8%) were insufficient; the number of TDM requests was correlated with daily dose (r=0.7). Similar numbers of inappropriate meropenem trough levels (30.4%) were below and above the upper limit. Real-time TDM introduction increased the empirical dose of imipenem/cilastatin, but not meropenem. CONCLUSIONS: Real-time carbapenem TDM availability significantly altered the empirical daily dosage of imipenem/cilastatin at our burn ICU. Further studies are needed to evaluate the individual impact of TDM-based antibiotic adjustment on infection outcomes in these patients.