25 resultados para DMS (Computer system)

em Université de Lausanne, Switzerland


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Learning Affect Monitor (LAM) is a new computer-based assessment system integrating basic dimensional evaluation and discrete description of affective states in daily life, based on an autonomous adapting system. Subjects evaluate their affective states according to a tridimensional space (valence and activation circumplex as well as global intensity) and then qualify it using up to 30 adjective descriptors chosen from a list. The system gradually adapts to the user, enabling the affect descriptors it presents to be increasingly relevant. An initial study with 51 subjects, using a 1 week time-sampling with 8 to 10 randomized signals per day, produced n = 2,813 records with good reliability measures (e.g., response rate of 88.8%, mean split-half reliability of .86), user acceptance, and usability. Multilevel analyses show circadian and hebdomadal patterns, and significant individual and situational variance components of the basic dimension evaluations. Validity analyses indicate sound assignment of qualitative affect descriptors in the bidimensional semantic space according to the circumplex model of basic affect dimensions. The LAM assessment module can be implemented on different platforms (palm, desk, mobile phone) and provides very rapid and meaningful data collection, preserving complex and interindividually comparable information in the domain of emotion and well-being.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Freehand positioning of the femoral drill guide is difficult during hip resurfacing and the surgeon is often unsure of the implant position achieved peroperatively. The purpose of this study was to find out whether, by using a navigation system, acetabular and femoral component positioning could be made easier and more precise. Eighteen patients operated on by the same surgeon were matched by sex, age, BMI, diagnosis and ASA score (nine patients with computer assistance, nine with the regular ancillary). Pre-operative planning was done on standard AP and axial radiographs with CT scan views for the computer-assisted operations. The final position of implants was evaluated by the same radiographs for all patients. The follow-up was at least 1 year. No difference between both groups in terms of femoral component position was observed (p > 0.05). There was also no difference in femoral notching. A trend for a better cup position was observed for the navigated hips, especially for cup anteversion. There was no additional operating time for the navigated hips. Hip navigation for resurfacing surgery may allow improved visualisation and hip implant positioning, but its advantage probably will be more obvious with mini-incisions than with regular incision surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acetabular cup orientation is a key factor determining hip stability, and standard mechanical guides have shown little help in improving alignment. An in vitro study was carried out to compare the accuracy and precision of a new gravity-assisted guidance system with a standard mechanical guide. Three hundred ten cups were impacted by 5 surgeons, and the final cup orientation was measured. With the new guide, the average error in anteversion was 0.4 degrees , compared with 10.4 degrees with the standard guide and 0.3 degrees and -4.7 degrees , respectively, for abduction angles. The average time required for orienting the cups was similar for both guides. The accuracy and reproducibility obtained with the new guide were better (P < .0001). These good results would require a clinical validation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Malposition of the acetabular component during hip arthroplasty increases the occurrence of impingement, reduces range of motion, and increases the risk of dislocation and long-term wear. To prevent malpositioned hip implants, an increasing number of computer-assisted orthopaedic systems have been described, but their accuracy is not well established. The purpose of this study was to determine the reproducibility and accuracy of conventional versus computer-assisted techniques for positioning the acetabular component in total hip arthroplasty. Using a lateral approach, 150 cups were placed by 10 surgeons in 10 identical plastic pelvis models (freehand, with a mechanical guide, using computer assistance). Conditions for cup implantations were made to mimic the operating room situation. Preoperative planning was done from a computed tomography scan. The accuracy of cup abduction and anteversion was assessed with an electromagnetic system. Freehand placement revealed a mean accuracy of cup anteversion and abduction of 10 degrees and 3.5 degrees, respectively (maximum error, 35 degrees). With the cup positioner, these angles measured 8 degrees and 4 degrees (maximum error, 29.8 degrees), respectively, and using computer assistance, 1.5 degrees and 2.5 degrees degrees (maximum error, 8 degrees), respectively. Computer-assisted cup placement was an accurate and reproducible technique for total hip arthroplasty. It was more accurate than traditional methods of cup positioning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biochemical systems are commonly modelled by systems of ordinary differential equations (ODEs). A particular class of such models called S-systems have recently gained popularity in biochemical system modelling. The parameters of an S-system are usually estimated from time-course profiles. However, finding these estimates is a difficult computational problem. Moreover, although several methods have been recently proposed to solve this problem for ideal profiles, relatively little progress has been reported for noisy profiles. We describe a special feature of a Newton-flow optimisation problem associated with S-system parameter estimation. This enables us to significantly reduce the search space, and also lends itself to parameter estimation for noisy data. We illustrate the applicability of our method by applying it to noisy time-course data synthetically produced from previously published 4- and 30-dimensional S-systems. In addition, we propose an extension of our method that allows the detection of network topologies for small S-systems. We introduce a new method for estimating S-system parameters from time-course profiles. We show that the performance of this method compares favorably with competing methods for ideal profiles, and that it also allows the determination of parameters for noisy profiles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Four standard radiation qualities (from RQA 3 to RQA 9) were used to compare the imaging performance of a computed radiography (CR) system (general purpose and high resolution phosphor plates of a Kodak CR 9000 system), a selenium-based direct flat panel detector (Kodak Direct View DR 9000), and a conventional screen-film system (Kodak T-MAT L/RA film with a 3M Trimax Regular screen of speed 400) in conventional radiography. Reference exposure levels were chosen according to the manufacturer's recommendations to be representative of clinical practice (exposure index of 1700 for digital systems and a film optical density of 1.4). With the exception of the RQA 3 beam quality, the exposure levels needed to produce a mean digital signal of 1700 were higher than those needed to obtain a mean film optical density of 1.4. In spite of intense developments in the field of digital detectors, screen-film systems are still very efficient detectors for most of the beam qualities used in radiology. An important outcome of this study is the behavior of the detective quantum efficiency of the digital radiography (DR) system as a function of beam energy. The practice of users to increase beam energy when switching from a screen-film system to a CR system, in order to improve the compromise between patient dose and image quality, might not be appropriate when switching from screen-film to selenium-based DR systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The potential of type-2 fuzzy sets for managing high levels of uncertainty in the subjective knowledge of experts or of numerical information has focused on control and pattern classification systems in recent years. One of the main challenges in designing a type-2 fuzzy logic system is how to estimate the parameters of type-2 fuzzy membership function (T2MF) and the Footprint of Uncertainty (FOU) from imperfect and noisy datasets. This paper presents an automatic approach for learning and tuning Gaussian interval type-2 membership functions (IT2MFs) with application to multi-dimensional pattern classification problems. T2MFs and their FOUs are tuned according to the uncertainties in the training dataset by a combination of genetic algorithm (GA) and crossvalidation techniques. In our GA-based approach, the structure of the chromosome has fewer genes than other GA methods and chromosome initialization is more precise. The proposed approach addresses the application of the interval type-2 fuzzy logic system (IT2FLS) for the problem of nodule classification in a lung Computer Aided Detection (CAD) system. The designed IT2FLS is compared with its type-1 fuzzy logic system (T1FLS) counterpart. The results demonstrate that the IT2FLS outperforms the T1FLS by more than 30% in terms of classification accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Among numerous magnetic resonance imaging (MRI) techniques, perfusion MRI provides insight into the passage of blood through the brain's vascular network non-invasively. Studying disease models and transgenic mice would intrinsically help understanding the underlying brain functions, cerebrovascular disease and brain disorders. This study evaluates the feasibility of performing continuous arterial spin labeling (CASL) on all cranial arteries for mapping murine cerebral blood flow at 9.4 T. We showed that with an active-detuned two-coil system, a labeling efficiency of 0.82 ± 0.03 was achieved with minimal magnetization transfer residuals in brain. The resulting cerebral blood flow of healthy mouse was 99 ± 26 mL/100g/min, in excellent agreement with other techniques. In conclusion, high magnetic fields deliver high sensitivity and allowing not only CASL but also other MR techniques, i.e. (1)H MRS and diffusion MRI etc, in studying murine brains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With increased activity and reduced financial and human resources, there is a need for automation in clinical bacteriology. Initial processing of clinical samples includes repetitive and fastidious steps. These tasks are suitable for automation, and several instruments are now available on the market, including the WASP (Copan), Previ-Isola (BioMerieux), Innova (Becton-Dickinson) and Inoqula (KIESTRA) systems. These new instruments allow efficient and accurate inoculation of samples, including four main steps: (i) selecting the appropriate Petri dish; (ii) inoculating the sample; (iii) spreading the inoculum on agar plates to obtain, upon incubation, well-separated bacterial colonies; and (iv) accurate labelling and sorting of each inoculated media. The challenge for clinical bacteriologists is to determine what is the ideal automated system for their own laboratory. Indeed, different solutions will be preferred, according to the number and variety of samples, and to the types of sample that will be processed with the automated system. The final choice is troublesome, because audits proposed by industrials risk being biased towards the solution proposed by their company, and because these automated systems may not be easily tested on site prior to the final decision, owing to the complexity of computer connections between the laboratory information system and the instrument. This article thus summarizes the main parameters that need to be taken into account for choosing the optimal system, and provides some clues to help clinical bacteriologists to make their choice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Clinical practice does not always reflect best practice and evidence, partly because of unconscious acts of omission, information overload, or inaccessible information. Reminders may help clinicians overcome these problems by prompting the doctor to recall information that they already know or would be expected to know and by providing information or guidance in a more accessible and relevant format, at a particularly appropriate time. OBJECTIVES: To evaluate the effects of reminders automatically generated through a computerized system and delivered on paper to healthcare professionals on processes of care (related to healthcare professionals' practice) and outcomes of care (related to patients' health condition). SEARCH METHODS: For this update the EPOC Trials Search Co-ordinator searched the following databases between June 11-19, 2012: The Cochrane Central Register of Controlled Trials (CENTRAL) and Cochrane Library (Economics, Methods, and Health Technology Assessment sections), Issue 6, 2012; MEDLINE, OVID (1946- ), Daily Update, and In-process; EMBASE, Ovid (1947- ); CINAHL, EbscoHost (1980- ); EPOC Specialised Register, Reference Manager, and INSPEC, Engineering Village. The authors reviewed reference lists of related reviews and studies.  SELECTION CRITERIA: We included individual or cluster-randomized controlled trials (RCTs) and non-randomized controlled trials (NRCTs) that evaluated the impact of computer-generated reminders delivered on paper to healthcare professionals on processes and/or outcomes of care. DATA COLLECTION AND ANALYSIS: Review authors working in pairs independently screened studies for eligibility and abstracted data. We contacted authors to obtain important missing information for studies that were published within the last 10 years. For each study, we extracted the primary outcome when it was defined or calculated the median effect size across all reported outcomes. We then calculated the median absolute improvement and interquartile range (IQR) in process adherence across included studies using the primary outcome or median outcome as representative outcome. MAIN RESULTS: In the 32 included studies, computer-generated reminders delivered on paper to healthcare professionals achieved moderate improvement in professional practices, with a median improvement of processes of care of 7.0% (IQR: 3.9% to 16.4%). Implementing reminders alone improved care by 11.2% (IQR 6.5% to 19.6%) compared with usual care, while implementing reminders in addition to another intervention improved care by 4.0% only (IQR 3.0% to 6.0%) compared with the other intervention. The quality of evidence for these comparisons was rated as moderate according to the GRADE approach. Two reminder features were associated with larger effect sizes: providing space on the reminder for provider to enter a response (median 13.7% versus 4.3% for no response, P value = 0.01) and providing an explanation of the content or advice on the reminder (median 12.0% versus 4.2% for no explanation, P value = 0.02). Median improvement in processes of care also differed according to the behaviour the reminder targeted: for instance, reminders to vaccinate improved processes of care by 13.1% (IQR 12.2% to 20.7%) compared with other targeted behaviours. In the only study that had sufficient power to detect a clinically significant effect on outcomes of care, reminders were not associated with significant improvements. AUTHORS' CONCLUSIONS: There is moderate quality evidence that computer-generated reminders delivered on paper to healthcare professionals achieve moderate improvement in process of care. Two characteristics emerged as significant predictors of improvement: providing space on the reminder for a response from the clinician and providing an explanation of the reminder's content or advice. The heterogeneity of the reminder interventions included in this review also suggests that reminders can improve care in various settings under various conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visualization of the vascular systems of organs or of small animals is important for an assessment of basic physiological conditions, especially in studies that involve genetically manipulated mice. For a detailed morphological analysis of the vascular tree, it is necessary to demonstrate the system in its entirety. In this study, we present a new lipophilic contrast agent, Angiofil, for performing postmortem microangiography by using microcomputed tomography. The new contrast agent was tested in 10 wild-type mice. Imaging of the vascular system revealed vessels down to the caliber of capillaries, and the digital three-dimensional data obtained from the scans allowed for virtual cutting, amplification, and scaling without destroying the sample. By use of computer software, parameters such as vessel length and caliber could be quantified and remapped by color coding onto the surface of the vascular system. The liquid Angiofil is easy to handle and highly radio-opaque. Because of its lipophilic abilities, it is retained intravascularly, hence it facilitates virtual vessel segmentation, and yields an enduring signal which is advantageous during repetitive investigations, or if samples need to be transported from the site of preparation to the place of actual analysis, respectively. These characteristics make Angiofil a promising novel contrast agent; when combined with microcomputed tomography, it has the potential to turn into a powerful method for rapid vascular phenotyping.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.