33 resultados para IOS (Computer operating system)

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Learning Affect Monitor (LAM) is a new computer-based assessment system integrating basic dimensional evaluation and discrete description of affective states in daily life, based on an autonomous adapting system. Subjects evaluate their affective states according to a tridimensional space (valence and activation circumplex as well as global intensity) and then qualify it using up to 30 adjective descriptors chosen from a list. The system gradually adapts to the user, enabling the affect descriptors it presents to be increasingly relevant. An initial study with 51 subjects, using a 1 week time-sampling with 8 to 10 randomized signals per day, produced n = 2,813 records with good reliability measures (e.g., response rate of 88.8%, mean split-half reliability of .86), user acceptance, and usability. Multilevel analyses show circadian and hebdomadal patterns, and significant individual and situational variance components of the basic dimension evaluations. Validity analyses indicate sound assignment of qualitative affect descriptors in the bidimensional semantic space according to the circumplex model of basic affect dimensions. The LAM assessment module can be implemented on different platforms (palm, desk, mobile phone) and provides very rapid and meaningful data collection, preserving complex and interindividually comparable information in the domain of emotion and well-being.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Evidences collected from smartphones users show a growing desire of personalization offered by services for mobile devices. However, the need to accurately identify users' contexts has important implications for user's privacy and it increases the amount of trust, which users are requested to have in the service providers. In this paper, we introduce a model that describes the role of personalization and control in users' assessment of cost and benefits associated to the disclosure of private information. We present an instantiation of such model, a context-aware application for smartphones based on the Android operating system, in which users' private information are protected. Focus group interviews were conducted to examine users' privacy concerns before and after having used our application. Obtained results confirm the utility of our artifact and provide support to our theoretical model, which extends previous literature on privacy calculus and user's acceptance of context-aware technology.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Freehand positioning of the femoral drill guide is difficult during hip resurfacing and the surgeon is often unsure of the implant position achieved peroperatively. The purpose of this study was to find out whether, by using a navigation system, acetabular and femoral component positioning could be made easier and more precise. Eighteen patients operated on by the same surgeon were matched by sex, age, BMI, diagnosis and ASA score (nine patients with computer assistance, nine with the regular ancillary). Pre-operative planning was done on standard AP and axial radiographs with CT scan views for the computer-assisted operations. The final position of implants was evaluated by the same radiographs for all patients. The follow-up was at least 1 year. No difference between both groups in terms of femoral component position was observed (p > 0.05). There was also no difference in femoral notching. A trend for a better cup position was observed for the navigated hips, especially for cup anteversion. There was no additional operating time for the navigated hips. Hip navigation for resurfacing surgery may allow improved visualisation and hip implant positioning, but its advantage probably will be more obvious with mini-incisions than with regular incision surgery.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Malposition of the acetabular component during hip arthroplasty increases the occurrence of impingement, reduces range of motion, and increases the risk of dislocation and long-term wear. To prevent malpositioned hip implants, an increasing number of computer-assisted orthopaedic systems have been described, but their accuracy is not well established. The purpose of this study was to determine the reproducibility and accuracy of conventional versus computer-assisted techniques for positioning the acetabular component in total hip arthroplasty. Using a lateral approach, 150 cups were placed by 10 surgeons in 10 identical plastic pelvis models (freehand, with a mechanical guide, using computer assistance). Conditions for cup implantations were made to mimic the operating room situation. Preoperative planning was done from a computed tomography scan. The accuracy of cup abduction and anteversion was assessed with an electromagnetic system. Freehand placement revealed a mean accuracy of cup anteversion and abduction of 10 degrees and 3.5 degrees, respectively (maximum error, 35 degrees). With the cup positioner, these angles measured 8 degrees and 4 degrees (maximum error, 29.8 degrees), respectively, and using computer assistance, 1.5 degrees and 2.5 degrees degrees (maximum error, 8 degrees), respectively. Computer-assisted cup placement was an accurate and reproducible technique for total hip arthroplasty. It was more accurate than traditional methods of cup positioning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acetabular cup orientation is a key factor determining hip stability, and standard mechanical guides have shown little help in improving alignment. An in vitro study was carried out to compare the accuracy and precision of a new gravity-assisted guidance system with a standard mechanical guide. Three hundred ten cups were impacted by 5 surgeons, and the final cup orientation was measured. With the new guide, the average error in anteversion was 0.4 degrees , compared with 10.4 degrees with the standard guide and 0.3 degrees and -4.7 degrees , respectively, for abduction angles. The average time required for orienting the cups was similar for both guides. The accuracy and reproducibility obtained with the new guide were better (P < .0001). These good results would require a clinical validation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Many patients with an implantable cardioverter-defibrillator (ICD) have indications for magnetic resonance imaging (MRI). However, MRI is generally contraindicated in ICD patients because of potential risks from hazardous interactions between the MRI and ICD system. OBJECTIVE: The purpose of this study was to use preclinical computer modeling, animal studies, and bench and scanner testing to demonstrate the safety of an ICD system developed for 1.5-T whole-body MRI. METHODS: MRI hazards were assessed and mitigated using multiple approaches: design decisions to increase safety and reliability, modeling and simulation to quantify clinical MRI exposure levels, animal studies to quantify the physiologic effects of MRI exposure, and bench testing to evaluate safety margin. RESULTS: Modeling estimated the incidence of a chronic change in pacing capture threshold >0.5V and 1.0V to be less than 1 in 160,000 and less than 1 in 1,000,000 cases, respectively. Modeling also estimated the incidence of unintended cardiac stimulation to occur in less than 1 in 1,000,000 cases. Animal studies demonstrated no delay in ventricular fibrillation detection and no reduction in ventricular fibrillation amplitude at clinical MRI exposure levels, even with multiple exposures. Bench and scanner testing demonstrated performance and safety against all other MRI-induced hazards. CONCLUSION: A preclinical strategy that includes comprehensive computer modeling, animal studies, and bench and scanner testing predicts that an ICD system developed for the magnetic resonance environment is safe and poses very low risks when exposed to 1.5-T normal operating mode whole-body MRI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biochemical systems are commonly modelled by systems of ordinary differential equations (ODEs). A particular class of such models called S-systems have recently gained popularity in biochemical system modelling. The parameters of an S-system are usually estimated from time-course profiles. However, finding these estimates is a difficult computational problem. Moreover, although several methods have been recently proposed to solve this problem for ideal profiles, relatively little progress has been reported for noisy profiles. We describe a special feature of a Newton-flow optimisation problem associated with S-system parameter estimation. This enables us to significantly reduce the search space, and also lends itself to parameter estimation for noisy data. We illustrate the applicability of our method by applying it to noisy time-course data synthetically produced from previously published 4- and 30-dimensional S-systems. In addition, we propose an extension of our method that allows the detection of network topologies for small S-systems. We introduce a new method for estimating S-system parameters from time-course profiles. We show that the performance of this method compares favorably with competing methods for ideal profiles, and that it also allows the determination of parameters for noisy profiles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Four standard radiation qualities (from RQA 3 to RQA 9) were used to compare the imaging performance of a computed radiography (CR) system (general purpose and high resolution phosphor plates of a Kodak CR 9000 system), a selenium-based direct flat panel detector (Kodak Direct View DR 9000), and a conventional screen-film system (Kodak T-MAT L/RA film with a 3M Trimax Regular screen of speed 400) in conventional radiography. Reference exposure levels were chosen according to the manufacturer's recommendations to be representative of clinical practice (exposure index of 1700 for digital systems and a film optical density of 1.4). With the exception of the RQA 3 beam quality, the exposure levels needed to produce a mean digital signal of 1700 were higher than those needed to obtain a mean film optical density of 1.4. In spite of intense developments in the field of digital detectors, screen-film systems are still very efficient detectors for most of the beam qualities used in radiology. An important outcome of this study is the behavior of the detective quantum efficiency of the digital radiography (DR) system as a function of beam energy. The practice of users to increase beam energy when switching from a screen-film system to a CR system, in order to improve the compromise between patient dose and image quality, might not be appropriate when switching from screen-film to selenium-based DR systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The potential of type-2 fuzzy sets for managing high levels of uncertainty in the subjective knowledge of experts or of numerical information has focused on control and pattern classification systems in recent years. One of the main challenges in designing a type-2 fuzzy logic system is how to estimate the parameters of type-2 fuzzy membership function (T2MF) and the Footprint of Uncertainty (FOU) from imperfect and noisy datasets. This paper presents an automatic approach for learning and tuning Gaussian interval type-2 membership functions (IT2MFs) with application to multi-dimensional pattern classification problems. T2MFs and their FOUs are tuned according to the uncertainties in the training dataset by a combination of genetic algorithm (GA) and crossvalidation techniques. In our GA-based approach, the structure of the chromosome has fewer genes than other GA methods and chromosome initialization is more precise. The proposed approach addresses the application of the interval type-2 fuzzy logic system (IT2FLS) for the problem of nodule classification in a lung Computer Aided Detection (CAD) system. The designed IT2FLS is compared with its type-1 fuzzy logic system (T1FLS) counterpart. The results demonstrate that the IT2FLS outperforms the T1FLS by more than 30% in terms of classification accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding the evolution of intraspecific variance is a major research question in evolutionary biology. While its importance to processes operating at individual and population levels is well-documented, much less is known about its role in macroevolutionary patterns. Nevertheless, both experimental and theoretical evidence suggest that the intraspecific variance is susceptible to selection, can transform into interspecific variation and, therefore, is crucial for macroevolutionary processes. The main objectives of this thesis were: (l) to investigate which factors impact evolution of intraspecific variation in Polygonaceae and determine if evolution of intraspecific variation influences species diversification; and (2) to develop a novel comparative phylogenetic method to model evolution of intraspecific variation. Using the buckwheat family, Polygonaceae, as a study system, I demonstrated which life-history and ecological traits are relevant to the evolution of intraspecific variation. I analyzed how differential intraspecific variation drives species diversification patterns. I showed with computer simulations the shortcomings of existing comparative methods with respect to intraspecific variation. I developed a novel comparative model that readily incorporates the intraspecific variance into phylogenetic comparative methods. The obtained results are complimentary, because they affect both empirical and methodological aspects of comparative analysis. Overall, I highlight that intraspecific variation is an important contributor to the macroevolutionary patterns and it should be explicitly considered in the comparative phylogenetic analysis. - En biologie évolutive comprendre l'évolution de la variance intraspécifique est un axe de recherche majeur. Bien que l'importance de cette variation soit bien documentée au niveau individuel et populationnel, on en sait beaucoup moins sur son rôle au niveau macroévolutif. Néanmoins, des preuves expérimentales et théoriques suggèrent que la variance intraspécifique est sensible à la sélection et peut se transformer en variation interspécifique. Par conséquent, elle est cruciale pour mieux comprendre les processus macroévolutifs. Les principaux objectifs de ma thèse étaient : (i) d'enquêter sur les facteurs qui affectent l'évolution de la variation intraspécifique chez les Polygonaceae et de déterminer si l'évolution de cette dernière influence la diversification des espèces, et (2) de développer une nouvelle méthode comparative permettant de modéliser l'évolution de la variation intraspécifique dans un cadre phylogénétique. En utilisant comme système d'étude la famille du sarrasin, les Polygonacées, je démontre que les traits d'histoire de vie sont pertinents pour comprendre l'évolution de la variation intraspécifique. J'ai également analysé l'influence de la variation intraspécifique au niveau de la diversification des espèces. J'ai ensuite démontré avec des données simulées les limites des méthodes comparatives existantes vis à vis de la variation intraspécifique. Finalement, j'ai développé un modèle comparatif qui intègre facilement la variance intraspécifique dans les méthodes comparatives phylogénétiques existantes. Les résultats obtenus lors de ma thèse sont complémentaires car ils abordent aspects empiriques et méthodologiques de l'analyse comparative. En conclusion, je souligne que la variation intraspécifique est un facteur important en macroévolution et qu'elle doit être explicitement considérée lors d'analyses comparatives phylogénétiques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Among numerous magnetic resonance imaging (MRI) techniques, perfusion MRI provides insight into the passage of blood through the brain's vascular network non-invasively. Studying disease models and transgenic mice would intrinsically help understanding the underlying brain functions, cerebrovascular disease and brain disorders. This study evaluates the feasibility of performing continuous arterial spin labeling (CASL) on all cranial arteries for mapping murine cerebral blood flow at 9.4 T. We showed that with an active-detuned two-coil system, a labeling efficiency of 0.82 ± 0.03 was achieved with minimal magnetization transfer residuals in brain. The resulting cerebral blood flow of healthy mouse was 99 ± 26 mL/100g/min, in excellent agreement with other techniques. In conclusion, high magnetic fields deliver high sensitivity and allowing not only CASL but also other MR techniques, i.e. (1)H MRS and diffusion MRI etc, in studying murine brains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With increased activity and reduced financial and human resources, there is a need for automation in clinical bacteriology. Initial processing of clinical samples includes repetitive and fastidious steps. These tasks are suitable for automation, and several instruments are now available on the market, including the WASP (Copan), Previ-Isola (BioMerieux), Innova (Becton-Dickinson) and Inoqula (KIESTRA) systems. These new instruments allow efficient and accurate inoculation of samples, including four main steps: (i) selecting the appropriate Petri dish; (ii) inoculating the sample; (iii) spreading the inoculum on agar plates to obtain, upon incubation, well-separated bacterial colonies; and (iv) accurate labelling and sorting of each inoculated media. The challenge for clinical bacteriologists is to determine what is the ideal automated system for their own laboratory. Indeed, different solutions will be preferred, according to the number and variety of samples, and to the types of sample that will be processed with the automated system. The final choice is troublesome, because audits proposed by industrials risk being biased towards the solution proposed by their company, and because these automated systems may not be easily tested on site prior to the final decision, owing to the complexity of computer connections between the laboratory information system and the instrument. This article thus summarizes the main parameters that need to be taken into account for choosing the optimal system, and provides some clues to help clinical bacteriologists to make their choice.