91 resultados para internet computing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many individuals with unhealthy alcohol use have few or no contact with the health care system and are therefore unlikely to receive information or a brief intervention from a health care professional. Consequently, many Internet-based interventions have been developed. These interventions can reach a large population. We present in this report www.alcooquizz.ch, a website providing tailored feedback and information on alcohol use and its consequences. In six months and a half, more than 15000 individuals visited the website. It appropriately targets individuals with unhealthy alcohol use and users' satisfaction was high. Internet is a valuable option to provide health related information and secondary prevention interventions for unhealthy alcohol use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study looks at how increased memory utilisation affects throughput and energy consumption in scientific computing, especially in high-energy physics. Our aim is to minimise energy consumed by a set of jobs without increasing the processing time. The earlier tests indicated that, especially in data analysis, throughput can increase over 100% and energy consumption decrease 50% by processing multiple jobs in parallel per CPU core. Since jobs are heterogeneous, it is not possible to find an optimum value for the number of parallel jobs. A better solution is based on memory utilisation, but finding an optimum memory threshold is not straightforward. Therefore, a fuzzy logic-based algorithm was developed that can dynamically adapt the memory threshold based on the overall load. In this way, it is possible to keep memory consumption stable with different workloads while achieving significantly higher throughput and energy-efficiency than using a traditional fixed number of jobs or fixed memory threshold approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivation: Genome-wide association studies have become widely used tools to study effects of genetic variants on complex diseases. While it is of great interest to extend existing analysis methods by considering interaction effects between pairs of loci, the large number of possible tests presents a significant computational challenge. The number of computations is further multiplied in the study of gene expression quantitative trait mapping, in which tests are performed for thousands of gene phenotypes simultaneously. Results: We present FastEpistasis, an efficient parallel solution extending the PLINK epistasis module, designed to test for epistasis effects when analyzing continuous phenotypes. Our results show that the algorithm scales with the number of processors and offers a reduction in computation time when several phenotypes are analyzed simultaneously. FastEpistasis is capable of testing the association of a continuous trait with all single nucleotide polymorphism ( SNP) pairs from 500 000 SNPs, totaling 125 billion tests, in a population of 5000 individuals in 29, 4 or 0.5 days using 8, 64 or 512 processors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been relatively little change over recent decades in the methods used in research on self-reported delinquency. Face-to-face interviews and selfadministered interviews in the classroom are still the predominant alternatives envisaged. New methods have been brought into the picture by recent computer technology, the Internet, and an increasing availability of computer equipment and Internet access in schools. In the autumn of 2004, a controlled experiment was conducted with 1,203 students in Lausanne (Switzerland), where "paper-and-pencil" questionnaires were compared with computer-assisted interviews through the Internet. The experiment included a test of two different definitions of the (same) reference period. After the introductory question ("Did you ever..."), students were asked how many times they had done it (or experienced it), if ever, "over the last 12 months" or "since the October 2003 vacation". Few significant differences were found between the results obtained by the two methods and for the two definitions of the reference period, in the answers concerning victimisation, self-reported delinquency, drug use, failure to respond (missing data). Students were found to be more motivated to respond through the Internet, take less time for filling out the questionnaire, and were apparently more confident of privacy, while the school principals were less reluctant to allow classes to be interviewed through the Internet. The Internet method also involves considerable cost reductions, which is a critical advantage if self-reported delinquency surveys are to become a routinely applied method of evaluation, particularly so in countries with limited resources. On balance, the Internet may be instrumental in making research on self-reported delinquency far more feasible in situations where limited resources so far have prevented its implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Knowledge of normal heart weight ranges is important information for pathologists. Comparing the measured heart weight to reference values is one of the key elements used to determine if the heart is pathological, as heart weight increases in many cardiac pathologies. The current reference tables are old and in need of an update. AIMS: The purposes of this study are to establish new reference tables for normal heart weights in the local population and to determine the best predictive factor for normal heart weight. We also aim to provide technical support to calculate the predictive normal heart weight. METHODS: The reference values are based on retrospective analysis of adult Caucasian autopsy cases without any obvious pathology that were collected at the University Centre of Legal Medicine in Lausanne from 2007 to 2011. We selected 288 cases. The mean age was 39.2 years. There were 118 men and 170 women. Regression analyses were performed to assess the relationship of heart weight to body weight, body height, body mass index (BMI) and body surface area (BSA). RESULTS: The heart weight increased along with an increase in all the parameters studied. The mean heart weight was greater in men than in women at a similar body weight. BSA was determined to be the best predictor for normal heart weight. New reference tables for predicted heart weights are presented as a web application that enable the comparison of heart weights observed at autopsy with the reference values. CONCLUSIONS: The reference tables for heart weight and other organs should be systematically updated and adapted for the local population. Web access and smartphone applications for the predicted heart weight represent important investigational tools.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: L'unité d'Assistance Pharmaceutique de la Pharmacie des HUG fonctionne comme centre d'information sur les médicaments et gère des informations mises à disposition sur le web. Celles-ci sont destinées prioritairement au personnel soignant des HUG et accessibles sur le site intranet/Internet (http://www.hcuge.ch/Pharmacie), mis en service en 1998. L'objectif de ce travail était d'évaluer la qualité de l'information du site intranet/Internet et d'y apporter les améliorations nécessaires. Méthode: Le site intranet/Internet de la pharmacie des HUG a été évalué en automne 2004 à l'aide de 2 outils : NetScoring : grille d'évaluation de la qualité de l'information de santé sur Internet (http://www.chu-rouen.fr/netscoring/). Elle comporte 49 critères répartis en 8 catégories. Chaque critère est noté sur une échelle de 5 occurrences puis pondéré selon son importance (multiplication par 3 si le critère est essentiel, par 2 s'il est important ou par 1 s'il est mineur). Analyse AMDEC : méthode permettant de séquencer un processus et d'en Analyser les Modes de Défaillance, leur Effet et leur Criticité (Qual Saf Health Care 2005 :14(2);93-98). Un score est attribué à chaque mode de défaillance identifié en terme de fréquence, de sévérité et de détectabilité. La multiplication des 3 scores fournit un résultat global de criticité (indice de criticité IC, max. 810), permettant de hiérarchiser les risques. Résultats: Etat des lieux NetScoring : La qualité globale du site intranet/Internet était bonne (202 pts/312). Les points forts concernaient la pertinence et l'utilité du site, la qualité du contenu, du moteur de recherche et du design, la rapidité de chargement du site, la sélection des liens externes proposés et le respect du secret médical. Les faiblesses résidaient dans l'absence de politique de mise à jour régulière, d'annotation systématique de l'état d'actualisation des documents, d'un comité éditorial et scientifique, de mots-clés en anglais et d'une liste permettant l'identification des auteurs. Analyse AMDEC : Quatre catégories (création du document, conversion, structure du site et publication du document) et 19 modes de défaillances ont été caractérisés. Trois modes de défaillance étaient associés à un IC important: erreurs lors de la création d'un document (IC 256), information inadéquate car pratique non validée ou recommandation non généralisable (IC 147) et absence de relecture après la conversion du document en format publiable (ex : PDF) (IC 144). Mesures correctives: Une procédure standard (SOP) a été élaborée pour la gestion du site intranet/Internet. Le format standard des informations (initiales de l'auteur, dates de création et de mise à jour, logo de la pharmacie), la validation et la politique de mise à jour des documents ainsi que la procédure d'archivage y sont clairement définis. Une fiche de suivi accompagnant chaque document a été créée pour la traçabilité de toutes les modifications effectuées et la fréquence de révision à respecter. Discussion et conclusion Cette étude a permis de déterminer et de quantifier les points critiques à améliorer sur le site intranet/Internet de la Pharmacie des HUG. Les mesures correctives entreprises doivent permettre d'améliorer les principales faiblesses et défaillances mises en évidence. La mise en place d'un comité éditorial et scientifique devra être évaluée à l'avenir. Le NetScoring et l'analyse AMDEC sont des outils utiles pour l'évaluation et l'amélioration continue de la qualité d'un site Internet, sous réserve d'une interprétation critique des résultats obtenus avant la mise en place de mesures correctives. Malgré une approche totalement différente, ces outils ont permis de mettre en évidence des lacunes similaires.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To assess whether problematic internet use is associated with somatic complaints and whether this association remains when checking for internet activity among a random sample of adolescents living in the canton of Vaud, Switzerland. METHODS: Cross-sectional survey of 3,067 8th graders (50.3% females) divided into average (n = 2,708) and problematic (n = 359) Internet users and compared for somatic complaints (backache, overweight, headaches, musculoskeletal pain, sleep problems and sight problems) controlling for sociodemographic and internet-related variables. Logistic regressions were performed for each complaint and for all of them simultaneously controlling variables significant at the bivariate level. RESULTS: At the multivariate level, when taken separately, problematic internet users were more likely to have a chronic condition (adjusted odds ratio [aOR] with 95% CI: 1.58 [1.11:2.23]) and to report back pain (aOR: 1.46 [1.04:2.05]), overweight (aOR: 1.74 [1.03:2.93]), musculoskeletal pain (aOR: 1.36 [1.00:1.84]) and sleep problems (aOR: 2.16 [1.62:2.88]). When considered in the full model, only sleep problems remained significant (aOR: 2.03 [1.50:2.74]). CONCLUSIONS: Our results confirm that problematic internet users report health problems more frequently, with lack of sleep being the most strongly associated and seeming to act as mediator regarding the other ones. Clinicians should remember to screen for excessive internet use their patients complaining of sleep-related problems, back or musculoskeletal pain or overweight. Clinicians should advise parents to limit the amount of time their adolescent children can spend online for leisure activities. Furthermore, limiting the number of devices used to connect to the internet could help warrant enough sleeping time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this study was to investigate the impact of computing parameters and the location of volumes of interest (VOI) on the calculation of 3D noise power spectrum (NPS) in order to determine an optimal set of computing parameters and propose a robust method for evaluating the noise properties of imaging systems. Noise stationarity in noise volumes acquired with a water phantom on a 128-MDCT and a 320-MDCT scanner were analyzed in the spatial domain in order to define locally stationary VOIs. The influence of the computing parameters in the 3D NPS measurement: the sampling distances bx,y,z and the VOI lengths Lx,y,z, the number of VOIs NVOI and the structured noise were investigated to minimize measurement errors. The effect of the VOI locations on the NPS was also investigated. Results showed that the noise (standard deviation) varies more in the r-direction (phantom radius) than z-direction plane. A 25 × 25 × 40 mm(3) VOI associated with DFOV = 200 mm (Lx,y,z = 64, bx,y = 0.391 mm with 512 × 512 matrix) and a first-order detrending method to reduce structured noise led to an accurate NPS estimation. NPS estimated from off centered small VOIs had a directional dependency contrary to NPS obtained from large VOIs located in the center of the volume or from small VOIs located on a concentric circle. This showed that the VOI size and location play a major role in the determination of NPS when images are not stationary. This study emphasizes the need for consistent measurement methods to assess and compare image quality in CT.