881 resultados para computer systems


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This report describes results from a study evaluating the use of stringless paving using a combination of global positioning and laser technologies. CMI and Geologic Computer Systems developed this technology and successfully implemented it on construction earthmoving and grading projects. Concrete paving is a new area for considering this technology. Fred Carlson Co. agreed to test the stringless paving technology on two challenging concrete paving projects located in Washington County, Iowa. The evaluation was conducted on two paving projects in Washington County, Iowa, during the summer of 2003. The research team from Iowa State University monitored the guidance and elevation conformance to the original design. They employed a combination of physical depth checks, surface location and elevation surveys, concrete yield checks, and physical survey of the control stakes and string line elevations. A final check on profile of the pavement surface was accomplished by the use of the Iowa Department of Transportation Light Weight Surface Analyzer (LISA). Due to the speed of paving and the rapid changes in terrain, the laser technology was abandoned for this project. Total control of the guidance and elevation controls on the slip-form paver were moved from string line to global positioning systems (GPS). The evaluation was a success, and the results indicate that GPS control is feasible and approaching the desired goals of guidance and profile control with the use of three dimensional design models. Further enhancements are needed in the physical features of the slipform paver oil system controls and in the computer program for controlling elevation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Maintaining therapeutic concentrations of drugs with a narrow therapeutic window is a complex task. Several computer systems have been designed to help doctors determine optimum drug dosage. Significant improvements in health care could be achieved if computer advice improved health outcomes and could be implemented in routine practice in a cost effective fashion. This is an updated version of an earlier Cochrane systematic review, by Walton et al, published in 2001. OBJECTIVES: To assess whether computerised advice on drug dosage has beneficial effects on the process or outcome of health care. SEARCH STRATEGY: We searched the Cochrane Effective Practice and Organisation of Care Group specialized register (June 1996 to December 2006), MEDLINE (1966 to December 2006), EMBASE (1980 to December 2006), hand searched the journal Therapeutic Drug Monitoring (1979 to March 2007) and the Journal of the American Medical Informatics Association (1996 to March 2007) as well as reference lists from primary articles. SELECTION CRITERIA: Randomized controlled trials, controlled trials, controlled before and after studies and interrupted time series analyses of computerized advice on drug dosage were included. The participants were health professionals responsible for patient care. The outcomes were: any objectively measured change in the behaviour of the health care provider (such as changes in the dose of drug used); any change in the health of patients resulting from computerized advice (such as adverse reactions to drugs). DATA COLLECTION AND ANALYSIS: Two reviewers independently extracted data and assessed study quality. MAIN RESULTS: Twenty-six comparisons (23 articles) were included (as compared to fifteen comparisons in the original review) including a wide range of drugs in inpatient and outpatient settings. Interventions usually targeted doctors although some studies attempted to influence prescriptions by pharmacists and nurses. Although all studies used reliable outcome measures, their quality was generally low. Computerized advice for drug dosage gave significant benefits by:1.increasing the initial dose (standardised mean difference 1.12, 95% CI 0.33 to 1.92)2.increasing serum concentrations (standradised mean difference 1.12, 95% CI 0.43 to 1.82)3.reducing the time to therapeutic stabilisation (standardised mean difference -0.55, 95%CI -1.03 to -0.08)4.reducing the risk of toxic drug level (rate ratio 0.45, 95% CI 0.30 to 0.70)5.reducing the length of hospital stay (standardised mean difference -0.35, 95% CI -0.52 to -0.17). AUTHORS' CONCLUSIONS: This review suggests that computerized advice for drug dosage has some benefits: it increased the initial dose of drug, increased serum drug concentrations and led to a more rapid therapeutic control. It also reduced the risk of toxic drug levels and the length of time spent in the hospital. However, it had no effect on adverse reactions. In addition, there was no evidence to suggest that some decision support technical features (such as its integration into a computer physician order entry system) or aspects of organization of care (such as the setting) could optimise the effect of computerised advice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La present memòria descriu el procés de desenvolupament d'un sistema informàtic autònom amb capacitat per poder capturar algunes dades del nostre entorn i poder-les comunicar mitjançant un protocol d'intercanvi de dades obert a un sistema receptor per tal de realitzar una posterior anàlisis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The ATP-binding cassette (ABC) family of proteins comprise a group of membrane transporters involved in the transport of a wide variety of compounds, such as xenobiotics, vitamins, lipids, amino acids, and carbohydrates. Determining their regional expression patterns along the intestinal tract will further characterize their transport functions in the gut. The mRNA expression levels of murine ABC transporters in the duodenum, jejunum, ileum, and colon were examined using the Affymetrix MuU74v2 GeneChip set. Eight ABC transporters (Abcb2, Abcb3, Abcb9, Abcc3, Abcc6, Abcd1, Abcg5, and Abcg8) displayed significant differential gene expression along the intestinal tract, as determined by two statistical models (a global error assessment model and a classic ANOVA, both with a P < 0.01). Concordance with semiquantitative real-time PCR was high. Analyzing the promoters of the differentially expressed ABC transporters did not identify common transcriptional motifs between family members or with other genes; however, the expression profile for Abcb9 was highly correlated with fibulin-1, and both genes share a common complex promoter model involving the NFkappaB, zinc binding protein factor (ZBPF), GC-box factors SP1/GC (SP1F), and early growth response factor (EGRF) transcription binding motifs. The cellular location of another of the differentially expressed ABC transporters, Abcc3, was examined by immunohistochemistry. Staining revealed that the protein is consistently expressed in the basolateral compartment of enterocytes along the anterior-posterior axis of the intestine. Furthermore, the intensity of the staining pattern is concordant with the expression profile. This agrees with previous findings in which the mRNA, protein, and transport function of Abcc3 were increased in the rat distal intestine. These data reveal regional differences in gene expression profiles along the intestinal tract and demonstrate that a complete understanding of intestinal ABC transporter function can only be achieved by examining the physiologically distinct regions of the gut.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

L'estudi vol detectar quin és el grau d'accessibilitat dels webs de les universitats catalanes. Els indicadors utilitzats són els presents en el nivell de prioritat 1 de les Pautes d'accessibilitat al contingut del web, versió 1.0 -WCAG- del World-Wide WebConsortium, acompanyats per altres de complementaris. Com a principal resultat s¿obté que només una de les 43 pàgines analitzades compleix els llindars d'accessibilitat.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La interfaz de consulta de una base de datos en web sirve para establecer la comunicación entre personas que buscan información y los sistemas de recuperación de la información, siendo una de las partes más importantes del diseño conceptual de una base de datos. La interfaz de consulta está formada por un conjunto de páginas de las cuales podríamos destacar las siguientes: página de consulta, resultados, visualización del documento completo, información general y ayudas. El objetivo del texto consiste en determinar cuáles son los elementos básicos que han de estar presentes en cada una de las páginas antes citadas para contribuir a facilitar el proceso de recuperación de la información por parte de los usuarios.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Se presentan los resultados de la investigación realizada en el marco del proyecto europeo DECIMAL, que tiene como objetivo el desarrollo de un módulo integrado de soporte de la toma de decisiones para sistemas automatizados usados en bibliotecas pequeñas y medianas. La investigación cuantitativa y cualitativa llevada a cabo en el Reino Unido, Italia y España se ha basado en una combinación de diversos métodos: revisión de la literatura, entrevistas semiestructuradas, cuestionarios y grupos de discusión en ocasión de los dos seminarios de presentación realizados. Se distinguen dos líneas básicas de investigación: la primera en torno a la utilización real de indicadores y medidas para la gestión y evaluación de la actividad del centro, así como su interés potencial en el caso que no se hayan aplicado por el momento, y la segunda en torno al tipo de decisiones más habituales en los centros y los factores que inciden en este proceso (fuentes de información utilizadas, cultura institucional, formación, nivel de satisfacción). El artículo está centrado en los resultados obtenidos en las bibliotecas españolas, aunque se mencionan también los resultados globales a modo de comparación. Las conclusiones del estudio han dado como resultado la especificación de las necesidades de los usuarios, sobre cuya base se ha diseñado el módulo de soporte a la toma de decisiones. El proyecto ha concluido con una fase de evaluación del prototipo que ha implicado el desarrollo de cuatro versiones sucesivas del módulo con la finalidad de resolver los problemas presentados durante el proceso de evaluación.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the past decade, a number of trends have come together in the general sphere of computing that have profoundly affected libraries. The popularisation of the Internet, the appearance of open and interoperable systems, the improvements within graphics and multimedia, and the generalised installation of LANs are some of the events of the period. Taken together, the result has been that libraries have undergone an important functional change, representing the switch from simple information depositories to information disseminators. Integrated library management systems have not remained unaffected by this transformation and those that have not adapted to the new technological surroundings are now referred to as legacy systems. The article describes the characteristics of systems existing in today's market and outlines future trends that, according to various authors, include the disappearance of the integrated library management systems that have traditionally been sold.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The impact of navigator spatial resolution and navigator evaluation time on image quality in free-breathing navigator-gated 3D coronary magnetic resonance angiography (MRA), including real-time motion correction, was investigated in a moving phantom. Objective image quality parameters signal-to-noise ratio (SNR) and vessel sharpness were compared. It was found that for improved mage quality a short navigator evaluation time is of crucial importance. Navigator spatial resolution showed minimal influence on image quality.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE: Before a patient can be connected to a mechanical ventilator, the controls of the apparatus need to be set up appropriately. Today, this is done by the intensive care professional. With the advent of closed loop controlled mechanical ventilation, methods will be needed to select appropriate start up settings automatically. The objective of our study was to test such a computerized method which could eventually be used as a start-up procedure (first 5-10 minutes of ventilation) for closed-loop controlled ventilation. DESIGN: Prospective Study. SETTINGS: ICU's in two adult and one children's hospital. PATIENTS: 25 critically ill adult patients (age > or = 15 y) and 17 critically ill children selected at random were studied. INTERVENTIONS: To stimulate 'initial connection', the patients were disconnected from their ventilator and transiently connected to a modified Hamilton AMADEUS ventilator for maximally one minute. During that time they were ventilated with a fixed and standardized breath pattern (Test Breaths) based on pressure controlled synchronized intermittent mandatory ventilation (PCSIMV). MEASUREMENTS AND MAIN RESULTS: Measurements of airway flow, airway pressure and instantaneous CO2 concentration using a mainstream CO2 analyzer were made at the mouth during application of the Test-Breaths. Test-Breaths were analyzed in terms of tidal volume, expiratory time constant and series dead space. Using this data an initial ventilation pattern consisting of respiratory frequency and tidal volume was calculated. This ventilation pattern was compared to the one measured prior to the onset of the study using a two-tailed paired t-test. Additionally, it was compared to a conventional method for setting up ventilators. The computer-proposed ventilation pattern did not differ significantly from the actual pattern (p > 0.05), while the conventional method did. However the scatter was large and in 6 cases deviations in the minute ventilation of more than 50% were observed. CONCLUSIONS: The analysis of standardized Test Breaths allows automatic determination of an initial ventilation pattern for intubated ICU patients. While this pattern does not seem to be superior to the one chosen by the conventional method, it is derived fully automatically and without need for manual patient data entry such as weight or height. This makes the method potentially useful as a start up procedure for closed-loop controlled ventilation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Evaluating other individuals with respect to personality characteristics plays a crucial role in human relations and it is the focus of attention for research in diverse fields such as psychology and interactive computer systems. In psychology, face perception has been recognized as a key component of this evaluation system. Multiple studies suggest that observers use face information to infer personality characteristics. Interactive computer systems are trying to take advantage of these findings and apply them to increase the natural aspect of interaction and to improve the performance of interactive computer systems. Here, we experimentally test whether the automatic prediction of facial trait judgments (e.g. dominance) can be made by using the full appearance information of the face and whether a reduced representation of its structure is sufficient. We evaluate two separate approaches: a holistic representation model using the facial appearance information and a structural model constructed from the relations among facial salient points. State of the art machine learning methods are applied to a) derive a facial trait judgment model from training data and b) predict a facial trait value for any face. Furthermore, we address the issue of whether there are specific structural relations among facial points that predict perception of facial traits. Experimental results over a set of labeled data (9 different trait evaluations) and classification rules (4 rules) suggest that a) prediction of perception of facial traits is learnable by both holistic and structural approaches; b) the most reliable prediction of facial trait judgments is obtained by certain type of holistic descriptions of the face appearance; and c) for some traits such as attractiveness and extroversion, there are relationships between specific structural features and social perceptions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVES: Experimental assessment of photodynamic therapy (PDT) for malignant pleural mesothelioma using a polyethylene glycol conjugate of meta-tetrahydroxyphenylchlorin (PEG-mTHPC). STUDY DESIGN/MATERIALS AND METHODS: (a) PDT was tested on H-meso-1 xenografts (652 nm laser light; fluence 10 J/cm(2); 0.93, 9.3, or 27.8 mg/kg of PEG-mTHPC; drug-light intervals 3-8 days). (b) Intraoperative PDT with similar treatment conditions was performed in the chest cavity of minipigs (n = 18) following extrapleural pneumonectomy (EPP) using an optical integrating balloon device combined with in situ light dosimetry. RESULTS: (a) PDT using PEG-mTHPC resulted in larger extent of tumor necrosis than in untreated tumors (P < or = 0.01) without causing damage to normal tissue. (b) Intraoperative PDT following EPP was well tolerated in 17 of 18 animals. Mean fluence and fluence rates measured at four sites of the chest cavity ranged from 10.2 +/- 0.2 to 13.2 +/- 2.3 J/cm(2) and 5.5 +/- 1.2 to 7.9 +/- 1.7 mW/cm(2) (mean +/- SD). Histology 3 months after light delivery revealed no PDT related tissue injury in all but one animal. CONCLUSIONS: PEG-mTHPC mediated PDT showed selective destruction of mesothelioma xenografts without causing damage to intrathoracic organs in pigs at similar treatment conditions. The light delivery system afforded regular light distribution to different parts of the chest cavity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Se presentan los resultados de la investigación realizada en el marco del proyecto europeo DECIMAL, que tiene como objetivo el desarrollo de un módulo integrado de soporte de la toma de decisiones para sistemas automatizados usados en bibliotecas pequeñas y medianas. La investigación cuantitativa y cualitativa llevada a cabo en el Reino Unido, Italia y España se ha basado en una combinación de diversos métodos: revisión de la literatura, entrevistas semiestructuradas, cuestionarios y grupos de discusión en ocasión de los dos seminarios de presentación realizados. Se distinguen dos líneas básicas de investigación: la primera en torno a la utilización real de indicadores y medidas para la gestión y evaluación de la actividad del centro, así como su interés potencial en el caso que no se hayan aplicado por el momento, y la segunda en torno al tipo de decisiones más habituales en los centros y los factores que inciden en este proceso (fuentes de información utilizadas, cultura institucional, formación, nivel de satisfacción). El artículo está centrado en los resultados obtenidos en las bibliotecas españolas, aunque se mencionan también los resultados globales a modo de comparación. Las conclusiones del estudio han dado como resultado la especificación de las necesidades de los usuarios, sobre cuya base se ha diseñado el módulo de soporte a la toma de decisiones. El proyecto ha concluido con una fase de evaluación del prototipo que ha implicado el desarrollo de cuatro versiones sucesivas del módulo con la finalidad de resolver los problemas presentados durante el proceso de evaluación.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.