955 resultados para Embedded computer systems
Resumo:
L'estudi vol detectar quin és el grau d'accessibilitat dels webs de les universitats catalanes. Els indicadors utilitzats són els presents en el nivell de prioritat 1 de les Pautes d'accessibilitat al contingut del web, versió 1.0 -WCAG- del World-Wide WebConsortium, acompanyats per altres de complementaris. Com a principal resultat s¿obté que només una de les 43 pàgines analitzades compleix els llindars d'accessibilitat.
Resumo:
La interfaz de consulta de una base de datos en web sirve para establecer la comunicación entre personas que buscan información y los sistemas de recuperación de la información, siendo una de las partes más importantes del diseño conceptual de una base de datos. La interfaz de consulta está formada por un conjunto de páginas de las cuales podríamos destacar las siguientes: página de consulta, resultados, visualización del documento completo, información general y ayudas. El objetivo del texto consiste en determinar cuáles son los elementos básicos que han de estar presentes en cada una de las páginas antes citadas para contribuir a facilitar el proceso de recuperación de la información por parte de los usuarios.
Resumo:
Se presentan los resultados de la investigación realizada en el marco del proyecto europeo DECIMAL, que tiene como objetivo el desarrollo de un módulo integrado de soporte de la toma de decisiones para sistemas automatizados usados en bibliotecas pequeñas y medianas. La investigación cuantitativa y cualitativa llevada a cabo en el Reino Unido, Italia y España se ha basado en una combinación de diversos métodos: revisión de la literatura, entrevistas semiestructuradas, cuestionarios y grupos de discusión en ocasión de los dos seminarios de presentación realizados. Se distinguen dos líneas básicas de investigación: la primera en torno a la utilización real de indicadores y medidas para la gestión y evaluación de la actividad del centro, así como su interés potencial en el caso que no se hayan aplicado por el momento, y la segunda en torno al tipo de decisiones más habituales en los centros y los factores que inciden en este proceso (fuentes de información utilizadas, cultura institucional, formación, nivel de satisfacción). El artículo está centrado en los resultados obtenidos en las bibliotecas españolas, aunque se mencionan también los resultados globales a modo de comparación. Las conclusiones del estudio han dado como resultado la especificación de las necesidades de los usuarios, sobre cuya base se ha diseñado el módulo de soporte a la toma de decisiones. El proyecto ha concluido con una fase de evaluación del prototipo que ha implicado el desarrollo de cuatro versiones sucesivas del módulo con la finalidad de resolver los problemas presentados durante el proceso de evaluación.
Resumo:
In the past decade, a number of trends have come together in the general sphere of computing that have profoundly affected libraries. The popularisation of the Internet, the appearance of open and interoperable systems, the improvements within graphics and multimedia, and the generalised installation of LANs are some of the events of the period. Taken together, the result has been that libraries have undergone an important functional change, representing the switch from simple information depositories to information disseminators. Integrated library management systems have not remained unaffected by this transformation and those that have not adapted to the new technological surroundings are now referred to as legacy systems. The article describes the characteristics of systems existing in today's market and outlines future trends that, according to various authors, include the disappearance of the integrated library management systems that have traditionally been sold.
Resumo:
The impact of navigator spatial resolution and navigator evaluation time on image quality in free-breathing navigator-gated 3D coronary magnetic resonance angiography (MRA), including real-time motion correction, was investigated in a moving phantom. Objective image quality parameters signal-to-noise ratio (SNR) and vessel sharpness were compared. It was found that for improved mage quality a short navigator evaluation time is of crucial importance. Navigator spatial resolution showed minimal influence on image quality.
Resumo:
OBJECTIVE: Before a patient can be connected to a mechanical ventilator, the controls of the apparatus need to be set up appropriately. Today, this is done by the intensive care professional. With the advent of closed loop controlled mechanical ventilation, methods will be needed to select appropriate start up settings automatically. The objective of our study was to test such a computerized method which could eventually be used as a start-up procedure (first 5-10 minutes of ventilation) for closed-loop controlled ventilation. DESIGN: Prospective Study. SETTINGS: ICU's in two adult and one children's hospital. PATIENTS: 25 critically ill adult patients (age > or = 15 y) and 17 critically ill children selected at random were studied. INTERVENTIONS: To stimulate 'initial connection', the patients were disconnected from their ventilator and transiently connected to a modified Hamilton AMADEUS ventilator for maximally one minute. During that time they were ventilated with a fixed and standardized breath pattern (Test Breaths) based on pressure controlled synchronized intermittent mandatory ventilation (PCSIMV). MEASUREMENTS AND MAIN RESULTS: Measurements of airway flow, airway pressure and instantaneous CO2 concentration using a mainstream CO2 analyzer were made at the mouth during application of the Test-Breaths. Test-Breaths were analyzed in terms of tidal volume, expiratory time constant and series dead space. Using this data an initial ventilation pattern consisting of respiratory frequency and tidal volume was calculated. This ventilation pattern was compared to the one measured prior to the onset of the study using a two-tailed paired t-test. Additionally, it was compared to a conventional method for setting up ventilators. The computer-proposed ventilation pattern did not differ significantly from the actual pattern (p > 0.05), while the conventional method did. However the scatter was large and in 6 cases deviations in the minute ventilation of more than 50% were observed. CONCLUSIONS: The analysis of standardized Test Breaths allows automatic determination of an initial ventilation pattern for intubated ICU patients. While this pattern does not seem to be superior to the one chosen by the conventional method, it is derived fully automatically and without need for manual patient data entry such as weight or height. This makes the method potentially useful as a start up procedure for closed-loop controlled ventilation.
Resumo:
Evaluating other individuals with respect to personality characteristics plays a crucial role in human relations and it is the focus of attention for research in diverse fields such as psychology and interactive computer systems. In psychology, face perception has been recognized as a key component of this evaluation system. Multiple studies suggest that observers use face information to infer personality characteristics. Interactive computer systems are trying to take advantage of these findings and apply them to increase the natural aspect of interaction and to improve the performance of interactive computer systems. Here, we experimentally test whether the automatic prediction of facial trait judgments (e.g. dominance) can be made by using the full appearance information of the face and whether a reduced representation of its structure is sufficient. We evaluate two separate approaches: a holistic representation model using the facial appearance information and a structural model constructed from the relations among facial salient points. State of the art machine learning methods are applied to a) derive a facial trait judgment model from training data and b) predict a facial trait value for any face. Furthermore, we address the issue of whether there are specific structural relations among facial points that predict perception of facial traits. Experimental results over a set of labeled data (9 different trait evaluations) and classification rules (4 rules) suggest that a) prediction of perception of facial traits is learnable by both holistic and structural approaches; b) the most reliable prediction of facial trait judgments is obtained by certain type of holistic descriptions of the face appearance; and c) for some traits such as attractiveness and extroversion, there are relationships between specific structural features and social perceptions.
Resumo:
BACKGROUND AND OBJECTIVES: Experimental assessment of photodynamic therapy (PDT) for malignant pleural mesothelioma using a polyethylene glycol conjugate of meta-tetrahydroxyphenylchlorin (PEG-mTHPC). STUDY DESIGN/MATERIALS AND METHODS: (a) PDT was tested on H-meso-1 xenografts (652 nm laser light; fluence 10 J/cm(2); 0.93, 9.3, or 27.8 mg/kg of PEG-mTHPC; drug-light intervals 3-8 days). (b) Intraoperative PDT with similar treatment conditions was performed in the chest cavity of minipigs (n = 18) following extrapleural pneumonectomy (EPP) using an optical integrating balloon device combined with in situ light dosimetry. RESULTS: (a) PDT using PEG-mTHPC resulted in larger extent of tumor necrosis than in untreated tumors (P < or = 0.01) without causing damage to normal tissue. (b) Intraoperative PDT following EPP was well tolerated in 17 of 18 animals. Mean fluence and fluence rates measured at four sites of the chest cavity ranged from 10.2 +/- 0.2 to 13.2 +/- 2.3 J/cm(2) and 5.5 +/- 1.2 to 7.9 +/- 1.7 mW/cm(2) (mean +/- SD). Histology 3 months after light delivery revealed no PDT related tissue injury in all but one animal. CONCLUSIONS: PEG-mTHPC mediated PDT showed selective destruction of mesothelioma xenografts without causing damage to intrathoracic organs in pigs at similar treatment conditions. The light delivery system afforded regular light distribution to different parts of the chest cavity.
Resumo:
Se presentan los resultados de la investigación realizada en el marco del proyecto europeo DECIMAL, que tiene como objetivo el desarrollo de un módulo integrado de soporte de la toma de decisiones para sistemas automatizados usados en bibliotecas pequeñas y medianas. La investigación cuantitativa y cualitativa llevada a cabo en el Reino Unido, Italia y España se ha basado en una combinación de diversos métodos: revisión de la literatura, entrevistas semiestructuradas, cuestionarios y grupos de discusión en ocasión de los dos seminarios de presentación realizados. Se distinguen dos líneas básicas de investigación: la primera en torno a la utilización real de indicadores y medidas para la gestión y evaluación de la actividad del centro, así como su interés potencial en el caso que no se hayan aplicado por el momento, y la segunda en torno al tipo de decisiones más habituales en los centros y los factores que inciden en este proceso (fuentes de información utilizadas, cultura institucional, formación, nivel de satisfacción). El artículo está centrado en los resultados obtenidos en las bibliotecas españolas, aunque se mencionan también los resultados globales a modo de comparación. Las conclusiones del estudio han dado como resultado la especificación de las necesidades de los usuarios, sobre cuya base se ha diseñado el módulo de soporte a la toma de decisiones. El proyecto ha concluido con una fase de evaluación del prototipo que ha implicado el desarrollo de cuatro versiones sucesivas del módulo con la finalidad de resolver los problemas presentados durante el proceso de evaluación.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
El objetivo principal del presente proyecto es plasmar una clasificación de los diferentes tipos de proyectos informáticos existentes hoy en día. Presentamos una clasificación basada en algunos criterios diferenciadores para finalmente seleccionar una tipología de proyectos informáticos fuertemente diferenciados entre sí. Posteriormente analizaremos de forma exhaustiva las diferentes etapas y fases de las que se componen, con el fin de conocer las diferentes técnicas de trabajo para la correcta consecución de cada uno de los tipos indicados.
Resumo:
Statistical properties of binary complex networks are well understood and recently many attempts have been made to extend this knowledge to weighted ones. There are, however, subtle yet important considerations to be made regarding the nature of the weights used in this generalization. Weights can be either continuous or discrete magnitudes, and in the latter case, they can additionally have undistinguishable or distinguishable nature. This fact has not been addressed in the literature insofar and has deep implications on the network statistics. In this work we face this problem introducing multiedge networks as graphs where multiple (distinguishable) connections between nodes are considered. We develop a statistical mechanics framework where it is possible to get information about the most relevant observables given a large spectrum of linear and nonlinear constraints including those depending both on the number of multiedges per link and their binary projection. The latter case is particularly interesting as we show that binary projections can be understood from multiedge processes. The implications of these results are important as many real-agent-based problems mapped onto graphs require this treatment for a proper characterization of their collective behavior.
Resumo:
This report describes results from a study evaluating the use of stringless paving using a combination of global positioning and laser technologies. CMI and Geologic Computer Systems developed this technology and successfully implemented it on construction earthmoving and grading projects. Concrete paving is a new area for considering this technology. Fred Carlson Co. agreed to test the stringless paving technology on two challenging concrete paving projects located in Washington County, Iowa. The evaluation was conducted on two paving projects in Washington County, Iowa, during the summer of 2003. The research team from Iowa State University monitored the guidance and elevation conformance to the original design. They employed a combination of physical depth checks, surface location and elevation surveys, concrete yield checks, and physical survey of the control stakes and string line elevations. A final check on profile of the pavement surface was accomplished by the use of the Iowa Department of Transportation Light Weight Surface Analyzer (LISA). Due to the speed of paving and the rapid changes in terrain, the laser technology was abandoned for this project. Total control of the guidance and elevation controls on the slip-form paver were moved from string line to global positioning systems (GPS). The evaluation was a success, and the results indicate that GPS control is feasible and approaching the desired goals of guidance and profile control with the use of three dimensional design models. Further enhancements are needed in the physical features of the slipform paver oil system controls and in the computer program for controlling elevation.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.