990 resultados para SOFTWARE-RELIABILITY
Resumo:
En la era digital actual, Internet forma parte de nuestras vidas, y ha aportado cambios a lasociedad globalizada. Algunos de estos cambios nos permiten nuevas formas de relacionarnos y degestionar el conocimiento, dando sentido al término que hoy entendemos como sociedad-red.Por eso, en el entorno que nos envuelve existen continuamente acciones colaborativas globales quefomentan la comunicación y se comparte información de diversos tipos, con la finalidad deaprender y mantenerse constantemente informado. Específicamente, los centros educativos no sequedan al margen ya que requiere preparar estudiantes para esta sociedad.Estos cambios en la sociedad presentan grandes desafíos para el centro educativo, que nopermiten ser afrontados solamente desde el aula. Los centros requieren adaptarse a un modelocompatible con la sociedad-red, y por ello, se sugieren un modelo centro-red, que presente unaestructura de una organización compatible con la era en el que estamos inmersos.Las redes de colaboración en los centros permite intercambiar información y aportar valor a laeducación con el objetivo de la mejora educativa. En este sentido, los centros educativos debendisponer de características que permitan ser flexibles, adaptarse a los agentes y organizaciones quele envuelven. Pero la estructura actual de un centro educativo es rígida y por tanto esta evoluciónrepresenta uno de los mayores desafíos para el sistema educativo.En esta linea, en los centros de Formación Profesional existe una tendencia hacia modeloscolaborativos con el tejido empresarial, entre otros agentes, y es en este punto donde este proyectopretende centrar el foco de la investigación. Con más exactitud, en la creación de una red decolaboración con el agente que el centro educativo seleccione.Específicamente las TIC forman un papel esencial, y se deben poner al servicio del problemaque apuntábamos para ayudar a solventarlo. En este sentido, es adecuado un diseño del artefactocon Software Libre que tiene múltiples beneficios para este objetivo, pero que destacamos el que ami parecer es el más importante; la vinculación con la filosofía de compartir el conocimiento, quegarantiza la simbiosis con la red colaborativa y es por esta razón que el tema de la investigación esrelevante para el centro educativo.Tal y como se mencionaba previamente, las TIC pueden ayudar a fomentar la red colaborativa,pero no sólo el artefacto TIC generado en este proyecto debe cumplir características como laflexibilidad, también es crítico que el centro educativo y los agentes de la red interioricen la culturacolaborativa en sus acciones con la implicación y compromiso que se requiere. Pero como podemosPágina 6Universitat Oberta de Catalunya Trabajo Final de Máster - Software Libreimaginar, ese cambio de cultura, no es una tarea sencilla y presenta problemas. Para mitigarlos yfomentar la cultura en red, se requieren procesos específicos que permitan incorporarla en la medidade lo posible. Para ello, la combinación de la innovación sistémica y el diseño de la investigación eneducación resultan metodologías apropiadas.Por eso, investigaremos durante este proceso cómo las redes de colaboración y el SoftwareLibre permiten adaptar el centro al entorno, cómo pueden ayudar al centro a potenciar la FormaciónProfesional y garantizar la durabilidad de las acciones, con el objetivo que perdure el conocimientoy la propia red de colaboración para una mejora educativa.
Resumo:
Este proyecto busca analizar, diseñar e implementar una nueva solución de telefonía para el Centro Social de Oficiales de la Policía Nacional contemplando la posibilidad de optar por una migración hacia un sistema VoIP bajo software libre con Asterisk. En consecuencia, se deben evaluar las tecnologías actuales buscando proveer nuevas funcionalidades en el servicio telefónico generando bajos costos en su implementación, funcionamiento y mantenimiento.
Resumo:
Based on results of an evaluation performed during the winter of 1985-86, six Troxler 3241-B Asphalt Content Gauges were purchased for District use in monitoring project asphalt contents. Use of these gauges will help reduce the need for chemical based extractions. Effective use of the gauges depends on the accurate preparation and transfer of project mix calibrations from the Central Lab to the Districts. The objective of this project was to evaluate the precision and accuracy of a gauge in determining asphalt contents and to develop a mix calibration transfer procedure for implementation during the 1987 construction. The first part of the study was accomplished by preparing mix calibrations in the Central Lab gauge and taking multiple measurements of a sample with known asphalt content. The second part was accomplished by preparing transfer pans, obtaining count data on the pans using each gauge, and transferring calibrations from one gauge to another through the use of calibration transfer equations. The transferred calibrations were tested by measuring samples with a known asphalt content. The study established that the Troxler 3241-B Asphalt Content Gauge yields results of acceptable accuracy and precision as evidenced by a standard deviation of 0.04% asphalt content on multiple measurements of the same sample. The calibration transfer procedure proved feasible and resulted in the calibration transfer portion of Materials I.M. 335 - Method of Test For Determining the Asphalt Content of Bituminous Mixtures by the Nuclear Method.
Resumo:
Three pavement design software packages were compared with regards to how they were different in determining design input parameters and their influences on the pavement thickness. StreetPave designs the concrete pavement thickness based on the PCA method and the equivalent asphalt pavement thickness. The WinPAS software performs both concrete and asphalt pavements following the AASHTO 1993 design method. The APAI software designs asphalt pavements based on pre-mechanistic/empirical AASHTO methodology. First, the following four critical design input parameters were identified: traffic, subgrade strength, reliability, and design life. The sensitivity analysis of these four design input parameters were performed using three pavement design software packages to identify which input parameters require the most attention during pavement design. Based on the current pavement design procedures and sensitivity analysis results, a prototype pavement design and sensitivity analysis (PD&SA) software package was developed to retrieve the pavement thickness design value for a given condition and allow a user to perform a pavement design sensitivity analysis. The prototype PD&SA software is a computer program that stores pavement design results in database that is designed for the user to input design data from the variety of design programs and query design results for given conditions. The prototype Pavement Design and Sensitivity Analysis (PA&SA) software package was developed to demonstrate the concept of retrieving the pavement design results from the database for a design sensitivity analysis. This final report does not include the prototype software which will be validated and tested during the next phase.
Resumo:
A good system of preventive bridge maintenance enhances the ability of engineers to manage and monitor bridge conditions, and take proper action at the right time. Traditionally infrastructure inspection is performed via infrequent periodical visual inspection in the field. Wireless sensor technology provides an alternative cost-effective approach for constant monitoring of infrastructures. Scientific data-acquisition systems make reliable structural measurements, even in inaccessible and harsh environments by using wireless sensors. With advances in sensor technology and availability of low cost integrated circuits, a wireless monitoring sensor network has been considered to be the new generation technology for structural health monitoring. The main goal of this project was to implement a wireless sensor network for monitoring the behavior and integrity of highway bridges. At the core of the system is a low-cost, low power wireless strain sensor node whose hardware design is optimized for structural monitoring applications. The key components of the systems are the control unit, sensors, software and communication capability. The extensive information developed for each of these areas has been used to design the system. The performance and reliability of the proposed wireless monitoring system is validated on a 34 feet span composite beam in slab bridge in Black Hawk County, Iowa. The micro strain data is successfully extracted from output-only response collected by the wireless monitoring system. The energy efficiency of the system was investigated to estimate the battery lifetime of the wireless sensor nodes. This report also documents system design, the method used for data acquisition, and system validation and field testing. Recommendations on further implementation of wireless sensor networks for long term monitoring are provided.
Resumo:
This manual describes how to use the Iowa Bridge Backwater software. It also documents the methods and equations used for the calculations. The main body describes how to use the software and the appendices cover technical aspects. The Bridge Backwater software performs 5 main tasks: Design Discharge Estimation; Stream Rating Curves; Floodway Encroachment; Bridge Backwater; and Bridge Scour. The intent of this program is to provide a simplified method for analysis of bridge backwater for rural structures located in areas with low flood damage potential. The software is written in Microsoft Visual Basic 6.0. It will run under Windows 95 or newer versions (i.e. Windows 98, NT, 2000, XP and later).
Resumo:
En la actualidad las tecnologías de la información son utilizadas en todos los ámbitos empresariales. Desde sistemas de gestión (ERPs) pasando por la gestión documental, el análisis de información con sistema de Bussines Intelligence, pudiendo incluso convertirse en toda una nueva plataforma para proveer a las empresas de nuevos canales de venta, como es el caso deInternet.De la necesidad inicial de nuestro cliente en comenzar a expandirse por un nuevo canal de venta para poder llegar a nuevos mercados y diversificar sus clientes se inicia la motivación de este TFC.Dadas las características actuales de las tecnologías de la información e internet, estas conforman un binomio perfecto para definir este TFC que trata todos los aspectos necesarios para llegar a obtener un producto final como es un portal web inmobiliario adaptado a los requisitos demandados por los usuarios actuales de Internet.
Resumo:
MRI has evolved into an important diagnostic technique in medical imaging. However, reliability of the derived diagnosis can be degraded by artifacts, which challenge both radiologists and automatic computer-aided diagnosis. This work proposes a fully-automatic method for measuring image quality of three-dimensional (3D) structural MRI. Quality measures are derived by analyzing the air background of magnitude images and are capable of detecting image degradation from several sources, including bulk motion, residual magnetization from incomplete spoiling, blurring, and ghosting. The method has been validated on 749 3D T(1)-weighted 1.5T and 3T head scans acquired at 36 Alzheimer's Disease Neuroimaging Initiative (ADNI) study sites operating with various software and hardware combinations. Results are compared against qualitative grades assigned by the ADNI quality control center (taken as the reference standard). The derived quality indices are independent of the MRI system used and agree with the reference standard quality ratings with high sensitivity and specificity (>85%). The proposed procedures for quality assessment could be of great value for both research and routine clinical imaging. It could greatly improve workflow through its ability to rule out the need for a repeat scan while the patient is still in the magnet bore.
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
Investigaremos cómo las redes de colaboración y el softwarelibre permiten adaptar el centro educativo al entorno, cómo pueden ayudar al centro a potenciar la formación profesional y garantizar la durabilidad de las acciones, con el objetivo que perdure el conocimiento y la propia red de colaboración para una mejora educativa.
Resumo:
Trabajo que muestra, haciendo uso de tecnologías libres y basándonos en sistemas operativos abiertos, cómo es posible mantener un nivel alto de trabajo para una empresa que se dedica a implementar y realizar desarrollos en tecnologías de software libre. Se muestra el montaje de un laboratorio de desarrollo que nos va a permitir entender el funcionamiento y la implementación tanto de GNU/Linux como del software que se basa en él dentro de la infraestructura de la empresa.
Resumo:
The objective of this work was to build mock-ups of complete yerba mate plants in several stages of development, using the InterpolMate software, and to compute photosynthesis on the interpolated structure. The mock-ups of yerba-mate were first built in the VPlants software for three growth stages. Male and female plants grown in two contrasting environments (monoculture and forest understory) were considered. To model the dynamic 3D architecture of yerba-mate plants during the biennial growth interval between two subsequent prunings, data sets of branch development collected in 38 dates were used. The estimated values obtained from the mock-ups, including leaf photosynthesis and sexual dimorphism, are very close to those observed in the field. However, this similarity was limited to reconstructions that included growth units from original data sets. The modeling of growth dynamics enables the estimation of photosynthesis for the entire yerba mate plant, which is not easily measurable in the field. The InterpolMate software is efficient for building yerba mate mock-ups.
Resumo:
Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.
Resumo:
Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.
Resumo:
Aim. Several software packages (SWP) and models have been released for quantification of myocardial perfusion (MP). Although they all are validated against something, the question remains how well their values agree. The present analysis focused on cross-comparison of three SWP for MP quantification of 13N-ammonia PET studies. Materials & Methods. 48 rest and stress MP 13N-ammonia PET studies of hypertrophic cardiomyopathy (HCM) patients (Sciagrà et al., 2009) were analysed with three SW packages - Carimas, PMOD, and FlowQuant - by three observers blinded to the results of each other. All SWP implement the one-tissue-compartment model (1TCM, DeGrado et al. 1996), and first two - the two-tissue-compartment model (2TCM, Hutchins et al. 1990) as well. Linear mixed model for the repeated measures was fitted to the data. Where appropriate we used Bland-Altman plots as well. The reproducibility was assessed on global, regional and segmental levels. Intraclass correlation coefficients (ICC), differences between the SWPs and between models were obtained. ICC≥0.75 indicated excellent reproducibility, 0.4≤ICC<0.75 indicated fair to good reproducibility, ICC<0.4 - poor reproducibility (Rosner, 2010). Results. When 1TCM MP values were compared, the SW agreement on global and regional levels was excellent, except for Carimas vs. PMOD at RCA: ICC=0.715 and for PMOD vs. FlowQuant at LCX:ICC=0.745 which were good. In segmental analysis in five segments: 7,12,13, 16, and 17 the agreement between all SWP was excellent; in the remaining 12 segments the agreement varied between the compared SWP. Carimas showed excellent agreement with FlowQuant in 13 segments and good in four - 1, 5, 6, 11: 0.687≤ICCs≤0.73; Carimas had excellent agreement with PMOD in 11 segments, good in five_4, 9, 10, 14, 15: 0.682≤ICCs≤0.737, and poor in segment 3: ICC=0.341. PMOD had excellent agreement with FlowQuant in eight segments and substantial-to-good in nine_1, 2, 3, 5, 6,8-11: 0.585≤ICCs≤0.738. Agreement between Carimas and PMOD for 2TCM was good at a global level: ICC=0.745, excellent at LCX (0.780) and RCA (0.774), good at LAD (0.662); agreement was excellent for ten segments, fair-to-substantial for segments 2, 3, 8, 14, 15 (0.431≤ICCs≤0.681), poor for segments 4 (0.384) and 17 (0.278). Conclusions. The three SWP used by different operators to analyse 13N-ammonia PET MP studies provide results that agree well at a global level, regional levels, and mostly well even at a segmental level. Agreement is better for 1TCM. Poor agreement at segments 4 and 17 for 2TCM needs further clarification.