980 resultados para software-defined storage


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Informática

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre em Engenharia Química e Bioquímica

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cryocoolers have been progressively replacing the use of the stored cryogens in cryogenic chains used for detector cooling, thanks to their higher and higher reliability. However, the mechanical vibrations, the electromagnetic interferences and the temperature fluctuations inherent to their functioning could reduce the sensor’s sensitivity. In order to minimize this problem, compact thermal energy storage units (ESU) are studied, devices able to store thermal energy without significant temperature increase. These devices can be used as a temporary cold source making it possible to turn the cryocooler OFF providing a proper environment for the sensor. A heat switch is responsible for the thermal decoupling of the ESU from the cryocooler’s temperature that increases when turned OFF. In this work, several prototypes working around 40 K were designed, built and characterized. They consist in a low temperature cell that contains the liquid neon connected to an expansion volume at room temperature for gas storage during the liquid evaporation phase. To turn this system insensitive to the gravity direction, the liquid is retained in the low temperature cell by capillary effect in a porous material. Thanks to pressure regulation of the liquid neon bath, 900 J were stored at 40K. The higher latent heat of the liquid and the inexistence of triple point transitions at 40 K turn the pressure control during the evaporation a versatile and compact alternative to an ESU working at the triple point transitions. A quite compact second prototype ESU directly connected to the cryocooler cold finger was tested as a temperature stabilizer. This device was able to stabilize the cryocooler temperature ((≈ 40K ±1 K) despite sudden heat bursts corresponding to twice the cooling power of the cryocooler. This thesis describes the construction of these devices as well as the tests performed. It is also shown that the thermal model developed to predict the thermal behaviour of these devices, implemented as a software,describes quite well the experimental results. Solutions to improve these devices are also proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cryocoolers have been progressively replacing the use of the stored cryogens in cryogenic chains used for detector cooling, thanks to their higher and higher reliability. However, the mechanical vibrations, the electromagnetic interferences and the temperature fluctuations inherent to their functioning could reduce the sensor’s sensitivity. In order to minimize this problem, compact thermal energy storage units (ESU) are studied, devices able to store thermal energy without significant temperature increase. These devices can be used as a temporary cold source making it possible to turn the cryocooler OFF providing a proper environment for the sensor. A heat switch is responsible for the thermal decoupling of the ESU from the cryocooler’s temperature that increases when turned OFF. In this work, several prototypes working around 40 K were designed, built and characterized. They consist in a low temperature cell that contains the liquid neon connected to an expansion volume at room temperature for gas storage during the liquid evaporation phase. To turn this system insensitive to the gravity direction, the liquid is retained in the low temperature cell by capillary effect in a porous material. Thanks to pressure regulation of the liquid neon bath, 900 J were stored at 40K. The higher latent heat of the liquid and the inexistence of triple point transitions at 40 K turn the pressure control during the evaporation a versatile and compact alternative to an ESU working at the triple point transitions. A quite compact second prototype ESU directly connected to the cryocooler cold finger was tested as a temperature stabilizer. This device was able to stabilize the cryocooler temperature ((≈ 40K ±1 K) despite sudden heat bursts corresponding to twice the cooling power of the cryocooler. This thesis describes the construction of these devices as well as the tests performed. It is also shown that the thermal model developed to predict the thermal behaviour of these devices,implemented as a software, describes quite well the experimental results. Solutions to improve these devices are also proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El objeto de estudio de este proyecto son los sistemas de calentamiento de agua mediante energía solar que funcionan termosifónicamente. En particular se tratará con dos diseños particulares generados por fabricantes de la Provincia de Córdoba y que han solicitado el asesoramiento del Grupo de Energía Solar (GES) para el mejoramiento de la performance térmica de dichos equipos. Se trata de dos sistemas que tienen materiales no tradicionales y se diferencian además por tener una distinta disposición del tanque de almacenamiento: uno es en forma vertical y el otro en forma horizontal. Basados en los resultados de un ensayo bajo norma internacional, donde se detectaron algunas puntos factibles de mejora, se propone en este proyecto el análisis en detalle de los equipos, para lo cual se les debe desarmar completos, para realizar un estudio analítico y experimental de los mismos con el objeto de hacer un planteo teórico-analítico del comportamiento de los mismos, con la implementación de propuestas de mejora y chequeo de los resultados. Se propone entonces como objetivo lograr un mejoramiento de la performance térmica de los citados equipos a partir de un estudio experimental y analítico. Asumiendo esta posibilidad de mejora, se plantea la hipótesis de que es posible representar el funcionamiento de estos equipos mediante modelos físico-matemáticos desarrollados a partir de ecuaciones y correlaciones conocidas y procesos a interpretar mediante resoluciones numéricas y softwares específicos de simulación. De esta manera, se plantea el despieze completo de los equipos para estudiar en detalle su estructura y conexiones internas y a partir de la geometría, dimensiones y propiedades termofísicas de materiales constructivos y fluidos de trabajo, realizar modelos físico-matemáticos que permitan realizar variaciones de propiedades y geometría y así buscar las mejores combinaciones que produzcan equipos más eficientes térmicamente. Los modelos físico-matemáticos serán codificados en lenguajes de alto nivel para poder luego de una validación de los modelos, correr simulaciones en un software de reconocimiento internacional que permite sumar dichos modelos mediante un protocolo de comunicación, haciendo que las poderosas prestaciones del software se puedan aplicar a nuestros modelos. Se complementará el estudio con un análisis exergético para identificar los puntos críticos en que se producen las pérdidas de oportunidad de aprovechar la energía disponible, para así analizar cómo solucionar los problemas en dichos puntos. Los materiales a utilizar serán los propios equipos provistos por los fabricantes, que serán modificados convenientemente para operarlos como prototipos Se espera obtener un conocimiento acabado de los procesos y principios de funcionamiento de los equipos, que permita plantear las mejoras, las cuales se implementarán en los prototipos, realizándose una medición mediante norma igual a la inicial para ver en que magnitud se logran las mejoras esperadas. Se pretende además que las mejoras a implementar, en la etapa de transferencia a las empresas involucradas, redunden no sólo en un beneficio técnico, sino que también los sea desde el punto de vista económico. Para ello se trabajará también sobre los procesos y métodos de fabricación para que los equipos mejorados no sean mas caros que los originales y de ser posible sean aún más económicos, todo esto apuntando a la difusión de la energía solar térmica y poner al alcance de todos estos equipos tan convenientes para la propagación de las energías limpias. El proyecto redundará también en un importante beneficio para el conocimiento de la comunidad científica en general, con el aporte de nuevos resultados en diseños novedosos y con nuevos materiales. Además, la institución se beneficiará con la formación que obtendrán los integrantes del proyecto, muchos de ellos en etapa de realización de sus estudios de posgrado y en una etapa importante de su vida como investigadores. The main goal of this project is the improvement of two thermosyphonic solar water heating systems, made of non conventional materials and with different arrangement of their storage tanks: one is vertical and the other one horizontal. The thermosyphonic systems are provided by manufacturers of the Córdoba Province, who came to the Solar Energy Group (GES) of the National University of Río Cuarto looking for help for the design of their products. In an agreement with these manufacturers, it was proposed this project in order to work analytically and experimentally in order to obtain physical-mathematical models of these two systems, which allow for changes to look by means of simulations the best changes to implement on the equipments for the improvement of their thermal performance. Then, the materials to be used are the proper systems provided by the manufacturers, which will be disarmed to be studied in detail. After the analytical study the proposals of improvement will be implemented in a high level language of programming to perform simulations in the environment of a well-known software for energy simulations (TRNSYS). After the simulations, the best modifications will be physically implemented in the prototypes to perform finally the same normalized test of the beginning and check the magnitude of the implemented improvements. The importance of this project is based on the offer of better systems the companies would make, which would benefit the deployment of the thermal solar energy. Another relevant point is to make the new equipments at the same cost of the previous ones or cheaper, in order to achieve a good deployment of the solar water heating systems; then, the manufacture processes and methods must be studied to obtain not only good technical solutions, but also economical equipments. In addition, this project will contribute to the increasing of the knowledge in the area of thermosyphonic solar systems and the training of postgraduate students.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El propósito de diseñar esta aplicación, es facilitar una herramienta a los usuarios de una red informática que gestione la resolución de incidencias técnicas con el objetivo que el servicio se vea interrumpido el menor tiempo posible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Percutaneous transluminal renal angioplasty (PTRA) is an invasive technique that is costly and involves the risk of complications and renal failure. The ability of PTRA to reduce the administration of antihypertensive drugs has been demonstrated. A potentially greater benefit, which nevertheless remains to be proven, is the deferral of the need for chronic dialysis. The aim of the study (ANPARIA) was to assess the appropriateness of PTRA to impact on the evolution of renal function. A standardized expert panel method was used to assess the appropriateness of medical treatment alone or medical treatment with revascularization in various clinical situations. The choice of revascularization by either PTRA or surgery was examined for each clinical situation. Analysis was based on a detailed literature review and on systematically elicited expert opinion, which were obtained during a two-round modified Delphi process. The study provides detailed responses on the appropriateness of PTRA for 1848 distinct clinical scenarios. Depending on the major clinical presentation, appropriateness of revascularization varied from 32% to 75% for individual scenarios (overal 48%). Uncertainty as to revascularization was 41% overall. When revascularization was appropriate, PTRA was favored over surgery in 94% of the scenarios, except in certain cases of aortic atheroma where sugery was the preferred choice. Kidney size [7 cm, absence of coexisting disease, acute renal failure, a high degree of stenosis (C70%), and absence of multiple arteries were identified as predictive variables of favorable appropriateness ratings. Situations such as cardiac failure with pulmonary edema or acute thrombosis of the renal artery were defined as indications for PTRA. This study identified clinical situations in which PTRA or surgery are appropriate for renal artery disease. We built a decision tree which can be used via Internet: the ANPARIA software (http://www.chu-clermontferrand.fr/anparia/). In numerous clinical situations uncertainty remains as to whether PTRA prevents deterioration of renal function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is aimed to find a solution for a distributed storage system adapted for CoDeS. By studying how DSSs work and how they are implemented, we can conclude how we can implement a DSS compatible with CoDeS requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The analysis and usage of biological data is hindered by the spread of information across multiple repositories and the difficulties posed by different nomenclature systems and storage formats. In particular, there is an important need for data unification in the study and use of protein-protein interactions. Without good integration strategies, it is difficult to analyze the whole set of available data and its properties.Results: We introduce BIANA (Biologic Interactions and Network Analysis), a tool for biological information integration and network management. BIANA is a Python framework designed to achieve two major goals: i) the integration of multiple sources of biological information, including biological entities and their relationships, and ii) the management of biological information as a network where entities are nodes and relationships are edges. Moreover, BIANA uses properties of proteins and genes to infer latent biomolecular relationships by transferring edges to entities sharing similar properties. BIANA is also provided as a plugin for Cytoscape, which allows users to visualize and interactively manage the data. A web interface to BIANA providing basic functionalities is also available. The software can be downloaded under GNU GPL license from http://sbi.imim.es/web/BIANA.php.Conclusions: BIANA's approach to data unification solves many of the nomenclature issues common to systems dealing with biological data. BIANA can easily be extended to handle new specific data repositories and new specific data types. The unification protocol allows BIANA to be a flexible tool suitable for different user requirements: non-expert users can use a suggested unification protocol while expert users can define their own specific unification rules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A large percentage of bridges in the state of Iowa are classified as structurally or fiinctionally deficient. These bridges annually compete for a share of Iowa's limited transportation budget. To avoid an increase in the number of deficient bridges, the state of Iowa decided to implement a comprehensive Bridge Management System (BMS) and selected the Pontis BMS software as a bridge management tool. This program will be used to provide a selection of maintenance, repair, and replacement strategies for the bridge networks to achieve an efficient and possibly optimal allocation of resources. The Pontis BMS software uses a new rating system to evaluate extensive and detailed inspection data gathered for all bridge elements. To manually collect these data would be a highly time-consuming job. The objective of this work was to develop an automated-computerized methodology for an integrated data base that includes the rating conditions as defined in the Pontis program. Several of the available techniques that can be used to capture inspection data were reviewed, and the most suitable method was selected. To accomplish the objectives of this work, two userfriendly programs were developed. One program is used in the field to collect inspection data following a step-by-step procedure without the need to refer to the Pontis user's manuals. The other program is used in the office to read the inspection data and prepare input files for the Pontis BMS software. These two programs require users to have very limited knowledge of computers. On-line help screens as well as options for preparing, viewing, and printing inspection reports are also available. The developed data collection software will improve and expedite the process of conducting bridge inspections and preparing the required input files for the Pontis program. In addition, it will eliminate the need for large storage areas and will simplify retrieval of inspection data. Furthermore, the approach developed herein will facilitate transferring these captured data electronically between offices within the Iowa DOT and across the state.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software projects have proved to be troublesome to be implemented and as the size of software keeps increasing it is more and more important to follow-up the projects. The proportion of succeeded software projects is still quite low in spite of the research and the development of the project control methodologies. The success and failure factors of projects are known, as well as the project risks but nevertheless the projects still have problems with keeping the schedule and the budget and achieving the defined functionality and adequate quality. The purpose of this thesis was to find out what deviations are there in projects at the moment, what causes them, and what is measured in projects. Also project deviation was defined in the viewpoint of literature and field experts. The analysis was made using a qualitative research approach. It was found out that there are still deviations in software projects with schedule, budget, quality, requirements, documenting, effort, and resources. In addition also changes in requirements were identified. It was also found out that for example schedule deviations can be affected by reducing the size of a task and adding measurements.