978 resultados para Software Testing
Resumo:
Estudio de optimización de recursos de red basado en la utilización de un servidor dedicado gestionado mediante software libre.
Resumo:
El trabajo se centra en proporcionar una solución informática para la Administración de Propiedades haciendo uso de nuevas tecnologías y a su vez dotar de las herramientas necesarias para la construcción de una comunidad. El desarrollo de la solución pasa por la descripción de las herramientas empleadas y las etapas para su construcción, que incluyen el análisis, diseño, implementación y posterior implantación. Se hace énfasis en el Framework elegido para demostrar las ventajas de su aplicación. En la construcción de la comunidad se describen las herramientas utilizadas para la difusión del proyecto que incluyen la publicación de una página del proyecto, el uso de redes sociales y páginas publicitarias y la puesta en marcha de un software de colaboración para la administración del desarrollo del proyecto.
Resumo:
The emergence of open source software in the last years has become a common topic of study in different fields, from the most technical characteristics to the economical aspects. This paper examines the current status about the literature dealing with economics of open source and explores the uses, infrastructure and expectations of retail businesses and institutions of the town of Igualda about it. This qualitative case study finds out that the current equipment and level of uses of ICTs are low and that the current situation of the town stores is receptive to a potential introduction of open source software.
Resumo:
In this project a research both in finding predictors via clustering techniques and in reviewing the Data Mining free software is achieved. The research is based in a case of study, from where additionally to the KDD free software used by the scientific community; a new free tool for pre-processing the data is presented. The predictors are intended for the e-learning domain as the data from where these predictors have to be inferred are student qualifications from different e-learning environments. Through our case of study not only clustering algorithms are tested but also additional goals are proposed.
Resumo:
Este proyecto tiene como objetivo implementar un sistema libre bajo GNU/Linux para la recepción y evaluación de trabajos presentados a un congreso argentino sobre sistemas embebidos. Además, se piensa en un futuro en organizar un evento de software libre con revisión de pares, para lo cual sería coherente utilizar un sistema con licencia de software libre.
Resumo:
Desarrollo de software para el control de calidad y la generación automatizada de informes técnicos sobre ficheros de estado generados por AUV (vehículos autónomos submarinos).
Resumo:
L'objectiu principal de l'estudi és fer una avaluació d'una possible introducció a l'ERP d'un entorn complex de producció com pot ser el món de l'electrònica. Partirem de la hipòtesi que l'empresa vol substituir el software de gestió que fa servir actualment, i que ha estat desenvolupat internament, per plantejar la possibilitat d'introduir aquest entorn sota l'ERP actual o utilitzar un software de gestió de tercers.
Resumo:
The aim of this study was to investigate the performance of a new and accurate method for the detection of isoniazid (INH) and rifampicin (RIF) resistance among Mycobacterium tuberculosis isolates using a crystal violet decolourisation assay (CVDA). Fifty-five M. tuberculosis isolates obtained from culture stocks stored at -80ºC were tested. After bacterial inoculation, the samples were incubated at 37ºC for seven days and 100 µL of CV (25 mg/L stock solution) was then added to the control and sample tubes. The tubes were incubated for an additional 24-48 h. CV (blue/purple) was decolourised in the presence of bacterial growth; thus, if CV lost its colour in a sample containing a drug, the tested isolate was reported as resistant. The sensitivity, specificity, positive predictive value, negative predictive value and agreement for INH were 92.5%, 96.4%, 96.1%, 93.1% and 94.5%, respectively, and 88.8%, 100%, 100%, 94.8% and 96.3%, respectively, for RIF. The results were obtained within eight-nine days. This study shows that CVDA is an effective method to detect M. tuberculosis resistance to INH and RIF in developing countries. This method is rapid, simple and inexpensive. Nonetheless, further studies are necessary before routine laboratory implementation.
Resumo:
The accuracy of the MicroScan WalkAway, BD Phoenix, and Vitek-2 systems for susceptibility testing of quinolones and aminoglycosides against 68 enterobacteria containing qnrB, qnrS, and/or aac(6 ')-Ib-cr was evaluated using reference microdilution. Overall, one very major error (0.09%), 6 major errors (0.52%), and 45 minor errors (3.89%) were noted.
Resumo:
Este proyecto tiene como objetivo desarrollar las herramientas necesarias para poder crear un mapa conceptual de las aplicaciones de una organización, representar gráficamente este mapa y controlar el estado de cada aplicación. En concreto, se trata de desarrollar un formato XML que permita identificar y describir una aplicación, detallar con qué tecnología está desarrollada, qué componentes utiliza, especificar las interacciones o dependencias con otros sistemas, etc.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.
Resumo:
A test kit based on living, lyophilized bacterial bioreporters emitting bioluminescence as a response to arsenite and arsenate was applied during a field campaign in six villages across Bangladesh. Bioreporter field measurements of arsenic in groundwater from tube wells were in satisfying agreement with the results of spectroscopic analyses of the same samples conducted in the lab. The practicability of the bioreporter test in terms of logistics and material requirements, suitability for high sample throughput, and waste disposal was much better than that of two commercial chemical test kits that were included as references. The campaigns furthermore demonstrated large local heterogeneity of arsenic in groundwater, underscoring the use of well switching as an effective remedy to avoid high arsenic exposure.
Resumo:
In the eighties, John Aitchison (1986) developed a new methodological approach for the statistical analysis of compositional data. This new methodology was implemented in Basic routines grouped under the name CODA and later NEWCODA inMatlab (Aitchison, 1997). After that, several other authors have published extensions to this methodology: Marín-Fernández and others (2000), Barceló-Vidal and others (2001), Pawlowsky-Glahn and Egozcue (2001, 2002) and Egozcue and others (2003). (...)
Resumo:
Job loss is widely known to lead to a substantial decrease in workers' subjective well-being. Functionalist theories explain this fact by arguing that the fundamental needs that work fulfills are absent during unemployment. Recent evidence from longitudinal studies however contradicts this approach, showing that workers who find a new job do not fully regain their former level of well-being upon reemployment. Therefore other mechanisms must be at work. We suggest that changes in social or economic domains of workers' lives - triggered by job displacement - lead to the observed changes in well-being. Drawing on a unique data set from a survey of workers displaced by plant closure in Switzerland after the financial crisis of 2008, our analysis confirms the previous result that finding a job after displacement does not completely restore workers' pre-displacement level of well-being. The factors that best explain this outcome are changes in social domains, notably changes in workers' job - related social status and their relationships to friends. This result provides valuable insights about the long lasting scars job displacement leaves on workers' lives.