947 resultados para Test method
Resumo:
The conformance of semantic technologies has to be systematically evaluated to measure and verify the real adherence of these technologies to the Semantic Web standards. Currente valuations of semantic technology conformance are not exhaustive enough and do not directly cover user requirements and use scenarios, which raises the need for a simple, extensible and parameterizable method to generate test data for such evaluations. To address this need, this paper presents a keyword-driven approach for generating ontology language conformance test data that can be used to evaluate semantic technologies, details the definition of a test suite for evaluating OWL DL conformance using this approach,and describes the use and extension of this test suite during the evaluation of some tools.
Resumo:
Abstract. This paper describes a new and original method for designing oscillators based on the Normalized Determinant Function (NDF) and Return Relations (RRT)- Firstly, a review of the loop-gain method will be performed. The loop-gain method pros, cons and some examples for exploring wrong solutions provided by this method will be shown. This method produces in some cases wrong solutions because some necessary conditions have not been fulfilled. The required necessary conditions to assure a right solution will be described. The necessity of using the NDF or the Transpose Return Relations (RRT), which are related with the True Loop-Gain, to test the additional conditions will be demonstrated. To conclude this paper, the steps for oscillator design and analysis, using the proposed NDF/RRj method, will be presented. The loop-gain wrong solutions will be compared with the NDF/RRj and the accuracy of this method to estimate the oscillation frequency and QL will be demonstrated. Some additional examples of plane reference oscillators (Z/Y/T), will be added and they will be analyzed with the new NDF/RRj proposed method, even these oscillators cannot be analyzed using the classic loop gain method.
Resumo:
Old-growth trees play a very important role in the maintenance of biodiversity in forests. However, no clear definition is yet available to help identify them since tree age is usually not recorded in National Forest Inventories. To develop and test a new method to identify old-growth trees using a species-specific threshold for tree diameter in National Forest Inventories. Different nonlinear mixed models for diameter ? age were generated using data from the Spanish Forest Inventory in order to identify the most appropriate one for Aleppo pine in its South-western distribution area. The asymptote of the optimal model indicates the threshold diameter for defining an old-growth tree. Additionally, five site index curves were examined to analyze the influence of site quality on these models.
Resumo:
The boundary element method (BEM) has been applied successfully to many engineering problems during the last decades. Compared with domain type methods like the finite element method (FEM) or the finite difference method (FDM) the BEM can handle problems where the medium extends to infinity much easier than domain type methods as there is no need to develop special boundary conditions (quiet or absorbing boundaries) or infinite elements at the boundaries introduced to limit the domain studied. The determination of the dynamic stiffness of arbitrarily shaped footings is just one of these fields where the BEM has been the method of choice, especially in the 1980s. With the continuous development of computer technology and the available hardware equipment the size of the problems under study grew and, as the flop count for solving the resulting linear system of equations grows with the third power of the number of equations, there was a need for the development of iterative methods with better performance. In [1] the GMRES algorithm was presented which is now widely used for implementations of the collocation BEM. While the FEM results in sparsely populated coefficient matrices, the BEM leads, in general, to fully or densely populated ones, depending on the number of subregions, posing a serious memory problem even for todays computers. If the geometry of the problem permits the surface of the domain to be meshed with equally shaped elements a lot of the resulting coefficients will be calculated and stored repeatedly. The present paper shows how these unnecessary operations can be avoided reducing the calculation time as well as the storage requirement. To this end a similar coefficient identification algorithm (SCIA), has been developed and implemented in a program written in Fortran 90. The vertical dynamic stiffness of a single pile in layered soil has been chosen to test the performance of the implementation. The results obtained with the 3-d model may be compared with those obtained with an axisymmetric formulation which are considered to be the reference values as the mesh quality is much better. The entire 3D model comprises more than 35000 dofs being a soil region with 21168 dofs the biggest single region. Note that the memory necessary to store all coefficients of this single region is about 6.8 GB, an amount which is usually not available with personal computers. In the problem under study the interface zone between the two adjacent soil regions as well as the surface of the top layer may be meshed with equally sized elements. In this case the application of the SCIA leads to an important reduction in memory requirements. The maximum memory used during the calculation has been reduced to 1.2 GB. The application of the SCIA thus permits problems to be solved on personal computers which otherwise would require much more powerful hardware.
Resumo:
Background: Analysis of exhaled volatile organic compounds (VOCs) in breath is an emerging approach for cancer diagnosis, but little is known about its potential use as a biomarker for colorectal cancer (CRC). We investigated whether a combination of VOCs could distinct CRC patients from healthy volunteers. Methods: In a pilot study, we prospectively analyzed breath exhalations of 38 CRC patient and 43 healthy controls all scheduled for colonoscopy, older than 50 in the average-risk category. The samples were ionized and analyzed using a Secondary ElectroSpray Ionization (SESI) coupled with a Time-of-Flight Mass Spectrometer (SESI-MS). After a minimum of 2 hours fasting, volunteers deeply exhaled into the system. Each test requires three soft exhalations and takes less than ten minutes. No breath condensate or collection are required and VOCs masses are detected in real time, also allowing for a spirometric profile to be analyzed along with the VOCs. A new sampling system precludes ambient air from entering the system, so background contamination is reduced by an overall factor of ten. Potential confounding variables from the patient or the environment that could interfere with results were analyzed. Results: 255 VOCs, with masses ranging from 30 to 431 Dalton have been identified in the exhaled breath. Using a classification technique based on the ROC curve for each VOC, a set of 9 biomarkers discriminating the presence of CRC from healthy volunteers was obtained, showing an average recognition rate of 81.94%, a sensitivity of 87.04% and specificity of 76.85%. Conclusions: A combination of cualitative and cuantitative analysis of VOCs in the exhaled breath could be a powerful diagnostic tool for average-risk CRC population. These results should be taken with precaution, as many endogenous or exogenous contaminants could interfere as confounding variables. On-line analysis with SESI-MS is less time-consuming and doesn’t need sample preparation. We are recruiting in a new pilot study including breath cleaning procedures and spirometric analysis incorporated into the postprocessing algorithms, to better control for confounding variables.
Resumo:
Resumen El diseño de sistemas ópticos, entendido como un arte por algunos, como una ciencia por otros, se ha realizado durante siglos. Desde los egipcios hasta nuestros días los sistemas de formación de imagen han ido evolucionando así como las técnicas de diseño asociadas. Sin embargo ha sido en los últimos 50 años cuando las técnicas de diseño han experimentado su mayor desarrollo y evolución, debido, en parte, a la aparición de nuevas técnicas de fabricación y al desarrollo de ordenadores cada vez más potentes que han permitido el cálculo y análisis del trazado de rayos a través de los sistemas ópticos de forma rápida y eficiente. Esto ha propiciado que el diseño de sistemas ópticos evolucione desde los diseños desarrollados únicamente a partir de la óptica paraxial hasta lo modernos diseños realizados mediante la utilización de diferentes técnicas de optimización multiparamétrica. El principal problema con el que se encuentra el diseñador es que las diferentes técnicas de optimización necesitan partir de un diseño inicial el cual puede fijar las posibles soluciones. Dicho de otra forma, si el punto de inicio está lejos del mínimo global, o diseño óptimo para las condiciones establecidas, el diseño final puede ser un mínimo local cerca del punto de inicio y lejos del mínimo global. Este tipo de problemática ha llevado al desarrollo de sistemas globales de optimización que cada vez sean menos sensibles al punto de inicio de la optimización. Aunque si bien es cierto que es posible obtener buenos diseños a partir de este tipo de técnicas, se requiere de muchos intentos hasta llegar a la solución deseada, habiendo un entorno de incertidumbre durante todo el proceso, puesto que no está asegurado el que se llegue a la solución óptima. El método de las Superficies Múltiples Simultaneas (SMS), que nació como una herramienta de cálculo de concentradores anidólicos, se ha demostrado como una herramienta también capaz utilizarse para el diseño de sistemas ópticos formadores de imagen, aunque hasta la fecha se ha utilizado para el diseño puntual de sistemas de formación de imagen. Esta tesis tiene por objeto presentar el SMS como un método que puede ser utilizado de forma general para el diseño de cualquier sistema óptico de focal fija o v afocal con un aumento definido así como una herramienta que puede industrializarse para ayudar al diseñador a afrontar de forma sencilla el diseño de sistemas ópticos complejos. Esta tesis está estructurada en cinco capítulos: El capítulo 1, es un capítulo de fundamentos donde se presentan los conceptos fundamentales necesarios para que el lector, aunque no posea una gran base en óptica formadora de imagen, pueda entender los planteamientos y resultados que se presentan en el resto de capítulos El capitulo 2 aborda el problema de la optimización de sistemas ópticos, donde se presenta el método SMS como una herramienta idónea para obtener un punto de partida para el proceso de optimización. Mediante un ejemplo aplicado se demuestra la importancia del punto de partida utilizado en la solución final encontrada. Además en este capítulo se presentan diferentes técnicas que permiten la interpolación y optimización de las superficies obtenidas a partir de la aplicación del SMS. Aunque en esta tesis se trabajará únicamente utilizando el SMS2D, se presenta además un método para la interpolación y optimización de las nubes de puntos obtenidas a partir del SMS3D basado en funciones de base radial (RBF). En el capítulo 3 se presenta el diseño, fabricación y medidas de un objetivo catadióptrico panorámico diseñado para trabajar en la banda del infrarrojo lejano (8-12 μm) para aplicaciones de vigilancia perimetral. El objetivo presentado se diseña utilizando el método SMS para tres frentes de onda de entrada utilizando cuatro superficies. La potencia del método de diseño utilizado se hace evidente en la sencillez con la que este complejo sistema se diseña. Las imágenes presentadas demuestran cómo el prototipo desarrollado cumple a la perfección su propósito. El capítulo 4 aborda el problema del diseño de sistemas ópticos ultra compactos, se introduce el concepto de sistemas multicanal, como aquellos sistemas ópticos compuestos por una serie de canales que trabajan en paralelo. Este tipo de sistemas resultan particularmente idóneos para él diseño de sistemas afocales. Se presentan estrategias de diseño para sistemas multicanal tanto monocromáticos como policromáticos. Utilizando la novedosa técnica de diseño que en este capítulo se presenta el diseño de un telescopio de seis aumentos y medio. En el capítulo 5 se presenta una generalización del método SMS para rayos meridianos. En este capítulo se presenta el algoritmo que debe utilizarse para el diseño de cualquier sistema óptico de focal fija. La denominada optimización fase 1 se vi introduce en el algoritmo presentado de forma que mediante el cambio de las condiciones iníciales del diseño SMS que, aunque el diseño se realice para rayos meridianos, los rayos skew tengan un comportamiento similar. Para probar la potencia del algoritmo desarrollado se presenta un conjunto de diseños con diferente número de superficies. La estabilidad y potencia del algoritmo se hace evidente al conseguirse por primera vez el diseño de un sistema de seis superficies diseñado por SMS. vii Abstract The design of optical systems, considered an art by some and a science by others, has been developed for centuries. Imaging optical systems have been evolving since Ancient Egyptian times, as have design techniques. Nevertheless, the most important developments in design techniques have taken place over the past 50 years, in part due to the advances in manufacturing techniques and the development of increasingly powerful computers, which have enabled the fast and efficient calculation and analysis of ray tracing through optical systems. This has led to the design of optical systems evolving from designs developed solely from paraxial optics to modern designs created by using different multiparametric optimization techniques. The main problem the designer faces is that the different optimization techniques require an initial design which can set possible solutions as a starting point. In other words, if the starting point is far from the global minimum or optimal design for the set conditions, the final design may be a local minimum close to the starting point and far from the global minimum. This type of problem has led to the development of global optimization systems which are increasingly less sensitive to the starting point of the optimization process. Even though it is possible to obtain good designs from these types of techniques, many attempts are necessary to reach the desired solution. This is because of the uncertain environment due to the fact that there is no guarantee that the optimal solution will be obtained. The Simultaneous Multiple Surfaces (SMS) method, designed as a tool to calculate anidolic concentrators, has also proved useful for the design of image-forming optical systems, although until now it has occasionally been used for the design of imaging systems. This thesis aims to present the SMS method as a technique that can be used in general for the design of any optical system, whether with a fixed focal or an afocal with a defined magnification, and also as a tool that can be commercialized to help designers in the design of complex optical systems. The thesis is divided into five chapters. Chapter 1 establishes the basics by presenting the fundamental concepts which the reader needs to acquire, even if he/she doesn‟t have extensive knowledge in the field viii of image-forming optics, in order to understand the steps taken and the results obtained in the following chapters. Chapter 2 addresses the problem of optimizing optical systems. Here the SMS method is presented as an ideal tool to obtain a starting point for the optimization process. The importance of the starting point for the final solution is demonstrated through an example. Additionally, this chapter introduces various techniques for the interpolation and optimization of the surfaces obtained through the application of the SMS method. Even though in this thesis only the SMS2D method is used, we present a method for the interpolation and optimization of clouds of points obtained though the SMS3D method, based on radial basis functions (RBF). Chapter 3 presents the design, manufacturing and measurement processes of a catadioptric panoramic lens designed to work in the Long Wavelength Infrared (LWIR) (8-12 microns) for perimeter surveillance applications. The lens presented is designed by using the SMS method for three input wavefronts using four surfaces. The powerfulness of the design method used is revealed through the ease with which this complex system is designed. The images presented show how the prototype perfectly fulfills its purpose. Chapter 4 addresses the problem of designing ultra-compact optical systems. The concept of multi-channel systems, such as optical systems composed of a series of channels that work in parallel, is introduced. Such systems are especially suitable for the design of afocal systems. We present design strategies for multichannel systems, both monochromatic and polychromatic. A telescope designed with a magnification of six-and-a-half through the innovative technique exposed in this chapter is presented. Chapter 5 presents a generalization of the SMS method for meridian rays. The algorithm to be used for the design of any fixed focal optics is revealed. The optimization known as phase 1 optimization is inserted into the algorithm so that, by changing the initial conditions of the SMS design, the skew rays have a similar behavior, despite the design being carried out for meridian rays. To test the power of the developed algorithm, a set of designs with a different number of surfaces is presented. The stability and strength of the algorithm become apparent when the first design of a system with six surfaces if obtained through the SMS method.
Resumo:
Las pruebas de software (Testing) son en la actualidad la técnica más utilizada para la validación y la evaluación de la calidad de un programa. El testing está integrado en todas las metodologías prácticas de desarrollo de software y juega un papel crucial en el éxito de cualquier proyecto de software. Desde las unidades de código más pequeñas a los componentes más complejos, su integración en un sistema de software y su despliegue a producción, todas las piezas de un producto de software deben ser probadas a fondo antes de que el producto de software pueda ser liberado a un entorno de producción. La mayor limitación del testing de software es que continúa siendo un conjunto de tareas manuales, representando una buena parte del coste total de desarrollo. En este escenario, la automatización resulta fundamental para aliviar estos altos costes. La generación automática de casos de pruebas (TCG, del inglés test case generation) es el proceso de generar automáticamente casos de prueba que logren un alto recubrimiento del programa. Entre la gran variedad de enfoques hacia la TCG, esta tesis se centra en un enfoque estructural de caja blanca, y más concretamente en una de las técnicas más utilizadas actualmente, la ejecución simbólica. En ejecución simbólica, el programa bajo pruebas es ejecutado con expresiones simbólicas como argumentos de entrada en lugar de valores concretos. Esta tesis se basa en un marco general para la generación automática de casos de prueba dirigido a programas imperativos orientados a objetos (Java, por ejemplo) y basado en programación lógica con restricciones (CLP, del inglés constraint logic programming). En este marco general, el programa imperativo bajo pruebas es primeramente traducido a un programa CLP equivalente, y luego dicho programa CLP es ejecutado simbólicamente utilizando los mecanismos de evaluación estándar de CLP, extendidos con operaciones especiales para el tratamiento de estructuras de datos dinámicas. Mejorar la escalabilidad y la eficiencia de la ejecución simbólica constituye un reto muy importante. Es bien sabido que la ejecución simbólica resulta impracticable debido al gran número de caminos de ejecución que deben ser explorados y a tamaño de las restricciones que se deben manipular. Además, la generación de casos de prueba mediante ejecución simbólica tiende a producir un número innecesariamente grande de casos de prueba cuando es aplicada a programas de tamaño medio o grande. Las contribuciones de esta tesis pueden ser resumidas como sigue. (1) Se desarrolla un enfoque composicional basado en CLP para la generación de casos de prueba, el cual busca aliviar el problema de la explosión de caminos interprocedimiento analizando de forma separada cada componente (p.ej. método) del programa bajo pruebas, almacenando los resultados y reutilizándolos incrementalmente hasta obtener resultados para el programa completo. También se ha desarrollado un enfoque composicional basado en especialización de programas (evaluación parcial) para la herramienta de ejecución simbólica Symbolic PathFinder (SPF). (2) Se propone una metodología para usar información del consumo de recursos del programa bajo pruebas para guiar la ejecución simbólica hacia aquellas partes del programa que satisfacen una determinada política de recursos, evitando la exploración de aquellas partes del programa que violan dicha política. (3) Se propone una metodología genérica para guiar la ejecución simbólica hacia las partes más interesantes del programa, la cual utiliza abstracciones como generadores de trazas para guiar la ejecución de acuerdo a criterios de selección estructurales. (4) Se propone un nuevo resolutor de restricciones, el cual maneja eficientemente restricciones sobre el uso de la memoria dinámica global (heap) durante ejecución simbólica, el cual mejora considerablemente el rendimiento de la técnica estándar utilizada para este propósito, la \lazy initialization". (5) Todas las técnicas propuestas han sido implementadas en el sistema PET (el enfoque composicional ha sido también implementado en la herramienta SPF). Mediante evaluación experimental se ha confirmado que todas ellas mejoran considerablemente la escalabilidad y eficiencia de la ejecución simbólica y la generación de casos de prueba. ABSTRACT Testing is nowadays the most used technique to validate software and assess its quality. It is integrated into all practical software development methodologies and plays a crucial role towards the success of any software project. From the smallest units of code to the most complex components and their integration into a software system and later deployment; all pieces of a software product must be tested thoroughly before a software product can be released. The main limitation of software testing is that it remains a mostly manual task, representing a large fraction of the total development cost. In this scenario, test automation is paramount to alleviate such high costs. Test case generation (TCG) is the process of automatically generating test inputs that achieve high coverage of the system under test. Among a wide variety of approaches to TCG, this thesis focuses on structural (white-box) TCG, where one of the most successful enabling techniques is symbolic execution. In symbolic execution, the program under test is executed with its input arguments being symbolic expressions rather than concrete values. This thesis relies on a previously developed constraint-based TCG framework for imperative object-oriented programs (e.g., Java), in which the imperative program under test is first translated into an equivalent constraint logic program, and then such translated program is symbolically executed by relying on standard evaluation mechanisms of Constraint Logic Programming (CLP), extended with special treatment for dynamically allocated data structures. Improving the scalability and efficiency of symbolic execution constitutes a major challenge. It is well known that symbolic execution quickly becomes impractical due to the large number of paths that must be explored and the size of the constraints that must be handled. Moreover, symbolic execution-based TCG tends to produce an unnecessarily large number of test cases when applied to medium or large programs. The contributions of this dissertation can be summarized as follows. (1) A compositional approach to CLP-based TCG is developed which overcomes the inter-procedural path explosion by separately analyzing each component (method) in a program under test, stowing the results as method summaries and incrementally reusing them to obtain whole-program results. A similar compositional strategy that relies on program specialization is also developed for the state-of-the-art symbolic execution tool Symbolic PathFinder (SPF). (2) Resource-driven TCG is proposed as a methodology to use resource consumption information to drive symbolic execution towards those parts of the program under test that comply with a user-provided resource policy, avoiding the exploration of those parts of the program that violate such policy. (3) A generic methodology to guide symbolic execution towards the most interesting parts of a program is proposed, which uses abstractions as oracles to steer symbolic execution through those parts of the program under test that interest the programmer/tester most. (4) A new heap-constraint solver is proposed, which efficiently handles heap-related constraints and aliasing of references during symbolic execution and greatly outperforms the state-of-the-art standard technique known as lazy initialization. (5) All techniques above have been implemented in the PET system (and some of them in the SPF tool). Experimental evaluation has confirmed that they considerably help towards a more scalable and efficient symbolic execution and TCG.
Resumo:
In warm and dry climates, the use of porous systems should be required in order to allow a better leaf distribution inside the plant, causing more space in the clusters area and enhancing determined physiological processes so in the leaf (photosynthesis, v entilation, transpiration) as in berry (growth and maturation). Plant geometry indexes, yield and must composition have been studied in three different systems: sprawl with 12 shoots/m (S1); sprawl system with 18 shoots/m (S2) and vertical positioned syste m or VSP with 12 shoots/m (VSP1). Total leaf area increases as the crop load does, whoever surface area depends on to two factors: crop load and the training system (VSP vs. sprawl), which can provide differences in leaf exposure efficiencies. The main objective of this study was to validate digital photography measurements used to compare porosity differences among treatments and, as they affect plant microclimate and, therefore, yield and berry quality. Also, all previous studied indexes (LAI, SA, SFEr) tended to overestimate the relationship between exposed leaf surface and porosity of each treatment, but the use of digital method proved to be an effective tool in order to assess canopy porosity. Results showed that not positioned and free systems (sprawl) scored between 25- 50% more porosity in the clusters area than the fixed vertical system (VSP), which resulted in a better plant microclimate for test conditions, mainly by improving the exposure of internal clusters and internal canopy ventilation. On the other hand, higher crop load treatment (S2) showed a real increase in yield (16%) without any relevant change into must composition, even improving total anthocyanin content into berry during ripening
Resumo:
In warm and dry climates, the use of porous systems should be required in order to allow a better leaf distribution inside the plant, causing more space in the clusters area and enhancing determined physiological processes so in the leaf (photosynthesis, ventilation, transpiration) as in berry (growth and maturation). Plant geometry indexes, yield and must composition have been studied in three different systems: sprawl with 12 shoots/m (S1); sprawl system with 18 shoots/m (S2) and vertical positioned system or VSP with 12 shoots/m (VSP1). Total leaf area increases as the crop load does, whoever surface area depends on to two factors: crop load and the training system (VSP vs . sprawl), which can provide differences in leaf exposure efficiencies. The main objective of this study was to validate digital photography measurements used to compare porosity differences among treatments and, as they affect plant microclimate and, therefore, yield and berry quality. Also, all previous studied indexes (LAI, SA, SFEr) tended to overestimate the relationship between exposed leaf surface and porosity of each treatment, but the use of digital method proved to be an effective tool in order to assess canopy porosity. Results showed that not positioned and free systems (sprawl) scored between 25 - 50% more porosity in the clusters area than the fixed vertical system (VSP), which resulted in a better plant microclimate for test conditions, mainly by improving the exposure of internal clusters and internal canopy ventilation. On the other hand, higher crop load treatment (S2) showed a real increase in yield (16%) without any relevant change into must composition, even improving total anthocyanin content into berry during ripening
Resumo:
Thin film photovoltaic (TF) modules have gained importance in the photovoltaic (PV) market. New PV plants increasingly use TF technologies. In order to have a reliable sample of a PV module population, a huge number of modules must be measured. There is a big variety of materials used in TF technology. Some of these modules are made of amorphous or microcrystalline silicon. Other are made of CIS or CdTe. Not all these materials respond the same under standard test conditions (STC) of power measurement. Power rates of the modules may vary depending on both the extent and the history of sunlight exposure. Thus, it is necessary a testing method adapted to each TF technology. This test must guarantee repeatability of measurements of generated power. This paper shows responses of different commercial TF PV modules to sunlight exposure. Several test procedures were performed in order to find the best methodology to obtain measurements of TF PV modules at STC in the easiest way. A methodology for indoor measurements adapted to these technologies is described.
Resumo:
La comparación de las diferentes ofertas presentadas en la licitación de un proyecto,con el sistema de contratación tradicional de medición abierta y precio unitario cerrado, requiere herramientas de análisis que sean capaces de discriminar propuestas que teniendo un importe global parecido pueden presentar un impacto económico muy diferente durante la ejecución. Una de las situaciones que no se detecta fácilmente con los métodos tradicionales es el comportamiento del coste real frente a las variaciones de las cantidades realmente ejecutadas en obra respecto de las estimadas en el proyecto. Este texto propone abordar esta situación mediante un sistema de análisis cuantitativo del riesgo como el método de Montecarlo. Este procedimiento, como es sabido, consiste en permitir que los datos de entrada que definen el problema varíen unas funciones de probabilidad definidas, generar un gran número de casos de prueba y tratar los resultados estadísticamente para obtener los valores finales más probables,con los parámetros necesarios para medir la fiabilidad de la estimación. Se presenta un modelo para la comparación de ofertas, desarrollado de manera que puede aplicarse en casos reales aplicando a los datos conocidos unas condiciones de variación que sean fáciles de establecer por los profesionales que realizan estas tareas. ABSTRACT: The comparison of the different bids in the tender for a project, with the traditional contract system based on unit rates open to and re-measurement, requires analysis tools that are able to discriminate proposals having a similar overall economic impact, but that might show a very different behaviour during the execution of the works. One situation not easily detected by traditional methods is the reaction of the actual cost to the changes in the exact quantity of works finally executed respect of the work estimated in the project. This paper intends to address this situation through the Monte Carlo method, a system of quantitative risk analysis. This procedure, as is known, is allows the input data defining the problem to vary some within well defined probability functions, generating a large number of test cases, the results being statistically treated to obtain the most probable final values, with the rest of the parameters needed to measure the reliability of the estimate. We present a model for the comparison of bids, designed in a way that it can be applied in real cases, based on data and assumptions that are easy to understand and set up by professionals who wish to perform these tasks.
Resumo:
Small punch (SP) test techniques are typically used to study the mechanical properties of materials or components from miniature size specimens. This kind of test was originally developed to assess ductility loss in steel caused by irradiation or thermal treatment, particularly when the amount of metal was limited, but it soon proved to be a powerful method to estimate several properties.
Resumo:
The aim of this paper is to develop a probabilistic modeling framework for the segmentation of structures of interest from a collection of atlases. Given a subset of registered atlases into the target image for a particular Region of Interest (ROI), a statistical model of appearance and shape is computed for fusing the labels. Segmentations are obtained by minimizing an energy function associated with the proposed model, using a graph-cut technique. We test different label fusion methods on publicly available MR images of human brains.
Resumo:
En este documento se detalla, la planificación y elaboración de un paquete que respeta el estándar S4 de programación en lenguaje R. El paquete consiste en una serie de métodos y clases para la generación de exámenes tipos test y soluciones a partir de un archivo xls, que hace las funciones de una base de datos. El diseño propuesto está orientado a objetos y desarrolla un conjunto de clases que representan los contenidos de una prueba de evaluación tipo test: enunciados, peguntas y respuestas. Se ha realizado una implementación sencilla de un prototipo con las funciones básicas necesarias para generar los tests. Además se ha generado la documentación necesaria para crear el paquete, esto significa que cada método tiene una página de ayuda, que se podrá consultar desde un terminal con R, dicha documentación incluye ejemplos de ejecución de cada método.---ABSTRACT---In this document is detailed the elaboration and development of a package that meets the standard S4 of programming language R. This package consists of a group of methods and classes used for the generation of test exams and their solutions starting from a xls format file wich plays the role of a data base at the same time. These classes have been grouped in a way that the user could have a complete and easy vision of them. This division has been done by using data storage and functions whose tasks are more or less the same. Furthermore, the necessary documentation to create this package has also been generated, that means that every method has a help page wich can be called from a R terminal if necessary. This documentation has examples of the execution of every method.
Resumo:
This paper presents a gravimetric study (based on 382 gravimetric stations in an area about 32 km2) of a nearly flat basin: the Low Andarax valley. This alluvial basin, close to its river mouth, is located in the extreme south of the province of Almería and coincides with one of the existing depressions in the Betic Cordillera. The paper presents new methodological work to adapt a published inversion approach (GROWTH method) to the case of an alluvial valley (sedimentary stratification, with density increase downward). The adjusted 3D density model reveals several features in the topography of the discontinuity layers between the calcareous basement (2,700 kg/m3) and two sedimentary layers (2,400 and 2,250 kg/m3). We interpret several low density alignments as corresponding to SE faults striking about N140?145°E. Some detected basement elevations (such as the one, previously known by boreholes, in Viator village) are apparently connected with the fault pattern. The outcomes of this work are: (1) new gravimetric data, (2) new methodological options, and (3) the resulting structural conclusions.