862 resultados para Interactive Video Instruction: A Training Tool Whose Time Has Come


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Previous studies suggested that some interactive video games induce cardiovascular responses. However, some different styles of video games have not been investigated. OBJECTIVE: We aimed to evaluate cardiovascular responses induced by video game boxing performance in healthy women. METHOD: We evaluated ten female sedentary volunteers, aged 20.9 ± 1.4 years, weight 58.7 ± 8.0 kg, height 163.2 ± 5.4cm. All subjects were weighed and measured. Their heart rate, blood pressure and lactate levels were recorded before and after video game performance. The volunteers played a Sony video game (Nintendo® Wii) by using the boxing method, in which all volunteers played for 10 minutes without interruption. At the end of the game the volunteers were reassessed using the same parameters mentioned above. RESULTS: At the end of the video game boxing performance we observed highly significant increases of lactate production (p < 0.0035) and the double product (heart rate vs. systolic blood pressure) was also higher (p < 0.0001). Both parameters indicate that the performance increased demands of the cardiovascular system. CONCLUSION: We conclude that a ten-minute video game boxing performance induces cardiovascular responses similar to aerobic exercise. This may be a practical form of exercise, but care should be exercised concerning subjects with cardiovascular disorders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Educação - FFC

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Calegari VC, Abrantes JL, Silveira LR, Paula FM, Costa JM Jr, Rafacho A, Velloso LA, Carneiro EM, Bosqueiro JR, Boschero AC, Zoppi CC. Endurance training stimulates growth and survival pathways and the redox balance in rat pancreatic islets. J Appl Physiol 112: 711-718, 2012. First published December 15, 2011; doi:10.1152/japplphysiol.00318.2011.-Endurance training has been shown to increase pancreatic beta-cell function and mass. However, whether exercise modulates beta-cell growth and survival pathways signaling is not completely understood. This study investigated the effects of exercise on growth and apoptotic markers levels in rat pancreatic islets. Male Wistar rats were randomly assigned to 8-wk endurance training or to a sedentary control group. After that, pancreatic islets were isolated; gene expression and the total content and phosphorylation of several proteins related to growth and apoptotic pathways as well as the main antioxidant enzymes were determined by real-time polymerase chain reaction and Western blot analysis, respectively. Reactive oxygen species (ROS) production was measured by fluorescence. Endurance training increased the time to reach fatigue by 50%. Endurance training resulted in increased protein phosphorylation content of AKT (75%), AKT substrate (AS160; 100%), mTOR (60%), p70s6k (90%), and ERK1/2 (50%), compared with islets from control group. Catalase protein content was 50% higher, whereas ROS production was 49 and 77% lower in islets from trained rats under basal and stimulating glucose conditions, respectively. Bcl-2 mRNA and protein levels increased by 46 and 100%, respectively. Bax and cleaved caspase-3 protein contents were reduced by 25 and 50% in islets from trained rats, respectively. In conclusion, these results demonstrate that endurance training favors the beta-cell growth and survival by activating AKT and ERK1/2 pathways, enhancing antioxidant capacity, and reducing ROS production and apoptotic proteins content.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gastrointestinal stromal tumors (GISTs) are the most common mesenchymal tumors in the gastrointestinal tract. This work considers the pharmacological response in GIST patients treated with imatinib by two different angles: the genetic and somatic point of view. We analyzed polymorphisms influence on treatment outcome, keeping in consideration SNPs in genes involved in drug transport and folate pathway. Naturally, all these intriguing results cannot be considered as the only main mechanism in imatinib response. GIST mainly depends by oncogenic gain of function mutations in tyrosin kinase receptor genes, KIT or PDGFRA, and the mutational status of these two genes or acquisition of secondary mutation is considered the main player in GIST development and progression. To this purpose we analyzed the secondary mutations to better understand how these are involved in imatinib resistance. In our analysis we considered both imatinib and the second line treatment, sunitinib, in a subset of progressive patients. KIT/PDGFRA mutation analysis is an important tool for physicians, as specific mutations may guide therapeutic choices. Currently, the only adaptations in treatment strategy include imatinib starting dose of 800 mg/daily in KIT exon-9-mutated GISTs. In the attempt to individualize treatment, genetic polymorphisms represent a novelty in the definition of biomarkers of imatinib response in addition to the use of tumor genotype. Accumulating data indicate a contributing role of pharmacokinetics in imatinib efficacy, as well as initial response, time to progression and acquired resistance. At the same time it is becoming evident that genetic host factors may contribute to the observed pharmacokinetic inter-patient variability. Genetic polymorphisms in transporters and metabolism may affect the activity or stability of the encoded enzymes. Thus, integrating pharmacogenetic data of imatinib transporters and metabolizing genes, whose interplay has yet to be fully unraveled, has the potential to provide further insight into imatinib response/resistance mechanisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CONCLUSION: Our self-developed planning and navigation system has proven its capacity for accurate surgery on the anterior and lateral skull base. With the incorporation of augmented reality, image-guided surgery will evolve into 'information-guided surgery'. OBJECTIVE: Microscopic or endoscopic skull base surgery is technically demanding and its outcome has a great impact on a patient's quality of life. The goal of the project was aimed at developing and evaluating enabling navigation surgery tools for simulation, planning, training, education, and performance. This clinically applied technological research was complemented by a series of patients (n=406) who were treated by anterior and lateral skull base procedures between 1997 and 2006. MATERIALS AND METHODS: Optical tracking technology was used for positional sensing of instruments. A newly designed dynamic reference base with specific registration techniques using fine needle pointer or ultrasound enables the surgeon to work with a target error of < 1 mm. An automatic registration assessment method, which provides the user with a color-coded fused representation of CT and MR images, indicates to the surgeon the location and extent of registration (in)accuracy. Integration of a small tracker camera mounted directly on the microscope permits an advantageous ergonomic way of working in the operating room. Additionally, guidance information (augmented reality) from multimodal datasets (CT, MRI, angiography) can be overlaid directly onto the surgical microscope view. The virtual simulator as a training tool in endonasal and otological skull base surgery provides an understanding of the anatomy as well as preoperative practice using real patient data. RESULTS: Using our navigation system, no major complications occurred in spite of the fact that the series included difficult skull base procedures. An improved quality in the surgical outcome was identified compared with our control group without navigation and compared with the literature. The surgical time consumption was reduced and more minimally invasive approaches were possible. According to the participants' questionnaires, the educational effect of the virtual simulator in our residency program received a high ranking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction As students become more connected with the internet and other current technologies, the school of nursing has continued to investigate more innovative, meaningful, and effective uses of technology. One particular technology whose use has increased is the portable music/video player. Like the cell phone, mp3 players and iPods have become a standard accessory for students. To capitalize on this popular technology the School has started several pilot projects involving podcasting under graduate and graduate nursing classes and has also been involved in one research project using video iPods. [See PDF for complete abstract]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Medical doctors often do not trust the result of fully automatic segmentations because they have no possibility to make corrections if necessary. On the other hand, manual corrections can introduce a user bias. In this work, we propose to integrate the possibility for quick manual corrections into a fully automatic segmentation method for brain tumor images. This allows for necessary corrections while maintaining a high objectiveness. The underlying idea is similar to the well-known Grab-Cut algorithm, but here we combine decision forest classification with conditional random field regularization for interactive segmentation of 3D medical images. The approach has been evaluated by two different users on the BraTS2012 dataset. Accuracy and robustness improved compared to a fully automatic method and our interactive approach was ranked among the top performing methods. Time for computation including manual interaction was less than 10 minutes per patient, which makes it attractive for clinical use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we present a scalable software architecture for on-line multi-camera video processing, that guarantees a good trade off between computational power, scalability and flexibility. The software system is modular and its main blocks are the Processing Units (PUs), and the Central Unit. The Central Unit works as a supervisor of the running PUs and each PU manages the acquisition phase and the processing phase. Furthermore, an approach to easily parallelize the desired processing application has been presented. In this paper, as case study, we apply the proposed software architecture to a multi-camera system in order to efficiently manage multiple 2D object detection modules in a real-time scenario. System performance has been evaluated under different load conditions such as number of cameras and image sizes. The results show that the software architecture scales well with the number of camera and can easily works with different image formats respecting the real time constraints. Moreover, the parallelization approach can be used in order to speed up the processing tasks with a low level of overhead

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: The EVA (Endoscopic Video Analysis) tracking system a new tracking system for extracting motions of laparoscopic instruments based on non-obtrusive video tracking was developed. The feasibility of using EVA in laparoscopic settings has been tested in a box trainer setup. METHODS: EVA makes use of an algorithm that employs information of the laparoscopic instrument's shaft edges in the image, the instrument's insertion point, and the camera's optical centre to track the 3D position of the instrument tip. A validation study of EVA comprised a comparison of the measurements achieved with EVA and the TrEndo tracking system. To this end, 42 participants (16 novices, 22 residents, and 4 experts) were asked to perform a peg transfer task in a box trainer. Ten motion-based metrics were used to assess their performance. RESULTS: Construct validation of the EVA has been obtained for seven motion-based metrics. Concurrent validation revealed that there is a strong correlation between the results obtained by EVA and the TrEndo for metrics such as path length (p=0,97), average speed (p=0,94) or economy of volume (p=0,85), proving the viability of EVA. CONCLUSIONS: EVA has been successfully used in the training setup showing potential of endoscopic video analysis to assess laparoscopic psychomotor skills. The results encourage further implementation of video tracking in training setups and in image guided surgery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Cognitive skills training for minimally invasive surgery has traditionally relied upon diverse tools, such as seminars or lectures. Web technologies for e-learning have been adopted to provide ubiquitous training and serve as structured repositories for the vast amount of laparoscopic video sources available. However, these technologies fail to offer such features as formative and summative evaluation, guided learning, or collaborative interaction between users. Methodology: The "TELMA" environment is presented as a new technology-enhanced learning platform that increases the user's experience using a four-pillared architecture: (1) an authoring tool for the creation of didactic contents; (2) a learning content and knowledge management system that incorporates a modular and scalable system to capture, catalogue, search, and retrieve multimedia content; (3) an evaluation module that provides learning feedback to users; and (4) a professional network for collaborative learning between users. Face validation of the environment and the authoring tool are presented. Results: Face validation of TELMA reveals the positive perception of surgeons regarding the implementation of TELMA and their willingness to use it as a cognitive skills training tool. Preliminary validation data also reflect the importance of providing an easy-to-use, functional authoring tool to create didactic content. Conclusion: The TELMA environment is currently installed and used at the Jesús Usón Minimally Invasive Surgery Centre and several other Spanish hospitals. Face validation results ascertain the acceptance and usefulness of this new minimally invasive surgery training environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El objeto de esta Tesis doctoral es el desarrollo de una metodologia para la deteccion automatica de anomalias a partir de datos hiperespectrales o espectrometria de imagen, y su cartografiado bajo diferentes condiciones tipologicas de superficie y terreno. La tecnologia hiperespectral o espectrometria de imagen ofrece la posibilidad potencial de caracterizar con precision el estado de los materiales que conforman las diversas superficies en base a su respuesta espectral. Este estado suele ser variable, mientras que las observaciones se producen en un numero limitado y para determinadas condiciones de iluminacion. Al aumentar el numero de bandas espectrales aumenta tambien el numero de muestras necesarias para definir espectralmente las clases en lo que se conoce como Maldicion de la Dimensionalidad o Efecto Hughes (Bellman, 1957), muestras habitualmente no disponibles y costosas de obtener, no hay mas que pensar en lo que ello implica en la Exploracion Planetaria. Bajo la definicion de anomalia en su sentido espectral como la respuesta significativamente diferente de un pixel de imagen respecto de su entorno, el objeto central abordado en la Tesis estriba primero en como reducir la dimensionalidad de la informacion en los datos hiperespectrales, discriminando la mas significativa para la deteccion de respuestas anomalas, y segundo, en establecer la relacion entre anomalias espectrales detectadas y lo que hemos denominado anomalias informacionales, es decir, anomalias que aportan algun tipo de informacion real de las superficies o materiales que las producen. En la deteccion de respuestas anomalas se asume un no conocimiento previo de los objetivos, de tal manera que los pixeles se separan automaticamente en funcion de su informacion espectral significativamente diferenciada respecto de un fondo que se estima, bien de manera global para toda la escena, bien localmente por segmentacion de la imagen. La metodologia desarrollada se ha centrado en la implicacion de la definicion estadistica del fondo espectral, proponiendo un nuevo enfoque que permite discriminar anomalias respecto fondos segmentados en diferentes grupos de longitudes de onda del espectro, explotando la potencialidad de separacion entre el espectro electromagnetico reflectivo y emisivo. Se ha estudiado la eficiencia de los principales algoritmos de deteccion de anomalias, contrastando los resultados del algoritmo RX (Reed and Xiaoli, 1990) adoptado como estandar por la comunidad cientifica, con el metodo UTD (Uniform Targets Detector), su variante RXD-UTD, metodos basados en subespacios SSRX (Subspace RX) y metodo basados en proyecciones de subespacios de imagen, como OSPRX (Orthogonal Subspace Projection RX) y PP (Projection Pursuit). Se ha desarrollado un nuevo metodo, evaluado y contrastado por los anteriores, que supone una variacion de PP y describe el fondo espectral mediante el analisis discriminante de bandas del espectro electromagnetico, separando las anomalias con el algortimo denominado Detector de Anomalias de Fondo Termico o DAFT aplicable a sensores que registran datos en el espectro emisivo. Se han evaluado los diferentes metodos de deteccion de anomalias en rangos del espectro electromagnetico del visible e infrarrojo cercano (Visible and Near Infrared-VNIR), infrarrojo de onda corta (Short Wavelenght Infrared-SWIR), infrarrojo medio (Meadle Infrared-MIR) e infrarrojo termico (Thermal Infrared-TIR). La respuesta de las superficies en las distintas longitudes de onda del espectro electromagnetico junto con su entorno, influyen en el tipo y frecuencia de las anomalias espectrales que puedan provocar. Es por ello que se han utilizado en la investigacion cubos de datos hiperepectrales procedentes de los sensores aeroportados cuya estrategia y diseno en la construccion espectrometrica de la imagen difiere. Se han evaluado conjuntos de datos de test de los sensores AHS (Airborne Hyperspectral System), HyMAP Imaging Spectrometer, CASI (Compact Airborne Spectrographic Imager), AVIRIS (Airborne Visible Infrared Imaging Spectrometer), HYDICE (Hyperspectral Digital Imagery Collection Experiment) y MASTER (MODIS/ASTER Simulator). Se han disenado experimentos sobre ambitos naturales, urbanos y semiurbanos de diferente complejidad. Se ha evaluado el comportamiento de los diferentes detectores de anomalias a traves de 23 tests correspondientes a 15 areas de estudio agrupados en 6 espacios o escenarios: Urbano - E1, Semiurbano/Industrial/Periferia Urbana - E2, Forestal - E3, Agricola - E4, Geologico/Volcanico - E5 y Otros Espacios Agua, Nubes y Sombras - E6. El tipo de sensores evaluados se caracteriza por registrar imagenes en un amplio rango de bandas, estrechas y contiguas, del espectro electromagnetico. La Tesis se ha centrado en el desarrollo de tecnicas que permiten separar y extraer automaticamente pixeles o grupos de pixeles cuya firma espectral difiere de manera discriminante de las que tiene alrededor, adoptando para ello como espacio muestral parte o el conjunto de las bandas espectrales en las que ha registrado radiancia el sensor hiperespectral. Un factor a tener en cuenta en la investigacion ha sido el propio instrumento de medida, es decir, la caracterizacion de los distintos subsistemas, sensores imagen y auxiliares, que intervienen en el proceso. Para poder emplear cuantitativamente los datos medidos ha sido necesario definir las relaciones espaciales y espectrales del sensor con la superficie observada y las potenciales anomalias y patrones objetivos de deteccion. Se ha analizado la repercusion que en la deteccion de anomalias tiene el tipo de sensor, tanto en su configuracion espectral como en las estrategias de diseno a la hora de registrar la radiacion prodecente de las superficies, siendo los dos tipos principales de sensores estudiados los barredores o escaneres de espejo giratorio (whiskbroom) y los barredores o escaneres de empuje (pushbroom). Se han definido distintos escenarios en la investigacion, lo que ha permitido abarcar una amplia variabilidad de entornos geomorfologicos y de tipos de coberturas, en ambientes mediterraneos, de latitudes medias y tropicales. En resumen, esta Tesis presenta una tecnica de deteccion de anomalias para datos hiperespectrales denominada DAFT en su variante de PP, basada en una reduccion de la dimensionalidad proyectando el fondo en un rango de longitudes de onda del espectro termico distinto de la proyeccion de las anomalias u objetivos sin firma espectral conocida. La metodologia propuesta ha sido probada con imagenes hiperespectrales reales de diferentes sensores y en diferentes escenarios o espacios, por lo tanto de diferente fondo espectral tambien, donde los resultados muestran los beneficios de la aproximacion en la deteccion de una gran variedad de objetos cuyas firmas espectrales tienen suficiente desviacion respecto del fondo. La tecnica resulta ser automatica en el sentido de que no hay necesidad de ajuste de parametros, dando resultados significativos en todos los casos. Incluso los objetos de tamano subpixel, que no pueden distinguirse a simple vista por el ojo humano en la imagen original, pueden ser detectados como anomalias. Ademas, se realiza una comparacion entre el enfoque propuesto, la popular tecnica RX y otros detectores tanto en su modalidad global como local. El metodo propuesto supera a los demas en determinados escenarios, demostrando su capacidad para reducir la proporcion de falsas alarmas. Los resultados del algoritmo automatico DAFT desarrollado, han demostrado la mejora en la definicion cualitativa de las anomalias espectrales que identifican a entidades diferentes en o bajo superficie, reemplazando para ello el modelo clasico de distribucion normal con un metodo robusto que contempla distintas alternativas desde el momento mismo de la adquisicion del dato hiperespectral. Para su consecucion ha sido necesario analizar la relacion entre parametros biofisicos, como la reflectancia y la emisividad de los materiales, y la distribucion espacial de entidades detectadas respecto de su entorno. Por ultimo, el algoritmo DAFT ha sido elegido como el mas adecuado para sensores que adquieren datos en el TIR, ya que presenta el mejor acuerdo con los datos de referencia, demostrando una gran eficacia computacional que facilita su implementacion en un sistema de cartografia que proyecte de forma automatica en un marco geografico de referencia las anomalias detectadas, lo que confirma un significativo avance hacia un sistema en lo que se denomina cartografia en tiempo real. The aim of this Thesis is to develop a specific methodology in order to be applied in automatic detection anomalies processes using hyperspectral data also called hyperspectral scenes, and to improve the classification processes. Several scenarios, areas and their relationship with surfaces and objects have been tested. The spectral characteristics of reflectance parameter and emissivity in the pattern recognition of urban materials in several hyperspectral scenes have also been tested. Spectral ranges of the visible-near infrared (VNIR), shortwave infrared (SWIR) and thermal infrared (TIR) from hyperspectral data cubes of AHS (Airborne Hyperspectral System), HyMAP Imaging Spectrometer, CASI (Compact Airborne Spectrographic Imager), AVIRIS (Airborne Visible Infrared Imaging Spectrometer), HYDICE (Hyperspectral Digital Imagery Collection Experiment) and MASTER (MODIS/ASTER Simulator) have been used in this research. It is assumed that there is not prior knowledge of the targets in anomaly detection. Thus, the pixels are automatically separated according to their spectral information, significantly differentiated with respect to a background, either globally for the full scene, or locally by the image segmentation. Several experiments on different scenarios have been designed, analyzing the behavior of the standard RX anomaly detector and different methods based on subspace, image projection and segmentation-based anomaly detection methods. Results and their consequences in unsupervised classification processes are discussed. Detection of spectral anomalies aims at extracting automatically pixels that show significant responses in relation of their surroundings. This Thesis deals with the unsupervised technique of target detection, also called anomaly detection. Since this technique assumes no prior knowledge about the target or the statistical characteristics of the data, the only available option is to look for objects that are differentiated from the background. Several methods have been developed in the last decades, allowing a better understanding of the relationships between the image dimensionality and the optimization of search procedures as well as the subpixel differentiation of the spectral mixture and its implications in anomalous responses. In other sense, image spectrometry has proven to be efficient in the characterization of materials, based on statistical methods using a specific reflection and absorption bands. Spectral configurations in the VNIR, SWIR and TIR have been successfully used for mapping materials in different urban scenarios. There has been an increasing interest in the use of high resolution data (both spatial and spectral) to detect small objects and to discriminate surfaces in areas with urban complexity. This has come to be known as target detection which can be either supervised or unsupervised. In supervised target detection, algorithms lean on prior knowledge, such as the spectral signature. The detection process for matching signatures is not straightforward due to the complications of converting data airborne sensor with material spectra in the ground. This could be further complicated by the large number of possible objects of interest, as well as uncertainty as to the reflectance or emissivity of these objects and surfaces. An important objective in this research is to establish relationships that allow linking spectral anomalies with what can be called informational anomalies and, therefore, identify information related to anomalous responses in some places rather than simply spotting differences from the background. The development in recent years of new hyperspectral sensors and techniques, widen the possibilities for applications in remote sensing of the Earth. Remote sensing systems measure and record electromagnetic disturbances that the surveyed objects induce in their surroundings, by means of different sensors mounted on airborne or space platforms. Map updating is important for management and decisions making people, because of the fast changes that usually happen in natural, urban and semi urban areas. It is necessary to optimize the methodology for obtaining the best from remote sensing techniques from hyperspectral data. The first problem with hyperspectral data is to reduce the dimensionality, keeping the maximum amount of information. Hyperspectral sensors augment considerably the amount of information, this allows us to obtain a better precision on the separation of material but at the same time it is necessary to calculate a bigger number of parameters, and the precision lowers with the increase in the number of bands. This is known as the Hughes effects (Bellman, 1957) . Hyperspectral imagery allows us to discriminate between a huge number of different materials however some land and urban covers are made up with similar material and respond similarly which produces confusion in the classification. The training and the algorithm used for mapping are also important for the final result and some properties of thermal spectrum for detecting land cover will be studied. In summary, this Thesis presents a new technique for anomaly detection in hyperspectral data called DAFT, as a PP's variant, based on dimensionality reduction by projecting anomalies or targets with unknown spectral signature to the background, in a range thermal spectrum wavelengths. The proposed methodology has been tested with hyperspectral images from different imaging spectrometers corresponding to several places or scenarios, therefore with different spectral background. The results show the benefits of the approach to the detection of a variety of targets whose spectral signatures have sufficient deviation in relation to the background. DAFT is an automated technique in the sense that there is not necessary to adjust parameters, providing significant results in all cases. Subpixel anomalies which cannot be distinguished by the human eye, on the original image, however can be detected as outliers due to the projection of the VNIR end members with a very strong thermal contrast. Furthermore, a comparison between the proposed approach and the well-known RX detector is performed at both modes, global and local. The proposed method outperforms the existents in particular scenarios, demonstrating its performance to reduce the probability of false alarms. The results of the automatic algorithm DAFT have demonstrated improvement in the qualitative definition of the spectral anomalies by replacing the classical model by the normal distribution with a robust method. For their achievement has been necessary to analyze the relationship between biophysical parameters such as reflectance and emissivity, and the spatial distribution of detected entities with respect to their environment, as for example some buried or semi-buried materials, or building covers of asbestos, cellular polycarbonate-PVC or metal composites. Finally, the DAFT method has been chosen as the most suitable for anomaly detection using imaging spectrometers that acquire them in the thermal infrared spectrum, since it presents the best results in comparison with the reference data, demonstrating great computational efficiency that facilitates its implementation in a mapping system towards, what is called, Real-Time Mapping.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper will present an open-source simulation tool, which is being developed in the frame of an European research project1. The tool, whose final version will be freely available through a website, allows the modelling and the design of different types of grid-connected PV systems, such as large grid-connected plants and building-integrated installations. The tool is based on previous software developed by the IES-UPM2, whose models and energy losses scenarios have been validated in the commissioning of PV projects3 carried out in Spain, Portugal, France and Italy, whose aggregated capacity is nearly 300MW. This link between design and commissioning is one of the key points of tool presented here, which is not usually addressed by present commercial software. The tool provides, among other simulation results, the energy yield, the analysis and breakdown of energy losses, and the estimations of financial returns adapted to the legal and financial frameworks of each European country. Besides, educational facilities will be developed and integrated in the tool, not only devoted to learn how to use this software, but also to train the users on the best design PV systems practices. The tool will also include the recommendation of several PV community experts, which have been invited to identify present necessities in the field of PV systems simulation. For example, the possibility of using meteorological forecasts as input data, or modelling the integration of large energy storage systems, such as vanadium redox or lithium-ion batteries. Finally, it is worth mentioning that during the verification and testing stages of this software development, it will be also open to the suggestions received from the different actors of the PV community, such as promoters, installers, consultants, etc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context. This thesis is framed in experimental software engineering. More concretely, it addresses the problems arisen when assessing process conformance in test-driven development experiments conducted by UPM's Experimental Software Engineering group. Process conformance was studied using the Eclipse's plug-in tool Besouro. It has been observed that Besouro does not work correctly in some circumstances. It creates doubts about the correction of the existing experimental data which render it useless. Aim. The main objective of this work is the identification and correction of Besouro's faults. A secondary goal is fixing the datasets already obtained in past experiments to the maximum possible extent. This way, existing experimental results could be used with confidence. Method. (1) Testing Besouro using different sequences of events (creation methods, assertions etc..) to identify the underlying faults. (2) Fix the code and (3) fix the datasets using code specially created for this purpose. Results. (1) We confirmed the existence of several fault in Besouro's code that affected to Test-First and Test-Last episode identification. These faults caused the incorrect identification of 20% of episodes. (2) We were able to fix Besouro's code. (3) The correction of existing datasets was possible, subjected to some restrictions (such us the impossibility of tracing code size increase to programming time. Conclusion. The results of past experiments dependent upon Besouro's data could no be trustable. We have the suspicion that more faults remain in Besouro's code, whose identification requires further analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tesis analiza los criterios con que fueron proyectadas y construidas las estructuras de hormigón hasta 1973, fecha coincidente con la Instrucción EH-73, que en contenido, formato y planteamiento, consagró la utilización de los criterios modernamente utilizados hasta ahora. Es heredera, además, de las CEB 1970. Esos años marcan el cambio de planteamiento desde la Teoría Clásica hacia los Estados Límite. Los objetivos perseguidos son, sintéticamente: 1) Cubrir un vacío patente en el estudio de la evolución del conocimiento. Hay tratados sobre la historia del hormigón que cubren de manera muy completa el relato de personajes y realizaciones, pero no, al menos de manera suficiente, la evolución del conocimiento. 2) Servir de ayuda a los técnicos de hoy para entender configuraciones estructurales, geometrías, disposiciones de armado, formatos de seguridad, etc, utilizados en el pasado, lo que servirá para la redacción más fundada de dictámenes preliminares sobre estructuras existentes. 3) Ser referencia para la realización de estudios de valoración de la capacidad resistente de construcciones existentes, constituyendo la base de un documento pre-normativo orientado en esa dirección. En efecto, esta tesis pretende ser una ayuda para los ingenieros de hoy que se enfrentan a la necesidad de conservar y reparar estructuras de hormigón armado que forman parte del patrimonio heredado. La gran mayoría de las estructuras, fueron construidas hace más de 40 años, por lo que es preciso conocer los criterios que marcaron su diseño, su cálculo y su construcción. Pretende determinar cuáles eran los límites de agotamiento y por tanto de seguridad, de estructuras dimensionadas con criterios de antaño, analizadas por la metodología de cálculo actual. De este modo, se podrá determinar el resguardo existente “real” de las estructuras dimensionadas y calculadas con criterios “distintos” a los actuales. Conocer el comportamiento de las estructuras construidas con criterios de la Teoría Clásica, según los criterios actuales, permitirá al ingeniero de hoy tratar de la forma más adecuada el abanico de necesidades que se puedan presentar en una estructura existente. Este trabajo se centra en la evolución del conocimiento por lo que no se encuentran incluidos los procesos constructivos. En lo relativo a los criterios de proyecto, hasta mediados del siglo XX, éstos se veían muy influidos por los ensayos y trabajos de autor consiguientes, en los que se basaban los reglamentos de algunos países. Era el caso del reglamento prusiano de 1904, de la Orden Circular francesa de 1906, del Congreso de Lieja de 1930. A partir de la segunda mitad del siglo XX, destacan las aportaciones de ingenieros españoles como es el caso de Alfredo Páez Balaca, Eduardo Torroja y Pedro Jiménez Montoya, entre otros, que permitieron el avance de los criterios de cálculo y de seguridad de las estructuras de hormigón, hasta los que se conocen hoy. El criterio rector del proyecto de las estructuras de hormigón se fundó, como es sabido, en los postulados de la Teoría Clásica, en particular en el “momento crítico”, aquel para el que hormigón y acero alcanzan sus tensiones admisibles y, por tanto, asegura el máximo aprovechamiento de los materiales y sin pretenderlo conscientemente, la máxima ductilidad. Si el momento solicitante es mayor que el crítico, se dispone de armadura en compresión. Tras el estudio de muchas de las estructuras existentes de la época por el autor de esta tesis, incluyendo entre ellas las Colecciones Oficiales de Puentes de Juan Manuel de Zafra, Eugenio Ribera y Carlos Fernández Casado, se concluye que la definición geométrica de las mismas no se corresponde exactamente con la resultante del momento crítico, dado que como ahora resultaba necesario armonizar los criterios de armado a nivel sección con la organización de la ferralla a lo largo de los diferentes elementos estructurales. Los parámetros de cálculo, resistencias de los materiales y formatos de seguridad, fueron evolucionando con los años. Se fueron conociendo mejor las prestaciones de los materiales, se fue enriqueciendo la experiencia de los propios procesos constructivos y, en menor medida, de las acciones solicitantes y, consiguientemente, acotándose las incertidumbres asociadas lo cual permitió ir ajustando los coeficientes de seguridad a emplear en el cálculo. Por ejemplo, para el hormigón se empleaba un coeficiente de seguridad igual a 4 a finales del siglo XIX, que evolucionó a 3,57 tras la publicación de la Orden Circular francesa de 1906, y a 3, tras la Instrucción española de 1939. En el caso del acero, al ser un material bastante más conocido por cuanto se haa utilizado muchísimo previamente, el coeficiente de seguridad permaneció casi constante a lo largo de los años, con un valor igual a 2. Otra de las causas de la evolución de los parámetros de cálculo fue el mejor conocimiento del comportamiento de las estructuras merced a la vasta tarea de planificación y ejecución de ensayos, con los estudios teóricos consiguientes, realizados por numerosos autores, principalmente austríacos y alemanes, pero también norteamericanos y franceses. En cuanto a los criterios de cálculo, puede sorprender al técnico de hoy el conocimiento que tenían del comportamiento del hormigón desde los primeros años del empleo del mismo. Saan del comportamiento no lineal del hormigón, pero limitaban su trabajo a un rango de tensióndeformación lineal porque eso aseguraba una previsión del comportamiento estructural conforme a las hipótesis de la Elasticidad Lineal y de la Resistencia de Materiales, muy bien conocidas a principios del s. XX (no así sucedía con la teoría de la Plasticidad, aún sin formular, aunque estaba implícita en los planteamientos algunos ingenieros especializados en estructuras de fábrica (piedra o ladrillo) y metálicas. Además, eso permitía independizar un tanto el proyecto de los valores de las resistencias reales de los materiales, lo que liberaba de la necesidad de llevar a cabo ensayos que, en la práctica, apenas se podían hacer debido a la escasez de los laboratorios. Tampoco disponían de programas informáticos ni de ninguna de las facilidades de las que hoy se tienen, que les permitiera hacer trabajar al hormigón en un rango no lineal. Así, sabia y prudentemente, limitaban las tensiones y deformaciones del material a un rango conocido. El modus operandi seguido para la elaboración de esta tesis, ha sido el siguiente: -Estudio documental: se han estudiado documentos de autor, recomendaciones y normativa generada en este ámbito, tanto en España como con carácter internacional, de manera sistemática con arreglo al índice del documento. En este proceso, se han detectado lagunas del conocimiento (y su afección a la seguridad estructural, en su caso) y se han identificado las diferencias con los procedimientos de hoy. También ha sido necesario adaptar la notación y terminología de la época a los criterios actuales, lo que ha supuesto una dificultad añadida. -Desarrollo del documento: A partir del estudio previo se han ido desarrollando los siguientes documentos, que conforman el contenido de la tesis: o Personajes e instituciones relevantes por sus aportaciones al conocimiento de las estructuras de hormigón (investigación, normativa, docencia). o Caracterización de las propiedades mecánicas de los materiales (hormigón y armaduras), en relación a sus resistencias, diagramas tensión-deformación, módulos de deformación, diagramas momento-curvatura, etc. Se incluye aquí la caracterización clásica de los hormigones, la geometría y naturaleza de las armaduras, etc. o Formatos de seguridad: Se trata de un complejo capítulo del que se pretende extraer la información suficiente que permita a los técnicos de hoy entender los criterios utilizados entonces y compararlos con los actuales. o Estudio de secciones y piezas sometidas a tensiones normales y tangenciales: Se trata de presentar la evolución en el tratamiento de la flexión simple y compuesta, del cortante, del rasante, torsión, etc. Se tratan también en esta parte del estudio aspectos que, no siendo de preocupación directa de los técnicos de antaño (fisuración y deformaciones), tienen hoy mayor importancia frente a cambios de usos y condiciones de durabilidad. o Detalles de armado: Incluye el tratamiento de la adherencia, el anclaje, el solapo de barras, el corte de barras, las disposiciones de armado en función de la geometría de las piezas y sus solicitaciones, etc. Es un capítulo de importancia obvia para los técnicos de hoy. Se incluye un anejo con las referencias más significativas a los estudios experimentales en que se basaron las propuestas que han marcado hito en la evolución del conocimiento. Finalmente, junto a las conclusiones más importantes, se enuncian las propuestas de estudios futuros. This thesis analyzes the criteria with which structures of reinforced concrete have been designed and constructed prior to 1973. Initially, the year 1970 was chosen as starting point, coinciding with the CEB recommendations, but with the development of the thesis it was decided that 1973 was the better option, coinciding with the Spanish regulations of 1973, whose content, format and description introduced the current criteria. The studied period includes the Classic Theory. The intended goals of this thesis are: 1) To cover a clear gap in the study of evolution of knowledge about reinforced concrete. The concept and accomplishments achieved by reinforced concrete itself has been treated in a very complete way by the main researchers in this area, but not the evolution of knowledge in this subject area. 2) To help the engineers understand structural configurations, geometries, dispositions of steel, safety formats etc, that will serve as preliminary judgments by experts on existing structures. To be a reference to the existing studies about the valuation of resistant capacity of existing constructions, constituting a basic study of a pre-regulation document. This thesis intends to be a help for the current generation of engineers who need to preserve and repair reinforced concrete structures that have existed for a significant number of years. Most of these structures in question were constructed more than 40 years ago, and it is necessary to know the criteria that influenced their design, the calculation and the construction. This thesis intends to determine the safety limits of the old structures and analyze them in the context of the current regulations and their methodology. Thus, it will then be possible to determine the safety of these structures, after being measured and calculated with the current criteria. This will allow the engineers to optimize the treatment of such a structure. This work considers the evolution of the knowledge, so constructive methods are not included. Related to the design criteria, there existed until middle of the 20th century a large number of diverse European tests and regulations, such as the Prussian norm of 1904, the Circular French Order of 1906, the Congress of Liège of 1930, as well as individual engineers’ own notes and criteria which incorporated the results of their own tests. From the second half of the 20th century, the contributions of Spanish engineers as Alfredo Páez Balaca, Eduardo Torroja and Pedro Jiménez Montoya, among others, were significant and this allowed the advancement of the criteria of the calculation of safety standards of concrete structures, many of which still exist to the present day. The design and calculation of reinforced concrete structures by the Classic Theory, was based on the ‘Critical Bending Moment’, when concrete and steel achieve their admissible tensions, that allows the best employment of materials and the best ductility. If the bending moment is major than the critical bending moment, will be necessary to introduce compression steel. After the study of the designs of many existing structures of that time by the author of this thesis, including the Historical Collections of Juan Manuel de Zafra, Eugenio Ribera and Carlos Fernandez Casado, the conclusion is that the geometric definition of the structures does not correspond exactly with the critical bending moment inherent in the structures. The parameters of these calculations changed throughout the years. The principal reason that can be outlined is that the materials were improving gradually and the number of calculated uncertainties were decreasing, thus allowing the reduction of the safety coefficients to use in the calculation. For example, concrete used a coefficient of 4 towards the end of the 19th century, which evolved to 3,57 after the publication of the Circular French Order of 1906, and then to 3 after the Spanish Instruction of 1939. In the case of the steel, a much more consistent material, the safety coefficient remained almost constant throughout the years, with a value of 2. Other reasons related to the evolution of the calculation parameters were that the tests and research undertaken by an ever-increasing number of engineers then allowed a more complete knowledge of the behavior of reinforced concrete. What is surprising is the extent of knowledge that existed about the behavior of the concrete from the outset. Engineers from the early years knew that the behavior of the concrete was non-linear, but they limited the work to a linear tension-deformation range. This was due to the difficulties of work in a non-linear range, because they did not have laboratories to test concrete, or facilities such as computers with appropriate software, something unthinkable today. These were the main reasons engineers of previous generations limited the tensions and deformations of a particular material to a known range. The modus operandi followed for the development of this thesis is the following one: -Document study: engineers’ documents, recommendations and regulations generated in this area, both from Spain or overseas, have been studied in a systematic way in accordance with the index of the document. In this process, a lack of knowledge has been detected concerning structural safety, and differences to current procedures have been identified and noted. Also, it has been necessary to adapt the notation and terminology of the Classic Theory to the current criteria, which has imposed an additional difficulty. -Development of the thesis: starting from the basic study, the next chapters of this thesis have been developed and expounded upon: o People and relevant institutions for their contribution to the knowledge about reinforced concrete structures (investigation, regulation, teaching). Determination of the mechanical properties of the materials (concrete and steel), in relation to their resistances, tension-deformation diagrams, modules of deformation, moment-curvature diagrams, etc. Included are the classic characterizations of concrete, the geometry and nature of the steel, etc. Safety formats: this is a very difficult chapter from which it is intended to provide enough information that will then allow the present day engineer to understand the criteria used in the Classic Theory and then to compare them with the current theories. Study of sections and pieces subjected to normal and tangential tensions: it intends to demonstrate the evolution in the treatment of the simple and complex flexion, shear, etc. Other aspects examined include aspects that were not very important in the Classic Theory but currently are, such as deformation and fissures. o Details of reinforcement: it includes the treatment of the adherence, the anchorage, the lapel of bars, the cut of bars, the dispositions of reinforcement depending on the geometry of the pieces and the solicitations, etc. It is a chapter of obvious importance for current engineers. The document will include an annex with the most references to the most significant experimental studies on which were based the proposals that have become a milestone in the evolution of knowledge in this area. Finally, there will be included conclusions and suggestions of future studies. A deep study of the documentation and researchers of that time has been done, juxtaposing their criteria and results with those considered relevant today, and giving a comparison between the resultant safety standards according to the Classic Theory criteria and currently used criteria. This thesis fundamentally intends to be a guide for engineers who have to treat or repair a structure constructed according to the Classic Theory criteria.