978 resultados para Testing Power
Resumo:
OBJECTIVE: To assess safety, feasibility, and the results of early exercise testing in patients with chest pain admitted to the emergency room of the chest pain unit, in whom acute myocardial infarction and high-risk unstable angina had been ruled out. METHODS: A study including 1060 consecutive patients with chest pain admitted to the emergency room of the chest pain unit was carried out. Of them, 677 (64%) patients were eligible for exercise testing, but only 268 (40%) underwent the test. RESULTS: The mean age of the patients studied was 51.7±12.1 years, and 188 (70%) were males. Twenty-eight (10%) patients had a previous history of coronary artery disease, 244 (91%) had a normal or unspecific electrocardiogram, and 150 (56%) underwent exercise testing within a 12-hour interval. The results of the exercise test in the latter group were as follows: 34 (13%) were positive, 191 (71%) were negative, and 43 (16%) were inconclusive. In the group of patients with a positive exercise test, 21 (62%) underwent coronary angiography, 11 underwent angioplasty, and 2 underwent myocardial revascularization. In a univariate analysis, type A/B chest pain (definitely/probably anginal) (p<0.0001), previous coronary artery disease (p<0.0001), and route 2 (patients at higher risk) correlated with a positive or inconclusive test (p<0.0001). CONCLUSION: In patients with chest pain and in whom acute myocardial infarction and high-risk unstable angina had been ruled out, the exercise test proved to be feasible, safe, and well tolerated.
Resumo:
El presente estudio pretende arribar a la construcción de un modelo explicativo del comportamiento político a partir de la contribución que los marcos sociales (norma sociales, normas de ciudadanía, ideología, confianza política) y sociocognitivos (inteligencia afectiva, interés político, eficacia política, conocimiento político, sentimiento de comunidad) mostraron, en términos de relaciones entre las variables, sobre el mismo. Nuestra atención se centra no sólo al comportamiento político de la ciudadanía -que es donde se desarrollaron la mayor parte de los estudios-, sino a las elites de poder constitutivas del sistema político (jueces, legisladores provinciales y representantes de instituciones del gobierno y de organismos no gubernamentales). Asimismo, pretende establecer las diferencias que puedan evidenciarse en torno a la relación de estas variables con el comportamiento político entre los distintos colectivos estudiados en el ámbito de la ciudad de Córdoba. Para ello se realizara una primera etapa de estudio instrumental, con el objeto de analizar las propiedades psicométricas de los instrumentos a utilizar en la operacionalización de las variables. Para ello se tomara una muestra accidental de 250 personas entre 18 y 65 años de edad. Posteriormente, se realizarán dos etapas de estudio ex post facto, con la finalidad de construir los modelos planteados. En la primera de ellas, se trabajará con una muestra accidental de 100 representantes de los grupos de poder estudiados y en la segunda con una muestra probabilística de 500 ciudadanos cordobeses entre 18 y 65 años de edad.
Resumo:
La utilización de energía eólica es un hecho cada vez más común en nuestro mundo como respuesta a mitigar el creciente aumento de demanda de energía, los aumentos constantes de precio, la escasez de combustibles fósiles y los impactos del cambio climático, los que son cada día más evidentes.Consecuentemente, el interés por la participación de esta nueva forma de generación de energía en sistema eléctrico de potencia ha aumentado considerablemente en los últimos años. La incorporación de generación de origen eólico en el sistema eléctrico de potencia requiere de un análisis detallado del sistema eléctrico en su conjunto, considerando la interacción entre parques y unidades de generación eólica, plantas de generación convencional y el sistema eléctrico de potencia. La integración de generación de origen renovable en el sistema eléctrico de potencia convencional presenta nuevos desafíos los que pueden ser atribuidos a características propias de este tipo de generación, por ejemplo la fluctuación de energía debido a la naturaleza variable del viento, la naturaleza distribuida de la generación eólica y las características constructivas y método de conexión de los distintos modelos de turbinas eólicas al sistema.La finalidad de este proyecto de investigación consiste en investigar el impacto sobre un mercado de sistema eléctrico competitivo causado por el agregado de generación de origen eolico. Como punto de partida se pretende realizar modelos de plantas de generacion eolica para luego incorporarlos a los modelos de sistemas eléctricos y realizar estudios de de despacho económico, flujo de cargas, análisis transitorio y estudios dinámicos del sistema.
Resumo:
Los mecanismos de producción y reproducción de la influencia política es una importante área de estudio de la ciencia política en las últimas décadas. En la misma se han disputado diferentes teorías, desde las que plantean la influencia predominante de grupos de poder y sectores corporativos tanto en las decisiones del estado como en las no decisiones, hasta los que plantean que existe la puja de diferentes intereses dentro del Estado pero que no existe ningún grupo predominante. El análisis de redes (network analysis) permite estudiar este objeto mediante la observación de la estructura de relaciones de los actores influyentes dentro de la política provincial. En esta area de estudio, este proyecto propone estudiar de qué manera se produce y reproduce la influencia política en la Provincia de Córdoba.Las hipótesis que plantea el proyecto son las siguientes: H1- La estructura del poder socio-político provincial adquiere una configuración reticular en la que existe un núcleo de actores que representan intereses tradicionales organizados y permite un escaso acceso de nuevas organizaciones que defienden intereses sociales difusos. H2- En el proceso de influencia sociopolítica provincial operan mecanismos de influencia interpersonales directos e indirectos (Brokerage) que permiten a los actores acceder e influir en los decisores públicos. H3- En el proceso de influencia socio-política interviene una diversidad de recursos de poder que los actores utilizan para influir las políticas públicas. Para esto se propone como objetivos del proyecto los siguientes: 1- Identificar y analizar la estructura de poder e influencia que subyace a la política provincial. 2- Analizar los intereses, actores y sectores incluidos y excluidos de la estructura de influencia política. 3- Analizar los mecanismos y recursos de producción y reproducción del poder y la influencia. 4- Analizar las áreas de política del estado provincial que resultan lugar de influencia de los actores y sectores que configuran la estructura de poder socio-política. 5- Analizar el sistema de decisión colectiva (policy domain) en dos áreas de política provincial. 6- Analizar los recursos que posibilitan a los actores ejercer poder e influencia en las áreas de políticas estudiadas. Para la verificación empírica de las hipótesis se realiza un diseño de investigación que incluye el mapeo y análisis de dos tipos de redes políticas diferentes, la "red de influencia en la política provincial" y la red de influencia en un "área de políticas públicas". La reconstrucción de las redes políticas se realizará mediante entrevistas semi-estructuradas a actores sociales y políticos en un muestreo no probabilístico de tipo "bola de nieve". La investigación pretende realizar un aporte a la comprensión de la coordinación política y, en tal sentido, espera alcanzar una adecuada descripción y comprensión de los procesos de influencia y de estructuración del poder en la Provincia de Córdoba.
Resumo:
El avance en la potencia de cómputo en nuestros días viene dado por la paralelización del procesamiento, dadas las características que disponen las nuevas arquitecturas de hardware. Utilizar convenientemente este hardware impacta en la aceleración de los algoritmos en ejecución (programas). Sin embargo, convertir de forma adecuada el algoritmo en su forma paralela es complejo, y a su vez, esta forma, es específica para cada tipo de hardware paralelo. En la actualidad los procesadores de uso general más comunes son los multicore, procesadores paralelos, también denominados Symmetric Multi-Processors (SMP). Hoy en día es difícil hallar un procesador para computadoras de escritorio que no tengan algún tipo de paralelismo del caracterizado por los SMP, siendo la tendencia de desarrollo, que cada día nos encontremos con procesadores con mayor numero de cores disponibles. Por otro lado, los dispositivos de procesamiento de video (Graphics Processor Units - GPU), a su vez, han ido desarrollando su potencia de cómputo por medio de disponer de múltiples unidades de procesamiento dentro de su composición electrónica, a tal punto que en la actualidad no es difícil encontrar placas de GPU con capacidad de 200 a 400 hilos de procesamiento paralelo. Estos procesadores son muy veloces y específicos para la tarea que fueron desarrollados, principalmente el procesamiento de video. Sin embargo, como este tipo de procesadores tiene muchos puntos en común con el procesamiento científico, estos dispositivos han ido reorientándose con el nombre de General Processing Graphics Processor Unit (GPGPU). A diferencia de los procesadores SMP señalados anteriormente, las GPGPU no son de propósito general y tienen sus complicaciones para uso general debido al límite en la cantidad de memoria que cada placa puede disponer y al tipo de procesamiento paralelo que debe realizar para poder ser productiva su utilización. Los dispositivos de lógica programable, FPGA, son dispositivos capaces de realizar grandes cantidades de operaciones en paralelo, por lo que pueden ser usados para la implementación de algoritmos específicos, aprovechando el paralelismo que estas ofrecen. Su inconveniente viene derivado de la complejidad para la programación y el testing del algoritmo instanciado en el dispositivo. Ante esta diversidad de procesadores paralelos, el objetivo de nuestro trabajo está enfocado en analizar las características especificas que cada uno de estos tienen, y su impacto en la estructura de los algoritmos para que su utilización pueda obtener rendimientos de procesamiento acordes al número de recursos utilizados y combinarlos de forma tal que su complementación sea benéfica. Específicamente, partiendo desde las características del hardware, determinar las propiedades que el algoritmo paralelo debe tener para poder ser acelerado. Las características de los algoritmos paralelos determinará a su vez cuál de estos nuevos tipos de hardware son los mas adecuados para su instanciación. En particular serán tenidos en cuenta el nivel de dependencia de datos, la necesidad de realizar sincronizaciones durante el procesamiento paralelo, el tamaño de datos a procesar y la complejidad de la programación paralela en cada tipo de hardware. Today´s advances in high-performance computing are driven by parallel processing capabilities of available hardware architectures. These architectures enable the acceleration of algorithms when thes ealgorithms are properly parallelized and exploit the specific processing power of the underneath architecture. Most current processors are targeted for general pruposes and integrate several processor cores on a single chip, resulting in what is known as a Symmetric Multiprocessing (SMP) unit. Nowadays even desktop computers make use of multicore processors. Meanwhile, the industry trend is to increase the number of integrated rocessor cores as technology matures. On the other hand, Graphics Processor Units (GPU), originally designed to handle only video processing, have emerged as interesting alternatives to implement algorithm acceleration. Current available GPUs are able to implement from 200 to 400 threads for parallel processing. Scientific computing can be implemented in these hardware thanks to the programability of new GPUs that have been denoted as General Processing Graphics Processor Units (GPGPU).However, GPGPU offer little memory with respect to that available for general-prupose processors; thus, the implementation of algorithms need to be addressed carefully. Finally, Field Programmable Gate Arrays (FPGA) are programmable devices which can implement hardware logic with low latency, high parallelism and deep pipelines. Thes devices can be used to implement specific algorithms that need to run at very high speeds. However, their programmability is harder that software approaches and debugging is typically time-consuming. In this context where several alternatives for speeding up algorithms are available, our work aims at determining the main features of thes architectures and developing the required know-how to accelerate algorithm execution on them. We look at identifying those algorithms that may fit better on a given architecture as well as compleme
Resumo:
The objective of this dissertation is to investigate the effect wind energy has on the Electricity Supply Industry in Ireland. Wind power generation is a source of renewable energy that is in abundant supply in Ireland and is fast becoming a resource that Ireland is depending on as a diverse and secure of supply of energy. However, wind is an intermittent resource and coupled with a variable demand, there are integration issues with balancing demand and supply effectively. To maintain a secure supply of electricity to customers, it is necessary that wind power has an operational reserve to ensure appropriate backup for situations where there is low wind but high demand. This dissertation examines the affect of this integration by comparing wind generation to that of conventional generation in the national grid. This is done to ascertain the cost benefits of wind power generation against a scenario with no wind generation. Then, the analysis examines to see if wind power can meet the pillars of sustainability. This entails looking at wind in a practical scenario to observe how it meets these pillars under the criteria of environmental responsibility, displacement of conventional fuel, cost competitiveness and security of supply.
Resumo:
As digital imaging processing techniques become increasingly used in a broad range of consumer applications, the critical need to evaluate algorithm performance has become recognised by developers as an area of vital importance. With digital image processing algorithms now playing a greater role in security and protection applications, it is of crucial importance that we are able to empirically study their performance. Apart from the field of biometrics little emphasis has been put on algorithm performance evaluation until now and where evaluation has taken place, it has been carried out in a somewhat cumbersome and unsystematic fashion, without any standardised approach. This paper presents a comprehensive testing methodology and framework aimed towards automating the evaluation of image processing algorithms. Ultimately, the test framework aims to shorten the algorithm development life cycle by helping to identify algorithm performance problems quickly and more efficiently.
Resumo:
Due to the global crisis o f climate change many countries throughout the world are installing the renewable energy o f wind power into their electricity system. Wind energy causes complications when it is being integrated into the electricity system due its intermittent nature. Additionally winds intennittency can result in penalties being enforced due to the deregulation in the electricity market. Wind power forecasting can play a pivotal role to ease the integration o f wind energy. Wind power forecasts at 24 and 48 hours ahead of time are deemed the most crucial for determining an appropriate balance on the power system. In the electricity market wind power forecasts can also assist market participants in terms o f applying a suitable bidding strategy, unit commitment or have an impact on the value o f the spot price. For these reasons this study investigates the importance o f wind power forecasts for such players as the Transmission System Operators (TSOs) and Independent Power Producers (IPPs). Investigation in this study is also conducted into the impacts that wind power forecasts can have on the electricity market in relation to bidding strategies, spot price and unit commitment by examining various case studies. The results o f these case studies portray a clear and insightful indication o f the significance o f availing from the information available from wind power forecasts. The accuracy o f a particular wind power forecast is also explored. Data from a wind power forecast is examined in the circumstances o f both 24 and 48 hour forecasts. The accuracy o f the wind power forecasts are displayed through a variety o f statistical approaches. The results o f the investigation can assist market participants taking part in the electricity pool and also provides a platform that can be applied to any forecast when attempting to define its accuracy. This study contributes significantly to the knowledge in the area o f wind power forecasts by explaining the importance o f wind power forecasting within the energy sector. It innovativeness and uniqueness lies in determining the accuracy o f a particular wind power forecast that was previously unknown.
Resumo:
Stand alone solar powered refrigeration and water desalination, two of the most popular and sought after applications of solar energy systems, have been selected as the topic of research for the works presented in this thesis. The water desalination system based on evaporation and condensation was found to be the most suitable one to be powered by solar energy. It has been established that highoutput fast-response solar heat collectors used to achieve high rates of evaporation and reliable solar powered cooling system for faster rates of condensation are the most important factors in achieving increased outputs in solar powered desalination systems. Comprehensive reviews of Solar powered cooling/refrigeration and also water desalination techniques have been presented. In view of the fact that the Institute of Technology, Sligo has a well-established long history of research and development in the production of state of the art high-efficiency fast-response evacuated solar heat collectors it was decided to use this know how in the work described in this thesis. For this reason achieving high rates of evaporation was not a problem. It was, therefore, the question of the solar powered refrigeration that was envisaged to be used in the solar powered desalination tofacilitate rapid condensation of the evaporated water that had to be addressed first. The principles of various solar powered refrigeration techniques have also been reviewed. The first step in work on solar powered refrigeration was to successfully modify a conventional refrigerator working on Platen-Munters design to be powered by highoutput fast-response evacuated solar heat collectors. In this work, which was the first ever successful attempt in the field, temperatures as low as —19°C were achieved in the icebox. A new approach in the use of photovoltaic technology to power a conventional domestic refrigerator was also attempted. This was done by modifying a conventional domestic refrigerator to be powered by photovoltaic panels in the most efficient way. In the system developed and successfully tested in this approach, the power demand has been reduced phenomenally and it is possible to achieve 48 hours of cooling power with exposure to just 7 hours of sunshine. The successful development of the first ever multi-cycle intermittent solar powered icemaker is without doubt the most exciting breakthrough in the work described in this thesis. Output of 74.3kg of ice per module with total exposure area of 2.88 m2, or 25.73kg per m2, per day is a major improvement in comparison to about 5-6kg of ice per m2 per day reported for all the single cycle intermittent systems. This system has then become the basis for the development of a new solar powered refrigeration system with even higher output, named the “composite” system described in this thesis. Another major breakthrough associated with the works described in this thesis is the successful development and testing of the high-output water desalination system. This system that uses a combination of the high-output fast-response evacuated solar heat collectors and the multi-cycle icemaker. The system is capable of producing a maximum of 141 litres of distilled water per day per module which has an exposure area of 3.24m2, or a production rate of 43.5 litres per m2 per day. Once again when this result is compared to the reported daily output of 5 litres of desalinated water per m per day the significance of this piece of work becomes apparent. In the presentation of many of the components and systems described in this thesis CAD parametric solid modelling has been used instead of photographs to illustrate them more clearly. The multi-cycle icemaker and the high-output desalination systems are the subject of two patent applications.
Resumo:
Univariate statistical control charts, such as the Shewhart chart, do not satisfy the requirements for process monitoring on a high volume automated fuel cell manufacturing line. This is because of the number of variables that require monitoring. The risk of elevated false alarms, due to the nature of the process being high volume, can present problems if univariate methods are used. Multivariate statistical methods are discussed as an alternative for process monitoring and control. The research presented is conducted on a manufacturing line which evaluates the performance of a fuel cell. It has three stages of production assembly that contribute to the final end product performance. The product performance is assessed by power and energy measurements, taken at various time points throughout the discharge testing of the fuel cell. The literature review performed on these multivariate techniques are evaluated using individual and batch observations. Modern techniques using multivariate control charts on Hotellings T2 are compared to other multivariate methods, such as Principal Components Analysis (PCA). The latter, PCA, was identified as the most suitable method. Control charts such as, scores, T2 and DModX charts, are constructed from the PCA model. Diagnostic procedures, using Contribution plots, for out of control points that are detected using these control charts, are also discussed. These plots enable the investigator to perform root cause analysis. Multivariate batch techniques are compared to individual observations typically seen on continuous processes. Recommendations, for the introduction of multivariate techniques that would be appropriate for most high volume processes, are also covered.
Resumo:
The research described in this thesis was developed as part o f the Information Management for Green Design (IMA GREE) Project. The 1MAGREE Project was founded by Enterprise Ireland under a Strategic Research Grant Scheme as a partnership project between Galway Mayo Institute o f Technology and C1MRU University College Galway. The project aimed to develop a CAD integrated software tool to support environmental information management for design, particularly for the electronics-manufacturing sector in Ireland.
Resumo:
This is a study of a state of the art implementation of a new computer integrated testing (CIT) facility within a company that designs and manufactures transport refrigeration systems. The aim was to use state of the art hardware, software and planning procedures in the design and implementation of three CIT systems. Typical CIT system components include data acquisition (DAQ) equipment, application and analysis software, communication devices, computer-based instrumentation and computer technology. It is shown that the introduction of computer technology into the area of testing can have a major effect on such issues as efficiency, flexibility, data accuracy, test quality, data integrity and much more. Findings reaffirm how the overall area of computer integration continues to benefit any organisation, but with more recent advances in computer technology, communication methods and software capabilities, less expensive more sophisticated test solutions are now possible. This allows more organisations to benefit from the many advantages associated with CIT. Examples of computer integration test set-ups and the benefits associated with computer integration have been discussed.
Resumo:
Distribution systems, eigenvalue analysis, nodal admittance matrix, power quality, spectral decomposition
Resumo:
Abstract ST2 is a member of the interleukin-1 receptor family biomarker and circulating soluble ST2 concentrations are believed to reflect cardiovascular stress and fibrosis. Recent studies have demonstrated soluble ST2 to be a strong predictor of cardiovascular outcomes in both chronic and acute heart failure. It is a new biomarker that meets all required criteria for a useful biomarker. Of note, it adds information to natriuretic peptides (NPs) and some studies have shown it is even superior in terms of risk stratification. Since the introduction of NPs, this has been the most promising biomarker in the field of heart failure and might be particularly useful as therapy guide.