978 resultados para Eddy Current Testing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the influences of circadian variations on tilt-table testing (TTT) results by comparing the positivity rate of the test performed during the morning with that of the test performed in the afternoon and to evaluate the reproducibility of the results in different periods of the day. METHODS: One hundred twenty-three patients with recurrent unexplained syncope or near-syncope referred for TTT were randomized into 2 groups. In group I, 68 patients, TTT was performed first in the afternoon and then in the morning. In group II, 55 patients, the test was performed first in the morning and then in the afternoon. RESULTS: The TTT protocol was the prolonged passive test, without drug sensitization. Twenty-nine (23.5%) patients had a positive result in at least one of the periods. The positivity rate for each period was similar: 20 (16.2%) patients in the afternoon and 19 (15.4%) in the morning (p=1.000). Total reproducibility (positive/positive and negative/negative) was observed in 49 (89%) patients in group I and in 55 (81%) in group II. Reproducibility of the results was obtained in 94 (90.4%) patients with first negative tests but in 10 (34%) patients with first positive tests. CONCLUSION: TTT could be performed during any period of the day, and even in the 2 periods to enhance positivity. Considering the low reproducibility rate of the positive tests, serial TTT to evaluate therapeutic efficacy should be performed during the same period of the day.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The job of health professionals, including nurses, is considered inherently stressful (Lee & Wang, 2002; Rutledge et al., 2009), and thus it is important to improve and develop specific measures that are sensitive to the demands that health professionals face. This study analysed the psychometric properties of three instruments that focus on the professional experiences of nurses in aspects related to occupational stress, cognitive appraisal, and mental health issues. The evaluation protocol included the Stress Questionnaire for Health Professionals (SQHP; Gomes, 2014), the Cognitive Appraisal Scale (CAS; Gomes, Faria, & Gonçalves, 2013), and the General Health Questionnaire-12 (GHQ-12; Goldberg, 1972). Validity and reliability issues were considered with statistical analysis (i.e. confirmatory factor analysis, convergent validity, and composite reliability) that revealed adequate values for all of the instruments, namely, a six-factor structure for the SQHP, a five-factor structure for the CAS, and a two-factor structure for the GHQ-12. In conclusion, this study proposes three consistent instruments that may be useful for analysing nurses’ adaptation to work contexts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To compare blood pressure response to dynamic exercise in hypertensive patients taking trandolapril or captopril. METHODS: We carried out a prospective, randomized, blinded study with 40 patients with primary hypertension and no other associated disease. The patients were divided into 2 groups (n=20), paired by age, sex, race, and body mass index, and underwent 2 symptom-limited exercise tests on a treadmill before and after 30 days of treatment with captopril (75 to 150 mg/day) or trandolapril (2 to 4 mg/day). RESULTS: The groups were similar prior to treatment (p<0.05), and both drugs reduced blood pressure at rest (p<0.001). During treatment, trandolapril caused a greater increase in functional capacity (+31%) than captopril (+17%; p=0.01) did, and provided better blood pressure control during exercise, observed as a reduction in the variation of systolic blood pressure/MET (trandolapril: 10.7±1.9 mmHg/U vs 7.4±1.2 mmHg/U, p=0.02; captopril: 9.1±1.4 mmHg/U vs 11.4±2.5 mmHg/U, p=0.35), a reduction in peak diastolic blood pressure (trandolapril: 116.8±3.1 mmHg vs 108.1±2.5 mmHg, p=0.003; captopril: 118.2±3.1 mmHg vs 115.8±3.3 mmHg, p=0.35), and a reduction in the interruption of the tests due to excessive elevation in blood pressure (trandolapril: 50% vs 15%, p=0.009; captopril: 50% vs 45%, p=0.32). CONCLUSION: Monotherapy with trandolapril is more effective than that with captopril to control blood pressure during exercise in hypertensive patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To assess safety, feasibility, and the results of early exercise testing in patients with chest pain admitted to the emergency room of the chest pain unit, in whom acute myocardial infarction and high-risk unstable angina had been ruled out. METHODS: A study including 1060 consecutive patients with chest pain admitted to the emergency room of the chest pain unit was carried out. Of them, 677 (64%) patients were eligible for exercise testing, but only 268 (40%) underwent the test. RESULTS: The mean age of the patients studied was 51.7±12.1 years, and 188 (70%) were males. Twenty-eight (10%) patients had a previous history of coronary artery disease, 244 (91%) had a normal or unspecific electrocardiogram, and 150 (56%) underwent exercise testing within a 12-hour interval. The results of the exercise test in the latter group were as follows: 34 (13%) were positive, 191 (71%) were negative, and 43 (16%) were inconclusive. In the group of patients with a positive exercise test, 21 (62%) underwent coronary angiography, 11 underwent angioplasty, and 2 underwent myocardial revascularization. In a univariate analysis, type A/B chest pain (definitely/probably anginal) (p<0.0001), previous coronary artery disease (p<0.0001), and route 2 (patients at higher risk) correlated with a positive or inconclusive test (p<0.0001). CONCLUSION: In patients with chest pain and in whom acute myocardial infarction and high-risk unstable angina had been ruled out, the exercise test proved to be feasible, safe, and well tolerated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El avance en la potencia de cómputo en nuestros días viene dado por la paralelización del procesamiento, dadas las características que disponen las nuevas arquitecturas de hardware. Utilizar convenientemente este hardware impacta en la aceleración de los algoritmos en ejecución (programas). Sin embargo, convertir de forma adecuada el algoritmo en su forma paralela es complejo, y a su vez, esta forma, es específica para cada tipo de hardware paralelo. En la actualidad los procesadores de uso general más comunes son los multicore, procesadores paralelos, también denominados Symmetric Multi-Processors (SMP). Hoy en día es difícil hallar un procesador para computadoras de escritorio que no tengan algún tipo de paralelismo del caracterizado por los SMP, siendo la tendencia de desarrollo, que cada día nos encontremos con procesadores con mayor numero de cores disponibles. Por otro lado, los dispositivos de procesamiento de video (Graphics Processor Units - GPU), a su vez, han ido desarrollando su potencia de cómputo por medio de disponer de múltiples unidades de procesamiento dentro de su composición electrónica, a tal punto que en la actualidad no es difícil encontrar placas de GPU con capacidad de 200 a 400 hilos de procesamiento paralelo. Estos procesadores son muy veloces y específicos para la tarea que fueron desarrollados, principalmente el procesamiento de video. Sin embargo, como este tipo de procesadores tiene muchos puntos en común con el procesamiento científico, estos dispositivos han ido reorientándose con el nombre de General Processing Graphics Processor Unit (GPGPU). A diferencia de los procesadores SMP señalados anteriormente, las GPGPU no son de propósito general y tienen sus complicaciones para uso general debido al límite en la cantidad de memoria que cada placa puede disponer y al tipo de procesamiento paralelo que debe realizar para poder ser productiva su utilización. Los dispositivos de lógica programable, FPGA, son dispositivos capaces de realizar grandes cantidades de operaciones en paralelo, por lo que pueden ser usados para la implementación de algoritmos específicos, aprovechando el paralelismo que estas ofrecen. Su inconveniente viene derivado de la complejidad para la programación y el testing del algoritmo instanciado en el dispositivo. Ante esta diversidad de procesadores paralelos, el objetivo de nuestro trabajo está enfocado en analizar las características especificas que cada uno de estos tienen, y su impacto en la estructura de los algoritmos para que su utilización pueda obtener rendimientos de procesamiento acordes al número de recursos utilizados y combinarlos de forma tal que su complementación sea benéfica. Específicamente, partiendo desde las características del hardware, determinar las propiedades que el algoritmo paralelo debe tener para poder ser acelerado. Las características de los algoritmos paralelos determinará a su vez cuál de estos nuevos tipos de hardware son los mas adecuados para su instanciación. En particular serán tenidos en cuenta el nivel de dependencia de datos, la necesidad de realizar sincronizaciones durante el procesamiento paralelo, el tamaño de datos a procesar y la complejidad de la programación paralela en cada tipo de hardware. Today´s advances in high-performance computing are driven by parallel processing capabilities of available hardware architectures. These architectures enable the acceleration of algorithms when thes ealgorithms are properly parallelized and exploit the specific processing power of the underneath architecture. Most current processors are targeted for general pruposes and integrate several processor cores on a single chip, resulting in what is known as a Symmetric Multiprocessing (SMP) unit. Nowadays even desktop computers make use of multicore processors. Meanwhile, the industry trend is to increase the number of integrated rocessor cores as technology matures. On the other hand, Graphics Processor Units (GPU), originally designed to handle only video processing, have emerged as interesting alternatives to implement algorithm acceleration. Current available GPUs are able to implement from 200 to 400 threads for parallel processing. Scientific computing can be implemented in these hardware thanks to the programability of new GPUs that have been denoted as General Processing Graphics Processor Units (GPGPU).However, GPGPU offer little memory with respect to that available for general-prupose processors; thus, the implementation of algorithms need to be addressed carefully. Finally, Field Programmable Gate Arrays (FPGA) are programmable devices which can implement hardware logic with low latency, high parallelism and deep pipelines. Thes devices can be used to implement specific algorithms that need to run at very high speeds. However, their programmability is harder that software approaches and debugging is typically time-consuming. In this context where several alternatives for speeding up algorithms are available, our work aims at determining the main features of thes architectures and developing the required know-how to accelerate algorithm execution on them. We look at identifying those algorithms that may fit better on a given architecture as well as compleme

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The adoption of a sustainable approach to meeting the energy needs of society has recently taken on a more central and urgent place in the minds of many people. There are many reasons for this including ecological, environmental and economic concerns. One particular area where a sustainable approach has become very relevant is in the production of electricity. The contribution of renewable sources to the energy mix supplying the electricity grid is nothing new, but the focus has begun to move away from the more conventional renewable sources such as wind and hydro. The necessity of exploring new and innovative sources of renewable energy is now seen as imperative as the older forms (i.e. hydro) reach the saturation point of their possible exploitation. One such innovative source of energy currently beginning to be utilised in this regard is tidal energy. The purpose of this thesis is to isolate one specific drawback to tidal energy, which could be considered a roadblock to this energy source being a major contributor to the Irish national grid. This drawback presents itself in the inconsistent nature in which a tidal device generates energy over the course of a 24 hour period. This inconsistency of supply can result in the cycling of conventional power plants in order to even out the supply, subsequently leading to additional costs. The thesis includes a review of literature relevant to the area of tidal and other marine energy sources with an emphasis on the state of the art devices currently in development or production. The research carried out included tidal data analysis and manipulation into a model of the power generating potential at specific sites. A solution is then proposed to the drawback of inconsistency of supply, which involves the positioning of various tidal generation installations at specifically selected locations around the Irish coast. The temporal shift achieved in the power supply profiles of the individual sites by locating the installations in the correct locations, successfully produced an overall power supply profile with the smoother curve and a consistent base load energy supply. Some limitations to the method employed were also outlined, and suggestions for further improvements to the method were made.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this research is to examine the main economic, legislative, and socio- cultural factors that are currently influencing the pub trade in Ireland and their specific impact on a sample of publicans in both Galway city and county. In approaching this task the author engaged in a comprehensive literature review on the origin, history and evolution of the Irish pub; examined the socio-cultural and economic role of the public house in Ireland and developed a profile of the Irish pub by undertaking a number of semi-structured interviews with pub owners from the area. In doing so, the author obtained the views and opinions of the publicans on the current state of their businesses, the extent to which patterns of trade have changed over recent years, the challenges and factors currently influencing their trade, the actions they believed to be necessary to promote the trade and address perceived difficulties and how they viewed the future of the pub business within the framework of the current regulatory regime. In light of this research, the author identified a number of key findings and put forward a series of recommendations designed to promote the future success and development of the pub trade in Ireland. The research established that public houses are currently operating under a very unfavourable regulatory framework that has resulted in the serious decline of the trade over the last decade. This decline appears to have coincided initially with the introduction of the ban on smoking in the workplace and was exacerbated further by the advent of more severe drink-driving laws, especially mandatory breath testing. Other unfavourable conditions include the high levels of excise duty, value added tax and local authority commercial rates. In addition to these regulatory factors, the research established that a major impediment to the pub trade is the unfair competition from supermarkets and other off-licence retail outlets and especially to the phenomenon of the below-cost selling of alcohol. The recession has also been a major contributory factor to the decline in the trade as also has been the trend towards lifestyle changes and home drinking mirroring the practice in some continental European countries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As digital imaging processing techniques become increasingly used in a broad range of consumer applications, the critical need to evaluate algorithm performance has become recognised by developers as an area of vital importance. With digital image processing algorithms now playing a greater role in security and protection applications, it is of crucial importance that we are able to empirically study their performance. Apart from the field of biometrics little emphasis has been put on algorithm performance evaluation until now and where evaluation has taken place, it has been carried out in a somewhat cumbersome and unsystematic fashion, without any standardised approach. This paper presents a comprehensive testing methodology and framework aimed towards automating the evaluation of image processing algorithms. Ultimately, the test framework aims to shorten the algorithm development life cycle by helping to identify algorithm performance problems quickly and more efficiently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research described in this thesis was developed as part o f the Information Management for Green Design (IMA GREE) Project. The 1MAGREE Project was founded by Enterprise Ireland under a Strategic Research Grant Scheme as a partnership project between Galway Mayo Institute o f Technology and C1MRU University College Galway. The project aimed to develop a CAD integrated software tool to support environmental information management for design, particularly for the electronics-manufacturing sector in Ireland.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is a study of a state of the art implementation of a new computer integrated testing (CIT) facility within a company that designs and manufactures transport refrigeration systems. The aim was to use state of the art hardware, software and planning procedures in the design and implementation of three CIT systems. Typical CIT system components include data acquisition (DAQ) equipment, application and analysis software, communication devices, computer-based instrumentation and computer technology. It is shown that the introduction of computer technology into the area of testing can have a major effect on such issues as efficiency, flexibility, data accuracy, test quality, data integrity and much more. Findings reaffirm how the overall area of computer integration continues to benefit any organisation, but with more recent advances in computer technology, communication methods and software capabilities, less expensive more sophisticated test solutions are now possible. This allows more organisations to benefit from the many advantages associated with CIT. Examples of computer integration test set-ups and the benefits associated with computer integration have been discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abdominal Aortic Aneurysms (AAA) haemorhaging is a life-threatening disease. An aneurysm is a permanent swelling of an artery due to a weakness in its wall. Current surgical repair involves opening the chest or abdomen, gaining temporary vascular control of the aorta and suturing a prosthetic graft to the healthy aorta within the aneurysm itself The outcome of this surgical approach is not perfect, and the quality of life after this repair is impaired by postoperative pain, sexual dysfunction, and a lengthy hospital stay resulting in high health costs. All these negative effects are related to the large incision and extensive tissue dissection. Endovascular grafting is an alternative to the standard surgical method. This treatment is a less invasive method of treating aortic aneurysms. It involves a surgical exposure of the common femoral arteries where the stent graft can be inserted through by an over-the-wire technique. All manipulations are controlled from a remote place by the use of a catheter and this technique avoids the need to directly expose the diseased artery through a large incision or an extensive dissection. The proposed design method outlined in this project is to develop the endovascular approach. The main aim is to design an unitary bifurcated stent graft (1 e- bifurcated graft as a single component) to treat these Abdominal Aortic Aneurysms. This includes the delivery system and deployment mechanism necessary to first accurately position the stent graft across the aneurysm sac and also across the iliac bifurcation, and secondly fix the stent graft in position by using expandable metal stents. Thus, excluding the aneurysm from the circulation and therefore preventing rupture. Miniaturisation is a critical aspect of this design, as the smaller the crimped stent graft the easier to guide through the vascular system to the desired location. Biocompatibility is an important aspect. The preferred materials for this prosthesis are to use Shape Memory Alloys for the stent and a multifilament fabric for the graft. A taper design is applied for the geometry as this gives a favourable flow characteristic and reduced wave reflections. Adequate testing of the stent graft to prove its durability and the ease of the method of deployment is a prerequisite. A bench test facility has being designed and build to replicate the cardiovascular system and the disease in question aortic aneurysms at the iliac bifurcation. The testing here shows the feasibility of the proposed delivery system and the durability of the stent graft across the aneurysm sac. Finally, these endovascular treatments offer the economic advantage of short hospital stays or even treatment as an outpatient, as well as elimination of the need for postoperative intensive care The risk of developing an aneurysm increases with age, that is one of the mam reasons to look for less invasive ways of treating aneurysms. Consequently, there is enormous pressure to develop and use these devices rapidly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract ST2 is a member of the interleukin-1 receptor family biomarker and circulating soluble ST2 concentrations are believed to reflect cardiovascular stress and fibrosis. Recent studies have demonstrated soluble ST2 to be a strong predictor of cardiovascular outcomes in both chronic and acute heart failure. It is a new biomarker that meets all required criteria for a useful biomarker. Of note, it adds information to natriuretic peptides (NPs) and some studies have shown it is even superior in terms of risk stratification. Since the introduction of NPs, this has been the most promising biomarker in the field of heart failure and might be particularly useful as therapy guide.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work describes a test tool that allows to make performance tests of different end-to-end available bandwidth estimation algorithms along with their different implementations. The goal of such tests is to find the best-performing algorithm and its implementation and use it in congestion control mechanism for high-performance reliable transport protocols. The main idea of this paper is to describe the options which provide available bandwidth estimation mechanism for highspeed data transport protocols and to develop basic functionality of such test tool with which it will be possible to manage entities of test application on all involved testing hosts, aided by some middleware.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Today, usability testing in the development of software and systems is essential. A stationary usability lab offers many different possibilities in the evaluation of usability, but it reaches its limits in terms of flexibility and the experimental conditions. Mobile usability studies consider consciously outside influences, and these studies require a specially adapted approach to preparation, implementation and evaluation. Using the example of a mobile eye tracking study the difficulties and the opportunities of mobile testing are considered.