822 resultados para Impedance based structural health monitoring
Resumo:
Perfluoroalkylated substances are a group of chemicals that have been largely employed during the last 60 years in several applications, widely spreading and accumulating in the environment due to their extreme resistance to degradation. As a consequence, they have been found also in various types of food as well as in drinking water, proving that they can easily reach humans through the diet. The available information concerning their adverse effects on health has recently increased the interest towards these contaminants and highlighted the importance of investigating all the potential sources of human exposure, among which diet was proved to be the most relevant. This need has been underlined by the European Union through Recommendation 2010/161/EU: in this document, Member States were called to monitor their presence of in food, producing accurate estimations of human exposure. The purpose of the research presented in this thesis, which is the result of a partnership between an Italian and a French laboratory, was to develop reliable tools for the analysis of these pollutants in food, to be used for generating data on potentially contaminated matrices. An efficient method based on liquid chromatography-mass spectrometry for the detection of 16 different perfluorinated compounds in milk has been validated in accordance with current European regulation guidelines (2002/657/EC) and is currently under evaluation for ISO 17025 accreditation. The proposed technique was applied to cow, powder and human breast milk samples from Italy and France to produce a preliminary monitoring on the presence of these contaminants. In accordance with the above mentioned European Recommendation, this project led also to the development of a promising technique for the quantification of some precursors of these substances in fish. This method showed extremely satisfying performances in terms of linearity and limits of detection, and will be useful for future surveys.
Resumo:
The quantification of the structural properties of snow is traditionally based on model-based stereology. Model-based stereology requires assumptions about the shape of the investigated structure. Here, we show how the density, specific surface area, and grain boundary area can be measured using a design-based method, where no assumptions about structural properties are necessary. The stereological results were also compared to X-ray tomography to control the accuracy of the method. The specific surface area calculated with the stereological method was 19.8 ± 12.3% smaller than with X-ray tomography. For the density, the stereological method gave results that were 11.7 ± 12.1% larger than X-ray tomography. The statistical analysis of the estimates confirmed that the stereological method and the sampling used are accurate. This stereological method was successfully tested on artificially produced ice beads but also on several snow types. Combining stereology and polarisation microscopy provides a good estimate of grain boundary areas in ice beads and in natural snow, with some limitatio
Resumo:
BACKGROUND: Steam pops are a risk of irrigated radiofrequency catheter ablation (RFA) and may cause cardiac perforation. Data to guide radiofrequency (RF) energy titration to avoid steam pops are limited. OBJECTIVE: This study sought to assess the frequency and consequence of audible pops and to determine the feasibility of using the magnitude of impedance change to predict pops. METHODS: We reviewed consecutive endocardial open-irrigated RFA for ventricular tachycardia (VT) with continuously recorded ablation data in 142 patients with structural heart disease. Steam pops were defined as an audible pop associated with a sudden spike in impedance. Ablation lesions before or after pops served as controls. RESULTS: From a total of 4,107 ablation lesions, 62 (1.5%) steam pops occurred in 42 procedures in 38 patients. Perforation with tamponade occurred with 1 of 62 (2%) pops. Applications with pops had a greater impedance decrease (22 +/- 7 Omega vs. 18 +/- 8 Omega, P = .001) and a higher maximum power (45 +/- 5 W vs. 43 +/- 6 W, P = .011), but did not differ in maximum catheter tip temperature (40 degrees C +/- 4 degrees C vs. 40 degrees C +/- 4 degrees C, P = .180) from applications without pops. Eighty percent of pops occurred after impedance decreased by at least 18 Omega. CONCLUSION: During VT ablation with open irrigation, audible pops are infrequent and do not usually cause perforation. Limiting RF power to achieve an impedance decrease of <18 Omega is a feasible method of reducing the likelihood of a pop when perforation risk is of concern.
Resumo:
Gastro-oesophageal reflux disease (GERD) is a highly prevalent condition in Western countries leading to millions of outpatient visits per year. GERD symptoms including heartburn, regurgitation and chest pain are caused by reflux of gastric content in the oesophagus even in the absence of endoscopically visible mucosal lesions. Several procedures are used to identify gastro-oesophageal reflux, the clinically widely used are: conventional (catheter-based) pH monitoring, wireless oesophageal pH monitoring (Bravo), bilirubin monitoring (Bilitec), and combined multichannel intraluminal impedance-pH monitoring (MII-pH). Each technique has strengths and limitations of which clinicians and investigators should be aware when deciding which to choose in a particular patient. Important is the ability to quantify gastro-oesophageal reflux and evaluate the relationship between symptoms and reflux episodes. The present review summarises the technical aspects in performing and interpreting esophageal reflux monitoring procedures.
Resumo:
Introduction Prospective memory (PM), the ability to remember to perform intended activities in the future (Kliegel & Jäger, 2007), is crucial to succeed in everyday life. PM seems to improve gradually over the childhood years (Zimmermann & Meier, 2006), but yet little is known about PM competences in young school children in general, and even less is known about factors influencing its development. Currently, a number of studies suggest that executive functions (EF) are potentially influencing processes (Ford, Driscoll, Shum & Macaulay, 2012; Mahy & Moses, 2011). Additionally, metacognitive processes (MC: monitoring and control) are assumed to be involved while optimizing one’s performance (Krebs & Roebers, 2010; 2012; Roebers, Schmid, & Roderer, 2009). Yet, the relations between PM, EF and MC remain relatively unspecified. We intend to empirically examine the structural relations between these constructs. Method A cross-sectional study including 119 2nd graders (mage = 95.03, sdage = 4.82) will be presented. Participants (n = 68 girls) completed three EF tasks (stroop, updating, shifting), a computerised event-based PM task and a MC spelling task. The latent variables PM, EF and MC that were represented by manifest variables deriving from the conducted tasks, were interrelated by structural equation modelling. Results Analyses revealed clear associations between the three cognitive constructs PM, EF and MC (rpm-EF = .45, rpm-MC = .23, ref-MC = .20). A three factor model, as opposed to one or two factor models, appeared to fit excellently to the data (chi2(17, 119) = 18.86, p = .34, remsea = .030, cfi = .990, tli = .978). Discussion The results indicate that already in young elementary school children, PM, EF and MC are empirically well distinguishable, but nevertheless substantially interrelated. PM and EF seem to share a substantial amount of variance while for MC, more unique processes may be assumed.
Devices in heart failure: potential methods for device-based monitoring of congestive heart failure.
Resumo:
Congestive heart failure has long been one of the most serious medical conditions in the United States; in fact, in the United States alone, heart failure accounts for 6.5 million days of hospitalization each year. One important goal of heart-failure therapy is to inhibit the progression of congestive heart failure through pharmacologic and device-based therapies. Therefore, there have been efforts to develop device-based therapies aimed at improving cardiac reserve and optimizing pump function to meet metabolic requirements. The course of congestive heart failure is often worsened by other conditions, including new-onset arrhythmias, ischemia and infarction, valvulopathy, decompensation, end-organ damage, and therapeutic refractoriness, that have an impact on outcomes. The onset of such conditions is sometimes heralded by subtle pathophysiologic changes, and the timely identification of these changes may promote the use of preventive measures. Consequently, device-based methods could in the future have an important role in the timely identification of the subtle pathophysiologic changes associated with congestive heart failure.
Resumo:
INTRODUCTION: Voluntary muscle activity, including swallowing, decreases during the night. The association between nocturnal awakenings and swallowing activity is under-researched with limited information on the frequency of swallows during awake and asleep periods. AIM: The aim of this study was to assess nocturnal swallowing activity and identify a cut-off predicting awake and asleep periods. METHODS: Patients undergoing impedance-pH monitoring as part of GERD work-up were asked to wear a wrist activity detecting device (Actigraph(®)) at night. Swallowing activity was quantified by analysing impedance changes in the proximal esophagus. Awake and asleep periods were determined using a validated scoring system (Sadeh algorithm). Receiver operating characteristics (ROC) analyses were performed to determine sensitivity, specificity and accuracy of swallowing frequency to identify awake and asleep periods. RESULTS: Data from 76 patients (28 male, 48 female; mean age 56 ± 15 years) were included in the analysis. The ROC analysis found that 0.33 sw/min (i.e. one swallow every 3 min) had the optimal sensitivity (78 %) and specificity (76 %) to differentiate awake from asleep periods. A swallowing frequency of 0.25 sw/min (i.e. one swallow every 4 min) was 93 % sensitive and 57 % specific to identify awake periods. A swallowing frequency of 1 sw/min was 20 % sensitive but 96 % specific in identifying awake periods. Impedance-pH monitoring detects differences in swallowing activity during awake and asleep periods. Swallowing frequency noticed during ambulatory impedance-pH monitoring can predict the state of consciousness during nocturnal periods
Resumo:
The brain is a complex neural network with a hierarchical organization and the mapping of its elements and connections is an important step towards the understanding of its function. Recent developments in diffusion-weighted imaging have provided the opportunity to reconstruct the whole-brain structural network in-vivo at a large scale level and to study the brain structural substrate in a framework that is close to the current understanding of brain function. However, methods to construct the connectome are still under development and they should be carefully evaluated. To this end, the first two studies included in my thesis aimed at improving the analytical tools specific to the methodology of brain structural networks. The first of these papers assessed the repeatability of the most common global and local network metrics used in literature to characterize the connectome, while in the second paper the validity of further metrics based on the concept of communicability was evaluated. Communicability is a broader measure of connectivity which accounts also for parallel and indirect connections. These additional paths may be important for reorganizational mechanisms in the presence of lesions as well as to enhance integration in the network. These studies showed good to excellent repeatability of global network metrics when the same methodological pipeline was applied, but more variability was detected when considering local network metrics or when using different thresholding strategies. In addition, communicability metrics have been found to add some insight into the integration properties of the network by detecting subsets of nodes that were highly interconnected or vulnerable to lesions. The other two studies used methods based on diffusion-weighted imaging to obtain knowledge concerning the relationship between functional and structural connectivity and about the etiology of schizophrenia. The third study integrated functional oscillations measured using electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) as well as diffusion-weighted imaging data. The multimodal approach that was applied revealed a positive relationship between individual fluctuations of the EEG alpha-frequency and diffusion properties of specific connections of two resting-state networks. Finally, in the fourth study diffusion-weighted imaging was used to probe for a relationship between the underlying white matter tissue structure and season of birth in schizophrenia patients. The results are in line with the neurodevelopmental hypothesis of early pathological mechanisms as the origin of schizophrenia. The different analytical approaches selected in these studies also provide arguments for discussion of the current limitations in the analysis of brain structural networks. To sum up, the first studies presented in this thesis illustrated the potential of brain structural network analysis to provide useful information on features of brain functional segregation and integration using reliable network metrics. In the other two studies alternative approaches were presented. The common discussion of the four studies enabled us to highlight the benefits and possibilities for the analysis of the connectome as well as some current limitations.
Resumo:
OBJECTIVES Many paediatric antiretroviral therapy (ART) programmes in Southern Africa rely on CD4⁺ to monitor ART. We assessed the benefit of replacing CD4⁺ by viral load monitoring. DESIGN A mathematical modelling study. METHODS A simulation model of HIV progression over 5 years in children on ART, parameterized by data from seven South African cohorts. We simulated treatment programmes with 6-monthly CD4⁺ or 6- or 12-monthly viral load monitoring. We compared mortality, second-line ART use, immunological failure and time spent on failing ART. In further analyses, we varied the rate of virological failure, and assumed that the rate is higher with CD4⁺ than with viral load monitoring. RESULTS About 7% of children were predicted to die within 5 years, independent of the monitoring strategy. Compared with CD4⁺ monitoring, 12-monthly viral load monitoring reduced the 5-year risk of immunological failure from 1.6 to 1.0% and the mean time spent on failing ART from 6.6 to 3.6 months; 1% of children with CD4⁺ compared with 12% with viral load monitoring switched to second-line ART. Differences became larger when assuming higher rates of virological failure. When assuming higher virological failure rates with CD4⁺ than with viral load monitoring, up to 4.2% of children with CD4⁺ compared with 1.5% with viral load monitoring experienced immunological failure; the mean time spent on failing ART was 27.3 months with CD4⁺ monitoring and 6.0 months with viral load monitoring. Conclusion: Viral load monitoring did not affect 5-year mortality, but reduced time on failing ART, improved immunological response and increased switching to second-line ART.
Resumo:
The function of the esophagus is transporting nutrients from the oropharyngeal cavity to the stomach. This is achieved by coordinated contractions and relaxation of the tubular esophagus and the upper and lower esophageal sphincter. Multichannel intraluminal impedance monitoring offers quantification of esophageal bolus transit and/or retention without the use of ionizing radiation. Combined with conventional or high-resolution manometry, impedance measurements complement the quantification of esophageal body contraction and sphincter relaxation, offering a more comprehensive evaluation of esophageal function. Further studies evaluating the utility of quantifying bolus transit will help clarify the role and position of impedance measurements.
Resumo:
BACKGROUND Oesophageal clearance has been scarcely studied. AIMS Oesophageal clearance in endoscopy-negative heartburn was assessed to detect differences in bolus clearance time among patients sub-grouped according to impedance-pH findings. METHODS In 118 consecutive endoscopy-negative heartburn patients impedance-pH monitoring was performed off-therapy. Acid exposure time, number of refluxes, baseline impedance, post-reflux swallow-induced peristaltic wave index and both automated and manual bolus clearance time were calculated. Patients were sub-grouped into pH/impedance positive (abnormal acid exposure and/or number of refluxes) and pH/impedance negative (normal acid exposure and number of refluxes), the former further subdivided on the basis of abnormal/normal acid exposure time (pH+/-) and abnormal/normal number of refluxes (impedance+/-). RESULTS Poor correlation (r=0.35) between automated and manual bolus clearance time was found. Manual bolus clearance time progressively decreased from pH+/impedance+ (42.6s), pH+/impedance- (27.1s), pH-/impedance+ (17.8s) to pH-/impedance- (10.8s). There was an inverse correlation between manual bolus clearance time and both baseline impedance and post-reflux swallow-induced peristaltic wave index, and a direct correlation between manual bolus clearance and acid exposure time. A manual bolus clearance time value of 14.8s had an accuracy of 93% to differentiate pH/impedance positive from pH/impedance negative patients. CONCLUSIONS When manually measured, bolus clearance time reflects reflux severity, confirming the pathophysiological relevance of oesophageal clearance in reflux disease.
Resumo:
Syndromic surveillance (SyS) systems currently exploit various sources of health-related data, most of which are collected for purposes other than surveillance (e.g. economic). Several European SyS systems use data collected during meat inspection for syndromic surveillance of animal health, as some diseases may be more easily detected post-mortem than at their point of origin or during the ante-mortem inspection upon arrival at the slaughterhouse. In this paper we use simulation to evaluate the performance of a quasi-Poisson regression (also known as an improved Farrington) algorithm for the detection of disease outbreaks during post-mortem inspection of slaughtered animals. When parameterizing the algorithm based on the retrospective analyses of 6 years of historic data, the probability of detection was satisfactory for large (range 83-445 cases) outbreaks but poor for small (range 20-177 cases) outbreaks. Varying the amount of historical data used to fit the algorithm can help increasing the probability of detection for small outbreaks. However, while the use of a 0·975 quantile generated a low false-positive rate, in most cases, more than 50% of outbreak cases had already occurred at the time of detection. High variance observed in the whole carcass condemnations time-series, and lack of flexibility in terms of the temporal distribution of simulated outbreaks resulting from low reporting frequency (monthly), constitute major challenges for early detection of outbreaks in the livestock population based on meat inspection data. Reporting frequency should be increased in the future to improve timeliness of the SyS system while increased sensitivity may be achieved by integrating meat inspection data into a multivariate system simultaneously evaluating multiple sources of data on livestock health.
Resumo:
The Houston region is home to arguably the largest petrochemical and refining complex anywhere. The effluent of this complex includes many potentially hazardous compounds. Study of some of these compounds has led to recognition that a number of known and probable carcinogens are at elevated levels in ambient air. Two of these, benzene and 1,3-butadiene, have been found in concentrations which may pose health risk for residents of Houston.^ Recent popular journalism and publications by local research institutions has increased the interest of the public in Houston's air quality. Much of the literature has been critical of local regulatory agencies' oversight of industrial pollution. A number of citizens in the region have begun to volunteer with air quality advocacy groups in the testing of community air. Inexpensive methods exist for monitoring of ozone, particulate matter and airborne toxic ambient concentrations. This study is an evaluation of a technique that has been successfully applied to airborne toxics.^ This technique, solid phase microextraction (SPME), has been used to measure airborne volatile organic hydrocarbons at community-level concentrations. It is has yielded accurate and rapid concentration estimates at a relatively low cost per sample. Examples of its application to measurement of airborne benzene exist in the literature. None have been found for airborne 1,3-butadiene. These compounds were selected for an evaluation of SPME as a community-deployed technique, to replicate previous application to benzene, to expand application to 1,3-butadiene and due to the salience of these compounds in this community. ^ This study demonstrates that SPME is a useful technique for quantification of 1,3-butadiene at concentrations observed in Houston. Laboratory background levels precluded recommendation of the technique for benzene. One type of SPME fiber, 85 μm Carboxen/PDMS, was found to be a sensitive sampling device for 1,3-butadiene under temperature and humidity conditions common in Houston. This study indicates that these variables affect instrument response. This suggests the necessity of calibration within specific conditions of these variables. While deployment of this technique was less expensive than other methods of quantification of 1,3-butadiene, the complexity of calibration may exclude an SPME method from broad deployment by community groups.^
Resumo:
The purpose of the multiple case-study was to determine how hospital subsystems (such as physician monitoring and credentialing; quality assurance; risk management; and peer review) were supporting the monitoring of physicians? Three large metropolitan hospitals in Texas were studied and designated as hospitals #1, #2, and #3. Realizing that hospital subsystems are a unique entity and part of a larger system, conclusions were made on the premises of a quality control system, in relation to the tools of government (particularly the Health Care Quality Improvement Act (HCQIA)), and in relation to itself as a tool of a hospital.^ Three major analytical assessments were performed. First, the subsystems were analyzed as to their "completeness"; secondly, the subsystems were analyzed for "performance"; and thirdly, the subsystems were analyzed in reference to the interaction of completeness and performance.^ The physician credentialing and monitoring and the peer review subsystems as quality control systems were most complete, efficient, and effective in hospitals #1 and #3. The HCQIA did not seem to be an influencing factor in the completeness of the subsystem in hospital #1. The quality assurance and risk management subsystem in hospital #2 was not representative of completeness and performance and the HCQIA was not an influencing factor in the completeness of the Q.A. or R.M. systems in any hospital. The efficiency (computerization) of the physician credentialing, quality assurance and peer review subsystems in hospitals #1 and #3 seemed to contribute to their effectiveness (system-wide effect).^ The results indicated that the more complete, effective, and efficient subsystems were characterized by (1) all defined activities being met, (2) the HCQIA being an influencing factor, (3) a decentralized administrative structure, (4) computerization an important element, and (5) staff was sophisticated in subsystem operations. However, other variables were identified which deserve further research as to their effect on completeness and performance of subsystems. They include (1) medical staff affiliations, (2) system funding levels, (3) the system's administrative structure, and (4) the physician staff "cultural" characteristics. Perhaps by understanding other influencing factors, health care administrators may plan subsystems that will be compatible with legislative requirements and administrative objectives. ^
Resumo:
Contraction of cardiac muscle is regulated through the Ca2+ dependent protein-protein interactions of the troponin complex (Tn). The critical role cardiac troponin C (cTnC) plays as the Ca2+ receptor in this complex makes it an attractive target for positive inotropic compounds. In this study, the ten Met methyl groups in cTnC, [98% 13C ϵ]-Met cTnC, are used as structural markers to monitor conformational changes in cTnC and identify sites of interaction between cTnC and cardiac troponin I (cTnI) responsible for the Ca2+ dependent interactions. In addition the structural consequences that a number of Ca2+-sensitizing compounds have on free cTnC and the cTnC·cTnI complex were characterized. Using heteronuclear NMR experiments and monitoring chemical shift changes in the ten Met methyl 1H-13C correlations in 3Ca2+ cTnC when bound to cTnI revealed an anti-parallel arrangement for the two proteins such that the N-domain of cTnI interacts with the C-domain of cTnC. The large chemical shifts in Mets-81, -120, and -157 identified points of contact between the proteins that include the C-domain hydrophobic surface in cTnC and the A, B, and D helical interface located in the regulatory N-domain of cTnC. TnI association [cTnI(33–80), cTnI(86–211), or cTnI(33–211)] was found also to dramatically reduce flexibility in the D/E central linker of cTnC as monitored by line broadening in the Met 1H- 13C correlations of cTnC induced by a nitroxide spin label, MTSSL, covalently attached to cTnC at Cys 84. TnI association resulted in an extended cTnC that is unlike the compact structure observed for free cTnC. The Met 1H-13C correlations also allowed the binding characteristics of bepridil, TFP, levosimendan, and EMD 57033 to the apo, 2Ca2+, and Ca2+ saturated forms of cTnC to be determined. In addition, the location of drug binding on the 3Ca2+cTnC·cTnI complex was identified for bepridil and TFP. Use of a novel spin-labeled phenothiazine, and detection of isotope filtered NOEs, allowed identification of drug binding sites in the shallow hydrophobic cup in the C-terminal domain, and on two hydrophobic surfaces on N-regulatory domain in free 3Ca2+ cTnC. In contrast, only one N-domain drug binding site exists in 3Ca2+ cTnC·cTnI complex. The methyl groups of Met 45, 60 and 80, which are grouped in a hydrophobic patch near site II in cTnC, showed the greatest change upon titration with bepridil or TFP, suggesting that this is a critical site of drug binding in both free cTnC and when associated with cTnI. The strongest NOEs were seen for Met-60 and -80, which are located on helices C and D, respectively, of Ca2+ binding site II. These results support the conclusion that the small hydrophobic patch which includes Met-45, -60, and -80 constitutes a drug binding site, and that binding drugs to this site will lead to an increase in Ca2+ binding affinity of site II while preserving maximal cTnC activity. Thus, the subregion in cTnC makes a likely target against which to design new and selective Ca2+-sensitizing compounds. ^