927 resultados para finite difference time-domain analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Magnetic resonance imaging (MRI) is today precluded to patients bearing active implantable medical devices AIMDs). The great advantages related to this diagnostic modality, together with the increasing number of people benefiting from implantable devices, in particular pacemakers(PM)and carioverter/defibrillators (ICD), is prompting the scientific community the study the possibility to extend MRI also to implanted patients. The MRI induced specific absorption rate (SAR) and the consequent heating of biological tissues is one of the major concerns that makes patients bearing metallic structures contraindicated for MRI scans. To date, both in-vivo and in-vitro studies have demonstrated the potentially dangerous temperature increase caused by the radiofrequency (RF) field generated during MRI procedures in the tissues surrounding thin metallic implants. On the other side, the technical evolution of MRI scanners and of AIMDs together with published data on the lack of adverse events have reopened the interest in this field and suggest that, under given conditions, MRI can be safely performed also in implanted patients. With a better understanding of the hazards of performing MRI scans on implanted patients as well as the development of MRI safe devices, we may soon enter an era where the ability of this imaging modality may be more widely used to assist in the appropriate diagnosis of patients with devices. In this study both experimental measures and numerical analysis were performed. Aim of the study is to systematically investigate the effects of the MRI RF filed on implantable devices and to identify the elements that play a major role in the induced heating. Furthermore, we aimed at developing a realistic numerical model able to simulate the interactions between an RF coil for MRI and biological tissues implanted with a PM, and to predict the induced SAR as a function of the particular path of the PM lead. The methods developed and validated during the PhD program led to the design of an experimental framework for the accurate measure of PM lead heating induced by MRI systems. In addition, numerical models based on Finite-Differences Time-Domain (FDTD) simulations were validated to obtain a general tool for investigating the large number of parameters and factors involved in this complex phenomenon. The results obtained demonstrated that the MRI induced heating on metallic implants is a real risk that represents a contraindication in extending MRI scans also to patient bearing a PM, an ICD, or other thin metallic objects. On the other side, both experimental data and numerical results show that, under particular conditions, MRI procedures might be consider reasonably safe also for an implanted patient. The complexity and the large number of variables involved, make difficult to define a unique set of such conditions: when the benefits of a MRI investigation cannot be obtained using other imaging techniques, the possibility to perform the scan should not be immediately excluded, but some considerations are always needed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is a collection of works focused on the topic of Earthquake Early Warning, with a special attention to large magnitude events. The topic is addressed from different points of view and the structure of the thesis reflects the variety of the aspects which have been analyzed. The first part is dedicated to the giant, 2011 Tohoku-Oki earthquake. The main features of the rupture process are first discussed. The earthquake is then used as a case study to test the feasibility Early Warning methodologies for very large events. Limitations of the standard approaches for large events arise in this chapter. The difficulties are related to the real-time magnitude estimate from the first few seconds of recorded signal. An evolutionary strategy for the real-time magnitude estimate is proposed and applied to the single Tohoku-Oki earthquake. In the second part of the thesis a larger number of earthquakes is analyzed, including small, moderate and large events. Starting from the measurement of two Early Warning parameters, the behavior of small and large earthquakes in the initial portion of recorded signals is investigated. The aim is to understand whether small and large earthquakes can be distinguished from the initial stage of their rupture process. A physical model and a plausible interpretation to justify the observations are proposed. The third part of the thesis is focused on practical, real-time approaches for the rapid identification of the potentially damaged zone during a seismic event. Two different approaches for the rapid prediction of the damage area are proposed and tested. The first one is a threshold-based method which uses traditional seismic data. Then an innovative approach using continuous, GPS data is explored. Both strategies improve the prediction of large scale effects of strong earthquakes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several countries have acquired, over the past decades, large amounts of area covering Airborne Electromagnetic data. Contribution of airborne geophysics has dramatically increased for both groundwater resource mapping and management proving how those systems are appropriate for large-scale and efficient groundwater surveying. We start with processing and inversion of two AEM dataset from two different systems collected over the Spiritwood Valley Aquifer area, Manitoba, Canada respectively, the AeroTEM III (commissioned by the Geological Survey of Canada in 2010) and the “Full waveform VTEM” dataset, collected and tested over the same survey area, during the fall 2011. We demonstrate that in the presence of multiple datasets, either AEM and ground data, due processing, inversion, post-processing, data integration and data calibration is the proper approach capable of providing reliable and consistent resistivity models. Our approach can be of interest to many end users, ranging from Geological Surveys, Universities to Private Companies, which are often proprietary of large geophysical databases to be interpreted for geological and\or hydrogeological purposes. In this study we deeply investigate the role of integration of several complimentary types of geophysical data collected over the same survey area. We show that data integration can improve inversions, reduce ambiguity and deliver high resolution results. We further attempt to use the final, most reliable output resistivity models as a solid basis for building a knowledge-driven 3D geological voxel-based model. A voxel approach allows a quantitative understanding of the hydrogeological setting of the area, and it can be further used to estimate the aquifers volumes (i.e. potential amount of groundwater resources) as well as hydrogeological flow model prediction. In addition, we investigated the impact of an AEM dataset towards hydrogeological mapping and 3D hydrogeological modeling, comparing it to having only a ground based TEM dataset and\or to having only boreholes data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is to assess the repeatability of spectral-domain-OCT (SD-OCT) retinal nerve fiber layer thickness (RNFL) thickness measurements in a non-glaucoma group and patients with glaucoma and to compare these results to conventional time-domain-OCT (TD-OCT).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The original cefepime product was withdrawn from the Swiss market in January 2007 and replaced by a generic 10 months later. The goals of the study were to assess the impact of this cefepime shortage on the use and costs of alternative broad-spectrum antibiotics, on antibiotic policy, and on resistance of Pseudomonas aeruginosa toward carbapenems, ceftazidime, and piperacillin-tazobactam. A generalized regression-based interrupted time series model assessed how much the shortage changed the monthly use and costs of cefepime and of selected alternative broad-spectrum antibiotics (ceftazidime, imipenem-cilastatin, meropenem, piperacillin-tazobactam) in 15 Swiss acute care hospitals from January 2005 to December 2008. Resistance of P. aeruginosa was compared before and after the cefepime shortage. There was a statistically significant increase in the consumption of piperacillin-tazobactam in hospitals with definitive interruption of cefepime supply and of meropenem in hospitals with transient interruption of cefepime supply. Consumption of each alternative antibiotic tended to increase during the cefepime shortage and to decrease when the cefepime generic was released. These shifts were associated with significantly higher overall costs. There was no significant change in hospitals with uninterrupted cefepime supply. The alternative antibiotics for which an increase in consumption showed the strongest association with a progression of resistance were the carbapenems. The use of alternative antibiotics after cefepime withdrawal was associated with a significant increase in piperacillin-tazobactam and meropenem use and in overall costs and with a decrease in susceptibility of P. aeruginosa in hospitals. This warrants caution with regard to shortages and withdrawals of antibiotics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to compare inter-observer agreement of Stratus™ OCT versus Spectralis™ OCT image grading in patients with neovascular age-related macular degeneration (AMD). Thirty eyes with neovascular AMD were examined with Stratus™ OCT and Spectralis™ OCT. Four different scan protocols were used for imaging. Three observers graded the images for the presence of various pathologies. Inter-observer agreement between OCT models was assessed by calculating intra-class correlation coefficients (ICC). In Stratus™ OCT highest interobserver agreement was found for subretinal fluid (ICC: 0.79), and in Spectralis™ OCT for intraretinal cysts (IRC) (ICC: 0.93). Spectralis™ OCT showed superior interobserver agreement for IRC and epiretinal membranes (ERM) (ICC(Stratus™): for IRC 0.61; for ERM 0.56; ICC(Spectralis™): for IRC 0.93; for ERM 0.84). Increased image resolution of Spectralis™ OCT did improve the inter-observer agreement for grading intraretinal cysts and epiretinal membranes but not for other retinal changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: We compare the prognostic strength of the lymph node ratio (LNR), positive lymph nodes (+LNs) and collected lymph nodes (LNcoll) using a time-dependent analysis in colorectal cancer patients stratified by mismatch repair (MMR) status. Method: 580 stage III-IV patients were included. Multivariable Cox regression analysis and time-dependent receiver operating characteristic (tROC) curve analysis were performed. The Area under the Curve (AUC) over time was compared for the three features. Results were validated on a second cohort of 105 stage III-IV patients. Results: The AUC for the LNR was 0.71 and outperformed + LNs and LNcoll by 10–15 % in both MMR-proficient and deficient cancers. LNR and + LNs were both significant (p<0.0001) in multivariable analysis but the effect was considerably stronger for the LNR [LNR: HR=5.18 (95 % CI: 3.5–7.6); +LNs=1.06 (95 % CI: 1.04–1.08)]. Similar results were obtained for patients with >12 LNcoll. An optimal cut off score for LNR=0.231 was validated on the second cohort (p<0.001). Conclusion: The LNR outperforms the + LNs and LNcoll even in patients with >12 LNcoll. Its clinical value is not confounded by MMR status. A cut-of score of 0.231 may best stratify patients into prognostic subgroups and could be a basis for the future prospective analysis of the LNR.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RATIONALE: Both psychotropic drugs and mental disorders have typical signatures in quantitative electroencephalography (EEG). Previous studies found that some psychotropic drugs had EEG effects opposite to the EEG effects of the mental disorders treated with these drugs (key-lock principle). OBJECTIVES: We performed a placebo-controlled pharmaco-EEG study on two conventional antipsychotics (chlorpromazine and haloperidol) and four atypical antipsychotics (olanzapine, perospirone, quetiapine, and risperidone) in healthy volunteers. We investigated differences between conventional and atypical drug effects and whether the drug effects were compatible with the key-lock principle. METHODS: Fourteen subjects underwent seven EEG recording sessions, one for each drug (dosage equivalent of 1 mg haloperidol). In a time-domain analysis, we quantified the EEG by identifying clusters of transiently stable EEG topographies (microstates). Frequency-domain analysis used absolute power across electrodes and the location of the center of gravity (centroid) of the spatial distribution of power in different frequency bands. RESULTS: Perospirone increased duration of a microstate class typically shortened in schizophrenics. Haloperidol increased mean microstate duration of all classes, increased alpha 1 and beta 1 power, and tended to shift the beta 1 centroid posterior. Quetiapine decreased alpha 1 power and shifted the centroid anterior in both alpha bands. Olanzapine shifted the centroid anterior in alpha 2 and beta 1. CONCLUSIONS: The increased microstate duration under perospirone and haloperidol was opposite to effects previously reported in schizophrenic patients, suggesting a key-lock mechanism. The opposite centroid changes induced by olanzapine and quetiapine compared to haloperidol might characterize the difference between conventional and atypical antipsychotics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A time series is a sequence of observations made over time. Examples in public health include daily ozone concentrations, weekly admissions to an emergency department or annual expenditures on health care in the United States. Time series models are used to describe the dependence of the response at each time on predictor variables including covariates and possibly previous values in the series. Time series methods are necessary to account for the correlation among repeated responses over time. This paper gives an overview of time series ideas and methods used in public health research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present an overview of different methods for decomposing a multichannel spontaneous electroencephalogram (EEG) into sets of temporal patterns and topographic distributions. All of the methods presented here consider the scalp electric field as the basic analysis entity in space. In time, the resolution of the methods is between milliseconds (time-domain analysis), subseconds (time- and frequency-domain analysis) and seconds (frequency-domain analysis). For any of these methods, we show that large parts of the data can be explained by a small number of topographic distributions. Physically, this implies that the brain regions that generated one of those topographies must have been active with a common phase. If several brain regions are producing EEG signals at the same time and frequency, they have a strong tendency to do this in a synchronized mode. This view is illustrated by several examples (including combined EEG and functional magnetic resonance imaging (fMRI)) and a selective review of the literature. The findings are discussed in terms of short-lasting binding between different brain regions through synchronized oscillations, which could constitute a mechanism to form transient, functional neurocognitive networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prediction of radiated fields from transmission lines has not previously been studied from a panoptical power system perspective. The application of BPL technologies to overhead transmission lines would benefit greatly from an ability to simulate real power system environments, not limited to the transmission lines themselves. Presently circuitbased transmission line models used by EMTP-type programs utilize Carson’s formula for a waveguide parallel to an interface. This formula is not valid for calculations at high frequencies, considering effects of earth return currents. This thesis explains the challenges of developing such improved models, explores an approach to combining circuit-based and electromagnetics modeling to predict radiated fields from transmission lines, exposes inadequacies of simulation tools, and suggests methods of extending the validity of transmission line models into very high frequency ranges. Electromagnetics programs are commonly used to study radiated fields from transmission lines. However, an approach is proposed here which is also able to incorporate the components of a power system through the combined use of EMTP-type models. Carson’s formulas address the series impedance of electrical conductors above and parallel to the earth. These equations have been analyzed to show their inherent assumptions and what the implications are. Additionally, the lack of validity into higher frequencies has been demonstrated, showing the need to replace Carson’s formulas for these types of studies. This body of work leads to several conclusions about the relatively new study of BPL. Foremost, there is a gap in modeling capabilities which has been bridged through integration of circuit-based and electromagnetics modeling, allowing more realistic prediction of BPL performance and radiated fields. The proposed approach is limited in its scope of validity due to the formulas used by EMTP-type software. To extend the range of validity, a new set of equations must be identified and implemented in the approach. Several potential methods of implementation have been explored. Though an appropriate set of equations has not yet been identified, further research in this area will benefit from a clear depiction of the next important steps and how they can be accomplished. Prediction of radiated fields from transmission lines has not previously been studied from a panoptical power system perspective. The application of BPL technologies to overhead transmission lines would benefit greatly from an ability to simulate real power system environments, not limited to the transmission lines themselves. Presently circuitbased transmission line models used by EMTP-type programs utilize Carson’s formula for a waveguide parallel to an interface. This formula is not valid for calculations at high frequencies, considering effects of earth return currents. This thesis explains the challenges of developing such improved models, explores an approach to combining circuit-based and electromagnetics modeling to predict radiated fields from transmission lines, exposes inadequacies of simulation tools, and suggests methods of extending the validity of transmission line models into very high frequency ranges. Electromagnetics programs are commonly used to study radiated fields from transmission lines. However, an approach is proposed here which is also able to incorporate the components of a power system through the combined use of EMTP-type models. Carson’s formulas address the series impedance of electrical conductors above and parallel to the earth. These equations have been analyzed to show their inherent assumptions and what the implications are. Additionally, the lack of validity into higher frequencies has been demonstrated, showing the need to replace Carson’s formulas for these types of studies. This body of work leads to several conclusions about the relatively new study of BPL. Foremost, there is a gap in modeling capabilities which has been bridged through integration of circuit-based and electromagnetics modeling, allowing more realistic prediction of BPL performance and radiated fields. The proposed approach is limited in its scope of validity due to the formulas used by EMTP-type software. To extend the range of validity, a new set of equations must be identified and implemented in the approach. Several potential methods of implementation have been explored. Though an appropriate set of equations has not yet been identified, further research in this area will benefit from a clear depiction of the next important steps and how they can be accomplished.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The number of record-breaking events expected to occur in a strictly stationary time-series depends only on the number of values in the time-series, regardless of distribution. This holds whether the events are record-breaking highs or lows and whether we count from past to present or present to past. However, these symmetries are broken in distinct ways by trends in the mean and variance. We define indices that capture this information and use them to detect weak trends from multiple time-series. Here, we use these methods to answer the following questions: (1) Is there a variability trend among globally distributed surface temperature time-series? We find a significant decreasing variability over the past century for the Global Historical Climatology Network (GHCN). This corresponds to about a 10% change in the standard deviation of inter-annual monthly mean temperature distributions. (2) How are record-breaking high and low surface temperatures in the United States affected by time period? We investigate the United States Historical Climatology Network (USHCN) and find that the ratio of record-breaking highs to lows in 2006 increases as the time-series extend further into the past. When we consider the ratio as it evolves with respect to a fixed start year, we find it is strongly correlated with the ensemble mean. We also compare the ratios for USHCN and GHCN (minus USHCN stations). We find the ratios grow monotonically in the GHCN data set, but not in the USHCN data set. (3) Do we detect either mean or variance trends in annual precipitation within the United States? We find that the total annual and monthly precipitation in the United States (USHCN) has increased over the past century. Evidence for a trend in variance is inconclusive.