912 resultados para Frequency-Domain Analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Simulating surface wind over complex terrain is a challenge in regional climate modelling. Therefore, this study aims at identifying a set-up of the Weather Research and Forecasting Model (WRF) model that minimises system- atic errors of surface winds in hindcast simulations. Major factors of the model configuration are tested to find a suitable set-up: the horizontal resolution, the planetary boundary layer (PBL) parameterisation scheme and the way the WRF is nested to the driving data set. Hence, a number of sensitivity simulations at a spatial resolution of 2 km are carried out and compared to observations. Given the importance of wind storms, the analysis is based on case studies of 24 historical wind storms that caused great economic damage in Switzerland. Each of these events is downscaled using eight different model set-ups, but sharing the same driving data set. The results show that the lack of representation of the unresolved topography leads to a general overestimation of wind speed in WRF. However, this bias can be substantially reduced by using a PBL scheme that explicitly considers the effects of non-resolved topography, which also improves the spatial structure of wind speed over Switzerland. The wind direction, although generally well reproduced, is not very sensitive to the PBL scheme. Further sensitivity tests include four types of nesting methods: nesting only at the boundaries of the outermost domain, analysis nudging, spectral nudging, and the so-called re-forecast method, where the simulation is frequently restarted. These simulations show that restricting the freedom of the model to develop large-scale disturbances slightly increases the temporal agreement with the observations, at the same time that it further reduces the overestimation of wind speed, especially for maximum wind peaks. The model performance is also evaluated in the outermost domains, where the resolution is coarser. The results demonstrate the important role of horizontal resolution, where the step from 6 to 2 km significantly improves model performance. In summary, the combination of a grid size of 2 km, the non-local PBL scheme modified to explicitly account for non-resolved orography, as well as analysis or spectral nudging, is a superior combination when dynamical downscaling is aimed at reproducing real wind fields.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Several parameters of heart rate variability (HRV) have been shown to predict the risk of sudden cardiac death (SCD) in cardiac patients. There is consensus that risk prediction is increased when measuring HRV during specific provocations such as orthostatic challenge. For the first time, we provide data on reproducibility of such a test in patients with a history of acute coronary syndrome. METHODS: Sixty male patients (65+/-8years) with a history of acute coronary syndrome on stable medication were included. HRV was measured in supine (5min) and standing (5min) position on 2 occasions separated by two weeks. For risk assessment relevant time-domain [standard deviation of all R-R intervals (SDNN) and root mean squared standard differences between adjacent R-R intervals (RMSSD)], frequency domain [low-frequency power (LF), high-frequency power (HF) and LF/HF power ratio] and short-term fractal scaling component (DF1) were computed. Absolute reproducibility was assessed with the standard errors of the mean (SEM) and 95% limits of random variation, and relative reproducibility by the intraclass correlation coefficient (ICC). RESULTS: We found comparable SEMs and ICCs in supine position and after an orthostatic challenge test. All ICCs were good to excellent (ICCs between 0.636 and 0.869). CONCLUSIONS: Reproducibility of HRV parameters during orthostatic challenge is good and comparable with supine position.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

AIMS Patients with ST-segment elevation myocardial infarction (STEMI) feature thrombus-rich lesions with large necrotic core, which are usually associated with delayed arterial healing and impaired stent-related outcomes. The use of bioresorbable vascular scaffolds (Absorb) has the potential to overcome these limitations owing to restoration of native vessel lumen and physiology at long term. The purpose of this randomized trial was to compare the arterial healing response at short term, as a surrogate for safety and efficacy, between the Absorb and the metallic everolimus-eluting stent (EES) in patients with STEMI. METHODS AND RESULTS ABSORB-STEMI TROFI II was a multicentre, single-blind, non-inferiority, randomized controlled trial. Patients with STEMI who underwent primary percutaneous coronary intervention were randomly allocated 1:1 to treatment with the Absorb or EES. The primary endpoint was the 6-month optical frequency domain imaging healing score (HS) based on the presence of uncovered and/or malapposed stent struts and intraluminal filling defects. Main secondary endpoint included the device-oriented composite endpoint (DOCE) according to the Academic Research Consortium definition. Between 06 January 2014 and 21 September 2014, 191 patients (Absorb [n = 95] or EES [n = 96]; mean age 58.6 years old; 17.8% females) were enrolled at eight centres. At 6 months, HS was lower in the Absorb arm when compared with EES arm [1.74 (2.39) vs. 2.80 (4.44); difference (90% CI) -1.06 (-1.96, -0.16); Pnon-inferiority <0.001]. Device-oriented composite endpoint was also comparably low between groups (1.1% Absorb vs. 0% EES). One case of definite subacute stent thrombosis occurred in the Absorb arm (1.1% vs. 0% EES; P = ns). CONCLUSION Stenting of culprit lesions with Absorb in the setting of STEMI resulted in a nearly complete arterial healing which was comparable with that of metallic EES at 6 months. These findings provide the basis for further exploration in clinically oriented outcome trials.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

AIMS The Absorb bioresorbable vascular scaffold (Absorb BVS) provides similar clinical outcomes compared with a durable polymer-based everolimus-eluting metallic stent (EES) in stable coronary artery disease patients. ST-elevation myocardial infarction (STEMI) lesions have been associated with delayed arterial healing and impaired stent-related outcomes. The purpose of the present study is to compare directly the arterial healing response, angiographic efficacy and clinical outcomes between the Absorb BVS and metallic EES. METHODS AND RESULTS A total of 191 patients with acute STEMI were randomly allocated to treatment with the Absorb BVS or a metallic EES 1:1. The primary endpoint is the neointimal healing (NIH) score, which is calculated based on a score taking into consideration the presence of uncovered and malapposed stent struts, intraluminal filling defects and excessive neointimal proliferation, as detected by optical frequency domain imaging (OFDI) six months after the index procedure. The study will provide 90% power to show non-inferiority of the Absorb BVS compared with the EES. CONCLUSIONS This will be the first randomised study investigating the arterial healing response following implantation of the Absorb BVS compared with the EES. The healing response assessed by a novel NIH score in conjunction with results on angiographic efficacy parameters and device-oriented events will elucidate disease-specific applications of bioresorbable scaffolds.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

MRSI grids frequently show spectra with poor quality, mainly because of the high sensitivity of MRS to field inhomogeneities. These poor quality spectra are prone to quantification and/or interpretation errors that can have a significant impact on the clinical use of spectroscopic data. Therefore, quality control of the spectra should always precede their clinical use. When performed manually, quality assessment of MRSI spectra is not only a tedious and time-consuming task, but is also affected by human subjectivity. Consequently, automatic, fast and reliable methods for spectral quality assessment are of utmost interest. In this article, we present a new random forest-based method for automatic quality assessment of (1) H MRSI brain spectra, which uses a new set of MRS signal features. The random forest classifier was trained on spectra from 40 MRSI grids that were classified as acceptable or non-acceptable by two expert spectroscopists. To account for the effects of intra-rater reliability, each spectrum was rated for quality three times by each rater. The automatic method classified these spectra with an area under the curve (AUC) of 0.976. Furthermore, in the subset of spectra containing only the cases that were classified every time in the same way by the spectroscopists, an AUC of 0.998 was obtained. Feature importance for the classification was also evaluated. Frequency domain skewness and kurtosis, as well as time domain signal-to-noise ratios (SNRs) in the ranges 50-75 ms and 75-100 ms, were the most important features. Given that the method is able to assess a whole MRSI grid faster than a spectroscopist (approximately 3 s versus approximately 3 min), and without loss of accuracy (agreement between classifier trained with just one session and any of the other labelling sessions, 89.88%; agreement between any two labelling sessions, 89.03%), the authors suggest its implementation in the clinical routine. The method presented in this article was implemented in jMRUI's SpectrIm plugin. Copyright © 2016 John Wiley & Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose. To investigate and understand the illness experiences of patients and their family members living with congestive heart failure (CHF). ^ Design. Focused ethnographic design. ^ Setting. One outpatient cardiology clinic, two outpatient heart failure clinics, and informants' homes in a large metropolitan city located in southeast Texas. ^ Sample. A purposeful sampling technique was used to select a sample of 28 informants. The following somewhat overlapping, sampling strategies were used to implement the purposeful method: criterion; typical case; operational construct; maximum variation; atypical case; opportunistic; and confirming and disconfirming case sampling. ^ Methods. Naturalistic inquiry consisted of data collected from observations, participant observations, and interviews. Open-ended semi-structured illness narrative interviews included questions designed to elicit informant's explanatory models of the illness, which served as a synthesizing framework for the analysis. A thematic analysis process was conducted through domain analysis and construction of data into themes and sub-themes. Credibility was enhanced through informant verification and a process of peer debriefing. ^ Findings. Thematic analysis revealed that patients and their family members living with CHF experience a process of disruption, incoherence, and reconciling. Reconciling emerged as the salient experience described by informants. Sub-themes of reconciling that emerged from the analysis included: struggling; participating in partnerships; finding purpose and meaning in the illness experience; and surrendering. ^ Conclusions. Understanding the experiences described in this study allows for a better understanding of living with CHF in everyday life. Findings from this study suggest that the experience of living with CHF entails more than the medical story can tell. It is important for nurses and other providers to understand the experiences of this population in order to develop appropriate treatment plans in a successful practitioner-patient partnership. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Continuous condensation particle (CP) observations were conducted from 1984 through 2009 at Neumayer Station under stringent contamination control. During this period, the CP concentration (median 258 1/cm**3) showed no significant long term trend but exhibited a pronounced seasonality characterized by a stepwise increase starting in September and reaching its annual maximum of around 10**3/cm**3 in March. Minimum values below 10**2/cm**3 were observed during June/July. Dedicated time series analyses in the time and frequency domain revealed no significant correlations between inter-annual CP concentration variations and atmospheric circulation indices like Southern Annular Mode (SAM) or Southern Ocean Index (SOI). The impact of the Pinatubo volcanic eruption and strong El Niño events did not affect CP concentrations. From thermodenuder experiments we deduced that the portion of volatile (at 125 °C) and semi-volatile (at 250 °C) particles which could be both associated with biogenic sulfur aerosol, was maximum during austral summer, while during winter non-volatile sea salt particles dominated. During September through April we could frequently observe enhanced concentrations of ultrafine particles within the nucleation mode (between 3 nm and 7 nm particle diameter), preferentially in the afternoon.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

For the first time, annually resolved accumulation rates have been determined in central Antarctica by means of counting seasonal signals of ammonium, calcium, and sodium. All records, obtained from three intermediate depth ice cores from Dronning Maud Land, East Antarctica, show rather constant accumulation rates throughout the last 9 centuries with mean values of 63, 61, and 44 mm H2O yr**-1 and a typical year-to-year variation of about 30%. For the last few decades, no trend was detected accounting for the high natural variability of all records. A significant weak intersite correlation is apparent only between two cores when the high-frequency part with periods less than 30 years is removed. By analyzing the records in the frequency domain, no persistent periods were found. This suggests that the snow accumulation in this area is mainly influenced by local deposition patterns and may be additionally masked by redistribution of snow due to wind. By comparing accumulation rates over the last 2 millennia a distinct change in the layer thickness in one of the three cores was found, which might be attributed either to an area upstream of the drilling site with lower accumulation rates, or to deposition processes influenced by surface undulations. The missing of a clear correlation between the accumulation rate histories at the three locations is also important for the interpretation of small, short time variations of past precipitation records obtained from deep ice cores.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ten ODP sites drilled in a depth transect (2164-4775 m water depth) during Leg 172 recovered high-deposition rate (>20 cm/kyr) sedimentary sections from sediment drifts in the western North Atlantic. For each site an age model covering the past 0.8-0.9 Ma has been developed. The time scales have a resolution of 10-20 kyr and are derived by tuning variations of estimated carbonate content to the orbital parameters precession and obliquity. Based on the similarity in the signature of proxy records and the spectral character of the time series, the sites are divided into two groups: precession cycles are better developed in carbonate records from a group of shallow sites (2164-2975 m water depth, Sites 1055-1058) while the deeper sites (2995-4775 m water depth, Sites 1060-1063) are characterized by higher spectral density in the obliquity band. The resulting time scales show excellent coherence with other dated carbonate and isotope records from low latitudes. Besides the typical Milankovitch cyclicity significant variance of the resulting carbonate time series is concentrated at millennial-scale changes with periods of about 12, 6, 4, 2.5, and 1.5 kyr. Comparisons of carbonate records from the Blake Bahama Outer Ridge and the Bermuda Rise reveal a remarkable similarity in the time and frequency domain indicating a basin-wide uniform sedimentation pattern during the last 0.9 Ma.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A particle accelerator is any device that, using electromagnetic fields, is able to communicate energy to charged particles (typically electrons or ionized atoms), accelerating and/or energizing them up to the required level for its purpose. The applications of particle accelerators are countless, beginning in a common TV CRT, passing through medical X-ray devices, and ending in large ion colliders utilized to find the smallest details of the matter. Among the other engineering applications, the ion implantation devices to obtain better semiconductors and materials of amazing properties are included. Materials supporting irradiation for future nuclear fusion plants are also benefited from particle accelerators. There are many devices in a particle accelerator required for its correct operation. The most important are the particle sources, the guiding, focalizing and correcting magnets, the radiofrequency accelerating cavities, the fast deflection devices, the beam diagnostic mechanisms and the particle detectors. Most of the fast particle deflection devices have been built historically by using copper coils and ferrite cores which could effectuate a relatively fast magnetic deflection, but needed large voltages and currents to counteract the high coil inductance in a response in the microseconds range. Various beam stability considerations and the new range of energies and sizes of present time accelerators and their rings require new devices featuring an improved wakefield behaviour and faster response (in the nanoseconds range). This can only be achieved by an electromagnetic deflection device based on a transmission line. The electromagnetic deflection device (strip-line kicker) produces a transverse displacement on the particle beam travelling close to the speed of light, in order to extract the particles to another experiment or to inject them into a different accelerator. The deflection is carried out by the means of two short, opposite phase pulses. The diversion of the particles is exerted by the integrated Lorentz force of the electromagnetic field travelling along the kicker. This Thesis deals with a detailed calculation, manufacturing and test methodology for strip-line kicker devices. The methodology is then applied to two real cases which are fully designed, built, tested and finally installed in the CTF3 accelerator facility at CERN (Geneva). Analytical and numerical calculations, both in 2D and 3D, are detailed starting from the basic specifications in order to obtain a conceptual design. Time domain and frequency domain calculations are developed in the process using different FDM and FEM codes. The following concepts among others are analyzed: scattering parameters, resonating high order modes, the wakefields, etc. Several contributions are presented in the calculation process dealing specifically with strip-line kicker devices fed by electromagnetic pulses. Materials and components typically used for the fabrication of these devices are analyzed in the manufacturing section. Mechanical supports and connexions of electrodes are also detailed, presenting some interesting contributions on these concepts. The electromagnetic and vacuum tests are then analyzed. These tests are required to ensure that the manufactured devices fulfil the specifications. Finally, and only from the analytical point of view, the strip-line kickers are studied together with a pulsed power supply based on solid state power switches (MOSFETs). The solid state technology applied to pulsed power supplies is introduced and several circuit topologies are modelled and simulated to obtain fast and good flat-top pulses.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper shows that today’s modelling of electrical noise as coming from noisy resistances is a non sense one contradicting their nature as systems bearing an electrical noise. We present a new model for electrical noise that including Johnson and Nyquist work also agrees with the Quantum Mechanical description of noisy systems done by Callen and Welton, where electrical energy fluctuates and is dissipated with time. By the two currents the Admittance function links in frequency domain with their common voltage, this new model shows the connection Cause-Effect that exists between Fluctuation and Dissipation of energy in time domain. In spite of its radical departure from today’s belief on electrical noise in resistors, this Complex model for electrical noise is obtained from Nyquist result by basic concepts of Circuit Theory and Thermo- dynamics that also apply to capacitors and inductors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Application of the spectrum analyzer for illustrating several concepts associated with mobile communications is discussed. Specifically, two groups of observable features are described. First, time variation and frequency selectivity of multipath propagation can be revealed by carrying out simple measurements on commercial-network GSM and UMTS signals. Second, the main time-domain and frequency-domain features of GSM and UMTS radio signals can be observed. This constitutes a valuable tool for teaching mobile communication courses.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper shows a physically cogent model for electrical noise in resistors that has been obtained from Thermodynamical reasons. This new model derived from the works of Johnson and Nyquist also agrees with the Quantum model for noisy systems handled by Callen and Welton in 1951, thus unifying these two Physical viewpoints. This new model is a Complex or 2-D noise model based on an Admittance that considers both Fluctuation and Dissipation of electrical energy to excel the Real or 1-D model in use that only considers Dissipation. By the two orthogonal currents linked with a common voltage noise by an Admittance function, the new model is shown in frequency domain. Its use in time domain allows to see the pitfall behind a paradox of Statistical Mechanics about systems considered as energy-conserving and deterministic on the microscale that are dissipative and unpredictable on the macroscale and also shows how to use properly the Fluctuation-Dissipation Theorem.