52 resultados para ARTIFICIAL NOISE
Resumo:
We describe a device made of artificial muscle for the treatment of end-stage heart failure as an alternative to current heart assist devices. The key component is a matrix of nitinol wires and aramidic fibers called Biometal muscle (BM). When heated electrically, it produces a motorless, smooth, and lifelike motion. The BM is connected to a carbon fiber scaffold, tightening the heart and providing simultaneous assistance to the left and right ventricles. A pacemaker-like microprocessor drives the contraction of the BM. We tested the device in a dedicated bench model of diseased heart. It generated a systolic pressure of 75 mm Hg and ejected a maximum of 330 ml/min, with an ejection fraction of 12%. The device required a power supply of 6 V, 250 mA. This could be the beginning of an era in which BMs integrate or replace the mechanical function of natural muscles.
Resumo:
Artificial radionuclides ((137)Cs, (90)Sr, Pu, and (241)Am) are present in soils because of Nuclear Weapon Tests and accidents in nuclear facilities. Their distribution in soil depth varies according to soil characteristics, their own chemical properties, and their deposition history. For this project, we studied the atmospheric deposition of (137)Cs, (90)Sr, Pu, (241)Am, (210)Pb, and stable Pb. We compared the distribution of these elements in soil profiles from different soil types from an alpine Valley (Val Piora, Switzerland) with the distribution of selected major and trace elements in the same soils. Our goals were to explain the distribution of the radioisotopes as a function of soil parameters and to identify stable elements with analogous behaviors. We found that Pu and (241)Am are relatively immobile and accumulate in the topsoil. In all soils, (90)Sr is more mobile and shows some accumulations at depth into Fe-Al rich horizons. This behavior is also observed for Cu and Zn, indicating that these elements may be used as chemical analogues for the migration of (90)Sr into the soil.
Resumo:
OBJECTIVE: Imaging during a period of minimal myocardial motion is of paramount importance for coronary MR angiography (MRA). The objective of our study was to evaluate the utility of FREEZE, a custom-built automated tool for the identification of the period of minimal myocardial motion, in both a moving phantom at 1.5 T and 10 healthy adults (nine men, one woman; mean age, 24.9 years; age range, 21-32 years) at 3 T. CONCLUSION: Quantitative analysis of the moving phantom showed that dimension measurements approached those obtained in the static phantom when using FREEZE. In vitro, vessel sharpness, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR) were significantly improved when coronary MRA was performed during the software-prescribed period of minimal myocardial motion (p < 0.05). Consistent with these objective findings, image quality assessments by consensus review also improved significantly when using the automated prescription of the period of minimal myocardial motion. The use of FREEZE improves image quality of coronary MRA. Simultaneously, operator dependence can be minimized while the ease of use is improved.
Resumo:
The noise power spectrum (NPS) is the reference metric for understanding the noise content in computed tomography (CT) images. To evaluate the noise properties of clinical multidetector (MDCT) scanners, local 2D and 3D NPSs were computed for different acquisition reconstruction parameters.A 64- and a 128-MDCT scanners were employed. Measurements were performed on a water phantom in axial and helical acquisition modes. CT dose index was identical for both installations. Influence of parameters such as the pitch, the reconstruction filter (soft, standard and bone) and the reconstruction algorithm (filtered-back projection (FBP), adaptive statistical iterative reconstruction (ASIR)) were investigated. Images were also reconstructed in the coronal plane using a reformat process. Then 2D and 3D NPS methods were computed.In axial acquisition mode, the 2D axial NPS showed an important magnitude variation as a function of the z-direction when measured at the phantom center. In helical mode, a directional dependency with lobular shape was observed while the magnitude of the NPS was kept constant. Important effects of the reconstruction filter, pitch and reconstruction algorithm were observed on 3D NPS results for both MDCTs. With ASIR, a reduction of the NPS magnitude and a shift of the NPS peak to the low frequency range were visible. 2D coronal NPS obtained from the reformat images was impacted by the interpolation when compared to 2D coronal NPS obtained from 3D measurements.The noise properties of volume measured in last generation MDCTs was studied using local 3D NPS metric. However, impact of the non-stationarity noise effect may need further investigations.
Resumo:
The generic concept of the artificial meteorite experiment STONE is to fix rock samples bearing microorganisms on the heat shield of a recoverable space capsule and to study their modifications during atmospheric re-entry. The STONE-5 experiment was performed mainly to answer astrobiological questions. The rock samples mounted on the heat shield were used (i) as a carrier for microorganisms and (ii) as internal control to verify whether physical conditions during atmospheric re-entry were comparable to those experienced by "real" meteorites. Samples of dolerite (an igneous rock), sandstone (a sedimentary rock), and gneiss impactite from Haughton Crater carrying endolithic cyanobacteria were fixed to the heat shield of the unmanned recoverable capsule FOTON-M2. Holes drilled on the back side of each rock sample were loaded with bacterial and fungal spores and with dried vegetative cryptoendoliths. The front of the gneissic sample was also soaked with cryptoendoliths. <p>The mineralogical differences between pre- and post-flight samples are detailed. Despite intense ablation resulting in deeply eroded samples, all rocks in part survived atmospheric re-entry. Temperatures attained during re-entry were high enough to melt dolerite, silica, and the gneiss impactite sample. The formation of fusion crusts in STONE-5 was a real novelty and strengthens the link with real meteorites. The exposed part of the dolerite is covered by a fusion crust consisting of silicate glass formed from the rock sample with an admixture of holder material (silica). Compositionally, the fusion crust varies from silica-rich areas (undissolved silica fibres of the holder material) to areas whose composition is "basaltic". Likewise, the fusion crust on the exposed gneiss surface was formed from gneiss with an admixture of holder material. The corresponding composition of the fusion crust varies from silica-rich areas to areas with "gneiss" composition (main component potassium-rich feldspar). The sandstone sample was retrieved intact and did not develop a fusion crust. Thermal decomposition of the calcite matrix followed by disintegration and liberation of the silicate grains prevented the formation of a melt.</p> <p>Furthermore, the non-exposed surface of all samples experienced strong thermal alterations. Hot gases released during ablation pervaded the empty space between sample and sample holder leading to intense local heating. The intense heating below the protective sample holder led to surface melting of the dolerite rock and to the formation of calcium-silicate rims on quartz grains in the sandstone sample. (c) 2008 Elsevier Ltd. All rights reserved.</p>
Resumo:
Spatial data analysis mapping and visualization is of great importance in various fields: environment, pollution, natural hazards and risks, epidemiology, spatial econometrics, etc. A basic task of spatial mapping is to make predictions based on some empirical data (measurements). A number of state-of-the-art methods can be used for the task: deterministic interpolations, methods of geostatistics: the family of kriging estimators (Deutsch and Journel, 1997), machine learning algorithms such as artificial neural networks (ANN) of different architectures, hybrid ANN-geostatistics models (Kanevski and Maignan, 2004; Kanevski et al., 1996), etc. All the methods mentioned above can be used for solving the problem of spatial data mapping. Environmental empirical data are always contaminated/corrupted by noise, and often with noise of unknown nature. That's one of the reasons why deterministic models can be inconsistent, since they treat the measurements as values of some unknown function that should be interpolated. Kriging estimators treat the measurements as the realization of some spatial randomn process. To obtain the estimation with kriging one has to model the spatial structure of the data: spatial correlation function or (semi-)variogram. This task can be complicated if there is not sufficient number of measurements and variogram is sensitive to outliers and extremes. ANN is a powerful tool, but it also suffers from the number of reasons. of a special type ? multiplayer perceptrons ? are often used as a detrending tool in hybrid (ANN+geostatistics) models (Kanevski and Maignank, 2004). Therefore, development and adaptation of the method that would be nonlinear and robust to noise in measurements, would deal with the small empirical datasets and which has solid mathematical background is of great importance. The present paper deals with such model, based on Statistical Learning Theory (SLT) - Support Vector Regression. SLT is a general mathematical framework devoted to the problem of estimation of the dependencies from empirical data (Hastie et al, 2004; Vapnik, 1998). SLT models for classification - Support Vector Machines - have shown good results on different machine learning tasks. The results of SVM classification of spatial data are also promising (Kanevski et al, 2002). The properties of SVM for regression - Support Vector Regression (SVR) are less studied. First results of the application of SVR for spatial mapping of physical quantities were obtained by the authorsin for mapping of medium porosity (Kanevski et al, 1999), and for mapping of radioactively contaminated territories (Kanevski and Canu, 2000). The present paper is devoted to further understanding of the properties of SVR model for spatial data analysis and mapping. Detailed description of the SVR theory can be found in (Cristianini and Shawe-Taylor, 2000; Smola, 1996) and basic equations for the nonlinear modeling are given in section 2. Section 3 discusses the application of SVR for spatial data mapping on the real case study - soil pollution by Cs137 radionuclide. Section 4 discusses the properties of the modelapplied to noised data or data with outliers.
Resumo:
This paper presents a validation study on statistical nonsupervised brain tissue classification techniques in magnetic resonance (MR) images. Several image models assuming different hypotheses regarding the intensity distribution model, the spatial model and the number of classes are assessed. The methods are tested on simulated data for which the classification ground truth is known. Different noise and intensity nonuniformities are added to simulate real imaging conditions. No enhancement of the image quality is considered either before or during the classification process. This way, the accuracy of the methods and their robustness against image artifacts are tested. Classification is also performed on real data where a quantitative validation compares the methods' results with an estimated ground truth from manual segmentations by experts. Validity of the various classification methods in the labeling of the image as well as in the tissue volume is estimated with different local and global measures. Results demonstrate that methods relying on both intensity and spatial information are more robust to noise and field inhomogeneities. We also demonstrate that partial volume is not perfectly modeled, even though methods that account for mixture classes outperform methods that only consider pure Gaussian classes. Finally, we show that simulated data results can also be extended to real data.
Resumo:
The neuropathology of Alzheimer disease is characterized by senile plaques, neurofibrillary tangles and cell death. These hallmarks develop according to the differential vulnerability of brain networks, senile plaques accumulating preferentially in the associative cortical areas and neurofibrillary tangles in the entorhinal cortex and the hippocampus. We suggest that the main aetiological hypotheses such as the beta-amyloid cascade hypothesis or its variant, the synaptic beta-amyloid hypothesis, will have to consider neural networks not just as targets of degenerative processes but also as contributors of the disease's progression and of its phenotype. Three domains of research are highlighted in this review. First, the cerebral reserve and the redundancy of the network's elements are related to brain vulnerability. Indeed, an enriched environment appears to increase the cerebral reserve as well as the threshold of disease's onset. Second, disease's progression and memory performance cannot be explained by synaptic or neuronal loss only, but also by the presence of compensatory mechanisms, such as synaptic scaling, at the microcircuit level. Third, some phenotypes of Alzheimer disease, such as hallucinations, appear to be related to progressive dysfunction of neural networks as a result, for instance, of a decreased signal to noise ratio, involving a diminished activity of the cholinergic system. Overall, converging results from studies of biological as well as artificial neural networks lead to the conclusion that changes in neural networks contribute strongly to Alzheimer disease's progression.
Resumo:
The aim of the present study is to determine the level of correlation between the 3-dimensional (3D) characteristics of trabecular bone microarchitecture, as evaluated using microcomputed tomography (μCT) reconstruction, and trabecular bone score (TBS), as evaluated using 2D projection images directly derived from 3D μCT reconstruction (TBSμCT). Moreover, we have evaluated the effects of image degradation (resolution and noise) and X-ray energy of projection on these correlations. Thirty human cadaveric vertebrae were acquired on a microscanner at an isotropic resolution of 93μm. The 3D microarchitecture parameters were obtained using MicroView (GE Healthcare, Wauwatosa, MI). The 2D projections of these 3D models were generated using the Beer-Lambert law at different X-ray energies. Degradation of image resolution was simulated (from 93 to 1488μm). Relationships between 3D microarchitecture parameters and TBSμCT at different resolutions were evaluated using linear regression analysis. Significant correlations were observed between TBSμCT and 3D microarchitecture parameters, regardless of the resolution. Correlations were detected that were strongly to intermediately positive for connectivity density (0.711≤r(2)≤0.752) and trabecular number (0.584≤r(2)≤0.648) and negative for trabecular space (-0.407 ≤r(2)≤-0.491), up to a pixel size of 1023μm. In addition, TBSμCT values were strongly correlated between each other (0.77≤r(2)≤0.96). Study results show that the correlations between TBSμCT at 93μm and 3D microarchitecture parameters are weakly impacted by the degradation of image resolution and the presence of noise.