945 resultados para Gabor wavelet filters
Resumo:
Short chain fatty acids (SCFAs) have recently attracted attention as potential mediators of the effects of gut microbiota on intestinal inflammation. Some of these effects have been suggested to occur through the direct actions of SCFAs on the GPR43 receptor in neutrophils, though the precise role of this receptor in neutrophil activation is still unclear. We show that mouse bone marrow derived neutrophils (BMNs) can chemotax effectively through polycarbonate filters towards a source of acetate, propionate or butyrate. Moreover, we show that BMNs move with good speed and directionality towards a source of propionate in an EZ-Taxiscan chamber coated with fibrinogen. These effects of SCFAs were mimicked by low concentrations of the synthetic GPR43 agonist phenylacetamide-1 and were abolished in GPR43(-/-) BMNs. SCFAs and phenylacetamide-1 also elicited GPR43-dependent activation of PKB, p38 and ERK and these responses were sensitive to pertussis toxin, indicating a role for Gi proteins. Phenylacetamide-1 also elicited rapid and transient activation of Rac1/2 GTPases and phosphorylation of ribosomal protein S6. Genetic and pharmacological intervention identified important roles for PI3K gamma, Rac2, p38 and ERK, but not mTOR, in GPR43-dependent chemotaxis. These results identify GPR43 as a bona fide chemotactic receptor for neutrophils in vitro and start to define important elements in its signal transduction pathways.
Resumo:
This paper studies a nonlinear, discrete-time matrix system arising in the stability analysis of Kalman filters. These systems present an internal coupling between the state components that gives rise to complex dynamic behavior. The problem of partial stability, which requires that a specific component of the state of the system converge exponentially, is studied and solved. The convergent state component is strongly linked with the behavior of Kalman filters, since it can be used to provide bounds for the error covariance matrix under uncertainties in the noise measurements. We exploit the special features of the system-mainly the connections with linear systems-to obtain an algebraic test for partial stability. Finally, motivated by applications in which polynomial divergence of the estimates is acceptable, we study and solve a partial semistability problem.
Resumo:
In Tokamak Chauffage Alfven Bresilien [R. M. O. Galvao , Plasma Phys. Controlled Fusion 43, 1181 (2001)], high magnetohydrodynamic (MHD) activity may appear spontaneously or during discharges with a voltage biased electrode inserted at the plasma edge. The turbulent electrostatic fluctuations, measured by Langmuir probes, are modulated by Mirnov oscillations presenting a dominant peak with a common frequency around 10 kHz. We report the occurrence of phase locking of the turbulent potential fluctuations driven by MHD activity at this frequency. Using wavelet cross-spectral analysis, we characterized the phase and frequency synchronization in the plasma edge region. We introduced an order parameter to characterize the radial dependence of the phase-locking intensity. (c) 2008 American Institute of Physics.
Resumo:
Aerosol samples were collected at a pasture site in the Amazon Basin as part of the project LBA-SMOCC-2002 (Large-Scale Biosphere-Atmosphere Experiment in Amazonia - Smoke Aerosols, Clouds, Rainfall and Climate: Aerosols from Biomass Burning Perturb Global and Regional Climate). Sampling was conducted during the late dry season, when the aerosol composition was dominated by biomass burning emissions, especially in the submicron fraction. A 13-stage Dekati low-pressure impactor (DLPI) was used to collect particles with nominal aerodynamic diameters (D(p)) ranging from 0.03 to 0.10 mu m. Gravimetric analyses of the DLPI substrates and filters were performed to obtain aerosol mass concentrations. The concentrations of total, apparent elemental, and organic carbon (TC, EC(a), and OC) were determined using thermal and thermal-optical analysis (TOA) methods. A light transmission method (LTM) was used to determine the concentration of equivalent black carbon (BC(e)) or the absorbing fraction at 880 nm for the size-resolved samples. During the dry period, due to the pervasive presence of fires in the region upwind of the sampling site, concentrations of fine aerosols (D(p) < 2.5 mu m: average 59.8 mu g m(-3)) were higher than coarse aerosols (D(p) > 2.5 mu m: 4.1 mu g m(-3)). Carbonaceous matter, estimated as the sum of the particulate organic matter (i.e., OC x 1.8) plus BC(e), comprised more than 90% to the total aerosol mass. Concentrations of EC(a) (estimated by thermal analysis with a correction for charring) and BC(e) (estimated by LTM) averaged 5.2 +/- 1.3 and 3.1 +/- 0.8 mu g m(-3), respectively. The determination of EC was improved by extracting water-soluble organic material from the samples, which reduced the average light absorption Angstrom exponent of particles in the size range of 0.1 to 1.0 mu m from >2.0 to approximately 1.2. The size-resolved BC(e) measured by the LTM showed a clear maximum between 0.4 and 0.6 mu m in diameter. The concentrations of OC and BC(e) varied diurnally during the dry period, and this variation is related to diurnal changes in boundary layer thickness and in fire frequency.
Resumo:
Nitrogen-doped carbon nanotubes can provide reactive sites on the porphyrin-like defects. It is well known that many porphyrins have transition-metal atoms, and we have explored transition-metal atoms bonded to those porphyrin-like defects inN-doped carbon nanotubes. The electronic structure and transport are analyzed by means of a combination of density functional theory and recursive Green's function methods. The results determined the heme B-like defect (an iron atom bonded to four nitrogens) is the most stable and has a higher polarization current for a single defect. With randomly positioned heme B defects in nanotubes a few hundred nanometers long, the polarization reaches near 100%, meaning they are effective spin filters. A disorder-induced magnetoresistance effect is also observed in those long nanotubes, and values as high as 20 000% are calculated with nonmagnectic eletrodes.
Resumo:
The Brazilian Amazon is one of the most rapidly developing agricultural frontiers in the world. The authors assess changes in cropland area and the intensification of cropping in the Brazilian agricultural frontier state of Mato Grosso using remote sensing and develop a greenhouse gas emissions budget. The most common type of intensification in this region is a shift from single-to double-cropping patterns and associated changes in management, including increased fertilization. Using the enhanced vegetation index (EVI) from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor, the authors created a green-leaf phenology for 2001-06 that was temporally smoothed with a wavelet filter. The wavelet-smoothed green-leaf phenology was analyzed to detect cropland areas and their cropping patterns. The authors document cropland extensification and double-cropping intensification validated with field data with 85% accuracy for detecting croplands and 64% and 89% accuracy for detecting single-and double-cropping patterns, respectively. The results show that croplands more than doubled from 2001 to 2006 to cover about 100 000 km(2) and that new double-cropping intensification occurred on over 20% of croplands. Variations are seen in the annual rates of extensification and double-cropping intensification. Greenhouse gas emissions are estimated for the period 2001-06 due to conversion of natural vegetation and pastures to row-crop agriculture in Mato Grosso averaged 179 Tg CO(2)-e yr(-1),over half the typical fossil fuel emissions for the country in recent years.
Resumo:
Objective: We carry out a systematic assessment on a suite of kernel-based learning machines while coping with the task of epilepsy diagnosis through automatic electroencephalogram (EEG) signal classification. Methods and materials: The kernel machines investigated include the standard support vector machine (SVM), the least squares SVM, the Lagrangian SVM, the smooth SVM, the proximal SVM, and the relevance vector machine. An extensive series of experiments was conducted on publicly available data, whose clinical EEG recordings were obtained from five normal subjects and five epileptic patients. The performance levels delivered by the different kernel machines are contrasted in terms of the criteria of predictive accuracy, sensitivity to the kernel function/parameter value, and sensitivity to the type of features extracted from the signal. For this purpose, 26 values for the kernel parameter (radius) of two well-known kernel functions (namely. Gaussian and exponential radial basis functions) were considered as well as 21 types of features extracted from the EEG signal, including statistical values derived from the discrete wavelet transform, Lyapunov exponents, and combinations thereof. Results: We first quantitatively assess the impact of the choice of the wavelet basis on the quality of the features extracted. Four wavelet basis functions were considered in this study. Then, we provide the average accuracy (i.e., cross-validation error) values delivered by 252 kernel machine configurations; in particular, 40%/35% of the best-calibrated models of the standard and least squares SVMs reached 100% accuracy rate for the two kernel functions considered. Moreover, we show the sensitivity profiles exhibited by a large sample of the configurations whereby one can visually inspect their levels of sensitiveness to the type of feature and to the kernel function/parameter value. Conclusions: Overall, the results evidence that all kernel machines are competitive in terms of accuracy, with the standard and least squares SVMs prevailing more consistently. Moreover, the choice of the kernel function and parameter value as well as the choice of the feature extractor are critical decisions to be taken, albeit the choice of the wavelet family seems not to be so relevant. Also, the statistical values calculated over the Lyapunov exponents were good sources of signal representation, but not as informative as their wavelet counterparts. Finally, a typical sensitivity profile has emerged among all types of machines, involving some regions of stability separated by zones of sharp variation, with some kernel parameter values clearly associated with better accuracy rates (zones of optimality). (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The search for more realistic modeling of financial time series reveals several stylized facts of real markets. In this work we focus on the multifractal properties found in price and index signals. Although the usual minority game (MG) models do not exhibit multifractality, we study here one of its variants that does. We show that the nonsynchronous MG models in the nonergodic phase is multifractal and in this sense, together with other stylized facts, constitute a better modeling tool. Using the structure function (SF) approach we detected the stationary and the scaling range of the time series generated by the MG model and, from the linear (non-linear) behavior of the SF we identified the fractal (multifractal) regimes. Finally, using the wavelet transform modulus maxima (WTMM) technique we obtained its multifractal spectrum width for different dynamical regimes. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
The Random Parameter model was proposed to explain the structure of the covariance matrix in problems where most, but not all, of the eigenvalues of the covariance matrix can be explained by Random Matrix Theory. In this article, we explore the scaling properties of the model, as observed in the multifractal structure of the simulated time series. We use the Wavelet Transform Modulus Maxima technique to obtain the multifractal spectrum dependence with the parameters of the model. The model shows a scaling structure compatible with the stylized facts for a reasonable choice of the parameter values. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Today several different unsupervised classification algorithms are commonly used to cluster similar patterns in a data set based only on its statistical properties. Specially in image data applications, self-organizing methods for unsupervised classification have been successfully applied for clustering pixels or group of pixels in order to perform segmentation tasks. The first important contribution of this paper refers to the development of a self-organizing method for data classification, named Enhanced Independent Component Analysis Mixture Model (EICAMM), which was built by proposing some modifications in the Independent Component Analysis Mixture Model (ICAMM). Such improvements were proposed by considering some of the model limitations as well as by analyzing how it should be improved in order to become more efficient. Moreover, a pre-processing methodology was also proposed, which is based on combining the Sparse Code Shrinkage (SCS) for image denoising and the Sobel edge detector. In the experiments of this work, the EICAMM and other self-organizing models were applied for segmenting images in their original and pre-processed versions. A comparative analysis showed satisfactory and competitive image segmentation results obtained by the proposals presented herein. (C) 2008 Published by Elsevier B.V.
Resumo:
There are about 7500 water treatment plants in Brazil. The wastes these plants generate in their decantation tanks and filters are discharged directly into the same brooks and rivers that supply water for treatment. Another serious environmental problem is the unregulated disposal of construction and demolition rubble, which increases the expenditure of public resources by degrading the urban environment and contributing to aggravate flooding and the proliferation of vectors harmful to public health. In this study, an evaluation was made of the possibility of recycling water treatment sludge in construction and demolition waste recycling plants. The axial compressive strength and water absorption of concretes and mortars produced with the exclusive and joint addition of these two types of waste was also determined. The ecoefficiency of this recycling was evaluated by determining the concentration of aluminum in the leached extract resulting from the solubilization of the recycled products. The production of concretes and mortars with the joint addition of water treatment sludge and recycled concrete rubble aggregates proved to be a viable recycling alternative from the standpoint of axial compression strength, modulus of elasticity, water absorption and tensile strength by the Brazilian test method. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
In developing countries such as Brazil, the wastes generated in the decanters and filters of water treatment plants are discharged directly into the same rivers and streams that supply water for treatment. Another environmental problem is the unregulated discard of wood wastes. The lumber and wood products industry generates large quantities of this waste, from logging to the manufacture of the end product. Brazil has few biomass plants and therefore only a minor part of these wastes are reused. This paper presents the results of the first study involving a novel scientific and technological approach to evaluate the possibility of combining these two types of wastes in the production of a light-weight composite for concrete. The concrete produced with cement:sand:composite:water mass ratios of 1:2.5:0.67:0.6 displayed an axial compressive strength of 11.1 MPa, a compressive and diametral tensile strength of 1.2 MPa, water absorption of 8.8%, and a specific mass of 1.847 kg/m(3). The mechanical properties obtained with this concrete render it suitable for application in non-structural elements. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Optical monitoring systems are necessary to manufacture multilayer thin-film optical filters with low tolerance on spectrum specification. Furthermore, to have better accuracy on the measurement of film thickness, direct monitoring is a must. Direct monitoring implies acquiring spectrum data from the optical component undergoing the film deposition itself, in real time. In making film depositions on surfaces of optical components, the high vacuum evaporator chamber is the most popular equipment. Inside the evaporator, at the top of the chamber, there is a metallic support with several holes where the optical components are assembled. This metallic support has rotary motion to promote film homogenization. To acquire a measurement of the spectrum of the film in deposition, it is necessary to pass a light beam through a glass witness undergoing the film deposition process, and collect a sample of the light beam using a spectrometer. As both the light beam and the light collector are stationary, a synchronization system is required to identify the moment at which the optical component passes through the light beam.
Resumo:
An experimental study of the Polarization Dependent Loss (PDL) is performed in an Optical Recirculating Loop (RCL). The RCL enables to simulate the transmission through various optical links using just one optical fiber spool, one in line amplifier, some optical filters and devices in a low cost manner. The total amount of PDL in a Recirculating loop, due to its statistical nature, is different of the simple sum of each element of the recirculating loop because of the alignment variation of the PDL elements with time, depending on the environmental conditions such as fiber stress and temperature. In this paper theoretical studies are also performed using formalism of Jones and Mueller matrices in order to represent the different optical elements in the recirculating loop. The PDL must be correctly characterized in order to evaluate properly the impact on the performance of next generation DWDM systems. Theoretical and experimental results comparison shows that a depolarization of 7% occurs in the experimental setup, probably by the optical amplifier due to the depolarized nature of the amplified spontaneous emission.
Resumo:
This paper proposes a novel computer vision approach that processes video sequences of people walking and then recognises those people by their gait. Human motion carries different information that can be analysed in various ways. The skeleton carries motion information about human joints, and the silhouette carries information about boundary motion of the human body. Moreover, binary and gray-level images contain different information about human movements. This work proposes to recover these different kinds of information to interpret the global motion of the human body based on four different segmented image models, using a fusion model to improve classification. Our proposed method considers the set of the segmented frames of each individual as a distinct class and each frame as an object of this class. The methodology applies background extraction using the Gaussian Mixture Model (GMM), a scale reduction based on the Wavelet Transform (WT) and feature extraction by Principal Component Analysis (PCA). We propose four new schemas for motion information capture: the Silhouette-Gray-Wavelet model (SGW) captures motion based on grey level variations; the Silhouette-Binary-Wavelet model (SBW) captures motion based on binary information; the Silhouette-Edge-Binary model (SEW) captures motion based on edge information and the Silhouette Skeleton Wavelet model (SSW) captures motion based on skeleton movement. The classification rates obtained separately from these four different models are then merged using a new proposed fusion technique. The results suggest excellent performance in terms of recognising people by their gait.