886 resultados para Projection


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The random eigenvalue problem arises in frequency and mode shape determination for a linear system with uncertainties in structural properties. Among several methods of characterizing this random eigenvalue problem, one computationally fast method that gives good accuracy is a weak formulation using polynomial chaos expansion (PCE). In this method, the eigenvalues and eigenvectors are expanded in PCE, and the residual is minimized by a Galerkin projection. The goals of the current work are (i) to implement this PCE-characterized random eigenvalue problem in the dynamic response calculation under random loading and (ii) to explore the computational advantages and challenges. In the proposed method, the response quantities are also expressed in PCE followed by a Galerkin projection. A numerical comparison with a perturbation method and the Monte Carlo simulation shows that when the loading has a random amplitude but deterministic frequency content, the proposed method gives more accurate results than a first-order perturbation method and a comparable accuracy as the Monte Carlo simulation in a lower computational time. However, as the frequency content of the loading becomes random, or for general random process loadings, the method loses its accuracy and computational efficiency. Issues in implementation, limitations, and further challenges are also addressed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a machine learning approach for subject independent human action recognition using depth camera, emphasizing the importance of depth in recognition of actions. The proposed approach uses the flow information of all 3 dimensions to classify an action. In our approach, we have obtained the 2-D optical flow and used it along with the depth image to obtain the depth flow (Z motion vectors). The obtained flow captures the dynamics of the actions in space time. Feature vectors are obtained by averaging the 3-D motion over a grid laid over the silhouette in a hierarchical fashion. These hierarchical fine to coarse windows capture the motion dynamics of the object at various scales. The extracted features are used to train a Meta-cognitive Radial Basis Function Network (McRBFN) that uses a Projection Based Learning (PBL) algorithm, referred to as PBL-McRBFN, henceforth. PBL-McRBFN begins with zero hidden neurons and builds the network based on the best human learning strategy, namely, self-regulated learning in a meta-cognitive environment. When a sample is used for learning, PBLMcRBFN uses the sample overlapping conditions, and a projection based learning algorithm to estimate the parameters of the network. The performance of PBL-McRBFN is compared to that of a Support Vector Machine (SVM) and Extreme Learning Machine (ELM) classifiers with representation of every person and action in the training and testing datasets. Performance study shows that PBL-McRBFN outperforms these classifiers in recognizing actions in 3-D. Further, a subject-independent study is conducted by leave-one-subject-out strategy and its generalization performance is tested. It is observed from the subject-independent study that McRBFN is capable of generalizing actions accurately. The performance of the proposed approach is benchmarked with Video Analytics Lab (VAL) dataset and Berkeley Multimodal Human Action Database (MHAD). (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Single receive antenna selection (AS) allows single-input single-output (SISO) systems to retain the diversity benefits of multiple antennas with minimum hardware costs. We propose a single receive AS method for time-varying channels, in which practical limitations imposed by next-generation wireless standards such as training, packetization and antenna switching time are taken into account. The proposed method utilizes low-complexity subspace projection techniques spanned by discrete prolate spheroidal (DPS) sequences. It only uses Doppler bandwidth knowledge, and does not need detailed correlation knowledge. Results show that the proposed AS method outperforms ideal conventional SISO systems with perfect CSI but no AS at the receiver and AS using the conventional Fourier estimation/prediction method. A closed-form expression for the symbol error probability (SEP) of phase-shift keying (MPSK) with symbol-by-symbol receive AS is derived.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a new method for rapid NMR data acquisition and assignments applicable to unlabeled (C-12) or C-13-labeled biomolecules/organic molecules in general and metabolomics in particular. The method involves the acquisition of three two dimensional (2D) NMR spectra simultaneously using a dual receiver system. The three spectra, namely: (1) G-matrix Fourier transform (GFT) (3,2)D C-13, H-1] HSQC-TOCSY, (2) 2D H-1-H-1 TOCSY and (3) 2D C-13-H-1 HETCOR are acquired in a single experiment and provide mutually complementary information to completely assign individual metabolites in a mixture. The GFT (3,2)D C-13, H-1] HSQC-TOCSY provides 3D correlations in a reduced dimensionality manner facilitating high resolution and unambiguous assignments. The experiments were applied for complete H-1 and C-13 assignments of a mixture of 21 unlabeled metabolites corresponding to a medium used in assisted reproductive technology. Taken together, the experiments provide time gain of order of magnitudes compared to the conventional data acquisition methods and can be combined with other fast NMR techniques such as non-uniform sampling and covariance spectroscopy. This provides new avenues for using multiple receivers and projection NMR techniques for high-throughput approaches in metabolomics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Elastic Net Regularizers have shown much promise in designing sparse classifiers for linear classification. In this work, we propose an alternating optimization approach to solve the dual problems of elastic net regularized linear classification Support Vector Machines (SVMs) and logistic regression (LR). One of the sub-problems turns out to be a simple projection. The other sub-problem can be solved using dual coordinate descent methods developed for non-sparse L2-regularized linear SVMs and LR, without altering their iteration complexity and convergence properties. Experiments on very large datasets indicate that the proposed dual coordinate descent - projection (DCD-P) methods are fast and achieve comparable generalization performance after the first pass through the data, with extremely sparse models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method to reliably extract object profiles even with surface discontinuities that leads to 2n pi phase jumps is proposed. The proposed method uses an amplitude-modulated Ronchi grating, which allows one to extract phase and unwrap the same with a single image. Ronchi equivalent image can be derived from modified grating image, which aids in extracting wrapped phase using Fourier transform profilometry. The amplitude of the modified grating aids in phase unwrapping. As we only need a projector that projects an amplitude-modulated grating, the proposed method allows one to extract three-dimensional profile without using full video projectors. This article also deals with noise reduction algorithms for fringe projection techniques. (C) 2014 Society of Photo-Optical Instrumentation Engineers (SPIE)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Global change in climate and consequent large impacts on regional hydrologic systems have, in recent years, motivated significant research efforts in water resources modeling under climate change. In an integrated future hydrologic scenario, it is likely that water availability and demands will change significantly due to modifications in hydro-climatic variables such as rainfall, reservoir inflows, temperature, net radiation, wind speed and humidity. An integrated regional water resources management model should capture the likely impacts of climate change on water demands and water availability along with uncertainties associated with climate change impacts and with management goals and objectives under non-stationary conditions. Uncertainties in an integrated regional water resources management model, accumulating from various stages of decision making include climate model and scenario uncertainty in the hydro-climatic impact assessment, uncertainty due to conflicting interests of the water users and uncertainty due to inherent variability of the reservoir inflows. This paper presents an integrated regional water resources management modeling approach considering uncertainties at various stages of decision making by an integration of a hydro-climatic variable projection model, a water demand quantification model, a water quantity management model and a water quality control model. Modeling tools of canonical correlation analysis, stochastic dynamic programming and fuzzy optimization are used in an integrated framework, in the approach presented here. The proposed modeling approach is demonstrated with the case study of the Bhadra Reservoir system in Karnataka, India.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the merging and splitting of quasi-two-dimensional Bose-Einstein condensates with strong dipolar interactions. We observe that if the dipoles have a non-zero component in the plane of the condensate, the dynamics of merging or splitting along two orthogonal directions, parallel and perpendicular to the projection of dipoles on the plane of the condensate, are different. The anisotropic merging and splitting of the condensate is a manifestation of the anisotropy of the roton-like mode in the dipolar system. The difference in dynamics disappears if the dipoles are oriented at right angles to the plane of the condensate as in this case the Bogoliubov dispersion, despite having roton-like features, is isotropic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We apply the objective method of Aldous to the problem of finding the minimum-cost edge cover of the complete graph with random independent and identically distributed edge costs. The limit, as the number of vertices goes to infinity, of the expected minimum cost for this problem is known via a combinatorial approach of Hessler and Wastlund. We provide a proof of this result using the machinery of the objective method and local weak convergence, which was used to prove the (2) limit of the random assignment problem. A proof via the objective method is useful because it provides us with more information on the nature of the edge's incident on a typical root in the minimum-cost edge cover. We further show that a belief propagation algorithm converges asymptotically to the optimal solution. This can be applied in a computational linguistics problem of semantic projection. The belief propagation algorithm yields a near optimal solution with lesser complexity than the known best algorithms designed for optimality in worst-case settings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several statistical downscaling models have been developed in the past couple of decades to assess the hydrologic impacts of climate change by projecting the station-scale hydrological variables from large-scale atmospheric variables simulated by general circulation models (GCMs). This paper presents and compares different statistical downscaling models that use multiple linear regression (MLR), positive coefficient regression (PCR), stepwise regression (SR), and support vector machine (SVM) techniques for estimating monthly rainfall amounts in the state of Florida. Mean sea level pressure, air temperature, geopotential height, specific humidity, U wind, and V wind are used as the explanatory variables/predictors in the downscaling models. Data for these variables are obtained from the National Centers for Environmental Prediction-National Center for Atmospheric Research (NCEP-NCAR) reanalysis dataset and the Canadian Centre for Climate Modelling and Analysis (CCCma) Coupled Global Climate Model, version 3 (CGCM3) GCM simulations. The principal component analysis (PCA) and fuzzy c-means clustering method (FCM) are used as part of downscaling model to reduce the dimensionality of the dataset and identify the clusters in the data, respectively. Evaluation of the performances of the models using different error and statistical measures indicates that the SVM-based model performed better than all the other models in reproducing most monthly rainfall statistics at 18 sites. Output from the third-generation CGCM3 GCM for the A1B scenario was used for future projections. For the projection period 2001-10, MLR was used to relate variables at the GCM and NCEP grid scales. Use of MLR in linking the predictor variables at the GCM and NCEP grid scales yielded better reproduction of monthly rainfall statistics at most of the stations (12 out of 18) compared to those by spatial interpolation technique used in earlier studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of Projection Reconstruction (PR) to obtain two-dimensional (2D) spectra from one-dimensional (1D) data in the solid state is illustrated. The method exploits multiple 1D spectra obtained using magic angle spinning and off-magic angle spinning. The spectra recorded under the influence of scaled heteronuclear scalar and dipolar couplings in the presence of homonuclear dipolar decoupling sequences have been used to reconstruct J/D Resolved 2D-NMR spectra. The use of just two 1D spectra is observed sufficient to reconstruct a J-resolved 2D-spectrum while a Separated Local Field (SLF) 2D-NMR spectrum could be obtained from three 1D spectra. The experimental techniques for recording the 10 spectra and procedure of reconstruction are discussed and the reconstructed results are compared with 20 experiments recorded in traditional methods. The application of the technique has been made to a solid polycrystalline sample and to a uniaxially oriented liquid crystal. Implementation of PR-NMR in solid state provides high-resolution spectra as well as leads to significant reduction in experimental time. The experiments are relatively simple and are devoid of several technical complications involved in performing the 2D experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The goal of this work is to reduce the cost of computing the coefficients in the Karhunen-Loeve (KL) expansion. The KL expansion serves as a useful and efficient tool for discretizing second-order stochastic processes with known covariance function. Its applications in engineering mechanics include discretizing random field models for elastic moduli, fluid properties, and structural response. The main computational cost of finding the coefficients of this expansion arises from numerically solving an integral eigenvalue problem with the covariance function as the integration kernel. Mathematically this is a homogeneous Fredholm equation of second type. One widely used method for solving this integral eigenvalue problem is to use finite element (FE) bases for discretizing the eigenfunctions, followed by a Galerkin projection. This method is computationally expensive. In the current work it is first shown that the shape of the physical domain in a random field does not affect the realizations of the field estimated using KL expansion, although the individual KL terms are affected. Based on this domain independence property, a numerical integration based scheme accompanied by a modification of the domain, is proposed. In addition to presenting mathematical arguments to establish the domain independence, numerical studies are also conducted to demonstrate and test the proposed method. Numerically it is demonstrated that compared to the Galerkin method the computational speed gain in the proposed method is of three to four orders of magnitude for a two dimensional example, and of one to two orders of magnitude for a three dimensional example, while retaining the same level of accuracy. It is also shown that for separable covariance kernels a further cost reduction of three to four orders of magnitude can be achieved. Both normal and lognormal fields are considered in the numerical studies. (c) 2014 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of optimizing the workforce of a service system. Adapting the staffing levels in such systems is non-trivial due to large variations in workload and the large number of system parameters do not allow for a brute force search. Further, because these parameters change on a weekly basis, the optimization should not take longer than a few hours. Our aim is to find the optimum staffing levels from a discrete high-dimensional parameter set, that minimizes the long run average of the single-stage cost function, while adhering to the constraints relating to queue stability and service-level agreement (SLA) compliance. The single-stage cost function balances the conflicting objectives of utilizing workers better and attaining the target SLAs. We formulate this problem as a constrained parameterized Markov cost process parameterized by the (discrete) staffing levels. We propose novel simultaneous perturbation stochastic approximation (SPSA)-based algorithms for solving the above problem. The algorithms include both first-order as well as second-order methods and incorporate SPSA-based gradient/Hessian estimates for primal descent, while performing dual ascent for the Lagrange multipliers. Both algorithms are online and update the staffing levels in an incremental fashion. Further, they involve a certain generalized smooth projection operator, which is essential to project the continuous-valued worker parameter tuned by our algorithms onto the discrete set. The smoothness is necessary to ensure that the underlying transition dynamics of the constrained Markov cost process is itself smooth (as a function of the continuous-valued parameter): a critical requirement to prove the convergence of both algorithms. We validate our algorithms via performance simulations based on data from five real-life service systems. For the sake of comparison, we also implement a scatter search based algorithm using state-of-the-art optimization tool-kit OptQuest. From the experiments, we observe that both our algorithms converge empirically and consistently outperform OptQuest in most of the settings considered. This finding coupled with the computational advantage of our algorithms make them amenable for adaptive labor staffing in real-life service systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

X-ray polarimeters based on Time Projection Chamber (TPC) geometry are currently being studied and developed to make sensitive measurement of polarization in 2-10keV energy range. TPC soft X-ray polarimeters exploit the fact that emission direction of the photoelectron ejected via photoelectric effect in a gas proportional counter carries the information of the polarization of the incident X-ray photon. Operating parameters such as pressure, drift field and drift-gap affect the performance of a TPC polarimeter. Simulations presented here showcase the effect of these operating parameters on the modulation factor of the TPC polarimeter. Models of Garfield are used to study photoelectron interaction in gas and drift of electron cloud towards Gas Electron Multiplier (GEM). The emission direction is reconstructed from the image and modulation factor is computed. Our study has shown that Ne/DME (50/50) at lower pressure and drift field can be used for a TPC polarimeter with modulation factor of 50-65%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Time Projection Chamber (TPC) based X-ray polarimeters using Gas Electron Multiplier (GEM) are currently being developed to make sensitive measurement of polarization in 2-10 keV energy range. The emission direction of the photoelectron ejected via photoelectric effect carries the information of the polarization of the incident X-ray photon. Performance of a gas based polarimeter is affected by the operating drift parameters such as gas pressure, drift field and drift-gap. We present simulation studies carried out in order to understand the effect of these operating parameters on the modulation factor of a TPC polarimeter. Models of Garfield are used to study photoelectron interaction in gas and drift of electron cloud towards GEM. Our study is aimed at achieving higher modulation factors by optimizing drift parameters. Study has shown that Ne/DME (50/50) at lower pressure and drift field can lead to desired performance of a TPC polarimeter.