133 resultados para Sequential production


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study presents the synthesis, characterization, and kinetics of steam reforming of methane and water gas shift (WGS) reactions over highly active and coke resistant Zr0.93Ru0.05O2-delta. The catalyst showed high activity at low temperatures for both the reactions. For WGS reaction, 99% conversion of CO with 100% H-2 selectivity was observed below 290 degrees C. The detailed kinetic studies including influence of gas phase product species, effect of temperature and catalyst loading on the reaction rates have been investigated. For the reforming reaction, the rate of reaction is first order in CH4 concentration and independent of CO and H2O concentration. This indicates that the adsorptive dissociation of CH4 is the rate determining step. The catalyst also showed excellent coke resistance even under a stoichiometric steam/carbon ratio. A lack of CO methanation activity is an important finding of present study and this is attributed to the ionic nature of Ru species. The associative mechanism involving the surface formate as an intermediate was used to correlate experimental data. Copyright (C) 2013, Hydrogen Energy Publications, LLC. Published by Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper focuses on the use of oxygen and steam as the gasification agents in the thermochemical conversion of biomass to produce hydrogen rich syngas, using a downdraft reactor configuration. Performance of the reactor is evaluated for different equivalence ratios (ER), steam to biomass ratios (SBR) and moisture content in the fuel. The results are compared and evaluated with chemical equilibrium analysis and reaction kinetics along with the results available in the literature. Parametric study suggests that, with increase in SBR, hydrogen fraction in the syngas increases but necessitates an increase in the ER to maintain reactor temperature toward stable operating conditions. SBR is varied from 0.75 to 2.7 and ER from 0.18 to 0.3. The peak hydrogen yield is found to be 104g/kg of biomass at SBR of 2.7. Further, significant enhancement in H-2 yield and H-2 to CO ratio is observed at higher SBR (SBR=1.5-2.7) compared with lower range SBR (SBR=0.75-1.5). Experiments were conducted using wet wood chips to induce moisture into the reacting system and compare the performance with dry wood with steam. The results clearly indicate the both hydrogen generation and the gasification efficiency ((g)) are better in the latter case. With the increase in SBR, gasification efficiency ((g)) and lower heating value (LHV) tend to reduce. Gasification efficiency of 85.8% is reported with LHV of 8.9MJNm(-3) at SBR of 0.75 compared with 69.5% efficiency at SBR of 2.5 and lower LHV of 7.4 at MJNm(-3) at SBR of 2.7. These are argued on the basis of the energy required for steam generation and the extent of steam consumption during the reaction, which translates subsequently in the LHV of syngas. From the analysis of the results, it is evident that reaction kinetics plays a crucial role in the conversion process. The study also presents the importance of reaction kinetics, which controls the overall performance related to efficiency, H-2 yield, H-2 to CO fraction and LHV of syngas, and their dependence on the process parameters SBR and ER. Copyright (c) 2013 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Structural Support Vector Machines (SSVMs) and Conditional Random Fields (CRFs) are popular discriminative methods used for classifying structured and complex objects like parse trees, image segments and part-of-speech tags. The datasets involved are very large dimensional, and the models designed using typical training algorithms for SSVMs and CRFs are non-sparse. This non-sparse nature of models results in slow inference. Thus, there is a need to devise new algorithms for sparse SSVM and CRF classifier design. Use of elastic net and L1-regularizer has already been explored for solving primal CRF and SSVM problems, respectively, to design sparse classifiers. In this work, we focus on dual elastic net regularized SSVM and CRF. By exploiting the weakly coupled structure of these convex programming problems, we propose a new sequential alternating proximal (SAP) algorithm to solve these dual problems. This algorithm works by sequentially visiting each training set example and solving a simple subproblem restricted to a small subset of variables associated with that example. Numerical experiments on various benchmark sequence labeling datasets demonstrate that the proposed algorithm scales well. Further, the classifiers designed are sparser than those designed by solving the respective primal problems and demonstrate comparable generalization performance. Thus, the proposed SAP algorithm is a useful alternative for sparse SSVM and CRF classifier design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a spatio-temporal registration approach for speech articulation data obtained from electromagnetic articulography (EMA) and real-time Magnetic Resonance Imaging (rtMRI). This is motivated by the potential for combining the complementary advantages of both types of data. The registration method is validated on EMA and rtMRI datasets obtained at different times, but using the same stimuli. The aligned corpus offers the advantages of high temporal resolution (from EMA) and a complete mid-sagittal view (from rtMRI). The co-registration also yields optimum placement of EMA sensors as articulatory landmarks on the magnetic resonance images, thus providing richer spatio-temporal information about articulatory dynamics. (C) 2014 Acoustical Society of America

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider ZH and WH production at the Large Hadron Collider, where the Higgs decays to a b (b) over bar pair. We use jet substructure techniques to reconstruct the Higgs boson and construct angular observables involving leptonic decay products of the vector bosons. These efficiently discriminate between the tensor structure of the HVV vertex expected in the Standard Model and that arising from possible new physics, as quantified by higher dimensional operators. This can then be used to examine the CP nature of the Higgs as well as CP mixing effects in the HZZ and HWW vertices separately. (C) 2014 Elsevier B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Taxol (R) (generic name paclitaxel) represents one of the most clinically valuable natural products known to mankind in the recent past. More than two decades have elapsed since the notable discovery of the first Taxol (R) producing endophytic fungus, which was followed by a plethora of reports on other endophytes possessing similar biosynthetic potential. However, industrial-scale Taxol (R) production using fungal endophytes, although seemingly promising, has not seen the light of the day. In this opinion article, we embark on the current state of knowledge on Taxol (R) biosynthesis focusing on the chemical ecology of its producers, and ask whether it is actually possible to produce Taxol (R) using endophyte biotechnology. The key problems that have prevented the exploitation of potent endophytic fungi by industrial bioprocesses for sustained production of Taxol (R) are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our work is motivated by impromptu (or ``as-you-go'') deployment of wireless relay nodes along a path, a need that arises in many situations. In this paper, the path is modeled as starting at the origin (where there is the data sink, e.g., the control center), and evolving randomly over a lattice in the positive quadrant. A person walks along the path deploying relay nodes as he goes. At each step, the path can, randomly, either continue in the same direction or take a turn, or come to an end, at which point a data source (e.g., a sensor) has to be placed, that will send packets to the data sink. A decision has to be made at each step whether or not to place a wireless relay node. Assuming that the packet generation rate by the source is very low, and simple link-by-link scheduling, we consider the problem of sequential relay placement so as to minimize the expectation of an end-to-end cost metric (a linear combination of the sum of convex hop costs and the number of relays placed). This impromptu relay placement problem is formulated as a total cost Markov decision process. First, we derive the optimal policy in terms of an optimal placement set and show that this set is characterized by a boundary (with respect to the position of the last placed relay) beyond which it is optimal to place the next relay. Next, based on a simpler one-step-look-ahead characterization of the optimal policy, we propose an algorithm which is proved to converge to the optimal placement set in a finite number of steps and which is faster than value iteration. We show by simulations that the distance threshold based heuristic, usually assumed in the literature, is close to the optimal, provided that the threshold distance is carefully chosen. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

USC-TIMIT is an extensive database of multimodal speech production data, developed to complement existing resources available to the speech research community and with the intention of being continuously refined and augmented. The database currently includes real-time magnetic resonance imaging data from five male and five female speakers of American English. Electromagnetic articulography data have also been presently collected from four of these speakers. The two modalities were recorded in two independent sessions while the subjects produced the same 460 sentence corpus used previously in the MOCHA-TIMIT database. In both cases the audio signal was recorded and synchronized with the articulatory data. The database and companion software are freely available to the research community. (C) 2014 Acoustical Society of America.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper considers cooperative spectrum sensing algorithms for Cognitive Radios which focus on reducing the number of samples to make a reliable detection. We propose algorithms based on decentralized sequential hypothesis testing in which the Cognitive Radios sequentially collect the observations, make local decisions and send them to the fusion center for further processing to make a final decision on spectrum usage. The reporting channel between the Cognitive Radios and the fusion center is assumed more realistically as a Multiple Access Channel (MAC) with receiver noise. Furthermore the communication for reporting is limited, thereby reducing the communication cost. We start with an algorithm where the fusion center uses an SPRT-like (Sequential Probability Ratio Test) procedure and theoretically analyze its performance. Asymptotically, its performance is close to the optimal centralized test without fusion center noise. We further modify this algorithm to improve its performance at practical operating points. Later we generalize these algorithms to handle uncertainties in SNR and fading. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concurrent planning of sequential saccades offers a simple model to study the nature of visuomotor transformations since the second saccade vector needs to be remapped to foveate the second target following the first saccade. Remapping is thought to occur through egocentric mechanisms involving an efference copy of the first saccade that is available around the time of its onset. In contrast, an exocentric representation of the second target relative to the first target, if available, can be used to directly code the second saccade vector. While human volunteers performed a modified double-step task, we examined the role of exocentric encoding in concurrent saccade planning by shifting the first target location well before the efference copy could be used by the oculomotor system. The impact of the first target shift on concurrent processing was tested by examining the end-points of second saccades following a shift of the second target during the first saccade. The frequency of second saccades to the old versus new location of the second target, as well as the propagation of first saccade localization errors, both indices of concurrent processing, were found to be significantly reduced in trials with the first target shift compared to those without it. A similar decrease in concurrent processing was obtained when we shifted the first target but kept constant the second saccade vector. Overall, these results suggest that the brain can use relatively stable visual landmarks, independent of efference copy-based egocentric mechanisms, for concurrent planning of sequential saccades.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We estimate transverse spin single spin asymmetry(TSSA) in the process e + p(up arrow) -> J/psi + X using color evaporation model of charmonium production. We take into account transverse momentum dependent(TMD) evolution of Sivers function and parton distribution function and show that the there is a reduction in the asymmetry as compared to our earlier estimates wherein the Q(2) - evolution was implemented only through DGLAP evolution of unpolarized gluon densities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measurement of the self-coupling of the 125 GeV Higgs boson is one of the most crucial tasks for a high luminosity run of the LHC, and it can only be measured in the di-Higgs final state. In the minimal supersymmetric standard model, heavy CP even Higgs (H) can decay into a lighter 125 GeV Higgs boson (h) and, therefore, can influence the rate of di-Higgs production. We investigate the role of single H production in the context of measuring the self-coupling of h. We have found that the H -> hh decay can change the value of Higgs (h) self-coupling substantially, in a low tan beta regime where the mass of the heavy Higgs boson lies between 250 and 600 GeV and, depending on the parameter space, it may be seen as an enhancement of the self-coupling of the 125 GeV Higgs boson.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyse the hVV (V = W, Z) vertex in a model independent way using Vh production. To that end, we consider possible corrections to the Standard Model Higgs Lagrangian, in the form of higher dimensional operators which parametrise the effects of new physics. In our analysis, we pay special attention to linear observables that can be used to probe CP violation in the same. By considering the associated production of a Higgs boson with a vector boson (W or Z), we use jet substructure methods to define angular observables which are sensitive to new physics effects, including an asymmetry which is linearly sensitive to the presence of CP odd effects. We demonstrate how to use these observables to place bounds on the presence of higher dimensional operators, and quantify these statements using a log likelihood analysis. Our approach allows one to probe separately the hZZ and hWW vertices, involving arbitrary combinations of BSM operators, at the Large Hadron Collider.