962 resultados para Variational Principle


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the C02 capture from power generation, the energy penalties for the capture are one of the main challenges. Nowadays, the post-combustion methods have energy penalties 10wer than the oxy combustion and pre-combustion technologies. One of the main disadvantages of the post combustion method is the fact that the capture ofC02at atmospheric pressure requires quite big equipment for the high flow rates of flue gas, and the 10w partial pressure of the CO2generates an important 10ss of energy. The A1lam cyc1e presented for NETPOWER gives high efficiencies in the power production and 10w energy penalties. A simulation of this cyc1e is made together with a simulation of power plants with pre-combustion and post-combustion capture and without capture for natural gas and forcoa1. The simulations give 10wer efficiencies than the proposed for NETPOWER For natural gas the efficiency is 52% instead of the 59% presented, and 33% instead of51% in the case of using coal as fuel. Are brought to light problems in the CO2compressor due the high flow ofC02that is compressed unti1300 bar to be recyc1ed into the combustor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Remote sensing imaging systems for the measurement of oceanic sea states have recently attracted renovated attention. Imaging technology is economical, non-invasive and enables a better understanding of the space-time dynamics of ocean waves over an area rather than at selected point locations of previous monitoring methods (buoys, wave gauges, etc.). We present recent progress in space-time measurement of ocean waves using stereo vision systems on offshore platforms. Both traditional disparity-based systems and modern elevation-based ones are presented in a variational optimization framework: the main idea is to pose the stereoscopic reconstruction problem of the surface of the ocean in a variational setting and design an energy functional whose minimizer is the desired temporal sequence of wave heights. The functional combines photometric observations as well as spatial and temporal smoothness priors. Disparity methods estimate the disparity between images as an intermediate step toward retrieving the depth of the waves with respect to the cameras, whereas elevation methods estimate the ocean surface displacements directly in 3-D space. Both techniques are used to measure ocean waves from real data collected at offshore platforms in the Black Sea (Crimean Peninsula, Ukraine) and the Northern Adriatic Sea (Venice coast, Italy). Then, the statistical and spectral properties of the resulting observed waves are analyzed. We show the advantages and disadvantages of the presented stereo vision systems and discuss the improvement of their performance in critical issues such as the robustness of the camera calibration in spite of undesired variations of the camera parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent developments to fit the so called Free Formulation into a variational framework have suggested the possibility of introducing a new category of error estimates for finite element computations. Such error estimates are based on differences between certain multifield functionals, which give the same value for the true solution. In the present paper the formulation of some estimates of this kind is introduced for elasticity and plate bending problems, and several examples of their performance are discussed. The observed numerical behavior of the new accuracy measures seems to be acceptable from an engineering point of view. However, further numerical experimentation is still needed to establish practical tolerance levels for real problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this paper is to sketch the outlines of what will, hopefully become my PhD thesis: the proposal of an evolutional leadership behavioral pattern, Hoshin Kanri, that enables process owners in organizations attain alignment by achieving their goals while increasing trust. ! I intend to develop such a behavioral pattern or “proper way” 善道 1 by adopting a network perspective over two organizational dimensions: process dimension and process owner dimension. In both dimensions I will propose measures of their topological structure and functionality. From there I will propose quantifiable characteristics that will help me enunciate several hypothesis about their dynamical and evolutional characteristics. I intend to test these hypothesis in the field and report the results to enable others to challenge this research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The calibration results of one anemometer equipped with several rotors, varying their size, were analyzed. In each case, the 30-pulses pert turn output signal of the anemometer was studied using Fourier series decomposition and correlated with the anemometer factor (i.e., the anemometer transfer function). Also, a 3-cup analytical model was correlated to the data resulting from the wind tunnel measurements. Results indicate good correlation between the post-processed output signal and the working condition of the cup anemometer. This correlation was also reflected in the results from the proposed analytical model. With the present work the possibility of remotely checking cup anemometer status, indicating the presence of anomalies and, therefore, a decrease on the wind sensor reliability is revealed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Validating modern oceanographic theories using models produced through stereo computer vision principles has recently emerged. Space-time (4-D) models of the ocean surface may be generated by stacking a series of 3-D reconstructions independently generated for each time instant or, in a more robust manner, by simultaneously processing several snapshots coherently in a true ?4-D reconstruction.? However, the accuracy of these computer-vision-generated models is subject to the estimations of camera parameters, which may be corrupted under the influence of natural factors such as wind and vibrations. Therefore, removing the unpredictable errors of the camera parameters is necessary for an accurate reconstruction. In this paper, we propose a novel algorithm that can jointly perform a 4-D reconstruction as well as correct the camera parameter errors introduced by external factors. The technique is founded upon variational optimization methods to benefit from their numerous advantages: continuity of the estimated surface in space and time, robustness, and accuracy. The performance of the proposed algorithm is tested using synthetic data produced through computer graphics techniques, based on which the errors of the camera parameters arising from natural factors can be simulated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the maximum parsimony (MP) and minimum evolution (ME) methods of phylogenetic inference, evolutionary trees are constructed by searching for the topology that shows the minimum number of mutational changes required (M) and the smallest sum of branch lengths (S), respectively, whereas in the maximum likelihood (ML) method the topology showing the highest maximum likelihood (A) of observing a given data set is chosen. However, the theoretical basis of the optimization principle remains unclear. We therefore examined the relationships of M, S, and A for the MP, ME, and ML trees with those for the true tree by using computer simulation. The results show that M and S are generally greater for the true tree than for the MP and ME trees when the number of nucleotides examined (n) is relatively small, whereas A is generally lower for the true tree than for the ML tree. This finding indicates that the optimization principle tends to give incorrect topologies when n is small. To deal with this disturbing property of the optimization principle, we suggest that more attention should be given to testing the statistical reliability of an estimated tree rather than to finding the optimal tree with excessive efforts. When a reliability test is conducted, simplified MP, ME, and ML algorithms such as the neighbor-joining method generally give conclusions about phylogenetic inference very similar to those obtained by the more extensive tree search algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transport of peptides across the membrane of the endoplasmic reticulum for assembly with MHC class I molecules is an essential step in antigen presentation to cytotoxic T cells. This task is performed by the major histocompatibility complex-encoded transporter associated with antigen processing (TAP). Using a combinatorial approach we have analyzed the substrate specificity of human TAP at high resolution and in the absence of any given sequence context, revealing the contribution of each peptide residue in stabilizing binding to TAP. Human TAP was found to be highly selective with peptide affinities covering at least three orders of magnitude. Interestingly, the selectivity is not equally distributed over the substrate. Only the N-terminal three positions and the C-terminal residue are critical, whereas effects from other peptide positions are negligible. A major influence from the peptide backbone was uncovered by peptide scans and libraries containing d amino acids. Again, independent of peptide length, critical positions were clustered near the peptide termini. These approaches demonstrate that human TAP is selective, with residues determining the affinity located in distinct regions, and point to the role of the peptide backbone in binding to TAP. This binding mode of TAP has implications in an optimized repertoire selection and in a coevolution with the major histocompatibility complex/T cell receptor complex.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the chemical reactivity of C3 of phosphoenolpyruvate (PEP) has been analyzed in terms of density functional theory quantified through quantum chemistry calculations. PEP is involved in a number of important enzymatic reactions, in which its C3 atom behaves like a base. In three different enzymatic reactions analyzed here, C3 sometimes behaves like a soft base and sometimes behaves like a hard base in terms of the hard-soft acid-base principle. This dual nature of C3 of PEP was found to be related to the conformational change of the molecule. This leads to a testable hypothesis: that PEP adopts particular conformations in the enzyme-substrate complexes of different PEP-using enzymes, and that the enzymes control the reactivity through controlling the dihedral angle between the carboxylate and the C==C double bond of PEP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the cerebral cortex, the small volume of the extracellular space in relation to the volume enclosed by synapses suggests an important functional role for this relationship. It is well known that there are atoms and molecules in the extracellular space that are absolutely necessary for synapses to function (e.g., calcium). I propose here the hypothesis that the rapid shift of these atoms and molecules from extracellular to intrasynaptic compartments represents the consumption of a shared, limited resource available to local volumes of neural tissue. Such consumption results in a dramatic competition among synapses for resources necessary for their function. In this paper, I explore a theory in which this resource consumption plays a critical role in the way local volumes of neural tissue operate. On short time scales, this principle of resource consumption permits a tissue volume to choose those synapses that function in a particular context and thereby helps to integrate the many neural signals that impinge on a tissue volume at any given moment. On longer time scales, the same principle aids in the stable storage and recall of information. The theory provides one framework for understanding how cerebral cortical tissue volumes integrate, attend to, store, and recall information. In this account, the capacity of neural tissue to attend to stimuli is intimately tied to the way tissue volumes are organized at fine spatial scales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a unifying picture where the notion of generalized entropy is related to information theory by means of a group-theoretical approach. The group structure comes from the requirement that an entropy be well defined with respect to the composition of independent systems, in the context of a recently proposed generalization of the Shannon-Khinchin axioms. We associate to each member of a large class of entropies a generalized information measure, satisfying the additivity property on a set of independent systems as a consequence of the underlying group law. At the same time, we also show that Einstein's likelihood function naturally emerges as a byproduct of our informational interpretation of (generally nonadditive) entropies. These results confirm the adequacy of composable entropies both in physical and social science contexts.