141 resultados para High-Dimensional Space Geometrical Informatics (HDSGI)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Operator spaces of Hilbertian JC∗ -triples E are considered in the light of the universal ternary ring of operators (TRO) introduced in recent work. For these operator spaces, it is shown that their triple envelope (in the sense of Hamana) is the TRO they generate, that a complete isometry between any two of them is always the restriction of a TRO isomorphism and that distinct operator space structures on a fixed E are never completely isometric. In the infinite-dimensional cases, operator space structure is shown to be characterized by severe and definite restrictions upon finite-dimensional subspaces. Injective envelopes are explicitly computed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores the possibility of combining moderate vacuum frying followed by post-frying high vacuum application during the oil drainage stage, with the aim to reduce oil content in potato chips. Potato slices were initially vacuum fried under two operating conditions (140 °C, 20 kPa and 162 °C, 50.67 kPa) until the moisture content reached 10 and 15 % (wet basis), prior to holding the samples in the head space under high vacuum level (1.33 kPa). This two-stage process was found to lower significantly the amount of oil taken up by potato chips by an amount as high as 48 %, compared to drainage at the same pressure as the frying pressure. Reducing the pressure value to 1.33 kPa reduced the water saturation temperature (11 °C), causing the product to continuously lose moisture during the course of drainage. Continuous release of water vapour prevented the occluded surface oil from penetrating into the product structure and released it from the surface of the product. When frying and drainage occurred at the same pressure, the temperature of the product fell below the water saturation temperature soon after it was lifted out of the oil, which resulted in the oil getting sucked into the product. Thus, lowering the pressure after frying to a value well below the frying pressure is a promising method to lower oil uptake by the product.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Listeria monocytogenes is a psychrotrophic food-borne pathogen that is problematic for the food industry because of its ubiquitous distribution in nature and its ability to grow at low temperatures and in the presence of high salt concentrations. Here we demonstrate that the process of adaptation to low temperature after cold shock includes elevated levels of cold shock proteins (CSPs) and that the levels of CSPs are also elevated after treatment with high hydrostatic pressure (HHP). Two-dimensional gel electrophoresis combined with Western blotting performed with anti-CspB of Bacillus subtilis was used to identify four 7-kDa proteins, designated Csp1, Csp2, Csp3, and Csp4. In addition, Southern blotting revealed four chromosomal DNA fragments that reacted with a csp probe, which also indicated that a CSP family is present in L. monocytogenes LO28. After a cold shock in which the temperature was decreased from 37°C to 10°C the levels of Csp1 and Csp3 increased 10- and 3.5-fold, respectively, but the levels of Csp2 and Csp4 were not elevated. Pressurization of L. monocytogenes LO28 cells resulted in 3.5- and 2-fold increases in the levels of Csp1 and Csp2, respectively. Strikingly, the level of survival after pressurization of cold-shocked cells was 100-fold higher than that of cells growing exponentially at 37°C. These findings imply that cold-shocked cells are protected from HHP treatment, which may affect the efficiency of combined preservation techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this paper is to study and further develop the orthogonality sampling or stationary waves algorithm for the detection of the location and shape of objects from the far field pattern of scattered waves in electromagnetics or acoustics. Orthogonality sampling can be seen as a special beam forming algorithm with some links to the point source method and to the linear sampling method. The basic idea of orthogonality sampling is to sample the space under consideration by calculating scalar products of the measured far field pattern , with a test function for all y in a subset Q of the space , m = 2, 3. The way in which this is carried out is important to extract the information which the scattered fields contain. The theoretical foundation of orthogonality sampling is only partly resolved, and the goal of this work is to initiate further research by numerical demonstration of the high potential of the approach. We implement the method for a two-dimensional setting for the Helmholtz equation, which represents electromagnetic scattering when the setup is independent of the third coordinate. We show reconstructions of the location and shape of objects from measurements of the scattered field for one or several directions of incidence and one or many frequencies or wave numbers, respectively. In particular, we visualize the indicator function both with the Dirichlet and Neumann boundary condition and for complicated inhomogeneous media.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 'Avalanche', an object is lowered, players staying in contact throughout. Normally the task is easily accomplished. However, with larger groups counter-intuitive behaviours appear. The paper proposes a formal theory for the underlying causal mechanisms. The aim is to not only provide an explicit, testable hypothesis for the source of the observed modes of behaviour-but also to exemplify the contribution that formal theory building can make to understanding complex social phenomena. Mapping reveals the importance of geometry to the Avalanche game; each player has a pair of balancing loops, one involved in lowering the object, the other ensuring contact. For more players, sets of balancing loops interact and these can allow dominance by reinforcing loops, causing the system to chase upwards towards an ever-increasing goal. However, a series of other effects concerning human physiology and behaviour (HPB) is posited as playing a role. The hypothesis is therefore rigorously tested using simulation. For simplicity a 'One Degree of Freedom' case is examined, allowing all of the effects to be included whilst rendering the analysis more transparent. Formulation and experimentation with the model gives insight into the behaviours. Multi-dimensional rate/level analysis indicates that there is only a narrow region in which the system is able to move downwards. Model runs reproduce the single 'desired' mode of behaviour and all three of the observed 'problematic' ones. Sensitivity analysis gives further insight into the system's modes and their causes. Behaviour is seen to arise only when the geometric effects apply (number of players greater than degrees of freedom of object) in combination with a range of HPB effects. An analogy exists between the co-operative behaviour required here and various examples: conflicting strategic objectives in organizations; Prisoners' Dilemma and integrated bargaining situations. Additionally, the game may be relatable in more direct algebraic terms to situations involving companies in which the resulting behaviours are mediated by market regulations. Finally, comment is offered on the inadequacy of some forms of theory building and the case is made for formal theory building involving the use of models, analysis and plausible explanations to create deep understanding of social phenomena.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we present FACSGen 2.0, new animation software for creating static and dynamic threedimensional facial expressions on the basis of the Facial Action Coding System (FACS). FACSGen permits total control over the action units (AUs), which can be animated at all levels of intensity and applied alone or in combination to an infinite number of faces. In two studies, we tested the validity of the software for the AU appearance defined in the FACS manual and the conveyed emotionality of FACSGen expressions. In Experiment 1, four FACS-certified coders evaluated the complete set of 35 single AUs and 54 AU combinations for AU presence or absence, appearance quality, intensity, and asymmetry. In Experiment 2, lay participants performed a recognition task on emotional expressions created with FACSGen software and rated the similarity of expressions displayed by human and FACSGen faces. Results showed good to excellent classification levels for all AUs by the four FACS coders, suggesting that the AUs are valid exemplars of FACS specifications. Lay participants’ recognition rates for nine emotions were high, and comparisons of human and FACSGen expressions were very similar. The findings demonstrate the effectiveness of the software in producing reliable and emotionally valid expressions, and suggest its application in numerous scientific areas, including perception, emotion, and clinical and euroscience research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The currently available model-based global data sets of atmospheric circulation are a by-product of the daily requirement of producing initial conditions for numerical weather prediction (NWP) models. These data sets have been quite useful for studying fundamental dynamical and physical processes, and for describing the nature of the general circulation of the atmosphere. However, due to limitations in the early data assimilation systems and inconsistencies caused by numerous model changes, the available model-based global data sets may not be suitable for studying global climate change. A comprehensive analysis of global observations based on a four-dimensional data assimilation system with a realistic physical model should be undertaken to integrate space and in situ observations to produce internally consistent, homogeneous, multivariate data sets for the earth's climate system. The concept is equally applicable for producing data sets for the atmosphere, the oceans, and the biosphere, and such data sets will be quite useful for studying global climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While stirring and mixing properties in the stratosphere are reasonably well understood in the context of balanced (slow) dynamics, as is evidenced in numerous studies of chaotic advection, the strongly enhanced presence of high-frequency gravity waves in the mesosphere gives rise to a significant unbalanced (fast) component to the flow. The present investigation analyses result from two idealized shallow-water numerical simulations representative of stratospheric and mesospheric dynamics on a quasi-horizontal isentropic surface. A generalization of the Hua–Klein Eulerian diagnostic to divergent flow reveals that velocity gradients are strongly influenced by the unbalanced component of the flow. The Lagrangian diagnostic of patchiness nevertheless demonstrates the persistence of coherent features in the zonal component of the flow, in contrast to the destruction of coherent features in the meridional component. Single-particle statistics demonstrate t2 scaling for both the stratospheric and mesospheric regimes in the case of zonal dispersion, and distinctive scaling laws for the two regimes in the case of meridional dispersion. This is in contrast to two-particle statistics, which in the mesospheric (unbalanced) regime demonstrate a more rapid approach to Richardson’s t3 law in the case of zonal dispersion and is evidence of enhanced meridional dispersion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper generalises and applies recently developed blocking diagnostics in a two- dimensional latitude-longitude context, which takes into consideration both mid- and high-latitude blocking. These diagnostics identify characteristics of the associated wave-breaking as seen in the potential temperature (θ) on the dynamical tropopause, in particular the cyclonic or anticyclonic Direction of wave-Breaking (DB index), and the Relative Intensity (RI index) of the air masses that contribute to blocking formation. The methodology is extended to a 2-D domain and a cluster technique is deployed to classify mid- and high-latitude blocking according to the wave-breaking characteristics. Mid-latitude blocking is observed over Europe and Asia, where the meridional gradient of θ is generally weak, whereas high-latitude blocking is mainly present over the oceans, to the north of the jet-stream, where the meridional gradient of θ is much stronger. They occur respectively on the equatorward and poleward flank of the jet- stream, where the horizontal shear ∂u/∂y is positive in the first case and negative in the second case. A regional analysis is also conducted. It is found that cold-anticyclonic and cyclonic blocking divert the storm-track respectively to the south and to the north over the East Atlantic and western Europe. Furthermore, warm-cyclonic blocking over the Pacific and cold-anticyclonic blocking over Europe are identified as the most persistent types and are associated with large amplitude anomalies in temperature and precipitation. Finally, the high-latitude, cyclonic events seem to correlate well with low- frequency modes of variability over the Pacific and Atlantic Ocean.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulations of ozone loss rates using a three-dimensional chemical transport model and a box model during recent Antarctic and Arctic winters are compared with experimental loss rates. The study focused on the Antarctic winter 2003, during which the first Antarctic Match campaign was organized, and on Arctic winters 1999/2000, 2002/2003. The maximum ozone loss rates retrieved by the Match technique for the winters and levels studied reached 6 ppbv/sunlit hour and both types of simulations could generally reproduce the observations at 2-sigma error bar level. In some cases, for example, for the Arctic winter 2002/2003 at 475 K level, an excellent agreement within 1-sigma standard deviation level was obtained. An overestimation was also found with the box model simulation at some isentropic levels for the Antarctic winter and the Arctic winter 1999/2000, indicating an overestimation of chlorine activation in the model. Loss rates in the Antarctic show signs of saturation in September, which have to be considered in the comparison. Sensitivity tests were performed with the box model in order to assess the impact of kinetic parameters of the ClO-Cl2O2 catalytic cycle and total bromine content on the ozone loss rate. These tests resulted in a maximum change in ozone loss rates of 1.2 ppbv/sunlit hour, generally in high solar zenith angle conditions. In some cases, a better agreement was achieved with fastest photolysis of Cl2O2 and additional source of total inorganic bromine but at the expense of overestimation of smaller ozone loss rates derived later in the winter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulations of polar ozone losses were performed using the three-dimensional high-resolution (1∘ × 1∘) chemical transport model MIMOSA-CHIM. Three Arctic winters 1999–2000, 2001–2002, 2002–2003 and three Antarctic winters 2001, 2002, and 2003 were considered for the study. The cumulative ozone loss in the Arctic winter 2002–2003 reached around 35% at 475 K inside the vortex, as compared to more than 60% in 1999–2000. During 1999–2000, denitrification induces a maximum of about 23% extra ozone loss at 475 K as compared to 17% in 2002–2003. Unlike these two colder Arctic winters, the 2001–2002 Arctic was warmer and did not experience much ozone loss. Sensitivity tests showed that the chosen resolution of 1∘ × 1∘ provides a better evaluation of ozone loss at the edge of the polar vortex in high solar zenith angle conditions. The simulation results for ozone, ClO, HNO3, N2O, and NO y for winters 1999–2000 and 2002–2003 were compared with measurements on board ER-2 and Geophysica aircraft respectively. Sensitivity tests showed that increasing heating rates calculated by the model by 50% and doubling the PSC (Polar Stratospheric Clouds) particle density (from 5 × 10−3 to 10−2 cm−3) refines the agreement with in situ ozone, N2O and NO y levels. In this configuration, simulated ClO levels are increased and are in better agreement with observations in January but are overestimated by about 20% in March. The use of the Burkholder et al. (1990) Cl2O2 absorption cross-sections slightly increases further ClO levels especially in high solar zenith angle conditions. Comparisons of the modelled ozone values with ozonesonde measurement in the Antarctic winter 2003 and with Polar Ozone and Aerosol Measurement III (POAM III) measurements in the Antarctic winters 2001 and 2002, shows that the simulations underestimate the ozone loss rate at the end of the ozone destruction period. A slightly better agreement is obtained with the use of Burkholder et al. (1990) Cl2O2 absorption cross-sections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ensemble-based data assimilation is rapidly proving itself as a computationally-efficient and skilful assimilation method for numerical weather prediction, which can provide a viable alternative to more established variational assimilation techniques. However, a fundamental shortcoming of ensemble techniques is that the resulting analysis increments can only span a limited subspace of the state space, whose dimension is less than the ensemble size. This limits the amount of observational information that can effectively constrain the analysis. In this paper, a data selection strategy that aims to assimilate only the observational components that matter most and that can be used with both stochastic and deterministic ensemble filters is presented. This avoids unnecessary computations, reduces round-off errors and minimizes the risk of importing observation bias in the analysis. When an ensemble-based assimilation technique is used to assimilate high-density observations, the data-selection procedure allows the use of larger localization domains that may lead to a more balanced analysis. Results from the use of this data selection technique with a two-dimensional linear and a nonlinear advection model using both in situ and remote sounding observations are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New representations and efficient calculation methods are derived for the problem of propagation from an infinite regularly spaced array of coherent line sources above a homogeneous impedance plane, and for the Green's function for sound propagation in the canyon formed by two infinitely high, parallel rigid or sound soft walls and an impedance ground surface. The infinite sum of source contributions is replaced by a finite sum and the remainder is expressed as a Laplace-type integral. A pole subtraction technique is used to remove poles in the integrand which lie near the path of integration, obtaining a smooth integrand, more suitable for numerical integration, and a specific numerical integration method is proposed. Numerical experiments show highly accurate results across the frequency spectrum for a range of ground surface types. It is expected that the methods proposed will prove useful in boundary element modeling of noise propagation in canyon streets and in ducts, and for problems of scattering by periodic surfaces.