934 resultados para compression


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an improved parallel Two-Pass Hexagonal (TPA) algorithm constituted by Linear Hashtable Motion Estimation Algorithm (LHMEA) and Hexagonal Search (HEXBS) for motion estimation. Motion Vectors (MV) are generated from the first-pass LHMEA and used as predictors for second-pass HEXBS motion estimation, which only searches a small number of Macroblocks (MBs). We used bashtable into video processing and completed parallel implementation. The hashtable structure of LHMEA is improved compared to the original TPA and LHMEA. We propose and evaluate parallel implementations of the LHMEA of TPA on clusters of workstations for real time video compression. The implementation contains spatial and temporal approaches. The performance of the algorithm is evaluated by using standard video sequences and the results are compared to current algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a novel two-pass algorithm constituted by Linear Hashtable Motion Estimation Algorithm (LHMEA) and Hexagonal Search (HEXBS) for block base motion compensation. On the basis of research from previous algorithms, especially an on-the-edge motion estimation algorithm called hexagonal search (HEXBS), we propose the LHMEA and the Two-Pass Algorithm (TPA). We introduced hashtable into video compression. In this paper we employ LHMEA for the first-pass search in all the Macroblocks (MB) in the picture. Motion Vectors (MV) are then generated from the first-pass and are used as predictors for second-pass HEXBS motion estimation, which only searches a small number of MBs. The evaluation of the algorithm considers the three important metrics being time, compression rate and PSNR. The performance of the algorithm is evaluated by using standard video sequences and the results are compared to current algorithms, Experimental results show that the proposed algorithm can offer the same compression rate as the Full Search. LHMEA with TPA has significant improvement on HEXBS and shows a direction for improving other fast motion estimation algorithms, for example Diamond Search.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The high variability of the intensity of suprathermal electron flux in the solar wind is usually ascribed to the high variability of sources on the Sun. Here we demonstrate that a substantial amount of the variability arises from peaks in stream interaction regions, where fast wind runs into slow wind and creates a pressure ridge at the interface. Superposed epoch analysis centered on stream interfaces in 26 interaction regions previously identified in Wind data reveal a twofold increase in 250 eV flux (integrated over pitch angle). Whether the peaks result from the compression there or are solar signatures of the coronal hole boundary, to which interfaces may map, is an open question. Suggestive of the latter, some cases show a displacement between the electron and magnetic field peaks at the interface. Since solar information is transmitted to 1 AU much more quickly by suprathermal electrons compared to convected plasma signatures, the displacement may imply a shift in the coronal hole boundary through transport of open magnetic flux via interchange reconnection. If so, however, the fact that displacements occur in both directions and that the electron and field peaks in the superposed epoch analysis are nearly coincident indicate that any systematic transport expected from differential solar rotation is overwhelmed by a random pattern, possibly owing to transport across a ragged coronal hole boundary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new self-tuning implicit pole-assignment algorithm is presented which, through the use of a pole compression factor and different RLS model and control structures, overcomes stability and convergence problems encountered in previously available algorithms. Computational requirements of the technique are much reduced when compared to explicit pole-assignment schemes, whereas the inherent robustness of the strategy is retained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present stereoscopic images of an Earth-impacting Coronal Mass Ejection (CME). The CME was imaged by the Heliospheric Imagers onboard the twin STEREO spacecraft during December 2008. The apparent acceleration of the CME is used to provide independent estimates of its speed and direction from the two spacecraft. Three distinct signatures within the CME were all found to be closely Earth-directed. At the time that the CME was predicted to pass the ACE spacecraft, in-situ observations contained a typical CME signature. At Earth, ground-based magnetometer observations showed a small but widespread sudden response to the compression of the geomagnetic cavity at CME impact. In this case, STEREO could have given warning of CME impact at least 24 hours in advance. These stereoscopic observations represent a significant milestone for the STEREO mission and have significant potential for improving operational space weather forecasting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is generally assumed that the variability of neuronal morphology has an important effect on both the connectivity and the activity of the nervous system, but this effect has not been thoroughly investigated. Neuroanatomical archives represent a crucial tool to explore structure–function relationships in the brain. We are developing computational tools to describe, generate, store and render large sets of three–dimensional neuronal structures in a format that is compact, quantitative, accurate and readily accessible to the neuroscientist. Single–cell neuroanatomy can be characterized quantitatively at several levels. In computer–aided neuronal tracing files, a dendritic tree is described as a series of cylinders, each represented by diameter, spatial coordinates and the connectivity to other cylinders in the tree. This ‘Cartesian’ description constitutes a completely accurate mapping of dendritic morphology but it bears little intuitive information for the neuroscientist. In contrast, a classical neuroanatomical analysis characterizes neuronal dendrites on the basis of the statistical distributions of morphological parameters, e.g. maximum branching order or bifurcation asymmetry. This description is intuitively more accessible, but it only yields information on the collective anatomy of a group of dendrites, i.e. it is not complete enough to provide a precise ‘blueprint’ of the original data. We are adopting a third, intermediate level of description, which consists of the algorithmic generation of neuronal structures within a certain morphological class based on a set of ‘fundamental’, measured parameters. This description is as intuitive as a classical neuroanatomical analysis (parameters have an intuitive interpretation), and as complete as a Cartesian file (the algorithms generate and display complete neurons). The advantages of the algorithmic description of neuronal structure are immense. If an algorithm can measure the values of a handful of parameters from an experimental database and generate virtual neurons whose anatomy is statistically indistinguishable from that of their real counterparts, a great deal of data compression and amplification can be achieved. Data compression results from the quantitative and complete description of thousands of neurons with a handful of statistical distributions of parameters. Data amplification is possible because, from a set of experimental neurons, many more virtual analogues can be generated. This approach could allow one, in principle, to create and store a neuroanatomical database containing data for an entire human brain in a personal computer. We are using two programs, L–NEURON and ARBORVITAE, to investigate systematically the potential of several different algorithms for the generation of virtual neurons. Using these programs, we have generated anatomically plausible virtual neurons for several morphological classes, including guinea pig cerebellar Purkinje cells and cat spinal cord motor neurons. These virtual neurons are stored in an online electronic archive of dendritic morphology. This process highlights the potential and the limitations of the ‘computational neuroanatomy’ strategy for neuroscience databases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current force feedback, haptic interface devices are generally limited to the display of low frequency, high amplitude spatial data. A typical device consists of a low impedance framework of one or more degrees-of-freedom (dof), allowing a user to explore a pre-defined workspace via an end effector such as a handle, thimble, probe or stylus. The movement of the device is then constrained using high gain positional feedback, thus reducing the apparent dof of the device and conveying the illusion of hard contact to the user. Such devices are, however, limited to a narrow bandwidth of frequencies, typically below 30Hz, and are not well suited to the display of surface properties, such as object texture. This paper details a device to augment an existing force feedback haptic display with a vibrotactile display, thus providing a means of conveying low amplitude, high frequency spatial information of object surface properties. 1. Haptics and Haptic Interfaces Haptics is the study of human touch and interaction with the external environment via touch. Information from the human sense of touch can be classified in to two categories, cutaneous and kinesthetic. Cutaneous information is provided via the mechanoreceptive nerve endings in the glabrous skin of the human hand. It is primarily a means of relaying information regarding small-scale details in the form of skin stretch, compression and vibration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Four fat blends based on palm fractions in combination with high oleic sunflower oil (HOSF) with a relatively low saturated fatty acid content (29.2±0.85%, i.e. less than 50% of that of butter) were prepared. The saturated fat was located in different triacylglycerols (TAG) structures in each blend. Principal saturated TAG were derived from palm stearin (POs, containing tripalmitoyl glycerol - PPP), palm mid fraction (PMF, containing 1,3-dipalmitoyl-2-oleoyl glycerol - POP) and interesterified PMF (inPMF, containing PPP, POP and rac-1,2-dipalmitoyl-3-oleoyl glycerol - PPO). Thus, in blend 1, composed of POs and HOSF, the saturates resided principally in PPP. In blend 2, composed of POs, PMF and HOSF, the principal saturate-containing TAG were PPP and POP. Blend 3, composed of inPMF and HOSF, was similar to blend 2 except that the disaturated TAG comprised a 2:1 mixture of PPO:POP. Finally, blend 4, a mixture of PMF and HOSF, had saturates present mainly as POP. The physical properties and the functionality of blends, as shortenings for puff pastry laminated in a warm bakery environment (20-30°C), were compared with each other, and with butter. Puff pastry prepared with blend 1 (POs:HOSF 29:71) and blend 4 (PMF:HOSF 41:59), was very hard; blend 2 (POs:PMF:HOSF 13:19:68) was most similar to butter in the compressibility of the baked product and it performed well in an independent baking trial; blend 3 (inPMF:HOSF 40:60) gave a product that required a higher force for compression than butter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An active pharmaceutical ingredient (API) was found to dissociate from the highly crystalline hydrochloride form to the amorphous free base form, with consequent alterations to tablet properties. Here, a wet granulation manufacturing process has been investigated using in situ Fourier transform (FT)-Raman spectroscopic analyses of granules and tablets prepared with different granulating fluids and under different manufacturing conditions. Dosage form stability under a range of storage stresses was also investigated. Despite the spectral similarities between the two drug forms, low levels of API dissociation could be quantified in the tablets; the technique allowed discrimination of around 4% of the API content as the amorphous free base (i.e. less than 1% of the tablet compression weight). API dissociation was shown to be promoted by extended exposure to moisture. Aqueous granulating fluids and manufacturing delays between granulation and drying stages and storage of the tablets in open conditions at 40◦C/75% relative humidity (RH) led to dissociation. In contrast, non-aqueous granulating fluids, with no delay in processing and storage of the tablets in either sealed containers or at lower temperature/humidity prevented detectable dissociation. It is concluded that appropriate manufacturing process and storage conditions for the finished product involved minimising exposure to moisture of the API. Analysis of the drug using FT-Raman spectroscopy allowed rapid optimisation of the process whilst offering quantitative molecular information concerning the dissociation of the drug salt to the amorphous free base form.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A supramolecular polymer blend, formed via π-π interactions between a π-electron rich pyrenyl endcapped oligomer and a chain-folding oligomer containing pairs of π-electron poor naphthalene-diimide (NDI) units, has been reinforced with cellulose nanocrystals (CNCs) to afford a healable nanocomposite material. Nanocomposites with varying weight percentage of CNCs (from 1.25 to 20.0 wt.%) within the healable supramolecular polymeric matrix have been prepared via solvent casting followed by compression molding, and their mechanical properties and healing behavior have been evaluated. It is found that homogeneously dispersed films can be formed with CNCs at less than 10 wt.%. Above 10 wt.% CNC heterogeneous nanocomposites were obtained. All the nanocomposites formed could be re-healed upon exposure to elevated temperatures although, for the homogeneous films, it was found that the healing rate was reduced with increasing CNC content. The best combination of healing efficiency and mechanical properties was obtained with the 7.5 wt.% CNC nanocomposite which exhibited a tensile modulus enhanced by as much as a factor of 20 over the matrix material alone and could be fully re-healed at 85 °C within 30 minutes. Thus it is demonstrated that supramolecular nanocomposites can afford greatly enhanced mechanical properties relative to the unreinforced polymer, while still allowing efficient thermal healing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Written for communications and electronic engineers, technicians and students, this book begins with an introduction to data communications, and goes on to explain the concept of layered communications. Other chapters deal with physical communications channels, baseband digital transmission, analog data transmission, error control and data compression codes, physical layer standards, the data link layer, the higher layers of the protocol hierarchy, and local are networks (LANS). Finally, the book explores some likely future developments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, numerical analyses of the thermal performance of an indirect evaporative air cooler incorporating a M-cycle cross-flow heat exchanger has been carried out. The numerical model was established from solving the coupled governing equations for heat and mass transfer between the product and working air, using the finite-element method. The model was developed using the EES (Engineering Equation Solver) environment and validated by published experimental data. Correlation between the cooling (wet-bulb) effectiveness, system COP and a number of air flow/exchanger parameters was developed. It is found that lower channel air velocity, lower inlet air relative humidity, and higher working-to-product air ratio yielded higher cooling effectiveness. The recommended average air velocities in dry and wet channels should not be greater than 1.77 m/s and 0.7 m/s, respectively. The optimum flow ratio of working-to-product air for this cooler is 50%. The channel geometric sizes, i.e. channel length and height, also impose significant impact to system performance. Longer channel length and smaller channel height contribute to increase of the system cooling effectiveness but lead to reduced system COP. The recommend channel height is 4 mm and the dimensionless channel length, i.e., ratio of the channel length to height, should be in the range 100 to 300. Numerical study results indicated that this new type of M-cycle heat and mass exchanger can achieve 16.7% higher cooling effectiveness compared with the conventional cross-flow heat and mass exchanger for the indirect evaporative cooler. The model of this kind is new and not yet reported in literatures. The results of the study help with design and performance analyses of such a new type of indirect evaporative air cooler, and in further, help increasing market rating of the technology within building air conditioning sector, which is currently dominated by the conventional compression refrigeration technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A discrete element model is used to study shear rupture of sea ice under convergent wind stresses. The model includes compressive, tensile, and shear rupture of viscous elastic joints connecting floes that move under the action of the wind stresses. The adopted shear rupture is governed by Coulomb’s criterion. The ice pack is a 400 km long square domain consisting of 4 km size floes. In the standard case with tensile strength 10 times smaller than the compressive strength, under uniaxial compression the failure regime is mainly shear rupture with the most probable scenario corresponding to that with the minimum failure work. The orientation of cracks delineating formed aggregates is bimodal with the peaks around the angles given by the wing crack theory determining diamond-shaped blocks. The ice block (floe aggregate) size decreases as the wind stress gradient increases since the elastic strain energy grows faster leading to a higher speed of crack propagation. As the tensile strength grows, shear rupture becomes harder to attain and compressive failure becomes equally important leading to elongation of blocks perpendicular to the compression direction and the blocks grow larger. In the standard case, as the wind stress confinement ratio increases the failure mode changes at a confinement ratio within 0.2–0.4, which corresponds to the analytical critical confinement ratio of 0.32. Below this value, the cracks are bimodal delineating diamond shape aggregates, while above this value failure becomes isotropic and is determined by small-scale stress anomalies due to irregularities in floe shape.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

new rheology that explicitly accounts for the subcontinuum anisotropy of the sea ice cover is implemented into the Los Alamos sea ice model. This is in contrast to all models of sea ice included in global circulation models that use an isotropic rheology. The model contains one new prognostic variable, the local structure tensor, which quantifies the degree of anisotropy of the sea ice, and two parameters that set the time scale of the evolution of this tensor. The anisotropic rheology provides a subcontinuum description of the mechanical behavior of sea ice and accounts for a continuum scale stress with large shear to compression ratio and tensile stress component. Results over the Arctic of a stand-alone version of the model are presented and anisotropic model sensitivity runs are compared with a reference elasto-visco-plastic simulation. Under realistic forcing sea ice quickly becomes highly anisotropic over large length scales, as is observed from satellite imagery. The influence of the new rheology on the state and dynamics of the sea ice cover is discussed. Our reference anisotropic run reveals that the new rheology leads to a substantial change of the spatial distribution of ice thickness and ice drift relative to the reference standard visco-plastic isotropic run, with ice thickness regionally increased by more than 1 m, and ice speed reduced by up to 50%.