7 resultados para Large-scale experiments

em Universidad de Alicante


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Comunicación presentada en el X Workshop of Physical Agents, Cáceres, 10-11 septiembre 2009.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El reciente crecimiento masivo de medios on-line y el incremento de los contenidos generados por los usuarios (por ejemplo, weblogs, Twitter, Facebook) plantea retos en el acceso e interpretación de datos multilingües de manera eficiente, rápida y asequible. El objetivo del proyecto TredMiner es desarrollar métodos innovadores, portables, de código abierto y que funcionen en tiempo real para generación de resúmenes y minería cross-lingüe de medios sociales a gran escala. Los resultados se están validando en tres casos de uso: soporte a la decisión en el dominio financiero (con analistas, empresarios, reguladores y economistas), monitorización y análisis político (con periodistas, economistas y políticos) y monitorización de medios sociales sobre salud con el fin de detectar información sobre efectos adversos a medicamentos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article describes an effective procedure for reducing the water content of excess sludge production from a wastewater treatment plant by increasing its concentration and, as a consequence, minimizing the volume of sludge to be managed. It consists of a pre-dewatering sludge process, which is used as a preliminary step or alternative to the thickening. It is made up of two discontinuous sequential stages: the first is resettling and the second, filtration through a porous medium. The process is strictly physical, without any chemical additives or electromechanical equipment intervening. The experiment was carried out in a pilot-scale system, consisting of a column of sedimentation that incorporates a filter medium. Different sludge heights were tested over the filter to verify the influence of hydrostatic pressure on the various final concentrations of each stage. The results show that the initial sludge concentration may increase by more than 570% by the end of the process with the final volume of sludge being reduced in similar proportions and hydrostatic pressure having a limited effect on this final concentration. Moreover, the value of the hydrostatic pressure at which critical specific cake resistance is reached is established.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vela X–1 is the prototype of the class of wind-fed accreting pulsars in high-mass X-ray binaries hosting a supergiant donor. We have analysed in a systematic way 10 years of INTEGRAL data of Vela X–1 (22–50 keV) and we found that when outside the X-ray eclipse, the source undergoes several luminosity drops where the hard X-rays luminosity goes below ∼3 × 1035 erg s−1, becoming undetected by INTEGRAL. These drops in the X-ray flux are usually referred to as ‘off-states’ in the literature. We have investigated the distribution of these off-states along the Vela X–1 ∼ 8.9 d orbit, finding that their orbital occurrence displays an asymmetric distribution, with a higher probability to observe an off-state near the pre-eclipse than during the post-eclipse. This asymmetry can be explained by scattering of hard X-rays in a region of ionized wind, able to reduce the source hard X-ray brightness preferentially near eclipse ingress. We associate this ionized large-scale wind structure with the photoionization wake produced by the interaction of the supergiant wind with the X-ray emission from the neutron star. We emphasize that this observational result could be obtained thanks to the accumulation of a decade of INTEGRAL data, with observations covering the whole orbit several times, allowing us to detect an asymmetric pattern in the orbital distribution of off-states in Vela X–1.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, we propose a new methodology for the large scale optimization and process integration of complex chemical processes that have been simulated using modular chemical process simulators. Units with significant numerical noise or large CPU times are substituted by surrogate models based on Kriging interpolation. Using a degree of freedom analysis, some of those units can be aggregated into a single unit to reduce the complexity of the resulting model. As a result, we solve a hybrid simulation-optimization model formed by units in the original flowsheet, Kriging models, and explicit equations. We present a case study of the optimization of a sour water stripping plant in which we simultaneously consider economics, heat integration and environmental impact using the ReCiPe indicator, which incorporates the recent advances made in Life Cycle Assessment (LCA). The optimization strategy guarantees the convergence to a local optimum inside the tolerance of the numerical noise.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Paper submitted to the IFIP International Conference on Very Large Scale Integration (VLSI-SOC), Darmstadt, Germany, 2003.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the current Information Age, data production and processing demands are ever increasing. This has motivated the appearance of large-scale distributed information. This phenomenon also applies to Pattern Recognition so that classic and common algorithms, such as the k-Nearest Neighbour, are unable to be used. To improve the efficiency of this classifier, Prototype Selection (PS) strategies can be used. Nevertheless, current PS algorithms were not designed to deal with distributed data, and their performance is therefore unknown under these conditions. This work is devoted to carrying out an experimental study on a simulated framework in which PS strategies can be compared under classical conditions as well as those expected in distributed scenarios. Our results report a general behaviour that is degraded as conditions approach to more realistic scenarios. However, our experiments also show that some methods are able to achieve a fairly similar performance to that of the non-distributed scenario. Thus, although there is a clear need for developing specific PS methodologies and algorithms for tackling these situations, those that reported a higher robustness against such conditions may be good candidates from which to start.