35 resultados para STOCKINGS, COMPRESSION
Resumo:
We explore the ability of the recently established quasilocal density functional theory for describing the isoscalar giant monopole resonance. Within this theory we use the scaling approach and perform constrained calculations for obtaining the cubic and inverse energy weighted moments (sum rules) of the RPA strength. The meaning of the sum rule approach in this case is discussed. Numerical calculations are carried out using Gogny forces and an excellent agreement is found with HF+RPA results previously reported in literature. The nuclear matter compression modulus predicted in our model lies in the range 210230 MeV which agrees with earlier findings. The information provided by the sum rule approach in the case of nuclei near the neutron drip line is also discussed.
Resumo:
The structure, magnetic response, and dielectric response of the grown epitaxial thin films of the orthorhombic phase of YMnO3 oxide on Nb:SrTiO3 (001) substrates have been measured. We have found that a substrate-induced strain produces an in-plane compression of the YMnO3 unit cell. The magnetization versus temperature curves display a significant zero-field cooling (ZFC)-field cooling hysteresis below the Nel temperature (TN 45 K). The dielectric constant increases gradually (up to 26%) below the TN and mimics the ZFC magnetization curve. We argue that these effects could be a manifestation of magnetoelectric coupling in YMnO3 thin films and that the magnetic structure of YMnO3 can be controlled by substrate selection and/or growth conditions.
Resumo:
Evidence on trends in prevalence of disease and disability can clarify whether countries are experiencing a compression or expansion of morbidity. An expansion of morbidity as indicated by disease have appeared in Europe and other developed regions. It is likely that better treatment, preventive measures and increases in education levels have contributed to the declines in mortality and increments in life expectancy. This paper examines whether there has been an expansion of morbidity in Catalonia (Spain). It uses trends in mortality and morbidity from major causes of death and links of these with survival to provide estimates of life expectancy with and without diseases and functioning loss. We use a repeated cross-sectional health survey carried out in 1994 and 2011 for measures of morbidity; mortality information comes from the Spanish National Statistics Institute. Our findings show that at age 65 the percentage of life with disease increased from 52% to 70% for men, and from 56% to 72% for women; the expectation of life unable to function increased from 24% to 30% for men and 40% to 47% for women between 1994 and 2011. These changes were attributable to increases in the prevalences of diseases and moderate functional limitation. Overall, we find an expansion of morbidity along the period. Increasing survival among people with diseases can lead to a higher prevalence of diseases in the older population. Higher prevalence of health problems can lead to greater pressure on the health care system and a growing burden of disease for individuals.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
In this paper we propose a method for computing JPEG quantization matrices for a given mean square error or PSNR. Then, we employ our method to compute JPEG standard progressive operation mode definition scripts using a quantization approach. Therefore, it is no longer necessary to use a trial and error procedure to obtain a desired PSNR and/or definition script, reducing cost. Firstly, we establish a relationship between a Laplacian source and its uniform quantization error. We apply this model to the coefficients obtained in the discrete cosine transform stage of the JPEG standard. Then, an image may be compressed using the JPEG standard under a global MSE (or PSNR) constraint and a set of local constraints determined by the JPEG standard and visual criteria. Secondly, we study the JPEG standard progressive operation mode from a quantization based approach. A relationship between the measured image quality at a given stage of the coding process and a quantization matrix is found. Thus, the definition script construction problem can be reduced to a quantization problem. Simulations show that our method generates better quantization matrices than the classical method based on scaling the JPEG default quantization matrix. The estimation of PSNR has usually an error smaller than 1 dB. This figure decreases for high PSNR values. Definition scripts may be generated avoiding an excessive number of stages and removing small stages that do not contribute during the decoding process with a noticeable image quality improvement.
Resumo:
This paper proposes a novel high capacity robust audio watermarking algorithm by using the high frequency band of the wavelet decomposition at which the human auditory system (HAS) is not very sensitive to alteration. The main idea is to divide the high frequency band into frames and, for embedding, to change the wavelet samples depending on the average of relevant frame¿s samples. The experimental results show that the method has a very high capacity (about 11,000 bps), without significant perceptual distortion (ODG in [¿1 ,0] and SNR about 30dB), and provides robustness against common audio signal processing such as additive noise, filtering, echo and MPEG compression (MP3).
Resumo:
Peer-reviewed
Resumo:
Peer-reviewed
Resumo:
Peer-reviewed
Resumo:
This correspondence addresses the problem of nondata-aidedwaveform estimation for digital communications. Based on the unconditionalmaximum likelihood criterion, the main contribution of this correspondenceis the derivation of a closed-form solution to the waveform estimationproblem in the low signal-to-noise ratio regime. The proposed estimationmethod is based on the second-order statistics of the received signaland a clear link is established between maximum likelihood estimation andcorrelation matching techniques. Compression with the signal-subspace isalso proposed to improve the robustness against the noise and to mitigatethe impact of abnormals or outliers.
Resumo:
We have studied the effect of pressure on the structural and vibrational properties of lanthanum tritungstate La2(WO4)3. This compound crystallizes under ambient conditions in the modulated scheelite-type structure known as the α phase. We have performed x-ray diffraction and Raman scattering measurements up to a pressure of 20 GPa, as well as ab initio calculations within the framework of the density functional theory. Up to 5 GPa, the three methods provide a similar picture of the evolution under pressure of α-La2(WO4)3. At 5 GPa, we begin to observe some structural changes, and above 6 GPa we find that the x-ray patterns cannot be indexed as a single phase. However, we find that a mixture of two phases with C2/c symmetry accounts for all diffraction peaks. Our ab initio study confirms the existence of several C2/c structures, which are very close in energy in this compression range. According to our measurements, a state with medium-range order appears at pressures above 9 and 11 GPa, from x-ray diffraction and Raman experiments, respectively. Based upon our theoretical calculations we propose several high-pressure candidates with high cationic coordinations at these pressures. The compound evolves into a partially amorphous phase at pressures above 20 GPa.
Resumo:
Iberia underwent intraplate deformation during the Mesozoic and Cenozoic. In eastem Ibena, compression took place during the Palaeogene and early Miocene, giving rise to the Iberian Chain, and extension started during the early Miocene in the coastal areas and the Valencia trough; during early Miocene compression continued in the western Iberian Chain whereas extension had started in the eastern Iberian Chain. From the kinematic data obtained from the major compressional and extensional structures formed dunng the Cenozoic, a simple dynamic model using Bott's (1959) formula is presented. The results show that both extension and compression may have been produced assuming a main horizontal stress-axis approximately N-S, in a similar direction that the convergence between Europe, Ibena and Afnca dunng the Cenozoic.
Resumo:
El grup de recerca AMADE de la Universitat de Girona està especialitzat en la caracteritzacióde materials compòsits. Per avaluar la resistència i la tolerància al dany d'aquestes estructures,típicament es realitzen dos assajos: impacte a baixa velocitat i compressió després d'impacte(CAI per les sigles en anglès de Compression After Impact), respectivament. L'objectiu del'assaig CAI és obtenir la resistència residual a compressió de l'estructura.L'objectiu d'aquest treball és el de crear una adaptació per l'utillatge capaç d'assajar plaquesamb baix gruix . La modelització amb el mètode dels elements finits serveix per avaluar quantitativament el valorde les càrregues crítiques de vinclament per diferents gruixos de placa.Les plaques d'assaig són plaques de material compòsit amb làmines unidireccionals amb apilamentsimètric i balancejat
Resumo:
Peer reviewed