948 resultados para Multicast Application Level
Resumo:
In many high income developed countries, obesity is inversely associated with educational level. In some countries, a widening gap of obesity between educational groups has been reported. The aim of this study was to assess trends in body mass index (BMI) and in prevalence of overweight and obesity and their association with educational level in the adult Swiss population. Four cross-sectional National health interview surveys conducted in 1992/93 (n = 14,521), 1997 (n = 12,474), 2002 (n = 18,908) and 2007 (n = 17,879) using representative samples of the Swiss population (age range 18-102 years). BMI was derived from self-reported data. Overweight was defined as BMI > or = 25 and <30 kg/m(2), and obesity as BMI > or = 30 kg/m(2). Mean (+/- standard deviation) BMI increased from 24.7 +/- 3.6 in 1992/3 to 25.4 +/- 3.6 kg/m2 in 2007 in men and 22.8 +/- 3.8 to 23.7 +/- 4.3 kg/m(2) in women. Between 1992/3 and 2007, the prevalence of overweight + obesity increased from 40.4% to 49.5% in men and from 22.3% to 31.3% in women, while the prevalence of obesity increased from 6.3% to 9.4% in men and from 4.9% to 8.5% in women. The rate of increase in the prevalence of obesity was greater between 1992/3 and 2002 (men: +0.26%/year; women: +0.31%/year) than between 2002 and 2007 (men: +0.10%/year; women: +0.10%/year). A sizable fraction (approximately 25%) of the increasing mean BMI was due to increasing age of the participants over time. The increase was larger in low than high education strata of the population. BMI was strongly associated with low educational level among women and this gradient remained fairly constant over time. A weaker similar gradient by educational level was apparent in men, but it tended to increase over time. In Switzerland, overweight and obesity increased between 1992 and 2007 and was associated with low education status in both men and women. A trend towards a stabilization of mean BMI levels was noted in most age categories since 2002. The increase in the prevalence of obesity was larger in low education strata of the population. These findings suggest that obesity preventive measures should be targeted according to educational level in Switzerland.
Resumo:
The paper proposes and applies statistical tests for poverty dominance that check for whether poverty comparisons can be made robustly over ranges of poverty lines and classes of poverty indices. This helps provide both normative and statistical confidence in establishing poverty rankings across distributions. The tests, which can take into account the complex sampling procedures that are typically used by statistical agencies to generate household-level surveys, are implemented using the Canadian Survey of Labour and Income Dynamics (SLID) for 1996, 1999 and 2002. Although the yearly cumulative distribution functions cross at the lower tails of the distributions, the more recent years tend to dominate earlier years for a relatively wide range of poverty lines. Failing to take into account SLID's sampling variability (as is sometimes done) can inflate significantly one's confidence in ranking poverty. Taking into account SLID's complex sampling design (as has not been done before) can also decrease substantially the range of poverty lines over which a poverty ranking can be inferred.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
This paper investigates vulnerability to poverty in Haiti. Research in vulnerability in developing countries has been scarce due to the high data requirements of vulnerability studies (e.g. panel or long series of cross-sections). The methodology adopted here allows the assessment of vulnerability to poverty by exploiting the short panel structure of nested data at different levels. The decomposition method reveals that vulnerability in Haiti is largely a rural phenomenon and that schooling correlates negatively with vulnerability. Most importantly, among the different shocks affecting household's income, it is found that meso-level shocks are in general far more important than covariate shocks. This finding points to some interesting policy implications in decentralizing policies to alleviate vulnerability to poverty.
Resumo:
Nowadays, service providers in the Cloud offer complex services ready to be used as it was a commodity like water or electricity to their customers with any other extra effort for them. However, providing these services implies a high management effort which requires a lot of human interaction. Furthermore, an efficient resource management mechanism considering only provider's resources is, though necessary, not enough, because the provider's profit is limited by the amount of resources it owns. Dynamically outsourcing resources to other providers in response to demand variation avoids this problem and makes the provider to get more profit. A key technology for achieving these goals is virtualization which facilitates provider's management and provides on-demand virtual environments, which are isolated and consolidated in order to achieve a better utilization of the provider's resources. Nevertheless, dealing with some virtualization capabilities implies an effort for the user in order to take benefit from them. In order to avoid this problem, we are contributing the research community with a virtualized environment manager which aims to provide virtual machines that fulfils with the user requirements. Another challenge is sharing resources among different federated Cloud providers while exploiting the features of virtualization in a new approach for facilitating providers' management. This project aims for reducing provider's costs and at the same time fulfilling the quality of service agreed with the customers while maximizing the provider's revenue. It considers resource management at several layers, namely locally to each node in the provider, among different nodes in the provider, and among different federated providers. This latter layer supports the novel capabilities of outsourcing when the local resources are not enough to fulfil the users demand, and offering resources to other providers when the local resources are underused.
Resumo:
This paper is about the role played by stock of human capital on location decisions of new manufacturing plants. We analyse the effect of several skill levels (from basic school to PhD) on decisions about the location of plants in various industries and, therefore, of different technological levels. We also test whether spatial aggregation level biases the results and determine the most appropriate areas to be considered in analyses of these phenomena. Our main statistical source is the Register of Manufacturing Establishments of Catalonia (REIC), which has plant-level microdata on the locations of new manufacturing plants. Keywords: agglomeration economies, industrial location, human capital, count-data models, spatial econometrics.
Resumo:
To start off, this document describes the Catalan model for emergencies response and its reference frame in terms of geography, location population…In addition, describes the main actors involved in emergencies response such as: police, the Fire and Rescue Emergency Service, the Emergency Medical System, Civil Protection, Reception and Management of Emergency Calls, Rural Agents, ADF’s and UME. Civil Protection, Firefighters and Police are includes in the training model developed by the Institute for Public Safety of Catalonia which at the same time does research in both security and safety matters. Research activities are performed by the Area for Research, Knowledge and International Cooperation at the ISPC and an example of these activities are European Research Projects such as COIM-Best (Coordination Improvement by Best Practices) and BESECU (cross-cultural differences of human behaviour in fire disasters and other crisis situations) among others.
Resumo:
In this paper we address the complexity of the analysis of water use in relation to the issue of sustainability. In fact, the flows of water in our planet represent a complex reality which can be studied using many different perceptions and narratives referring to different scales and dimensions of analysis. For this reason, a quantitative analysis of water use has to be based on analytical methods that are semantically open: they must be able to define what we mean with the term “water” when crossing different scales of analysis. We propose here a definition of water as a resource that deal with the many services it provides to humans and ecosystems. WE argue that water can fulfil so many of them since the element has many characteristics that allow for the resource to be labelled with different attributes, depending on the end use –such as drinkable. Since the services for humans and the functions for ecosystems associated with water flows are defined on different scales but still interconnected it is necessary to organize our assessment of water use across different hierarchical levels. In order to do so we define how to approach the study of water use in the Societal Metabolism, by proposing the Water Metabolism, tganized in three levels: societal level, ecosystem level and global level. The possible end uses we distinguish for the society are: personal/physiological use, household use, economic use. Organizing the study of “water use” across all these levels increases the usefulness of the quantitative analysis and the possibilities of finding relevant and comparable results. To achieve this result, we adapted a method developed to deal with multi-level, multi-scale analysis - the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) approach - to the analysis of water metabolism. In this paper, we discuss the peculiar analytical identity that “water” shows within multi-scale metabolic studies: water represents a flow-element when considering the metabolism of social systems (at a small scale, when describing the water metabolism inside the society) and a fund-element when considering the metabolism o ecosystems (at a larger scale when describing the water metabolism outside the society). The theoretical analysis is illustrated using two case which characterize the metabolic patterns regarding water use of a productive system in Catalonia and a water management policy in Andarax River Basin in Andalusia.
Resumo:
The objective was to evaluate the effect of ZnO-Functionalised-Sepiolite (ZnO-Sepiolite) to fulfil Zn requirements and health status of weaning piglets. Pre-starter Basal Diet (BD, corn– soybean based, from weaning till 14 days on trial) was calculated to provide 27 mg Zn/kg feed from raw materials and had no added ZnO and no antibiotics or organic acids. Treatments during pre-starter period were: 1) BD+90% of NRC Zn requirements completed with ZnO (ZnO90); 2) BD+90% of NRC Zn requirements completed with ZnO-Sepiolite (ZnOS90); 3) BD+3000 mg ZnO/kg of diet (ZnO3000); 4) BD+150 mg added Zn/kg diet from ZnO-Sepiolite (ZnOS150). The starter feed (corn–soybean based, from 14 till 31 days on trial) was common for all piglets, and met 90% NRC Zn requirements by adding ZnO. Diarrhea affected more than 50% of the animals of ZnO90, ZnOS90 and ZnOS150, and 33% of the ZnO3000 animals. Animals from ZnOS90 tended (Pb0.10) to improve Gain to Feed ratio (G:F) compared to animals from ZnO90 (0.830 kg/kg vs. 0.811 kg/kg for G:F). Performance of animals from ZnO3000 was not significantly different from the other treatments, and was numerically similar to animals from ZnOS90. The inclusion of ZnO at 3000 mg/kg of feed in the pre-starter period numerically decreased P in serum at the end of this period, with no effect on Ca level; normal levels were restored after 2 weeks of feeding the same levels of Zn than other animals. Animals fed ZnOSepiolite diets had numerically higher serum Ca than ZnO90 and ZnO3000 at 12 days and higher than ZnO90 at 28 days. Serum Zn levels were significantly higher for ZnO3000 than the other treatments.
Resumo:
The Institute of Public Health in Ireland (IPH) is a partner in the European project DETERMINE, building on its previous involvement in the Closing the Gap project in 2004-2006. In the first year of the project (2007-2008) 15 DETERMINE partners identified policies and actions that have taken place within countries, and at the EU level, to address Social Determinants of Health Inequalities. These policies and actions were identified via a questionnaire, which also identified structures and tools/mechanisms being used in the country to support a 'health in all policy' approach.
Resumo:
Objective Biomonitoring of solvents using the unchanged substance in urine as exposure indicator is still relatively scarce due to some discrepancies between the results reported in the literature. Based on the assessment of toluene exposure, the aim of this work was to evaluate the effects of some steps likely to bias the results and to measure urinary toluene both in volunteers experimentally exposed and in workers of rotogravure factories. Methods Static headspace was used for toluene analysis. o-Cresol was also measured for comparison. Urine collection, storage and conservation conditions were studied to evaluate possible loss or contamination of toluene in controlled situations applied to six volunteers in an exposure chamber according to four scenarios with exposure at stable levels from 10 to 50 ppm. Kinetics of elimination of toluene were determined over 24 h. A field study was then carried out in a total of 29 workers from two rotogravure printing facilities. Results Potential contamination during urine collection in the field is confirmed to be a real problem but technical precautions for sampling, storage and analysis can be easily followed to control the situation. In the volunteers at rest, urinary toluene showed a rapid increase after 2 h with a steady level after about 3 h. At 47.1 ppm the mean cumulated excretion was about 0.005% of the amount of the toluene ventilated. Correlation between the toluene levels in air and in end of exposure urinary sample was excellent (r = 0.965). In the field study, the median personal exposure to toluene was 32 ppm (range 3.6-148). According to the correlations between environmental and biological monitoring data, the post-shift urinary toluene (r = 0.921) and o-cresol (r = 0.873) concentrations were, respectively, 75.6 mu g/l and 0.76 mg/g creatinine for 50 ppm toluene personal exposure. The corresponding urinary toluene concentration before the next shift was 11 mu g/l (r = 0.883). Conclusion Urinary toluene was shown once more time a very interesting surrogate to o-cresol and could be recommended as a biomarker of choice for solvent exposure. [Authors]
Resumo:
Recognition by the T-cell receptor (TCR) of immunogenic peptides (p) presented by Class I major histocompatibility complexes (MHC) is the key event in the immune response against virus-infected cells or tumor cells. A study of the 2C TCR/SIYR/H-2K(b) system using a computational alanine scanning and a much faster binding free energy decomposition based on the Molecular Mechanics-Generalized Born Surface Area (MM-GBSA) method is presented. The results show that the TCR-p-MHC binding free energy decomposition using this approach and including entropic terms provides a detailed and reliable description of the interactions between the molecules at an atomistic level. Comparison of the decomposition results with experimentally determined activity differences for alanine mutants yields a correlation of 0.67 when the entropy is neglected and 0.72 when the entropy is taken into account. Similarly, comparison of experimental activities with variations in binding free energies determined by computational alanine scanning yields correlations of 0.72 and 0.74 when the entropy is neglected or taken into account, respectively. Some key interactions for the TCR-p-MHC binding are analyzed and some possible side chains replacements are proposed in the context of TCR protein engineering. In addition, a comparison of the two theoretical approaches for estimating the role of each side chain in the complexation is given, and a new ad hoc approach to decompose the vibrational entropy term into atomic contributions, the linear decomposition of the vibrational entropy (LDVE), is introduced. The latter allows the rapid calculation of the entropic contribution of interesting side chains to the binding. This new method is based on the idea that the most important contributions to the vibrational entropy of a molecule originate from residues that contribute most to the vibrational amplitude of the normal modes. The LDVE approach is shown to provide results very similar to those of the exact but highly computationally demanding method.
Resumo:
Recent progress in the experimental determination of protein structures allow to understand, at a very detailed level, the molecular recognition mechanisms that are at the basis of the living matter. This level of understanding makes it possible to design rational therapeutic approaches, in which effectors molecules are adapted or created de novo to perform a given function. An example of such an approach is drug design, were small inhibitory molecules are designed using in silico simulations and tested in vitro. In this article, we present a similar approach to rationally optimize the sequence of killer T lymphocytes receptors to make them more efficient against melanoma cells. The architecture of this translational research project is presented together with its implications both at the level of basic research as well as in the clinics.
Resumo:
Given the urgence of a new paradigm in wireless digital trasmission which should allow for higher bit rate, lower latency and tigher delay constaints, it has been proposed to investigate the fundamental building blocks that at the circuital/device level, will boost the change towards a more efficient network architecture, with high capacity, higher bandwidth and a more satisfactory end user experience. At the core of each transciever, there are inherently analog devices capable of providing the carrier signal, the oscillators. It is strongly believed that many limitations in today's communication protocols, could be relieved by permitting high carrier frequency radio transmission, and having some degree of reconfigurability. This led us to studying distributed oscillator architectures which work in the microwave range and possess wideband tuning capability. As microvave oscillators are essentially nonlinear devices, a full nonlinear analyis, synthesis, and optimization had to be considered for their implementation. Consequently, all the most used nonlinear numerical techniques in commercial EDA software had been reviewed. An application of all the aforementioned techniques has been shown, considering a systems of three coupled oscillator ("triple push" oscillator) in which the stability of the various oscillating modes has been studied. Provided that a certain phase distribution is maintained among the oscillating elements, this topology permits a rise in the output power of the third harmonic; nevertheless due to circuit simmetry, "unwanted" oscillating modes coexist with the intenteded one. Starting with the necessary background on distributed amplification and distributed oscillator theory, the design of a four stage reverse mode distributed voltage controlled oscillator (DVCO) using lumped elments has been presented. All the design steps have been reported and for the first time a method for an optimized design with reduced variations in the output power has been presented. Ongoing work is devoted to model a wideband DVCO and to implement a frequency divider.