896 resultados para global approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

La desnutrición infantil y la pobreza se encuentran asociadas y estas a su vez con el progreso de los países. Conocer las determinantes sociales y económicas de la niñez que padece de bajo peso es necesario para crear escenarios propicios para el adecuado desarrollo de la primera infancia y de esta manera contribuir con la superación de la pobreza en el marco de sistemas sanitarios equitativos. Se realiza una descripción de las características socio-económicas y un análisis de posibles asociaciones entre estas y el bajo peso infantil de una muestra de infantes de uno de los sectores de mayor vulnerabilidad y pobreza de Bogotá (Colombia). La tasa del bajo peso infantil en la muestra del estudio en más alta a la presentada en Bogotá y Colombia (8.5%, 2.9% y 3.4% respectivamente). Al realizar el análisis de las posibles asociaciones entre el bajo peso y las variables de estudio, se evidencia que las relaciones son débiles entre la primera y las segundas, siendo la condición de desplazamiento la que mayor asociación positiva presenta con la deficiencia nutricional seguido del rango de edad entre los 25 y 36 meses. La situación que presenta mayor independencia con respecto al bajo peso infantil es contar con vivienda propia seguida del sexo. La desnutrición infantil se presenta en niveles importantes en sectores de mayor vulnerabilidad con implicaciones para el adecuado desarrollo de los infantes y para las intenciones de reducción de los índices de pobreza en el país. El fortalecimiento de las políticas públicas que favorezca el desarrollo infantil, la superación de la pobreza y las inequidades en los sistemas de salud deben contemplar acciones integrales dirigidas a los más vulnerables, con la participación de la sociedad civil y los sectores públicos y privados, el compromiso político y económico de los gobiernos y reglas claras que contribuyan a la solución estructural de la pobreza y que promueva el adecuado desarrollo infantil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introducción: La evaluación de la función miocárdica global y regional juega una papel crítico en el diagnóstico y manejo de los pacientes con enfermedad coronaria con importantes implicaciones pronosticas, las nuevas técnicas ecocardiográficas como la evaluación del STRAIN han sido validadas como una herramienta objetiva, comprehensiva y precisa para evaluar dichos parámetros. Objetivo: Determinar la capacidad del strain global longitudinal para la detección de estenosis coronaria significativa, número de territorios comprometidos y territorio anatómico del vaso culpable; en pacientes sin antecedentes de enfermedad coronaria previa con infarto agudo del miocardio. Diseño: estudio de pruebas diagnósticas retrospectivo en el que se utilizó como gold estándar la angiografía coronaria, se seleccionaron 64 pacientes con ecocardiograma transtorácico previo a la angiografía coronaria. Resultados: Se demostró una exactitud intermedia del strain global longitudinal para detectar estenosis coronaria por análisis de curvas ROC, con un área bajo la curva de 0,78 p= 0,000 (IC 0,6; 1,0), Una sensibilidad de 96.5% (91.7%, 101.3%), especificidad 40.0% (9.6%, 70.4%) y una prevalencia real del enfermedad coronaria de 85.1% (76.5%, 93.6%) Conclusiones: La medición de la función global y regional por medio del strain global longitudinal identifica pacientes con infarto agudo del miocardio que tienen estenosis coronaria significativa, número de territorios afectados, y la distribución anatómica de los posibles vasos culpables, sin embargo hay que tener precaución en su uso que sólo se limite a escenarios en donde pueda ser interpretado adecuadamente. Palabras clave: strain global bidimensional, detección de estenosis coronaria significativa, infarto del miocardio.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective pursued in this thesis targets the development and systematization of a methodology that allows addressing management problems in the dynamic operation of Urban Wastewater Systems. The proposed methodology will suggest operational strategies that can improve the overall performance of the system under certain problematic situations through a model-based approach. The proposed methodology has three main steps: The first step includes the characterization and modeling of the case-study, the definition of scenarios, the evaluation criteria and the operational settings that can be manipulated to improve the system’s performance. In the second step, Monte Carlo simulations are launched to evaluate how the system performs for a wide range of operational settings combinations, and a global sensitivity analysis is conducted to rank the most influential operational settings. Finally, the third step consists on a screening methodology applying a multi-criteria analysis to select the best combinations of operational settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The characteristics of service independence and flexibility of ATM networks make the control problems of such networks very critical. One of the main challenges in ATM networks is to design traffic control mechanisms that enable both economically efficient use of the network resources and desired quality of service to higher layer applications. Window flow control mechanisms of traditional packet switched networks are not well suited to real time services, at the speeds envisaged for the future networks. In this work, the utilisation of the Probability of Congestion (PC) as a bandwidth decision parameter is presented. The validity of PC utilisation is compared with QOS parameters in buffer-less environments when only the cell loss ratio (CLR) parameter is relevant. The convolution algorithm is a good solution for CAC in ATM networks with small buffers. If the source characteristics are known, the actual CLR can be very well estimated. Furthermore, this estimation is always conservative, allowing the retention of the network performance guarantees. Several experiments have been carried out and investigated to explain the deviation between the proposed method and the simulation. Time parameters for burst length and different buffer sizes have been considered. Experiments to confine the limits of the burst length with respect to the buffer size conclude that a minimum buffer size is necessary to achieve adequate cell contention. Note that propagation delay is a no dismiss limit for long distance and interactive communications, then small buffer must be used in order to minimise delay. Under previous premises, the convolution approach is the most accurate method used in bandwidth allocation. This method gives enough accuracy in both homogeneous and heterogeneous networks. But, the convolution approach has a considerable computation cost and a high number of accumulated calculations. To overcome this drawbacks, a new method of evaluation is analysed: the Enhanced Convolution Approach (ECA). In ECA, traffic is grouped in classes of identical parameters. By using the multinomial distribution function instead of the formula-based convolution, a partial state corresponding to each class of traffic is obtained. Finally, the global state probabilities are evaluated by multi-convolution of the partial results. This method avoids accumulated calculations and saves storage requirements, specially in complex scenarios. Sorting is the dominant factor for the formula-based convolution, whereas cost evaluation is the dominant factor for the enhanced convolution. A set of cut-off mechanisms are introduced to reduce the complexity of the ECA evaluation. The ECA also computes the CLR for each j-class of traffic (CLRj), an expression for the CLRj evaluation is also presented. We can conclude that by combining the ECA method with cut-off mechanisms, utilisation of ECA in real-time CAC environments as a single level scheme is always possible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Focus on “social determinants of health” provides a welcome alternative to the bio-medical illness paradigm. However, the tendency to concentrate on the influence of “risk factors” related to living and working conditions of individuals, rather than to more broadly examine dynamics of the social processes that affect population health, has triggered critical reaction not only from the Global North but especially from voices the Global South where there is a long history of addressing questions of health equity. In this article, we elaborate on how focusing instead on the language of “social determination of health” has prompted us to attempt to apply a more equity-sensitive approaches to research and related policy and praxis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With temperatures in the Arctic rising at twice the pace of anywhere else in the world, the European Union (EU) decided in 2008 to begin formulating an overall Arctic policy tackling maritime, environmental, energy and transport challenges. This attempt to draft a comprehensive policy on a topic that the EU had rarely touched upon unavoidably ran up against other existing strategies from Arctic and non-Arctic states. Against this background, this paper examines whether the EU’s current Arctic policy is conducive to framing a strategy that is both correctly targeted and flexible enough to represent Europe’s interests. It shows that the EU’s approach can serve as an effective foreign policy tool to establish the Union’s legitimacy as an Arctic player. However, the EU’s Arctic policy is still underestimating its potential to find common grounds with the strategic partners Russia and China. A properly targeted Arctic policy could help influence Russia over the EU’s interests in the Northern Sea Route and strengthen cooperation with China in an endeavour to gain recognition as relevant Arctic players.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to offer an exploratory case study comparing one Brazilian beef processor's relationships supplying two different distribution channels, an EU importer and an EU retail chain operating in Brazil. Design/methodology/approach - The paper begins with a short review of global value chains and the recent literature on trust. It gives the background to the Brazilian beef chain and presents data obtained through in-depth interviews, annual reports and direct observation with the Brazilian beef processor, the EU importer and the retailer. The interviews were conducted with individual firms, but the analysis places them in a chain context, identifying the links and relationships between the agents of the chains and aiming to describe each distribution channel. Findings - Executive chain governance exercised by the domestic retailer stimulates technical upgrading and transferring of best practices to local. suppliers. Consequently, this kind of relationship results in more trust within the global value chain. Practical implications - There are difficulties and challenges facing this Brazilian beef processor that are party related to the need to comply with increasingly complex and demanding food safety and food quality standards. There is still a gap between practices adopted for the export market and practices adopted locally. The strategies of transnational retailers in offering differentiated beef should be taken in account. Originality/value - The research outlines an interdisciplinary framework able to explain chain relationships and the kind of trust that emerges in relationships between EU importer/retail and a developing country supplier.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The International Citicoline Trial in acUte Stroke is a sequential phase III study of the use of the drug citicoline in the treatment of acute ischaemic stroke, which was initiated in 2006 in 56 treatment centres. The primary objective of the trial is to demonstrate improved recovery of patients randomized to citicoline relative to those randomized to placebo after 12 weeks of follow-up. The primary analysis will take the form of a global test combining the dichotomized results of assessments on three well-established scales: the Barthel Index, the modified Rankin scale and the National Institutes of Health Stroke Scale. This approach was previously used in the analysis of the influential National Institute of Neurological Disorders and Stroke trial of recombinant tissue plasminogen activator in stroke. The purpose of this paper is to describe how this trial was designed, and in particular how the simultaneous objectives of taking into account three assessment scales, performing a series of interim analyses and conducting treatment allocation and adjusting the analyses to account for prognostic factors, including more than 50 treatment centres, were addressed. Copyright (C) 2008 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Can autonomic computing concepts be applied to traditional multi-core systems found in high performance computing environments? In this paper, we propose a novel synergy between parallel computing and swarm robotics to offer a new computing paradigm, `Swarm-Array Computing' that can harness and apply autonomic computing for parallel computing systems. One approach among three proposed approaches in swarm-array computing based on landscapes of intelligent cores, in which the cores of a parallel computing system are abstracted to swarm agents, is investigated. A task gets executed and transferred seamlessly between cores in the proposed approach thereby achieving self-ware properties that characterize autonomic computing. FPGAs are considered as an experimental platform taking into account its application in space robotics. The feasibility of the proposed approach is validated on the SeSAm multi-agent simulator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing parameter alone for a general class of chaotic Lorenz 96 models. We then verify the theoretical predictions from the outputs of the simulations up to a high degree of precision. The theory is used to explain differences in the response of local and global observables, to define the intensive properties of the system, which do not depend on the spatial resolution of the Lorenz 96 model, and to generalize the concept of climate sensitivity to all time scales. We also show how to reconstruct the linear Green function, which maps perturbations of general time patterns into changes in the expectation value of the considered observable for finite as well as infinite time. Finally, we propose a simple yet general methodology to study general Climate Change problems on virtually any time scale by resorting to only well selected simulations, and by taking full advantage of ensemble methods. The specific case of globally averaged surface temperature response to a general pattern of change of the CO2 concentration is discussed. We believe that the proposed approach may constitute a mathematically rigorous and practically very effective way to approach the problem of climate sensitivity, climate prediction, and climate change from a radically new perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We explore the potential for making statistical decadal predictions of sea surface temperatures (SSTs) in a perfect model analysis, with a focus on the Atlantic basin. Various statistical methods (Lagged correlations, Linear Inverse Modelling and Constructed Analogue) are found to have significant skill in predicting the internal variability of Atlantic SSTs for up to a decade ahead in control integrations of two different global climate models (GCMs), namely HadCM3 and HadGEM1. Statistical methods which consider non-local information tend to perform best, but which is the most successful statistical method depends on the region considered, GCM data used and prediction lead time. However, the Constructed Analogue method tends to have the highest skill at longer lead times. Importantly, the regions of greatest prediction skill can be very different to regions identified as potentially predictable from variance explained arguments. This finding suggests that significant local decadal variability is not necessarily a prerequisite for skillful decadal predictions, and that the statistical methods are capturing some of the dynamics of low-frequency SST evolution. In particular, using data from HadGEM1, significant skill at lead times of 6–10 years is found in the tropical North Atlantic, a region with relatively little decadal variability compared to interannual variability. This skill appears to come from reconstructing the SSTs in the far north Atlantic, suggesting that the more northern latitudes are optimal for SST observations to improve predictions. We additionally explore whether adding sub-surface temperature data improves these decadal statistical predictions, and find that, again, it depends on the region, prediction lead time and GCM data used. Overall, we argue that the estimated prediction skill motivates the further development of statistical decadal predictions of SSTs as a benchmark for current and future GCM-based decadal climate predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Six land surface models and five global hydrological models participate in a model intercomparison project (WaterMIP), which for the first time compares simulation results of these different classes of models in a consistent way. In this paper the simulation setup is described and aspects of the multi-model global terrestrial water balance are presented. All models were run at 0.5 degree spatial resolution for the global land areas for a 15-year period (1985-1999) using a newly-developed global meteorological dataset. Simulated global terrestrial evapotranspiration, excluding Greenland and Antarctica, ranges from 415 to 586 mm year-1 (60,000 to 85,000 km3 year-1) and simulated runoff ranges from 290 to 457 mm year-1 (42,000 to 66,000 km3 year-1). Both the mean and median runoff fractions for the land surface models are lower than those of the global hydrological models, although the range is wider. Significant simulation differences between land surface and global hydrological models are found to be caused by the snow scheme employed. The physically-based energy balance approach used by land surface models generally results in lower snow water equivalent values than the conceptual degree-day approach used by global hydrological models. Some differences in simulated runoff and evapotranspiration are explained by model parameterizations, although the processes included and parameterizations used are not distinct to either land surface models or global hydrological models. The results show that differences between model are major sources of uncertainty. Climate change impact studies thus need to use not only multiple climate models, but also some other measure of uncertainty, (e.g. multiple impact models).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The calibration of the CloudSat spaceborne cloud radar has been thoroughly assessed using very accurate internal link budgets before launch, comparisons with predicted ocean surface backscatter at 94 GHz, direct comparisons with airborne cloud radars, and statistical comparisons with ground-based cloud radars at different locations of the world. It is believed that the calibration of CloudSat is accurate to within 0.5–1 dB. In the present paper it is shown that an approach similar to that used for the statistical comparisons with ground-based radars can now be adopted the other way around to calibrate other ground-based or airborne radars against CloudSat and/or to detect anomalies in long time series of ground-based radar measurements, provided that the calibration of CloudSat is followed up closely (which is the case). The power of using CloudSat as a global radar calibrator is demonstrated using the Atmospheric Radiation Measurement cloud radar data taken at Barrow, Alaska, the cloud radar data from the Cabauw site, Netherlands, and airborne Doppler cloud radar measurements taken along the CloudSat track in the Arctic by the Radar System Airborne (RASTA) cloud radar installed in the French ATR-42 aircraft for the first time. It is found that the Barrow radar data in 2008 are calibrated too high by 9.8 dB, while the Cabauw radar data in 2008 are calibrated too low by 8.0 dB. The calibration of the RASTA airborne cloud radar using direct comparisons with CloudSat agrees well with the expected gains and losses resulting from the change in configuration that required verification of the RASTA calibration.