979 resultados para Gemstone Team ANSWER Poverty (Assessing the Need for Services Which Effectively Reduce Poverty)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Colorectal cancer (CRC) is the third leading cancer in both incidence and mortality in Texas. This study investigated the adherence of CRC treatment to standard treatment guidelines and the association between standard treatment and CRC survival in Texas. The author used Texas Cancer Registry (TCR) and Medicare linked data to study the CRC treatment patterns and factors associated with standard treatment in patients who were more than 65 years old and were diagnosed in 2001 through 2007. We also determined whether adherence to standard treatment affect patients' survival. Multiple logistic regression and Cox regression analysis were used to analyze our data. Both regression models are adjusted for demographic characteristics and tumor characteristics. We found that for the 3977 regional colon cancer patients 80 years old or younger, 60.2% of them received chemotherapy, in adherence to the recommended treatment guidelines. People with younger age, female gender, higher education and lower comorbidity score are more likely adherent to this surgery guideline. Patients' adherence to chemotherapy in this cohort have better survival compared to those who are not (HR: 0.76, 95% CI: 0.68-0.84). For the 12709 colon cancer patients treated with surgery, 49.3% have more than 12 lymph nodes removed, in adherence to the treatment guidelines. People with younger age, female gender, higher education, regional stage, lager tumor size and lower comorbidity score are more likely to adherent to this surgery guideline. Patients with more than 12 lymph nodes removed in this cohort have better survival (HR: 0.86, 95% CI: 0.82-0.91). For the 1211 regional rectal cancer patients 80 years old or younger, 63.2% of them were adherent to radiation treatment. People with smaller tumor size and lower comorbidity score are more likely to adherent to this radiation guideline. There is no significant survival difference between radiation adherent patients and non-adherent patients (HR: 1.03, 95% CI: 0.82-1.29). For the 1122 regional rectal cancer patients 80 years old or younger who were treated with surgery, 76.0% of them received postoperative chemotherapy, in adherence to the treatment guidelines. People with younger age and smaller comorbidity score are related with higher adherence rate. Patients adherent with adjuvant chemotherapy in this cohort have better survival than those were not adherent (HR: 0.60, 95% CI: 0.45-0.79).^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pedro Giménez' is a white criolla variety cropped in Argentina, mainly in Mendoza and San Juan, being the most planted white variety destined for wine making in the country. Its origin remains unknown, as well as its relationship with Spanish variety 'Pedro Ximénez', mostly grown in Jerez, Spain. Previous works have probed that most of Criollas varieties existing in America at the moment, are the offspring of 'Muscat of Alexandria' x 'Criolla Chica'. The aim of the present work was to compare 'Pedro Giménez' with the Spanish variety 'Pedro Ximénez', and to establish its degree of relatedness to 'Muscat of Alexandria' and 'Criolla Chica'. Therefore we used a set of 18 nuclear SSR loci and 3 chloroplast SSR loci. 'Pedro Giménez' shared only 38% of the alleles under analysis with 'Pedro Ximénez', indicating that they are indeed two different varieties. In all 18 polymorphic nuclear SSR loci 'Pedro Giménez' shared 50% of its alleles with 'Muscat of Alexandria', while the other 50% of the alleles present in 'Pedro Giménez' were also present in 'Criolla Chica'. This data, along with those from the chloroplast SSR analysis, strongly suggest that 'Pedro Giménez' is the progeny of 'Muscat of Alexandria' x 'Criolla Chica', being the latest one the most likely female progenitor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Organisms inhabiting coastal waters naturally experience diel and seasonal physico-chemical variations. According to various assumptions, coastal species are either considered to be highly tolerant to environmental changes or, conversely, living at the thresholds of their physiological performance. Therefore, these species are either more resistant or more sensitive, respectively, to ocean acidification and warming. Here, we focused on Crepidula fornicata, an invasive gastropod that colonized bays and estuaries on northwestern European coasts during the 20th century. Small (<3 cm in length) and large (>4.5 cm in length), sexually mature individuals of C. fornicata were raised for 6 months in three different pCO2 conditions (390 µatm, 750 µatm, and 1400 µatm) at four successive temperature levels (10°C, 13°C, 16°C, and 19°C). At each temperature level and in each pCO2 condition, we assessed the physiological rates of respiration, ammonia excretion, filtration and calcification on small and large individuals. Results show that, in general, temperature positively influenced respiration, excretion and filtration rates in both small and large individuals. Conversely, increasing pCO2 negatively affected calcification rates, leading to net dissolution in the most drastic pCO2 condition (1400 µatm) but did not affect the other physiological rates. Overall, our results indicate that C. fornicata can tolerate ocean acidification, particularly in the intermediate pCO2 scenario. Moreover, in this eurythermal species, moderate warming may play a buffering role in the future responses of organisms to ocean acidification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

More than one-third of the World Trade Organization-notified services trade agreements that were in effect between January 2008 and August 2015 involved at least one South or Southeast Asian trading partner. Drawing on Baier and Bergstrand’s (2004) determinants of preferential trade agreements and using the World Bank’s database on the restrictiveness of domestic services regimes (Borchert, Gootiiz, and Mattoo 2012), we examine the potential for negotiated regulatory convergence in Asian services markets. Our results suggest that Asian economies with high levels of preexisting bilateral merchandise trade and wide differences in services regulatory frameworks are more likely candidates for services trade agreement formation. Such results lend support to the hypothesis that the heightened “servicification” of production generates demand for the lowered services input costs resulting from negotiated market openings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

More than a third of the World Trade Organization (WTO)-notified services trade agreements (STAs) in effect over January 2008 - August 2015 have involved at least one (South or Southeast) Asian trading partner. Drawing on Baier and Bergstrand's (2004) determinants of preferential trade agreements and using the World Bank's database on the restrictiveness of domestic services regimes (Borchert et.al. 2012), we examine the potential for negotiated regulatory convergence in Asian services markets. Our results suggest that countries within Asia with high levels of pre-existing bilateral merchandise trade and wide differences in services regulatory frameworks are more likely candidates for STA formation. Such results lend support to the hypothesis that the heightened "servicification" of production generates a demand for the lowered service input costs resulting from negotiated market opening.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Think piece by Pierre Sauvé for the E15 Initiative on Strengthening the Global Trade System In his latest essay for the ICTSD-World Economic Forum E15 initiative on Strengthening the Global Trade and Investment System for Sustainable Development, WTI Director of External Programmes and Academic Partnerships and faculty member Pierre Sauvé explores the case for fusing the law of goods with that of services in a world of global value chains. The paper does so by directing attention to the questions of whether the current architectures of multilateral and preferential trade governance are compatible with a world of trade in tasks; whether the existing rules offer globally active firms a coherent structure for doing business in a predictable environment; whether it is feasible to redesign the structure and content of existing trade rules to align them to the reality of production fragmentation; and what steps can be envisaged to better align policy and realities in the marketplace if the prospects for restructuring appear unfavourable. The paper argues that fusing trade disciplines for goods and services is neither needed nor feasible and may actually deflect attention from a number of worthwhile policy initiatives where more realistic (if never easily secured) prospects of generic rule-making may well exist.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Brazil, a low-latitude country characterized by its high availability and uniformity of solar radiation, the use of PV solar energy integrated in buildings is still incipient. However, at the moment there are several initiatives which give some hints that lead to think that there will be a change shortly. In countries where this technology is already a daily reality, such as Germany, Japan or Spain, the recommendations and basic criteria to avoid losses due to orientation and tilt are widespread. Extrapolating those measures used in high latitudes to all regions, without a previous deeper analysis, is standard practice. They do not always correspond to reality, what frequently leads to false assumptions and may become an obstacle in a country which is taking the first step in this area. In this paper, the solar potential yield for different surfaces in Brazilian cities (located at latitudes between 0° and 30°S) are analyzed with the aim of providing the necessary tools to evaluate the suitability of the buildings’ envelopes for photovoltaic use

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Changes in the geomorphology of rivers have serious repercussions, causing losses in the dynamics and naturalness of their forms, going in many cases, from a type of meandering channel, with constant erosion and sedimentation processes, to a channelized narrow river with rigid and stable margins, where the only possibility of movement occurs in the vertical, causing the only changes in channel geometry occur in the river bed. On the other hand, these changes seriously affect the naturalness of the banks, preventing the development of riparian vegetation and reducing the cross connectivity of the riparian corridor. Common canalizations and disconnections of meanders increase the slope, and therefore speed, resulting in processes of regressive erosion, effect increased as a result of the narrowing of the channel and the concentration of flows. This process of incision may turn the flood plain to be "hung", being completely disconnected from the water table, with important consequences for vegetation. As an example of the effects of these changes, it has been chosen the case of the Arga River The Arga river has been channelized and rectified, as it passes along the meander RamalHondo and Soto Gil (Funes, Navarra). The effects on fish habitat and riparian vegetation by remeandering the Arga River are presented. and Ttwo very contrasting situationsrestoration hypothesis, in terms of geomorphology concerns, have been established to assess the effects these changes have on the habitat of one of the major fish species in the area (Luciobabus graellsii) and on the riparian vegetation. To accomplish this goal, it has been necessary to used the a digital elevation model provided by LIDAR flight, bathymetric data, flow data, as inputs, and a hydraulic simulation model 2D (Infoworks RS). The results obtained not only helped to evaluate the effects of the past alterations of geomorphologic characteristics, but also to predict fish and vegetation habitat responses to this type of changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a method of landscape characterisation and assessment of public works associated with fluvial landscapes, which is validated in the middle section of the Tajo River. In this method, a set of criteria is identified that unifies various characteristics of the landscape associated to the infrastructures. A specific weight is then assigned to each criterion in such a way as to produce a semi-quantitative value ranging from a minimum value of 0 to a maximum value of 10. Taken together, these criteria enable us to describe and assess the value of the public works selected for study, in this case helping us to evaluate the sections of the River Tajo analysed in our study area. Accordingly, the value of all the infrastructures associated to a stretch of the river covering several hundred kilometres was determined and after dividing this stretch into sections, they were compared under equivalent conditions to provide a hierarchal ranking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A notorious advantage of wireless transmission is a significant reduction and simplification in wiring and harness. There are a lot of applications of wireless systems, but in many occasions sensor nodes require a specific housing to protect the electronics from hush environmental conditions. Nowadays the information is scarce and nonspecific on the dynamic behaviour of WSN and RFID. Therefore the purpose of this study is to evaluate the dynamic behaviour of the sensors. A series of trials were designed and performed covering temperature steps between cold room (5 °C), room temperature (23 °C) and heated environment (35 °C). As sensor nodes: three Crossbow motes, a surface mounted Nlaza module (with sensor Sensirion located on the motherboard), an aerial mounted Nlaza where the Sensirion sensor stayed at the end of a cable), and four tags RFID Turbo Tag (T700 model with and without housing), and 702-B (with and without housing). To assess the dynamic behaviour a first order response approach is used and fitted with dedicated optimization tools programmed in Matlab that allow extracting the time response (?) and corresponding determination coefficient (r2) with regard to experimental data. The shorter response time (20.9 s) is found for the uncoated T 700 tag which encapsulated version provides a significantly higher response (107.2 s). The highest ? corresponds to the Crossbow modules (144.4 s), followed by the surface mounted Nlaza module (288.1 s), while the module with aerial mounted sensor gives a response certainly close above to the T700 without coating (42.8 s). As a conclusion, the dynamic response of temperature sensors within wireless and RFID nodes is dramatically influenced by the way they are housed (to protect them from the environment) as well as by the heat released by the node electronics itself; its characterization is basic to allow monitoring of high rate temperature changes and to certify the cold chain. Besides the time to rise and to recover is significantly different being mostly higher for the latter than for the former.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Se establece un metodología para evaluar la cartografía de capas GIS

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accessibility is an essential concept widely used to evaluate the impact of land-use and transport strategies in transport and urban planning. Accessibility is typically evaluated by using a transport model or a land-use model independently or successively without a feedback loop, thus neglecting the interaction effects between the two systems and the induced competition effects among opportunities due to accessibility improvements. More than a mere methodological curiosity, failure to account for land- use/transport interactions and the competition effect may result in large underestimation of the policy effects. With the recent development of land-use and transport interaction (LUTI) models, there is a growing interest in using these models to adequately measure accessibility and evaluate its impact. The current study joins this research stream by embedding an accessibility measure in a LUTI model with two main aims. The first aim is to account for adaptive accessibility, namely the adjustment of the potential accessibility due to the effect of competition among opportunities (e.g., workplaces) as a result of improved accessibility. LUTI models are particularly suitable for assessing adaptive accessibility because the competition factor is a function of the number of jobs, which is related to land-use attractiveness and the number of workers which is related, among other factors, to the transport demand. The second aim is to identify the optimal implementation scenario of policy measures on the basis of the potential and adaptive accessibility and analyse the results in terms of social welfare and accessibility. The metropolitan area of Madrid is used as a case-study and two transport policy instruments, namely a cordon toll and bus frequency increase, have been chosen for the simulation study in order to present the usefulness of the approach to urban planners and policy makers. The MARS model (Metropolitan Activity Relocation Simulator) calibrated for Madrid was employed as the analysis tool. The impact of accessibility is embedded in the model through a social welfare function that includes not only costs and benefits to both road users and transport operators, but also costs and benefits for the government and society in general (external costs). An optimisation procedure is performed by the MARS model for maximizing the value of objective function in order to find the best (optimal) policy imp lementations intensity (i.e., price, frequency). Last, the two policy strategies are evaluated in terms of their accessibility. Results show that the accessibility with competition factor influences the optimal policy implementation level and also generates different results in terms of social welfare. In addition, mapping the difference between the potential and the adaptive accessibility indicators shows that the main changes occur in areas where there is a strong competition among land-use opportunities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a first study in-depth of solar-fossil hybridization from a general perspective. It develops a set of useful parameters for analyzing and comparing hybrid plants, it studies the case of hybridizing Brayton cycles with current solar technologies and shows a tentative extrapolation of the results to integrated combined cycle systems (ISCSS). In particular, three points have been analyzed: the technical requirements for solar technologies to be hybridized with Brayton cycles, the temperatures and pressures at which hybridization would produce maximum power per unit of fossil fuel, and their mapping to current solar technologies and Brayton cycles. Major conclusions are that a hybrid plant works in optimum conditions which are not equal to those of the solar or power blocks considered independently, and that hybridizing at the Brayton cycle of a combined cycle could be energetically advantageous.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Una apropiada evaluación de los márgenes de seguridad de una instalación nuclear, por ejemplo, una central nuclear, tiene en cuenta todas las incertidumbres que afectan a los cálculos de diseño, funcionanmiento y respuesta ante accidentes de dicha instalación. Una fuente de incertidumbre son los datos nucleares, que afectan a los cálculos neutrónicos, de quemado de combustible o activación de materiales. Estos cálculos permiten la evaluación de las funciones respuesta esenciales para el funcionamiento correcto durante operación, y también durante accidente. Ejemplos de esas respuestas son el factor de multiplicación neutrónica o el calor residual después del disparo del reactor. Por tanto, es necesario evaluar el impacto de dichas incertidumbres en estos cálculos. Para poder realizar los cálculos de propagación de incertidumbres, es necesario implementar metodologías que sean capaces de evaluar el impacto de las incertidumbres de estos datos nucleares. Pero también es necesario conocer los datos de incertidumbres disponibles para ser capaces de manejarlos. Actualmente, se están invirtiendo grandes esfuerzos en mejorar la capacidad de analizar, manejar y producir datos de incertidumbres, en especial para isótopos importantes en reactores avanzados. A su vez, nuevos programas/códigos están siendo desarrollados e implementados para poder usar dichos datos y analizar su impacto. Todos estos puntos son parte de los objetivos del proyecto europeo ANDES, el cual ha dado el marco de trabajo para el desarrollo de esta tesis doctoral. Por tanto, primero se ha llevado a cabo una revisión del estado del arte de los datos nucleares y sus incertidumbres, centrándose en los tres tipos de datos: de decaimiento, de rendimientos de fisión y de secciones eficaces. A su vez, se ha realizado una revisión del estado del arte de las metodologías para la propagación de incertidumbre de estos datos nucleares. Dentro del Departamento de Ingeniería Nuclear (DIN) se propuso una metodología para la propagación de incertidumbres en cálculos de evolución isotópica, el Método Híbrido. Esta metodología se ha tomado como punto de partida para esta tesis, implementando y desarrollando dicha metodología, así como extendiendo sus capacidades. Se han analizado sus ventajas, inconvenientes y limitaciones. El Método Híbrido se utiliza en conjunto con el código de evolución isotópica ACAB, y se basa en el muestreo por Monte Carlo de los datos nucleares con incertidumbre. En esta metodología, se presentan diferentes aproximaciones según la estructura de grupos de energía de las secciones eficaces: en un grupo, en un grupo con muestreo correlacionado y en multigrupos. Se han desarrollado diferentes secuencias para usar distintas librerías de datos nucleares almacenadas en diferentes formatos: ENDF-6 (para las librerías evaluadas), COVERX (para las librerías en multigrupos de SCALE) y EAF (para las librerías de activación). Gracias a la revisión del estado del arte de los datos nucleares de los rendimientos de fisión se ha identificado la falta de una información sobre sus incertidumbres, en concreto, de matrices de covarianza completas. Además, visto el renovado interés por parte de la comunidad internacional, a través del grupo de trabajo internacional de cooperación para evaluación de datos nucleares (WPEC) dedicado a la evaluación de las necesidades de mejora de datos nucleares mediante el subgrupo 37 (SG37), se ha llevado a cabo una revisión de las metodologías para generar datos de covarianza. Se ha seleccionando la actualización Bayesiana/GLS para su implementación, y de esta forma, dar una respuesta a dicha falta de matrices completas para rendimientos de fisión. Una vez que el Método Híbrido ha sido implementado, desarrollado y extendido, junto con la capacidad de generar matrices de covarianza completas para los rendimientos de fisión, se han estudiado diferentes aplicaciones nucleares. Primero, se estudia el calor residual tras un pulso de fisión, debido a su importancia para cualquier evento después de la parada/disparo del reactor. Además, se trata de un ejercicio claro para ver la importancia de las incertidumbres de datos de decaimiento y de rendimientos de fisión junto con las nuevas matrices completas de covarianza. Se han estudiado dos ciclos de combustible de reactores avanzados: el de la instalación europea para transmutación industrial (EFIT) y el del reactor rápido de sodio europeo (ESFR), en los cuales se han analizado el impacto de las incertidumbres de los datos nucleares en la composición isotópica, calor residual y radiotoxicidad. Se han utilizado diferentes librerías de datos nucleares en los estudios antreriores, comparando de esta forma el impacto de sus incertidumbres. A su vez, mediante dichos estudios, se han comparando las distintas aproximaciones del Método Híbrido y otras metodologías para la porpagación de incertidumbres de datos nucleares: Total Monte Carlo (TMC), desarrollada en NRG por A.J. Koning y D. Rochman, y NUDUNA, desarrollada en AREVA GmbH por O. Buss y A. Hoefer. Estas comparaciones demostrarán las ventajas del Método Híbrido, además de revelar sus limitaciones y su rango de aplicación. ABSTRACT For an adequate assessment of safety margins of nuclear facilities, e.g. nuclear power plants, it is necessary to consider all possible uncertainties that affect their design, performance and possible accidents. Nuclear data are a source of uncertainty that are involved in neutronics, fuel depletion and activation calculations. These calculations can predict critical response functions during operation and in the event of accident, such as decay heat and neutron multiplication factor. Thus, the impact of nuclear data uncertainties on these response functions needs to be addressed for a proper evaluation of the safety margins. Methodologies for performing uncertainty propagation calculations need to be implemented in order to analyse the impact of nuclear data uncertainties. Nevertheless, it is necessary to understand the current status of nuclear data and their uncertainties, in order to be able to handle this type of data. Great eórts are underway to enhance the European capability to analyse/process/produce covariance data, especially for isotopes which are of importance for advanced reactors. At the same time, new methodologies/codes are being developed and implemented for using and evaluating the impact of uncertainty data. These were the objectives of the European ANDES (Accurate Nuclear Data for nuclear Energy Sustainability) project, which provided a framework for the development of this PhD Thesis. Accordingly, first a review of the state-of-the-art of nuclear data and their uncertainties is conducted, focusing on the three kinds of data: decay, fission yields and cross sections. A review of the current methodologies for propagating nuclear data uncertainties is also performed. The Nuclear Engineering Department of UPM has proposed a methodology for propagating uncertainties in depletion calculations, the Hybrid Method, which has been taken as the starting point of this thesis. This methodology has been implemented, developed and extended, and its advantages, drawbacks and limitations have been analysed. It is used in conjunction with the ACAB depletion code, and is based on Monte Carlo sampling of variables with uncertainties. Different approaches are presented depending on cross section energy-structure: one-group, one-group with correlated sampling and multi-group. Differences and applicability criteria are presented. Sequences have been developed for using different nuclear data libraries in different storing-formats: ENDF-6 (for evaluated libraries) and COVERX (for multi-group libraries of SCALE), as well as EAF format (for activation libraries). A revision of the state-of-the-art of fission yield data shows inconsistencies in uncertainty data, specifically with regard to complete covariance matrices. Furthermore, the international community has expressed a renewed interest in the issue through the Working Party on International Nuclear Data Evaluation Co-operation (WPEC) with the Subgroup (SG37), which is dedicated to assessing the need to have complete nuclear data. This gives rise to this review of the state-of-the-art of methodologies for generating covariance data for fission yields. Bayesian/generalised least square (GLS) updating sequence has been selected and implemented to answer to this need. Once the Hybrid Method has been implemented, developed and extended, along with fission yield covariance generation capability, different applications are studied. The Fission Pulse Decay Heat problem is tackled first because of its importance during events after shutdown and because it is a clean exercise for showing the impact and importance of decay and fission yield data uncertainties in conjunction with the new covariance data. Two fuel cycles of advanced reactors are studied: the European Facility for Industrial Transmutation (EFIT) and the European Sodium Fast Reactor (ESFR), and response function uncertainties such as isotopic composition, decay heat and radiotoxicity are addressed. Different nuclear data libraries are used and compared. These applications serve as frameworks for comparing the different approaches of the Hybrid Method, and also for comparing with other methodologies: Total Monte Carlo (TMC), developed at NRG by A.J. Koning and D. Rochman, and NUDUNA, developed at AREVA GmbH by O. Buss and A. Hoefer. These comparisons reveal the advantages, limitations and the range of application of the Hybrid Method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the decade of the 1980’s the literature on economic development began paying attention to the cases of countries which were industrialized after the first industrial revolution. One of the most relevant aspects analyzed has been the role of technology as a factor which promotes or delays the process of catching up with technology leaders. As result of this interest, new and more adequate indicators were identified to provide a coherent explanation for technological activities and their relationship with economic efficiency. Although the earliest studies focused on analyzing the activities of research and development (R&D), recently the focus of analysis has shifted to another type of variables, more oriented towards the processes of innovation and the gathering of knowledge and capabilities, in which patents provide relevant information.