33 resultados para intergovernmental transfers
Resumo:
The Intergovernmental Panel on Climate Change (IPCC) has successfully produced four assessment reports since 1990 along with a number of special reports and greenhouse gas inventory guidelines. It has very rigorous and robust procedures and guidelines for preparing the assessment reports largely based on synthesis of peer-reviewed and published scientific literature. IPCC has attracted controversy since the Second Assessment Report of 1995. The recent controversies surrounding the IPCC reports surfaced nearly two years after the release of the report in 2007, especially in the wake of the crucial Copenhagen Climate Convention. Many of the controversies can be traced to the use of information sourced from reports published outside the scientific journals such as reports of the World Wildlife Fund. It is true that there are a few errors in the IPCC reports, which may have escaped the multilayered rigorous review process. Many of the errors found in the main reports, which are over a thousand page each, have not been quoted in the crucial and most referred Summary for Policy Makers. IPCC may have to develop a more robust policy for sourcing literature published outside the scientific journals. The United Nations Secretary General has requested the prestigious Inter-Academy Council to review the IPCC principles, procedures and guidelines. The controversies raised in the recent past do not in any way change the main conclusions of the IPCC Assessment Report.
Resumo:
A simple three step procedure was used to purify microsomal NADH-cytochrome b5 (ferricyanide) reductase to homogeneity from the higher plant C. roseus. The microsomal bound reductase was solubilized using zwitterionic detergent-CHAPS. The solubilized reductase was subjected to affinity chromatography on octylamino Sepharose 4B, blue 2-Sepharose CL-6B and NAD+-Agarose. The homogeneous enzyme has an apparent molecular weight of 33,000 as estimated by SDS-PAGE. The purified enzyme catalyzes the reduction of purified cytochrome b5 from C. roseus in the presence of NADH. The reductase also readily transfers electrons from NADH to ferricyanide (Km 56 μM), 2,6-dichlorophenolindophenol (Km 65 μM) and cytochrome Image via cytochrome b5 but not to menadione.
Resumo:
EcoP15I DNA methyltransferase recognizes the sequence 5'-CAGCAG-3' and transfers a methyl group to N-6 of the second adenine residue in the recognition sequence. All N-6 adenine methyltransferases contain two highly conserved sequences, FxGxG (motif I), postulated to form part of the S-adenosyl-L-methionine binding site and (D/N/S)PP(Y/F) (motif IV) involved in catalysis. We have altered the second glycine residue in motif I to arginine and serine, and substituted tyrosine in motif IV with tryptophan in EcoP15I DNA methyltransferase, using site-directed mutagenesis. The mutant enzymes were overexpressed, purified and characterized by biochemical methods. The mutations in motif I completely abolished AdoMet binding but left target DNA recognition unaltered. Although the mutation in motif IV resulted in loss of enzyme activity, we observed enhanced crosslinking of S-adenosyl-L-methionine and DNA. This implies that DNA and AdoMet binding sites are close to motif IV. Taken together, these results reinforce the importance of motif I in AdoMet binding and motif IV in catalysis. Additionally, limited proteolysis and UV crosslinking experiments with EcoP15I DNA methyltransferase imply that DNA binds in a cleft formed by two domains in the protein. Methylation protection analysis provides evidence for the fact that EcoP15I DNA MTase makes contacts in the major groove of its substrate DNA. Interestingly, hypermethylation of the guanine residue next to the target adenine residue indicates that the protein probably flips out the target adenine residue. (C) 1996 Academic Press Limited
Resumo:
We describe the design of a directory-based shared memory architecture on a hierarchical network of hypercubes. The distributed directory scheme comprises two separate hierarchical networks for handling cache requests and transfers. Further, the scheme assumes a single address space and each processing element views the entire network as contiguous memory space. The size of individual directories stored at each node of the network remains constant throughout the network. Although the size of the directory increases with the network size, the architecture is scalable. The results of the analytical studies demonstrate superior performance characteristics of our scheme compared with those of other schemes.
Resumo:
We develop analytical models for estimating the energy spent by stations (STAs) in infrastructure WLANs when performing TCP controlled file downloads. We focus on the energy spent in radio communication when the STAs are in the Continuously Active Mode (CAM), or in the static Power Save Mode (PSM). Our approach is to develop accurate models for obtaining the fraction of times the STA radios spend in idling, receiving and transmitting. We discuss two traffic models for each mode of operation: (i) each STA performs one large file download, and (ii) the STAs perform short file transfers. We evaluate the rate of STA energy expenditure with long file downloads, and show that static PSM is worse than just using CAM. For short file downloads we compute the number of file downloads that can be completed with given battery capacity, and show that PSM performs better than CAM for this case. We provide a validation of our analytical models using the NS-2 simulator.
Resumo:
The poor performance of TCP over multi-hop wireless networks is well known. In this paper we explore to what extent network coding can help to improve the throughput performance of TCP controlled bulk transfers over a chain topology multi-hop wireless network. The nodes use a CSMA/ CA mechanism, such as IEEE 802.11’s DCF, to perform distributed packet scheduling. The reverse flowing TCP ACKs are sought to be X-ORed with forward flowing TCP data packets. We find that, without any modification to theMAC protocol, the gain from network coding is negligible. The inherent coordination problem of carrier sensing based random access in multi-hop wireless networks dominates the performance. We provide a theoretical analysis that yields a throughput bound with network coding. We then propose a distributed modification of the IEEE 802.11 DCF, based on tuning the back-off mechanism using a feedback approach. Simulation studies show that the proposed mechanism when combined with network coding, improves the performance of a TCP session by more than 100%.
Resumo:
In this paper, we consider the problem of association of wireless stations (STAs) with an access network served by a wireless local area network (WLAN) and a 3G cellular network. There is a set of WLAN Access Points (APs) and a set of 3G Base Stations (BSs) and a number of STAs each of which needs to be associated with one of the APs or one of the BSs. We concentrate on downlink bulk elastic transfers. Each association provides each ST with a certain transfer rate. We evaluate an association on the basis of the sum log utility of the transfer rates and seek the utility maximizing association. We also obtain the optimal time scheduling of service from a 3G BS to the associated STAs. We propose a fast iterative heuristic algorithm to compute an association. Numerical results show that our algorithm converges in a few steps yielding an association that is within 1% (in objective value) of the optimal (obtained through exhaustive search); in most cases the algorithm yields an optimal solution.
Resumo:
A two-stage methodology is developed to obtain future projections of daily relative humidity in a river basin for climate change scenarios. In the first stage, Support Vector Machine (SVM) models are developed to downscale nine sets of predictor variables (large-scale atmospheric variables) for Intergovernmental Panel on Climate Change Special Report on Emissions Scenarios (SRES) (A1B, A2, B1, and COMMIT) to R (H) in a river basin at monthly scale. Uncertainty in the future projections of R (H) is studied for combinations of SRES scenarios, and predictors selected. Subsequently, in the second stage, the monthly sequences of R (H) are disaggregated to daily scale using k-nearest neighbor method. The effectiveness of the developed methodology is demonstrated through application to the catchment of Malaprabha reservoir in India. For downscaling, the probable predictor variables are extracted from the (1) National Centers for Environmental Prediction reanalysis data set for the period 1978-2000 and (2) simulations of the third-generation Canadian Coupled Global Climate Model for the period 1978-2100. The performance of the downscaling and disaggregation models is evaluated by split sample validation. Results show that among the SVM models, the model developed using predictors pertaining to only land location performed better. The R (H) is projected to increase in the future for A1B and A2 scenarios, while no trend is discerned for B1 and COMMIT.
Resumo:
Climate projections for the Fifth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC) are made using the newly developed representative concentration pathways (RCPs) under the Coupled Model Inter-comparison Project 5 (CMIP5). This article provides multi-model and multi-scenario temperature and precipitation projections for India for the period 1860-2099 based on the new climate data. We find that CMIP5 ensemble mean climate is closer to observed climate than any individual model. The key findings of this study are: (i) under the business-as-usual (between RCP6.0 and RCP8.5) scenario, mean warming in India is likely to be in the range 1.7-2 degrees C by 2030s and 3.3-4.8 degrees C by 2080s relative to pre-industrial times; (ii) all-India precipitation under the business-as-usual scenario is projected to increase from 4% to 5% by 2030s and from 6% to 14% towards the end of the century (2080s) compared to the 1961-1990 baseline; (iii) while precipitation projections are generally less reliable than temperature projections, model agreement in precipitation projections increases from RCP2.6 to RCP8.5, and from short-to long-term projections, indicating that long-term precipitation projections are generally more robust than their short-term counterparts and (iv) there is a consistent positive trend in frequency of extreme precipitation days (e.g. > 40 mm/day) for decades 2060s and beyond. These new climate projections should be used in future assessment of impact of climate change and adaptation planning. There is need to consider not just the mean climate projections, but also the more important extreme projections in impact studies and as well in adaptation planning.
Resumo:
Impact of global warming on daily rainfall is examined using atmospheric variables from five General Circulation Models (GCMs) and a stochastic downscaling model. Daily rainfall at eleven raingauges over Malaprabha catchment of India and National Center for Environmental Prediction (NCEP) reanalysis data at grid points over the catchment for a continuous time period 1971-2000 (current climate) are used to calibrate the downscaling model. The downscaled rainfall simulations obtained using GCM atmospheric variables corresponding to the IPCC-SRES (Intergovernmental Panel for Climate Change - Special Report on Emission Scenarios) A2 emission scenario for the same period are used to validate the results. Following this, future downscaled rainfall projections are constructed and examined for two 20 year time slices viz. 2055 (i.e. 2046-2065) and 2090 (i.e. 2081-2100). The model results show reasonable skill in simulating the rainfall over the study region for the current climate. The downscaled rainfall projections indicate no significant changes in the rainfall regime in this catchment in the future. More specifically, 2% decrease by 2055 and 5% decrease by 2090 in monsoon (HAS) rainfall compared to the current climate (1971-2000) under global warming conditions are noticed. Also, pre-monsoon (JFMAM) and post-monsoon (OND) rainfall is projected to increase respectively, by 2% in 2055 and 6% in 2090 and, 2% in 2055 and 12% in 2090, over the region. On annual basis slight decreases of 1% and 2% are noted for 2055 and 2090, respectively.
Resumo:
We present a centralized integrated approach for: 1) enhancing the performance of an IEEE 802.11 infrastructure wireless local area network (WLAN), and 2) managing the access link that connects the WLAN to the Internet. Our approach, which is implemented on a standard Linux platform, and which we call ADvanced Wi-fi Internet Service EnhanceR (ADWISER), is an extension of our previous system WLAN Manager (WM). ADWISER addresses several infrastructure WLAN performance anomalies such as mixed-rate inefficiency, unfair medium sharing between simultaneous TCP uploads and downloads, and inefficient utilization of the Internet access bandwidth when Internet transfers compete with LAN-WLAN transfers, etc. The approach is via centralized queueing and scheduling, using a novel, configurable, cascaded packet queueing and scheduling architecture, with an adaptive service rate. In this paper, we describe the design of ADWISER and report results of extensive experimentation conducted on a hybrid testbed consisting of real end-systems and an emulated WLAN on Qualnet. We also present results from a physical testbed consisting of one access point (AP) and a few end-systems.
Resumo:
Empirical research available on technology transfer initiatives is either North American or European. Literature over the last two decades shows various research objectives such as identifying the variables to be measured and statistical methods to be used in the context of studying university based technology transfer initiatives. AUTM survey data from years 1996 to 2008 provides insightful patterns about the North American technology transfer initiatives, we use this data in our paper. This paper has three sections namely, a comparison of North American Universities with (n=1129) and without Medical Schools (n=786), an analysis of the top 75th percentile of these samples and a DEA analysis of these samples. We use 20 variables. Researchers have attempted to classify university based technology transfer initiative variables into multi-stages, namely, disclosures, patents and license agreements. Using the same approach, however with minor variations, three stages are defined in this paper. The first stage is to do with inputs from R&D expenditure and outputs namely, invention disclosures. The second stage is to do with invention disclosures being the input and patents issued being the output. The third stage is to do with patents issued as an input and technology transfers as outcomes.
Resumo:
Exploiting the performance potential of GPUs requires managing the data transfers to and from them efficiently which is an error-prone and tedious task. In this paper, we develop a software coherence mechanism to fully automate all data transfers between the CPU and GPU without any assistance from the programmer. Our mechanism uses compiler analysis to identify potential stale accesses and uses a runtime to initiate transfers as necessary. This allows us to avoid redundant transfers that are exhibited by all other existing automatic memory management proposals. We integrate our automatic memory manager into the X10 compiler and runtime, and find that it not only results in smaller and simpler programs, but also eliminates redundant memory transfers. Tested on eight programs ported from the Rodinia benchmark suite it achieves (i) a 1.06x speedup over hand-tuned manual memory management, and (ii) a 1.29x speedup over another recently proposed compiler--runtime automatic memory management system. Compared to other existing runtime-only and compiler-only proposals, it also transfers 2.2x to 13.3x less data on average.
Resumo:
The two-phase Brust-Schiffrin method (BSM) is used to synthesize highly stable nanoparticles of noble metals. A phase transfer catalyst (PTC) is used to bring in aqueous phase soluble precursors into the organic phase to enable particle synthesis there. Two different mechanisms for phase transfer are advanced in the literature. The first mechanism considers PTC to bring in an aqueous phase soluble precursor by complexing with it. The second mechanism considers the ionic species to be contained in inverse micelles of PTC, with a water core inside. A comprehensive experimental study involving measurement of interfacial tension, viscosity, water content by Karl-Fischer titration, static light scattering, H-1 NMR, and small-angle X-ray scattering is reported in this work to establish that the phase transfer catalyst tetraoctylammonium bromide transfers ions by complexing with them, instead of encapsulating them in inverse micelles. The findings have implications for particle synthesis in two-phase methods such as BSM and their modification to produce more monodispersed particles.
Resumo:
In Mycobacterium tuberculosis Rv1027c-Rv1028c genes are predicted to encode KdpDE two component system, which is highly conserved across all bacterial species. Here, we show that the system is functionally active and KdpD sensor kinase undergoes autophosphorylation and transfers phosphoryl group to KdpE, response regulator protein. We identified His(642) and Asp(52) as conserved phosphorylation sites in KdpD and KdpE respectively and by SPR analysis confirmed the physical interaction between them. KdpD was purified with prebound divalent ions and their importance in phosphorylation was established using protein refolding and ion chelation approaches. Genetically a single transcript encoded both KdpD and KdpE proteins. Overall, we report that M. tuberculosis KdpDE system operates like a canonical two component system. (C) 2014 Elsevier Inc. All rights reserved.