44 resultados para Reciprocal patchiness of resources

em Indian Institute of Science - Bangalore - Índia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Unlike the invertases from the mesophilic fungi and yeasts, invertase from a thermophilic fungus,Thermomyces lanuginosus,was unusually unstable bothin vivoandin vitro.The following observations suggested that the unstable nature of the enzyme activity in the cell-free extracts was due to the oxidation of the cysteine residue(s) in the enzyme molecule: (a) the addition of dithiothreitol or reduced glutathione stabilized invertase activity during storage of the extracts and also revived enzyme activity in the extracts which had become inactive with time; (b)N-ethylmaleimide, iodoacetamide, oxidized glutathione, cystine, or oxidized coenzyme A-inactivated invertase; (c) invertase activity was low when the ratio reduced/oxidized glutathione was lower and high when this ratio was higher, suggesting regulation of the enzyme by thiol/disulfide exchange reaction. In contrast to the activation of invertase by the thiol compounds and its inactivation by the disulfides in the cell-free extracts, the purified enzyme did not respond to these compounds. Following its inactivation, the purified enzyme required a helper protein in addition to dithiothreitol for maximal activation. A cellular protein was identified that promoted activation of invertase by dithiothreitol and it was called “PRIA” for theprotein which helps inrestoringinvertaseactivity. The revival of enzyme activity was due to the conversion of the inactive invertase molecules into an active form. A model is presented to explain the modulation of invertase activity by the thiol compounds and the disulfides, both in the crude cell-free extracts and in the purified preparations. The requirement of free sulfhydryl group(s) for the enzyme activity and, furthermore, the reciprocal effects of the thiols and the disulfides on invertase activity have not been reported for invertase from any other source. The finding of a novel invertase which shows a distinct mode of regulation demonstrates the diversity in an enzyme that has figured prominently in the development of biochemistry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Polyembryony, referring here to situations where a nucellar embryo is formed along with the zygotic embryo, has different consequences for the fitness of the maternal parent and offspring. We have developed genetic and inclusive fitness models to derive the conditions that permit the evolution of polyembryony under maternal and offspring control. We have also derived expressions for the optimal allocation (evolutionarily stable strategy, ESS) of resources between zygotic and nucellar embryos. It is seen that (i) Polyembryony can evolve more easily under maternal control than under that of either the offspring or the ‘selfish’ endosperm. Under maternal regulation, evolution of polyembryony can occur for any clutch size. Under offspring control polyembryony is more likely to evolve for high clutch sizes, and is unlikely for low clutch sizes (<3). This conflict between mother and offspring decreases with increase in clutch size and favours the evolution of polyembryony at high clutch sizes, (ii) Polyembryony can evolve for values of “x” (the power of the function relating fitness to seed resource) greater than 0.5758; the possibility of its occurrence increases with “x”, indicating that a more efficient conversion of resource into fitness favours polyembryony. (iii) Under both maternal parent and offspring control, the evolution of polyembryony becomes increasingly unlikely as the level of inbreeding increases, (iv) The proportion of resources allocated to the nucellar embryo at ESS is always higher than that which maximizes the rate of spread of the allele against a non-polyembryonic allele.Finally we argue that polyembryony is a maternal counter strategy to compensate for the loss in her fitness due to brood reduction caused by sibling rivalry. We support this assertion by two empirical evidences: (a) the extent of polyembryony is positively correlated with brood reduction inCitrus, and (b) species exhibiting polyembryony are more often those that frequently exhibit brood reduction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new class of nets, called S-nets, is introduced for the performance analysis of scheduling algorithms used in real-time systems Deterministic timed Petri nets do not adequately model the scheduling of resources encountered in real-time systems, and need to be augmented with resource places and signal places, and a scheduler block, to facilitate the modeling of scheduling algorithms. The tokens are colored, and the transition firing rules are suitably modified. Further, the concept of transition folding is used, to get intuitively simple models of multiframe real-time systems. Two generic performance measures, called �load index� and �balance index,� which characterize the resource utilization and the uniformity of workload distribution, respectively, are defined. The utility of S-nets for evaluating heuristic-based scheduling schemes is illustrated by considering three heuristics for real-time scheduling. S-nets are useful in tuning the hardware configuration and the underlying scheduling policy, so that the system utilization is maximized, and the workload distribution among the computing resources is balanced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two dhole (Cuon alpinus) packs were monitored in Mudumalai Sanctuary, southern India, during 1989-93 to look at population dynamics, movement pattern, and foraging strategy and their inter-relationship with the maintenance of social groups. Pack size fluctuated substantially (4-18 and 4-25 in the two packs) owing to dispersal and demographic factors such as females not breeding in a given year. Both packs killed a much higher proportion of chital (Axis axis) and sambar (Cervus unicolor) fawns (< one year old) than their availability in the population. There was no correlation between pack size and body weight of prey killed, while per capita consumption of meat declined with increasing pack size. Home-range area (83.3 km(2) and 54.2 km(2) for the two packs) was not correlated with pack size. Pack movement from one resource patch (consisting of resting sites and aggregations of prey species) to another was not random or based on factors such as inter-patch distance or relative prey densities. There was no difference in mean residence time of the pack across the four resource patches; the pack moved across these in a sequential manner in one direction. We conclude that dholes live in groups not because of any advantages accruing from enhanced group sizes through increased per capita yield of food, but as a consequence of the dispersion of resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Representatives of several Internet access providers have expressed their wish to see a substantial change in the pricing policies of the Internet. In particular, they would like to see content providers pay for use of the network, given the large amount of resources they use. This would be in clear violation of the �network neutrality� principle that had characterized the development of the wireline Internet. Our first goal in this paper is to propose and study possible ways of implementing such payments and of regulating their amount. We introduce a model that includes the internaut�s behavior, the utilities of the ISP and of the content providers, and the monetary flow that involves the internauts, the ISP and content provider, and in particular, the content provider�s revenues from advertisements. We consider various game models and study the resulting equilibrium; they are all combinations of a noncooperative game (in which the service and content providers determine how much they will charge the internauts) with a cooperative one - the content provider and the service provider bargain with each other over payments to one another. We include in our model a possible asymmetric bargaining power which is represented by a parameter (that varies between zero to one). We then extend our model to study the case of several content providers. We also provide a very brief study of the equilibria that arise when one of the content providers enters into an exclusive contract with the ISP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Growing concern over the status of global and regional bioenergy resources has necessitated the analysis and monitoring of land cover and land use parameters on spatial and temporal scales. The knowledge of land cover and land use is very important in understanding natural resources utilization, conversion and management. Land cover, land use intensity and land use diversity are land quality indicators for sustainable land management. Optimal management of resources aids in maintaining the ecosystem balance and thereby ensures the sustainable development of a region. Thus sustainable development of a region requires a synoptic ecosystem approach in the management of natural resources that relates to the dynamics of natural variability and the effects of human intervention on key indicators of biodiversity and productivity. Spatial and temporal tools such as remote sensing (RS), geographic information system (GIS) and global positioning system (GPS) provide spatial and attribute data at regular intervals with functionalities of a decision support system aid in visualisation, querying, analysis, etc., which would aid in sustainable management of natural resources. Remote sensing data and GIS technologies play an important role in spatially evaluating bioresource availability and demand. This paper explores various land cover and land use techniques that could be used for bioresources monitoring considering the spatial data of Kolar district, Karnataka state, India. Slope and distance based vegetation indices are computed for qualitative and quantitative assessment of land cover using remote spectral measurements. Differentscale mapping of land use pattern in Kolar district is done using supervised classification approaches. Slope based vegetation indices show area under vegetation range from 47.65 % to 49.05% while distance based vegetation indices shoes its range from 40.40% to 47.41%. Land use analyses using maximum likelihood classifier indicate that 46.69% is agricultural land, 42.33% is wasteland (barren land), 4.62% is built up, 3.07% of plantation, 2.77% natural forest and 0.53% water bodies. The comparative analysis of various classifiers, indicate that the Gaussian maximum likelihood classifier has least errors. The computation of talukwise bioresource status shows that Chikballapur Taluk has better availability of resources compared to other taluks in the district.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Freshwater ecosystems vary in size and composition and contain a wide range of organisms which interact with each other and with the environment. These interactions are between organisms and the environment as nutrient cycling, biomass formation and transfer, maintenance of internal environment and interactions with the external environment. The range of organisms present in aquatic communities decides the generation and transfer function of biomass, which defines and characterises the system. These organisms have distinct roles as they occupy particular trophic levels, forming an interconnected system in a food chain. Availability of resources and competition would primarily determine the balance of individual species within the food web, which in turn influences the variety and proportions of the different organisms, with important implications for the overall functioning of the system. This dynamic and diverse relationship decides the physical, chemical and biological elements across spatial and temporal scales in the aquatic ecosystem, which can be recorded by regular inventorying and monitoring to maintain the integrity and conserve the ecosystem. Regular environmental monitoring, particularly water quality monitoring allows us to detect, assess and manage the overall impacts on the rivers. The appreciation of water quality is in constant flux. Water quality assessments derived through the biotic indices, i.e. assessments based on observations of the resident floral and faunal communities has gained importance in recent years. Biological evaluations provide a description of the water quality that is often not achievable from elemental analyses alone. A biological indicator (or bioindicator) is a taxon or taxa selected based on its sensitivity to a particular attribute, and then assessed to make inferences about that attribute. In other words, they are a substitute for directly measuring abiotic features or other biota. Bioindicators are evaluated through presence or absence, condition, relative abundance, reproductive success, community structure (i.e. composition and diversity), community function (i.e. trophic structure), or any combination thereof.Biological communities reflect the overall ecological integrity by integrating various stresses, thus providing a broad measure of their synergistic impacts. Aquatic communities, both plants and animals, integrate and reflect the effects of chemical and physical disturbances that occur over extended periods of time. Monitoring procedures based on the biota measure the health of a river and the ability of aquatic ecosystems to support life as opposed to simply characterising the chemical and physical components of a particular system. This is the central purpose of assessing the biological condition of aquatic communities of a river.Diatoms (Bacillariophyceae), blue green algae (Cyanophyceae), green algae (Chlorophyceae), and red algae (Rhodphyceae) are the main groups of algae in flowing water. These organisms are widely used as biological indicators of environmental health in the aquatic ecosystem because algae occupy the most basic level in the transfer of energy through natural aquatic systems. The distribution of algae in an aquatic ecosystem is directly related to the fundamental factors such as physical, chemical and biological constituents. Soft algae (all the algal groups except diatoms) have also been used as indicators of biological integrity, but they may have less efficiency than diatoms in this respect due to their highly variable morphology. The diatoms (Bacillariophyceae) comprise a ubiquitous, highly successful and distinctive group of unicellular algae with the most obvious distinguishing characteristic feature being siliceous cell walls (frustules). The photosynthetic organisms living within its photic zone are responsible for about one-half of global primary productivity. The most successful organisms are thought to be photosynthetic prokaryotes (cyanobacteria and prochlorophytes) and a class of eukaryotic unicellular algae known as diatoms. Diatoms are likely to have arisen around 240 million years ago following an endosymbiotic event between a red eukaryotic alga and a heterotrophic flagellate related to the Oomycetes.The importance of algae to riverine ecology is easily appreciated when one considers that they are primary producers that convert inorganic nutrients into biologically active organic compounds while providing physical habitat for other organisms. As primary producers, algae transform solar energy into food from which many invertebrates obtain their energy. Algae also transform inorganic nutrients, such as atmospheric nitrogen into organic forms such as ammonia and amino acids that can be used by other organisms. Algae stabilises the substrate and creates mats that form structural habitats for fish and invertebrates. Algae are a source of organic matter and provide habitat for other organisms such as non-photosynthetic bacteria, protists, invertebrates, and fish. Algae's crucial role in stream ecosystems and their excellent indicator properties make them an important component of environmental studies to assess the effects of human activities on stream health. Diatoms are used as biological indicators for a number of reasons: 1. They occur in all types of aquatic ecosystems. 2. They collectively show a broad range of tolerance along a gradient of aquatic productivity, individual species have specific water chemistry requirements. 3. They have one of the shortest generation times of all biological indicators (~2 weeks). They reproduce and respond rapidly to environmental change and provide early measures of both pollution impacts and habitat restoration. 4. It takes two to three weeks before changes are reflected to a measurable extent in the assemblage composition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Social, economic and political development of a region is dependent on the health and quantity of the natural resources. Integrated approaches in the management of natural resources would ensure sustainability, which demands inventorying, mapping and monitoring of resources considering all components of an ecosystem. The monitoring of hydrological and catchment landscape of river resources have a vital role in the conservation and management of aquatic resources. This paper presents a case study Venkatapura river basin in Uttara Kannada district of Karnataka State, India based on stream hydrology and landuse analyses. The results revealed variations in dissolved oxygen and free carbon dioxide according to the flow nature of the water, and increased amount of phosphates and coliform contamination in streams closer to anthropogenic activities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we propose the architecture of a SoC fabric onto which applications described in a HLL are synthesized. The fabric is a homogeneous layout of computation, storage and communication resources on silicon. Through a process of composition of resources (as opposed to decomposition of applications), application specific computational structures are defined on the fabric at runtime to realize different modules of the applications in hardware. Applications synthesized on this fabric offers performance comparable to ASICs while retaining the programmability of processing cores. We outline the application synthesis methodology through examples, and compare our results with software implementations on traditional platforms with unbounded resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is being realized that the traditional closed-door and market driven approaches for drug discovery may not be the best suited model for the diseases of the developing world such as tuberculosis and malaria, because most patients suffering from these diseases have poor paying capacity. To ensure that new drugs are created for patients suffering from these diseases, it is necessary to formulate an alternate paradigm of drug discovery process. The current model constrained by limitations for collaboration and for sharing of resources with confidentiality hampers the opportunities for bringing expertise from diverse fields. These limitations hinder the possibilities of lowering the cost of drug discovery. The Open Source Drug Discovery project initiated by Council of Scientific and Industrial Research, India has adopted an open source model to power wide participation across geographical borders. Open Source Drug Discovery emphasizes integrative science through collaboration, open-sharing, taking up multi-faceted approaches and accruing benefits from advances on different fronts of new drug discovery. Because the open source model is based on community participation, it has the potential to self-sustain continuous development by generating a storehouse of alternatives towards continued pursuit for new drug discovery. Since the inventions are community generated, the new chemical entities developed by Open Source Drug Discovery will be taken up for clinical trial in a non-exclusive manner by participation of multiple companies with majority funding from Open Source Drug Discovery. This will ensure availability of drugs through a lower cost community driven drug discovery process for diseases afflicting people with poor paying capacity. Hopefully what LINUX the World Wide Web have done for the information technology, Open Source Drug Discovery will do for drug discovery. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The setting considered in this paper is one of distributed function computation. More specifically, there is a collection of N sources possessing correlated information and a destination that would like to acquire a specific linear combination of the N sources. We address both the case when the common alphabet of the sources is a finite field and the case when it is a finite, commutative principal ideal ring with identity. The goal is to minimize the total amount of information needed to be transmitted by the N sources while enabling reliable recovery at the destination of the linear combination sought. One means of achieving this goal is for each of the sources to compress all the information it possesses and transmit this to the receiver. The Slepian-Wolf theorem of information theory governs the minimum rate at which each source must transmit while enabling all data to be reliably recovered at the receiver. However, recovering all the data at the destination is often wasteful of resources since the destination is only interested in computing a specific linear combination. An alternative explored here is one in which each source is compressed using a common linear mapping and then transmitted to the destination which then proceeds to use linearity to directly recover the needed linear combination. The article is part review and presents in part, new results. The portion of the paper that deals with finite fields is previously known material, while that dealing with rings is mostly new.Attempting to find the best linear map that will enable function computation forces us to consider the linear compression of source. While in the finite field case, it is known that a source can be linearly compressed down to its entropy, it turns out that the same does not hold in the case of rings. An explanation for this curious interplay between algebra and information theory is also provided in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a comparative evaluation of the average and switching models of a dc-dc boost converter from the point of view of real-time simulation. Both the models are used to simulate the converter in real-time on a Field Programmable Gate Array (FPGA) platform. The converter is considered to function over a wide range of operating conditions, and could do transition between continuous conduction mode (CCM) and discontinuous conduction mode (DCM). While the average model is known to be computationally efficient from the perspective of off-line simulation, the same is shown here to consume more logical resources than the switching model for real-time simulation of the dc-dc converter. Further, evaluation of the boundary condition between CCM and DCM is found to be the main reason for the increased consumption of resources by the average model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A ubiquitous network plays a critical role to provide rendered services to ubiquitous application running nodes. To provide appropriate resources the nodes are needed to be monitored continuously. Monitoring a node in ubiquitous network is challenging because of dynamicity and heterogeneity of the ubiquitous network. The network monitor has to monitor resource parameters, like data rate, delay and throughput, as well as events such as node failure, network failure and fault in the system to curb the system failure. In this paper, we propose a method to develop a ubiquitous system monitoring protocol using agents. Earlier works on network monitoring using agents consider that the agents are designed for particular network. While in our work the heterogeneity property of the network has been considered. We have shown that the nodes' behaviour can be easily monitored by using agents (both static and mobile agent). The past behavior of the application and network, and past history of the Unode and the predecessor are taken into consideration to help SA to take appropriate decision during the time of emergency situation like unavailability of resources at the local administration, and to predict the migration of the Unode based on the previous node history. The results obtained in the simulation reflects the effectiveness of the technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we present a framework for realizing arbitrary instruction set extensions (IE) that are identified post-silicon. The proposed framework has two components viz., an IE synthesis methodology and the architecture of a reconfigurable data-path for realization of the such IEs. The IE synthesis methodology ensures maximal utilization of resources on the reconfigurable data-path. In this context we present the techniques used to realize IEs for applications that demand high throughput or those that must process data streams. The reconfigurable hardware called HyperCell comprises a reconfigurable execution fabric. The fabric is a collection of interconnected compute units. A typical use case of HyperCell is where it acts as a co-processor with a host and accelerates execution of IEs that are defined post-silicon. We demonstrate the effectiveness of our approach by evaluating the performance of some well-known integer kernels that are realized as IEs on HyperCell. Our methodology for realizing IEs through HyperCells permits overlapping of potentially all memory transactions with computations. We show significant improvement in performance for streaming applications over general purpose processor based solutions, by fully pipelining the data-path. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Representatives of several Internet service providers (ISPs) have expressed their wish to see a substantial change in the pricing policies of the Internet. In particular, they would like to see content providers (CPs) pay for use of the network, given the large amount of resources they use. This would be in clear violation of the ``network neutrality'' principle that had characterized the development of the wireline Internet. Our first goal in this article is to propose and study possible ways of implementing such payments and of regulating their amount. We introduce a model that includes the users' behavior, the utilities of the ISP and of the CPs, and, the monetary flow that involves the content users, the ISP and CP, and, in pUrticular, the CP's revenues from advertisements. We consider various game models and study the resulting equilibria; they are all combinations of a noncooperative game (in which the ISPs and CPs determine how much they will charge the users) with a ``cooperative'' one on how the CP and the ISP share the payments. We include in our model a possible asymmetric weighting parameter (that varies between zero to one). We also study equilibria that arise when one of the CPs colludes with the TSP. We also study two dynamic game models as well as the convergence of prices to the equilibrium values.