8 resultados para Phasing plan
em Indian Institute of Science - Bangalore - Índia
Resumo:
The Government of India has announced the Greening India Mission (GIM) under the National Climate Change Action Plan. The Mission aims to restore and afforest about 10 mha over the period 2010-2020 under different sub-missions covering moderately dense and open forests, scrub/grasslands, mangroves, wetlands, croplands and urban areas. Even though the main focus of the Mission is to address mitigation and adaptation aspects in the context of climate change, the adaptation component is inadequately addressed. There is a need for increased scientific input in the preparation of the Mission. The mitigation potential is estimated by simply multiplying global default biomass growth rate values and area. It is incomplete as it does not include all the carbon pools, phasing, differing growth rates, etc. The mitigation potential estimated using the Comprehensive Mitigation Analysis Process model for the GIM for the year 2020 has the potential to offset 6.4% of the projected national greenhouse gas emissions, compared to the GIM estimate of only 1.5%, excluding any emissions due to harvesting or disturbances. The selection of potential locations for different interventions and species choice under the GIM must be based on the use of modelling, remote sensing and field studies. The forest sector provides an opportunity to promote mitigation and adaptation synergy, which is not adequately addressed in the GIM. Since many of the interventions proposed are innovative and limited scientific knowledge exists, there is need for an unprecedented level of collaboration between the research institutions and the implementing agencies such as the Forest Departments, which is currently non-existent. The GIM could propel systematic research into forestry and climate change issues and thereby provide global leadership in this new and emerging science.
Resumo:
Estimates of predicate selectivities by database query optimizers often differ significantly from those actually encountered during query execution, leading to poor plan choices and inflated response times. In this paper, we investigate mitigating this problem by replacing selectivity error-sensitive plan choices with alternative plans that provide robust performance. Our approach is based on the recent observation that even the complex and dense "plan diagrams" associated with industrial-strength optimizers can be efficiently reduced to "anorexic" equivalents featuring only a few plans, without materially impacting query processing quality. Extensive experimentation with a rich set of TPC-H and TPC-DS-based query templates in a variety of database environments indicate that plan diagram reduction typically retains plans that are substantially resistant to selectivity errors on the base relations. However, it can sometimes also be severely counter-productive, with the replacements performing much worse. We address this problem through a generalized mathematical characterization of plan cost behavior over the parameter space, which lends itself to efficient criteria of when it is safe to reduce. Our strategies are fully non-invasive and have been implemented in the Picasso optimizer visualization tool.
Resumo:
Given a parametrized n-dimensional SQL query template and a choice of query optimizer, a plan diagram is a color-coded pictorial enumeration of the execution plan choices of the optimizer over the query parameter space. These diagrams have proved to be a powerful metaphor for the analysis and redesign of modern optimizers, and are gaining currency in diverse industrial and academic institutions. However, their utility is adversely impacted by the impractically large computational overheads incurred when standard brute-force exhaustive approaches are used for producing fine-grained diagrams on high-dimensional query templates. In this paper, we investigate strategies for efficiently producing close approximations to complex plan diagrams. Our techniques are customized to the features available in the optimizer's API, ranging from the generic optimizers that provide only the optimal plan for a query, to those that also support costing of sub-optimal plans and enumerating rank-ordered lists of plans. The techniques collectively feature both random and grid sampling, as well as inference techniques based on nearest-neighbor classifiers, parametric query optimization and plan cost monotonicity. Extensive experimentation with a representative set of TPC-H and TPC-DS-based query templates on industrial-strength optimizers indicates that our techniques are capable of delivering 90% accurate diagrams while incurring less than 15% of the computational overheads of the exhaustive approach. In fact, for full-featured optimizers, we can guarantee zero error with less than 10% overheads. These approximation techniques have been implemented in the publicly available Picasso optimizer visualization tool.
Resumo:
A "plan diagram" is a pictorial enumeration of the execution plan choices of a database query optimizer over the relational selectivity space. We have shown recently that, for industrial-strength database engines, these diagrams are often remarkably complex and dense, with a large number of plans covering the space. However, they can often be reduced to much simpler pictures, featuring significantly fewer plans, without materially affecting the query processing quality. Plan reduction has useful implications for the design and usage of query optimizers, including quantifying redundancy in the plan search space, enhancing useability of parametric query optimization, identifying error-resistant and least-expected-cost plans, and minimizing the overheads of multi-plan approaches. We investigate here the plan reduction issue from theoretical, statistical and empirical perspectives. Our analysis shows that optimal plan reduction, w.r.t. minimizing the number of plans, is an NP-hard problem in general, and remains so even for a storage-constrained variant. We then present a greedy reduction algorithm with tight and optimal performance guarantees, whose complexity scales linearly with the number of plans in the diagram for a given resolution. Next, we devise fast estimators for locating the best tradeoff between the reduction in plan cardinality and the impact on query processing quality. Finally, extensive experimentation with a suite of multi-dimensional TPCH-based query templates on industrial-strength optimizers demonstrates that complex plan diagrams easily reduce to "anorexic" (small absolute number of plans) levels incurring only marginal increases in the estimated query processing costs.
Resumo:
Large animals are disproportionately likely to go extinct, and the effects of this on ecosystem processes are unclear. Megaherbivores (weighing over 1000kg) are thought to be particularly effective seed dispersers, yet only a few plant species solely or predominantly adapted for dispersal by megaherbivores have been identified. The reasons for this paradox may be elucidated by examining the ecology of so-called megafaunal fruiting species in Asia, where large-fruited species have been only sparsely researched. We conducted focal tree watches, camera trapping, fruit ageing trials, dung seed counts and germination trials to understand the ecology of Dillenia indica, a large-fruited species thought to be elephant-dispersed, in a tropical moist forest (Buxa Tiger Reserve, India). We find that the initial hardness of the fruit of D.indica ensures that its small (6mm) seeds will primarily be consumed and dispersed by elephants and perhaps other megaherbivores. Elephants removed 63.3% of camera trap-monitored fruits taken by frugivores. If the fruit of D.indica is not removed by a large animal, the seeds of D.indica become available to successively smaller frugivores as its fruits soften. Seeds from both hard and soft fruits are able to germinate, meaning these smaller frugivores may provide a mechanism for dispersal without megaherbivores.Synthesis. Dillenia indica's strategy for dispersal allows it to realize the benefits of dispersal by megaherbivores without becoming fully reliant on these less abundant species. This risk-spreading dispersal behaviour suggests D.indica will be able to persist even if its megafaunal disperser becomes extinct.
Resumo:
Transcriptional regulation enables adaptation in bacteria. Typically, only a few transcriptional events are well understood, leaving many others unidentified. The recent genome-wide identification of transcription factor binding sites in Mycobacterium tuberculosis has changed this by deciphering a molecular road-map of transcriptional control, indicating active events and their immediate downstream effects.
Resumo:
This paper studies a pilot-assisted physical layer data fusion technique known as Distributed Co-Phasing (DCP). In this two-phase scheme, the sensors first estimate the channel to the fusion center (FC) using pilots sent by the latter; and then they simultaneously transmit their common data by pre-rotating them by the estimated channel phase, thereby achieving physical layer data fusion. First, by analyzing the symmetric mutual information of the system, it is shown that the use of higher order constellations (HOC) can improve the throughput of DCP compared to the binary signaling considered heretofore. Using an HOC in the DCP setting requires the estimation of the composite DCP channel at the FC for data decoding. To this end, two blind algorithms are proposed: 1) power method, and 2) modified K-means algorithm. The latter algorithm is shown to be computationally efficient and converges significantly faster than the conventional K-means algorithm. Analytical expressions for the probability of error are derived, and it is found that even at moderate to low SNRs, the modified K-means algorithm achieves a probability of error comparable to that achievable with a perfect channel estimate at the FC, while requiring no pilot symbols to be transmitted from the sensor nodes. Also, the problem of signal corruption due to imperfect DCP is investigated, and constellation shaping to minimize the probability of signal corruption is proposed and analyzed. The analysis is validated, and the promising performance of DCP for energy-efficient physical layer data fusion is illustrated, using Monte Carlo simulations.