832 resultados para Web content adaptation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform’s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate change is projected to lead to shift of forest types leading to irreversible damage to forests by rendering several species extinct and potentially affecting the livelihoods of local communities and the economy. Approximately 47% and 42% of tropical dry deciduous grids are projected to undergo shifts under A2 and B2 SRES scenarios respectively, as opposed to less than 16% grids comprising of tropical wet evergreen forests. Similarly, the tropical thorny scrub forest is projected to undergo shifts in majority of forested grids under A2 (more than 80%) as well as B2 scenarios (50% of grids). Thus the forest managers and policymakers need to adapt to the ecological as well as the socio-economic impacts of climate change. This requires formulation of effective forest management policies and practices, incorporating climate concerns into long-term forest policy and management plans. India has formulated a large number of innovative and progressive forest policies but a mechanism to ensure effective implementation of these policies is needed. Additional policies and practices may be needed to address the impacts of climate change. This paper discusses an approach and steps involved in the development of an adaptation framework as well as policies, strategies and practices needed for mainstreaming adaptation to cope with projected climate change. Further, the existing barriers which may affect proactive adaptation planning given the scale, accuracy and uncertainty associated with assessing climate change impacts are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to large scale afforestation programs and forest conservation legislations, India's total forest area seems to have stabilized or even increased. In spite of such efforts, forest fragmentation and degradation continues, with forests being subject to increased pressure due to anthropogenic factors. Such fragmentation and degradation is leading to the forest cover to change from very dense to moderately dense and open forest and 253 km(2) of very dense forest has been converted to moderately dense forest, open forest, scrub and non-forest (during 2005-2007). Similarly, there has been a degradation of 4,120 km(2) of moderately dense forest to open forest, scrub and non-forest resulting in a net loss of 936 km(2) of moderately dense forest. Additionally, 4,335 km(2) of open forest have degraded to scrub and non-forest. Coupled with pressure due to anthropogenic factors, climate change is likely to be an added stress on forests. Forest sector programs and policies are major factors that determine the status of forests and potentially resilience to projected impacts of climate change. An attempt is made to review the forest policies and programs and their implications for the status of forests and for vulnerability of forests to projected climate change. The study concludes that forest conservation and development policies and programs need to be oriented to incorporate climate change impacts, vulnerability and adaptation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine the potential for adaptation to climate change in Indian forests, and derive the macroeconomic implications of forest impacts and adaptation in India. The study is conducted by integrating results from the dynamic global vegetation model IBIS and the computable general equilibrium model GRACE-IN, which estimates macroeconomic implications for six zones of India. By comparing a reference scenario without climate change with a climate impact scenario based on the IPCC A2-scenario, we find major variations in the pattern of change across zones. Biomass stock increases in all zones but the Central zone. The increase in biomass growth is smaller, and declines in one more zone, South zone, despite higher stock. In the four zones with increases in biomass growth, harvest increases by only approximately 1/3 of the change in biomass growth. This is due to two market effects of increased biomass growth. One is that an increase in biomass growth encourages more harvest given other things being equal. The other is that more harvest leads to higher supply of timber, which lowers market prices. As a result, also the rent on forested land decreases. The lower prices and rent discourage more harvest even though they may induce higher demand, which increases the pressure on harvest. In a less perfect world than the model describes these two effects may contribute to an increase in the risk of deforestation because of higher biomass growth. Furthermore, higher harvest demands more labor and capital input in the forestry sector. Given total supply of labor and capital, this increases the cost of production in all the other sectors, although very little indeed. Forestry dependent communities with declining biomass growth may, however, experience local unemployment as a result.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The protein-protein docking programs typically perform four major tasks: (i) generation of docking poses, (ii) selecting a subset of poses, (iii) their structural refinement and (iv) scoring, ranking for the final assessment of the true quaternary structure. Although the tasks can be integrated or performed in a serial order, they are by nature modular, allowing an opportunity to substitute one algorithm with another. We have implemented two modular web services, (i) PRUNE: to select a subset of docking poses generated during sampling search (http://pallab.serc.iisc.ernet.in/prune) and (ii) PROBE: to refine, score and rank them (http://pallab.serc.iisc.ernet.in/probe). The former uses a new interface area based edge-scoring function to eliminate > 95% of the poses generated during docking search. In contrast to other multi-parameter-based screening functions, this single parameter based elimination reduces the computational time significantly, in addition to increasing the chances of selecting native-like models in the top rank list. The PROBE server performs ranking of pruned poses, after structure refinement and scoring using a regression model for geometric compatibility, and normalized interaction energy. While web-service similar to PROBE is infrequent, no web-service akin to PRUNE has been described before. Both the servers are publicly accessible and free for use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Depth measures the extent of atom/residue burial within a protein. It correlates with properties such as protein stability, hydrogen exchange rate, protein-protein interaction hot spots, post-translational modification sites and sequence variability. Our server, DEPTH, accurately computes depth and solvent-accessible surface area (SASA) values. We show that depth can be used to predict small molecule ligand binding cavities in proteins. Often, some of the residues lining a ligand binding cavity are both deep and solvent exposed. Using the depth-SASA pair values for a residue, its likelihood to form part of a small molecule binding cavity is estimated. The parameters of the method were calibrated over a training set of 900 high-resolution X-ray crystal structures of single-domain proteins bound to small molecules (molecular weight < 1.5 KDa). The prediction accuracy of DEPTH is comparable to that of other geometry-based prediction methods including LIGSITE, SURFNET and Pocket-Finder (all with Matthew's correlation coefficient of similar to 0.4) over a testing set of 225 single and multi-chain protein structures. Users have the option of tuning several parameters to detect cavities of different sizes, for example, geometrically flat binding sites. The input to the server is a protein 3D structure in PDB format. The users have the option of tuning the values of four parameters associated with the computation of residue depth and the prediction of binding cavities. The computed depths, SASA and binding cavity predictions are displayed in 2D plots and mapped onto 3D representations of the protein structure using Jmol. Links are provided to download the outputs. Our server is useful for all structural analysis based on residue depth and SASA, such as guiding site-directed mutagenesis experiments and small molecule docking exercises, in the context of protein functional annotation and drug discovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a new method of data handling for web servers. We call this method Network Aware Buffering and Caching (NABC for short). NABC facilitates reduction of data copies in web server's data sending path, by doing three things: (1) Layout the data in main memory in a way that protocol processing can be done without data copies (2) Keep a unified cache of data in kernel and ensure safe access to it by various processes and kernel and (3) Pass only the necessary meta data between processes so that bulk data handling time spent during IPC can be reduced. We realize NABC by implementing a set of system calls and an user library. The end product of the implementation is a set of APIs specifically designed for use by the web servers. We port an in house web server called SWEET, to NABC APIs and evaluate performance using a range of workloads both simulated and real. The results show a very impressive gain of 12% to 21% in throughput for static file serving and 1.6 to 4 times gain in throughput for lightweight dynamic content serving for a server using NABC APIs over the one using UNIX APIs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A graphics package has been developed to display the main chain torsion angles phi, psi (phi, Psi); (Ramachandran angles) in a protein of known structure. In addition, the package calculates the Ramachandran angles at the central residue in the stretch of three amino acids having specified the flanking residue types. The package displays the Ramachandran angles along with a detailed analysis output. This software is incorporated with all the protein structures available in the Protein Databank.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several replacement policies for web caches have been proposed and studied extensively in the literature. Different replacement policies perform better in terms of (i) the number of objects found in the cache (cache hit), (ii) the network traffic avoided by fetching the referenced object from the cache, or (iii) the savings in response time. In this paper, we propose a simple and efficient replacement policy (hereafter known as SE) which improves all three performance measures. Trace-driven simulations were done to evaluate the performance of SE. We compare SE with two widely used and efficient replacement policies, namely Least Recently Used (LRU) and Least Unified Value (LUV) algorithms. Our results show that SE performs at least as well as, if not better than, both these replacement policies. Unlike various other replacement policies proposed in literature, our SE policy does not require parameter tuning or a-priori trace analysis and has an efficient and simple implementation that can be incorporated in any existing proxy server or web server with ease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many web sites incorporate dynamic web pages to deliver customized contents to their users. However, dynamic pages result in increased user response times due to their construction overheads. In this paper, we consider mechanisms for reducing these overheads by utilizing the excess capacity with which web servers are typically provisioned. Specifically, we present a caching technique that integrates fragment caching with anticipatory page pre-generation in order to deliver dynamic pages faster during normal operating situations. A feedback mechanism is used to tune the page pre-generation process to match the current system load. The experimental results from a detailed simulation study of our technique indicate that, given a fixed cache budget, page construction speedups of more than fifty percent can be consistently achieved as compared to a pure fragment caching approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In species-rich assemblages, differential utilization of vertical space can be driven by resource availability. For animals that communicate acoustically over long distances under habitat-induced constraints, access to an effective transmission channel is a valuable resource. The acoustic adaptation hypothesis suggests that habitat acoustics imposes a selective pressure that drives the evolution of both signal structure and choice of calling sites by signalers. This predicts that species-specific signals transmit best in native habitats. In this study, we have tested the hypothesis that vertical stratification of calling heights of acoustically communicating species is driven by acoustic adaptation. This was tested in an assemblage of 12 coexisting species of crickets and katydids in a tropical wet evergreen forest. We carried out transmission experiments using natural calls at different heights from the forest floor to the canopy. We measured signal degradation using 3 different measures: total attenuation, signal-to-noise ratio (SNR), and envelope distortion. Different sets of species supported the hypothesis depending on which attribute of signal degradation was examined. The hypothesis was upheld by 5 species for attenuation and by 3 species each for SNR and envelope distortion. Only 1 species of 12 provided support for the hypothesis by all 3 measures of signal degradation. The results thus provided no overall support for acoustic adaptation as a driver of vertical stratification of coexisting cricket and katydid species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Impreso por la Diputación Foral de Álava, D.L. VI-430/99.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Members of the family Gammaridae are very closely interrelated. There arises the question as to how far they also differ amongst themselves through physiological characteristics. Comparative respiratory and physiological experiments were made on the five euryhaline species Gammarus locusta, G. oceanicus, G. salinus, G. zaddachi and G. duebeni. The respiratory measurements carried out within the framework of this experiment were occupied with the relationships between oxygen consumption and body size depending on salinity. They also had the object of determing the variations in metabolic intensity after an abrupt change in the salt content of the external medium, and to establish the period of time for the process of adaptation. As the experiments were carried out polarographically in a testing plant with continuous flow-through, and the method which was applied permitted continuous recording over prolonged intervals, there could also be carried out comparisons between metabolism at rest and under activity, and the alterations of oxygen consumption during the process of moulting could be measured.