903 resultados para web sites
Resumo:
Noble metal substituted ionic catalysts were synthesized by solution combustion technique. The compounds were characterized by X-ray diffraction, FT-Raman spectroscopy, and X-ray photoelectron spectroscopy. Zirconia supported compounds crystallized in tetragonal phase. The solid solutions of ceria with zirconia crystallized in fluorite structure. The noble metals were substituted in ionic form.The water-gas shift reaction was carried out over the catalysts.Negligible conversions were observed with unsubstituted compounds. The substitution of a noble metal ion was found to enhance the reaction rate. Equilibrium conversion was obtained below 250 degrees C in the presence of Pt ion substituted compounds. The formation of Bronsted acid-Bronsted base pairs was proposed to explain the activity of zirconia catalysts. The effect of oxide ion vacancies on the reactions over substituted ceria-zirconia solid solutions was established. (c)2010 Elsevier B.V. All rights reserved.
Resumo:
As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform’s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.
Resumo:
As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.
Resumo:
Background: The Internet has recently made possible the free global availability of scientific journal articles. Open Access (OA) can occur either via OA scientific journals, or via authors posting manuscripts of articles published in subscription journals in open web repositories. So far there have been few systematic studies showing how big the extent of OA is, in particular studies covering all fields of science. Methodology/Principal Findings: The proportion of peer reviewed scholarly journal articles, which are available openly in full text on the web, was studied using a random sample of 1837 titles and a web search engine. Of articles published in 2008, 8,5% were freely available at the publishers’ sites. For an additional 11,9% free manuscript versions could be found using search engines, making the overall OA percentage 20,4%. Chemistry (13%) had the lowest overall share of OA, Earth Sciences (33%) the highest. In medicine, biochemistry and chemistry publishing in OA journals was more common. In all other fields author-posted manuscript copies dominated the picture. Conclusions/Significance: The results show that OA already has a significant positive impact on the availability of the scientific journal literature and that there are big differences between scientific disciplines in the uptake. Due to the lack of awareness of OA-publishing among scientists in most fields outside physics, the results should be of general interest to all scholars. The results should also interest academic publishers, who need to take into account OA in their business strategies and copyright policies, as well as research funders, who like the NIH are starting to require OA availability of results from research projects they fund. The method and search tools developed also offer a good basis for more in-depth studies as well as longitudinal studies.
Resumo:
Encoding protein 3D structures into 1D string using short structural prototypes or structural alphabets opens a new front for structure comparison and analysis. Using the well-documented 16 motifs of Protein Blocks (PBs) as structural alphabet, we have developed a methodology to compare protein structures that are encoded as sequences of PBs by aligning them using dynamic programming which uses a substitution matrix for PBs. This methodology is implemented in the applications available in Protein Block Expert (PBE) server. PBE addresses common issues in the field of protein structure analysis such as comparison of proteins structures and identification of protein structures in structural databanks that resemble a given structure. PBE-T provides facility to transform any PDB file into sequences of PBs. PBE-ALIGNc performs comparison of two protein structures based on the alignment of their corresponding PB sequences. PBE-ALIGNm is a facility for mining SCOP database for similar structures based on the alignment of PBs. Besides, PBE provides an interface to a database (PBE-SAdb) of preprocessed PB sequences from SCOP culled at 95% and of all-against-all pairwise PB alignments at family and superfamily levels. PBE server is freely available at http://bioinformatics.univ-reunion.fr/ PBE/.
Resumo:
Owing to high evolutionary divergence, it is not always possible to identify distantly related protein domains by sequence search techniques. Intermediate sequences possess sequence features of more than one protein and facilitate detection of remotely related proteins. We have demonstrated recently the employment of Cascade PSI-BLAST where we perform PSI-BLAST for many 'generations', initiating searches from new homologues as well. Such a rigorous propagation through generations of PSI-BLAST employs effectively the role of intermediates in detecting distant similarities between proteins. This approach has been tested on a large number of folds and its performance in detecting superfamily level relationships is similar to 35% better than simple PSI-BLAST searches. We present a web server for this search method that permits users to perform Cascade PSI-BLAST searches against the Pfam, SCOP and SwissProt databases. The URL for this server is http://crick.mbu.iisc.ernet.in/similar to CASCADE/CascadeBlast.html.
Resumo:
Instrument landing systems (ILS) and the upcoming microwave landing systems (MLS) are (or are planned to be) very important navigational aids at most major airports of the world. However, their performance is directly affected by the features of the site in which they are located. Currently, validation of the ILS performance is through costly and time-consuming experimental methods. This paper outlines a powerful and versatile analytical approach for performing the site evaluation, as an alternative to the experimental methods. The approach combines a multi-plate model for the terrain with a powerful and exhaustive ray-tracing technique and a versatile and accurate formulation for estimating the electromagnetic fields due to the array antenna in the presence of the terrain. It can model the effects of the undulation, the roughness and the impedance (depending on the soil type) of the terrain at the site. The results computed from the analytical method are compared with the actual measurements and good agreement is shown. Considerations for site effects on MLS are also outlined.
Resumo:
The selective hydroxylation of proline residues in nascent procollagen chains by prolyl hydroxylase (EC 1.14.11.2) can be understood in terms of the conformational feature of the -Pro-Gly-segments in linear peptides and globular proteins. The folded beta-turn conformation in such segments appears to be the conformational requirement for proline hydroxylation. The available data on the hydroxylation of native and synthetic substrates of prolyl hydroxylase are explained on the basis of the extent of beta-turn formation in them. Taken in conjunction with the conformational features of the hydroxyproline residue, our results bring out the conformational reason for the posttranslational proline hydroxylation which, it is proposed, leads to the "straightening" of the beta-turn segments into the linear triple-helical conformation.
Resumo:
"We used PCR-DGGE fingerprinting and direct sequencing to analyse the response of fungal and actinobacterial communities to changing hydrological conditions at 3 different sites in a boreal peatland complex in Finland. The experimental design involved a short-term (3 years; STD) and a long-term (43 years; LTD) water-level drawdown. Correspondence analyses of DGGE bands revealed differences in the communities between natural sites representing the nutrient-rich mesotrophic fen, the nutrient-poorer oligotrophic fen, and the nutrient-poor ombrotrophic bog. Still, most fungi and actinobacteria found in the pristine peatland seemed robust to the environmental variables. Both fungal and actinobacterial diversity was higher in the fens than in the bog. Fungal diversity increased significantly after STD whereas actinobacterial diversity did not respond to hydrology. Both fungal and actinobacterial communities became more similar between peatland types after LTD, which was not apparent after STD. Most sequences clustered equally between the two main fungal phyla Ascomycota and Basidiomycota. Sequencing revealed that basidiomycetes may respond more (either positively or negatively) to hydrological changes than ascomycetes. Overall, our results suggest that fungal responses to water-level drawdown depend on peatland type. Actinobacteria seem to be less sensitive to hydrological changes, although the response of some may similarly depend on peatland type. (C) 2009 Elsevier Ltd. All rights reserved."
Resumo:
Myrkyllisten aineiden jakaumat ja vaikutusmallit jätealueiden ympäristöriskien analyysissä.
Resumo:
The world of mapping has changed. Earlier, only professional experts were responsible for map production, but today ordinary people without any training or experience can become map-makers. The number of online mapping sites, and the number of volunteer mappers has increased significantly. The development of the technology, such as satellite navigation systems, Web 2.0, broadband Internet connections, and smartphones, have had one of the key roles in enabling the rise of volunteered geographic information (VGI). As opening governmental data to public is a current topic in many countries, the opening of high quality geographical data has a central role in this study. The aim of this study is to investigate how is the quality of spatial data produced by volunteers by comparing it with the map data produced by public authorities, to follow what occurs when spatial data are opened for users, and to get acquainted with the user profile of these volunteer mappers. A central part of this study is OpenStreetMap project (OSM), which aim is to create a map of the entire world by volunteers. Anyone can become an OpenStreetMap contributor, and the data created by the volunteers are free to use for anyone without restricting copyrights or license charges. In this study OpenStreetMap is investigated from two viewpoints. In the first part of the study, the aim was to investigate the quality of volunteered geographic information. A pilot project was implemented by following what occurs when a high-resolution aerial imagery is released freely to the OpenStreetMap contributors. The quality of VGI was investigated by comparing the OSM datasets with the map data of The National Land Survey of Finland (NLS). The quality of OpenStreetMap data was investigated by inspecting the positional accuracy and the completeness of the road datasets, as well as the differences in the attribute datasets between the studied datasets. Also the OSM community was under analysis and the development of the map data of OpenStreetMap was investigated by visual analysis. The aim of the second part of the study was to analyse the user profile of OpenStreetMap contributors, and to investigate how the contributors act when collecting data and editing OpenStreetMap. The aim was also to investigate what motivates users to map and how is the quality of volunteered geographic information envisaged. The second part of the study was implemented by conducting a web inquiry to the OpenStreetMap contributors. The results of the study show that the quality of OpenStreetMap data compared with the data of National Land Survey of Finland can be defined as good. OpenStreetMap differs from the map of National Land Survey especially because of the amount of uncertainty, for example because of the completeness and uniformity of the map are not known. The results of the study reveal that opening spatial data increased notably the amount of the data in the study area, and both the positional accuracy and completeness improved significantly. The study confirms the earlier arguments that only few contributors have created the majority of the data in OpenStreetMap. The inquiry made for the OpenStreetMap users revealed that the data are most often collected by foot or by bicycle using GPS device, or by editing the map with the help of aerial imageries. According to the responses, the users take part to the OpenStreetMap project because they want to make maps better, and want to produce maps, which have information that is up-to-date and cannot be found from any other maps. Almost all of the users exploit the maps by themselves, most popular methods being downloading the map into a navigator or into a mobile device. The users regard the quality of OpenStreetMap as good, especially because of the up-to-dateness and the accuracy of the map.
Resumo:
"Fifty-six teachers, from four European countries, were interviewed to ascertain their attitudes to and beliefs about the Collaborative Learning Environments (CLEs) which were designed under the Innovative Technologies for Collaborative Learning Project. Their responses were analysed using categories based on a model from cultural-historical activity theory [Engestrom, Y. (1987). Learning by expanding.- An activity-theoretical approach to developmental research. Helsinki: Orienta-Konsultit; Engestrom, Y., Engestrom, R., & Suntio, A. (2002). Can a school community learn to master its own future? An activity-theoretical study of expansive learning among middle school teachers. In G. Wells & G. Claxton (Eds.), Learning for life in the 21st century. Oxford: Blackwell Publishers]. The teachers were positive about CLEs and their possible role in initiating pedagogical innovation and enhancing personal professional development. This positive perception held across cultures and national boundaries. Teachers were aware of the fact that demanding planning was needed for successful implementations of CLEs. However, the specific strategies through which the teachers can guide students' inquiries in CLEs and the assessment of new competencies that may characterize student performance in the CLEs were poorly represented in the teachers' reflections on CLEs. The attitudes and beliefs of the teachers from separate countries had many similarities, but there were also some clear differences, which are discussed in the article. (c) 2005 Elsevier Ltd. All rights reserved."
Resumo:
Eighteen corpora striata from normal human foetal brains ranging in gestational age from 16 to 40 weeks and five from post natal brains ranging from 23 days to 42 years were analysed for the ontogeny of dopamine receptors using [3H]spiperone as the ligand and 10 mM dopamine hydrochloride was used in blanks. Spiperone binding sites were characterized in a 40-week-old foetal brain to be dopamine receptors by the following criteria: (1) It was localized in a crude mitochondrial pellet that included synaptosomes; (2) binding was saturable at 0.8 nM concentration; (3) dopaminergic antagonists spiperone, haloperidol, pimozide, trifluperazine and chlorpromazine competed for the binding with IC50 values in the range of 0.3–14 nM while agonists—apomorphine and dopamine gave IC50 values of 2.5 and 10 μM, respectively suggesting a D2 type receptor.Epinephrine and norepinephrine inhibited the binding much less efficiently while mianserin at 10 μM and serotonin at 1 mM concentration did not inhibit the binding. Bimolecular association and dissociation rate constants for the reversible binding were 5.7 × 108 M−1 min−1 and 5.0 × 10−2 min−1, respectively. Equilibrium dissociation constant was 87 pM and the KD obtained by saturation binding was 73 pM.During the foetal age 16 to 40 weeks, the receptor concentration remained in the range of 38–60 fmol/mg protein or 570–1080 fmol/g striatum but it increased two-fold postnatally reaching a maximum at 5 years Significantly, at lower foetal ages (16–24 weeks) the [3H]spiperone binding sites exhibited a heterogeneity with a high (KD, 13–85 pM) and a low (KD, 1.2–4.6 nM) affinity component, the former accounting for 13–24% of the total binding sites. This heterogeneity persisted even when sulpiride was used as a displacer. The number of high affinity sites increased from 16 weeks to 24 weeks and after 28 weeks of gestation, all the binding sites showed only a single high affinity.GTP decreased the agonist affinity as observed by dopamine competition of [3H]spiperone binding in 20-week-old foetal striata and at all subsequent ages. GTP increased IC50 values of dopamine 2 to 4.5 fold and Hill coefficients were also increased becoming closer to one suggesting that the dopamine receptor was susceptible to regulation from foetal life onwards.
Resumo:
A catalytic hydrogen combustion reaction was carried out over noble metal catalysts substituted in ZrO2 and TiO2 in ionic form. The catalysts were synthesized by the solution combustion technique. The compounds showed high activity and CO tolerance for the reaction. The activity of Pd and Pt ion substituted TiO2 was comparable and was higher than Pd and Pt ion substituted ZrO2. The mechanisms of the reaction over the two supports were proposed by making use of the X-ray photoelectron spectroscopy and FT infrared spectroscopic observations. The reaction over ZrO2 supported catalysts was proposed to take place by the utilization of the surface hydroxyl groups while the reaction over TiO2 supported catalysts was hypothesized to be a hybrid mechanism utilizing surface hydroxyl groups and the lattice oxygen.