927 resultados para peer-to-peer (P2P) computing


Relevância:

40.00% 40.00%

Publicador:

Resumo:

We report the synthesis and characterisation of new examples of meso-hydroxynickel(II) porphyrins with 5,15-diphenyl and 10-phenyl-5,15-diphenyl/diaryl substitu- tion. The OH group was introduced by using carbonate or hydroxide as nucleophile by using palladium/phosphine cat- alysis. The NiPor OHs exist in solution in equilibrium with the corresponding oxy radicals NiPor OC. The 15-phenyl group stabilises the radicals, so that the 1H NMR spectra of {NiPor OH} are extremely broad due to chemical exchange with the paramagnetic species. The radical concentration for the diphenylporphyrin analogue is only 1%, and its NMR line-broadening was able to be studied by variable-tempera- ture NMR spectroscopy. The EPR signals of NiPor OC are con- sistent with somewhat delocalised porphyrinyloxy radicals, and the spin distributions calculated by using density func- tional theory match the EPR and NMR spectroscopic obser- vations. Nickel(II) meso-hydroxy-10,20-diphenylporphyrin was oxidatively coupled to a dioxo-terminated porphodimethene dyad, the strongly red-shifted electronic spectrum of which was successfully modelled by using time-dependent DFT calculations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: The Internet has recently made possible the free global availability of scientific journal articles. Open Access (OA) can occur either via OA scientific journals, or via authors posting manuscripts of articles published in subscription journals in open web repositories. So far there have been few systematic studies showing how big the extent of OA is, in particular studies covering all fields of science. Methodology/Principal Findings: The proportion of peer reviewed scholarly journal articles, which are available openly in full text on the web, was studied using a random sample of 1837 titles and a web search engine. Of articles published in 2008, 8,5% were freely available at the publishers’ sites. For an additional 11,9% free manuscript versions could be found using search engines, making the overall OA percentage 20,4%. Chemistry (13%) had the lowest overall share of OA, Earth Sciences (33%) the highest. In medicine, biochemistry and chemistry publishing in OA journals was more common. In all other fields author-posted manuscript copies dominated the picture. Conclusions/Significance: The results show that OA already has a significant positive impact on the availability of the scientific journal literature and that there are big differences between scientific disciplines in the uptake. Due to the lack of awareness of OA-publishing among scientists in most fields outside physics, the results should be of general interest to all scholars. The results should also interest academic publishers, who need to take into account OA in their business strategies and copyright policies, as well as research funders, who like the NIH are starting to require OA availability of results from research projects they fund. The method and search tools developed also offer a good basis for more in-depth studies as well as longitudinal studies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

One of the effects of the Internet is that the dissemination of scientific publications in a few years has migrated to electronic formats. The basic business practices between libraries and publishers for selling and buying the content, however, have not changed much. In protest against the high subscription prices of mainstream publishers, scientists have started Open Access (OA) journals and e-print repositories, which distribute scientific information freely. Despite widespread agreement among academics that OA would be the optimal distribution mode for publicly financed research results, such channels still constitute only a marginal phenomenon in the global scholarly communication system. This paper discusses, in view of the experiences of the last ten years, the many barriers hindering a rapid proliferation of Open Access. The discussion is structured according to the main OA channels; peer-reviewed journals for primary publishing, subject- specific and institutional repositories for secondary parallel publishing. It also discusses the types of barriers, which can be classified as consisting of the legal framework, the information technology infrastructure, business models, indexing services and standards, the academic reward system, marketing, and critical mass.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Internet has made possible the cost-effective dissemination of scientific journals in the form of electronic versions, usually in parallel with the printed versions. At the same time the electronic medium also makes possible totally new open access (OA) distribution models, funded by author charges, sponsorship, advertising, voluntary work, etc., where the end product is free in full text to the readers. Although more than 2,000 new OA journals have been founded in the last 15 years, the uptake of open access has been rather slow, with currently around 5% of all peer-reviewed articles published in OA journals. The slow growth can to a large extent be explained by the fact that open access has predominantly emerged via newly founded journals and startup publishers. Established journals and publishers have not had strong enough incentives to change their business models, and the commercial risks in doing so have been high. In this paper we outline and discuss two different scenarios for how scholarly publishers could change their operating model to open access. The first is based on an instantaneous change and the second on a gradual change. We propose a way to manage the gradual change by bundling traditional “big deal” licenses and author charges for opening access to individual articles.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Internet has made possible the cost-effective dissemination of scientific journals in the form of electronic versions, usually in parallel with the printed versions. At the same time the electronic medium also makes possible totally new open access (OA) distribution models, funded by author charges, sponsorship, advertising, voluntary work, etc., where the end product is free in full text to the readers. Although more than 2,000 new OA journals have been founded in the last 15 years, the uptake of open access has been rather slow, with currently around 5% of all peer-reviewed articles published in OA journals. The slow growth can to a large extent be explained by the fact that open access has predominantly emerged via newly founded journals and startup publishers. Established journals and publishers have not had strong enough incentives to change their business models, and the commercial risks in doing so have been high. In this paper we outline and discuss two different scenarios for how scholarly publishers could change their operating model to open access. The first is based on an instantaneous change and the second on a gradual change. We propose a way to manage the gradual change by bundling traditional “big deal” licenses and author charges for opening access to individual articles.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

When authors of scholarly articles decide where to submit their manuscripts for peer review and eventual publication, they often base their choice of journals on very incomplete information abouthow well the journals serve the authors’ purposes of informing about their research and advancing their academic careers. The purpose of this study was to develop and test a new method for benchmarking scientific journals, providing more information to prospective authors. The method estimates a number of journal parameters, including readership, scientific prestige, time from submission to publication, acceptance rate and service provided by the journal during the review and publication process. Data directly obtainable from the web, data that can be calculated from such data, data obtained from publishers and editors, and data obtained using surveys with authors are used in the method, which has been tested on three different sets of journals, each from a different discipline. We found a number of problems with the different data acquisition methods, which limit the extent to which the method can be used. Publishers and editors are reluctant to disclose important information they have at hand (i.e. journal circulation, web downloads, acceptance rate). The calculation of some important parameters (for instance average time from submission to publication, regional spread of authorship) can be done but requires quite a lot of work. It can be difficult to get reasonable response rates to surveys with authors. All in all we believe that the method we propose, taking a “service to authors” perspective as a basis for benchmarking scientific journals, is useful and can provide information that is valuable to prospective authors in selected scientific disciplines.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Den här boken handlar om framtidens television och vissa marknadsförings- och upphovsrättsliga spörsmål som aktualiseras i samband med nya former av televisionssändningar. TV-tekniken har utvecklats mycket under de senaste åren, och det är framförallt sändningstekniken som har genomgått de största förändringarna. Tyngdpunkten i detta arbete har lagts på förmedling av TV-sändningar över Internet med hjälp av P2P-tekniken (peer-to-peer) och vilka marknadsförings- och upphovsrättsliga spörsmål detta aktualiserar. Den nya förmedlingstekniken möjliggör en rad nya marknadsföringsmetoder, och i boken behandlas hur dessa metoder förhåller sig till nuvarande marknadsföringsrättsliga reglering. Förmedling av upphovsrättsskyddat material över Internet har inneburit ett flertal reformer av gällande upphovsrättslagstiftning i syfte att stärka rättsinnehavarnas ensamrätt. En viktig fråga som boken behandlar är hur gällande upphovsrättslagstiftning skall tolkas i förhållande till nya distributionsformer över Internet. I boken tas även upp frågor i anknytning till problemet med att en stor del av det material som förmedlas över Internet sker utan rättsinnehavarens samtycke, vilket innebär en ekonomisk förlust för rättsinnehavaren. I boken framläggs alternativa lösningar för hur förmedlingen över Internet kan tilldela rättsinnehavaren ekonomisk ersättning, trots att förmedlingen är avgiftsfri.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Erasure coding techniques are used to increase the reliability of distributed storage systems while minimizing storage overhead. Also of interest is minimization of the bandwidth required to repair the system following a node failure. In a recent paper, Wu et al. characterize the tradeoff between the repair bandwidth and the amount of data stored per node. They also prove the existence of regenerating codes that achieve this tradeoff. In this paper, we introduce Exact Regenerating Codes, which are regenerating codes possessing the additional property of being able to duplicate the data stored at a failed node. Such codes require low processing and communication overheads, making the system practical and easy to maintain. Explicit construction of exact regenerating codes is provided for the minimum bandwidth point on the storage-repair bandwidth tradeoff, relevant to distributed-mail-server applications. A sub-space based approach is provided and shown to yield necessary and sufficient conditions on a linear code to possess the exact regeneration property as well as prove the uniqueness of our construction. Also included in the paper, is an explicit construction of regenerating codes for the minimum storage point for parameters relevant to storage in peer-to-peer systems. This construction supports a variable number of nodes and can handle multiple, simultaneous node failures. All constructions given in the paper are of low complexity, requiring low field size in particular.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

More than half a decade has passed since the December 26th 2004 tsunami hit the Indian coast leaving a trail of ecological, economic and human destruction in its wake. We reviewed the coastal ecological research carried out in India in the light of the tsunami. In addition, we also briefly reviewed the ecological research in other tsunami affected countries in Asia namely Sri Lanka, Indonesia, Thailand and Maldives in order to provide a broader perspective of ecological research after tsunami. A basic search in ISI Web of Knowledge using keywords ``tsunami'' and ``India'' resulted in 127 peer reviewed journal articles, of which 39 articles were pertaining to ecological sciences. In comparison, Sri Lanka, Indonesia, Thailand and Maldives had, respectively, eight, four, 21 and two articles pertaining to ecology. In India, bioshields received the major share of scientific interest (14 out of 39) while only one study (each) was dedicated to corals, seagrasses, seaweeds and meiofauna, pointing to the paucity of research attention dedicated to these critical ecosystems. We noted that very few interdisciplinary studies looked at linkages between pure/applied sciences and the social sciences in India. In addition, there appears to be little correlation between the limited research that was done and its influence on policy in India. This review points to gap areas in ecological research in India and highlights the lessons learnt from research in other tsunami-affected countries. It also provides guidance on the links between science and policy that are required for effective coastal zone management.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fragment Finder 2.0 is a web-based interactive computing server which can be used to retrieve structurally similar protein fragments from 25 and 90% nonredundant data sets. The computing server identifies structurally similar fragments using the protein backbone C alpha angles. In addition, the identified fragments can be superimposed using either of the two structural superposition programs, STAMP and PROFIT, provided in the server. The freely available Java plug-in Jmol has been interfaced with the server for the visualization of the query and superposed fragments. The server is the updated version of a previously developed search engine and employs an in-house-developed fast pattern matching algorithm. This server can be accessed freely over the World Wide Web through the URL http://cluster.physics.iisc.ernet.in/ff/.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Hydrogen bonds in biological macromolecules play significant structural and functional roles. They are the key contributors to most of the interactions without which no living system exists. In view of this, a web-based computing server, the Hydrogen Bonds Computing Server (HBCS), has been developed to compute hydrogen-bond interactions and their standard deviations for any given macromolecular structure. The computing server is connected to a locally maintained Protein Data Bank (PDB) archive. Thus, the user can calculate the above parameters for any deposited structure, and options have also been provided for the user to upload a structure in PDB format from the client machine. In addition, the server has been interfaced with the molecular viewers Jmol and JSmol to visualize the hydrogen-bond interactions. The proposed server is freely available and accessible via the World Wide Web at http://bioserver1.physics.iisc.ernet.in/hbcs/.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Setting total allowable catches (TACs) is an endogenous process in which different agents and institutions, often with conflicting interests and opportunistic behaviour, try to influence policy-makers. Such policy-makers, far from being the benevolent social planners many would wish them to be, may also pursue self-interest when making final decisions. Although restricted knowledge of stock abundance and population dynamics, and weakness in enforcement, have effects, these other factors may explain the reason why TAC management has failed to guarantee sustainable exploitation of fish resources. Rejecting the exogeneity of the TAC and taking advantage of fruitful debate on economic policy (i.e. the rules vs. discretion debate, and that surrounding the independence of central banks), two institutional developments are analysed as potential mechanisms to face up to misconceptions about TACs: long-term harvest control rules, and a central bank of fish.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The scalability of CMOS technology has driven computation into a diverse range of applications across the power consumption, performance and size spectra. Communication is a necessary adjunct to computation, and whether this is to push data from node-to-node in a high-performance computing cluster or from the receiver of wireless link to a neural stimulator in a biomedical implant, interconnect can take up a significant portion of the overall system power budget. Although a single interconnect methodology cannot address such a broad range of systems efficiently, there are a number of key design concepts that enable good interconnect design in the age of highly-scaled CMOS: an emphasis on highly-digital approaches to solving ‘analog’ problems, hardware sharing between links as well as between different functions (such as equalization and synchronization) in the same link, and adaptive hardware that changes its operating parameters to mitigate not only variation in the fabrication of the link, but also link conditions that change over time. These concepts are demonstrated through the use of two design examples, at the extremes of the power and performance spectra.

A novel all-digital clock and data recovery technique for high-performance, high density interconnect has been developed. Two independently adjustable clock phases are generated from a delay line calibrated to 2 UI. One clock phase is placed in the middle of the eye to recover the data, while the other is swept across the delay line. The samples produced by the two clocks are compared to generate eye information, which is used to determine the best phase for data recovery. The functions of the two clocks are swapped after the data phase is updated; this ping-pong action allows an infinite delay range without the use of a PLL or DLL. The scheme's generalized sampling and retiming architecture is used in a sharing technique that saves power and area in high-density interconnect. The eye information generated is also useful for tuning an adaptive equalizer, circumventing the need for dedicated adaptation hardware.

On the other side of the performance/power spectra, a capacitive proximity interconnect has been developed to support 3D integration of biomedical implants. In order to integrate more functionality while staying within size limits, implant electronics can be embedded onto a foldable parylene (‘origami’) substrate. Many of the ICs in an origami implant will be placed face-to-face with each other, so wireless proximity interconnect can be used to increase communication density while decreasing implant size, as well as facilitate a modular approach to implant design, where pre-fabricated parylene-and-IC modules are assembled together on-demand to make custom implants. Such an interconnect needs to be able to sense and adapt to changes in alignment. The proposed array uses a TDC-like structure to realize both communication and alignment sensing within the same set of plates, increasing communication density and eliminating the need to infer link quality from a separate alignment block. In order to distinguish the communication plates from the nearby ground plane, a stimulus is applied to the transmitter plate, which is rectified at the receiver to bias a delay generation block. This delay is in turn converted into a digital word using a TDC, providing alignment information.