985 resultados para Web version


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform’s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electronic document management (EDM) technology has the potential to enhance the information management in construction projects considerably, without radical changes to current practice. Over the past fifteen years this topic has been overshadowed by building product modelling in the construction IT research world, but at present EDM is quickly being introduced in practice, in particular in bigger projects. Often this is done in the form of third party services available over the World Wide Web. In the paper, a typology of research questions and methods is presented, which can be used to position the individual research efforts which are surveyed in the paper. Questions dealt with include: What features should EMD systems have? How much are they used? Are there benefits from use and how should these be measured? What are the barriers to wide-spread adoption? Which technical questions need to be solved? Is there scope for standardisation? How will the market for such systems evolve?

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The Internet has recently made possible the free global availability of scientific journal articles. Open Access (OA) can occur either via OA scientific journals, or via authors posting manuscripts of articles published in subscription journals in open web repositories. So far there have been few systematic studies showing how big the extent of OA is, in particular studies covering all fields of science. Methodology/Principal Findings: The proportion of peer reviewed scholarly journal articles, which are available openly in full text on the web, was studied using a random sample of 1837 titles and a web search engine. Of articles published in 2008, 8,5% were freely available at the publishers’ sites. For an additional 11,9% free manuscript versions could be found using search engines, making the overall OA percentage 20,4%. Chemistry (13%) had the lowest overall share of OA, Earth Sciences (33%) the highest. In medicine, biochemistry and chemistry publishing in OA journals was more common. In all other fields author-posted manuscript copies dominated the picture. Conclusions/Significance: The results show that OA already has a significant positive impact on the availability of the scientific journal literature and that there are big differences between scientific disciplines in the uptake. Due to the lack of awareness of OA-publishing among scientists in most fields outside physics, the results should be of general interest to all scholars. The results should also interest academic publishers, who need to take into account OA in their business strategies and copyright policies, as well as research funders, who like the NIH are starting to require OA availability of results from research projects they fund. The method and search tools developed also offer a good basis for more in-depth studies as well as longitudinal studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When authors of scholarly articles decide where to submit their manuscripts for peer review and eventual publication, they often base their choice of journals on very incomplete information abouthow well the journals serve the authors’ purposes of informing about their research and advancing their academic careers. The purpose of this study was to develop and test a new method for benchmarking scientific journals, providing more information to prospective authors. The method estimates a number of journal parameters, including readership, scientific prestige, time from submission to publication, acceptance rate and service provided by the journal during the review and publication process. Data directly obtainable from the web, data that can be calculated from such data, data obtained from publishers and editors, and data obtained using surveys with authors are used in the method, which has been tested on three different sets of journals, each from a different discipline. We found a number of problems with the different data acquisition methods, which limit the extent to which the method can be used. Publishers and editors are reluctant to disclose important information they have at hand (i.e. journal circulation, web downloads, acceptance rate). The calculation of some important parameters (for instance average time from submission to publication, regional spread of authorship) can be done but requires quite a lot of work. It can be difficult to get reasonable response rates to surveys with authors. All in all we believe that the method we propose, taking a “service to authors” perspective as a basis for benchmarking scientific journals, is useful and can provide information that is valuable to prospective authors in selected scientific disciplines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current mainstream scientific-publication process has so far been only marginally affected by the possibilities offered by the Internet, despite some pioneering attempts with free electronic-only journals and electronic preprint archives. Additional electronic versions of traditional paper journals for which one needs a subscription are not a solution. A clear trend, for young researchers in particular, is to go around subscription barriers (both for paper and electronic material) and rely almost exclusively on what they can find free on the Internet, which often includes working versions posted on the home pages of the authors. A survey of how scientists retrieve publications was conducted in February 2000, aimed at measuring to what extent the opportunities offered by the Internet are already changing the scientific information exchange and how researchers feel about this. This paper presents the results based on 236 replies to an extensive Web-based questionnaire, which was announced to around 3,000 researchers in the domains of construction information technology and construction management. The questions dealt with how researchers find, access, and read different sources; how many and what publications they read; how often and to which conferences they travel; how much they publish, and criteria for where they eventually decide to publish. Some of the questions confronted traditional and electronic publishing, with one final section dedicated to opinions about electronic publishing. According to the survey, researchers already download half of the material that they read digitally from the Web. The most popular method for retrieving an interesting publication is downloading it for free from the author's or publisher's Web site. Researchers are not particularly willing to pay for electronic scientific publications. There is much support for a scenario of electronic journals available freely in their entirety on the Web, where the costs could be covered by, for instance, professional societies or the publishing university.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction. We estimate the total yearly volume of peer-reviewed scientific journal articles published world-wide as well as the share of these articles available openly on the Web either directly or as copies in e-print repositories. Method. We rely on data from two commercial databases (ISI and Ulrich's Periodicals Directory) supplemented by sampling and Google searches. Analysis. A central issue is the finding that ISI-indexed journals publish far more articles per year (111) than non ISI-indexed journals (26), which means that the total figure we obtain is much lower than many earlier estimates. Our method of analysing the number of repository copies (green open access) differs from several earlier studies which have studied the number of copies in identified repositories, since we start from a random sample of articles and then test if copies can be found by a Web search engine. Results. We estimate that in 2006 the total number of articles published was approximately 1,350,000. Of this number 4.6% became immediately openly available and an additional 3.5% after an embargo period of, typically, one year. Furthermore, usable copies of 11.3% could be found in subject-specific or institutional repositories or on the home pages of the authors. Conclusions. We believe our results are the most reliable so far published and, therefore, should be useful in the on-going debate about Open Access among both academics and science policy makers. The method is replicable and also lends itself to longitudinal studies in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Publishers of academic journals can be seen as service providers to authors, in addition to the traditional role of providers of research results to readers. The purpose of this study was to analyse how author choices of journal in construction management are affected by quality and service perceptions. Seven journals were identified and for each 2006 article, one author e-mail address was extracted. A web based questionnaire was sent to 397 authors and 35% responded. It was found that there were three journals regularly followed by at least half the respondents. Most of the other four journals have scopes broader than construction management and receive lower scores for characteristics such as impact on researchers. No open access journals were included, and authors in the field of construction management rarely post openly accessible copies of their manuscripts or publications on the web. Author ranking of journals for their next submission is found to be related to general criteria such as academic status, circulation figures and ISI indexation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The World Wide Web provides the opportunity for a radically changed and much more efficient communication process for scientific results. A survey in the closely related domains of construction information technology and construction management was conducted in February 2000, aimed at measuring to what extent these opportunities are already changing the scientific information exchange and how researchers feel about the changes. The paper presents the results based on 236 replies to an extensive Web based questionnaire. 65% of the respondents stated their primary research interest as IT in A/E/C and 20% as construction management and economics. The questions dealt with how researchers find, access and read different sources; how much and what publications they read; how often and to which conferences they travel; how much they publish, and what are the criteria for where they eventually decide to publish. Some of the questions confronted traditional and electronic publishing with one final section dedicated to opinions about electronic publishing. According to the survey researchers already download half of the material that they read digitally from the Web. The most popular method for retrieving an interesting publication is downloading it for free from the author’s or publisher’s website. Researchers are not particularly willing to pay for electronic scientific publications. There is much support for a scenario of electronic journals available totally freely on the Web, where the costs could be covered by for instance professional societies or the publishing university. The shift that the Web is causing seems to be towards the "just in time" reading of literature. Also, frequent users of the Web rely less on scientific publications and tend to read fewer articles. If available with little effort, papers published in traditional journals are preferred; if not, the papers should be on the Web. In these circumstances, the role of paper-based journals published by established publishers is shifting from the core "information exchange" to the building of authors' prestige. The respondents feel they should build up their reputations by publishing in journals and relevant conferences, but then make their work freely available on the Web.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – The aim of this paper is to explore what kind of measures personnel managers have taken to intervene in workplace harassment and to explore how organisational characteristics and the characteristics of the personnel manager affect the choice of response strategies. Design/methodology/approach – The study was exploratory and used a survey design. A web-based questionnaire was sent to the personnel managers of all Finnish municipalities and data on organisational responses and organisational characteristics were collected. Findings – The study showed that the organisations surveyed relied heavily on reconciliatory measures for responding to workplace harassment and that punitive measures were seldom used. Findings indicated that personnel manager gender, size of municipality, use of “sophisticated” human resource management practices and having provided information and training to increase awareness about harassment all influence the organisational responses chosen. Research limitations/implications – Only the effects of organisational and personnel manager characteristics on organisational responses were analysed. Future studies need to include perpetrator characteristics and harassment severity. Practical implications – The study informs both practitioners and policy makers about the measures that have been taken and that can be taken in order to stop harassment. It also questions the effectiveness of written anti-harassment policies for influencing organisational responses to harassment and draws attention to the role of gendered perceptions of harassment for choice of response strategy. Originality/value – This paper fills a gap in harassment research by reporting on the use of different response strategies and by providing initial insights into factors affecting choice of responses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Encoding protein 3D structures into 1D string using short structural prototypes or structural alphabets opens a new front for structure comparison and analysis. Using the well-documented 16 motifs of Protein Blocks (PBs) as structural alphabet, we have developed a methodology to compare protein structures that are encoded as sequences of PBs by aligning them using dynamic programming which uses a substitution matrix for PBs. This methodology is implemented in the applications available in Protein Block Expert (PBE) server. PBE addresses common issues in the field of protein structure analysis such as comparison of proteins structures and identification of protein structures in structural databanks that resemble a given structure. PBE-T provides facility to transform any PDB file into sequences of PBs. PBE-ALIGNc performs comparison of two protein structures based on the alignment of their corresponding PB sequences. PBE-ALIGNm is a facility for mining SCOP database for similar structures based on the alignment of PBs. Besides, PBE provides an interface to a database (PBE-SAdb) of preprocessed PB sequences from SCOP culled at 95% and of all-against-all pairwise PB alignments at family and superfamily levels. PBE server is freely available at http://bioinformatics.univ-reunion.fr/ PBE/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Owing to high evolutionary divergence, it is not always possible to identify distantly related protein domains by sequence search techniques. Intermediate sequences possess sequence features of more than one protein and facilitate detection of remotely related proteins. We have demonstrated recently the employment of Cascade PSI-BLAST where we perform PSI-BLAST for many 'generations', initiating searches from new homologues as well. Such a rigorous propagation through generations of PSI-BLAST employs effectively the role of intermediates in detecting distant similarities between proteins. This approach has been tested on a large number of folds and its performance in detecting superfamily level relationships is similar to 35% better than simple PSI-BLAST searches. We present a web server for this search method that permits users to perform Cascade PSI-BLAST searches against the Pfam, SCOP and SwissProt databases. The URL for this server is http://crick.mbu.iisc.ernet.in/similar to CASCADE/CascadeBlast.html.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The unique characteristics of marketspace in combination with the fast growing number of consumers interested in e-commerce have created new research areas of interest to both marketing and consumer behaviour researchers. Consumer behaviour researchers interested in the decision making processes of consumers have two new sets of questions to answer. The first set of questions is related to how useful theories developed for a marketplace are in a marketspace context. Cyber auctions, Internet communities and the possibilities for consumers to establish dialogues not only with companies but also with other consumers make marketspace unique. The effects of these distinctive characteristics on the behaviour of consumers have not been systematically analysed and therefore constitute the second set of questions which have to be studied. Most companies feel that they have to be online even though the effects of being on the Net are not unambiguously positive. The relevance of the relationship marketing paradigm in a marketspace context have to be studied. The relationship enhancement effects of websites from the customers’ point of view are therefore emphasized in this research paper. Representatives of the Net-generation were analysed and the results show that companies should develop marketspace strategies while Net presence has a value-added effect on consumers. The results indicate that the decision making processes of the consumers are also changing as a result of the progress of marketspace

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The title of the 14th International Conference on Electronic Publishing (ELPUB), “Publishing in the networked world: Transforming the nature of communication”, is a timely one. Scholarly communication and scientific publishing has recently been undergoing subtle changes. Published papers are no longer fixed physical objects, as they once were. The “convergence” of information, communication, publishing and web technologies along with the emergence of Web 2.0 and social networks has completely transformed scholarly communication and scientific papers turned to living and changing entities in the online world. The themes (electronic publishing and social networks; scholarly publishing models; and technological convergence) selected for the conference are meant to address the issues involved in this transformation process. We are pleased to present the proceedings book with more than 30 papers and short communications addressing these issues. What you hold in your hands is a by-product and the culmination of almost a Year long work of many people including conference organizers, authors, reviewers, editors and print and online publishers. The ELPUB 2010 conference was organized and hosted by the Hanken School of Economics in Helsinki, Finland. Professors Turid Hedlund of Hanken School of Economics and Yaşar Tonta of Hacettepe University Department of Information Management (Ankara, Turkey) served as General Chair and Program Chair, respectively. We received more than 50 submissions from several countries. All submissions were peer-reviewed by members of an international Program Committee whose contributions proved most valuable and appreciated. The 14th ELPUB conference carries on the tradition of previous conferences held in the United Kingdom (1997 and 2001), Hungary (1998), Sweden (1999), Russia (2000), the Czech Republic (2002), Portugal (2003), Brazil (2004), Belgium (2005), Bulgaria (2006), Austria (2007), Canada (2008) and Italy (2009). The ELPUB Digital Library, http://elpub.scix.net serves as archive for the papers presented at the ELPUB conferences through the years. The 15th ELPUB conference will be organized by the Department of Information Management of Hacettepe University and will take place in Ankara, Turkey, from 14-16 June 2011. (Details can be found at the ELPUB web site as the conference date nears by.) We thank Marcus Sandberg and Hannu Sääskilahti for copyediting, Library Director Tua Hindersson – Söderholm for accepting to publish the online as well as the print version of the proceedings. Thanks also to Patrik Welling for maintaining the conference web site and Tanja Dahlgren for administrative support. We warmly acknowledge the support in organizing the conference to colleagues at Hanken School of Economics and our sponsors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tutkielmassa tarkastellaan kuluttajien näkemyksiä ilmastonmuutoksesta ja ilmastovaikutusten seuranta- ja palautejärjestelmän hyväksyttävyydestä kulutuksen ohjauskeinona. 15 kuluttajaa kokeili kuukauden ajan kulutuksen ilmastovaikutusten seuranta- ja palautejärjestelmän demonstraatioversiota ja he osallistuivat kokeilun pohjalta aihepiiriä käsitelleeseen verkkokeskusteluun. Analysoin verkkokeskustelun aineistoa arkisen järkeilyn näkökulmasta tutkien kuluttajien ilmastonmuutokseen ja ympäristövastuullisuuteen liittyvää arkitietoa sekä palvelun hyväksyttävyyteen liittyviä heuristiikkoja. Ilmastonmuutoksen todettiin yleisesti olevan vielä melko abstrakti ja monitulkintainen ilmiö, minkä vuoksi kuluttajilla on vaikeuksia ymmärtää omien valintojensa konkreettista merkitystä ilmastonmuutoksen kannalta. Vaikka tietoa kulutuksen ilmastovaikutuksista on saatavilla paljon, niin erityisesti yritysten tuottama tieto koettiin ristiriitaiseksi ja osin epäluotettavaksi. Kuluttajat myös kritisoivat ilmastonmuutoskeskustelun tarjoamaa kapeaa näkemystä kulutuksen ympäristövaikutuksista. Hiilidioksidipäästöihin keskittymisen sijaan ympäristövaikutuksia tulisi kuluttajien mielestä tarkastella kokonaisuutena, josta ilmastovaikutukset muodostavat vain yhden osan. Tutkimukseen osallistuneiden kuluttajien kulutustottumuksiin ilmastonmuutos vaikutti eriasteisesti. Toisille ilmastonmuutoksesta oli muodostunut keskeinen omaa kulutusta ohjaava normi, kun taas toiset kertoivat pohtivansa ilmastovaikutuksia pääasiassa suurimpien hankintojen kohdalla. Ympäristövastuullisuudessa merkitykselliseksi koettiin tasapainon löytäminen ja henkilökohtainen tunne siitä, että kokee toimivansa oikein. Ilmasto- tai ympäristökysymyksiä punnitaan valinnoissa joustavasti yhdessä muiden tekijöiden kanssa. Vaikka kuluttajilla toisaalta olisi tietoa ja halua ottaa ilmasto- ja ympäristövaikutukset huomioon valinnoissaan, toimintaympäristö rajaa keskeisesti kuluttajien mahdollisuuksia toimia ympäristövastuullisesti. Tutkimus toi esille neljä kuluttajien käyttämää heuristiikkaa heidän pohtiessaan ilmastovaikutusten seuranta- ja palautejärjestelmän hyväksyttävyyden ehtoja ja toimivuutta ohjauskeinona. Ensinnäkin palvelun tulee olla käytettävyydeltään nopea ja vaivaton sekä tarjota tietoa havainnollisessa ja helposti ymmärrettävässä muodossa. Toiseksi palvelun tarjoaman tiedon tulee olla ehdottoman luotettavaa ja kuluttajien valintojen kannalta merkityksellistä siten, että palvelu ottaa huomioon erilaiset kuluttajat ja tiedontarpeet. Kolmanneksi palvelu tulee toteuttaa kokonaisvaltaisesti ja läpinäkyvästi useampien kaupparyhmittymien ja julkisten toimijoiden yhteistyönä. Neljänneksi toteutuksessa tulee huomioda palvelun kannustavuus ja kytkeytyminen muihin ohjauskeinoihin.