26 resultados para google
Resumo:
Many producers of geographic information are now disseminating their data using open web service protocols, notably those published by the Open Geospatial Consortium. There are many challenges inherent in running robust and reliable services at reasonable cost. Cloud computing provides a new kind of scalable infrastructure that could address many of these challenges. In this study we implement a Web Map Service for raster imagery within the Google App Engine environment. We discuss the challenges of developing GIS applications within this framework and the performance characteristics of the implementation. Results show that the application scales well to multiple simultaneous users and performance will be adequate for many applications, although concerns remain over issues such as latency spikes. We discuss the feasibility of implementing services within the free usage quotas of Google App Engine and the possibility of extending the approaches in this paper to other GIS applications.
Resumo:
Purpose - The purpose of this paper is to identify the most popular techniques used to rank a web page highly in Google. Design/methodology/approach - The paper presents the results of a study into 50 highly optimized web pages that were created as part of a Search Engine Optimization competition. The study focuses on the most popular techniques that were used to rank highest in this competition, and includes an analysis on the use of PageRank, number of pages, number of in-links, domain age and the use of third party sites such as directories and social bookmarking sites. A separate study was made into 50 non-optimized web pages for comparison. Findings - The paper provides insight into the techniques that successful Search Engine Optimizers use to ensure a page ranks highly in Google. Recognizes the importance of PageRank and links as well as directories and social bookmarking sites. Research limitations/implications - Only the top 50 web sites for a specific query were analyzed. Analysing more web sites and comparing with similar studies in different competition would provide more concrete results. Practical implications - The paper offers a revealing insight into the techniques used by industry experts to rank highly in Google, and the success or other-wise of those techniques. Originality/value - This paper fulfils an identified need for web sites and e-commerce sites keen to attract a wider web audience.
Resumo:
With the advent of mass digitization projects, such as the Google Book Search, a peculiar shift has occurred in the way that copyright works are dealt with. Contrary to what has so far been the case, works are turned into machine-readable data to be automatically processed for various purposes without the expression of works being displayed to the public. In the Google Book Settlement Agreement, this new kind of usage is referred to as ‘non-display uses’ of digital works. The legitimacy of these uses has not yet been tested by Courts and does not comfortably fit in the current copyright doctrine, plainly because the works are not used as works but as something else, namely as data. Since non-display uses may prove to be a very lucrative market in the near future, with the potential to affect the way people use copyright works, we examine non-display uses under the prism of copyright principles to determine the boundaries of their legitimacy. Through this examination, we provide a categorization of the activities carried out under the heading of ‘non-display uses’, we examine their lawfulness under the current copyright doctrine and approach the phenomenon from the spectrum of data protection law that could apply, by analogy, to the use of copyright works as processable data.
Resumo:
Results are presented from a new web application called OceanDIVA - Ocean Data Intercomparison and Visualization Application. This tool reads hydrographic profiles and ocean model output and presents the data on either depth levels or isotherms for viewing in Google Earth, or as probability density functions (PDFs) of regional model-data misfits. As part of the CLIVAR Global Synthesis and Observations Panel, an intercomparison of water mass properties of various ocean syntheses has been undertaken using OceanDIVA. Analysis of model-data misfits reveals significant differences between the water mass properties of the syntheses, such as the ability to capture mode water properties.
Resumo:
Smooth flow of production in construction is hampered by disparity between individual trade teams' goals and the goals of stable production flow for the project as a whole. This is exacerbated by the difficulty of visualizing the flow of work in a construction project. While the addresses some of the issues in Building information modeling provides a powerful platform for visualizing work flow in control systems that also enable pull flow and deeper collaboration between teams on and off site. The requirements for implementation of a BIM-enabled pull flow construction management software system based on the Last Planner System™, called ‘KanBIM’, have been specified, and a set of functional mock-ups of the proposed system has been implemented and evaluated in a series of three focus group workshops. The requirements cover the areas of maintenance of work flow stability, enabling negotiation and commitment between teams, lean production planning with sophisticated pull flow control, and effective communication and visualization of flow. The evaluation results show that the system holds the potential to improve work flow and reduce waste by providing both process and product visualization at the work face.
Resumo:
We consider a non-local version of the NJL model, based on a separable quark-quark interaction. The interaction is extended to include terms that bind vector and axial-vector mesons. The non-locality means that no further regulator is required. Moreover the model is able to confine the quarks by generating a quark propagator without poles at real energies. Working in the ladder approximation, we calculate amplitudes in Euclidean space and discuss features of their continuation to Minkowski energies. Conserved currents are constructed and we demonstrate their consistency with various Ward identities. Various meson masses are calculated, along with their strong and electromagnetic decay amplitudes. We also calculate the electromagnetic form factor of the pion, as well as form factors associated with the processes γγ* → π0 and ω → π0γ*. The results are found to lead to a satisfactory phenomenology and lend some dynamical support to the idea of vector-meson dominance.
Resumo:
The aim of using GPS for Alzheimer's Patients is to give carers and families of those affected by Alzheimer's Disease, as well as all the other dementia related conditions, a service that can, via SMS text message, notify them should their loved one leave their home. Through a custom website, it enables the carer to remotely manage a contour boundary that is specifically assigned to the patient as well as the telephone numbers of the carers. The technique makes liberal use of such as Google Maps.
Resumo:
The development of large scale virtual reality and simulation systems have been mostly driven by the DIS and HLA standards community. A number of issues are coming to light about the applicability of these standards, in their present state, to the support of general multi-user VR systems. This paper pinpoints four issues that must be readdressed before large scale virtual reality systems become accessible to a larger commercial and public domain: a reduction in the effects of network delays; scalable causal event delivery; update control; and scalable reliable communication. Each of these issues is tackled through a common theme of combining wall clock and causal time-related entity behaviour, knowledge of network delays and prediction of entity behaviour, that together overcome many of the effects of network delay.
Resumo:
A live work where digital and analogue media collide. This work uses the Internet as a central point of departure in that the script is taken from the Wikipedia entry for the word 'slideshow'. Words are randomly extracted and transferred onto photographic 35mm slide to be projected with analogue carousel slide projectors taking the audience into a visual wordplay, from Google to PowerPoint presentation. The sound of projectors is manipulated gradually into a clashing, confrontational, digital/analogue crescendo. 'Slideshow' investigates how information is sourced, navigated and considered in a culture of accelerating mediation. It posits the notion of a post-digital era in which we are increasingly faced with challenging questions of authenticity and authority.
Resumo:
Advances in hardware and software in the past decade allow to capture, record and process fast data streams at a large scale. The research area of data stream mining has emerged as a consequence from these advances in order to cope with the real time analysis of potentially large and changing data streams. Examples of data streams include Google searches, credit card transactions, telemetric data and data of continuous chemical production processes. In some cases the data can be processed in batches by traditional data mining approaches. However, in some applications it is required to analyse the data in real time as soon as it is being captured. Such cases are for example if the data stream is infinite, fast changing, or simply too large in size to be stored. One of the most important data mining techniques on data streams is classification. This involves training the classifier on the data stream in real time and adapting it to concept drifts. Most data stream classifiers are based on decision trees. However, it is well known in the data mining community that there is no single optimal algorithm. An algorithm may work well on one or several datasets but badly on others. This paper introduces eRules, a new rule based adaptive classifier for data streams, based on an evolving set of Rules. eRules induces a set of rules that is constantly evaluated and adapted to changes in the data stream by adding new and removing old rules. It is different from the more popular decision tree based classifiers as it tends to leave data instances rather unclassified than forcing a classification that could be wrong. The ongoing development of eRules aims to improve its accuracy further through dynamic parameter setting which will also address the problem of changing feature domain values.
Resumo:
The present invention provides a process comprising substitution of an acceptor molecule comprising a group -XC(O)- wherein X is O, S or NR8, where R8 is C1-6 alkyl, C6-12 aryl or hydrogen, with a nucleophile, wherein the acceptor molecule is cyclised such that said nucleophilic substitution at -XC (O)- occurs without racemisation. This process has particular application for the production of a peptide by extension from the activated carboxy-terminus of an acyl amino acid residue without epimerisation.
Resumo:
The present invention provides a process comprising substitution of an acceptor molecule comprising a group -XC(O)- wherein X is O, S or NR8, where R8 is C1-6 alkyl, C6-12 aryl or hydrogen, with a nucleophile, wherein the acceptor molecule is cyclised such that said nucleophilic substitution at -XC (O)- occurs without racemisation. This process has particular application for the production of a peptide by extension from the activated carboxy-terminus of an acyl amino acid residue without epimerisation.
Resumo:
Purpose – This paper aims to analyze the research productivity and impact of the finalists of the AIB best dissertation award, now titled the Buckley and Casson Award, but from 1987 to 2012 the Farmer Award. Specifically, this paper examines whether there is a relationship between winning the best dissertation award and subsequent publication productivity and impact. Relationships between academic institution and institutional geographic location and finalists are also examined. Design/methodology/approach – The paper examines 25 years of citation counts and the number of publications in Google Scholar of Farmer Award winners and finalists of the AIB best dissertation award from inception in 1987 to 2009, with cited publications as a measure of productivity and citations as a measure of impact. Top performers in productivity and impact are identified, and the averages of winners and non-winners are analyzed in aggregate, over time and per year. Data on finalists' institution and geographic location of institution are analyzed to describe the importance of location and institution to the award. Findings – It is found that the overall average citations of the winners of the award is less than that of the non-winners, and that in the large majority of years the non-winners have an average citation count higher than that of the winners. However, taking averages in five year increments shows more mixed results, with non-winners performing better in two periods and winners performing better in two periods, with the remaining period being split as to research productivity and impact. Originality/value – Aggarwal et al. in this journal summarized a variety of data on Farmer Award finalists from the 1990s to gain insights on institutions represented by finalists, the publication record of finalists, and content of dissertations, among other characteristics. This paper updates some of the insights from that paper by examining data on award winners from 1987 to 2013, and adds further insight by examining for the first time cited publications and citation counts winners and non-winners for the same period excluding the last two years.
Resumo:
A method and oligonucleotide compound for inhibiting replication of a nidovirus in virus-infected animal cells are disclosed. The compound (i) has a nuclease-resistant backbone, (ii) is capable of uptake by the infected cells, (iii) contains between 8-25 nucleotide bases, and (iv) has a sequence capable of disrupting base pairing between the transcriptional regulatory sequences in the 5′ leader region of the positive-strand viral genome and negative-strand 3′ subgenomic region. In practicing the method, infected cells are exposed to the compound in an amount effective to inhibit viral replication.