132 resultados para internet-based computations


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Web 1.0 referred to the early, read-only internet; Web 2.0 refers to the ‘read-write web’ in which users actively contribute to as well as consume online content; Web 3.0 is now being used to refer to the convergence of mobile and Web 2.0 technologies and applications. One of the most important developments in mobile 3.0 is geography: with many mobile phones now equipped with GPS, mobiles promise to “bring the internet down to earth” through geographically-aware, or locative media. The internet was earlier heralded as “the death of geography” with predictions that with anyone able to access information from anywhere, geography would no longer matter. But mobiles are disproving this. GPS allows the location of the user to be pinpointed, and the mobile internet allows the user to access locally-relevant information, or to upload content which is geotagged to the specific location. It also allows locally-specific content to be sent to the user when the user enters a specific space. Location-based services are one of the fastest-growing segments of the mobile internet market: the 2008 AIMIA report indicates that user access of local maps increased by 347% over the previous 12 months, and restaurant guides/reviews increased by 174%. The central tenet of cultural geography is that places are culturally-constructed, comprised of the physical space itself, culturally-inflected perceptions of that space, and people’s experiences of the space (LeFebvre 1991). This paper takes a cultural geographical approach to locative media, anatomising the various spaces which have emerged through locative media, or “the geoweb” (Lake 2004). The geoweb is such a new concept that to date, critical discourse has treated it as a somewhat homogenous spatial formation. In order to counter this, and in order to demonstrate the dynamic complexity of the emerging spaces of the geoweb, the paper provides a topography of different types of locative media space: including the personal/aesthetic in which individual users geotag specific physical sites with their own content and meanings; the commercial, like the billboards which speak to individuals as they pass in Minority Report; and the social, in which one’s location is defined by the proximity of friends rather than by geography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Web service based systems, new value-added Web services can be constructed by integrating existing Web services. A Web service may have many implementations, which are functionally identical, but have different Quality of Service (QoS) attributes, such as response time, price, reputation, reliability, availability and so on. Thus, a significant research problem in Web service composition is how to select an implementation for each of the component Web services so that the overall QoS of the composite Web service is optimal. This is so called QoS-aware Web service composition problem. In some composite Web services there are some dependencies and conflicts between the Web service implementations. However, existing approaches cannot handle the constraints. This paper tackles the QoS-aware Web service composition problem with inter service dependencies and conflicts using a penalty-based genetic algorithm (GA). Experimental results demonstrate the effectiveness and the scalability of the penalty-based GA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Young drivers aged 17-24 are consistently overrepresented in motor vehicle crashes. Research has shown that a young driver’s crash risk increases when carrying similarly aged passengers, with fatal crash risk increasing two to three fold with two or more passengers. Recent growth in access to and use of the internet has led to a corresponding increase in the number of web based behaviour change interventions. An increasing body of literature describes the evaluation of web based programs targeting risk behaviours and health issues. Evaluations have shown promise for such strategies with evidence for positive changes in knowledge, attitudes and behaviour. The growing popularity of web based programs is due in part to their wide accessibility, ability for personalised tailoring of intervention messages, and self-direction and pacing of online content. Young people are also highly receptive to the internet and the interactive elements of online programs are particularly attractive. The current study was designed to assess the feasibility for a web based intervention to increase the use of personal and peer protective strategies among young adult passengers. An extensive review was conducted on the development and evaluation of web based programs. Year 12 students were also surveyed about their use of the internet in general and for health and road safety information. All students reported internet access at home or at school, and 74% had searched for road safety information. Additional findings have shown promise for the development of a web based passenger safety program for young adults. Design and methodological issues will be discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud computing is a latest new computing paradigm where applications, data and IT services are provided over the Internet. Cloud computing has become a main medium for Software as a Service (SaaS) providers to host their SaaS as it can provide the scalability a SaaS requires. The challenges in the composite SaaS placement process rely on several factors including the large size of the Cloud network, SaaS competing resource requirements, SaaS interactions between its components and SaaS interactions with its data components. However, existing applications’ placement methods in data centres are not concerned with the placement of the component’s data. In addition, a Cloud network is much larger than data center networks that have been discussed in existing studies. This paper proposes a penalty-based genetic algorithm (GA) to the composite SaaS placement problem in the Cloud. We believe this is the first attempt to the SaaS placement with its data in Cloud provider’s servers. Experimental results demonstrate the feasibility and the scalability of the GA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

TCP is a dominant protocol for consistent communication over the internet. It provides flow, congestion and error control mechanisms while using wired reliable networks. Its congestion control mechanism is not suitable for wireless links where data corruption and its lost rate are higher. The physical links are transparent from TCP that takes packet losses due to congestion only and initiates congestion handling mechanisms by reducing transmission speed. This results in wasting already limited available bandwidth on the wireless links. Therefore, there is no use to carry out research on increasing bandwidth of the wireless links until the available bandwidth is not optimally utilized. This paper proposed a hybrid scheme called TCP Detection and Recovery (TCP-DR) to distinguish congestion, corruption and mobility related losses and then instructs the data sending host to take appropriate action. Therefore, the link utilization is optimal while losses are either due to high bit error rate or mobility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today’s evolving networks are experiencing a large number of different attacks ranging from system break-ins, infection from automatic attack tools such as worms, viruses, trojan horses and denial of service (DoS). One important aspect of such attacks is that they are often indiscriminate and target Internet addresses without regard to whether they are bona fide allocated or not. Due to the absence of any advertised host services the traffic observed on unused IP addresses is by definition unsolicited and likely to be either opportunistic or malicious. The analysis of large repositories of such traffic can be used to extract useful information about both ongoing and new attack patterns and unearth unusual attack behaviors. However, such an analysis is difficult due to the size and nature of the collected traffic on unused address spaces. In this dissertation, we present a network traffic analysis technique which uses traffic collected from unused address spaces and relies on the statistical properties of the collected traffic, in order to accurately and quickly detect new and ongoing network anomalies. Detection of network anomalies is based on the concept that an anomalous activity usually transforms the network parameters in such a way that their statistical properties no longer remain constant, resulting in abrupt changes. In this dissertation, we use sequential analysis techniques to identify changes in the behavior of network traffic targeting unused address spaces to unveil both ongoing and new attack patterns. Specifically, we have developed a dynamic sliding window based non-parametric cumulative sum change detection techniques for identification of changes in network traffic. Furthermore we have introduced dynamic thresholds to detect changes in network traffic behavior and also detect when a particular change has ended. Experimental results are presented that demonstrate the operational effectiveness and efficiency of the proposed approach, using both synthetically generated datasets and real network traces collected from a dedicated block of unused IP addresses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A browser is a convenient way to access resources located remotely on computer networks. Security in browsers has become a crucial issue for users who use them for sensitive applications without knowledge ofthe hazards. This research utilises a structure approach to analyse and propose enhancements to browser security. Standard evaluation for computer products is important as it helps users to ensure that the product they use is appropriate for their needs. Security in browsers, therefore, has been evaluated using the Common Criteria. The outcome of this was a security requirements profile which attempts to formalise the security needs of browsers. The information collected during the research was used to produce a prototype model for a secure browser program. Modifications to the Lynx browser were made to demonstrate the proposed enhancements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasingly, software is no longer developed as a single system, but rather as a smart combination of so-called software services. Each of these provides an independent, specific and relatively small piece of functionality, which is typically accessible through the Internet from internal or external service providers. To the best of our knowledge, there are no standards or models that describe the sourcing process of these software based services (SBS). We identify the sourcing requirements for SBS and associate the key characteristics of SBS (with the sourcing requirements introduced). Furthermore, we investigate the sourcing of SBS with the related works in the field of classical procurement, business process outsourcing, and information systems sourcing. Based on the analysis, we conclude that the direct adoption of these approaches for SBS is not feasible and new approaches are required for sourcing SBS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasingly, software is no longer developed as a single system, but rather as a smart combination of so-called software services. Each of these provides an independent, specific and relatively small piece of functionality, which is typically accessible through the Internet from internal or external service providers. There are no standards or models that describe the sourcing process of these software based services (SBS). The authors identify the sourcing requirements for SBS and associate the key characteristics of SBS (with the sourcing requirements introduced). Furthermore, this paper investigates the sourcing of SBS with the related works in the field of classical procurement, business process outsourcing, and information systems sourcing. Based on the analysis, the authors conclude that the direct adoption of these approaches for SBS is not feasible and new approaches are required for sourcing SBS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Internet and Web services have been used in both teaching and learning and are gaining popularity in today’s world. E-Learning is becoming popular and considered the latest advance in technology based learning. Despite the potential advantages for learning in a small country like Bhutan, there is lack of eServices at the Paro College of Education. This study investigated students’ attitudes towards online communities and frequency of access to the Internet, and how students locate and use different sources of information in their project tasks. Since improvement was at the heart of this research, an action research approach was used. Based on the idea of purposeful sampling, a semi-structured interview and observations were used as data collection instruments. 10 randomly selected students (5 girls and 5 boys) participated in this research as the controlled group. The study findings indicated that there is a lack of educational information technology services, such as e-learning at the college. Internet connection being very slow was the main barrier to learning using e-learning or accessing Internet resources. There is a strong relationship between the quality of written task and the source of the information, and between Web searching and learning. The source of information used in assignments and project work is limited to books in the library which are often outdated and of poor quality. Project tasks submitted by most of the students were of poor quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the emergence of Web 2.0, Web users can classify Web items of their interest by using tags. Tags reflect users’ understanding to the items collected in each tag. Exploring user tagging behavior provides a promising way to understand users’ information needs. However, free and relatively uncontrolled vocabulary has its drawback in terms of lack of standardization and semantic ambiguity. Moreover, the relationships among tags have not been explored even there exist rich relationships among tags which could provide valuable information for us to better understand users. In this paper, we propose a novel approach to construct tag ontology based on the widely used general ontology WordNet to capture the semantics and the structural relationships of tags. Ambiguity of tags is a challenging problem to deal with in order to construct high quality tag ontology. We propose strategies to find the semantic meanings of tags and a strategy to disambiguate the semantics of tags based on the opinion of WordNet lexicographers. In order to evaluate the usefulness of the constructed tag ontology, in this paper we apply the extracted tag ontology in a tag recommendation experiment. We believe this is the first application of tag ontology for recommendation making. The initial result shows that by using the tag ontology to re-rank the recommended tags, the accuracy of the tag recommendation can be improved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We derive an explicit method of computing the composition step in Cantor’s algorithm for group operations on Jacobians of hyperelliptic curves. Our technique is inspired by the geometric description of the group law and applies to hyperelliptic curves of arbitrary genus. While Cantor’s general composition involves arithmetic in the polynomial ring F_q[x], the algorithm we propose solves a linear system over the base field which can be written down directly from the Mumford coordinates of the group elements. We apply this method to give more efficient formulas for group operations in both affine and projective coordinates for cryptographic systems based on Jacobians of genus 2 hyperelliptic curves in general form.