987 resultados para web communications
Resumo:
The management of main material prices of provincial highway project quota has problems of lag and blindness. Framework of provincial highway project quota data MIS and main material price data warehouse were established based on WEB firstly. Then concrete processes of provincial highway project main material prices were brought forward based on BP neural network algorithmic. After that standard BP algorithmic, additional momentum modify BP network algorithmic, self-adaptive study speed improved BP network algorithmic were compared in predicting highway project main prices. The result indicated that it is feasible to predict highway main material prices using BP NN, and using self-adaptive study speed improved BP network algorithmic is the relatively best one.
Resumo:
Cipher Cities was a practice-led research project developed in 3 stages between 2005 and 2007 resulting in the creation of a unique online community, ‘Cipher Cities’, that provides simple authoring tools and processes for individuals and groups to create their own mobile events and event journals, build community profile and participate in other online community activities. Cipher Cities was created to revitalise peoples relationship to everyday places by giving them the opportunity and motivation to create and share complex digital stories in simple and engaging ways. To do so we developed new design processes and methods for both the research team and the end user to appropriate web and mobile technologies. To do so we collaborated with ethnographers, designers and ICT researchers and developers. In teams we ran a series of workshops in a wide variety of cities in Australia to refine an engagement process and to test a series of iteratively developed prototypes to refine the systems that supported community motivation and collaboration. The result of the research is 2 fold: 1. a sophisticated prototype for researchers and designers to further experiment with community engagement methodologies using existing and emerging communications technologies. 2. A ‘human dimensions matrix’. This matrix assists in the identification and modification of place based interventions in the social, technical, spatial, cultural, pedagogical conditions of any given community. This matrix has now become an essential part of a number of subsequent projects and assists design collaborators to successfully conceptualise, generate and evaluate interactive experiences. the research team employed practice-led action research methodologies that involved a collaborative effort across the fields of interaction design and social science, in particular ethnography, in order to: 1. seek, contest, refine a design methodology that would maximise the successful application of a dynamic system to create new kinds of interactions between people, places and artefacts’. 2. To design and deploy an application that intervenes in place-based and mobile technologies and offers people simple interfaces to create and share digital stories. Cipher Cities was awarded 3 separate CRC competitive grants (over $270,000 in total) to assist 3 stages of research covering the development of the Ethnographic Design Methodologies, the development of the tools, and the testing and refinement of both the engagement models and technologies. The resulting methodologies and tools are in the process of being commercialised by the Australasian CRC for Interaction Design.
Resumo:
Information and Communications Technologies globally are moving towards Service Oriented Architectures and Web Services. The healthcare environment is rapidly moving to the use of Service Oriented Architecture/Web Services systems interconnected via this global open Internet. Such moves present major challenges where these structures are not based on highly trusted operating systems. This paper argues the need of a radical re-think of access control in the contemporary healthcare environment in light of modern information system structures, legislative and regulatory requirements, and security operation demands in Health Information Systems. This paper proposes the Open and Trusted Health Information Systems (OTHIS), a viable solution including override capability to the provision of appropriate levels of secure access control for the protection of sensitive health data.
Resumo:
Traditional media are under assault from digital technologies. Online advertising is eroding the financial basis of newspapers and television, demarcations between different forms of media are fading, and audiences are fragmenting. We can podcast our favourite radio show, data accompanies television programs, and we catch up with newspaper stories on our laptops. Yet mainstream media remain enormously powerful. The Media and Communications in Australia offers a systematic introduction to this dynamic field. Fully updated and revised to take account of recent developments, this third edition outlines the key media industries and explains how communications technologies are impacting on them. It provides a thorough overview of the main approaches taken in studying the media, and includes new chapters on social media, gaming, telecommunications, sport and cultural diversity. With contributions from some of Australia's best researchers and teachers in the field, The Media and Communications in Australia is the most comprehensive and reliable introduction to media and communications available. It is an ideal student text, and a reference for teachers of media and anyone interested in this influential industry.
Resumo:
One of most modern professional jobs minimum requirements is to handle and manage the World Wide Web [WWW] and communications such as Electric Mail [email]. In my office all staff including, administration, marketing, management, and all levels of quantity surveyors, ranging from cadet to director must manage electric communication. One of many aspects in my professional development I have struggled with is managing my tasks dictated to me through e-mail.
Resumo:
The enhanced accessibility, affordability and capability of the Internet has created enormous possibilities in terms of designing, developing and implementing innovative teaching methods in the classroom. As existing pedagogies are revamped and new ones are added, there is a need to assess the effectiveness of these approaches from the students’ perspective. For more than three decades, proven qualitative and quantitative research methods associated with learning environments research have yielded productive results for educators. This article presents the findings of a study in which Getsmart, a teacher-designed website, was blended into science and physics lessons at an Australian high school. Students’ perceptions of this environment were investigated, together with differences in the perceptions of students in junior and senior years of schooling. The article also explores the impact of teachers in such an environment. The investigation undertaken in this study also gave an indication of how effective Getsmart was as a teaching model in such environments.
Resumo:
The shift from 20th century mass communications media towards convergent media and Web 2.0 has raised the possibility of a renaissance of the public sphere, based around citizen journalism and participatory media culture. This paper will evaluate such claims both conceptually and empirically. At a conceptual level, it is noted that the question of whether media democratization is occurring depends in part upon how democracy is understood, with some critical differences in understandings of democracy, the public sphere and media citizenship. The empirical work in this paper draws upon various case studies of new developments in Australian media, including online- only newspapers, developments in public service media, and the rise of commercially based online alternative media. It is argued that participatory media culture is being expanded if understood in terms of media pluralism, but that implications for the public sphere depend in part upon how media democratization is defined.
Resumo:
Web 1.0 referred to the early, read-only internet; Web 2.0 refers to the ‘read-write web’ in which users actively contribute to as well as consume online content; Web 3.0 is now being used to refer to the convergence of mobile and Web 2.0 technologies and applications. One of the most important developments in mobile 3.0 is geography: with many mobile phones now equipped with GPS, mobiles promise to “bring the internet down to earth” through geographically-aware, or locative media. The internet was earlier heralded as “the death of geography” with predictions that with anyone able to access information from anywhere, geography would no longer matter. But mobiles are disproving this. GPS allows the location of the user to be pinpointed, and the mobile internet allows the user to access locally-relevant information, or to upload content which is geotagged to the specific location. It also allows locally-specific content to be sent to the user when the user enters a specific space. Location-based services are one of the fastest-growing segments of the mobile internet market: the 2008 AIMIA report indicates that user access of local maps increased by 347% over the previous 12 months, and restaurant guides/reviews increased by 174%. The central tenet of cultural geography is that places are culturally-constructed, comprised of the physical space itself, culturally-inflected perceptions of that space, and people’s experiences of the space (LeFebvre 1991). This paper takes a cultural geographical approach to locative media, anatomising the various spaces which have emerged through locative media, or “the geoweb” (Lake 2004). The geoweb is such a new concept that to date, critical discourse has treated it as a somewhat homogenous spatial formation. In order to counter this, and in order to demonstrate the dynamic complexity of the emerging spaces of the geoweb, the paper provides a topography of different types of locative media space: including the personal/aesthetic in which individual users geotag specific physical sites with their own content and meanings; the commercial, like the billboards which speak to individuals as they pass in Minority Report; and the social, in which one’s location is defined by the proximity of friends rather than by geography.
Resumo:
In the filed of semantic grid, QoS-based Web service scheduling for workflow optimization is an important problem.However, in semantic and service rich environment like semantic grid, the emergence of context constraints on Web services is very common making the scheduling consider not only quality properties of Web services, but also inter service dependencies which are formed due to the context constraints imposed on Web services. In this paper, we present a repair genetic algorithm, namely minimal-conflict hill-climbing repair genetic algorithm, to address scheduling optimization problems in workflow applications in the presence of domain constraints and inter service dependencies. Experimental results demonstrate the scalability and effectiveness of the genetic algorithm.
Resumo:
Young drivers aged 17-24 are consistently overrepresented in motor vehicle crashes. Research has shown that a young driver’s crash risk increases when carrying similarly aged passengers, with fatal crash risk increasing two to three fold with two or more passengers. Recent growth in access to and use of the internet has led to a corresponding increase in the number of web based behaviour change interventions. An increasing body of literature describes the evaluation of web based programs targeting risk behaviours and health issues. Evaluations have shown promise for such strategies with evidence for positive changes in knowledge, attitudes and behaviour. The growing popularity of web based programs is due in part to their wide accessibility, ability for personalised tailoring of intervention messages, and self-direction and pacing of online content. Young people are also highly receptive to the internet and the interactive elements of online programs are particularly attractive. The current study was designed to assess the feasibility for a web based intervention to increase the use of personal and peer protective strategies among young adult passengers. An extensive review was conducted on the development and evaluation of web based programs. Year 12 students were also surveyed about their use of the internet in general and for health and road safety information. All students reported internet access at home or at school, and 74% had searched for road safety information. Additional findings have shown promise for the development of a web based passenger safety program for young adults. Design and methodological issues will be discussed.
Resumo:
Query reformulation is a key user behavior during Web search. Our research goal is to develop predictive models of query reformulation during Web searching. This article reports results from a study in which we automatically classified the query-reformulation patterns for 964,780 Web searching sessions, composed of 1,523,072 queries, to predict the next query reformulation. We employed an n-gram modeling approach to describe the probability of users transitioning from one query-reformulation state to another to predict their next state. We developed first-, second-, third-, and fourth-order models and evaluated each model for accuracy of prediction, coverage of the dataset, and complexity of the possible pattern set. The results show that Reformulation and Assistance account for approximately 45% of all query reformulations; furthermore, the results demonstrate that the first- and second-order models provide the best predictability, between 28 and 40% overall and higher than 70% for some patterns. Implications are that the n-gram approach can be used for improving searching systems and searching assistance.
Resumo:
In this paper, we use time series analysis to evaluate predictive scenarios using search engine transactional logs. Our goal is to develop models for the analysis of searchers’ behaviors over time and investigate if time series analysis is a valid method for predicting relationships between searcher actions. Time series analysis is a method often used to understand the underlying characteristics of temporal data in order to make forecasts. In this study, we used a Web search engine transactional log and time series analysis to investigate users’ actions. We conducted our analysis in two phases. In the initial phase, we employed a basic analysis and found that 10% of searchers clicked on sponsored links. However, from 22:00 to 24:00, searchers almost exclusively clicked on the organic links, with almost no clicks on sponsored links. In the second and more extensive phase, we used a one-step prediction time series analysis method along with a transfer function method. The period rarely affects navigational and transactional queries, while rates for transactional queries vary during different periods. Our results show that the average length of a searcher session is approximately 2.9 interactions and that this average is consistent across time periods. Most importantly, our findings shows that searchers who submit the shortest queries (i.e., in number of terms) click on highest ranked results. We discuss implications, including predictive value, and future research.
Resumo:
This paper reports results from a study in which we automatically classified the query reformulation patterns for 964,780 Web searching sessions (composed of 1,523,072 queries) in order to predict what the next query reformulation would be. We employed an n-gram modeling approach to describe the probability of searchers transitioning from one query reformulation state to another and predict their next state. We developed first, second, third, and fourth order models and evaluated each model for accuracy of prediction. Findings show that Reformulation and Assistance account for approximately 45 percent of all query reformulations. Searchers seem to seek system searching assistant early in the session or after a content change. The results of our evaluations show that the first and second order models provided the best predictability, between 28 and 40 percent overall, and higher than 70 percent for some patterns. Implications are that the n-gram approach can be used for improving searching systems and searching assistance in real time.
Resumo:
This paper reports preliminary results from a study modeling the interplay between multitasking, cognitive coordination, and cognitive shifts during Web search. Study participants conducted three Web searches on personal information problems. Data collection techniques included pre- and post-search questionnaires; think-aloud protocols, Web search logs, observation, and post-search interviews. Key findings include: (1) users Web searches included multitasking, cognitive shifting and cognitive coordination processes, (2) cognitive coordination is the hinge linking multitasking and cognitive shifting that enables Web search construction, (3) cognitive shift levels determine the process of cognitive coordination, and (4) cognitive coordination is interplay of task, mechanism and strategy levels that underpin multitasking and task switching. An initial model depicts the interplay between multitasking, cognitive coordination, and cognitive shifts during Web search. Implications of the findings and further research are also discussed.
Resumo:
The artwork describes web as a network environment and a space where people are connected and as a result, it can reshape you as an interactive participant who is able to regenerate an object as a new form through a truly collaborative and cooperative interactions with others. The artwork has been created based on the research findings of characteristic of web: 1) Participatory (Slater 2002, p.536), 2) Communicational (Rheingold 1993), 3) Connected (Jordan 1999, 80), and 4) Stylising (Jordan 1999, 69). The artwork has conceptualised and visualised those characteristics of web based on principles of graphic design and visual communication.