832 resultados para Web content adaptation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cipher Cities was a practice-led research project developed in 3 stages between 2005 and 2007 resulting in the creation of a unique online community, ‘Cipher Cities’, that provides simple authoring tools and processes for individuals and groups to create their own mobile events and event journals, build community profile and participate in other online community activities. Cipher Cities was created to revitalise peoples relationship to everyday places by giving them the opportunity and motivation to create and share complex digital stories in simple and engaging ways. To do so we developed new design processes and methods for both the research team and the end user to appropriate web and mobile technologies. To do so we collaborated with ethnographers, designers and ICT researchers and developers. In teams we ran a series of workshops in a wide variety of cities in Australia to refine an engagement process and to test a series of iteratively developed prototypes to refine the systems that supported community motivation and collaboration. The result of the research is 2 fold: 1. a sophisticated prototype for researchers and designers to further experiment with community engagement methodologies using existing and emerging communications technologies. 2. A ‘human dimensions matrix’. This matrix assists in the identification and modification of place based interventions in the social, technical, spatial, cultural, pedagogical conditions of any given community. This matrix has now become an essential part of a number of subsequent projects and assists design collaborators to successfully conceptualise, generate and evaluate interactive experiences. the research team employed practice-led action research methodologies that involved a collaborative effort across the fields of interaction design and social science, in particular ethnography, in order to: 1. seek, contest, refine a design methodology that would maximise the successful application of a dynamic system to create new kinds of interactions between people, places and artefacts’. 2. To design and deploy an application that intervenes in place-based and mobile technologies and offers people simple interfaces to create and share digital stories. Cipher Cities was awarded 3 separate CRC competitive grants (over $270,000 in total) to assist 3 stages of research covering the development of the Ethnographic Design Methodologies, the development of the tools, and the testing and refinement of both the engagement models and technologies. The resulting methodologies and tools are in the process of being commercialised by the Australasian CRC for Interaction Design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Internet theoretically enables marketers to personalize a Website to an individual consumer. This article examines optimal Website design from the perspective of personality trait theory and resource-matching theory. The influence of two traits relevant to Internet Web-site processing—sensation seeking and need for cognition— were studied in the context of resource matching and different levels of Web-site complexity. Data were collected at two points of time: personality-trait data and a laboratory experiment using constructed Web sites. Results reveal that (a) subjects prefer Web sites of a medium level of complexity, rather than high or low complexity; (b)high sensation seekers prefer complex visual designs, and low sensation seekers simple visual designs, both in Web sites of medium complexity; and (c) high need-for-cognition subjects evaluated Web sites with high verbal and low visual complexity more favourably.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The enhanced accessibility, affordability and capability of the Internet has created enormous possibilities in terms of designing, developing and implementing innovative teaching methods in the classroom. As existing pedagogies are revamped and new ones are added, there is a need to assess the effectiveness of these approaches from the students’ perspective. For more than three decades, proven qualitative and quantitative research methods associated with learning environments research have yielded productive results for educators. This article presents the findings of a study in which Getsmart, a teacher-designed website, was blended into science and physics lessons at an Australian high school. Students’ perceptions of this environment were investigated, together with differences in the perceptions of students in junior and senior years of schooling. The article also explores the impact of teachers in such an environment. The investigation undertaken in this study also gave an indication of how effective Getsmart was as a teaching model in such environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Auch wenn Alvin Tofflers „Prosumer“ oder „Prosument“ in diesem Band von zentralem Interesse ist, lohnt es sich, zunächst etwas weiter auszuholen und kurz zu umreißen, worauf dieses Modell fußt und welche Grundmodelle es modifizieren soll. Prosumtion soll nämlich die herkömmliche Wertschöpfungskette erweitern und verbessern, welche beim Übergang zur industriellen Massenproduktion etabliert wurde. Die Notwendigkeit, industrielle Produktionsmittel zu bauen, zu betreiben und zu warten und die Waren aus Massenproduktion an ihre Zielmärkte zu vertreiben, führte schnell zu einer immer größeren Trennung von Produzenten, Distributoren und Konsumenten als separaten Stationen in der Wertschöpfungskette der industriellen Produktion. Besonders zu Beginn des industriellen Zeitalters war eine solche Trennung ein angemessenes und wirksames Organisationsmodell, das Teilnahme an der Industriegesellschaft in drei klar definierte Aufgaben aufteilte.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is the 2nd of a series of discussion papers (Table 1) around the pedagogy of supervision in the technology disciplines. The papers form part of an Australian Learning and Teaching Council Fellowship program conducted by ALTC Associate Fellow, Professor Christine Bruce, Queensland University of Technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Web 1.0 referred to the early, read-only internet; Web 2.0 refers to the ‘read-write web’ in which users actively contribute to as well as consume online content; Web 3.0 is now being used to refer to the convergence of mobile and Web 2.0 technologies and applications. One of the most important developments in mobile 3.0 is geography: with many mobile phones now equipped with GPS, mobiles promise to “bring the internet down to earth” through geographically-aware, or locative media. The internet was earlier heralded as “the death of geography” with predictions that with anyone able to access information from anywhere, geography would no longer matter. But mobiles are disproving this. GPS allows the location of the user to be pinpointed, and the mobile internet allows the user to access locally-relevant information, or to upload content which is geotagged to the specific location. It also allows locally-specific content to be sent to the user when the user enters a specific space. Location-based services are one of the fastest-growing segments of the mobile internet market: the 2008 AIMIA report indicates that user access of local maps increased by 347% over the previous 12 months, and restaurant guides/reviews increased by 174%. The central tenet of cultural geography is that places are culturally-constructed, comprised of the physical space itself, culturally-inflected perceptions of that space, and people’s experiences of the space (LeFebvre 1991). This paper takes a cultural geographical approach to locative media, anatomising the various spaces which have emerged through locative media, or “the geoweb” (Lake 2004). The geoweb is such a new concept that to date, critical discourse has treated it as a somewhat homogenous spatial formation. In order to counter this, and in order to demonstrate the dynamic complexity of the emerging spaces of the geoweb, the paper provides a topography of different types of locative media space: including the personal/aesthetic in which individual users geotag specific physical sites with their own content and meanings; the commercial, like the billboards which speak to individuals as they pass in Minority Report; and the social, in which one’s location is defined by the proximity of friends rather than by geography.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the filed of semantic grid, QoS-based Web service scheduling for workflow optimization is an important problem.However, in semantic and service rich environment like semantic grid, the emergence of context constraints on Web services is very common making the scheduling consider not only quality properties of Web services, but also inter service dependencies which are formed due to the context constraints imposed on Web services. In this paper, we present a repair genetic algorithm, namely minimal-conflict hill-climbing repair genetic algorithm, to address scheduling optimization problems in workflow applications in the presence of domain constraints and inter service dependencies. Experimental results demonstrate the scalability and effectiveness of the genetic algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim To measure latitude-related body size variation in field-collected Paropsis atomaria Olivier (Coleoptera: Chrysomelidae) individuals and to conduct common-garden experiments to determine whether such variation is due to phenotypic plasticity or local adaptation. Location Four collection sites from the east coast of Australia were selected for our present field collections: Canberra (latitude 35°19' S), Bangalow (latitude 28°43' S), Beerburrum (latitude 26°58' S) and Lowmead (latitude 24°29' S). Museum specimens collected over the past 100 years and covering the same geographical area as the present field collections came from one state, one national and one private collection. Methods Body size (pronotum width) was measured for 118 field-collected beetles and 302 specimens from collections. We then reared larvae from the latitudinal extremes (Canberra and Lowmead) to determine whether the size cline was the result of phenotypic plasticity or evolved differences (= local adaptation) between sites. Results Beetles decreased in size with increasing latitude, representing a converse Bergmann cline. A decrease in developmental temperature produced larger adults for both Lowmead (low latitude) and Canberra (high latitude) individuals, and those from Lowmead were larger than those from Canberra when reared under identical conditions. Main conclusions The converse Bergmann cline in P. atomaria is likely to be the result of local adaptation to season length.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Query reformulation is a key user behavior during Web search. Our research goal is to develop predictive models of query reformulation during Web searching. This article reports results from a study in which we automatically classified the query-reformulation patterns for 964,780 Web searching sessions, composed of 1,523,072 queries, to predict the next query reformulation. We employed an n-gram modeling approach to describe the probability of users transitioning from one query-reformulation state to another to predict their next state. We developed first-, second-, third-, and fourth-order models and evaluated each model for accuracy of prediction, coverage of the dataset, and complexity of the possible pattern set. The results show that Reformulation and Assistance account for approximately 45% of all query reformulations; furthermore, the results demonstrate that the first- and second-order models provide the best predictability, between 28 and 40% overall and higher than 70% for some patterns. Implications are that the n-gram approach can be used for improving searching systems and searching assistance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports findings from a study investigating the effect of integrating sponsored and nonsponsored search engine links into a single web listing. The premise underlying this research is that web searchers are chiefly interested in relevant results. Given the reported negative bias that web searchers have concerning sponsored links, separate listings may be a disservice to web searchers as it might not direct them to relevant websites. Some web meta-search engines integrate sponsored and nonsponsored links into a single listing. Using a web search engine log of over 7 million interactions from hundreds of thousands of users from a major web meta-search engine, we analysed the click-through patterns for both sponsored and nonsponsored links. We also classified web queries as informational, navigational and transactional based on the expected type of content and analysed the click-through patterns of each classification. The findings show that for more than 35% of queries, there are no clicks on any result. More than 80% of web queries are informational in nature and approximately 10% are transactional, and 10% navigational. Sponsored links account for approximately 15% of all clicks. Integrating sponsored and nonsponsored links does not appear to increase the clicks on sponsored listings. We discuss how these research results could enhance future sponsored search platforms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we use time series analysis to evaluate predictive scenarios using search engine transactional logs. Our goal is to develop models for the analysis of searchers’ behaviors over time and investigate if time series analysis is a valid method for predicting relationships between searcher actions. Time series analysis is a method often used to understand the underlying characteristics of temporal data in order to make forecasts. In this study, we used a Web search engine transactional log and time series analysis to investigate users’ actions. We conducted our analysis in two phases. In the initial phase, we employed a basic analysis and found that 10% of searchers clicked on sponsored links. However, from 22:00 to 24:00, searchers almost exclusively clicked on the organic links, with almost no clicks on sponsored links. In the second and more extensive phase, we used a one-step prediction time series analysis method along with a transfer function method. The period rarely affects navigational and transactional queries, while rates for transactional queries vary during different periods. Our results show that the average length of a searcher session is approximately 2.9 interactions and that this average is consistent across time periods. Most importantly, our findings shows that searchers who submit the shortest queries (i.e., in number of terms) click on highest ranked results. We discuss implications, including predictive value, and future research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports preliminary results from a study modeling the interplay between multitasking, cognitive coordination, and cognitive shifts during Web search. Study participants conducted three Web searches on personal information problems. Data collection techniques included pre- and post-search questionnaires; think-aloud protocols, Web search logs, observation, and post-search interviews. Key findings include: (1) users Web searches included multitasking, cognitive shifting and cognitive coordination processes, (2) cognitive coordination is the hinge linking multitasking and cognitive shifting that enables Web search construction, (3) cognitive shift levels determine the process of cognitive coordination, and (4) cognitive coordination is interplay of task, mechanism and strategy levels that underpin multitasking and task switching. An initial model depicts the interplay between multitasking, cognitive coordination, and cognitive shifts during Web search. Implications of the findings and further research are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Examined the social adaptation of 32 children in grades 3–6 with mild intellectual disability: 13 Ss were partially integrated into regular primary school classes and 19 Ss were full-time in separate classes. Sociometric status was assessed using best friend and play rating measures. Consistent with previous research, children with intellectual disability were less socially accepted than were a matched group of 32 children with no learning disabilities. Children in partially integrated classes received more play nominations than those in separate classes, but had no greater acceptance as a best friend. On teachers' reports, disabled children had higher levels of inappropriate social behaviours, but there was no significant difference in appropriate behaviours. Self-assessments by integrated children were more negative than those by children in separate classes, and their peer-relationship satisfaction was lower. Ratings by disabled children of their satisfaction with peer relationships were associated with ratings of appropriate social skills by themselves and their teachers, and with self-ratings of negative behaviour. The study confirmed that partial integration can have negative consequences for children with an intellectual disability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years greater emphasis has been placed by many Law Schools on teaching not only the substantive content of the law but also the skills needed for the practice of the law. Negotiation is one such skill. However, effective teaching of negotiation may be problematic in the context of large numbers of students studying in a variety of modes and often juggling other time commitments. This paper examines the Air Gondwana program, a blended learning environment designed to address these challenges. The program demonstrates that ICT can be used to create an authentic learning experience which engages and stimulates students.