863 resultados para World wide web (www)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stigmergy is a biological term used when discussing insect or swarm behaviour, and describes a model supporting environmental communication separately from artefacts or agents. This phenomenon is demonstrated in the behavior of ants and their food gathering process when following pheromone trails, or similarly termites and their termite mound building process. What is interesting with this mechanism is that highly organized societies are achieved with a lack of any apparent management structure. Stigmergic behavior is implicit in the Web where the volume of users provides a self-organizing and self-contextualization of content in sites which facilitate collaboration. However, the majority of content is generated by a minority of the Web participants. A significant contribution from this research would be to create a model of Web stigmergy, identifying virtual pheromones and their importance in the collaborative process. This paper explores how exploiting stigmergy has the potential of providing a valuable mechanism for identifying and analyzing online user behavior recording actionable knowledge otherwise lost in the existing web interaction dynamics. Ultimately this might assist our building better collaborative Web sites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Teaching awards, grants and fellowships are strategies used to recognise outstanding contributions to learning and teaching, encourage innovation, and to shift learning and teaching from the edge to centre stage. Examples range from school, faculty and institutional award and grant schemes to national schemes such as those offered by the Australian Learning and Teaching Council (ALTC), the Carnegie Foundation for the Advancement of Teaching in the United States, and the Fund for the Development of Teaching and Learning in higher education in the United Kingdom. The Queensland University of Technology (QUT) has experienced outstanding success in all areas of the ALTC funding since the inception of the Carrick Institute for Learning and Teaching in 2004. This paper reports on a study of the critical factors that have enabled sustainable and resilient institutional engagement with ALTC programs. As a lens for examining the QUT environment and practices, the study draws upon the five conditions of the framework for effective dissemination of innovation developed by Southwell, Gannaway, Orrell, Chalmers and Abraham (2005, 2010): 1. Effective, multi-level leadership and management 2. Climate of readiness for change 3. Availability of resources 4. Comprehensive systems in institutions and funding bodies 5. Funding design The discussion on the critical factors and practical and strategic lessons learnt for successful university-wide engagement offer insights for university leaders and staff who are responsible for learning and teaching award, grant and associated internal and external funding schemes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This conference celebrates the passing of 40 years since the establishment of the Internet (dating this, presumably, to the first connection between two nodes on ARPANET in October 1969). For a gathering of media scholars such as this, however, it may be just as important not only to mark the first testing of the core technologies upon which much of our present-day Net continues to build, but also to reflect on another recent milestone: the 20th anniversary of what is today arguably the chief interface through which billions around the world access and experience the Internet – the World Wide Web, launched by Tim Berners-Lee in 1989.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last two decades, the internet and e-commerce have reshaped the way we communicate, interact and transact. In the converged environment enabled by high speed broadband, web 2.0, social media, virtual worlds, user-generated content, cloud computing, VoIP, open source software and open content have rapidly become established features of our online experience. Business and government alike are increasingly using the internet as the preferred platform for delivery of their goods and services and for effective engagement with their clients. New ways of doing things online and challenges to existing business, government and social activities have tested current laws and often demand new policies and laws, adapted to the new realities. The focus of this book is the regulation of social, cultural and commercial activity on the World Wide Web. It considers developments in the law that have been, and continue to be, brought about by the emergence of the internet and e-commerce. It analyses how the law is applied to define rights and obligations in relation to online infrastructure, content and practices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Australian universities are currently engaging with new governmental policies and regulations that require them to demonstrate enhanced quality and accountability in teaching and research. The development of national academic standards for learning outcomes in higher education is one such instance of this drive for excellence. These discipline-specific standards articulate the minimum, or Threshold Learning Outcomes, to be addressed by higher education institutions so that graduating students can demonstrate their achievement to their institutions, accreditation agencies, and industry recruiters. This impacts not only on the design of Engineering courses (with particular emphasis on pedagogy and assessment), but also on the preparation of academics to engage with these standards and implement them in their day-to-day teaching practice on a micro level. This imperative for enhanced quality and accountability in teaching is also significant at a meso level, for according to the Australian Bureau of Statistics, about 25 per cent of teachers in Australian universities are aged 55 and above and more than 54 per cent are aged 45 and above (ABS, 2006). A number of institutions have undertaken recruitment drives to regenerate and enrich their academic workforce by appointing capacity-building research professors and increasing the numbers of early- and mid-career academics. This nationally driven agenda for quality and accountability in teaching permeates also the micro level of engineering education, since the demand for enhanced academic standards and learning outcomes requires both a strong advocacy for a shift to an authentic, collaborative, outcomes-focused education and the mechanisms to support academics in transforming their professional thinking and practice. Outcomes-focused education means giving greater attention to the ways in which the curriculum design, pedagogy, assessment approaches and teaching activities can most effectively make a positive, verifiable difference to students’ learning. Such education is authentic when it is couched firmly in the realities of learning environments, student and academic staff characteristics, and trustworthy educational research. That education will be richer and more efficient when staff works collaboratively, contributing their knowledge, experience and skills to achieve learning outcomes based on agreed objectives. We know that the school or departmental levels of universities are the most effective loci of changes in approaches to teaching and learning practices in higher education (Knight & Trowler, 2000). Heads of Schools are being increasingly entrusted with more responsibilities - in addition to setting strategic directions and managing the operational and sometimes financial aspects of their school, they are also expected to lead the development and delivery of the teaching, research and other academic activities. Guiding and mentoring individuals and groups of academics is one critical aspect of the Head of School’s role. Yet they do not always have the resources or support to help them mentor staff, especially the more junior academics. In summary, the international trend in undergraduate engineering course accreditation towards the demonstration of attainment of graduate attributes poses new challenges in addressing academic staff development needs and the assessment of learning. This paper will give some insights into the conceptual design, implementation and empirical effectiveness to date, of a Fellow-In-Residence Engagement (FIRE) program. The program is proposed as a model for achieving better engagement of academics with contemporary issues and effectively enhancing their teaching and assessment practices. It will also report on the program’s collaborative approach to working with Heads of Schools to better support academics, especially early-career ones, by utilizing formal and informal mentoring. Further, the paper will discuss possible factors that may assist the achievement of the intended outcomes of such a model, and will examine its contributions to engendering an outcomes-focussed thinking in engineering education.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although topic detection and tracking techniques have made great progress, most of the researchers seldom pay more attention to the following two aspects. First, the construction of a topic model does not take the characteristics of different topics into consideration. Second, the factors that determine the formation and development of hot topics are not further analyzed. In order to correctly extract news blog hot topics, the paper views the above problems in a new perspective based on the W2T (Wisdom Web of Things) methodology, in which the characteristics of blog users, context of topic propagation and information granularity are investigated in a unified way. The motivations and features of blog users are first analyzed to understand the characteristics of news blog topics. Then the context of topic propagation is decomposed into the blog community, topic network and opinion network, respectively. Some important factors such as the user behavior pattern, opinion leader and network opinion are identified to track the development trends of news blog topics. Moreover, a blog hot topic detection algorithm is proposed, in which news blog hot topics are identified by measuring the duration, topic novelty, attention degree of users and topic growth. Experimental results show that the proposed method is feasible and effective. These results are also useful for further studying the formation mechanism of opinion leaders in blogspace.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Structural Dynamics is the study of the response of structures to dynamic or time varying loads. This topic has emerged to be one of importance to all structural engineers due to three important issues with structural engineering in the new millennium. These are: (1) vibration and problems in slender structures that have emerged due to new material technology and aesthetic requirements, (ii) ageing structures such as bridges whoese health needs to be monitored and appropriate retrofitting carried out to prevent failure and (iii) increased vulnerability of structures to random loads such as seismic, impact and blast loads. Knowledge of structural dynamics is necessary to address these issues and their consequences. During the past two decades, research in structural dynamics has generated considerable amount of new information to address these issues. This new knowledge is not readily made available to practicing engineers and very little or none of it enters the classrooms. There is no universal emphasis on including structural dynamics and their recently generated new knowledge into the civil/structural curriculum. This paper argues for the need to include structural dynamics into the syllabus of all civil engineering courses especially those having a first or second major in structural engineering. This will enable our future structural engineers to design and maintain safe and efficient structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present WebPut, a prototype system that adopts a novel web-based approach to the data imputation problem. Towards this, Webput utilizes the available information in an incomplete database in conjunction with the data consistency principle. Moreover, WebPut extends effective Information Extraction (IE) methods for the purpose of formulating web search queries that are capable of effectively retrieving missing values with high accuracy. WebPut employs a confidence-based scheme that efficiently leverages our suite of data imputation queries to automatically select the most effective imputation query for each missing value. A greedy iterative algorithm is proposed to schedule the imputation order of the different missing values in a database, and in turn the issuing of their corresponding imputation queries, for improving the accuracy and efficiency of WebPut. Moreover, several optimization techniques are also proposed to reduce the cost of estimating the confidence of imputation queries at both the tuple-level and the database-level. Experiments based on several real-world data collections demonstrate not only the effectiveness of WebPut compared to existing approaches, but also the efficiency of our proposed algorithms and optimization techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently we are facing an overburdening growth of the number of reliable information sources on the Internet. The quantity of information available to everyone via Internet is dramatically growing each year [15]. At the same time, temporal and cognitive resources of human users are not changing, therefore causing a phenomenon of information overload. World Wide Web is one of the main sources of information for decision makers (reference to my research). However our studies show that, at least in Poland, the decision makers see some important problems when turning to Internet as a source of decision information. One of the most common obstacles raised is distribution of relevant information among many sources, and therefore need to visit different Web sources in order to collect all important content and analyze it. A few research groups have recently turned to the problem of information extraction from the Web [13]. The most effort so far has been directed toward collecting data from dispersed databases accessible via web pages (related to as data extraction or information extraction from the Web) and towards understanding natural language texts by means of fact, entity, and association recognition (related to as information extraction). Data extraction efforts show some interesting results, however proper integration of web databases is still beyond us. Information extraction field has been recently very successful in retrieving information from natural language texts, however it is still lacking abilities to understand more complex information, requiring use of common sense knowledge, discourse analysis and disambiguation techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Web service and business process technologies are widely adopted to facilitate business automation and collaboration. Given the complexity of business processes, it is a sought-after feature to show a business process with different views to cater for the diverse interests, authority levels, etc., of different users. Aiming to implement such flexible process views in the Web service environment, this paper presents a novel framework named FlexView to support view abstraction and concretisation of WS-BPEL processes. In the FlexView framework, a rigorous view model is proposed to specify the dependency and correlation between structural components of process views with emphasis on the characteristics of WS-BPEL, and a set of rules are defined to guarantee the structural consistency between process views during transformations. A set of algorithms are developed to shift the abstraction and concretisation operations to the operational level. A prototype is also implemented for the proof-of-concept purpose. © 2010 Springer Science+Business Media, LLC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present an empirical evaluation and comparison of two content extraction methods in HTML: absolute XPath expressions and relative XPath expressions. We argue that the relative XPath expressions, although not widely used, should be used in preference to absolute XPath expressions in extracting content from human-created Web documents. Evaluation of robustness covers four thousand queries executed on several hundred webpages. We show that in referencing parts of real world dynamic HTML documents, relative XPath expressions are on average significantly more robust than absolute XPath ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study of social phenomena in the World Wide Web has been rather fragmentary, andthere is no coherent, reseach-based theory about sense of community in Web environment. Sense of community means part of one's self-concept that has to do with perceiving oneself belonging to, and feeling affinity to a certain social grouping. The present study aimed to find evidence for sense of community in Web environment, and specifically find out what the most critical psychological factors of sense of community would be. Based on known characteristics of real life communities and sense of community, and few occational studies of Web-communities, it was hypothesized that the following factors would be the most critical ones and that they could be grouped as prerequisites, facilitators and consequences of sense of community: awareness and social presence (prerequisites), criteria for membership and borders, common purpose, social interaction and reciprocity, norms and conformity, common history (facilitators), trust and accountability (consequences). In addition to critical factors, the present study aimed to find out if this kind of grouping would be valid. Furthermore, the effect of Web-community members' background variables to sense of community was of interest. In order to answer the questions, an online-questionnaire was created and tested. It included propositions that reflect factors that precede, facilitate and follow the sense of community in Web environment. A factor analysis was calculated to find out the critical factors and analyses of variance were calculated to see if the grouping to prerequisites, facilitators and consequences was right and how the background variables would affect the sense of community in Web environment. The results indicated that the psychological structure of sense of community in Web environment could not be presented with critical variables grouped as prerequisites, facilitators and consequences. Most factors did facilitate the sense of community, but based on this data it could not be argued that some of the factors chronologically precedesense of community and some follow it. Instead, the factor analysis revealed that the most critical factors in sense of community in Web environment are 1) reciprocal involvement, 2) basic trust for others, 3) similarity and common purpose of members, and 4) shared history of members. The most influencing background variables were the member's own participation activity (indicated with reading and writing messages) and the phase in membership lifecycle (from visitor to leader). The more the member participated and the further in membership life cycle he was, the more he felt sense of community. There are many descreptions of sense of community, but the present study was one of the first to actually measure the phenomenon in Web environment, and that gained well documented, valid results based on large data, proving that sense of community in Web environment is possible, and clarifying its psychological structure, thus enhancing the understanding of sense of community in Web environment. Keywords: sense of community, Web-community, psychology of the Internet

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Small firms are always vulnerable to complex technological change that may render their existing business model obsolete. This paper emphasises the need to understand how the Internet's ubiquitous World Wide Web is impacting on their operating environments. Consideration of evolutionary theory and the absorptive capacity construct provides the foundation for discussion of how learning and discovery take place within individuals, firms and the environments that interact with. Small firms, we argue, face difficulties identifying what routines and competencies are best aligned with the seemingly invisible dominant designs that support pursuit of new enterprise in web-impacted environments. We argue that such difficulties largely relate to an inability to acquire external knowledge and the subsequent reliance on existing internal selection processes that may reinforce the known, at the expense of the unknown. The paper concludes with consideration as to how managers can overcome the expected difficulties through the development of internal routines that support the continual search, evaluation and acquisition of specific external knowledge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many small firms increasingly operate in markets under siege from new entrants who exploit the technologies associated with the Internet's World Wide Web (the web). In these circumstances, interpreting the operating environment is like a vu jade, the opposite of deja vu, a time in space where they have never been, have no idea what they are doing and who it is that could help them. Through the use of the story of the Caterpillar and the Butterfly, this paper considers the inherent difficulties faced by small firms considering the prospect of becoming an e-firm. When considered from an evolutionary perspective, the journey from small firm to small e-firm is not seen as one of choice, but rather one of necessity. In such markets, a race currently appears to exist between entrepreneurs exploiting the web's technologies, and the process of natural selection acting upon firms whose routines have lost favour.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The TCP protocol is used by most Internet applications today, including the recent mobile wireless terminals that use TCP for their World-Wide Web, E-mail and other traffic. The recent wireless network technologies, such as GPRS, are known to cause delay spikes in packet transfer. This causes unnecessary TCP retransmission timeouts. This dissertation proposes a mechanism, Forward RTO-Recovery (F-RTO) for detecting the unnecessary TCP retransmission timeouts and thus allow TCP to take appropriate follow-up actions. We analyze a Linux F-RTO implementation in various network scenarios and investigate different alternatives to the basic algorithm. The second part of this dissertation is focused on quickly adapting the TCP's transmission rate when the underlying link characteristics change suddenly. This can happen, for example, due to vertical hand-offs between GPRS and WLAN wireless technologies. We investigate the Quick-Start algorithm that, in collaboration with the network routers, aims to quickly probe the available bandwidth on a network path, and allow TCP's congestion control algorithms to use that information. By extensive simulations we study the different router algorithms and parameters for Quick-Start, and discuss the challenges Quick-Start faces in the current Internet. We also study the performance of Quick-Start when applied to vertical hand-offs between different wireless link technologies.