381 resultados para User generated contents (UGC)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article examines the merger between AOL and the Huffington Post. The broader issues around the merger will be investigated, especially the implication for rights, in particular free expression, and their conditions for exercise and actual exercise online. One major issue is that of the status of user-generated content and how the existing legal regime reflects the ethical concerns of users over how their content, data and information is used and commodified by the for-profit Internet intermediaries and platforms, especially when they start to merge and form concentrations. The extent to which the current legal regimes, especially human rights, deal with these problems in an adequate fashion will be assessed, along with the presentation of some suggestions of alternative approaches which may be more effective in practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides a framework for understanding Twitter as a historical source. We address digital humanities scholars to enable the transfer of concepts from traditional source criticism to new media formats, and to encourage the preservation of Twitter as a cultural artifact. Twitter has established itself as a key social media platform which plays an important role in public, real-time conversation. Twitter is also unique as its content is being archived by a public institution (the Library of Congress). In this paper we will show that we still have to assume that much of the contextual information beyond the pure tweet texts is already lost, and propose additional objectives for preservation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We describe research into the identification of anomalous events and event patterns as manifested in computer system logs. Prototype software has been developed with a capability that identifies anomalous events based on usage patterns or user profiles, and alerts administrators when such events are identified. To reduce the number of false positive alerts we have investigated the use of different user profile training techniques and introduce the use of abstractions to group together applications which are related. Our results suggest that the number of false alerts that are generated is significantly reduced when a growing time window is used for user profile training and when abstraction into groups of applications is used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important aspect of designing any product is validation. Virtual design process (VDP) is an alternative to hardware prototyping in which analysis of designs can be done without manufacturing physical samples. In recent years, VDP have been generated either for animation or filming applications. This paper proposes a virtual reality design process model on one of the applications when used as a validation tool. This technique is used to generate a complete design guideline and validation tool of product design. To support the design process of a product, a virtual environment and VDP method were developed that supports validation and an initial design cycle performed by a designer. The product model car carrier is used as illustration for which virtual design was generated. The loading and unloading sequence of the model for the prototype was generated using automated reasoning techniques and was completed by interactively animating the product in the virtual environment before complete design was built. By using the VDP process critical issues like loading, unloading, Australian Design rules (ADR) and clearance analysis were done. The process would save time, money in physical sampling and to large extent in complete math generation. Since only schematic models are required, it saves time in math modelling and handling of bigger size assemblies due to complexity of the models. This extension of VDP process for design evaluation is unique and was developed, implemented successfully. In this paper a Toll logistics and J Smith and Sons car carrier which is developed under author’s responsibility has been used to illustrate our approach of generating design validation via VDP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A worldwide interest is being generated in the use of fibre reinforced polymer composites (FRP) in rehabilitation of reinforced concrete structures. As a replacement for the traditional steel plates or external post-tensioning in strengthening applications, various types of FRP plates, with their high strength to weight ratio and good resistance to corrosion, represent a class of ideal material in external retrofitting. Within the last ten years, many design guidelines have been published to provide guidance for the selection, design and installation of FRP systems for external strengthening of concrete structures. Use of these guidelines requires understanding of a number of issues pertaining to different properties and structural failure modes specific to these materials. A research initiative funded by the CRC for Construction Innovation was undertaken (primarily at RMIT) to develop a decision support tool and a user friendly guide for use of fibre reinforced polymer composites in rehabilitation of concrete structures. The user guidelines presented in this report were developed after industry consultation and a comprehensive review of the state of the art technology. The scope of the guide was mainly developed based on outcomes of two workshops with Queensland Department of Main Roads (QDMR). The document covers material properties, recommended construction requirements, design philosophy, flexural, shear and torsional strengthening of beams and strengthening of columns. In developing this document, the guidelines published on FIB Bulletin 14 (2002), Task group 9.3, International Federation of Structural Concrete (FIB) and American Concrete Institute Committee 440 report (2002) were consulted in conjunction with provisions of the Austroads Bridge design code (1992) and Australian Concrete Structures code AS3600 (2002). In conclusion, the user guide presents design examples covering typical strengthening scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SAP and its research partners have been developing a lan- guage for describing details of Services from various view- points called the Unified Service Description Language (USDL). At the time of writing, version 3.0 describes technical implementation aspects of services, as well as stakeholders, pricing, lifecycle, and availability. Work is also underway to address other business and legal aspects of services. This language is designed to be used in service portfolio management, with a repository of service descriptions being available to various stakeholders in an organisation to allow for service prioritisation, development, deployment and lifecycle management. The structure of the USDL metadata is specified using an object-oriented metamodel that conforms to UML, MOF and EMF Ecore. As such it is amenable to code gener-ation for implementations of repositories that store service description instances. Although Web services toolkits can be used to make these programming language objects available as a set of Web services, the practicalities of writing dis- tributed clients against over one hundred class definitions, containing several hundred attributes, will make for very large WSDL interfaces and highly inefficient “chatty” implementations. This paper gives the high-level design for a completely model-generated repository for any version of USDL (or any other data-only metamodel), which uses the Eclipse Modelling Framework’s Java code generation, along with several open source plugins to create a robust, transactional repository running in a Java application with a relational datastore. However, the repository exposes a generated WSDL interface at a coarse granularity, suitable for distributed client code and user-interface creation. It uses heuristics to drive code generation to bridge between the Web service and EMF granularities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information overload has become a serious issue for web users. Personalisation can provide effective solutions to overcome this problem. Recommender systems are one popular personalisation tool to help users deal with this issue. As the base of personalisation, the accuracy and efficiency of web user profiling affects the performances of recommender systems and other personalisation systems greatly. In Web 2.0, the emerging user information provides new possible solutions to profile users. Folksonomy or tag information is a kind of typical Web 2.0 information. Folksonomy implies the users‘ topic interests and opinion information. It becomes another source of important user information to profile users and to make recommendations. However, since tags are arbitrary words given by users, folksonomy contains a lot of noise such as tag synonyms, semantic ambiguities and personal tags. Such noise makes it difficult to profile users accurately or to make quality recommendations. This thesis investigates the distinctive features and multiple relationships of folksonomy and explores novel approaches to solve the tag quality problem and profile users accurately. Harvesting the wisdom of crowds and experts, three new user profiling approaches are proposed: folksonomy based user profiling approach, taxonomy based user profiling approach, hybrid user profiling approach based on folksonomy and taxonomy. The proposed user profiling approaches are applied to recommender systems to improve their performances. Based on the generated user profiles, the user and item based collaborative filtering approaches, combined with the content filtering methods, are proposed to make recommendations. The proposed new user profiling and recommendation approaches have been evaluated through extensive experiments. The effectiveness evaluation experiments were conducted on two real world datasets collected from Amazon.com and CiteULike websites. The experimental results demonstrate that the proposed user profiling and recommendation approaches outperform those related state-of-the-art approaches. In addition, this thesis proposes a parallel, scalable user profiling implementation approach based on advanced cloud computing techniques such as Hadoop, MapReduce and Cascading. The scalability evaluation experiments were conducted on a large scaled dataset collected from Del.icio.us website. This thesis contributes to effectively use the wisdom of crowds and expert to help users solve information overload issues through providing more accurate, effective and efficient user profiling and recommendation approaches. It also contributes to better usages of taxonomy information given by experts and folksonomy information contributed by users in Web 2.0.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is a study of how the contents of volatile memory on the Windows operating system can be better understood and utilised for the purposes of digital forensic investigations. It proposes several techniques to improve the analysis of memory, with a focus on improving the detection of unknown code such as malware. These contributions allow the creation of a more complete reconstruction of the state of a computer at acquisition time, including whether or not the computer has been infected by malicious code.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability to identify and assess user engagement with transmedia productions is vital to the success of individual projects and the sustainability of this mode of media production as a whole. It is essential that industry players have access to tools and methodologies that offer the most complete and accurate picture of how audiences/users engage with their productions and which assets generate the most valuable returns of investment. Drawing upon research conducted with Hoodlum Entertainment, a Brisbane-based transmedia producer, this project involved an initial assessment of the way engagement tends to be understood, why standard web analytics tools are ill-suited to measuring it, how a customised tool could offer solutions, and why this question of measuring engagement is so vital to the future of transmedia as a sustainable industry. Working with data provided by Hoodlum Entertainment and Foxtel Marketing, the outcome of the study was a prototype for a custom data visualisation tool that allowed access, manipulation and presentation of user engagement data, both historic and predictive. The prototyped interfaces demonstrate how the visualization tool would collect and organise data specific to multiplatform projects by aggregating data across a number of platform reporting tools. Such a tool is designed to encompass not only platforms developed by the transmedia producer but also sites developed by fans. This visualisation tool accounted for multiplatform experience projects whose top level is comprised of people, platforms and content. People include characters, actors, audience, distributors and creators. Platforms include television, Facebook and other relevant social networks, literature, cinema and other media that might be included in the multiplatform experience. Content refers to discreet media texts employed within the platform, such as tweet, a You Tube video, a Facebook post, an email, a television episode, etc. Core content is produced by the creators’ multiplatform experiences to advance the narrative, while complimentary content generated by audience members offers further contributions to the experience. Equally important is the timing with which the components of the experience are introduced and how they interact with and impact upon each other. Being able to combine, filter and sort these elements in multiple ways we can better understand the value of certain components of a project. It also offers insights into the relationship between the timing of the release of components and user activity associated with them, which further highlights the efficacy (or, indeed, failure) of assets as catalysts for engagement. In collaboration with Hoodlum we have developed a number of design scenarios experimenting with the ways in which data can be visualised and manipulated to tell a more refined story about the value of user engagement with certain project components and activities. This experimentation will serve as the basis for future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the widespread of social media websites in the internet, and the huge number of users participating and generating infinite number of contents in these websites, the need for personalisation increases dramatically to become a necessity. One of the major issues in personalisation is building users’ profiles, which depend on many elements; such as the used data, the application domain they aim to serve, the representation method and the construction methodology. Recently, this area of research has been a focus for many researchers, and hence, the proposed methods are increasing very quickly. This survey aims to discuss the available user modelling techniques for social media websites, and to highlight the weakness and strength of these methods and to provide a vision for future work in user modelling in social media websites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To harness safe operation of Web-based systems in Web environments, we propose an SSPA (Server-based SHA-1 Page-digest Algorithm) to verify the integrity of Web contents before the server issues an HTTP response to a user request. In addition to standard security measures, our Java implementation of the SSPA, which is called the Dynamic Security Surveillance Agent (DSSA), provides further security in terms of content integrity to Web-based systems. Its function is to prevent the display of Web contents that have been altered through the malicious acts of attackers and intruders on client machines. This is to protect the reputation of organisations from cyber-attacks and to ensure the safe operation of Web systems by dynamically monitoring the integrity of a Web site's content on demand. We discuss our findings in terms of the applicability and practicality of the proposed system. We also discuss its time metrics, specifically in relation to its computational overhead at the Web server, as well as the overall latency from the clients' point of view, using different Internet access methods. The SSPA, our DSSA implementation, some experimental results and related work are all discussed

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study constructs performance prediction models to estimate the end-user perceived video quality on mobile devices for the latest video encoding techniques –VP9 and H.265. Both subjective and objective video quality assessments were carried out for collecting data and selecting the most desirable predictors. Using statistical regression, two models were generated to achieve 94.5% and 91.5% of prediction accuracies respectively, depending on whether the predictor derived from the objective assessment is involved. These proposed models can be directly used by media industries for video quality estimation, and will ultimately help them to ensure a positive end-user quality of experience on future mobile devices after the adaptation of the latest video encoding technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND OR CONTEXT Thermodynamics is a core concept for mechanical engineers yet notoriously difficult. Evidence suggests students struggle to understand and apply the core fundamental concepts of thermodynamics with analysis indicating a problem with student learning/engagement. A contributing factor is that thermodynamics is a ‘science involving concepts based on experiments’ (Mayhew 1990) with subject matter that cannot be completely defined a priori. To succeed, students must engage in a deep-holistic approach while taking ownership of their learning. The difficulty in achieving this often manifests itself in students ‘not getting’ the principles and declaring thermodynamics ‘hard’. PURPOSE OR GOAL Traditionally, students practice and “learn” the application of thermodynamics in their tutorials, however these do not consider prior conceptions (Holman & Pilling 2004). As ‘hands on’ learning is the desired outcome of tutorials it is pertinent to study methods of improving their efficacy. Within the Australian context, the format of thermodynamics tutorials has remained relatively unchanged over the decades, relying anecdotally on a primarily didactic pedagogical approach. Such approaches are not conducive to deep learning (Ramsden 2003) with students often disengaged from the learning process. Evidence suggests (Haglund & Jeppsson 2012), however, that a deeper level and ownership of learning can be achieved using a more constructivist approach for example through self generated analogies. This pilot study aimed to collect data to support the hypothesis that the ‘difficulty’ of thermodynamics is associated with the pedagogical approach of tutorials rather than actual difficulty in subject content or deficiency in students. APPROACH Successful application of thermodynamic principles requires solid knowledge of the core concepts. Typically, tutorial sessions guide students in this application. However, a lack of deep and comprehensive understanding can lead to student confusion in the applications resulting in the learning of the ‘process’ of application without understanding ‘why’. The aim of this study was to gain empirical data on student learning of both concepts and application, within thermodynamic tutorials. The approach taken for data collection and analysis was: - 1 Four concurrent tutorial streams were timetabled to examine student engagement/learning in traditional ‘didactic’ (3 weeks) and non-traditional (3 weeks). In each week, two of the selected four sessions were traditional and two non-traditional. This provided a control group for each week. - 2 The non-traditional tutorials involved activities designed to promote student-centered deep learning. Specific pedagogies employed were: self-generated analogies, constructivist, peer-to-peer learning, inquiry based learning, ownership of learning and active learning. - 3 After a three-week period, teaching styles of the selected groups was switched, to allow each group to experience both approaches with the same tutor. This also acted to mimimise any influence of tutor personality / style on the data. - 4 At the conclusion of the trial participants completed a ‘5 minute essay’ on how they liked the sessions, a small questionnaire, modelled on the modified (Christo & Hoang, 2013)SPQ designed by Biggs (1987) and a small formative quiz to gauge the level of learning achieved. DISCUSSION Preliminary results indicate that overall students respond positively to in class demonstrations (inquiry based learning), and active learning activities. Within the active learning exercises, the current data suggests students preferred individual rather than group or peer-to-peer activities. Preliminary results from the open-ended questions such as “What did you like most/least about this tutorial” and “do you have other comments on how this tutorial could better facilitate your learning”, however, indicated polarising views on the nontraditional tutorial. Some student’s responded that they really like the format and emphasis on understanding the concepts, while others were very vocal that that ‘hated’ the style and just wanted the solutions to be presented by the tutor. RECOMMENDATIONS/IMPLICATIONS/CONCLUSION Preliminary results indicated a mixed, but overall positive response by students with more collaborative tutorials employing tasks promoting inquiry based, peer-to-peer, active, and ownership of learning activities. Preliminary results from student feedback supports evidence that students learn differently, and running tutorials focusing on only one pedagogical approached (typically didactic) may not be beneficial to all students. Further, preliminary data suggests that the learning / teaching style of both students and tutor are important to promoting deep learning in students. Data collection is still ongoing and scheduled for completion at the end of First Semester (Australian academic calendar). The final paper will examine in more detail the results and analysis of this project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Twitter’s hashtag functionality is now used for a very wide variety of purposes, from covering crises and other breaking news events through gathering an instant community around shared media texts (such as sporting events and TV broadcasts) to signalling emotive states from amusement to despair. These divergent uses of the hashtag are increasingly recognised in the literature, with attention paid especially to the ability for hashtags to facilitate the creation of ad hoc or hashtag publics. A more comprehensive understanding of these different uses of hashtags has yet to be developed, however. Previous research has explored the potential for a systematic analysis of the quantitative metrics that could be generated from processing a series of hashtag datasets. Such research found, for example, that crisis-related hashtags exhibited a significantly larger incidence of retweets and tweets containing URLs than hashtags relating to televised events, and on this basis hypothesised that the information-seeking and -sharing behaviours of Twitter users in such different contexts were substantially divergent. This article updates such study and their methodology by examining the communicative metrics of a considerably larger and more diverse number of hashtag datasets, compiled over the past five years. This provides an opportunity both to confirm earlier findings, as well as to explore whether hashtag use practices may have shifted subsequently as Twitter’s userbase has developed further; it also enables the identification of further hashtag types beyond the “crisis” and “mainstream media event” types outlined to date. The article also explores the presence of such patterns beyond recognised hashtags, by incorporating an analysis of a number of keyword-based datasets. This large-scale, comparative approach contributes towards the establishment of a more comprehensive typology of hashtags and their publics, and the metrics it describes will also be able to be used to classify new hashtags emerging in the future. In turn, this may enable researchers to develop systems for automatically distinguishing newly trending topics into a number of event types, which may be useful for example for the automatic detection of acute crises and other breaking news events.