912 resultados para pacs: neural computing technologies
Resumo:
This paper illustrates the damage identification and condition assessment of a three story bookshelf structure using a new frequency response functions (FRFs) based damage index and Artificial Neural Networks (ANNs). A major obstacle of using measured frequency response function data is a large size input variables to ANNs. This problem is overcome by applying a data reduction technique called principal component analysis (PCA). In the proposed procedure, ANNs with their powerful pattern recognition and classification ability were used to extract damage information such as damage locations and severities from measured FRFs. Therefore, simple neural network models are developed, trained by Back Propagation (BP), to associate the FRFs with the damage or undamaged locations and severity of the damage of the structure. Finally, the effectiveness of the proposed method is illustrated and validated by using the real data provided by the Los Alamos National Laboratory, USA. The illustrated results show that the PCA based artificial Neural Network method is suitable and effective for damage identification and condition assessment of building structures. In addition, it is clearly demonstrated that the accuracy of proposed damage detection method can also be improved by increasing number of baseline datasets and number of principal components of the baseline dataset.
Resumo:
Local governments struggle to engage time poor and seemingly apathetic citizens, as well as the city’s young digital natives, the digital locals. This project aims at providing a lightweight, technological contribution towards removing the hierarchy between those who build the city and those who use it. We aim to narrow this gap by enhancing people’s experience of physical spaces with digital, civic technologies that are directly accessible within that space. This paper presents the findings of a design trial allowing users to interact with a public screen via their mobile phones. The screen facilitated a feedback platform about a concrete urban planning project by promoting specific questions and encouraging direct, in-situ, real-time responses via SMS and twitter. This new mechanism offers additional benefits for civic participation as it gives voice to residents who otherwise would not be heard. It also promotes a positive attitude towards local governments and gathers information different from more traditional public engagement tools.
Resumo:
In recent years, various observers have pointed to the shifting paradigms of cultural and societal participation and economic production in developed nations. These changes are facilitated (although, importantly, not solely driven) by the emergence of new, participatory technologies of information access, knowledge exchange, and content production, many of whom are associated with Internet and new media technologies. In an online context, such technologies are now frequently described as social software, social media, or Web2.0, but their impact is no longer confined to cyberspace as an environment that is somehow different and separate from ‘real life’: user-led content and knowledge production is increasingly impacting on media, economy, law, social practices, and democracy itself.
Resumo:
Public transportation is an environment with great potential for applying innovative ubiquitous computing services to enhance user experiences. This paper provides the underpinning rationale for research that will be looking at how real-time passenger information system deployed by transit authorities can provide a core platform to improve commuters’ user experiences during all stages of their journey. The proposal builds on this platform to inform the design and development of innovative social media, mobile computing and geospatial information applications, with the hope to create fun and meaningful experiences for passengers during their everyday travel. Furthermore, we present the findings of our pilot study that aims to offer a better understanding of passengers’ activities and social interactions during their daily commute.
Resumo:
In this article we identify how computational automation achieved through programming has enabled a new class of music technologies with generative music capabilities. These generative systems can have a degree of music making autonomy that impacts on our relationships with them; we suggest that this coincides with a shift in the music-equipment relationship from tool use to a partnership. This partnership relationship can occur when we use technologies that display qualities of agency. It raises questions about the kinds of skills and knowledge that are necessary to interact musically in such a partnership. These are qualities of musicianship we call eBility. In this paper we seek to define what eBility might consist of and how consideration of it might effect music education practice. The 'e' in eBility refers not only to the electronic nature of computing systems but also to the ethical, enabling, experiential and educational dimensions of the creative relationship with technologies with agency. We hope to initiate a discussion around differentiating what we term representational technologies from those with agency and begin to uncover the implications of these ideas for music educators in schools and communities. We hope also to elucidate the emergent theory and practice that has enabled the development of strategies for optimising this kind of eBility where the tool becomes partner. The identification of musical technologies with agency adds to the authors’ list of metaphors for technology use in music education that previously included tool, medium and instrument. We illustrate these ideas with examples and with data from our work with the jam2jam interactive music system. In this discussion we will outline our experiences with jam2jam as an example of a technology with agency and describe the aspects of eBility that interaction with it promotes.
Resumo:
Privacy issues have hindered the evolution of e-health since its emergence. Patients demand better solutions for the protection of private information. Health professionals demand open access to patient health records. Existing e-health systems find it difficult to fulfill these competing requirements. In this paper, we present an information accountability framework (IAF) for e-health systems. The IAF is intended to address privacy issues and their competing concerns related to e-health. Capabilities of the IAF adhere to information accountability principles and e-health requirements. Policy representation and policy reasoning are key capabilities introduced in the IAF. We investigate how these capabilities are feasible using Semantic Web technologies. We discuss with the use of a case scenario, how we can represent the different types of policies in the IAF using the Open Digital Rights Language (ODRL).
Resumo:
The dynamic interplay between existing learning frameworks: people, pedagogy, learning spaces and technology is challenging the traditional lecture. A paradigm is emerging from the correlation of change amongst these elements, offering new possibilities for improving the quality of the learning experience. For many universities, the design of physical learning spaces has been the focal point for blending technology and flexible learning spaces to promote learning and teaching. As the pace of technological change intensifies, affording new opportunities for engaging learners, pedagogical practice in higher education is not comparatively evolving. The resulting disparity is an opportunity for the reconsideration of pedagogical practice for increased student engagement in physical learning spaces as an opportunity for active learning. This interplay between students, staff and technology is challenging the value for students in attending physical learning spaces such as the traditional lecture. Why should students attend for classes devoted to content delivery when streaming and web technologies afford more flexible learning opportunities? Should we still lecture? Reconsideration of pedagogy is driving learning design at Queensland University of Technology, seeking new approaches affording increased student engagement via active learning experiences within large lectures. This paper provides an overview and an evaluation of one of these initiatives, Open Web Lecture (OWL), an experimental web based student response application developed by Queensland University of Technology. OWL seamlessly integrates a virtual learning environment within physical learning spaces, fostering active learning opportunities. This paper will evaluate the pilot of this initiative through consideration of effectiveness in increasing student engagement through the affordance of web enabled active learning opportunities in physical learning spaces.
Resumo:
In the era of Web 2.0, huge volumes of consumer reviews are posted to the Internet every day. Manual approaches to detecting and analyzing fake reviews (i.e., spam) are not practical due to the problem of information overload. However, the design and development of automated methods of detecting fake reviews is a challenging research problem. The main reason is that fake reviews are specifically composed to mislead readers, so they may appear the same as legitimate reviews (i.e., ham). As a result, discriminatory features that would enable individual reviews to be classified as spam or ham may not be available. Guided by the design science research methodology, the main contribution of this study is the design and instantiation of novel computational models for detecting fake reviews. In particular, a novel text mining model is developed and integrated into a semantic language model for the detection of untruthful reviews. The models are then evaluated based on a real-world dataset collected from amazon.com. The results of our experiments confirm that the proposed models outperform other well-known baseline models in detecting fake reviews. To the best of our knowledge, the work discussed in this article represents the first successful attempt to apply text mining methods and semantic language models to the detection of fake consumer reviews. A managerial implication of our research is that firms can apply our design artifacts to monitor online consumer reviews to develop effective marketing or product design strategies based on genuine consumer feedback posted to the Internet.
Resumo:
This volume represents the proceedings of the 13th ENTER conference, held at Lausanne, Switzerland during 2006. The conference brought together academics and practitioners across four tracks, which were eSolutions, refereed research papers, work-in-progress papers, and a Ph.D workshop. This proceedings contains 40 refereed papers, which is less that the 51 papers presented in 2005. However, the editors advise the scientific committee was stricter than in previous years, to the extent that the acceptance rate was 50%. A significant change in the current proceedings is the inclusion of extended abstracts of the 23 work-in-progress presentations. The papers cover a diverse range of topics across 16 research streams. This reviewer has adopted the approach of succinctly summarising the contribution of each of the 40 refereed papers, in the order in which they appear...
Resumo:
This volume represents the proceedings of the 12th ENTER conference held at Innsbruck in 2005. While the conference also accepts work-in-progress papers and includes a Ph.D. workshop, the proceedings contain 51 research papers by 102 authors. The general theme of the conference was eBusiness is here—what is next? and the papers cover a diverse range of topics across nine tracks. This reviewer has adopted the approach of succinctly summarising the contribution of each of the papers, in the order they appear....
Resumo:
"This volume represents the proceedings of the 10th ENTER conference, held in Helsinki, Finland during January 2003. The conference theme was ‘technology on the move’, and the 476pp. proceedings offer 50 papers by 108 authors. The editors advise all papers were subject to a double blind peer review. The research has been categorised into 18 broad headings, which reflects the diversity of topics addressed. This reviewer has adopted the approach of succinctly summarising each of the papers, in the order they appear, to assist readers of Tourism Management in judging the potential value of the content for their own work..." -- publisher website
Resumo:
This summary is based on an international review of leading peer reviewed journals, in both technical and management fields. It draws on highly cited articles published between 2000 and 2009 to investigate the research question, "What are the diffusion determinants for passive building technologies in Australia?". Using a conceptual framework drawn from the innovation systems literature, this paper synthesises and interprets the literature to map the current state of passive building technologies in Australia and to analyse the drivers for, and obstacles to, their optimal diffusion. The paper concludes that the government has a key role to play through its influence over the specification of building codes.
Resumo:
An online survey of recent ICT graduates in the workplace was carried out as part of a recent project funded by the Australian Learning and Teaching Council. The survey was concerned with the ICT curriculum in relation to workplace job requirements and university preparation for these requirements. The survey contained quantitative and qualitative components and findings from the former have been published (Koppi et al., 2009). This paper reports on a quantitative comparison of responses from graduates who had workplace experience and those who did not, and a qualitative analysis of text responses from all ICT graduates to open-ended questions concerning the curriculum and their perceived university preparation for the workplace. The overwhelming response from ICT graduates in the workplace was for more industry related learning. These industry relationships included industry involvement, workplace learning and business experience, up-to-date teaching and technologies, practical applications, and real-world activities. A closer relationship of academia and industry was strongly advocated by ICT graduates in the workplace.
Resumo:
This paper summarises some of the recent studies on various types of learning approaches that have utilised some form of Web 2.0 services in curriculum design to enhance learning. A generic implementation model of this integration will then be presented to illustrate the overall learning implementation process. Recently, the integration of Web 2.0 technologies into learning curriculum has begun to get a wide acceptance among teaching instructors across various higher learning institutions. This is evidenced by numerous studies which indicate the implementation of a range of Web 2.0 technologies into their learning design to improve learning delivery. Moreover, recent studies also have shown that the ability of current students to embrace Web 2.0 technologies is better than students using existing learning technology. Despite various attempts made by teachers in relation to the integration, researchers have noted a lack of integration standard to help in curriculum design. The absence of this standard will restrict the capacity of Web 2.0 adaptation into learning and adding more the complexity to provide meaningful learning. Therefore, this paper will attempt to draw a conceptual integration model which is being generated to reflect how learning activities with some facilitation of Web 2.0 is currently being implemented. The design of this model is based on shared experiences by many scholars as well as feedback gathered from two separate surveys conducted on teachers and a group of 180 students. Furthermore, this paper also recognizes some key components that generally engage in the design of a Web 2.0 teaching and learning which need to be addressed accordingly. Overall, the content of this paper will be organised as follows. The first part of the paper will introduce the importance of Web 2.0 implementation in teaching and learning from the perspective of higher education institutions and those challenges surrounding this area. The second part summarizes related works done in this field and brings forward the concept of designing learning with the incorporation of Web 2.0 technology. The next part presents the results of analysis derived from the two student and teachers surveys on using Web 2.0 during learning activities. This paper concludes by presenting a model that reflects several key entities that may be involved during the learning design.
Resumo:
Cities accumulate and distribute vast sets of digital information. Many decision-making and planning processes in councils, local governments and organisations are based on both real-time and historical data. Until recently, only a small, carefully selected subset of this information has been released to the public – usually for specific purposes (e.g. train timetables, release of planning application through websites to name just a few). This situation is however changing rapidly. Regulatory frameworks, such as the Freedom of Information Legislation in the US, the UK, the European Union and many other countries guarantee public access to data held by the state. One of the results of this legislation and changing attitudes towards open data has been the widespread release of public information as part of recent Government 2.0 initiatives. This includes the creation of public data catalogues such as data.gov.au (U.S.), data.gov.uk (U.K.), data.gov.au (Australia) at federal government levels, and datasf.org (San Francisco) and data.london.gov.uk (London) at municipal levels. The release of this data has opened up the possibility of a wide range of future applications and services which are now the subject of intensified research efforts. Previous research endeavours have explored the creation of specialised tools to aid decision-making by urban citizens, councils and other stakeholders (Calabrese, Kloeckl & Ratti, 2008; Paulos, Honicky & Hooker, 2009). While these initiatives represent an important step towards open data, they too often result in mere collections of data repositories. Proprietary database formats and the lack of an open application programming interface (API) limit the full potential achievable by allowing these data sets to be cross-queried. Our research, presented in this paper, looks beyond the pure release of data. It is concerned with three essential questions: First, how can data from different sources be integrated into a consistent framework and made accessible? Second, how can ordinary citizens be supported in easily composing data from different sources in order to address their specific problems? Third, what are interfaces that make it easy for citizens to interact with data in an urban environment? How can data be accessed and collected?