622 resultados para pacs: neural computing technologies
Resumo:
The education sector has dramatically changed in the past half decade. In a time of globalisation of education and tightening budgets, various paradigm shifts and challenges have rapidly changed learning and teaching. These include: meeting student expectation for more engaging, more interactive learning experiences, the increased focus to deliver content online, and the complexities of fast-changing technologies. Rising to these challenges and responding to them is a complex and multi-faceted task. This paper discusses educational theories and issues and explores current educational practices in the context of teaching undergraduate students via distance education in the university context. A case study applies a framework drawn from engineering education using the learner-centric concept of academagogy. Results showed that academagogy actively empowers students to build effective learning, and engages facilitators in meaningful teaching and delivery methods.
Resumo:
Located in the Gulf of Mexico in nearly 8,000 ft of water, the Perdido project is the deepest spar application to date in the world and Shell’s first fully integrated application of its inhouse digital oilfield technology— called “Smart Field”—in the Western hemisphere. Developed by Shell on behalf of partners BP and Chevron, the spar and the subsea equipment connected to it will eventually capture about an order of magnitude more data than is collected from any other Shelldesigned and -managed development operating in the Gulf of Mexico. This article describes Shell’s digital oilfield design philosophy, briefly explains the five design elements that underpin “smartness” in Shell’s North and South American operations and sheds light on the process by which a highly customized digital oilfield development and management plan was put together for Perdido. Although Perdido is the first instance in North and South America in which these design elements and processes were applied in an integrated way, all of Shell’s future new developments in the Western hemisphere are expected to follow the same overarching design principles. Accordingly, this article uses Perdido as a real-world example to outline the high-level details of Shell’s digital oilfield design philosophy and processes.
Resumo:
This thesis is an investigation of the media's representation of children and ICT. The study draws on moral panic theory and Queensland newspaper media, to identify the impact of newspaper reporting on the public's perceptions of young people and ICT.
Resumo:
The huge amount of CCTV footage available makes it very burdensome to process these videos manually through human operators. This has made automated processing of video footage through computer vision technologies necessary. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned ‘normal’ model. There is no precise and exact definition for an abnormal activity; it is dependent on the context of the scene. Hence there is a requirement for different feature sets to detect different kinds of abnormal activities. In this work we evaluate the performance of different state of the art features to detect the presence of the abnormal objects in the scene. These include optical flow vectors to detect motion related anomalies, textures of optical flow and image textures to detect the presence of abnormal objects. These extracted features in different combinations are modeled using different state of the art models such as Gaussian mixture model(GMM) and Semi- 2D Hidden Markov model(HMM) to analyse the performances. Further we apply perspective normalization to the extracted features to compensate for perspective distortion due to the distance between the camera and objects of consideration. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.
Resumo:
This chapter presents an historical narrative on the recent evolution of information and communications technology (ICT) that has been, and is, utilized for purposes of learning. In other words, it presents an account of the development of e-learning supported through the Web and other similar virtual environments. It does not attempt to present a definitive account; as such an exercise is fraught with assumptions, contextual bias, and probable conjecture. The concern here is more with contextualizing the role of inquiry in learning and the evolving digital tools that enable interfaces that promote and support it. In tracking this evolution, both multi-disciplinary and trans-disciplinary research has been pursued. Key historical developments are identified as well as interpretations of the key drivers of e-learning over time and into what might be better described as digital learning. Innovations in the development of digital tools are described as dynamic and emergent, evolving as a consequence of multiple, sometimes hidden drivers of change. But conflating advancements in learning technologies with e-learning seems to be pervasive. As is the push for the “open” agenda – a growing number of initiatives and movements dominated by themes associated with access, intellectual property, public benefit, sharing and technical interoperability. Openness is also explored in this chapter, however, more in terms of what it means when associated with inquiry. By investigating opportunities for the stimulation and support of questioning online – in particular, why-questioning – this chapter is focused on “opening” content – not just for access but for inquiry and deeper learning.
Resumo:
Technological advances have led to an influx of affordable hardware that supports sensing, computation and communication. This hardware is increasingly deployed in public and private spaces, tracking and aggregating a wealth of real-time environmental data. Although these technologies are the focus of several research areas, there is a lack of research dealing with the problem of making these capabilities accessible to everyday users. This thesis represents a first step towards developing systems that will allow users to leverage the available infrastructure and create custom tailored solutions. It explores how this notion can be utilized in the context of energy monitoring to improve conventional approaches. The project adopted a user-centered design process to inform the development of a flexible system for real-time data stream composition and visualization. This system features an extensible architecture and defines a unified API for heterogeneous data streams. Rather than displaying the data in a predetermined fashion, it makes this information available as building blocks that can be combined and shared. It is based on the insight that individual users have diverse information needs and presentation preferences. Therefore, it allows users to compose rich information displays, incorporating personally relevant data from an extensive information ecosystem. The prototype was evaluated in an exploratory study to observe its natural use in a real-world setting, gathering empirical usage statistics and conducting semi-structured interviews. The results show that a high degree of customization does not warrant sustained usage. Other factors were identified, yielding recommendations for increasing the impact on energy consumption.
Resumo:
Currently, the GNSS computing modes are of two classes: network-based data processing and user receiver-based processing. A GNSS reference receiver station essentially contributes raw measurement data in either the RINEX file format or as real-time data streams in the RTCM format. Very little computation is carried out by the reference station. The existing network-based processing modes, regardless of whether they are executed in real-time or post-processed modes, are centralised or sequential. This paper describes a distributed GNSS computing framework that incorporates three GNSS modes: reference station-based, user receiver-based and network-based data processing. Raw data streams from each GNSS reference receiver station are processed in a distributed manner, i.e., either at the station itself or at a hosting data server/processor, to generate station-based solutions, or reference receiver-specific parameters. These may include precise receiver clock, zenith tropospheric delay, differential code biases, ambiguity parameters, ionospheric delays, as well as line-of-sight information such as azimuth and elevation angles. Covariance information for estimated parameters may also be optionally provided. In such a mode the nearby precise point positioning (PPP) or real-time kinematic (RTK) users can directly use the corrections from all or some of the stations for real-time precise positioning via a data server. At the user receiver, PPP and RTK techniques are unified under the same observation models, and the distinction is how the user receiver software deals with corrections from the reference station solutions and the ambiguity estimation in the observation equations. Numerical tests demonstrate good convergence behaviour for differential code bias and ambiguity estimates derived individually with single reference stations. With station-based solutions from three reference stations within distances of 22–103 km the user receiver positioning results, with various schemes, show an accuracy improvement of the proposed station-augmented PPP and ambiguity-fixed PPP solutions with respect to the standard float PPP solutions without station augmentation and ambiguity resolutions. Overall, the proposed reference station-based GNSS computing mode can support PPP and RTK positioning services as a simpler alternative to the existing network-based RTK or regionally augmented PPP systems.
Resumo:
With the advancement of new technologies, this author has in 2010 started to engineer an online learning environment for investigating the nature and development of spatial abilities, and the teaching and learning of geometry. This paper documents how this new digital learning environment can afford the opportunity to integrate the learning about 3D shapes with direction, location and movement, and how young children can mentally and visually construct virtual 3D shapes using movements in both egocentric and fixed frames of reference (FOR). Findings suggest that year 4 (aged 9) children can develop the capacity to construct a cube using egocentric FOR only, fixed FOR only or a combination of both FOR. However, these young participants were unable to articulate the effect of individual or combined FOR movements. Directions for future research are proposed.
Resumo:
Topic modelling has been widely used in the fields of information retrieval, text mining, machine learning, etc. In this paper, we propose a novel model, Pattern Enhanced Topic Model (PETM), which makes improvements to topic modelling by semantically representing topics with discriminative patterns, and also makes innovative contributions to information filtering by utilising the proposed PETM to determine document relevance based on topics distribution and maximum matched patterns proposed in this paper. Extensive experiments are conducted to evaluate the effectiveness of PETM by using the TREC data collection Reuters Corpus Volume 1. The results show that the proposed model significantly outperforms both state-of-the-art term-based models and pattern-based models.
Resumo:
Study Approach The results presented in this report are part of a larger global study on the major issues in BPM. Only one part of the larger study is reported here, viz. interviews with BPM experts. Interviews of BPM tool vendors together with focus group studies involving user organizations were conducted in parallel and set the groundwork for the identification of BPM issues on a global scale. Through this multi-method approach, we identify four distinct sets of outcomes. First, as is the focus of this report, we identify the BPM issues as perceived by BPM experts. Second, the research design allows us to gain insight into the opinions of organizations deploying BPM solutions. Third, an understanding of organizations’ misconceptions of BPM technologies, as confronted by BPM tool vendors, is obtained. Last, we seek to gain an understanding of BPM issues on a global scale, together with knowledge of matters of concern. This final outcome is aimed to produce an industry-driven research agenda that will inform practitioners and, in particular, the research community worldwide on issues and challenges that are prevalent or emerging in BPM and related areas...
Resumo:
L'intérêt suscité par la ré-ingénierie des processus et les technologies de l'information révèle l'émergence du paradigme du management par les processus. Bien que beaucoup d'études aient été publiées sur des outils et techniques alternatives de modélisation de processus, peu d'attention a été portée à l'évaluation post-hoc des activités de modélisation de processus ou à l'établissement de directives sur la façon de conduire efficacement une modélisation de processus. La présente étude a pour objectif de combler ce manque. Nous présentons les résultats d'une étude de cas détaillée, conduite dans une organisation leader australienne dans le but de construire un modèle de réussite de la modélisation des processus.
Resumo:
It is widely acknowledged that effective asset management requires an interdisciplinary approach, in which synergies should exist between traditional disciplines such as: accounting, engineering, finance, humanities, logistics, and information systems technologies. Asset management is also an important, yet complex business practice. Business process modelling is proposed as an approach to manage the complexity of asset management through the modelling of asset management processes. A sound foundation for the systematic application and analysis of business process modelling in asset management is, however, yet to be developed. Fundamentally, a business process consists of activities (termed functions), events/states, and control flow logic. As both events/states and control flow logic are somewhat dependent on the functions themselves, it is a logical step to first identify the functions within a process. This research addresses the current gap in knowledge by developing a method to identify functions common to various industry types (termed core functions). This lays the foundation to extract such functions, so as to identify both commonalities and variation points in asset management processes. This method describes the use of a manual text mining and a taxonomy approach. An example is presented.
Resumo:
Much of what is written about digital technologies in preschool contexts focuses on young children’s acquisition of skills rather than their meaning-making during use of technologies. In this paper, we consider how the viewing of a YouTube video was used by a teacher and children to produce shared understandings about it. Conversation analysis of talk and interaction during the viewing of the video establishes some of the ways that individual accounts of events were produced for others and then endorsed as shared understandings. The analysis establishes how adults and children made use of verbal and embodied actions during interactions to produce shared understandings of the YouTube video, the events it recorded and written commentary about those events
Resumo:
Server consolidation using virtualization technology has become an important technology to improve the energy efficiency of data centers. Virtual machine placement is the key in the server consolidation technology. In the past few years, many approaches to the virtual machine placement have been proposed. However, existing virtual machine placement approaches consider the energy consumption by physical machines only, but do not consider the energy consumption in communication network, in a data center. However, the energy consumption in the communication network in a data center is not trivial, and therefore should be considered in the virtual machine placement. In our preliminary research, we have proposed a genetic algorithm for a new virtual machine placement problem that considers the energy consumption in both physical machines and the communication network in a data center. Aiming at improving the performance and efficiency of the genetic algorithm, this paper presents a hybrid genetic algorithm for the energy-efficient virtual machine placement problem. Experimental results show that the hybrid genetic algorithm significantly outperforms the original genetic algorithm, and that the hybrid genetic algorithm is scalable.
Resumo:
The introduction of safety technologies into complex socio-technical systems requires an integrated and holistic approach to HF and engineering, considering the effects of failures not only within system boundaries, but also at the interfaces with other systems and humans. Level crossing warning devices are examples of such systems where technically safe states within the system boundary can influence road user performance, giving rise to other hazards that degrade safety of the system. Chris will discuss the challenges that have been encountered to date in developing a safety argument in support of low-cost level crossing warning devices. The design and failure modes of level crossing warning devices are known to have a significant influence on road user performance; however, quantifying this effect is one of the ongoing challenges in determining appropriate reliability and availability targets for low-cost level crossing warning devices.