227 resultados para Case Based Computing


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This book develops tools and techniques that will help urban residents gain access to urban computing. Metaphorically speaking, it is taking computing to the street by giving the general public – rather than just researchers and professionals – the power to leverage available city infrastructure and create solutions tailored to their individual needs. It brings together five chapters that are based on presentations given at the Street Computing Workshop held on 24 November 2009 in Melbourne in conjunction with the Australian Computer-Human Interaction Conference (OZCHI 2009). This book focuses on applying urban informatics, urban and community sensing and open application programming interfaces (APIs) to the public space through the delivery of online services, on demand and in real time. It then offers a case study of how the city of Singapore has harnessed the potential of an online infrastructure so that residents and visitors can access services electronically. This book was published as a special issue of the Journal of Urban Technology, 19(2), 2012.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This case study investigated pedagogical responses to internationalisation by a faith-based secondary school in Australia. Using social constructivism as the theoretical framework the study examined teaching and learning for culturally and linguistically diverse students. Data generated through questionnaires, focus groups, individual interviews and document archives were analysed and interpreted using thematic analysis. The findings showed that teachers believed themselves to be ill-equipped to teach international students. Their concerns centred on a lack of explicit pedagogical, cultural and linguistic knowledge to help the students acculturate and learn. Recommendations include the dissemination of school policies to teachers, intentional staff collaboration and professional development to address the teachers’ needs for internationalisation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Bioacoustic data can provide an important base for environmental monitoring. To explore a large amount of field recordings collected, an automated similarity search algorithm is presented in this paper. A region of an audio defined by frequency and time bounds is provided by a user; the content of the region is used to construct a query. In the retrieving process, our algorithm will automatically scan through recordings to search for similar regions. In detail, we present a feature extraction approach based on the visual content of vocalisations – in this case ridges, and develop a generic regional representation of vocalisations for indexing. Our feature extraction method works best for bird vocalisations showing ridge characteristics. The regional representation method allows the content of an arbitrary region of a continuous recording to be described in a compressed format.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

There is a growing awareness worldwide of the significance of social media to communication in times of both natural and human-created disasters and crises. While the media have long been used as a means of broadcasting messages to communities in times of crisis – bushfires, floods, earthquakes etc. – the significance of social media in enabling many-to-many communication through ubiquitous networked computing and mobile media devices is becoming increasingly important in the fields of disaster and emergency management. This paper undertakes an analysis of the uses made of social media during two recent natural disasters: the January 2011 floods in Brisbane and South-East Queensland in Australia, and the February 2011 earthquake in Christchurch, New Zealand. It is part of a wider project being undertaken by a research team based at the Queensland University of Technology in Brisbane, Australia, that is working with the Queensland Department of Community Safety (DCS) and the EIDOS Institute, and funded by the Australian Research Council (ARC) through its Linkages program. The project combines large-scale, quantitative social media tracking and analysis techniques with qualitative cultural analysis of communication efforts by citizens and officials, to enable both emergency management authorities and news media organisations to develop, implement, and evaluate new social media strategies for emergency communication.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Carrying capacity assessments model a population’s potential self-sufficiency. A crucial first step in the development of such modelling is to examine the basic resource-based parameters defining the population’s production and consumption habits. These parameters include basic human needs such as food, water, shelter and energy together with climatic, environmental and behavioural characteristics. Each of these parameters imparts land-usage requirements in different ways and varied degrees so their incorporation into carrying capacity modelling also differs. Given that the availability and values of production parameters may differ between locations, no two carrying capacity models are likely to be exactly alike. However, the essential parameters themselves can remain consistent so one example, the Carrying Capacity Dashboard, is offered as a case study to highlight one way in which these parameters are utilised. While examples exist of findings made from carrying capacity assessment modelling, to date, guidelines for replication of such studies in other regions and scales have largely been overlooked. This paper addresses such shortcomings by describing a process for the inclusion and calibration of the most important resource-based parameters in a way that could be repeated elsewhere.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Floods are among the most devastating events that affect primarily tropical, archipelagic countries such as the Philippines. With the current predictions of climate change set to include rising sea levels, intensification of typhoon strength and a general increase in the mean annual precipitation throughout the Philippines, it has become paramount to prepare for the future so that the increased risk of floods on the country does not translate into more economic and human loss. Field work and data gathering was done within the framework of an internship at the former German Technical Cooperation (GTZ) in cooperation with the Local Government Unit of Ormoc City, Leyte, The Philippines, in order to develop a dynamic computer based flood model for the basin of the Pagsangaan River. To this end, different geo-spatial analysis tools such as PCRaster and ArcGIS, hydrological analysis packages and basic engineering techniques were assessed and implemented. The aim was to develop a dynamic flood model and use the development process to determine the required data, availability and impact on the results as case study for flood early warning systems in the Philippines. The hope is that such projects can help to reduce flood risk by including the results of worst case scenario analyses and current climate change predictions into city planning for municipal development, monitoring strategies and early warning systems. The project was developed using a 1D-2D coupled model in SOBEK (Deltares Hydrological modelling software package) and was also used as a case study to analyze and understand the influence of different factors such as land use, schematization, time step size and tidal variation on the flood characteristics. Several sources of relevant satellite data were compared, such as Digital Elevation Models (DEMs) from ASTER and SRTM data, as well as satellite rainfall data from the GIOVANNI server (NASA) and field gauge data. Different methods were used in the attempt to partially calibrate and validate the model to finally simulate and study two Climate Change scenarios based on scenario A1B predictions. It was observed that large areas currently considered not prone to floods will become low flood risk (0.1-1 m water depth). Furthermore, larger sections of the floodplains upstream of the Lilo- an’s Bridge will become moderate flood risk areas (1 - 2 m water depth). The flood hazard maps created for the development of the present project will be presented to the LGU and the model will be used to create a larger set of possible flood prone areas related to rainfall intensity by GTZ’s Local Disaster Risk Management Department and to study possible improvements to the current early warning system and monitoring of the basin section belonging to Ormoc City; recommendations about further enhancement of the geo-hydro-meteorological data to improve the model’s accuracy mainly on areas of interest will also be presented at the LGU.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Currently, the GNSS computing modes are of two classes: network-based data processing and user receiver-based processing. A GNSS reference receiver station essentially contributes raw measurement data in either the RINEX file format or as real-time data streams in the RTCM format. Very little computation is carried out by the reference station. The existing network-based processing modes, regardless of whether they are executed in real-time or post-processed modes, are centralised or sequential. This paper describes a distributed GNSS computing framework that incorporates three GNSS modes: reference station-based, user receiver-based and network-based data processing. Raw data streams from each GNSS reference receiver station are processed in a distributed manner, i.e., either at the station itself or at a hosting data server/processor, to generate station-based solutions, or reference receiver-specific parameters. These may include precise receiver clock, zenith tropospheric delay, differential code biases, ambiguity parameters, ionospheric delays, as well as line-of-sight information such as azimuth and elevation angles. Covariance information for estimated parameters may also be optionally provided. In such a mode the nearby precise point positioning (PPP) or real-time kinematic (RTK) users can directly use the corrections from all or some of the stations for real-time precise positioning via a data server. At the user receiver, PPP and RTK techniques are unified under the same observation models, and the distinction is how the user receiver software deals with corrections from the reference station solutions and the ambiguity estimation in the observation equations. Numerical tests demonstrate good convergence behaviour for differential code bias and ambiguity estimates derived individually with single reference stations. With station-based solutions from three reference stations within distances of 22–103 km the user receiver positioning results, with various schemes, show an accuracy improvement of the proposed station-augmented PPP and ambiguity-fixed PPP solutions with respect to the standard float PPP solutions without station augmentation and ambiguity resolutions. Overall, the proposed reference station-based GNSS computing mode can support PPP and RTK positioning services as a simpler alternative to the existing network-based RTK or regionally augmented PPP systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Process compliance measurement is getting increasing attention in companies due to stricter legal requirements and market pressure for operational excellence. On the other hand, the metrics to quantify process compliance have only been defined recently. A major criticism points to the fact that existing measures appear to be unintuitive. In this paper, we trace back this problem to a more foundational question: which notion of behavioural equivalence is appropriate for discussing compliance? We present a quantification approach based on behavioural profiles, which is a process abstraction mechanism. Behavioural profiles can be regarded as weaker than existing equivalence notions like trace equivalence, and they can be calculated efficiently. As a validation, we present a respective implementation that measures compliance of logs against a normative process model. This implementation is being evaluated in a case study with an international service provider.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Process compliance measurement is getting increasing attention in companies due to stricter legal requirements and market pressure for operational excellence. In order to judge on compliance of the business processing, the degree of behavioural deviation of a case, i.e., an observed execution sequence, is quantified with respect to a process model (referred to as fitness, or recall). Recently, different compliance measures have been proposed. Still, nearly all of them are grounded on state-based techniques and the trace equivalence criterion, in particular. As a consequence, these approaches have to deal with the state explosion problem. In this paper, we argue that a behavioural abstraction may be leveraged to measure the compliance of a process log – a collection of cases. To this end, we utilise causal behavioural profiles that capture the behavioural characteristics of process models and cases, and can be computed efficiently. We propose different compliance measures based on these profiles, discuss the impact of noise in process logs on our measures, and show how diagnostic information on non-compliance is derived. As a validation, we report on findings of applying our approach in a case study with an international service provider.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Passenger experience has become a major factor that influences the success of an airport. In this context, passenger flow simulation has been used in designing and managing airports. However, most passenger flow simulations failed to consider the group dynamics when developing passenger flow models. In this paper, an agent-based model is presented to simulate passenger behaviour at the airport check-in and evacuation process. The simulation results show that the passenger behaviour can have significant influences on the performance and utilisation of services in airport terminals. The model was created using AnyLogic software and its parameters were initialised using recent research data published in the literature.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Knowledge Management (KM) is vital factor to successfully undertake projects. The temporary nature of projects necessitates employing useful KM practices to reduce any issues such as knowledge leakiness and rework. The Project Management Office (PMO) is a unit within organisations to facilitate and oversee organisational projects. Project Management Maturity Models (PMMM) show the development of PMOs from immature to mature levels. The existing PMMMs have focused on discussing Project Management (PM) practices, however, the management of project knowledge is yet to be addressed, at various levels of maturity. A research project was undertaken to investigate the mentioned gap for addressing KM practices at the existing PMMMs. Due to the exploratory and inductive nature of this research, qualitative methods using case studies were chosen as the research methodology to investigate the problem in the real world. In total, three cases selected from different industries: research; mining and government organisations, to provide broad categories for research and research questions were examined using the developed framework. This paper presents the findings from the investigation of the research organisation with the lowest level of maturity. From KM process point of view, knowledge creation and capturing are the most important processes, while knowledge transferring and reusing received less attention. In addition, it was revealed that provision of “knowledge about client” and “project management knowledge” are the most important types of knowledge that are required at this level of maturity. The results also revealed that PMOs with higher maturity level have better knowledge management, however, some improvement is needed. In addition, the importance of KM processes varies at different levels of maturity. In conclusion, the outcomes of this paper could provide powerful guidance to PMOs at lowest level of maturity from KM point of view.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over the past 20 years the labour market, workforce and work organisation of most if not all industrialised countries have been significantly refashioned by the increased use of more flexible work arrangements, variously labelled as precarious employment or contingent work. There is now a substantial and growing body of international evidence that many of these arrangements are associated with a significant deterioration in occupational health and safety (OHS), using a range of measures such as injury rates, disease, hazard exposures and work-related stress. Moreover, there is an emerging body of evidence that these arrangements pose particular problems for conventional regulatory regimes. Recognition of these problems has aroused the concern of policy makers - especially in Europe, North America and Australia - and a number of responses have been adopted in terms of modifying legislation, producing new guidance material and codes of practice and revised enforcement practices. This article describes one such in itiative in Australia with regard to home-based clothing workers. The regulatory strategy developed in one Australian jurisdiction (and now being ‘exported’ into others) seeks to counter this process via contractual tracking mechanisms to follow the work, tie in liability and shift overarching legal responsibility to the top of the supply chain. The process also entails the integration of minimum standards relating to wages, hours and working conditions; OHS and access to workers’ compensation. While home-based clothing manufacture represents a very old type of ‘flexible’ work arrangement, it is one that regulators have found especially difficult to address. Further, the elaborate multi-tiered subcont racting and diffuse work locations found in this industry are also characteristic of newer forms of contingent work in other industries (such as some telework) and the regulatory challenges they pose (such as the tendency of elaborate supply chains to attenuate and fracture statutory responsibilities, at least in terms of the attitudes and behaviour of those involved).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Knowledge Management (KM) is vital factor to successfully undertake projects. The temporary nature of projects necessitates employing useful KM practices for tackling issues such as knowledge leakiness and rework. The Project Management Office (PMO) is a unit within organizations to facilitate and oversee organizational projects. Project Management Maturity Models (PMMM) shows the development of PMOs from immature to mature levels. The existing PMMMs have focused on discussing Project Management (PM) practices, however, the management of project knowledge is yet to be addressed, at various levels of maturity. This research project was undertaken to investigate the mentioned gap for addressing KM practices at the existing PMMMs. Due to the exploratory and inductive nature of this research, qualitative methods were chosen as the research methodology. In total, three cases selected from different industries: research; mining and government organizations, to provide broad categories for research and research questions were examined using the developed framework. This paper presents the partial findings of undertaken investigation of the research organisation with the lowest level of maturity. The result shows that knowledge creation and capturing are the most important processes, while knowledge transferring and reusing are not as important as the other two processes. In addition, it was revealed that provision of “knowledge about client” and “project management knowledge” are the most important types of knowledge that are required at this level of maturity. In conclusion, the outcomes of this paper shall provide powerful guidance to PMOs at lowest level of maturity from KM point of view.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Biological systems are typically complex and adaptive, involving large numbers of entities, or organisms, and many-layered interactions between these. System behaviour evolves over time, and typically benefits from previous experience by retaining memory of previous events. Given the dynamic nature of these phenomena, it is non-trivial to provide a comprehensive description of complex adaptive systems and, in particular, to define the importance and contribution of low-level unsupervised interactions to the overall evolution process. In this chapter, the authors focus on the application of the agent-based paradigm in the context of the immune response to HIV. Explicit implementation of lymph nodes and the associated lymph network, including lymphatic chain structure, is a key objective, and requires parallelisation of the model. Steps taken towards an optimal communication strategy are detailed.