721 resultados para cloud computing, software as a service, SaaS, enterprise systems, IS success
Resumo:
Service-oriented architectures and Web services mature and have become more widely accepted and used by industry. This growing adoption increased the demands for new ways of using Web service technology. Users start re-combining and mediating other providers’ services in ways that have not been anticipated by their original provider. Within organisations and cross-organisational communities, discoverable services are organised in repositories providing convenient access to adaptable end-to-end business processes. This idea is captured in the term Service Ecosystem. This paper addresses the question of how quality management can be performed in such service ecosystems. Service quality management is a key challenge when services are composed of a dynamic set of heterogeneous sub-services from different service providers. This paper contributes to this important area by developing a reference model of quality management in service ecosystems. We illustrate the application of the reference model in an exploratory case study. With this case study, we show how the reference model helps to derive requirements for the implementation and support of quality management in an exemplary service ecosystem in public administration.
Resumo:
Existing techniques for automated discovery of process models from event logs largely focus on extracting flat process models. In other words, they fail to exploit the notion of subprocess, as well as structured error handling and repetition constructs provided by contemporary process modeling notations, such as the Business Process Model and Notation (BPMN). This paper presents a technique for automated discovery of BPMN models containing subprocesses, interrupting and non-interrupting boundary events, and loop and multi-instance markers. The technique analyzes dependencies between data attributes associated with events, in order to identify subprocesses and to extract their associated logs. Parent process and subprocess models are then discovered separately using existing techniques for flat process model discovery. Finally, the resulting models and logs are heuristically analyzed in order to identify boundary events and markers. A validation with one synthetic and two real-life logs shows that process models derived using the proposed technique are more accurate and less complex than those derived with flat process model discovery techniques.
Resumo:
Enterprise Systems purport to bring innovation to organizations. Yet, no past studies, neither from innovation nor from ES disciplines have merged their knowledge to understand how ES could facilitate lifecycle-wide innovation. Therefore, this study forms conceptual bridge between the two disciplines. In this research, we seek to understand how ES could facilitate innovation across its lifecycle phases. We associate classifications of innovation such as radical vs. incremental, administrative vs. technical innovation with the three phases of ES lifecycle. We introduce Continuous Restrained Innovation (CRI) as a new type of innovation specific to ES, considering restraints of technology, business processes and organization. Our empirical data collection at the implementation phase, using data from both the client and implementation partner, shows preliminary evidence of CRI. In addition, we state that both parties consider the implementation of ES as a radical innovation yet, are less interest in seeking further innovations through the system.
Resumo:
Enterprise Resource Planning (ERP) software is the dominant strategic platform for supporting enterprise-wide business processes. However, it has been criticised for being inflexible and not meeting specific organisation and industry requirements. An alternative, Best of Breed (BoB), integrates components of standard package and/or custom software. The objective is to develop enterprise systems that are more closely aligned with the business processes of an organisation. A case study of a BoB implementation facilitates a comparative analysis of the issues associated with this strategy and the single vendor ERP alternative. The paper illustrates the differences in complexity of implementation, levels of functionality, business process alignment potential and associated maintenance.
Resumo:
Monitoring the environment with acoustic sensors is an effective method for understanding changes in ecosystems. Through extensive monitoring, large-scale, ecologically relevant, datasets can be produced that can inform environmental policy. The collection of acoustic sensor data is a solved problem; the current challenge is the management and analysis of raw audio data to produce useful datasets for ecologists. This paper presents the applied research we use to analyze big acoustic datasets. Its core contribution is the presentation of practical large-scale acoustic data analysis methodologies. We describe details of the data workflows we use to provide both citizen scientists and researchers practical access to large volumes of ecoacoustic data. Finally, we propose a work in progress large-scale architecture for analysis driven by a hybrid cloud-and-local production-grade website.
Resumo:
This paper begins with a brief review of recent literature about relationships between offending behaviour and mental illness, classifying studies by the settings within which they occurred. The establishment and role of a mental health court liaison (MHCL) service is then described, together with findings from a 3-year service audit, including an examination of relationships between clients’ characteristics and offence profiles, and comparisons with regional offence data. During the audit period, 971 clients (767 males, 204 females) were referred to the service, comprising 1139 service episodes, 35.5% of which involved a comorbid substance use diagnosis. The pattern of offences for MHCL clients was reasonably similar to the regional offence data, except that among MHCL clients there were proportionately more offences against justice procedures (e.g., breaches of apprehended violence orders [AVOs]) and fewer driving offences and “other offences”. Additionally, male MHCL clients had proportionately more malicious damage and robbery offences and lower rates of offensive behaviour and drug offences. A range of service and research issues is also discussed. Overall, the new service appears to have forged more effective links between the mental health and criminal justice systems.
Resumo:
Climate change and solar ultraviolet radiation may affect vaccine-preventable infectious diseases (VPID), the human immune response process and the immunization service delivery system. We systematically reviewed the scientific literature and identified 37 relevant publications. Our study shows that climate variability and ultraviolet radiation may potentially affect VPID and the immunization delivery system through modulating vector reproduction and vaccination effectiveness, possibly influencing human immune response systems to the vaccination, and disturbing immunization service delivery. Further research is needed to determine these affects on climate-sensitive VPID and on human immune response to common vaccines. Such research will facilitate the development and delivery of optimal vaccination programs for target populations, to meet the goal of disease control and elimination.
Resumo:
One of the main challenges in data analytics is that discovering structures and patterns in complex datasets is a computer-intensive task. Recent advances in high-performance computing provide part of the solution. Multicore systems are now more affordable and more accessible. In this paper, we investigate how this can be used to develop more advanced methods for data analytics. We focus on two specific areas: model-driven analysis and data mining using optimisation techniques.
Resumo:
The Field and Service Robotics (FSR) conference is a single track conference with a specific focus on field and service applications of robotics technology. The goal of FSR is to report and encourage the development of field and service robotics. These are non-factory robots, typically mobile, that must operate in complex and dynamic environments. Typical field robotics applications include mining, agriculture, building and construction, forestry, cargo handling and so on. Field robots may operate on the ground (of Earth or planets), under the ground, underwater, in the air or in space. Service robots are those that work closely with humans, importantly the elderly and sick, to help them with their lives. The first FSR conference was held in Canberra, Australia, in 1997. Since then the meeting has been held every 2 years in Asia, America, Europe and Australia. It has been held in Canberra, Australia (1997), Pittsburgh, USA (1999), Helsinki, Finland (2001), Mount Fuji, Japan (2003), Port Douglas, Australia (2005), Chamonix, France (2007), Cambridge, USA (2009), Sendai, Japan (2012) and most recently in Brisbane, Australia (2013). This year we had 54 submissions of which 36 were selected for oral presentation. The organisers would like to thank the international committee for their invaluable contribution in the review process ensuring the overall quality of contributions. The organising committee would also like to thank Ben Upcroft, Felipe Gonzalez and Aaron McFadyen for helping with the organisation and proceedings. and proceedings. The conference was sponsored by the Australian Robotics and Automation Association (ARAA), CSIRO, Queensland University of Technology (QUT), Defence Science and Technology Organisation Australia (DSTO) and the Rio Tinto Centre for Mine Automation, University of Sydney.
Resumo:
The requirement of distributed computing of all-to-all comparison (ATAC) problems in heterogeneous systems is increasingly important in various domains. Though Hadoop-based solutions are widely used, they are inefficient for the ATAC pattern, which is fundamentally different from the MapReduce pattern for which Hadoop is designed. They exhibit poor data locality and unbalanced allocation of comparison tasks, particularly in heterogeneous systems. The results in massive data movement at runtime and ineffective utilization of computing resources, affecting the overall computing performance significantly. To address these problems, a scalable and efficient data and task distribution strategy is presented in this paper for processing large-scale ATAC problems in heterogeneous systems. It not only saves storage space but also achieves load balancing and good data locality for all comparison tasks. Experiments of bioinformatics examples show that about 89\% of the ideal performance capacity of the multiple machines have be achieved through using the approach presented in this paper.
Resumo:
Variability is observed at all levels of cardiac electrophysiology. Yet, the underlying causes and importance of this variability are generally unknown, and difficult to investigate with current experimental techniques. The aim of the present study was to generate populations of computational ventricular action potential models that reproduce experimentally observed intercellular variability of repolarisation (represented by action potential duration) and to identify its potential causes. A systematic exploration of the effects of simultaneously varying the magnitude of six transmembrane current conductances (transient outward, rapid and slow delayed rectifier K(+), inward rectifying K(+), L-type Ca(2+), and Na(+)/K(+) pump currents) in two rabbit-specific ventricular action potential models (Shannon et al. and Mahajan et al.) at multiple cycle lengths (400, 600, 1,000 ms) was performed. This was accomplished with distributed computing software specialised for multi-dimensional parameter sweeps and grid execution. An initial population of 15,625 parameter sets was generated for both models at each cycle length. Action potential durations of these populations were compared to experimentally derived ranges for rabbit ventricular myocytes. 1,352 parameter sets for the Shannon model and 779 parameter sets for the Mahajan model yielded action potential duration within the experimental range, demonstrating that a wide array of ionic conductance values can be used to simulate a physiological rabbit ventricular action potential. Furthermore, by using clutter-based dimension reordering, a technique that allows visualisation of multi-dimensional spaces in two dimensions, the interaction of current conductances and their relative importance to the ventricular action potential at different cycle lengths were revealed. Overall, this work represents an important step towards a better understanding of the role that variability in current conductances may play in experimentally observed intercellular variability of rabbit ventricular action potential repolarisation.
Resumo:
In this paper, we show implementation results of various algorithms that sort data encrypted with Fully Homomorphic Encryption scheme based on Integers. We analyze the complexities of sorting algorithms over encrypted data by considering Bubble Sort, Insertion Sort, Bitonic Sort and Odd-Even Merge sort. Our complexity analysis together with implementation results show that Odd-Even Merge Sort has better performance than the other sorting techniques. We observe that complexity of sorting in homomorphic domain will always have worst case complexity independent of the nature of input. In addition, we show that combining different sorting algorithms to sort encrypted data does not give any performance gain when compared to the application of sorting algorithms individually.
Resumo:
The values that gave rise to the ethos of public service broadcasting (PSB) almost a century ago, and which have provided the rationale for PSBs around the world across that time, are under question. This article argues that the process of reinvention of PSBs is enhanced through repositioning the innovation rationale for public service media (PSM). It is organized around a differentiation which is part of the standard repertoire of innovation studies – that between product, process and organizational innovation – as they are being practised by the two Australian PSBs, the Australian Broadcasting Corporation (ABC) and the Special Broadcasting Service (SBS). The article then considers the general problematics of innovation for PSBs through an analysis of the operations of the public value test in the context of European PSM, and its, to this stage, non-application in Australia. The innovation rationale is argued to be a distinctive via media between complementary and comprehensive roles for PSM, which in turn suggests an international, policy-relevant research agenda focusing on international circumstances in which the public broadcaster is not market dominant.
Resumo:
This research investigates Bhutan Civil Service Human Resource Management strategies, policies and practices, and their contribution to achieving the national goal of Gross National Happiness. The study finds that the HRM of the Bhutanese civil service is meeting its strategic objective of contributing to GNH. The civil service in Bhutan plays an important role in socio-economic development, influences private sector practices, strengthens good governance and provides continuity to the government. Participants in the study were government ministers and senior, highly experienced civil servants. A model of civil service HRM in Bhutan is developed.
Resumo:
Tridiagonal diagonally dominant linear systems arise in many scientific and engineering applications. The standard Thomas algorithm for solving such systems is inherently serial forming a bottleneck in computation. Algorithms such as cyclic reduction and SPIKE reduce a single large tridiagonal system into multiple small independent systems which can be solved in parallel. We have developed portable cyclic reduction and SPIKE algorithm OpenCL implementations with the intent to target a range of co-processors in a heterogeneous computing environment including Field Programmable Gate Arrays (FPGAs), Graphics Processing Units (GPUs) and other multi-core processors. In this paper, we evaluate these designs in the context of solver performance, resource efficiency and numerical accuracy.