680 resultados para cloud service pricing
Resumo:
The beliefs, attitudes and understandings of pre-service teachers towards bullying and more recently, cyberbullying remains unclear. Previous studies have found them to be generally lacking confidence to address bullying, which could impact negatively on school climate if, when they enter the profession, these beliefs undermine existing anti-bullying initiatives. This study explores Australian pre-service teachers' (N= 717) understanding and knowledge of traditional bullying and cyberbullying and their confidence and capacity to deal with it. Findings from self-report, anonymous questionnaires from students attending three universities in Australia indicated that two thirds (66%) of current pre-service teachers felt informed to very informed and capable to very capable (62%) of dealing with school bullying and 90% could discern cyber and traditional bullying behaviours from other online and offline aggressive acts. Gender and Year level differences were found. The potential impact of their knowledge and understanding of bullying and cyberbullying on school climate, and sustaining and maintaining anti-bullying interventions as they enter the profession is discussed.
Resumo:
In the past few years, the virtual machine (VM) placement problem has been studied intensively and many algorithms for the VM placement problem have been proposed. However, those proposed VM placement algorithms have not been widely used in today's cloud data centers as they do not consider the migration cost from current VM placement to the new optimal VM placement. As a result, the gain from optimizing VM placement may be less than the loss of the migration cost from current VM placement to the new VM placement. To address this issue, this paper presents a penalty-based genetic algorithm (GA) for the VM placement problem that considers the migration cost in addition to the energy-consumption of the new VM placement and the total inter-VM traffic flow in the new VM placement. The GA has been implemented and evaluated by experiments, and the experimental results show that the GA outperforms two well known algorithms for the VM placement problem.
Resumo:
Purpose – While many studies have predominantly looked at the benefits and risks of cloud computing, little is known whether and to what extent institutional forces play a role in cloud computing adoption. The purpose of this paper is to explore the role of institutional factors in top management team’s (TMT’s) decision to adopt cloud computing services. Design/methodology/approach – A model is developed and tested with data from an Australian survey using the partial least squares modeling technique. Findings – The results suggest that mimetic and coercive pressures influence TMT’s beliefs in the benefits of cloud computing. The results also show that TMT’s beliefs drive TMT’s participation, which in turn affects the intention to increase the adoption of cloud computing solutions. Research limitations/implications – Future studies could incorporate the influences of local actors who might also press for innovation. Practical implications – Given the influence of institutional forces and the plethora of cloud-based solutions on the market, it is recommended that TMTs exercise a high degree of caution when deciding for the types of applications to be outsourced as organizational requirements in terms of performance and security will differ. Originality/value – The paper contributes to the growing empirical literature on cloud computing adoption and offers the institutional framework as an alternative lens with which to interpret cloud-based information technology outsourcing.
Resumo:
This study proposes that the adoption process of complex-wide systems (e.g. cloud ERP) should be observed as multi-stage actions. Two theoretical lenses were utilised for this study with critical adoption factors identified through the theory of planned behaviour and the progression of each adoption factor observed through Ettlie's (1980) multi-stage adoption model. Together with a survey method, this study has employed data gathered from 162 decision-makers of small and medium-sized enterprises (SMEs). Using both linear and non-linear approaches for the data analysis, the study findings have shown that the level of importance for adoption factors changes across different adoption stages.
Resumo:
The space and positioning of Indigenous knowledges (IK) within Australian curricula and pedagogy are often contentious, informed by the broader Australian socio-cultural, political and economic landscape. Against changing educational policy, historically based on the myth of terra nullius, we discuss the shifting priorities for embedding Indigenous knowledges in educational practice in university and school curricula and pedagogy. In this chapter, we argue that personal and professional commitment to social justice is an important starting point for embedding Indigenous knowledges in the Australian school curricula and pedagogy. Developing teacher knowledge around embedding IK is required to enable teachers’ preparedness to navigate a contested historical/colonising space in curriculum decision-making, teaching and learning. We draw one mpirical data from a recent research project on supporting pre-service teachers as future curriculum leaders; the project was funded by the Office of Learning and Teaching (OLT). This project aimed to support future curriculum leaders to develop their knowledge of embedding IK at one Australian university. We propose supporting the embedding of IK in situ with pre-service teachers and their supervising teachers on practicum in real, sustained and affirming ways that shifts the recognition of IK from personal commitment to social justice in education, to one that values Indigenous knowledges as content to educate (Connell, 1993). We argue that sustained engagement with and appreciation of IKhas the potential to decolonise Australian curricula, shift policy directions and enhance race relations between Indigenous and non-Indigenous Australians .
Resumo:
Designers have become aware of the importance of creating strong emotional experiences intertwined with new tangible products for the past decade, however an increased interest from firms has emerged in developing new service and business models as complimentary forms of emotion-driven innovation. This interdisciplinary study draws from the psychological sciences – theory of emotion – and the management sciences – business model literature to introduce this new innovation agenda. The term visceral hedonic rhetoric (VHR) is defined as the properties of a product, (and in this paper service and business model extensions) that persuasively induce the pursuit of pleasure at an instinctual level of cognition. This research paper lays the foundation for VHR beyond a product setting, presenting the results from an empirical study where organizations explored the possibilities for VHR in the context of their business. The results found that firms currently believe VHR is perceived in either their product and/or services they provide. Implications suggest shifting perspective surrounding the use of VHR across a firm’s business model design in order to influence the outcomes of their product and/or service design, resulting in an overall stronger emotional connection with the customer.
Resumo:
The climate in the Arctic is changing faster than anywhere else on earth. Poorly understood feedback processes relating to Arctic clouds and aerosol–cloud interactions contribute to a poor understanding of the present changes in the Arctic climate system, and also to a large spread in projections of future climate in the Arctic. The problem is exacerbated by the paucity of research-quality observations in the central Arctic. Improved formulations in climate models require such observations, which can only come from measurements in situ in this difficult-to-reach region with logistically demanding environmental conditions. The Arctic Summer Cloud Ocean Study (ASCOS) was the most extensive central Arctic Ocean expedition with an atmospheric focus during the International Polar Year (IPY) 2007–2008. ASCOS focused on the study of the formation and life cycle of low-level Arctic clouds. ASCOS departed from Longyearbyen on Svalbard on 2 August and returned on 9 September 2008. In transit into and out of the pack ice, four short research stations were undertaken in the Fram Strait: two in open water and two in the marginal ice zone. After traversing the pack ice northward, an ice camp was set up on 12 August at 87°21' N, 01°29' W and remained in operation through 1 September, drifting with the ice. During this time, extensive measurements were taken of atmospheric gas and particle chemistry and physics, mesoscale and boundary-layer meteorology, marine biology and chemistry, and upper ocean physics. ASCOS provides a unique interdisciplinary data set for development and testing of new hypotheses on cloud processes, their interactions with the sea ice and ocean and associated physical, chemical, and biological processes and interactions. For example, the first-ever quantitative observation of bubbles in Arctic leads, combined with the unique discovery of marine organic material, polymer gels with an origin in the ocean, inside cloud droplets suggests the possibility of primary marine organically derived cloud condensation nuclei in Arctic stratocumulus clouds. Direct observations of surface fluxes of aerosols could, however, not explain observed variability in aerosol concentrations, and the balance between local and remote aerosols sources remains open. Lack of cloud condensation nuclei (CCN) was at times a controlling factor in low-level cloud formation, and hence for the impact of clouds on the surface energy budget. ASCOS provided detailed measurements of the surface energy balance from late summer melt into the initial autumn freeze-up, and documented the effects of clouds and storms on the surface energy balance during this transition. In addition to such process-level studies, the unique, independent ASCOS data set can and is being used for validation of satellite retrievals, operational models, and reanalysis data sets.
Resumo:
An intrinsic challenge associated with evaluating proposed techniques for detecting Distributed Denial-of-Service (DDoS) attacks and distinguishing them from Flash Events (FEs) is the extreme scarcity of publicly available real-word traffic traces. Those available are either heavily anonymised or too old to accurately reflect the current trends in DDoS attacks and FEs. This paper proposes a traffic generation and testbed framework for synthetically generating different types of realistic DDoS attacks, FEs and other benign traffic traces, and monitoring their effects on the target. Using only modest hardware resources, the proposed framework, consisting of a customised software traffic generator, ‘Botloader’, is capable of generating a configurable mix of two-way traffic, for emulating either large-scale DDoS attacks, FEs or benign traffic traces that are experimentally reproducible. Botloader uses IP-aliasing, a well-known technique available on most computing platforms, to create thousands of interactive UDP/TCP endpoints on a single computer, each bound to a unique IP-address, to emulate large numbers of simultaneous attackers or benign clients.
Resumo:
Objective Chest pain is one of the most common complaints in patients presenting to an emergency department. Delays in management due to a lack of readily available objective tests to risk stratify patients with possible acute coronary syndromes can lead to an unnecessarily lengthy admission placing pressure on hospital beds or inappropriate discharge. The need for a co-ordinated system of clinical management based on enhanced communication between departments, timely and appropriate triage, clinical investigation, diagnosis, and treatment was identified. Methods An evidence-based Chest Pain Management Service and clinical pathway were developed and implemented, including the introduction of after-hours exercise stress testing. Results Between November 2005 and March 2013, 5662 patients were managed according to a Chest Pain Management pathway resulting in a reduction of 5181 admission nights by more timely identification of patients at low risk who could then be discharged. In addition, 1360 days were avoided in high-risk patients who received earlier diagnosis and treatment. Conclusions The creation of a Chest Pain Management pathway and the extended exercise stress testing service resulted in earlier discharge for low-risk patients; and timely treatment for patients with positive and equivocal exercise stress test results. This service demonstrated a significant saving in overnight admissions.
Resumo:
Background This paper examines changing patterns in the utilisation and geographic access to health services in Great Britain using National Travel Survey data (1985-2012). The National Travel Survey (NTS) is a series of household surveys designed to provide data on personal travel and monitor changes in travel behaviour over time. The utilisation rate was derived using the proportion of journeys made to access health services. Geographic access was analysed by separating the concept into its accessibility and mobility dimensions. Methods Variables from the PSU, households, and individuals datasets were used as explanatory variables. Whereas, variables extracted from the journeys dataset were used as dependent variables to identify patterns of utilisation i.e. the proportion of journeys made by different groups to access health facilities in a particular journey distance or time band or by mode of transport; and geographic access to health services. A binary logistic regression analysis was conducted to identify the utilisation rate over the different time periods between different groups. This analysis shows the Odds Ratios (ORs) for different groups making a trip to utilise health services compared to their respective counterparts. Linear multiple regression analyses were conducted to then identify patterns of change in the accessibility and mobility level. Results Analysis of the data has shown that that journey distances to health facilities were signi fi cantly shorter and also gradually reduced over the period in question for Londoners, females, those without a car or on low incomes, and older people. Although rates of utilisation of health services we re Oral Abstracts / Journal of Transport & Health 2 (2015) S5 – S63 S43 signi fi cantly lower because of longer journey times. These fi ndings indicate that the rate of utilisation of health services largely depends on mobility level although previous research studies have traditionally overlooked the mobility dimension. Conclusions This fi nding, therefore, suggests the need to improve geographic access to services together with an enhanced mobility option for disadvantaged groups in order for them to have improved levels of access to health facilities. This research has also found that the volume of car trips to health services also increased steadily over the period 1985-2012 while all other modes accounted for a smaller number of trips. However, it is dif fi cult to conclude from this research whether this increase in the volume of car trips was due to a lack of alternative transport or due to an increase in the level of car-ownership.
Resumo:
Executive Summary Queensland University of Technology (QUT) was contracted to conduct an evaluation of an integrated chronic disease nurse practitioner service conducted at Meadowbrook Primary Care Practice. This evaluation is a collaborative project with nurse practitioners (NP) from Logan Hospital. The integrated chronic disease nurse practitioner service is an outpatient clinic for patients with two or more chronic diseases, including chronic kidney disease (CKD), heart failure (HF), diabetes (type I or II). This document reports on the first 12 months of the service (4th June, 2014 to 25th May, 2015). During this period: • 55 patients attended the NP clinic with 278 occasions of service provided • Almost all (95.7%) patients attended their scheduled appointments (only 4.3% did not attend an appointment) • Since attending the NP clinic, the majority of patients (77.6%) had no emergency department visits related to their chronic disease; only 3 required hospital admission. • 3 patients under the service were managed with Hospital In the Home which avoided more than 25 hospital bed days • 41 patients consented to join a prospective cohort study of patient-reported outcomes and patient satisfaction • 14 patient interviews and 3 stakeholder focus groups were also conducted to provide feedback on their perceptions of the NP-led service innovation. The report concludes with seven recommendations.
Resumo:
Food retail is known for its use of flexible labour and for the centralisation of functions at head office, resulting in a reduction of managerial autonomy at store level. This article employs a typology of controls developed from labour process scholarship to explore how retail managers negotiate the control of their predominantly part-time workforce. Using an Australian supermarket chain as a case, and mixed methods, the article demonstrates that supermarkets use a multiplicity of forms of control across their workforce. For front line service workers, the article identifies a new configuration of controls which intersects with employment status and acts differentially for checkout operators on different employment contracts.
Resumo:
Similar to most other creative industries, the evolution of the music industry is heavily shaped by media technologies. This was equally true in 1999, when the global recorded music industry had experienced two decades of continuous growth largely driven by the rapid transition from vinyl records to Compact Discs. The transition encouraged avid music listeners to purchase much of their music collections all over again in order to listen to their favourite music with ‘digital sound’. As a consequence of this successful product innovation, recorded music sales (unit measure) more than doubled between the early 1980s and the end of the 1990s. It was with this backdrop that the first peer-to-peer file sharing service was developed and released to the mainstream music market in 1999 by the college student Shawn Fanning. The service was named Napster and it marks the beginning of an era that is now a classic example of how an innovation is able to disrupt an entire industry and make large swathes of existing industry competences obsolete. File sharing services such as Napster, followed by a range of similar services in its path, reduced physical unit sales in the music industry to levels that had not been seen since the 1970s. The severe impact of the internet on physical sales shocked many music industry executives who spent much of the 2000s vigorously trying to reverse the decline and make the disruptive technologies go away. At the end, they learned that their efforts were to no avail and the impact on the music industry proved to be transformative, irreversible and, to many music industry professionals, also devastating. Thousands of people lost their livelihood, large and small music companies have folded or been forced into mergers or acquisitions. But as always during periods of disruption, the past 15 years have also been very innovative, spurring a plethora of new music business models. These new business models have mainly emerged outside the music industry and the innovators have been often been required to be both persuasive and persistent in order to get acceptance from the risk-averse and cash-poor music industry establishment. Apple was one such change agent that in 2003 was the first company to open up a functioning and legal market for online music. iTunes Music Store was the first online retail outlet that was able to offer the music catalogues from all the major music companies; it used an entirely novel pricing model, and it allowed consumers to de-bundle the music album and only buy the songs that they actually liked. Songs had previously been bundled by physical necessity as discs or cassettes, but with iTunes Music Store, the institutionalized album bundle slowly started to fall apart. The consequences had an immediate impact on music retailing and within just a few years, many brick and mortar record stores were forced out of business in markets across the world. The transformation also had disruptive consequences beyond music retailing and redefined music companies’ organizational structures, work processes and routines, as well as professional roles. iTunes Music Store in one sense was a disruptive innovation, but it was at the same time relatively incremental, since the major labels’ positions and power structures remained largely unscathed. The rights holders still controlled their intellectual properties and the structures that guided the royalties paid per song that was sold were predictable, transparent and in line with established music industry practices.
Resumo:
The values that gave rise to the ethos of public service broadcasting (PSB) almost a century ago, and which have provided the rationale for PSBs around the world across that time, are under question. This article argues that the process of reinvention of PSBs is enhanced through repositioning the innovation rationale for public service media (PSM). It is organized around a differentiation which is part of the standard repertoire of innovation studies – that between product, process and organizational innovation – as they are being practised by the two Australian PSBs, the Australian Broadcasting Corporation (ABC) and the Special Broadcasting Service (SBS). The article then considers the general problematics of innovation for PSBs through an analysis of the operations of the public value test in the context of European PSM, and its, to this stage, non-application in Australia. The innovation rationale is argued to be a distinctive via media between complementary and comprehensive roles for PSM, which in turn suggests an international, policy-relevant research agenda focusing on international circumstances in which the public broadcaster is not market dominant.
Resumo:
This paper investigates several competing procedures for computing the prices of vanilla European options, such as puts, calls and binaries, in which the underlying model has a characteristic function that is known in semi-closed form. The algorithms investigated here are the half-range Fourier cosine series, the half-range Fourier sine series and the full-range Fourier series. Their performance is assessed in simulation experiments in which an analytical solution is available and also for a simple affine model of stochastic volatility in which there is no closed-form solution. The results suggest that the half-range sine series approximation is the least effective of the three proposed algorithms. It is rather more difficult to distinguish between the performance of the halfrange cosine series and the full-range Fourier series. However there are two clear differences. First, when the interval over which the density is approximated is relatively large, the full-range Fourier series is at least as good as the half-range Fourier cosine series, and outperforms the latter in pricing out-of-the-money call options, in particular with maturities of three months or less. Second, the computational time required by the half-range Fourier cosine series is uniformly longer than that required by the full-range Fourier series for an interval of fixed length. Taken together,these two conclusions make a case for pricing options using a full-range range Fourier series as opposed to a half-range Fourier cosine series if a large number of options are to be priced in as short a time as possible.