177 resultados para Heterogeneous firms
Resumo:
This study explores the relationship between the qualities of an information system and management accounting adaptability (MAA) and effectiveness in firms. Design/methodology/approach I develop and empirically test a model where the qualities of the information system and management accounting effectiveness are mediated by MAA. Findings Information system flexibility (ISF) and shared knowledge had a significant and positive correlation with MAA, which in turn had a positive and significant correlation with management accounting effectiveness. There was also a moderation effect of ISF on the correlation betweeen information system integration and MAA. Research implications Information system integration may not lead to management accounting stability, but the lack of flexibility of a system and a lack of cooperation between the stakeholders might lead to its stagnation. Practical implications Organizations are advised to implement solutions that are relatively flexible and modular, as well as encourage cooperation between stakeholders to fully leverage and improve the existing and future systems. Originality/value The study extends the discourse on the interaction between management accounting and information systems by exploring the role of a number of factors that drive MAA.
Resumo:
A holistic consideration of innovation and associated activities is still very new to consulting engineering firms. This research will have benefits for both industry and academia. The final outcome of this research is a prioritised decision making innovation model that can be used by consulting engineering firms to make informed decisions by investing in appropriate innovation activities that positively impact project performance. This helps by using an informed approach towards investing rather than 'hit-and-miss' trialling.
Resumo:
Interest in the area of collaborative Unmanned Aerial Vehicles (UAVs) in a Multi-Agent System is growing to compliment the strengths and weaknesses of the human-machine relationship. To achieve effective management of multiple heterogeneous UAVs, the status model of the agents must be communicated to each other. This paper presents the effects on operator Cognitive Workload (CW), Situation Awareness (SA), trust and performance by increasing the autonomy capability transparency through text-based communication of the UAVs to the human agents. The results revealed a reduction in CW, increase in SA, increase in the Competence, Predictability and Reliability dimensions of trust, and the operator performance.
Resumo:
This study provides evidence that after several decades of fighting for equal pay for equal work, an unexplained gender pay gap remains amongst senior executives in ASX-listed firms. After controlling for a large suite of personal, occupational and firm observables, we find female senior executives receive, on average, 22.58 percent less in base salary for the period 2002–2013. When executives are awarded performance-based pay, females receive on average 16.47 percent less in cash bonus and 18.21 percent less in long-term incentives than males. The results are robust to using firm fixed effects and propensity-score matching. Blinder–Oaxaca decomposition results show that the mean pay gap cannot be attributed to gender differences in attributes, including job titles. Instead, the results point to differences in returns on firm-specific variables, in particular firm risk.
Resumo:
Tridiagonal diagonally dominant linear systems arise in many scientific and engineering applications. The standard Thomas algorithm for solving such systems is inherently serial forming a bottleneck in computation. Algorithms such as cyclic reduction and SPIKE reduce a single large tridiagonal system into multiple small independent systems which can be solved in parallel. We have developed portable cyclic reduction and SPIKE algorithm OpenCL implementations with the intent to target a range of co-processors in a heterogeneous computing environment including Field Programmable Gate Arrays (FPGAs), Graphics Processing Units (GPUs) and other multi-core processors. In this paper, we evaluate these designs in the context of solver performance, resource efficiency and numerical accuracy.
Resumo:
Small, not-for-profit organisations fulfil a need in the economy that is typically not satisfied by for-profit firms. They also operate in ways that are distinct from larger organisations. While such firms employ a substantial proportion of the workforce, research addressing human resource management (HRM) practices in these settings is limited. This article used data collected from five small not-for-profit firms in Australia to examine the way one significant HRM practice – the provision and utilisation of flexible work arrangements – operates in the sector. Drawing on research from several scholarly fields, the article firstly develops a framework comprising three tensions in not-for-profits that have implications for HRM. These tensions are: (1) contradictions between an informal approach to HRM vs. a formal regulatory system; (2) employee values that favour social justice vs. external market forces; and (3) a commitment to service vs. external financial expectations. The article then empirically examines how these tensions are managed in relation to the specific case of flexible work arrangements. The study reveals that tensions around providing and accessing flexible work arrangements are managed in three ways: discretion, leadership style and distancing. These findings more broadly inform the way HRM is operationalised in this under-examined sector.
Resumo:
- Provided a practical variable-stepsize implementation of the exponential Euler method (EEM). - Introduced a new second-order variant of the scheme that enables the local error to be estimated at the cost of a single additional function evaluation. - New EEM implementation outperformed sophisticated implementations of the backward differentiation formulae (BDF) of order 2 and was competitive with BDF of order 5 for moderate to high tolerances.
Resumo:
Stochastic volatility models are of fundamental importance to the pricing of derivatives. One of the most commonly used models of stochastic volatility is the Heston Model in which the price and volatility of an asset evolve as a pair of coupled stochastic differential equations. The computation of asset prices and volatilities involves the simulation of many sample trajectories with conditioning. The problem is treated using the method of particle filtering. While the simulation of a shower of particles is computationally expensive, each particle behaves independently making such simulations ideal for massively parallel heterogeneous computing platforms. In this paper, we present our portable Opencl implementation of the Heston model and discuss its performance and efficiency characteristics on a range of architectures including Intel cpus, Nvidia gpus, and Intel Many-Integrated-Core (mic) accelerators.
Resumo:
This thesis examines green marketing and green consumption behaviours addressing limited understandings about how consumers interpret their green consumption behaviour in their everyday lives; what motivates people to purchase green products, and what barriers exist to this behaviour. Findings reveal that enhancing green consumption through green marketing depends on consumers' enthusiasm to engage in green practices and green behavioural influences. The research supports the need for qualitative research to provide rich insights into relationships between consumer behaviour, green marketing and green consumption and builds a stronger knowledge foundation by introducing social practice theory into the marketing discipline.
Resumo:
Solving large-scale all-to-all comparison problems using distributed computing is increasingly significant for various applications. Previous efforts to implement distributed all-to-all comparison frameworks have treated the two phases of data distribution and comparison task scheduling separately. This leads to high storage demands as well as poor data locality for the comparison tasks, thus creating a need to redistribute the data at runtime. Furthermore, most previous methods have been developed for homogeneous computing environments, so their overall performance is degraded even further when they are used in heterogeneous distributed systems. To tackle these challenges, this paper presents a data-aware task scheduling approach for solving all-to-all comparison problems in heterogeneous distributed systems. The approach formulates the requirements for data distribution and comparison task scheduling simultaneously as a constrained optimization problem. Then, metaheuristic data pre-scheduling and dynamic task scheduling strategies are developed along with an algorithmic implementation to solve the problem. The approach provides perfect data locality for all comparison tasks, avoiding rearrangement of data at runtime. It achieves load balancing among heterogeneous computing nodes, thus enhancing the overall computation time. It also reduces data storage requirements across the network. The effectiveness of the approach is demonstrated through experimental studies.
Resumo:
Lung cancer is the second most common type of cancer in the world and is the most common cause of cancer-related death in both men and women. Research into causes, prevention and treatment of lung cancer is ongoing and much progress has been made recently in these areas, however survival rates have not significantly improved. Therefore, it is essential to develop biomarkers for early diagnosis of lung cancer, prediction of metastasis and evaluation of treatment efficiency, as well as using these molecules to provide some understanding about tumour biology and translate highly promising findings in basic science research to clinical application. In this investigation, two-dimensional difference gel electrophoresis and mass spectrometry were initially used to analyse conditioned media from a panel of lung cancer and normal bronchial epithelial cell lines. Significant proteins were identified with heterogeneous nuclear ribonucleoprotein A2B1 (hnRNPA2B1), pyruvate kinase M2 isoform (PKM2), Hsc-70 interacting protein and lactate dehydrogenase A (LDHA) selected for analysis in serum from healthy individuals and lung cancer patients. hnRNPA2B1, PKM2 and LDHA were found to be statistically significant in all comparisons. Tissue analysis and knockdown of hnRNPA2B1 using siRNA subsequently demonstrated both the overexpression and potential role for this molecule in lung tumorigenesis. The data presented highlights a number of in vitro derived candidate biomarkers subsequently verified in patient samples and also provides some insight into their roles in the complex intracellular mechanisms associated with tumour progression.
Resumo:
Construction firms that employ collaborative procurement approaches develop operating routines through joint learning so as to improve infrastructure project performance. This paper reports a study based on a survey sample of 320 construction practitioners which were involved in collaborative infrastructure delivery in Australia. The study developed valid and reliable scales for measuring collaborative learning capability (CLC), and used the scales to evaluate the CLC of contractor and consultant firms within the sample. The evaluation suggests that whilst these firms explore knowledge from both internal and external sources, transform both explicit and tacit knowledge, and apply and internalise new knowledge, they can improve the extent to which these routines are applied to optimise project performance.