953 resultados para Accounting data
Resumo:
Purpose: The aim of this paper is to identify and gain insights into the significance of barriers contributing to the purported "gap" between academic management accounting research and practice. Design/methodology/approach: Drawing on diffusion of innovations theory, this study collects and analyses data from a questionnaire survey and follow-up interviews with 19 representatives of the four principal professional accounting bodies in Australia. Findings: Professional accounting bodies perceive the gap between academic research and practice in management accounting to be of limited concern to practitioners. The two most significant barriers to research utilisation by practitioners are identified as: difficulties in understanding academic research papers; and limited access to research findings. In acting as a conduit between the worlds of academia and practice, professional bodies have an important role to play by demonstrating the mutual value to both academics and practitioners resulting from a closer engagement between MA research and practice. Research limitations/implications: As one of the few empirically-based, theoretically informed investigations exploring the research-practice gap in management accounting, this study provides insights rather than "answers". Its findings therefore serve as a foundational basis for further empirical and theoretical enquiry. Originality/value: This study contributes to the conversation about the "research-practice gap" in management accounting by adopting a distinct theoretical vantage point to organize, analyse and interpret empirical evidence obtained from Australian professional accounting bodies about management accounting practice. © Emerald Group Publishing Limited.
Resumo:
The results of an experimental study of retail investors' use of eXtensible Business Reporting Language tagged (interactive) data and PDF format for making investment decisions are reported. The main finding is that data format made no difference to participants' ability to locate and integrate information from statement footnotes to improve investment decisions. Interactive data were perceived by participants as quick and 'accurate', but it failed to facilitate the identification of the adjustment needed to make the ratios accurate for comparison. An important implication is that regulators and software designers should work to reduce user reliance on the comparability of ratios generated automatically using interactive data.
Resumo:
The Securities and Exchange Commission (SEC) in the United States and in particular its immediately past chairman, Christopher Cox, has been actively promoting an upgrade of the EDGAR system of disseminating filings. The new generation of information provision has been dubbed by Chairman Cox, "Interactive Data" (SEC, 2006). In October this year the Office of Interactive Disclosure was created(http://www.sec.gov/news/press/2007/2007-213.htm). The focus of this paper is to examine the way in which the non-professional investor has been constructed by various actors. We examine the manner in which Interactive Data has been sold as the panacea for financial market 'irregularities' by the SEC and others. The academic literature shows almost no evidence of researching non-professional investors in any real sense (Young, 2006). Both this literature and the behaviour of representatives of institutions such as the SEC and FSA appears to find it convenient to construct this class of investor in a particular form and to speak for them. We theorise the activities of the SEC and its chairman in particular over a period of about three years, both following and prior to the 'credit crunch'. Our approach is to examine a selection of the policy documents released by the SEC and other interested parties and the statements made by some of the policy makers and regulators central to the programme to advance the socio-technical project that is constituted by Interactive Data. We adopt insights from ANT and more particularly the sociology of translation (Callon, 1986; Latour, 1987, 2005; Law, 1996, 2002; Law & Singleton, 2005) to show how individuals and regulators have acted as spokespersons for this malleable class of investor. We theorise the processes of accountability to investors and others and in so doing reveal the regulatory bodies taking the regulated for granted. The possible implications of technological developments in digital reporting have been identified also by the CEO's of the six biggest audit firms in a discussion document on the role of accounting information and audit in the future of global capital markets (DiPiazza et al., 2006). The potential for digital reporting enabled through XBRL to "revolutionize the entire company reporting model" (p.16) is discussed and they conclude that the new model "should be driven by the wants of investors and other users of company information,..." (p.17; emphasis in the original). Here rather than examine the somewhat illusive and vexing question of whether adding interactive functionality to 'traditional' reports can achieve the benefits claimed for nonprofessional investors we wish to consider the rhetorical and discursive moves in which the SEC and others have engaged to present such developments as providing clearer reporting and accountability standards and serving the interests of this constructed and largely unknown group - the non-professional investor.
Resumo:
The increasing adoption of international accounting standards and global convergence of accounting regulations is frequently heralded as serving to reduce diversity in financial reporting practice. In a process said to be driven in large part by the interests of international business and global financial markets, one might expect the greatest degree of convergence to be found amongst the world’s largest multinational financial corporations. This paper challenges such claims and presumptions. Its content analysis of longitudinal data for the period 2000-2006 reveals substantial, on going diversity in the market risk disclosure practices, both numerical and narrative, of the world’s top-25 banks. The significance of such findings is reinforced by the sheer scale of the banking sector’s risk exposures that have been subsequently revealed in the current global financial crisis. The variations in disclosure practices documented in the paper apply both across and within national boundaries, leading to a firm conclusion that, at least in terms of market risk reporting, progress towards international harmonisation remains rather more apparent than real.
Resumo:
The primary aim of this research is to understand what constitutes management accounting and control (MACs) practice and how these control processes are implicated in the day to day work practices and operations of the organisation. It also examines the changes that happen in MACs practices over time as multiple actors within organisational settings interact with each other. I adopt a distinctive practice theory approach (i.e. sociomateriality) and the concept of imbrication in this research to show that MACs practices emerge from the entanglement between human/social agency and material/technological agency within an organisation. Changes in the pattern of MACs practices happens in imbrication processes which are produced as the two agencies entangle. The theoretical approach employed in this research offers an interesting and valuable lens which seeks to reveal the depth of these interactions and uncover the way in which the social and material imbricate. The theoretical framework helps to reveal how these constructions impact on and produce modifications of MACs practices. The exploration of the control practices at different hierarchical levels (i.e. from the operational to middle management and senior level management) using the concept of imbrication process also maps the dynamic flow of controls from operational to top management and vice versa in the organisation. The empirical data which is the focus of this research has been gathered from a case study of an organisation involved in a large vertically integrated palm oil industry company in Malaysia specifically the refinery sector. The palm oil industry is a significant industry in Malaysia as it contributed an average of 4.5% of Malaysian Gross Domestic Product, over the period 1990 -2010. The Malaysian palm oil industry also has a significant presence in global food oil supply where it contributed 26% of the total oils and fats global trade in 2010. The case organisation is a significant contributor to the Malaysian palm oil industry. The research access has provided an interesting opportunity to explore the interactions between different groups of people and material/technology in a relatively heavy process food industry setting. My research examines how these interactions shape and are shaped by control practices in a dynamic cycle of imbrications over both short and medium time periods.
Resumo:
We use a panel data set of UK-listed companies over the period 2005–2009 to analyse the actuarial assumptions used to value pension plan liabilities under IAS 19. The valuation process requires companies to make assumptions about financial and demographic variables, notably discount rate, price inflation, salary inflation and mortality/life expectancy of plan members/beneficiaries. We use regression analysis to analyse the relationships between these key assumptions (except mortality, where disclosures are limited) and company-specific factors such as the pension plan funding position and duration of pension liabilities. We find evidence of selective ‘management’ of the three assumptions investigated, although the nature of this appears to differ from the findings of US authors. We conclude that IAS 19 does not prevent the use of managerial discretion, particularly by companies whose pension plan funding positions are weak, thereby reducing the representational faithfulness of the reported pension figures. We also highlight that the degree of discretion used reflects the extent to which IAS 19 defines how the assumptions are to be determined. We therefore suggest that companies should be encouraged to justify more explicitly their choice of assumptions.
Resumo:
The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.
Resumo:
We develop a framework for estimating the quality of transmission (QoT) of a new lightpath before it is established, as well as for calculating the expected degradation it will cause to existing lightpaths. The framework correlates the QoT metrics of established lightpaths, which are readily available from coherent optical receivers that can be extended to serve as optical performance monitors. Past similar studies used only space (routing) information and thus neglected spectrum, while they focused on oldgeneration noncoherent networks. The proposed framework accounts for correlation in both the space and spectrum domains and can be applied to both fixed-grid wavelength division multiplexing (WDM) and elastic optical networks. It is based on a graph transformation that exposes and models the interference between spectrum-neighboring channels. Our results indicate that our QoT estimates are very close to the actual performance data, that is, to having perfect knowledge of the physical layer. The proposed estimation framework is shown to provide up to 4 × 10-2 lower pre-forward error correction bit error ratio (BER) compared to theworst-case interference scenario,which overestimates the BER. The higher accuracy can be harvested when lightpaths are provisioned with low margins; our results showed up to 47% reduction in required regenerators, a substantial savings in equipment cost.
Resumo:
Permanent water bodies not only store dissolved CO2 but are essential for the maintenance of wetlands in their proximity. From the viewpoint of greenhouse gas (GHG) accounting wetland functions comprise sequestration of carbon under anaerobic conditions and methane release. The investigated area in central Siberia covers boreal and sub-arctic environments. Small inundated basins are abundant on the sub-arctic Taymir lowlands but also in parts of severe boreal climate where permafrost ice content is high and feature important freshwater ecosystems. Satellite radar imagery (ENVISAT ScanSAR), acquired in summer 2003 and 2004, has been used to derive open water surfaces with 150 m resolution, covering an area of approximately 3 Mkm**2. The open water surface maps were derived using a simple threshold-based classification method. The results were assessed with Russian forest inventory data, which includes detailed information about water bodies. The resulting classification has been further used to estimate the extent of tundra wetlands and to determine their importance for methane emissions. Tundra wetlands cover 7% (400,000 km**2) of the study region and methane emissions from hydromorphic soils are estimated to be 45,000 t/d for the Taymir peninsula.
Resumo:
Owing to their important roles in biogeochemical cycles, phytoplankton functional types (PFTs) have been the aim of an increasing number of ocean color algorithms. Yet, none of the existing methods are based on phytoplankton carbon (C) biomass, which is a fundamental biogeochemical and ecological variable and the "unit of accounting" in Earth system models. We present a novel bio-optical algorithm to retrieve size-partitioned phytoplankton carbon from ocean color satellite data. The algorithm is based on existing methods to estimate particle volume from a power-law particle size distribution (PSD). Volume is converted to carbon concentrations using a compilation of allometric relationships. We quantify absolute and fractional biomass in three PFTs based on size - picophytoplankton (0.5-2 µm in diameter), nanophytoplankton (2-20 µm) and microphytoplankton (20-50 µm). The mean spatial distributions of total phytoplankton C biomass and individual PFTs, derived from global SeaWiFS monthly ocean color data, are consistent with current understanding of oceanic ecosystems, i.e., oligotrophic regions are characterized by low biomass and dominance of picoplankton, whereas eutrophic regions have high biomass to which nanoplankton and microplankton contribute relatively larger fractions. Global climatological, spatially integrated phytoplankton carbon biomass standing stock estimates using our PSD-based approach yield - 0.25 Gt of C, consistent with analogous estimates from two other ocean color algorithms and several state-of-the-art Earth system models. Satisfactory in situ closure observed between PSD and POC measurements lends support to the theoretical basis of the PSD-based algorithm. Uncertainty budget analyses indicate that absolute carbon concentration uncertainties are driven by the PSD parameter No which determines particle number concentration to first order, while uncertainties in PFTs' fractional contributions to total C biomass are mostly due to the allometric coefficients. The C algorithm presented here, which is not empirically constrained a priori, partitions biomass in size classes and introduces improvement over the assumptions of the other approaches. However, the range of phytoplankton C biomass spatial variability globally is larger than estimated by any other models considered here, which suggests an empirical correction to the No parameter is needed, based on PSD validation statistics. These corrected absolute carbon biomass concentrations validate well against in situ POC observations.
Resumo:
Determining the past record of temperature and salinity of ocean surface waters is essential for understanding past changes in climate, such as those which occur across glacial-interglacial transitions. As a useful proxy, the oxygen isotope composition (delta18O) of calcite from planktonic foraminifera has been shown to reflect both surface temperature and seawater delta18O, itself an indicator of global ice volume and salinity (Shackleton, 1974; Rostek et al., 1993, doi:10.1038/364319a0). In addition, magnesium/calcium (Mg/Ca) ratios in foraminiferal calcite show a temperature dependence (Nürnberg, 1995, doi:10.2113/gsjfr.25.4.350; Nürnberg et al., 1996, doi:10.1016/0016-7037(95)00446-7; Lea et al., 1999, doi:10.1016/S0016-7037(99)00197-0) due to the partitioning of Mg during calcification. Here we demonstrate, in a field-based calibration experiment, that the variation of Mg/Ca ratios with temperature is similar for eight species of planktonic foraminifera (when accounting for Mg dissolution effects). Using a multi-species record from the Last Glacial Maximum in the North Atlantic Ocean we found that past temperatures reconstructed from Mg/Ca ratios followed the two other palaeotemperature proxies: faunal abundance (CLIMAP, 1981; Mix et al., 1999, doi:10.1029/1999PA900012) and alkenone saturation (Müller et al., 1998, doi:10.1016/S0016-7037(98)00097-0 ). Moreover, combining Mg/Ca and delta18O data from the same faunal assemblage, we show that reconstructed surface water delta18O from all foraminiferal species record the same glacial-interglacial change-representing changing hydrography and global ice volume. This reinforces the potential of this combined technique in probing past ocean-climate interactions.
Resumo:
Kandidaatintyö on toteutettu kirjallisuuskatsauksena, jonka tavoitteena on selvittää data-analytiikan käyttökohteita ja datan hyödyntämisen vaikutusta liiketoimintaan. Työ käsittelee data-analytiikan käyttöä ja datan tehokkaan hyödyntämisen haasteita. Työ on rajattu tarkastelemaan yrityksen talouden ohjausta, jossa analytiikkaa käytetään johdon ja rahoituksen laskentatoimessa. Datan määrän eksponentiaalinen kasvunopeus luo data-analytiikan käytölle uusia haasteita ja mahdollisuuksia. Datalla itsessään ei kuitenkaan ole suurta arvoa yritykselle, vaan arvo syntyy prosessoinnin kautta. Vaikka data-analytiikkaa tutkitaan ja käytetään jo runsaasti, se tarjoaa paljon nykyisiä sovelluksia suurempia mahdollisuuksia. Yksi työn keskeisimmistä tuloksista on, että data-analytiikalla voidaan tehostaa johdon laskentatoimea ja helpottaa rahoituksen laskentatoimen tehtäviä. Tarjolla olevan datan määrä kasvaa kuitenkin niin nopeasti, että käytettävissä oleva teknologia ja osaamisen taso eivät pysy kehityksessä mukana. Varsinkin big datan laajempi käyttöönotto ja sen tehokas hyödyntäminen vaikuttavat jatkossa talouden ohjauksen käytäntöihin ja sovelluksiin yhä enemmän.
Resumo:
In the last thirty years, the emergence and progression of biologging technology has led to great advances in marine predator ecology. Large databases of location and dive observations from biologging devices have been compiled for an increasing number of diving predator species (such as pinnipeds, sea turtles, seabirds and cetaceans), enabling complex questions about animal activity budgets and habitat use to be addressed. Central to answering these questions is our ability to correctly identify and quantify the frequency of essential behaviours, such as foraging. Despite technological advances that have increased the quality and resolution of location and dive data, accurately interpreting behaviour from such data remains a challenge, and analytical methods are only beginning to unlock the full potential of existing datasets. This review evaluates both traditional and emerging methods and presents a starting platform of options for future studies of marine predator foraging ecology, particularly from location and two-dimensional (time-depth) dive data. We outline the different devices and data types available, discuss the limitations and advantages of commonly-used analytical techniques, and highlight key areas for future research. We focus our review on pinnipeds - one of the most studied taxa of marine predators - but offer insights that will be applicable to other air-breathing marine predator tracking studies. We highlight that traditionally-used methods for inferring foraging from location and dive data, such as first-passage time and dive shape analysis, have important caveats and limitations depending on the nature of the data and the research question. We suggest that more holistic statistical techniques, such as state-space models, which can synthesise multiple track, dive and environmental metrics whilst simultaneously accounting for measurement error, offer more robust alternatives. Finally, we identify a need for more research to elucidate the role of physical oceanography, device effects, study animal selection, and developmental stages in predator behaviour and data interpretation.
Resumo:
This paper explores the effect of using regional data for livestock attributes on estimation of greenhouse gas (GHG) emissions for the northern beef industry in Australia, compared with using state/territory-wide values, as currently used in Australia’s national GHG inventory report. Regional GHG emissions associated with beef production are reported for 21 defined agricultural statistical regions within state/territory jurisdictions. A management scenario for reduced emissions that could qualify as an Emissions Reduction Fund (ERF) project was used to illustrate the effect of regional level model parameters on estimated abatement levels. Using regional parameters, instead of state level parameters, for liveweight (LW), LW gain and proportion of cows lactating and an expanded number of livestock classes, gives a 5.2% reduction in estimated emissions (range +12% to –34% across regions). Estimated GHG emissions intensity (emissions per kilogram of LW sold) varied across the regions by up to 2.5-fold, ranging from 10.5 kg CO2-e kg–1 LW sold for Darling Downs, Queensland, through to 25.8 kg CO2-e kg–1 LW sold for the Pindan and North Kimberley, Western Australia. This range was driven by differences in production efficiency, reproduction rate, growth rate and survival. This suggests that some regions in northern Australia are likely to have substantial opportunities for GHG abatement and higher livestock income. However, this must be coupled with the availability of management activities that can be implemented to improve production efficiency; wet season phosphorus (P) supplementation being one such practice. An ERF case study comparison showed that P supplementation of a typical-sized herd produced an estimated reduction of 622 t CO2-e year–1, or 7%, compared with a non-P supplemented herd. However, the different model parameters used by the National Inventory Report and ERF project means that there was an anomaly between the herd emissions for project cattle excised from the national accounts (13 479 t CO2-e year–1) and the baseline herd emissions estimated for the ERF project (8 896 t CO2-e year–1) before P supplementation was implemented. Regionalising livestock model parameters in both ERF projects and the national accounts offers the attraction of being able to more easily and accurately reflect emissions savings from this type of emissions reduction project in Australia’s national GHG accounts.