74 resultados para Information theory.


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the social and environmental disclosure practices of two large multinational companies, specifically Nike and Hennés & Mauritz. Utilising a joint consideration of legitimacy theory and media agenda setting theory, we investigate the linkage between negative media attention, and positive corporate social and environmental disclosures. Our results generally support a view that for those industry-related social and environmental issues attracting the greatest amount of negative media attention, these corporations react by providing positive social and environmental disclosures. The results were particularly significant in relation to labour practices in developing countries - the issue attracting the greatest amount of negative media attention for the companies in question.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates whether the narrative section of Iranian companies' annual reports satisfies the information requirements of financial analysts employed by institutional investors. Taking a group of stakeholders (i.e. financial analysts) as the sample, a questionnaire survey was conducted to identify their top three information needs from the narrative sections of company annual reports in each of three information categories: Present, Analytical and Prospective. Following this survey, a checklist was prepared to analyse whether Iranian companies are disclosing this information required by financial analysts. Overall, the results partially support stakeholder theory as there is a general lack of information flow on the part of Iranian listed companies in meeting their stakeholders' information needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective in this study is to examine the existing literature regarding the antecedents to public sector accountability performance by including a new variable: preparer-commanders’ beliefs about the usefulness of whole-of-government consolidated financial reporting. Goldberg’s (1965) Commander Theory was used as a relevant theoretical framework. Survey results provided insights into the beliefs of preparer-commanders as to the usefulness of whole-of-government consolidated financial reports for the discharge of accountability. While there appears to be a view that such reports may be useful for decision-making purposes, there is relatively less evidence to suggest that this type of information is suitable for the purposes of government resource allocation decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Selecting a suitable proximity measure is one of the fundamental tasks in clustering. How to effectively utilize all available side information, including the instance level information in the form of pair-wise constraints, and the attribute level information in the form of attribute order preferences, is an essential problem in metric learning. In this paper, we propose a learning framework in which both the pair-wise constraints and the attribute order preferences can be incorporated simultaneously. The theory behind it and the related parameter adjusting technique have been described in details. Experimental results on benchmark data sets demonstrate the effectiveness of proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Agencies charged with nature conservation and protecting built-assets from fire face a policy dilemma because management that protects assets can have adverse impacts on biodiversity. Although conservation is often a policy goal, protecting built-assets usually takes precedence in fire management implementation. To make decisions that can better achieve both objectives, existing trade-offs must first be recognized, and then policies implemented to manage multiple objectives explicitly. We briefly review fire management actions that can conflict with biodiversity conservation. Through this review, we find that common management practices might not appreciably reduce the threat to built-assets but could have a large negative impact on biodiversity. We develop a framework based on decision theory that could be applied to minimize these conflicts. Critical to this approach is (1) the identification of the full range of management options and (2) obtaining data for evaluating the effectiveness of those options for achieving asset protection and conservation goals. This information can be used to compare explicitly the effectiveness of different management choices for conserving species and for protecting assets, given budget constraints. The challenge now is to gather data to quantify these trade-offs so that fire policy and practices can be better aligned with multiple objectives

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Any attempt to model an economy requires foundational assumptions about the relations between prices, values and the distribution of wealth. These assumptions exert a profound influence over the results of any model. Unfortunately, there are few areas in economics as vexed as the theory of value. I argue in this paper that the fundamental problem with past theories of value is that it is simply not possible to model the determination of value, the formation of prices and the distribution of income in a real economy with analytic mathematical models. All such attempts leave out crucial processes or make unrealistic assumptions which significantly affect the results. There have been two primary approaches to the theory of value. The first, associated with classical economists such as Ricardo and Marx were substance theories of value, which view value as a substance inherent in an object and which is conserved in exchange. For Marxists, the value of a commodity derives solely from the value of the labour power used to produce it - and therefore any profit is due to the exploitation of the workers. The labour theory of value has been discredited because of its assumption that labour was the only ‘factor’ that contributed to the creation of value, and because of its fundamentally circular argument. Neoclassical theorists argued that price was identical with value and was determined purely by the interaction of supply and demand. Value then, was completely subjective. Returns to labour (wages) and capital (profits) were determined solely by their marginal contribution to production, so that each factor received its just reward by definition. Problems with the neoclassical approach include assumptions concerning representative agents, perfect competition, perfect and costless information and contract enforcement, complete markets for credit and risk, aggregate production functions and infinite, smooth substitution between factors, distribution according to marginal products, firms always on the production possibility frontier and firms’ pricing decisions, ignoring money and credit, and perfectly rational agents with infinite computational capacity. Two critical areas include firstly, the underappreciated Sonnenschein-Mantel- Debreu results which showed that the foundational assumptions of the Walrasian general-equilibrium model imply arbitrary excess demand functions and therefore arbitrary equilibrium price sets. Secondly, in real economies, there is no equilibrium, only continuous change. Equilibrium is never reached because of constant changes in preferences and tastes; technological and organisational innovations; discoveries of new resources and new markets; inaccurate and evolving expectations of businesses, consumers, governments and speculators; changing demand for credit; the entry and exit of firms; the birth, learning, and death of citizens; changes in laws and government policies; imperfect information; generalized increasing returns to scale; random acts of impulse; weather and climate events; changes in disease patterns, and so on. The problem is not the use of mathematical modelling, but the kind of mathematical modelling used. Agent-based models (ABMs), objectoriented programming and greatly increased computer power however, are opening up a new frontier. Here a dynamic bargaining ABM is outlined as a basis for an alternative theory of value. A large but finite number of heterogeneous commodities and agents with differing degrees of market power are set in a spatial network. Returns to buyers and sellers are decided at each step in the value chain, and in each factor market, through the process of bargaining. Market power and its potential abuse against the poor and vulnerable are fundamental to how the bargaining dynamics play out. Ethics therefore lie at the very heart of economic analysis, the determination of prices and the distribution of wealth. The neoclassicals are right then that price is the enumeration of value at a particular time and place, but wrong to downplay the critical roles of bargaining, power and ethics in determining those same prices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we identify the necessary capabilities of the IT function to create agility in existing information systems. Agility is the ability to quickly sense and respond to environment perturbations. We contrast the agility perspective from a widely used industry framework with research perspectives on agility in the IS literature. We suggest Beer’s Viable System Model (VSM) is a useful meta-level theory to house agility elements from IS research literature, and apply VSM principles to identify the capabilities required of the IT function. Indeed, by means of a survey of 34 organisations, we confirm that the meta-level theory better correlates with reported agility measures than existing practice measures do on their own. From a research perspective, the incorporation of the VSM mechanism helps to explain ‘why’ the IT function is capable of creating agility. From a practical perspective of ‘how’, the findings point to a new set of capabilities of the IT function for future versions of the industry frameworks to enable agility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research identifies how the IT function can create agility in existing information systems. Agility is the capability to quickly sense and respond to environmental perturbations. We contrasted perspectives on agility from a widely used industry framework and that of the IS research literature. Beer’s Viable System Model was a useful meta-level theory to house agility elements from IS research and it introduced cybernetic principles to identify the processes required of the IT function. Indeed, our surveys of 70 organizations confirmed that the applied theory better correlates with reported agility than does existing industry best practice.

The research conducted two quantitative surveys to test the applied theory. The first survey mailed a Likert-type questionnaire to the clients of an Australian IT consultancy. The second survey invited international members of professional interest groups to complete a web-based questionnaire. The responses from the surveys were analyzed using partial-least-squares modeling. The data analysis positively correlated the maturity of IT function processes prescribed by the VSM and the likelihood of agility in existing information systems. We claim our findings generalize to other large organizations in OECD member countries.

The research offers an agility-capability model of the IT function to explain and predict agility in existing information systems. A further contribution is to improve industry ‘best practice’ frameworks by prescribing processes of the IT function to develop in maturity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The previous decade has been well known in Australia for the methodological wars. This culminated in a modest increase in interpretivist research. However, critical theory and postmodernist methodologies have not been taken up with any great enthusiasm both here in Australia or elsewhere. This paper attempts to outline the practical and historical reasons and the theoretical difficulties for the failure of these paradigms in contemporary IS. Secondly, it attempts to identify why IS must embrace these paradigms for the future, and the emerging postmodernist approaches in particular, in the light of the rise of ubiquitous information systems. The paper proposes a new set of research questions for the discipline.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the image formation process of the camera, explicit 3D information about the scene or objects in the scene are lost. Therefore, 3D structure or depth information has to be inferred implicitly from the 2D intensity images. This problem is com- monly referred to as 3D reconstruction. In this work a complete 3D reconstruction algorithm is presented, capable of reconstructing dimensionally accurate 3D models of the objects, based on stereo vision and multi-resolution analysis. The developed system uses a reference depth model of the objects under observation to improve the disparity maps, estimated. Only a few features are extracted from that reference model, which are the relative location of the discontinuities and the z-dimensional extremes of objects depth. The maximum error deviation of the estimated depth along the surfaces is less than 0.5mm and along the discontinuities is less than 1.5mm. The developed system is invariant to illuminative variations, and orientation, location and scaling of the objects under consideration, which makes the developed system highly robust.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – Accounting and water industry experts are developing general-purpose water accounting (GPWA) to report information about water and rights to water. The system has the potential to affect water policies, pricing and management, and investment and other decisions that are affected by GPWA report users' understanding of water risks faced by an entity. It may also affect financial returns to accounting and auditing firms and firms in water industries. In this paper the authors aim to examine the roles of the accounting profession, water industries and other stakeholders in governing GPWA. Recognising that the fate of GPWA depends partly upon regulatory power and economics, they seek to apply regulatory theories that explain financial accounting standards development to speculate about the national and international future of GPWA.

Design/methodology/approach – Official documents, internal Water Accounting Standards Board documents and unstructured interviews underpin the authors' analysis.

Findings – The authors speculate about the benefits that might accrue to various stakeholder groups from capturing the GPWA standard-setting process. They also suggest that internationally, water industries may dominate early GPWA standards development in the public interest and that regulatory capture by accounting or water industry professionals will not necessarily conflict with public interest benefits.

Practical implications – Accounting for water can affect allocations of environmental, economic, social and other resources; also, accounting and water industry professional standing and revenues. In this paper the authors identify factors influencing GPWA standards and standard-setting institutional arrangements, and thereby these resource allocations. The paper generates an awareness of GPWA's emergence and practical implications.

Originality/value –
This is an early study to investigate water accounting standard-setting regulatory influences and their impact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relationships of authority and control and their effect on information systems actors has interested IS researchers since at least the 1980’s. The study of power itself has also troubled organisational and sociological theorists, from which information systems researchers have drawn various lines of attack. Our approach to power rests on an historical synchronic theory that seeks to uncover the places and operation of power through an examination of narrative ‘testaments’ which are analysed not from the perspective of the giving individual but from the structural elements of discourse that they may represent. This paper compliments previous research methods on the topic of power especially in expert reports and systems development methodologies; provides specific guidance on how to apply the notion of discourse synchronically; and reconstructs the commercial practice of information systems, not as a broad church, but as one of competing and epistemologically incommensurate discourse, where the fates of the powerful are balanced against the fearful and silent disciplined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research considers information systems development (ISD) projects as complex adaptive systems. We investigate the question whether complex adaptive systems (CAS) theory is relevant as a theoretical foundation for understanding ISD, and if so, which kind of understanding can be achieved by utilizing the theory? We introduce key concepts of CAS theory such as interaction, emergence, interconnected autonomous agents, selforganization, co-evolution, poise at the edge of chaos, time pacing, and poise at the edge of time to analyse and understand ISD in practice. We demonstrate the strength of such a CAS approach through an empirical case study presentation and analysis. While our work contributes to a complexity theory of ISD, the case examination also provides practical advice derived from this perspective to successfully cope with complexity in ISD in an adaptive manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existing distinctions among macro and micro approaches have been jeopardising the advances of Information Systems (IS) research. Both approaches have been criticized for explaining one level while neglecting the other; thereby, the current situation necessitates the application of multilevel research for revealing the deficiencies. Instead of studying single level (macro or micro), multilevel research entails more than one level of conceptualization and analysis, simultaneously. As the notion of multilevel is borrowed from reference disciplines, there tends to be confusions and inconsistencies within the IS discipline, which hinders the adoption of multilevel research. This paper speaks for the potential value of multilevel research, by investigating the current application status of multilevel research within the IS domain. A content analysis of multilevel research articles from major IS conferences and journals is presented. Analysis results suggest that IS scholars have applied multilevel research to produce high quality work ranging from a variety of topics. However, researchers have not yet been consistently defining “multilevel”, leading to idiosyncratic meanings of multilevel research, most often, in authors’ own interpretations. We argue that a rigorous definition of “multilevel research” needs to be explicated for consistencies in research community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Locating information systems education foursquare within rigorous and substantial educational research is crucial if the discipline is to receive the scholarly attention that it warrants. One way to do that is to highlight how current information systems course design in Australian undergraduate and postgraduate programs exhibits the strongest possible elements of contemporary learning theories. This paper analyses selected features of the design of an information systems postgraduate course in an Australian university, including the use of peer review of journal entries and writing professional reports to enhance authentic learning and maximise quality assessment design. The analysis is framed by the principles of instructional design theory (Snyder 2009). The authors argue that, by demonstrating theoretically grounded and effective educational practice, the course highlights the value of being located in wider educational research, and of bringing the two fields more closely together.