87 resultados para Odyssey Stand Alone
Resumo:
Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.
Resumo:
This is the first outdoor test of small-scale dye sensitized solar cells (DSC) powering a stand-alone nanosensor node. A solar cell test station (SCTS) has been developed using standard DSC to power a gas nanosensor, a radio transmitter, and the control electronics (CE) for battery charging. The station is remotely monitored through wired (Ethernet cable) or wireless connection (radio transmitter) in order to evaluate in real time the performance of the solar cells and devices under different weather conditions. The 408 cm2 active surface module produces enough energy to power a gas nanosensor and a radio transmitter during the day and part of the night. Also, by using a programmable load we keep the system working on the maximum power point (MPP) quantifying the total energy generated and stored in a battery. These experiments provide useful data for future outdoor applications such as nanosensor networks.
Resumo:
Historically, cities as urban forms have been critical to human development. In 1950, 30% of the world’s population lived in major cities. By the year 2000 this had increased to 47% with further expected growth to 50% by the end of 2007. Projections suggest that city-based densities will edge towards 60% of the global total by 2030. Such rapidly increasing urbanisation, in both developed and developing economies, challenges options for governance and planning, as well as crisis and disaster management. A common issue to the livability of cities as urban forms through time has been access to clean and reliable water supply. This is an issue that is particularly important in countries with arid ecosystems, such as Australia. This paper examines preliminary aspects, and theoretical basis, of a study into the resilience of the (potable) water supply system in Southeast Queensland (SEQ), an area with one of the most significant urban growth rates in Australia. The first stage will be to assess needs and requirements for gauging resilience characteristics of a generic water supply system, consisting of supply catchment, storage reservoir/s and treatment plant/s. The second stage will extend the analysis to examine the resilience of the SEQ water supply system incorporating specific characteristics of the SEQ water grid made increasingly vulnerable due to climate variability and projected impacts on rainfall characteristics and compounded by increasing demands due to population growth. Longer-term findings will inform decision making based on the application of the concept of resilience to designing and operating stand-alone and networked water supply infrastructure systems as well as its application to water resource systems more generally.
Resumo:
The research undertaken in these two major doctoral studies investigates the field of artsbased learning, a pedagogical approach to individual and organisational learning and development, my professional creative facilitation practice and development as a researcher. While the studies are stand-alone projects they are intended to build on each other in order to tell the evolving story of my research and professional practice. The first study combines The Role of Arts-based Learning in a Creative Economy; The Need for Artistry in Professional Education the art of knowing what to do when you don’t know what to do and Lines of Inquiry: Making Sense of Research and Professional Practice. The Role of Arts-based Learning in a Creative Economy provides an overview of the field of arts-based learning in business. The study focuses on the relevant literature and interviews with people working in the field. The paper argues that arts-based learning is a valuable addition to organisations for building a culture of creativity and innovation. The Need for Artistry in Professional Education continues that investigation. It explores the way artists approach their work and considers what skills and capabilities from artistic practice can be applied to other professions’ practices. From this research the Sphere of Professional Artistry model is developed and depicts the process of moving toward professional artistry. Lines of Inquiry: making sense of research and professional practice through artful inquiry is a self-reflective study. It explores my method of inquiry as a researcher and as a creative facilitation practitioner using arts-based learning processes to facilitate groups of people for learning, development and change. It discusses how my research and professional practice influence and inspire the other and draws on cased studies. The second major research study Artful Inquiry: Arts-based Learning for Inquiry, Reflection and Action in Professional Practice is a one year practice-led inquiry. It continues the research into arts-based and aesthetic learning experiences and my arts-based facilitation practice. The research is conducted with members of a Women’s Network in a large government service agency. It develops the concept of ‘Artful Inquiry’’ a creative, holistic, and embodied approach for facilitation, inquiry, learning, reflection, and action. Storytelling as Inquiry is used as a methodology for understanding participants’ experiences of being involved in arts-based learning experiences. The study reveals the complex and emergent nature of practice and research. It demonstrates what it can mean to do practice-led research with others, within an organisational context, and to what effect.
Resumo:
Individuals, community organisations and industry have always been involved to varying degrees in efforts to address the Queensland road toll. Traditionally, road crash prevention efforts have been led by state and local government organisations. While community and industry groups have sometimes become involved (e.g. Driver Reviver campaign), their efforts have largely been uncoordinated and under-resourced. A common strength of these initiatives lies in the energy, enthusiasm and persistence of community-based efforts. Conversely, a weakness has sometimes been the lack of knowledge, awareness or prioritisation of evidence-based interventions or their capacity to build on collaborative efforts. In 2000, the Queensland University of Technology’s Centre for Accident Research and Road Safety – Queensland (CARRS-Q) identified this issue as an opportunity to bridge practice and research and began acknowledging a selection of these initiatives, in partnership with the RACQ, through the Queensland Road Safety Awards program. After nine years it became apparent there was need to strengthen this connection, with the Centre establishing a Community Engagement Workshop in 2009 as part of the overall Awards program. With an aim of providing community participants opportunities to see, hear and discuss the experiences of others, this event was further developed in 2010, and with the collaboration of the Queensland Department of Transport and Main Roads, the RACQ, Queensland Police Service and Leighton Contractors Pty Ltd, a stand-alone Queensland Road Safety Awards Community Engagement Workshop was held in 2010. Each collaborating organisation recognised a need to mobilise the community through effective information and knowledge sharing, and recognised that learning and discussion can influence lasting behaviour change and action in this often emotive, yet not always evidence-based, area. This free event featured a number of speakers representing successful projects from around Australia and overseas. Attendees were encouraged to interact with the speakers, to ask questions, and most importantly, build connections with other attendees to build a ‘community road safety army’ all working throughout Australia on projects underpinned by evaluated research. The workshop facilitated the integration of research, policy and grass-roots action enhancing the success of community road safety initiatives. For collaboration partners, the event enabled them to transfer their knowledge in an engaged approach, working within a more personal communication process. An analysis of the success factors for this event identified openness to community groups and individuals, relevance of content to local initiatives, generous support with the provision of online materials and ongoing communication with key staff members as critical and supports the view that the university can directly provide both the leadership and the research needed for effective and credible community-based initiatives to address injury and death on the roads.
Resumo:
The health system is one sector dealing with a deluge of complex data. Many healthcare organisations struggle to utilise these volumes of health data effectively and efficiently. Also, there are many healthcare organisations, which still have stand-alone systems, not integrated for management of information and decision-making. This shows, there is a need for an effective system to capture, collate and distribute this health data. Therefore, implementing the data warehouse concept in healthcare is potentially one of the solutions to integrate health data. Data warehousing has been used to support business intelligence and decision-making in many other sectors such as the engineering, defence and retail sectors. The research problem that is going to be addressed is, "how can data warehousing assist the decision-making process in healthcare". To address this problem the researcher has narrowed an investigation focusing on a cardiac surgery unit. This research used the cardiac surgery unit at the Prince Charles Hospital (TPCH) as the case study. The cardiac surgery unit at TPCH uses a stand-alone database of patient clinical data, which supports clinical audit, service management and research functions. However, much of the time, the interaction between the cardiac surgery unit information system with other units is minimal. There is a limited and basic two-way interaction with other clinical and administrative databases at TPCH which support decision-making processes. The aims of this research are to investigate what decision-making issues are faced by the healthcare professionals with the current information systems and how decision-making might be improved within this healthcare setting by implementing an aligned data warehouse model or models. As a part of the research the researcher will propose and develop a suitable data warehouse prototype based on the cardiac surgery unit needs and integrating the Intensive Care Unit database, Clinical Costing unit database (Transition II) and Quality and Safety unit database [electronic discharge summary (e-DS)]. The goal is to improve the current decision-making processes. The main objectives of this research are to improve access to integrated clinical and financial data, providing potentially better information for decision-making for both improved from the questionnaire and by referring to the literature, the results indicate a centralised data warehouse model for the cardiac surgery unit at this stage. A centralised data warehouse model addresses current needs and can also be upgraded to an enterprise wide warehouse model or federated data warehouse model as discussed in the many consulted publications. The data warehouse prototype was able to be developed using SAS enterprise data integration studio 4.2 and the data was analysed using SAS enterprise edition 4.3. In the final stage, the data warehouse prototype was evaluated by collecting feedback from the end users. This was achieved by using output created from the data warehouse prototype as examples of the data desired and possible in a data warehouse environment. According to the feedback collected from the end users, implementation of a data warehouse was seen to be a useful tool to inform management options, provide a more complete representation of factors related to a decision scenario and potentially reduce information product development time. However, there are many constraints exist in this research. For example the technical issues such as data incompatibilities, integration of the cardiac surgery database and e-DS database servers and also, Queensland Health information restrictions (Queensland Health information related policies, patient data confidentiality and ethics requirements), limited availability of support from IT technical staff and time restrictions. These factors have influenced the process for the warehouse model development, necessitating an incremental approach. This highlights the presence of many practical barriers to data warehousing and integration at the clinical service level. Limitations included the use of a small convenience sample of survey respondents, and a single site case report study design. As mentioned previously, the proposed data warehouse is a prototype and was developed using only four database repositories. Despite this constraint, the research demonstrates that by implementing a data warehouse at the service level, decision-making is supported and data quality issues related to access and availability can be reduced, providing many benefits. Output reports produced from the data warehouse prototype demonstrated usefulness for the improvement of decision-making in the management of clinical services, and quality and safety monitoring for better clinical care. However, in the future, the centralised model selected can be upgraded to an enterprise wide architecture by integrating with additional hospital units’ databases.
Resumo:
Virtual prototyping emerges as a new technology to replace existing physical prototypes for product evaluation, which are costly and time consuming to manufacture. Virtualization technology allows engineers and ergonomists to perform virtual builds and different ergonomic analyses on a product. Digital Human Modelling (DHM) software packages such as Siemens Jack, often integrate with CAD systems to provide a virtual environment which allows investigation of operator and product compatibility. Although the integration between DHM and CAD systems allows for the ergonomic analysis of anthropometric design, human musculoskeletal, multi-body modelling software packages such as the AnyBody Modelling System (AMS) are required to support physiologic design. They provide muscular force analysis, estimate human musculoskeletal strain and help address human comfort assessment. However, the independent characteristics of the modelling systems Jack and AMS constrain engineers and ergonomists in conducting a complete ergonomic analysis. AMS is a stand alone programming system without a capability to integrate into CAD environments. Jack is providing CAD integrated human-in-the-loop capability, but without considering musculoskeletal activity. Consequently, engineers and ergonomists need to perform many redundant tasks during product and process design. Besides, the existing biomechanical model in AMS uses a simplified estimation of body proportions, based on a segment mass ratio derived scaling approach. This is insufficient to represent user populations anthropometrically correct in AMS. In addition, sub-models are derived from different sources of morphologic data and are therefore anthropometrically inconsistent. Therefore, an interface between the biomechanical AMS and the virtual human model Jack was developed to integrate a musculoskeletal simulation with Jack posture modeling. This interface provides direct data exchange between the two man-models, based on a consistent data structure and common body model. The study assesses kinematic and biomechanical model characteristics of Jack and AMS, and defines an appropriate biomechanical model. The information content for interfacing the two systems is defined and a protocol is identified. The interface program is developed and implemented through Tcl and Jack-script(Python), and interacts with the AMS console application to operate AMS procedures.
Resumo:
Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.
Resumo:
It is nearly 10 years since the introduction of s 299(1)(f) Corporations Act , which requires the disclosure of information regarding a company's environmental performance within its annual report. This provision has generated considerable debate in the years since its introduction, fundamentally between proponents of either a voluntary or mandatory environmental reporting framework. This study examines the adequacy of the current regulatory framework. The environmental reporting practices of 24 listed companies in the resources industries are assessed relative to a standard set by the Global Reporting Initiative (GRI) Sustainability Reporting Guidelines. These Guidelines are argued to represent "international best practice" in environmental reporting and a "scorecard" approach is used to score the quality of disclosure according to this voluntary benchmark. Larger companies in the sample tend to report environmental information over and above the level required by legislation. Some, but not all companies present a stand-alone environmental/sustainability report. However, smaller companies provide minimal information in compliance with s 299(1)(f) . The findings indicate that "international best practice" environmental reporting is unlikely to be achieved by Australian companies under the current regulatory framework. In the current regulatory environment that scrutinises s 299(1)(f) , this article provides some preliminary evidence of the quality of disclosures generated in the Australian market.
Resumo:
This work focuses on the development of a stand-alone gas nanosensor node, powered by solar energy to track concentration of polluted gases such as NO2, N2O, and NH3. Gas sensor networks have been widely developed over recent years, but the rise of nanotechnology is allowing the creation of a new range of gas sensors [1] with higher performance, smaller size and an inexpensive manufacturing process. This work has created a gas nanosensor node prototype to evaluate future field performance of this new generation of sensors. The sensor node has four main parts: (i) solar cells; (ii) control electronics; (iii) gas sensor and sensor board interface [2-4]; and (iv) data transmission. The station is remotely monitored through wired (ethernet cable) or wireless connection (radio transmitter) [5, 6] in order to evaluate, in real time, the performance of the solar cells and sensor node under different weather conditions. The energy source of the node is a module of polycrystalline silicon solar cells with 410cm2 of active surface. The prototype is equipped with a Resistance-To-Period circuit [2-4] to measure the wide range of resistances (KΩ to GΩ) from the sensor in a simple and accurate way. The system shows high performance on (i) managing the energy from the solar panel, (ii) powering the system load and (iii) recharging the battery. The results show that the prototype is suitable to work with any kind of resistive gas nanosensor and provide useful data for future nanosensor networks.
Resumo:
Australia requires decisive action on climate change and issues of sustainability. The Urban Informatics Research Lab has been funded by the Queensland State Government to conduct a three year study (2009 – 2011) exploring ways to support Queensland residents in making more sustainable consumer and lifestyle choices. We conduct user-centred design research that inform the development of real-time, mobile, locational, networked information interfaces, feedback mechanisms and persuasive and motivational approaches that in turn assist in-situ decision making and environmental awareness in everyday settings. The study aims to deliver usable and useful prototypes offering individual and collective visualisations of ecological impact and opportunities for engagement and collaboration in order to foster a participatory and sustainable culture of life in Australia. Raising people’s awareness with environmental data and educational information does not necessarily trigger sufficient motivation to change their habits towards a more environmentally friendly and sustainable lifestyle. Our research seeks to develop a better understanding how to go beyond just informing and into motivating and encouraging action and change. Drawing on participatory culture, ubiquitous computing, and real-time information, the study delivers research that leads to viable new design approaches and information interfaces which will strengthen Australia’s position to meet the targets of the Clean Energy Future strategy, and contribute to the sustainability of a low-carbon future in Australia. As part of this program of research, the Urban Informatics Research Lab has been invited to partner with GV Community Energy Pty Ltd on a project funded by the Victorian Government Sustainability Fund. This feasibility report specifically looks at the challenges and opportunities of energy monitoring in households in Victoria that include a PV solar installation. The report is structured into two parts: In Part 1, we first review a range of energy monitoring solutions, both stand-alone and internet-enabled. This section primarily focusses on the technical capacilities. However, in order to understand this information and make an informed decision, it is crucial to understand the basic principles and limitations of energy monitoring as well as the opportunities and challenges of a networked approach towards energy monitoring which are discussed in Section 2.
Resumo:
It is certain that there will be changes in environmental conditions across the globe as a result of climate change. Such changes will require the building of biological, human and infrastructure resilience. In some instances the building of such resilience will be insufficient to deal with extreme changes in environmental conditions and legal frameworks will be required to provide recognition and support for people dislocated because of environmental change. Such dislocation may occur internally within the country of original origin or externally into another State’s territory. International and national legal frameworks do not currently recognise or assist people displaced as a result of environmental factors including displacement occurring as a result of climate change. Legal frameworks developed to deal with this issue will need to consider the legal rights of those people displaced and the legal responsibilities of those countries required to respond to such displacement. The objective of this article is to identify the most suitable international institution to host a program addressing climate displacement. There are a number of areas of international law that are relevant to climate displacement, including refugee law, human rights law and international environmental law. These regimes, however, were not designed to protect people relocating as a result of environmental change. As such, while they indirectly may be of relevance to climate displacement, they currently do nothing to directly address this complex issue. In order to determine the most appropriate institution to address and regulate climate displacement, it is imperative to consider issues of governance. This paper seeks to examine this issue and determine whether it is preferable to place climate displacement programs into existing international legal frameworks or whether it is necessary to regulate this area in an entirely new institution specifically designed to deal with the complex and cross-cutting issues surrounding the topic. Commentators in this area have proposed three different regulatory models for addressing climate displacement. These models include: (a) Expand the definition of refugee under the Refugee Convention to encompass persons displaced by climate change; (b) Implement a new stand alone Climate Displacement Convention; and (c) Implement a Climate Displacement Protocol to the UNFCCC. This article will examine each of these proposed models against a number of criteria to determine the model that is most likely to address the needs and requirements of people displaced by climate change. It will also identify the model that is likely to be most politically acceptable and realistic for those countries likely to attract responsibilities by its implementation. In order to assess whether the rights and needs of the people to be displaced are to be met, theories of procedural, distributive and remedial justice will be used to consider the equity of the proposed schemes. In order to consider the most politically palatable and realistic scheme, reference will be made to previous state practice and compliance with existing obligations in the area. It is suggested that the criteria identified by this article should underpin any future climate displacement instrument.
Resumo:
Securing IT infrastructures of our modern lives is a challenging task because of their increasing complexity, scale and agile nature. Monolithic approaches such as using stand-alone firewalls and IDS devices for protecting the perimeter cannot cope with complex malwares and multistep attacks. Collaborative security emerges as a promising approach. But, research results in collaborative security are not mature, yet, and they require continuous evaluation and testing. In this work, we present CIDE, a Collaborative Intrusion Detection Extension for the network security simulation platform ( NeSSi 2 ). Built-in functionalities include dynamic group formation based on node preferences, group-internal communication, group management and an approach for handling the infection process for malware-based attacks. The CIDE simulation environment provides functionalities for easy implementation of collaborating nodes in large-scale setups. We evaluate the group communication mechanism on the one hand and provide a case study and evaluate our collaborative security evaluation platform in a signature exchange scenario on the other.
Resumo:
Despite the ubiquitous nature of the discourse on human rights there is currently little research on the emergence of disclosure by multinational corporations on their human rights obligations or the regulatory dynamic that may lie behind this trend. In an attempt to begin to explore the extent to which, if any, the language of human rights has entered the discourse of corporate accountability, this paper investigates the adoption of the International Labour Organisation's (ILO) human rights standards by major multinational garment retail companies that source products from developing countries, as disclosed through their reporting media. The paper has three objectives. Firstly, to empirically explore the extent to which a group of multinational garment retailers invoke the language of human rights when disclosing their corporate responsibilities. The paper reviews corporate reporting media including social responsibility codes of conduct, annual reports and stand-alone social responsibility reports released by 18 major global clothing and retail companies during a period from 1990 to 2007. We find that the number of companies adopting and disclosing on the ILO's workplace human rights standards has significantly increased since 1998 – the year in which the ILO's standards were endorsed and accepted by the global community (ILO, 1998). Secondly, drawing on a combination of Responsive Regulation theory and neo-institutional theory, we tentatively seek to understand the regulatory space that may have influenced these large corporations to adopt the language of human rights obligations. In particular, we study the role that International Governmental Organisation's (IGO) such as ILO may have played in these disclosures. Finally, we provide some critical reflections on the power and potential within the corporate adoption of the language of human rights.
Resumo:
Purpose – The purpose of this paper is to examine the environmental disclosure initiatives of Niko Resources Ltd – a Canada-based multinational oil and gas company – following the two major environmental blowouts at a gas field in Bangladesh in 2005. As part of the examination, the authors particularly focus on whether Niko's disclosure strategy was associated with public concern pertaining to the blowouts. Design/methodology/approach – The authors reviewed news articles about Niko's environmental incidents in Bangladesh and Niko's communication media, including annual reports, press releases and stand-alone social responsibility report over the period 2004-2007, to understand whether news media attention as proxy for public concern has an impact on Niko's disclosure practices in relation to the affected local community in Bangladesh. Findings – The findings show that Niko did not provide any non-financial environmental information within its annual reports and press releases as a part of its responsibility to the local community which was affected by the blowouts, but it did produce a stand-alone report to address the issue. However, financial environmental disclosures, such as the environmental contingent liability disclosure, were adequately provided through annual reports to meet the regulatory requirements concerning environmental persecutions. The findings also suggest that Niko's non-financial disclosure within a stand-alone report was associated with the public pressures as measured by negative media coverage towards the Niko blowouts. Research limitations/implications – This paper concludes that the motive for Niko's non-financial environmental disclosure, via a stand-alone report, reflected survival considerations: the company's reaction did not suggest any real attempt to hold broader accountability for its activities in a developing country.