960 resultados para European copyright code
Resumo:
The proliferation of innovative schemes to address climate change at international, national and local levels signals a fundamental shift in the priority and role of the natural environment to society, organizations and individuals. This shift in shared priorities invites academics and practitioners to consider the role of institutions in shaping and constraining responses to climate change at multiple levels of organisations and society. Institutional theory provides an approach to conceptualising and addressing climate change challenges by focusing on the central logics that guide society, organizations and individuals and their material and symbolic relationship to the environment. For example, framing a response to climate change in the form of an emission trading scheme evidences a practice informed by a capitalist market logic (Friedland and Alford 1991). However, not all responses need necessarily align with a market logic. Indeed, Thornton (2004) identifies six broad societal sectors each with its own logic (markets, corporations, professions, states, families, religions). Hence, understanding the logics that underpin successful –and unsuccessful– climate change initiatives contributes to revealing how institutions shape and constrain practices, and provides valuable insights for policy makers and organizations. This paper develops models and propositions to consider the construction of, and challenges to, climate change initiatives based on institutional logics (Thornton and Ocasio 2008). We propose that the challenge of understanding and explaining how climate change initiatives are successfully adopted be examined in terms of their institutional logics, and how these logics evolve over time. To achieve this, a multi-level framework of analysis that encompasses society, organizations and individuals is necessary (Friedland and Alford 1991). However, to date most extant studies of institutional logics have tended to emphasize one level over the others (Thornton and Ocasio 2008: 104). In addition, existing studies related to climate change initiatives have largely been descriptive (e.g. Braun 2008) or prescriptive (e.g. Boiral 2006) in terms of the suitability of particular practices. This paper contributes to the literature on logics by examining multiple levels: the proliferation of the climate change agenda provides a site in which to study how institutional logics are played out across multiple, yet embedded levels within society through institutional forums in which change takes place. Secondly, the paper specifically examines how institutional logics provide society with organising principles –material practices and symbolic constructions– which enable and constrain their actions and help define their motives and identity. Based on this model, we develop a series of propositions of the conditions required for the successful introduction of climate change initiatives. The paper proceeds as follows. We present a review of literature related to institutional logics and develop a generic model of the process of the operation of institutional logics. We then consider how this is applied to key initiatives related to climate change. Finally, we develop a series of propositions which might guide insights into the successful implementation of climate change practices.
Networks in the shadow of markets and hierarchies : calling the shots in the visual effects industry
Resumo:
The nature and organisation of creative industries and the creative economy has received increased attention in recent academic and policy literatures (Florida 2002; Grabher 2002; Scott 2006a). Constituted as one variant on new economy narratives, creativity, alongside knowledge, has been presented as a key competitive asset, Such industries – ranging from advertising, to film and new media – are seen as not merely expanding their scale and scope, but as leading edge proponents of a more general trend towards new forms of organization and economic coordination (Davis and Scase 2000). The idea of network forms (and the consequent displacement of markets and hierarchies) has been at the heart of attempts to differentiate the field economically and spatially. Across both the discussion of production models and work/employment relations is the assertion of the enhanced importance of trust and non-market relations in coordinating structures and practices. This reflects an influential view in sociological, management, geography and other literatures that social life is ‘intrinsically networked’ (Sunley 2008: 12) and that we can confidently use the term ‘network society’ to describe contemporary structures and practices (Castells 1996). Our paper is sceptical of the conceptual and empirical foundations of such arguments. We draw on a number of theoretical resources, including institutional theory, global value chain analysis and labour process theory (see Smith and McKinlay 2009) to explore how a more realistic and grounded analysis of the nature of and limits to networks can be articulated. Given space constraints, we cannot address all the dimensions of network arguments or evidence. Our focus is on inter and intra-firm relations and draws on research into a particular creative industry – visual effects – that is a relatively new though increasingly important global production network. Through this examination a different model of the creative industries and creative work emerges – one in which market rules and patterns of hierarchical interaction structure the behaviour of economic actors and remain a central focus of analysis. The next section outlines and unpacks in more detail arguments concerning the role and significance of networks, markets and hierarchies in production models and work organisation in creative industries and the ‘creative economy’.
Resumo:
This report is the primary output of Project 4: Copyright and Intellectual Property, the aim of which was to produce a report considering how greater access to and use of government information could be achieved within the scope of the current copyright law. In our submission for Project 4, we undertook to address: •the policy rationales underlying copyright and how they apply in the context of materials owned, held and used by government; • the recommendations of the Copyright Law Review Committee (CLRC) in its 2005 report on Crown copyright; • the legislative and regulatory barriers to information sharing in key domains, including where legal impediments such as copyright have been relied upon (whether rightly or wrongly) to justify a refusal to provide access to government data; • copyright licensing models appropriate to government materials and examples of licensing initiatives in Australia and other relevant jurisdictions; and • issues specific to the galleries, libraries, archives and museums (“GLAM”) sector, including management of copyright in legacy materials and “orphan” works. In addressing these areas, we analysed the submissions received in response to the Government 2.0 Taskforce Issues Paper, consulted with members of the Task Force as well as several key stakeholders and considered the comments posted on the Task Force’s blog. This Project Report sets out our findings on the above issues. It puts forward recommendations for consideration by the Government 2.0 Task Force on steps that can be taken to ensure that copyright and intellectual property promote access to and use of government information.
Resumo:
Process modeling grammars are used by analysts to describe information systems domains in terms of the business operations an organization is conducting. While prior research has examined the factors that lead to continued usage behavior, little knowledge has been established as to what extent characteristics of the users of process modeling grammars inform usage behavior. In this study, a theoretical model is advanced that incorporates determinants of continued usage behavior as well as key antecedent individual difference factors of the grammar users, such as modeling experience, modeling background and perceived grammar familiarity. Findings from a global survey of 529 grammar users support the hypothesized relationships of the model. The study offers three central contributions. First, it provides a validated theoretical model of post-adoptive modeling grammar usage intentions. Second, it discusses the effects of individual difference factors of grammar users in the context of modeling grammar usage. Third, it provides implications for research and practice.
Resumo:
First-degree relatives of men with prostate cancer have a higher risk of being diagnosed with prostate cancer than men without a family history. The present review examines the prevalence and predictors of testing in first-degree relatives, perceptions of risk, prostate cancer knowledge and psychological consequences of screening. Medline, PsycInfo and Cinahl databases were searched for articles examining risk perceptions or screening practices of first-degree relatives of men with prostate cancer for the period of 1990 to August 2007. Eighteen studies were eligible for inclusion. First-degree relatives participated in prostate-specific antigen (PSA) testing more and perceived their risk of prostate cancer to be higher than men without a family history. Family history factors (e.g. being an unaffected son rather than an unaffected brother) were consistent predictors of PSA testing. Studies were characterized by sampling biases and a lack of longitudinal assessments. Prospective, longitudinal assessments with well-validated and comprehensive measures are needed to identify factors that cue the uptake of screening and from this develop an evidence base for decision support. Men with a family history may benefit from targeted communication about the risks and benefits of prostate cancer testing that responds to the implications of their heightened risk.
Resumo:
This paper presents effects of end-winding on shaft voltage in AC generators. A variety of design parameters have been considered to calculate the parasitic capacitive couplings in the machine structure with Finite Elements simulations and mathematical calculations. End-winding capacitances have also been calculated to have a precise estimation of shaft voltage and its relationship with design parameters in AC generators.
Resumo:
The Restrung New Chamber Festival was a practice-led research project which explored the intricacies of musical relationships. Specifically, it investigated the relationships between new music ensembles and pop-oriented bands inspired by the new music genre. The festival, held at the Brisbane Powerhouse (28 February-2 March 2009) comprised 17 diverse groups including the Brodsky Quartet, Topology, Wood, Fourplay and CODA. Restrung used a new and distinctive model which presented new music and syncretic musical genres within an immersive environment. Restrung brought together approaches used in both contemporary classical and popular music festivals, using musical, visual and spatial aspects to engage audiences. Interactivity was encouraged through video and sound installations, workshops and forums. This paper will investigate some of the issues surrounding the conception and design of the Restrung model, within the context of an overview of European new music trends. It includes a discussion of curating such an event in a musically sensitive and effective way, and approaches to identifying new and receptive audiences. As a guide to programming Restrung, I formulated a working definition of new music, further developed by interviews with specialists in Australia and Europe, and this will be outlined below.
Resumo:
Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. AE is potentially more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message and in a separate pass, providing integrity protection by generating a Message Authentication Code (MAC) tag. This paper presents results on the analysis of three AE stream ciphers submitted to the recently completed eSTREAM competition. We classify the ciphers based on the methods the ciphers use to provide authenticated encryption and discuss possible methods for mounting attacks on these ciphers.
Resumo:
Established Monte Carlo user codes BEAMnrc and DOSXYZnrc permit the accurate and straightforward simulation of radiotherapy experiments and treatments delivered from multiple beam angles. However, when an electronic portal imaging detector (EPID) is included in these simulations, treatment delivery from non-zero beam angles becomes problematic. This study introduces CTCombine, a purpose-built code for rotating selected CT data volumes, converting CT numbers to mass densities, combining the results with model EPIDs and writing output in a form which can easily be read and used by the dose calculation code DOSXYZnrc. The geometric and dosimetric accuracy of CTCombine’s output has been assessed by simulating simple and complex treatments applied to a rotated planar phantom and a rotated humanoid phantom and comparing the resulting virtual EPID images with the images acquired using experimental measurements and independent simulations of equivalent phantoms. It is expected that CTCombine will be useful for Monte Carlo studies of EPID dosimetry as well as other EPID imaging applications.
Resumo:
There is wide agreement that in order to manage the increasingly complex and uncertain tasks of business, government and community, organizations can no longer operate in supreme isolation, but must develop a more networked approach. Networks are not ‘business as usual’. Of particular note is what has been referred to as collaborative networks. Collaborative networks now constitute a significant part of our institutional infrastructure. A key driver for the proliferation of these multiorganizational arrangements is their ability to facilitate the learning and knowledge necessary to survive or to respond to increasingly complex social issues In this regard the emphasis is on the importance of learning in networks. Learning applies to networks in two different ways. These refer to the kinds of learning that occur as part of the interactive processes of networks. This paper looks at the importance of these two kinds of learning in collaborative networks. The first kind of learning relates to networks as learning networks or communities of practice. In learning networks people exchange ideas with each other and bring back this new knowledge for use in their own organizations. The second type of learning is referred to as network learning. Network learning refers to how people in collaborative networks learn new ways of communicating and behaving with each other. Network learning has been described as transformational in terms of leading to major systems changes and innovation. In order to be effective, all networks need to be involved as learning networks; however, collaborative networks must also be involved in network learning to be effective. In addition to these two kinds of learning in collaborative networks this paper also focuses on the importance of how we learn about collaborative networks. Maximizing the benefits of working through collaborative networks is dependent on understanding their unique characteristics and how this impacts on their operation. This requires a new look at how we specifically teach about collaborative networks and how this is similar to and/or different from how we currently teach about interorgnizational relations.
Resumo:
ANDS Guides http://ands.org.au/guides/index.html These guides provide information about ANDS services and some fundamental issues in data-intensive research and research data management. These are not rules, prescriptions or proscriptions. They are guidelines and checklists to inform and broaden the range of possibilities for researchers, data managers, and research organisations.
Resumo:
One of the classic forms of intermediate representation used for communication between compiler front-ends and back-ends are those based on abstract stack machines. It is possible to compile the stack machine instructions into machine code by means of an interpretive code generator, or to simulate the stack machine at runtime using an interpreter. This paper describes an approach intermediate between these two extremes. The front-end for a commercial Modula 2 compiler was ported to the "industry standard PC", and a partially compiling back-end written. The object code runs with the assistance of an interpreter, but may be linked with libraries which are fully compiled. The intent was to provide a programming environment on the PC which is identical to that of the same compilers on 32-bit UNIX machines. This objective has been met, and the compiler is available to educational institutions as free-ware. The design basis of the new compiler is described, and the performance critically evaluated.
Resumo:
Multi-storey buildings are highly vulnerable to terrorist bombing attacks in various parts of the world. Large numbers of casualties and extensive property damage result not only from blast overpressure, but also from the failing of structural components. Understanding the blast response and damage consequences of reinforced concrete (RC) building frames is therefore important when assessing multi-storey buildings designed to resist normal gravity loads. However, limited research has been conducted to identify the blast response and damage of RC frames in order to assess the vulnerability of entire buildings. This paper discusses the blast response and evaluation of damage of three-dimension (3D) RC rigid frame under potential blast loads scenarios. The explicit finite element modelling and analysis under time history blast pressure loads were carried out by LS DYNA code. Complete 3D RC frame was developed with relevant reinforcement details and material models with strain rate effect. Idealised triangular blast pressures calculated from standard manuals are applied on the front face of the model in the present investigation. The analysis results show the blast response, as displacements and material yielding of the structural elements in the RC frame. The level of damage is evaluated and classified according to the selected load case scenarios. Residual load carrying capacities are evaluated and level of damage was presented by the defined damage indices. This information is necessary to determine the vulnerability of existing multi-storey buildings with RC frames and to identify the level of damage under typical external explosion environments. It also provides basic guidance to the design of new buildings to resist blast loads.
Resumo:
The portability and runtime safety of programs which are executed on the Java Virtual Machine (JVM) makes the JVM an attractive target for compilers of languages other than Java. Unfortunately, the JVM was designed with language Java in mind, and lacks many of the primitives required for a straighforward implementation of other languages. Here, we discuss how the JVM may be used to implement other object-oriented languages. As a practical example of the possibilities, we report on a comprehensive case study. The open source Gardens Point Component Pascal compiler compiles the entire Component Pascal language, a dialect of Oberon-2, to JVM bytecodes. This compiler achieves runtime efficiencies which are comparable to native-code implementations of procedural languages.