996 resultados para Open landscapes
Resumo:
For Bakhtin, it is always important to know from where one speaks. The place from which I speak is that of a person who grew up in Italy during the economic miracle (pre-1968) in a working class family, watching film matinees on television during school holidays. All sort of films and genres were shown: from film noir to westerns, to Jean Renoir's films, German expressionism, Italian neorealism and Italian comedy. Cinema has come to represent over time a sort of memory extension that supplements lived memory of events, and one which, especially, mediates the intersection of many cultural discourses. When later in life I moved to Australia and started teaching in film studies, my choice of a film that was emblematic of neorealism went naturally to Roma città aperta (Open city hereafter) by Roberto Rossellini (1945), and not to Paisan or Sciuscà or Bicycle Thieves. My choice was certainly grounded in my personal memory - especially those aspects transmitted to me by my parents, who lived through the war and maintained that Open City had truly made them cry. With a mother who voted for the Christian Democratic Party and a father who was a unionist, I thought that this was normal in Italian families and society. In the early 1960s, the Resistance still offered a narrative of suffering and redemption, shared by Catholics or Communists. This construction of psychological realism is what I believe Open City continues to offer in time.
Resumo:
Executive Summary The objective of this report was to use the Sydney Opera House as a case study of the application of Building Information Modelling (BIM). The Sydney opera House is a complex, large building with very irregular building configuration, that makes it a challenging test. A number of key concerns are evident at SOH: • the building structure is complex, and building service systems - already the major cost of ongoing maintenance - are undergoing technology change, with new computer based services becoming increasingly important. • the current “documentation” of the facility is comprised of several independent systems, some overlapping and is inadequate to service current and future services required • the building has reached a milestone age in terms of the condition and maintainability of key public areas and service systems, functionality of spaces and longer term strategic management. • many business functions such as space or event management require up-to-date information of the facility that are currently inadequately delivered, expensive and time consuming to update and deliver to customers. • major building upgrades are being planned that will put considerable strain on existing Facilities Portfolio services, and their capacity to manage them effectively While some of these concerns are unique to the House, many will be common to larger commercial and institutional portfolios. The work described here supported a complementary task which sought to identify if a building information model – an integrated building database – could be created, that would support asset & facility management functions (see Sydney Opera House – FM Exemplar Project, Report Number: 2005-001-C-4 Building Information Modelling for FM at Sydney Opera House), a business strategy that has been well demonstrated. The development of the BIMSS - Open Specification for BIM has been surprisingly straightforward. The lack of technical difficulties in converting the House’s existing conventions and standards to the new model based environment can be related to three key factors: • SOH Facilities Portfolio – the internal group responsible for asset and facility management - have already well established building and documentation policies in place. The setting and adherence to well thought out operational standards has been based on the need to create an environment that is understood by all users and that addresses the major business needs of the House. • The second factor is the nature of the IFC Model Specification used to define the BIM protocol. The IFC standard is based on building practice and nomenclature, widely used in the construction industries across the globe. For example the nomenclature of building parts – eg ifcWall, corresponds to our normal terminology, but extends the traditional drawing environment currently used for design and documentation. This demonstrates that the international IFC model accurately represents local practice for building data representation and management. • a BIM environment sets up opportunities for innovative processes that can exploit the rich data in the model and improve services and functions for the House: for example several high-level processes have been identified that could benefit from standardized Building Information Models such as maintenance processes using engineering data, business processes using scheduling, venue access, security data and benchmarking processes using building performance data. The new technology matches business needs for current and new services. The adoption of IFC compliant applications opens the way forward for shared building model collaboration and new processes, a significant new focus of the BIM standards. In summary, SOH current building standards have been successfully drafted for a BIM environment and are confidently expected to be fully developed when BIM is adopted operationally by SOH. These BIM standards and their application to the Opera House are intended as a template for other organisations to adopt for the own procurement and facility management activities. Appendices provide an overview of the IFC Integrated Object Model and an understanding IFC Model Data.
Resumo:
Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.
Resumo:
A small group of companies including Intel, Microsoft, and Cisco have used "platform leadership" with great effect as a means for driving innovation and accelerating market growth within their respective industries. Prior research in this area emphasizes that trust plays a critical role in the success of this strategy. However, many of the categorizations of trust discussed in the literature tend to ignore or undervalue the fact that trust and power are often functionally equivalent, and that the coercion of weaker partners is sometimes misdiagnosed as collaboration. In this paper, I use case study data focusing on Intel's shift from ceramic/wire-bonded packaging to organic/C4 packaging to characterize the relationships between Intel and its suppliers, and to determine if these links are based on power in addition to trust. The case study shows that Intel's platform leadership strategy is built on a balance of both trust and a relatively benevolent form of power that is exemplified by the company's "open kimono" principle, through which Intel insists that suppliers share detailed financial data and highly proprietary technical information to achieve mutually advantageous objectives. By explaining more completely the nature of these inter-firm linkages, this paper usefully extends our understanding of how platform leadership is maintained by Intel, and contributes to the literature by showing how trust and power can be used simultaneously within an inter-firm relationship in a way that benefits all of the stakeholders.
Resumo:
The endeavour to obtain estimates of durability of components for use in lifecycle assessment or costing and infrastructure and maintenance planning systems is large. The factor method and the reference service life concept provide a very valuable structure, but do not resolve the central dilemma of the need to derive an extensive database of service life. Traditional methods of estimating service life, such as dose functions or degradation models, can play a role in developing this database, however the scale of the problem clearly indicates that individual dose functions cannot be derived for each component in each different local and geographic setting. Thus, a wider range of techniques is required in order to devise reference service life. This paper outlines the approaches being taken in the Cooperative Research Centre for Construction Innovation project to predict reference service life. Approaches include the development of fundamental degradation and microclimate models, the development of a situation-based reasoning ‘engine’ to vary the ‘estimator’ of service life, and the development of a database on expert performance (Delphi study). These methods should be viewed as complementary rather than as discrete alternatives. As discussed in the paper, the situation-based reasoning approach in fact has the possibility of encompassing all other methods.
Resumo:
The full economic, cultural and environmental value of information produced or funded by the public sector can be realised through enabling greater access to and reuse of the information. To do this effectively it is necessary to describe and implement a policy framework that supports greater access and reuse among a distributed, online network of information suppliers and users. The objective of this study was to identify materials dealing with policies, principles and practices relating to information access and reuse in Australia and in other key jurisdictions internationally. Open Access Policies, Practices and Licensing: A review of the literature in Australia and selected jurisdictions sets out the findings of an extensive review of published materials dealing with policies, practices and legal issues relating to information access and reuse, with a particular focus on materials generated, held or funded by public sector bodies. The report was produced as part of the work program of the project “Enabling Real-Time Information Access in Both Urban and Regional Areas”, established within the Cooperative Research Centre for Spatial Information (CRCSI).
Resumo:
The analysis and value of digital evidence in an investigation has been the domain of discourse in the digital forensic community for several years. While many works have considered different approaches to model digital evidence, a comprehensive understanding of the process of merging different evidence items recovered during a forensic analysis is still a distant dream. With the advent of modern technologies, pro-active measures are integral to keeping abreast of all forms of cyber crimes and attacks. This paper motivates the need to formalize the process of analyzing digital evidence from multiple sources simultaneously. In this paper, we present the forensic integration architecture (FIA) which provides a framework for abstracting the evidence source and storage format information from digital evidence and explores the concept of integrating evidence information from multiple sources. The FIA architecture identifies evidence information from multiple sources that enables an investigator to build theories to reconstruct the past. FIA is hierarchically composed of multiple layers and adopts a technology independent approach. FIA is also open and extensible making it simple to adapt to technological changes. We present a case study using a hypothetical car theft case to demonstrate the concepts and illustrate the value it brings into the field.
Resumo:
The Open and Trusted Health Information Systems (OTHIS) Research Group has formed in response to the health sector’s privacy and security requirements for contemporary Health Information Systems (HIS). Due to recent research developments in trusted computing concepts, it is now both timely and desirable to move electronic HIS towards privacy-aware and security-aware applications. We introduce the OTHIS architecture in this paper. This scheme proposes a feasible and sustainable solution to meeting real-world application security demands using commercial off-the-shelf systems and commodity hardware and software products.
Resumo:
Information and Communications Technologies globally are moving towards Service Oriented Architectures and Web Services. The healthcare environment is rapidly moving to the use of Service Oriented Architecture/Web Services systems interconnected via this global open Internet. Such moves present major challenges where these structures are not based on highly trusted operating systems. This paper argues the need of a radical re-think of access control in the contemporary healthcare environment in light of modern information system structures, legislative and regulatory requirements, and security operation demands in Health Information Systems. This paper proposes the Open and Trusted Health Information Systems (OTHIS), a viable solution including override capability to the provision of appropriate levels of secure access control for the protection of sensitive health data.
Resumo:
This article presents a reflective view of three teaching colleagues from Queensland University of Technology, Brisbane who had attended and participated in the 'Landscapes of Rights' Conference in Adelaide, July 2009. The conference is a biennial event run by the Reggio Emilia-Australia Information Exchange. The authors explore and reflect on the provocations posed throughout this conference and consider these in light of their ongoing work in the field of teacher education, of early childhood teaching and as active supporters of children's rights.
Resumo:
The range of political information sources available to modern Australians is greater and more varied today than at any point in the nation’s history, incorporating print, broadcast, Internet, mainstream and non-mainstream media. In such a competitive media environment, the factors which influence the selection of some information sources above others are of interest to political agents, media institutions and communications researchers alike. A key factor in information source selection is credibility. At the same time that the range of political information sources is increasing rapidly, due to the development of new information and communication technologies, audience research suggests that trust in mainstream media organisations in many countries is declining. So if people distrust the mainstream media, but have a vast array of alternative political information sources available to them, what do their personal media consumption patterns look like? How can we analyse such media consumption patterns in a meaningful way? In this paper I will briefly map the development of media credibility research in the US and Australia, leading to a discussion of one of the most recent media credibility constructs to be shown to influence political information consumption, media scepticism. Looking at the consequences of media scepticism, I will then consider the associated media consumption construct, media diet, and evaluate its usefulness in an Australian, as opposed to US, context. Finally, I will suggest alternative conceptualisations of media diets which may be more suited to Australian political communications research.
Resumo:
Folio submission is universally regarded as the most appropriate means for measuring a student’s performance in the studio. However, developing meaningful and defensible assessment criteria is persistent challenge for all tertiary art educators. In discipline-based studios, the parameters provided by medium and technique provide useful points of reference for assessing creative performance. But how can student performance be evaluated when there is no discipline-based framework to act as a point of reference? The ‘open’ studio approach to undergraduate teaching presents these and other pedagogical challenges. This paper discusses the innovative approaches to studio-based teaching and assessment at QUT. Vital to the QUT open studio model is the studio rationale – an exegetical document that establishes an individualised theoretical framework through which a student’s understandings can be, in part, evaluated. This paper argues that the exegetical folio effectively reconciles the frequently divergent imperatives of creative, professional and academic skills, while retaining the centrality of the studio as a site for the production of new material, processual and conceptual understandings.
Resumo:
Since at least the 1960s, art has assumed a breadth of form and medium as diverse as social reality itself. Where once it was marginal and transgressive for artists to work across a spectrum of media, today it is common practice. In this ‘post-medium’ age, fidelity to a specific branch of media is a matter of preference, rather than a code of practice policed by gallerists, curators and critics. Despite the openness of contemporary art practice, the teaching of art at most universities remains steadfastly discipline-based. Discipline-based art teaching, while offering the promise of focussed ‘mastery’ of a particular set of technical skills and theoretical concerns, does so at the expense of a deeper and more complex understanding of the possibilities of creative experimentation in the artist’s studio. By maintaining an hermetic approach to medium, it does not prepare students sufficiently for the reality of art making in the twenty-first century. In fact, by pretending that there is a select range of techniques fundamental to the artist’s trade, discipline-based teaching can often appear to be more engaged with the notion of skills preservation than purposeful art training. If art schools are to survive and prosper in an increasingly vocationally-oriented university environment, they need to fully synthesise the professional reality of contemporary art practice into their approach to teaching and learning. This paper discusses the way in which the ‘open’ studio approach to visual art study at QUT endeavours to incorporate the diversity and complexity of contemporary art while preserving the sense of collective purpose that discipline-based teaching fosters. By allowing students to independently develop their own art practices while also applying collaborative models of learning and assessment, the QUT studio program aims to equip students with a strong sense of self-reliance, a broad awareness and appreciation of contemporary art, and a deep understanding of studio-based experimentation unfettered by the boundaries of traditional media: all skills fundamental to the practice of contemporary art.
Resumo:
An asset registry arguably forms the core system that needs to be in place before other systems can operate or interoperate. Most systems have rudimentary asset registry functionality that store assets, relationships, or characteristics, and this leads to different asset management systems storing similar sets of data in multiple locations in an organisation. As organisations have been slowly moving their information architecture toward a service-oriented architecture, they have also been consolidating their multiple data stores, to form a “single point of truth”. As part of a strategy to integrate several asset management systems in an Australian railway organisation, a case study for developing a consolidated asset registry was conducted. A decision was made to use the MIMOSA OSA-EAI CRIS data model as well as the OSA-EAI Reference Data in building the platform due to the standard’s relative maturity and completeness. A pilot study of electrical traction equipment was selected, and the data sources feeding into the asset registry were primarily diagrammatic based. This paper presents the pitfalls encountered, approaches taken, and lessons learned during the development of the asset registry.