992 resultados para OPEN CLUSTERS
Resumo:
A month-long intensive measurement campaign was conducted in March/April 2007 at Agnes Water, a remote coastal site just south of the Great Barrier Reef on the east coast of Australia. Particle and ion size distributions were continuously measured during the campaign. Coastal nucleation events were observed in clean, marine air masses coming from the south-east on 65% of the days. The events usually began at ~10:00 local time and lasted for 1-4 hrs. They were characterised by the appearance of a nucleation mode with a peak diameter of ~10 nm. The freshly nucleated particles grew within 1-4 hrs up to sizes of 20-50 nm. The events occurred when solar intensity was high (~1000 W m-2) and RH was low (~60%). Interestingly, the events were not related to tide height. The volatile and hygroscopic properties of freshly nucleated particles (17-22.5 nm), simultaneously measured with a volatility-hygroscopicity-tandem differential mobility analyser (VH-TDMA), were used to infer chemical composition. The majority of the volume of these particles was attributed to internally mixed sulphate and organic components. After ruling out coagulation as a source of significant particle growth, we conclude that the condensation of sulphate and/or organic vapours was most likely responsible for driving particle growth during the nucleation events. We cannot make any direct conclusions regarding the chemical species that participated in the initial particle nucleation. However, we suggest that nucleation may have resulted from the photo-oxidation products of unknown sulphur or organic vapours emitted from the waters of Hervey Bay, or from the formation of DMS-derived sulphate clusters over the open ocean that were activated to observable particles by condensable vapours emitted from the nutrient rich waters around Fraser Island or Hervey Bay. Furthermore, a unique and particularly strong nucleation event was observed during northerly wind. The event began early one morning (08:00) and lasted almost the entire day resulting in the production of a large number of ~80 nm particles (average modal concentration during the event was 3200 cm-3). The Great Barrier Reef was the most likely source of precursor vapours responsible for this event.
Resumo:
For Bakhtin, it is always important to know from where one speaks. The place from which I speak is that of a person who grew up in Italy during the economic miracle (pre-1968) in a working class family, watching film matinees on television during school holidays. All sort of films and genres were shown: from film noir to westerns, to Jean Renoir's films, German expressionism, Italian neorealism and Italian comedy. Cinema has come to represent over time a sort of memory extension that supplements lived memory of events, and one which, especially, mediates the intersection of many cultural discourses. When later in life I moved to Australia and started teaching in film studies, my choice of a film that was emblematic of neorealism went naturally to Roma città aperta (Open city hereafter) by Roberto Rossellini (1945), and not to Paisan or Sciuscà or Bicycle Thieves. My choice was certainly grounded in my personal memory - especially those aspects transmitted to me by my parents, who lived through the war and maintained that Open City had truly made them cry. With a mother who voted for the Christian Democratic Party and a father who was a unionist, I thought that this was normal in Italian families and society. In the early 1960s, the Resistance still offered a narrative of suffering and redemption, shared by Catholics or Communists. This construction of psychological realism is what I believe Open City continues to offer in time.
Resumo:
Executive Summary The objective of this report was to use the Sydney Opera House as a case study of the application of Building Information Modelling (BIM). The Sydney opera House is a complex, large building with very irregular building configuration, that makes it a challenging test. A number of key concerns are evident at SOH: • the building structure is complex, and building service systems - already the major cost of ongoing maintenance - are undergoing technology change, with new computer based services becoming increasingly important. • the current “documentation” of the facility is comprised of several independent systems, some overlapping and is inadequate to service current and future services required • the building has reached a milestone age in terms of the condition and maintainability of key public areas and service systems, functionality of spaces and longer term strategic management. • many business functions such as space or event management require up-to-date information of the facility that are currently inadequately delivered, expensive and time consuming to update and deliver to customers. • major building upgrades are being planned that will put considerable strain on existing Facilities Portfolio services, and their capacity to manage them effectively While some of these concerns are unique to the House, many will be common to larger commercial and institutional portfolios. The work described here supported a complementary task which sought to identify if a building information model – an integrated building database – could be created, that would support asset & facility management functions (see Sydney Opera House – FM Exemplar Project, Report Number: 2005-001-C-4 Building Information Modelling for FM at Sydney Opera House), a business strategy that has been well demonstrated. The development of the BIMSS - Open Specification for BIM has been surprisingly straightforward. The lack of technical difficulties in converting the House’s existing conventions and standards to the new model based environment can be related to three key factors: • SOH Facilities Portfolio – the internal group responsible for asset and facility management - have already well established building and documentation policies in place. The setting and adherence to well thought out operational standards has been based on the need to create an environment that is understood by all users and that addresses the major business needs of the House. • The second factor is the nature of the IFC Model Specification used to define the BIM protocol. The IFC standard is based on building practice and nomenclature, widely used in the construction industries across the globe. For example the nomenclature of building parts – eg ifcWall, corresponds to our normal terminology, but extends the traditional drawing environment currently used for design and documentation. This demonstrates that the international IFC model accurately represents local practice for building data representation and management. • a BIM environment sets up opportunities for innovative processes that can exploit the rich data in the model and improve services and functions for the House: for example several high-level processes have been identified that could benefit from standardized Building Information Models such as maintenance processes using engineering data, business processes using scheduling, venue access, security data and benchmarking processes using building performance data. The new technology matches business needs for current and new services. The adoption of IFC compliant applications opens the way forward for shared building model collaboration and new processes, a significant new focus of the BIM standards. In summary, SOH current building standards have been successfully drafted for a BIM environment and are confidently expected to be fully developed when BIM is adopted operationally by SOH. These BIM standards and their application to the Opera House are intended as a template for other organisations to adopt for the own procurement and facility management activities. Appendices provide an overview of the IFC Integrated Object Model and an understanding IFC Model Data.
Resumo:
Much of the literature on clusters has focused on the economic advantages of clusters and how these can be achieved in terms of competition, regional development and local spillovers. Some studies have focused at the level of the individual firm however human resource management (HRM) in individual clustered firms has received scant attention. This paper innovatively utilises the extended Resource Based View (RBV) of the firm as a framework to conceptualise the human resource processes of individual firms within a cluster. RBV is argued as a useful tool as it explains external rents outside a firm’s boundaries. The paper concludes that HRM can assist in generating rents for firms and clusters more broadly when the function supports valuable interfirm relationships important for realising inter-firm advantages.
Resumo:
Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.
Resumo:
A small group of companies including Intel, Microsoft, and Cisco have used "platform leadership" with great effect as a means for driving innovation and accelerating market growth within their respective industries. Prior research in this area emphasizes that trust plays a critical role in the success of this strategy. However, many of the categorizations of trust discussed in the literature tend to ignore or undervalue the fact that trust and power are often functionally equivalent, and that the coercion of weaker partners is sometimes misdiagnosed as collaboration. In this paper, I use case study data focusing on Intel's shift from ceramic/wire-bonded packaging to organic/C4 packaging to characterize the relationships between Intel and its suppliers, and to determine if these links are based on power in addition to trust. The case study shows that Intel's platform leadership strategy is built on a balance of both trust and a relatively benevolent form of power that is exemplified by the company's "open kimono" principle, through which Intel insists that suppliers share detailed financial data and highly proprietary technical information to achieve mutually advantageous objectives. By explaining more completely the nature of these inter-firm linkages, this paper usefully extends our understanding of how platform leadership is maintained by Intel, and contributes to the literature by showing how trust and power can be used simultaneously within an inter-firm relationship in a way that benefits all of the stakeholders.
Resumo:
The endeavour to obtain estimates of durability of components for use in lifecycle assessment or costing and infrastructure and maintenance planning systems is large. The factor method and the reference service life concept provide a very valuable structure, but do not resolve the central dilemma of the need to derive an extensive database of service life. Traditional methods of estimating service life, such as dose functions or degradation models, can play a role in developing this database, however the scale of the problem clearly indicates that individual dose functions cannot be derived for each component in each different local and geographic setting. Thus, a wider range of techniques is required in order to devise reference service life. This paper outlines the approaches being taken in the Cooperative Research Centre for Construction Innovation project to predict reference service life. Approaches include the development of fundamental degradation and microclimate models, the development of a situation-based reasoning ‘engine’ to vary the ‘estimator’ of service life, and the development of a database on expert performance (Delphi study). These methods should be viewed as complementary rather than as discrete alternatives. As discussed in the paper, the situation-based reasoning approach in fact has the possibility of encompassing all other methods.
Resumo:
The full economic, cultural and environmental value of information produced or funded by the public sector can be realised through enabling greater access to and reuse of the information. To do this effectively it is necessary to describe and implement a policy framework that supports greater access and reuse among a distributed, online network of information suppliers and users. The objective of this study was to identify materials dealing with policies, principles and practices relating to information access and reuse in Australia and in other key jurisdictions internationally. Open Access Policies, Practices and Licensing: A review of the literature in Australia and selected jurisdictions sets out the findings of an extensive review of published materials dealing with policies, practices and legal issues relating to information access and reuse, with a particular focus on materials generated, held or funded by public sector bodies. The report was produced as part of the work program of the project “Enabling Real-Time Information Access in Both Urban and Regional Areas”, established within the Cooperative Research Centre for Spatial Information (CRCSI).
Resumo:
The analysis and value of digital evidence in an investigation has been the domain of discourse in the digital forensic community for several years. While many works have considered different approaches to model digital evidence, a comprehensive understanding of the process of merging different evidence items recovered during a forensic analysis is still a distant dream. With the advent of modern technologies, pro-active measures are integral to keeping abreast of all forms of cyber crimes and attacks. This paper motivates the need to formalize the process of analyzing digital evidence from multiple sources simultaneously. In this paper, we present the forensic integration architecture (FIA) which provides a framework for abstracting the evidence source and storage format information from digital evidence and explores the concept of integrating evidence information from multiple sources. The FIA architecture identifies evidence information from multiple sources that enables an investigator to build theories to reconstruct the past. FIA is hierarchically composed of multiple layers and adopts a technology independent approach. FIA is also open and extensible making it simple to adapt to technological changes. We present a case study using a hypothetical car theft case to demonstrate the concepts and illustrate the value it brings into the field.
Resumo:
The Open and Trusted Health Information Systems (OTHIS) Research Group has formed in response to the health sector’s privacy and security requirements for contemporary Health Information Systems (HIS). Due to recent research developments in trusted computing concepts, it is now both timely and desirable to move electronic HIS towards privacy-aware and security-aware applications. We introduce the OTHIS architecture in this paper. This scheme proposes a feasible and sustainable solution to meeting real-world application security demands using commercial off-the-shelf systems and commodity hardware and software products.
Resumo:
Information and Communications Technologies globally are moving towards Service Oriented Architectures and Web Services. The healthcare environment is rapidly moving to the use of Service Oriented Architecture/Web Services systems interconnected via this global open Internet. Such moves present major challenges where these structures are not based on highly trusted operating systems. This paper argues the need of a radical re-think of access control in the contemporary healthcare environment in light of modern information system structures, legislative and regulatory requirements, and security operation demands in Health Information Systems. This paper proposes the Open and Trusted Health Information Systems (OTHIS), a viable solution including override capability to the provision of appropriate levels of secure access control for the protection of sensitive health data.
Resumo:
Folio submission is universally regarded as the most appropriate means for measuring a student’s performance in the studio. However, developing meaningful and defensible assessment criteria is persistent challenge for all tertiary art educators. In discipline-based studios, the parameters provided by medium and technique provide useful points of reference for assessing creative performance. But how can student performance be evaluated when there is no discipline-based framework to act as a point of reference? The ‘open’ studio approach to undergraduate teaching presents these and other pedagogical challenges. This paper discusses the innovative approaches to studio-based teaching and assessment at QUT. Vital to the QUT open studio model is the studio rationale – an exegetical document that establishes an individualised theoretical framework through which a student’s understandings can be, in part, evaluated. This paper argues that the exegetical folio effectively reconciles the frequently divergent imperatives of creative, professional and academic skills, while retaining the centrality of the studio as a site for the production of new material, processual and conceptual understandings.
Resumo:
Since at least the 1960s, art has assumed a breadth of form and medium as diverse as social reality itself. Where once it was marginal and transgressive for artists to work across a spectrum of media, today it is common practice. In this ‘post-medium’ age, fidelity to a specific branch of media is a matter of preference, rather than a code of practice policed by gallerists, curators and critics. Despite the openness of contemporary art practice, the teaching of art at most universities remains steadfastly discipline-based. Discipline-based art teaching, while offering the promise of focussed ‘mastery’ of a particular set of technical skills and theoretical concerns, does so at the expense of a deeper and more complex understanding of the possibilities of creative experimentation in the artist’s studio. By maintaining an hermetic approach to medium, it does not prepare students sufficiently for the reality of art making in the twenty-first century. In fact, by pretending that there is a select range of techniques fundamental to the artist’s trade, discipline-based teaching can often appear to be more engaged with the notion of skills preservation than purposeful art training. If art schools are to survive and prosper in an increasingly vocationally-oriented university environment, they need to fully synthesise the professional reality of contemporary art practice into their approach to teaching and learning. This paper discusses the way in which the ‘open’ studio approach to visual art study at QUT endeavours to incorporate the diversity and complexity of contemporary art while preserving the sense of collective purpose that discipline-based teaching fosters. By allowing students to independently develop their own art practices while also applying collaborative models of learning and assessment, the QUT studio program aims to equip students with a strong sense of self-reliance, a broad awareness and appreciation of contemporary art, and a deep understanding of studio-based experimentation unfettered by the boundaries of traditional media: all skills fundamental to the practice of contemporary art.
Resumo:
Examined whether discrete working memory deficits underlie positive, negative and disorganised symptoms of schizophrenia. 52 outpatients (mean age 37.5 yrs) with schizophrenia were studied using items drawn from the Positive and Negative Syndrome Scale (PANSS). Linear regression and correlational analyses were conducted to examine whether symptom dimension scores were related to performance on several tests of working memory function. Severity of negative symptoms correlated with reduced production of words during a verbal fluency task, impaired ability to hold letter and number sequences on-line and manipulate them simultaneously, reduced performance during a dual task, and compromised visuospatial working memory under distraction-free conditions. Severity of disorganisation symptoms correlated with impaired visuospatial working memory under conditions of distraction, failure of inhibition during a verbal fluency task, perseverative responding on a test of set-shifting ability, and impaired ability to judge the veracity of simple declarative statements. The present study provides evidence that the positive, negative and disorganised symptom dimensions of the PANSS constitute independent clusters, associated with unique patterns of working memory impairment.
Resumo:
This paper considers the history of the cluster concept in urban economic geography, and its relationship to recent debates about creative cities. It then looks at the role that universities can play in the development of a creative cluster, as well as some of the potential pitfalls.