361 resultados para software asset creation
Resumo:
Software as a Service (SaaS) is a promising approach for Small and Medium Enterprises (SMEs) firms, in particular those that are focused on growing fast and leveraging new technology, due to the potential benefits arising from its inherent scalability, reduced total cost of ownership and the ease of access to global innovations. This paper proposes a dynamic perspective on IS capabilities to understand and explain SMEs sourcing and levering SaaS. The model is derived from combining the IS capabilities of Feeny and Willcocks (1998) and the dynamic capabilities of Teece (2007) and contextualizing it for SMEs and SaaS. We conclude that SMEs sourcing and leveraging SaaS require leadership, business systems thinking and informed buying for sensing and seizing SaaS opportunities and require leadership and vendor development for transforming in terms of aligning and realigning specific tangible and intangible assets.
Resumo:
Over the last twenty years, the use of open content licenses has become increasingly and surprisingly popular. The use of such licences challenges the traditional incentive-based model of exclusive rights under copyright. Instead of providing a means to charge for the use of particular works, what seems important is mitigating against potential personal harm to the author and, in some cases, preventing non-consensual commercial exploitation. It is interesting in this context to observe the primacy of what are essentially moral rights over the exclusionary economic rights. The core elements of common open content licences map somewhat closely to continental conceptions of the moral rights of authorship. Most obviously, almost all free software and free culture licences require attribution of authorship. More interestingly, there is a tension between social norms developed in free software communities and those that have emerged in the creative arts over integrity and commercial exploitation. For programmers interested in free software, licence terms that prohibit commercial use or modification are almost completely inconsistent with the ideological and utilitarian values that underpin the movement. For those in the creative industries, on the other hand, non-commercial terms and, to a lesser extent, terms that prohibit all but verbatim distribution continue to play an extremely important role in the sharing of copyright material. While prohibitions on commercial use often serve an economic imperative, there is also a certain personal interest for many creators in avoiding harmful exploitation of their expression – an interest that has sometimes been recognised as forming a component of the moral right of integrity. One particular continental moral right – the right of withdrawal – is present neither in Australian law or in any of the common open content licences. Despite some marked differences, both free software and free culture participants are using contractual methods to articulate the norms of permissible sharing. Legal enforcement is rare and often prohibitively expensive, and the various communities accordingly rely upon shared understandings of acceptable behaviour. The licences that are commonly used represent a formalised expression of these community norms and provide the theoretically enforceable legal baseline that lends them legitimacy. The core terms of these licences are designed primarily to alleviate risk in sharing and minimise transaction costs in sharing and using copyright expression. Importantly, however, the range of available licences reflect different optional balances in the norms of creating and sharing material. Generally, it is possible to see that, stemming particularly from the US, open content licences are fundamentally important in providing a set of normatively accepted copyright balances that reflect the interests sought to be protected through moral rights regimes. As the cost of creation, distribution, storage, and processing of expression continues to fall towards zero, there are increasing incentives to adopt open content licences to facilitate wide distribution and reuse of creative expression. Thinking of these protocols not only as reducing transaction costs but of setting normative principles of participation assists in conceptualising the role of open content licences and the continuing tensions that permeate modern copyright law.
Resumo:
The most common software analysis tools available for measuring fluorescence images are for two-dimensional (2D) data that rely on manual settings for inclusion and exclusion of data points, and computer-aided pattern recognition to support the interpretation and findings of the analysis. It has become increasingly important to be able to measure fluorescence images constructed from three-dimensional (3D) datasets in order to be able to capture the complexity of cellular dynamics and understand the basis of cellular plasticity within biological systems. Sophisticated microscopy instruments have permitted the visualization of 3D fluorescence images through the acquisition of multispectral fluorescence images and powerful analytical software that reconstructs the images from confocal stacks that then provide a 3D representation of the collected 2D images. Advanced design-based stereology methods have progressed from the approximation and assumptions of the original model-based stereology(1) even in complex tissue sections(2). Despite these scientific advances in microscopy, a need remains for an automated analytic method that fully exploits the intrinsic 3D data to allow for the analysis and quantification of the complex changes in cell morphology, protein localization and receptor trafficking. Current techniques available to quantify fluorescence images include Meta-Morph (Molecular Devices, Sunnyvale, CA) and Image J (NIH) which provide manual analysis. Imaris (Andor Technology, Belfast, Northern Ireland) software provides the feature MeasurementPro, which allows the manual creation of measurement points that can be placed in a volume image or drawn on a series of 2D slices to create a 3D object. This method is useful for single-click point measurements to measure a line distance between two objects or to create a polygon that encloses a region of interest, but it is difficult to apply to complex cellular network structures. Filament Tracer (Andor) allows automatic detection of the 3D neuronal filament-like however, this module has been developed to measure defined structures such as neurons, which are comprised of dendrites, axons and spines (tree-like structure). This module has been ingeniously utilized to make morphological measurements to non-neuronal cells(3), however, the output data provide information of an extended cellular network by using a software that depends on a defined cell shape rather than being an amorphous-shaped cellular model. To overcome the issue of analyzing amorphous-shaped cells and making the software more suitable to a biological application, Imaris developed Imaris Cell. This was a scientific project with the Eidgenössische Technische Hochschule, which has been developed to calculate the relationship between cells and organelles. While the software enables the detection of biological constraints, by forcing one nucleus per cell and using cell membranes to segment cells, it cannot be utilized to analyze fluorescence data that are not continuous because ideally it builds cell surface without void spaces. To our knowledge, at present no user-modifiable automated approach that provides morphometric information from 3D fluorescence images has been developed that achieves cellular spatial information of an undefined shape (Figure 1). We have developed an analytical platform using the Imaris core software module and Imaris XT interfaced to MATLAB (Mat Works, Inc.). These tools allow the 3D measurement of cells without a pre-defined shape and with inconsistent fluorescence network components. Furthermore, this method will allow researchers who have extended expertise in biological systems, but not familiarity to computer applications, to perform quantification of morphological changes in cell dynamics.
Resumo:
Starting from the vantage point that explaining success at creating a venture should be the unique contribution—or at least one unique contribution—of entrepreneurship research, we argue that this success construct has not yet been adequately defined an operationalized. We thus offer suggestions for more precise conceptualization and measurement of this central construct. Rather than regarding various success proxies used in prior research as poor operationalizations of success we argue that they represent other important aspects of the venture creation process: engagement, persistence and progress. We hold that in order to attain a better understanding of venture creation these constructs also need to be theoretically defined. Further, their respective drivers need to be theorized and tested separately. We suggest theoretical definitions of each. We then develop and test hypotheses concerning how human capital, venture idea novelty and business planning has different impact on the different assessments of the process represented by engagement, persistence, progress and success. The results largely confirm the stated hypotheses, suggesting that the conceptual and empirical approach we are suggesting is a path towards improved understanding of the central entrepreneurship phenomenon of new venture creation.
Resumo:
The topic of this research is a novel entertainment form currently emerging from the youngest human communication technology, the Internet. This form, products based on it, and the conceptual framework describing it are all referred to as Entertainment Architecture (‘entarch,’ for short). Entarch is classified as Internet-native transmedia entertainment — it fully utilises the unique communicative characteristics of the Internet and is not based on just one medium. A number of entarch examples are explored through ‘immersive’ textual analysis — a new mode of textual analysis required for research into this kind of entertainment. As a secondary priority, entarch is related to the movie — which is chosen as an exemplary existing entertainment form finding itself in a radically uncertain formal, business, and industrial environment, and accordingly is struggling financially. Throughout, formal, business, and industrial consequences of the emergence of Entertainment Architecture are explored. This research is an example of applied cultural science, as it treats culture as a source of innovation and a complex dynamic system with technological as well as human characteristics. It analyses the dynamics of cultural change in the context of business development, consumer experience, and economic evolution — with an intrinsically transdisciplinary methodology.
Resumo:
Engineering asset management (EAM) is a rapidly growing and developing field. However, efforts to select and develop engineers in this area are complicated by our lack of understanding of the full range of competencies required to perform. This exploratory study sought to clarify and categorise the professional competencies required of individuals at different hierarchical levels within EAM. Data from 14 interviews and 61 on-line survey participants has informed the development of an initial Professional Competency Framework. The nine competency categories indicate that Engineers working in this field need to be able to collaborate and influence others, complete objectives within organizational guidelines and be able to manage themselves effectively. Limitations and potential uses in practice and research for this framework are discussed.
Resumo:
This chapter reviews aspects of the challenge of reviewing and reforming Indonesian practice within state asset management law and policy specifically related to public housing, public buildings, parklands, and vacant land. A critical issue in beginning this review is how Indonesia currently conceptualizes the notion of asset governance and how this meaning is embodied in recent changes in law and policy and importantly in options for future change. This chapter discusses the potential complexities uniquely Indonesian characteristics such as decentralisation and regional autonomy regime, political history, and bureaucratic culture.
Resumo:
Optimal Asset Maintenance decisions are imperative for efficient asset management. Decision Support Systems are often used to help asset managers make maintenance decisions, but high quality decision support must be based on sound decision-making principles. For long-lived assets, a successful Asset Maintenance decision-making process must effectively handle multiple time scales. For example, high-level strategic plans are normally made for periods of years, while daily operational decisions may need to be made within a space of mere minutes. When making strategic decisions, one usually has the luxury of time to explore alternatives, whereas routine operational decisions must often be made with no time for contemplation. In this paper, we present an innovative, flexible decision-making process model which distinguishes meta-level decision making, i.e., deciding how to make decisions, from the information gathering and analysis steps required to make the decisions themselves. The new model can accommodate various decision types. Three industrial case studies are given to demonstrate its applicability.
Resumo:
Building Information Modelling (BIM) appears to be the next evolutionary link in project delivery within the AEC (Architecture, Engineering and Construction) Industry. There have been several surveys of implementation at the local level but to date little is known of the international context. This paper is a preliminary report of a large scale electronic survey of the implementation of BIM and the impact on AEC project delivery and project stakeholders in Australia and internationally. National and regional patterns of BIM usage will be identified. These patterns will include disciplinary users, project lifecycle stages, technology integration–including software compatibility—and organisational issues such as human resources and interoperability. Also considered is the current status of the inclusion of BIM within tertiary level curricula and potential for the creation of a new discipline.
Resumo:
Recent literature in project management has urged a re-conceptualisation of projects as a value co-creation process. Contrary to the traditional output-focused project methodology, the value creation perspective argues for the importance of creating new knowledge, processes, and systems for suppliers and customers. Stakeholder involvement is important in this new perspective, as the balancing of competing needs of stakeholders in mega projects becomes a major challenge in managing the value co-creation process. In this study we present interview data from three Australian defence mega projects to demonstrate that senior executives have a more complex understanding of project success than traditional iron triangle measures. In these mega defence projects, customers and other stakeholders actively engage in the value creation process, and over time both content and process value are created to increase defence and national capability. Value created and captured during and post projects are the key to true success.
Resumo:
Asset management (AM) processes play an important role in assisting enterprises to manage their assets more efficiently. To visualise and improve AM processes, the processes need to be modelled using certain process modelling methodologies. Understanding the requirements for AM process modelling is essential for selecting or developing effective AM process modelling methodologies. However, little research has been done on analysing the requirements. This paper attempts to fill this gap by investigating the features of AM processes. It is concluded that AM process modelling requires intuitive representation of its processes, ‘fast’ implementation of the process modelling, effective evaluation of the processes and sound system integration.
Resumo:
The ability to forecast machinery health is vital to reducing maintenance costs, operation downtime and safety hazards. Recent advances in condition monitoring technologies have given rise to a number of prognostic models which attempt to forecast machinery health based on condition data such as vibration measurements. This paper demonstrates how the population characteristics and condition monitoring data (both complete and suspended) of historical items can be integrated for training an intelligent agent to predict asset health multiple steps ahead. The model consists of a feed-forward neural network whose training targets are asset survival probabilities estimated using a variation of the Kaplan–Meier estimator and a degradation-based failure probability density function estimator. The trained network is capable of estimating the future survival probabilities when a series of asset condition readings are inputted. The output survival probabilities collectively form an estimated survival curve. Pump data from a pulp and paper mill were used for model validation and comparison. The results indicate that the proposed model can predict more accurately as well as further ahead than similar models which neglect population characteristics and suspended data. This work presents a compelling concept for longer-range fault prognosis utilising available information more fully and accurately.
Resumo:
Since 2007 Kite Arts Education Program (KITE), based at Queensland Performing Arts Centre (QPAC), has been engaged in delivering a series of theatre-based experiences for children in low socio-economic primary schools in Queensland. The artist in residence (AIR) project titled Yonder includes performances developed by the children with the support and leadership of teacher artists from KITE for their community and parents/carers,supported by a peak community cultural institution. In 2009,Queensland Performing Arts Centre partnered with Queensland University of Technology (QUT) Creative Industries Faculty (Drama) to conduct a three-year evaluation of the Yonder project to understand the operational dynamics, artistic outputs and the educational benefits of the project. This paper outlines the research findings for children engaged in the Yonder project in the interrelated areas of literacy development and social competencies. Findings are drawn from six iterations of the project in suburban locations on the edge of Brisbane city and in regional Queensland.
Resumo:
The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.