426 resultados para software creation methodology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software as a Service (SaaS) is anticipated to provide significant benefits to small and medium enterprises (SMEs) due to ease of access to high-end applications, 7*24 availability, utility pricing, etc. However, underlying SaaS is the assumption that SMEs will directly interact with the SaaS vendor and use a self-service model. In practice, we see the rise of SaaS intermediaries who support SMEs with using SaaS. This paper reports on an empirical study of the role of intermediaries in terms of how they support SMEs in sourcing and leveraging SaaS for their business. The knowledge contributions of this paper are: (1) the identification and description of the role of SaaS intermediaries and (2) the specification of different roles of SaaS intermediaries, in particular a more basic role with technology orientation and operational alignment perspective and (3) a more added value role with customer orientation and strategic alignment perspective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical methodology was applied to a survey of time-course incidence of four viruses (alfalfa mosaic virus, clover yellow vein virus, subterranean clover mottle virus and subterranean clover red leaf virus) in improved pastures in southern regions of Australia. -from Authors

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to develop an effective methodology for implementing lean manufacturing strategies and a leanness evaluation metric using continuous performance measurement (CPM). Design/methodology/approach – Based on five lean principles, a systematic lean implementation methodology for manufacturing organizations has been proposed. A simplified leanness evaluation metric consisting of both efficiency and effectiveness attributes of manufacturing performance has been developed for continuous evaluation of lean implementation. A case study to validate the proposed methodology has been conducted and proposed CPM metric has been used to assess the manufacturing leanness. Findings – Proposed methodology is able to systematically identify manufacturing wastes, select appropriate lean tools, identify relevant performance indicators, achieve significant performance improvement and establish lean culture in the organization. Continuous performance measurement matrices in terms of efficiency and effectiveness are proved to be appropriate methods for continuous evaluation of lean performance. Research limitations/implications – Effectiveness of the method developed has been demonstrated by applying it in a real life assembly process. However, more tests/applications will be necessary to generalize the findings. Practical implications – Results show that applying the methods developed, managers can successfully identify and remove manufacturing wastes from their production processes. By improving process efficiency, they can optimize their resource allocations. Manufacturers now have a validated step by step methodology for successfully implementing lean strategies. Originality/value – According to the authors’ best knowledge, this is the first known study that proposed a systematic lean implementation methodology based on lean principles and continuous improvement techniques. Evaluation of performance improvement by lean strategies is a critical issue. This study develops a simplified leanness evaluation metric considering both efficiency and effectiveness attributes and integrates it with the lean implementation methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exposure control or case-control methodologies are common techniques for estimating crash risks, however they require either observational data on control cases or exogenous exposure data, such as vehicle-kilometres travelled. This study proposes an alternative methodology for estimating crash risk of road user groups, whilst controlling for exposure under a variety of roadway, traffic and environmental factors by using readily available police-reported crash data. In particular, the proposed method employs a combination of a log-linear model and quasi-induced exposure technique to identify significant interactions among a range of roadway, environmental and traffic conditions to estimate associated crash risks. The proposed methodology is illustrated using a set of police-reported crash data from January 2004 to June 2009 on roadways in Queensland, Australia. Exposure-controlled crash risks of motorcyclists—involved in multi-vehicle crashes at intersections—were estimated under various combinations of variables like posted speed limit, intersection control type, intersection configuration, and lighting condition. Results show that the crash risk of motorcycles at three-legged intersections is high if the posted speed limits along the approaches are greater than 60 km/h. The crash risk at three-legged intersections is also high when they are unsignalized. Dark lighting conditions appear to increase the crash risk of motorcycles at signalized intersections, but the problem of night time conspicuity of motorcyclists at intersections is lessened on approaches with lower speed limits. This study demonstrates that this combined methodology is a promising tool for gaining new insights into the crash risks of road user groups, and is transferrable to other road users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The fastest-growing segment of jobs in the creative sector are in those firms that provide creative services to other sectors (Hearn, Goldsmith, Bridgstock, Rodgers 2014, this volume; Cunningham 2014, this volume). There are also a large number of Creative Services (Architecture and Design, Advertising and Marketing, Software and Digital Content occupations) workers embedded in organizations in other industry sectors (Cunningham and Higgs 2009). Ben Goldsmith (2014, this volume) shows, for example, that the Financial Services sector is the largest employer of digital creative talent in Australia. But why should this be? We argue it is because ‘knowledge-based intangibles are increasingly the source of value creation and hence of sustainable competitive advantage (Mudambi 2008, 186). This value creation occurs primarily at the research and development (R and D) and the marketing ends of the supply chain. Both of these areas require strong creative capabilities in order to design for, and to persuade, consumers. It is no surprise that Jess Rodgers (2014, this volume), in a study of Australia’s Manufacturing sector, found designers and advertising and marketing occupations to be the most numerous creative occupations. Greg Hearn and Ruth Bridgstock (2013, forthcoming) suggest ‘the creative heart of the creative economy […] is the social and organisational routines that manage the generation of cultural novelty, both tacit and codified, internal and external, and [cultural novelty’s] combination with other knowledges […] produce and capture value’. 2 Moreover, the main “social and organisational routine” is usually a team (for example, Grabher 2002; 2004).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The methoxyamine group represents an ideal protecting group for the nitroxide moiety. It can be easily and selectively introduced in high yield (typically >90%) to a range of functionalised nitroxides using FeSO4.7H2O and H2O2 in DMSO. Its removal is readily achieved under mild conditions in high yield (70-90%) using mCPBA in a Cope-type elimination process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Australia, collaborative contracts, and in particular, project alliances, have been increasingly used to govern infrastructure projects. These contracts use formal and informal governance mechanisms to manage the delivery of infrastructure projects. Formal mechanisms such as financial risk sharing are specified in the contract, while informal mechanisms such as integrated teams are not. Given that the literature contains a multiplicity of often untestable definitions, this paper reports on a review of the literature to operationalize the concepts of formal and informal governance. This work is the first phase of a study that will examine the optimal balance of formal and informal governance structures. Desk-top review of leading journals in the areas of construction management and business management, as well as recent government documents and industry guidelines, was undertaken to to conceptualise and operationalize formal and informal governance mechanisms. The study primarily draws on transaction-cost economics (e.g. Williamson 1979; Williamson 1991), relational contract theory (Feinman 2000; Macneil 2000) and social psychology theory (e.g. Gulati 1995). Content analysis of the literature was undertaken to identify key governance mechanisms. Content analysis is a commonly used methodology in the social sciences area. It provides rich data through the systematic and objective review of literature (Krippendorff 2004). NVivo 9, a qualitative data analysis software package, was used to assist in this process. A previous study by the authors identified that formal governance mechanisms can be classified into seven measurable categories: (1) negotiated cost, (2) competitive cost, (3) commercial framework, (4) risk and reward sharing, (5) qualitative performance, (6) collaborative multi-party agreement, and (7) early contractor involvement. Similarly, informal governance mechanisms can be classified into four measureable categories: (1) leadership structure, (2) integrated team, (3) team workshops, and (4) joint management system. This paper explores and further defines the key operational characteristics of each mechanism category, highlighting its impact on value for money in alliance project delivery. The paper’s contribution is that it provides the basis for future research to compare the impact of a range of individual mechanisms within each category, as a means of improving the performance of construction projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A multi-resource multi-stage scheduling methodology is developed to solve short-term open-pit mine production scheduling problems as a generic multi-resource multi-stage scheduling problem. It is modelled using essential characteristics of short-term mining production operations such as drilling, sampling, blasting and excavating under the capacity constraints of mining equipment at each processing stage. Based on an extended disjunctive graph model, a shifting-bottleneck-procedure algorithm is enhanced and applied to obtain feasible short-term open-pit mine production schedules and near-optimal solutions. The proposed methodology and its solution quality are verified and validated using a real mining case study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a higher-order beam-column formulation that can capture the geometrically non-linear behaviour of steel framed structures which contain a multiplicity of slender members. Despite advances in computational frame software, analyses of large frames can still be problematic from a numerical standpoint and so the intent of the paper is to fulfil a need for versatile, reliable and efficient non-linear analysis of general steel framed structures with very many members. Following a comprehensive review of numerical frame analysis techniques, a fourth-order element is derived and implemented in an updated Lagrangian formulation, and it is able to predict flexural buckling, snap-through buckling and large displacement post-buckling behaviour of typical structures whose responses have been reported by independent researchers. The solutions are shown to be efficacious in terms of a balance of accuracy and computational expediency. The higher-order element forms a basis for augmenting the geometrically non-linear approach with material non-linearity through the refined plastic hinge methodology described in the companion paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fire incident in buildings is common, so the fire safety design of the framed structure is imperative, especially for the unprotected or partly protected bare steel frames. However, software for structural fire analysis is not widely available. As a result, the performance-based structural fire design is urged on the basis of using user-friendly and conventional nonlinear computer analysis programs so that engineers do not need to acquire new structural analysis software for structural fire analysis and design. The tool is desired to have the capacity of simulating the different fire scenarios and associated detrimental effects efficiently, which includes second-order P-D and P-d effects and material yielding. Also the nonlinear behaviour of large-scale structure becomes complicated when under fire, and thus its simulation relies on an efficient and effective numerical analysis to cope with intricate nonlinear effects due to fire. To this end, the present fire study utilizes a second order elastic/plastic analysis software NIDA to predict structural behaviour of bare steel framed structures at elevated temperatures. This fire study considers thermal expansion and material degradation due to heating. Degradation of material strength with increasing temperature is included by a set of temperature-stress-strain curves according to BS5950 Part 8 mainly, which implicitly allows for creep deformation. This finite element stiffness formulation of beam-column elements is derived from the fifth-order PEP element which facilitates the computer modeling by one member per element. The Newton-Raphson method is used in the nonlinear solution procedure in order to trace the nonlinear equilibrium path at specified elevated temperatures. Several numerical and experimental verifications of framed structures are presented and compared against solutions in literature. The proposed method permits engineers to adopt the performance-based structural fire analysis and design using typical second-order nonlinear structural analysis software.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the theory and practice of the Futures Action Model (FAM). FAM has been in development for over a decade, in a number of contexts and iterations. It is a creative methodology that uses a variety of concepts and tools to guide participants through the conception and modeling of enterprises, services, social innovations and projects in the context of emerging futures. It is used to generate strategic options that people can utilise to build opportunities for value creation as they move into the future. This paper details examples in its development, and provides theoretical and practical guidelines for educators and business facilitators to use the FAM system in their own workplaces.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The detection and correction of defects remains among the most time consuming and expensive aspects of software development. Extensive automated testing and code inspections may mitigate their effect, but some code fragments are necessarily more likely to be faulty than others, and automated identification of fault prone modules helps to focus testing and inspections, thus limiting wasted effort and potentially improving detection rates. However, software metrics data is often extremely noisy, with enormous imbalances in the size of the positive and negative classes. In this work, we present a new approach to predictive modelling of fault proneness in software modules, introducing a new feature representation to overcome some of these issues. This rank sum representation offers improved or at worst comparable performance to earlier approaches for standard data sets, and readily allows the user to choose an appropriate trade-off between precision and recall to optimise inspection effort to suit different testing environments. The method is evaluated using the NASA Metrics Data Program (MDP) data sets, and performance is compared with existing studies based on the Support Vector Machine (SVM) and Naïve Bayes (NB) Classifiers, and with our own comprehensive evaluation of these methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agent-based modelling (ABM), like other modelling techniques, is used to answer specific questions from real world systems that could otherwise be expensive or impractical. Its recent gain in popularity can be attributed to some degree to its capacity to use information at a fine level of detail of the system, both geographically and temporally, and generate information at a higher level, where emerging patterns can be observed. This technique is data-intensive, as explicit data at a fine level of detail is used and it is computer-intensive as many interactions between agents, which can learn and have a goal, are required. With the growing availability of data and the increase in computer power, these concerns are however fading. Nonetheless, being able to update or extend the model as more information becomes available can become problematic, because of the tight coupling of the agents and their dependence on the data, especially when modelling very large systems. One large system to which ABM is currently applied is the electricity distribution where thousands of agents representing the network and the consumers’ behaviours are interacting with one another. A framework that aims at answering a range of questions regarding the potential evolution of the grid has been developed and is presented here. It uses agent-based modelling to represent the engineering infrastructure of the distribution network and has been built with flexibility and extensibility in mind. What distinguishes the method presented here from the usual ABMs is that this ABM has been developed in a compositional manner. This encompasses not only the software tool, which core is named MODAM (MODular Agent-based Model) but the model itself. Using such approach enables the model to be extended as more information becomes available or modified as the electricity system evolves, leading to an adaptable model. Two well-known modularity principles in the software engineering domain are information hiding and separation of concerns. These principles were used to develop the agent-based model on top of OSGi and Eclipse plugins which have good support for modularity. Information regarding the model entities was separated into a) assets which describe the entities’ physical characteristics, and b) agents which describe their behaviour according to their goal and previous learning experiences. This approach diverges from the traditional approach where both aspects are often conflated. It has many advantages in terms of reusability of one or the other aspect for different purposes as well as composability when building simulations. For example, the way an asset is used on a network can greatly vary while its physical characteristics are the same – this is the case for two identical battery systems which usage will vary depending on the purpose of their installation. While any battery can be described by its physical properties (e.g. capacity, lifetime, and depth of discharge), its behaviour will vary depending on who is using it and what their aim is. The model is populated using data describing both aspects (physical characteristics and behaviour) and can be updated as required depending on what simulation is to be run. For example, data can be used to describe the environment to which the agents respond to – e.g. weather for solar panels, or to describe the assets and their relation to one another – e.g. the network assets. Finally, when running a simulation, MODAM calls on its module manager that coordinates the different plugins, automates the creation of the assets and agents using factories, and schedules their execution which can be done sequentially or in parallel for faster execution. Building agent-based models in this way has proven fast when adding new complex behaviours, as well as new types of assets. Simulations have been run to understand the potential impact of changes on the network in terms of assets (e.g. installation of decentralised generators) or behaviours (e.g. response to different management aims). While this platform has been developed within the context of a project focussing on the electricity domain, the core of the software, MODAM, can be extended to other domains such as transport which is part of future work with the addition of electric vehicles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This column features a conversation (via email, image sharing, and Facetime) that took place over several months between two international theorists of digital filmmaking from schools in two countries—Professors Jason Ranker (Portland State University, Oregon, United States) and Kathy Mills (Queensland University of Technology, Australia). The authors discuss emerging ways of thinking about video making, sharing tips and anecdotes from classroom experience to inspire teachers to explore with adolescents the meaning potentials of digital video creation. The authors briefly discuss their previous work in this area, and then move into a discussion of how the material spaces in which students create videos profoundly shape the films' meanings and significance. The article ends with a discussion of how students can take up creative new directions, pushing the boundaries of the potentials of classroom video making and uncovering profound uses of the medium.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Games and the broader interactive entertainment industry are the major ‘born global/born digital’ creative industry. The videogame industry (formally referred to as interactive entertainment) is the economic sector that develops, markets and sells videogames to millions of people worldwide. There are over 11 countries with revenues of over $1 billion. This number was expected to grow 9.1 per cent annually to $48.9 in 2011 and $68 billion in 2012, making it the fastest-growing component of the international media sector (Scanlon, 2007; Caron, 2008).