906 resultados para software creation methodology
Resumo:
This study examined the effect that venture creation action has on the outcomes of nascent entrepreneurship. A theoretical model was developed which proposes action as a fundamental mechanism in venture creation. Thus, action should rightly be considered as a means rather than an end in itself. In this respect, action transmits the effects of venture resource endowments on to venture creation outcomes. This conceptual model was empirically supported in a random sample of nascent ventures. Ventures with higher levels of human or social capital tend to be more active in venture creation. In turn, more active venture attempts are more likely to achieve improved results.
Resumo:
We apply Lazear’s jack-of-all-trades theory to investigate the effect of nascent entrepreneurs´ balanced skill set across various functional areas on the performance of nascent projects. Analyzing longitudinal data on innovative nascent projects, we find that nascent entrepreneurs with a more balanced skill set are more successful in that they progress faster in the venture creation process.
Resumo:
Projects funded by the Australian National Data Service(ANDS). The specific projects that were funded included: a) Greenhouse Gas Emissions Project (N2O) with Prof. Peter Grace from QUT’s Institute of Sustainable Resources. b) Q150 Project for the management of multimedia data collected at Festival events with Prof. Phil Graham from QUT’s Institute of Creative Industries. c) Bio-diversity environmental sensing with Prof. Paul Roe from the QUT Microsoft eResearch Centre. For the purposes of these projects the Eclipse Rich Client Platform (Eclipse RCP) was chosen as an appropriate software development framework within which to develop the respective software. This poster will present a brief overview of the requirements of the projects, an overview of the experiences of the project team in using Eclipse RCP, report on the advantages and disadvantages of using Eclipse and it’s perspective on Eclipse as an integrated tool for supporting future data management requirements.
Resumo:
The concept of produsage developed from the realisation that new language was needed to describe the new phenomena emerging from the intersection of Web 2.0, user-generated content, and social media since the early years of the new millennium. When hundreds, thousands, maybe tens of thousands of participants utilise online platforms to collaborate in the development and continuous improvement of a wide variety of content – from software to informational resources to creative works –, and when this work takes place through a series of more or less unplanned, ad hoc, almost random cooperative encounters, then to describe these processes using terms which were developed during the industrial revolution no longer makes much sense. When – exactly because what takes place here is no longer a form of production in any conventional sense of the word – the outcomes of these massively distributed collaborations appear in the form of constantly changing, permanently mutable bodies of work which are owned at once by everyone and no-one, by the community of contributors as a whole but by none of them as individuals, then to conceptualise them as fixed and complete products in the industrial meaning of the term is missing the point. When what results from these efforts is of a quality (in both depth and breadth) that enables it to substitute for, replace, and even undermine the business model of long-established industrial products, even though precariously it relies on volunteer contributions, and when their volunteering efforts make it possible for some contributors to find semi- or fully professional employment in their field, then conventional industrial logic is put on its head.
Resumo:
To the action researcher, who laboriously spends his or her hours working within the local contexts of communities or organisations to co-generate meaningful research, and who’s theories are hardened on the anvil of creating meaningful social change; futures studies might seem the discipline the most peripheral to its interests, and the most ill equipped to deal with the local and intimate domain of community existence. To the futurist, who laboriously spends his or her hours understanding the nuances of history and social change, who through persistent work, begins to make sense of the weak signals and the subtle shifts, action research would seem as simply an auxiliary field, inappropriate for understanding the greater scheme. I invite the reader, however, whether they belong to one camp or the other, to let go of their respective disciplinary perspectives, and see both belonging to each other. [Introduction] .
Resumo:
Design for Manufacturing (DFM) is a highly integral methodology in product development, starting from the concept development phase, with the aim of improving manufacturing productivity and maintaining product quality. While Design for Assembly (DFA) is focusing on elimination or combination of parts with other components (Boothroyd, Dewhurst and Knight, 2002), which in most cases relates to performing a function and manufacture operation in a simpler way, DFM is following a more holistic approach. During DFM, the considerable background work required for the conceptual phase is compensated for by a shortening of later development phases. Current DFM projects normally apply an iterative step-by-step approach and eventually transfer to the developer team. Although DFM has been a well established methodology for about 30 years, a Fraunhofer IAO study from 2009 found that DFM was still one of the key challenges of the German Manufacturing Industry. A new, knowledge based approach to DFM, eliminating steps of DFM, was introduced in Paul and Al-Dirini (2009). The concept focuses on a concurrent engineering process between the manufacturing engineering and product development systems, while current product realization cycles depend on a rigorous back-and-forth examine-and-correct approach so as to ensure compatibility of any proposed design to the DFM rules and guidelines adopted by the company. The key to achieving reductions is to incorporate DFM considerations into the early stages of the design process. A case study for DFM application in an automotive powertrain engineering environment is presented. It is argued that a DFM database needs to be interfaced to the CAD/CAM software, which will restrict designers to the DFM criteria. Consequently, a notable reduction of development cycles can be achieved. The case study is following the hypothesis that current DFM methods do not improve product design in a manner claimed by the DFM method. The critical case was to identify DFA/DFM recommendations or program actions with repeated appearance in different sources. Repetitive DFM measures are identified, analyzed and it is shown how a modified DFM process can mitigate a non-fully integrated DFM approach.
Resumo:
Post-deployment maintenance and evolution can account for up to 75% of the cost of developing a software system. Software refactoring can reduce the costs associated with evolution by improving system quality. Although refactoring can yield benefits, the process includes potentially complex, error-prone, tedious and time-consuming tasks. It is these tasks that automated refactoring tools seek to address. However, although the refactoring process is well-defined, current refactoring tools do not support the full process. To develop better automated refactoring support, we have completed a usability study of software refactoring tools. In the study, we analysed the task of software refactoring using the ISO 9241-11 usability standard and Fitts' List of task allocation. Expanding on this analysis, we reviewed 11 collections of usability guidelines and combined these into a single list of 38 guidelines. From this list, we developed 81 usability requirements for refactoring tools. Using these requirements, the software refactoring tools Eclipse 3.2, Condenser 1.05, RefactorIT 2.5.1, and Eclipse 3.2 with the Simian UI 2.2.12 plugin were studied. Based on the analysis, we have selected a subset of the requirements that can be incorporated into a prototype refactoring tool intended to address the full refactoring process.
Resumo:
With the large diffusion of Business Process Managemen (BPM) automation suites, the possibility of managing process-related risks arises. This paper introduces an innovative framework for process-related risk management and describes a working implementation realized by extending the YAWL system. The framework covers three aspects of risk management: risk monitoring, risk prevention, and risk mitigation. Risk monitoring functionality is provided using a sensor-based architecture, where sensors are defined at design time and used at run-time for monitoring purposes. Risk prevention functionality is provided in the form of suggestions about what should be executed, by who, and how, through the use of decision trees. Finally, risk mitigation functionality is provided as a sequence of remedial actions (e.g. reallocating, skipping, rolling back of a work item) that should be executed to restore the process to a normal situation.
Resumo:
Purpose: The purpose of this paper is to clarify how end-users’ tacit knowledge can be captured and integrated in an overall business process management (BPM) approach. Current approaches to support stakeholders’ collaboration in the modelling of business processes envision an egalitarian environment where stakeholders interact in the same context, using the same languages and sharing the same perspectives on the business process. Therefore, such stakeholders have to collaborate in the context of process modelling using a language that some of them do not master, and have to integrate their various perspectives. Design/methodology/approach: The paper applies the SECI knowledge management process to analyse the problems of traditional top-down BPM approaches and BPM collaborative modelling tools. Besides, the SECI model is also applied to Wikipedia, a successful Web 2.0-based knowledge management environment, to identify how tacit knowledge is captured in a bottom-up approach. Findings – The paper identifies a set of requirements for a hybrid BPM approach, both top-down and bottom-up, and describes a new BPM method based on a stepwise discovery of knowledge. Originality/value: This new approach, Processpedia, enhances collaborative modelling among stakeholders without enforcing egalitarianism. In Processpedia tacit knowledge is captured and standardised into the organisation’s business processes by fostering an ecological participation of all the stakeholders and capitalising on stakeholders’ distinctive characteristics.
Resumo:
This project investigates machine listening and improvisation in interactive music systems with the goal of improvising musically appropriate accompaniment to an audio stream in real-time. The input audio may be from a live musical ensemble, or playback of a recording for use by a DJ. I present a collection of robust techniques for machine listening in the context of Western popular dance music genres, and strategies of improvisation to allow for intuitive and musically salient interaction in live performance. The findings are embodied in a computational agent – the Jambot – capable of real-time musical improvisation in an ensemble setting. Conceptually the agent’s functionality is split into three domains: reception, analysis and generation. The project has resulted in novel techniques for addressing a range of issues in each of these domains. In the reception domain I present a novel suite of onset detection algorithms for real-time detection and classification of percussive onsets. This suite achieves reasonable discrimination between the kick, snare and hi-hat attacks of a standard drum-kit, with sufficiently low-latency to allow perceptually simultaneous triggering of accompaniment notes. The onset detection algorithms are designed to operate in the context of complex polyphonic audio. In the analysis domain I present novel beat-tracking and metre-induction algorithms that operate in real-time and are responsive to change in a live setting. I also present a novel analytic model of rhythm, based on musically salient features. This model informs the generation process, affording intuitive parametric control and allowing for the creation of a broad range of interesting rhythms. In the generation domain I present a novel improvisatory architecture drawing on theories of music perception, which provides a mechanism for the real-time generation of complementary accompaniment in an ensemble setting. All of these innovations have been combined into a computational agent – the Jambot, which is capable of producing improvised percussive musical accompaniment to an audio stream in real-time. I situate the architectural philosophy of the Jambot within contemporary debate regarding the nature of cognition and artificial intelligence, and argue for an approach to algorithmic improvisation that privileges the minimisation of cognitive dissonance in human-computer interaction. This thesis contains extensive written discussions of the Jambot and its component algorithms, along with some comparative analyses of aspects of its operation and aesthetic evaluations of its output. The accompanying CD contains the Jambot software, along with video documentation of experiments and performances conducted during the project.
Resumo:
The Toolbox, combined with MATLAB ® and a modern workstation computer, is a useful and convenient environment for investigation of machine vision algorithms. For modest image sizes the processing rate can be sufficiently ``real-time'' to allow for closed-loop control. Focus of attention methods such as dynamic windowing (not provided) can be used to increase the processing rate. With input from a firewire or web camera (support provided) and output to a robot (not provided) it would be possible to implement a visual servo system entirely in MATLAB. Provides many functions that are useful in machine vision and vision-based control. Useful for photometry, photogrammetry, colorimetry. It includes over 100 functions spanning operations such as image file reading and writing, acquisition, display, filtering, blob, point and line feature extraction, mathematical morphology, homographies, visual Jacobians, camera calibration and color space conversion.
Resumo:
This year marks the completion of data collection for year three (Wave 3) of the CAUSEE study. This report uses data from the first three years and focuses on the process of learning and adaptation in the business creation process. Most start-ups need to change their business model, their product, their marketing plan, their market or something else about the business to be successful. PayPal changed their product at least five times, moving from handheld security, to enterprise apps, to consumer apps, to a digital wallet, to payments between handhelds before finally stumbling on the model that made the a multi-billion dollar company revolving around email-based payments. PayPal is not alone and anecdotes abounds of start-ups changing direction: Sysmantec started as an artificial intelligence company, Apple started selling plans to build computers and Microsoft tried to peddle compilers before licensing an operating system out of New Mexico. To what extent do Australian new ventures change and adapt as their ideas and business develop? As a longitudinal study, CAUSEE was designed specifically to observe development in the venture creation process. In this research briefing paper, we compare development over time of randomly sampled Nascent Firms (NF) and Young Firms(YF), concentrating on the surviving cases. We also compare NFs with YFs at each yearly interval. The 'high potential' over sample is not used in this report.
Resumo:
The ninth release of the Toolbox, represents over fifteen years of development and a substantial level of maturity. This version captures a large number of changes and extensions generated over the last two years which support my new book “Robotics, Vision & Control”. The Toolbox has always provided many functions that are useful for the study and simulation of classical arm-type robotics, for example such things as kinematics, dynamics, and trajectory generation. The Toolbox is based on a very general method of representing the kinematics and dynamics of serial-link manipulators. These parameters are encapsulated in MATLAB ® objects - robot objects can be created by the user for any serial-link manipulator and a number of examples are provided for well know robots such as the Puma 560 and the Stanford arm amongst others. The Toolbox also provides functions for manipulating and converting between datatypes such as vectors, homogeneous transformations and unit-quaternions which are necessary to represent 3-dimensional position and orientation. This ninth release of the Toolbox has been significantly extended to support mobile robots. For ground robots the Toolbox includes standard path planning algorithms (bug, distance transform, D*, PRM), kinodynamic planning (RRT), localization (EKF, particle filter), map building (EKF) and simultaneous localization and mapping (EKF), and a Simulink model a of non-holonomic vehicle. The Toolbox also including a detailed Simulink model for a quadcopter flying robot.
Resumo:
In this article Caroline Heim explores an avenue for the audience's contribution to the theatrical event that has emerged as increasingly important over the past decade: postperformance discussions. With the exception of theatres that actively encourage argument such as the Staatstheater Stuttgart, most extant audience discussions in Western mainstream theatres privilege the voice of the theatre expert. Caroline Heim presents case studies of post-performance discussions held after performances of Anne of the Thousand Days and Who's Afraid of Virginia Woolf? which trialled a new model of audience co-creation. An audience text which informs the theatrical event was created, and a new role, that of audience critic, established in the process.