426 resultados para software creation methodology
Resumo:
This project investigates musicalisation and intermediality in the writing and devising of composed theatre. Its research question asks “How does the narrative of a musical play differ when it emerges from a setlist of original songs?”, the aim being to create performance event that is neither music nor theatre. This involves composition of lyrics, music, action and spoken text, projected image: gathered in a script and presented in performance. Scholars such as Kulezic-Wilson(in Kendrick, L and Roesner, D 2011:34) outline the acoustic dimension to the ‘performative turn’ (Mungen, Ernst and Bentzweizer, 2012) as heralding “…a shift of emphasis on how meaning is created (and veiled) and how the spectrum of theatrical creation and reception is widened.” Rebstock and Roesner (2012) capture approaches similar to this, building on Lehmann’s work in the post-dramatic under the new term ‘composed theatre’. This practice led research draws influence from these new theoretical frames, pushing beyond ‘the musical’. Springing from a set of original songs in dialogue with performed narrative, Bear with Me is a 45 minute music driven work for children, involving projected image and participatory action. Bear with Me’s intermedial hybrid of theatrical, screen and concert presentations shows that a simple setlist of original songs can be the starting point for the structure of a complex intermedial performance. Bear with Me was programmed into the Queensland Performing Arts Centre’s Out of the Box Festival. It was first performed in the Tony Gould Gallery at the Queensland in June 2012. The season sold out. A masterclass on my playwriting methodology was presented at the Connecting The Dots Symposium which ran alongside the festival.
Resumo:
Nurse researchers are increasingly adopting qualitative methodologies for research practice and theory development. These approaches to research are, in many cases, more appropriate for the field of nursing inquiry than the previously dominant techno-rational methods. However, there remains the issue of adapting methodologies developed in other academic disciplines to the nursing research context. This paper draws upon my own experience with interpretive research to raise questions about the issue of nursing research within a social science research framework. The paper argues that by integrating the characteristics of nursing practice with the characteristics of research practice, the researcher can develop a 'nursing lens', an approach to qualitative research that brings an added dimension to social science methodologies in the nursing research context. Attention is drawn to the unique nature of the nurse-patient relationship, and the ways in which this aspect of nursing practice can enhance nursing research. Examples are given from interview transcripts to support this position.
Resumo:
Building Information Modelling (BIM) appears to be the next evolutionary link in project delivery within the AEC (Architecture, Engineering and Construction) Industry. There have been several surveys of implementation at the local level but to date little is known of the international context. This paper is a preliminary report of a large scale electronic survey of the implementation of BIM and the impact on AEC project delivery and project stakeholders in Australia and internationally. National and regional patterns of BIM usage will be identified. These patterns will include disciplinary users, project lifecycle stages, technology integration–including software compatibility—and organisational issues such as human resources and interoperability. Also considered is the current status of the inclusion of BIM within tertiary level curricula and potential for the creation of a new discipline.
Resumo:
Ethnography is now a well-established research methodology for virtual environments, and the vast majority of accounts have one aspect in common, whether textual or graphic environments – that of the embodied avatar. In this article, I first discuss the applicability of such a methodology to non-avatar environments such as Eve Online, considering where the methodology works and the issues that arise in its implementation – particularly for the consideration of sub-communities within the virtual environment. Second, I consider what alternative means exist for getting at the information that is obtained through an ethnographic study of the virtual environment. To that end, I consider the practical and ethical implications of utilizing existing accounts, the importance of the meta-game discourse, including those sources outside of the control of the environment developer, and finally the utility in combining personal observations with accounts of other ethnographers, both within and between environments.
Resumo:
Since 2007 Kite Arts Education Program (KITE), based at Queensland Performing Arts Centre (QPAC), has been engaged in delivering a series of theatre-based experiences for children in low socio-economic primary schools in Queensland. The artist in residence (AIR) project titled Yonder includes performances developed by the children with the support and leadership of teacher artists from KITE for their community and parents/carers,supported by a peak community cultural institution. In 2009,Queensland Performing Arts Centre partnered with Queensland University of Technology (QUT) Creative Industries Faculty (Drama) to conduct a three-year evaluation of the Yonder project to understand the operational dynamics, artistic outputs and the educational benefits of the project. This paper outlines the research findings for children engaged in the Yonder project in the interrelated areas of literacy development and social competencies. Findings are drawn from six iterations of the project in suburban locations on the edge of Brisbane city and in regional Queensland.
Resumo:
Whole-body computer control interfaces present new opportunities to engage children with games for learning. Stomp is a suite of educational games that use such a technology, allowing young children to use their whole body to interact with a digital environment projected on the floor. To maximise the effectiveness of this technology, tenets of self-determination theory (SDT) are applied to the design of Stomp experiences. By meeting user needs for competence, autonomy, and relatedness our aim is to increase children's engagement with the Stomp learning platform. Analysis of Stomp's design suggests that these tenets are met. Observations from a case study of Stomp being used by young children show that they were highly engaged and motivated by Stomp. This analysis demonstrates that continued application of SDT to Stomp will further enhance user engagement. It also is suggested that SDT, when applied more widely to other whole-body multi-user interfaces, could instil similar positive effects.
Resumo:
This research examines the entrepreneurship phenomenon, and the question: Why are some venture attempts more successful than others? This question is not a new one. Prior research has answered this by describing those that engage in nascent entrepreneurship. Yet, this approach yielded little consensus and offers little comfort for those newly considering venture creation (Gartner, 1988). Rather, this research considers the process of venture creation, by focusing on the actions of nascent entrepreneurs. However, the venture creation process is complex (Liao, Welsch, & Tan, 2005), and multi-dimensional (Davidsson, 2004). The process can vary in the amount of action engaged by the entrepreneur; the temporal dynamics of how action is enacted (Lichtenstein, Carter, Dooley, and Gartner 2007); or the sequence in which actions are undertaken. And little is known about whether any, or all three, of these dimensions matter. Further, there exists scant general knowledge about how the venture creation process influences venture creation outcomes (Gartner & Shaver, 2011). Therefore, this research conducts a systematic study of what entrepreneurs do as they create a new venture. The primary goal is to develop general principles so that advice may be offered on how to ‘proceed’, rather than how to ‘be’. Three integrated empirical studies were conducted that separately focus on each of the interrelated dimensions. The basis for this was a randomly sampled, longitudinal panel, of nascent ventures. Upon recruitment these ventures were in the process of being created, but yet to be established as new businesses. The ventures were tracked one year latter to follow up on outcomes. Accordingly, this research makes the following original contributions to knowledge. First, the findings suggest that all three of the dimensions play an important role: action, dynamics, and sequence. This implies that future research should take a multi-dimensional view of the venture creation process. Failing to do so can only result in a limited understanding of a complex phenomenon. Second, action is the fundamental means through which venture creation is achieved. Simply put, more active venture creation efforts are more likely more successful. Further, action is the medium which allows resource endowments their effect upon venture outcomes. Third, the dynamics of how venture creation plays out over time is also influential. Here, a process with a high rate of action which increases in intensity will more likely achieve positive outcomes. Forth, sequence analysis, suggests that the order in which actions are taken will also drive outcomes. Although venture creation generally flows in sequence from discovery toward exploitation (Shane & Venkataraman, 2000; Eckhardt & Shane, 2003; Shane, 2003), processes that actually proceed in this way are less likely to be realized. Instead, processes which specifically intertwine discovery and exploitation action together in symbiosis more likely achieve better outcomes (Sarasvathy, 2001; Baker, Miner, & Eesley, 2003). Further, an optimal venture creation order exists somewhere between these sequential and symbiotic process archetypes. A process which starts out as symbiotic discovery and exploitation, but switches to focus exclusively on exploitation later on is most likely to achieve venture creation. These sequence findings are unique, and suggest future integration between opposing theories for order in venture creation.
Resumo:
Smartphones are steadily gaining popularity, creating new application areas as their capabilities increase in terms of computational power, sensors and communication. Emerging new features of mobile devices give opportunity to new threats. Android is one of the newer operating systems targeting smartphones. While being based on a Linux kernel, Android has unique properties and specific limitations due to its mobile nature. This makes it harder to detect and react upon malware attacks if using conventional techniques. In this paper, we propose an Android Application Sandbox (AASandbox) which is able to perform both static and dynamic analysis on Android programs to automatically detect suspicious applications. Static analysis scans the software for malicious patterns without installing it. Dynamic analysis executes the application in a fully isolated environment, i.e. sandbox, which intervenes and logs low-level interactions with the system for further analysis. Both the sandbox and the detection algorithms can be deployed in the cloud, providing a fast and distributed detection of suspicious software in a mobile software store akin to Google's Android Market. Additionally, AASandbox might be used to improve the efficiency of classical anti-virus applications available for the Android operating system.
Resumo:
Purpose: Within the context of high global competitiveness, knowledge management (KM) has proven to be one of the major factors contributing to enhanced business outcomes. Furthermore, knowledge sharing (KS) is one of the most critical of all KM activities. From a manufacturing industry perspective, supply chain management (SCM) and product development process (PDP) activities, require a high proportion of company resources such as budget and manpower. Therefore, manufacturing companies are striving to strengthen SCM, PDP and KS activities in order to accelerate rates of manufacturing process improvement, ultimately resulting in higher levels of business performance (BP). A theoretical framework along with a number of hypotheses are proposed and empirically tested through correlation, factor and path analyses. Design/methodology/approach: A questionnaire survey was administered to a sample of electronic manufacturing companies operating in Taiwan to facilitate testing the proposed relationships. More than 170 respondents from 83 organisations responded to the survey. The study identified top management commitment and employee empowerment, supplier evaluation and selection, and design simplification and modular design as the key business activities that are strongly associated with the business performance. Findings: The empirical study supports that key manufacturing business activities (i.e., SCM, PDP, and KS) are positively associated with BP. The findings also evealed that some specific business activities such as SCMF1,PDPF2, and KSF1 have the strongest influencing power on particular business outcomes (i.e., BPF1 and BPF2) within the context of electronic manufacturing companies operating in Taiwan. Practical implications: The finding regarding the relationship between SCM and BP identified the essential role of supplier evaluation and selection in improving business competitiveness and long term performance. The process of forming knowledge in companies, such as creation, storage/retrieval, and transfer do not necessarily lead to enhanced business performance; only through effectively applying knowledge to the right person at the right time does. Originality/value: Based on this finding it is recommended that companies should involve suppliers in partnerships to continuously improve operations and enhance product design efforts, which would ultimately enhance business performance. Business performance depends more on an employee’s ability to turn knowledge into effective action.
Resumo:
This series of research vignettes is aimed at sharing current and interesting research findings from our team of international Entrepreneurship researchers. This vignette, written by Mr. Darren Kavanagh and Professor Per Davidsson, takes a closer look at job creation by new firms.
Resumo:
Background Predicting protein subnuclear localization is a challenging problem. Some previous works based on non-sequence information including Gene Ontology annotations and kernel fusion have respective limitations. The aim of this work is twofold: one is to propose a novel individual feature extraction method; another is to develop an ensemble method to improve prediction performance using comprehensive information represented in the form of high dimensional feature vector obtained by 11 feature extraction methods. Methodology/Principal Findings A novel two-stage multiclass support vector machine is proposed to predict protein subnuclear localizations. It only considers those feature extraction methods based on amino acid classifications and physicochemical properties. In order to speed up our system, an automatic search method for the kernel parameter is used. The prediction performance of our method is evaluated on four datasets: Lei dataset, multi-localization dataset, SNL9 dataset and a new independent dataset. The overall accuracy of prediction for 6 localizations on Lei dataset is 75.2% and that for 9 localizations on SNL9 dataset is 72.1% in the leave-one-out cross validation, 71.7% for the multi-localization dataset and 69.8% for the new independent dataset, respectively. Comparisons with those existing methods show that our method performs better for both single-localization and multi-localization proteins and achieves more balanced sensitivities and specificities on large-size and small-size subcellular localizations. The overall accuracy improvements are 4.0% and 4.7% for single-localization proteins and 6.5% for multi-localization proteins. The reliability and stability of our classification model are further confirmed by permutation analysis. Conclusions It can be concluded that our method is effective and valuable for predicting protein subnuclear localizations. A web server has been designed to implement the proposed method. It is freely available at http://bioinformatics.awowshop.com/snlpred_page.php.
Multi-level knowledge transfer in software development outsourcing projects : the agency theory view
Resumo:
In recent years, software development outsourcing has become even more complex. Outsourcing partner have begun‘re- outsourcing’ components of their projects to other outsourcing companies to minimize cost and gain efficiencies, creating a multi-level hierarchy of outsourcing. This research in progress paper presents preliminary findings of a study designed to understand knowledge transfer effectiveness of multi-level software development outsourcing projects. We conceptualize the SD-outsourcing entities using the Agency Theory. This study conceptualizes, operationalises and validates the concept of Knowledge Transfer as a three-phase multidimensional formative index of 1) Domain knowledge, 2) Communication behaviors, and 3) Clarity of requirements. Data analysis identified substantial, significant differences between the Principal and the Agent on two of the three constructs. Using Agency Theory, supported by preliminary findings, the paper also provides prescriptive guidelines of reducing the friction between the Principal and the Agent in multi-level software outsourcing.
Resumo:
Although topic detection and tracking techniques have made great progress, most of the researchers seldom pay more attention to the following two aspects. First, the construction of a topic model does not take the characteristics of different topics into consideration. Second, the factors that determine the formation and development of hot topics are not further analyzed. In order to correctly extract news blog hot topics, the paper views the above problems in a new perspective based on the W2T (Wisdom Web of Things) methodology, in which the characteristics of blog users, context of topic propagation and information granularity are investigated in a unified way. The motivations and features of blog users are first analyzed to understand the characteristics of news blog topics. Then the context of topic propagation is decomposed into the blog community, topic network and opinion network, respectively. Some important factors such as the user behavior pattern, opinion leader and network opinion are identified to track the development trends of news blog topics. Moreover, a blog hot topic detection algorithm is proposed, in which news blog hot topics are identified by measuring the duration, topic novelty, attention degree of users and topic growth. Experimental results show that the proposed method is feasible and effective. These results are also useful for further studying the formation mechanism of opinion leaders in blogspace.
Resumo:
The main objective of this paper is to describe the development of a remote sensing airborne air sampling system for Unmanned Aerial Systems (UAS) and provide the capability for the detection of particle and gas concentrations in real time over remote locations. The design of the air sampling methodology started by defining system architecture, and then by selecting and integrating each subsystem. A multifunctional air sampling instrument, with capability for simultaneous measurement of particle and gas concentrations was modified and integrated with ARCAA’s Flamingo UAS platform and communications protocols. As result of the integration process, a system capable of both real time geo-location monitoring and indexed-link sampling was obtained. Wind tunnel tests were conducted in order to evaluate the performance of the air sampling instrument in controlled nonstationary conditions at the typical operational velocities of the UAS platform. Once the remote fully operative air sampling system was obtained, the problem of mission design was analyzed through the simulation of different scenarios. Furthermore, flight tests of the complete air sampling system were then conducted to check the dynamic characteristics of the UAS with the air sampling system and to prove its capability to perform an air sampling mission following a specific flight path.
Resumo:
Management of groundwater systems requires realistic conceptual hydrogeological models as a framework for numerical simulation modelling, but also for system understanding and communicating this to stakeholders and the broader community. To help overcome these challenges we developed GVS (Groundwater Visualisation System), a stand-alone desktop software package that uses interactive 3D visualisation and animation techniques. The goal was a user-friendly groundwater management tool that could support a range of existing real-world and pre-processed data, both surface and subsurface, including geology and various types of temporal hydrological information. GVS allows these data to be integrated into a single conceptual hydrogeological model. In addition, 3D geological models produced externally using other software packages, can readily be imported into GVS models, as can outputs of simulations (e.g. piezometric surfaces) produced by software such as MODFLOW or FEFLOW. Boreholes can be integrated, showing any down-hole data and properties, including screen information, intersected geology, water level data and water chemistry. Animation is used to display spatial and temporal changes, with time-series data such as rainfall, standing water levels and electrical conductivity, displaying dynamic processes. Time and space variations can be presented using a range of contouring and colour mapping techniques, in addition to interactive plots of time-series parameters. Other types of data, for example, demographics and cultural information, can also be readily incorporated. The GVS software can execute on a standard Windows or Linux-based PC with a minimum of 2 GB RAM, and the model output is easy and inexpensive to distribute, by download or via USB/DVD/CD. Example models are described here for three groundwater systems in Queensland, northeastern Australia: two unconfined alluvial groundwater systems with intensive irrigation, the Lockyer Valley and the upper Condamine Valley, and the Surat Basin, a large sedimentary basin of confined artesian aquifers. This latter example required more detail in the hydrostratigraphy, correlation of formations with drillholes and visualisation of simulation piezometric surfaces. Both alluvial system GVS models were developed during drought conditions to support government strategies to implement groundwater management. The Surat Basin model was industry sponsored research, for coal seam gas groundwater management and community information and consultation. The “virtual” groundwater systems in these 3D GVS models can be interactively interrogated by standard functions, plus production of 2D cross-sections, data selection from the 3D scene, rear end database and plot displays. A unique feature is that GVS allows investigation of time-series data across different display modes, both 2D and 3D. GVS has been used successfully as a tool to enhance community/stakeholder understanding and knowledge of groundwater systems and is of value for training and educational purposes. Projects completed confirm that GVS provides a powerful support to management and decision making, and as a tool for interpretation of groundwater system hydrological processes. A highly effective visualisation output is the production of short videos (e.g. 2–5 min) based on sequences of camera ‘fly-throughs’ and screen images. Further work involves developing support for multi-screen displays and touch-screen technologies, distributed rendering, gestural interaction systems. To highlight the visualisation and animation capability of the GVS software, links to related multimedia hosted online sites are included in the references.