603 resultados para software project dependencies
Resumo:
This important research is published at a critical time in the history of PRINCE2. The world’s project managers are under incredible scrutiny and pressure to ensure their projects deliver quality on time and on budget – and even more so during a world recession. The research shows that PRINCE2 goes a long way to helping them achieve these goals. Although its origins began in the UK, PRINCE2 now has a truly international reach. We are delighted that the Queensland University of Technology (QUT) has undertaken this global, thorough and informative research project. While it highlights the strengths of the methodology itself, the report also looks at the challenges organisations face when using a project management method such as PRINCE2. We’re sure the challenges will resonate with project managers around the world. Securing executive support to champion the adoption of PRINCE2, creating a robust business case and prioritising project governance are key issues that all project managers will grapple with during their career. The research also shows that to be thoroughly effective, organisations need to properly embed PRINCE2 and tailor it to suit their particular circumstances. Many successful organisations have sought the effective help of accredited consulting organisations to assist them in developing a programme to tailor and inculcate this method into their organisational culture. The latest version incorporates a whole chapter on tailoring PRINCE2. We believe that the publication of PRINCE2 Directing Successful Projects using PRINCE2 and the development of further support in the form of materials, mentoring and training for senior executives will be of significant benefit to contemporary project based organisations. The APM Group has already developed a qualification for sponsors in conjunction with the UK’s Home Office to help with this.
Resumo:
This paper develops a framework for classifying term dependencies in query expansion with respect to the role terms play in structural linguistic associations. The framework is used to classify and compare the query expansion terms produced by the unigram and positional relevance models. As the unigram relevance model does not explicitly model term dependencies in its estimation process it is often thought to ignore dependencies that exist between words in natural language. The framework presented in this paper is underpinned by two types of linguistic association, namely syntagmatic and paradigmatic associations. It was found that syntagmatic associations were a more prevalent form of linguistic association used in query expansion. Paradoxically, it was the unigram model that exhibited this association more than the positional relevance model. This surprising finding has two potential implications for information retrieval models: (1) if linguistic associations underpin query expansion, then a probabilistic term dependence assumption based on position is inadequate for capturing them; (2) the unigram relevance model captures more term dependency information than its underlying theoretical model suggests, so its normative position as a baseline that ignores term dependencies should perhaps be reviewed.
Resumo:
Queensland University of Technology (QUT) was one of the first universities in Australia to establish an institutional repository. Launched in November 2003, the repository (QUT ePrints) uses the EPrints open source repository software (from Southampton) and has enjoyed the benefit of an institutional deposit mandate since January 2004. Currently (April 2012), the repository holds over 36,000 records, including 17,909 open access publications with another 2,434 publications embargoed but with mediated access enabled via the ‘Request a copy’ button which is a feature of the EPrints software. At QUT, the repository is managed by the library.QUT ePrints (http://eprints.qut.edu.au) The repository is embedded into a number of other systems at QUT including the staff profile system and the University’s research information system. It has also been integrated into a number of critical processes related to Government reporting and research assessment. Internally, senior research administrators often look to the repository for information to assist with decision-making and planning. While some statistics could be drawn from the advanced search feature and the existing download statistics feature, they were rarely at the level of granularity or aggregation required. Getting the information from the ‘back end’ of the repository was very time-consuming for the Library staff. In 2011, the Library funded a project to enhance the range of statistics which would be available from the public interface of QUT ePrints. The repository team conducted a series of focus groups and individual interviews to identify and prioritise functionality requirements for a new statistics ‘dashboard’. The participants included a mix research administrators, early career researchers and senior researchers. The repository team identified a number of business criteria (eg extensible, support available, skills required etc) and then gave each a weighting. After considering all the known options available, five software packages (IRStats, ePrintsStats, AWStats, BIRT and Google Urchin/Analytics) were thoroughly evaluated against a list of 69 criteria to determine which would be most suitable. The evaluation revealed that IRStats was the best fit for our requirements. It was deemed capable of meeting 21 out of the 31 high priority criteria. Consequently, IRStats was implemented as the basis for QUT ePrints’ new statistics dashboards which were launched in Open Access Week, October 2011. Statistics dashboards are now available at four levels; whole-of-repository level, organisational unit level, individual author level and individual item level. The data available includes, cumulative total deposits, time series deposits, deposits by item type, % fulltexts, % open access, cumulative downloads, time series downloads, downloads by item type, author ranking, paper ranking (by downloads), downloader geographic location, domains, internal v external downloads, citation data (from Scopus and Web of Science), most popular search terms, non-search referring websites. The data is displayed in charts, maps and table format. The new statistics dashboards are a great success. Feedback received from staff and students has been very positive. Individual researchers have said that they have found the information to be very useful when compiling a track record. It is now very easy for senior administrators (including the Deputy Vice Chancellor-Research) to compare the full-text deposit rates (i.e. mandate compliance rates) across organisational units. This has led to increased ‘encouragement’ from Heads of School and Deans in relation to the provision of full-text versions.
Resumo:
Recent studies suggest that meta-evaluation can be valuable in developing new approaches to evaluation, building evaluation capacities, and enhancing organizational learning. These new extensions of the concept of meta-evaluation are significant, given the growing emphasis on improving the quality and effectiveness of evaluation practices in the South Asian region. Following a review of the literature, this paper presents a case study of the use of concurrent meta-evaluation in the four-year project Assessing Communication for Social Change which developed and trialled a participatory impact assessment methodology in collaboration with a development communication Non-government organization (NGO) in Nepal. Key objectives of the meta-evaluation included to: continuously develop, adapt and improve the impact assessment methodology, Monitoring and Evaluation (M&E) systems and process and other project activities; identify impacts of the project; and build capacities in critical reflection and review. Our analysis indicates that this meta-evaluation was essential to understanding various constraints related to the organizational context that affected the success of the project and the development of improved M&E systems and capacities within the NGO. We identified several limitations of our meta-evaluation methods, which were balanced by the strengths of other methods. Our case study suggests that as well as assessing the quality, credibility and value of evaluation practices, meta-evaluations need to focus on important contextual issues that can have significant impacts on the outcomes of participatory evaluation projects. They include hierarchical organizational cultures, communication barriers, power/knowledge relations, and the time and resources available. Meta-evaluations also need to consider wider issues such as the sustainability of evaluation systems and approaches.
Resumo:
A breaker restrike is an abnormal arcing phenomenon, leading to a possible breaker failure. Eventually, this failure leads to interruption of the transmission and distribution of the electricity supply system until the breaker is replaced. Before 2008, there was little evidence in the literature of monitoring techniques based on restrike measurement and interpretation produced during switching of capacitor banks and shunt reactor banks in power systems. In 2008 a non-intrusive radiometric restrike measurement method and a restrike hardware detection algorithm were developed by M.S. Ramli and B. Kasztenny. However, the limitations of the radiometric measurement method are a band limited frequency response as well as limitations in amplitude determination. Current restrike detection methods and algorithms require the use of wide bandwidth current transformers and high voltage dividers. A restrike switch model using Alternative Transient Program (ATP) and Wavelet Transforms which support diagnostics are proposed. Restrike phenomena become a new diagnostic process using measurements, ATP and Wavelet Transforms for online interrupter monitoring. This research project investigates the restrike switch model Parameter „A. dielectric voltage gradient related to a normal and slowed case of the contact opening velocity and the escalation voltages, which can be used as a diagnostic tool for a vacuum circuit-breaker (CB) at service voltages between 11 kV and 63 kV. During current interruption of an inductive load at current quenching or chopping, a transient voltage is developed across the contact gap. The dielectric strength of the gap should rise to a point to withstand this transient voltage. If it does not, the gap will flash over, resulting in a restrike. A straight line is fitted through the voltage points at flashover of the contact gap. This is the point at which the gap voltage has reached a value that exceeds the dielectric strength of the gap. This research shows that a change in opening contact velocity of the vacuum CB produces a corresponding change in the slope of the gap escalation voltage envelope. To investigate the diagnostic process, an ATP restrike switch model was modified with contact opening velocity computation for restrike waveform signature analyses along with experimental investigations. This also enhanced a mathematical CB model with the empirical dielectric model for SF6 (sulphur hexa-fluoride) CBs at service voltages above 63 kV and a generalised dielectric curve model for 12 kV CBs. A CB restrike can be predicted if there is a similar type of restrike waveform signatures for measured and simulated waveforms. The restrike switch model applications are used for: computer simulations as virtual experiments, including predicting breaker restrikes; estimating the interrupter remaining life of SF6 puffer CBs; checking system stresses; assessing point-on-wave (POW) operations; and for a restrike detection algorithm development using Wavelet Transforms. A simulated high frequency nozzle current magnitude was applied to an Equation (derived from the literature) which can calculate the life extension of the interrupter of a SF6 high voltage CB. The restrike waveform signatures for a medium and high voltage CB identify its possible failure mechanism such as delayed opening, degraded dielectric strength and improper contact travel. The simulated and measured restrike waveform signatures are analysed using Matlab software for automatic detection. Experimental investigation of a 12 kV vacuum CB diagnostic was carried out for the parameter determination and a passive antenna calibration was also successfully developed with applications for field implementation. The degradation features were also evaluated with a predictive interpretation technique from the experiments, and the subsequent simulation indicates that the drop in voltage related to the slow opening velocity mechanism measurement to give a degree of contact degradation. A predictive interpretation technique is a computer modeling for assessing switching device performance, which allows one to vary a single parameter at a time; this is often difficult to do experimentally because of the variable contact opening velocity. The significance of this thesis outcome is that it is a non-intrusive method developed using measurements, ATP and Wavelet Transforms to predict and interpret a breaker restrike risk. The measurements on high voltage circuit-breakers can identify degradation that can interrupt the distribution and transmission of an electricity supply system. It is hoped that the techniques for the monitoring of restrike phenomena developed by this research will form part of a diagnostic process that will be valuable for detecting breaker stresses relating to the interrupter lifetime. Suggestions for future research, including a field implementation proposal to validate the restrike switch model for ATP system studies and the hot dielectric strength curve model for SF6 CBs, are given in Appendix A.
Resumo:
The tricky terrain of intercultural communication within the pressure-cooker environment of creating new performance work is explored through the experiences of five Australians working with 55 artists in Hanoi, Vietnam on a project called Through the Eyes of the Phoenix. Key cultural communication issues such as the concept of ‘face’, identity, translation, adaptability, ambiguity tolerance, empathy, enmeshment and the development of shared understandings are examined in relation to theories of high and low context cultures and individualist collectivist frameworks. The experiences of both Australian and Vietnamese artists are foregrounded, revealing the importance of other intercultural communication modes such as visual, kinaesthetic and tactile languages as well as the languages of their art forms. Immersion in social activities and the importance of the emotional domain are also highlighted as essential factors to survive and thrive in intense creative collaborations across cultures. These dance perspectives, embedded in practice, provide alternative contributions to the messy complexities of intercultural communication.
Resumo:
This volume represents teh second collection of working papers and articles by participants in the Higher Education Policy Project (HEPP), a project funded by the Australian Research Council and based in the Graduate School of Education at the University of Queensland. The first volume, 'Higher Education in Transition: Working Papers of the Higher Education Policy Project (Bella, McCollow and Knight, 1993), took the broad theme of "higher education in transition" in order to introduce readers the HEPP and give them some idea of the breadth of the research being pursued by the HEPP research team itself and by the cohort of post-graduate students also associated with the project. Since then, higher education has remained in transition. Stubborn and resurgent questions continue: such as what a university ought to be, what forms of research should be supported in a mass system, and how institutional accountability can be demonstrated. In differing ways and using a variety of research perspectives and methodologies, the contributors to this volume explore these and other questions of relevance to higher education today.
Resumo:
The Queensland University of Technology (QUT) in Brisbane, Australia, is involved in a number of projects funded by the Australian National Data Service (ANDS). Currently, QUT is working on a project (Metadata Stores Project) that uses open source VIVO software to aid in the storage and management of metadata relating to data sets created/managed by the QUT research community. The registry (called QUT Research Data Finder) will support the sharing and reuse of research datasets, within and external to QUT. QUT uses VIVO for both the display and the editing of research metadata.
Resumo:
The Texas Department of Transportation (TxDOT) is concerned about the widening gap between pavement preservation needs and available funding. Thus, the TxDOT Austin District Pavement Engineer (DPE) has investigated methods to strategically allocate available pavement funding to potential projects that improve the overall performance of the District and Texas highway systems. The primary objective of the study presented in this paper is to develop a network-level project screening and ranking method that supports the Austin District 4-year pavement management plan development. The study developed candidate project selection and ranking algorithms that evaluated pavement conditions of each project candidate using data contained in the Pavement Management Information system (PMIS) database and incorporated insights from Austin District pavement experts; and implemented the developed method and supporting algorithm. This process previously required weeks to complete, but now requires about 10 minutes including data preparation and running the analysis algorithm, which enables the Austin DPE to devote more time and resources to conducting field visits, performing project-level evaluation and testing candidate projects. The case study results showed that the proposed method assisted the DPE in evaluating and prioritizing projects and allocating funds to the right projects at the right time.
Resumo:
The most common software analysis tools available for measuring fluorescence images are for two-dimensional (2D) data that rely on manual settings for inclusion and exclusion of data points, and computer-aided pattern recognition to support the interpretation and findings of the analysis. It has become increasingly important to be able to measure fluorescence images constructed from three-dimensional (3D) datasets in order to be able to capture the complexity of cellular dynamics and understand the basis of cellular plasticity within biological systems. Sophisticated microscopy instruments have permitted the visualization of 3D fluorescence images through the acquisition of multispectral fluorescence images and powerful analytical software that reconstructs the images from confocal stacks that then provide a 3D representation of the collected 2D images. Advanced design-based stereology methods have progressed from the approximation and assumptions of the original model-based stereology(1) even in complex tissue sections(2). Despite these scientific advances in microscopy, a need remains for an automated analytic method that fully exploits the intrinsic 3D data to allow for the analysis and quantification of the complex changes in cell morphology, protein localization and receptor trafficking. Current techniques available to quantify fluorescence images include Meta-Morph (Molecular Devices, Sunnyvale, CA) and Image J (NIH) which provide manual analysis. Imaris (Andor Technology, Belfast, Northern Ireland) software provides the feature MeasurementPro, which allows the manual creation of measurement points that can be placed in a volume image or drawn on a series of 2D slices to create a 3D object. This method is useful for single-click point measurements to measure a line distance between two objects or to create a polygon that encloses a region of interest, but it is difficult to apply to complex cellular network structures. Filament Tracer (Andor) allows automatic detection of the 3D neuronal filament-like however, this module has been developed to measure defined structures such as neurons, which are comprised of dendrites, axons and spines (tree-like structure). This module has been ingeniously utilized to make morphological measurements to non-neuronal cells(3), however, the output data provide information of an extended cellular network by using a software that depends on a defined cell shape rather than being an amorphous-shaped cellular model. To overcome the issue of analyzing amorphous-shaped cells and making the software more suitable to a biological application, Imaris developed Imaris Cell. This was a scientific project with the Eidgenössische Technische Hochschule, which has been developed to calculate the relationship between cells and organelles. While the software enables the detection of biological constraints, by forcing one nucleus per cell and using cell membranes to segment cells, it cannot be utilized to analyze fluorescence data that are not continuous because ideally it builds cell surface without void spaces. To our knowledge, at present no user-modifiable automated approach that provides morphometric information from 3D fluorescence images has been developed that achieves cellular spatial information of an undefined shape (Figure 1). We have developed an analytical platform using the Imaris core software module and Imaris XT interfaced to MATLAB (Mat Works, Inc.). These tools allow the 3D measurement of cells without a pre-defined shape and with inconsistent fluorescence network components. Furthermore, this method will allow researchers who have extended expertise in biological systems, but not familiarity to computer applications, to perform quantification of morphological changes in cell dynamics.
Resumo:
Software as a Service (SaaS) in Cloud is getting more and more significant among software users and providers recently. A SaaS that is delivered as composite application has many benefits including reduced delivery costs, flexible offers of the SaaS functions and decreased subscription cost for users. However, this approach has introduced a new problem in managing the resources allocated to the composite SaaS. The resource allocation that has been done at the initial stage may be overloaded or wasted due to the dynamic environment of a Cloud. A typical data center resource management usually triggers a placement reconfiguration for the SaaS in order to maintain its performance as well as to minimize the resource used. Existing approaches for this problem often ignore the underlying dependencies between SaaS components. In addition, the reconfiguration also has to comply with SaaS constraints in terms of its resource requirements, placement requirement as well as its SLA. To tackle the problem, this paper proposes a penalty-based Grouping Genetic Algorithm for multiple composite SaaS components clustering in Cloud. The main objective is to minimize the resource used by the SaaS by clustering its component without violating any constraint. Experimental results demonstrate the feasibility and the scalability of the proposed algorithm.
Resumo:
Important differences exist in how service firms operate in comparison with manufacturing firms (c.f. Johne & Storey, 1998; Tether, 2002). Despite these significant differences, not much is known whether these differences extrapolate to entrepreneurship in the services industry. This study seeks to address this gap by investigating how value creation occurs when project-oriented firms1 adopt client adaptiveness as part of their entrepreneurial posture. Specifically, we examine the effect of client adaptiveness on sustained competitive advantage. Client adaptiveness is conceptualized as the extent to which an organization engages in identifying and responding to perceived client needs and wants which reflects the service firm’s propensity to dynamically synchronize with the project/client requirements.
Resumo:
The QUT Centre for Subtropical Design reviewed tools and indices that measure ‘liveability’ on behalf of the Brisbane Development Association. This review provides insight into the concept of ‘liveability’ and how various international and local tools measure or value ‘liveability’ of cities. Liveability is subjective, and can mean different things to different individuals depending upon their situation and lifecycle stage, and is therefore difficult to define. Essentially, the term ‘liveability’ constitutes thoughts of quality of life and wellbeing of residents in urban environments.
Resumo:
Program management serves as an overall vehicle for the transformation effort. It aims to support the implementation of the decided strategy in order to achieve the expected benefits in a business transformation initiative. A program is defined as a group of related projects managed in a coordinated way to obtain benefits and control not available when managing them individually . A project on the other hand, is a temporary endeavor undertaken to create a unique product, service, or result. Projects tend to have definite start and finish points, with the aim of delivering a predetermined output, giving them relatively clear development paths from initiation to delivery. Programs, on the contrary, exist to create value by enriching the management of projects in isolation. Programs typically have a more strategic vision of the desired end goal, but no clearly defined path to get there. Therefore, program management is expected to deal with the uncertainty surrounding the achievement of the vision, whereas projects work best where the outputs can be well defined.
Resumo:
Buildings are key mediators between human activity and the environment around them, but details of energy usage and activity in buildings is often poorly communicated and understood. ECOS is an Eco-Visualization project that aims to contextualize the energy generation and consumption of a green building in a variety of different climates. The ECOS project is being developed for a large public interactive space installed in the new Science and Engineering Centre of the Queensland University of Technology that is dedicated to delivering interactive science education content to the public. This paper focuses on how design can develop ICT solutions from large data sets to create meaningful engagement with environmental data.