957 resultados para Database systems
Resumo:
In the long term, with development of skill, knowledge, exposure and confidence within the engineering profession, rigorous analysis techniques have the potential to become a reliable and far more comprehensive method for design and verification of the structural adequacy of OPS, write Nimal J Perera, David P Thambiratnam and Brian Clark. This paper explores the potential to enhance operator safety of self-propelled mechanical plant subjected to roll over and impact of falling objects using the non-linear and dynamic response simulation capabilities of analytical processes to supplement quasi-static testing methods prescribed in International and Australian Codes of Practice for bolt on Operator Protection Systems (OPS) that are post fitted. The paper is based on research work carried out by the authors at the Queensland University of Technology (QUT) over a period of three years by instrumentation of prototype tests, scale model tests in the laboratory and rigorous analysis using validated Finite Element (FE) Models. The FE codes used were ABAQUS for implicit analysis and LSDYNA for explicit analysis. The rigorous analysis and dynamic simulation technique described in the paper can be used to investigate the structural response due to accident scenarios such as multiple roll over, impact of multiple objects and combinations of such events and thereby enhance the safety and performance of Roll Over and Falling Object Protection Systems (ROPS and FOPS). The analytical techniques are based on sound engineering principles and well established practice for investigation of dynamic impact on all self propelled vehicles. They are used for many other similar applications where experimental techniques are not feasible.
Resumo:
Overcoming many of the constraints to early stage investment in biofuels production from sugarcane bagasse in Australia requires an understanding of the complex technical, economic and systemic challenges associated with the transition of established sugar industry structures from single product agri-businesses to new diversified multi-product biorefineries. While positive investment decisions in new infrastructure requires technically feasible solutions and the attainment of project economic investment thresholds, many other systemic factors will influence the investment decision. These factors include the interrelationships between feedstock availability and energy use, competing product alternatives, technology acceptance and perceptions of project uncertainty and risk. This thesis explores the feasibility of a new cellulosic ethanol industry in Australia based on the large sugarcane fibre (bagasse) resource available. The research explores industry feasibility from multiple angles including the challenges of integrating ethanol production into an established sugarcane processing system, scoping the economic drivers and key variables relating to bioethanol projects and considering the impact of emerging technologies in improving industry feasibility. The opportunities available from pilot scale technology demonstration are also addressed. Systems analysis techniques are used to explore the interrelationships between the existing sugarcane industry and the developing cellulosic biofuels industry. This analysis has resulted in the development of a conceptual framework for a bagassebased cellulosic ethanol industry in Australia and uses this framework to assess the uncertainty in key project factors and investment risk. The analysis showed that the fundamental issue affecting investment in a cellulosic ethanol industry from sugarcane in Australia is the uncertainty in the future price of ethanol and government support that reduces the risks associated with early stage investment is likely to be necessary to promote commercialisation of this novel technology. Comprehensive techno-economic models have been developed and used to assess the potential quantum of ethanol production from sugarcane in Australia, to assess the feasibility of a soda-based biorefinery at the Racecourse Sugar Mill in Mackay, Queensland and to assess the feasibility of reducing the cost of production of fermentable sugars from the in-planta expression of cellulases in sugarcane in Australia. These assessments show that ethanol from sugarcane in Australia has the potential to make a significant contribution to reducing Australia’s transportation fuel requirements from fossil fuels and that economically viable projects exist depending upon assumptions relating to product price, ethanol taxation arrangements and greenhouse gas emission reduction incentives. The conceptual design and development of a novel pilot scale cellulosic ethanol research and development facility is also reported in this thesis. The establishment of this facility enables the technical and economic feasibility of new technologies to be assessed in a multi-partner, collaborative environment. As a key outcome of this work, this study has delivered a facility that will enable novel cellulosic ethanol technologies to be assessed in a low investment risk environment, reducing the potential risks associated with early stage investment in commercial projects and hence promoting more rapid technology uptake. While the study has focussed on an exploration of the feasibility of a commercial cellulosic ethanol industry from sugarcane in Australia, many of the same key issues will be of relevance to other sugarcane industries throughout the world seeking diversification of revenue through the implementation of novel cellulosic ethanol technologies.
Resumo:
Is it possible to control identities using performance management systems (PMSs)? This paper explores the theoretical fusion of management accounting and identity studies, providing a synthesised view of control, PMSs and identification processes. It argues that the effective use of PMSs generates a range of obtrusive mechanistic and unobtrusive organic controls that mediate identification processes to achieve a high level of identity congruency between individuals and collectives—groups and organisations. This paper contends that mechanistic control of PMSs provides sensebreaking effects and also creates structural conditions for sensegiving in top-down identification processes. These processes encourage individuals to continue the bottom-up processes of sensemaking, enacting identity and constructing identity narratives. Over time, PMS activities and conversations periodically mediate several episode(s) of identification to connect past, current and future identities. To explore this relationship, the dual locus of control—collectives and individuals—is emphasised to explicate their interplay. This multidisciplinary approach contributes to explaining the multidirectional effects of PMSs in obtrusive as well as unobtrusive ways, in order to control the nature of collectives and individuals in organisations.
Resumo:
In many “user centred design” methods, participants are used as informants to provide data but they are not involved in further analysis of that data. This paper investigates a participatory analysis approach in order to identify the strengths and weaknesses of involving participants collaboratively in the requirements analysis process. Findings show that participants are able to use information that they themselves have provided to analyse requirements and to draw upon that analysis for design, producing insights and suggestions that might not have been available otherwise to the design team. The contribution of this paper is to demonstrate an example of a participatory analysis process.
Resumo:
This paper explores how the effective use of performance management systems (PMS) essentialises collective identities through the use of textual performances. The discursive effect of PMS operates to simplify members’ logic to allow them to understand and negotiate the complex nature of collective performance. Two case studies, drawing on a qualitative study of the implementation of PMS in two public sector organisations, point to the unique contribution of symbolic effects of one popular PMS, the balanced scorecard (BSC). Findings suggest that the BSC visualising the trajectory of achieving organisational vision through multiple perspectives, measures and linkages is a valuable identity product to achieve organisational success. The case studies also provide an analysis that contrasts aspects of the diffusion and promotion of collective identities through the use of the BSC. This demonstrates that clear direction in the identity management process is an important factor in the design and implementation of successful PMS programs. The value of this paper is to heighten recognition of the symbolic agency of PMS, as it serves as a subtle mechanism for identity management, and also to foster the collaboration of communication specialists and management accountants to achieve common organisational goals.
Resumo:
Identity is unique, multiple and dynamic. This paper explores common attributes of organisational identities, and examines the role of performance management systems (PMSs) on revealing identity attributes. One of the influential PMSs, the balanced scorecard, is used to illustrate the arguments. A case study of a public-sector organisation suggests that PMSs now place a value on the intangible aspects of organisational life as well as the financial, periodically revealing distinctiveness, relativity, visibility, fluidity and manageability of public-sector identities that sustain their viability. This paper contributes to a multi-disciplinary approach and its practical application, demonstrating an alternative pathway to identity-making using PMSs.
Resumo:
The paper explores the results an on-going research project to identify factors influencing the success of international and non-English speaking background (NESB) gradúate students in the fields of Engineering and IT at three Australian universities: the Queensland University of Technology (QUT), the University of Western Australia (UWA), and Curtin University (CU). While the larger study explores the influence of factors from both sides of the supervision equation (e.g., students and supervisors), this paper focusses primarily on the results of an online survey involving 227 international and/or NESB graduate students in the areas of Engineering and IT at the three universities. The study reveals cross-cultural differences in perceptions of student and supervisor roles, as well as differences in the understanding of the requirements of graduate study within the Australian Higher Education context. We argue that in order to assist international and NESB research students to overcome such culturally embedded challenges, it is important to develop a model which recognizes the complex interactions of factors from both sides of the supervision relationship, in order to understand this cohort‟s unique pedagogical needs and develop intercultural sensitivity within postgraduate research supervision.
Resumo:
This PhD study examines whether water allocation becomes more productive when it is re-allocated from 'low' to 'high' efficient alternative uses in village irrigation systems (VISs) in Sri Lanka. Reservoir-based agriculture is a collective farming economic activity, which inter-sectoral allocation of water is assumed to be inefficient due to market imperfections and weak user rights. Furthermore, the available literature shows that a „head-tail syndrome. is the most common issue for intra-sectoral water management in „irrigation. agriculture. This research analyses the issue of water allocation by using primary data collected from two surveys of 460 rice farmers and 325 fish farming groups in two administrative districts in Sri Lanka. Technical efficiency estimates are undertaken for both rice farming and culture-based fisheries (CBF) production. The equi-marginal principle is applied for inter and intra-sectoral allocation of water. Welfare benefits of water re-allocation are measured through consumer surplus estimation. Based on these analyses, the overall findings of the thesis can be summarised as follows. The estimated mean technical efficiency (MTE) for rice farming is 73%. For CBF production, the estimated MTE is 33%. The technical efficiency distribution is skewed to the left for rice farming, while it skewed to the right for CBF production. The results show that technical efficiency of rice farming can be improved by formalising transferability of land ownership and, therefore, water user rights by enhancing the institutional capacity of Farmer Organisations (FOs). Other effective tools for improving technical efficiency of CBF production are strengthening group stability of CBF farmers, improving the accessibility of official consultation, and attracting independent investments. Inter-sectoral optimal allocation shows that the estimated inefficient volume of water in rice farming, which can be re-allocated for CBF production, is 32%. With the application of successive policy instruments (e.g., a community transferable quota system and promoting CBF activities), there is potential for a threefold increase in marginal value product (MVP) of total reservoir water in VISs. The existing intra-sectoral inefficient volume of water use in tail-end fields and head-end fields can potentially be removed by reducing water use by 10% and 23% respectively and re-allocating this to middle fields. This re-allocation may enable a twofold increase in MVP of water used in rice farming without reducing the existing rice output, but will require developing irrigation practices to facilitate this re-allocation. Finally, the total productivity of reservoir water can be increased by responsible village level institutions and primary level stakeholders (i.e., co-management) sharing responsibility of water management, while allowing market forces to guide the efficient re-allocation decisions. This PhD has demonstrated that instead of farmers allocating water between uses haphazardly, they can now base their decisions on efficient water use with a view to increasing water productivity. Such an approach, no doubt will enhance farmer incomes and community welfare.
Resumo:
In keeping with the proliferation of free software development initiatives and the increased interest in the business process management domain, many open source workflow and business process management systems have appeared during the last few years and are now under active development. This upsurge gives rise to two important questions: What are the capabilities of these systems? and How do they compare to each other and to their closed source counterparts? In other words: What is the state-of-the-art in the area?. To gain an insight into these questions, we have conducted an in-depth analysis of three of the major open source workflow management systems – jBPM, OpenWFE, and Enhydra Shark, the results of which are reported here. This analysis is based on the workflow patterns framework and provides a continuation of the series of evaluations performed using the same framework on closed source systems, business process modelling languages, and web-service composition standards. The results from evaluations of the three open source systems are compared with each other and also with the results from evaluations of three representative closed source systems: Staffware, WebSphere MQ, and Oracle BPEL PM. The overall conclusion is that open source systems are targeted more toward developers rather than business analysts. They generally provide less support for the patterns than closed source systems, particularly with respect to the resource perspective, i.e. the various ways in which work is distributed amongst business users and managed through to completion.
Resumo:
It is not uncommon for enterprises today to be faced with the demand to integrate and incor- porate many different and possibly heterogeneous systems which are generally independently designed and developed, to allow seamless access. In effect, the integration of these systems results in one large whole system that must be able, at the same time, to maintain the local autonomy and to continue working as an independent entity. This problem has introduced a new distributed architecture called federated systems. The most challenging issue in federated systems is to find answers for the question of how to efficiently cooperate while preserving their autonomous characteristic, especially the security autonomy. This thesis intends to address this issue. The thesis reviews the evolution of the concept of federated systems and discusses the organisational characteristics as well as remaining security issues with the existing approaches. The thesis examines how delegation can be used as means to achieve better security, especially authorisation while maintaining autonomy for the participating member of the federation. A delegation taxonomy is proposed as one of the main contributions. The major contribution of this thesis is to study and design a mechanism to support dele- gation within and between multiple security domains with constraint management capability. A novel delegation framework is proposed including two modules: Delegation Constraint Man- agement module and Policy Management module. The first module is designed to effectively create, track and manage delegation constraints, especially for delegation processes which require re-delegation (indirect delegation). The first module employs two algorithms to trace the root authority of a delegation constraint chain and to prevent the potential conflict when creating a delegation constraint chain if necessary. The first module is designed for conflict prevention not conflict resolution. The second module is designed to support the first module via the policy comparison capability. The major function of this module is to provide the delegation framework the capability to compare policies and constraints (written under the format of a policy). The module is an extension of Lin et al.'s work on policy filtering and policy analysis. Throughout the thesis, some case studies are used as examples to illustrate the discussed concepts. These two modules are designed to capture one of the most important aspects of the delegation process: the relationships between the delegation transactions and the involved constraints, which are not very well addressed by the existing approaches. This contribution is significant because the relationships provide information to keep track and en- force the involved delegation constraints and, therefore, play a vital role in maintaining and enforcing security for transactions across multiple security domains.
Resumo:
The objective quantification of three-dimensional kinematics during different functional and occupational tasks is now more in demand than ever. The introduction of new generation of low-cost passive motion capture systems from a number of manufacturers has made this technology accessible for teaching, clinical practice and in small/medium industry. Despite the attractive nature of these systems, their accuracy remains unproved in independent tests. We assessed static linear accuracy, dynamic linear accuracy and compared gait kinematics from a Vicon MX20 system to a Natural Point OptiTrack system. In all experiments data were sampled simultaneously. We identified both systems perform excellently in linear accuracy tests with absolute errors not exceeding 1%. In gait data there was again strong agreement between the two systems in sagittal and coronal plane kinematics. Transverse plane kinematics differed by up to 3 at the knee and hip, which we attributed to the impact of soft tissue artifact accelerations on the data. We suggest that low-cost systems are comparably accurate to their high-end competitors and offer a platform with accuracy acceptable in research for laboratories with a limited budget.
Resumo:
Design for Manufacturing (DFM) is a highly integral methodology in product development, starting from the concept development phase, with the aim of improving manufacturing productivity and maintaining product quality. While Design for Assembly (DFA) is focusing on elimination or combination of parts with other components (Boothroyd, Dewhurst and Knight, 2002), which in most cases relates to performing a function and manufacture operation in a simpler way, DFM is following a more holistic approach. During DFM, the considerable background work required for the conceptual phase is compensated for by a shortening of later development phases. Current DFM projects normally apply an iterative step-by-step approach and eventually transfer to the developer team. Although DFM has been a well established methodology for about 30 years, a Fraunhofer IAO study from 2009 found that DFM was still one of the key challenges of the German Manufacturing Industry. A new, knowledge based approach to DFM, eliminating steps of DFM, was introduced in Paul and Al-Dirini (2009). The concept focuses on a concurrent engineering process between the manufacturing engineering and product development systems, while current product realization cycles depend on a rigorous back-and-forth examine-and-correct approach so as to ensure compatibility of any proposed design to the DFM rules and guidelines adopted by the company. The key to achieving reductions is to incorporate DFM considerations into the early stages of the design process. A case study for DFM application in an automotive powertrain engineering environment is presented. It is argued that a DFM database needs to be interfaced to the CAD/CAM software, which will restrict designers to the DFM criteria. Consequently, a notable reduction of development cycles can be achieved. The case study is following the hypothesis that current DFM methods do not improve product design in a manner claimed by the DFM method. The critical case was to identify DFA/DFM recommendations or program actions with repeated appearance in different sources. Repetitive DFM measures are identified, analyzed and it is shown how a modified DFM process can mitigate a non-fully integrated DFM approach.
Resumo:
In 1999 Richards compared the accuracy of commercially available motion capture systems commonly used in biomechanics. Richards identified that in static tests the optical motion capture systems generally produced RMS errors of less than 1.0 mm. During dynamic tests, the RMS error increased to up to 4.2 mm in some systems. In the last 12 years motion capture systems have continued to evolve and now include high-resolution CCD or CMOS image sensors, wireless communication, and high full frame sampling frequencies. In addition to hardware advances, there have also been a number of advances in software, which includes improved calibration and tracking algorithms, real time data streaming, and the introduction of the c3d standard. These advances have allowed the system manufactures to maintain a high retail price in the name of advancement. In areas such as gait analysis and ergonomics many of the advanced features such as high resolution image sensors and high sampling frequencies are not required due to the nature of the task often investigated. Recently Natural Point introduced low cost cameras, which on face value appear to be suitable as at very least a high quality teaching tool in biomechanics and possibly even a research tool when coupled with the correct calibration and tracking software. The aim of the study was therefore to compare both the linear accuracy and quality of angular kinematics from a typical high end motion capture system and a low cost system during a simple task.
Resumo:
The Cardiac Access-Remoteness Index of Australia (Cardiac ARIA) used geographic information systems (GIS) to model population level, road network accessibility to cardiac services before and after a cardiac event for all (20,387) population localities in Australia., The index ranged from 1A (access to all cardiac services within 1 h driving time) to 8E (limited or no access). The methodology derived an objective geographic measure of accessibility to required cardiac services across Australia. Approximately 71% of the 2006 Australian population had very good access to acute hospital services and services after hospital discharge. This GIS model could be applied to other regions or health conditions where spatially enabled data were available.
Resumo:
Recommender systems are one of the recent inventions to deal with ever growing information overload in relation to the selection of goods and services in a global economy. Collaborative Filtering (CF) is one of the most popular techniques in recommender systems. The CF recommends items to a target user based on the preferences of a set of similar users known as the neighbours, generated from a database made up of the preferences of past users. With sufficient background information of item ratings, its performance is promising enough but research shows that it performs very poorly in a cold start situation where there is not enough previous rating data. As an alternative to ratings, trust between the users could be used to choose the neighbour for recommendation making. Better recommendations can be achieved using an inferred trust network which mimics the real world "friend of a friend" recommendations. To extend the boundaries of the neighbour, an effective trust inference technique is required. This thesis proposes a trust interference technique called Directed Series Parallel Graph (DSPG) which performs better than other popular trust inference algorithms such as TidalTrust and MoleTrust. Another problem is that reliable explicit trust data is not always available. In real life, people trust "word of mouth" recommendations made by people with similar interests. This is often assumed in the recommender system. By conducting a survey, we can confirm that interest similarity has a positive relationship with trust and this can be used to generate a trust network for recommendation. In this research, we also propose a new method called SimTrust for developing trust networks based on user's interest similarity in the absence of explicit trust data. To identify the interest similarity, we use user's personalised tagging information. However, we are interested in what resources the user chooses to tag, rather than the text of the tag applied. The commonalities of the resources being tagged by the users can be used to form the neighbours used in the automated recommender system. Our experimental results show that our proposed tag-similarity based method outperforms the traditional collaborative filtering approach which usually uses rating data.