936 resultados para Process analysis
Resumo:
Qualitative research methods require transparency to ensure the ‘trustworthiness’ of the data analysis. The intricate processes of organizing, coding and analyzing the data are often rendered invisible in the presentation of the research findings, which requires a ‘leap of faith’ for the reader. Computer assisted data analysis software can be used to make the research process more transparent, without sacrificing rich, interpretive analysis by the researcher. This article describes in detail how one software package was used in a poststructural study to link and code multiple forms of data to four research questions for fine-grained analysis. This description will be useful for researchers seeking to use qualitative data analysis software as an analytic tool.
Resumo:
An important aspect of designing any product is validation. Virtual design process (VDP) is an alternative to hardware prototyping in which analysis of designs can be done without manufacturing physical samples. In recent years, VDP have been generated either for animation or filming applications. This paper proposes a virtual reality design process model on one of the applications when used as a validation tool. This technique is used to generate a complete design guideline and validation tool of product design. To support the design process of a product, a virtual environment and VDP method were developed that supports validation and an initial design cycle performed by a designer. The product model car carrier is used as illustration for which virtual design was generated. The loading and unloading sequence of the model for the prototype was generated using automated reasoning techniques and was completed by interactively animating the product in the virtual environment before complete design was built. By using the VDP process critical issues like loading, unloading, Australian Design rules (ADR) and clearance analysis were done. The process would save time, money in physical sampling and to large extent in complete math generation. Since only schematic models are required, it saves time in math modelling and handling of bigger size assemblies due to complexity of the models. This extension of VDP process for design evaluation is unique and was developed, implemented successfully. In this paper a Toll logistics and J Smith and Sons car carrier which is developed under author’s responsibility has been used to illustrate our approach of generating design validation via VDP.
Resumo:
A curriculum for a university-level course called Business Process Modeling is presented in order to provide guidance for the increasing number of institutions who are currently developing such contents. The course caters to undergraduate and post graduate students. Its content is drawn from recent research, industry practice, and established teaching material, and teaches ways of specifying business processes for the analysis and design of process-aware information systems. The teaching approach is a blend of lectures and classroom exercises with innovative case studies, as well as reviews of research material. Students are asked to conceptualize, analyze, and articulate real life process scenarios. Tutorials and cheat sheets assist with the learning experience. Course evaluations from 40 students suggest the adequacy of the teaching approach. Specifically, evaluations show a high degree of satisfaction with course relevance, content presentation, and teaching approach.
Resumo:
The ability to assess a commercial building for its impact on the environment at the earliest stage of design is a goal which is achievable by integrating several approaches into a single procedure directly from the 3D CAD representation. Such an approach enables building design professionals to make informed decisions on the environmental impact of building and its alternatives during the design development stage instead of at the post-design stage where options become limited. The indicators of interest are those which relate to consumption of resources and energy, contributions to pollution of air, water and soil, and impacts on the health and wellbeing of people in the built environment as a result of constructing and operating buildings. 3D object-oriented CAD files contain a wealth of building information which can be interrogated for details required for analysis of the performance of a design. The quantities of all components in the building can be automatically obtained from the 3D CAD objects and their constituent materials identified to calculate a complete list of the amounts of all building products such as concrete, steel, timber, plastic etc. When this information is combined with a life cycle inventory database, key internationally recognised environmental indicators can be estimated. Such a fully integrated tool known as LCADesign has been created for automated ecoefficiency assessment of commercial buildings direct from 3D CAD. This paper outlines the key features of LCADesign and its application to environmental assessment of commercial buildings.
Resumo:
The role that heparanase plays during metastasis and angiogenesis in tumors makes it an attractive target for cancer therapeutics. Despite this enzyme’s significance, most of the assays developed to measure its activity are complex. Moreover, they usually rely on labeling variable preparations of the natural substrate heparan sulfate, making comparisons across studies precarious. To overcome these problems, we have developed a convenient assay based on the cleavage of the synthetic heparin oligosaccharide fondaparinux. The assay measures the appearance of the disaccharide product of heparanase-catalyzed fondaparinux cleavage colorimetrically using the tetrazolium salt WST-1. Because this assay has a homogeneous substrate with a single point of cleavage, the kinetics of the enzyme can be reliably characterized, giving a Km of 46 μM and a kcat of 3.5 s−1 with fondaparinux as substrate. The inhibition of heparanase by the published inhibitor, PI-88, was also studied, and a Ki of 7.9 nM was determined. The simplicity and robustness of this method, should, not only greatly assist routine assay of heparanase activity but also could be adapted for high-throughput screening of compound libraries, with the data generated being directly comparable across studies.
Resumo:
Historically, asset management focused primarily on the reliability and maintainability of assets; organisations have since then accepted the notion that a much larger array of processes govern the life and use of an asset. With this, asset management’s new paradigm seeks a holistic, multi-disciplinary approach to the management of physical assets. A growing number of organisations now seek to develop integrated asset management frameworks and bodies of knowledge. This research seeks to complement existing outputs of the mentioned organisations through the development of an asset management ontology. Ontologies define a common vocabulary for both researchers and practitioners who need to share information in a chosen domain. A by-product of ontology development is the realisation of a process architecture, of which there is also no evidence in published literature. To develop the ontology and subsequent asset management process architecture, a standard knowledge-engineering methodology is followed. This involves text analysis, definition and classification of terms and visualisation through an appropriate tool (in this case, the Protégé application was used). The result of this research is the first attempt at developing an asset management ontology and process architecture.
Resumo:
With increasingly complex engineering assets and tight economic requirements, asset reliability becomes more crucial in Engineering Asset Management (EAM). Improving the reliability of systems has always been a major aim of EAM. Reliability assessment using degradation data has become a significant approach to evaluate the reliability and safety of critical systems. Degradation data often provide more information than failure time data for assessing reliability and predicting the remnant life of systems. In general, degradation is the reduction in performance, reliability, and life span of assets. Many failure mechanisms can be traced to an underlying degradation process. Degradation phenomenon is a kind of stochastic process; therefore, it could be modelled in several approaches. Degradation modelling techniques have generated a great amount of research in reliability field. While degradation models play a significant role in reliability analysis, there are few review papers on that. This paper presents a review of the existing literature on commonly used degradation models in reliability analysis. The current research and developments in degradation models are reviewed and summarised in this paper. This study synthesises these models and classifies them in certain groups. Additionally, it attempts to identify the merits, limitations, and applications of each model. It provides potential applications of these degradation models in asset health and reliability prediction.
Resumo:
CRTA technology offers better resolution and a more detailed interpretation of the decomposition processes of a clay mineral such as sepiolite via approaching equilibrium conditions of decomposition through the elimination of the slow transfer of heat to the sample as a controlling parameter on the process of decomposition. Constant-rate decomposition processes of non-isothermal nature reveal changes in the sepiolite as the sepiolite is converted to an anhydride. In the dynamic experiment two dehydration steps are observed over the ~20-170 and 170-350°C temperature range. In the dynamic experiment three dehydroxylation steps are observed over the temperature ranges 201-337, 337-638 and 638-982°C. The CRTA technology enables the separation of the thermal decomposition steps.
Resumo:
The composition of many professional services firms in the Urban Development area has moved away from a discipline specific ‘silo’ structure to a more multidisciplinary environment. The benefits of multidisciplinarity have been seen in industry by providing synergies across many of the related disciplines. Similarly, the Queensland University of Technology, Bachelor of Urban Development degree has sought to broaden the knowledge base of students and achieve a greater level of synergy between related urban development disciplines through the introduction of generic and multidisciplinary units. This study aims to evaluate the effectiveness of delivering core property units in a multidisciplinary context. A comparative analysis has been undertaken between core property units and more generic units offered in a multidisciplinary context from introductory, intermediate and advanced years within the property program. This analysis was based on data collected from course performance surveys, student performance results, a student focus group and was informed by a reflective process from the student perspective and lecturer/ tutor feedback. The study showed that there are many benefits associated with multidisciplinary unit offerings across the QUT Urban Development program particularly in the more generic units. However, these units require a greater degree of management. It is more difficult to organise, teach and coordinate multidisciplinary student cohorts due to a difference in prior knowledge and experience between each of the discipline groups. In addition, the interaction between lecturers/ tutors and the students frequently becomes more limited. A perception exists within the student body that this more limited face to face contact with academic staff is not valuable which may be exacerbated by the quality of complimentary online teaching materials. For many academics, non-attendance at lectures was coupled with an increase in email communication. From the limited data collected during the study there appears to be no clear correlation between large multidisciplinary student classes and student academic performance or satisfaction.
Resumo:
"By understanding how places have evolved, we are better able to guide development and change in the urban fabric and avoid the incongruity created by so much of the modern environment" (MacCormac, R (1996), An anatomy of London, Built Environment, Dec 1996 This paper proposes a theory on the relevance of mapping the evolutionary aspects of historical urban form in order to develop a measure of evaluating architectural elements within urban forms, through to deriving parameters for new buildings. By adopting Conzen's identification of the tripartite division of urban form; the consonance inurban form of a particular palce resides in the elements and measurable values tha makeup the fine grain aggregates of urban form. The paper will demonstrate throughthe case study of Brisbane in Australia, a method of conveying these essential components that constitute a cities continuity of form and active usage. By presenting the past as a repository of urban form characteristics, it is argued that concise architectural responses that stem from such knowledge should result in an engaged urban landscape. The essential proposition is that urban morphology is a missing constituent in the process of urban design, and that the approach of the geographical discipline to the study of urban morphology holds the key to providing the evidence of urban growth characteristics, and this methodology suggests possibilities for an architectural approach that can comprehensively determine qualitative aspects of urban buildings. The relevance of this research lies in a potential to breach the limitations of current urban analysis whilst continuing the evolving currency of urban morphology as an integral practice in the design of our cities.
Resumo:
While Business Process Management (BPM) is an established discipline, the increased adoption of BPM technology in recent years has introduced new challenges. One challenge concerns dealing with process model complexity in order to improve the understanding of a process model by stakeholders and process analysts. Features for dealing with this complexity can be classified in two categories: 1) those that are solely concerned with the appearance of the model, and 2) those that in essence change the structure of the model. In this paper we focus on the former category and present a collection of patterns that generalize and conceptualize various existing features. The paper concludes with a detailed analysis of the degree of support of a number of state-of-the-art languages and language implementations for these patterns.
Resumo:
Public key cryptography, and with it,the ability to compute digital signatures, have made it possible for electronic commerce to flourish. It is thus unsurprising that the proposed Australian NECS will also utilise digital signatures in its system so as to provide a fully automated process from the creation of electronic land title instrument to the digital signing, and electronic lodgment of these instruments. This necessitates an analysis of the fraud risks raised by the usage of digital signatures because a compromise of the integrity of digital signatures will lead to a compromise of the Torrens system itself. This article will show that digital signatures may in fact offer greater security against fraud than handwritten signatures; but to achieve this, digital signatures require an infrastructure whereby each component is properly implemented and managed.
Resumo:
Controlled rate thermal analysis (CRTA) technology offers better resolution and a more detailed interpretation of the decomposition processes of a clay mineral such as sepiolite via approaching equilibrium conditions of decomposition through the elimination of the slow transfer of heat to the sample as a controlling parameter on the process of decomposition. Constant-rate decomposition processes of non-isothermal nature reveal changes in the sepiolite as the sepiolite is converted to an anhydride. In the dynamic experiment two dehydration steps are observed over the *20–170 and 170–350 �C temperature range. In the dynamic experiment three dehydroxylation steps are observed over the temperature ranges 201–337, 337–638 and 638–982 �C. The CRTA technology enables the separation of the thermal decomposition steps.
Resumo:
Construction projects can involve a diverse range of stakeholders and the success of the project depends very much on fulfilling their needs and expectations. It is important, therefore, to identify and recognize project stakeholders and develop a rigorous stakeholder management process. However, limited research has investigated the impact of stakeholders on construction projects in developing countries. A stakeholder impact analysis (SIA), based on an approach developed by Olander (2007), was adopted to investigate the stakeholders' impact on state-owned civil engineering projects in Vietnam. This involved the analysis of a questionnaire survey of 57 project managers to determine the relative importance of different stakeholders. The results show the client to have the highest level of impact on the projects, followed by project managers and the senior management of state-owned engineering firms. The SIA also provides suggestions to project managers in developing and evaluating the stakeholder management process.