850 resultados para Model Identification
Resumo:
A mathematical model for the galvanostatic discharge and recovery of porous, electrolytic manganese dioxide cathodes, similar to those found within primary alkaline batteries is presented. The phenomena associated with discharge are modeled over three distinct size scales, a cathodic (or macroscopic) scale, a porous manganese oxide particle (or microscopic) scale, and a manganese oxide crystal (or submicroscopic) scale. The physical and chemical coupling between these size scales is included in the model. In addition, the model explicitly accounts for the graphite phase within the cathode. The effects that manganese oxide particle size and proton diffusion have on cathodic discharge and the effects of intraparticle voids and microporous electrode structure are predicted using the model.
Resumo:
The main aim of radiotherapy is to deliver a dose of radiation that is high enough to destroy the tumour cells while at the same time minimising the damage to normal healthy tissues. Clinically, this has been achieved by assigning a prescription dose to the tumour volume and a set of dose constraints on critical structures. Once an optimal treatment plan has been achieved the dosimetry is assessed using the physical parameters of dose and volume. There has been an interest in using radiobiological parameters to evaluate and predict the outcome of a treatment plan in terms of both a tumour control probability (TCP) and a normal tissue complication probability (NTCP). In this study, simple radiobiological models that are available in a commercial treatment planning system were used to compare three dimensional conformal radiotherapy treatments (3D-CRT) and intensity modulated radiotherapy (IMRT) treatments of the prostate. Initially both 3D-CRT and IMRT were planned for 2 Gy/fraction to a total dose of 60 Gy to the prostate. The sensitivity of the TCP and the NTCP to both conventional dose escalation and hypo-fractionation was investigated. The biological responses were calculated using the Källman S-model. The complication free tumour control probability (P+) is generated from the combined NTCP and TCP response values. It has been suggested that the alpha/beta ratio for prostate carcinoma cells may be lower than for most other tumour cell types. The effect of this on the modelled biological response for the different fractionation schedules was also investigated.
Resumo:
John Frazer's architectural work is inspired by living and generative processes. Both evolutionary and revolutionary, it explores informatin ecologies and the dynamics of the spaces between objects. Fuelled by an interest in the cybernetic work of Gordon Pask and Norbert Wiener, and the possibilities of the computer and the "new science" it has facilitated, Frazer and his team of collaborators have conducted a series of experiments that utilize genetic algorithms, cellular automata, emergent behaviour, complexity and feedback loops to create a truly dynamic architecture. Frazer studied at the Architectural Association (AA) in London from 1963 to 1969, and later became unit master of Diploma Unit 11 there. He was subsequently Director of Computer-Aided Design at the University of Ulter - a post he held while writing An Evolutionary Architecture in 1995 - and a lecturer at the University of Cambridge. In 1983 he co-founded Autographics Software Ltd, which pioneered microprocessor graphics. Frazer was awarded a person chair at the University of Ulster in 1984. In Frazer's hands, architecture becomes machine-readable, formally open-ended and responsive. His work as computer consultant to Cedric Price's Generator Project of 1976 (see P84)led to the development of a series of tools and processes; these have resulted in projects such as the Calbuild Kit (1985) and the Universal Constructor (1990). These subsequent computer-orientated architectural machines are makers of architectural form beyond the full control of the architect-programmer. Frazer makes much reference to the multi-celled relationships found in nature, and their ongoing morphosis in response to continually changing contextual criteria. He defines the elements that describe his evolutionary architectural model thus: "A genetic code script, rules for the development of the code, mapping of the code to a virtual model, the nature of the environment for the development of the model and, most importantly, the criteria for selection. In setting out these parameters for designing evolutionary architectures, Frazer goes beyond the usual notions of architectural beauty and aesthetics. Nevertheless his work is not without an aesthetic: some pieces are a frenzy of mad wire, while others have a modularity that is reminiscent of biological form. Algorithms form the basis of Frazer's designs. These algorithms determine a variety of formal results dependent on the nature of the information they are given. His work, therefore, is always dynamic, always evolving and always different. Designing with algorithms is also critical to other architects featured in this book, such as Marcos Novak (see p150). Frazer has made an unparalleled contribution to defining architectural possibilities for the twenty-first century, and remains an inspiration to architects seeking to create responsive environments. Architects were initially slow to pick up on the opportunities that the computer provides. These opportunities are both representational and spatial: computers can help architects draw buildings and, more importantly, they can help architects create varied spaces, both virtual and actual. Frazer's work was groundbreaking in this respect, and well before its time.
Resumo:
Business process modeling has undoubtedly emerged as a popular and relevant practice in Information Systems. Despite being an actively researched field, anecdotal evidence and experiences suggest that the focus of the research community is not always well aligned with the needs of industry. The main aim of this paper is, accordingly, to explore the current issues and the future challenges in business process modeling, as perceived by three key stakeholder groups (academics, practitioners, and tool vendors). We present the results of a global Delphi study with these three groups of stakeholders, and discuss the findings and their implications for research and practice. Our findings suggest that the critical areas of concern are standardization of modeling approaches, identification of the value proposition of business process modeling, and model-driven process execution. These areas are also expected to persist as business process modeling roadblocks in the future.
Resumo:
Providing support for reversible transformations as a basis for round-trip engineering is a significant challenge in model transformation research. While there are a number of current approaches, they require the underlying transformation to exhibit an injective behaviour when reversing changes. This however, does not serve all practical transformations well. In this paper, we present a novel approach to round-trip engineering that does not place restrictions on the nature of the underlying transformation. Based on abductive logic programming, it allows us to compute a set of legitimate source changes that equate to a given change to the target model. Encouraging results are derived from an initial prototype that supports most concepts of the Tefkat transformation language
Resumo:
With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.
Resumo:
In architecture courses, instilling a wider understanding of the industry specific representations practiced in the Building Industry is normally done under the auspices of Technology and Science subjects. Traditionally, building industry professionals communicated their design intentions using industry specific representations. Originally these mainly two dimensional representations such as plans, sections, elevations, schedules, etc. were produced manually, using a drawing board. Currently, this manual process has been digitised in the form of Computer Aided Design and Drafting (CADD) or ubiquitously simply CAD. While CAD has significant productivity and accuracy advantages over the earlier manual method, it still only produces industry specific representations of the design intent. Essentially, CAD is a digital version of the drawing board. The tool used for the production of these representations in industry is still mainly CAD. This is also the approach taken in most traditional university courses and mirrors the reality of the situation in the building industry. A successor to CAD, in the form of Building Information Modelling (BIM), is presently evolving in the Construction Industry. CAD is mostly a technical tool that conforms to existing industry practices. BIM on the other hand is revolutionary both as a technical tool and as an industry practice. Rather than producing representations of design intent, BIM produces an exact Virtual Prototype of any building that in an ideal situation is centrally stored and freely exchanged between the project team. Essentially, BIM builds any building twice: once in the virtual world, where any faults are resolved, and finally, in the real world. There is, however, no established model for learning through the use of this technology in Architecture courses. Queensland University of Technology (QUT), a tertiary institution that maintains close links with industry, recognises the importance of equipping their graduates with skills that are relevant to industry. BIM skills are currently in increasing demand throughout the construction industry through the evolution of construction industry practices. As such, during the second half of 2008, QUT 4th year architectural students were formally introduced for the first time to BIM, as both a technology and as an industry practice. This paper will outline the teaching team’s experiences and methodologies in offering a BIM unit (Architectural Technology and Science IV) at QUT for the first time and provide a description of the learning model. The paper will present the results of a survey on the learners’ perspectives of both BIM and their learning experiences as they learn about and through this technology.
Resumo:
Recent decisions of the Family Court of Australian reflect concerns over the adversarial nature of the legal process. The processes and procedures of the judicial system militate against a detailed examination of the issues and rights of the parties in dispute. The limitations of the family law framework are particularly demonstrated in disputes over the custody of children where the Court has tended to neglect the rights and interests of the primary carer. An alternative "unified family court" framework will be examined in which the Court pursues a more active and interventionist approach in the determination of family law disputes.
Resumo:
It has previously been found that complexes comprised of vitronectin and growth factors (VN:GF) enhance keratinocyte protein synthesis and migration. More specifically, these complexes have been shown to significantly enhance the migration of dermal keratinocytes derived from human skin. In view of this, it was thought that these complexes may hold potential as a novel therapy for healing chronic wounds. However, there was no evidence indicating that the VN:GF complexes would retain their effect on keratinocytes in the presence of chronic wound fluid. The studies in this thesis demonstrate for the first time that the VN:GF complexes not only stimulate proliferation and migration of keratinocytes, but also these effects are maintained in the presence of chronic wound fluid in a 2-dimensional (2-D) cell culture model. Whilst the 2-D culture system provided insights into how the cells might respond to the VN:GF complexes, this investigative approach is not ideal as skin is a 3-dimensional (3-D) tissue. In view of this, a 3-D human skin equivalent (HSE) model, which reflects more closely the in vivo environment, was used to test the VN:GF complexes on epidermopoiesis. These studies revealed that the VN:GF complexes enable keratinocytes to migrate, proliferate and differentiate on a de-epidermalised dermis (DED), ultimately forming a fully stratified epidermis. In addition, fibroblasts were seeded on DED and shown to migrate into the DED in the presence of the VN:GF complexes and hyaluronic acid, another important biological factor in the wound healing cascade. This HSE model was then further developed to enable studies examining the potential of the VN:GF complexes in epidermal wound healing. Specifically, a reproducible partial-thickness HSE wound model was created in fully-defined media and monitored as it healed. In this situation, the VN:GF complexes were shown to significantly enhance keratinocyte migration and proliferation, as well as differentiation. This model was also subsequently utilized to assess the wound healing potential of a synthetic fibrin-like gel that had previously been demonstrated to bind growth factors. Of note, keratinocyte re-epitheliasation was shown to be markedly improved in the presence of this 3-D matrix, highlighting its future potential for use as a delivery vehicle for the VN:GF complexes. Furthermore, this synthetic fibrin-like gel was injected into a 4 mm diameter full-thickness wound created in the HSE, both keratinocytes and fibroblasts were shown to migrate into this gel, as revealed by immunofluorescence. Interestingly, keratinocyte migration into this matrix was found to be dependent upon the presence of the fibroblasts. Taken together, these data indicate that reproducible wounds, as created in the HSEs, provide a relevant ex vivo tool to assess potential wound healing therapies. Moreover, the models will decrease our reliance on animals for scientific experimentation. Additionally, it is clear that these models will significantly assist in the development of novel treatments, such as the VN:GF complexes and the synthetic fibrin-like gel described herein, ultimately facilitating their clinical trial in the treatment of chronic wounds.
Resumo:
Vibration based damage identification methods examine the changes in primary modal parameters or quantities derived from modal parameters. As one method may have advantages over the other under some circumstances, a multi-criteria approach is proposed. Case studies are conducted separately on beam, plate and plate-on-beam structures. Using the numerically simulated modal data obtained through finite element analysis software, algorithms based on flexibility and strain energy changes before and after damage are obtained and used as the indices for the assessment of the state of structural health. Results show that the proposed multi-criteria method is effective in damage identification in these structures.