892 resultados para Active appearance model
Resumo:
A mathematical model for the galvanostatic discharge and recovery of porous, electrolytic manganese dioxide cathodes, similar to those found within primary alkaline batteries is presented. The phenomena associated with discharge are modeled over three distinct size scales, a cathodic (or macroscopic) scale, a porous manganese oxide particle (or microscopic) scale, and a manganese oxide crystal (or submicroscopic) scale. The physical and chemical coupling between these size scales is included in the model. In addition, the model explicitly accounts for the graphite phase within the cathode. The effects that manganese oxide particle size and proton diffusion have on cathodic discharge and the effects of intraparticle voids and microporous electrode structure are predicted using the model.
Resumo:
The main aim of radiotherapy is to deliver a dose of radiation that is high enough to destroy the tumour cells while at the same time minimising the damage to normal healthy tissues. Clinically, this has been achieved by assigning a prescription dose to the tumour volume and a set of dose constraints on critical structures. Once an optimal treatment plan has been achieved the dosimetry is assessed using the physical parameters of dose and volume. There has been an interest in using radiobiological parameters to evaluate and predict the outcome of a treatment plan in terms of both a tumour control probability (TCP) and a normal tissue complication probability (NTCP). In this study, simple radiobiological models that are available in a commercial treatment planning system were used to compare three dimensional conformal radiotherapy treatments (3D-CRT) and intensity modulated radiotherapy (IMRT) treatments of the prostate. Initially both 3D-CRT and IMRT were planned for 2 Gy/fraction to a total dose of 60 Gy to the prostate. The sensitivity of the TCP and the NTCP to both conventional dose escalation and hypo-fractionation was investigated. The biological responses were calculated using the Källman S-model. The complication free tumour control probability (P+) is generated from the combined NTCP and TCP response values. It has been suggested that the alpha/beta ratio for prostate carcinoma cells may be lower than for most other tumour cell types. The effect of this on the modelled biological response for the different fractionation schedules was also investigated.
Resumo:
John Frazer's architectural work is inspired by living and generative processes. Both evolutionary and revolutionary, it explores informatin ecologies and the dynamics of the spaces between objects. Fuelled by an interest in the cybernetic work of Gordon Pask and Norbert Wiener, and the possibilities of the computer and the "new science" it has facilitated, Frazer and his team of collaborators have conducted a series of experiments that utilize genetic algorithms, cellular automata, emergent behaviour, complexity and feedback loops to create a truly dynamic architecture. Frazer studied at the Architectural Association (AA) in London from 1963 to 1969, and later became unit master of Diploma Unit 11 there. He was subsequently Director of Computer-Aided Design at the University of Ulter - a post he held while writing An Evolutionary Architecture in 1995 - and a lecturer at the University of Cambridge. In 1983 he co-founded Autographics Software Ltd, which pioneered microprocessor graphics. Frazer was awarded a person chair at the University of Ulster in 1984. In Frazer's hands, architecture becomes machine-readable, formally open-ended and responsive. His work as computer consultant to Cedric Price's Generator Project of 1976 (see P84)led to the development of a series of tools and processes; these have resulted in projects such as the Calbuild Kit (1985) and the Universal Constructor (1990). These subsequent computer-orientated architectural machines are makers of architectural form beyond the full control of the architect-programmer. Frazer makes much reference to the multi-celled relationships found in nature, and their ongoing morphosis in response to continually changing contextual criteria. He defines the elements that describe his evolutionary architectural model thus: "A genetic code script, rules for the development of the code, mapping of the code to a virtual model, the nature of the environment for the development of the model and, most importantly, the criteria for selection. In setting out these parameters for designing evolutionary architectures, Frazer goes beyond the usual notions of architectural beauty and aesthetics. Nevertheless his work is not without an aesthetic: some pieces are a frenzy of mad wire, while others have a modularity that is reminiscent of biological form. Algorithms form the basis of Frazer's designs. These algorithms determine a variety of formal results dependent on the nature of the information they are given. His work, therefore, is always dynamic, always evolving and always different. Designing with algorithms is also critical to other architects featured in this book, such as Marcos Novak (see p150). Frazer has made an unparalleled contribution to defining architectural possibilities for the twenty-first century, and remains an inspiration to architects seeking to create responsive environments. Architects were initially slow to pick up on the opportunities that the computer provides. These opportunities are both representational and spatial: computers can help architects draw buildings and, more importantly, they can help architects create varied spaces, both virtual and actual. Frazer's work was groundbreaking in this respect, and well before its time.
Resumo:
This paper assesses and compares the performances of two daylight collection strategies, one passive and one active, for large-scale mirrored light pipes (MLP) illuminating deep plan buildings. Both strategies use laser cut panels (LCP) as the main component of the collection system. The passive system comprises LCPs in pyramid form, whereas the active system uses a tiled LCP on a simple rotation mechanism that rotates 360° in 24 hours. Performance is assessed using scale model testing under sunny sky conditions and mathematical modelling. Results show average illuminance levels for the pyramid LCP ranging from 50 to 250 lux and 150 to 200 lux for the rotating LCPs. Both systems improve the performance of a MLP. The pyramid LCP increases the performance of a MLP by 2.5 times and the rotating LCP by 5 times, when compared to an open pipe particularly for low sun elevation angles.
Resumo:
Denial-of-service attacks (DoS) and distributed denial-of-service attacks (DDoS) attempt to temporarily disrupt users or computer resources to cause service un- availability to legitimate users in the internetworking system. The most common type of DoS attack occurs when adversaries °ood a large amount of bogus data to interfere or disrupt the service on the server. The attack can be either a single-source attack, which originates at only one host, or a multi-source attack, in which multiple hosts coordinate to °ood a large number of packets to the server. Cryptographic mechanisms in authentication schemes are an example ap- proach to help the server to validate malicious tra±c. Since authentication in key establishment protocols requires the veri¯er to spend some resources before successfully detecting the bogus messages, adversaries might be able to exploit this °aw to mount an attack to overwhelm the server resources. The attacker is able to perform this kind of attack because many key establishment protocols incorporate strong authentication at the beginning phase before they can iden- tify the attacks. This is an example of DoS threats in most key establishment protocols because they have been implemented to support con¯dentiality and data integrity, but do not carefully consider other security objectives, such as availability. The main objective of this research is to design denial-of-service resistant mechanisms in key establishment protocols. In particular, we focus on the design of cryptographic protocols related to key establishment protocols that implement client puzzles to protect the server against resource exhaustion attacks. Another objective is to extend formal analysis techniques to include DoS- resistance. Basically, the formal analysis approach is used not only to analyse and verify the security of a cryptographic scheme carefully but also to help in the design stage of new protocols with a high level of security guarantee. In this research, we focus on an analysis technique of Meadows' cost-based framework, and we implement DoS-resistant model using Coloured Petri Nets. Meadows' cost-based framework is directly proposed to assess denial-of-service vulnerabil- ities in the cryptographic protocols using mathematical proof, while Coloured Petri Nets is used to model and verify the communication protocols using inter- active simulations. In addition, Coloured Petri Nets are able to help the protocol designer to clarify and reduce some inconsistency of the protocol speci¯cation. Therefore, the second objective of this research is to explore vulnerabilities in existing DoS-resistant protocols, as well as extend a formal analysis approach to our new framework for improving DoS-resistance and evaluating the performance of the new proposed mechanism. In summary, the speci¯c outcomes of this research include following results; 1. A taxonomy of denial-of-service resistant strategies and techniques used in key establishment protocols; 2. A critical analysis of existing DoS-resistant key exchange and key estab- lishment protocols; 3. An implementation of Meadows's cost-based framework using Coloured Petri Nets for modelling and evaluating DoS-resistant protocols; and 4. A development of new e±cient and practical DoS-resistant mechanisms to improve the resistance to denial-of-service attacks in key establishment protocols.
Resumo:
Providing support for reversible transformations as a basis for round-trip engineering is a significant challenge in model transformation research. While there are a number of current approaches, they require the underlying transformation to exhibit an injective behaviour when reversing changes. This however, does not serve all practical transformations well. In this paper, we present a novel approach to round-trip engineering that does not place restrictions on the nature of the underlying transformation. Based on abductive logic programming, it allows us to compute a set of legitimate source changes that equate to a given change to the target model. Encouraging results are derived from an initial prototype that supports most concepts of the Tefkat transformation language
Resumo:
With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.
Resumo:
In architecture courses, instilling a wider understanding of the industry specific representations practiced in the Building Industry is normally done under the auspices of Technology and Science subjects. Traditionally, building industry professionals communicated their design intentions using industry specific representations. Originally these mainly two dimensional representations such as plans, sections, elevations, schedules, etc. were produced manually, using a drawing board. Currently, this manual process has been digitised in the form of Computer Aided Design and Drafting (CADD) or ubiquitously simply CAD. While CAD has significant productivity and accuracy advantages over the earlier manual method, it still only produces industry specific representations of the design intent. Essentially, CAD is a digital version of the drawing board. The tool used for the production of these representations in industry is still mainly CAD. This is also the approach taken in most traditional university courses and mirrors the reality of the situation in the building industry. A successor to CAD, in the form of Building Information Modelling (BIM), is presently evolving in the Construction Industry. CAD is mostly a technical tool that conforms to existing industry practices. BIM on the other hand is revolutionary both as a technical tool and as an industry practice. Rather than producing representations of design intent, BIM produces an exact Virtual Prototype of any building that in an ideal situation is centrally stored and freely exchanged between the project team. Essentially, BIM builds any building twice: once in the virtual world, where any faults are resolved, and finally, in the real world. There is, however, no established model for learning through the use of this technology in Architecture courses. Queensland University of Technology (QUT), a tertiary institution that maintains close links with industry, recognises the importance of equipping their graduates with skills that are relevant to industry. BIM skills are currently in increasing demand throughout the construction industry through the evolution of construction industry practices. As such, during the second half of 2008, QUT 4th year architectural students were formally introduced for the first time to BIM, as both a technology and as an industry practice. This paper will outline the teaching team’s experiences and methodologies in offering a BIM unit (Architectural Technology and Science IV) at QUT for the first time and provide a description of the learning model. The paper will present the results of a survey on the learners’ perspectives of both BIM and their learning experiences as they learn about and through this technology.