887 resultados para 3D Computer Graphics
Resumo:
Hybrid face recognition, using image (2D) and structural (3D) information, has explored the fusion of Nearest Neighbour classifiers. This paper examines the effectiveness of feature modelling for each individual modality, 2D and 3D. Furthermore, it is demonstrated that the fusion of feature modelling techniques for the 2D and 3D modalities yields performance improvements over the individual classifiers. By fusing the feature modelling classifiers for each modality with equal weights the average Equal Error Rate improves from 12.60% for the 2D classifier and 12.10% for the 3D classifier to 7.38% for the Hybrid 2D+3D clasiffier.
Resumo:
Knowmore (House of Commons) is a large scale generative interactive installation that incorporates embodied interaction, dynamic image creation, new furniture forms, touch sensitivity, innovative collaborative processes and multichannel generative sound creation. A large circular table spun by hand and a computer-controlled video projection falls on its top, creating an uncanny blend of physical object and virtual media. Participants’ presence around the table and how they touch it is registered, allowing up to five people to collaboratively ‘play’ this deeply immersive audiovisual work. Set within an ecological context, the work subtly asks what kind of resources and knowledges might be necessary to move us past simply knowing what needs to be changed to instead actually embodying that change, whilst hinting at other deeply relational ways of understanding and knowing the world. The work has successfully operated in two high traffic public environments, generating a subtle form of interactivity that allows different people to interact at different paces and speeds and with differing intentions, each contributing towards dramatic public outcomes. The research field involved developing new interaction and engagement strategies for eco-political media arts practice. The context was the creation of improved embodied, performative and improvisational experiences for participants; further informed by ‘Sustainment’ theory. The central question was, what ontological shifts may be necessary to better envision and align our everyday life choices in ways that respect that which is shared by all - 'The Commons'. The methodology was primarily practice-led and in concert with underlying theories. The work’s knowledge contribution was to question how new media interactive experience and embodied interaction might prompt participants to reflect upon the kind of resources and knowledges required to move past simply knowing what needs to be changed to instead actually embodying that change. This was achieved through focusing on the power of embodied learning implied by the works' strongly physical interface (i.e. the spinning of a full size table) in concert with the complex field of layered imagery and sound. The work was commissioned by the State Library of Queensland and Queensland Artworkers Alliance and significantly funded by The Australia Council for the Arts, Arts Queensland, QUT, RMIT Centre for Animation and Interactive Media and industry partners E2E Visuals. After premiering for 3 months at the State Library of Queensland it was curated into the significant ‘Mediations Biennial of Modern Art’ in Poznan, Poland. The work formed the basis of two papers, was reviewed in Realtime (90), was overviewed at Subtle Technologies (2010) in Toronto and shortlisted for ISEA 2011 Istanbul and included in the edited book/catalogue ‘Art in Spite of Economics’, a collaboration between Leonardo/ISAST (MIT Press); Goldsmiths, University of London; ISEA International; and Sabanci University, Istanbul.
Resumo:
The design of a building is a complicated process, having to formulate diverse components through unique tasks involving different personalities and organisations in order to satisfy multi-faceted client requirements. To do this successfully, the project team must encapsulate an integrated design that accommodates various social, economic and legislative factors. Therefore, in this era of increasing global competition integrated design has been increasingly recognised as a solution to deliver value to clients.----- The ‘From 3D to nD modelling’ project at the University of Salford aims to support integrated design; to enable and equip the design and construction industry with a tool that allows users to create, share, contemplate and apply knowledge from multiple perspectives of user requirements (accessibility, maintainability, sustainability, acoustics, crime, energy simulation, scheduling, costing etc.). Thus taking the concept of 3-dimensional computer modelling of the built environment to an almost infinite number of dimensions, to cope with whole-life construction and asset management issues in the design of modern buildings. This paper reports on the development of a vision for how integrated environments that will allow nD-enabled construction and asset management to be undertaken. The project is funded by a four-year platform grant from the Engineering and Physical Sciences Research Council (EPSRC) in the UK; thus awarded to a multi-disciplinary research team, to enable flexibility in the research strategy and to produce leading innovation. This paper reports on the development of a business process and IT vision for how integrated environments will allow nD-enabled construction and asset management to be undertaken. It further develops many of the key issues of a future vision arising from previous CIB W78 conferences.
Resumo:
Facing with the difficulty in information propagation and synthesizing from conceptual to embodiment design, this paper introduces a function-oriented, axiom based conceptual modeling scheme. Default logic reasoning is exploited for recognition and reconstitution of conceptual product geometric and topological information. The proposed product modeling system and reasoning approach testify a methodology of "structural variation design", which is verified in the implementation of a GPAL (Green Product All Life-cycle) CAD system. The GPAL system includes major enhancement modules of a mechanism layout sketching method based on fuzzy logic, a knowledge-based function-to-form mapping mechanism and conceptual form reconstitution paradigm based on default geometric reasoning. A mechanical hand design example shows a more than 20 times increase in design efficacy with these enhancement modules in the GPAL system on a general 3D CAD platform.
Resumo:
The main aim of radiotherapy is to deliver a dose of radiation that is high enough to destroy the tumour cells while at the same time minimising the damage to normal healthy tissues. Clinically, this has been achieved by assigning a prescription dose to the tumour volume and a set of dose constraints on critical structures. Once an optimal treatment plan has been achieved the dosimetry is assessed using the physical parameters of dose and volume. There has been an interest in using radiobiological parameters to evaluate and predict the outcome of a treatment plan in terms of both a tumour control probability (TCP) and a normal tissue complication probability (NTCP). In this study, simple radiobiological models that are available in a commercial treatment planning system were used to compare three dimensional conformal radiotherapy treatments (3D-CRT) and intensity modulated radiotherapy (IMRT) treatments of the prostate. Initially both 3D-CRT and IMRT were planned for 2 Gy/fraction to a total dose of 60 Gy to the prostate. The sensitivity of the TCP and the NTCP to both conventional dose escalation and hypo-fractionation was investigated. The biological responses were calculated using the Källman S-model. The complication free tumour control probability (P+) is generated from the combined NTCP and TCP response values. It has been suggested that the alpha/beta ratio for prostate carcinoma cells may be lower than for most other tumour cell types. The effect of this on the modelled biological response for the different fractionation schedules was also investigated.
Resumo:
John Frazer's architectural work is inspired by living and generative processes. Both evolutionary and revolutionary, it explores informatin ecologies and the dynamics of the spaces between objects. Fuelled by an interest in the cybernetic work of Gordon Pask and Norbert Wiener, and the possibilities of the computer and the "new science" it has facilitated, Frazer and his team of collaborators have conducted a series of experiments that utilize genetic algorithms, cellular automata, emergent behaviour, complexity and feedback loops to create a truly dynamic architecture. Frazer studied at the Architectural Association (AA) in London from 1963 to 1969, and later became unit master of Diploma Unit 11 there. He was subsequently Director of Computer-Aided Design at the University of Ulter - a post he held while writing An Evolutionary Architecture in 1995 - and a lecturer at the University of Cambridge. In 1983 he co-founded Autographics Software Ltd, which pioneered microprocessor graphics. Frazer was awarded a person chair at the University of Ulster in 1984. In Frazer's hands, architecture becomes machine-readable, formally open-ended and responsive. His work as computer consultant to Cedric Price's Generator Project of 1976 (see P84)led to the development of a series of tools and processes; these have resulted in projects such as the Calbuild Kit (1985) and the Universal Constructor (1990). These subsequent computer-orientated architectural machines are makers of architectural form beyond the full control of the architect-programmer. Frazer makes much reference to the multi-celled relationships found in nature, and their ongoing morphosis in response to continually changing contextual criteria. He defines the elements that describe his evolutionary architectural model thus: "A genetic code script, rules for the development of the code, mapping of the code to a virtual model, the nature of the environment for the development of the model and, most importantly, the criteria for selection. In setting out these parameters for designing evolutionary architectures, Frazer goes beyond the usual notions of architectural beauty and aesthetics. Nevertheless his work is not without an aesthetic: some pieces are a frenzy of mad wire, while others have a modularity that is reminiscent of biological form. Algorithms form the basis of Frazer's designs. These algorithms determine a variety of formal results dependent on the nature of the information they are given. His work, therefore, is always dynamic, always evolving and always different. Designing with algorithms is also critical to other architects featured in this book, such as Marcos Novak (see p150). Frazer has made an unparalleled contribution to defining architectural possibilities for the twenty-first century, and remains an inspiration to architects seeking to create responsive environments. Architects were initially slow to pick up on the opportunities that the computer provides. These opportunities are both representational and spatial: computers can help architects draw buildings and, more importantly, they can help architects create varied spaces, both virtual and actual. Frazer's work was groundbreaking in this respect, and well before its time.