535 resultados para Computer Architecture
Resumo:
Computer Experiments, consisting of a number of runs of a computer model with different inputs, are now common-place in scientific research. Using a simple fire model for illustration some guidelines are given for the size of a computer experiment. A graph is provided relating the error of prediction to the sample size which should be of use when designing computer experiments. Methods for augmenting computer experiments with extra runs are also described and illustrated. The simplest method involves adding one point at a time choosing that point with the maximum prediction variance. Another method that appears to work well is to choose points from a candidate set with maximum determinant of the variance covariance matrix of predictions.
Resumo:
Deterministic computer simulations of physical experiments are now common techniques in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena of this nature. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This thesis investigates some practical issues in the design and analysis of computer experiments and attempts to answer some of the questions faced by experimenters using computer experiments. In particular, the question of the number of computer experiments and how they should be augmented is studied and attention is given to when the response is a function over time.
Resumo:
Classical architecture has a long history of representing the idealized proportions of the human body, derived from the Vitruvian man. This association with the idealized human form has also associated architecture as symbiotic with prevailing power structures. Meaning that architecture is always loaded with some signification, it creates a highly inscribed space. In the absence of architecture space is not necessarily without inscription, for within the void there can exist an anti-architecture. Like the black box theatre, it is both empty and full at the same time, in the absence of the architecture, the void of space and how it is occupied becomes much more profound. As Dorita Hannah writes, ‘In denying a purely visual apprehension of built space, and suggesting a profound interiority, the black-box posits a new way of regarding the body in space.’ This paper analyses the work of Harold Pinter and his use of the body to create an anti-architecture to subvert oppressors and power structures. Pinter’s works are an important case study in this research due to their political nature. His works are also heavily tied to territory, which bound the works in a dependent relationship with a simulated ‘place’. In the citation accompanying the playwright’s Nobel Laureate it states, '...in his plays [he] uncovers the precipice under everyday prattle and forces entry into oppression's closed rooms.' In Pinter’s work oppression manifests itself in the representation of a room, the architecture, which is the cause of a power struggle when objectified and defeated when subjectified. The following work examines how Pinter uses the body to subjectify and represent architecture as authority in his earlier works, which relied on detailed mimetic sets of domestic rooms, and then in his later political works, that were freed of representational scenography. This paper will also look at the adaption of Pinter’s work by the Belarus Free Theatre in their 2008 production of ‘Being Harold Pinter.’ The work of Pinter and the Belarus Free Theatre are concerned with authoritarian political structures. That is, political structures that works against ideas of individualism, ascribing to a mass-produced body as an artifact of dictatorship and conservatism. The focus on the body in space on an empty stage draws attention to the individual – the body amongst scenography can become merely another prop, lost in the borders and boundaries the scenery dictates. Through an analysis of selected works by Harold Pinter and their interpretations, this paper examines this paradox of emptiness and fullness through the body as anti-architecture in performance.
Resumo:
The exchange between the body and architecture walks a fine line between violence and pleasure. It is through the body that the subject engages with the architectural act, not via thought or reason, but through action. The materiality of architecture is the often the catalyst for some intense association; the wall that defines gender or class, the double bolted door that incarcerates, the enclosed privacy of the bedroom to the love affair. Architecture is the physical manifestation of Lefebvre’s inscribed space. It enacts the social and political systems through bodily occupation. Architecture, when tested by the occupation of bodies, anchors ideology in both space and time. The architect’s script can be powerful when rehearsed honestly to the building’s intentions and just as beautiful when rebuked by the act of protest or unfaithful occupation. This research examines this fine line of violence and pleasure in architecture through performance, in the work of Bryony Lavin’s play Stockholm and Revolving Door by Allora & Calzadilla as part of the recent Kaldor Public Art Projects exhibition 13 Rooms in Sydney. The research is underpinned by the work of Architect and theorist, Bernard Tschumi in his two essays, Violence of Architecture and The Pleasure of Architecture. Studying architecture through the lens of performance shifts the focus of examination from pure thought to the body; because architecture is occupied through the body and not the mind.
Resumo:
Many software applications extend their functionality by dynamically loading executable components into their allocated address space. Such components, exemplified by browser plugins and other software add-ons, not only enable reusability, but also promote programming simplicity, as they reside in the same address space as their host application, supporting easy sharing of complex data structures and pointers. However, such components are also often of unknown provenance and quality and may be riddled with accidental bugs or, in some cases, deliberately malicious code. Statistics show that such component failures account for a high percentage of software crashes and vulnerabilities. Enabling isolation of such fine-grained components is therefore necessary to increase the stability, security and resilience of computer programs. This thesis addresses this issue by showing how host applications can create isolation domains for individual components, while preserving the benefits of a single address space, via a new architecture for software isolation called LibVM. Towards this end, we define a specification which outlines the functional requirements for LibVM, identify the conditions under which these functional requirements can be met, define an abstract Application Programming Interface (API) that encompasses the general problem of isolating shared libraries, thus separating policy from mechanism, and prove its practicality with two concrete implementations based on hardware virtualization and system call interpositioning, respectively. The results demonstrate that hardware isolation minimises the difficulties encountered with software based approaches, while also reducing the size of the trusted computing base, thus increasing confidence in the solution’s correctness. This thesis concludes that, not only is it feasible to create such isolation domains for individual components, but that it should also be a fundamental operating system supported abstraction, which would lead to more stable and secure applications.
Resumo:
As buildings have become more advanced and complex, our ability to understand how they are operated and managed has diminished. Modern technologies have given us systems to look after us but it appears to have taken away our say in how we like our environment to be managed. The aim of this paper is to discuss our research concerning spaces that are sensitive to changing needs and allow building-users to have a certain level of freedom to understand and control their environment. We discuss why, what we call the Active Layer, is needed in modern buildings; how building inhabitants are to interact with it; and the development of interface prototypes to test consequences of having the Active Layer in our environment.
Resumo:
The latest paradigm shift in government, termed Transformational Government, puts the citizen in the centre of attention. Including citizens in the design of online one-stop portals can help governmental organisations to become more customer focussed. This study describes the initial efforts of an Australian state government to develop an information architecture to structure the content of their future one-stop portal. Hereby, card sorting exercises have been conducted and analysed, utilising contemporary approaches found in academic and non-scientific literature. This paper describes the findings of the card sorting exercises in this particular case and discusses the suitability of the applied approaches in general. These are distinguished into non-statistical, statistical, and hybrid approaches. Thus, on the one hand, this paper contributes to academia by describing the application of different card sorting approaches and discussing their strengths and weaknesses. On the other hand, this paper contributes to practice by explaining the approach that has been taken by the authors’ research partner in order to develop a customer-focussed governmental one-stop portal. Thus, they provide decision support for practitioners with regard to different analysis methods that can be used to complement recent approaches in Transformational Government.
Resumo:
This paper demonstrates, following Vygotsky, that language and tool use has a critical role in the collaborative problem-solving behaviour of school-age children. It reports original ethnographic classroom research examining the convergence of speech and practical activity in children’s collaborative problem solving with robotics programming tasks. The researchers analysed children’s interactions during a series of problem solving experiments in which Lego Mindstorms toolsets were used by teachers to create robotics design challenges among 24 students in a Year 4 Australian classroom (students aged 8.5–9.5 years). The design challenges were incrementally difficult, beginning with basic programming of straight line movement, and progressing to more complex challenges involving programming of the robots to raise Lego figures from conduit pipes using robots as pulleys with string and recycled materials. Data collection involved micro-genetic analysis of students’ speech interactions with tools, peers, and other experts, teacher interviews, and student focus group data. Coding the repeated patterns in the transcripts, the authors outline the structure of the children’s social speech in joint problem solving, demonstrating the patterns of speech and interaction that play an important role in the socialisation of the school-age child’s practical intellect.
Resumo:
This study aims to redefine spaces of learning to places of learning through the direct engagement of local communities as a way to examine and learn from real world issues in the city. This paper exemplifies Smart City Learning, where the key goal is to promote the generation and exchange of urban design ideas for the future development of South Bank, in Brisbane, Australia, informing the creation of new design policies responding to the needs of local citizens. Specific to this project was the implementation of urban informatics techniques and approaches to promote innovative engagement strategies. Architecture and Urban Design students were encouraged to review and appropriate real-time, ubiquitous technology, social media, and mobile devices that were used by urban residents to augment and mediate the physical and digital layers of urban infrastructures. Our study’s experience found that urban informatics provide an innovative opportunity to enrich students’ place of learning within the city.
Resumo:
Molecular-level computer simulations of restricted water diffusion can be used to develop models for relating diffusion tensor imaging measurements of anisotropic tissue to microstructural tissue characteristics. The diffusion tensors resulting from these simulations can then be analyzed in terms of their relationship to the structural anisotropy of the model used. As the translational motion of water molecules is essentially random, their dynamics can be effectively simulated using computers. In addition to modeling water dynamics and water-tissue interactions, the simulation software of the present study was developed to automatically generate collagen fiber networks from user-defined parameters. This flexibility provides the opportunity for further investigations of the relationship between the diffusion tensor of water and morphologically different models representing different anisotropic tissues.
Resumo:
In this study, a hierarchical nano/microfibrous chitosan/collagen scaffold that approximates structural and functional attributes of native extracellular matrix (ECM), has been developed for applicability in skin tissue engineering. Scaffolds were produced by electrospinning of chitosan followed by imbibing of collagen solution, freeze-drying and subsequent cross-linking of two polymers. Scanning electron microscopy showed formation of layered scaffolds with nano/microfibrous architechture. Physico-chemical properties of scaffolds including tensile strength, swelling behavior and biodegradability were found satisfactory for intended application. 3T3 fibroblasts and HaCaT keratinocytes showed good in vitro cellular response on scaffolds thereby indicating the matrices′ cytocompatible nature. Scaffolds tested in an ex vivo human skin equivalent (HSE) wound model, as a preliminary alternative to animal testing, showed keratinocyte migration and wound re-epithelization — a pre-requisite for healing and regeneration. Taken together, the herein proposed chitosan/collagen scaffold, shows good potential for skin tissue engineering.
Resumo:
Historic house museums form a significant component of the built heritage and social history of a country. They vary from the elaborate mansions of the wealthy to modest dwellings of the working class. Regardless of the original owner's status in society these house museums are vital to an understanding of architecture, culture and society from a bygone era. The Newstead House, the oldest surviving residence, in Brisbane, is the first house to be designated a 'Historic House Museum' in Queensland. It is a representative example of a house that demonstrates the British colonial heritage of 19th century Australia. Originally a modest cottage, on 34 acres of land, the Newstead house was built by a Scottish migrant. The ownership of the house and land changed many times, during the period from 1847 to 1939. During this period a series of prominent residents of Brisbane either owned or rented this residence. They included, an officer of the Royal Navy, politicians, magistrates, merchant ship owners, and a Consul General of the United States of America. As a result, the house went through a series of renovations and extensions to accommodate the needs of its owners and their position in society. This paper aims to investigate the significance of historic museum houses in educating the community on aspects of social history, culture and architecture of 19th century Australia. It will focus on the heritage listed Newstead House as a case study to demonstrate the significance of the house as an artefact and an educational tool.
Resumo:
Classifier selection is a problem encountered by multi-biometric systems that aim to improve performance through fusion of decisions. A particular decision fusion architecture that combines multiple instances (n classifiers) and multiple samples (m attempts at each classifier) has been proposed in previous work to achieve controlled trade-off between false alarms and false rejects. Although analysis on text-dependent speaker verification has demonstrated better performance for fusion of decisions with favourable dependence compared to statistically independent decisions, the performance is not always optimal. Given a pool of instances, best performance with this architecture is obtained for certain combination of instances. Heuristic rules and diversity measures have been commonly used for classifier selection but it is shown that optimal performance is achieved for the `best combination performance' rule. As the search complexity for this rule increases exponentially with the addition of classifiers, a measure - the sequential error ratio (SER) - is proposed in this work that is specifically adapted to the characteristics of sequential fusion architecture. The proposed measure can be used to select a classifier that is most likely to produce a correct decision at each stage. Error rates for fusion of text-dependent HMM based speaker models using SER are compared with other classifier selection methodologies. SER is shown to achieve near optimal performance for sequential fusion of multiple instances with or without the use of multiple samples. The methodology applies to multiple speech utterances for telephone or internet based access control and to other systems such as multiple finger print and multiple handwriting sample based identity verification systems.
Resumo:
Plug-in electric vehicles (PEVs) are increasingly popular in the global trend of energy saving and environmental protection. However, the uncoordinated charging of numerous PEVs can produce significant negative impacts on the secure and economic operation of the power system concerned. In this context, a hierarchical decomposition approach is presented to coordinate the charging/discharging behaviors of PEVs. The major objective of the upper-level model is to minimize the total cost of system operation by jointly dispatching generators and electric vehicle aggregators (EVAs). On the other hand, the lower-level model aims at strictly following the dispatching instructions from the upper-level decision-maker by designing appropriate charging/discharging strategies for each individual PEV in a specified dispatching period. Two highly efficient commercial solvers, namely AMPL/IPOPT and AMPL/CPLEX, respectively, are used to solve the developed hierarchical decomposition model. Finally, a modified IEEE 118-bus testing system including 6 EVAs is employed to demonstrate the performance of the developed model and method.
Resumo:
This paper discusses computer mediated distance learning on a Master's level course in the UK and student perceptions of this as a quality learning environment.