57 resultados para Built in 1973 Maison
Resumo:
This paper introduces a parallel implementation of an agent-based model applied to electricity distribution grids. A fine-grained shared memory parallel implementation is presented, detailing the way the agents are grouped and executed on a multi-threaded machine, as well as the way the model is built (in a composable manner) which is an aid to the parallelisation. Current results show a medium level speedup of 2.6, but improvements are expected by incor-porating newer distributed or parallel ABM schedulers into this implementa-tion. While domain-specific, this parallel algorithm can be applied to similarly structured ABMs (directed acyclic graphs).
Resumo:
We introduce Kamouflage: a new architecture for building theft-resistant password managers. An attacker who steals a laptop or cell phone with a Kamouflage-based password manager is forced to carry out a considerable amount of online work before obtaining any user credentials. We implemented our proposal as a replacement for the built-in Firefox password manager, and provide performance measurements and the results from experiments with large real-world password sets to evaluate the feasibility and effectiveness of our approach. Kamouflage is well suited to become a standard architecture for password managers on mobile devices.
Resumo:
Uanda house is of historical importance to Queensland both in terms of its architectural design and its social history. Uanda is a low set, single story house built in 1928, located in the inner city Brisbane suburb of Wilston. Architecturally, the house has a number of features that distinguish it from the surrounding bungalow influenced inter-war houses. The house has been described as a Queensland style house with neo-Georgian influences. Historically, it is associated with the entry of women into the profession of architecture in Queensland. Uanda is the only remaining intact work of architect/draftswoman Nellie McCredie and one of a very few examples of works by pioneering women architects in Queensland. The house was entered into the Queensland Heritage Register, in 2000, after an appeal against Brisbane City Council’s refusal of an application to demolish the house was disputed in the Queensland Planning and Environment court in 1998/1999. In the court’s report, Judge Robin QC, DCJ, stated that, “The importance of preserving women's history and heritage, often previously marginalised or lost, is now accepted at government level, recognising that role models are vital for bringing new generations of women into the professions and public life.” While acknowledging women’s contribution to the profession of architecture is an important endeavour, it also has the potential to isolate women architects as separate to a mainstream history of architecture. As Julie Willis writes, it can imply an atypical, feminine style of architecture. What is the impact or potential implications of recognising heritage buildings designed by women architects? The Judge also highlights the absence of a recorded history of unique Brisbane houses and questions the authority of the heritage register. This research looks at these points of difference through a case study of the Uanda house. The paper will investigate the processes of adding the house to the heritage register, the court case and existing research on Nellie McCredie and Uanda House.
Resumo:
This paper presents a modulation and controller design method for paralleled Z-source inverter systems applicable for alternative energy sources like solar cells, fuel cells, or variablespeed wind turbines with front-end diode rectifiers. A modulation scheme is designed based on simple shoot-through principle with interleaved carriers to give enhanced ripple reduction in the system. Subsequently, a control method is proposed to equalize the amount of power injected by the inverters in the grid-connected mode and also to provide reliable supply to sensitive loads onsite in the islanding mode. The modulation and controlling methods are proposed to have modular independence so that redundancy, maintainability, and improved reliability of supply can be achieved. The performance of the proposed paralleled Z-source inverter configuration is validated with simulations carried out using Matlab/Simulink/Powersim. Moreover, a prototype is built in the laboratory to obtain the experimental verifications.
Resumo:
The present study explores reproducing the closest geometry of a high pressure ratio single stage radial-inflow turbine applied in the Sundstrans Power Systems T-100 Multipurpose Small Power Unit. The commercial software ANSYS-Vista RTD along with a built in module, BladeGen, is used to conduct a meanline design and create 3D geometry of one flow passage. Carefully examining the proposed design against the geometrical and experimental data, ANSYS-TurboGrid is applied to generate computational mesh. CFD simulations are performed with ANSYS-CFX in which three-dimensional Reynolds-Averaged Navier-Stokes equations are solved subject to appropriate boundary conditions. Results are compared with numerical and experimental data published in the literature in order to generate the exact geometry of the existing turbine and validate the numerical results against the experimental ones.
Resumo:
Description of a patient's injuries is recorded in narrative text form by hospital emergency departments. For statistical reporting, this text data needs to be mapped to pre-defined codes. Existing research in this field uses the Naïve Bayes probabilistic method to build classifiers for mapping. In this paper, we focus on providing guidance on the selection of a classification method. We build a number of classifiers belonging to different classification families such as decision tree, probabilistic, neural networks, and instance-based, ensemble-based and kernel-based linear classifiers. An extensive pre-processing is carried out to ensure the quality of data and, in hence, the quality classification outcome. The records with a null entry in injury description are removed. The misspelling correction process is carried out by finding and replacing the misspelt word with a soundlike word. Meaningful phrases have been identified and kept, instead of removing the part of phrase as a stop word. The abbreviations appearing in many forms of entry are manually identified and only one form of abbreviations is used. Clustering is utilised to discriminate between non-frequent and frequent terms. This process reduced the number of text features dramatically from about 28,000 to 5000. The medical narrative text injury dataset, under consideration, is composed of many short documents. The data can be characterized as high-dimensional and sparse, i.e., few features are irrelevant but features are correlated with one another. Therefore, Matrix factorization techniques such as Singular Value Decomposition (SVD) and Non Negative Matrix Factorization (NNMF) have been used to map the processed feature space to a lower-dimensional feature space. Classifiers with these reduced feature space have been built. In experiments, a set of tests are conducted to reflect which classification method is best for the medical text classification. The Non Negative Matrix Factorization with Support Vector Machine method can achieve 93% precision which is higher than all the tested traditional classifiers. We also found that TF/IDF weighting which works well for long text classification is inferior to binary weighting in short document classification. Another finding is that the Top-n terms should be removed in consultation with medical experts, as it affects the classification performance.
Resumo:
The creation of a commercially viable and a large-scale purification process for plasmid DNA (pDNA) production requires a whole-systems continuous or semi-continuous purification strategy employing optimised stationary adsorption phase(s) without the use of expensive and toxic chemicals, avian/bovine-derived enzymes and several built-in unit processes, thus affecting overall plasmid recovery, processing time and economics. Continuous stationary phases are known to offer fast separation due to their large pore diameter making large molecule pDNA easily accessible with limited mass transfer resistance even at high flow rates. A monolithic stationary sorbent was synthesised via free radical liquid porogenic polymerisation of ethylene glycol dimethacrylate (EDMA) and glycidyl methacrylate (GMA) with surface and pore characteristics tailored specifically for plasmid binding, retention and elution. The polymer was functionalised with an amine active group for anion-exchange purification of pDNA from cleared lysate obtained from E. coli DH5α-pUC19 pellets in RNase/protease-free process. Characterization of the resin showed a unique porous material with 70% of the pores sizes above 300 nm. The final product isolated from anion-exchange purification in only 5 min was pure and homogenous supercoiled pDNA with no gDNA, RNA and protein contamination as confirmed with DNA electrophoresis, restriction analysis and SDS page. The resin showed a maximum binding capacity of 15.2 mg/mL and this capacity persisted after several applications of the resin. This technique is cGMP compatible and commercially viable for rapid isolation of pDNA.
Resumo:
At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar to the second preimage resistance rather than on the collision resistance property of the hash functions. One of the randomized hash function modes was named the RMX hash function mode and was recommended for practical purposes. The National Institute of Standards and Technology (NIST), USA standardized a variant of the RMX hash function mode and published this standard in the Special Publication (SP) 800-106. In this article, we first discuss a generic online birthday existential forgery attack of Dang and Perlner on the RMX-hash-then-sign schemes. We show that a variant of this attack can be applied to forge the other randomize-hash-then-sign schemes. We point out practical limitations of the generic forgery attack on the RMX-hash-then-sign schemes. We then show that these limitations can be overcome for the RMX-hash-then-sign schemes if it is easy to find fixed points for the underlying compression functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online birthday forgery attack on this class of signatures by using a variant of Dean’s method of finding fixed point expandable messages for hash functions based on the Davies-Meyer construction. This forgery attack is also applicable to signature schemes based on the variant of RMX standardized by NIST in SP 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash-based message authentication code (HMAC).
Resumo:
It is commonplace to use digital video cameras in robotic applications. These cameras have built-in exposure control but they do not have any knowledge of the environment, the lens being used, the important areas of the image and do not always produce optimal image exposure. Therefore, it is desirable and often necessary to control the exposure off the camera. In this paper we present a scheme for exposure control which enables the user application to determine the area of interest. The proposed scheme introduces an intermediate transparent layer between the camera and the user application which combines the information from these for optimal exposure production. We present results from indoor and outdoor scenarios using directional and fish-eye lenses showing the performance and advantages of this framework.
Resumo:
The use of capacitors for electrical energy storage actually predates the invention of the battery. Alessandro Volta is attributed with the invention of the battery in 1800, where he first describes a battery as an assembly of plates of two different materials (such as copper and zinc) placed in an alternating stack and separated by paper soaked in brine or vinegar [1]. Accordingly, this device was referred to as Volta’s pile and formed the basis of subsequent revolutionary research and discoveries on the chemical origin of electricity. Before the advent of Volta’s pile, however, eighteenth century researchers relied on the use of Leyden jars as a source of electrical energy. Built in the mid-1700s at the University of Leyden in Holland, a Leyden jar is an early capacitor consisting of a glass jar coated inside and outside with a thin layer of silver foil [2, 3]. With the outer foil being grounded, the inner foil could be charged with an electrostatic generator, or a source of static electricity, and could produce a strong electrical discharge from a small and comparatively simple device.
Resumo:
Multi-agent systems implicate a high degree of concurrency at both the Inter- and Intra-Agent levels. Scalable, fault tolerant, Agent Grooming Environment (SAGE), the second generation, FIPA compliant MAS requires a built in mechanism to achieve both the Inter- and Intra-Agent concurrency. This paper dilates upon an attempt to provide a reliable, efficient and light-weight solution to provide intra-agent concurrency with-in the internal agent architecture of SAGE. It addresses the issues related to using the JAVA threading model to provide this level of concurrency to the agent and provides an alternative approach that is based on an eventdriven, concurrent and user-scalable multi-tasking model for the agent's internal model. The findings of this paper show that our proposed approach is suitable for providing an efficient and lightweight concurrent task model for SA GE and considerably outweighs the performance of multithreaded tasking model based on JAVA in terms of throughput and efficiency. This has been illustrated using the practical implementation and evaluation of both models. © 2004 IEEE.
Resumo:
Multi-agent systems (MAS) advocate an agent-based approach to software engineering based on decomposing problems in terms of decentralized, autonomous agents that can engage in flexible, high-level interactions. This chapter introduces scalable fault tolerant agent grooming environment (SAGE), a second-generation Foundation for Intelligent Physical Agents (FIPA)-compliant multi-agent system developed at NIIT-Comtec, which provides an environment for creating distributed, intelligent, and autonomous entities that are encapsulated as agents. The chapter focuses on the highlight of SAGE, which is its decentralized fault-tolerant architecture that can be used to develop applications in a number of areas such as e-health, e-government, and e-science. In addition, SAGE architecture provides tools for runtime agent management, directory facilitation, monitoring, and editing messages exchange between agents. SAGE also provides a built-in mechanism to program agent behavior and their capabilities with the help of its autonomous agent architecture, which is the other major highlight of this chapter. The authors believe that the market for agent-based applications is growing rapidly, and SAGE can play a crucial role for future intelligent applications development. © 2007, IGI Global.