959 resultados para Hardware system


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the modern built environment, building construction and demolition consume a large amount of energy and emits greenhouse gasses due to widely used conventional construction materials such as reinforced and composite concrete. These materials consume high amount of natural resources and possess high embodied energy. More energy is required to recycle or reuse such materials at the cessation of use. Therefore, it is very important to use recyclable or reusable new materials in building construction in order to conserve natural resources and reduce the energy and emissions associated with conventional materials. Advancements in materials technology have resulted in the introduction of new composite and hybrid materials in infrastructure construction as alternatives to the conventional materials. This research project has developed a lightweight and prefabricatable Hybrid Composite Floor Plate System (HCFPS) as an alternative to conventional floor system, with desirable properties, easy to construct, economical, demountable, recyclable and reusable. Component materials of HCFPS include a central Polyurethane (PU) core, outer layers of Glass-fiber Reinforced Cement (GRC) and steel laminates at tensile regions. This research work explored the structural adequacy and performance characteristics of hybridised GRC, PU and steel laminate for the development of HCFPS. Performance characteristics of HCFPS were investigated using Finite Element (FE) method simulations supported by experimental testing. Parametric studies were conducted to develop the HCFPS to satisfy static performance using sectional configurations, spans, loading and material properties as the parameters. Dynamic response of HCFPS floors was investigated by conducting parametric studies using material properties, walking frequency and damping as the parameters. Research findings show that HCFPS can be used in office and residential buildings to provide acceptable static and dynamic performance. Design guidelines were developed for this new floor system. HCFPS is easy to construct and economical compared to conventional floor systems as it is lightweight and prefabricatable floor system. This floor system can also be demounted and reused or recycled at the cessation of use due to its component materials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many software applications extend their functionality by dynamically loading executable components into their allocated address space. Such components, exemplified by browser plugins and other software add-ons, not only enable reusability, but also promote programming simplicity, as they reside in the same address space as their host application, supporting easy sharing of complex data structures and pointers. However, such components are also often of unknown provenance and quality and may be riddled with accidental bugs or, in some cases, deliberately malicious code. Statistics show that such component failures account for a high percentage of software crashes and vulnerabilities. Enabling isolation of such fine-grained components is therefore necessary to increase the stability, security and resilience of computer programs. This thesis addresses this issue by showing how host applications can create isolation domains for individual components, while preserving the benefits of a single address space, via a new architecture for software isolation called LibVM. Towards this end, we define a specification which outlines the functional requirements for LibVM, identify the conditions under which these functional requirements can be met, define an abstract Application Programming Interface (API) that encompasses the general problem of isolating shared libraries, thus separating policy from mechanism, and prove its practicality with two concrete implementations based on hardware virtualization and system call interpositioning, respectively. The results demonstrate that hardware isolation minimises the difficulties encountered with software based approaches, while also reducing the size of the trusted computing base, thus increasing confidence in the solution’s correctness. This thesis concludes that, not only is it feasible to create such isolation domains for individual components, but that it should also be a fundamental operating system supported abstraction, which would lead to more stable and secure applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. This approach is more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message, and in a separate pass providing integrity protection by generating a Message Authentication Code (MAC). AE using symmetric ciphers can be provided by either stream ciphers with built in authentication mechanisms or block ciphers using appropriate modes of operation. However, stream ciphers have the potential for higher performance and smaller footprint in hardware and/or software than block ciphers. This property makes stream ciphers suitable for resource constrained environments, where storage and computational power are limited. There have been several recent stream cipher proposals that claim to provide AE. These ciphers can be analysed using existing techniques that consider confidentiality or integrity separately; however currently there is no existing framework for the analysis of AE stream ciphers that analyses these two properties simultaneously. This thesis introduces a novel framework for the analysis of AE using stream cipher algorithms. This thesis analyzes the mechanisms for providing confidentiality and for providing integrity in AE algorithms using stream ciphers. There is a greater emphasis on the analysis of the integrity mechanisms, as there is little in the public literature on this, in the context of authenticated encryption. The thesis has four main contributions as follows. The first contribution is the design of a framework that can be used to classify AE stream ciphers based on three characteristics. The first classification applies Bellare and Namprempre's work on the the order in which encryption and authentication processes take place. The second classification is based on the method used for accumulating the input message (either directly or indirectly) into the into the internal states of the cipher to generate a MAC. The third classification is based on whether the sequence that is used to provide encryption and authentication is generated using a single key and initial vector, or two keys and two initial vectors. The second contribution is the application of an existing algebraic method to analyse the confidentiality algorithms of two AE stream ciphers; namely SSS and ZUC. The algebraic method is based on considering the nonlinear filter (NLF) of these ciphers as a combiner with memory. This method enables us to construct equations for the NLF that relate the (inputs, outputs and memory of the combiner) to the output keystream. We show that both of these ciphers are secure from this type of algebraic attack. We conclude that using a keydependent SBox in the NLF twice, and using two different SBoxes in the NLF of ZUC, prevents this type of algebraic attack. The third contribution is a new general matrix based model for MAC generation where the input message is injected directly into the internal state. This model describes the accumulation process when the input message is injected directly into the internal state of a nonlinear filter generator. We show that three recently proposed AE stream ciphers can be considered as instances of this model; namely SSS, NLSv2 and SOBER-128. Our model is more general than a previous investigations into direct injection. Possible forgery attacks against this model are investigated. It is shown that using a nonlinear filter in the accumulation process of the input message when either the input message or the initial states of the register is unknown prevents forgery attacks based on collisions. The last contribution is a new general matrix based model for MAC generation where the input message is injected indirectly into the internal state. This model uses the input message as a controller to accumulate a keystream sequence into an accumulation register. We show that three current AE stream ciphers can be considered as instances of this model; namely ZUC, Grain-128a and Sfinks. We establish the conditions under which the model is susceptible to forgery and side-channel attacks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One remaining difficulty in the Information Technology (IT) business value evaluation domain is the direct linkage between IT value and the underlying determinants of IT value or surrogates of IT value. This paper proposes a research that examines the interacting effects of the determinants of IT value, and their influences on IT value. The overarching research question is how those determinants interact with each other and affect the IT value at organizational value. To achieve this, this research embraces a multilevel, complex, and adaptive system view, where the IT value emerges from the interacting of underlying determinants. This research is theoretically grounded on three organizational theories – multilevel theory, complex adaptive systems theory, and adaptive structuration theory. By integrating those theoretical paradigms, this research proposes a conceptual model that focuses on the process where IT value is created from interactions of those determinants. To answer the research question, agent-based modeling technique is used in this research to build a computational representation based on the conceptual model. Computational experimentation will be conducted based on the computational representation. Validation procedures will be applied to consolidate the validity of this model. In the end, hypotheses will be tested using computational experimentation data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mixtures of single odours were used to explore the receptor response profile across individual antennae of Helicoverpa armigera (Hübner) (Lepidoptera: Noctuidae). Seven odours were tested including floral and green-leaf volatiles: phenyl acetaldehyde, benzaldehyde, β-caryophyllene, limonene, α-pinene, 1-hexanol, 3Z-hexenyl acetate. Electroantennograms of responses to paired mixtures of odours showed that there was considerable variation in receptor tuning across the receptor field between individuals. Data from some moth antennae showed no additivity, which indicated a restricted receptor profile. Results from other moth antennae to the same odour mixtures showed a range of partial additivity. This indicated that a wider array of receptor types was present in these moths, with a greater percentage of the receptors tuned exclusively to each odour. Peripheral receptor fields show variation in the spectrum of response within a population (of moths) when exposed to high doses of plant volatiles. This may be related to recorded variation in host choice within moth populations as reported by other authors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this PhD research program is to investigate numerical methods for simulating variably-saturated flow and sea water intrusion in coastal aquifers in a high-performance computing environment. The work is divided into three overlapping tasks: to develop an accurate and stable finite volume discretisation and numerical solution strategy for the variably-saturated flow and salt transport equations; to implement the chosen approach in a high performance computing environment that may have multiple GPUs or CPU cores; and to verify and test the implementation. The geological description of aquifers is often complex, with porous materials possessing highly variable properties, that are best described using unstructured meshes. The finite volume method is a popular method for the solution of the conservation laws that describe sea water intrusion, and is well-suited to unstructured meshes. In this work we apply a control volume-finite element (CV-FE) method to an extension of a recently proposed formulation (Kees and Miller, 2002) for variably saturated groundwater flow. The CV-FE method evaluates fluxes at points where material properties and gradients in pressure and concentration are consistently defined, making it both suitable for heterogeneous media and mass conservative. Using the method of lines, the CV-FE discretisation gives a set of differential algebraic equations (DAEs) amenable to solution using higher-order implicit solvers. Heterogeneous computer systems that use a combination of computational hardware such as CPUs and GPUs, are attractive for scientific computing due to the potential advantages offered by GPUs for accelerating data-parallel operations. We present a C++ library that implements data-parallel methods on both CPU and GPUs. The finite volume discretisation is expressed in terms of these data-parallel operations, which gives an efficient implementation of the nonlinear residual function. This makes the implicit solution of the DAE system possible on the GPU, because the inexact Newton-Krylov method used by the implicit time stepping scheme can approximate the action of a matrix on a vector using residual evaluations. We also propose preconditioning strategies that are amenable to GPU implementation, so that all computationally-intensive aspects of the implicit time stepping scheme are implemented on the GPU. Results are presented that demonstrate the efficiency and accuracy of the proposed numeric methods and formulation. The formulation offers excellent conservation of mass, and higher-order temporal integration increases both numeric efficiency and accuracy of the solutions. Flux limiting produces accurate, oscillation-free solutions on coarse meshes, where much finer meshes are required to obtain solutions with equivalent accuracy using upstream weighting. The computational efficiency of the software is investigated using CPUs and GPUs on a high-performance workstation. The GPU version offers considerable speedup over the CPU version, with one GPU giving speedup factor of 3 over the eight-core CPU implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis, three mathematical models describing the growth of solid tumour incorporating the host tissue and the immune system response are developed and investigated. The initial model describes the dynamics of the growing tumour and immune response before being extended in the second model by introducing a time-varying dendritic cell-based treatment strategy. Finally, in the third model, we present a mathematical model of a growing tumour using a hybrid cellular automata. These models can provide information to pre-experimental work to assist in designing more effective and efficient laboratory experiments related to tumour growth and interactions with the immune system and immunotherapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent experiments [F. E. Pinkerton, M. S. Meyer, G. P. Meisner, M. P. Balogh, and J. J. Vajo, J. Phys. Chem. C 111, 12881 (2007) and J. J. Vajo and G. L. Olson, Scripta Mater. 56, 829 (2007)] demonstrated that the recycling of hydrogen in the coupled LiBH4/MgH2 system is fully reversible. The rehydrogenation of MgB2 is an important step toward the reversibility. By using ab initio density functional theory calculations, we found that the activation barrier for the dissociation of H2 are 0.49 and 0.58 eV for the B and Mg-terminated MgB2(0001) surface, respectively. This implies that the dissociation kinetics of H2 on a MgB2 (0001) surface should be greatly improved compared to that in pure Mg materials. Additionally, the diffusion of dissociated H atom on the Mg-terminated MgB2(0001) surface is almost barrier-less. Our results shed light on the experimentally-observed reversibility and improved kinetics for the coupled LiBH4/MgH2 system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: The motivation for developing megavoltage (and kilovoltage) cone beam CT (MV CBCT) capabilities in the radiotherapy treatment room was primarily based on the need to improve patient set-up accuracy. There has recently been an interest in using the cone beam CT data for treatment planning. Accurate treatment planning, however, requires knowledge of the electron density of the tissues receiving radiation in order to calculate dose distributions. This is obtained from CT, utilising a conversion between CT number and electron density of various tissues. The use of MV CBCT has particular advantages compared to treatment planning with kilovoltage CT in the presence of high atomic number materials and requires the conversion of pixel values from the image sets to electron density. Therefore, a study was undertaken to characterise the pixel value to electron density relationship for the Siemens MV CBCT system, MVision, and determine the effect, if any, of differing the number of monitor units used for acquisition. If a significant difference with number of monitor units was seen then pixel value to ED conversions may be required for each of the clinical settings. The calibration of the MV CT images for electron density offers the possibility for a daily recalculation of the dose distribution and the introduction of new adaptive radiotherapy treatment strategies. Methods: A Gammex Electron Density CT Phantom was imaged with the MVCB CT system. The pixel value for each of the sixteen inserts, which ranged from 0.292 to 1.707 relative electron density to the background solid water, was determined by taking the mean value from within a region of interest centred on the insert, over 5 slices within the centre of the phantom. These results were averaged and plotted against the relative electron densities of each insert with a linear least squares fit was preformed. This procedure was performed for images acquired with 5, 8, 15 and 60 monitor units. Results: The linear relationship between MVCT pixel value and ED was demonstrated for all monitor unit settings and over a range of electron densities. The number of monitor units utilised was found to have no significant impact on this relationship. Discussion: It was found that the number of MU utilised does not significantly alter the pixel value obtained for different ED materials. However, to ensure the most accurate and reproducible MV to ED calibration, one MU setting should be chosen and used routinely. To ensure accuracy for the clinical situation this MU setting should correspond to that which is used clinically. If more than one MU setting is used clinically then an average of the CT values acquired with different numbers of MU could be utilized without loss in accuracy. Conclusions: No significant differences have been shown between the pixel value to ED conversion for the Siemens MV CT cone beam unit with change in monitor units. Thus as single conversion curve could be utilised for MV CT treatment planning. To fully utilise MV CT imaging for radiotherapy treatment planning further work will be undertaken to ensure all corrections have been made and dose calculations verified. These dose calculations may be either for treatment planning purposes or for reconstructing the delivered dose distribution from transit dosimetry measurements made using electronic portal imaging devices. This will potentially allow the cumulative dose distribution to be determined through the patient’s multi-fraction treatment and adaptive treatment strategies developed to optimize the tumour response.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

- The RAH was activated over 2500 trauma calls in 2009. This figure is over twice the number of calls put out by similar services. - Many trauma calls (in particular L2 trauma calls) from the existing system do not warrant activation of the trauma team - Sometimes trauma calls are activated for nontrauma reasons (eg rapid access to radiology, departmental pressures etc) - The excess of trauma calls has several deleterious effects particularly on time management for the trauma service staff: ward rounds/tertiary survey rounds, education, quality improvement, research

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an approach for identifying the limit states of resilience in a water supply system when influenced by different types of pressure (disturbing) forces. Understanding of systemic resilience facilitates identification of the trigger points for early managerial action to avoid further loss of ability to provide satisfactory service availability when the ability to supply water is under pressure. The approach proposed here is to illustrate the usefulness of a surrogate measure of resilience depicted in a three dimensional space encompassing independent pressure factors. That enables visualisation of the transition of the system-state (resilience) between high to low resilience regions and acts as an early warning trigger for decision-making. The necessity of a surrogate measure arises as a means of linking resilience to the identified pressures as resilience cannot be measured directly. The basis for identifying the resilience surrogate and exploring the interconnected relationships within the complete system, is derived from a meta-system model consisting of three nested sub-systems representing the water catchment and reservoir; treatment plant; and the distribution system and end-users. This approach can be used as a framework for assessing levels of resilience in different infrastructure systems by identifying a surrogate measure and its relationship to relevant pressures acting on the system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Teaching introductory programming has challenged educators through the years. Although Intelligent Tutoring Systems that teach programming have been developed to try to reduce the problem, none have been developed to teach web programming. This paper describes the design and evaluation of the PHP Intelligent Tutoring System (PHP ITS) which addresses this problem. The evaluation process showed that students who used the PHP ITS showed a significant improvement in test scores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a methodology for determining the vertical hydraulic conductivity (Kv) of an aquitard, in a multilayered leaky system, based on the harmonic analysis of arbitrary water-level fluctuations in aquifers. As a result, Kv of the aquitard is expressed as a function of the phase-shift of water-level signals measured in the two adjacent aquifers. Based on this expression, we propose a robust method to calculate Kv by employing linear regression analysis of logarithm transformed frequencies and phases. The frequencies, where the Kv are calculated, are identified by coherence analysis. The proposed methods are validated by a synthetic case study and are then applied to the Westbourne and Birkhead aquitards, which form part of a five-layered leaky system in the Eromanga Basin, Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thesis provides a framework for potential implementation of the design-build (DB) project delivery system in road infrastructure projects in Indonesia. This framework proposed a structure of the hierarchy of factors promoting the potential implementation of the DB project delivery system and introduced ways to implement the DB project delivery system through level of hierarchical factors. These findings not only give benefit to the academic knowledge but also to the public officials in guiding them with regard to the priority of promoting factors in the process to implement the DB system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, we explore the relationship between the qualities of the information system environment and management accounting adaptability. The information system environment refers to three distinct elements: the degree of information system integration, system flexibility, and shared knowledge between business unit managers and the IT function. We draw on the literature on integrated information systems (IIS) and management accounting change and propose a model to test the hypothesized relationships. The sample for this study consists of Australian companies from all industries.