138 resultados para integrity in closed-loop


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The two-dimensional free surface flow of a finite-depth fluid into a horizontal slot is considered. For this study, the effects of viscosity and gravity are ignored. A generalised Schwarz-Christoffel mapping is used to formulate the problem in terms of a linear integral equation, which is solved exactly with the use of a Fourier transform. The resulting free surface profile is given explicitly in closed-form.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pathological mineralization of articular cartilage is a characteristic feature of osteoarthritis (OA); however, the underlying mechanisms, and their relevance to cartilage degeneration, are not clear. The involvement of subchondral bone changes in OA have been reported previously with the characterization of abnormal subchondral bone mineral density (BMD), osteiod volume, altered bone mechanical parameters and an increase in bone turnover markers. A number of osteoarthritic animal models have demonstrated that subchondral bone changes often precede cartilage degeneration. In this study site specific localization of mineralization markers were detected in the OA cartilage. Chondrocytes and osteoblasts derived from OA cartilage and subchondral bone showed a significant increase in the mRNA expressions of mineralization markers. Interestingly, osteoblasts from OA subchondral bone could significantly decrease cartilage matrix expression; whereas, increase mineralization of chondrocytes (Figure 1). Osteogenic factors, such as CBFA1, ALP, and type X collagen (Col-X), were detected in chondrocytes under mineralization conditions (Figure 2). Furthermore, chondrocyte mineralization was followed by increased mRNA and protein levels of MMP-2, MMP-9 and MMP-13, all of which are detrimental to cartilage integrity in vivo. The data reported here suggests that the upregulation of subchondral bone-mineralization, typical of OA progression, causes cartilage mineralization, and that the mineralization of chondrocytes induce increased MMP levels with a subsequent degradation of the articular cartilage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monitoring and assessing environmental health is becoming increasingly important as human activity and climate change place greater pressure on global biodiversity. Acoustic sensors provide the ability to collect data passively, objectively and continuously across large areas for extended periods of time. While these factors make acoustic sensors attractive as autonomous data collectors, there are significant issues associated with large-scale data manipulation and analysis. We present our current research into techniques for analysing large volumes of acoustic data effectively and efficiently. We provide an overview of a novel online acoustic environmental workbench and discuss a number of approaches to scaling analysis of acoustic data; collaboration, manual, automatic and human-in-the loop analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reduced element spacing in antenna arrays gives rise to strong mutual coupling between array elements and may cause significant performance degradation. These effects can be alleviated by introducing a decoupling network consisting of interconnected reactive elements. The existing design approach for the synthesis of a decoupling network for circulant symmetric arrays allows calculation of element values using closed-form expressions, but the resulting circuit configuration requires multilayer technology for implementation. In this paper, a new structure for the decoupling of circulant symmetric arrays of more than four elements is presented. Element values are no longer obtained in closed form, but the resulting circuit is much simpler and can be implemented on a single layer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

World Ethics Forum conference proceedings : the joint conference of The International Institute for Public Ethics (IIPE) and The World Bank : leadership, ethics and integrity in public life : 9-11 April 2006, Keble College, University of Oxford UK /

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data flow analysis techniques can be used to help assess threats to data confidentiality and integrity in security critical program code. However, a fundamental weakness of static analysis techniques is that they overestimate the ways in which data may propagate at run time. Discounting large numbers of these false-positive data flow paths wastes an information security evaluator's time and effort. Here we show how to automatically eliminate some false-positive data flow paths by precisely modelling how classified data is blocked by certain expressions in embedded C code. We present a library of detailed data flow models of individual expression elements and an algorithm for introducing these components into conventional data flow graphs. The resulting models can be used to accurately trace byte-level or even bit-level data flow through expressions that are normally treated as atomic. This allows us to identify expressions that safely downgrade their classified inputs and thereby eliminate false-positive data flow paths from the security evaluation process. To validate the approach we have implemented and tested it in an existing data flow analysis toolkit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ‘Dream Circle’ is a space designed by and operated through Indigenous educator footprints as a safe space for the school’s deadly jarjums (Indigenous children). The ‘Dream Circle’ uses a kinnected methodology drawing on the rich vein of Murri cultural knowledges and Torres Strait Islander supports within the local community to provide a safe and supportive circle. The ‘Dream Circle’ operates on a school site in the Logan area as an after school homework and cultural studies class. The ‘Dream Circle’ embodies practices and ritualises processes which ensure cultural safety and integrity. In this way the ‘Dream Circle’ balances the measures that Sarra (2005) purports are the stronger, smarter realities needed for positive change in Indigenous education.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Virtual prototyping emerges as a new technology to replace existing physical prototypes for product evaluation, which are costly and time consuming to manufacture. Virtualization technology allows engineers and ergonomists to perform virtual builds and different ergonomic analyses on a product. Digital Human Modelling (DHM) software packages such as Siemens Jack, often integrate with CAD systems to provide a virtual environment which allows investigation of operator and product compatibility. Although the integration between DHM and CAD systems allows for the ergonomic analysis of anthropometric design, human musculoskeletal, multi-body modelling software packages such as the AnyBody Modelling System (AMS) are required to support physiologic design. They provide muscular force analysis, estimate human musculoskeletal strain and help address human comfort assessment. However, the independent characteristics of the modelling systems Jack and AMS constrain engineers and ergonomists in conducting a complete ergonomic analysis. AMS is a stand alone programming system without a capability to integrate into CAD environments. Jack is providing CAD integrated human-in-the-loop capability, but without considering musculoskeletal activity. Consequently, engineers and ergonomists need to perform many redundant tasks during product and process design. Besides, the existing biomechanical model in AMS uses a simplified estimation of body proportions, based on a segment mass ratio derived scaling approach. This is insufficient to represent user populations anthropometrically correct in AMS. In addition, sub-models are derived from different sources of morphologic data and are therefore anthropometrically inconsistent. Therefore, an interface between the biomechanical AMS and the virtual human model Jack was developed to integrate a musculoskeletal simulation with Jack posture modeling. This interface provides direct data exchange between the two man-models, based on a consistent data structure and common body model. The study assesses kinematic and biomechanical model characteristics of Jack and AMS, and defines an appropriate biomechanical model. The information content for interfacing the two systems is defined and a protocol is identified. The interface program is developed and implemented through Tcl and Jack-script(Python), and interacts with the AMS console application to operate AMS procedures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monitoring environmental health is becoming increasingly important as human activity and climate change place greater pressure on global biodiversity. Acoustic sensors provide the ability to collect data passively, objectively and continuously across large areas for extended periods. While these factors make acoustic sensors attractive as autonomous data collectors, there are significant issues associated with large-scale data manipulation and analysis. We present our current research into techniques for analysing large volumes of acoustic data efficiently. We provide an overview of a novel online acoustic environmental workbench and discuss a number of approaches to scaling analysis of acoustic data; online collaboration, manual, automatic and human-in-the loop analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is a need for an accurate real-time quantitative system that would enhance decision-making in the treatment of osteoarthritis. To achieve this objective, significant research is required that will enable articular cartilage properties to be measured and categorized for health and functionality without the need for laboratory tests involving biopsies for pathological evaluation. Such a system would provide the capability of access to the internal condition of the cartilage matrix and thus extend the vision-based arthroscopy that is currently used beyond the subjective evaluation of surgeons. The system required must be able to non-destructively probe the entire thickness of the cartilage and its immediate subchondral bone layer. In this thesis, near infrared spectroscopy is investigated for the purpose mentioned above. The aim is to relate it to the structure and load bearing properties of the cartilage matrix to the near infrared absorption spectrum and establish functional relationships that will provide objective, quantitative and repeatable categorization of cartilage condition outside the area of visible degradation in a joint. Based on results from traditional mechanical testing, their innovative interpretation and relationship with spectroscopic data, new parameters were developed. These were then evaluated for their consistency in discriminating between healthy viable and degraded cartilage. The mechanical and physico-chemical properties were related to specific regions of the near infrared absorption spectrum that were identified as part of the research conducted for this thesis. The relationships between the tissue's near infrared spectral response and the new parameters were modeled using multivariate statistical techniques based on partial least squares regression (PLSR). With significantly high levels of statistical correlation, the modeled relationships were demonstrated to possess considerable potential in predicting the properties of unknown tissue samples in a quick and non-destructive manner. In order to adapt near infrared spectroscopy for clinical applications, a balance between probe diameter and the number of active transmit-receive optic fibres must be optimized. This was achieved in the course of this research, resulting in an optimal probe configuration that could be adapted for joint tissue evaluation. Furthermore, as a proof-of-concept, a protocol for obtaining the new parameters from the near infrared absorption spectra of cartilage was developed and implemented in a graphical user interface (GUI)-based software, and used to assess cartilage-on-bone samples in vitro. This conceptual implementation has been demonstrated, in part by the individual parametric relationship with the near infrared absorption spectrum, the capacity of the proposed system to facilitate real-time, non-destructive evaluation of cartilage matrix integrity. In summary, the potential of the optical near infrared spectroscopy for evaluating articular cartilage and bone laminate has been demonstrated in this thesis. The approach could have a spin-off for other soft tissues and organs of the body. It builds on the earlier work of the group at QUT, enhancing the near infrared component of the ongoing research on developing a tool for cartilage evaluation that goes beyond visual and subjective methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The INEX 2011 Relevance Feedback track offered a refined approach to the evaluation of Focused Relevance Feedback algorithms through simulated exhaustive user feedback. Run in largely identical fashion to the Relevance Feedback track in INEX 2010[2], we simulated a user-in-the loop by re-using the assessments of ad-hoc retrieval obtained from real users who assess focused ad-hoc retrieval submissions. We present the evaluation methodology, its implementation, and experimental results obtained for four submissions from two participating organisations. As the task and evaluation methods did not change between INEX 2010 and now, explanations of these details from the INEX 2010 version of the track have been repeated verbatim where appropriate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the field of face recognition, Sparse Representation (SR) has received considerable attention during the past few years. Most of the relevant literature focuses on holistic descriptors in closed-set identification applications. The underlying assumption in SR-based methods is that each class in the gallery has sufficient samples and the query lies on the subspace spanned by the gallery of the same class. Unfortunately, such assumption is easily violated in the more challenging face verification scenario, where an algorithm is required to determine if two faces (where one or both have not been seen before) belong to the same person. In this paper, we first discuss why previous attempts with SR might not be applicable to verification problems. We then propose an alternative approach to face verification via SR. Specifically, we propose to use explicit SR encoding on local image patches rather than the entire face. The obtained sparse signals are pooled via averaging to form multiple region descriptors, which are then concatenated to form an overall face descriptor. Due to the deliberate loss spatial relations within each region (caused by averaging), the resulting descriptor is robust to misalignment & various image deformations. Within the proposed framework, we evaluate several SR encoding techniques: l1-minimisation, Sparse Autoencoder Neural Network (SANN), and an implicit probabilistic technique based on Gaussian Mixture Models. Thorough experiments on AR, FERET, exYaleB, BANCA and ChokePoint datasets show that the proposed local SR approach obtains considerably better and more robust performance than several previous state-of-the-art holistic SR methods, in both verification and closed-set identification problems. The experiments also show that l1-minimisation based encoding has a considerably higher computational than the other techniques, but leads to higher recognition rates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. This approach is more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message, and in a separate pass providing integrity protection by generating a Message Authentication Code (MAC). AE using symmetric ciphers can be provided by either stream ciphers with built in authentication mechanisms or block ciphers using appropriate modes of operation. However, stream ciphers have the potential for higher performance and smaller footprint in hardware and/or software than block ciphers. This property makes stream ciphers suitable for resource constrained environments, where storage and computational power are limited. There have been several recent stream cipher proposals that claim to provide AE. These ciphers can be analysed using existing techniques that consider confidentiality or integrity separately; however currently there is no existing framework for the analysis of AE stream ciphers that analyses these two properties simultaneously. This thesis introduces a novel framework for the analysis of AE using stream cipher algorithms. This thesis analyzes the mechanisms for providing confidentiality and for providing integrity in AE algorithms using stream ciphers. There is a greater emphasis on the analysis of the integrity mechanisms, as there is little in the public literature on this, in the context of authenticated encryption. The thesis has four main contributions as follows. The first contribution is the design of a framework that can be used to classify AE stream ciphers based on three characteristics. The first classification applies Bellare and Namprempre's work on the the order in which encryption and authentication processes take place. The second classification is based on the method used for accumulating the input message (either directly or indirectly) into the into the internal states of the cipher to generate a MAC. The third classification is based on whether the sequence that is used to provide encryption and authentication is generated using a single key and initial vector, or two keys and two initial vectors. The second contribution is the application of an existing algebraic method to analyse the confidentiality algorithms of two AE stream ciphers; namely SSS and ZUC. The algebraic method is based on considering the nonlinear filter (NLF) of these ciphers as a combiner with memory. This method enables us to construct equations for the NLF that relate the (inputs, outputs and memory of the combiner) to the output keystream. We show that both of these ciphers are secure from this type of algebraic attack. We conclude that using a keydependent SBox in the NLF twice, and using two different SBoxes in the NLF of ZUC, prevents this type of algebraic attack. The third contribution is a new general matrix based model for MAC generation where the input message is injected directly into the internal state. This model describes the accumulation process when the input message is injected directly into the internal state of a nonlinear filter generator. We show that three recently proposed AE stream ciphers can be considered as instances of this model; namely SSS, NLSv2 and SOBER-128. Our model is more general than a previous investigations into direct injection. Possible forgery attacks against this model are investigated. It is shown that using a nonlinear filter in the accumulation process of the input message when either the input message or the initial states of the register is unknown prevents forgery attacks based on collisions. The last contribution is a new general matrix based model for MAC generation where the input message is injected indirectly into the internal state. This model uses the input message as a controller to accumulate a keystream sequence into an accumulation register. We show that three current AE stream ciphers can be considered as instances of this model; namely ZUC, Grain-128a and Sfinks. We establish the conditions under which the model is susceptible to forgery and side-channel attacks.