164 resultados para Computer aided analysis, Machine vision, Video surveillance
Resumo:
Spontaneous facial expressions differ from posed ones in appearance, timing and accompanying head movements. Still images cannot provide timing or head movement information directly. However, indirectly the distances between key points on a face extracted from a still image using active shape models can capture some movement and pose changes. This information is superposed on information about non-rigid facial movement that is also part of the expression. Does geometric information improve the discrimination between spontaneous and posed facial expressions arising from discrete emotions? We investigate the performance of a machine vision system for discrimination between posed and spontaneous versions of six basic emotions that uses SIFT appearance based features and FAP geometric features. Experimental results on the NVIE database demonstrate that fusion of geometric information leads only to marginal improvement over appearance features. Using fusion features, surprise is the easiest emotion (83.4% accuracy) to be distinguished, while disgust is the most difficult (76.1%). Our results find different important facial regions between discriminating posed versus spontaneous version of one emotion and classifying the same emotion versus other emotions. The distribution of the selected SIFT features shows that mouth is more important for sadness, while nose is more important for surprise, however, both the nose and mouth are important for disgust, fear, and happiness. Eyebrows, eyes, nose and mouth are important for anger.
Resumo:
Corporate reputation is viewed as fundamental to firm performance, growth and survival and the maintenance and enhancement of that reputation is a key responsibility of senior executives. However, relatively little is known about the main dimensions of corporate reputation and the amount of attention given to them by senior executives. Based on the corporate reputation and intangible resources literatures, thirteen reputational elements were identified and the amount of attention given to those elements in a large, longitudinal sample of annual reports from Australian firms was measured using computer aided text analysis. This identified five, main reputational dimensions that were both stable over time and related to firms’ future financial performance.
Resumo:
Abstract—Computational Intelligence Systems (CIS) is one of advanced softwares. CIS has been important position for solving single-objective / reverse / inverse and multi-objective design problems in engineering. The paper hybridise a CIS for optimisation with the concept of Nash-Equilibrium as an optimisation pre-conditioner to accelerate the optimisation process. The hybridised CIS (Hybrid Intelligence System) coupled to the Finite Element Analysis (FEA) tool and one type of Computer Aided Design(CAD) system; GiD is applied to solve an inverse engineering design problem; reconstruction of High Lift Systems (HLS). Numerical results obtained by the hybridised CIS are compared to the results obtained by the original CIS. The benefits of using the concept of Nash-Equilibrium are clearly demonstrated in terms of solution accuracy and optimisation efficiency.
Resumo:
Proteases regulate a spectrum of diverse physiological processes, and dysregulation of proteolytic activity drives a plethora of pathological conditions. Understanding protease function is essential to appreciating many aspects of normal physiology and progression of disease. Consequently, development of potent and specific inhibitors of proteolytic enzymes is vital to provide tools for the dissection of protease function in biological systems and for the treatment of diseases linked to aberrant proteolytic activity. The studies in this thesis describe the rational design of potent inhibitors of three proteases that are implicated in disease development. Additionally, key features of the interaction of proteases and their cognate inhibitors or substrates are analysed and a series of rational inhibitor design principles are expounded and tested. Rational design of protease inhibitors relies on a comprehensive understanding of protease structure and biochemistry. Analysis of known protease cleavage sites in proteins and peptides is a commonly used source of such information. However, model peptide substrate and protein sequences have widely differing levels of backbone constraint and hence can adopt highly divergent structures when binding to a protease’s active site. This may result in identical sequences in peptides and proteins having different conformations and diverse spatial distribution of amino acid functionalities. Regardless of this, protein and peptide cleavage sites are often regarded as being equivalent. One of the key findings in the following studies is a definitive demonstration of the lack of equivalence between these two classes of substrate and invalidation of the common practice of using the sequences of model peptide substrates to predict cleavage of proteins in vivo. Another important feature for protease substrate recognition is subsite cooperativity. This type of cooperativity is commonly referred to as protease or substrate binding subsite cooperativity and is distinct from allosteric cooperativity, where binding of a molecule distant from the protease active site affects the binding affinity of a substrate. Subsite cooperativity may be intramolecular where neighbouring residues in substrates are interacting, affecting the scissile bond’s susceptibility to protease cleavage. Subsite cooperativity can also be intermolecular where a particular residue’s contribution to binding affinity changes depending on the identity of neighbouring amino acids. Although numerous studies have identified subsite cooperativity effects, these findings are frequently ignored in investigations probing subsite selectivity by screening against diverse combinatorial libraries of peptides (positional scanning synthetic combinatorial library; PS-SCL). This strategy for determining cleavage specificity relies on the averaged rates of hydrolysis for an uncharacterised ensemble of peptide sequences, as opposed to the defined rate of hydrolysis of a known specific substrate. Further, since PS-SCL screens probe the preference of the various protease subsites independently, this method is inherently unable to detect subsite cooperativity. However, mean hydrolysis rates from PS-SCL screens are often interpreted as being comparable to those produced by single peptide cleavages. Before this study no large systematic evaluation had been made to determine the level of correlation between protease selectivity as predicted by screening against a library of combinatorial peptides and cleavage of individual peptides. This subject is specifically explored in the studies described here. In order to establish whether PS-SCL screens could accurately determine the substrate preferences of proteases, a systematic comparison of data from PS-SCLs with libraries containing individually synthesised peptides (sparse matrix library; SML) was carried out. These SML libraries were designed to include all possible sequence combinations of the residues that were suggested to be preferred by a protease using the PS-SCL method. SML screening against the three serine proteases kallikrein 4 (KLK4), kallikrein 14 (KLK14) and plasmin revealed highly preferred peptide substrates that could not have been deduced by PS-SCL screening alone. Comparing protease subsite preference profiles from screens of the two types of peptide libraries showed that the most preferred substrates were not detected by PS SCL screening as a consequence of intermolecular cooperativity being negated by the very nature of PS SCL screening. Sequences that are highly favoured as result of intermolecular cooperativity achieve optimal protease subsite occupancy, and thereby interact with very specific determinants of the protease. Identifying these substrate sequences is important since they may be used to produce potent and selective inhibitors of protolytic enzymes. This study found that highly favoured substrate sequences that relied on intermolecular cooperativity allowed for the production of potent inhibitors of KLK4, KLK14 and plasmin. Peptide aldehydes based on preferred plasmin sequences produced high affinity transition state analogue inhibitors for this protease. The most potent of these maintained specificity over plasma kallikrein (known to have a very similar substrate preference to plasmin). Furthermore, the efficiency of this inhibitor in blocking fibrinolysis in vitro was comparable to aprotinin, which previously saw clinical use to reduce perioperative bleeding. One substrate sequence particularly favoured by KLK4 was substituted into the 14 amino acid, circular sunflower trypsin inhibitor (SFTI). This resulted in a highly potent and selective inhibitor (SFTI-FCQR) which attenuated protease activated receptor signalling by KLK4 in vitro. Moreover, SFTI-FCQR and paclitaxel synergistically reduced growth of ovarian cancer cells in vitro, making this inhibitor a lead compound for further therapeutic development. Similar incorporation of a preferred KLK14 amino acid sequence into the SFTI scaffold produced a potent inhibitor for this protease. However, the conformationally constrained SFTI backbone enforced a different intramolecular cooperativity, which masked a KLK14 specific determinant. As a consequence, the level of selectivity achievable was lower than that found for the KLK4 inhibitor. Standard mechanism inhibitors such as SFTI rely on a stable acyl-enzyme intermediate for high affinity binding. This is achieved by a conformationally constrained canonical binding loop that allows for reformation of the scissile peptide bond after cleavage. Amino acid substitutions within the inhibitor to target a particular protease may compromise structural determinants that support the rigidity of the binding loop and thereby prevent the engineered inhibitor reaching its full potential. An in silico analysis was carried out to examine the potential for further improvements to the potency and selectivity of the SFTI-based KLK4 and KLK14 inhibitors. Molecular dynamics simulations suggested that the substitutions within SFTI required to target KLK4 and KLK14 had compromised the intramolecular hydrogen bond network of the inhibitor and caused a concomitant loss of binding loop stability. Furthermore in silico amino acid substitution revealed a consistent correlation between a higher frequency of formation and the number of internal hydrogen bonds of SFTI-variants and lower inhibition constants. These predictions allowed for the production of second generation inhibitors with enhanced binding affinity toward both targets and highlight the importance of considering intramolecular cooperativity effects when engineering proteins or circular peptides to target proteases. The findings from this study show that although PS-SCLs are a useful tool for high throughput screening of approximate protease preference, later refinement by SML screening is needed to reveal optimal subsite occupancy due to cooperativity in substrate recognition. This investigation has also demonstrated the importance of maintaining structural determinants of backbone constraint and conformation when engineering standard mechanism inhibitors for new targets. Combined these results show that backbone conformation and amino acid cooperativity have more prominent roles than previously appreciated in determining substrate/inhibitor specificity and binding affinity. The three key inhibitors designed during this investigation are now being developed as lead compounds for cancer chemotherapy, control of fibrinolysis and cosmeceutical applications. These compounds form the basis of a portfolio of intellectual property which will be further developed in the coming years.
Resumo:
This report discusses findings of a case study into "CADD, BIM and IPD" undertaken as a part of the retrospective analysis component of Sustainable Built Environment National Research Centre (SBEnrc) Project 2.7 Leveraging R&D investment for the Australian Built Environment. This case study investigated the evolution that has taken place in the Queensland Department of Public Works Division of Project Services during the last 20 years from: the initial implementation of computer aided design and documentation(CADD); to the experimentation with building information modelling (BIM) from the mid 2000’s; embedding integrated practice (IP); to current steps towards integrated project delivery (IPD) with the integration of contractors in the design/delivery process. This case study should be read in conjunction with Part 1 of this suite of reports.
Resumo:
Several track-before-detection approaches for image based aircraft detection have recently been examined in an important automated aircraft collision detection application. A particularly popular approach is a two stage processing paradigm which involves: a morphological spatial filter stage (which aims to emphasize the visual characteristics of targets) followed by a temporal or track filter stage (which aims to emphasize the temporal characteristics of targets). In this paper, we proposed new spot detection techniques for this two stage processing paradigm that fuse together raw and morphological images or fuse together various different morphological images (we call these approaches morphological reinforcement). On the basis of flight test data, the proposed morphological reinforcement operations are shown to offer superior signal to-noise characteristics when compared to standard spatial filter options (such as the close-minus-open and adaptive contour morphological operations). However, system operation characterised curves, which examine detection verses false alarm characteristics after both processing stages, illustrate that system performance is very data dependent.
Resumo:
The most common software analysis tools available for measuring fluorescence images are for two-dimensional (2D) data that rely on manual settings for inclusion and exclusion of data points, and computer-aided pattern recognition to support the interpretation and findings of the analysis. It has become increasingly important to be able to measure fluorescence images constructed from three-dimensional (3D) datasets in order to be able to capture the complexity of cellular dynamics and understand the basis of cellular plasticity within biological systems. Sophisticated microscopy instruments have permitted the visualization of 3D fluorescence images through the acquisition of multispectral fluorescence images and powerful analytical software that reconstructs the images from confocal stacks that then provide a 3D representation of the collected 2D images. Advanced design-based stereology methods have progressed from the approximation and assumptions of the original model-based stereology(1) even in complex tissue sections(2). Despite these scientific advances in microscopy, a need remains for an automated analytic method that fully exploits the intrinsic 3D data to allow for the analysis and quantification of the complex changes in cell morphology, protein localization and receptor trafficking. Current techniques available to quantify fluorescence images include Meta-Morph (Molecular Devices, Sunnyvale, CA) and Image J (NIH) which provide manual analysis. Imaris (Andor Technology, Belfast, Northern Ireland) software provides the feature MeasurementPro, which allows the manual creation of measurement points that can be placed in a volume image or drawn on a series of 2D slices to create a 3D object. This method is useful for single-click point measurements to measure a line distance between two objects or to create a polygon that encloses a region of interest, but it is difficult to apply to complex cellular network structures. Filament Tracer (Andor) allows automatic detection of the 3D neuronal filament-like however, this module has been developed to measure defined structures such as neurons, which are comprised of dendrites, axons and spines (tree-like structure). This module has been ingeniously utilized to make morphological measurements to non-neuronal cells(3), however, the output data provide information of an extended cellular network by using a software that depends on a defined cell shape rather than being an amorphous-shaped cellular model. To overcome the issue of analyzing amorphous-shaped cells and making the software more suitable to a biological application, Imaris developed Imaris Cell. This was a scientific project with the Eidgenössische Technische Hochschule, which has been developed to calculate the relationship between cells and organelles. While the software enables the detection of biological constraints, by forcing one nucleus per cell and using cell membranes to segment cells, it cannot be utilized to analyze fluorescence data that are not continuous because ideally it builds cell surface without void spaces. To our knowledge, at present no user-modifiable automated approach that provides morphometric information from 3D fluorescence images has been developed that achieves cellular spatial information of an undefined shape (Figure 1). We have developed an analytical platform using the Imaris core software module and Imaris XT interfaced to MATLAB (Mat Works, Inc.). These tools allow the 3D measurement of cells without a pre-defined shape and with inconsistent fluorescence network components. Furthermore, this method will allow researchers who have extended expertise in biological systems, but not familiarity to computer applications, to perform quantification of morphological changes in cell dynamics.
Resumo:
The development and design of electric high power devices with electromagnetic computer-aided engineering (EM-CAE) software such as the Finite Element Method (FEM) and Boundary Element Method (BEM) has been widely adopted. This paper presents the analysis of a Fault Current Limiter (FCL), which acts as a high-voltage surge protector for power grids. A prototype FCL was built. The magnetic flux in the core and the resulting electromagnetic forces in the winding of the FCL were analyzed using both FEM and BEM. An experiment on the prototype was conducted in a laboratory. The data obtained from the experiment is compared to the numerical solutions to determine the suitability and accuracy of the two methods.
Resumo:
Data structures such as k-D trees and hierarchical k-means trees perform very well in approximate k nearest neighbour matching, but are only marginally more effective than linear search when performing exact matching in high-dimensional image descriptor data. This paper presents several improvements to linear search that allows it to outperform existing methods and recommends two approaches to exact matching. The first method reduces the number of operations by evaluating the distance measure in order of significance of the query dimensions and terminating when the partial distance exceeds the search threshold. This method does not require preprocessing and significantly outperforms existing methods. The second method improves query speed further by presorting the data using a data structure called d-D sort. The order information is used as a priority queue to reduce the time taken to find the exact match and to restrict the range of data searched. Construction of the d-D sort structure is very simple to implement, does not require any parameter tuning, and requires significantly less time than the best-performing tree structure, and data can be added to the structure relatively efficiently.
Resumo:
An evolution in the use of digital modelling has occurred in the Queensland Department of Public Works Division of Project Services over the last 20 years from: the initial implementation of computer aided design and documentation (CADD); to experimentation with building information modelling (BIM); to embedding integrated practice (IP); to current steps towards integrated project delivery (IPD) including the active involvement of consultants and contractors in the design/delivery process. This case study is one of three undertaken through the Australian Sustainable Built Environment National Research Centre investigating past R&D investment. The intent of these cases is to inform the development of policy guidelines for future investment in the construction industry in Australia. This research is informing the activities of CIB Task Group 85 R&D Investment and Impact. The uptake of digital modelling by Project Services has been approached through an incremental learning approach. This has been driven by a strong and clear vision with a focus on developing more efficient delivery mechanisms through the use of new technology coupled with process change. Findings reveal an organisational focus on several areas including: (i) strategic decision making including the empowerment of innovation leaders and champions; (ii) the acquisition and exploitation of knowledge; (iii) product and process development (with a focus on efficiency and productivity); (iv) organisational learning; (v) maximising the use of technology; and (vi) supply chain integration. Key elements of this approach include pilot projects, researcher engagement, industry partnerships and leadership.
Resumo:
This paper considers the design of a radial flux permanent magnet iron less core brushless DC motor for use in an electric wheel drive with an integrated epicyclic gear reduction. The motor has been designed for a continuous output torque of 30 Nm and peak rating of 60 Nm with a maximum operating speed of 7000 RPM. In the design of brushless DC motors with a toothed iron stator the peak air-gap magnetic flux density is typically chosen to be close to that of the remanence value of the magnets used. This paper demonstrates that for an ironless motor the optimal peak air-gap flux density is closer to the maximum energy product of the magnets used. The use of a radial flux topology allows for high frequency operation and can be shown to give high specific power output while maintaining a relatively low magnet mass. Two-dimensional finite element analysis is used to predict the air-gap flux density. The motor design is based around commonly available NdFeB bar magnet size
Resumo:
Successful anatomic fitting of a total artificial heart (TAH) is vital to achieve optimal pump hemodynamics after device implantation. Although many anatomic fitting studies have been completed in humans prior to clinical trials, few reports exist that detail the experience in animals for in vivo device evaluation. Optimal hemodynamics are crucial throughout the in vivo phase to direct design iterations and ultimately validate device performance prior to pivotal human trials. In vivo evaluation in a sheep model allows a realistically sized representation of a smaller patient, for which smaller third-generation TAHs have the potential to treat. Our study aimed to assess the anatomic fit of a single device rotary TAH in sheep prior to animal trials and to use the data to develop a threedimensional, computer-aided design (CAD)-operated anatomic fitting tool for future TAH development. Following excision of the native ventricles above the atrio-ventricular groove, a prototype TAH was inserted within the chest cavity of six sheep (28–40 kg).Adjustable rods representing inlet and outlet conduits were oriented toward the center of each atrial chamber and the great vessels, with conduit lengths and angles recorded for future analysis. A threedimensional, CAD-operated anatomic fitting tool was then developed, based on the results of this study, and used to determine the inflow and outflow conduit orientation of the TAH. The mean diameters of the sheep left atrium, right atrium, aorta, and pulmonary artery were 39, 33, 12, and 11 mm, respectively. The center-to-center distance and outer-edge-to-outer-edge distance between the atria, found to be 39 ± 9 mm and 72 ± 17 mm in this study, were identified as the most critical geometries for successful TAH connection. This geometric constraint restricts the maximum separation allowable between left and right inlet ports of a TAH to ensure successful alignment within the available atrial circumference.
Resumo:
Digital Human Models (DHM) have been used for over 25 years. They have evolved from simple drawing templates, which are nowadays still used in architecture, to complex and Computer Aided Engineering (CAE) integrated design and analysis tools for various ergonomic tasks. DHM are most frequently used for applications in product design and production planning, with many successful implementations documented. DHM from other domains, as for example computer user interfaces, artificial intelligence, training and education, or the entertainment industry show that there is also an ongoing development towards a comprehensive understanding and holistic modeling of human behavior. While the development of DHM for the game sector has seen significant progress in recent years, advances of DHM in the area of ergonomics have been comparatively modest. As a consequence, we need to question if current DHM systems are fit for the design of future mobile work systems. So far it appears that DHM in Ergonomics are rather limited to some traditional applications. According to Dul et al. (2012), future characteristics of Human Factors and Ergonomics (HFE) can be assigned to six main trends: (1) global change of work systems, (2) cultural diversity, (3) ageing, (4) information and communication technology (ICT), (5) enhanced competiveness and the need for innovation, and; (6) sustainability and corporate social responsibility. Based on a literature review, we systematically investigate the capabilities of current ergonomic DHM systems versus the ‘Future of Ergonomics’ requirements. It is found that DHMs already provide broad functionality in support of trends (1) and (2), and more limited options in regards to trend (3). Today’s DHM provide access to a broad range of national and international databases for correct differentiation and characterization of anthropometry for global populations. Some DHM explicitly address social and cultural modeling of groups of people. In comparison, the trends of growing importance of ICT (4), the need for innovation (5) and sustainability (6) are addressed primarily from a hardware-oriented and engineering perspective and not reflected in DHM. This reflects a persistent separation between hardware design (engineering) and software design (information technology) in the view of DHM – a disconnection which needs to be urgently overcome in the era of software defined user interfaces and mobile devices. The design of a mobile ICT-device is discussed to exemplify the need for a comprehensive future DHM solution. Designing such mobile devices requires an approach that includes organizational aspects as well as technical and cognitive ergonomics. Multiple interrelationships between the different aspects result in a challenging setting for future DHM. In conclusion, the ‘Future of Ergonomics’ pose particular challenges for DHM in regards to the design of mobile work systems, and moreover mobile information access.