927 resultados para feature inspection method
Resumo:
In the study of traffic safety, expected crash frequencies across sites are generally estimated via the negative binomial model, assuming time invariant safety. Since the time invariant safety assumption may be invalid, Hauer (1997) proposed a modified empirical Bayes (EB) method. Despite the modification, no attempts have been made to examine the generalisable form of the marginal distribution resulting from the modified EB framework. Because the hyper-parameters needed to apply the modified EB method are not readily available, an assessment is lacking on how accurately the modified EB method estimates safety in the presence of the time variant safety and regression-to-the-mean (RTM) effects. This study derives the closed form marginal distribution, and reveals that the marginal distribution in the modified EB method is equivalent to the negative multinomial (NM) distribution, which is essentially the same as the likelihood function used in the random effects Poisson model. As a result, this study shows that the gamma posterior distribution from the multivariate Poisson-gamma mixture can be estimated using the NM model or the random effects Poisson model. This study also shows that the estimation errors from the modified EB method are systematically smaller than those from the comparison group method by simultaneously accounting for the RTM and time variant safety effects. Hence, the modified EB method via the NM model is a generalisable method for estimating safety in the presence of the time variant safety and the RTM effects.
A Modified inverse integer Cholesky decorrelation method and the performance on ambiguity resolution
Resumo:
One of the research focuses in the integer least squares problem is the decorrelation technique to reduce the number of integer parameter search candidates and improve the efficiency of the integer parameter search method. It remains as a challenging issue for determining carrier phase ambiguities and plays a critical role in the future of GNSS high precise positioning area. Currently, there are three main decorrelation techniques being employed: the integer Gaussian decorrelation, the Lenstra–Lenstra–Lovász (LLL) algorithm and the inverse integer Cholesky decorrelation (IICD) method. Although the performance of these three state-of-the-art methods have been proved and demonstrated, there is still a potential for further improvements. To measure the performance of decorrelation techniques, the condition number is usually used as the criterion. Additionally, the number of grid points in the search space can be directly utilized as a performance measure as it denotes the size of search space. However, a smaller initial volume of the search ellipsoid does not always represent a smaller number of candidates. This research has proposed a modified inverse integer Cholesky decorrelation (MIICD) method which improves the decorrelation performance over the other three techniques. The decorrelation performance of these methods was evaluated based on the condition number of the decorrelation matrix, the number of search candidates and the initial volume of search space. Additionally, the success rate of decorrelated ambiguities was calculated for all different methods to investigate the performance of ambiguity validation. The performance of different decorrelation methods was tested and compared using both simulation and real data. The simulation experiment scenarios employ the isotropic probabilistic model using a predetermined eigenvalue and without any geometry or weighting system constraints. MIICD method outperformed other three methods with conditioning improvements over LAMBDA method by 78.33% and 81.67% without and with eigenvalue constraint respectively. The real data experiment scenarios involve both the single constellation system case and dual constellations system case. Experimental results demonstrate that by comparing with LAMBDA, MIICD method can significantly improve the efficiency of reducing the condition number by 78.65% and 97.78% in the case of single constellation and dual constellations respectively. It also shows improvements in the number of search candidate points by 98.92% and 100% in single constellation case and dual constellations case.
Resumo:
This paper presents a method of voice activity detection (VAD) for high noise scenarios, using a noise robust voiced speech detection feature. The developed method is based on the fusion of two systems. The first system utilises the maximum peak of the normalised time-domain autocorrelation function (MaxPeak). The second zone system uses a novel combination of cross-correlation and zero-crossing rate of the normalised autocorrelation to approximate a measure of signal pitch and periodicity (CrossCorr) that is hypothesised to be noise robust. The score outputs by the two systems are then merged using weighted sum fusion to create the proposed autocorrelation zero-crossing rate (AZR) VAD. Accuracy of AZR was compared to state of the art and standardised VAD methods and was shown to outperform the best performing system with an average relative improvement of 24.8% in half-total error rate (HTER) on the QUT-NOISE-TIMIT database created using real recordings from high-noise environments.
Resumo:
To date, the majority of films that utilise or feature hip hop music and culture, have either been in the realms of documentary, or in ‘show musicals’ (where the film musical’s device of characters’ bursting into song, is justified by the narrative of a pursuit of a career in the entertainment industry). Thus, most films that feature hip hop expression have in some way been tied to the subject of hip hop. A research interest and enthusiasm was developed for utilising hip hop expression in film in a new way, which would extend the narrative possibilities of hip hop film to wider topics and themes. The creation of the thesis film Out of My Cloud, and the writing of this accompanying exegesis, investigates a research concern of the potential for the use of hip hop expression in an ‘integrated musical’ film (where characters’ break into song without conceit or explanation). Context and rationale for Out of My Cloud (an Australian hip hop ‘integrated musical’ film) is provided in this writing. It is argued that hip hop is particularly suitable for use in a modern narrative film, and particularly in an ‘integrated musical’ film, due to its: current vibrancy and popularity, rap (vocal element of hip hop) music’s focus on lyrical message and meaning, and rap’s use as an everyday, non-performative method of communication. It is also argued that Australian hip hop deserves greater representation in film and literature due to: its current popularity, and its nature as a unique and distinct form of hip hop. To date, representation of Australian hip hop in film and television has almost solely been restricted to the documentary form. Out of My Cloud borrows from elements of social realist cinema such as: contrasts with mainstream cinema, an exploration/recognition of the relationship between environment and development of character, use of non-actors, location-shooting, a political intent of the filmmaker, displaying sympathy for an underclass, representation of underrepresented character types and topics, and a loose narrative structure that does not offer solid resolution. A case is made that it may be appropriate to marry elements of social realist film with hip hop expression due to common characteristics, such as: representation of marginalised or underrepresented groups and issues in society, political objectives of the artist/s, and sympathy for an underclass. In developing and producing Out of My Cloud, a specific method of working with, and filming actor improvisation was developed. This method was informed by improvisation and associated camera techniques of filmmakers such as Charlie Chaplin, Mike Leigh, Khoa Do, Dogme 95 filmmakers, and Lars von Trier (post-Dogme 95). A review of techniques used by these filmmakers is provided in this writing, as well as the impact it has made on my approach. The method utilised in Out of My Cloud was most influenced by Khoa Do’s technique of guiding actors to improvise fairly loosely, but with a predetermined endpoint in mind. A variation of this technique was developed for use in Out of My Cloud, which involved filming with two cameras to allow edits from multiple angles. Specific processes for creating Out of My Cloud are described and explained in this writing. Particular attention is given to the approaches regarding the story elements and the music elements. Various significant aspects of the process are referred to including the filming and recording of live musical performances, the recording of ‘freestyle’ performances (lyrics composed and performed spontaneously) and the creation of a scored musical scene involving a vocal performance without regular timing or rhythm. The documentation of processes in this writing serve to make the successful elements of this film transferable and replicable to other practitioners in the field, whilst flagging missteps to allow fellow practitioners to avoid similar missteps in future projects. While Out of My Cloud is not without its shortcomings as a short film work (for example in the areas of story and camerawork) it provides a significant contribution to the field as a working example of how hip hop may be utilised in an ‘integrated musical’ film, as well as being a rare example of a narrative film that features Australian hip hop. This film and the accompanying exegesis provide insights that contribute to an understanding of techniques, theories and knowledge in the field of filmmaking practice.
Resumo:
A recent advance in biosecurity surveillance design aims to benefit island conservation through early and improved detection of incursions by non-indigenous species. The novel aspects of the design are that it achieves a specified power of detection in a cost-managed system, while acknowledging heterogeneity of risk in the study area and stratifying the area to target surveillance deployment. The design also utilises a variety of surveillance system components, such as formal scientific surveys, trapping methods, and incidental sightings by non-biologist observers. These advances in design were applied to black rats (Rattus rattus) representing the group of invasive rats including R. norvegicus, and R. exulans, which are potential threats to Barrow Island, Australia, a high value conservation nature reserve where a proposed liquefied natural gas development is a potential source of incursions. Rats are important to consider as they are prevalent invaders worldwide, difficult to detect early when present in low numbers, and able to spread and establish relatively quickly after arrival. The ‘exemplar’ design for the black rat is then applied in a manner that enables the detection of a range of non-indigenous species of rat that could potentially be introduced. Many of the design decisions were based on expert opinion as data gaps exist in empirical data. The surveillance system was able to take into account factors such as collateral effects on native species, the availability of limited resources on an offshore island, financial costs, demands on expertise and other logistical constraints. We demonstrate the flexibility and robustness of the surveillance system and discuss how it could be updated as empirical data are collected to supplement expert opinion and provide a basis for adaptive management. Overall, the surveillance system promotes an efficient use of resources while providing defined power to detect early rat incursions, translating to reduced environmental, resourcing and financial costs.
Resumo:
The ability to reproducibly load bioactive molecules into polymeric microspheres is a challenge. Traditional microsphere fabrication methods typically provide inhomogeneous release profiles and suffer from lack of batch to batch reproducibility, hindering their potential to up-scale and their translation to the clinic. This deficit in homogeneity is in part attributed to broad size distributions and variability in the morphology of particles. It is thus desirable to control morphology and size of non-loaded particles in the first instance, in preparation for obtaining desired release profiles of loaded particles in the later stage. This is achieved by identifying the key parameters involved in particle production and understanding how adapting these parameters affects the final characteristics of particles. In this study, electrospraying was presented as a promising technique for generating reproducible particles made of polycaprolactone, a biodegradable, FDA-approved polymer. Narrow size distributions were obtained by the control of electrospraying flow rate and polymer concentration, with average particle sizes ranging from 10 to 20 um. Particles were shown to be spherical with a homogenous embossed texture, determined by the polymer entanglement regime taking place during electrospraying. No toxic residue was detected by this process based on preliminary cell work using DNA quantification assays, validating this method as suitable for further loading of bioactive components.
Resumo:
Scalable high-resolution tiled display walls are becoming increasingly important to decision makers and researchers because high pixel counts in combination with large screen areas facilitate content rich, simultaneous display of computer-generated visualization information and high-definition video data from multiple sources. This tutorial is designed to cater for new users as well as researchers who are currently operating tiled display walls or 'OptiPortals'. We will discuss the current and future applications of display wall technology and explore opportunities for participants to collaborate and contribute in a growing community. Multiple tutorial streams will cover both hands-on practical development, as well as policy and method design for embedding these technologies into the research process. Attendees will be able to gain an understanding of how to get started with developing similar systems themselves, in addition to becoming familiar with typical applications and large-scale visualisation techniques. Presentations in this tutorial will describe current implementations of tiled display walls that highlight the effective usage of screen real-estate with various visualization datasets, including collaborative applications such as visualcasting, classroom learning and video conferencing. A feature presentation for this tutorial will be given by Jurgen Schulze from Calit2 at the University of California, San Diego. Jurgen is an expert in scientific visualization in virtual environments, human-computer interaction, real-time volume rendering, and graphics algorithms on programmable graphics hardware.
Resumo:
In the past 20 years, mesoporous materials have been attracted great attention due to their significant feature of large surface area, ordered mesoporous structure, tunable pore size and volume, and well-defined surface property. They have many potential applications, such as catalysis, adsorption/separation, biomedicine, etc. [1]. Recently, the studies of the applications of mesoporous materials have been expanded into the field of biomaterials science. A new class of bioactive glass, referred to as mesoporous bioactive glass (MBG), was first developed in 2004. This material has a highly ordered mesopore channel structure with a pore size ranging from 5–20 nm [1]. Compared to non-mesopore bioactive glass (BG), MBG possesses a more optimal surface area, pore volume and improved in vitro apatite mineralization in simulated body fluids [1,2]. Vallet-Regí et al. has systematically investigated the in vitro apatite formation of different types of mesoporous materials, and they demonstrated that an apatite-like layer can be formed on the surfaces of Mobil Composition of Matters (MCM)-48, hexagonal mesoporous silica (SBA-15), phosphorous-doped MCM-41, bioglass-containing MCM-41 and ordered mesoporous MBG, allowing their use in biomedical engineering for tissue regeneration [2-4]. Chang et al. has found that MBG particles can be used for a bioactive drug-delivery system [5,6]. Our study has shown that MBG powders, when incorporated into a poly (lactide-co-glycolide) (PLGA) film, significantly enhance the apatite-mineralization ability and cell response of PLGA films. compared to BG [7]. These studies suggest that MBG is a very promising bioactive material with respect to bone regeneration. It is known that for bone defect repair, tissue engineering represents an optional method by creating three-dimensional (3D) porous scaffolds which will have more advantages than powders or granules as 3D scaffolds will provide an interconnected macroporous network to allow cell migration, nutrient delivery, bone ingrowth, and eventually vascularization [8]. For this reason, we try to apply MBG for bone tissue engineering by developing MBG scaffolds. However, one of the main disadvantages of MBG scaffolds is their low mechanical strength and high brittleness; the other issue is that they have very quick degradation, which leads to an unstable surface for bone cell growth limiting their applications. Silk fibroin, as a new family of native biomaterials, has been widely studied for bone and cartilage repair applications in the form of pure silk or its composite scaffolds [9-14]. Compared to traditional synthetic polymer materials, such as PLGA and poly(3-hydroxybutyrate-co-3-hydroxyvalerate) (PHBV), the chief advantage of silk fibroin is its water-soluble nature, which eliminates the need for organic solvents, that tend to be highly cytotoxic in the process of scaffold preparation [15]. Other advantages of silk scaffolds are their excellent mechanical properties, controllable biodegradability and cytocompatibility [15-17]. However, for the purposes of bone tissue engineering, the osteoconductivity of pure silk scaffolds is suboptimal. It is expected that combining MBG with silk to produce MBG/silk composite scaffolds would greatly improve their physiochemical and osteogenic properties for bone tissue engineering application. Therefore, in this chapter, we will introduce the research development of MBG/silk scaffolds for bone tissue engineering.
Resumo:
The use of appropriate features to represent an output class or object is critical for all classification problems. In this paper, we propose a biologically inspired object descriptor to represent the spectral-texture patterns of image-objects. The proposed feature descriptor is generated from the pulse spectral frequencies (PSF) of a pulse coupled neural network (PCNN), which is invariant to rotation, translation and small scale changes. The proposed method is first evaluated in a rotation and scale invariant texture classification using USC-SIPI texture database. It is further evaluated in an application of vegetation species classification in power line corridor monitoring using airborne multi-spectral aerial imagery. The results from the two experiments demonstrate that the PSF feature is effective to represent spectral-texture patterns of objects and it shows better results than classic color histogram and texture features.
Resumo:
Trust can be used for neighbor formation to generate automated recommendations. User assigned explicit rating data can be used for this purpose. However, the explicit rating data is not always available. In this paper we present a new method of generating trust network based on user’s interest similarity. To identify the interest similarity, we use user’s personalized tag information. This trust network can be used to find the neighbors to make automated recommendation. Our experiment result shows that the precision of the proposed method outperforms the traditional collaborative filtering approach.
Resumo:
The vibration serviceability limit state is an important design consideration for two-way, suspended concrete floors that is not always well understood by many practicing structural engineers. Although the field of floor vibration has been extensively developed, at present there are no convenient design tools that deal with this problem. Results from this research have enabled the development of a much-needed, new method for assessing the vibration serviceability of flat, suspended concrete floors in buildings. This new method has been named, the Response Coefficient-Root Function (RCRF) method. Full-scale, laboratory tests have been conducted on a post-tensioned floor specimen at Queensland University of Technology’s structural laboratory. Special support brackets were fabricated to perform as frictionless, pinned connections at the corners of the specimen. A series of static and dynamic tests were performed in the laboratory to obtain basic material and dynamic properties of the specimen. Finite-element-models have been calibrated against data collected from laboratory experiments. Computational finite-element-analysis has been extended to investigate a variety of floor configurations. Field measurements of floors in existing buildings are in good agreement with computational studies. Results from this parametric investigation have led to the development of new approach for predicting the design frequencies and accelerations of flat, concrete floor structures. The RCRF method is convenient tool to assist structural engineers in the design for the vibration serviceability limit-state of in-situ concrete floor systems.
Resumo:
This study investigated the Kinaesthetic Fusion Effect (KFE) that was first described by Craske and Kenny in 1981. It was reported that when, without vision, participants pressed a button that resulted in a probe simultaneously touching the contralateral limb at a displaced location, they perceived an apparent change in limb length. The current study did not fully replicate these earlier findings. Participants did not perceive any reduction in the sagittal separation of the button and probe following repeated exposure to the tactile stimuli that was present on both arms. However, a localised and partial medio-lateral fusion was observed, with the touched positions seeming closer together. In addition, tactile acuity was found to decrease progressively for distal positions of the upper limb and a foreshortening effect was found which may result from a line-of-sight judgment and represent a feature of the reporting method used. A number of years have elapsed since the description of the original KFE. Although frequently cited in the literature, there has been no further investigation into the mechanisms of action. The results of the current study are considered in light of more recent literature concerning intersensory integration. Future research should focus on further clarification for the specific conditions that must be present for a fusion effect to occur. Finally, this thesis will benefit future studies that require participants to report the perceived locations of the unseen limbs.
Resumo:
A significant proportion of the cost of software development is due to software testing and maintenance. This is in part the result of the inevitable imperfections due to human error, lack of quality during the design and coding of software, and the increasing need to reduce faults to improve customer satisfaction in a competitive marketplace. Given the cost and importance of removing errors improvements in fault detection and removal can be of significant benefit. The earlier in the development process faults can be found, the less it costs to correct them and the less likely other faults are to develop. This research aims to make the testing process more efficient and effective by identifying those software modules most likely to contain faults, allowing testing efforts to be carefully targeted. This is done with the use of machine learning algorithms which use examples of fault prone and not fault prone modules to develop predictive models of quality. In order to learn the numerical mapping between module and classification, a module is represented in terms of software metrics. A difficulty in this sort of problem is sourcing software engineering data of adequate quality. In this work, data is obtained from two sources, the NASA Metrics Data Program, and the open source Eclipse project. Feature selection before learning is applied, and in this area a number of different feature selection methods are applied to find which work best. Two machine learning algorithms are applied to the data - Naive Bayes and the Support Vector Machine - and predictive results are compared to those of previous efforts and found to be superior on selected data sets and comparable on others. In addition, a new classification method is proposed, Rank Sum, in which a ranking abstraction is laid over bin densities for each class, and a classification is determined based on the sum of ranks over features. A novel extension of this method is also described based on an observed polarising of points by class when rank sum is applied to training data to convert it into 2D rank sum space. SVM is applied to this transformed data to produce models the parameters of which can be set according to trade-off curves to obtain a particular performance trade-off.
Resumo:
Background and purpose: The appropriate fixation method for hemiarthroplasty of the hip as it relates to implant survivorship and patient mortality is a matter of ongoing debate. We examined the influence of fixation method on revision rate and mortality.----- ----- Methods: We analyzed approximately 25,000 hemiarthroplasty cases from the AOA National Joint Replacement Registry. Deaths at 1 day, 1 week, 1 month, and 1 year were compared for all patients and among subgroups based on implant type.----- ----- Results: Patients treated with cemented monoblock hemiarthroplasty had a 1.7-times higher day-1 mortality compared to uncemented monoblock components (p < 0.001). This finding was reversed by 1 week, 1 month, and 1 year after surgery (p < 0.001). Modular hemiarthroplasties did not reveal a difference in mortality between fixation methods at any time point.----- ----- Interpretation: This study shows lower (or similar) overall mortality with cemented hemiarthroplasty of the hip.
Resumo:
Different from conventional methods for structural reliability evaluation, such as, first/second-order reliability methods (FORM/SORM) or Monte Carlo simulation based on corresponding limit state functions, a novel approach based on dynamic objective oriented Bayesian network (DOOBN) for prediction of structural reliability of a steel bridge element has been proposed in this paper. The DOOBN approach can effectively model the deterioration processes of a steel bridge element and predict their structural reliability over time. This approach is also able to achieve Bayesian updating with observed information from measurements, monitoring and visual inspection. Moreover, the computational capacity embedded in the approach can be used to facilitate integrated management and maintenance optimization in a bridge system. A steel bridge girder is used to validate the proposed approach. The predicted results are compared with those evaluated by FORM method.