300 resultados para allochthonous input
Resumo:
A new algorithm for extracting features from images for object recognition is described. The algorithm uses higher order spectra to provide desirable invariance properties, to provide noise immunity, and to incorporate nonlinearity into the feature extraction procedure thereby allowing the use of simple classifiers. An image can be reduced to a set of 1D functions via the Radon transform, or alternatively, the Fourier transform of each 1D projection can be obtained from a radial slice of the 2D Fourier transform of the image according to the Fourier slice theorem. A triple product of Fourier coefficients, referred to as the deterministic bispectrum, is computed for each 1D function and is integrated along radial lines in bifrequency space. Phases of the integrated bispectra are shown to be translation- and scale-invariant. Rotation invariance is achieved by a regrouping of these invariants at a constant radius followed by a second stage of invariant extraction. Rotation invariance is thus converted to translation invariance in the second step. Results using synthetic and actual images show that isolated, compact clusters are formed in feature space. These clusters are linearly separable, indicating that the nonlinearity required in the mapping from the input space to the classification space is incorporated well into the feature extraction stage. The use of higher order spectra results in good noise immunity, as verified with synthetic and real images. Classification of images using the higher order spectra-based algorithm compares favorably to classification using the method of moment invariants
Resumo:
Features derived from the trispectra of DFT magnitude slices are used for multi-font digit recognition. These features are insensitive to translation, rotation, or scaling of the input. They are also robust to noise. Classification accuracy tests were conducted on a common data base of 256× 256 pixel bilevel images of digits in 9 fonts. Randomly rotated and translated noisy versions were used for training and testing. The results indicate that the trispectral features are better than moment invariants and affine moment invariants. They achieve a classification accuracy of 95% compared to about 81% for Hu's (1962) moment invariants and 39% for the Flusser and Suk (1994) affine moment invariants on the same data in the presence of 1% impulse noise using a 1-NN classifier. For comparison, a multilayer perceptron with no normalization for rotations and translations yields 34% accuracy on 16× 16 pixel low-pass filtered and decimated versions of the same data.
Resumo:
An application of image processing techniques to recognition of hand-drawn circuit diagrams is presented. The scanned image of a diagram is pre-processed to remove noise and converted to bilevel. Morphological operations are applied to obtain a clean, connected representation using thinned lines. The diagram comprises of nodes, connections and components. Nodes and components are segmented using appropriate thresholds on a spatially varying object pixel density. Connection paths are traced using a pixel-stack. Nodes are classified using syntactic analysis. Components are classified using a combination of invariant moments, scalar pixel-distribution features, and vector relationships between straight lines in polygonal representations. A node recognition accuracy of 82% and a component recognition accuracy of 86% was achieved on a database comprising 107 nodes and 449 components. This recogniser can be used for layout “beautification” or to generate input code for circuit analysis and simulation packages
Resumo:
Multivariate volatility forecasts are an important input in many financial applications, in particular portfolio optimisation problems. Given the number of models available and the range of loss functions to discriminate between them, it is obvious that selecting the optimal forecasting model is challenging. The aim of this thesis is to thoroughly investigate how effective many commonly used statistical (MSE and QLIKE) and economic (portfolio variance and portfolio utility) loss functions are at discriminating between competing multivariate volatility forecasts. An analytical investigation of the loss functions is performed to determine whether they identify the correct forecast as the best forecast. This is followed by an extensive simulation study examines the ability of the loss functions to consistently rank forecasts, and their statistical power within tests of predictive ability. For the tests of predictive ability, the model confidence set (MCS) approach of Hansen, Lunde and Nason (2003, 2011) is employed. As well, an empirical study investigates whether simulation findings hold in a realistic setting. In light of these earlier studies, a major empirical study seeks to identify the set of superior multivariate volatility forecasting models from 43 models that use either daily squared returns or realised volatility to generate forecasts. This study also assesses how the choice of volatility proxy affects the ability of the statistical loss functions to discriminate between forecasts. Analysis of the loss functions shows that QLIKE, MSE and portfolio variance can discriminate between multivariate volatility forecasts, while portfolio utility cannot. An examination of the effective loss functions shows that they all can identify the correct forecast at a point in time, however, their ability to discriminate between competing forecasts does vary. That is, QLIKE is identified as the most effective loss function, followed by portfolio variance which is then followed by MSE. The major empirical analysis reports that the optimal set of multivariate volatility forecasting models includes forecasts generated from daily squared returns and realised volatility. Furthermore, it finds that the volatility proxy affects the statistical loss functions’ ability to discriminate between forecasts in tests of predictive ability. These findings deepen our understanding of how to choose between competing multivariate volatility forecasts.
Resumo:
Maternal and infant mortality is a global health issue with a significant social and economic impact. Each year, over half a million women worldwide die due to complications related to pregnancy or childbirth, four million infants die in the first 28 days of life, and eight million infants die in the first year. Ninety-nine percent of maternal and infant deaths are in developing countries. Reducing maternal and infant mortality is among the key international development goals. In China, the national maternal mortality ratio and infant mortality rate were reduced greatly in the past two decades, yet a large discrepancy remains between urban and rural areas. To address this problem, a large-scale Safe Motherhood Programme was initiated in 2000. The programme was implemented in Guangxi in 2003. Interventions in the programme included both demand-side and supply side-interventions focusing on increasing health service use and improving birth outcomes. Little is known about the effects and economic outcomes of the Safe Motherhood Programme in Guangxi, although it has been implemented for seven years. The aim of this research is to estimate the effectiveness and cost-effectiveness of the interventions in the Safe Motherhood Programme in Guangxi, China. The objectives of this research include: 1. To evaluate whether the changes of health service use and birth outcomes are associated with the interventions in the Safe Motherhood Programme. 2. To estimate the cost-effectiveness of the interventions in the Safe Motherhood Programme and quantify the uncertainty surrounding the decision. 3. To assess the expected value of perfect information associated with both the whole decision and individual parameters, and interpret the findings to inform priority setting in further research and policy making in this area. A quasi-experimental study design was used in this research to assess the effectiveness of the programme in increasing health service use and improving birth outcomes. The study subjects were 51 intervention counties and 30 control counties. Data on the health service use, birth outcomes and socio-economic factors from 2001 to 2007 were collected from the programme database and statistical yearbooks. Based on the profile plots of the data, general linear mixed models were used to evaluate the effectiveness of the programme while controlling for the effects of baseline levels of the response variables, change of socio-economic factors over time and correlations among repeated measurements from the same county. Redundant multicollinear variables were deleted from the mixed model using the results of the multicollinearity diagnoses. For each response variable, the best covariance structure was selected from 15 alternatives according to the fit statistics including Akaike information criterion, Finite-population corrected Akaike information criterion, and Schwarz.s Bayesian information criterion. Residual diagnostics were used to validate the model assumptions. Statistical inferences were made to show the effect of the programme on health service use and birth outcomes. A decision analytic model was developed to evaluate the cost-effectiveness of the programme, quantify the decision uncertainty, and estimate the expected value of perfect information associated with the decision. The model was used to describe the transitions between health states for women and infants and reflect the change of both costs and health benefits associated with implementing the programme. Result gained from the mixed models and other relevant evidence identified were synthesised appropriately to inform the input parameters of the model. Incremental cost-effectiveness ratios of the programme were calculated for the two groups of intervention counties over time. Uncertainty surrounding the parameters was dealt with using probabilistic sensitivity analysis, and uncertainty relating to model assumptions was handled using scenario analysis. Finally the expected value of perfect information for both the whole model and individual parameters in the model were estimated to inform priority setting in further research in this area.The annual change rates of the antenatal care rate and the institutionalised delivery rate were improved significantly in the intervention counties after the programme was implemented. Significant improvements were also found in the annual change rates of the maternal mortality ratio, the infant mortality rate, the incidence rate of neonatal tetanus and the mortality rate of neonatal tetanus in the intervention counties after the implementation of the programme. The annual change rate of the neonatal mortality rate was also improved, although the improvement was only close to statistical significance. The influences of the socio-economic factors on the health service use indicators and birth outcomes were identified. The rural income per capita had a significant positive impact on the health service use indicators, and a significant negative impact on the birth outcomes. The number of beds in healthcare institutions per 1,000 population and the number of rural telephone subscribers per 1,000 were found to be positively significantly related to the institutionalised delivery rate. The length of highway per square kilometre negatively influenced the maternal mortality ratio. The percentage of employed persons in the primary industry had a significant negative impact on the institutionalised delivery rate, and a significant positive impact on the infant mortality rate and neonatal mortality rate. The incremental costs of implementing the programme over the existing practice were US $11.1 million from the societal perspective, and US $13.8 million from the perspective of the Ministry of Health. Overall, 28,711 life years were generated by the programme, producing an overall incremental cost-effectiveness ratio of US $386 from the societal perspective, and US $480 from the perspective of the Ministry of Health, both of which were below the threshold willingness-to-pay ratio of US $675. The expected net monetary benefit generated by the programme was US $8.3 million from the societal perspective, and US $5.5 million from the perspective of the Ministry of Health. The overall probability that the programme was cost-effective was 0.93 and 0.89 from the two perspectives, respectively. The incremental cost-effectiveness ratio of the programme was insensitive to the different estimates of the three parameters relating to the model assumptions. Further research could be conducted to reduce the uncertainty surrounding the decision, in which the upper limit of investment was US $0.6 million from the societal perspective, and US $1.3 million from the perspective of the Ministry of Health. It is also worthwhile to get a more precise estimate of the improvement of infant mortality rate. The population expected value of perfect information for individual parameters associated with this parameter was US $0.99 million from the societal perspective, and US $1.14 million from the perspective of the Ministry of Health. The findings from this study have shown that the interventions in the Safe Motherhood Programme were both effective and cost-effective in increasing health service use and improving birth outcomes in rural areas of Guangxi, China. Therefore, the programme represents a good public health investment and should be adopted and further expanded to an even broader area if possible. This research provides economic evidence to inform efficient decision making in improving maternal and infant health in developing countries.
Resumo:
This paper adopts an epistemic community framework to explicate the dual role of epistemic communities as influencers of accounting policy within regulatory space and as implementers who effect change within the domain of accounting. The context is the adoption and implementation of fair value accounting within local government in New South Wales (NSW). The roles and functions of Australian local government are extensive, and include the development and maintenance of infrastructure, provision of recreational facilities, certain health and community services, buildings, cultural facilities, and in some cases, water and sewerage (Australian Local Government Association, 2009). The NSW state Department of Local Government (DLG) is responsible for legislation and policy development to ensure that local councils are able to deliver ‘quality services to their communities in a sustainable manner’ (DLG, 2008c). These local councils receive revenue from various sources including property rates, government grants and user-pays service provision. In July 2006 the DLG issued Circular 06-453 to councils (DLG, 2006c), mandating the staged adoption of fair value measurement of infrastructure assets. This directive followed the policy of NSW State Treasury (NSW Treasury, 2007),4 and an independent inquiry into the financial sustainability of local councils (LGSA, 2006). It was an attempt to resolve the inconsistency in public sector asset valuation in NSW Local Governments, and to provide greater usefulness and comparability of financial statements.5 The focus of this study is the mobilization of accounting change by the DLG within this wider political context. When a regulatory problem arises, those with political power seek advice from professionals with relevant skill and expertise (Potter, 2005). This paper explores the way in which professionals diffuse accounting ‘problems’ and the associated accounting solutions ‘across time and space’ (Potter, 2005, p. 277). The DLG’s fair value accounting policy emanated from a ‘regulatory space’ (Hancher and Moran, 1989)6 as a result of negotiations between many parties, including accounting and finance professionals. Operating within the local government sector, these professionals were identified by the DLG as being capable of providing helpful input. They were also responsible for the implementation of the new olicy within local councils. Accordingly they have been dentified as an pistemic community with the ability to ranslate regulatory power by changing he domain of ccounting (Potter, 2005, p. 278).7 The paper is organised as follows. The background to the LG’s decision to require the introduction of fair value accounting for infrastructure assets is explored. Following this, the method of the study is described, and the epistemic community framework outlined. In the next sections, evidence of the influencing and implementing roles of epistemic groups is provided. Finally, conclusions are drawn about the significance of these groups both within regulatory space in developing accounting regulation, and in embedding change within the domain of accounting.
Resumo:
Traffic Simulation models tend to have their own data input and output formats. In an effort to standardise the input for traffic simulations, we introduce in this paper a set of data marts that aim to serve as a common interface between the necessaary data, stored in dedicated databases, and the swoftware packages, that require the input in a certain format. The data marts are developed based on real world objects (e.g. roads, traffic lights, controllers) rather than abstract models and hence contain all necessary information that can be transformed by the importing software package to their needs. The paper contains a full description of the data marts for network coding, simulation results, and scenario management, which have been discussed with industry partners to ensure sustainability.
Resumo:
Damage detection in structures has become increasingly important in recent years. While a number of damage detection and localization methods have been proposed, few attempts have been made to explore the structure damage with frequency response functions (FRFs). This paper illustrates the damage identification and condition assessment of a beam structure using a new frequency response functions (FRFs) based damage index and Artificial Neural Networks (ANNs). In practice, usage of all available FRF data as an input to artificial neural networks makes the training and convergence impossible. Therefore one of the data reduction techniques Principal Component Analysis (PCA) is introduced in the algorithm. In the proposed procedure, a large set of FRFs are divided into sub-sets in order to find the damage indices for different frequency points of different damage scenarios. The basic idea of this method is to establish features of damaged structure using FRFs from different measurement points of different sub-sets of intact structure. Then using these features, damage indices of different damage cases of the structure are identified after reconstructing of available FRF data using PCA. The obtained damage indices corresponding to different damage locations and severities are introduced as input variable to developed artificial neural networks. Finally, the effectiveness of the proposed method is illustrated and validated by using the finite element modal of a beam structure. The illustrated results show that the PCA based damage index is suitable and effective for structural damage detection and condition assessment of building structures.
Resumo:
Conventional planning and decision making, with its sectoral and territorial emphasis and flat-map based processes are no longer adequate or appropriate for the increased complexity confronting airport/city interfaces. These crowed and often contested governance spaces demand a more iterative and relational planning and decision-making approach. Emergent GIS based planning and decision-making tools provide a mechanism which integrate and visually display an array of complex data, frameworks and scenarios/expectations, often in ‘real time’ computations. In so doing, these mechanisms provide a common ground for decision making and facilitate a more ‘joined-up’ approach to airport/city planning. This paper analyses the contribution of the Airport Metropolis Planning Support System (PSS) to sub-regional planning in the Brisbane Airport case environment.
Resumo:
Motivation is a major driver of project performance. Despite team member ability to deliver successful project outcomes if they are not positively motivated to pursue joint project goals, then performance will be constrained. One approach to improving the motivation of project organizations is by offering a financial reward for the achievement of set performance standards above a minimum required level. However, little investigation has been undertaken into the features of successful incentive systems as a part of an overall delivery strategy. With input from organizational management literature, and drawing on the literature covering psychological and economic theories of motivation, this paper presents an integrated framework that can be used by project organizations to assess the impact of financial reward systems on motivation in construction projects. The integrated framework offers four motivation indicators which reflect key theoretical concepts across both psychological and economic disciplines. The indicators are: (1) Goal Commitment, (2) Distributive Justice, (3) Procedural Justice, and (4) Reciprocity. The paper also interprets the integrated framework against the results of a successful Australian social infrastructure project case study and identifies key learning’s for project organizations to consider when designing financial reward systems. Case study results suggest that motivation directed towards the achievement of incentive goals is influenced not only by the value placed on the financial reward for commercial benefit, but also driven by the strength of the project initiatives that encourage just and fair dealings, supporting the establishment of trust and positive reciprocal behavior across a project team. The strength of the project relationships was found to be influenced by how attractive the achievement of the goal is to the incentive recipient and how likely they were to push for the achievement of the goal. Interestingly, findings also suggested that contractor motivation is also influenced by the fairness of the performance measurement process and their perception of the trustworthiness and transparency of their client. These findings provide the basis for future research on the impact of financial reward systems on motivation in construction projects. It is anticipated that such research will shed new light on this complex topic and further define how reward systems should be designed to promote project team motivation. Due to the unique nature of construction projects with high levels of task complexity and interdependence, results are expected to vary in comparison to previous studies based on individuals or single-entity organizations.
Resumo:
This research explores music in space, as experienced through performing and music-making with interactive systems. It explores how musical parameters may be presented spatially and displayed visually with a view to their exploration by a musician during performance. Spatial arrangements of musical components, especially pitches and harmonies, have been widely studied in the literature, but the current capabilities of interactive systems allow the improvisational exploration of these musical spaces as part of a performance practice. This research focuses on quantised spatial organisation of musical parameters that can be categorised as grid music systems (GMSs), and interactive music systems based on them. The research explores and surveys existing and historical uses of GMSs, and develops and demonstrates the use of a novel grid music system designed for whole body interaction. Grid music systems provide plotting of spatialised input to construct patterned music on a two-dimensional grid layout. GMSs are navigated to construct a sequence of parametric steps, for example a series of pitches, rhythmic values, a chord sequence, or terraced dynamic steps. While they are conceptually simple when only controlling one musical dimension, grid systems may be layered to enable complex and satisfying musical results. These systems have proved a viable, effective, accessible and engaging means of music-making for the general user as well as the musician. GMSs have been widely used in electronic and digital music technologies, where they have generally been applied to small portable devices and software systems such as step sequencers and drum machines. This research shows that by scaling up a grid music system, music-making and musical improvisation are enhanced, gaining several advantages: (1) Full body location becomes the spatial input to the grid. The system becomes a partially immersive one in four related ways: spatially, graphically, sonically and musically. (2) Detection of body location by tracking enables hands-free operation, thereby allowing the playing of a musical instrument in addition to “playing” the grid system. (3) Visual information regarding musical parameters may be enhanced so that the performer may fully engage with existing spatial knowledge of musical materials. The result is that existing spatial knowledge is overlaid on, and combined with, music-space. Music-space is a new concept produced by the research, and is similar to notions of other musical spaces including soundscape, acoustic space, Smalley's “circumspace” and “immersive space” (2007, 48-52), and Lotis's “ambiophony” (2003), but is rather more textural and “alive”—and therefore very conducive to interaction. Music-space is that space occupied by music, set within normal space, which may be perceived by a person located within, or moving around in that space. Music-space has a perceivable “texture” made of tensions and relaxations, and contains spatial patterns of these formed by musical elements such as notes, harmonies, and sounds, changing over time. The music may be performed by live musicians, created electronically, or be prerecorded. Large-scale GMSs have the capability not only to interactively display musical information as music representative space, but to allow music-space to co-exist with it. Moving around the grid, the performer may interact in real time with musical materials in music-space, as they form over squares or move in paths. Additionally he/she may sense the textural matrix of the music-space while being immersed in surround sound covering the grid. The HarmonyGrid is a new computer-based interactive performance system developed during this research that provides a generative music-making system intended to accompany, or play along with, an improvising musician. This large-scale GMS employs full-body motion tracking over a projected grid. Playing with the system creates an enhanced performance employing live interactive music, along with graphical and spatial activity. Although one other experimental system provides certain aspects of immersive music-making, currently only the HarmonyGrid provides an environment to explore and experience music-space in a GMS.
Resumo:
An analytical solution is presented in this paper for the vibration response of a ribbed plate clamped on all its boundary edges by employing a travelling wave solution. A clamped ribbed plate test rig is also assembled in this study for the experimental investigation of the ribbed plate response and to provide verification results to the analytical solution. The dynamic characteristics and mode shapes of the ribbed plate are measured and compared to those obtained from the analytical solution and from finite element analysis (FEA). General good agreements are found between the results. Discrepancies between the computational and experimental results at low and high frequencies are also discussed. Explanations are offered in the study to disclose the mechanism causing the discrepancies. The dependency of the dynamic response of the ribbed plate on the distance between the excitation force and the rib is also investigated experimentally. It confirms the findings disclosed in a previous analytical study [T. R. Lin and J. Pan, A closed form solution for the dynamic response of finite ribbed plates. Journal of the Acoustical Society of America 119 (2006) 917-925] that the vibration response of a clamped ribbed plate due to a point force excitation is controlled by the plate stiffness when the source is more than a quarter plate bending wavelength away from the rib and from the plate boundary. The response is largely affected by the rib stiffness when the source location is less than a quarter bending wavelength away from the rib.
Resumo:
Abstract: Goals and potential impacts of QUT corporate Blueprint3 framework, university has made significant investments in physical infrastructure, and investments to improve staff profiles, particularly in relation to science, technology, engineering, and mathematics (STEM) disciplines. The most significant physical change to the Faculty’s infrastructure has seen new workshop and teaching and research spaces located in Science and Technology precinct under construction. Also includes Alumni news, input and output numbers Spatial Science discussion, Work Integrated Learning (WIL) in 2011, some key teaching administrative dates in 2011.
Resumo:
Increasing global competitiveness worldwide has forced manufacturing organizations to produce high-quality products more quickly and at a competitive cost. In order to reach these goals, they need good quality components from suppliers at optimum price and lead time. This actually forced all the companies to adapt different improvement practices such as lean manufacturing, Just in Time (JIT) and effective supply chain management. Applying new improvement techniques and tools cause higher establishment costs and more Information Delay (ID). On the contrary, these new techniques may reduce the risk of stock outs and affect supply chain flexibility to give a better overall performance. But industry people are unable to measure the overall affects of those improvement techniques with a standard evaluation model .So an effective overall supply chain performance evaluation model is essential for suppliers as well as manufacturers to assess their companies under different supply chain strategies. However, literature on lean supply chain performance evaluation is comparatively limited. Moreover, most of the models assumed random values for performance variables. The purpose of this paper is to propose an effective supply chain performance evaluation model using triangular linguistic fuzzy numbers and to recommend optimum ranges for performance variables for lean implementation. The model initially considers all the supply chain performance criteria (input, output and flexibility), converts the values to triangular linguistic fuzzy numbers and evaluates overall supply chain performance under different situations. Results show that with the proposed performance measurement model, improvement area for each variable can be accurately identified.
Resumo:
With the release of the Nintendo Wii in 2006, the use of haptic force gestures has become a very popular form of input for interactive entertainment. However, current gesture recognition techniques utilised in Nintendo Wii games fall prey to a lack of control when it comes to recognising simple gestures. This paper presents a simple gesture recognition technique called Peak Testing which gives greater control over gesture interaction. This recognition technique locates force peaks in continuous force data (provided by a gesture device such as the Wiimote) and then cancels any peaks which are not meant for input. Peak Testing is therefore technically able to identify movements in any direction. This paper applies this recognition technique to control virtual instruments and investigates how users respond to this interaction. The technique is then explored as the basis for a robust way to navigate menus with a simple flick of the wrist. We propose that this flick-form of interaction could be a very intuitive way to navigate Nintendo Wii menus instead of the current pointer techniques implemented.