890 resultados para real option analysis
Resumo:
Texture analysis and textural cues have been applied for image classification, segmentation and pattern recognition. Dominant texture descriptors include directionality, coarseness, line-likeness etc. In this dissertation a class of textures known as particulate textures are defined, which are predominantly coarse or blob-like. The set of features that characterise particulate textures are different from those that characterise classical textures. These features are micro-texture, macro-texture, size, shape and compaction. Classical texture analysis techniques do not adequately capture particulate texture features. This gap is identified and new methods for analysing particulate textures are proposed. The levels of complexity in particulate textures are also presented ranging from the simplest images where blob-like particles are easily isolated from their back- ground to the more complex images where the particles and the background are not easily separable or the particles are occluded. Simple particulate images can be analysed for particle shapes and sizes. Complex particulate texture images, on the other hand, often permit only the estimation of particle dimensions. Real life applications of particulate textures are reviewed, including applications to sedimentology, granulometry and road surface texture analysis. A new framework for computation of particulate shape is proposed. A granulometric approach for particle size estimation based on edge detection is developed which can be adapted to the gray level of the images by varying its parameters. This study binds visual texture analysis and road surface macrotexture in a theoretical framework, thus making it possible to apply monocular imaging techniques to road surface texture analysis. Results from the application of the developed algorithm to road surface macro-texture, are compared with results based on Fourier spectra, the auto- correlation function and wavelet decomposition, indicating the superior performance of the proposed technique. The influence of image acquisition conditions such as illumination and camera angle on the results was systematically analysed. Experimental data was collected from over 5km of road in Brisbane and the estimated coarseness along the road was compared with laser profilometer measurements. Coefficient of determination R2 exceeding 0.9 was obtained when correlating the proposed imaging technique with the state of the art Sensor Measured Texture Depth (SMTD) obtained using laser profilometers.
Resumo:
Distributed Denial-of-Service (DDoS) attacks continue to be one of the most pernicious threats to the delivery of services over the Internet. Not only are DDoS attacks present in many guises, they are also continuously evolving as new vulnerabilities are exploited. Hence accurate detection of these attacks still remains a challenging problem and a necessity for ensuring high-end network security. An intrinsic challenge in addressing this problem is to effectively distinguish these Denial-of-Service attacks from similar looking Flash Events (FEs) created by legitimate clients. A considerable overlap between the general characteristics of FEs and DDoS attacks makes it difficult to precisely separate these two classes of Internet activity. In this paper we propose parameters which can be used to explicitly distinguish FEs from DDoS attacks and analyse two real-world publicly available datasets to validate our proposal. Our analysis shows that even though FEs appear very similar to DDoS attacks, there are several subtle dissimilarities which can be exploited to separate these two classes of events.
Resumo:
With the advent of live cell imaging microscopy, new types of mathematical analyses and measurements are possible. Many of the real-time movies of cellular processes are visually very compelling, but elementary analysis of changes over time of quantities such as surface area and volume often show that there is more to the data than meets the eye. This unit outlines a geometric modeling methodology and applies it to tubulation of vesicles during endocytosis. Using these principles, it has been possible to build better qualitative and quantitative understandings of the systems observed, as well as to make predictions about quantities such as ligand or solute concentration, vesicle pH, and membrane trafficked. The purpose is to outline a methodology for analyzing real-time movies that has led to a greater appreciation of the changes that are occurring during the time frame of the real-time video microscopy and how additional quantitative measurements allow for further hypotheses to be generated and tested.
Resumo:
Trees, shrubs and other vegetation are of continued importance to the environment and our daily life. They provide shade around our roads and houses, offer a habitat for birds and wildlife, and absorb air pollutants. However, vegetation touching power lines is a risk to public safety and the environment, and one of the main causes of power supply problems. Vegetation management, which includes tree trimming and vegetation control, is a significant cost component of the maintenance of electrical infrastructure. For example, Ergon Energy, the Australia’s largest geographic footprint energy distributor, currently spends over $80 million a year inspecting and managing vegetation that encroach on power line assets. Currently, most vegetation management programs for distribution systems are calendar-based ground patrol. However, calendar-based inspection by linesman is labour-intensive, time consuming and expensive. It also results in some zones being trimmed more frequently than needed and others not cut often enough. Moreover, it’s seldom practicable to measure all the plants around power line corridors by field methods. Remote sensing data captured from airborne sensors has great potential in assisting vegetation management in power line corridors. This thesis presented a comprehensive study on using spiking neural networks in a specific image analysis application: power line corridor monitoring. Theoretically, the thesis focuses on a biologically inspired spiking cortical model: pulse coupled neural network (PCNN). The original PCNN model was simplified in order to better analyze the pulse dynamics and control the performance. Some new and effective algorithms were developed based on the proposed spiking cortical model for object detection, image segmentation and invariant feature extraction. The developed algorithms were evaluated in a number of experiments using real image data collected from our flight trails. The experimental results demonstrated the effectiveness and advantages of spiking neural networks in image processing tasks. Operationally, the knowledge gained from this research project offers a good reference to our industry partner (i.e. Ergon Energy) and other energy utilities who wants to improve their vegetation management activities. The novel approaches described in this thesis showed the potential of using the cutting edge sensor technologies and intelligent computing techniques in improve power line corridor monitoring. The lessons learnt from this project are also expected to increase the confidence of energy companies to move from traditional vegetation management strategy to a more automated, accurate and cost-effective solution using aerial remote sensing techniques.
Resumo:
In this paper, we present a method for the recovery of position and absolute attitude (including pitch, roll and yaw) using a novel fusion of monocular Visual Odometry and GPS measurements in a similar manner to a classic loosely-coupled GPS/INS error state navigation filter. The proposed filter does not require additional restrictions or assumptions such as platform-specific dynamics, map-matching, feature-tracking, visual loop-closing, gravity vector or additional sensors such as an IMU or magnetic compass. An observability analysis of the proposed filter is performed, showing that the scale factor, position and attitude errors are fully observable under acceleration that is non-parallel to velocity vector in the navigation frame. The observability properties of the proposed filter are demonstrated using numerical simulations. We conclude the article with an implementation of the proposed filter using real flight data collected from a Cessna 172 equipped with a downwards-looking camera and GPS, showing the feasibility of the algorithm in real-world conditions.
Resumo:
Real estate, or property development, is considered one of the pillar industries of the Chinese economy. As a result of the opening up of the economy as well as the "macro-control" policy of the Central Chinese Government to moderate the frenetic pace of growth of the economy, the real estate industry has faced fierce competition and ongoing change. Real estate firms in China must improve their competitiveness in order to maintain market share or even survive in this brutally competitive environment. This study developed a methodology to evaluate the competitiveness of real estate developers in the China and then used a case study to illustrate the effectiveness of the evaluation method. Four steps were taken to achieve this. The first step was to conduct a thorough literature review which included a review of the characteristics of real estate industry, theories about competitiveness and the competitive characteristics of real estate developers. Following this literature review, the competitive model was developed based on seven key competitive factors (the 'level 1') identified in the literature. They include: (1) financial competency; (2) market share; (3) management competency; (4) social responsibility; (5) organisational competency; (6) technological capabilities; and, (7) regional competitiveness. In the next step of research, the competitive evaluation criteria (the 'level 2') under each of competitive factors (the 'level 1') were evaluated. Additionally, there were identified a set of competitive attributes (the 'level 3') under each competitive criteria (the 'level 2'). These attributes were initially recognised during the literature review and then expanded upon through interviews with multidisciplinary experts and practitioners in various real estate-related industries. The final step in this research was to undertake a case study using the proposed evaluation method and attributes. Through the study of an actual real estate development company, the procedures and effectiveness of the evaluation method were illustrated and validated. Through the above steps, this research investigates and develops an analytical system for determining the corporate competitiveness of real estate developers in China. The analytical system is formulated to evaluate the "state of health" of the business from different competitive perspectives. The result of empirical study illustrates that a systematic and structured evaluation can effectively assist developers in identifying their strengths and highlighting potential problems. This is very important for the development of an overall corporate strategy and supporting key strategic decisions. This study also provides some insights, analysis and suggestions for improving the competitiveness of real estate developers in China from different perspectives, including: management competency, organisational competency, technological capabilities, financial competency, market share, social responsibility and regional competitiveness. In the case study, problems were found in each of these areas, and they appear to be common in the industry. To address these problems and improve the competitiveness and effectiveness of Chinese real estate developers, a variety of suggestions are proposed. The findings of this research provide an insight into the factors that influence competitiveness in the Chinese real estate industry while also assisting practitioners to formulate strategies to improve their competitiveness. References for studying the competitiveness of real estate developers in other countries are also provided.
Resumo:
PURPOSE: The purpose of this study is to identify risk factors for developing complications following treatment of refractory glaucoma with transscleral diode laser cyclophotocoagulation (cyclodiode), to improve the safety profile of this treatment modality. METHOD: A retrospective analysis of 72 eyes from 70 patients who were treated with cyclodiode. RESULTS: The mean pre-treatment IOP was 37.0 mmHg (SD 11.0), with a mean post-treatment reduction in intraocular pressure (IOP) of 19.8 mmHg, and a mean IOP at last follow-up of 17.1 mmHg (SD 9.7). Mean total power delivered during treatment was 156.8 Joules (SD 82.7) over a mean of 1.3 treatments (SD 0.6). Sixteen eyes (22.2% of patients) developed complications from the treatment, with the most common being hypotony, occurring in 6 patients, including 4 with neovascular glaucoma. A higher pre-treatment IOP and higher mean total power delivery also were associated with higher complications. CONCLUSIONS: Cyclodiode is an effective treatment option for glaucoma that is refractory to other treatment options. By identifying risk factors for potential complications, cyclodiode can be modified accordingly for each patient to improve safety and efficacy.
Resumo:
Complex networks have been studied extensively due to their relevance to many real-world systems such as the world-wide web, the internet, biological and social systems. During the past two decades, studies of such networks in different fields have produced many significant results concerning their structures, topological properties, and dynamics. Three well-known properties of complex networks are scale-free degree distribution, small-world effect and self-similarity. The search for additional meaningful properties and the relationships among these properties is an active area of current research. This thesis investigates a newer aspect of complex networks, namely their multifractality, which is an extension of the concept of selfsimilarity. The first part of the thesis aims to confirm that the study of properties of complex networks can be expanded to a wider field including more complex weighted networks. Those real networks that have been shown to possess the self-similarity property in the existing literature are all unweighted networks. We use the proteinprotein interaction (PPI) networks as a key example to show that their weighted networks inherit the self-similarity from the original unweighted networks. Firstly, we confirm that the random sequential box-covering algorithm is an effective tool to compute the fractal dimension of complex networks. This is demonstrated on the Homo sapiens and E. coli PPI networks as well as their skeletons. Our results verify that the fractal dimension of the skeleton is smaller than that of the original network due to the shortest distance between nodes is larger in the skeleton, hence for a fixed box-size more boxes will be needed to cover the skeleton. Then we adopt the iterative scoring method to generate weighted PPI networks of five species, namely Homo sapiens, E. coli, yeast, C. elegans and Arabidopsis Thaliana. By using the random sequential box-covering algorithm, we calculate the fractal dimensions for both the original unweighted PPI networks and the generated weighted networks. The results show that self-similarity is still present in generated weighted PPI networks. This implication will be useful for our treatment of the networks in the third part of the thesis. The second part of the thesis aims to explore the multifractal behavior of different complex networks. Fractals such as the Cantor set, the Koch curve and the Sierspinski gasket are homogeneous since these fractals consist of a geometrical figure which repeats on an ever-reduced scale. Fractal analysis is a useful method for their study. However, real-world fractals are not homogeneous; there is rarely an identical motif repeated on all scales. Their singularity may vary on different subsets; implying that these objects are multifractal. Multifractal analysis is a useful way to systematically characterize the spatial heterogeneity of both theoretical and experimental fractal patterns. However, the tools for multifractal analysis of objects in Euclidean space are not suitable for complex networks. In this thesis, we propose a new box covering algorithm for multifractal analysis of complex networks. This algorithm is demonstrated in the computation of the generalized fractal dimensions of some theoretical networks, namely scale-free networks, small-world networks, random networks, and a kind of real networks, namely PPI networks of different species. Our main finding is the existence of multifractality in scale-free networks and PPI networks, while the multifractal behaviour is not confirmed for small-world networks and random networks. As another application, we generate gene interactions networks for patients and healthy people using the correlation coefficients between microarrays of different genes. Our results confirm the existence of multifractality in gene interactions networks. This multifractal analysis then provides a potentially useful tool for gene clustering and identification. The third part of the thesis aims to investigate the topological properties of networks constructed from time series. Characterizing complicated dynamics from time series is a fundamental problem of continuing interest in a wide variety of fields. Recent works indicate that complex network theory can be a powerful tool to analyse time series. Many existing methods for transforming time series into complex networks share a common feature: they define the connectivity of a complex network by the mutual proximity of different parts (e.g., individual states, state vectors, or cycles) of a single trajectory. In this thesis, we propose a new method to construct networks of time series: we define nodes by vectors of a certain length in the time series, and weight of edges between any two nodes by the Euclidean distance between the corresponding two vectors. We apply this method to build networks for fractional Brownian motions, whose long-range dependence is characterised by their Hurst exponent. We verify the validity of this method by showing that time series with stronger correlation, hence larger Hurst exponent, tend to have smaller fractal dimension, hence smoother sample paths. We then construct networks via the technique of horizontal visibility graph (HVG), which has been widely used recently. We confirm a known linear relationship between the Hurst exponent of fractional Brownian motion and the fractal dimension of the corresponding HVG network. In the first application, we apply our newly developed box-covering algorithm to calculate the generalized fractal dimensions of the HVG networks of fractional Brownian motions as well as those for binomial cascades and five bacterial genomes. The results confirm the monoscaling of fractional Brownian motion and the multifractality of the rest. As an additional application, we discuss the resilience of networks constructed from time series via two different approaches: visibility graph and horizontal visibility graph. Our finding is that the degree distribution of VG networks of fractional Brownian motions is scale-free (i.e., having a power law) meaning that one needs to destroy a large percentage of nodes before the network collapses into isolated parts; while for HVG networks of fractional Brownian motions, the degree distribution has exponential tails, implying that HVG networks would not survive the same kind of attack.
Resumo:
Sourcing appropriate funding for the provision of new urban infrastructure has been a policy dilemma for governments around the world for decades. This is particularly relevant in high growth areas where new services are required to support swelling populations. The Australian infrastructure funding policy dilemmas are reflective of similar matters in many countries, particularly the United States of America, where infrastructure cost recovery policies have been in place since the 1970’s. There is an extensive body of both theoretical and empirical literature from these countries that discusses the passing on (to home buyers) of these infrastructure charges, and the corresponding impact on housing prices. The theoretical evidence is consistent in its findings that infrastructure charges are passed on to home buyers by way of higher house prices. The empirical evidence is also consistent in its findings, with “overshifting” of these charges evident in all models since the 1980’s, i.e. $1 infrastructure charge results in greater than $1 increase in house prices. However, despite over a dozen separate studies over two decades in the US on this topic, no empirical works have been carried out in Australia to test if similar shifting or overshifting occurs here. The purpose of this research is to conduct a preliminary analysis of the more recent models used in these US empirical studies in order to identify the key study area selection criteria and success factors. The paper concludes that many of the study area selection criteria are implicit rather than explicit. By collecting data across the models, some implicit criteria become apparent, whilst others remain elusive. This data will inform future research on whether an existing model can be adopted or adapted for use in Australia.
Resumo:
The purpose of this paper is to explore the trend of Purpose Built Office (PBO) supply and occupancy in Malaysia. In achieving this, the number of PBO supply by the private sector in the market is compared with the government sector to gain an understanding of the current emerging market for the PBO. There have been limited studies in Malaysia comparing the trend supply and occupancy of PBOs by both sectors. This paper outcome will illustrate the needs for public sector asset management in Malaysia, particularly for PBOs. An analytical framework is developed using time series to measure the level of supply and occupancy of PBO by both sectors, indicating the percentage of government’s PBO compared to the total numbers of PBOs in the market from 2004 to 2010
Resumo:
Rural property in Australia has seen significant market resurgence over the past 3 years, with improved seasonal conditions in a number of states, improved commodity prices and a greater interest and purchase of rural land by major international corporations and investment institutions. Much of this change in perspective in relation to rural property as an asset class can be linked to the food shortage of 2007 and the subsequent interest by many countries in respect to food security. This paper will address the total and capital return performance of a major agricultural area and compare these returns on the basis of both location of land and land use. The comparison will be used to determine if location or actual land use has a greater influence on rural property capital and income returns. This performance analysis is based on over 40,000 rural sales transactions. These transactions cover all market based rural property transactions in New South Wales, Australia for the period January 1990 to December 2010. Correlation analysis and investment performance analysis has also been carried out to determine the possible relationships between location and land use and subsequent changes in rural land capital values.
Resumo:
A novel method for genotyping the clustered, regularly interspaced short-palindromic-repeat (CRISPR) locus of Campylobacter jejuni is described. Following real-time PCR, CRISPR products were subjected to high-resolution melt (HRM) analysis, a new technology that allows precise melt profile determination of amplicons. This investigation shows that the CRISPR HRM assay provides a powerful addition to existing C. jejuni genotyping methods and emphasizes the potential of HRM for genotyping short sequence repeats in other species
Resumo:
In 1999 Richards compared the accuracy of commercially available motion capture systems commonly used in biomechanics. Richards identified that in static tests the optical motion capture systems generally produced RMS errors of less than 1.0 mm. During dynamic tests, the RMS error increased to up to 4.2 mm in some systems. In the last 12 years motion capture systems have continued to evolve and now include high-resolution CCD or CMOS image sensors, wireless communication, and high full frame sampling frequencies. In addition to hardware advances, there have also been a number of advances in software, which includes improved calibration and tracking algorithms, real time data streaming, and the introduction of the c3d standard. These advances have allowed the system manufactures to maintain a high retail price in the name of advancement. In areas such as gait analysis and ergonomics many of the advanced features such as high resolution image sensors and high sampling frequencies are not required due to the nature of the task often investigated. Recently Natural Point introduced low cost cameras, which on face value appear to be suitable as at very least a high quality teaching tool in biomechanics and possibly even a research tool when coupled with the correct calibration and tracking software. The aim of the study was therefore to compare both the linear accuracy and quality of angular kinematics from a typical high end motion capture system and a low cost system during a simple task.
Resumo:
This paper studies the missing covariate problem which is often encountered in survival analysis. Three covariate imputation methods are employed in the study, and the effectiveness of each method is evaluated within the hazard prediction framework. Data from a typical engineering asset is used in the case study. Covariate values in some time steps are deliberately discarded to generate an incomplete covariate set. It is found that although the mean imputation method is simpler than others for solving missing covariate problems, the results calculated by it can differ largely from the real values of the missing covariates. This study also shows that in general, results obtained from the regression method are more accurate than those of the mean imputation method but at the cost of a higher computational expensive. Gaussian Mixture Model (GMM) method is found to be the most effective method within these three in terms of both computation efficiency and predication accuracy.
Resumo:
The Social Web is a torrent of real-time information and an emerging discipline is now focussed on harnessing this information flow for analysis of themes, opinions and sentiment. This short paper reports on early work on designing better user interfaces for end users in manipulating the outcomes from these analysis engines.