845 resultados para LARGE-SAMPLE
Resumo:
In recent years a large body of research has investigated the various factors affecting child development and the consequent impact of child development on future educational and labour market outcomes. In this article we contribute to this literature by investigating the effect of handedness on a child and given recent research demonstrating that child development strongly affects adult outcomes. Using a large nationally representative sample of young children we find that the probability of a child being left-handed is not significantly related to child health at birth, family composition, parental employment or household income. We also find robust evidence that left-handed (and mixed handed) children perform significantly worse in nearly all measures of development than right-handed children with the relative disadvantage being larger for boys than girls. Importantly these differentials cannot be explained by different socioeconomic characteristics of the household, parental attitudes or investments in learning resources.
Resumo:
Recent studies have shown that delusion-like experiences (DLEs) are common among general populations. This study investigates whether the prevalence of these experiences are linked to the embracing of New Age thought. Logistic regression analyses were performed using data derived from a large community sample of young adults (N = 3777). Belief in a spiritual or higher power other than God was found to be significantly associated with endorsement of 16 of 19 items from Peters et al. (1999b) Delusional Inventory following adjustment for a range of potential confounders, while belief in God was associated with endorsement of four items. A New Age conception of the divine appears to be strongly associated with a wide range of DLEs. Further research is needed to determine a causal link between New Age philosophy and DLEs (e.g. thought disturbance, suspiciousness, and delusions of grandeur).
Resumo:
Precise, up-to-date and increasingly detailed road maps are crucial for various advanced road applications, such as lane-level vehicle navigation, and advanced driver assistant systems. With the very high resolution (VHR) imagery from digital airborne sources, it will greatly facilitate the data acquisition, data collection and updates if the road details can be automatically extracted from the aerial images. In this paper, we proposed an effective approach to detect road lane information from aerial images with employment of the object-oriented image analysis method. Our proposed algorithm starts with constructing the DSM and true orthophotos from the stereo images. The road lane details are detected using an object-oriented rule based image classification approach. Due to the affection of other objects with similar spectral and geometrical attributes, the extracted road lanes are filtered with the road surface obtained by a progressive two-class decision classifier. The generated road network is evaluated using the datasets provided by Queensland department of Main Roads. The evaluation shows completeness values that range between 76% and 98% and correctness values that range between 82% and 97%.
Resumo:
The automatic extraction of road features from remote sensed images has been a topic of great interest within the photogrammetric and remote sensing communities for over 3 decades. Although various techniques have been reported in the literature, it is still challenging to efficiently extract the road details with the increasing of image resolution as well as the requirement for accurate and up-to-date road data. In this paper, we will focus on the automatic detection of road lane markings, which are crucial for many applications, including lane level navigation and lane departure warning. The approach consists of four steps: i) data preprocessing, ii) image segmentation and road surface detection, iii) road lane marking extraction based on the generated road surface, and iv) testing and system evaluation. The proposed approach utilized the unsupervised ISODATA image segmentation algorithm, which segments the image into vegetation regions, and road surface based only on the Cb component of YCbCr color space. A shadow detection method based on YCbCr color space is also employed to detect and recover the shadows from the road surface casted by the vehicles and trees. Finally, the lane marking features are detected from the road surface using the histogram clustering. The experiments of applying the proposed method to the aerial imagery dataset of Gympie, Queensland demonstrate the efficiency of the approach.
Resumo:
Aims To determine the effect of nutritional status on the presence and severity of pressure ulcers in statewide? public healthcare facilities, in Queensland, Australia. Research Methods A multicentre, cross sectional audit of nutritional status of a convenience sample of subjects was carried out as part of a large audit of pressure ulcers in a sample of state based public healthcare facilities in 2002 and 2003. Dietitians in 20 hospitals and six residential aged care facilities conducted single day nutritional status audits of 2208 acute and 839 aged care subjects using the Subjective Global Assessment. The effect of nutritional status on the presence, highest stage and number of pressure ulcers was determined by logistic regression in a model controlling for age, gender, medical specialty and facility location. The potential clustering effect of facility was accounted for in the model using an analysis of correlated data approach. Results Subjects with malnutrition had an adjusted odds risk of 2.6 (95% CI 1.8-3.5, p<0.001) of having a pressure ulcer in acute facilities and 2.0 (95% CI 1.5-2.7, p<0.001) for residential aged care facilities. There was also increased odds risk of having a pressure ulcer, having a higher stage pressure ulcer and a higher number of pressure ulcers with increased severity of malnutrition. Conclusion Malnutrition was associated with at least twice the odds risk of having a pressure ulcer of in public healthcare facilities in Queensland. Action must be taken to identify, prevent and treat malnutrition, especially in patients at risk of pressure ulcer.
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
Resumo:
This is an experimental study into the permeability and compressibility properties of bagasse pulp pads. Three experimental rigs were custom-built for this project. The experimental work is complemented by modelling work. Both the steady-state and dynamic behaviour of pulp pads are evaluated in the experimental and modelling components of this project. Bagasse, the fibrous residue that remains after sugar is extracted from sugarcane, is normally burnt in Australia to generate steam and electricity for the sugar factory. A study into bagasse pulp was motivated by the possibility of making highly value-added pulp products from bagasse for the financial benefit of sugarcane millers and growers. The bagasse pulp and paper industry is a multibillion dollar industry (1). Bagasse pulp could replace eucalypt pulp which is more widely used in the local production of paper products. An opportunity exists for replacing the large quantity of mainly generic paper products imported to Australia. This includes 949,000 tonnes of generic photocopier papers (2). The use of bagasse pulp for paper manufacture is the main application area of interest for this study. Bagasse contains a large quantity of short parenchyma cells called ‘pith’. Around 30% of the shortest fibres are removed from bagasse prior to pulping. Despite the ‘depithing’ operations in conventional bagasse pulp mills, a large amount of pith remains in the pulp. Amongst Australian paper producers there is a perception that the high quantity of short fibres in bagasse pulp leads to poor filtration behaviour at the wet-end of a paper machine. Bagasse pulp’s poor filtration behaviour reduces paper production rates and consequently revenue when compared to paper production using locally made eucalypt pulp. Pulp filtration can be characterised by two interacting factors; permeability and compressibility. Surprisingly, there has previously been very little rigorous investigation into neither bagasse pulp permeability nor compressibility. Only freeness testing of bagasse pulp has been published in the open literature. As a result, this study has focussed on a detailed investigation of the filtration properties of bagasse pulp pads. As part of this investigation, this study investigated three options for improving the permeability and compressibility properties of Australian bagasse pulp pads. Two options for further pre-treating depithed bagasse prior to pulping were considered. Firstly, bagasse was fractionated based on size. Two bagasse fractions were produced, ‘coarse’ and ‘medium’ bagasse fractions. Secondly, bagasse was collected after being processed on two types of juice extraction technology, i.e. from a sugar mill and from a sugar diffuser. Finally one method of post-treating the bagasse pulp was investigated. The effects of chemical additives, which are known to improve freeness, were also assessed for their effect on pulp pad permeability and compressibility. Pre-treated Australian bagasse pulp samples were compared with several benchmark pulp samples. A sample of commonly used kraft Eucalyptus globulus pulp was obtained. A sample of depithed Argentinean bagasse, which is used for commercial paper production, was also obtained. A sample of Australian bagasse which was depithed as per typical factory operations was also produced for benchmarking purposes. The steady-state pulp pad permeability and compressibility parameters were determined experimentally using two purpose-built experimental rigs. In reality, steady-state conditions do not exist on a paper machine. The permeability changes as the sheet compresses over time. Hence, a dynamic model was developed which uses the experimentally determined steady-state permeability and compressibility parameters as inputs. The filtration model was developed with a view to designing pulp processing equipment that is suitable specifically for bagasse pulp. The predicted results of the dynamic model were compared to experimental data. The effectiveness of a polymeric and microparticle chemical additives for improving the retention of short fibres and increasing the drainage rate of a bagasse pulp slurry was determined in a third purpose-built rig; a modified Dynamic Drainage Jar (DDJ). These chemical additives were then used in the making of a pulp pad, and their effect on the steady-state and dynamic permeability and compressibility of bagasse pulp pads was determined. The most important finding from this investigation was that Australian bagasse pulp was produced with higher permeability than eucalypt pulp, despite a higher overall content of short fibres. It is thought this research outcome could enable Australian paper producers to switch from eucalypt pulp to bagasse pulp without sacrificing paper machine productivity. It is thought that two factors contributed to the high permeability of the bagasse pulp pad. Firstly, thicker cell walls of the bagasse pulp fibres resulted in high fibre stiffness. Secondly, the bagasse pulp had a large proportion of fibres longer than 1.3 mm. These attributes helped to reinforce the pulp pad matrix. The steady-state permeability and compressibility parameters for the eucalypt pulp were consistent with those found by previous workers. It was also found that Australian pulp derived from the ‘coarse’ bagasse fraction had higher steady-state permeability than the ‘medium’ fraction. However, there was no difference between bagasse pulp originating from a diffuser or a mill. The bagasse pre-treatment options investigated in this study were not found to affect the steady-state compressibility parameters of a pulp pad. The dynamic filtration model was found to give predictions that were in good agreement with experimental data for pads made from samples of pretreated bagasse pulp, provided at least some pith was removed prior to pulping. Applying vacuum to a pulp slurry in the modified DDJ dramatically reduced the drainage time. At any level of vacuum, bagasse pulp benefitted from chemical additives as quantified by reduced drainage time and increased retention of short fibres. Using the modified DDJ, it was observed that under specific conditions, a benchmark depithed bagasse pulp drained more rapidly than the ‘coarse’ bagasse pulp. In steady-state permeability and compressibility experiments, the addition of chemical additives improved the pad permeability and compressibility of a benchmark bagasse pulp with a high quantity of short fibres. Importantly, this effect was not observed for the ‘coarse’ bagasse pulp. However, dynamic filtration experiments showed that there was also a small observable improvement in filtration for the ‘medium’ bagasse pulp. The mechanism of bagasse pulp pad consolidation appears to be by fibre realignment. Chemical additives assist to lubricate the consolidation process. This study was complemented by pulp physical and chemical property testing and a microscopy study. In addition to its high pulp pad permeability, ‘coarse’ bagasse pulp often (but not always) had superior physical properties than a benchmark depithed bagasse pulp.
Resumo:
Conventional clinical therapies are unable to resolve osteochondral defects adequately, hence tissue engineering solutions are sought to address the challenge. A biphasic implant which was seeded with Mesenchymal Stem Cells (MSC) and coupled with an electrospun membrane was evaluated as an alternative. This dual phase construct comprised of a Polycaprolactone (PCL) cartilage scaffold and a Polycaprolactone - Tri Calcium Phosphate (PCL - TCP) osseous matrix. Autologous MSC was seeded into the entire implant via fibrin and the construct was inserted into critically sized osteochondral defects located at the medial condyle and patellar groove of pigs. The defect was resurfaced with a PCL - collagen electrospun mesh that served as a substitute for periosteal flap in preventing cell leakage. Controls either without implanted MSC or resurfacing membrane were included. After 6 months, cartilaginous repair was observed with a low occurrence of fibrocartilage at the medial condyle. Osteochondral repair was promoted and host cartilage degeneration was arrested as shown by the superior Glycosaminoglycan (GAG) maintenance. This positive morphological outcome was supported by a higher relative Young's modulus which indicated functional cartilage restoration. Bone in growth and remodeling occurred in all groups with a higher degree of mineralization in the experimental group. Tissue repair was compromised in the absence of the implanted cells or the resurfacing membrane. Moreover healing was inferior at the patellar groove as compared to the medial condyle and this was attributed to the native biomechanical features.
Resumo:
In this paper, we present a finite sample analysis of the sample minimum-variance frontier under the assumption that the returns are independent and multivariate normally distributed. We show that the sample minimum-variance frontier is a highly biased estimator of the population frontier, and we propose an improved estimator of the population frontier. In addition, we provide the exact distribution of the out-of-sample mean and variance of sample minimum-variance portfolios. This allows us to understand the impact of estimation error on the performance of in-sample optimal portfolios. Key Words: minimum-variance frontier; efficiency set constants; finite sample distribution
Resumo:
The purpose of this article is to highlight the conflict in the policy objectives of subs 46(1) and subs 46(1AA) of the Trade Practices Act 1974 (Cth) (TPA). The policy objective of subs 46(1) is to promote competition and efficient markets for the benefit of consumers (consumer welfare standard). It does not prohibit corporations with substantial market power using cost savings arising from efficiencies such economies of scale or scope, to undercut small business competitors The policy objective of 46(1AA), on the other hand, is to protect small business operators from price discounting by their larger competitors.. Unlike subs 46(1), it does not contain a ‘taking advantage’ element. It is argued that subs 46(1AA) may harm consumer welfare by having a chilling effect on price competition if this would harm small business competitors.
Resumo:
While a number of factors have been highlighted in the innovation adoption literature, little is known about the key triggers of innovation adoption across differently-sized firms. This study compares case studies of small, medium and large manufacturing firms who recently decided to adopt a process innovation. We also employ organizational surveys from 134 firms investigating the factors which influence innovation adoption. The quantitative results support the qualitative findings that the external pressures are a trigger among small to medium firms, while the larger firm’s adoption was associated with internal causes.