18 resultados para residuals

em Deakin Research Online - Australia


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mass and length growth models were determined for male (n = 69) and female (n = 163) Australian fur seals (Arctocephalus pusillus doriferus) collected at a breeding colony on Seal Rocks (38˚31′S, 145˚06′E), Bass Strait, in south-east Australia, between February and November during 1970–72. Growth was best described by the logistic model in males and the von Bertalanffy model in females. Asymptotic mass and length were 229 kg and 221 cm for males, and 85 kg and 163 cm for females. In all, 95% of asymptotic mass and length were attained by 11 years and 11 years, respectively, in males compared with 9 years and 5 years, respectively, in females. Males grew in length faster than females and experienced a growth spurt in mass coinciding with the onset of puberty (4–5 years). The onset of puberty in females occurs when approximately 86% of asymptotic length is attained. The rate of growth and sexual development in Australian fur seals is similar to (if not faster than) that in the conspecific Cape fur seal (A. p. pusillus), which inhabits the nutrient-rich Benguela current. This suggests that the low marine productivity of Bass Strait may not be cause of the slow rate of recovery of the Australian fur seal population following the severe over-exploitation of the commercial sealing era. Sternal blubber depth was positively correlated in adult animals with a body condition index derived from the residuals of the mass–length relationship (males: r2 = 0.38, n = 19, P < 0.001; females: r2 = 0.22, n = 92, P < 0.001), confirming the validity of using such indices on otariids. Sternal blubber depth varied significantly with season in adult animals. In males it was lowest in winter and increased during spring prior to the breeding season (r2 = 0.39, n = 19, P < 0.03) whereas in females it was greatest during winter (r2 = 0.05, n = 122, P< 0.05).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the use of Ordered Weighted Averaging (OWA) in linear regression. Our goal is to replace the traditional least squares, least absolute deviation, and maximum likelihood criteria with an OWA function of the residuals. We obtain several high breakdown robust regression methods as special cases (least median, least trimmed squares, trimmed likelihood methods). We also present new formulations of regression problem. OWA-based regression is particularly useful in the presence of outliers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider an application of fuzzy logic connectives to statistical regression. We replace the standard least squares, least absolute deviation, and maximum likelihood criteria with an ordered weighted averaging (OWA) function of the residuals. Depending on the choice of the weights, we obtain the standard regression problems, high-breakdown robust methods (least median, least trimmed squares, and trimmed likelihood methods), as well as new formulations. We present various approaches to numerical solution of such regression problems. OWA-based regression is particularly useful in the presence of outliers, and we illustrate the performance of the new methods on several instances of linear regression problems with multiple outliers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using Rasch analysis, the psychometric properties of a newly developed 35-item parent-proxy instrument, the Caregiver Assessment of Movement Participation (CAMP), designed to measure movement participation problems in children with Developmental Coordination Disorder, were examined. The CAMP was administered to 465 school children aged 5–10 years. Thirty of the 35 items were retained as they had acceptable infit and outfit statistics. Item separation (7.48) and child separation (3.16) were good; moreover, the CAMP had excellent reliability (Reliability Index for item = 0.98; Person = 0.91). Principal components analysis of item residuals confirmed the unidimensionality of the instrument. Based on category probability statistics, the original five-point scale was collapsed into a four-point scale. The item threshold calibration of the CAMP with the Movement Assessment Battery for Children Test was computed. The results indicated that a CAMP total score of 75 is the optimal cut-off point for identifying children at risk of movement problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Detection of lane boundaries of a road based on the images or video taken by a video capturing device in a suburban environment is a challenging task. In this paper, a novel lane detection algorithm is proposed without considering camera parameters; which robustly detects lane boundaries in real-time especially for sub-urban roads. Initially, the proposed method fits the CIE L*a*b* transformed road chromaticity values (that is a* and b* values) to a bi-variate Gaussian model followed by the classification of road area based on Mahalanobis distance. Secondly, the classified road area acts as an arbitrary shaped region of interest (AROI) in order to extract blobs resulting from the filtered image by a two dimensional Gabor filter. This is considered as the first cue of images. Thirdly, another cue of images was employed in order to obtain an entropy image. Moreover, results from the color based image cue and entropy image cue were integrated following an outlier removing process. Finally, the correct road lane points are fitted with Bezier splines which act as control points that can form arbitrary shapes. The algorithm was implemented and experiments were carried out on sub-urban roads. The results show the effectiveness of the algorithm in producing more accurate lane boundaries on curvatures and other objects on the road.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nearly all drinking water distribution systems experience a "natural" reduction of disinfection residuals. The most frequently used disinfectant is chlorine, which can decay due to reactions with organic and inorganic compounds in the water and by liquid/solids reaction with the biofilm, pipe walls and sediments. Usually levels of 0.2-0.5 mg/L of free chlorine are required at the point of consumption to maintain bacteriological safety. Higher concentrations are not desirable as they present the problems of taste and odour and increase formation of disinfection by-products. It is usually a considerable concern for the operators of drinking water distribution systems to manage chlorine residuals at the "optimum level", considering all these issues. This paper describes how the chlorine profile in a drinking water distribution system can be modelled and optimised on the basis of readily and inexpensively available laboratory data. Methods are presented for deriving the laboratory data, fitting a chlorine decay model of bulk water to the data and applying the model, in conjunction with a simplified hydraulic model, to obtain the chlorine profile in a distribution system at steady flow conditions. Two case studies are used to demonstrate the utility of the technique. Melbourne's Greenvale-Sydenham distribution system is unfiltered and uses chlorination as its only treatment. The chlorine model developed from laboratory data was applied to the whole system and the chlorine profile was shown to be accurately simulated. Biofilm was not found to critically affect chlorine decay. In the other case study, Sydney Water's Nepean system was modelled from limited hydraulic data. Chlorine decay and trihalomethane (THM) formation in raw and treated water were measured in a laboratory, and a chlorine decay and THM model was derived on the basis of these data. Simulated chlorine and THM profiles agree well with the measured values available. Various applications of this modelling approach are also briefly discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nearly all drinking water distribution systems experience a "natural" reduction of disinfection residuals. The most frequently used disinfectant is chlorine, which can decay due to reactions with organic and inorganic compounds in the water and by liquid/solids reaction with the biofilm, pipe walls and sediments. Usually levels of 0.2-0.5 mg/L of free chlorine are required at the point of consumption to maintain bacteriological safety. Higher concentrations are not desirable as they present the problems of taste and odour and increase formation of disinfection by-products. It is usually a considerable concern for the operators of drinking water distribution systems to manage chlorine residuals at the "optimum level", considering all these issues. This paper describes how the chlorine profile in a drinking water distribution system can be modelled and optimised on the basis of readily and inexpensively available laboratory data. Methods are presented for deriving the laboratory data, fitting a chlorine decay model of bulk water to the data and applying the model, in conjunction with a simplified hydraulic model, to obtain the chlorine profile in a distribution system at steady flow conditions. Two case studies are used to demonstrate the utility of the technique. Melbourne's Greenvale-Sydenham distribution system is unfiltered and uses chlorination as its only treatment. The chlorine model developed from laboratory data was applied to the whole system and the chlorine profile was shown to be accurately simulated. Biofilm was not found to critically affect chlorine decay. In the other case study, Sydney Water's Nepean system was modelled from limited hydraulic data. Chlorine decay and trihalomethane (THM) formation in raw and treated water were measured in a laboratory, and a chlorine decay and THM model was derived on the basis of these data. Simulated chlorine and THM profiles agree well with the measured values available. Various applications of this modelling approach are also briefly discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A simple biofilm model was developed to describe the growth of bacteria in drinking water biofilms and the subsequent interactions with disinfectant residuals incorporating the important processes, such as attachment of free bacteria to the biofilm on a wall surface, detachment of bacteria from the biofilm, growth of biofilm bacteria with chloramine inhibition, chloramine decay in the bulk water phase, and chloramine decay due to biofilm bacteria and wall surfaces. The model is useful in evaluating the biological stability of different waters, as it can predict concentration of organic substances in water. In addition, the model can be used to predict the bacterial growth and biofilm decay in distribution systems. A model of this kind is a useful tool in developing system management strategies to ultimately improve drinking water quality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Regression lies heart in statistics, it is the one of the most important branch of multivariate techniques available for extracting knowledge in almost every field of study and research. Nowadays, it has drawn a huge interest to perform the tasks with different fields like machine learning, pattern recognition and data mining. Investigating outlier (exceptional) is a century long problem to the data analyst and researchers. Blind application of data could have dangerous consequences and leading to discovery of meaningless patterns and carrying to the imperfect knowledge. As a result of digital revolution and the growth of the Internet and Intranet data continues to be accumulated at an exponential rate and thereby importance of detecting outliers and study their costs and benefits as a tool for reliable knowledge discovery claims perfect attention. Investigating outliers in regression has been paid great value for the last few decades within two frames of thoughts in the name of robust regression and regression diagnostics. Robust regression first wants to fit a regression to the majority of the data and then to discover outliers as those points that possess large residuals from the robust output whereas in regression diagnostics one first finds the outliers, delete/correct them and then fit the regular data by classical (usual) methods. At the beginning there seems to be much confusion but now the researchers reach to the consensus, robustness and diagnostics are two complementary approaches to the analysis of data and any one is not good enough. In this chapter, we discuss both of them under the unique spectrum of regression diagnostics. Chapter expresses the necessity and views of regression diagnostics as well as presents several contemporary methods through numerical examples in linear regression within each aforesaid category together with current challenges and possible future research directions. Our aim is to make the chapter self-explained maintaining its general accessibility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Compressed sensing (CS) is a new information sampling theory for acquiring sparse or compressible data with much fewer measurements than those otherwise required by the Nyquist/Shannon counterpart. This is particularly important for some imaging applications such as magnetic resonance imaging or in astronomy. However, in the existing CS formulation, the use of the â„“ 2 norm on the residuals is not particularly efficient when the noise is impulsive. This could lead to an increase in the upper bound of the recovery error. To address this problem, we consider a robust formulation for CS to suppress outliers in the residuals. We propose an iterative algorithm for solving the robust CS problem that exploits the power of existing CS solvers. We also show that the upper bound on the recovery error in the case of non-Gaussian noise is reduced and then demonstrate the efficacy of the method through numerical studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present an approach to computing high-breakdown regression estimators in parallel on graphics processing units (GPU).We show that sorting the residuals is not necessary, and it can be substituted by calculating the median. We present and compare various methods to calculate the median and order statistics on GPUs. We introduce an alternative method based on the optimization of a convex function, and showits numerical superiority when calculating the order statistics of very large arrays on GPUs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this note, we examine the size and power properties and the break date estimation accuracy of the Lee and Strazicich (LS, 2003) two break endogenous unit root test, based on two different break date selection methods: minimising the test statistic and minimising the sum of squared residuals (SSR). Our results show that the performance of both Models A and C of the LS test are superior when one uses the minimising SSR procedure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The detection of lane boundaries on suburban streets using images obtained from video constitutes a challenging task. This is mainly due to the difficulties associated with estimating the complex geometric structure of lane boundaries, the quality of lane markings as a result of wear, occlusions by traffic, and shadows caused by road-side trees and structures. Most of the existing techniques for lane boundary detection employ a single visual cue and will only work under certain conditions and where there are clear lane markings. Also, better results are achieved when there are no other onroad objects present. This paper extends our previous work and discusses a novel lane boundary detection algorithm specifically addressing the abovementioned issues through the integration of two visual cues. The first visual cue is based on stripe-like features found on lane lines extracted using a two-dimensional symmetric Gabor filter. The second visual cue is based on a texture characteristic determined using the entropy measure of the predefined neighbourhood around a lane boundary line. The visual cues are then integrated using a rulebased classifier which incorporates a modified sequential covering algorithm to improve robustness. To separate lane boundary lines from other similar features, a road mask is generated using road chromaticity values estimated from CIE L*a*b* colour transformation. Extraneous points around lane boundary lines are then removed by an outlier removal procedure based on studentized residuals. The lane boundary lines are then modelled with Bezier spline curves. To validate the algorithm, extensive experimental evaluation was carried out on suburban streets and the results are presented. 

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Species distribution models have come under criticism for being too simplistic for making robust future forecasts, partly because they assume that climate is the main determinant of geographical range at large spatial extents and coarse resolutions, with non-climate predictors being important only at finer scales. We suggest that this paradigm might be obscured by species movement patterns. To explore this we used contrasting kangaroo (family Macropodidae) case studies: two species with relatively small, stable home ranges (Macropus giganteus and M.robustus) and three species with more extensive, adaptive ranging behaviour (M.antilopinus, M.fuliginosus and M.rufus). We predicted that non-climate predictors will be most influential to model fit and predictive performance at local spatial resolution for the former species and at landscape resolution for the latter species. We compared residuals autocovariate - boosted regression tree (RAC-BRT) model statistics with and without species-specific non-climate predictors (habitat, soil, fire, water and topography), at local- and landscape-level spatial resolutions (5 and 50km). As predicted, the influence of non-climate predictors on model fit and predictive performance (compared with climate-only models) was greater at 50 compared with 5km resolution for M.rufus and M.fuliginosus and the opposite trend was observed for M.giganteus. The results for M.robustus and M.antilopinus were inconclusive. Also notable was the difference in inter-scale importance of climate predictors in the presence of non-climate predictors. In conclusion, differences in autecology, particularly relating to space use, may contribute to the importance of non-climate predictors at a given scale, not model scale per se. Further exploration of this concept across a range of species is encouraged and findings may contribute to more effective conservation and management of species at ecologically meaningful scales. © 2014 Ecological Society of Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Phylogenetic generalised least squares (PGLS) is one of the most commonly employed phylogenetic comparative methods. The technique, a modification of generalised least squares, uses knowledge of phylogenetic relationships to produce an estimate of expected covariance in cross-species data. Closely related species are assumed to have more similar traits because of their shared ancestry and hence produce more similar residuals from the least squares regression line. By taking into account the expected covariance structure of these residuals, modified slope and intercept estimates are generated that can account for interspecific autocorrelation due to phylogeny. Here, we provide a basic conceptual background to PGLS, for those unfamiliar with the approach. We describe the requirements for a PGLS analysis and highlight the packages that can be used to implement the method. We show how phylogeny is used to calculate the expected covariance structure in the data and how this is applied to the generalised least squares regression equation. We demonstrate how PGLS can incorporate information about phylogenetic signal, the extent to which closely related species truly are similar, and how it controls for this signal appropriately, thereby negating concerns about unnecessarily ‘correcting’ for phylogeny. In addition to discussing the appropriate way to present the results of PGLS analyses, we highlight some common misconceptions about the approach and commonly encountered problems with the method. These include misunderstandings about what phylogenetic signal refers to in the context of PGLS (residuals errors, not the traits themselves), and issues associated with unknown or uncertain phylogeny.