135 resultados para Minkowski sum
Resumo:
We consider complexity penalization methods for model selection. These methods aim to choose a model to optimally trade off estimation and approximation errors by minimizing the sum of an empirical risk term and a complexity penalty. It is well known that if we use a bound on the maximal deviation between empirical and true risks as a complexity penalty, then the risk of our choice is no more than the approximation error plus twice the complexity penalty. There are many cases, however, where complexity penalties like this give loose upper bounds on the estimation error. In particular, if we choose a function from a suitably simple convex function class with a strictly convex loss function, then the estimation error (the difference between the risk of the empirical risk minimizer and the minimal risk in the class) approaches zero at a faster rate than the maximal deviation between empirical and true risks. In this paper, we address the question of whether it is possible to design a complexity penalized model selection method for these situations. We show that, provided the sequence of models is ordered by inclusion, in these cases we can use tight upper bounds on estimation error as a complexity penalty. Surprisingly, this is the case even in situations when the difference between the empirical risk and true risk (and indeed the error of any estimate of the approximation error) decreases much more slowly than the complexity penalty. We give an oracle inequality showing that the resulting model selection method chooses a function with risk no more than the approximation error plus a constant times the complexity penalty.
Resumo:
The explanation of social inequalities in education is still a debated issue in economics. Recent empirical studies tend to downplay the potential role of credit constraint. This article tests a different potential explanation of social inequalities in education, specifically that social differences in aspiration level result in different educational choices. Having existed for a long time in the sociology of education, this explanation can be justified if aspiration levels are seen as reference points in a prospect theory framework. In order to test this explanation, this article applies the method of experimental economics to the issue of education choice and behaviour. One hundred and twenty-nine individuals participated in an experiment in which they had to perform a task over 15 stages grouped in three blocks or levels. In order to continue through the experiment, a minimum level of success was required at the end of each level. Rewards were dependent on the final level successfully reached. At the end of each level, participants could either choose to stop and take their reward or to pay a cost to continue further in order to possibly receive higher rewards. To test the impact of aspiration levels, outcomes were either presented as gains or losses relative to an initial sum. In accordance with the theoretical predictions, participants in the loss framing group choose to go further in the experiment. There was also a significant and interesting gender effect in the loss framing treatment, such that males performed better and reached higher levels.
Resumo:
In previous research (Chung et al., 2009), the potential of the continuous risk profile (CRP) to proactively detect the systematic deterioration of freeway safety levels was presented. In this paper, this potential is investigated further, and an algorithm is proposed for proactively detecting sites where the collision rate is not sufficiently high to be classified as a high collision concentration location but where a systematic deterioration of safety level is observed. The approach proposed compares the weighted CRP across different years and uses the cumulative sum (CUSUM) algorithm to detect the sites where changes in collision rate are observed. The CRPs of the detected sites are then compared for reproducibility. When high reproducibility is observed, a growth factor is used for sequential hypothesis testing to determine if the collision profiles are increasing over time. Findings from applying the proposed method using empirical data are documented in the paper together with a detailed description of the method.
Resumo:
Endocytosis is the process by which cells internalise molecules including nutrient proteins from the extracellular media. In one form, macropinocytosis, the membrane at the cell surface ruffles and folds over to give rise to an internalised vesicle. Negatively charged phospholipids within the membrane called phosphoinositides then undergo a series of transformations that are critical for the correct trafficking of the vesicle within the cell, and which are often pirated by pathogens such as Salmonella. Advanced fluorescent video microscopy imaging now allows the detailed observation and quantification of these events in live cells over time. Here we use these observations as a basis for building differential equation models of the transformations. An initial investigation of these interactions was modelled with reaction rates proportional to the sum of the concentrations of the individual constituents. A first order linear system for the concentrations results. The structure of the system enables analytical expressions to be obtained and the problem becomes one of determining the reaction rates which generate the observed data plots. We present results with reaction rates which capture the general behaviour of the reactions so that we now have a complete mathematical model of phosphoinositide transformations that fits the experimental observations. Some excellent fits are obtained with modulated exponential functions; however, these are not solutions of the linear system. The question arises as to how the model may be modified to obtain a system whose solution provides a more accurate fit.
Resumo:
Magnetic Resonance Imaging was used to study changes in the crystalline lens and ciliary body with accommodation and aging. Monocular images were obtained in 15 young (19-29 years) and 15 older (60-70 years) emmetropes when viewing at far (6m) and at individual near points (14.5 to 20.9 cm) in the younger group. With accommodation, lens thickness increased (mean±95% CI: 0.33±0.06mm) by a similar magnitude to the decrease in anterior chamber depth (0.31±0.07mm) and equatorial diameter (0.32±0.04mm) with a decrease in the radius of curvature of the posterior lens surface (0.58±0.30mm). Anterior lens surface shape could not be determined due to the overlapping region with the iris. Ciliary ring diameter decreased (0.44±0.17mm) with no decrease in circumlental space or forward ciliary body movement. With aging, lens thickness increased (mean±95% CI: 0.97±0.24mm) similar in magnitude to the sum of the decrease in anterior chamber depth (0.45±0.21mm) and increase in anterior segment depth (0.52±0.23mm). Equatorial lens diameter increased (0.28±0.23mm) with no change in the posterior lens surface radius of curvature. Ciliary ring diameter decreased (0.57±0.41mm) with reduced circumlental space (0.43±0.15mm) and no forward ciliary body movement. Accommodative changes support the Helmholtz theory of accommodation including an increase in posterior lens surface curvature. Certain aspects of aging changes mimic accommodation.
Resumo:
Background Although physical activity is associated with health-related quality of life (HRQL), the nature of the dose-response relationship remains unclear. This study examined the concurrent and prospective dose-response relationships between total physical activity (TPA) and (only) walking with HRQL in two age cohorts of women. Methods Participants were 10,698 women born in 1946-1951 and 7,646 born in 1921-1926, who completed three mailed surveys for the Australian Longitudinal Study on Women's Health. They reported weekly TPA minutes (sum of walking, moderate, and vigorous minutes). HRQL was measured with the Medical Outcomes Study Short-Form 36 Health Status Survey (SF-36). Linear mixed models, adjusted for socio-demographic and health-related variables, were used to examine associations between TPA level (none, very low, low, intermediate, sufficient, high, and very high) and SF-36 scores. For women who reported walking as their only physical activity, associations between walking and SF-36 scores were also examined. Results Curvilinear trends were observed between TPA and walking with SF-36 scores. Concurrently, HRQL scores increased significantly with increasing TPA and walking, in both cohorts, with increases less marked above sufficient activity levels. Prospectively, associations were attenuated although significant and meaningful improvements in physical functioning and vitality were observed across most TPA and walking categories above the low category. Conclusion For women in their 50s-80s without clinical depression, greater amounts of TPA are associated with better current and future HRQL, particularly physical functioning and vitality. Even if walking is their only activity, women, particularly those in their 70s-80s, have better health-related quality of life.
Resumo:
Language-use has proven to be the most complex and complicating of all Internet features, yet people and institutions invest enormously in language and crosslanguage features because they are fundamental to the success of the Internet’s past, present and future. The thesis takes into focus the developments of the latter – features that facilitate and signify linking between or across languages – both in their historical and current contexts. In the theoretical analysis, the conceptual platform of inter-language linking is developed to both accommodate efforts towards a new social complexity model for the co-evolution of languages and language content, as well as to create an open analytical space for language and cross-language related features of the Internet and beyond. The practiced uses of inter-language linking have changed over the last decades. Before and during the first years of the WWW, mechanisms of inter-language linking were at best important elements used to create new institutional or content arrangements, but on a large scale they were just insignificant. This has changed with the emergence of the WWW and its development into a web in which content in different languages co-evolve. The thesis traces the inter-language linking mechanisms that facilitated these dynamic changes by analysing what these linking mechanisms are, how their historical as well as current contexts can be understood and what kinds of cultural-economic innovation they enable and impede. The study discusses this alongside four empirical cases of bilingual or multilingual media use, ranging from television and web services for languages of smaller populations, to large-scale, multiple languages involving web ventures by the British Broadcasting Corporation, the Special Broadcasting Service Australia, Wikipedia and Google. To sum up, the thesis introduces the concepts of ‘inter-language linking’ and the ‘lateral web’ to model the social complexity and co-evolution of languages online. The resulting model reconsiders existing social complexity models in that it is the first that can explain the emergence of large-scale, networked co-evolution of languages and language content facilitated by the Internet and the WWW. Finally, the thesis argues that the Internet enables an open space for language and crosslanguage related features and investigates how far this process is facilitated by (1) amateurs and (2) human-algorithmic interaction cultures.
Resumo:
Public interest in volunteering in Australia has markedly escalated over the past five years, reflected in a number of publications in the popular, professional and academic press. This interest is welcome, and in many ways, is long overdue. Volunteers, or employing a term we find more useful, voluntarism is important for a number of reasons, not least of which is its structural role in the social institutions we have developed to support people, manage dependencies and facilitate a range of developmental activities across the life span. Voluntarism is an extremely complex social phenomenon. Conceptually, it transcends the sum of its parts, in that it is more than a simple aggregation of instances of individual behaviours. Our core argument here is that this complexity is such that equally intricate and multi-faceted perspectives and models need to be employed to further our understanding. In academic speak, this means that we need to develop analytical frameworks that draw on the breadth and depth of the social sciences...
Resumo:
The quick detection of abrupt (unknown) parameter changes in an observed hidden Markov model (HMM) is important in several applications. Motivated by the recent application of relative entropy concepts in the robust sequential change detection problem (and the related model selection problem), this paper proposes a sequential unknown change detection algorithm based on a relative entropy based HMM parameter estimator. Our proposed approach is able to overcome the lack of knowledge of post-change parameters, and is illustrated to have similar performance to the popular cumulative sum (CUSUM) algorithm (which requires knowledge of the post-change parameter values) when examined, on both simulated and real data, in a vision-based aircraft manoeuvre detection problem.
Resumo:
Background The adverse consequences of lymphedema following breast cancer in relation to physical function and quality of life are clear; however, its potential relationship with survival has not been investigated. Our purpose was to determine the prevalence of lymphedema and associated upper-body symptoms at 6 years following breast cancer and to examine the prognostic significance of lymphedema with respect to overall 6-year survival (OS). Methods and Results A population-based sample of Australian women (n=287) diagnosed with invasive, unilateral breast cancer was followed for a median of 6.6 years and prospectively assessed for lymphedema (using bioimpedance spectroscopy [BIS], sum of arm circumferences [SOAC], and self-reported arm swelling), a range of upper-body symptoms, and vital status. OS was measured from date of diagnosis to date of death or last follow-up. Kaplan-Meier methods were used to calculate OS and Cox proportional hazards models quantified the risk associated with lymphedema. Approximately 45% of women had reported at least one moderate to extreme symptom at 6.6 years postdiagnosis, while 34% had shown clinical evidence of lymphedema, and 48% reported arm swelling at least once since baseline assessment. A total of 27 (9.4%) women died during the follow-up period, and lymphedema, diagnosed by BIS or SOAC between 6–18 months postdiagnosis, predicted mortality (BIS: HR=2.5; 95% CI: 0.9, 6.8, p=0.08; SOAC: 3.0; 95% CI: 1.1, 8.7, p=0.04). There was no association (HR=1.2; 95% CI: 0.5, 2.6, p=0.68) between self-reported arm swelling and OS. Conclusions These findings suggest that lymphedema may influence survival following breast cancer treatment and warrant further investigation in other cancer cohorts and explication of a potential underlying biology.
Resumo:
The lymphedema diagnostic method used in descriptive or intervention studies may influence results found. The purposes of this work were to compare baseline lymphedema prevalence in the physical activity and lymphedema (PAL) trial cohort and to subsequently compare the effect of the weight-lifting intervention on lymphedema, according to four standard diagnostic methods. The PAL trial was a randomized controlled intervention study, involving 295 women who had previously been treated for breast cancer, and evaluated the effect of 12 months of weight lifting on lymphedema status. Four diagnostic methods were used to evaluate lymphedema outcomes: (i) interlimb volume difference through water displacement, (ii) interlimb size difference through sum of arm circumferences, (iii) interlimb impedance ratio using bioimpedance spectroscopy, and (iv) a validated self-report survey. Of the 295 women who participated in the PAL trial, between 22 and 52% were considered to have lymphedema at baseline according to the four diagnostic criteria used. No between-group differences were noted in the proportion of women who had a change in interlimb volume, interlimb size, interlimb ratio, or survey score of ≥5, ≥5, ≥10%, and 1 unit, respectively (cumulative incidence ratio at study end for each measure ranged between 0.6 and 0.8, with confidence intervals spanning 1.0). The variation in proportions of women within the PAL trial considered to have lymphoedema at baseline highlights the potential impact of the diagnostic criteria on population surveillance regarding prevalence of this common morbidity of treatment. Importantly though, progressive weight lifting was shown to be safe for women following breast cancer, even for those at risk or with lymphedema, irrespective of the diagnostic criteria used.
Resumo:
The purpose of this paper is to investigate the essential elements of sport management in Australia in the 1990's. The essential purpose is to view these elements from a legal perspective. In the past 12 months there has been at least three conferences in the sports law area. The majority of this paper has been allocated to the area of legal liability, especially the legal relationships evolving between the player and his co-participant, the player and his club, the player and his coach, and the duties and liabilities of the coach and the club. The area of insurance will also be discussed as it is a vital element in protecting the players, coaches and clubs in the event of any litigation. A well publicised case was that of Rogers v Bugden where the plaintiff Steven Rogers, who was a first grade rugby league football player for Cronulla, suffered a broken jaw and sued his co-participant Mark Bugden and Bugden's employer Canterbury/Bankstown District Rugby League Football Club. It was held that there was a contract of employment and Canterbury/Bankstown was found to be vicariously liable and was ordered to pay Rogers the sum of $68,154.00. The legal actions in tort and negligence are increasing. Sports managers will need to investigate thoroughly the protection available for their clients.
Resumo:
Client owners usually need an estimate or forecast of their likely building costs in advance of detailed design in order to confirm the financial feasibility of their projects. Because of their timing in the project life cycle, these early stage forecasts are characterized by the minimal amount of information available concerning the new (target) project to the point that often only its size and type are known. One approach is to use the mean contract sum of a sample, or base group, of previous projects of a similar type and size to the project for which the estimate is needed. Bernoulli’s law of large numbers implies that this base group should be as large as possible. However, increasing the size of the base group inevitably involves including projects that are less and less similar to the target project. Deciding on the optimal number of base group projects is known as the homogeneity or pooling problem. A method of solving the homogeneity problem is described involving the use of closed form equations to compare three different sampling arrangements of previous projects for their simulated forecasting ability by a cross-validation method, where a series of targets are extracted, with replacement, from the groups and compared with the mean value of the projects in the base groups. The procedure is then demonstrated with 450 Hong Kong projects (with different project types: Residential, Commercial centre, Car parking, Social community centre, School, Office, Hotel, Industrial, University and Hospital) clustered into base groups according to their type and size.
Resumo:
Traditional area-based matching techniques make use of similarity metrics such as the Sum of Absolute Differences(SAD), Sum of Squared Differences (SSD) and Normalised Cross Correlation (NCC). Non-parametric matching algorithms such as the rank and census rely on the relative ordering of pixel values rather than the pixels themselves as a similarity measure. Both traditional area-based and non-parametric stereo matching techniques have an algorithmic structure which is amenable to fast hardware realisation. This investigation undertakes a performance assessment of these two families of algorithms for robustness to radiometric distortion and random noise. A generic implementation framework is presented for the stereo matching problem and the relative hardware requirements for the various metrics investigated.
Resumo:
This thesis is concerned with creating and evaluating interactive art systems that facilitate emergent participant experiences. For the purposes of this research, interactive art is the computer based arts involving physical participation from the audience, while emergence is when a new form or concept appears that was not directly implied by the context from which it arose. This emergent ‘whole’ is more than a simple sum of its parts. The research aims to develop understanding of the nature of emergent experiences that might arise during participant interaction with interactive art systems. It also aims to understand the design issues surrounding the creation of these systems. The approach used is Practice-based, integrating practice, evaluation and theoretical research. Practice used methods from Reflection-in-action and Iterative design to create two interactive art systems: Glass Pond and +-now. Creation of +-now resulted in a novel method for instantiating emergent shapes. Both art works were also evaluated in exploratory studies. In addition, a main study with 30 participants was conducted on participant interaction with +-now. These sessions were video recorded and participants were interviewed about their experience. Recordings were transcribed and analysed using Grounded theory methods. Emergent participant experiences were identified and classified using a taxonomy of emergence in interactive art. This taxonomy draws on theoretical research. The outcomes of this Practice-based research are summarised as follows. Two interactive art systems, where the second work clearly facilitates emergent interaction, were created. Their creation involved the development of a novel method for instantiating emergent shapes and it informed aesthetic and design issues surrounding interactive art systems for emergence. A taxonomy of emergence in interactive art was also created. Other outcomes are the evaluation findings about participant experiences, including different types of emergence experienced and the coding schemes produced during data analysis.