949 resultados para CHANGE-POINT
Resumo:
Introduction Standing radiographs are the ‘gold standard’ for clinical assessment of adolescent idiopathic scoliosis (AIS), with the Cobb Angle used to measure the severity and progression of the scoliotic curve. Supine imaging modalities can provide valuable 3D information on scoliotic anatomy, however, due to changes in gravitational loading direction, the geometry of the spine alters between the supine and standing position which in turn affects the Cobb Angle measurement. Previous studies have consistently reported a 7-10° [1-3] Cobb Angle increase from supine to standing, however, none have reported the effect of endplate pre-selection and which (if any) curve parameters affect the supine to standing Cobb Angle difference. Methods Female AIS patients with right-sided thoracic major curves were included in the retrospective study. Clinically measured Cobb Angles from existing standing coronal radiographs and fulcrum bending radiographs [4] were compared to existing low-dose supine CT scans taken within 3 months of the reference radiograph. Reformatted coronal CT images were used to measure Cobb Angle variability with and without endplate pre-selection (end-plates selected on the radiographs used on the CT images). Inter and intra-observer measurement variability was assessed. Multi-linear regression was used to investigate whether there was a relationship between supine to standing Cobb Angle change and patient characteristics (SPSS, v.21, IBM, USA). Results Fifty-two patients were included, with mean age of 14.6 (SD 1.8) years; all curves were Lenke Type 1 with mean Cobb Angle on supine CT of 42° (SD 6.4°) and 52° (SD 6.7°) on standing radiographs. The mean fulcrum bending Cobb Angle for the group was 22.6° (SD 7.5°). The 10° increase from supine to standing is consistent with existing literature. Pre-selecting vertebral endplates was found to increase the Cobb Angle difference by a mean 2° (range 0-9°). Multi-linear regression revealed a statistically significant relationship between supine to standing Cobb Angle change with: fulcrum flexibility (p=0.001), age (p=0.027) and standing Cobb Angle (p<0.001). In patients with high fulcrum flexibility scores, the supine to standing Cobb Angle change was as great as 20°.The 95% confidence intervals for intra-observer and inter-observer measurement variability were 3.1° and 3.6°, respectively. Conclusion There is a statistically significant relationship between supine to standing Cobb Angle change and fulcrum flexibility. Therefore, this difference can be considered a measure of spinal flexibility. Pre-selecting vertebral endplates causes only minor changes.
Resumo:
This paper evaluates the performances of prediction intervals generated from alternative time series models, in the context of tourism forecasting. The forecasting methods considered include the autoregressive (AR) model, the AR model using the bias-corrected bootstrap, seasonal ARIMA models, innovations state space models for exponential smoothing, and Harvey’s structural time series models. We use thirteen monthly time series for the number of tourist arrivals to Hong Kong and Australia. The mean coverage rates and widths of the alternative prediction intervals are evaluated in an empirical setting. It is found that all models produce satisfactory prediction intervals, except for the autoregressive model. In particular, those based on the biascorrected bootstrap perform best in general, providing tight intervals with accurate coverage rates, especially when the forecast horizon is long.
Resumo:
The occurrence of extreme water level events along low-lying, highly populated and/or developed coastlines can lead to devastating impacts on coastal infrastructure. Therefore it is very important that the probabilities of extreme water levels are accurately evaluated to inform flood and coastal management and for future planning. The aim of this study was to provide estimates of present day extreme total water level exceedance probabilities around the whole coastline of Australia, arising from combinations of mean sea level, astronomical tide and storm surges generated by both extra-tropical and tropical storms, but exclusive of surface gravity waves. The study has been undertaken in two main stages. In the first stage, a high-resolution (~10 km along the coast) hydrodynamic depth averaged model has been configured for the whole coastline of Australia using the Danish Hydraulics Institute’s Mike21 modelling suite of tools. The model has been forced with astronomical tidal levels, derived from the TPX07.2 global tidal model, and meteorological fields, from the US National Center for Environmental Prediction’s global reanalysis, to generate a 61-year (1949 to 2009) hindcast of water levels. This model output has been validated against measurements from 30 tide gauge sites around Australia with long records. At each of the model grid points located around the coast, time series of annual maxima and the several highest water levels for each year were derived from the multi-decadal water level hindcast and have been fitted to extreme value distributions to estimate exceedance probabilities. Stage 1 provided a reliable estimate of the present day total water level exceedance probabilities around southern Australia, which is mainly impacted by extra-tropical storms. However, as the meteorological fields used to force the hydrodynamic model only weakly include the effects of tropical cyclones the resultant water levels exceedance probabilities were underestimated around western, northern and north-eastern Australia at higher return periods. Even if the resolution of the meteorological forcing was adequate to represent tropical cyclone-induced surges, multi-decadal periods yielded insufficient instances of tropical cyclones to enable the use of traditional extreme value extrapolation techniques. Therefore, in the second stage of the study, a statistical model of tropical cyclone tracks and central pressures was developed using histroic observations. This model was then used to generate synthetic events that represented 10,000 years of cyclone activity for the Australia region, with characteristics based on the observed tropical cyclones over the last ~40 years. Wind and pressure fields, derived from these synthetic events using analytical profile models, were used to drive the hydrodynamic model to predict the associated storm surge response. A random time period was chosen, during the tropical cyclone season, and astronomical tidal forcing for this period was included to account for non-linear interactions between the tidal and surge components. For each model grid point around the coast, annual maximum total levels for these synthetic events were calculated and these were used to estimate exceedance probabilities. The exceedance probabilities from stages 1 and 2 were then combined to provide a single estimate of present day extreme water level probabilities around the whole coastline of Australia.
Resumo:
Transportation construction is substantially different from other construction fields due to widespread use of unit price bidding and competitive contract awarding. Thus, the potential for change orders has been the main source of unbalanced bidding for contractors, which can be described as substantial increases in work quantity or reasonable changes to the initial design provided by the State Highway Agencies (SHAs). It is important to understand the causes of the change orders as cost related issues are the main reason for contract disputes. We have analyzed a large dataset from a major SHA to identify project related and environmental factors that affect the change order costs. The results of the study can be instrumental in assessing the increased costs associated with change orders and better management measures can be taken to mitigate their effects.
Resumo:
The quick detection of an abrupt unknown change in the conditional distribution of a dependent stochastic process has numerous applications. In this paper, we pose a minimax robust quickest change detection problem for cases where there is uncertainty about the post-change conditional distribution. Our minimax robust formulation is based on the popular Lorden criteria of optimal quickest change detection. Under a condition on the set of possible post-change distributions, we show that the widely known cumulative sum (CUSUM) rule is asymptotically minimax robust under our Lorden minimax robust formulation as a false alarm constraint becomes more strict. We also establish general asymptotic bounds on the detection delay of misspecified CUSUM rules (i.e. CUSUM rules that are designed with post- change distributions that differ from those of the observed sequence). We exploit these bounds to compare the delay performance of asymptotically minimax robust, asymptotically optimal, and other misspecified CUSUM rules. In simulation examples, we illustrate that asymptotically minimax robust CUSUM rules can provide better detection delay performance at greatly reduced computation effort compared to competing generalised likelihood ratio procedures.
Resumo:
Commentators have predicted bureaucratic organisations would undergo substantial change as a result of social and economic pressures. We ask whether reforms to the Australian public service over the 1983–93 period exemplify this process. We use the methods of organisational analysis to characterise the direction of change, basing our assessment on the standard structural variables of complexity, formalisation and centralisation, together with a cultural variable. We find evidence that, overall, departments of state in the APS were becoming less bureaucratic in their structure, culture and internal function in the 1983–93 period. However, the effect was not uniform across departments, or unambiguous — formalisation, for example, increased in some respects and decreased in others. Centralisation increased overall, despite devolution of some decision-making.
Resumo:
Problem, research strategy, and findings: There is a conflict between recent creative placemaking policies intended to promote positive neighborhood development through the arts and the fact that the arts have long been cited as contributing to gentrification and the displacement of lower-income residents. Unfortunately, we do not have data to demonstrate widespread evidence of either outcome. We address the dearth of comprehensive research and inform neighborhood planning efforts by statistically testing how two different groups of arts activities—the fine arts and commercial arts industries—are associated with conditions indicative of revitalization and gentrification in 100 large U.S. metropolitan areas. We find that different arts activities are associated with different types and levels of neighborhood change. Commercial arts industries show the strongest association with gentrification in rapidly changing areas, while the fine arts are associated with stable, slow-growth neighborhoods. Takeaway for practice: This research can help planners to more effectively incorporate the arts into neighborhood planning efforts and to anticipate the potential for different outcomes in their arts development strategies, including gentrification-related displacement.
Resumo:
This article takes as its starting point the observation that neoliberalism is a concept that is ‘oft-invoked but ill-defined’. It provides a taxonomy of uses of the term neoliberalism to include: (1) an all-purpose denunciatory category; (2) ‘the way things are’; (3) an institutional framework characterizing particular forms of national capitalism, most notably the Anglo-American ones; (4) a dominant ideology of global capitalism; (5) a form of governmentality and hegemony; and (6) a variant within the broad framework of liberalism as both theory and policy discourse. It is argued that this sprawling set of definitions are not mutually compatible, and that uses of the term need to be dramatically narrowed from its current association with anything and everything that a particular author may find objectionable. In particular, it is argued that the uses of the term by Michel Foucault in his 1978–9 lectures, found in The Birth of Biopolitics, are not particularly compatible with its more recent status as a variant of dominant ideology or hegemony theories. It instead proposes understanding neoliberalism in terms of historical institutionalism, with Foucault’s account of historical change complementing MaxWeber’s work identifying the distinctive economic sociology of national capitalisms.
Resumo:
Sleep disruption strongly influences daytime functioning; resultant sleepiness is recognised as a contributing risk-factor for individuals performing critical and dangerous tasks. While the relationship between sleep and sleepiness has been heavily investigated in the vulnerable sub-populations of shift workers and patients with sleep disorders, postpartum women have been comparatively overlooked. Thirty-three healthy, postpartum women recorded every episode of sleep and wake each day during postpartum weeks 6, 12 and 18. Although repeated measures analysis revealed there was no significant difference in the amount of nocturnal sleep and frequency of night-time wakings, there was a significant reduction in sleep disruption, due to fewer minutes of wake after sleep onset. Subjective sleepiness was measured each day using the Karolinska Sleepiness Scale; at the two earlier time points this was significantly correlated with sleep quality but not to sleep quantity. Epworth Sleepiness Scores significantly reduced over time; however, during week 18 over 50% of participants were still experiencing excessive daytime sleepiness (Epworth Sleepiness Score ≥12). Results have implications for health care providers and policy makers. Health care providers designing interventions to address sleepiness in new mothers should take into account the dynamic changes to sleep and sleepiness during this initial postpartum period. Policy makers developing regulations for parental leave entitlements should take into consideration the high prevalence of excessive daytime sleepiness experienced by new mothers, ensuring enough opportunity for daytime sleepiness to diminish to a manageable level prior to reengagement in the workforce.
Resumo:
We have developed a Hierarchical Look-Ahead Trajectory Model (HiLAM) that incorporates the firing pattern of medial entorhinal grid cells in a planning circuit that includes interactions with hippocampus and prefrontal cortex. We show the model’s flexibility in representing large real world environments using odometry information obtained from challenging video sequences. We acquire the visual data from a camera mounted on a small tele-operated vehicle. The camera has a panoramic field of view with its focal point approximately 5 cm above the ground level, similar to what would be expected from a rat’s point of view. Using established algorithms for calculating perceptual speed from the apparent rate of visual change over time, we generate raw dead reckoning information which loses spatial fidelity over time due to error accumulation. We rectify the loss of fidelity by exploiting the loop-closure detection ability of a biologically inspired, robot navigation model termed RatSLAM. The rectified motion information serves as a velocity input to the HiLAM to encode the environment in the form of grid cell and place cell maps. Finally, we show goal directed path planning results of HiLAM in two different environments, an indoor square maze used in rodent experiments and an outdoor arena more than two orders of magnitude larger than the indoor maze. Together these results bridge for the first time the gap between higher fidelity bio-inspired navigation models (HiLAM) and more abstracted but highly functional bio-inspired robotic mapping systems (RatSLAM), and move from simulated environments into real-world studies in rodent-sized arenas and beyond.
Resumo:
This study reports on the utilisation of the Manchester Driver Behaviour Questionnaire (DBQ) to examine the self-reported driving behaviours of a large sample of Australian fleet drivers (N = 3414). Surveys were completed by employees before they commenced a one day safety workshop intervention. Factor analysis techniques identified a three factor solution similar to previous research, which was comprised of: (a) errors, (b) highway-code violations and (c) aggressive driving violations. Two items traditionally related with highway-code violations were found to be associated with aggressive driving behaviours among the current sample. Multivariate analyses revealed that exposure to the road, errors and self-reported offences predicted crashes at work in the last 12 months, while gender, highway violations and crashes predicted offences incurred while at work. Importantly, those who received more fines at work were at an increased risk of crashing the work vehicle. However, overall, the DBQ demonstrated limited efficacy at predicting these two outcomes. This paper outlines the major findings of the study in regards to identifying and predicting aberrant driving behaviours and also highlights implications regarding the future utilisation of the DBQ within fleet settings.
Resumo:
Organisations have recently looked to design to become more customer oriented and co-create a new kind of value and service offering. This requires changes in the organisation mindset, involving the entire company, innovation processes and often its business model. One tool that has been successful in facilitating this has been Osterwalder and Pigneur (2010) ‘Business Model Canvas’ and more importantly the design process that supports the use of this tool. The aim of this paper is to explore the role design tools play in the translation and facilitation process of innovation in firms. Six ‘Design Innovation Catalysts’ (Wrigley, 2013) were interviewed in regards to their approach and use of design tools in order to better facilitate innovation. Results highlight the value of tools expands beyond their intended use to include; facilitation of communicating, permission to think creatively, and learning and teaching through visualisation. Findings from this research build upon the role of the Design Innovation Catalyst and provide additional implications for organisations.
Resumo:
Land use and agricultural practices can result in important contributions to the global source strength of atmospheric nitrous oxide (N2O) and methane (CH4). However, knowledge of gas flux from irrigated agriculture is very limited. From April 2005 to October 2006, a study was conducted in the Aral Sea Basin, Uzbekistan, to quantify and compare emissions of N2O and CH4 in various annual and perennial land-use systems: irrigated cotton, winter wheat and rice crops, a poplar plantation and a natural Tugai (floodplain) forest. In the annual systems, average N2O emissions ranged from 10 to 150 μg N2O-N m−2 h−1 with highest N2O emissions in the cotton fields, covering a similar range of previous studies from irrigated cropping systems. Emission factors (uncorrected for background emission), used to determine the fertilizer-induced N2O emission as a percentage of N fertilizer applied, ranged from 0.2% to 2.6%. Seasonal variations in N2O emissions were principally controlled by fertilization and irrigation management. Pulses of N2O emissions occurred after concomitant N-fertilizer application and irrigation. The unfertilized poplar plantation showed high N2O emissions over the entire study period (30 μg N2O-N m−2 h−1), whereas only negligible fluxes of N2O (<2 μg N2O-N m−2 h−1) occurred in the Tugai. Significant CH4 fluxes only were determined from the flooded rice field: Fluxes were low with mean flux rates of 32 mg CH4 m−2 day−1 and a low seasonal total of 35.2 kg CH4 ha−1. The global warming potential (GWP) of the N2O and CH4 fluxes was highest under rice and cotton, with seasonal changes between 500 and 3000 kg CO2 eq. ha−1. The biennial cotton–wheat–rice crop rotation commonly practiced in the region would average a GWP of 2500 kg CO2 eq. ha−1 yr−1. The analyses point out opportunities for reducing the GWP of these irrigated agricultural systems by (i) optimization of fertilization and irrigation practices and (ii) conversion of annual cropping systems into perennial forest plantations, especially on less profitable, marginal lands.
Resumo:
Effective response by government and individuals to the risk of land degradation requires an understanding of regional climate variations and the impacts of climate and management on condition and productivity of land and vegetation resources. Analysis of past land degradation and climate variability provides some understanding of vulnerability to current and future climate changes and the information needs for more sustainable management. We describe experience in providing climate risk assessment information for managing for the risk of land degradation in north-eastern Australian arid and semi-arid regions used for extensive grazing. However, we note that information based on historical climate variability, which has been relied on in the past, will now also have to factor in the influence of human-induced climate change. Examples illustrate trends in climate for Australia over the past decade and the impacts on indicators of resource condition. The analysis highlights the benefits of insights into past trends and variability in rainfall and other climate variables based on extended historic databases. This understanding in turn supports more reliable regional climate projections and decision support information for governments and land managers to better manage the risk of land degradation now and in the future.
Resumo:
The emergence of the Internet is one of the most significant leaps in the history of humanity. Information, knowledge and culture are exchanged among masses of people through interconnected information platforms. These platforms enable our culture to be analysed and rewritten, and fundamentally opens our perceptions to a wide variety of concepts and beliefs. The connected networks of the Internet have shaped a virtual — but communicative — space where people can cross borders freely within a realm characterised by the ability to go anywhere, see anything, learn, compare and understand. This chapter focuses on the Libyan experience with social networking platforms in actualising democratic change in the uprising of 17 February 2011. After briefly outlining the political and economic situation under the regime of Colonel Mummar Ghaddafi, the chapter discusses the role that social networking platforms played during the struggle of the Libyan people for democratic change. Finally, it points out the positive changes that resulted from the uprising and the potential role that social media might play in the ongoing democratization and development of Libyan society.