424 resultados para Blog datasets
Resumo:
To recognize faces in video, face appearances have been widely modeled as piece-wise local linear models which linearly approximate the smooth yet non-linear low dimensional face appearance manifolds. The choice of representations of the local models is crucial. Most of the existing methods learn each local model individually meaning that they only anticipate variations within each class. In this work, we propose to represent local models as Gaussian distributions which are learned simultaneously using the heteroscedastic probabilistic linear discriminant analysis (PLDA). Each gallery video is therefore represented as a collection of such distributions. With the PLDA, not only the within-class variations are estimated during the training, the separability between classes is also maximized leading to an improved discrimination. The heteroscedastic PLDA itself is adapted from the standard PLDA to approximate face appearance manifolds more accurately. Instead of assuming a single global within-class covariance, the heteroscedastic PLDA learns different within-class covariances specific to each local model. In the recognition phase, a probe video is matched against gallery samples through the fusion of point-to-model distances. Experiments on the Honda and MoBo datasets have shown the merit of the proposed method which achieves better performance than the state-of-the-art technique.
Resumo:
The Queensland University of Technology (QUT) Library, like many other academic and research institution libraries in Australia, has been collaborating with a range of academic and service provider partners to develop a range of research data management services and collections. Three main strategies are being employed and an overview of process, infrastructure, usage and benefits is provided of each of these service aspects. The development of processes and infrastructure to facilitate the strategic identification and management of QUT developed datasets has been a major focus. A number of Australian National Data Service (ANDS) sponsored projects - including Seeding the Commons; Metadata Hub / Store; Data Capture and Gold Standard Record Exemplars have / will provide QUT with a data registry system, linkages to storage, processes for identifying and describing datasets, and a degree of academic awareness. QUT supports open access and has established a culture for making its research outputs available via the QUT ePrints institutional repository. Incorporating open access research datasets into the library collections is an equally important aspect of facilitating the adoption of data-centric eresearch methods. Some datasets are available commercially, and the library has collaborated with QUT researchers, in the QUT Business School especially strongly, to identify and procure a rapidly growing range of financial datasets to support research. The library undertakes licensing and uses the Library Resource Allocation to pay for the subscriptions. It is a new area of collection development for with much to be learned. The final strategy discussed is the library acting as “data broker”. QUT Library has been working with researchers to identify these datasets and undertake the licensing, payment and access as a centrally supported service on behalf of researchers.
Resumo:
This project sought to investigate parameters of residual soil materials located in South East Queensland (SEQ), as determined from a large number of historical site investigation records. This was undertaken to quantify material parameter variability and to assess the validity of using commonly adopted correlations to estimate "typical" soil parameters for this region. A dataset of in situ and laboratory derived residual soil parameters was constructed and analysed to identify potential correlations that related either to the entire area considered, or to specific residual soils that were derived from a common parent material. The variability of SEQ soil parameters were generally found to be greater than the results of equivalent studies that analysed transported soil dominant datasets. Noteworthy differences in material properties also became evident when residual soils weathered from different parent materials were considered independently. Large variation between the correlations developed for specific soil types was found, which highligted both heterogeneity of the studied materials and the incompatibility of generic correlations to residual soils present in SEQ. Region and parent material specific correlations that estimate shear strength from in situ penetration tests have been proposed for the various residual soil types considered.
Resumo:
The health impacts of exposure to ambient temperature have been drawing increasing attention from the environmental health research community, government, society, industries, and the public. Case-crossover and time series models are most commonly used to examine the effects of ambient temperature on mortality. However, some key methodological issues remain to be addressed. For example, few studies have used spatiotemporal models to assess the effects of spatial temperatures on mortality. Few studies have used a case-crossover design to examine the delayed (distributed lag) and non-linear relationship between temperature and mortality. Also, little evidence is available on the effects of temperature changes on mortality, and on differences in heat-related mortality over time. This thesis aimed to address the following research questions: 1. How to combine case-crossover design and distributed lag non-linear models? 2. Is there any significant difference in effect estimates between time series and spatiotemporal models? 3. How to assess the effects of temperature changes between neighbouring days on mortality? 4. Is there any change in temperature effects on mortality over time? To combine the case-crossover design and distributed lag non-linear model, datasets including deaths, and weather conditions (minimum temperature, mean temperature, maximum temperature, and relative humidity), and air pollution were acquired from Tianjin China, for the years 2005 to 2007. I demonstrated how to combine the case-crossover design with a distributed lag non-linear model. This allows the case-crossover design to estimate the non-linear and delayed effects of temperature whilst controlling for seasonality. There was consistent U-shaped relationship between temperature and mortality. Cold effects were delayed by 3 days, and persisted for 10 days. Hot effects were acute and lasted for three days, and were followed by mortality displacement for non-accidental, cardiopulmonary, and cardiovascular deaths. Mean temperature was a better predictor of mortality (based on model fit) than maximum or minimum temperature. It is still unclear whether spatiotemporal models using spatial temperature exposure produce better estimates of mortality risk compared with time series models that use a single site’s temperature or averaged temperature from a network of sites. Daily mortality data were obtained from 163 locations across Brisbane city, Australia from 2000 to 2004. Ordinary kriging was used to interpolate spatial temperatures across the city based on 19 monitoring sites. A spatiotemporal model was used to examine the impact of spatial temperature on mortality. A time series model was used to assess the effects of single site’s temperature, and averaged temperature from 3 monitoring sites on mortality. Squared Pearson scaled residuals were used to check the model fit. The results of this study show that even though spatiotemporal models gave a better model fit than time series models, spatiotemporal and time series models gave similar effect estimates. Time series analyses using temperature recorded from a single monitoring site or average temperature of multiple sites were equally good at estimating the association between temperature and mortality as compared with a spatiotemporal model. A time series Poisson regression model was used to estimate the association between temperature change and mortality in summer in Brisbane, Australia during 1996–2004 and Los Angeles, United States during 1987–2000. Temperature change was calculated by the current day's mean temperature minus the previous day's mean. In Brisbane, a drop of more than 3 �C in temperature between days was associated with relative risks (RRs) of 1.16 (95% confidence interval (CI): 1.02, 1.31) for non-external mortality (NEM), 1.19 (95% CI: 1.00, 1.41) for NEM in females, and 1.44 (95% CI: 1.10, 1.89) for NEM aged 65.74 years. An increase of more than 3 �C was associated with RRs of 1.35 (95% CI: 1.03, 1.77) for cardiovascular mortality and 1.67 (95% CI: 1.15, 2.43) for people aged < 65 years. In Los Angeles, only a drop of more than 3 �C was significantly associated with RRs of 1.13 (95% CI: 1.05, 1.22) for total NEM, 1.25 (95% CI: 1.13, 1.39) for cardiovascular mortality, and 1.25 (95% CI: 1.14, 1.39) for people aged . 75 years. In both cities, there were joint effects of temperature change and mean temperature on NEM. A change in temperature of more than 3 �C, whether positive or negative, has an adverse impact on mortality even after controlling for mean temperature. I examined the variation in the effects of high temperatures on elderly mortality (age . 75 years) by year, city and region for 83 large US cities between 1987 and 2000. High temperature days were defined as two or more consecutive days with temperatures above the 90th percentile for each city during each warm season (May 1 to September 30). The mortality risk for high temperatures was decomposed into: a "main effect" due to high temperatures using a distributed lag non-linear function, and an "added effect" due to consecutive high temperature days. I pooled yearly effects across regions and overall effects at both regional and national levels. The effects of high temperature (both main and added effects) on elderly mortality varied greatly by year, city and region. The years with higher heat-related mortality were often followed by those with relatively lower mortality. Understanding this variability in the effects of high temperatures is important for the development of heat-warning systems. In conclusion, this thesis makes contribution in several aspects. Case-crossover design was combined with distribute lag non-linear model to assess the effects of temperature on mortality in Tianjin. This makes the case-crossover design flexibly estimate the non-linear and delayed effects of temperature. Both extreme cold and high temperatures increased the risk of mortality in Tianjin. Time series model using single site’s temperature or averaged temperature from some sites can be used to examine the effects of temperature on mortality. Temperature change (no matter significant temperature drop or great temperature increase) increases the risk of mortality. The high temperature effect on mortality is highly variable from year to year.
Resumo:
Using Monte Carlo simulation for radiotherapy dose calculation can provide more accurate results when compared to the analytical methods usually found in modern treatment planning systems, especially in regions with a high degree of inhomogeneity. These more accurate results acquired using Monte Carlo simulation however, often require orders of magnitude more calculation time so as to attain high precision, thereby reducing its utility within the clinical environment. This work aims to improve the utility of Monte Carlo simulation within the clinical environment by developing techniques which enable faster Monte Carlo simulation of radiotherapy geometries. This is achieved principally through the use new high performance computing environments and simpler alternative, yet equivalent representations of complex geometries. Firstly the use of cloud computing technology and it application to radiotherapy dose calculation is demonstrated. As with other super-computer like environments, the time to complete a simulation decreases as 1=n with increasing n cloud based computers performing the calculation in parallel. Unlike traditional super computer infrastructure however, there is no initial outlay of cost, only modest ongoing usage fees; the simulations described in the following are performed using this cloud computing technology. The definition of geometry within the chosen Monte Carlo simulation environment - Geometry & Tracking 4 (GEANT4) in this case - is also addressed in this work. At the simulation implementation level, a new computer aided design interface is presented for use with GEANT4 enabling direct coupling between manufactured parts and their equivalent in the simulation environment, which is of particular importance when defining linear accelerator treatment head geometry. Further, a new technique for navigating tessellated or meshed geometries is described, allowing for up to 3 orders of magnitude performance improvement with the use of tetrahedral meshes in place of complex triangular surface meshes. The technique has application in the definition of both mechanical parts in a geometry as well as patient geometry. Static patient CT datasets like those found in typical radiotherapy treatment plans are often very large and present a significant performance penalty on a Monte Carlo simulation. By extracting the regions of interest in a radiotherapy treatment plan, and representing them in a mesh based form similar to those used in computer aided design, the above mentioned optimisation techniques can be used so as to reduce the time required to navigation the patient geometry in the simulation environment. Results presented in this work show that these equivalent yet much simplified patient geometry representations enable significant performance improvements over simulations that consider raw CT datasets alone. Furthermore, this mesh based representation allows for direct manipulation of the geometry enabling motion augmentation for time dependant dose calculation for example. Finally, an experimental dosimetry technique is described which allows the validation of time dependant Monte Carlo simulation, like the ones made possible by the afore mentioned patient geometry definition. A bespoke organic plastic scintillator dose rate meter is embedded in a gel dosimeter thereby enabling simultaneous 3D dose distribution and dose rate measurement. This work demonstrates the effectiveness of applying alternative and equivalent geometry definitions to complex geometries for the purposes of Monte Carlo simulation performance improvement. Additionally, these alternative geometry definitions allow for manipulations to be performed on otherwise static and rigid geometry.
Resumo:
There are no population studies of prevalence or incidence of child maltreatment in Australia. Child protection data gives some understanding but is restricted by system capacity and definitional issues across jurisdictions. Child protection data currently suggests that numbers of reports are increasing yearly, and the child protection system then becomes focussed on investigating all reports and diluting available resources for those children who are most in need of intervention. A public health response across multiple agencies enables responses to child safety across the entire population. All families are targeted at the primary level; examples include ensuring all parents know the dangers of shaking a baby or teaching children to say no if a situation makes them uncomfortable. The secondary level of prevention targets families with a number of risk factors, for example subsidised child care so children aren't left unsupervised after school when both parents have to be at work or home visiting for drug-addicted parents to ensure children are cared for. The tertiary response then becomes the responsibility of the child protection system and is reserved for those children where abuse and neglect are identified. This model requires that child safety is seen in a broader context than just the child protection system, and increasingly health professionals are being identified as an important component in the public health framework. If all injury is viewed as preventable and considered along a continuum of 'accidental' through to 'inflicted', it becomes possible to conceptualise child maltreatment in an injury context. Parental intent may not be to cause harm to the child, but by lack of insight or concern about risk, the potential for injury is high. The mechanisms for unintentional and intentional injury overlap and some suggest that by segregating child abuse (with the possible exception of sexual abuse) from unintentional injury, child abuse is excluded from the broader injury prevention initiative that is gaining momentum in the community. This research uses a public health perspective, specifically that of injury prevention, to consider the problem of child abuse. This study employed a mixed method design that incorporates secondary data analysis, data linkage and structured interviews of different professional groups. Datasets from the Queensland Injury Surveillance Unit (QISU) and The Department of Child Safety (DCS) were evaluated. Coded injury data was grouped according to intent of injury according to those with a code that indicated the ED presentation was due to child abuse, a code indicating that the injury was possibly due to abuse or, in the third group, the intent code indicated that the injury was unintentional and not due to abuse. Primary data collection from ED records was undertaken and information recoded to assess reliability and completeness. Emergency department data (QISU) was linked to Department of Child Safety Data to examine concordance and data quality. Factors influencing the collection and collation of these data were identified through structured interview methodology and analysed using qualitative methods. Secondary analysis of QISU data indicated that codes lacking specific information on the injury event were more likely to also have an intent code indicating abuse than those records where there was specific information on the injury event. Codes for abuse appeared in only 1.2% of the 84,765 records analysed. Unintentional injury was the most commonly coded intent (95.3%). In the group with a definite abuse code assigned at triage, 83% linked to a record with DCS and cases where documentation indicated police involvement were significantly more likely to be associated with a DCS record than those without such documentation. In those coded with an unintentional injury code, 22% linked to a DCS record with cases assigned an urgent triage category more likely to link than those with a triage category for resuscitation and children who presented to regional or remote hospitals more likely to link to a DCS record than those presenting to urban hospitals. Twenty-nine per cent of cases with a code indicating possible abuse linked to a DCS record. In documentation that indicated police involvement in the case, a code for unspecified activity when compared to cases with a code indicating involvement in a sporting activity and children less than 12 months of age compared to those in the 13-17 year old age group were all variables significantly associated with linkage to a DCS record. Only 13% of records contained documentation indicating that child abuse and neglect were considered in the diagnosis of the injury despite almost half of the sample having a code of abuse or possible abuse. Doctors and nurses were confident in their knowledge of the process of reporting child maltreatment but less confident about identifying child abuse and neglect and what should be reported. Many were concerned about implications of reporting, for the child and family and for themselves. A number were concerned about the implications of not reporting, mostly for the wellbeing of the child and a few in terms of their legal obligations as mandatory reporters. The outcomes of this research will help improve the knowledge of barriers to effective surveillance of child abuse in emergency departments. This will, in turn, ensure better identification and reporting practises; more reliable official statistical collections and the potential of flagging high-risk cases to ensure adequate departmental responses have been initiated.
Resumo:
Summary: More than ever before contemporary societies are characterised by the huge amounts of data being transferred. Authorities, companies, academia and other stakeholders refer to Big Data when discussing the importance of large and complex datasets and developing possible solutions for their use. Big Data promises to be the next frontier of innovation for institutions and individuals, yet it also offers possibilities to predict and influence human behaviour with ever-greater precision
Resumo:
In this paper we demonstrate passive vision-based localization in environments more than two orders of magnitude darker than the current benchmark using a 100 webcam and a 500 camera. Our approach uses the camera’s maximum exposure duration and sensor gain to achieve appropriately exposed images even in unlit night-time environments, albeit with extreme levels of motion blur. Using the SeqSLAM algorithm, we first evaluate the effect of variable motion blur caused by simulated exposures of 132 ms to 10000 ms duration on localization performance. We then use actual long exposure camera datasets to demonstrate day-night localization in two different environments. Finally we perform a statistical analysis that compares the baseline performance of matching unprocessed greyscale images to using patch normalization and local neighbourhood normalization – the two key SeqSLAM components. Our results and analysis show for the first time why the SeqSLAM algorithm is effective, and demonstrate the potential for cheap camera-based localization systems that function across extreme perceptual change.
Resumo:
Food is inherently cultural yet traditionally overlooked in many disciplines as a topic worthy of serious investigation. This thesis investigates how food, as a topic of interest, is thriving in an online environment through recipe sharing on food blogs. It applies an ethnographic approach to online community studies, providing a rich description of the food blogging community. The thesis demonstrates how the food blogging can be seen as a community. Through a case study focusing on a one recipe shared across many blogs, it also examines the community in action. As the community has grown, it has become more complex, structured and diverse. The thesis examines its evolution and the response of food-related media and other industries to food blogging. The nature of the food blogging community reflects the cultural and social nature of food and the ongoing evolution of recipe sharing through food-related media. Food blogs provide an insight into the eating habits of ‘ordinary’ people, in a more broad-based manner than traditional food-related media such as cookbooks. Beyond this, food blogs are part of wider cultural trends towards DIY, and provide a useful example of the ongoing transformation of food-related media, food culture, and indeed, culture more broadly.
Resumo:
This paper investigates engaging experienced birders, as volunteer citizen scientists, to analyze large recorded audio datasets gathered through environmental acoustic monitoring. Although audio data is straightforward to gather, automated analysis remains a challenging task; the existing expertise, local knowledge and motivation of the birder community can complement computational approaches and provide distinct benefits. We explored both the culture and practice of birders, and paradigms for interacting with recorded audio data. A variety of candidate design elements were tested with birders. This study contributes an understanding of how virtual interactions and practices can be developed to complement existing practices of experienced birders in the physical world. In so doing this study contributes a new approach to engagement in e-science. Whereas most citizen science projects task lay participants with discrete real world or artificial activities, sometimes using extrinsic motivators, this approach builds on existing intrinsically satisfying practices.
Resumo:
The Council of Australian Governments (COAG) in 2003 gave in-principle approval to a best-practice report recommending a holistic approach to managing natural disasters in Australia incorporating a move from a traditional response-centric approach to a greater focus on mitigation, recovery and resilience with community well-being at the core. Since that time, there have been a range of complementary developments that have supported the COAG recommended approach. Developments have been administrative, legislative and technological, both, in reaction to the COAG initiative and resulting from regular natural disasters. This paper reviews the characteristics of the spatial data that is becoming increasingly available at Federal, state and regional jurisdictions with respect to their being fit for the purpose for disaster planning and mitigation and strengthening community resilience. In particular, Queensland foundation spatial data, which is increasingly accessible by the public under the provisions of the Right to Information Act 2009, Information Privacy Act 2009, and recent open data reform initiatives are evaluated. The Fitzroy River catchment and floodplain is used as a case study for the review undertaken. The catchment covers an area of 142,545 km2, the largest river catchment flowing to the eastern coast of Australia. The Fitzroy River basin experienced extensive flooding during the 2010–2011 Queensland floods. The basin is an area of important economic, environmental and heritage values and contains significant infrastructure critical for the mining and agricultural sectors, the two most important economic sectors for Queensland State. Consequently, the spatial datasets for this area play a critical role in disaster management and for protecting critical infrastructure essential for economic and community well-being. The foundation spatial datasets are assessed for disaster planning and mitigation purposes using data quality indicators such as resolution, accuracy, integrity, validity and audit trail.
Resumo:
This paper presents two algorithms to automate the detection of marine species in aerial imagery. An algorithm from an initial pilot study is presented in which morphology operations and colour analysis formed the basis of its working principle. A second approach is presented in which saturation channel and histogram-based shape profiling were used. We report on performance for both algorithms using datasets collected from an unmanned aerial system at an altitude of 1000 ft. Early results have demonstrated recall values of 48.57% and 51.4%, and precision values of 4.01% and 4.97%.
Resumo:
In this paper we use the algorithm SeqSLAM to address the question, how little and what quality of visual information is needed to localize along a familiar route? We conduct a comprehensive investigation of place recognition performance on seven datasets while varying image resolution (primarily 1 to 512 pixel images), pixel bit depth, field of view, motion blur, image compression and matching sequence length. Results confirm that place recognition using single images or short image sequences is poor, but improves to match or exceed current benchmarks as the matching sequence length increases. We then present place recognition results from two experiments where low-quality imagery is directly caused by sensor limitations; in one, place recognition is achieved along an unlit mountain road by using noisy, long-exposure blurred images, and in the other, two single pixel light sensors are used to localize in an indoor environment. We also show failure modes caused by pose variance and sequence aliasing, and discuss ways in which they may be overcome. By showing how place recognition along a route is feasible even with severely degraded image sequences, we hope to provoke a re-examination of how we develop and test future localization and mapping systems.
Resumo:
During the last several decades, the quality of natural resources and their services have been exposed to significant degradation from increased urban populations combined with the sprawl of settlements, development of transportation networks and industrial activities (Dorsey, 2003; Pauleit et al., 2005). As a result of this environmental degradation, a sustainable framework for urban development is required to provide the resilience of natural resources and ecosystems. Sustainable urban development refers to the management of cities with adequate infrastructure to support the needs of its population for the present and future generations as well as maintain the sustainability of its ecosystems (UNEP/IETC, 2002; Yigitcanlar, 2010). One of the important strategic approaches for planning sustainable cities is „ecological planning‟. Ecological planning is a multi-dimensional concept that aims to preserve biodiversity richness and ecosystem productivity through the sustainable management of natural resources (Barnes et al., 2005). As stated by Baldwin (1985, p.4), ecological planning is the initiation and operation of activities to direct and control the acquisition, transformation, disruption and disposal of resources in a manner capable of sustaining human activities with a minimum disruption of ecosystem processes. Therefore, ecological planning is a powerful method for creating sustainable urban ecosystems. In order to explore the city as an ecosystem and investigate the interaction between the urban ecosystem and human activities, a holistic urban ecosystem sustainability assessment approach is required. Urban ecosystem sustainability assessment serves as a tool that helps policy and decision-makers in improving their actions towards sustainable urban development. There are several methods used in urban ecosystem sustainability assessment among which sustainability indicators and composite indices are the most commonly used tools for assessing the progress towards sustainable land use and urban management. Currently, a variety of composite indices are available to measure the sustainability at the local, national and international levels. However, the main conclusion drawn from the literature review is that they are too broad to be applied to assess local and micro level sustainability and no benchmark value for most of the indicators exists due to limited data availability and non-comparable data across countries. Mayer (2008, p. 280) advocates that by stating "as different as the indices may seem, many of them incorporate the same underlying data because of the small number of available sustainability datasets". Mori and Christodoulou (2011) also argue that this relative evaluation and comparison brings along biased assessments, as data only exists for some entities, which also means excluding many nations from evaluation and comparison. Thus, there is a need for developing an accurate and comprehensive micro-level urban ecosystem sustainability assessment method. In order to develop such a model, it is practical to adopt an approach that uses a method to utilise indicators for collecting data, designate certain threshold values or ranges, perform a comparative sustainability assessment via indices at the micro-level, and aggregate these assessment findings to the local level. Hereby, through this approach and model, it is possible to produce sufficient and reliable data to enable comparison at the local level, and provide useful results to inform the local planning, conservation and development decision-making process to secure sustainable ecosystems and urban futures. To advance research in this area, this study investigated the environmental impacts of an existing urban context by using a composite index with an aim to identify the interaction between urban ecosystems and human activities in the context of environmental sustainability. In this respect, this study developed a new comprehensive urban ecosystem sustainability assessment tool entitled the „Micro-level Urban-ecosystem Sustainability IndeX‟ (MUSIX). The MUSIX model is an indicator-based indexing model that investigates the factors affecting urban sustainability in a local context. The model outputs provide local and micro-level sustainability reporting guidance to help policy-making concerning environmental issues. A multi-method research approach, which is based on both quantitative analysis and qualitative analysis, was employed in the construction of the MUSIX model. First, a qualitative research was conducted through an interpretive and critical literature review in developing a theoretical framework and indicator selection. Afterwards, a quantitative research was conducted through statistical and spatial analyses in data collection, processing and model application. The MUSIX model was tested in four pilot study sites selected from the Gold Coast City, Queensland, Australia. The model results detected the sustainability performance of current urban settings referring to six main issues of urban development: (1) hydrology, (2) ecology, (3) pollution, (4) location, (5) design, and; (6) efficiency. For each category, a set of core indicators was assigned which are intended to: (1) benchmark the current situation, strengths and weaknesses, (2) evaluate the efficiency of implemented plans, and; (3) measure the progress towards sustainable development. While the indicator set of the model provided specific information about the environmental impacts in the area at the parcel scale, the composite index score provided general information about the sustainability of the area at the neighbourhood scale. Finally, in light of the model findings, integrated ecological planning strategies were developed to guide the preparation and assessment of development and local area plans in conjunction with the Gold Coast Planning Scheme, which establishes regulatory provisions to achieve ecological sustainability through the formulation of place codes, development codes, constraint codes and other assessment criteria that provide guidance for best practice development solutions. These relevant strategies can be summarised as follows: • Establishing hydrological conservation through sustainable stormwater management in order to preserve the Earth’s water cycle and aquatic ecosystems; • Providing ecological conservation through sustainable ecosystem management in order to protect biological diversity and maintain the integrity of natural ecosystems; • Improving environmental quality through developing pollution prevention regulations and policies in order to promote high quality water resources, clean air and enhanced ecosystem health; • Creating sustainable mobility and accessibility through designing better local services and walkable neighbourhoods in order to promote safe environments and healthy communities; • Sustainable design of urban environment through climate responsive design in order to increase the efficient use of solar energy to provide thermal comfort, and; • Use of renewable resources through creating efficient communities in order to provide long-term management of natural resources for the sustainability of future generations.
Resumo:
Distal radius fractures stabilized by open reduction internal fixation (ORIF) have become increasingly common. There is currently no consensus on the optimal time to commence range of motion (ROM) exercises post-ORIF. A retrospective cohort review was conducted over a five-year period to compare wrist and forearm range of motion outcomes and number of therapy sessions between patients who commenced active ROM exercises within the first seven days and from day eight onward following ORIF of distal radius fractures. One hundred and twenty-one patient cases were identified. Clinical data, active ROM at initial and discharge therapy assessments, fracture type, surgical approaches, and number of therapy sessions attended were recorded. One hundred and seven (88.4%) cases had complete datasets. The early active ROM group (n = 37) commenced ROM a mean (SD) of 4.27 (1.8) days post-ORIF. The comparator group (n = 70) commenced ROM exercises 24.3 (13.6) days post-ORIF. No significant differences were identified between groups in ROM at initial or discharge assessments, or therapy sessions attended. The results from this study indicate that patients who commenced active ROM exercises an average of 24 days after surgery achieved comparable ROM outcomes with similar number of therapy sessions to those who commenced ROM exercises within the first week.