957 resultados para Local binary pattern
Resumo:
Bargara Pasturage Reserve: Future Visions This exhibition showcases the work of Postgraduate Landscape Architecture and final year Undergraduate Civil and Environmental Engineering students in response to issues of sustainability in a coastal wetland known as the Bargara Pasturage Reserve; an exemplar of the many issues facing sensitive coastal places in Queensland today. The 312ha Pasturage Reserve at Bargara is the only biofilter between the pressures of Bargara’s urban and tourism expansion, surrounding sugarcane farming, and the Great Sandy Marine Park, including the largest concentration of nesting marine turtles on the eastern Australian mainland. This ephemeral wetland, while struggling to fulfil its coastal biofiltration function, is also in high demand for passive recreation, and the project partners’ priorities were to meet both of these challenges. The students were required to plan and design for the best balance possible amongst, but not limited to: wetland and coastal ecological health, enhancement of cultural heritage and values, sustainable urban development, and local economic health. To understand these challenges, QUT staff and students met with partners, visited and analysed the Pasturage Reserve, spent time in and around Bargara talking to locals and inviting dialogue with Indigenous representatives and the South Sea Islander community. We then went home to Brisbane to undertake theoretical and technical research, and then worked to produce 11 Strategic Plans, 2 Environmental Management Plans and 33 Detailed Designs. One group of students analysed the Bargara coastal landscape as an historical and ongoing series of conversations between ecological systems, cultural heritage, community and stakeholders. Another group identified the landscape as neither ‘urban,’ ‘rural,’ nor ‘natural,’ instead identifying it metaphorically as a series of layered thematic ‘fields’ such as water, conservation, reconciliation, and educational fields. These landscape analyses became the organising mechanisms for strategic planning. An outstanding Strategic Plan was produced by Zhang, Lemberg and Jensen, entitled Metanoia, which means to ‘make a change as the result of reflection on values’. Three implementation phases of “flow”, “flux”, and “flex” span twenty-five years, and present a vision a coastal and marine research and conservation hub, with a focus on coastal wetland function, turtle habitat and coral reef conservation. An Environmental Management Plan by Brand and Stickland focuses on protecting and improving wetland biodiversity and habitat quality, and increasing hydrological and water quality function; vital in a coastal area of such high conservation value. After the planning phase, students individually developed detailed design proposals responsive to their plans. From Metanoia, Zhang concentrated on wetland access and interpretation, proposing four focal places to form the nucleus of a wider pattern of connectivity, and to encourage community engagement with coastal environmental management and education. Jensen tackled the thorny issue of coastal urban development, proposing a sensitive staged eco-village model which maintains both ecological and recreational connectivity between the wetland and the marine environment. This project offered QUT’s partners many innovative options to inform their future planning. BSC, BMRG and Oceanwatch Australia are currently engaged in the investigation of on-ground opportunities drawing on these options.
Resumo:
In a recent paper, Gordon, Muratov, and Shvartsman studied a partial differential equation (PDE) model describing radially symmetric diffusion and degradation in two and three dimensions. They paid particular attention to the local accumulation time (LAT), also known in the literature as the mean action time, which is a spatially dependent timescale that can be used to provide an estimate of the time required for the transient solution to effectively reach steady state. They presented exact results for three-dimensional applications and gave approximate results for the two-dimensional analogue. Here we make two generalizations of Gordon, Muratov, and Shvartsman’s work: (i) we present an exact expression for the LAT in any dimension and (ii) we present an exact expression for the variance of the distribution. The variance provides useful information regarding the spread about the mean that is not captured by the LAT. We conclude by describing further extensions of the model that were not considered by Gordon,Muratov, and Shvartsman. We have found that exact expressions for the LAT can also be derived for these important extensions...
Resumo:
Fire safety design of building structures has received greater attention in recent times due to continuing losses of properties and lives in fires. However, the structural behaviour of thin-walled cold-formed steel columns under fire conditions is not well understood despite the increasing use of light gauge steels in building construction. Cold-formed steel columns are often subject to local buckling effects. Therefore a series of laboratory tests of lipped and unlipped channel columns made of varying steel thicknesses and grades was undertaken at uniform elevated temperatures up to 700°C under steady state conditions. Finite element models of the tested columns were also developed, and their elastic buckling and nonlinear analysis results were compared with test results at elevated temperatures. Effects of the degradation of mechanical properties of steel with temperature were included in the finite element analyses. The use of accurately measured yield stress, elasticity modulus and stress-strain curves at elevated temperatures provided a good comparison of the ultimate loads and load-deflection curves from tests and finite element analyses. The commonly used effective width design rules and the direct strength method at ambient temperature were then used to predict the ultimate loads at elevated temperatures by using the reduced mechanical properties. By comparing these predicted ultimate loads with those from tests and finite element analyses, the accuracy of using this design approach was evaluated.
Resumo:
The decentralisation reform in Indonesia has mandated the Central Government to transfer some functions and responsibilities to local governments including the transfer of human resources, assets and budgets. Local governments became giant asset holders almost overnight and most were ill prepared to handle these transformations. Assets were transferred without analysing local government need, ability or capability to manage the assets and no local government was provided with an asset management framework. Therefore, the aim of this research is to develop a Public Asset Management Framework for provincial governments in Indonesia, especially for infrastructure and real property assets. This framework will enable provincial governments to develop integrated asset management procedures throughout asset‘s lifecycle. Achieving the research aim means answering the following three research questions; 1) How do provincial governments in Indonesia currently manage their public assets? 2) What factors influence the provincial governments in managing these public assets? 3) How is a Public Asset Management Framework developed that is specific for the Indonesian provincial governments‘ situation? This research applied case studies approach after a literature review; document retrieval, interviews and observations were collated. Data was collected in June 2009 (preliminary data collection) and January to July 2010 in the major eastern Indonesian provinces. Once the public asset management framework was developed, a focus group was used to verify the framework. Results are threefold and indicate that Indonesian provincial governments need to improve the effectiveness and efficiency of current practice of public asset management in order to improve public service quality. The second result shows that the 5 major concerns that influence the local government public asset management processes are asset identification and inventory systems, public asset holding, asset guidance and legal arrangements, asset management efficiency and effectiveness, and, human resources and their organisational arrangements. The framework was then applied to assets already transferred to local governments and so included a system of asset identification and a needs analysis to classify the importance of these assets to local governments, their functions and responsibilities in delivering public services. Assets that support local government functions and responsibilities will then be managed using suitable asset lifecycle processes. Those categorised as surplus assets should be disposed. Additionally functions and responsibilities that do not need an asset solution should be performed directly by local governments. These processes must be measured using performance measurement indicators. All these stages should be guided and regulated with sufficient laws and regulations. Constant improvements to the quality and quantity of human resources hold an important role in successful public asset management processes. This research focuses on developing countries, and contributes toward the knowledge of a Public Asset Management Framework at local government level, particularly Indonesia. The framework provides local governments a foundation to improve their effectiveness and efficiency in managing public assets, which could lead to improved public service quality. This framework will ensure that the best decisions are made throughout asset decision ownership and provide a better asset life cycle process, leading to selection of the most appropriate asset, improve its acquisition and delivery process, optimise asset performance, and provide an appropriate disposal program.
Resumo:
Current literature warns organisations about a global ageing phenomenon. Workplace ageing is causing a diminishing work pool which has consequences for a sustainable workforce in the future. This phenomenon continues to impact on local government councils in Australia. Australia has one of the world’s most rapidly ageing populations, and there is evidence that Australian local government councils are already resulting in an unsustainable workforce. Consequently, this research program investigated the role of older workers in the Queensland local government workplace in enabling them to extend their working lives towards transitional employment and a sustainable workforce in the future. Transitional Employment is intended as a strategy for enabling individuals to have greater control over their employment options and their employability during the period leading to their final exit from the workforce. There was no evidence of corporate support for older workers in Queensland local government councils other than tokenistic government campaigns encouraging organisations to "better value their older workers". (Queensland Government, 2007d, p.6). TE is investigated as a possible intervention for older workers in the future. The international and national literature review reflected a range of matters impacting on current older workers in the workforce and barriers preventing them from accessing services towards extending their employment beyond the traditional retirement age (60 years) as defined by the Australian Government; an age when individuals can access their superannuation. Learning and development services were identified as one of those barriers. There was little evidence of investment in or consistent approaches to supporting older workers by organisations. Learning and development services appeared at best to be ad hoc, reactive to corporate productivity and outputs with little recognition of the ageing phenomenon (OECD, 2006, p.23) and looming skills and labour shortages (ALGA, 2006, p. 19). Themes from the literature review led to the establishment of three key research questions: 1. What are the current local government workforce issues impacting on skills and labour retention? 2. What are perceptions about the current workplace environment? And, 3. What are the expectations about learning and development towards extending employability of older workers within the local government sector? The research questions were explored by utilising three qualitative empirical studies, using some numerical data for reporting and comparative analysis. Empirical Study One investigated common themes for accessing transitional employment and comprised two phases. A literature review and Study One data analysis enabled the construction of an initial Transitional Employment Model which includes most frequent themes. Empirical Study Two comprised focus groups to further consider those themes. This led to identification of issues impacting the most on access to learning and development by older workers and towards a revised TEM. Findings presented majority support for transitional employment as a strategy for supporting older workers to work beyond their traditional retirement age. Those findings are presented as significant issues impacting on access to transitional employment within the final 3-dimensionsal TEM. The model is intended as a guide for responding to an ageing workforce by local government councils in the future. This study argued for increased and improved corporate support, particularly for learning and development services for older workers. Such support will enable older workers to maintain their employability and extend their working lives; a sustainable workforce in the future.
Resumo:
Ramp metering is an effective motorway control tool beneficial for mainline traffic, but the long on-ramp queues created interfere with surface traffic profoundly. This study deals with the conflict between mainline benefits and thecosts of on-ramp and surface traffic. A novel local on-ramp queue management strategy with mainline speed recovery is proposed. Microscopic simulation is used to test the new strategy and compare it with other strategies. Simulation results reveal that the ramp metering with queue management strategy provides a good balance between the mainline and on-ramp performances.
Resumo:
Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.
Resumo:
In this letter the core-core-valence Auger transitions of an atomic impurity, both in bulk or adsorbed on a jellium-like surface, are computed within a DFT framework. The Auger rates calculated by the Fermi golden rule are compared with those determined by an approximate and simpler expression. This is based on the local density of states (LDOS) with a core hole present, in a region around the impurity nucleus. Different atoms, Na and Mg, solids, Al and Ag, and several impurity locations are considered. We obtain an excellent agreement between KL1V and KL23V rates worked out with the two approaches. The radius of the sphere in which we calculate the LDOS is the relevant parameter of the simpler approach. Its value only depends on the atomic species regardless of the location of the impurity and the type of substrate. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
In order to meet the land use and infrastructure needs of the community with the additional challenges posed by climate change and a global recession, it is essential that Queensland local governments test their proposed integrated land use and infrastructure plans to ensure the maximum achievement of triple-bottom line sus-tainability goals. Extensive regulatory impact assessment systems are in place at the Australian and state government levels to substantiate and test policy and legislative proposals, however no such requirement has been extended to the local government level. This paper contends that with the devolution of responsibility to local government and growing impacts of local government planning and development assessment activities, impact assessment of regulatory planning instruments is appropriate and overdue. This is particularly so in the Queensland context where local governments manage metropolitan and regional scale responsibilities and their planning schemes under the Sustainable Planning Act 2009 integrate land use and infrastructure planning to direct development rights, the spatial allocation of land, and infrastructure investment. It is critical that urban planners have access to fit-for-purpose impact assessment frameworks which support this challenging task and address the important relationship between local planning and sustainable urban development. This paper uses two examples of sustainability impact assessment and a case study from the Queensland local urban planning context to build an argument and potential starting point for impact assessment in local planning processes.
Resumo:
Application of "advanced analysis" methods suitable for non-linear analysis and design of steel frame structures permits direct and accurate determination of ultimate system strengths, without resort to simplified elastic methods of analysis and semi-empirical specification equations. However, the application of advanced analysis methods has previously been restricted to steel frames comprising only compact sections that are not influenced by the effects of local buckling. A research project has been conducted with the aim of developing concentrated plasticity methods suitable for practical advanced analysis of steel frame structures comprising non-compact sections. This paper contains a comprehensive set of analytical benchmark solutions for steel frames comprising non-compact sections, which can be used to verify the accuracy of simplified concentrated plasticity methods of advanced analysis. The analytical benchmark solutions were obtained using a distributed plasticity shell finite element model that explicitly accounts for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. A brief description and verification of the shell finite element model is provided in this paper.
Resumo:
Text categorisation is challenging, due to the complex structure with heterogeneous, changing topics in documents. The performance of text categorisation relies on the quality of samples, effectiveness of document features, and the topic coverage of categories, depending on the employing strategies; supervised or unsupervised; single labelled or multi-labelled. Attempting to deal with these reliability issues in text categorisation, we propose an unsupervised multi-labelled text categorisation approach that maps the local knowledge in documents to global knowledge in a world ontology to optimise categorisation result. The conceptual framework of the approach consists of three modules; pattern mining for feature extraction; feature-subject mapping for categorisation; concept generalisation for optimised categorisation. The approach has been promisingly evaluated by compared with typical text categorisation methods, based on the ground truth encoded by human experts.
Resumo:
Two recent decisions of the Supreme Court of New South Wales in the context of obstetric management have highlighted firstly, the importance of keeping legible, accurate and detailed medical records; and secondly, the challenges faced by those seeking to establish causation, particularly where epidemiological evidence is relied upon...
Resumo:
In a classification problem typically we face two challenging issues, the diverse characteristic of negative documents and sometimes a lot of negative documents that are closed to positive documents. Therefore, it is hard for a single classifier to clearly classify incoming documents into classes. This paper proposes a novel gradual problem solving to create a two-stage classifier. The first stage identifies reliable negatives (negative documents with weak positive characteristics). It concentrates on minimizing the number of false negative documents (recall-oriented). We use Rocchio, an existing recall based classifier, for this stage. The second stage is a precision-oriented “fine tuning”, concentrates on minimizing the number of false positive documents by applying pattern (a statistical phrase) mining techniques. In this stage a pattern-based scoring is followed by threshold setting (thresholding). Experiment shows that our statistical phrase based two-stage classifier is promising.
Resumo:
Atherosclerotic cardiovascular disease remains the leading cause of morbidity and mortality in industrialized societies. The lack of metabolite biomarkers has impeded the clinical diagnosis of atherosclerosis so far. In this study, stable atherosclerosis patients (n=16) and age- and sex-matched non-atherosclerosis healthy subjects (n=28) were recruited from the local community (Harbin, P. R. China). The plasma was collected from each study subject and was subjected to metabolomics analysis by GC/MS. Pattern recognition analyses (principal components analysis, orthogonal partial least-squares discriminate analysis, and hierarchical clustering analysis) commonly demonstrated plasma metabolome, which was significantly different from atherosclerotic and non-atherosclerotic subjects. The development of atherosclerosis-induced metabolic perturbations of fatty acids, such as palmitate, stearate, and 1-monolinoleoylglycerol, was confirmed consistent with previous publication, showing that palmitate significantly contributes to atherosclerosis development via targeting apoptosis and inflammation pathways. Altogether, this study demonstrated that the development of atherosclerosis directly perturbed fatty acid metabolism, especially that of palmitate, which was confirmed as a phenotypic biomarker for clinical diagnosis of atherosclerosis.
Resumo:
BACKGROUND & AIMS Metabolomics is comprehensive analysis of low-molecular-weight endogenous metabolites in a biological sample. It could enable mapping of perturbations of early biochemical changes in diseases and hence provide an opportunity to develop predictive biomarkers that could provide valuable insights into the mechanisms of diseases. The aim of this study was to elucidate the changes in endogenous metabolites and to phenotype the metabolic profiling of d-galactosamine (GalN)-inducing acute hepatitis in rats by UPLC-ESI MS. METHODS The systemic biochemical actions of GalN administration (ip, 400 mg/kg) have been investigated in male wistar rats using conventional clinical chemistry, liver histopathology and metabolomic analysis of UPLC- ESI MS of urine. The urine was collected predose (-24 to 0 h) and 0-24, 24-48, 48-72, 72-96 h post-dose. Mass spectrometry of the urine was analysed visually and via conjunction with multivariate data analysis. RESULTS Results demonstrated that there was a time-dependent biochemical effect of GalN dosed on the levels of a range of low-molecular-weight metabolites in urine, which was correlated with developing phase of the GalN-inducing acute hepatitis. Urinary excretion of beta-hydroxybutanoic acid and citric acid was decreased following GalN dosing, whereas that of glycocholic acid, indole-3-acetic acid, sphinganine, n-acetyl-l-phenylalanine, cholic acid and creatinine excretion was increased, which suggests that several key metabolic pathways such as energy metabolism, lipid metabolism and amino acid metabolism were perturbed by GalN. CONCLUSION This metabolomic investigation demonstrates that this robust non-invasive tool offers insight into the metabolic states of diseases.