986 resultados para Minimum local
Resumo:
This paper investigates a wireless sensor network deployment - monitoring water quality, e.g. salinity and the level of the underground water table - in a remote tropical area of northern Australia. Our goal is to collect real time water quality measurements together with the amount of water being pumped out in the area, and investigate the impacts of current irrigation practice on the environments, in particular underground water salination. This is a challenging task featuring wide geographic area coverage (mean transmission range between nodes is more than 800 meters), highly variable radio propagations, high end-to-end packet delivery rate requirements, and hostile deployment environments. We have designed, implemented and deployed a sensor network system, which has been collecting water quality and flow measurements, e.g., water flow rate and water flow ticks for over one month. The preliminary results show that sensor networks are a promising solution to deploying a sustainable irrigation system, e.g., maximizing the amount of water pumped out from an area with minimum impact on water quality.
Resumo:
The large deformation analysis is one of major challenges in numerical modelling and simulation of metal forming. Because no mesh is used, the meshfree methods show good potential for the large deformation analysis. In this paper, a local meshfree formulation, based on the local weak-forms and the updated Lagrangian (UL) approach, is developed for the large deformation analysis. To fully employ the advantages of meshfree methods, a simple and effective adaptive technique is proposed, and this procedure is much easier than the re-meshing in FEM. Numerical examples of large deformation analysis are presented to demonstrate the effectiveness of the newly developed nonlinear meshfree approach. It has been found that the developed meshfree technique provides a superior performance to the conventional FEM in dealing with large deformation problems for metal forming.
Resumo:
In public venues, crowd size is a key indicator of crowd safety and stability. In this paper we propose a crowd counting algorithm that uses tracking and local features to count the number of people in each group as represented by a foreground blob segment, so that the total crowd estimate is the sum of the group sizes. Tracking is employed to improve the robustness of the estimate, by analysing the history of each group, including splitting and merging events. A simplified ground truth annotation strategy results in an approach with minimal setup requirements that is highly accurate.
Resumo:
The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.
Resumo:
This work is focussed on developing a commissioning procedure so that a Monte Carlo model, which uses BEAMnrc’s standard VARMLC component module, can be adapted to match a specific BrainLAB m3 micro-multileaf collimator (μMLC). A set of measurements are recommended, for use as a reference against which the model can be tested and optimised. These include radiochromic film measurements of dose from small and offset fields, as well as measurements of μMLC transmission and interleaf leakage. Simulations and measurements to obtain μMLC scatter factors are shown to be insensitive to relevant model parameters and are therefore not recommended, unless the output of the linear accelerator model is in doubt. Ultimately, this note provides detailed instructions for those intending to optimise a VARMLC model to match the dose delivered by their local BrainLAB m3 μMLC device.
Resumo:
Policymakers often propose strict enforcement strategies to fight the shadow economy and to increase tax morale. However, there is an alternative bottom-up approach that decentralises political power to those who are close to the problems. This paper analyses the relationship with local autonomy. We use data on tax morale at the individual level and macro data on the size of the shadow economy to analyse the relevance of local autonomy and compliance in Switzerland. The findings suggest that there is a positive (negative) relationship between local autonomy and tax morale (size of the shadow economy).
Resumo:
There is a widespread recognition to the need of better manage municipal property in most cities in the world. Structural problems across regional, state, and territorial governments that have legal powers to own and maintain real property are similar, regardless of the level of development of each country. Start from a very basic level of property inventory records. The need for better manage to the local government owned property is the result of widespread decentralisation initiatives that often have devolved huge property portfolios from central to local governments almost “overnight”. At the same time municipal or regional governments were and continue to be unprepared to deal with multiple issues related to the role of property owners and managers. The lack of discussion of public asset management especially the elements that should be incorporated in the framework creates an important challenge to study the discipline of public asset management further. The aim of this paper is to study the practices of public asset management in developed countries, especially the elements of public asset management framework, and its transferability to developing countries. A case study was selected and conducted to achieve this aim. They involved interviews and a focus group. The study found that in public asset management framework, proper asset identification, public asset needs analysis, asset life cycle and performance measurements are an important element that should be incorporated in the framework. Those elements are transferable and applicable to developing countries’ local governments. Finally, findings from this study provide useful input for the local government policy makers, scholars and asset management practitioners to establish a public asset management framework toward more efficient and effective local governments in managing their assets as well as increasing public services quality.
Resumo:
Robustness of the track allocation problem is rarely addressed in literatures and the obtained track allocation schemes (TAS) embody some bottlenecks. Therefore, an approach to detect bottlenecks is needed to support local optimization. First a TAS is transformed to an executable model by Petri nets. Then disturbances analysis is performed using the model and the indicators of the total trains' departure delays are collected to detect bottlenecks when each train suffers a disturbance. Finally, the results of the tests based on a rail hub linking six lines and a TAS about thirty minutes show that the minimum buffer time is 21 seconds and there are two bottlenecks where the buffer times are 57 and 44 seconds respectively, and it indicates that the bottlenecks do not certainly locate at the area where there is minimum buffer time. The proposed approach can further support selection of multi schemes and robustness optimization.
Resumo:
Various countries have been introducing sustainable assessment tools for real estate design to produce integrated sustainability components not just for the building, but also the landscape component of the development. This paper aims to present the comparison between international and local assessment tools of landscape design for housing estate developments in Bangkok Metropolitan Region (BMR), Thailand. The methodologies used are to review, then compare and identify discrepancy indicators among the tools. This paper will examine four international tools; LEED for Neighbourhood Development (LEED – ND) of United State of America (USA), EnviroDevelopment standards of Australia, Residential Landscape Sustainability of United Kingdom (UK) and Green Mark for Infrastructure of Singapore; and three BMR’s existing tools; Land Subdivision Act B.E. 2543, Environmental Impact Assessment Monitoring Awards (EIA-MA) and Thai’s Rating for Energy and Environmental Sustainability of New construction and major renovation (TREES-NC). The findings show that there are twenty two elements of three categories which are neighbourhood design, community management, and environmental condition. Moreover, only one element in neighbourhood designs different between the international and local tools. The sustainable assessment tools have existed in BMR but they are not complete in only one assessment tool. Thus, the development of new comprehensive assessment tool will be necessary in BMR; however, it should meet the specific environment and climate condition for housing estate development at BMR.
Resumo:
This study investigates the application of local search methods on the railway junction traffic conflict-resolution problem, with the objective of attaining a quick and reasonable solution. A procedure based on local search relies on finding a better solution than the current one by a search in the neighbourhood of the current one. The structure of neighbourhood is therefore very important to an efficient local search procedure. In this paper, the formulation of the structure of the solution, which is the right-of-way sequence assignment, is first described. Two new neighbourhood definitions are then proposed and the performance of the corresponding local search procedures is evaluated by simulation. It has been shown that they provide similar results but they can be used to handle different traffic conditions and system requirements.
Resumo:
This paper investigates how contemporary works of women’s travel writing are reworking canonical formations of environmental literature by presenting imaginative accounts of travel writing that are both literal and metaphorical. In this context, the paper considers how women who travel/write may intersect the spatial hybridities of travel writing and nature writing, and in doing so, create a new genre of environmental literature that is not only ecologically sensitive but gendered. As the role of female travel writers in generating this knowledge is immense but largely unexamined, this paper will investigate how a feminist geography can be applied, both critically and creatively, to local accounts of travel. It will draw on my own travels around Queensland in an attempt to explore how many female storytellers situate themselves, in and against, various discourses of mobility and morality.