975 resultados para Random close packing
Resumo:
In this paper, we consider a time-space fractional diffusion equation of distributed order (TSFDEDO). The TSFDEDO is obtained from the standard advection-dispersion equation by replacing the first-order time derivative by the Caputo fractional derivative of order α∈(0,1], the first-order and second-order space derivatives by the Riesz fractional derivatives of orders β 1∈(0,1) and β 2∈(1,2], respectively. We derive the fundamental solution for the TSFDEDO with an initial condition (TSFDEDO-IC). The fundamental solution can be interpreted as a spatial probability density function evolving in time. We also investigate a discrete random walk model based on an explicit finite difference approximation for the TSFDEDO-IC.
Resumo:
Prawns are a substantial Australian resource but presently are processed in a very labour-intensive manner. A prototype system has been developed for automatically grading and packing prawns into single-layer 'consumer packs' in which each prawn is approximately straight and has the same orientation. The novel technology includes a machine vision system that has been specially programmed to calculate relevant parameters at high speed and a gripper mechanism that can acquire, straighten and place prawns of various sizes. The system can be implemented on board a trawler or in an onshore processing facility. © 1993.
Resumo:
Objectives: To evaluate the validity, reliability and responsiveness of EDC using the WOMAC® NRS 3.1 Index on Motorola V3 mobile phones. ---------- Methods: Patients with osteoarthritis (OA) undergoing primary unilateral hip or knee joint replacement surgery were assessed pre-operatively and 3-4 months post-operatively. Patients completed the WOMAC® Index in paper (p-WOMAC®) and electronic (m-WOMAC®) format in random order. ---------- Results: 24 men and 38 women with hip and knee OA participated and successfully completed the m-WOMAC® questionnaire. Pearson correlations between the summated total index scores for the p-WOMAC® and m-WOMAC® pre- and post-surgery were 0.98 and 0.99 (p<0.0001). There was no clinically important or statistically significant between-method difference in the adjusted total summated scores, pre- and post-surgery (adjusted mean difference = 4.44, p = 0.474 and 1.73, p = 0.781). Internal consistency estimates of m-WOMAC® reliability were 0.87 – 0.98. The m-WOMAC® detected clinically important, statistically significant (p<0.0001) improvements in pain, stiffness, function and total index score. ---------- Conclusions: Sixty-two patients with hip and knee OA successfully completed EDC by Motorola V3 mobile phone using the m-WOMAC® NRS3.1 Index; completion times averaging only 1-1.5 minutes longer than the p-WOMAC® Index. Data were successfully and securely transmitted from patients in Australia to a server in the USA. There was close agreement and no significant differences between m-WOMAC® and p-WOMAC® scores. This study confirms the validity, reliability and responsiveness of the Exco InTouch engineered, Java-based m-WOMAC® Index application. EDC with the m-WOMAC® Index provides unique opportunities for using quantitative measurement in clinical research and practice.
Resumo:
This paper describes the approach taken to the clustering task at INEX 2009 by a group at the Queensland University of Technology. The Random Indexing (RI) K-tree has been used with a representation that is based on the semantic markup available in the INEX 2009 Wikipedia collection. The RI K-tree is a scalable approach to clustering large document collections. This approach has produced quality clustering when evaluated using two different methodologies.
Resumo:
In cloud computing resource allocation and scheduling of multiple composite web services is an important challenge. This is especially so in a hybrid cloud where there may be some free resources available from private clouds but some fee-paying resources from public clouds. Meeting this challenge involves two classical computational problems. One is assigning resources to each of the tasks in the composite web service. The other is scheduling the allocated resources when each resource may be used by more than one task and may be needed at different points of time. In addition, we must consider Quality-of-Service issues, such as execution time and running costs. Existing approaches to resource allocation and scheduling in public clouds and grid computing are not applicable to this new problem. This paper presents a random-key genetic algorithm that solves new resource allocation and scheduling problem. Experimental results demonstrate the effectiveness and scalability of the algorithm.
Resumo:
This thesis is devoted to the study of linear relationships in symmetric block ciphers. A block cipher is designed so that the ciphertext is produced as a nonlinear function of the plaintext and secret master key. However, linear relationships within the cipher can still exist if the texts and components of the cipher are manipulated in a number of ways, as shown in this thesis. There are four main contributions of this thesis. The first contribution is the extension of the applicability of integral attacks from word-based to bitbased block ciphers. Integral attacks exploit the linear relationship between texts at intermediate stages of encryption. This relationship can be used to recover subkey bits in a key recovery attack. In principle, integral attacks can be applied to bit-based block ciphers. However, specific tools to define the attack on these ciphers are not available. This problem is addressed in this thesis by introducing a refined set of notations to describe the attack. The bit patternbased integral attack is successfully demonstrated on reduced-round variants of the block ciphers Noekeon, Present and Serpent. The second contribution is the discovery of a very small system of equations that describe the LEX-AES stream cipher. LEX-AES is based heavily on the 128-bit-key (16-byte) Advanced Encryption Standard (AES) block cipher. In one instance, the system contains 21 equations and 17 unknown bytes. This is very close to the upper limit for an exhaustive key search, which is 16 bytes. One only needs to acquire 36 bytes of keystream to generate the equations. Therefore, the security of this cipher depends on the difficulty of solving this small system of equations. The third contribution is the proposal of an alternative method to measure diffusion in the linear transformation of Substitution-Permutation-Network (SPN) block ciphers. Currently, the branch number is widely used for this purpose. It is useful for estimating the possible success of differential and linear attacks on a particular SPN cipher. However, the measure does not give information on the number of input bits that are left unchanged by the transformation when producing the output bits. The new measure introduced in this thesis is intended to complement the current branch number technique. The measure is based on fixed points and simple linear relationships between the input and output words of the linear transformation. The measure represents the average fraction of input words to a linear diffusion transformation that are not effectively changed by the transformation. This measure is applied to the block ciphers AES, ARIA, Serpent and Present. It is shown that except for Serpent, the linear transformations used in the block ciphers examined do not behave as expected for a random linear transformation. The fourth contribution is the identification of linear paths in the nonlinear round function of the SMS4 block cipher. The SMS4 block cipher is used as a standard in the Chinese Wireless LAN Wired Authentication and Privacy Infrastructure (WAPI) and hence, the round function should exhibit a high level of nonlinearity. However, the findings in this thesis on the existence of linear relationships show that this is not the case. It is shown that in some exceptional cases, the first four rounds of SMS4 are effectively linear. In these cases, the effective number of rounds for SMS4 is reduced by four, from 32 to 28. The findings raise questions about the security provided by SMS4, and might provide clues on the existence of a flaw in the design of the cipher.
Resumo:
The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.
Resumo:
Extending Deleuze’s later writing on the cinema and engaging both film and built work, the article explores what I call the "close-up," an immanent subjectivity of architectural encounter, whereby the architectural surface aggressively colonizes the subject at close range through a touch or by another mechanism I describe as the "withdrawn effect," the surface assimilates the subject.
Resumo:
The structure-building phenomena within clay aggregates are governed by forces acting between clay particles. The nature of such forces is important to understand in order to manipulate the aggregate structure for applications such as settling and dewatering. A parallel particle orientation is required when conducting force measurements acting between the basal planes of clay mineral platelets using atomic force microscopy (AFM). In order to prepare a film of clay particles with the optimal orientation for conducting AFM measurements, the influences of particle concentration in suspension, suspension pH and particle size on the clay platelet orientation were investigated using scanning electron microscopy (SEM) and X-ray diffraction (XRD) methods. From these investigations, we conclude that high clay (dry mass) concentrations and larger particle diameters (up to 5 µm) in suspension result in random orientation of platelets on the substrate. The best possible laminar orientation in the clay dried film as represented in the XRD by the 001/020 intensity ratio of more than 150 and by SE micrograph assessments, was obtained by drying thin layers from 0.2 wt% of -5 µm clay suspensions at pH 10.5. These dried films are stable and suitable for close-approach AFM studies in solution.
Resumo:
Quantitative studies of nascent entrepreneurs such as GEM and PSED are required to generate their samples by screening the adult population, usually by phone in developed economies. Phone survey research has recently been challenged by shifting patterns of ownership and response rates of landline versus mobile (cell) phones, particularly for younger respondents. This challenge is acutely intense for entrepreneurship which is a strongly age-dependent phenomenon. Although shifting ownership rates have received some attention, shifting response rates have remained largely unexplored. For the Australian GEM 2010 adult population study we conducted a dual-frame approach that allows comparison between samples of mobile and landline phones. We find a substantial response bias towards younger, male and metropolitan respondents for mobile phones – far greater than explained by ownership rates. We also found these response rate differences significantly biases the estimates of the prevalence of early stage entrepreneurship by both samples, even when each sample is weighted to match the Australian population.
Resumo:
In cloud computing resource allocation and scheduling of multiple composite web services is an important challenge. This is especially so in a hybrid cloud where there may be some free resources available from private clouds but some fee-paying resources from public clouds. Meeting this challenge involves two classical computational problems. One is assigning resources to each of the tasks in the composite web service. The other is scheduling the allocated resources when each resource may be used by more than one task and may be needed at different points of time. In addition, we must consider Quality-of-Service issues, such as execution time and running costs. Existing approaches to resource allocation and scheduling in public clouds and grid computing are not applicable to this new problem. This paper presents a random-key genetic algorithm that solves new resource allocation and scheduling problem. Experimental results demonstrate the effectiveness and scalability of the algorithm.
Resumo:
Objective: The global implementation of oral random roadside drug testing is relatively limited, and correspondingly, the literature that focuses on the effectiveness of this intervention is scant. This study aims to provide a preliminary indication of the impact of roadside drug testing in Queensland. Methods: A sample of Queensland motorists’ (N= 922) completed a self-report questionnaire to investigate their drug driving behaviour, as well as examine the perceived affect of legal sanctions (certainty, severity and swiftness) and knowledge of the countermeasure on their subsequent offending behaviour. Results: Analysis of the collected data revealed that approximately 20% of participants reported drug driving at least once in the last six months. Overall, there was considerable variability in respondent’s perceptions regarding the certainty, severity and swiftness of legal sanctions associated with the testing regime and a considerable proportion remained unaware of testing practices. In regards to predicting those who intended to drug driving again in the future, perceptions of apprehension certainty, more specifically low certainty of apprehension, were significantly associated with self-reported intentions to offend. Additionally, self-reported recent drug driving activity and frequent drug consumption were also identified as significant predictors, which indicates that in the current context, past behaviour is a prominent predictor of future behaviour. To a lesser extent, awareness of testing practices was a significant predictor of intending not to drug drive in the future. Conclusion: The results indicate that drug driving is relatively prevalent on Queensland roads, and a number of factors may influence such behaviour. Additionally, while the roadside testing initiative is beginning to have a deterrent impact, its success will likely be linked with targeted intelligence-led implementation in order to increase apprehension levels as well as the general deterrent effect.