68 resultados para 3.5G EUL Techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bat researchers currently use a variety of techniques that transform echolocation calls into audible frequencies and allow the spectral content of a signal to be viewed and analyzed. All techniques have limitations and an understanding of how each works and the effect on the signal being analyzed are vital for correct interpretation. The 3 most commonly used techniques for transforming frequencies of a call are heterodyne, frequency division, and time expansion. Three techniques for viewing spectral content of a signal are zero-crossing, Fourier analysis, and instantaneous frequency analysis. It is important for bat researchers to be familiar with the advantages and disadvantages of each technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The competition to select a new secure hash function standard SHA-3 was initiated in response to surprising progress in the cryptanalysis of existing hash function constructions that started in 2004. In this report we survey design and cryptanalytic results of those 14 candidates that remain in the competition, about 1.5 years after the competition started with the initial submission of the candidates in October 2008. Implementation considerations are not in the scope of this report. The diversity of designs is also reflected in the great variety of cryptanalytic techniques and results that were applied and found during this time. This report gives an account of those techniques and results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long-term measurements of particle number size distribution (PNSD) produce a very large number of observations and their analysis requires an efficient approach in order to produce results in the least possible time and with maximum accuracy. Clustering techniques are a family of sophisticated methods which have been recently employed to analyse PNSD data, however, very little information is available comparing the performance of different clustering techniques on PNSD data. This study aims to apply several clustering techniques (i.e. K-means, PAM, CLARA and SOM) to PNSD data, in order to identify and apply the optimum technique to PNSD data measured at 25 sites across Brisbane, Australia. A new method, based on the Generalised Additive Model (GAM) with a basis of penalised B-splines, was proposed to parameterise the PNSD data and the temporal weight of each cluster was also estimated using the GAM. In addition, each cluster was associated with its possible source based on the results of this parameterisation, together with the characteristics of each cluster. The performances of four clustering techniques were compared using the Dunn index and Silhouette width validation values and the K-means technique was found to have the highest performance, with five clusters being the optimum. Therefore, five clusters were found within the data using the K-means technique. The diurnal occurrence of each cluster was used together with other air quality parameters, temporal trends and the physical properties of each cluster, in order to attribute each cluster to its source and origin. The five clusters were attributed to three major sources and origins, including regional background particles, photochemically induced nucleated particles and vehicle generated particles. Overall, clustering was found to be an effective technique for attributing each particle size spectra to its source and the GAM was suitable to parameterise the PNSD data. These two techniques can help researchers immensely in analysing PNSD data for characterisation and source apportionment purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Railway capacity determination and expansion are very important topics. In prior research, the competition between different entities such as train services and train types, on different network corridors however have been ignored, poorly modelled, or else assumed to be static. In response, a comprehensive set of multi-objective models have been formulated in this article to perform a trade-off analysis. These models determine the total absolute capacity of railway networks as the most equitable solution according to a clearly defined set of competing objectives. The models also perform a sensitivity analysis of capacity with respect to those competing objectives. The models have been extensively tested on a case study and their significant worth is shown. The models were solved using a variety of techniques however an adaptive E constraint method was shown to be most superior. In order to identify only the best solution, a Simulated Annealing meta-heuristic was implemented and tested. However a linearization technique based upon separable programming was also developed and shown to be superior in terms of solution quality but far less in terms of computational time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organizations executing similar business processes need to understand the differences and similarities in activities performed across work environments. Presently, research interest is directed towards the potential of visualization for the display of process models, to support users in their analysis tasks. Although recent literature in process mining and comparison provide several methods and algorithms to perform process and log comparison, few contributions explore novel visualization approaches. This paper analyses process comparison from a design perspective, providing some practical visualization techniques as anal- ysis solutions (/to support process analysis). The design of the visual comparison has been tackled through three different points of view: the general model, the projected model and the side-by-side comparison in order to support the needs of business analysts. A case study is presented showing the application of process mining and visualization techniques to patient treatment across two Australian hospitals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resource assignment and scheduling is a difficult task when job processing times are stochastic, and resources are to be used for both known and unknown demand. To operate effectively within such an environment, several novel strategies are investigated. The first focuses upon the creation of a robust schedule, and utilises the concept of strategically placed idle time (i.e. buffering). The second approach introduces the idea of maintaining a number of free resources at each time, and culminates in another form of strategically placed buffering. The attraction of these approaches is that they are easy to grasp conceptually, and mimic what practitioners already do in practice. Our extensive numerical testing has shown that these techniques ensure more prompt job processing, and reduced job cancellations and waiting time. They are effective in the considered setting and could easily be adapted for many real life problems, for instance those in health care. This article has more importantly demonstrated that integrating the two approaches is a better strategy and will provide an effective stochastic scheduling approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this article is to show the applicability and benefits of the techniques of design of experiments as an optimization tool for discrete simulation models. The simulated systems are computational representations of real-life systems; its characteristics include a constant evolution that follows the occurrence of discrete events along the time. In this study, a production system, designed with the business philosophy JIT (Just in Time) is used, which seeks to achieve excellence in organizations through waste reduction in all the operational aspects. The most typical tool of JIT systems is the KANBAN production control that seeks to synchronize demand with flow of materials, minimize work in process, and define production metrics. Using experimental design techniques for stochastic optimization, the impact of the operational factors on the efficiency of the KANBAN / CONWIP simulation model is analyzed. The results show the effectiveness of the integration of experimental design techniques and discrete simulation models in the calculation of the operational parameters. Furthermore, the reliability of the methodologies found was improved with a new statistical consideration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A low temperature synthesis method based on the decomposition of urea at 90°C in water has been developed to synthesise fraipontite. This material is characterised by a basal reflection 001 at 7.44 Å. The trioctahedral nature of the fraipontite is shown by the presence of a 06l band around 1.54 Å, while a minor band around 1.51 Å indicates some cation ordering between Zn and Al resulting in Al-rich areas with a more dioctahedral nature. TEM and IR indicate that no separate kaolinite phase is present. An increase in the Al content however, did result in the formation of some SiO2 in the form of quartz. Minor impurities of carbonate salts were observed during the synthesis caused by to the formation of CO32- during the decomposition of urea.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The information on climate variations is essential for the research of many subjects, such as the performance of buildings and agricultural production. However, recorded meteorological data are often incomplete. There may be a limited number of locations recorded, while the number of recorded climatic variables and the time intervals can also be inadequate. Therefore, the hourly data of key weather parameters as required by many building simulation programmes are typically not readily available. To overcome this gap in measured information, several empirical methods and weather data generators have been developed. They generally employ statistical analysis techniques to model the variations of individual climatic variables, while the possible interactions between different weather parameters are largely ignored. Based on a statistical analysis of 10 years historical hourly climatic data over all capital cities in Australia, this paper reports on the finding of strong correlations between several specific weather variables. It is found that there are strong linear correlations between the hourly variations of global solar irradiation (GSI) and dry bulb temperature (DBT), and between the hourly variations of DBT and relative humidity (RH). With an increase in GSI, DBT would generally increase, while the RH tends to decrease. However, no such a clear correlation can be found between the DBT and atmospheric pressure (P), and between the DBT and wind speed. These findings will be useful for the research and practice in building performance simulation.