887 resultados para 3.5G EUL Techniques
Resumo:
Bat researchers currently use a variety of techniques that transform echolocation calls into audible frequencies and allow the spectral content of a signal to be viewed and analyzed. All techniques have limitations and an understanding of how each works and the effect on the signal being analyzed are vital for correct interpretation. The 3 most commonly used techniques for transforming frequencies of a call are heterodyne, frequency division, and time expansion. Three techniques for viewing spectral content of a signal are zero-crossing, Fourier analysis, and instantaneous frequency analysis. It is important for bat researchers to be familiar with the advantages and disadvantages of each technique.
Resumo:
The competition to select a new secure hash function standard SHA-3 was initiated in response to surprising progress in the cryptanalysis of existing hash function constructions that started in 2004. In this report we survey design and cryptanalytic results of those 14 candidates that remain in the competition, about 1.5 years after the competition started with the initial submission of the candidates in October 2008. Implementation considerations are not in the scope of this report. The diversity of designs is also reflected in the great variety of cryptanalytic techniques and results that were applied and found during this time. This report gives an account of those techniques and results.
Resumo:
Long-term measurements of particle number size distribution (PNSD) produce a very large number of observations and their analysis requires an efficient approach in order to produce results in the least possible time and with maximum accuracy. Clustering techniques are a family of sophisticated methods which have been recently employed to analyse PNSD data, however, very little information is available comparing the performance of different clustering techniques on PNSD data. This study aims to apply several clustering techniques (i.e. K-means, PAM, CLARA and SOM) to PNSD data, in order to identify and apply the optimum technique to PNSD data measured at 25 sites across Brisbane, Australia. A new method, based on the Generalised Additive Model (GAM) with a basis of penalised B-splines, was proposed to parameterise the PNSD data and the temporal weight of each cluster was also estimated using the GAM. In addition, each cluster was associated with its possible source based on the results of this parameterisation, together with the characteristics of each cluster. The performances of four clustering techniques were compared using the Dunn index and Silhouette width validation values and the K-means technique was found to have the highest performance, with five clusters being the optimum. Therefore, five clusters were found within the data using the K-means technique. The diurnal occurrence of each cluster was used together with other air quality parameters, temporal trends and the physical properties of each cluster, in order to attribute each cluster to its source and origin. The five clusters were attributed to three major sources and origins, including regional background particles, photochemically induced nucleated particles and vehicle generated particles. Overall, clustering was found to be an effective technique for attributing each particle size spectra to its source and the GAM was suitable to parameterise the PNSD data. These two techniques can help researchers immensely in analysing PNSD data for characterisation and source apportionment purposes.
Resumo:
Railway capacity determination and expansion are very important topics. In prior research, the competition between different entities such as train services and train types, on different network corridors however have been ignored, poorly modelled, or else assumed to be static. In response, a comprehensive set of multi-objective models have been formulated in this article to perform a trade-off analysis. These models determine the total absolute capacity of railway networks as the most equitable solution according to a clearly defined set of competing objectives. The models also perform a sensitivity analysis of capacity with respect to those competing objectives. The models have been extensively tested on a case study and their significant worth is shown. The models were solved using a variety of techniques however an adaptive E constraint method was shown to be most superior. In order to identify only the best solution, a Simulated Annealing meta-heuristic was implemented and tested. However a linearization technique based upon separable programming was also developed and shown to be superior in terms of solution quality but far less in terms of computational time.
Resumo:
Organizations executing similar business processes need to understand the differences and similarities in activities performed across work environments. Presently, research interest is directed towards the potential of visualization for the display of process models, to support users in their analysis tasks. Although recent literature in process mining and comparison provide several methods and algorithms to perform process and log comparison, few contributions explore novel visualization approaches. This paper analyses process comparison from a design perspective, providing some practical visualization techniques as anal- ysis solutions (/to support process analysis). The design of the visual comparison has been tackled through three different points of view: the general model, the projected model and the side-by-side comparison in order to support the needs of business analysts. A case study is presented showing the application of process mining and visualization techniques to patient treatment across two Australian hospitals.
Resumo:
Resource assignment and scheduling is a difficult task when job processing times are stochastic, and resources are to be used for both known and unknown demand. To operate effectively within such an environment, several novel strategies are investigated. The first focuses upon the creation of a robust schedule, and utilises the concept of strategically placed idle time (i.e. buffering). The second approach introduces the idea of maintaining a number of free resources at each time, and culminates in another form of strategically placed buffering. The attraction of these approaches is that they are easy to grasp conceptually, and mimic what practitioners already do in practice. Our extensive numerical testing has shown that these techniques ensure more prompt job processing, and reduced job cancellations and waiting time. They are effective in the considered setting and could easily be adapted for many real life problems, for instance those in health care. This article has more importantly demonstrated that integrating the two approaches is a better strategy and will provide an effective stochastic scheduling approach.
Resumo:
The rectangular dielectric waveguide is the most commonly used structure in integrated optics, especially in semi-conductor diode lasers. Demands for new applications such as high-speed data backplanes in integrated electronics, waveguide filters, optical multiplexers and optical switches are driving technology toward better materials and processing techniques for planar waveguide structures. The infinite slab and circular waveguides that we know are not practical for use on a substrate because the slab waveguide has no lateral confinement and the circular fiber is not compatible with the planar processing technology being used to make planar structures. The rectangular waveguide is the natural structure. In this review, we have discussed several analytical methods for analyzing the mode structure of rectangular structures, beginning with a wave analysis based on the pioneering work of Marcatili. We study three basic techniques with examples to compare their performance levels. These are the analytical approach developed by Marcatili, the perturbation techniques, which improve on the analytical solutions and the effective index method with examples.
Resumo:
The purpose of this article is to show the applicability and benefits of the techniques of design of experiments as an optimization tool for discrete simulation models. The simulated systems are computational representations of real-life systems; its characteristics include a constant evolution that follows the occurrence of discrete events along the time. In this study, a production system, designed with the business philosophy JIT (Just in Time) is used, which seeks to achieve excellence in organizations through waste reduction in all the operational aspects. The most typical tool of JIT systems is the KANBAN production control that seeks to synchronize demand with flow of materials, minimize work in process, and define production metrics. Using experimental design techniques for stochastic optimization, the impact of the operational factors on the efficiency of the KANBAN / CONWIP simulation model is analyzed. The results show the effectiveness of the integration of experimental design techniques and discrete simulation models in the calculation of the operational parameters. Furthermore, the reliability of the methodologies found was improved with a new statistical consideration.
Resumo:
Some whole leaf-clearing and staining techniques are described for the microscopic observation of the origin of powdery mildew conidiophores, whether from external mycelium or internal mycelium, emerging through stomata. These techniques enable separation of the two genera, Oidiopsis and Streptopodium, in the Erysiphaceae.
Resumo:
Navua sedge, a member of the Cyperaceae family, is an aggressive weed of pastures in Fiji, Sri Lanka, Malay Peninsula, Vanuatu, Samoa, Solomons, and Tahiti and is now a weed of pastures and roadsides in north Queensland, Australia. Primarily restricted to areas with an annual rainfall exceeding 2500 mm, Navua sedge is capable of forming dense stands smothering many tropical pasture species. Seventeen herbicides were field tested at three sites in north Queensland, with glyphosate, halosulfuron, hexazinone, imazapic, imazapyr, or MSMA the most effective for Navua sedge control. Environmental problems such as persistence in soil, lack of selectivity and movement off-site may occur using some herbicides at the predicted LC90 control level rates. A seasonality trial using halosulfuron (97.5 g ai/ha) gave better Navua sedge control (84%) spraying March to September than spraying at other times (50%). In a frequency trial, sequential glyphosate applications (2,160 g ae/ha) every two months was more effective for continued Navua sedge control (67%) than a single application of glyphosate (36%), though loss of ground cover would occur. In a management trial, single applications of glyphosate (2,160 to 3,570 g ae/ha) using either a rope wick, ground foliar spraying or a rotary rope wick gave 59 to 73% control, while other treatments (rotary hoe (3%), slashing (-13%) or crushing (-30%)) were less effective. In a second management trial, four monthly rotary wick applications were much more effective (98%) than four monthly crushing applications (42%). An effective management plan must include the application of regular herbicide treatments to eliminate Navua sedge seed being added to the soil seed bank. Treatments that result in seed burial, for example, discing are likely to prolong seed persistence and should be avoided. The sprouting activity of vegetative propagules and root fragmentation needs to also be considered when selecting control options.
Resumo:
Multi-access techniques are widely used in computer networking and distributed multiprocessor systems. On-the-fly arbitration schemes permit one of the many contenders to access the medium without collisions. Serial arbitration is cost effective but is slow and hence unsuitable for high-speed multiprocessor environments supporting very high data transfer rates. A fully parallel arbitration scheme takes less time but is not practically realisable for large numbers of contenders. In this paper, a generalised parallel-serial scheme is proposed which significantly reduces the arbitration time and is practically realisable.
Latent TGF-β binding proteins -3 and -4 : transcriptional control and extracellular matrix targeting
Resumo:
Extracellular matrix (ECM) is a complex network of various proteins and proteoglycans which provides tissues with structural strength and resilience. By harvesting signaling molecules like growth factors ECM has the capacity to control cellular functions including proliferation, differentiation and cell survival. Latent transforming growth factor β (TGF-β) binding proteins (LTBPs) associate fibrillar structures of the ECM and mediate the efficient secretion and ECM deposition of latent TGF-β. The current work was conducted to determine the regulatory regions of LTBP-3 and -4 genes to gain insight into their tissue-specific expression which also has impact on TGF-β biology. Furthermore, the current research aimed at defining the ECM targeting of the N-terminal variants of LTBP-4 (LTBP-4S and -4L), which is required to understand their functions in tissues and to gain insight into conditions in which TGF-β is activated. To characterize the regulatory regions of LTBP-3 and -4 genes in silico and functional promoter analysis techniques were employed. It was found that the expression of LTBP-4S and -4L are under control of two independent promoters. This finding was in accordance with the observed expression patterns of LTBP-4S and -4L in human tissues. All promoter regions characterized in this study were TATAless, GC-rich and highly conserved between human and mouse species. Putative binding sites for Sp1 and GATA family of transcription factors were recognized in all of these regulatory regions. It is possible that these transcription factors control the basal expression of LTBP-3 and -4 genes. Smad binding element was found within the LTBP-3 and -4S promoter regions, but it was not present in LTBP-4L promoter. Although this element important for TGF-β signaling was present in LTBP-4S promoter, TGF-β did not induce its transcriptional activity. LTBP-3 promoter activity and mRNA expression instead were stimulated by TGF-β1 in osteosarcoma cells. It was found that the stimulatory effect of TGF-β was mediated by Smad and Erk MAPK signaling pathways. The current work explored the ECM targeting of LTBP-4S and identified binding partners of this protein. It was found that the N-terminal end of LTBP-4S possesses fibronectin (FN) binding sites which are critical for its ECM targeting. FN deficient fibroblasts incorporated LTBP-4S into their ECM only after addition of exogenous FN. Furthermore, LTBP-4S was found to have heparin binding regions, of which the C-terminal binding site mediated fibroblast adhesion. Soluble heparin prevented the ECM association of LTBP-4S in fibroblast cultures. In the current work it was observed that there are significant differences in the secretion, processing and ECM targeting of LTBP-4S and -4L. Interestingly, it was observed that most of the secreted LTBP-4L was associated with latent TGF-β1, whereas LTBP-4S was mainly secreted as a free form from CHO cells. This thesis provides information on transcriptional regulation of LTBP-3 and -4 genes, which is required for the deeper understanding of their tissue-specific functions. Further, the current work elucidates the structural variability of LTBPs, which appears to have impact on secretion and ECM targeting of TGF-β. These findings may advance understanding the abnormal activation of TGF-β which is associated with connective tissue disorders and cancer.
Development of Sample Pretreatment and Liquid Chromatographic Techniques for Antioxidative Compounds
Resumo:
In this study, novel methodologies for the determination of antioxidative compounds in herbs and beverages were developed. Antioxidants are compounds that can reduce, delay or inhibit oxidative events. They are a part of the human defense system and are obtained through the diet. Antioxidants are naturally present in several types of foods, e.g. in fruits, beverages, vegetables and herbs. Antioxidants can also be added to foods during manufacturing to suppress lipid oxidation and formation of free radicals under conditions of cooking or storage and to reduce the concentration of free radicals in vivo after food ingestion. There is growing interest in natural antioxidants, and effective compounds have already been identified from antioxidant classes such as carotenoids, essential oils, flavonoids and phenolic acids. The wide variety of sample matrices and analytes presents quite a challenge for the development of analytical techniques. Growing demands have been placed on sample pretreatment. In this study, three novel extraction techniques, namely supercritical fluid extraction (SFE), pressurised hot water extraction (PHWE) and dynamic sonication-assisted extraction (DSAE) were studied. SFE was used for the extraction of lycopene from tomato skins and PHWE was used in the extraction of phenolic compounds from sage. DSAE was applied to the extraction of phenolic acids from Lamiaceae herbs. In the development of extraction methodologies, the main parameters of the extraction were studied and the recoveries were compared to those achieved by conventional extraction techniques. In addition, the stability of lycopene was also followed under different storage conditions. For the separation of the antioxidative compounds in the extracts, liquid chromatographic methods (LC) were utilised. Two novel LC techniques, namely ultra performance liquid chromatography (UPLC) and comprehensive two-dimensional liquid chromatography (LCxLC) were studied and compared with conventional high performance liquid chromatography (HPLC) for the separation of antioxidants in beverages and Lamiaceae herbs. In LCxLC, the selection of LC mode, column dimensions and flow rates were studied and optimised to obtain efficient separation of the target compounds. In addition, the separation powers of HPLC, UPLC, HPLCxHPLC and HPLCxUPLC were compared. To exploit the benefits of an integrated system, in which sample preparation and final separation are performed in a closed unit, dynamic sonication-assisted extraction was coupled on-line to a liquid chromatograph via a solid-phase trap. The increased sensitivity was utilised in the extraction of phenolic acids from Lamiaceae herbs. The results were compared to those of achieved by the LCxLC system.
Resumo:
Efficient ways to re-establish pastures are needed on land that requires a rotation between pastures and crops. We conducted trials in southern inland Queensland with a range of tropical perennial grasses sown into wheat stubble that was modified in various ways. Differing seedbed preparations involved cultivation or herbicide sprays, with or without fertilizer at sowing. Seed was broadcast and sowing time ranged from spring through to autumn on 3 different soil types. Seed quality and post-sowing rainfall were major determinants of the density of sown grass plants in the first year. Light cultivation sometimes enhanced establishment compared with herbicide spraying of standing stubble, most often on harder-setting soils. A nitrogen + phosphorus mixed fertilizer rarely produced any improvement in sown grass establishment and sometimes increased weed competition. The effects were similar for all types of grass seed from hairy fascicles to large, smooth panicoid seeds and minute Eragrostis seeds. There was a strong inverse relationship between the initial density of sown grass established and the level of weed competition.