902 resultados para broadband tuning
Resumo:
Here we present a sequential Monte Carlo approach to Bayesian sequential design for the incorporation of model uncertainty. The methodology is demonstrated through the development and implementation of two model discrimination utilities; mutual information and total separation, but it can also be applied more generally if one has different experimental aims. A sequential Monte Carlo algorithm is run for each rival model (in parallel), and provides a convenient estimate of the marginal likelihood (of each model) given the data, which can be used for model comparison and in the evaluation of utility functions. A major benefit of this approach is that it requires very little problem specific tuning and is also computationally efficient when compared to full Markov chain Monte Carlo approaches. This research is motivated by applications in drug development and chemical engineering.
Resumo:
This paper focuses on the implementation of a damping controller for the doubly fed induction generator (DFIG) system. Coordinated tuning of the damping controller to enhance the damping of the oscillatory modes is presented using bacterial foraging technique. The effect of the tuned damping controller on converter ratings of the DFIG system is also investigated. The results of both eigenvalue analysis and the time-domain simulation studies are presented to elucidate the effectiveness of the tuned damping controller in the DFIG system. The improvement of the fault ride-through capability of the system is also demonstrated.
Resumo:
The striking color patterns of butterflies and birds have long interested biologists. But how these animals see color is less well understood. Opsins are the protein components of the visual pigments of the eye. Color vision has evolved in butterflies through opsin gene duplications, through positive selection at individual opsin loci, and by the use of filtering pigments. By contrast, birds have retained the same opsin complement present in early-jawed vertebrates, and their visual system has diversified primarily through tuning of the short-wavelength-sensitive photoreceptors, rather than by opsin duplication or the use of filtering elements. Butterflies and birds have evolved photoreceptors that might use some of the same amino acid sites for generating similar spectral phenotypes across approximately 540 million years of evolution, when rhabdomeric and ciliary-type opsins radiated during the early Cambrian period. Considering the similarities between the two taxa, it is surprising that the eyes of birds are not more diverse. Additional taxonomic sampling of birds may help clarify this mystery.
Resumo:
In the past few years, remarkable progress has been made in unveiling novel and unique optical properties of strongly coupled plasmonic nanostructures. However, application of such plasmonic nanostructures in biomedicine remains challenging due to the lack of facile and robust assembly methods for producing stable nanostructures. Previous attempts to achieve plasmonic nano-assemblies using molecular ligands were limited due to the lack of flexibility that could be exercised in forming them. Here, we report the utilization of tailor-made hyperbranched polymers (HBP) as linkers to assemble gold nanoparticles (NPs) into nano-assemblies. The ease and flexibility in tuning the particle size and number of branch ends of a HBP makes it an ideal candidate as a linker, as opposed to DNA, small organic molecules and linear or dendrimeric polymers. We report a strong correlation of polymer (HBP) concentration with the size of the hybrid nano-assemblies and “hot-spot” density. We have shown that such solutions of stable HBP-gold nano-assemblies can be barcoded with various Raman tags to provide improved surface-enhanced Raman scattering (SERS) compared with non-aggregated NP systems. These Raman barcoded hybrid nano-assemblies, with further optimization of NP shape, size and “hot-spot” density, may find application as diagnostic tools in nanomedicine.
Resumo:
This article clarifies the interdependence between high-speed broadband and e-Learning. It does this by identifying the importance of the internet for Australia’s education future and the importance of education for the future of the internet. It concludes by confirming the role the NBN will play as an enabler of both; and the need to ensure access for all to appropriate skills, as well as to services.
Resumo:
The development of public service broadcasters (PSBs) in the 20th century was framed around debates about its difference compared to commercial broadcasting. These debates navigated between two poles. One concerned the relationship between non‐commercial sources of funding and the role played by statutory Charters as guarantors of the independence of PSBs. The other concerned the relationship between PSBs being both a complementary and a comprehensive service, although there are tensions inherent in this duality. In the 21st century, as reconfigured public service media organisations (PSMs) operate across multiple platforms in a convergent media environment, how are these debates changing, if at all? Is the case for PSM “exceptionalism” changed with Web‐based services, catch‐up TV, podcasting, ancillary product sales, and commissioning of programs from external sources in order to operate in highly diversified cross‐media environments? Do the traditional assumptions about non‐commercialism still hold as the basis for different forms of PSM governance and accountability? This paper will consider the question of PSM exceptionalism in the context of three reviews into Australian media that took place over 2011‐2012: the Convergence Review undertaken through the Department of Broadband, Communications and the Digital Economy; the National Classification Scheme Review undertaken by the Australian Law Reform Commission; and the Independent Media Inquiry that considered the future of news and journalism.
Resumo:
Our understanding of the mechanisms of action of GH and its receptor, the GHR, has advanced significantly in the last decade and has provided some important surprises. It is now clear that the GH-GHR axis activates a number of inter-related signalling pathways, not all of which are dependent on the intracellular tyrosine kinase, JAK2 as originally postulated. JAK2-independent pathways, mediated via the Src family kinases, together with a number of negative regulators of GH signalling and emerging cross-talk mechanisms with other growth factor receptors, provide a complex array of mechanisms that are capable of fine-tuning responses to GH in a cell context dependent manner. Additionally, it is also now clear that GH and the GHR can translocate to the nucleus of target cells and initiate, as yet not well defined, nuclear responses. Continued emphasis on elucidation of these complex mechanisms is critical to provide further insights into the diverse physiological and pathophysiological effects of GH.
Resumo:
Linear adaptive channel equalization using the least mean square (LMS) algorithm and the recursive least-squares(RLS) algorithm for an innovative multi-user (MU) MIMOOFDM wireless broadband communications system is proposed. The proposed equalization method adaptively compensates the channel impairments caused by frequency selectivity in the propagation environment. Simulations for the proposed adaptive equalizer are conducted using a training sequence method to determine optimal performance through a comparative analysis. Results show an improvement of 0.15 in BER (at a SNR of 16 dB) when using Adaptive Equalization and RLS algorithm compared to the case in which no equalization is employed. In general, adaptive equalization using LMS and RLS algorithms showed to be significantly beneficial for MU-MIMO-OFDM systems.
Resumo:
Preparing valuations is a time consuming process involving site inspections, research and report formulation. The ease of access to the internet has changed how and where valuations may be undertaken. No longer is it necessary to return to the office to finalise reports, or leave your desk in order to undertake research. This enables more streamlined service delivery and is viewed as a positive. However, it is not without negative impacts. This paper seeks to inform practitioners of the work environment changes flowing from increased access to the internet. It identifies how increased accessibility to, and use of, technology and the internet has, and will continue to, impact upon valuation service provision into the future.
Resumo:
Advances in algorithms for approximate sampling from a multivariable target function have led to solutions to challenging statistical inference problems that would otherwise not be considered by the applied scientist. Such sampling algorithms are particularly relevant to Bayesian statistics, since the target function is the posterior distribution of the unobservables given the observables. In this thesis we develop, adapt and apply Bayesian algorithms, whilst addressing substantive applied problems in biology and medicine as well as other applications. For an increasing number of high-impact research problems, the primary models of interest are often sufficiently complex that the likelihood function is computationally intractable. Rather than discard these models in favour of inferior alternatives, a class of Bayesian "likelihoodfree" techniques (often termed approximate Bayesian computation (ABC)) has emerged in the last few years, which avoids direct likelihood computation through repeated sampling of data from the model and comparing observed and simulated summary statistics. In Part I of this thesis we utilise sequential Monte Carlo (SMC) methodology to develop new algorithms for ABC that are more efficient in terms of the number of model simulations required and are almost black-box since very little algorithmic tuning is required. In addition, we address the issue of deriving appropriate summary statistics to use within ABC via a goodness-of-fit statistic and indirect inference. Another important problem in statistics is the design of experiments. That is, how one should select the values of the controllable variables in order to achieve some design goal. The presences of parameter and/or model uncertainty are computational obstacles when designing experiments but can lead to inefficient designs if not accounted for correctly. The Bayesian framework accommodates such uncertainties in a coherent way. If the amount of uncertainty is substantial, it can be of interest to perform adaptive designs in order to accrue information to make better decisions about future design points. This is of particular interest if the data can be collected sequentially. In a sense, the current posterior distribution becomes the new prior distribution for the next design decision. Part II of this thesis creates new algorithms for Bayesian sequential design to accommodate parameter and model uncertainty using SMC. The algorithms are substantially faster than previous approaches allowing the simulation properties of various design utilities to be investigated in a more timely manner. Furthermore the approach offers convenient estimation of Bayesian utilities and other quantities that are particularly relevant in the presence of model uncertainty. Finally, Part III of this thesis tackles a substantive medical problem. A neurological disorder known as motor neuron disease (MND) progressively causes motor neurons to no longer have the ability to innervate the muscle fibres, causing the muscles to eventually waste away. When this occurs the motor unit effectively ‘dies’. There is no cure for MND, and fatality often results from a lack of muscle strength to breathe. The prognosis for many forms of MND (particularly amyotrophic lateral sclerosis (ALS)) is particularly poor, with patients usually only surviving a small number of years after the initial onset of disease. Measuring the progress of diseases of the motor units, such as ALS, is a challenge for clinical neurologists. Motor unit number estimation (MUNE) is an attempt to directly assess underlying motor unit loss rather than indirect techniques such as muscle strength assessment, which generally is unable to detect progressions due to the body’s natural attempts at compensation. Part III of this thesis builds upon a previous Bayesian technique, which develops a sophisticated statistical model that takes into account physiological information about motor unit activation and various sources of uncertainties. More specifically, we develop a more reliable MUNE method by applying marginalisation over latent variables in order to improve the performance of a previously developed reversible jump Markov chain Monte Carlo sampler. We make other subtle changes to the model and algorithm to improve the robustness of the approach.
Resumo:
This paper focuses on the super/sub-synchronous operation of the doubly fed induction generator (DFIG) system. The impact of a damping controller on the different modes of operation for the DFIG based wind generation system is investigated. The co-ordinated tuning of the damping controller to enhance the damping of the oscillatory modes using bacteria foraging (BF) technique is presented. The results from eigenvalue analysis are presented to elucidate the effectiveness of the tuned damping controller in the DFIG system. The robustness issue of the damping controller is also investigated
Resumo:
RatSLAM is a navigation system based on the neural processes underlying navigation in the rodent brain, capable of operating with low resolution monocular image data. Seminal experiments using RatSLAM include mapping an entire suburb with a web camera and a long term robot delivery trial. This paper describes OpenRatSLAM, an open-source version of RatSLAM with bindings to the Robot Operating System framework to leverage advantages such as robot and sensor abstraction, networking, data playback, and visualization. OpenRatSLAM comprises connected ROS nodes to represent RatSLAM’s pose cells, experience map, and local view cells, as well as a fourth node that provides visual odometry estimates. The nodes are described with reference to the RatSLAM model and salient details of the ROS implementation such as topics, messages, parameters, class diagrams, sequence diagrams, and parameter tuning strategies. The performance of the system is demonstrated on three publicly available open-source datasets.
Resumo:
Over the last two decades, the internet and e-commerce have reshaped the way we communicate, interact and transact. In the converged environment enabled by high speed broadband, web 2.0, social media, virtual worlds, user-generated content, cloud computing, VoIP, open source software and open content have rapidly become established features of our online experience. Business and government alike are increasingly using the internet as the preferred platform for delivery of their goods and services and for effective engagement with their clients. New ways of doing things online and challenges to existing business, government and social activities have tested current laws and often demand new policies and laws, adapted to the new realities. The focus of this book is the regulation of social, cultural and commercial activity on the World Wide Web. It considers developments in the law that have been, and continue to be, brought about by the emergence of the internet and e-commerce. It analyses how the law is applied to define rights and obligations in relation to online infrastructure, content and practices.
Resumo:
Internet chatrooms are common means of interaction and communications, and they carry valuable information about formal or ad-hoc formation of groups with diverse objectives. This work presents a fully automated surveillance system for data collection and analysis in Internet chatrooms. The system has two components: First, it has an eavesdropping tool which collects statistics on individual (chatter) and chatroom behavior. This data can be used to profile a chatroom and its chatters. Second, it has a computational discovery algorithm based on Singular Value Decomposition (SVD) to locate hidden communities and communication patterns within a chatroom. The eavesdropping tool is used for fine tuning the SVD-based discovery algorithm which can be deployed in real-time and requires no semantic information processing. The evaluation of the system on real data shows that (i) statistical properties of different chatrooms vary significantly, thus profiling is possible, (ii) SVD-based algorithm has up to 70-80% accuracy to discover groups of chatters.
Resumo:
Before e-Technology’s effects on users can be accurately measured, those users must be fully engaged with the relevant systems and services. That is they must be able to function as part of the digital economy. The paper refers to this ‘user functionality’ as t-Engagement. Not all users are t-Engaged and in many instances achieving t-Engagement will require assistance from external sources. This paper identifies the current state of Australia’s regional digital economy readiness and highlights the role of Local Government Authorities (‘LGAs’) in enabling t-Engagement. The paper analyses responses to the 2012 BTA, NBN and Digital Economy Survey by LGA and other regional organizations within Australia. The paper’s particular focus is on the level of use by Local Government Authorities of federal, state and other programs designed to enable t-Engagement. The analysis confirms the role of LGAs in enabling t-Engagement and in promoting Australia’s digital economy. The paper concludes by reinforcing the need to ensure ongoing meaningful federal and State support of regional initiatives, as well as identifying issues requiring specific attention.