996 resultados para adaptive sampling
Resumo:
A small array composed of three monopole elements with very small element spacing on the order of λ/6 to λ/20 is considered for application in adaptive beamforming. The properties of this 3-port array are governed by strong mutual coupling. It is shown that for signal-to-noise maximization, it is not sufficient to adjust the weights to compensate for the effects of mutual coupling. The necessity for a RF-decoupling network (RF-DN) and its simple realization are shown. The array with closely spaced elements together with the RF-DN represents a superdirective antenna with a directivity of more than 10 dBi. It is shown that the required fractional frequency bandwidth and the available unloaded Q of the antenna and RF-DN structure determine the lower limit for the element spacing.
Resumo:
To ensure infrastructure assets are procured and maintained by government on behalf of citizens, appropriate policy and institutional architecture is needed, particularly if a fundamental shift to more sustainable infrastructure is the goal. The shift in recent years from competitive and resource-intensive procurement to more collaborative and sustainable approaches to infrastructure governance is considered a major transition in infrastructure procurement systems. In order to better understand this transition in infrastructure procurement arrangements, the concept of emergence from Complex Adaptive Systems (CAS) theory is offered as a key construct. Emergence holds that micro interactions can result in emergent macro order. Applying the concept of emergence to infrastructure procurement, this research examines how interaction of agents in individual projects can result in different industry structural characteristics. The paper concludes that CAS theory, and particularly the concept of ‘emergence’, provides a useful construct to understand infrastructure procurement dynamics and progress towards sustainability.
Resumo:
The current regulatory approach to coal seam gas projects in Queensland is based on the philosophy of adaptive environmental management. This method of “learning by doing” is implemented in Queensland primarily through the imposition of layered monitoring and reporting duties on the coal seam gas operator alongside obligations to compensate and “make good” harm caused. The purpose of this article is to provide a critical review of the Queensland regulatory approach to the approval and minimisation of adverse impacts from coal seam gas activities. Following an overview of the hallmarks of an effective adaptive management approach, this article begins by addressing the mosaic of approval processes and impact assessment regimes that may apply to coal seam gas projects. This includes recent Strategic Cropping Land reforms. This article then turns to consider the preconditions for land access in Queensland and the emerging issues for landholders relating to the negotiation of access and compensation agreements. This article then undertakes a critical review of the environmental duties imposed on coal seam gas operators relating to hydraulic fracturing, well head leaks, groundwater management and the disposal and beneficial use of produced water. Finally, conclusions are drawn regarding the overall effectiveness of the Queensland framework and the lessons that may be drawn from Queensland’s adaptive environmental management approach.
Resumo:
Purpose of review: This review provides an overview on the importance of characterising and considering insect distribution infor- mation for designing stored commodity sampling protocols. Findings: Sampling protocols are influenced by a number of factors including government regulations, management practices, new technology and current perceptions of the status of insect pest damage. The spatial distribution of insects in stored commodities influ- ences the efficiency of sampling protocols; these can vary in response to season, treatment and other factors. It is important to use sam- pling designs based on robust statistics suitable for the purpose. Future research: The development of sampling protocols based on flexible, robust statistics allows for accuracy across a range of spatial distributions. Additionally, power can be added to sampling protocols through the integration of external information such as treatment history and climate. Bayesian analysis provides a coherent and well understood means to achieve this.
Resumo:
Here we present a sequential Monte Carlo (SMC) algorithm that can be used for any one-at-a-time Bayesian sequential design problem in the presence of model uncertainty where discrete data are encountered. Our focus is on adaptive design for model discrimination but the methodology is applicable if one has a different design objective such as parameter estimation or prediction. An SMC algorithm is run in parallel for each model and the algorithm relies on a convenient estimator of the evidence of each model which is essentially a function of importance sampling weights. Other methods for this task such as quadrature, often used in design, suffer from the curse of dimensionality. Approximating posterior model probabilities in this way allows us to use model discrimination utility functions derived from information theory that were previously difficult to compute except for conjugate models. A major benefit of the algorithm is that it requires very little problem specific tuning. We demonstrate the methodology on three applications, including discriminating between models for decline in motor neuron numbers in patients suffering from neurological diseases such as Motor Neuron disease.
Resumo:
Vernier acuity, a form of visual hyperacuity, is amongst the most precise forms of spatial vision. Under optimal conditions Vernier thresholds are much finer than the inter-photoreceptor distance. Achievement of such high precision is based substantially on cortical computations, most likely in the primary visual cortex. Using stimuli with added positional noise, we show that Vernier processing is reduced with advancing age across a wide range of noise levels. Using an ideal observer model, we are able to characterize the mechanisms underlying age-related loss, and show that the reduction in Vernier acuity can be mainly attributed to the reduction in efficiency of sampling, with no significant change in the level of internal position noise, or spatial distortion, in the visual system.
Resumo:
This paper proposes a novel approach to video deblocking which performs perceptually adaptive bilateral filtering by considering color, intensity, and motion features in a holistic manner. The method is based on bilateral filter which is an effective smoothing filter that preserves edges. The bilateral filter parameters are adaptive and avoid over-blurring of texture regions and at the same time eliminate blocking artefacts in the smooth region and areas of slow motion content. This is achieved by using a saliency map to control the strength of the filter for each individual point in the image based on its perceptual importance. The experimental results demonstrate that the proposed algorithm is effective in deblocking highly compressed video sequences and to avoid over-blurring of edges and textures in salient regions of image.
Resumo:
Exploiting wind-energy is one possible way to ex- tend flight duration for Unmanned Arial Vehicles. Wind-energy can also be used to minimise energy consumption for a planned path. In this paper, we consider uncertain time-varying wind fields and plan a path through them. A Gaussian distribution is used to determine uncertainty in the Time-varying wind fields. We use Markov Decision Process to plan a path based upon the uncertainty of Gaussian distribution. Simulation results that compare the direct line of flight between start and target point and our planned path for energy consumption and time of travel are presented. The result is a robust path using the most visited cell while sampling the Gaussian distribution of the wind field in each cell.
Resumo:
Fusion techniques have received considerable attention for achieving performance improvement with biometrics. While a multi-sample fusion architecture reduces false rejects, it also increases false accepts. This impact on performance also depends on the nature of subsequent attempts, i.e., random or adaptive. Expressions for error rates are presented and experimentally evaluated in this work by considering the multi-sample fusion architecture for text-dependent speaker verification using HMM based digit dependent speaker models. Analysis incorporating correlation modeling demonstrates that the use of adaptive samples improves overall fusion performance compared to randomly repeated samples. For a text dependent speaker verification system using digit strings, sequential decision fusion of seven instances with three random samples is shown to reduce the overall error of the verification system by 26% which can be further reduced by 6% for adaptive samples. This analysis novel in its treatment of random and adaptive multiple presentations within a sequential fused decision architecture, is also applicable to other biometric modalities such as finger prints and handwriting samples.
Resumo:
Interleukin(IL)-18 is a pleiotrophic cytokine with functions in immune modulation, angiogenesis and bone metabolism. In this study, the potential of IL-18 as an immunotherapy for prostate cancer (PCa) was examined using the murine model of prostate carcinoma, RM1 and a bone metastatic variant RM1(BM)/B4H7-luc. RM1 and RM1(BM)/B4H7-luc cells were stably transfected to express bioactive IL-18. These cells were implanted into syngeneic immunocompetent mice, with or without an IL-18-neutralising antibody (αIL-18, SK113AE4). IL-18 significantly inhibited the growth of both subcutaneous and orthotopic RM1 tumors and the IL-18 neutralizing antibody abrogated the tumor growth-inhibition. In vivo neutralization of interferon-gamma (IFN-γ) completely eliminated the anti-tumor effects of IL-18 confirming an essential role of IFN-γ as a down-stream mediator of the anti-tumor activity of IL-18. Tumors from mice in which IL-18 and/or IFN-γ was neutralized contained significantly fewer CD4+ and CD8+ T cells than those with functional IL-18. The essential role of adaptive immunity was demonstrated as tumors grew more rapidly in RAG1−/− mice or in mice depleted of CD4+ and/or CD8+ cells than in normal mice. The tumors in RAG1−/− mice were also significantly smaller when IL-18 was present, indicating that innate immune mechanisms are involved. IL-18 also induced an increase in tumor infiltration of macrophages and neutrophils but not NK cells. In other experiments, direct injection of recombinant IL-18 into established tumors also inhibited tumor growth, which was associated with an increase in intratumoral macrophages, but not T cells. These results suggest that local IL-18 in the tumor environment can significantly potentiate anti-tumor immunity in the prostate and clearly demonstrate that this effect is mediated by innate and adaptive immune mechanisms.
Resumo:
Optimal design methods have been proposed to determine the best sampling times when sparse blood sampling is required in clinical pharmacokinetic studies. However, the optimal blood sampling time points may not be feasible in clinical practice. Sampling windows, a time interval for blood sample collection, have been proposed to provide flexibility in blood sampling times while preserving efficient parameter estimation. Because of the complexity of the population pharmacokinetic models, which are generally nonlinear mixed effects models, there is no analytical solution available to determine sampling windows. We propose a method for determination of sampling windows based on MCMC sampling techniques. The proposed method attains a stationary distribution rapidly and provides time-sensitive windows around the optimal design points. The proposed method is applicable to determine sampling windows for any nonlinear mixed effects model although our work focuses on an application to population pharmacokinetic models.
Resumo:
Effective, statistically robust sampling and surveillance strategies form an integral component of large agricultural industries such as the grains industry. Intensive in-storage sampling is essential for pest detection, Integrated Pest Management (IPM), to determine grain quality and to satisfy importing nation’s biosecurity concerns, while surveillance over broad geographic regions ensures that biosecurity risks can be excluded, monitored, eradicated or contained within an area. In the grains industry, a number of qualitative and quantitative methodologies for surveillance and in-storage sampling have been considered. Primarily, research has focussed on developing statistical methodologies for in storage sampling strategies concentrating on detection of pest insects within a grain bulk, however, the need for effective and statistically defensible surveillance strategies has also been recognised. Interestingly, although surveillance and in storage sampling have typically been considered independently, many techniques and concepts are common between the two fields of research. This review aims to consider the development of statistically based in storage sampling and surveillance strategies and to identify methods that may be useful for both surveillance and in storage sampling. We discuss the utility of new quantitative and qualitative approaches, such as Bayesian statistics, fault trees and more traditional probabilistic methods and show how these methods may be used in both surveillance and in storage sampling systems.
Resumo:
Vehicular safety applications, such as cooperative collision warning systems, rely on beaconing to provide situational awareness that is needed to predict and therefore to avoid possible collisions. Beaconing is the continual exchange of vehicle motion-state information, such as position, speed, and heading, which enables each vehicle to track its neighboring vehicles in real time. This work presents a context-aware adaptive beaconing scheme that dynamically adapts the beaconing repetition rate based on an estimated channel load and the danger severity of the interactions among vehicles. The safety, efficiency, and scalability of the new scheme is evaluated by simulating vehicle collisions caused by inattentive drivers under various road traffic densities. Simulation results show that the new scheme is more efficient and scalable, and is able to improve safety better than the existing non-adaptive and adaptive rate schemes.