977 resultados para incorporate probabilistic techniques
Resumo:
In this article, several basic swarming laws for Unmanned Aerial Vehicles (UAVs) are developed for both two-dimensional (2D) plane and three-dimensional (3D) space. Effects of these basic laws on the group behaviour of swarms of UAVs are studied. It is shown that when cohesion rule is applied an equilibrium condition is reached in which all the UAVs settle at the same altitude on a circle of constant radius. It is also proved analytically that this equilibrium condition is stable for all values of velocity and acceleration. A decentralised autonomous decision-making approach that achieves collision avoidance without any central authority is also proposed in this article. Algorithms are developed with the help of these swarming laws for two types of collision avoidance, Group-wise and Individual, in 2D plane and 3D space. Effect of various parameters are studied on both types of collision avoidance schemes through extensive simulations.
Resumo:
Lateral or transaxial truncation of cone-beam data can occur either due to the field of view limitation of the scanning apparatus or iregion-of-interest tomography. In this paper, we Suggest two new methods to handle lateral truncation in helical scan CT. It is seen that reconstruction with laterally truncated projection data, assuming it to be complete, gives severe artifacts which even penetrates into the field of view. A row-by-row data completion approach using linear prediction is introduced for helical scan truncated data. An extension of this technique known as windowed linear prediction approach is introduced. Efficacy of the two techniques are shown using simulation with standard phantoms. A quantitative image quality measure of the resulting reconstructed images are used to evaluate the performance of the proposed methods against an extension of a standard existing technique.
Resumo:
The widespread and increasing resistance of internal parasites to anthelmintic control is a serious problem for the Australian sheep and wool industry. As part of control programmes, laboratories use the Faecal Egg Count Reduction Test (FECRT) to determine resistance to anthelmintics. It is important to have confidence in the measure of resistance, not only for the producer planning a drenching programme but also for companies investigating the efficacy of their products. The determination of resistance and corresponding confidence limits as given in anthelmintic efficacy guidelines of the Standing Committee on Agriculture (SCA) is based on a number of assumptions. This study evaluated the appropriateness of these assumptions for typical data and compared the effectiveness of the standard FECRT procedure with the effectiveness of alternative procedures. Several sets of historical experimental data from sheep and goats were analysed to determine that a negative binomial distribution was a more appropriate distribution to describe pre-treatment helminth egg counts in faeces than a normal distribution. Simulated egg counts for control animals were generated stochastically from negative binomial distributions and those for treated animals from negative binomial and binomial distributions. Three methods for determining resistance when percent reduction is based on arithmetic means were applied. The first was that advocated in the SCA guidelines, the second similar to the first but basing the variance estimates on negative binomial distributions, and the third using Wadley’s method with the distribution of the response variate assumed negative binomial and a logit link transformation. These were also compared with a fourth method recommended by the International Co-operation on Harmonisation of Technical Requirements for Registration of Veterinary Medicinal Products (VICH) programme, in which percent reduction is based on the geometric means. A wide selection of parameters was investigated and for each set 1000 simulations run. Percent reduction and confidence limits were then calculated for the methods, together with the number of times in each set of 1000 simulations the theoretical percent reduction fell within the estimated confidence limits and the number of times resistance would have been said to occur. These simulations provide the basis for setting conditions under which the methods could be recommended. The authors show that given the distribution of helminth egg counts found in Queensland flocks, the method based on arithmetic not geometric means should be used and suggest that resistance be redefined as occurring when the upper level of percent reduction is less than 95%. At least ten animals per group are required in most circumstances, though even 20 may be insufficient where effectiveness of the product is close to the cut off point for defining resistance.
Resumo:
Many statistical forecast systems are available to interested users. In order to be useful for decision-making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and their statistical manifestation have been firmly established, the forecasts must also provide some quantitative evidence of `quality’. However, the quality of statistical climate forecast systems (forecast quality) is an ill-defined and frequently misunderstood property. Often, providers and users of such forecast systems are unclear about what ‘quality’ entails and how to measure it, leading to confusion and misinformation. Here we present a generic framework to quantify aspects of forecast quality using an inferential approach to calculate nominal significance levels (p-values) that can be obtained either by directly applying non-parametric statistical tests such as Kruskal-Wallis (KW) or Kolmogorov-Smirnov (KS) or by using Monte-Carlo methods (in the case of forecast skill scores). Once converted to p-values, these forecast quality measures provide a means to objectively evaluate and compare temporal and spatial patterns of forecast quality across datasets and forecast systems. Our analysis demonstrates the importance of providing p-values rather than adopting some arbitrarily chosen significance levels such as p < 0.05 or p < 0.01, which is still common practice. This is illustrated by applying non-parametric tests (such as KW and KS) and skill scoring methods (LEPS and RPSS) to the 5-phase Southern Oscillation Index classification system using historical rainfall data from Australia, The Republic of South Africa and India. The selection of quality measures is solely based on their common use and does not constitute endorsement. We found that non-parametric statistical tests can be adequate proxies for skill measures such as LEPS or RPSS. The framework can be implemented anywhere, regardless of dataset, forecast system or quality measure. Eventually such inferential evidence should be complimented by descriptive statistical methods in order to fully assist in operational risk management.
Resumo:
In this paper we study two problems in feedback stabilization. The first is the simultaneous stabilization problem, which can be stated as follows. Given plantsG_{0}, G_{1},..., G_{l}, does there exist a single compensatorCthat stabilizes all of them? The second is that of stabilization by a stable compensator, or more generally, a "least unstable" compensator. Given a plantG, we would like to know whether or not there exists a stable compensatorCthat stabilizesG; if not, what is the smallest number of right half-place poles (counted according to their McMillan degree) that any stabilizing compensator must have? We show that the two problems are equivalent in the following sense. The problem of simultaneously stabilizingl + 1plants can be reduced to the problem of simultaneously stabilizinglplants using a stable compensator, which in turn can be stated as the following purely algebraic problem. Given2lmatricesA_{1}, ..., A_{l}, B_{1}, ..., B_{l}, whereA_{i}, B_{i}are right-coprime for alli, does there exist a matrixMsuch thatA_{i} + MB_{i}, is unimodular for alli?Conversely, the problem of simultaneously stabilizinglplants using a stable compensator can be formulated as one of simultaneously stabilizingl + 1plants. The problem of determining whether or not there exists anMsuch thatA + BMis unimodular, given a right-coprime pair (A, B), turns out to be a special case of a question concerning a matrix division algorithm in a proper Euclidean domain. We give an answer to this question, and we believe this result might be of some independent interest. We show that, given twon times mplantsG_{0} and G_{1}we can generically stabilize them simultaneously provided eithernormis greater than one. In contrast, simultaneous stabilizability, of two single-input-single-output plants, g0and g1, is not generic.
Resumo:
In this paper we develop compilation techniques for the realization of applications described in a High Level Language (HLL) onto a Runtime Reconfigurable Architecture. The compiler determines Hyper Operations (HyperOps) that are subgraphs of a data flow graph (of an application) and comprise elementary operations that have strong producer-consumer relationship. These HyperOps are hosted on computation structures that are provisioned on demand at runtime. We also report compiler optimizations that collectively reduce the overheads of data-driven computations in runtime reconfigurable architectures. On an average, HyperOps offer a 44% reduction in total execution time and a 18% reduction in management overheads as compared to using basic blocks as coarse grained operations. We show that HyperOps formed using our compiler are suitable to support data flow software pipelining.
Resumo:
This paper presents the site classification of Bangalore Mahanagar Palike (BMP) area using geophysical data and the evaluation of spectral acceleration at ground level using probabilistic approach. Site classification has been carried out using experimental data from the shallow geophysical method of Multichannel Analysis of Surface wave (MASW). One-dimensional (1-D) MASW survey has been carried out at 58 locations and respective velocity profiles are obtained. The average shear wave velocity for 30 m depth (Vs(30)) has been calculated and is used for the site classification of the BMP area as per NEHRP (National Earthquake Hazards Reduction Program). Based on the Vs(30) values major part of the BMP area can be classified as ``site class D'', and ``site class C'. A smaller portion of the study area, in and around Lalbagh Park, is classified as ``site class B''. Further, probabilistic seismic hazard analysis has been carried out to map the seismic hazard in terms spectral acceleration (S-a) at rock and the ground level considering the site classes and six seismogenic sources identified. The mean annual rate of exceedance and cumulative probability hazard curve for S. have been generated. The quantified hazard values in terms of spectral acceleration for short period and long period are mapped for rock, site class C and D with 10% probability of exceedance in 50 years on a grid size of 0.5 km. In addition to this, the Uniform Hazard Response Spectrum (UHRS) at surface level has been developed for the 5% damping and 10% probability of exceedance in 50 years for rock, site class C and D These spectral acceleration and uniform hazard spectrums can be used to assess the design force for important structures and also to develop the design spectrum.
Resumo:
Climate variability and change are risk factors for climate sensitive activities such as agriculture. Managing these risks requires "climate knowledge", i.e. a sound understanding of causes and consequences of climate variability and knowledge of potential management options that are suitable in light of the climatic risks posed. Often such information about prognostic variables (e.g. yield, rainfall, run-off) is provided in probabilistic terms (e.g. via cumulative distribution functions, CDF), whereby the quantitative assessments of these alternative management options is based on such CDFs. Sound statistical approaches are needed in order to assess whether difference between such CDFs are intrinsic features of systems dynamics or chance events (i.e. quantifying evidences against an appropriate null hypothesis). Statistical procedures that rely on such a hypothesis testing framework are referred to as "inferential statistics" in contrast to descriptive statistics (e.g. mean, median, variance of population samples, skill scores). Here we report on the extension of some of the existing inferential techniques that provides more relevant and adequate information for decision making under uncertainty.
Resumo:
The amount and timing of early wet-season rainfall are important for the management of many agricultural industries in north Australia. With this in mind, a wet-season onset date is defined based on the accumulation of rainfall to a predefined threshold, starting from 1 September, for each square of a 1° gridded analysis of daily rainfall across the region. Consistent with earlier studies, the interannual variability of the onset dates is shown to be well related to the immediately preceding July-August Southern Oscillation index (SOI). Based on this relationship, a forecast method using logistic regression is developed to predict the probability that onset will occur later than the climatological mean date. This method is expanded to also predict the probabilities that onset will be later than any of a range of threshold dates around the climatological mean. When assessed using cross-validated hindcasts, the skill of the predictions exceeds that of climatological forecasts in the majority of locations in north Australia, especially in the Top End region, Cape York, and central Queensland. At times of strong anomalies in the July-August SOI, the forecasts are reliably emphatic. Furthermore, predictions using tropical Pacific sea surface temperatures (SSTs) as the predictor are also tested. While short-lead (July-August predictor) forecasts are more skillful using the SOI, long-lead (May-June predictor) forecasts are more skillful using Pacific SSTs, indicative of the longer-term memory present in the ocean.
Resumo:
Some whole leaf-clearing and staining techniques are described for the microscopic observation of the origin of powdery mildew conidiophores, whether from external mycelium or internal mycelium, emerging through stomata. These techniques enable separation of the two genera, Oidiopsis and Streptopodium, in the Erysiphaceae.
Resumo:
In this work an attempt has been made to evaluate the seismic hazard of South India (8.0 degrees N-20 degrees N; 72 degrees E-88 degrees E) based on the probabilistic seismic hazard analysis (PSHA). The earthquake data obtained from different sources were declustered to remove the dependent events. A total of 598 earthquakes of moment magnitude 4 and above were obtained from the study area after declustering, and were considered for further hazard analysis. The seismotectonic map of the study area was prepared by considering the faults, lineaments and the shear zones in the study area which are associated with earthquakes of magnitude 4 and above. For assessing theseismic hazard, the study area was divided into small grids of size 0.1 degrees x0.1 degrees, and the hazard parameters were calculated at the centre of each of these grid cells by considering all the seismic sources with in a radius of 300 km. Rock level peak horizontal acceleration (PHA) and spectral acceleration (SA) values at 1 corresponding to 10% and 2% probability of exceedance in 50 years have been calculated for all the grid points. The contour maps showing the spatial variation of these values are presented here. Uniform hazard response spectrum (UHRS) at rock level for 5% damping and 10% and 2% probability of exceedance in 50 years were also developed for all the grid points. The peak ground acceleration (PGA) at surface level was calculated for the entire South India for four different site classes. These values can be used to find the PGA values at any site in South India based on site class at that location. Thus, this method can be viewed as a simplified method to evaluate the PGA values at any site in the study area.
Resumo:
Navua sedge, a member of the Cyperaceae family, is an aggressive weed of pastures in Fiji, Sri Lanka, Malay Peninsula, Vanuatu, Samoa, Solomons, and Tahiti and is now a weed of pastures and roadsides in north Queensland, Australia. Primarily restricted to areas with an annual rainfall exceeding 2500 mm, Navua sedge is capable of forming dense stands smothering many tropical pasture species. Seventeen herbicides were field tested at three sites in north Queensland, with glyphosate, halosulfuron, hexazinone, imazapic, imazapyr, or MSMA the most effective for Navua sedge control. Environmental problems such as persistence in soil, lack of selectivity and movement off-site may occur using some herbicides at the predicted LC90 control level rates. A seasonality trial using halosulfuron (97.5 g ai/ha) gave better Navua sedge control (84%) spraying March to September than spraying at other times (50%). In a frequency trial, sequential glyphosate applications (2,160 g ae/ha) every two months was more effective for continued Navua sedge control (67%) than a single application of glyphosate (36%), though loss of ground cover would occur. In a management trial, single applications of glyphosate (2,160 to 3,570 g ae/ha) using either a rope wick, ground foliar spraying or a rotary rope wick gave 59 to 73% control, while other treatments (rotary hoe (3%), slashing (-13%) or crushing (-30%)) were less effective. In a second management trial, four monthly rotary wick applications were much more effective (98%) than four monthly crushing applications (42%). An effective management plan must include the application of regular herbicide treatments to eliminate Navua sedge seed being added to the soil seed bank. Treatments that result in seed burial, for example, discing are likely to prolong seed persistence and should be avoided. The sprouting activity of vegetative propagules and root fragmentation needs to also be considered when selecting control options.
Resumo:
Multi-access techniques are widely used in computer networking and distributed multiprocessor systems. On-the-fly arbitration schemes permit one of the many contenders to access the medium without collisions. Serial arbitration is cost effective but is slow and hence unsuitable for high-speed multiprocessor environments supporting very high data transfer rates. A fully parallel arbitration scheme takes less time but is not practically realisable for large numbers of contenders. In this paper, a generalised parallel-serial scheme is proposed which significantly reduces the arbitration time and is practically realisable.
Resumo:
The behavior of pile foundations in non liquefiable soil under seismic loading is considerably influenced by the variability in the soil and seismic design parameters. Hence, probabilistic models for the assessment of seismic pile design are necessary. Deformation of pile foundation in non liquefiable soil is dominated by inertial force from superstructure. The present study considers a pseudo-static approach based on code specified design response spectra. The response of the pile is determined by equivalent cantilever approach. The soil medium is modeled as a one-dimensional random field along the depth. The variability associated with undrained shear strength, design response spectrum ordinate, and superstructure mass is taken into consideration. Monte Carlo simulation technique is adopted to determine the probability of failure and reliability indices based on pile failure modes, namely exceedance of lateral displacement limit and moment capacity. A reliability-based design approach for the free head pile under seismic force is suggested that enables a rational choice of pile design parameters.
Resumo:
This Sheep CRC2 project is directed towards regional development of, support for, commercialisation and target adoption of, new integrated management techniques, which will lift productivity and better match product with market specification. The outputs and benefits will help support and improve the productivity and profitability of producers remaining in the industry, and support industry in the current move towards one in which both meat and wool will be joint drivers of industry productivity.