922 resultados para Kalman filters
Resumo:
Event-based systems are seen as good candidates for supporting distributed applications in dynamic and ubiquitous environments because they support decoupled and asynchronous many-to-many information dissemination. Event systems are widely used, because asynchronous messaging provides a flexible alternative to RPC (Remote Procedure Call). They are typically implemented using an overlay network of routers. A content-based router forwards event messages based on filters that are installed by subscribers and other routers. The filters are organized into a routing table in order to forward incoming events to proper subscribers and neighbouring routers. This thesis addresses the optimization of content-based routing tables organized using the covering relation and presents novel data structures and configurations for improving local and distributed operation. Data structures are needed for organizing filters into a routing table that supports efficient matching and runtime operation. We present novel results on dynamic filter merging and the integration of filter merging with content-based routing tables. In addition, the thesis examines the cost of client mobility using different protocols and routing topologies. We also present a new matching technique called temporal subspace matching. The technique combines two new features. The first feature, temporal operation, supports notifications, or content profiles, that persist in time. The second feature, subspace matching, allows more expressive semantics, because notifications may contain intervals and be defined as subspaces of the content space. We also present an application of temporal subspace matching pertaining to metadata-based continuous collection and object tracking.
Resumo:
This study examined the physical and chemical properties of a novel, fully-recirculated prawn and polychaete production system that incorporated polychaete-assisted sand filters (PASF). The aims were to assess and demonstrate the potential of this system for industrialisation, and to provide optimisations for wastewater treatment by PASF. Two successive seasons were studied at commercially-relevant scales in a prototype system constructed at the Bribie Island Research Centre in Southeast Queensland. The project produced over 5.4 tonnes of high quality black tiger prawns at rates up to 9.9 tonnes per hectare, with feed conversion of up to 1.1. Additionally, the project produced about 930 kg of high value polychaete biomass at rates up to 1.5 kg per square metre of PASF, with the worms feeding predominantly on waste nutrients. Importantly, this closed production system demonstrated rapid growth of healthy prawns at commercially relevant production levels, using methods that appear feasible for application at large scale. Deeper (23 cm) PASF beds provided similar but more reliable wastewater treatment efficacies compared with shallower (13 cm) beds, but did not demonstrate significantly greater polychaete productivity than (easier to harvest) shallow beds. The nutrient dynamics associated with seasonal and tidal operations of the system were studied in detail, providing technical and practical insights into how PASF could be optimised for the mitigation of nutrient discharge. The study also highlighted some of the other important advantages of this integrated system, including low sludge production, no water discharge during the culture phase, high ecosystem health, good prospects for biosecurity controls, and the sustainable production of a fishery-limited resource (polychaetes) that may be essential for the expansion of prawn farming industries throughout the world. Regarding nutrient discharge from this prototype mariculture system, when PASF was operating correctly it proved feasible to have no water (or nutrient) discharge during the entire prawn growing season. However, the final drain harvest and emptying of ponds that is necessary at the end of the prawn farming season released 58.4 kg ha-1 of nitrogen and 6 kg ha-1 of phosphorus (in Season 2). Whilst this is well below (i.e., one-third to one-half of) the current load-based licencing conditions for many prawn farms in Australia, the levels of nitrogen and chlorophyll a in the ponds remained higher than the more-stringent maximum limits at the Bribie Island study site. Zero-net-nutrient discharge was not achieved, but waste nutrients were low where 5.91 kg of nitrogen and 0.61 kg of phosphorus was discharged per tonne of prawns produced. This was from a system that deployed PASF at 14.4% of total ponded farm area which treated an average of 5.8% of pond water daily and did not use settlement ponds or other natural or artificial water remediation systems. Four supplemental appendices complement this research by studying several additional aspects that are central to the industrialisation of PASF. The first details an economic model and decision tool which allows potential users to interactively assess construction and operational variables of PASF at different scales. The second provides the qualitative results of a prawn maturation trial conducted collaboratively with the Commonwealth Scientific and Industrial Research Organisation (CSIRO) to assess dietary inclusions of PASF-produced worms. The third provides the reproductive results from industry-based assessments of prawn broodstock produced using PASF. And the fourth appendix provides detailed elemental and nutritional analyses of bacterial biofilm produced by PASF and assesses its potential to improve the growth of prawns in recirculated culture systems.
Resumo:
Background Project archives are becoming increasingly large and complex. On construction projects in particular, the increasing amount of information and the increasing complexity of its structure make searching and exploring information in the project archive challenging and time-consuming. Methods This research investigates a query-driven approach that represents new forms of contextual information to help users understand the set of documents resulting from queries of construction project archives. Specifically, this research extends query-driven interface research by representing three types of contextual information: (1) the temporal context is represented in the form of a timeline to show when each document was created; (2) the search-relevance context shows exactly which of the entered keywords matched each document; and (3) the usage context shows which project participants have accessed or modified a file. Results We implemented and tested these ideas within a prototype query-driven interface we call VisArchive. VisArchive employs a combination of multi-scale and multi-dimensional timelines, color-coded stacked bar charts, additional supporting visual cues and filters to support searching and exploring historical project archives. The timeline-based interface integrates three interactive timelines as focus + context visualizations. Conclusions The feasibility of using these visual design principles is tested in two types of project archives: searching construction project archives of an educational building project and tracking of software defects in the Mozilla Thunderbird project. These case studies demonstrate the applicability, usefulness and generality of the design principles implemented.
Resumo:
Early detection of (pre-)signs of ulceration on a diabetic foot is valuable for clinical practice. Hyperspectral imaging is a promising technique for detection and classification of such (pre-)signs. However, the number of the spectral bands should be limited to avoid overfitting, which is critical for pixel classification with hyperspectral image data. The goal was to design a detector/classifier based on spectral imaging (SI) with a small number of optical bandpass filters. The performance and stability of the design were also investigated. The selection of the bandpass filters boils down to a feature selection problem. A dataset was built, containing reflectance spectra of 227 skin spots from 64 patients, measured with a spectrometer. Each skin spot was annotated manually by clinicians as "healthy" or a specific (pre-)sign of ulceration. Statistical analysis on the data set showed the number of required filters is between 3 and 7, depending on additional constraints on the filter set. The stability analysis revealed that shot noise was the most critical factor affecting the classification performance. It indicated that this impact could be avoided in future SI systems with a camera sensor whose saturation level is higher than 106, or by postimage processing.
Resumo:
Purification of drinking water is routinely achieved by use of conventional coagulants and disinfection procedures. However, there are instances such as flood events when the level of turbidity reaches extreme levels while NOM may be an issue throughout the year. Consequently, there is a need to develop technologies which can effectively treat water of high turbidity during flood events and natural organic matter (NOM) content year round. It was our hypothesis that pebble matrix filtration potentially offered a relatively cheap, simple and reliable means to clarify such challenging water samples. Therefore, a laboratory scale pebble matrix filter (PMF) column was used to evaluate the turbidity and natural organic matter (NOM) pre-treatment performance in relation to 2013 Brisbane River flood water. Since the high turbidity was only a seasonal and short term problem, the general applicability of pebble matrix filters for NOM removal was also investigated. A 1.0 m deep bed of pebbles (the matrix) partly in-filled with either sand or crushed glass was tested, upon which was situated a layer of granular activated carbon (GAC). Turbidity was measured as a surrogate for suspended solids (SS), whereas, total organic carbon (TOC) and UV Absorbance at 254 nm were measured as surrogate parameters for NOM. Experiments using natural flood water showed that without the addition of any chemical coagulants, PMF columns achieved at least 50% turbidity reduction when the source water contained moderate hardness levels. For harder water samples, above 85% turbidity reduction was obtained. The ability to remove 50% turbidity without chemical coagulants may represent significant cost savings to water treatment plants and added environmental benefits accrue due to less sludge formation. A TOC reduction of 35-47% and UV-254 nm reduction of 24-38% was also observed. In addition to turbidity removal during flood periods, the ability to remove NOM using the pebble matrix filter throughout the year may have the benefit of reducing disinfection by-products (DBP) formation potential and coagulant demand at water treatment plants. Final head losses were remarkably low, reaching only 11 cm at a filtration velocity of 0.70 m/h.
Resumo:
This paper presents an analysis of an optimal linear filter in the presence of constraints on the moan squared values of the estimates from the viewpoint of singular optimal control. The singular arc has been shown to satisfy the generalized Legcndrc-Clebseh condition and Jacobson's condition. Both the cases of white measurement noise and coloured measurement noise are considered. The constrained estimate is shown to be a linear transformation of the unconstrained Kalman estimate.
Resumo:
A nonlinear control design approach is presented in this paper for a challenging application problem of ensuring robust performance of an air-breathing engine operating at supersonic speed. The primary objective of control design is to ensure that the engine produces the required thrust that tracks the commanded thrust as closely as possible by appropriate regulation of the fuel flow rate. However, since the engine operates in the supersonic range, an important secondary objective is to ensure an optimal location of the shock in the intake for maximum pressure recovery with a sufficient margin. This is manipulated by varying the throat area of the nozzle. The nonlinear dynamic inversion technique has been successfully used to achieve both of the above objectives. In this problem, since the process is faster than the actuators, independent control designs have also been carried out for the actuators as well to assure the satisfactory performance of the system. Moreover, an extended Kalman Filter based state estimation design has been carried out both to filter out the process and sensor noises as well as to make the control design operate based on output feedback. Promising simulation results indicate that the proposed control design approach is quite successful in obtaining robust performance of the air-breathing system.
Resumo:
The paper deals with a method for the evaluation of exhaust muffers with mean flow. A new set of variables, convective pressure and convective mass velocity, have been defined to replace the acoustic variables. An expression for attenuation (insertion loss) of a muffler has been proposed in terms of convective terminal impedances and a velocity ratio, on the lines of the one existing for acoustic filters. In order to evaluate the velocity ratio in terms of convective variables, transfer matrices for various muffler elements have been derived from the basic relations of energy, mass and momentum. Finally, the velocity ratiocum-transfer matrix method is illustrated for a typical straight-through muffler.
Resumo:
The deviation in the performance of active networks due to practical operational amplifiers (OA) is mainly because of the finite gain bandwidth productBand nonzero output resistanceR_0. The effect ofBandR_0on two OA impedances and single and multi-OA filters are discussed. In filters, the effect ofR_0is to add zeros to the transfer function often making it nonminimum phase. A simple method of analysis has been suggested for 3-OA biquad and coupled biquad circuits. A general method of noise minimization of the generalized impedance converter (GIC), while operating OA's within the prescribed voltage and current limits, is also discussed. The 3-OA biquadratic sections analyzed also exhibit noise behavior and signal handling capacity similar to the GIC. The GIC based structures are found to be better than other configurations both in biquadratic sections and direct realizations of higher order transfer functions.
Resumo:
We present robust joint nonlinear transceiver designs for multiuser multiple-input multiple-output (MIMO) downlink in the presence of imperfections in the channel state information at the transmitter (CSIT). The base station (BS) is equipped with multiple transmit antennas, and each user terminal is equipped with one or more receive antennas. The BS employs Tomlinson-Harashima precoding (THP) for interuser interference precancellation at the transmitter. We consider robust transceiver designs that jointly optimize the transmit THP filters and receive filter for two models of CSIT errors. The first model is a stochastic error (SE) model, where the CSIT error is Gaussian-distributed. This model is applicable when the CSIT error is dominated by channel estimation error. In this case, the proposed robust transceiver design seeks to minimize a stochastic function of the sum mean square error (SMSE) under a constraint on the total BS transmit power. We propose an iterative algorithm to solve this problem. The other model we consider is a norm-bounded error (NBE) model, where the CSIT error can be specified by an uncertainty set. This model is applicable when the CSIT error is dominated by quantization errors. In this case, we consider a worst-case design. For this model, we consider robust (i) minimum SMSE, (ii) MSE-constrained, and (iii) MSE-balancing transceiver designs. We propose iterative algorithms to solve these problems, wherein each iteration involves a pair of semidefinite programs (SDPs). Further, we consider an extension of the proposed algorithm to the case with per-antenna power constraints. We evaluate the robustness of the proposed algorithms to imperfections in CSIT through simulation, and show that the proposed robust designs outperform nonrobust designs as well as robust linear transceiver designs reported in the recent literature.
Resumo:
One of the most important applications of adaptive systems is in noise cancellation using adaptive filters. Ln this paper, we propose adaptive noise cancellation schemes for the enhancement of EEG signals in the presence of EOG artifacts. The effect of two reference inputs is studied on simulated as well as recorded EEG signals and it is found that one reference input is enough to get sufficient minimization of EOG artifacts. This has been verified through correlation analysis also. We use signal to noise ratio and linear prediction spectra, along with time plots, for comparing the performance of the proposed schemes for minimizing EOG artifacts from contaminated EEG signals. Results show that the proposed schemes are very effective (especially the one which employs Newton's method) in minimizing the EOG artifacts from contaminated EEG signals.
Resumo:
We obtained the images of the eastern part of the solar corona in the Fe xiv 530.3 nm (green) and Fe x 637.4 nm (red) coronal emission lines during the total solar eclipse of 29 March 2006 at Manavgat, Antalya, Turkey. The images were obtained using a 35 cm Meade telescope equipped with a Peltier-cooled 2k x 2k CCD and 0.3 nm pass-band interference filters at the rates of 2.95 s (exposure times of 100 ms) and 2.0 s (exposure times of 300 ms) in the Fe xiv and Fe x emission lines,respectively. The analysis of the data indicates intensity variations at some locations with period of strongest power around 27 s for the green line and 20 s for the red line. These results confirm earlier findings of variations in the continuum intensity with periods in the range of 5 to 56 s by Singh et al. (Solar Phys. 170, 235, 1997). The wavelet analysis has been used to identify significant intensity oscillations at all pixels within our field of view. Significant oscillations with high probability estimates were detected for some locations only. These locations seem to follow the boundary of an active region and in the neighborhood, rather than within the loops themselves. These intensity oscillations may be caused by fast magneto-sonic waves in the solar corona and partly account for heating of the plasma in the corona.
Resumo:
This paper compares closed-loop performance of seeker-based and radar-based estimators for surface-to-air interception through 6-degree-of-freedom simulation using proportional navigation guidance.Ground radar measurements are evader range, azimuth and elevation angles contaminated by Gaussian noise. Onboard seeker measurements are pursuer-evader relative range, range rate also contaminated by Gaussian noise. The gimbal angles and line-of-sight rates in the gimbal frame,contaminated by time-correlated non-Gaussian noise with realistic numerical values are also available as measurements. In both the applications, extended Kalman filter with Gaussian noise assumption are used for state estimation. For a typical engagement, it is found that,based on Monte Carlo studies, seeker estimator outperforms radar estimator in terms of autopilot demand and reduces the miss distance.Thus, a seeker estimator with white Gaussian assumption is found to be adequate to handle the measurements even in the presence of non-Gaussian correlated noise. This paper uses realistic numerical values of all noise parameters.
Resumo:
Filters and other devices using photonic bandgap (PBG) theory are typically implemented in microstrip lines by etching periodic holes on the ground plane of the microstrip. The period of such several holes corresponds to nearly half the guided wavelength of the transmission line. In this paper we study the effects of miniaturization of the PBG device by meandering the microstrip line about one single hole in the ground plane. A comparison of the S-parameters and dispersion behavior of the modified geometry and a conventional PBG device with a straight microstrip line shows that these devices have similar behaviors.
Resumo:
The BeiDou system is the first global navigation satellite system in which all satellites transmit triple-frequency signals that can provide the positioning, navigation, and timing independently. A benefit of triple-frequency signals is that more useful combinations can be formed, including some extrawide-lane combinations whose ambiguities can generally be instantaneously fixed without distance restriction, although the narrow-lane ambiguity resolution (NL AR) still depends on the interreceiver distance or requires a long time to achieve. In this paper, we synthetically study decimeter and centimeter kinematic positioning using BeiDou triple-frequency signals. It starts with AR of two extrawide-lane signals based on the ionosphere-free or ionosphere-reduced geometry-free model. For decimeter positioning, one can immediately use two ambiguity-fixed extrawide-lane observations without pursuing NL AR. To achieve higher accuracy, NL AR is the necessary next step. Despite the fact that long-baseline NL AR is still challenging, some NL ambiguities can indeed be fixed with high reliability. Partial AR for NL signals is acceptable, because as long as some ambiguities for NL signals are fixed, positioning accuracy will be certainly improved.With accumulation of observations, more and more NL ambiguities are fixed and the positioning accuracy continues to improve. An efficient Kalman-filtering system is established to implement the whole process. The formulated system is flexible, since the additional constraints can be easily applied to enhance the model's strength. Numerical results from a set of real triple-frequency BeiDou data on a 50 km baseline show that decimeter positioning is achievable instantaneously.With only five data epochs, 84% of NL ambiguities can be fixed so that the real-time kinematic accuracies are 4.5, 2.5, and 16 cm for north, east, and height components (respectively), while with 10 data epochs more than 90% of NL ambiguities are fixed, and the rea- -time kinematic solutions are improved to centimeter level for all three coordinate components.