279 resultados para Packet Filtering
Resumo:
In this paper, we propose a highly reliable fault diagnosis scheme for incipient low-speed rolling element bearing failures. The scheme consists of fault feature calculation, discriminative fault feature analysis, and fault classification. The proposed approach first computes wavelet-based fault features, including the respective relative wavelet packet node energy and entropy, by applying a wavelet packet transform to an incoming acoustic emission signal. The most discriminative fault features are then filtered from the originally produced feature vector by using discriminative fault feature analysis based on a binary bat algorithm (BBA). Finally, the proposed approach employs one-against-all multiclass support vector machines to identify multiple low-speed rolling element bearing defects. This study compares the proposed BBA-based dimensionality reduction scheme with four other dimensionality reduction methodologies in terms of classification performance. Experimental results show that the proposed methodology is superior to other dimensionality reduction approaches, yielding an average classification accuracy of 94.9%, 95.8%, and 98.4% under bearing rotational speeds at 20 revolutions-per-minute (RPM), 80 RPM, and 140 RPM, respectively.
Resumo:
In an estuary, mixing and dispersion result from a combination of large-scale advection and smallscale turbulence, which are complex to estimate. The predictions of scalar transport and mixing are often inferred and rarely accurate, due to inadequate understanding of the contributions of these difference scales to estuarine recirculation. A multi-device field study was conducted in a small sub-tropical estuary under neap tide conditions with near-zero fresh water discharge for about 48 hours. During the study, acoustic Doppler velocimeters (ADV) were sampled at high frequency (50 Hz), while an acoustic Doppler current profiler (ADCP) and global positioning system (GPS) tracked drifters were used to obtain some lower frequency spatial distribution of the flow parameters within the estuary. The velocity measurements were complemented with some continuous measurement of water depth, conductivity, temperature and some other physiochemical parameters. Thorough quality control was carried out by implementation of relevant error removal filters on the individual data set to intercept spurious data. A triple decomposition (TD) technique was introduced to access the contributions of tides, resonance and ‘true’ turbulence in the flow field. The time series of mean flow measurements for both the ADCP and drifter were consistent with those of the mean ADV data when sampled within a similar spatial domain. The tidal scale fluctuation of velocity and water level were used to examine the response of the estuary to tidal inertial current. The channel exhibited a mixed type wave with a typical phase-lag between 0.035π– 0.116π. A striking feature of the ADV velocity data was the slow fluctuations, which exhibited large amplitudes of up to 50% of the tidal amplitude, particularly in slack waters. Such slow fluctuations were simultaneously observed in a number of physiochemical properties of the channel. The ensuing turbulence field showed some degree of anisotropy. For all ADV units, the horizontal turbulence ratio ranged between 0.4 and 0.9, and decreased towards the bed, while the vertical turbulence ratio was on average unity at z = 0.32 m and approximately 0.5 for the upper ADV (z = 0.55 m). The result of the statistical analysis suggested that the ebb phase turbulence field was dominated by eddies that evolved from ejection type process, while that of the flood phase contained mixed eddies with significant amount related to sweep type process. Over 65% of the skewness values fell within the range expected of a finite Gaussian distribution and the bulk of the excess kurtosis values (over 70%) fell within the range of -0.5 and +2. The TD technique described herein allowed the characterisation of a broader temporal scale of fluctuations of the high frequency data sampled within the durations of a few tidal cycles. The study provides characterisation of the ranges of fluctuation required for an accurate modelling of shallow water dispersion and mixing in a sub-tropical estuary.
Resumo:
We argue that safeguards are necessary to ensure human rights are adequately protected. All systems of blocking access to online content necessarily raise difficult and problematic issues of infringement of freedom of speech and access to information. Given the importance of access to information across the breadth of modern life, great care must be taken to ensure that any measures designed to protect copyright by blocking access to online locations are proportionate. Any measures to block access to online content must be carefully tailored to avoid serious and disproportionate impact on human rights. This means first that the measures must be effective and adapted to achieve a legitimate purpose. The experience of foreign jurisdictions suggests that this legislation is unlikely to be effective. Unless and until there is clear evidence that the proposed scheme is likely to increase effective returns to Australian creators, this legislation should not be introduced. Second, the principle of proportionality requires ensuring that the proposed legislation does not unnecessarily burden legitimate speech or access to information. As currently worded, the draft legislation may result in online locations being blocked even though they would, if operated in Australia, not contravene Australian law. This is unacceptable, and if introduced, the law should be drafted so that it is clearly limited only to foreign locations where there is clear and compelling evidence that the location would authorise copyright infringement if it were in Australia. Third, proportionality requires that measures are reasonable and strike an appropriate balance between competing interests. This draft legislation provides few safeguards for the public interest or the interests of private actors who would access legitimate information. New safeguards should be introduced to ensure that the public interest is well represented at both the stage of the primary application and at any applications to rescind or vary injunctions. We recommend that: The legislation not be introduced unless and until there is compelling evidence that it will have a real and significant positive impact on the effective incomes of Australian creators. The ‘facilitates an infringement’ test in s 115A(1)(b) should be replaced with ‘authorises infringement’. The ‘primary purpose’ test in s 115A(1)(c) should be replaced with: “the online location has no substantial non-infringing uses”. An explicit role for public interest groups as amici curiae should be introduced. Costs of successful applications should be borne by applicants. Injunctions should be valid only for renewable two year terms. Section 115A(5) should be clarified, and cl (b) and (c) be removed. The effectiveness of the scheme should be evaluated in two years.
Resumo:
In this paper, a novel 2×2 multiple-input multiple-output orthogonal frequency division multiplexing (MIMO-OFDM) testbed based on an Analog Devices AD9361 highly integrated radio frequency (RF) agile transceiver was specifically implemented for the purpose of estimating and analyzing MIMO-OFDM channel capacity in vehicle-to-infrastructure (V2I) environments using the 920 MHz industrial, scientific, and medical (ISM) band. We implemented two-dimensional discrete cosine transform-based filtering to reduce the channel estimation errors and show its effectiveness on our measurement results. We have also analyzed the effects of channel estimation error on the MIMO channel capacity by simulation. Three different scenarios of subcarrier spacing were investigated which correspond to IEEE 802.11p, Long-Term Evolution (LTE), and Digital Video Broadcasting Terrestrial (DVB-T)(2k) standards. An extensive MIMO-OFDM V2I channel measurement campaign was performed in a suburban environment. Analysis of the measured MIMO channel capacity results as a function of the transmitter-to-receiver (TX-RX) separation distance up to 250 m shows that the variance of the MIMO channel capacity is larger for the near-range line-of-sight (LOS) scenarios than for the long-range non-LOS cases, using a fixed receiver signal-to-noise ratio (SNR) criterion. We observed that the largest capacity values were achieved at LOS propagation despite the common assumption of a degenerated MIMO channel in LOS. We consider that this is due to the large angular spacing between MIMO subchannels which occurs when the receiver vehicle rooftop antennas pass by the fixed transmitter antennas at close range, causing MIMO subchannels to be orthogonal. In addition, analysis on the effects of different subcarrier spacings on MIMO-OFDM channel capacity showed negligible differences in mean channel capacity for the subcarrier spacing range investigated. Measured channels described in this paper are available on request.
Resumo:
We extended genetic linkage analysis - an analysis widely used in quantitative genetics - to 3D images to analyze single gene effects on brain fiber architecture. We collected 4 Tesla diffusion tensor images (DTI) and genotype data from 258 healthy adult twins and their non-twin siblings. After high-dimensional fluid registration, at each voxel we estimated the genetic linkage between the single nucleotide polymorphism (SNP), Val66Met (dbSNP number rs6265), of the BDNF gene (brain-derived neurotrophic factor) with fractional anisotropy (FA) derived from each subject's DTI scan, by fitting structural equation models (SEM) from quantitative genetics. We also examined how image filtering affects the effect sizes for genetic linkage by examining how the overall significance of voxelwise effects varied with respect to full width at half maximum (FWHM) of the Gaussian smoothing applied to the FA images. Raw FA maps with no smoothing yielded the greatest sensitivity to detect gene effects, when corrected for multiple comparisons using the false discovery rate (FDR) procedure. The BDNF polymorphism significantly contributed to the variation in FA in the posterior cingulate gyrus, where it accounted for around 90-95% of the total variance in FA. Our study generated the first maps to visualize the effect of the BDNF gene on brain fiber integrity, suggesting that common genetic variants may strongly determine white matter integrity.
Resumo:
The 3D Water Chemistry Atlas is an intuitive, open source, Web-based system that enables the three-dimensional (3D) sub-surface visualization of ground water monitoring data, overlaid on the local geological model (formation and aquifer strata). This paper firstly describes the results of evaluating existing virtual globe technologies, which led to the decision to use the Cesium open source WebGL Virtual Globe and Map Engine as the underlying platform. Next it describes the backend database and search, filtering, browse and analysis tools that were developed to enable users to interactively explore the groundwater monitoring data and interpret it spatially and temporally relative to the local geological formations and aquifers via the Cesium interface. The result is an integrated 3D visualization system that enables environmental managers and regulators to assess groundwater conditions, identify inconsistencies in the data, manage impacts and risks and make more informed decisions about coal seam gas extraction, waste water extraction, and water reuse.
Resumo:
Antigen selection of B cells within the germinal center reaction generally leads to the accumulation of replacement mutations in the complementarity-determining regions (CDRs) of immunoglobulin genes. Studies of mutations in IgE-associated VDJ gene sequences have cast doubt on the role of antigen selection in the evolution of the human IgE response, and it may be that selection for high affinity antibodies is a feature of some but not all allergic diseases. The severity of IgE-mediated anaphylaxis is such that it could result from higher affinity IgE antibodies. We therefore investigated IGHV mutations in IgE-associated sequences derived from ten individuals with a history of anaphylactic reactions to bee or wasp venom or peanut allergens. IgG sequences, which more certainly experience antigen selection, served as a control dataset. A total of 6025 unique IgE and 5396 unique IgG sequences were generated using high throughput 454 pyrosequencing. The proportion of replacement mutations seen in the CDRs of the IgG dataset was significantly higher than that of the IgE dataset, and the IgE sequences showed little evidence of antigen selection. To exclude the possibility that 454 errors had compromised analysis, rigorous filtering of the datasets led to datasets of 90 core IgE sequences and 411 IgG sequences. These sequences were present as both forward and reverse reads, and so were most unlikely to include sequencing errors. The filtered datasets confirmed that antigen selection plays a greater role in the evolution of IgG sequences than of IgE sequences derived from the study participants.
Resumo:
The paper presents an improved Phase-Locked Loop (PLL) for measuring the fundamental frequency and selective harmonic content of a distorted signal. This information can be used by grid interfaced devices and harmonic compensators. The single-phase structure is based on the Synchronous Reference Frame (SRF) PLL. The proposed PLL needs only a limited number of harmonic stages by incorporating Moving Average Filters (MAF) for eliminating the undesired harmonic content at each stage. The frequency dependency of MAF in effective filtering of undesired harmonics is also dealt with by a proposed method for adaptation to frequency variations of input signal. The method is suitable for high sampling rates and a wide frequency measurement range. Furthermore, an extended model of this structure is proposed which includes the response to both the frequency and phase angle variations. The proposed algorithm is simulated and verified using Hardware-in-the-Loop (HIL) testing.
Resumo:
Drivers behave in different ways, and these different behaviors are a cause of traffic disturbances. A key objective for simulation tools is to correctly reproduce this variability, in particular for car-following models. From data collection to the sampling of realistic behaviors, a chain of key issues must be addressed. This paper discusses data filtering, robustness of calibration, correlation between parameters, and sampling techniques of acceleration-time continuous car-following models. The robustness of calibration is systematically investigated with an objective function that allows confidence regions around the minimum to be obtained. Then, the correlation between sets of calibrated parameters and the validity of the joint distributions sampling techniques are discussed. This paper confirms the need for adapted calibration and sampling techniques to obtain realistic sets of car-following parameters, which can be used later for simulation purposes.
Resumo:
Objective: To illustrate a new method for simplifying patient recruitment for advanced prostate cancer clinical trials using natural language processing techniques. Background: The identification of eligible participants for clinical trials is a critical factor to increase patient recruitment rates and an important issue for discovery of new treatment interventions. The current practice of identifying eligible participants is highly constrained due to manual processing of disparate sources of unstructured patient data. Informatics-based approaches can simplify the complex task of evaluating patient’s eligibility for clinical trials. We show that an ontology-based approach can address the challenge of matching patients to suitable clinical trials. Methods: The free-text descriptions of clinical trial criteria as well as patient data were analysed. A set of common inclusion and exclusion criteria was identified through consultations with expert clinical trial coordinators. A research prototype was developed using Unstructured Information Management Architecture (UIMA) that identified SNOMED CT concepts in the patient data and clinical trial description. The SNOMED CT concepts model the standard clinical terminology that can be used to represent and evaluate patient’s inclusion/exclusion criteria for the clinical trial. Results: Our experimental research prototype describes a semi-automated method for filtering patient records using common clinical trial criteria. Our method simplified the patient recruitment process. The discussion with clinical trial coordinators showed that the efficiency in patient recruitment process measured in terms of information processing time could be improved by 25%. Conclusion: An UIMA-based approach can resolve complexities in patient recruitment for advanced prostate cancer clinical trials.
Resumo:
Avian species richness surveys, which measure the total number of unique avian species, can be conducted via remote acoustic sensors. An immense quantity of data can be collected, which, although rich in useful information, places a great workload on the scientists who manually inspect the audio. To deal with this big data problem, we calculated acoustic indices from audio data at a one-minute resolution and used them to classify one-minute recordings into five classes. By filtering out the non-avian minutes, we can reduce the amount of data by about 50% and improve the efficiency of determining avian species richness. The experimental results show that, given 60 one-minute samples, our approach enables to direct ecologists to find about 10% more avian species.
Resumo:
Bird species richness survey is one of the most intriguing ecological topics for evaluating environmental health. Here, bird species richness denotes the number of unique bird species in a particular area. Factors affecting the investigation of bird species richness include weather, observation bias, and most importantly, the prohibitive costs of conducting surveys at large spatiotemporal scales. Thanks to advances in recording techniques, these problems have been alleviated by deploying sensors for acoustic data collection. Although automated detection techniques have been introduced to identify various bird species, the innate complexity of bird vocalizations, the background noise present in the recording and the escalating volumes of acoustic data pose a challenging task on determination of bird species richness. In this paper we proposed a two-step computer-assisted sampling approach for determining bird species richness in one-day acoustic data. First, a classification model is built based on acoustic indices for filtering out minutes that contain few bird species. Then the classified bird minutes are ordered by an acoustic index and the redundant temporal minutes are removed from the ranked minute sequence. The experimental results show that our method is more efficient in directing experts for determination of bird species compared with the previous methods.
Resumo:
This study examines and quantifies the effect of adding polyelectrolytes to cellulose nanofibre suspensions on the gel point of cellulose nanofibre suspensions, which is the lowest solids concentration at which the suspension forms a continuous network. The lower the gel point, the faster the drainage time to produce a sheet and the higher the porosity of the final sheet formed. Two new techniques were designed to measure the dynamic compressibility and the drainability of nanocellulose–polyelectrolyte suspensions. We developed a master curve which showed that the independent variable controlling the behaviour of nanocellulose suspensions and its composite is the structure of the flocculated suspension which is best quantified as the gel point. This was independent of the type of polyelectrolyte used. At an addition level of 2 mg/g of nanofibre, a reduction in gel point over 50 % was achieved using either a high molecular weight (13 MDa) linear cationic polyacrylamide (CPAM, 40 % charge), a dendrimer polyethylenimine of high molecular weight of 750,000 Da (HPEI) or even a low molecular weight of 2000 Da (LPEI). There was no significant difference in the minimum gel point achieved, despite the difference in polyelectrolyte morphology and molecular weight. In this paper, we show that the gel point controls the flow through the fibre suspension, even when comparing fibre suspensions with solids content above the gel point. A lower gel point makes it easier for water to drain through the fibre network,reducing the pressure required to achieve a given dewatering rate and reducing the filtering time required to form a wet laid sheet. We further show that the lower gel point partially controls the structure of the wet laid sheet after it is dried. Halving the gel point increased the air permeability of the dry sheet by 37, 46 and 25 %, when using CPAM, HPEI and LPEI, respectively. The resistance to liquid flow was reduced by 74 and 90 %, when using CPAM and LPEI. Analysing the paper formed shows that sheet forming process and final sheet properties can be engineered and controlled by adding polyelectrolytes to the nanofibre suspension.
Resumo:
Automatic-dishwasher detergent is a common household substance which is extremely corrosive and potentially fatal if ingested. In this report, we discuss the implications of the ingestion of automatic-dishwasher detergent in 18 children over a three-year period. Ten of the 18 children gained access to the automatic-dishwasher detergent from the dishwasher on the completion of the washing-cycle, while the remainder ingested the detergent directly from the packet. There was a poor correlation between the presenting signs and symptoms and the subsequent endoscopic finding in the 14 children who underwent endoscopy.
Resumo:
This entry discusses the origins and history of media content regulation, the reasons for content regulations, and their application to different media platforms. It discusses online content regulations and the concerns that have motivated such policies with particular reference to debates about internet filtering. It is noted that, as there is growing convergence of media content, platforms, devices, and services, the debates can be expected to shift from free speech and censorship on the internet and the social protection of internet users, to wider issues of media policy reform that include cultural policy and industry development in the digital economy.