31 resultados para semi-automatic method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we concentrate on the direct semi-blind spatial equalizer design for MIMO systems with Rayleigh fading channels. Our aim is to develop an algorithm which can outperform the classical training based method with the same training information used, and avoid the problems of low convergence speed and local minima due to pure blind methods. A general semi-blind cost function is first constructed which incorporates both the training information from the known data and some kind of higher order statistics (HOS) from the unknown sequence. Then, based on the developed cost function, we propose two semi-blind iterative and adaptive algorithms to find the desired spatial equalizer. To further improve the performance and convergence speed of the proposed adaptive method, we propose a technique to find the optimal choice of step size. Simulation results demonstrate the performance of the proposed algorithms and comparable schemes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel, fast automatic motion segmentation approach is presented. It differs from conventional pixel or edge based motion segmentation approaches in that the proposed method uses labelled regions (facets) to segment various video objects from the background. Facets are clustered into objects based on their motion and proximity details using Bayesian logic. Because the number of facets is usually much lower than the number of edges and points, using facets can greatly reduce the computational complexity of motion segmentation. The proposed method can tackle efficiently the complexity of video object motion tracking, and offers potential for real-time content-based video annotation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Index properties such as the liquid limit and plastic limit are widely used to evaluate certain geotechnical parameters of fine-grained soils. Measurement of the liquid limit is a mechanical process, and the possibility of errors occurring during measurement is not significant. However, this is not the case for plastic limit testing, despite the fact that the current method of measurement is embraced by many standards around the world. The method in question relies on a fairly crude procedure known widely as the ‘thread rolling' test, though it has been the subject of much criticism in recent years. It is essential that a new, more reliable method of measuring the plastic limit is developed using a mechanical process that is both consistent and easily reproducible. The work reported in this paper concerns the development of a new device to measure the plastic limit, based on the existing falling cone apparatus. The force required for the test is equivalent to the application of a 54 N fast-static load acting on the existing cone used in liquid limit measurements. The test is complete when the relevant water content of the soil specimen allows the cone to achieve a penetration of 20 mm. The new technique was used to measure the plastic limit of 16 different clays from around the world. The plastic limit measured using the new method identified reasonably well the water content at which the soil phase changes from the plastic to the semi-solid state. Further evaluation was undertaken by conducting plastic limit tests using the new method on selected samples and comparing the results with values reported by local site investigation laboratories. Again, reasonable agreement was found.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is convenient and effective to solve nonlinear problems with a model that has a linear-in-the-parameters (LITP) structure. However, the nonlinear parameters (e.g. the width of Gaussian function) of each model term needs to be pre-determined either from expert experience or through exhaustive search. An alternative approach is to optimize them by a gradient-based technique (e.g. Newton’s method). Unfortunately, all of these methods still need a lot of computations. Recently, the extreme learning machine (ELM) has shown its advantages in terms of fast learning from data, but the sparsity of the constructed model cannot be guaranteed. This paper proposes a novel algorithm for automatic construction of a nonlinear system model based on the extreme learning machine. This is achieved by effectively integrating the ELM and leave-one-out (LOO) cross validation with our two-stage stepwise construction procedure [1]. The main objective is to improve the compactness and generalization capability of the model constructed by the ELM method. Numerical analysis shows that the proposed algorithm only involves about half of the computation of orthogonal least squares (OLS) based method. Simulation examples are included to confirm the efficacy and superiority of the proposed technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Raman spectroscopy with far-red excitation has been used to study seized, tableted samples of MDMA (N-methyl-3,4-methylenedioxyamphetamine) and related compounds (MDA, MDEA, MBDB, 2C-B and amphetamine sulfate), as well as pure standards of these drugs. We have found that by using far-red (785 nm) excitation the level of fluorescence background even in untreated seized samples is sufficiently low that there is little difficulty in obtaining good quality data with moderate 2 min data accumulation times. The spectra can be used to distinguish between even chemically-similar substances, such as the geometrical isomers MDEA and MBDB, and between different polymorphic/hydrated forms of the same drug. Moreover, these differences can be found even in directly recorded spectra of seized samples which have been bulked with other materials, giving a rapid and non-destructive method for drug identification. The spectra can be processed to give unambiguous identification of both drug and excipients (even when more than one compound has been used as the bulking agent) and the relative intensities of drug and excipient bands can be used for quantitative or at least semi-quantitative analysis. Finally, the simple nature of the measurements lends itself to automatic sample handling so that sample throughputs of 20 samples per hour can be achieved with no real difficulty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An intelligent ink, previously shown to be capable of rapidly assessing photocatalytic activity, was simply applied via a felt-pen onto a commercially available piece of Activ (TM) self-cleaning glass. The ink, comprising of redox dye resazurin and the sacrificial electron donor glycerol within an aqueous hydroxy ethyl cellulose (HEC) polymer media, was photocatalytically degraded in a two-step process. The key initial stage was the photo-reductive conversion of resazurin to resorufin, whereby a colour change from blue to pink occurred. The latter stage was the subsequent photo-reduction of the resorufin, where a slower change from pink to colourless was seen. Red and green components of red-green-blue colour extracted from flat-bed scanner digital images of resazurin ink coated photocatalytic films at intervals during the photocatalysis reaction were inversely proportional to the changes seen via UV-visible absorption spectroscopy and indicative of reaction kinetics. A 3 x 3 grid of intelligent ink was drawn onto a piece of Activ (TM) and a glass blank. The photocatalysis reaction was monitored solely by flat-bed digital scanning. Red-green-blue values of respective positions on the grid were extracted using a custom-built program entitled RGB Extractor (c). The program was capable of extracting a number of 5 x 5 pixel averages of red-green-blue components simultaneously. Allocation of merely three coordinates allowed for the automatic generation of a grid, with scroll-bars controlling the number of positions to be extracted on the grid formed. No significant change in red and green components for any position on the glass blank was observed; however, the Activ (TM) film displayed a homogenous photo-reduction of the dye, reaching maxima in red and minima in green components in 23 +/- 3 and 14 +/- 2 min, respectively. A compositionally graded N-doped titania film synthesised in house via a combinatorial APCVD reaction was also photocatalytically tested by this method where 247 positions on a 13 x 19 grid were simultaneously analysed. The dramatic variation in photocatalysis observed was rapidly quantified for all positions (2-3 hours) allowing for correlations to be made between thicknesses and N : Ti% compositions attained from Swanepoel and WDX analysis, respectively. N incorporation within this system was found to be detrimental to film activity for the photocatalysis reaction of intelligent ink under 365 nm light.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Color segmentation of images usually requires a manual selection and classification of samples to train the system. This paper presents an automatic system that performs these tasks without the need of a long training, providing a useful tool to detect and identify figures. In real situations, it is necessary to repeat the training process if light conditions change, or if, in the same scenario, the colors of the figures and the background may have changed, being useful a fast training method. A direct application of this method is the detection and identification of football players.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An evolution in theoretical models and methodological paradigms for investigating cognitive biases in the addictions is discussed. Anomalies in traditional cognitive perspectives, and problems with the self-report methods which underpin them, are highlighted. An emergent body of cognitive research, contextualized within the principles and paradigms of cognitive neuropsychology rather than social learning theory, is presented which, it is argued, addresses these anomalies and problems. Evidence is presented that biases in the processing of addiction-related stimuli, and in the network of propositions which motivate addictive behaviours, occur at automatic, implicit and pre-conscious levels of awareness. It is suggested that methods which assess such implicit cognitive biases (e.g. Stroop, memory, priming and reaction-time paradigms) yield findings which have better predictive utility for ongoing behaviour than those biases determined by self-report methods of introspection. The potential utility of these findings for understanding "loss of control" phenomena, and the desynchrony between reported beliefs and intentions and ongoing addictive behaviours, is discussed. Applications to the practice of cognitive therapy are considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Barrett's oesophagus (BO) is a well recognized precursor of the majority of cases of oesophageal adenocarcinoma (OAC). Endoscopic surveillance of BO patients is frequently undertaken in an attempt to detect early OAC, high grade dysplasia (HGD) or low grade dysplasia (LGD). However histological interpretation and grading of dysplasia is subjective and poorly reproducible. The alternative flow cytometry and cytology-preparation image cytometry techniques require large amounts of tissue and specialist expertise which are not widely available for frontline health care.
Methods: This study has combined whole slide imaging with DNA image cytometry, to provide a novel method for the detection and quantification of abnormal DNA contents. 20 cases were evaluated, including 8 Barrett's specialised intestinal metaplasia (SIM), 6 LGD and 6 HGD. Feulgen stained oesophageal sections (1µm thickness) were digitally scanned in their entirety and evaluated to select regions of interests and abnormalities. Barrett’s mucosa was then interactively chosen for automatic nuclei segmentation where irrelevant cell types are ignored. The combined DNA content histogram for all selected image regions was then obtained. In addition, histogram measurements, including 5c exceeding ratio (xER-5C), 2c deviation index (2cDI) and DNA grade of malignancy (DNA-MG), were computed.
Results: The histogram measurements, xER-5C, 2cDI and DNA-MG, were shown to be effective in differentiating SIM from HGD, SIM from LGD, and LGD from HGD. All three measurements discriminated SIM from HGD cases successfully with statistical significance (pxER-5C=0.0041, p2cDI=0.0151 and pDNA-MG=0.0057). Statistical significance is also achieved differentiating SIM from LGD samples with pxER-5C=0.0019, p2cDI=0.0023 and pDNA-MG=0.0030. Furthermore the differences between LGD and HGD cases are statistical significant (pxER-5C=0.0289, p2cDI=0.0486 and pDNA-MG=0.0384).
Conclusion: Whole slide image cytometry is a novel and effective method for the detection and quantification of abnormal DNA content in BO. Compared to manual histological review, this proposed method is more objective and reproducible. Compared to flow cytometry and cytology-preparation image cytometry, the current method is low cost, simple to use and only requires a single 1µm tissue section. Whole slide image cytometry could assist the routine clinical diagnosis of dysplasia in BO, which is relevant for future progression risk to OAC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Anti-islanding protection is becoming increasingly important due to the rapid installation of distributed generation from renewable resources like wind, tidal and wave, solar PV, bio-fuels, as well as from other resources like diesel. Unintentional islanding presents a potential risk for damaging utility plants and equipment connected from the demand side, as well as to public and personnel in utility plants. This paper investigates automatic islanding detection. This is achieved by deploying a statistical process control approach for fault detection with the real-time data acquired through a wide area measurement system, which is based on Phasor Measurement Unit (PMU) technology. In particular, the principal component analysis (PCA) is used to project the data into principal component subspace and residual space, and two statistics are used to detect the occurrence of fault. Then a fault reconstruction method is used to identify the fault and its development over time. The proposed scheme has been used in a real system and the results have confirmed that the proposed method can correctly identify the fault and islanding site.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Freshwater and brackish microalgal toxins, such as microcystins, cylindrospermopsins, paralytic toxins, anatoxins or other neurotoxins are produced during the overgrowth of certain phytoplankton and benthic cyanobacteria, which includes either prokaryotic or eukaryotic microalgae. Although, further studies are necessary to define the biological role of these toxins, at least some of them are known to be poisonous to humans and wildlife due to their occurrence in these aquatic systems. The World Health Organization (WHO) has established as provisional recommended limit 1 μg of microcystin-LR per liter of drinking water. In this work we present a microsphere-based multi-detection method for five classes of freshwater and brackish toxins: microcystin-LR (MC-LR), cylindrospermopsin (CYN), anatoxin-a (ANA-a), saxitoxin (STX) and domoic acid (DA). Five inhibition assays were developed using different binding proteins and microsphere classes coupled to a flow-cytometry Luminex system. Then, assays were combined in one method for the simultaneous detection of the toxins. The IC50's using this method were 1.9 ± 0.1 μg L−1 MC-LR, 1.3 ± 0.1 μg L−1 CYN, 61 ± 4 μg L−1 ANA-a, 5.4 ± 0.4 μg L−1 STX and 4.9 ± 0.9 μg L−1 DA. Lyophilized cyanobacterial culture samples were extracted using a simple procedure and analyzed by the Luminex method and by UPLC–IT-TOF-MS. Similar quantification was obtained by both methods for all toxins except for ANA-a, whereby the estimated content was lower when using UPLC–IT-TOF-MS. Therefore, this newly developed multiplexed detection method provides a rapid, simple, semi-quantitative screening tool for the simultaneous detection of five environmentally important freshwater and brackish toxins, in buffer and cyanobacterial extracts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thermal comfort is defined as “that condition of mind which expresses satisfaction with the thermal environment’ [1] [2]. Field studies have been completed in order to establish the governing conditions for thermal comfort [3]. These studies showed that the internal climate of a room was the strongest factor in establishing thermal comfort. Direct manipulation of the internal climate is necessary to retain an acceptable level of thermal comfort. In order for Building Energy Management Systems (BEMS) strategies to be efficiently utilised it is necessary to have the ability to predict the effect that activating a heating/cooling source (radiators, windows and doors) will have on the room. The numerical modelling of the domain can be challenging due to necessity to capture temperature stratification and/or different heat sources (radiators, computers and human beings). Computational Fluid Dynamic (CFD) models are usually utilised for this function because they provide the level of details required. Although they provide the necessary level of accuracy these models tend to be highly computationally expensive especially when transient behaviour needs to be analysed. Consequently they cannot be integrated in BEMS. This paper presents and describes validation of a CFD-ROM method for real-time simulations of building thermal performance. The CFD-ROM method involves the automatic extraction and solution of reduced order models (ROMs) from validated CFD simulations. The test case used in this work is a room of the Environmental Research Institute (ERI) Building at the University College Cork (UCC). ROMs have shown that they are sufficiently accurate with a total error of less than 1% and successfully retain a satisfactory representation of the phenomena modelled. The number of zones in a ROM defines the size and complexity of that ROM. It has been observed that ROMs with a higher number of zones produce more accurate results. As each ROM has a time to solution of less than 20 seconds they can be integrated into the BEMS of a building which opens the potential to real time physics based building energy modelling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores semi-qualitative probabilistic networks (SQPNs) that combine numeric and qualitative information. We first show that exact inferences with SQPNs are NPPP-Complete. We then show that existing qualitative relations in SQPNs (plus probabilistic logic and imprecise assessments) can be dealt effectively through multilinear programming. We then discuss learning: we consider a maximum likelihood method that generates point estimates given a SQPN and empirical data, and we describe a Bayesian-minded method that employs the Imprecise Dirichlet Model to generate set-valued estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate modelling of the internal climate of buildings is essential if Building Energy Management Systems (BEMS) are to efficiently maintain adequate thermal comfort. Computational fluid dynamics (CFD) models are usually utilised to predict internal climate. Nevertheless CFD models, although providing the necessary level of accuracy, are highly computationally expensive, and cannot practically be integrated in BEMS. This paper presents and describes validation of a CFD-ROM method for real-time simulations of building thermal performance. The CFD-ROM method involves the automatic extraction and solution of reduced order models (ROMs) from validated CFD simulations. ROMs are shown to be adequately accurate with a total error below 5% and to retain satisfactory representation of the phenomena modelled. Each ROM has a time to solution under 20seconds, which opens the potential of their integration with BEMS, giving real-time physics-based building energy modelling. A parameter study was conducted to investigate the applicability of the extracted ROM to initial boundary conditions different from those from which it was extracted. The results show that the ROMs retained satisfactory total errors when the initial conditions in the room were varied by ±5°C. This allows the production of a finite number of ROMs with the ability to rapidly model many possible scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most popular techniques of generating classifier ensembles is known as stacking which is based on a meta-learning approach. In this paper, we introduce an alternative method to stacking which is based on cluster analysis. Similar to stacking, instances from a validation set are initially classified by all base classifiers. The output of each classifier is subsequently considered as a new attribute of the instance. Following this, a validation set is divided into clusters according to the new attributes and a small subset of the original attributes of the instances. For each cluster, we find its centroid and calculate its class label. The collection of centroids is considered as a meta-classifier. Experimental results show that the new method outperformed all benchmark methods, namely Majority Voting, Stacking J48, Stacking LR, AdaBoost J48, and Random Forest, in 12 out of 22 data sets. The proposed method has two advantageous properties: it is very robust to relatively small training sets and it can be applied in semi-supervised learning problems. We provide a theoretical investigation regarding the proposed method. This demonstrates that for the method to be successful, the base classifiers applied in the ensemble should have greater than 50% accuracy levels.