927 resultados para feature inspection method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

2,2'-Biphenols are a large and diverse group of compounds with exceptional properties both as ligands and bioactive agents. Traditional methods for their synthesis by oxidative dimerisation are often problematic and lead to mixtures of ortho- and para-connected regioisomers. To compound these issues, an intermolecular dimerisation strategy is often inappropriate for the synthesis of heterodimers. The ‘acetal method’ provides a solution for these problems: stepwise tethering of two monomeric phenols enables heterodimer synthesis, enforces ortho regioselectivity and allows relatively facile and selective intramolecular reactions to take place. The resulting dibenzo[1,3]dioxepines have been analysed by quantum chemical calculations to obtain information about the activation barrier for ring flip between the enantiomers. Hydrolytic removal of the dioxepine acetal unit revealed the 2,2′-biphenol target.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Musculoskeletal pain is commonly reported by police officers. A potential cause of officer discomfort is a mismatch between vehicle seats and the method used for carrying appointments. Twenty-five police officers rated their discomfort while seated in: (1) a standard police vehicle seat, and (2) a vehicle seat custom-designed for police use. Discomfort was recorded in both seats while wearing police appointments on: (1) a traditional appointments belt, and (2) a load-bearing vest / belt combination (LBV). Sitting in the standard vehicle seat and carrying appointments on a traditional appointments belt were both associated with significantly elevated discomfort. Four vehicle seat features were most implicated as contributing to discomfort: back rest bolster prominence; lumbar region support; seat cushion width; and seat cushion bolster depth. Authorising the carriage of appointments using a LBV is a lower cost solution with potential to reduce officer discomfort. Furthermore, the introduction of custom-designed vehicle seats should be considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new wave energy flow (WEF) map concept was proposed in this work. Based on it, an improved technique incorporating the laser scanning method and Betti’s reciprocal theorem was developed to evaluate the shape and size of damage as well as to realize visualization of wave propagation. In this technique, a simple signal processing algorithm was proposed to construct the WEF map when waves propagate through an inspection region, and multiple lead zirconate titanate (PZT) sensors were employed to improve inspection reliability. Various damages in aluminum and carbon fiber reinforced plastic laminated plates were experimentally and numerically evaluated to validate this technique. The results show that it can effectively evaluate the shape and size of damage from wave field variations around the damage in the WEF map.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spatially-explicit modelling of grassland classes is important to site-specific planning for improving grassland and environmental management over large areas. In this study, a climate-based grassland classification model, the Comprehensive and Sequential Classification System (CSCS) was integrated with spatially interpolated climate data to classify grassland in Gansu province, China. The study area is characterized by complex topographic features imposed by plateaus, high mountains, basins and deserts. To improve the quality of the interpolated climate data and the quality of the spatial classification over this complex topography, three linear regression methods, namely an analytic method based on multiple regression and residues (AMMRR), a modification of the AMMRR method through adding the effect of slope and aspect to the interpolation analysis (M-AMMRR) and a method which replaces the IDW approach for residue interpolation in M-AMMRR with an ordinary kriging approach (I-AMMRR), for interpolating climate variables were evaluated. The interpolation outcomes from the best interpolation method were then used in the CSCS model to classify the grassland in the study area. Climate variables interpolated included the annual cumulative temperature and annual total precipitation. The results indicated that the AMMRR and M-AMMRR methods generated acceptable climate surfaces but the best model fit and cross validation result were achieved by the I-AMMRR method. Twenty-six grassland classes were classified for the study area. The four grassland vegetation classes that covered more than half of the total study area were "cool temperate-arid temperate zonal semi-desert", "cool temperate-humid forest steppe and deciduous broad-leaved forest", "temperate-extra-arid temperate zonal desert", and "frigid per-humid rain tundra and alpine meadow". The vegetation classification map generated in this study provides spatial information on the locations and extents of the different grassland classes. This information can be used to facilitate government agencies' decision-making in land-use planning and environmental management, and for vegetation and biodiversity conservation. The information can also be used to assist land managers in the estimation of safe carrying capacities which will help to prevent overgrazing and land degradation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Disjoint top-view networked cameras are among the most commonly utilized networks in many applications. One of the open questions for these cameras' study is the computation of extrinsic parameters (positions and orientations), named extrinsic calibration or localization of cameras. Current approaches either rely on strict assumptions of the object motion for accurate results or fail to provide results of high accuracy without the requirement of the object motion. To address these shortcomings, we present a location-constrained maximum a posteriori (LMAP) approach by applying known locations in the surveillance area, some of which would be passed by the object opportunistically. The LMAP approach formulates the problem as a joint inference of the extrinsic parameters and object trajectory based on the cameras' observations and the known locations. In addition, a new task-oriented evaluation metric, named MABR (the Maximum value of All image points' Back-projected localization errors' L2 norms Relative to the area of field of view), is presented to assess the quality of the calibration results in an indoor object tracking context. Finally, results herein demonstrate the superior performance of the proposed method over the state-of-the-art algorithm based on the presented MABR and classical evaluation metric in simulations and real experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A sub‒domain smoothed Galerkin method is proposed to integrate the advantages of mesh‒free Galerkin method and FEM. Arbitrarily shaped sub‒domains are predefined in problems domain with mesh‒free nodes. In each sub‒domain, based on mesh‒free Galerkin weak formulation, the local discrete equation can be obtained by using the moving Kriging interpolation, which is similar to the discretization of the high‒order finite elements. Strain smoothing technique is subsequently applied to the nodal integration of sub‒domain by dividing the sub‒domain into several smoothing cells. Moreover, condensation of DOF can also be introduced into the local discrete equations to improve the computational efficiency. The global governing equations of present method are obtained on the basis of the scheme of FEM by assembling all local discrete equations of the sub‒domains. The mesh‒free properties of Galerkin method are retained in each sub‒domain. Several 2D elastic problems have been solved on the basis of this newly proposed method to validate its computational performance. These numerical examples proved that the newly proposed sub‒domain smoothed Galerkin method is a robust technique to solve solid mechanics problems based on its characteristics of high computational efficiency, good accuracy, and convergence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Cancer monitoring and prevention relies on the critical aspect of timely notification of cancer cases. However, the abstraction and classification of cancer from the free-text of pathology reports and other relevant documents, such as death certificates, exist as complex and time-consuming activities. Aims In this paper, approaches for the automatic detection of notifiable cancer cases as the cause of death from free-text death certificates supplied to Cancer Registries are investigated. Method A number of machine learning classifiers were studied. Features were extracted using natural language techniques and the Medtex toolkit. The numerous features encompassed stemmed words, bi-grams, and concepts from the SNOMED CT medical terminology. The baseline consisted of a keyword spotter using keywords extracted from the long description of ICD-10 cancer related codes. Results Death certificates with notifiable cancer listed as the cause of death can be effectively identified with the methods studied in this paper. A Support Vector Machine (SVM) classifier achieved best performance with an overall F-measure of 0.9866 when evaluated on a set of 5,000 free-text death certificates using the token stem feature set. The SNOMED CT concept plus token stem feature set reached the lowest variance (0.0032) and false negative rate (0.0297) while achieving an F-measure of 0.9864. The SVM classifier accounts for the first 18 of the top 40 evaluated runs, and entails the most robust classifier with a variance of 0.001141, half the variance of the other classifiers. Conclusion The selection of features significantly produced the most influences on the performance of the classifiers, although the type of classifier employed also affects performance. In contrast, the feature weighting schema created a negligible effect on performance. Specifically, it is found that stemmed tokens with or without SNOMED CT concepts create the most effective feature when combined with an SVM classifier.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a new blind steganalytic method to detect the presence of secret messages embedded in black and white images using the steganographic techniques. We start by extracting several sets of matrix, such as run length matrix, gap length matrix and pixel difference. We also apply characteristic function on these matrices to enhance their discriminative capabilities. Then we calculate the statistics which include mean, variance, kurtosis and skewness to form our feature sets. The presented empirical works demonstrate our proposed method can effectively detect three different types of steganography. This proves the universality of our proposed method as a blind steganalysis. In addition, the experimental results show our proposed method is capable of detecting small amount of the embedded message.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing the importance and use of infrastructures such as bridges, demands more effective structural health monitoring (SHM) systems. SHM has well addressed the damage detection issues through several methods such as modal strain energy (MSE). Many of the available MSE methods either have been validated for limited type of structures such as beams or their performance is not satisfactory. Therefore, it requires a further improvement and validation of them for different types of structures. In this study, an MSE method was mathematically improved to precisely quantify the structural damage at an early stage of formation. Initially, the MSE equation was accurately formulated considering the damaged stiffness and then it was used for derivation of a more accurate sensitivity matrix. Verification of the improved method was done through two plane structures: a steel truss bridge and a concrete frame bridge models that demonstrate the framework of a short- and medium-span of bridge samples. Two damage scenarios including single- and multiple-damage were considered to occur in each structure. Then, for each structure, both intact and damaged, modal analysis was performed using STRAND7. Effects of up to 5 per cent noise were also comprised. The simulated mode shapes and natural frequencies derived were then imported to a MATLAB code. The results indicate that the improved method converges fast and performs well in agreement with numerical assumptions with few computational cycles. In presence of some noise level, it performs quite well too. The findings of this study can be numerically extended to 2D infrastructures particularly short- and medium-span bridges to detect the damage and quantify it more accurately. The method is capable of providing a proper SHM that facilitates timely maintenance of bridges to minimise the possible loss of lives and properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study used a homogeneous water-equivalent model of an electronic portal imaging device (EPID), contoured as a structure in a radiotherapy treatment plan, to produce reference dose images for comparison with in vivo EPID dosimetry images. Head and neck treatments were chosen as the focus of this study, due to the heterogeneous anatomies involved and the consequent difficulty of rapidly obtaining reliable reference dose images by other means. A phantom approximating the size and heterogeneity of a typical neck, with a maximum radiological thickness of 8.5 cm, was constructed for use in this study. This phantom was CT scanned and a simple treatment including five square test fields and one off-axis IMRT field was planned. In order to allow the treatment planning system to calculate dose in a model EPID positioned a distance downstream from the phantom to achieve a source-to-detector distance (SDD) of 150 cm, the CT images were padded with air and the phantom’s “body” contour was extended to encompass the EPID contour. Comparison of dose images obtained from treatment planning calculations and experimental irradiations showed good agreement, with more than 90% of points in all fields passing a gamma evaluation, at γ (3%, 3mm )Similar agreement was achieved when the phantom was over-written with air in the treatment plan and removed from the experimental beam, suggesting that water EPID model at 150 cm SDD is capable of providing accurate reference images for comparison with clinical IMRT treatment images, for patient anatomies with radiological thicknesses ranging from 0 up to approximately 9 cm. This methodology therefore has the potential to be used for in vivo dosimetry during treatments to tissues in the neck as well as the oral and nasal cavities, in the head-and-neck region.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this research, we introduce a new blind steganalysis in detecting grayscale JPEG images. Features-pooling method is employed to extract the steganalytic features and the classification is done by using neural network. Three different steganographic models are tested and classification results are compared to the five state-of-the-art blind steganalysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Healthy governance systems are key to delivering sound environmental management outcomes from global to local scales. There are, however, surprisingly few risk assessment methods that can pinpoint those domains and sub-domains within governance systems that are most likely to influence good environmental outcomes at any particular scale, or those if absent or dysfunctional, most likely to prevent effective environmental management. This paper proposes a new risk assessment method for analysing governance systems. This method is then tested through its preliminary application to a significant real-world context: governance as it relates to the health of Australia's Great Barrier Reef (GBR). The GBR exists at a supra-regional scale along most of the north eastern coast of Australia. Brodie et al (2012 Mar. Pollut. Bull. 65 81-100) have recently reviewed the state and trend of the health of the GBR, finding that overall trends remain of significant concern. At the same time, official international concern over the governance of the reef has recently been signalled globally by the International Union for the Conservation of Nature (IUCN). These environmental and political contexts make the GBR an ideal candidate for use in testing and reviewing the application of improved tools for governance risk assessment. © 2013 IOP Publishing Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Moving cell fronts are an essential feature of wound healing, development and disease. The rate at which a cell front moves is driven, in part, by the cell motility, quantified in terms of the cell diffusivity $D$, and the cell proliferation rate �$\lambda$. Scratch assays are a commonly-reported procedure used to investigate the motion of cell fronts where an initial cell monolayer is scratched and the motion of the front is monitored over a short period of time, often less than 24 hours. The simplest way of quantifying a scratch assay is to monitor the progression of the leading edge. Leading edge data is very convenient since, unlike other methods, it is nondestructive and does not require labeling, tracking or counting individual cells amongst the population. In this work we study short time leading edge data in a scratch assay using a discrete mathematical model and automated image analysis with the aim of investigating whether such data allows us to reliably identify $D$ and $\lambda$�. Using a naıve calibration approach where we simply scan the relevant region of the ($D$;$\lambda$�) parameter space, we show that there are many choices of $D$ and $\lambda$� for which our model produces indistinguishable short time leading edge data. Therefore, without due care, it is impossible to estimate $D$ and $\lambda$� from this kind of data. To address this, we present a modified approach accounting for the fact that cell motility occurs over a much shorter time scale than proliferation. Using this information we divide the duration of the experiment into two periods, and we estimate $D$ using data from the first period, while we estimate �$\lambda$ using data from the second period. We confirm the accuracy of our approach using in silico data and a new set of in vitro data, which shows that our method recovers estimates of $D$ and $\lamdba$� that are consistent with previously-reported values except that that our approach is fast, inexpensive, nondestructive and avoids the need for cell labeling and cell counting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study presents an acoustic emission (AE) based fault diagnosis for low speed bearing using multi-class relevance vector machine (RVM). A low speed test rig was developed to simulate the various defects with shaft speeds as low as 10 rpm under several loading conditions. The data was acquired using anAEsensor with the test bearing operating at a constant loading (5 kN) andwith a speed range from20 to 80 rpm. This study is aimed at finding a reliable method/tool for low speed machines fault diagnosis based on AE signal. In the present study, component analysis was performed to extract the bearing feature and to reduce the dimensionality of original data feature. The result shows that multi-class RVM offers a promising approach for fault diagnosis of low speed machines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robust facial expression recognition (FER) under occluded face conditions is challenging. It requires robust algorithms of feature extraction and investigations into the effects of different types of occlusion on the recognition performance to gain insight. Previous FER studies in this area have been limited. They have spanned recovery strategies for loss of local texture information and testing limited to only a few types of occlusion and predominantly a matched train-test strategy. This paper proposes a robust approach that employs a Monte Carlo algorithm to extract a set of Gabor based part-face templates from gallery images and converts these templates into template match distance features. The resulting feature vectors are robust to occlusion because occluded parts are covered by some but not all of the random templates. The method is evaluated using facial images with occluded regions around the eyes and the mouth, randomly placed occlusion patches of different sizes, and near-realistic occlusion of eyes with clear and solid glasses. Both matched and mis-matched train and test strategies are adopted to analyze the effects of such occlusion. Overall recognition performance and the performance for each facial expression are investigated. Experimental results on the Cohn-Kanade and JAFFE databases demonstrate the high robustness and fast processing speed of our approach, and provide useful insight into the effects of occlusion on FER. The results on the parameter sensitivity demonstrate a certain level of robustness of the approach to changes in the orientation and scale of Gabor filters, the size of templates, and occlusions ratios. Performance comparisons with previous approaches show that the proposed method is more robust to occlusion with lower reductions in accuracy from occlusion of eyes or mouth.