120 resultados para Automatic term extraction


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a GPU implementation of normalized cuts for road extraction problem using panchromatic satellite imagery. The roads have been extracted in three stages namely pre-processing, image segmentation and post-processing. Initially, the image is pre-processed to improve the tolerance by reducing the clutter (that mostly represents the buildings, vegetation,. and fallow regions). The road regions are then extracted using the normalized cuts algorithm. Normalized cuts algorithm is a graph-based partitioning `approach whose focus lies in extracting the global impression (perceptual grouping) of an image rather than local features. For the segmented image, post-processing is carried out using morphological operations - erosion and dilation. Finally, the road extracted image is overlaid on the original image. Here, a GPGPU (General Purpose Graphical Processing Unit) approach has been adopted to implement the same algorithm on the GPU for fast processing. A performance comparison of this proposed GPU implementation of normalized cuts algorithm with the earlier algorithm (CPU implementation) is presented. From the results, we conclude that the computational improvement in terms of time as the size of image increases for the proposed GPU implementation of normalized cuts. Also, a qualitative and quantitative assessment of the segmentation results has been projected.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been a continuous surge toward developing new biopolymers that exhibit better in vivo biocompatibility properties in terms of demonstrating a reduced foreign body response (FBR). One approach to mitigate the undesired FBR is to develop an implant capable of releasing anti-inflammatory molecules in a sustained manner over a long time period. Implants causing inflammation are also more susceptible to infection. In this article, the in vivo biocompatibility of a novel, biodegradable salicylic acid releasing polyester (SAP) has been investigated by subcutaneous implantation in a mouse model. The tissue response to SAP was compared with that of a widely used biodegradable polymer, poly(lactic acid-co-glycolic acid) (PLGA), as a control over three time points: 2, 4, and 16 weeks postimplantation. A long-term in vitro study illustrates a continuous, linear (zero order) release of salicylic acid with a cumulative mass percent release rate of 7.34 x 10(-4) h(-1) over similar to 1.5-17 months. On the basis of physicochemical analysis, surface erosion for SAP and bulk erosion for PLGA have been confirmed as their dominant degradation modes in vivo. On the basis of the histomorphometrical analysis of inflammatory cell densities and collagen distribution as well as quantification of proinflammatory cytokine levels (TNF-alpha and IL-1 beta), a reduced foreign body response toward SAP with respect to that generated by PLGA has been unambiguously established. The favorable in vivo tissue response to SAP, as manifest from the uniform and well-vascularized encapsulation around the implant, is consistent with the decrease in inflammatory cell density and increase in angiogenesis with time. The above observations, together with the demonstration of long-term and sustained release of salicylic acid, establish the potential use of SAP for applications in improved matrices for tissue engineering and chronic wound healing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

3-D full-wave method of moments (MoM) based electromagnetic analysis is a popular means toward accurate solution of Maxwell's equations. The time and memory bottlenecks associated with such a solution have been addressed over the last two decades by linear complexity fast solver algorithms. However, the accurate solution of 3-D full-wave MoM on an arbitrary mesh of a package-board structure does not guarantee accuracy, since the discretization may not be fine enough to capture spatial changes in the solution variable. At the same time, uniform over-meshing on the entire structure generates a large number of solution variables and therefore requires an unnecessarily large matrix solution. In this paper, different refinement criteria are studied in an adaptive mesh refinement platform. Consequently, the most suitable conductor mesh refinement criterion for MoM-based electromagnetic package-board extraction is identified and the advantages of this adaptive strategy are demonstrated from both accuracy and speed perspectives. The results are also compared with those of the recently reported integral equation-based h-refinement strategy. Finally, a new methodology to expedite each adaptive refinement pass is proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Termites, herbivores and fire are recognized as major guilds that structure woody plant communities in African savanna and woodland ecosystems. An understanding of their interaction is crucial to design appropriate management regimes. The aim of this study was to evaluate the long-term impacts of herbivore, fire and termite activities on regeneration of trees. Permanent experimental quadrats were established in 1992 in the Sudanian woodland of Burkina Faso subjected to grazing by livestock and annual early fire and the control. Within the treatment quadrats, an inventory of the woody undergrowth community was conducted on termitaria occupied by Macrotermes subhyalinus, extended termitosphere (within 5 m radius from the mound base) and adjacent area (beyond 5 m from the mound base). Hierarchical analysis was performed to determine significant differences in species richness, abundance and diversity indices among vegetation patches within fire and herbivory treatments. Grazed quadrats had significantly (P < 0.001) more species and stem density of woody undergrowth than non-grazed quadrats but maintained similar level of species richness and stem density of woody undergrowth on termitaria. There were not significant differences (P>0.05) in species richness and stem density between burnt and unburnt quadrats. Termitaria supported a highly diverse woody undergrowth with higher stem density than either the extended termitosphere or rest of quadrats. The density of woody undergrowth was significantly related with mature trees of selected species on termitaria (R-2 = 0.593; P<0.001) than that on the extended termitosphere (R-2 = 0.333; P<0.001) and adjacent area (R-2 = 0.197; P<0.001). It can be concluded that termites facilitate the regeneration of woody species while grazing and annual early fire play a minor role in the regeneration of woody species. The current policy that prohibits grazing should be revised to accommodate the interests of livestock herders. (C) 2014 Elsevier GmbH. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Understanding channel structures that lead to active sites or traverse the molecule is important in the study of molecular functions such as ion, ligand, and small molecule transport. Efficient methods for extracting, storing, and analyzing protein channels are required to support such studies. Further, there is a need for an integrated framework that supports computation of the channels, interactive exploration of their structure, and detailed visual analysis of their properties. Results: We describe a method for molecular channel extraction based on the alpha complex representation. The method computes geometrically feasible channels, stores both the volume occupied by the channel and its centerline in a unified representation, and reports significant channels. The representation also supports efficient computation of channel profiles that help understand channel properties. We describe methods for effective visualization of the channels and their profiles. These methods and the visual analysis framework are implemented in a software tool, CHEXVIS. We apply the method on a number of known channel containing proteins to extract pore features. Results from these experiments on several proteins show that CHEXVIS performance is comparable to, and in some cases, better than existing channel extraction techniques. Using several case studies, we demonstrate how CHEXVIS can be used to study channels, extract their properties and gain insights into molecular function. Conclusion: CHEXVIS supports the visual exploration of multiple channels together with their geometric and physico-chemical properties thereby enabling the understanding of the basic biology of transport through protein channels. The CHEXVIS web-server is freely available at http://vgl.serc.iisc.ernet.in/chexvis/. The web-server is supported on all modern browsers with latest Java plug-in.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new automatic algorithm for the assessment of mixed mode crack growth rate characteristics is presented based on the concept of an equivalent crack. The residual ligament size approach is introduced to implementation this algorithm for identifying the crack tip position on a curved path with respect to the drop potential signal. The automatic algorithm accounting for the curvilinear crack trajectory and employing an electrical potential difference was calibrated with respect to the optical measurements for the growing crack under cyclic mixed mode loading conditions. The effectiveness of the proposed algorithm is confirmed by fatigue tests performed on ST3 steel compact tension-shear specimens in the full range of mode mixities from pure mode Ito pure mode II. (C) 2015 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of new implantable biomaterials requires bone-mimicking physical properties together with desired biocompatible property. In continuation to our earlier published research to establish compositional dependent multifunctional bone-like properties and cytocompatibility response of hydroxyapatite (HA)-BaTiO3 composites, the toxicological property evaluation, both invitro and invivo, were conducted on HA-40wt% BaTiO3 and reported in this work. In particular, this work reports invitro cytotoxicity of mouse myoblast cells as well as invivo long-term tissue and nanoparticles interaction of intra-articularly injected HA-40wt% BaTiO3 and BaTiO3 up to the concentration of 25mg/mL in physiological saline over 12weeks in mouse model. The careful analysis of flow cytometry results could not reveal any statistically significant difference in terms of early/late apoptotic cells or necrotic cells over 8d in culture. Extensive histological analysis could not record any signature of cellular level toxicity or pronounced inflammatory response in vital organs as well as at knee joints of Balb/c mice after 12weeks. Taken together, this study establishes nontoxic nature of HA-40wt% BaTiO3 and therefore, HA-40wt% BaTiO3 can be used safely for various biomedical applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present paper, we present the structure and composition of tropical evergreen and deciduous forests in the Western Ghats monitored under a long-term programme involving Indian Institute of Science, Earthwatch and volunteer investigators from HSBC. Currently, there is limited evidence on the status and dynamics of tropical forests in the context of human disturbance and climate change. Observations made in this study show that the `more disturbed' evergreen and one of the deciduous plots have low species diversity compared to the less-disturbed forests. There are also variations in the size class structure in the more and `less disturbed' forests of all the locations. The variation is particularly noticeable in the DBH size class 10 - 15 cm category. When biomass stock estimates are considered, there was no significant difference between evergreen and deciduous forests. The difference in biomass stocks between `less disturbed' and `more disturbed' forests within a forest type is also low. Thus, the biomass and carbon stock has not been impacted despite the dependence of communities on the forests. Periodic and long-term monitoring of the status and dynamics of the forests is necessary in the context of potential increased human pressure and climate change. There is, therefore, a need to inform the communities of the impact of extraction and its effect on regeneration so as to motivate them to adopt what may be termed as ``adaptive resource management'', so as to sustain the flow of forest products.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Speech polarity detection is a crucial first step in many speech processing techniques. In this paper, an algorithm is proposed that improvises the existing technique using the skewness of the voice source (VS) signal. Here, the integrated linear prediction residual (ILPR) is used as the VS estimate, which is obtained using linear prediction on long-term frames of the low-pass filtered speech signal. This excludes the unvoiced regions from analysis and also reduces the computation. Further, a modified skewness measure is proposed for decision, which also considers the magnitude of the skewness of the ILPR along with its sign. With the detection error rate (DER) as the performance metric, the algorithm is tested on 8 large databases and its performance (DER=0.20%) is found to be comparable to that of the best technique (DER=0.06%) on both clean and noisy speech. Further, the proposed method is found to be ten times faster than the best technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the design and implementation of PolyMage, a domain-specific language and compiler for image processing pipelines. An image processing pipeline can be viewed as a graph of interconnected stages which process images successively. Each stage typically performs one of point-wise, stencil, reduction or data-dependent operations on image pixels. Individual stages in a pipeline typically exhibit abundant data parallelism that can be exploited with relative ease. However, the stages also require high memory bandwidth preventing effective utilization of parallelism available on modern architectures. For applications that demand high performance, the traditional options are to use optimized libraries like OpenCV or to optimize manually. While using libraries precludes optimization across library routines, manual optimization accounting for both parallelism and locality is very tedious. The focus of our system, PolyMage, is on automatically generating high-performance implementations of image processing pipelines expressed in a high-level declarative language. Our optimization approach primarily relies on the transformation and code generation capabilities of the polyhedral compiler framework. To the best of our knowledge, this is the first model-driven compiler for image processing pipelines that performs complex fusion, tiling, and storage optimization automatically. Experimental results on a modern multicore system show that the performance achieved by our automatic approach is up to 1.81x better than that achieved through manual tuning in Halide, a state-of-the-art language and compiler for image processing pipelines. For a camera raw image processing pipeline, our performance is comparable to that of a hand-tuned implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article, a Field Programmable Gate Array (FPGA)-based hardware accelerator for 3D electromagnetic extraction, using Method of Moments (MoM) is presented. As the number of nets or ports in a system increases, leading to a corresponding increase in the number of right-hand-side (RHS) vectors, the computational cost for multiple matrix-vector products presents a time bottleneck in a linear-complexity fast solver framework. In this work, an FPGA-based hardware implementation is proposed toward a two-level parallelization scheme: (i) matrix level parallelization for single RHS and (ii) pipelining for multiple-RHS. The method is applied to accelerate electrostatic parasitic capacitance extraction of multiple nets in a Ball Grid Array (BGA) package. The acceleration is shown to be linearly scalable with FPGA resources and speed-ups over 10x against equivalent software implementation on a 2.4GHz Intel Core i5 processor is achieved using a Virtex-6 XC6VLX240T FPGA on Xilinx's ML605 board with the implemented design operating at 200MHz clock frequency. (c) 2016 Wiley Periodicals, Inc. Microwave Opt Technol Lett 58:776-783, 2016

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Imaging flow cytometry is an emerging technology that combines the statistical power of flow cytometry with spatial and quantitative morphology of digital microscopy. It allows high-throughput imaging of cells with good spatial resolution, while they are in flow. This paper proposes a general framework for the processing/classification of cells imaged using imaging flow cytometer. Each cell is localized by finding an accurate cell contour. Then, features reflecting cell size, circularity and complexity are extracted for the classification using SVM. Unlike the conventional iterative, semi-automatic segmentation algorithms such as active contour, we propose a noniterative, fully automatic graph-based cell localization. In order to evaluate the performance of the proposed framework, we have successfully classified unstained label-free leukaemia cell-lines MOLT, K562 and HL60 from video streams captured using custom fabricated cost-effective microfluidics-based imaging flow cytometer. The proposed system is a significant development in the direction of building a cost-effective cell analysis platform that would facilitate affordable mass screening camps looking cellular morphology for disease diagnosis. Lay description In this article, we propose a novel framework for processing the raw data generated using microfluidics based imaging flow cytometers. Microfluidics microscopy or microfluidics based imaging flow cytometry (mIFC) is a recent microscopy paradigm, that combines the statistical power of flow cytometry with spatial and quantitative morphology of digital microscopy, which allows us imaging cells while they are in flow. In comparison to the conventional slide-based imaging systems, mIFC is a nascent technology enabling high throughput imaging of cells and is yet to take the form of a clinical diagnostic tool. The proposed framework process the raw data generated by the mIFC systems. The framework incorporates several steps: beginning from pre-processing of the raw video frames to enhance the contents of the cell, localising the cell by a novel, fully automatic, non-iterative graph based algorithm, extraction of different quantitative morphological parameters and subsequent classification of cells. In order to evaluate the performance of the proposed framework, we have successfully classified unstained label-free leukaemia cell-lines MOLT, K562 and HL60 from video streams captured using cost-effective microfluidics based imaging flow cytometer. The cell lines of HL60, K562 and MOLT were obtained from ATCC (American Type Culture Collection) and are separately cultured in the lab. Thus, each culture contains cells from its own category alone and thereby provides the ground truth. Each cell is localised by finding a closed cell contour by defining a directed, weighted graph from the Canny edge images of the cell such that the closed contour lies along the shortest weighted path surrounding the centroid of the cell from a starting point on a good curve segment to an immediate endpoint. Once the cell is localised, morphological features reflecting size, shape and complexity of the cells are extracted and used to develop a support vector machine based classification system. We could classify the cell-lines with good accuracy and the results were quite consistent across different cross validation experiments. We hope that imaging flow cytometers equipped with the proposed framework for image processing would enable cost-effective, automated and reliable disease screening in over-loaded facilities, which cannot afford to hire skilled personnel in large numbers. Such platforms would potentially facilitate screening camps in low income group countries; thereby transforming the current health care paradigms by enabling rapid, automated diagnosis for diseases like cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Algorithms for extracting epochs or glottal closure instants (GCIs) from voiced speech typically fall into two categories: i) ones which operate on linear prediction residual (LPR) and ii) those which operate directly on the speech signal. While the former class of algorithms (such as YAGA and DPI) tend to be more accurate, the latter ones (such as ZFR and SEDREAMS) tend to be more noise-robust. In this letter, a temporal measure termed the cumulative impulse strength is proposed for locating the impulses in a quasi-periodic impulse-sequence embedded in noise. Subsequently, it is applied for detecting the GCIs from the inverted integrated LPR using a recursive algorithm. Experiments on two large corpora of speech with simultaneous electroglottographic recordings demonstrate that the proposed method is more robust to additive noise than the state-of-the-art algorithms, despite operating on the LPR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solar geoengineering has been proposed as a potential means to counteract anthropogenic climate change, yet it is unknown how such climate intervention might affect the Earth's climate on the millennial time scale. Here we use the HadCM3L model to conduct a 1000year sunshade geoengineering simulation in which solar irradiance is uniformly reduced by 4% to approximately offset global mean warming from an abrupt quadrupling of atmospheric CO2. During the 1000year period, modeled global climate, including temperature, hydrological cycle, and ocean circulation of the high-CO2 simulation departs substantially from that of the control preindustrial simulation, whereas the climate of the geoengineering simulation remains much closer to that of the preindustrial state with little drift. The results of our study do not support the hypothesis that nonlinearities in the climate system would cause substantial drift in the climate system if solar geoengineering was to be deployed on the timescale of a millennium.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a Monte Carlo filter for recursive estimation of diffusive processes that modulate the instantaneous rates of Poisson measurements. A key aspect is the additive update, through a gain-like correction term, empirically approximated from the innovation integral in the time-discretized Kushner-Stratonovich equation. The additive filter-update scheme eliminates the problem of particle collapse encountered in many conventional particle filters. Through a few numerical demonstrations, the versatility of the proposed filter is brought forth.