967 resultados para Data matrix


Relevância:

30.00% 30.00%

Publicador:

Resumo:

* Work supported by the Lithuanian State Science and Studies Foundation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

* Work is partially supported by the Lithuanian State Science and Studies Foundation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the volume of image data and the need of using it in various applications is growing significantly in the last days it brings a necessity of retrieval efficiency and effectiveness. Unfortunately, existing indexing methods are not applicable to a wide range of problem-oriented fields due to their operating time limitations and strong dependency on the traditional descriptors extracted from the image. To meet higher requirements, a novel distance-based indexing method for region-based image retrieval has been proposed and investigated. The method creates premises for considering embedded partitions of images to carry out the search with different refinement or roughening level and so to seek the image meaningful content.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62J12, 62K15, 91B42, 62H99.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As massive data sets become increasingly available, people are facing the problem of how to effectively process and understand these data. Traditional sequential computing models are giving way to parallel and distributed computing models, such as MapReduce, both due to the large size of the data sets and their high dimensionality. This dissertation, as in the same direction of other researches that are based on MapReduce, tries to develop effective techniques and applications using MapReduce that can help people solve large-scale problems. Three different problems are tackled in the dissertation. The first one deals with processing terabytes of raster data in a spatial data management system. Aerial imagery files are broken into tiles to enable data parallel computation. The second and third problems deal with dimension reduction techniques that can be used to handle data sets of high dimensionality. Three variants of the nonnegative matrix factorization technique are scaled up to factorize matrices of dimensions in the order of millions in MapReduce based on different matrix multiplication implementations. Two algorithms, which compute CANDECOMP/PARAFAC and Tucker tensor decompositions respectively, are parallelized in MapReduce based on carefully partitioning the data and arranging the computation to maximize data locality and parallelism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flow Cytometry analyzers have become trusted companions due to their ability to perform fast and accurate analyses of human blood. The aim of these analyses is to determine the possible existence of abnormalities in the blood that have been correlated with serious disease states, such as infectious mononucleosis, leukemia, and various cancers. Though these analyzers provide important feedback, it is always desired to improve the accuracy of the results. This is evidenced by the occurrences of misclassifications reported by some users of these devices. It is advantageous to provide a pattern interpretation framework that is able to provide better classification ability than is currently available. Toward this end, the purpose of this dissertation was to establish a feature extraction and pattern classification framework capable of providing improved accuracy for detecting specific hematological abnormalities in flow cytometric blood data. ^ This involved extracting a unique and powerful set of shift-invariant statistical features from the multi-dimensional flow cytometry data and then using these features as inputs to a pattern classification engine composed of an artificial neural network (ANN). The contribution of this method consisted of developing a descriptor matrix that can be used to reliably assess if a donor’s blood pattern exhibits a clinically abnormal level of variant lymphocytes, which are blood cells that are potentially indicative of disorders such as leukemia and infectious mononucleosis. ^ This study showed that the set of shift-and-rotation-invariant statistical features extracted from the eigensystem of the flow cytometric data pattern performs better than other commonly-used features in this type of disease detection, exhibiting an accuracy of 80.7%, a sensitivity of 72.3%, and a specificity of 89.2%. This performance represents a major improvement for this type of hematological classifier, which has historically been plagued by poor performance, with accuracies as low as 60% in some cases. This research ultimately shows that an improved feature space was developed that can deliver improved performance for the detection of variant lymphocytes in human blood, thus providing significant utility in the realm of suspect flagging algorithms for the detection of blood-related diseases.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presence of inhibitory substances in biological forensic samples has, and continues to affect the quality of the data generated following DNA typing processes. Although the chemistries used during the procedures have been enhanced to mitigate the effects of these deleterious compounds, some challenges remain. Inhibitors can be components of the samples, the substrate where samples were deposited or chemical(s) associated to the DNA purification step. Therefore, a thorough understanding of the extraction processes and their ability to handle the various types of inhibitory substances can help define the best analytical processing for any given sample. A series of experiments were conducted to establish the inhibition tolerance of quantification and amplification kits using common inhibitory substances in order to determine if current laboratory practices are optimal for identifying potential problems associated with inhibition. DART mass spectrometry was used to determine the amount of inhibitor carryover after sample purification, its correlation to the initial inhibitor input in the sample and the overall effect in the results. Finally, a novel alternative at gathering investigative leads from samples that would otherwise be ineffective for DNA typing due to the large amounts of inhibitory substances and/or environmental degradation was tested. This included generating data associated with microbial peak signatures to identify locations of clandestine human graves. Results demonstrate that the current methods for assessing inhibition are not necessarily accurate, as samples that appear inhibited in the quantification process can yield full DNA profiles, while those that do not indicate inhibition may suffer from lowered amplification efficiency or PCR artifacts. The extraction methods tested were able to remove >90% of the inhibitors from all samples with the exception of phenol, which was present in variable amounts whenever the organic extraction approach was utilized. Although the results attained suggested that most inhibitors produce minimal effect on downstream applications, analysts should practice caution when selecting the best extraction method for particular samples, as casework DNA samples are often present in small quantities and can contain an overwhelming amount of inhibitory substances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An object based image analysis approach (OBIA) was used to create a habitat map of the Lizard Reef. Briefly, georeferenced dive and snorkel photo-transect surveys were conducted at different locations surrounding Lizard Island, Australia. For the surveys, a snorkeler or diver swam over the bottom at a depth of 1-2m in the lagoon, One Tree Beach and Research Station areas, and 7m depth in Watson's Bay, while taking photos of the benthos at a set height using a standard digital camera and towing a surface float GPS which was logging its track every five seconds. The camera lens provided a 1.0 m x 1.0 m footprint, at 0.5 m height above the benthos. Horizontal distance between photos was estimated by fin kicks, and corresponded to a surface distance of approximately 2.0 - 4.0 m. Approximation of coordinates of each benthic photo was done based on the photo timestamp and GPS coordinate time stamp, using GPS Photo Link Software (www.geospatialexperts.com). Coordinates of each photo were interpolated by finding the gps coordinates that were logged at a set time before and after the photo was captured. Dominant benthic or substrate cover type was assigned to each photo by placing 24 points random over each image using the Coral Point Count excel program (Kohler and Gill, 2006). Each point was then assigned a dominant cover type using a benthic cover type classification scheme containing nine first-level categories - seagrass high (>=70%), seagrass moderate (40-70%), seagrass low (<= 30%), coral, reef matrix, algae, rubble, rock and sand. Benthic cover composition summaries of each photo were generated automatically in CPCe. The resulting benthic cover data for each photo was linked to GPS coordinates, saved as an ArcMap point shapefile, and projected to Universal Transverse Mercator WGS84 Zone 56 South. The OBIA class assignment followed a hierarchical assignment based on membership rules with levels for "reef", "geomorphic zone" and "benthic community" (above).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper develops a novel realized matrix-exponential stochastic volatility model of multivariate returns and realized covariances that incorporates asymmetry and long memory (hereafter the RMESV-ALM model). The matrix exponential transformation guarantees the positivedefiniteness of the dynamic covariance matrix. The contribution of the paper ties in with Robert Basmann’s seminal work in terms of the estimation of highly non-linear model specifications (“Causality tests and observationally equivalent representations of econometric models”, Journal of Econometrics, 1988, 39(1-2), 69–104), especially for developing tests for leverage and spillover effects in the covariance dynamics. Efficient importance sampling is used to maximize the likelihood function of RMESV-ALM, and the finite sample properties of the quasi-maximum likelihood estimator of the parameters are analysed. Using high frequency data for three US financial assets, the new model is estimated and evaluated. The forecasting performance of the new model is compared with a novel dynamic realized matrix-exponential conditional covariance model. The volatility and co-volatility spillovers are examined via the news impact curves and the impulse response functions from returns to volatility and co-volatility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The role of the extracellular matrix (ECM) and mechanotransduction as an important signaling factor in the human uterus is just beginning to be appreciated. The ECM is not only the substance that surrounds cells, but ECM stiffness will either compress cells or stretch them resulting in signals converted into chemical changes within the cell, depending on the amount of collagen, cross-linking, and hydration, as well as other ECM components. In this review we present evidence that the stiffness of fibroid tissue has a direct effect on the growth of the tumor through the induction of fibrosis. Fibrosis has two characteristics: (1) resistance to apoptosis leading to the persistence of cells and (2) secretion of collagen and other components of the ECM such a proteoglycans by those cells leading to abundant disposition of highly cross-linked, disoriented, and often widely dispersed collagen fibrils. Fibrosis affects cell growth by mechanotransduction, the dynamic signaling system whereby mechanical forces initiate chemical signaling in cells. Data indicate that the structurally disordered and abnormally formed ECM of uterine fibroids contributes to fibroid formation and growth. An appreciation of the critical role of ECM stiffness to fibroid growth may lead to new strategies for treatment of this common disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Site 1103 was one of a transect of three sites drilled across the Antarctic Peninsula continental shelf during Leg 178. The aim of drilling on the shelf was to determine the age of the sedimentary sequences and to ground truth previous interpretations of the depositional environment (i.e., topsets and foresets) of progradational seismostratigraphic sequences S1, S2, S3, and S4. The ultimate objective was to obtain a better understanding of the history of glacial advances and retreats in this west Antarctic margin. Drilling the topsets of the progradational wedge (0-247 m below seafloor [mbsf]), which consist of unsorted and unconsolidated materials of seismic Unit S1, was very unfavorable, resulting in very low (2.3%) core recovery. Recovery improved (34%) below 247 mbsf, corresponding to sediments of seismic Unit S3, which have a consolidated matrix. Logs were only obtained from the interval between 75 and 244 mbsf, and inconsistencies on the automatic analog picking of the signals received from the sonic log at the array and at the two other receivers prevented accurate shipboard time-depth conversions. This, in turn, limited the capacity for making seismic stratigraphic interpretations at this site and regionally. This study is an attempt to compile all available data sources, perform quality checks, and introduce nonstandard processing techniques for the logging data obtained to arrive at a reliable and continuous depth vs. velocity profile. We defined 13 data categories using differential traveltime information. Polynomial exclusion techniques with various orders and low-pass filtering reduced the noise of the initial data pool and produced a definite velocity depth profile that is synchronous with the resistivity logging data. A comparison of the velocity profile produced with various other logs of Site 1103 further validates the presented data. All major logging units are expressed within the new velocity data. A depth-migrated section with the new velocity data is presented together with the original time section and initial depth estimates published within the Leg 178 Initial Reports volume. The presented data confirms the location of the shelf unconformity at 222 ms two-way traveltime (TWT), or 243 mbsf, and allows its seismic identification as a strong negative and subsequent positive reflection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spectral unmixing (SU) is a technique to characterize mixed pixels of the hyperspectral images measured by remote sensors. Most of the existing spectral unmixing algorithms are developed using the linear mixing models. Since the number of endmembers/materials present at each mixed pixel is normally scanty compared with the number of total endmembers (the dimension of spectral library), the problem becomes sparse. This thesis introduces sparse hyperspectral unmixing methods for the linear mixing model through two different scenarios. In the first scenario, the library of spectral signatures is assumed to be known and the main problem is to find the minimum number of endmembers under a reasonable small approximation error. Mathematically, the corresponding problem is called the $\ell_0$-norm problem which is NP-hard problem. Our main study for the first part of thesis is to find more accurate and reliable approximations of $\ell_0$-norm term and propose sparse unmixing methods via such approximations. The resulting methods are shown considerable improvements to reconstruct the fractional abundances of endmembers in comparison with state-of-the-art methods such as having lower reconstruction errors. In the second part of the thesis, the first scenario (i.e., dictionary-aided semiblind unmixing scheme) will be generalized as the blind unmixing scenario that the library of spectral signatures is also estimated. We apply the nonnegative matrix factorization (NMF) method for proposing new unmixing methods due to its noticeable supports such as considering the nonnegativity constraints of two decomposed matrices. Furthermore, we introduce new cost functions through some statistical and physical features of spectral signatures of materials (SSoM) and hyperspectral pixels such as the collaborative property of hyperspectral pixels and the mathematical representation of the concentrated energy of SSoM for the first few subbands. Finally, we introduce sparse unmixing methods for the blind scenario and evaluate the efficiency of the proposed methods via simulations over synthetic and real hyperspectral data sets. The results illustrate considerable enhancements to estimate the spectral library of materials and their fractional abundances such as smaller values of spectral angle distance (SAD) and abundance angle distance (AAD) as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In asymptomatic subjects B-type natriuretic peptide (BNP) is associated with adverse cardiovascular outcomes even at levels well below contemporary thresholds used for the diagnosis of heart failure. The mechanisms behind these observations are unclear. We examined the hypothesis that in an asymptomatic hypertensive population BNP would be associated with sub-clinical evidence of cardiac remodeling, inflammation and extracellular matrix (ECM) alterations. We performed transthoracic echocardiography and sampled coronary sinus (CS) and peripheral serum from patients with low (n = 14) and high BNP (n = 27). Peripheral BNP was closely associated with CS levels (r = 0.92, p<0.001). CS BNP correlated significantly with CS levels of markers of collagen type I and III turnover including: PINP (r = 0.44, p = 0.008), CITP (r = 0.35, p = 0.03) and PIIINP (r = 0.35, p = 0.001), and with CS levels of inflammatory cytokines including: TNF-α (r = 0.49, p = 0.002), IL-6 (r = 0.35, p = 0.04), and IL-8 (r = 0.54, p<0.001). The high BNP group had greater CS expression of fibro-inflammatory biomarkers including: CITP (3.8±0.7 versus 5.1±1.9, p = 0.007), TNF-α (3.2±0.5 versus 3.7±1.1, p = 003), IL-6 (1.9±1.3 versus 3.4±2.7, p = 0.02) and hsCRP (1.2±1.1 versus 2.4±1.1, p = 0.04), and greater left ventricular mass index (97±20 versus 118±26 g/m(2), p = 0.03) and left atrial volume index (18±2 versus 21±4, p = 0.008). Our data provide insight into the mechanisms behind the observed negative prognostic impact of modest elevations in BNP and suggest that in an asymptomatic hypertensive cohort a peripheral BNP measurement may be a useful marker of an early, sub-clinical pathological process characterized by cardiac remodeling, inflammation and ECM alterations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is part of a special issue of Applied Geochemistry focusing on reliable applications of compositional multivariate statistical methods. This study outlines the application of compositional data analysis (CoDa) to calibration of geochemical data and multivariate statistical modelling of geochemistry and grain-size data from a set of Holocene sedimentary cores from the Ganges-Brahmaputra (G-B) delta. Over the last two decades, understanding near-continuous records of sedimentary sequences has required the use of core-scanning X-ray fluorescence (XRF) spectrometry, for both terrestrial and marine sedimentary sequences. Initial XRF data are generally unusable in ‘raw-format’, requiring data processing in order to remove instrument bias, as well as informed sequence interpretation. The applicability of these conventional calibration equations to core-scanning XRF data are further limited by the constraints posed by unknown measurement geometry and specimen homogeneity, as well as matrix effects. Log-ratio based calibration schemes have been developed and applied to clastic sedimentary sequences focusing mainly on energy dispersive-XRF (ED-XRF) core-scanning. This study has applied high resolution core-scanning XRF to Holocene sedimentary sequences from the tidal-dominated Indian Sundarbans, (Ganges-Brahmaputra delta plain). The Log-Ratio Calibration Equation (LRCE) was applied to a sub-set of core-scan and conventional ED-XRF data to quantify elemental composition. This provides a robust calibration scheme using reduced major axis regression of log-ratio transformed geochemical data. Through partial least squares (PLS) modelling of geochemical and grain-size data, it is possible to derive robust proxy information for the Sundarbans depositional environment. The application of these techniques to Holocene sedimentary data offers an improved methodological framework for unravelling Holocene sedimentation patterns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A matrix-type silicone elastomer vaginal ring providing 28-day continuous release of dapivirine (DPV) - a lead candidate human immunodeficiency virus type 1 (HIV-1) microbicide compound - has recently demonstrated moderate levels of protection in two Phase III clinical studies. Here, next-generation matrix and reservoir-type silicone elastomer vaginal rings are reported for the first time offering simultaneous and continuous in vitro release of DPV and the contraceptive progestin levonorgestrel (LNG) over a period of between 60 and 180days. For matrix-type vaginal rings comprising initial drug loadings of 100, 150 or 200mg DPV and 0, 16 or 32mg LNG, Day 1 daily DPV release values were between 4132 and 6113μg while Day 60 values ranged from 284 to 454μg. Daily LNG release ranged from 129 to 684μg on Day 1 and 2-91μg on Day 60. Core-type rings comprising one or two drug-loaded cores provided extended duration of in vitro release out to 180days, and maintained daily drug release rates within much narrower windows (either 75-131μg/day or 37-66μg/day for DPV, and either 96-150μg/day or 37-57μg/day for LNG, depending on core ring configuration and ignoring initial lag release effect for LNG) compared with matrix-type rings. The data support the continued development of these devices as multi-purpose prevention technologies (MPTs) for HIV prevention and long-acting contraception.