937 resultados para rank-based procedure


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The wide use of antibiotics in aquaculture has led to the emergence of resistant microbial species. It should be avoided/minimized by controlling the amount of drug employed in fish farming. For this purpose, the present work proposes test-strip papers aiming at the detection/semi-quantitative determination of organic drugs by visual comparison of color changes, in a similar analytical procedure to that of pH monitoring by universal pH paper. This is done by establishing suitable chemical changes upon cellulose, attributing the paper the ability to react with the organic drug and to produce a color change. Quantitative data is also enabled by taking a picture and applying a suitable mathematical treatment to the color coordinates given by the HSL system used by windows. As proof of concept, this approach was applied to oxytetracycline (OXY), one of the antibiotics frequently used in aquaculture. A bottom-up modification of paper was established, starting by the reaction of the glucose moieties on the paper with 3-triethoxysilylpropylamine (APTES). The so-formed amine layer allowed binding to a metal ion by coordination chemistry, while the metal ion reacted after with the drug to produce a colored compound. The most suitable metals to carry out such modification were selected by bulk studies, and the several stages of the paper modification were optimized to produce an intense color change against the concentration of the drug. The paper strips were applied to the analysis of spiked environmental water, allowing a quantitative determination for OXY concentrations as low as 30 ng/mL. In general, this work provided a simple, method to screen and discriminate tetracycline drugs, in aquaculture, being a promising tool for local, quick and cheap monitoring of drugs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents a novel surface Smart Polymer Antibody Material (SPAM) for Carnitine (CRT, a potential biomarker of ovarian cancer), tested for the first time as ionophore in potentiometric electrodes of unconventional configuration. The SPAM material consisted of a 3D polymeric network created by surface imprinting on graphene layers. The polymer was obtained by radical polymerization of (vinylbenzyl) trimethylammonium chloride and 4-styrenesulfonic acid (signaling the binding sites), and vinyl pivalate and ethylene glycol dimethacrylate (surroundings). Non-imprinted material (NIM) was prepared as control, by excluding the template from the procedure. These materials were then used to produce several plasticized PVC membranes, testing the relevance of including the SPAM as ionophore, and the need for a charged lipophilic additive. The membranes were casted over solid conductive supports of graphite or ITO/FTO. The effect of pH upon the potentiometric response was evaluated for different pHs (2-9) with different buffer compositions. Overall, the best performance was achieved for membranes with SPAM ionophore, having a cationic lipophilic additive and tested in HEPES (4-(2-hydroxyethyl)-1-piperazineethanesulfonic acid) buffer, pH 5.1. Better slopes were achieved when the membrane was casted on conductive glass (-57.4 mV/decade), while the best detection limits were obtained for graphite-based conductive supports (3.6 × 10−5mol/L). Good selectivity was observed against BSA, ascorbic acid, glucose, creatinine and urea, tested for concentrations up to their normal physiologic levels in urine. The application of the devices to the analysis of spiked samples showed recoveries ranging from 91% (± 6.8%) to 118% (± 11.2%). Overall, the combination of the SPAM sensory material with a suitable selective membrane composition and electrode design has lead to a promising tool for point-of-care applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

III Jornadas de Electroquímica e Inovação (Electroquímica e Nanomateriais), na Universidade de Trás-os-Montes e Alto Douro, Vila Real, 16 a 17 de Setembro de 2013

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Graduate Student Symposium on Molecular Imprinting 2013, na Queen’s University, Belfast, United Kingdom, 15 a 17 de Agosto de 2013

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sectorization means dividing a set of basic units into sectors or parts, a procedure that occurs in several contexts, such as political, health and school districting, social networks and sales territory or airspace assignment, to achieve some goal or to facilitate an activity. This presentation will focus on three main issues: Measures, a new approach to sectorization problems and an application in waste collection. When designing or comparing sectors different characteristics are usually taken into account. Some are commonly used, and they are related to the concepts of contiguity, equilibrium and compactness. These fundamental characteristics will be addressed, by defining new generic measures and by proposing a new measure, desirability, connected with the idea of preference. A new approach to sectorization inspired in Coulomb’s Law, which establishes a relation of force between electrically charged points, will be proposed. A charged point represents a small region with specific characteristics/values creating relations of attraction/repulsion with the others (two by two), proportional to the charges and inversely proportional to their distance. Finally, a real case about sectorization and vehicle routing in solid waste collection will be mentioned.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Musicians are frequently affected by playing-related musculoskeletal disorders (PRMD). Common solutions used by Western medicine to treat musculoskeletal pain include rehabilitation programs and drugs, but their results are sometimes disappointing. Objective To study the effects of self-administered exercises based on Tuina techniques on the pain intensity caused by PRMD of professional orchestra musicians, using numeric visual scale (NVS). Design, setting, participants and interventions We performed a prospective, controlled, single-blinded, randomized study with musicians suffering from PRMD. Participating musicians were randomly distributed into the experimental (n = 39) and the control (n = 30) groups. After an individual diagnostic assessment, specific Tuina self-administered exercises were developed and taught to the participants. Musicians were instructed to repeat the exercises every day for 3 weeks. Main outcome measures Pain intensity was measured by NVS before the intervention and after 1, 3, 5, 10, 15 and 20 d of treatment. The procedure was the same for the control group, however the Tuina exercises were executed in points away from the commonly-used acupuncture points. Results In the treatment group, but not the control group, pain intensity was significantly reduced on days 1, 3, 5, 10, 15 and 20. Conclusion The results obtained are consistent with the hypothesis that self-administered exercises based on Tuina techniques could help professional musicians controlling the pain caused by PRMD. Although our results are very promising, further studies are needed employing a larger sample size and double blinding designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The complexity of systems is considered an obstacle to the progress of the IT industry. Autonomic computing is presented as the alternative to cope with the growing complexity. It is a holistic approach, in which the systems are able to configure, heal, optimize, and protect by themselves. Web-based applications are an example of systems where the complexity is high. The number of components, their interoperability, and workload variations are factors that may lead to performance failures or unavailability scenarios. The occurrence of these scenarios affects the revenue and reputation of businesses that rely on these types of applications. In this article, we present a self-healing framework for Web-based applications (SHõWA). SHõWA is composed by several modules, which monitor the application, analyze the data to detect and pinpoint anomalies, and execute recovery actions autonomously. The monitoring is done by a small aspect-oriented programming agent. This agent does not require changes to the application source code and includes adaptive and selective algorithms to regulate the level of monitoring. The anomalies are detected and pinpointed by means of statistical correlation. The data analysis detects changes in the server response time and analyzes if those changes are correlated with the workload or are due to a performance anomaly. In the presence of per- formance anomalies, the data analysis pinpoints the anomaly. Upon the pinpointing of anomalies, SHõWA executes a recovery procedure. We also present a study about the detection and localization of anomalies, the accuracy of the data analysis, and the performance impact induced by SHõWA. Two benchmarking applications, exercised through dynamic workloads, and different types of anomaly were considered in the study. The results reveal that (1) the capacity of SHõWA to detect and pinpoint anomalies while the number of end users affected is low; (2) SHõWA was able to detect anomalies without raising any false alarm; and (3) SHõWA does not induce a significant performance overhead (throughput was affected in less than 1%, and the response time delay was no more than 2 milliseconds).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The diagnosis of historic masonry walls is an intricate and complex field and has been an object of research for many years. This paper aims to propose practical methodologies for the diagnosis of historic masonry walls, specifically based on their typological characteristics. In order to develop such procedures, information relating to historic masonry typologies in Portugal, classified as rural, urban and military was gathered and techniques for the assessment of historic masonry were studied. All information was integrated to develop a pattern typology oriented methodology. Developed methodology was tested and validated in a small diagnosis campaign carried out in the Guimarães Castle. Methodology was proven to be advantageous and although the study is limited and focused on the Portuguese architectural specificities, it still holds global classifications, and therefore can be useful for any diagnosis procedure of a historic masonry wall.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2010 Mathematics Subject Classification: Primary: 37C85, 37A17, 37A45; Secondary: 11K36, 11L07.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel framework for probabilistic-based structural assessment of existing structures, which combines model identification and reliability assessment procedures, considering in an objective way different sources of uncertainty, is presented in this paper. A short description of structural assessment applications, provided in literature, is initially given. Then, the developed model identification procedure, supported in a robust optimization algorithm, is presented. Special attention is given to both experimental and numerical errors, to be considered in this algorithm convergence criterion. An updated numerical model is obtained from this process. The reliability assessment procedure, which considers a probabilistic model for the structure in analysis, is then introduced, incorporating the results of the model identification procedure. The developed model is then updated, as new data is acquired, through a Bayesian inference algorithm, explicitly addressing statistical uncertainty. Finally, the developed framework is validated with a set of reinforced concrete beams, which were loaded up to failure in laboratory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Mechanical Engineering

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we present a 3D web-based interactive tool for numerical modeling and simulation approach to breast reduction surgery simulation, to assist surgeons in planning all aspects related to breast reduction surgery before the actual procedure takes place, thereby avoiding unnecessary risks. In particular, it allows the modeling of the initial breast geometry, the definition of all aspects related to the surgery and the visualization of the post-surgery breast shape in a realistic environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops a new test of true versus spurious long memory, based on log-periodogram estimation of the long memory parameter using skip-sampled data. A correction factor is derived to overcome the bias in this estimator due to aliasing. The procedure is designed to be used in the context of a conventional test of significance of the long memory parameter, and composite test procedure described that has the properties of known asymptotic size and consistency. The test is implemented using the bootstrap, with the distribution under the null hypothesis being approximated using a dependent-sample bootstrap technique to approximate short-run dependence following fractional differencing. The properties of the test are investigated in a set of Monte Carlo experiments. The procedure is illustrated by applications to exchange rate volatility and dividend growth series.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND STUDY AIMS: Various screening methods for colorectal cancer (CRC) are promoted by professional societies; however, few data are available about the factors that determine patient participation in screening, which is crucial to the success of population-based programs. This study aimed (i) to identify factors that determine acceptance of screening and preference of screening method, and (ii) to evaluate procedure success, detection of colorectal neoplasia, and patient satisfaction with screening colonoscopy. PATIENTS AND METHODS: Following a public awareness campaign, the population aged 50 - 80 years was offered CRC screening in the form of annual fecal occult blood tests, flexible sigmoidoscopy, a combination of both, or colonoscopy. RESULTS: 2731 asymptomatic persons (12.0 % of the target population) registered with and were eligible to take part in the screening program. Access to information and a positive attitude to screening were major determinants of participation. Colonoscopy was the method preferred by 74.8 % of participants. Advanced colorectal neoplasia was present in 8.5 %; its prevalence was higher in males and increased with age. Significant complications occurred in 0.5 % of those undergoing colonoscopy and were associated with polypectomy or sedation. Most patients were satisfied with colonoscopy and over 90 % would choose it again for CRC screening. CONCLUSIONS: In this population-based study, only a small proportion of the target population underwent CRC screening despite an extensive information campaign. Colonoscopy was the preferred method and was safe. The determinants of participation in screening and preference of screening method, together with the distribution of colorectal neoplasia in different demographic categories, provide a rationale for improving screening procedures.