914 resultados para Model transformation analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Benefits of long-term monitoring have drawn considerable attention in healthcare. Since the acquired data provides an important source of information to clinicians and researchers, the choice for long-term monitoring studies has become frequent. However, long-term monitoring can result in massive datasets, which makes the analysis of the acquired biosignals a challenge. In this case, visualization, which is a key point in signal analysis, presents several limitations and the annotations handling in which some machine learning algorithms depend on, turn out to be a complex task. In order to overcome these problems a novel web-based application for biosignals visualization and annotation in a fast and user friendly way was developed. This was possible through the study and implementation of a visualization model. The main process of this model, the visualization process, comprised the constitution of the domain problem, the abstraction design, the development of a multilevel visualization and the study and choice of the visualization techniques that better communicate the information carried by the data. In a second process, the visual encoding variables were the study target. Finally, the improved interaction exploration techniques were implemented where the annotation handling stands out. Three case studies are presented and discussed and a usability study supports the reliability of the implemented work.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Due to their toxicity, especially their carcinogenic potential, polycyclic aromatic hydrocarbons (PAHs) became priority pollutants in biomonitoring programmes and environmental policy, such as the European Water Framework Directive. The model substances tested in this study, namely benzo[b]fluoranthene (B[b]F), considered potentially carcinogenic to humans and an effector carcinogenic PAH to wildlife, and phenanthrene (Phe), deemed a non-carcinogenic PAH, are common PAHs in coastal waters, owning distinct properties reflected in different, albeit overlapping, mechanisms of toxicity. Still, as for similar PAHs, their interaction effects remain largely unknown. In order to study the genotoxic effects of caused by the interaction of carcinogenic and non-carcinogenic PAHs, and their relation to histopathological alterations, juvenile sea basses, Dicentrarchus labrax, a highly ecologically- and economically-relevant marine fish, were injected with different doses (5 and 10 μg.g-1 fish ww) of the two PAHs, isolated or in mixture, and incubated for 48 h. Individuals injected with B[b]F and the PAH mixture exhibited higher clastogenic/aneugenic effects and DNA strand breakage in blood cells, determined through the erythrocytic nuclear abnormalities (ENA) and Comet assays, respectively. Also, hepatic histopathological alterations were found in all animals, especially those injected with B[b]F and the PAH mixture, relating especially to inflammation. Still, Phe also exhibited genotoxic effects in sea bass, especially in higher doses, revealing a very significant acute effect that was accordant with the Microtox test performed undergone in parallel. Overall, sea bass was sensitive to B[b]F (a higher molecular weight PAH), likely due to efficient bioactivation of the pollutant (yielding genotoxic metabolites and reactive oxygen species), when compared to Phe, the latter revealing a more significant acute effect. The results indicate no significant additive effect between the substances, under the current experimental conditions. The present study highlights the importance of understanding PAH interactions in aquatic organisms, since they are usually present in the aquatic environment in complex mixtures.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Although literature is lacking in the topic of internationalization of services, we manage to apply both the Uppsala model and the Eclectic Theory to the healthcare service. A cross-case study analysis with three international hospitals is done in order to define an internationalization pattern and conditions for a successful process. This is then applied to Associação Protectora dos Diabéticos de Portugal with the purpose of defining an internationalization strategy to the Association.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nowadays, a significant increase in chronic diseases is observed. Epidemiological studies showed a consistent relationship between the consumption of fruits and vegetables and a reduced risk of certain chronic diseases, namely neurodegenerative disorders. One factor common to these diseases is oxidative stress, which is highly related with proteins, lipids, carbohydrates and nucleic acids damage, leading to cellular dysfunction. Polyphenols, highly abundant in berries and associated products, were described as having antioxidant properties, with beneficial effect in these pathologies. The aims of this study were to evaluate by proteomic analyses the effect of oxidative insult in a neuroblastoma cell line (SK-N-MC) and understand the mechanisms involved in the neuroprotective effects of digested extracts from commercial and wild blackberry (R. vagabundus Samp.). The analysis of the total proteome by two-dimensional electrophoresis revealed that oxidative stress in SK-N-MC cells resulted in altered expression of 12 protein spots from a total of 318. Regarding some redox proteomics alterations, particularly proteins carbonylation and glutathionylation, protein carbonyl alterations during stress suggest that cells produce an early and late response; on the other hand, no glutathionylated polypeptides were detected. Relatively to the incubation of SK-N-MC cells with digested berry extracts, commercial blackberry promotes more changes in protein pattern of these cells than R. vagabundus. From 9 statistically different protein spots of cells incubated with commercial blackberry, only β-tubulin and GRP 78 were until now identified by mass spectrometry. Further studies involving the selection of sub proteomes will be necessary to have a better understanding of the mechanisms underlying the neuroprotective effects of berries.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

According to a recent Eurobarometer survey (2014), 68% of Europeans tend not to trust national governments. As the increasing alienation of citizens from politics endangers democracy and welfare, governments, practitioners and researchers look for innovative means to engage citizens in policy matters. One of the measures intended to overcome the so-called democratic deficit is the promotion of civic participation. Digital media proliferation offers a set of novel characteristics related to interactivity, ubiquitous connectivity, social networking and inclusiveness that enable new forms of societal-wide collaboration with a potential impact on leveraging participative democracy. Following this trend, e-Participation is an emerging research area that consists in the use of Information and Communication Technologies to mediate and transform the relations among citizens and governments towards increasing citizens’ participation in public decision-making. However, despite the widespread efforts to implement e-Participation through research programs, new technologies and projects, exhaustive studies on the achieved outcomes reveal that it has not yet been successfully incorporated in institutional politics. Given the problems underlying e-Participation implementation, the present research suggested that, rather than project-oriented efforts, the cornerstone for successfully implementing e-Participation in public institutions as a sustainable added-value activity is a systematic organisational planning, embodying the principles of open-governance and open-engagement. It further suggested that BPM, as a management discipline, can act as a catalyst to enable the desired transformations towards value creation throughout the policy-making cycle, including political, organisational and, ultimately, citizen value. Following these findings, the primary objective of this research was to provide an instrumental model to foster e-Participation sustainability across Government and Public Administration towards a participatory, inclusive, collaborative and deliberative democracy. The developed artefact, consisting in an e-Participation Organisational Semantic Model (ePOSM) underpinned by a BPM-steered approach, introduces this vision. This approach to e-Participation was modelled through a semi-formal lightweight ontology stack structured in four sub-ontologies, namely e-Participation Strategy, Organisational Units, Functions and Roles. The ePOSM facilitates e-Participation sustainability by: (1) Promoting a common and cross-functional understanding of the concepts underlying e-Participation implementation and of their articulation that bridges the gap between technical and non-technical users; (2) Providing an organisational model which allows a centralised and consistent roll-out of strategy-driven e-Participation initiatives, supported by operational units dedicated to the execution of transformation projects and participatory processes; (3) Providing a standardised organisational structure, goals, functions and roles related to e-Participation processes that enhances process-level interoperability among government agencies; (4) Providing a representation usable in software development for business processes’ automation, which allows advanced querying using a reasoner or inference engine to retrieve concrete and specific information about the e-Participation processes in place. An evaluation of the achieved outcomes, as well a comparative analysis with existent models, suggested that this innovative approach tackling the organisational planning dimension can constitute a stepping stone to harness e-Participation value.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Informatik, Diss., 2011

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper dis cusses the fitting of a Cobb-Doug las response curve Yi = αXβi, with additive error, Yi = αXβi + e i, instead of the usual multiplicative error Yi = αXβi (1 + e i). The estimation of the parameters A and B is discussed. An example is given with use of both types of error.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

L’objectiu d’aquest estudi, que correspon a un projecte de recerca sobre la pèrdua funcional i la mortalitat de persones grans fràgils, és construir un procés de supervivència predictiu que tingui en compte l’evolució funcional i nutricional dels pacients al llarg del temps. En aquest estudi ens enfrontem a l’anàlisi de dades de supervivència i mesures repetides però els mètodes estadístics habituals per al tractament conjunt d’aquest tipus de dades no són apropiats en aquest cas. Com a alternativa utilitzem els models de supervivència multi-estats per avaluar l’associació entre mortalitat i recuperació, o no, dels nivells funcionals i nutricionals considerats normals. Després d’estimar el model i d’identificar els factors pronòstics de mortalitat és possible obtenir un procés predictiu que permet fer prediccions de la supervivència dels pacients en funció de la seva història concreta fins a un determinat moment. Això permet realitzar un pronòstic més precís de cada grup de pacients, la qual cosa pot ser molt útil per als professionals sanitaris a l’hora de prendre decisions clíniques.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We are interested in coupled microscopic/macroscopic models describing the evolution of particles dispersed in a fluid. The system consists in a Vlasov-Fokker-Planck equation to describe the microscopic motion of the particles coupled to the Euler equations for a compressible fluid. We investigate dissipative quantities, equilibria and their stability properties and the role of external forces. We also study some asymptotic problems, their equilibria and stability and the derivation of macroscopic two-phase models.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

INTRODUCTION: Hip fractures are responsible for excessive mortality, decreasing the 5-year survival rate by about 20%. From an economic perspective, they represent a major source of expense, with direct costs in hospitalization, rehabilitation, and institutionalization. The incidence rate sharply increases after the age of 70, but it can be reduced in women aged 70-80 years by therapeutic interventions. Recent analyses suggest that the most efficient strategy is to implement such interventions in women at the age of 70 years. As several guidelines recommend bone mineral density (BMD) screening of postmenopausal women with clinical risk factors, our objective was to assess the cost-effectiveness of two screening strategies applied to elderly women aged 70 years and older. METHODS: A cost-effectiveness analysis was performed using decision-tree analysis and a Markov model. Two alternative strategies, one measuring BMD of all women, and one measuring BMD only of those having at least one risk factor, were compared with the reference strategy "no screening". Cost-effectiveness ratios were measured as cost per year gained without hip fracture. Most probabilities were based on data observed in EPIDOS, SEMOF and OFELY cohorts. RESULTS: In this model, which is mostly based on observed data, the strategy "screen all" was more cost effective than "screen women at risk." For one woman screened at the age of 70 and followed for 10 years, the incremental (additional) cost-effectiveness ratio of these two strategies compared with the reference was 4,235 euros and 8,290 euros, respectively. CONCLUSION: The results of this model, under the assumptions described in the paper, suggest that in women aged 70-80 years, screening all women with dual-energy X-ray absorptiometry (DXA) would be more effective than no screening or screening only women with at least one risk factor. Cost-effectiveness studies based on decision-analysis trees maybe useful tools for helping decision makers, and further models based on different assumptions should be performed to improve the level of evidence on cost-effectiveness ratios of the usual screening strategies for osteoporosis.