857 resultados para clustering and QoS-aware routing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

lmage super-resolution is defined as a class of techniques that enhance the spatial resolution of images. Super-resolution methods can be subdivided in single and multi image methods. This thesis focuses on developing algorithms based on mathematical theories for single image super­ resolution problems. lndeed, in arder to estimate an output image, we adopta mixed approach: i.e., we use both a dictionary of patches with sparsity constraints (typical of learning-based methods) and regularization terms (typical of reconstruction-based methods). Although the existing methods already per- form well, they do not take into account the geometry of the data to: regularize the solution, cluster data samples (samples are often clustered using algorithms with the Euclidean distance as a dissimilarity metric), learn dictionaries (they are often learned using PCA or K-SVD). Thus, state-of-the-art methods still suffer from shortcomings. In this work, we proposed three new methods to overcome these deficiencies. First, we developed SE-ASDS (a structure tensor based regularization term) in arder to improve the sharpness of edges. SE-ASDS achieves much better results than many state-of-the- art algorithms. Then, we proposed AGNN and GOC algorithms for determining a local subset of training samples from which a good local model can be computed for recon- structing a given input test sample, where we take into account the underlying geometry of the data. AGNN and GOC methods outperform spectral clustering, soft clustering, and geodesic distance based subset selection in most settings. Next, we proposed aSOB strategy which takes into account the geometry of the data and the dictionary size. The aSOB strategy outperforms both PCA and PGA methods. Finally, we combine all our methods in a unique algorithm, named G2SR. Our proposed G2SR algorithm shows better visual and quantitative results when compared to the results of state-of-the-art methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Continental margin sediments of SE South America originate from various terrestrial sources, each conveying specific magnetic and element signatures. Here, we aim to identify the sources and transport characteristics of shelf and slope sediments deposited between East Brazil and Patagonia (20°-48°S) using enviromagnetic, major element, and grain-size data. A set of five source-indicative parameters (i.e., chi-fd%, ARM/IRM, S0.3T, SIRM/Fe and Fe/K) of 25 surface samples (16-1805 m water depth) was analyzed by fuzzy c-means clustering and non-linear mapping to depict and unmix sediment-province characteristics. This multivariate approach yields three regionally coherent sediment provinces with petrologically and climatically distinct source regions. The southernmost province is entirely restricted to the slope off the Argentinean Pampas and has been identified as relict Andean-sourced sands with coarse unaltered magnetite. The direct transport to the slope was enabled by Rio Colorado and Rio Negro meltwaters during glacial and deglacial phases of low sea level. The adjacent shelf province consists of coastal loessoidal sands (highest hematite and goethite proportions) delivered from the Argentinean Pampas by wave erosion and westerly winds. The northernmost province includes the Plata mudbelt and Rio Grande Cone. It contains tropically weathered clayey silts from the La Plata Drainage Basin with pronounced proportions of fine magnetite, which were distributed up to ~24° S by the Brazilian Coastal Current and admixed to coarser relict sediments of Pampean loessoidal origin. Grain-size analyses of all samples showed that sediment fractionation during transport and deposition had little impact on magnetic and element source characteristics. This study corroborates the high potential of the chosen approach to access sediment origin in regions with contrasting sediment sources, complex transport dynamics, and large grain-size variability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Marxist frameworks “distributive justice” depends on extracting value through a centralized state. Many new social movements—peer to peer economy, maker activism, community agriculture, queer ecology, etc.—take the opposite approach, keeping value in its unalienated form and allowing it to freely circulate from the bottom up. Unlike Marxism, there is no general theory for bottom-up, unalienated value circulation. This paper examines the concept of “generative justice” through an historical contrast between Marx’s writings and the indigenous cultures that he drew upon. Marx erroneously concluded that while indigenous cultures had unalienated forms of production, only centralized value extraction could allow the productivity needed for a high quality of life. To the contrary, indigenous cultures now provide a robust model for the “gift economy” that underpins open source technological production, agroecology, and restorative approaches to civil rights. Expanding Marx’s concept of unalienated labor value to include unalienated ecological (nonhuman) value, as well as the domain of freedom in speech, sexual orientation, spirituality and other forms of “expressive” value, we arrive at an historically informed perspective for generative justice. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most recent studies of Loyalism in Northern Ireland have focused on the nature and development of Loyalist paramilitaries and their methods, ideology and attitudes to the peace process. This article argues that the nature of Loyalist paramilitarism is primarily masculinist and that there is a perspective that has gone generally unheard from women in Loyalist communities. Using standpoint theory, evidence from interviews with women in Loyalist communities associated with Belfast is analysed and a picture is formed that suggests that there are gendered attitudes towards women who become involved in the conflict through paramilitary organisations and that paramilitaries are not representative of their communities. It is concluded that researchers need to bear in mind the gender dimensions of their work and be aware of who is present and who is absent when research is being carried out.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resource Selection (or Query Routing) is an important step in P2P IR. Though analogous to document retrieval in the sense of choosing a relevant subset of resources, resource selection methods have evolved independently from those for document retrieval. Among the reasons for such divergence is that document retrieval targets scenarios where underlying resources are semantically homogeneous, whereas peers would manage diverse content. We observe that semantic heterogeneity is mitigated in the clustered 2-tier P2P IR architecture resource selection layer by way of usage of clustering, and posit that this necessitates a re-look at the applicability of document retrieval methods for resource selection within such a framework. This paper empirically benchmarks document retrieval models against the state-of-the-art resource selection models for the problem of resource selection in the clustered P2P IR architecture, using classical IR evaluation metrics. Our benchmarking study illustrates that document retrieval models significantly outperform other methods for the task of resource selection in the clustered P2P IR architecture. This indicates that clustered P2P IR framework can exploit advancements in document retrieval methods to deliver corresponding improvements in resource selection, indicating potential convergence of these fields for the clustered P2P IR architecture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This papers examines the use of trajectory distance measures and clustering techniques to define normal
and abnormal trajectories in the context of pedestrian tracking in public spaces. In order to detect abnormal
trajectories, what is meant by a normal trajectory in a given scene is firstly defined. Then every trajectory
that deviates from this normality is classified as abnormal. By combining Dynamic Time Warping and a
modified K-Means algorithms for arbitrary-length data series, we have developed an algorithm for trajectory
clustering and abnormality detection. The final system performs with an overall accuracy of 83% and 75%
when tested in two different standard datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mathematical models are increasingly used in environmental science thus increasing the importance of uncertainty and sensitivity analyses. In the present study, an iterative parameter estimation and identifiability analysis methodology is applied to an atmospheric model – the Operational Street Pollution Model (OSPMr). To assess the predictive validity of the model, the data is split into an estimation and a prediction data set using two data splitting approaches and data preparation techniques (clustering and outlier detection) are analysed. The sensitivity analysis, being part of the identifiability analysis, showed that some model parameters were significantly more sensitive than others. The application of the determined optimal parameter values was shown to succesfully equilibrate the model biases among the individual streets and species. It was as well shown that the frequentist approach applied for the uncertainty calculations underestimated the parameter uncertainties. The model parameter uncertainty was qualitatively assessed to be significant, and reduction strategies were identified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we use concepts from graph theory and cellular biology represented as ontologies, to carry out semantic mining tasks on signaling pathway networks. Specifically, the paper describes the semantic enrichment of signaling pathway networks. A cell signaling network describes the basic cellular activities and their interactions. The main contribution of this paper is in the signaling pathway research area, it proposes a new technique to analyze and understand how changes in these networks may affect the transmission and flow of information, which produce diseases such as cancer and diabetes. Our approach is based on three concepts from graph theory (modularity, clustering and centrality) frequently used on social networks analysis. Our approach consists into two phases: the first uses the graph theory concepts to determine the cellular groups in the network, which we will call them communities; the second uses ontologies for the semantic enrichment of the cellular communities. The measures used from the graph theory allow us to determine the set of cells that are close (for example, in a disease), and the main cells in each community. We analyze our approach in two cases: TGF-β and the Alzheimer Disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

International audience

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Earlier histories of the Scottish parliament have been somewhat constitutional in emphasis and have been exceedingly critical of what was understood to be parliament's subservience to the crown. Estimates by constitutional historians of the extreme weakness of parliament rested on an assessment of the constitutional system. The argument was that many of its features were not consistent with a reasonably strong parliament. Because the 'constitution' is apparently fragmented, with active roles played by bodies such as the lords of articles, the general council and the convention of estates, each apparently suggesting that parliament was inadequate, historians have sometimes failed to appreciate the positive role played by the estates in the conduct of national affairs. The thesis begins with a discussion of the reliability of the printed text of APS and proceeds to an examination of selected aspects of the work of parliament in a period from c 1424-c 1625. The belief of constitutional historians such as Rait that conditions In Scotland proved unfavourable to the interests and. effectiveness of parliament in the fifteenth and sixteenth centuries, is also examined. Chapter 1 concludes that APS is a less than reliable text, particularly for the reign of James I. Numerous statutes were excluded from the printed text and they are offered below for the first time. These statutes have been a useful addition to our understanding of the reign of James I. Chapter 2 analyses the motives behind the schemes for shire representation and concludes that neither constitutional theory nor political opportunism explains the support which James I and James VI gave to these measures. Both these monarchs were motivated by the realisation that their particular ambitions were dependent on winning the support of the estates whose ranks should include representatives from the shires. Chapter 3 examines the method of electing the lords of articles, the composition of this committee, and some aspects of its operation. The conclusion is that in the main the estates were the deciding force in the choice of the lords of articles. The committee's composition was more a reflection of a desire for a balance between representatives from north and south of the Forth and for the most important burghs and clergy to be selected than an attempt at electing government favourites. The articles did exercise a significant control over the items which came before parliament but this control was not absolute and applied to government as well as private legislation. Chapter 4 questions the traditional view that the general council and convention of estates were the same body. It is argued that they were two different institutions with different powers, but that they nevertheless worked within certain limits and were careful not to usurp the authority of parliament. Chapter 5 concedes that taxation was sometimes decided outside parliament; that the irregularity of taxation certainly weakened the bargaining power of the estates and that the latter did not appear to capitalise on these occasions when taxation was an issue. But the tendency was to ensure that, whether in or out of parliament, the decision to impose taxation was taken by a large number of each estate. The infrequency of taxation was a direct consequence of an unwillingness among the estates to agree to a regular taxation and their preference to ensure for the crown an alternative source of income. Moreover taxation was one issue, which more than any other, would be subject to contentious opposition by the estates, and could lead to the crown's defeat. Chapter 6 is concerned with ecclesiastical representation after the Reformation and the church's attitudes to the possibility of ministerial representation. Some ministers had doctrinal misgivings but the majority came to believe that the church's absence from parliament bad severely reduced. the influence of the church. That no agreement was forthcoming on a system of ministerial representation, particularly after 1597, is attributable to the estates' unwillingness to compromise and, not to the strength of opposition in the church. Chapter 7 examines the institutions which are sometime seen as 'rivals' of parliament and concludes that institutions such as the privy council were generally very careful in matters which needed the approval of parliament, and seemed aware of the greater authority of parliament. Chapter 8 which illustrates how parliament had the right to be consulted in all important matters of state, brings together the main points of the earlier chapters and offers further illustrations of the essential role which parliament played in the conduct of national affairs. Whether or not the system can be regarded as constitutionally sound, the estates in Scotland could observe parliament's day-to-day operation with some satisfaction. All in all, there is little convincing evidence that parliament was as weak as some historians would have us believe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Internet has grown in size at rapid rates since BGP records began, and continues to do so. This has raised concerns about the scalability of the current BGP routing system, as the routing state at each router in a shortest-path routing protocol will grow at a supra-linearly rate as the network grows. The concerns are that the memory capacity of routers will not be able to keep up with demands, and that the growth of the Internet will become ever more cramped as more and more of the world seeks the benefits of being connected. Compact routing schemes, where the routing state grows only sub-linearly relative to the growth of the network, could solve this problem and ensure that router memory would not be a bottleneck to Internet growth. These schemes trade away shortest-path routing for scalable memory state, by allowing some paths to have a certain amount of bounded “stretch”. The most promising such scheme is Cowen Routing, which can provide scalable, compact routing state for Internet routing, while still providing shortest-path routing to nearly all other nodes, with only slightly stretched paths to a very small subset of the network. Currently, there is no fully distributed form of Cowen Routing that would be practical for the Internet. This dissertation describes a fully distributed and compact protocol for Cowen routing, using the k-core graph decomposition. Previous compact routing work showed the k-core graph decomposition is useful for Cowen Routing on the Internet, but no distributed form existed. This dissertation gives a distributed k-core algorithm optimised to be efficient on dynamic graphs, along with with proofs of its correctness. The performance and efficiency of this distributed k-core algorithm is evaluated on large, Internet AS graphs, with excellent results. This dissertation then goes on to describe a fully distributed and compact Cowen Routing protocol. This protocol being comprised of a landmark selection process for Cowen Routing using the k-core algorithm, with mechanisms to ensure compact state at all times, including at bootstrap; a local cluster routing process, with mechanisms for policy application and control of cluster sizes, ensuring again that state can remain compact at all times; and a landmark routing process is described with a prioritisation mechanism for announcements that ensures compact state at all times.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part 21: Mobility and Logistics

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part 21: Mobility and Logistics

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Statistical analysis of DNA microarray data provides a valuable diagnostic tool for the investigation of genetic components of diseases. To take advantage of the multitude of available data sets and analysis methods, it is desirable to combine both different algorithms and data from different studies. Applying ensemble learning, consensus clustering and cross-study normalization methods for this purpose in an almost fully automated process and linking different analysis modules together under a single interface would simplify many microarray analysis tasks. Results: We present ArrayMining.net, a web-application for microarray analysis that provides easy access to a wide choice of feature selection, clustering, prediction, gene set analysis and cross-study normalization methods. In contrast to other microarray-related web-tools, multiple algorithms and data sets for an analysis task can be combined using ensemble feature selection, ensemble prediction, consensus clustering and cross-platform data integration. By interlinking different analysis tools in a modular fashion, new exploratory routes become available, e.g. ensemble sample classification using features obtained from a gene set analysis and data from multiple studies. The analysis is further simplified by automatic parameter selection mechanisms and linkage to web tools and databases for functional annotation and literature mining. Conclusion: ArrayMining.net is a free web-application for microarray analysis combining a broad choice of algorithms based on ensemble and consensus methods, using automatic parameter selection and integration with annotation databases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Libraries since their inception 4000 years ago have been in a process of constant change. Although, changes were in slow motion for centuries, in the last decades, academic libraries have been continuously striving to adapt their services to the ever-changing user needs of students and academic staff. In addition, e-content revolution, technological advances, and ever-shrinking budgets have obliged libraries to efficiently allocate their limited resources among collection and services. Unfortunately, this resource allocation is a complex process due to the diversity of data sources and formats required to be analyzed prior to decision-making, as well as the lack of efficient integration methods. The main purpose of this study is to develop an integrated model that supports libraries in making optimal budgeting and resource allocation decisions among their services and collection by means of a holistic analysis. To this end, a combination of several methodologies and structured approaches is conducted. Firstly, a holistic structure and the required toolset to holistically assess academic libraries are proposed to collect and organize the data from an economic point of view. A four-pronged theoretical framework is used in which the library system and collection are analyzed from the perspective of users and internal stakeholders. The first quadrant corresponds to the internal perspective of the library system that is to analyze the library performance, and costs incurred and resources consumed by library services. The second quadrant evaluates the external perspective of the library system; user’s perception about services quality is judged in this quadrant. The third quadrant analyses the external perspective of the library collection that is to evaluate the impact of the current library collection on its users. Eventually, the fourth quadrant evaluates the internal perspective of the library collection; the usage patterns followed to manipulate the library collection are analyzed. With a complete framework for data collection, these data coming from multiple sources and therefore with different formats, need to be integrated and stored in an adequate scheme for decision support. A data warehousing approach is secondly designed and implemented to integrate, process, and store the holistic-based collected data. Ultimately, strategic data stored in the data warehouse are analyzed and implemented for different purposes including the following: 1) Data visualization and reporting is proposed to allow library managers to publish library indicators in a simple and quick manner by using online reporting tools. 2) Sophisticated data analysis is recommended through the use of data mining tools; three data mining techniques are examined in this research study: regression, clustering and classification. These data mining techniques have been applied to the case study in the following manner: predicting the future investment in library development; finding clusters of users that share common interests and similar profiles, but belong to different faculties; and predicting library factors that affect student academic performance by analyzing possible correlations of library usage and academic performance. 3) Input for optimization models, early experiences of developing an optimal resource allocation model to distribute resources among the different processes of a library system are documented in this study. Specifically, the problem of allocating funds for digital collection among divisions of an academic library is addressed. An optimization model for the problem is defined with the objective of maximizing the usage of the digital collection over-all library divisions subject to a single collection budget. By proposing this holistic approach, the research study contributes to knowledge by providing an integrated solution to assist library managers to make economic decisions based on an “as realistic as possible” perspective of the library situation.