874 resultados para Curricula representation and visualization
Resumo:
lmage super-resolution is defined as a class of techniques that enhance the spatial resolution of images. Super-resolution methods can be subdivided in single and multi image methods. This thesis focuses on developing algorithms based on mathematical theories for single image super resolution problems. lndeed, in arder to estimate an output image, we adopta mixed approach: i.e., we use both a dictionary of patches with sparsity constraints (typical of learning-based methods) and regularization terms (typical of reconstruction-based methods). Although the existing methods already per- form well, they do not take into account the geometry of the data to: regularize the solution, cluster data samples (samples are often clustered using algorithms with the Euclidean distance as a dissimilarity metric), learn dictionaries (they are often learned using PCA or K-SVD). Thus, state-of-the-art methods still suffer from shortcomings. In this work, we proposed three new methods to overcome these deficiencies. First, we developed SE-ASDS (a structure tensor based regularization term) in arder to improve the sharpness of edges. SE-ASDS achieves much better results than many state-of-the- art algorithms. Then, we proposed AGNN and GOC algorithms for determining a local subset of training samples from which a good local model can be computed for recon- structing a given input test sample, where we take into account the underlying geometry of the data. AGNN and GOC methods outperform spectral clustering, soft clustering, and geodesic distance based subset selection in most settings. Next, we proposed aSOB strategy which takes into account the geometry of the data and the dictionary size. The aSOB strategy outperforms both PCA and PGA methods. Finally, we combine all our methods in a unique algorithm, named G2SR. Our proposed G2SR algorithm shows better visual and quantitative results when compared to the results of state-of-the-art methods.
Resumo:
Peer reviewed
Resumo:
In Ireland, the Constitution guarantees very strong rights to parents and the family, and there has been a long and unfortunate history of failures to adequately protect children at risk. As a result, there has been much discussion in recent years about the need to improve legal mechanisms designed to protect the rights of children. By comparison, little attention has been given to establishing whether the theoretically strong rights of parents translate into strongly protected rights in practice. This paper presents new empirical evidence on the manner in which child care proceedings in Ireland balance the rights and interests of children and parents, including the rates at which orders are granted, the frequency of and conditions in which legal representation is provided, and the extent to which parents are able to actively participate in proceedings. A number of systemic issues are identified that restrict the capacity of the system to emphasise parental rights and hear the voice of parents to the extent that would be expected when looking at the legal provisions in isolation.
Resumo:
This dissertation offers an investigation of the role of visual strategies, art, and representation in reconciling Indian Residential School history in Canada. This research builds upon theories of biopolitics, settler colonialism, and race to examine the project of redress and reconciliation as nation and identity building strategies engaged in the ongoing structural invasion of settler colonialism. It considers the key policy moments and expressions of the federal government—from RCAP to the IRSSA and subsequent apology—as well as the visual discourse of reconciliation as it works through archival photography, institutional branding, and commissioned works. These articulations are read alongside the creative and critical work of Indigenous artists and knowledge producers working within and outside of hegemonic structures on the topics of Indian Residential School history and redress. In particular the works of Jeff Thomas, Adrian Stimson, Krista Belle Stewart, Christi Belcourt, Luke Marston, Peter Morin, and Carey Newman are discussed in this dissertation. These works must be understood in relationship to the normative discourse of reconciliation as a legitimizing mechanism of settler colonial hegemony. Beyond the binary of cooptation and autonomous resistance, these works demonstrate the complexity of representing Indigeneity: as an ongoing site of settler colonial encounter and simultaneously the forum for the willful refusal of contingency or containment.
Resumo:
With the emerging prevalence of smart phones and 4G LTE networks, the demand for faster-better-cheaper mobile services anytime and anywhere is ever growing. The Dynamic Network Optimization (DNO) concept emerged as a solution that optimally and continuously tunes the network settings, in response to varying network conditions and subscriber needs. Yet, the DNO realization is still at infancy, largely hindered by the bottleneck of the lengthy optimization runtime. This paper presents the design and prototype of a novel cloud based parallel solution that further enhances the scalability of our prior work on various parallel solutions that accelerate network optimization algorithms. The solution aims to satisfy the high performance required by DNO, preliminarily on a sub-hourly basis. The paper subsequently visualizes a design and a full cycle of a DNO system. A set of potential solutions to large network and real-time DNO are also proposed. Overall, this work creates a breakthrough towards the realization of DNO.
Resumo:
Archaeozoological mortality profiles have been used to infer site-specific subsistence strategies. There is however no common agreement on the best way to present these profiles and confidence intervals around age class proportions. In order to deal with these issues, we propose the use of the Dirichlet distribution and present a new approach to perform age-at-death multivariate graphical comparisons. We demonstrate the efficiency of this approach using domestic sheep/goat dental remains from 10 Cardial sites (Early Neolithic) located in South France and the Iberian Peninsula. We show that the Dirichlet distribution in age-at-death analysis can be used: (i) to generate Bayesian credible intervals around each age class of a mortality profile, even when not all age classes are observed; and (ii) to create 95% kernel density contours around each age-at-death frequency distribution when multiple sites are compared using correspondence analysis. The statistical procedure we present is applicable to the analysis of any categorical count data and particularly well-suited to archaeological data (e.g. potsherds, arrow heads) where sample sizes are typically small.
Resumo:
50 years after the birth of the Nouvelle Vague, the inheritance that the contemporary cinema receives from it is inevitable. Figures and visual motifs; stories, themes, faces or common places; aesthetic and language devices. The echoes appear in various ways, each affiliation involves a different relationship and therefore a dissimilar approximation to its analysis. And yet, both the academy and the film critics maintain their will to think the Nouvelle Vague as a whole, a universe, a stream or an aesthetic trend. However, does a Nouvelle Vague’s aesthetic exist? And if so: why and how to address their historical revision? Taking Deleuze’s thesis on the time-image and Serge Daney’s assertion according to which 50 years after the birth of the Nouvelle Vague, the inheritance that the contemporary cinema receives from it is inevitable. Figures and visual motifs; stories, themes, faces or common places; aesthetic and language devices. The echoes appear in various ways, each affiliation involves a different relationship and therefore a dissimilar approximation to its analysis. And yet, both the academy and the film critics maintain their will to think the Nouvelle Vague as a whole, a universe, a stream or an aesthetic trend. However, does a Nouvelle Vague’s aesthetic exist? And if so: why and how to address their historical revision? Taking Deleuze’s thesis on the time-image and Serge Daney’s assertion according to which
Resumo:
We build a system to support search and visualization on heterogeneous information networks. We first build our system on a specialized heterogeneous information network: DBLP. The system aims to facilitate people, especially computer science researchers, toward a better understanding and user experience about academic information networks. Then we extend our system to the Web. Our results are much more intuitive and knowledgeable than the simple top-k blue links from traditional search engines, and bring more meaningful structural results with correlated entities. We also investigate the ranking algorithm, and we show that the personalized PageRank and proposed Hetero-personalized PageRank outperform the TF-IDF ranking or mixture of TF-IDF and authority ranking. Our work opens several directions for future research.
Resumo:
Since the beginning of the Haitian theatrical tradition there has been an ineluctable dedication to the representation of Haitian history on stage. Given the rich theatrical archive about Haiti throughout the world, this study considers operas and plays written solely by Haitian playwrights. By delving into the works of Juste Chanlatte, Massillon Coicou, and Vendenesse Ducasse this study proposes a re-reading of Haitian theater that considers the stage as an innovative site for contesting negative and clichéd representations of the Haitian Revolution and its revolutionary leadership. A genre long mired in accusations of mimicking European literary forms, this study proposes a reevaluation of Haitian theater and its literary origins.
Resumo:
Knowledge is the key for success. The adequate treatment you make on data for generating knowledge can make a difference in projects, processes, and networks. Such a treatment is the main goal of two important areas: knowledger representation and management. Our aim, in this book, is collecting sorne innovative ways of representing and managing knowledge proposed by several Latin American researchers under the premise of improving knowledge.
Resumo:
With the exponential growth of the usage of web-based map services, the web GIS application has become more and more popular. Spatial data index, search, analysis, visualization and the resource management of such services are becoming increasingly important to deliver user-desired Quality of Service. First, spatial indexing is typically time-consuming and is not available to end-users. To address this, we introduce TerraFly sksOpen, an open-sourced an Online Indexing and Querying System for Big Geospatial Data. Integrated with the TerraFly Geospatial database [1-9], sksOpen is an efficient indexing and query engine for processing Top-k Spatial Boolean Queries. Further, we provide ergonomic visualization of query results on interactive maps to facilitate the user’s data analysis. Second, due to the highly complex and dynamic nature of GIS systems, it is quite challenging for the end users to quickly understand and analyze the spatial data, and to efficiently share their own data and analysis results with others. Built on the TerraFly Geo spatial database, TerraFly GeoCloud is an extra layer running upon the TerraFly map and can efficiently support many different visualization functions and spatial data analysis models. Furthermore, users can create unique URLs to visualize and share the analysis results. TerraFly GeoCloud also enables the MapQL technology to customize map visualization using SQL-like statements [10]. Third, map systems often serve dynamic web workloads and involve multiple CPU and I/O intensive tiers, which make it challenging to meet the response time targets of map requests while using the resources efficiently. Virtualization facilitates the deployment of web map services and improves their resource utilization through encapsulation and consolidation. Autonomic resource management allows resources to be automatically provisioned to a map service and its internal tiers on demand. v-TerraFly are techniques to predict the demand of map workloads online and optimize resource allocations, considering both response time and data freshness as the QoS target. The proposed v-TerraFly system is prototyped on TerraFly, a production web map service, and evaluated using real TerraFly workloads. The results show that v-TerraFly can accurately predict the workload demands: 18.91% more accurate; and efficiently allocate resources to meet the QoS target: improves the QoS by 26.19% and saves resource usages by 20.83% compared to traditional peak load-based resource allocation.
Resumo:
Event extraction from texts aims to detect structured information such as what has happened, to whom, where and when. Event extraction and visualization are typically considered as two different tasks. In this paper, we propose a novel approach based on probabilistic modelling to jointly extract and visualize events from tweets where both tasks benefit from each other. We model each event as a joint distribution over named entities, a date, a location and event-related keywords. Moreover, both tweets and event instances are associated with coordinates in the visualization space. The manifold assumption that the intrinsic geometry of tweets is a low-rank, non-linear manifold within the high-dimensional space is incorporated into the learning framework using a regularization. Experimental results show that the proposed approach can effectively deal with both event extraction and visualization and performs remarkably better than both the state-of-the-art event extraction method and a pipeline approach for event extraction and visualization.
Resumo:
Extreme weather events related to deep convection are high-impact critical phenomena whose reliable numerical simulation is still challenging. High-resolution (convection-permitting) modeling setups allow to switch off physical parameterizations accountable for substantial errors in convection representation. A new convection-permitting reanalysis over Italy (SPHERA) has been produced at ARPAE to enhance the representation and understanding of extreme weather situations. SPHERA is obtained through a dynamical downscaling of the global reanalysis ERA5 using the non-hydrostatic model COSMO at 2.2 km grid spacing over 1995-2020. This thesis aims to verify the expectations placed on SPHERA by analyzing two weather phenomena that are particularly challenging to simulate: heavy rainfall and hail. A quantitative statistical analysis over Italy during 2003-2017 for daily and hourly precipitation is presented to compare the performance of SPHERA with its driver ERA5 considering the national network of rain gauges as reference. Furthermore, two extreme precipitation events are deeply investigated. SPHERA shows a quantitative added skill over ERA5 for moderate to severe and rapid accumulations in terms of adherence to the observations, higher detailing of the spatial fields, and more precise temporal matching. These results prompted the use of SPHERA for the investigation of hailstorms, for which the combination of multiple information is crucial to reduce the substantial uncertainties permeating their understanding. A proxy for hail is developed by combining hail-favoring environmental numerical predictors with observations of ESWD hail reports and satellite overshooting top detections. The procedure is applied to the extended summer season (April-October) of 2016-2018 over the whole SPHERA spatial domain. The results indicate maximum hail likelihood over pre-Alpine regions and the northern Adriatic sea around 15 UTC in June-July, in agreement with recent European hail climatologies. The method demonstrates enhanced performance in case of severe hail occurrences and the ability to separate between ambient signatures depending on hail severity.