818 resultados para Functorial Embedding


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We show how to reconstruct a graded ample Hausdorff groupoid with topologically principal neutrally-graded component from the ring structure of its graded Steinberg algebra over any commutative integral domain with 1, together with the embedding of the canonical abelian subring of functions supported on the unit space. We deduce that diagonal-preserving ring isomorphism of Leavitt path algebras implies $C^*$-isomorphism of $C^*$-algebras for graphs $E$ and $F$ in which every cycle has an exit. This is a joint work with Joan Bosa, Roozbeh Hazrat and Aidan Sims.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The heart is a non-regenerating organ that gradually suffers a loss of cardiac cells and functionality. Given the scarcity of organ donors and complications in existing medical implantation solutions, it is desired to engineer a three-dimensional architecture to successfully control the cardiac cells in vitro and yield true myocardial structures similar to native heart. This thesis investigates the synthesis of a biocompatible gelatin methacrylate hydrogel to promote growth of cardiac cells using biotechnology methodology: surface acoustic waves, to create cell sheets. Firstly, the synthesis of a photo-crosslinkable gelatin methacrylate (GelMA) hydrogel was investigated with different degree of methacrylation concentration. The porous matrix of the hydrogel should be biocompatible, allow cell-cell interaction and promote cell adhesion for growth through the porous network of matrix. The rheological properties, such as polymer concentration, ultraviolet exposure time, viscosity, elasticity and swelling characteristics of the hydrogel were investigated. In tissue engineering hydrogels have been used for embedding cells to mimic native microenvironments while controlling the mechanical properties. Gelatin methacrylate hydrogels have the advantage of allowing such control of mechanical properties in addition to easy compatibility with Lab-on-a-chip methodologies. Secondly in this thesis, standing surface acoustic waves were used to control the degree of movement of cells in the hydrogel and produce three-dimensional engineered scaffolds to investigate in-vitro studies of cardiac muscle electrophysiology and cardiac tissue engineering therapies for myocardial infarction. The acoustic waves were characterized on a piezoelectric substrate, lithium niobate that was micro-fabricated with slanted-finger interdigitated transducers for to generate waves at multiple wavelengths. This characterization successfully created three-dimensional micro-patterning of cells in the constructs through means of one- and two-dimensional non-invasive forces. The micro-patterning was controlled by tuning different input frequencies that allowed manipulation of the cells spatially without any pre- treatment of cells, hydrogel or substrate. This resulted in a synchronous heartbeat being produced in the hydrogel construct. To complement these mechanical forces, work in dielectrophoresis was conducted centred on a method to pattern micro-particles. Although manipulation of particles were shown, difficulties were encountered concerning the close proximity of particles and hydrogel to the microfabricated electrode arrays, dependence on conductivity of hydrogel and difficult manoeuvrability of scaffold from the surface of electrodes precluded measurements on cardiac cells. In addition, COMSOL Multiphysics software was used to investigate the mechanical and electrical forces theoretically acting on the cells. Thirdly, in this thesis the cardiac electrophysiology was investigated using immunostaining techniques to visualize the growth of sarcomeres and gap junctions that promote cell-cell interaction and excitation-contraction of heart muscles. The physiological response of beating of co-cultured cardiomyocytes and cardiac fibroblasts was observed in a synchronous and simultaneous manner closely mimicking the native cardiac impulses. Further investigations were carried out by mechanically stimulating the cells in the three-dimensional hydrogel using standing surface acoustic waves and comparing with traditional two-dimensional flat surface coated with fibronectin. The electrophysiological responses of the cells under the effect of the mechanical stimulations yielded a higher magnitude of contractility, action potential and calcium transient.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Image (Video) retrieval is an interesting problem of retrieving images (videos) similar to the query. Images (Videos) are represented in an input (feature) space and similar images (videos) are obtained by finding nearest neighbors in the input representation space. Numerous input representations both in real valued and binary space have been proposed for conducting faster retrieval. In this thesis, we present techniques that obtain improved input representations for retrieval in both supervised and unsupervised settings for images and videos. Supervised retrieval is a well known problem of retrieving same class images of the query. We address the practical aspects of achieving faster retrieval with binary codes as input representations for the supervised setting in the first part, where binary codes are used as addresses into hash tables. In practice, using binary codes as addresses does not guarantee fast retrieval, as similar images are not mapped to the same binary code (address). We address this problem by presenting an efficient supervised hashing (binary encoding) method that aims to explicitly map all the images of the same class ideally to a unique binary code. We refer to the binary codes of the images as `Semantic Binary Codes' and the unique code for all same class images as `Class Binary Code'. We also propose a new class­ based Hamming metric that dramatically reduces the retrieval times for larger databases, where only hamming distance is computed to the class binary codes. We also propose a Deep semantic binary code model, by replacing the output layer of a popular convolutional Neural Network (AlexNet) with the class binary codes and show that the hashing functions learned in this way outperforms the state­ of ­the art, and at the same time provide fast retrieval times. In the second part, we also address the problem of supervised retrieval by taking into account the relationship between classes. For a given query image, we want to retrieve images that preserve the relative order i.e. we want to retrieve all same class images first and then, the related classes images before different class images. We learn such relationship aware binary codes by minimizing the similarity between inner product of the binary codes and the similarity between the classes. We calculate the similarity between classes using output embedding vectors, which are vector representations of classes. Our method deviates from the other supervised binary encoding schemes as it is the first to use output embeddings for learning hashing functions. We also introduce new performance metrics that take into account the related class retrieval results and show significant gains over the state­ of­ the art. High Dimensional descriptors like Fisher Vectors or Vector of Locally Aggregated Descriptors have shown to improve the performance of many computer vision applications including retrieval. In the third part, we will discuss an unsupervised technique for compressing high dimensional vectors into high dimensional binary codes, to reduce storage complexity. In this approach, we deviate from adopting traditional hyperplane hashing functions and instead learn hyperspherical hashing functions. The proposed method overcomes the computational challenges of directly applying the spherical hashing algorithm that is intractable for compressing high dimensional vectors. A practical hierarchical model that utilizes divide and conquer techniques using the Random Select and Adjust (RSA) procedure to compress such high dimensional vectors is presented. We show that our proposed high dimensional binary codes outperform the binary codes obtained using traditional hyperplane methods for higher compression ratios. In the last part of the thesis, we propose a retrieval based solution to the Zero shot event classification problem - a setting where no training videos are available for the event. To do this, we learn a generic set of concept detectors and represent both videos and query events in the concept space. We then compute similarity between the query event and the video in the concept space and videos similar to the query event are classified as the videos belonging to the event. We show that we significantly boost the performance using concept features from other modalities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Secure Multi-party Computation (MPC) enables a set of parties to collaboratively compute, using cryptographic protocols, a function over their private data in a way that the participants do not see each other's data, they only see the final output. Typical MPC examples include statistical computations over joint private data, private set intersection, and auctions. While these applications are examples of monolithic MPC, richer MPC applications move between "normal" (i.e., per-party local) and "secure" (i.e., joint, multi-party secure) modes repeatedly, resulting overall in mixed-mode computations. For example, we might use MPC to implement the role of the dealer in a game of mental poker -- the game will be divided into rounds of local decision-making (e.g. bidding) and joint interaction (e.g. dealing). Mixed-mode computations are also used to improve performance over monolithic secure computations. Starting with the Fairplay project, several MPC frameworks have been proposed in the last decade to help programmers write MPC applications in a high-level language, while the toolchain manages the low-level details. However, these frameworks are either not expressive enough to allow writing mixed-mode applications or lack formal specification, and reasoning capabilities, thereby diminishing the parties' trust in such tools, and the programs written using them. Furthermore, none of the frameworks provides a verified toolchain to run the MPC programs, leaving the potential of security holes that can compromise the privacy of parties' data. This dissertation presents language-based techniques to make MPC more practical and trustworthy. First, it presents the design and implementation of a new MPC Domain Specific Language, called Wysteria, for writing rich mixed-mode MPC applications. Wysteria provides several benefits over previous languages, including a conceptual single thread of control, generic support for more than two parties, high-level abstractions for secret shares, and a fully formalized type system and operational semantics. Using Wysteria, we have implemented several MPC applications, including, for the first time, a card dealing application. The dissertation next presents Wys*, an embedding of Wysteria in F*, a full-featured verification oriented programming language. Wys* improves on Wysteria along three lines: (a) It enables programmers to formally verify the correctness and security properties of their programs. As far as we know, Wys* is the first language to provide verification capabilities for MPC programs. (b) It provides a partially verified toolchain to run MPC programs, and finally (c) It enables the MPC programs to use, with no extra effort, standard language constructs from the host language F*, thereby making it more usable and scalable. Finally, the dissertation develops static analyses that help optimize monolithic MPC programs into mixed-mode MPC programs, while providing similar privacy guarantees as the monolithic versions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Working on the d-torus, we show that Besov spaces Bps(Lp(logL)a) modelled on Zygmund spaces can be described in terms of classical Besov spaces. Several other properties of spaces Bps(Lp(logL)a) are also established. In particular, in the critical case s=d/p, we characterize the embedding of Bpd/p(Lp(logL)a) into the space of continuous functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wydział Studiów Edukacyjnych

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Social network sites (SNS), such as Facebook, Google+ and Twitter, have attracted hundreds of millions of users daily since their appearance. Within SNS, users connect to each other, express their identity, disseminate information and form cooperation by interacting with their connected peers. The increasing popularity and ubiquity of SNS usage and the invaluable user behaviors and connections give birth to many applications and business models. We look into several important problems within the social network ecosystem. The first one is the SNS advertisement allocation problem. The other two are related to trust mechanisms design in social network setting, including local trust inference and global trust evaluation. In SNS advertising, we study the problem of advertisement allocation from the ad platform's angle, and discuss its differences with the advertising model in the search engine setting. By leveraging the connection between social networks and hyperbolic geometry, we propose to solve the problem via approximation using hyperbolic embedding and convex optimization. A hyperbolic embedding method, \hcm, is designed for the SNS ad allocation problem, and several components are introduced to realize the optimization formulation. We show the advantages of our new approach in solving the problem compared to the baseline integer programming (IP) formulation. In studying the problem of trust mechanisms in social networks, we consider the existence of distrust (i.e. negative trust) relationships, and differentiate between the concept of local trust and global trust in social network setting. In the problem of local trust inference, we propose a 2-D trust model. Based on the model, we develop a semiring-based trust inference framework. In global trust evaluation, we consider a general setting with conflicting opinions, and propose a consensus-based approach to solve the complex problem in signed trust networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of robots has shown itself as a very complex interdisciplinary research field. The predominant procedure for these developments in the last decades is based on the assumption that each robot is a fully personalized project, with the direct embedding of hardware and software technologies in robot parts with no level of abstraction. Although this methodology has brought countless benefits to the robotics research, on the other hand, it has imposed major drawbacks: (i) the difficulty to reuse hardware and software parts in new robots or new versions; (ii) the difficulty to compare performance of different robots parts; and (iii) the difficulty to adapt development needs-in hardware and software levels-to local groups expertise. Large advances might be reached, for example, if physical parts of a robot could be reused in a different robot constructed with other technologies by other researcher or group. This paper proposes a framework for robots, TORP (The Open Robot Project), that aims to put forward a standardization in all dimensions (electrical, mechanical and computational) of a robot shared development model. This architecture is based on the dissociation between the robot and its parts, and between the robot parts and their technologies. In this paper, the first specification for a TORP family and the first humanoid robot constructed following the TORP specification set are presented, as well as the advances proposed for their improvement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The ageing population, with concomitant increase in chronic conditions, is increasing the presence of older people with complex needs in hospital. People with dementia are one of these complex populations and are particularly vulnerable to complications in hospital. Registered nurses can offer simultaneous assessment and intervention to prevent or mitigate hospital-acquired complications through their skilled brokerage between patient needs and hospital functions. A range of patient outcome measures that are sensitive to nursing care has been tested in nursing work environments across the world. However, none of these measures have focused on hospitalised older patients. Method: This thesis explores nursing-sensitive complications for older patients with and without dementia using an internationally recognised, risk-adjusted patient outcome approach. Specifically explored are: the differences between rates of complications; the costs of complications; and cost comparisons of patient complexity. A retrospective cohort study of an Australian state’s 2006–07 public hospital discharge data was utilised to identify patient episodes for people over age 50 (N=222,440) where dementia was identified as a primary or secondary diagnosis (N=44,422). Extra costs for patient episodes were estimated based on length of stay (LOS) above the average for each patient’s Diagnosis Related Group (DRG) (N=157,178) and were modelled using linear regression analysis to establish the strongest patient complexity predictors of cost. Results: Hospitalised patients with a primary or secondary diagnosis of dementia had higher rates of complications than did their same-age peers. The highest rates and relative risk for people with dementia were found in four key complications: urinary tract infections; pressure injuries; pneumonia, and delirium. While 21.9% of dementia patients (9,751/44,488, p<0.0001) suffered a complication, only 8.8% of non-dementia patients did so (33,501/381,788, p<0.0001), giving dementia patients a 2.5 relative risk of acquiring a complication (p<0.0001). These four key complications in patients over 50 both with and without dementia were associated with an eightfold increase in length of stay (813%, or 3.6 days/0.4 days) and double the increased estimated mean episode cost (199%, or A$16,403/ A$8,240). These four complications were associated with 24.7% of the estimated cost of additional days spent in hospital in 2006–07 in NSW (A$226million/A$914million). Dementia patients accounted for 22.0% of these costs (A$49million/A$226million) even though they were only 10.4% of the population (44,488/426,276 episodes). Hospital-acquired complications, particularly for people with a comorbidity of dementia, cost more than other kinds of inpatient complexity but admission severity was a better predictor of excess cost. Discussion: Four key complications occur more often in older patients with dementia and the high rate of these complications makes them expensive. These complications are potentially preventable. However, the care that can prevent them (such as mobility, hydration, nutrition and communication) is known to be rationed or left unfinished by nurses. Older hospitalised people who have complex needs, such as those with dementia, are more likely to experience care rationing as their care tends to take longer, be less predictable and less curative in nature. This thesis offers the theoretical proposition that evidence-based nursing practices are rationed for complex older patients and that this rationed care contributes to functional and cognitive decline during hospitalisation. This, in turn, contributes to the high rates of complications observed. Thus four key complications can be seen as a ‘Failure to Maintain’ complex older people in hospital. ‘Failure to Maintain’ is the inadequate delivery of essential functional and cognitive care for a complex older person in hospital resulting in a complication, and is recommended as a useful indicator for hospital quality. Conclusions: When examining extra length of stay in hospital, complications and comorbid dementia are costly. Complications are potentially preventable, and dementia care in hospitals can be improved. Hospitals and governments looking to decrease costs can engage in risk-reduction strategies for common nurse sensitive complications such as healthy nursing work environments that minimise nurses’ rationing of functional and cognitive care. The conceptualisation of complex older patients as ‘business as usual’ rather than a ‘burden’ is likely necessary for sustainable health care services of the future. The use of the ‘Failure to Maintain’ indicators at institution and state levels may aid in embedding this approach for complex older patients into health organisations. Ongoing investigation is warranted into the relationships between the largest health services expense (hospitals), the largest hospital population (complex older patients), and the largest hospital expense (nurses). The ‘Failure to Maintain’ quality indicator makes a useful and substantive contribution to further clinical, administrative and research developments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas Departamento de Matemática, 2016.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Given a 2manifold triangular mesh \(M \subset {\mathbb {R}}^3\), with border, a parameterization of \(M\) is a FACE or trimmed surface \(F=\{S,L_0,\ldots, L_m\}\) -- \(F\) is a connected subset or region of a parametric surface \(S\), bounded by a set of LOOPs \(L_0,\ldots ,L_m\) such that each \(L_i \subset S\) is a closed 1manifold having no intersection with the other \(L_j\) LOOPs -- The parametric surface \(S\) is a statistical fit of the mesh \(M\) -- \(L_0\) is the outermost LOOP bounding \(F\) and \(L_i\) is the LOOP of the ith hole in \(F\) (if any) -- The problem of parameterizing triangular meshes is relevant for reverse engineering, tool path planning, feature detection, redesign, etc -- Stateofart mesh procedures parameterize a rectangular mesh \(M\) -- To improve such procedures, we report here the implementation of an algorithm which parameterizes meshes \(M\) presenting holes and concavities -- We synthesize a parametric surface \(S \subset {\mathbb {R}}^3\) which approximates a superset of the mesh \(M\) -- Then, we compute a set of LOOPs trimming \(S\), and therefore completing the FACE \(F=\ {S,L_0,\ldots ,L_m\}\) -- Our algorithm gives satisfactory results for \(M\) having low Gaussian curvature (i.e., \(M\) being quasi-developable or developable) -- This assumption is a reasonable one, since \(M\) is the product of manifold segmentation preprocessing -- Our algorithm computes: (1) a manifold learning mapping \(\phi : M \rightarrow U \subset {\mathbb {R}}^2\), (2) an inverse mapping \(S: W \subset {\mathbb {R}}^2 \rightarrow {\mathbb {R}}^3\), with \ (W\) being a rectangular grid containing and surpassing \(U\) -- To compute \(\phi\) we test IsoMap, Laplacian Eigenmaps and Hessian local linear embedding (best results with HLLE) -- For the back mapping (NURBS) \(S\) the crucial step is to find a control polyhedron \(P\), which is an extrapolation of \(M\) -- We calculate \(P\) by extrapolating radial basis functions that interpolate points inside \(\phi (M)\) -- We successfully test our implementation with several datasets presenting concavities, holes, and are extremely nondevelopable -- Ongoing work is being devoted to manifold segmentation which facilitates mesh parameterization

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this report, we survey results on distance magic graphs and some closely related graphs. A distance magic labeling of a graph G with magic constant k is a bijection l from the vertex set to {1, 2, . . . , n}, such that for every vertex x Σ l(y) = k,y∈NG(x) where NG(x) is the set of vertices of G adjacent to x. If the graph G has a distance magic labeling we say that G is a distance magic graph. In Chapter 1, we explore the background of distance magic graphs by introducing examples of magic squares, magic graphs, and distance magic graphs. In Chapter 2, we begin by examining some basic results on distance magic graphs. We next look at results on different graph structures including regular graphs, multipartite graphs, graph products, join graphs, and splitting graphs. We conclude with other perspectives on distance magic graphs including embedding theorems, the matrix representation of distance magic graphs, lifted magic rectangles, and distance magic constants. In Chapter 3, we study graph labelings that retain the same labels as distance magic labelings, but alter the definition in some other way. These labelings include balanced distance magic labelings, closed distance magic labelings, D-distance magic labelings, and distance antimagic labelings. In Chapter 4, we examine results on neighborhood magic labelings, group distance magic labelings, and group distance antimagic labelings. These graph labelings change the label set, but are otherwise similar to distance magic graphs. In Chapter 5, we examine some applications of distance magic and distance antimagic labeling to the fair scheduling of tournaments. In Chapter 6, we conclude with some open problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Peripheral nerves have demonstrated the ability to bridge gaps of up to 6 mm. Peripheral Nerve System injury sites beyond this range need autograft or allograft surgery. Central Nerve System cells do not allow spontaneous regeneration due to the intrinsic environmental inhibition. Although stem cell therapy seems to be a promising approach towards nerve repair, it is essential to use the distinct three-dimensional architecture of a cell scaffold with proper biomolecule embedding in order to ensure that the local environment can be controlled well enough for growth and survival. Many approaches have been developed for the fabrication of 3D scaffolds, and more recently, fiber-based scaffolds produced via the electrospinning have been garnering increasing interest, as it offers the opportunity for control over fiber composition, as well as fiber mesh porosity using a relatively simple experimental setup. All these attributes make electrospun fibers a new class of promising scaffolds for neural tissue engineering. Therefore, the purpose of this doctoral study is to investigate the use of the novel material PGD and its derivative PGDF for obtaining fiber scaffolds using the electrospinning. The performance of these scaffolds, combined with neural lineage cells derived from ESCs, was evaluated by the dissolvability test, Raman spectroscopy, cell viability assay, real time PCR, Immunocytochemistry, extracellular electrophysiology, etc. The newly designed collector makes it possible to easily obtain fibers with adequate length and integrity. The utilization of a solvent like ethanol and water for electrospinning of fibrous scaffolds provides a potentially less toxic and more biocompatible fabrication method. Cell viability testing demonstrated that the addition of gelatin leads to significant improvement of cell proliferation on the scaffolds. Both real time PCR and Immunocytochemistry analysis indicated that motor neuron differentiation was achieved through the high motor neuron gene expression using the metabolites approach. The addition of Fumaric acid into fiber scaffolds further promoted the differentiation. Based on the results, this newly fabricated electrospun fiber scaffold, combined with neural lineage cells, provides a potential alternate strategy for nerve injury repair.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Legumes are bee-pollinated, but to a different extent. The importance of the plant– pollinator interplay (PPI), in flowering crops such as legumes lies in a combination of the importance of pollination for the production service and breeding strategies, plus the increasing urgency in mitigating the decline of pollinators through the development and implementation of conservation measures. To realize the full potential of the PPI, a multidisciplinary approach is required. This article assembles an international team of genebank managers, geneticists, plant breeders, experts on environmental governance and agro-ecology, and comprises several sections. The contributions in these sections outline both the state of the art of knowledge in the field and the novel aspects under development, and encompass a range of reviews, opinions and perspectives. The first three sections explore the role of PPI in legume breeding strategies. PPI based approaches to crop improvement can make it possible to adapt and re-design breeding strategies to meet both goals of: (1) optimal productivity, based on an efficient use of pollinators, and (2) biodiversity conservation. The next section deals with entomological aspects and focuses on the protection of the “pest control service” and pollinators in legume crops. The final section addresses general approaches to encourage the synergybetweenfoodproductionandpollinationservicesatfarmerfieldlevel.Twobasic approaches are proposed: (a) Farming with Alternative Pollinators and (b) Crop Design System.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increasing emphasis on aid effectiveness, accountability and impact measurement in international development and humanitarian work has generated a requirement for high quality internal systems for the management of programmes. To help to address this requirement, Trócaire adopted Results Based Management in the 20 countries in which it works. This paper provides an overview of Trócaire’s RBM journey, including the process of embedding the new approach in the organisation, lessons learnt from this process, the subsequent benefits that are emerging at field programme level and the challenges going forward.