1000 resultados para Redistricting problems


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The analysis of sequential data is required in many diverse areas such as telecommunications, stock market analysis, and bioinformatics. A basic problem related to the analysis of sequential data is the sequence segmentation problem. A sequence segmentation is a partition of the sequence into a number of non-overlapping segments that cover all data points, such that each segment is as homogeneous as possible. This problem can be solved optimally using a standard dynamic programming algorithm. In the first part of the thesis, we present a new approximation algorithm for the sequence segmentation problem. This algorithm has smaller running time than the optimal dynamic programming algorithm, while it has bounded approximation ratio. The basic idea is to divide the input sequence into subsequences, solve the problem optimally in each subsequence, and then appropriately combine the solutions to the subproblems into one final solution. In the second part of the thesis, we study alternative segmentation models that are devised to better fit the data. More specifically, we focus on clustered segmentations and segmentations with rearrangements. While in the standard segmentation of a multidimensional sequence all dimensions share the same segment boundaries, in a clustered segmentation the multidimensional sequence is segmented in such a way that dimensions are allowed to form clusters. Each cluster of dimensions is then segmented separately. We formally define the problem of clustered segmentations and we experimentally show that segmenting sequences using this segmentation model, leads to solutions with smaller error for the same model cost. Segmentation with rearrangements is a novel variation to the segmentation problem: in addition to partitioning the sequence we also seek to apply a limited amount of reordering, so that the overall representation error is minimized. We formulate the problem of segmentation with rearrangements and we show that it is an NP-hard problem to solve or even to approximate. We devise effective algorithms for the proposed problem, combining ideas from dynamic programming and outlier detection algorithms in sequences. In the final part of the thesis, we discuss the problem of aggregating results of segmentation algorithms on the same set of data points. In this case, we are interested in producing a partitioning of the data that agrees as much as possible with the input partitions. We show that this problem can be solved optimally in polynomial time using dynamic programming. Furthermore, we show that not all data points are candidates for segment boundaries in the optimal solution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies optimisation problems related to modern large-scale distributed systems, such as wireless sensor networks and wireless ad-hoc networks. The concrete tasks that we use as motivating examples are the following: (i) maximising the lifetime of a battery-powered wireless sensor network, (ii) maximising the capacity of a wireless communication network, and (iii) minimising the number of sensors in a surveillance application. A sensor node consumes energy both when it is transmitting or forwarding data, and when it is performing measurements. Hence task (i), lifetime maximisation, can be approached from two different perspectives. First, we can seek for optimal data flows that make the most out of the energy resources available in the network; such optimisation problems are examples of so-called max-min linear programs. Second, we can conserve energy by putting redundant sensors into sleep mode; we arrive at the sleep scheduling problem, in which the objective is to find an optimal schedule that determines when each sensor node is asleep and when it is awake. In a wireless network simultaneous radio transmissions may interfere with each other. Task (ii), capacity maximisation, therefore gives rise to another scheduling problem, the activity scheduling problem, in which the objective is to find a minimum-length conflict-free schedule that satisfies the data transmission requirements of all wireless communication links. Task (iii), minimising the number of sensors, is related to the classical graph problem of finding a minimum dominating set. However, if we are not only interested in detecting an intruder but also locating the intruder, it is not sufficient to solve the dominating set problem; formulations such as minimum-size identifying codes and locating dominating codes are more appropriate. This thesis presents approximation algorithms for each of these optimisation problems, i.e., for max-min linear programs, sleep scheduling, activity scheduling, identifying codes, and locating dominating codes. Two complementary approaches are taken. The main focus is on local algorithms, which are constant-time distributed algorithms. The contributions include local approximation algorithms for max-min linear programs, sleep scheduling, and activity scheduling. In the case of max-min linear programs, tight upper and lower bounds are proved for the best possible approximation ratio that can be achieved by any local algorithm. The second approach is the study of centralised polynomial-time algorithms in local graphs these are geometric graphs whose structure exhibits spatial locality. Among other contributions, it is shown that while identifying codes and locating dominating codes are hard to approximate in general graphs, they admit a polynomial-time approximation scheme in local graphs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The idea of ‘wicked’ problems has made a valuable contribution to recognising the complexity and challenges of contemporary planning. However, some wicked policy problems are further complicated by a significant moral, psychological, religious or cultural dimension. This is particularly the case for problems that possess strong elements of abjection and symbolic pollution and high degrees of psychosocial sensitivity. Because this affects the way these problems are framed and discussed they are also characterised by high levels of verbal proscription. As a result, they are not discussed in the rational and emotion-free way that conventional planning demands and can become obscured or inadequately acknowledged in planning processes. This further contributes to their wickedness and intractability. Through paradigmatic urban planning examples, we argue that placing their unspeakable nature at the forefront of enquiry will enable planners to advocate for a more contextually and culturally situated approach to planning, which accommodates both emotional and embodied talk alongside more technical policy contributions. Re-imagining wicked problems in this way has the potential to enhance policy and plan-making and to disrupt norms, expose their contingency, and open new ways of planning for both the unspeakable and the merely wicked.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the present study is to analyze Confucian understandings of the Christian doctrine of salvation in order to find the basic problems in the Confucian-Christian dialogue. I will approach the task via a systematic theological analysis of four issues in order to limit the thesis to an appropriate size. They are analyzed in three chapters as follows: 1. The Confucian concept concerning the existence of God. Here I discuss mainly the issue of assimilation of the Christian concept of God to the concepts of Sovereign on High (Shangdi) and Heaven (Tian) in Confucianism. 2. The Confucian understanding of the object of salvation and its status in Christianity. 3. The Confucian understanding of the means of salvation in Christianity. Before beginning this analysis it is necessary to clarify the vast variety of controversies, arguments, ideas, opinions and comments expressed in the name of Confucianism; thus, clear distinctions among different schools of Confucianism are given in chapter 2. In the last chapter I will discuss the results of my research in this study by pointing out the basic problems that will appear in the analysis. The results of the present study provide conclusions in three related areas: the tacit differences in the ways of thinking between Confucians and Christians, the basic problems of the Confucian-Christian dialogue, and the affirmative elements in the dialogue. In addition to a summary, a bibliography and an index, there are also eight appendices, where I have introduced important background information for readers to understand the present study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this 'Summary Guidance for Daily Practice', we describe the basic principles of prevention and management of foot problems in persons with diabetes. This summary is based on the International Working Group on the Diabetic Foot (IWGDF) Guidance 2015. There are five key elements that underpin prevention of foot problems: (1) identification of the at-risk foot; (2) regular inspection and examination of the at-risk foot; (3) education of patient, family and healthcare providers; (4) routine wearing of appropriate footwear, and; (5) treatment of pre-ulcerative signs. Healthcare providers should follow a standardized and consistent strategy for evaluating a foot wound, as this will guide further evaluation and therapy. The following items must be addressed: type, cause, site and depth, and signs of infection. There are seven key elements that underpin ulcer treatment: (1) relief of pressure and protection of the ulcer; (2) restoration of skin perfusion; (3) treatment of infection; (4) metabolic control and treatment of co-morbidity; (5) local wound care; (6) education for patient and relatives, and; (7) prevention of recurrence. Finally, successful efforts to prevent and manage foot problems in diabetes depend upon a well-organized team, using a holistic approach in which the ulcer is seen as a sign of multi-organ disease, and integrating the various disciplines involved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Foot problems complicating diabetes are a source of major patient suffering and societal costs. Investing in evidence-based, internationally appropriate diabetic foot care guidance is likely among the most cost-effective forms of healthcare expenditure, provided it is goal-focused and properly implemented. The International Working Group on the Diabetic Foot (IWGDF) has been publishing and updating international Practical Guidelines since 1999. The 2015 updates are based on systematic reviews of the literature, and recommendations are formulated using the Grading of Recommendations Assessment Development and Evaluation system. As such, we changed the name from 'Practical Guidelines' to 'Guidance'. In this article we describe the development of the 2015 IWGDF Guidance documents on prevention and management of foot problems in diabetes. This Guidance consists of five documents, prepared by five working groups of international experts. These documents provide guidance related to foot complications in persons with diabetes on: prevention; footwear and offloading; peripheral artery disease; infections; and, wound healing interventions. Based on these five documents, the IWGDF Editorial Board produced a summary guidance for daily practice. The resultant of this process, after reviewed by the Editorial Board and by international IWGDF members of all documents, is an evidence-based global consensus on prevention and management of foot problems in diabetes. Plans are already under way to implement this Guidance. We believe that following the recommendations of the 2015 IWGDF Guidance will almost certainly result in improved management of foot problems in persons with diabetes and a subsequent worldwide reduction in the tragedies caused by these foot problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An error-free computational approach is employed for finding the integer solution to a system of linear equations, using finite-field arithmetic. This approach is also extended to find the optimum solution for linear inequalities such as those arising in interval linear programming probloms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The remarkable advances made in recombinant DNA technology over the last two decades have paved way for the use of gene transfer to treat human diseases. Several protocols have been developed for the introduction and expression of genes in humans, but the clinical efficacy has not been conclusively demonstrated in any of them. The eventual success of gene therapy for genetic and acquired disorders depends on the development of better gene transfer vectors for sustained, long term expression of foreign genes as well as a better understanding of the pathophysiology of human diseases, it is heartening to note that some of the gene therapy protocols have found other applications such as the genetic immunization or DNA vaccines, which is being heralded as the third vaccine revolution, Gene therapy is yet to become a dream come true, but the light is seen at the end of the tunnel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Site-specific geotechnical data are always random and variable in space. In the present study, a procedure for quantifying the variability in geotechnical characterization and design parameters is discussed using the site-specific cone tip resistance data (qc) obtained from static cone penetration test (SCPT). The parameters for the spatial variability modeling of geotechnical parameters i.e. (i) existing trend function in the in situ qc data; (ii) second moment statistics i.e. analysis of mean, variance, and auto-correlation structure of the soil strength and stiffness parameters; and (iii) inputs from the spatial correlation analysis, are utilized in the numerical modeling procedures using the finite difference numerical code FLAC 5.0. The influence of consideration of spatially variable soil parameters on the reliability-based geotechnical deign is studied for the two cases i.e. (a) bearing capacity analysis of a shallow foundation resting on a clayey soil, and (b) analysis of stability and deformation pattern of a cohesive-frictional soil slope. The study highlights the procedure for conducting a site-specific study using field test data such as SCPT in geotechnical analysis and demonstrates that a few additional computations involving soil variability provide a better insight into the role of variability in designs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this note, the fallacy in the method given by Sharma and Swarup, in their paper on time minimising transportation problem, to determine the setS hkof all nonbasic cells which when introduced into the basis, either would eliminate a given basic cell (h, k) from the basis or reduce the amountx hkis pointed out.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the spectral stochastic finite element method for analyzing an uncertain system. the uncertainty is represented by a set of random variables, and a quantity of Interest such as the system response is considered as a function of these random variables Consequently, the underlying Galerkin projection yields a block system of deterministic equations where the blocks are sparse but coupled. The solution of this algebraic system of equations becomes rapidly challenging when the size of the physical system and/or the level of uncertainty is increased This paper addresses this challenge by presenting a preconditioned conjugate gradient method for such block systems where the preconditioning step is based on the dual-primal finite element tearing and interconnecting method equipped with a Krylov subspace reusage technique for accelerating the iterative solution of systems with multiple and repeated right-hand sides. Preliminary performance results on a Linux Cluster suggest that the proposed Solution method is numerically scalable and demonstrate its potential for making the uncertainty quantification Of realistic systems tractable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a self-regularized pseudo-time marching strategy for ill-posed, nonlinear inverse problems involving recovery of system parameters given partial and noisy measurements of system response. While various regularized Newton methods are popularly employed to solve these problems, resulting solutions are known to sensitively depend upon the noise intensity in the data and on regularization parameters, an optimal choice for which remains a tricky issue. Through limited numerical experiments on a couple of parameter re-construction problems, one involving the identification of a truss bridge and the other related to imaging soft-tissue organs for early detection of cancer, we demonstrate the superior features of the pseudo-time marching schemes.