927 resultados para Algorithmic Complexity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we consider several instances of the following problem: "how complicated can the isomorphism relation for countable models be?"' Using the Borel reducibility framework, we investigate this question with regard to the space of countable models of particular complete first-order theories. We also investigate to what extent this complexity is mirrored in the number of back-and-forth inequivalent models of the theory. We consider this question for two large and related classes of theories. First, we consider o-minimal theories, showing that if T is o-minimal, then the isomorphism relation is either Borel complete or Borel. Further, if it is Borel, we characterize exactly which values can occur, and when they occur. In all cases Borel completeness implies lambda-Borel completeness for all lambda. Second, we consider colored linear orders, which are (complete theories of) a linear order expanded by countably many unary predicates. We discover the same characterization as with o-minimal theories, taking the same values, with the exception that all finite values are possible except two. We characterize exactly when each possibility occurs, which is similar to the o-minimal case. Additionally, we extend Schirrman's theorem, showing that if the language is finite, then T is countably categorical or Borel complete. As before, in all cases Borel completeness implies lambda-Borel completeness for all lambda.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents approximation algorithms for some NP-Hard combinatorial optimization problems on graphs and networks; in particular, we study problems related to Network Design. Under the widely-believed complexity-theoretic assumption that P is not equal to NP, there are no efficient (i.e., polynomial-time) algorithms that solve these problems exactly. Hence, if one desires efficient algorithms for such problems, it is necessary to consider approximate solutions: An approximation algorithm for an NP-Hard problem is a polynomial time algorithm which, for any instance of the problem, finds a solution whose value is guaranteed to be within a multiplicative factor of the value of an optimal solution to that instance. We attempt to design algorithms for which this factor, referred to as the approximation ratio of the algorithm, is as small as possible. The field of Network Design comprises a large class of problems that deal with constructing networks of low cost and/or high capacity, routing data through existing networks, and many related issues. In this thesis, we focus chiefly on designing fault-tolerant networks. Two vertices u,v in a network are said to be k-edge-connected if deleting any set of k − 1 edges leaves u and v connected; similarly, they are k-vertex connected if deleting any set of k − 1 other vertices or edges leaves u and v connected. We focus on building networks that are highly connected, meaning that even if a small number of edges and nodes fail, the remaining nodes will still be able to communicate. A brief description of some of our results is given below. We study the problem of building 2-vertex-connected networks that are large and have low cost. Given an n-node graph with costs on its edges and any integer k, we give an O(log n log k) approximation for the problem of finding a minimum-cost 2-vertex-connected subgraph containing at least k nodes. We also give an algorithm of similar approximation ratio for maximizing the number of nodes in a 2-vertex-connected subgraph subject to a budget constraint on the total cost of its edges. Our algorithms are based on a pruning process that, given a 2-vertex-connected graph, finds a 2-vertex-connected subgraph of any desired size and of density comparable to the input graph, where the density of a graph is the ratio of its cost to the number of vertices it contains. This pruning algorithm is simple and efficient, and is likely to find additional applications. Recent breakthroughs on vertex-connectivity have made use of algorithms for element-connectivity problems. We develop an algorithm that, given a graph with some vertices marked as terminals, significantly simplifies the graph while preserving the pairwise element-connectivity of all terminals; in fact, the resulting graph is bipartite. We believe that our simplification/reduction algorithm will be a useful tool in many settings. We illustrate its applicability by giving algorithms to find many trees that each span a given terminal set, while being disjoint on edges and non-terminal vertices; such problems have applications in VLSI design and other areas. We also use this reduction algorithm to analyze simple algorithms for single-sink network design problems with high vertex-connectivity requirements; we give an O(k log n)-approximation for the problem of k-connecting a given set of terminals to a common sink. We study similar problems in which different types of links, of varying capacities and costs, can be used to connect nodes; assuming there are economies of scale, we give algorithms to construct low-cost networks with sufficient capacity or bandwidth to simultaneously support flow from each terminal to the common sink along many vertex-disjoint paths. We further investigate capacitated network design, where edges may have arbitrary costs and capacities. Given a connectivity requirement R_uv for each pair of vertices u,v, the goal is to find a low-cost network which, for each uv, can support a flow of R_uv units of traffic between u and v. We study several special cases of this problem, giving both algorithmic and hardness results. In addition to Network Design, we consider certain Traveling Salesperson-like problems, where the goal is to find short walks that visit many distinct vertices. We give a (2 + epsilon)-approximation for Orienteering in undirected graphs, achieving the best known approximation ratio, and the first approximation algorithm for Orienteering in directed graphs. We also give improved algorithms for Orienteering with time windows, in which vertices must be visited between specified release times and deadlines, and other related problems. These problems are motivated by applications in the fields of vehicle routing, delivery and transportation of goods, and robot path planning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The farm-gate value of extensive beef production from the northern Gulf region of Queensland, Australia, is ~$150 million annually. Poor profitability and declining equity are common issues for most beef businesses in the region. The beef industry relies primarily on native pasture systems and studies continue to report a decline in the condition and productivity of important land types in the region. Governments and Natural Resource Management groups are investing significant resources to restore landscape health and productivity. Fundamental community expectations also include broader environmental outcomes such as reducing beef industry greenhouse gas emissions. Whole-of-business analysis results are presented from 18 extensive beef businesses (producers) to highlight the complex social and economic drivers of management decisions that impact on the natural resource and environment. Business analysis activities also focussed on improving enterprise performance. Profitability, herd performance and greenhouse emission benchmarks are documented and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nitrogen (N) is an essential plant nutrient in maize production, and if considering only natural sources, is often the limiting factor world-wide in terms of a plant’s grain yield. For this reason, many farmers around the world supplement available soil N with synthetic man-made forms. Years of over-application of N fertilizer have led to increased N in groundwater and streams due to leaching and run-off from agricultural sites. In the Midwest Corn Belt much of this excess N eventually makes its way to the Gulf of Mexico leading to eutrophication (increase of phytoplankton) and a hypoxic (reduced oxygen) dead zone. Growing concerns about these types of problems and desire for greater input use efficiency have led to demand for crops with improved N use efficiency (NUE) to allow reduced N fertilizer application rates and subsequently lower N pollution. It is well known that roots are responsible for N uptake by plants, but it is relatively unknown how root architecture affects this ability. This research was conducted to better understand the influence of root complexity (RC) in maize on a plant’s response to N stress as well as the influence of RC on other above-ground plant traits. Thirty-one above-ground plant traits were measured for 64 recombinant inbred lines (RILs) from the intermated B73 & Mo17 (IBM) population and their backcrosses (BCs) to either parent, B73 and Mo17, under normal (182 kg N ha-1) and N deficient (0 kg N ha-1) conditions. The RILs were selected based on results from an earlier experiment by Novais et al. (2011) which screened 232 RILs from the IBM to obtain their root complexity measurements. The 64 selected RILs were comprised of 31 of the lowest complexity RILs (RC1) and 33 of the highest complexity RILs (RC2) in terms of root architecture (characterized as fractal dimensions). The use of the parental BCs classifies the experiment as Design III, an experimental design developed by Comstock and Robinson (1952) which allows for estimation of dominance significance and level. Of the 31 traits measured, 12 were whole plant traits chosen due to their documented response to N stress. The other 19 traits were ear traits commonly measured for their influence on yield. Results showed that genotypes from RC1 and RC2 significantly differ for several above-ground phenotypes. We also observed a difference in the number and magnitude of N treatment responses between the two RC classes. Differences in phenotypic trait correlations and their change in response to N were also observed between the RC classes. RC did not seem to have a strong correlation with calculated NUE (ΔYield/ΔN). Quantitative genetic analysis utilizing the Design III experimental design revealed significant dominance effects acting on several traits as well as changes in significance and dominance level between N treatments. Several QTL were mapped for 26 of the 31 traits and significant N effects were observed across the majority of the genome for some N stress indicative traits (e.g. stay-green). This research and related projects are essential to a better understanding of plant N uptake and metabolism. Understanding these processes is a necessary step in the progress towards the goal of breeding for better NUE crops.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Current thinking about ‘patient safety’ emphasises the causal relationship between the work environment and the delivery of clinical care. This research draws on the theory of Normal Accidents to extend this analysis and better understand the ‘organisational factors’ that threaten safety. Methods: Ethnographic research methods were used, with observations of the operating department setting for 18 month and interviews with 80 members of hospital staff. The setting for the study was the Operating Department of a large teaching hospital in the North-West of England. Results: The work of the operating department is determined by inter-dependant, ‘tightly coupled’ organisational relationships between hospital departments based upon the timely exchange of information, services and resources required for the delivery of care. Failures within these processes, manifest as ‘breakdowns’ within inter-departmental relationships lead to situations of constraint, rapid change and uncertainty in the work of the operating department that require staff to break with established routines and work with increased time and emotional pressures. This means that staff focus on working quickly, as opposed to working safely. Conclusion: Analysis of safety needs to move beyond a focus on the immediate work environment and individual practice, to consider the more complex and deeply structured organisational systems of hospital activity. For departmental managers the scope for service planning to control for safety may be limited as the structured ‘real world’ situation of service delivery is shaped by inter-department and organisational factors that are perhaps beyond the scope of departmental management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Réalisé en cotutelle avec l'École normale supérieure de Cachan – Université Paris-Saclay

Relevância:

20.00% 20.00%

Publicador:

Resumo:

International audience

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the Guaymas Basin, the presence of cold seeps and hydrothermal vents in close proximity, similar sedimentary settings and comparable depths offers a unique opportunity to assess and compare the functioning of these deep-sea chemosynthetic ecosystems. The food webs of five seep and four vent assemblages were studied using stable carbon and nitrogen isotope analyses. Although the two ecosystems shared similar potential basal sources, their food webs differed: seeps relied predominantly on methanotrophy and thiotrophy via the Calvin-Benson-Bassham (CBB) cycle and vents on petroleum-derived organic matter and thiotrophy via the CBB and reductive tricarboxylic acid (rTCA) cycles. In contrast to symbiotic species, the heterotrophic fauna exhibited high trophic flexibility among assemblages, suggesting weak trophic links to the metabolic diversity of chemosynthetic primary producers. At both ecosystems, food webs did not appear to be organised through predator-prey links but rather through weak trophic relationships among co-occurring species. Examples of trophic or spatial niche differentiation highlighted the importance of species-sorting processes within chemosynthetic ecosystems. Variability in food web structure, addressed through Bayesian metrics, revealed consistent trends across ecosystems. Food-web complexity significantly decreased with increasing methane concentrations, a common proxy for the intensity of seep and vent fluid fluxes. Although high fluid-fluxes have the potential to enhance primary productivity, they generate environmental constraints that may limit microbial diversity, colonisation of consumers and the structuring role of competitive interactions, leading to an overall reduction of food-web complexity and an increase in trophic redundancy. Heterogeneity provided by foundation species was identified as an additional structuring factor. According to their biological activities, foundation species may have the potential to partly release the competitive pressure within communities of low fluid-flux habitats. Finally, ecosystem functioning in vents and seeps was highly similar despite environmental differences (e.g. physico-chemistry, dominant basal sources) suggesting that ecological niches are not specifically linked to the nature of fluids. This comparison of seep and vent functioning in the Guaymas basin thus provides further supports to the hypothesis of continuity among deep-sea chemosynthetic ecosystems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article deals with climate change from a linguistic perspective. Climate change is an extremely complex issue that has exercised the minds of experts and policy makers with renewed urgency in recent years. It has prompted an explosion of writing in the media, on the internet and in the domain of popular science and literature, as well as a proliferation of new compounds around the word ‘carbon’ as a hub, such as ‘carbon indulgence’, a new compound that will be studied in this article. Through a linguistic analysis of lexical and discourse formations around such ‘carbon compounds’ we aim to contribute to a broader understanding of the meaning of climate change. Lexical carbon compounds are used here as indicators for observing how human symbolic cultures change and adapt in response to environmental threats and how symbolic innovation and transmission occurs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To tackle the challenges at circuit level and system level VLSI and embedded system design, this dissertation proposes various novel algorithms to explore the efficient solutions. At the circuit level, a new reliability-driven minimum cost Steiner routing and layer assignment scheme is proposed, and the first transceiver insertion algorithmic framework for the optical interconnect is proposed. At the system level, a reliability-driven task scheduling scheme for multiprocessor real-time embedded systems, which optimizes system energy consumption under stochastic fault occurrences, is proposed. The embedded system design is also widely used in the smart home area for improving health, wellbeing and quality of life. The proposed scheduling scheme for multiprocessor embedded systems is hence extended to handle the energy consumption scheduling issues for smart homes. The extended scheme can arrange the household appliances for operation to minimize monetary expense of a customer based on the time-varying pricing model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we consider Preference Inference based on a generalised form of Pareto order. Preference Inference aims at reasoning over an incomplete specification of user preferences. We focus on two problems. The Preference Deduction Problem (PDP) asks if another preference statement can be deduced (with certainty) from a set of given preference statements. The Preference Consistency Problem (PCP) asks if a set of given preference statements is consistent, i.e., the statements are not contradicting each other. Here, preference statements are direct comparisons between alternatives (strict and non-strict). It is assumed that a set of evaluation functions is known by which all alternatives can be rated. We consider Pareto models which induce order relations on the set of alternatives in a Pareto manner, i.e., one alternative is preferred to another only if it is preferred on every component of the model. We describe characterisations for deduction and consistency based on an analysis of the set of evaluation functions, and present algorithmic solutions and complexity results for PDP and PCP, based on Pareto models in general and for a special case. Furthermore, a comparison shows that the inference based on Pareto models is less cautious than some other types of well-known preference model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: This study is part of an interactive improvement intervention aimed to facilitate empowerment-based chronic kidney care using data from persons with CKD and their family members. There are many challenges to implementing empowerment-based care, and it is therefore necessary to study the implementation process. The aim of this study was to generate knowledge regarding the implementation process of an improvement intervention of empowerment for those who require chronic kidney care. Methods: A prospective single qualitative case study was chosen to follow the process of the implementation over a two year period. Twelve health care professionals were selected based on their various role(s) in the implementation of the improvement intervention. Data collection comprised of digitally recorded project group meetings, field notes of the meetings, and individual interviews before and after the improvement project. These multiple data were analyzed using qualitative latent content analysis. Results: Two facilitator themes emerged: Moving spirit and Encouragement. The healthcare professionals described a willingness to individualize care and to increase their professional development in the field of chronic kidney care. The implementation process was strongly reinforced by both the researchers working interactively with the staff, and the project group. One theme emerged as a barrier: the Limitations of the organization. Changes in the organization hindered the implementation of the intervention throughout the study period, and the lack of interplay in the organization most impeded the process. Conclusions: The findings indicated the complexity of maintaining a sustainable and lasting implementation over a period of two years. Implementing empowerment-based care was found to be facilitated by the cooperation between all involved healthcare professionals. Furthermore, long-term improvement interventions need strong encouragement from all levels of the organization to maintain engagement, even when it is initiated by the health care professionals themselves.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research poster about indexing theory

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Self-replication and compartmentalization are two central properties thought to be essential for minimal life, and understanding how such processes interact in the emergence of complex reaction networks is crucial to exploring the development of complexity in chemistry and biology. Autocatalysis can emerge from multiple different mechanisms such as formation of an initiator, template self-replication and physical autocatalysis (where micelles formed from the reaction product solubilize the reactants, leading to higher local concentrations and therefore higher rates). Amphiphiles are also used in artificial life studies to create protocell models such as micelles, vesicles and oil-in-water droplets, and can increase reaction rates by encapsulation of reactants. So far, no template self-replicator exists which is capable of compartmentalization, or transferring this molecular scale phenomenon to micro or macro-scale assemblies. Here a system is demonstrated where an amphiphilic imine catalyses its own formation by joining a non-polar alkyl tail group with a polar carboxylic acid head group to form a template, which was shown to form reverse micelles by Dynamic Light Scattering (DLS). The kinetics of this system were investigated by 1H NMR spectroscopy, showing clearly that a template self-replication mechanism operates, though there was no evidence that the reverse micelles participated in physical autocatalysis. Active oil droplets, composed from a mixture of insoluble organic compounds in an aqueous sub-phase, can undergo processes such as division, self-propulsion and chemotaxis, and are studied as models for minimal cells, or protocells. Although in most cases the Marangoni effect is responsible for the forces on the droplet, the behaviour of the droplet depends heavily on the exact composition. Though theoretical models are able to calculate the forces on a droplet, to model a mixture of oils on an aqueous surface where compounds from the oil phase are dissolving and diffusing through the aqueous phase is beyond current computational capability. The behaviour of a droplet in an aqueous phase can only be discovered through experiment, though it is determined by the droplet's composition. By using an evolutionary algorithm and a liquid handling robot to conduct droplet experiments and decide which compositions to test next, entirely autonomously, the composition of the droplet becomes a chemical genome capable of evolution. The selection is carried out according to a fitness function, which ranks the formulation based on how well it conforms to the chosen fitness criteria (e.g. movement or division). Over successive generations, significant increases in fitness are achieved, and this increase is higher with more components (i.e. greater complexity). Other chemical processes such as chemiluminescence and gelation were investigated in active oil droplets, demonstrating the possibility of controlling chemical reactions by selective droplet fusion. Potential future applications for this might include combinatorial chemistry, or additional fitness goals for the genetic algorithm. Combining the self-replication and the droplet protocells research, it was demonstrated that the presence of the amphiphilic replicator lowers the interfacial tension between droplets of a reaction mixture in organic solution and the alkaline aqueous phase, causing them to divide. Periodic sampling by a liquid handling robot revealed that the extent of droplet fission increased as the reaction progressed, producing more individual protocells with increased self-replication. This demonstrates coupling of the molecular scale phenomenon of template self-replication to a macroscale physicochemical effect.