831 resultados para computational complexity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cross-Flow, Radial Jets Mixing, Temperature Homogenization, Optimization, Combustion Chamber, CFD

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Mathematik, Diss., 2015

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the huge increase in processor and interprocessor network performace, many computational problems remain unsolved due to lack of some critical resources such as floating point sustained performance, memory bandwidth, etc... Examples of these problems are found in areas of climate research, biology, astrophysics, high energy physics (montecarlo simulations) and artificial intelligence, among others. For some of these problems, computing resources of a single supercomputing facility can be 1 or 2 orders of magnitude apart from the resources needed to solve some them. Supercomputer centers have to face an increasing demand on processing performance, with the direct consequence of an increasing number of processors and systems, resulting in a more difficult administration of HPC resources and the need for more physical space, higher electrical power consumption and improved air conditioning, among other problems. Some of the previous problems can´t be easily solved, so grid computing, intended as a technology enabling the addition and consolidation of computing power, can help in solving large scale supercomputing problems. In this document, we describe how 2 supercomputing facilities in Spain joined their resources to solve a problem of this kind. The objectives of this experience were, among others, to demonstrate that such a cooperation can enable the solution of bigger dimension problems and to measure the efficiency that could be achieved. In this document we show some preliminary results of this experience and to what extend these objectives were achieved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We say the endomorphism problem is solvable for an element W in a free group F if it can be decided effectively whether, given U in F, there is an endomorphism Φ of F sending W to U. This work analyzes an approach due to C. Edmunds and improved by C. Sims. Here we prove that the approach provides an efficient algorithm for solving the endomorphism problem when W is a two- generator word. We show that when W is a two-generator word this algorithm solves the problem in time polynomial in the length of U. This result gives a polynomial-time algorithm for solving, in free groups, two-variable equations in which all the variables occur on one side of the equality and all the constants on the other side.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Here we describe the results of some computational explorations in Thompson's group F. We describe experiments to estimate the cogrowth of F with respect to its standard finite generating set, designed to address the subtle and difficult question whether or not Thompson's group is amenable. We also describe experiments to estimate the exponential growth rate of F and the rate of escape of symmetric random walks with respect to the standard generating set.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Whitehead minimization problem consists in finding a minimum size element in the automorphic orbit of a word, a cyclic word or a finitely generated subgroup in a finite rank free group. We give the first fully polynomial algorithm to solve this problem, that is, an algorithm that is polynomial both in the length of the input word and in the rank of the free group. Earlier algorithms had an exponential dependency in the rank of the free group. It follows that the primitivity problem – to decide whether a word is an element of some basis of the free group – and the free factor problem can also be solved in polynomial time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neuroblastoma (NB) is a neural crest-derived childhood tumor characterized by a remarkable phenotypic diversity, ranging from spontaneous regression to fatal metastatic disease. Although the cancer stem cell (CSC) model provides a trail to characterize the cells responsible for tumor onset, the NB tumor-initiating cell (TIC) has not been identified. In this study, the relevance of the CSC model in NB was investigated by taking advantage of typical functional stem cell characteristics. A predictive association was established between self-renewal, as assessed by serial sphere formation, and clinical aggressiveness in primary tumors. Moreover, cell subsets gradually selected during serial sphere culture harbored increased in vivo tumorigenicity, only highlighted in an orthotopic microenvironment. A microarray time course analysis of serial spheres passages from metastatic cells allowed us to specifically "profile" the NB stem cell-like phenotype and to identify CD133, ABC transporter, and WNT and NOTCH genes as spheres markers. On the basis of combined sphere markers expression, at least two distinct tumorigenic cell subpopulations were identified, also shown to preexist in primary NB. However, sphere markers-mediated cell sorting of parental tumor failed to recapitulate the TIC phenotype in the orthotopic model, highlighting the complexity of the CSC model. Our data support the NB stem-like cells as a dynamic and heterogeneous cell population strongly dependent on microenvironmental signals and add novel candidate genes as potential therapeutic targets in the control of high-risk NB.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Smart canula concept allows for collapsed cannula insertion, and self-expansion within a vein of the body. (A) Computational fluid dynamics, and (B) bovine experiments (76+/-3.8 kg) were performed for comparative analyses, prior to (C) the first clinical application. For an 18F access, a given flow of 4 l/min (A) resulted in a pressure drop of 49 mmHg for smart cannula versus 140 mmHg for control. The corresponding Reynolds numbers are 680 versus 1170, respectively. (B) For an access of 28F, the maximal flow for smart cannula was 5.8+/-0.5 l/min versus 4.0+/-0.1 l/min for standard (P<0.0001), for 24F 5.5+/-0.6 l/min versus 3.2+/-0.4 l/min (P<0.0001), and for 20F 4.1+/-0.3 l/min versus 1.6+/-0.3 l/min (P<0.0001). The flow obtained with the smart cannula was 270+/-45% (20F), 172+/-26% (24F), and 134+/-13% (28F) of standard (one-way ANOVA, P=0.014). (C) First clinical application (1.42 m2) with a smart cannula showed 3.55 l/min (100% predicted) without additional fluids. All three assessment steps confirm the superior performance of the smart cannula design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

I develop a model of endogenous bounded rationality due to search costs, arising implicitly from the problems complexity. The decision maker is not required to know the entire structure of the problem when making choices but can think ahead, through costly search, to reveal more of it. However, the costs of search are not assumed exogenously; they are inferred from revealed preferences through her choices. Thus, bounded rationality and its extent emerge endogenously: as problems become simpler or as the benefits of deeper search become larger relative to its costs, the choices more closely resemble those of a rational agent. For a fixed decision problem, the costs of search will vary across agents. For a given decision maker, they will vary across problems. The model explains, therefore, why the disparity, between observed choices and those prescribed under rationality, varies across agents and problems. It also suggests, under reasonable assumptions, an identifying prediction: a relation between the benefits of deeper search and the depth of the search. As long as calibration of the search costs is possible, this can be tested on any agent-problem pair. My approach provides a common framework for depicting the underlying limitations that force departures from rationality in different and unrelated decision-making situations. Specifically, I show that it is consistent with violations of timing independence in temporal framing problems, dynamic inconsistency and diversification bias in sequential versus simultaneous choice problems, and with plausible but contrasting risk attitudes across small- and large-stakes gambles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper critically examines a number of issues relating to the measurement of tax complexity. It starts with an analysis of the concept of tax complexity, distinguishing tax design complexity and operational complexity. It considers the consequences/costs of complexity, and then examines the rationale for measuring complexity. Finally it applies the analysis to an examination of an index of complexity developed by the UK Office of Tax Simplification (OTS).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: To document biopsychosocial profiles of patients with rheumatoid arthritis (RA) by means of the INTERMED and to correlate the results with conventional methods of disease assessment and health care utilization. METHODS: Patients with RA (n = 75) were evaluated with the INTERMED, an instrument for assessing case complexity and care needs. Based on their INTERMED scores, patients were compared with regard to severity of illness, functional status, and health care utilization. RESULTS: In cluster analysis, a 2-cluster solution emerged, with about half of the patients characterized as complex. Complex patients scoring especially high in the psychosocial domain of the INTERMED were disabled significantly more often and took more psychotropic drugs. Although the 2 patient groups did not differ in severity of illness and functional status, complex patients rated their illness as more severe on subjective measures and on most items of the Medical Outcomes Study Short Form 36. Complex patients showed increased health care utilization despite a similar biologic profile. CONCLUSIONS: The INTERMED identified complex patients with increased health care utilization, provided meaningful and comprehensive patient information, and proved to be easy to implement and advantageous compared with conventional methods of disease assessment. Intervention studies will have to demonstrate whether management strategies based on INTERMED profiles can improve treatment response and outcome of complex patients.