964 resultados para Near-optimal solutions


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The asymptotic speed problem of front solutions to hyperbolic reaction-diffusion (HRD) equations is studied in detail. We perform linear and variational analyses to obtain bounds for the speed. In contrast to what has been done in previous work, here we derive upper bounds in addition to lower ones in such a way that we can obtain improved bounds. For some functions it is possible to determine the speed without any uncertainty. This is also achieved for some systems of HRD (i.e., time-delayed Lotka-Volterra) equations that take into account the interaction among different species. An analytical analysis is performed for several systems of biological interest, and we find good agreement with the results of numerical simulations as well as with available observations for a system discussed recently

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a thermally fluctuating long linear polymeric chain in a solution, the ends, from time to time, approach each other. At such an instance, the chain can be regarded as closed and thus will form a knot or rather a virtual knot. Several earlier studies of random knotting demonstrated that simpler knots show a higher occurrence for shorter random walks than do more complex knots. However, up to now there have been no rules that could be used to predict the optimal length of a random walk, i.e. the length for which a given knot reaches its highest occurrence. Using numerical simulations, we show here that a power law accurately describes the relation between the optimal lengths of random walks leading to the formation of different knots and the previously characterized lengths of ideal knots of a corresponding type.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deduction allows us to draw consequences from previous knowledge. Deductive reasoning can be applied to several types of problem, for example, conditional, syllogistic, and relational. It has been assumed that the same cognitive operations underlie solutions to them all; however, this hypothesis remains to be tested empirically. We used event-related fMRI, in the same group of subjects, to compare reasoning-related activity associated with conditional and syllogistic deductive problems. Furthermore, we assessed reasoning-related activity for the two main stages of deduction, namely encoding of premises and their integration. Encoding syllogistic premises for reasoning was associated with activation of BA 44/45 more than encoding them for literal recall. During integration, left fronto-lateral cortex (BA 44/45, 6) and basal ganglia activated with both conditional and syllogistic reasoning. Besides that, integration of syllogistic problems additionally was associated with activation of left parietal (BA 7) and left ventro-lateral frontal cortex (BA 47). This difference suggests a dissociation between conditional and syllogistic reasoning at the integration stage. Our finding indicates that the integration of conditional and syllogistic reasoning is carried out by means of different, but partly overlapping, sets of anatomical regions and by inference, cognitive processes. The involvement of BA 44/45 during both encoding (syllogisms) and premise integration (syllogisms and conditionals) suggests a central role in deductive reasoning for syntactic manipulations and formal/linguistic representations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Is it important to negotiate on proportions rather than on numbers? To answer this question, we analyze the behavior of well-known bargaining solutions and the claims rules they induce when they are applied to a "proportionally transformed" bargaining set SP -so-called bargaining-in-proportions set. The idea of applying bargaining solutions to claims problems was already developed in Dagan and Volij (1993). They apply the bargaining solutions over a bargaining set that is the one de ned by the claims and the endowment. A comparison among our results and theirs is provided. Keywords: Bargaining problem, Claims problem, Proportional, Constrained Equal Awards, Constrained Equal Losses, Nash bargaining solution. JEL classi fication: C71, D63, D71.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most leadership and management researchers ignore one key design and estimation problem rendering parameter estimates uninterpretable: Endogeneity. We discuss the problem of endogeneity in depth and explain conditions that engender it using examples grounded in the leadership literature. We show how consistent causal estimates can be derived from the randomized experiment, where endogeneity is eliminated by experimental design. We then review the reasons why estimates may become biased (i.e., inconsistent) in non-experimental designs and present a number of useful remedies for examining causal relations with non-experimental data. We write in intuitive terms using nontechnical language to make this chapter accessible to a large audience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimal Usage of De-Icing Chemicals when Scraping Ice, Final Report of Project HR 391

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a nationwide need for a safe, efficient and cost effective transportation system. An essential component of this system is the bridges. Local agencies perhaps have an even greater task than federal and state agencies in maintaining the low volume road (LVR) bridge system due to lack of sufficient resources and funding. The primary focus of this study was to review the various aspects of off-system bridge design, rehabilitation, and replacement. Specifically, a reference report was developed to address common problems in LVR bridges. The source of information included both Iowa and national agencies. This report is intended to be a “user manual” or “tool box” of information, procedures and choices for county engineers to employ in the management of their bridge inventory plus identify areas and problems that need to be researched

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Recent advances on high-throughput technologies have produced a vast amount of protein sequences, while the number of high-resolution structures has seen a limited increase. This has impelled the production of many strategies to built protein structures from its sequence, generating a considerable amount of alternative models. The selection of the closest model to the native conformation has thus become crucial for structure prediction. Several methods have been developed to score protein models by energies, knowledge-based potentials and combination of both.Results: Here, we present and demonstrate a theory to split the knowledge-based potentials in scoring terms biologically meaningful and to combine them in new scores to predict near-native structures. Our strategy allows circumventing the problem of defining the reference state. In this approach we give the proof for a simple and linear application that can be further improved by optimizing the combination of Zscores. Using the simplest composite score () we obtained predictions similar to state-of-the-art methods. Besides, our approach has the advantage of identifying the most relevant terms involved in the stability of the protein structure. Finally, we also use the composite Zscores to assess the conformation of models and to detect local errors.Conclusion: We have introduced a method to split knowledge-based potentials and to solve the problem of defining a reference state. The new scores have detected near-native structures as accurately as state-of-art methods and have been successful to identify wrongly modeled regions of many near-native conformations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of fading channels it is well established that, with a constrained transmit power, the bit rates achievable by signals that are not peaky vanish as the bandwidth grows without bound. Stepping back from the limit, we characterize the highest bit rate achievable by such non-peaky signals and the approximate bandwidth where that apex occurs. As it turns out, the gap between the highest rate achievable without peakedness and the infinite-bandwidth capacity (with unconstrained peakedness) is small for virtually all settings of interest to wireless communications. Thus, although strictly achieving capacity in wideband fading channels does require signal peakedness, bit rates not far from capacity can be achieved with conventional signaling formats that do not exhibit the serious practical drawbacks associated with peakedness. In addition, we show that the asymptotic decay of bit rate in the absence of peakedness usually takes hold at bandwidths so large that wideband fading models are called into question. Rather, ultrawideband models ought to be used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two important challenges that teachers are currently facing are the sharing and the collaborative authoring of their learning design solutions, such as didactical units and learning materials. On the one hand, there are tools that can be used for the creation of design solutions and only some of them facilitate the co-edition. However, they do not incorporate mechanisms that support the sharing of the designs between teachers. On the other hand, there are tools that serve as repositories of educational resources but they do not enable the authoring of the designs. In this paper we present LdShake, a web tool whose novelty is focused on the combined support for the social sharing and co-edition of learning design solutions within communities of teachers. Teachers can create and share learning designs with other teachers using different access rights so that they can read, comment or co-edit the designs. Therefore, each design solution is associated to a group of teachers able to work on its definition, and another group that can only see the design. The tool is generic in that it allows the creation of designs based on any pedagogical approach. However, it can be particularized in instances providing pre-formatted designs structured according to a specific didactic method (such as Problem-Based Learning, PBL). A particularized LdShake instance has been used in the context of Human Biology studies where teams of teachers are required to work together in the design of PBL solutions. A controlled user study, that compares the use of a generic LdShake and a Moodle system, configured to enable the creation and sharing of designs, has been also carried out. The combined results of the real and controlled studies show that the social structure, and the commenting, co-edition and publishing features of LdShake provide a useful, effective and usable approach for facilitating teachers' teamwork.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We design powerful low-density parity-check (LDPC) codes with iterative decoding for the block-fading channel. We first study the case of maximum-likelihood decoding, and show that the design criterion is rather straightforward. Since optimal constructions for maximum-likelihood decoding do not performwell under iterative decoding, we introduce a new family of full-diversity LDPC codes that exhibit near-outage-limit performance under iterative decoding for all block-lengths. This family competes favorably with multiplexed parallel turbo codes for nonergodic channels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Concrete curing is closely related to cement hydration, microstructure development, and concrete performance. Application of a liquid membrane-forming curing compound is among the most widely used curing methods for concrete pavements and bridge decks. Curing compounds are economical, easy to apply, and maintenance free. However, limited research has been done to investigate the effectiveness of different curing compounds and their application technologies. No reliable standard testing method is available to evaluate the effectiveness of curing, especially of the field concrete curing. The present research investigates the effects of curing compound materials and application technologies on concrete properties, especially on the properties of surface concrete. This report presents a literature review of curing technology, with an emphasis on curing compounds, and the experimental results from the first part of this research—lab investigation. In the lab investigation, three curing compounds were selected and applied to mortar specimens at three different times after casting. Two application methods, single- and double-layer applications, were employed. Moisture content, conductivity, sorptivity, and degree of hydration were measured at different depths of the specimens. Flexural and compressive strength of the specimens were also tested. Statistical analysis was conducted to examine the relationships between these material properties. The research results indicate that application of a curing compound significantly increased moisture content and degree of cement hydration and reduced sorptivity of the near-surface-area concrete. For given concrete materials and mix proportions, optimal application time of curing compounds depended primarily upon the weather condition. If a sufficient amount of a high-efficiency-index curing compound was uniformly applied, no double-layer application was necessary. Among all test methods applied, the sorptivity test is the most sensitive one to provide good indication for the subtle changes in microstructure of the near-surface-area concrete caused by different curing materials and application methods. Sorptivity measurement has a close relation with moisture content and degree of hydration. The research results have established a baseline for and provided insight into the further development of testing procedures for evaluation of curing compounds in field. Recommendations are provided for further field study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study optimal public rationing of an indivisible good and private sector price responses. Consumers differ in their wealth and costs of provisions. Due to a limited budget, some consumers must be rationed. Public rationing determines the characteristics of consumers who seek supply from the private sector, where a firm sets prices based on consumers' cost information and in response to the rationing rule. We consider two information regimes. In the first, the public supplier rations consumers according to their wealth information. In equilibrium, the public supplier must ration both rich and poor consumers. Supplying all poor consumers would leave only rich consumers in the private market, and the firm would react by setting a high price. Rationing some poor consumers is optimal, and implements price reduction in the private market. In the second information regime, the public supplier rations consumers according to consumers' wealth and cost information. In equilibrium, consumers are allocated the good if and only if their costs are below a threshold. Wealth information is not used. Rationing based on cost results in higher equilibrium total consumer surplus than rationing based on wealth. [Authors]