925 resultados para algorithm optimization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We characterize the capacity-achieving input covariance for multi-antenna channels known instantaneously at the receiver and in distribution at the transmitter. Our characterization, valid for arbitrary numbers of antennas, encompasses both the eigenvectors and the eigenvalues. The eigenvectors are found for zero-mean channels with arbitrary fading profiles and a wide range of correlation and keyhole structures. For the eigenvalues, in turn, we present necessary and sufficient conditions as well as an iterative algorithm that exhibits remarkable properties: universal applicability, robustness and rapid convergence. In addition, we identify channel structures for which an isotropic input achieves capacity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A systolic array to implement lattice-reduction-aided lineardetection is proposed for a MIMO receiver. The lattice reductionalgorithm and the ensuing linear detections are operated in the same array, which can be hardware-efficient. All-swap lattice reduction algorithm (ASLR) is considered for the systolic design.ASLR is a variant of the LLL algorithm, which processes all lattice basis vectors within one iteration. Lattice-reduction-aided linear detection based on ASLR and LLL algorithms have very similarbit-error-rate performance, while ASLR is more time efficient inthe systolic array, especially for systems with a large number ofantennas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

America’s roadways are in serious need of repair. According to the American Society of Civil Engineers (ASCE), one-third of the nation’s roads are in poor or mediocre condition (1). ASCE has estimated that under these circumstances American drivers will sacrifice $5.8 billion and as many as 13,800 fatalities a year from 1999 to 2001 ( 1). A large factor in the deterioration of these roads is a result of how well the steel reinforcement transfers loads across the concrete slabs. Fabricating this reinforcement using a shape conducive to transferring these loads will help to aid in minimizing roadway damage. Load transfer within a series of concrete slabs takes place across the joints. For a typical concrete paved road, these joints are approximately 1/8-inch gaps between two adjacent slabs. Dowel bars are located at these joints and used to transfer load from one slab to its adjacent slabs. As long as the dowel bar is completely surrounded by concrete no problems will occur. However, when the hole starts to oblong a void space is created and difficulties can arise. This void space is formed due to a stress concentration where the dowel contacts the concrete. Over time, the repeated process of traffic traveling over the joint crushes the concrete surrounding the dowel bar and causes a void in the concrete. This void inhibits the dowel’s ability to effectively transfer load across the joint. Furthermore, this void gives water and other particles a place to collect that will eventually corrode and potentially bind or lock the joint so that no thermal expansion is allowed. Once there is no longer load transferred across the joint, the load is transferred to the foundation and differential settlement of the adjacent slabs will occur.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The chemistry of today’s concrete mixture designs is complicated by many variables, including multiple sources of aggregate and cements and a plethora of sometimes incompatible mineral and chemical admixtures. Concrete paving has undergone significant changes in recent years as new materials have been introduced into concrete mixtures. Supplementary cementitious materials such as fly ash and ground granulated blast furnace slag are now regularly used. In addition, many new admixtures that were not even available a few years ago now have widespread usage. Adding to the complexity are construction variables such as weather, mix delivery times, finishing practices, and pavement opening schedules. Mixture materials, mix design, and pavement construction are not isolated steps in the concrete paving process. Each affects and is affected by the other in ways that determine overall pavement quality and long-term performance. Equipment and procedures commonly used to test concrete materials and concrete pavements have not changed in decades, leaving serious gaps in our ability to understand and control the factors that determine concrete durability. The concrete paving community needs tests that will adequately characterize the materials, predict interactions, and monitor the properties of the concrete.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tumor Endothelial Marker-1 (TEM1/CD248) is a tumor vascular marker with high therapeutic and diagnostic potentials. Immuno-imaging with TEM1-specific antibodies can help to detect cancerous lesions, monitor tumor responses, and select patients that are most likely to benefit from TEM1-targeted therapies. In particular, near infrared(NIR) optical imaging with biomarker-specific antibodies can provide real-time, tomographic information without exposing the subjects to radioactivity. To maximize the theranostic potential of TEM1, we developed a panel of all human, multivalent Fc-fusion proteins based on a previously identified single chain antibody (scFv78) that recognizes both human and mouse TEM1. By characterizing avidity, stability, and pharmacokinectics, we identified one fusion protein, 78Fc, with desirable characteristics for immuno-imaging applications. The biodistribution of radiolabeled 78Fc showed that this antibody had minimal binding to normal organs, which have low expression of TEM1. Next, we developed a 78Fc-based tracer and tested its performance in different TEM1-expressing mouse models. The NIR imaging and tomography results suggest that the 78Fc-NIR tracer performs well in distinguishing mouse- or human-TEM1 expressing tumor grafts from normal organs and control grafts in vivo. From these results we conclude that further development and optimization of 78Fc as a TEM1-targeted imaging agent for use in clinical settings is warranted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of CT applications might become a public health problem if no effort is made on the justification and the optimisation of the examinations. This paper presents some hints to assure that the risk-benefit compromise remains in favour of the patient, especially when one deals with the examinations of young patients. In this context a particular attention has to be made on the justification of the examination. When performing the acquisition one needs to optimise the extension of the volume investigated together with the number of acquisition sequences used. Finally, the use of automatic exposure systems, now available on all the units, and the use of the Diagnostic Reference Levels (DRL) should allow help radiologists to control the exposure of their patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid (whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then the problem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Severe environmental conditions, coupled with the routine use of deicing chemicals and increasing traffic volume, tend to place extreme demands on portland cement concrete (PCC) pavements. In most instances, engineers have been able to specify and build PCC pavements that met these challenges. However, there have also been reports of premature deterioration that could not be specifically attributed to a single cause. Modern concrete mixtures have evolved to become very complex chemical systems. The complexity can be attributed to both the number of ingredients used in any given mixture and the various types and sources of the ingredients supplied to any given project. Local environmental conditions can also influence the outcome of paving projects. This research project investigated important variables that impact the homogeneity and rheology of concrete mixtures. The project consisted of a field study and a laboratory study. The field study collected information from six different projects in Iowa. The information that was collected during the field study documented cementitious material properties, plastic concrete properties, and hardened concrete properties. The laboratory study was used to develop baseline mixture variability information for the field study. It also investigated plastic concrete properties using various new devices to evaluate rheology and mixing efficiency. In addition, the lab study evaluated a strategy for the optimization of mortar and concrete mixtures containing supplementary cementitious materials. The results of the field studies indicated that the quality management concrete (QMC) mixtures being placed in the state generally exhibited good uniformity and good to excellent workability. Hardened concrete properties (compressive strength and hardened air content) were also satisfactory. The uniformity of the raw cementitious materials that were used on the projects could not be monitored as closely as was desired by the investigators; however, the information that was gathered indicated that the bulk chemical composition of most materials streams was reasonably uniform. Specific minerals phases in the cementitious materials were less uniform than the bulk chemical composition. The results of the laboratory study indicated that ternary mixtures show significant promise for improving the performance of concrete mixtures. The lab study also verified the results from prior projects that have indicated that bassanite is typically the major sulfate phase that is present in Iowa cements. This causes the cements to exhibit premature stiffening problems (false set) in laboratory testing. Fly ash helps to reduce the impact of premature stiffening because it behaves like a low-range water reducer in most instances. The premature stiffening problem can also be alleviated by increasing the water–cement ratio of the mixture and providing a remix cycle for the mixture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The standard one-machine scheduling problem consists in schedulinga set of jobs in one machine which can handle only one job at atime, minimizing the maximum lateness. Each job is available forprocessing at its release date, requires a known processing timeand after finishing the processing, it is delivery after a certaintime. There also can exists precedence constraints between pairsof jobs, requiring that the first jobs must be completed beforethe second job can start. An extension of this problem consistsin assigning a time interval between the processing of the jobsassociated with the precedence constrains, known by finish-starttime-lags. In presence of this constraints, the problem is NP-hardeven if preemption is allowed. In this work, we consider a specialcase of the one-machine preemption scheduling problem with time-lags, where the time-lags have a chain form, and propose apolynomial algorithm to solve it. The algorithm consist in apolynomial number of calls of the preemption version of the LongestTail Heuristic. One of the applicability of the method is to obtainlower bounds for NP-hard one-machine and job-shop schedulingproblems. We present some computational results of thisapplication, followed by some conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We address the problem of scheduling a multiclass $M/M/m$ queue with Bernoulli feedback on $m$ parallel servers to minimize time-average linear holding costs. We analyze the performance of a heuristic priority-index rule, which extends Klimov's optimal solution to the single-server case: servers select preemptively customers with larger Klimov indices. We present closed-form suboptimality bounds (approximate optimality) for Klimov's rule, which imply that its suboptimality gap is uniformly bounded above with respect to (i) external arrival rates, as long as they stay within system capacity;and (ii) the number of servers. It follows that its relativesuboptimality gap vanishes in a heavy-traffic limit, as external arrival rates approach system capacity (heavy-traffic optimality). We obtain simpler expressions for the special no-feedback case, where the heuristic reduces to the classical $c \mu$ rule. Our analysis is based on comparing the expected cost of Klimov's ruleto the value of a strong linear programming (LP) relaxation of the system's region of achievable performance of mean queue lengths. In order to obtain this relaxation, we derive and exploit a new set ofwork decomposition laws for the parallel-server system. We further report on the results of a computational study on the quality of the $c \mu$ rule for parallel scheduling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Following the introduction of single-metal deposition (SMD), a simplified fingermark detection technique based on multimetal deposition, optimization studies were conducted. The different parameters of the original formula were tested and the results were evaluated based on the contrast and overall aspect of the enhanced fingermarks. The new formula for SMD was found based on the most optimized parameters. Interestingly, it was found that important variations from the base parameters did not significantly affect the outcome of the enhancement, thus demonstrating that SMD is a very robust technique. Finally, a comparison of the optimized SMD with multi-metal deposition (MMD) was carried out on different surfaces. It was demonstrated that SMD produces comparable results to MMD, thus validating the technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid(whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then theproblem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.