7 resultados para Maximizing

em Digital Commons - Michigan Tech


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Linear programs, or LPs, are often used in optimization problems, such as improving manufacturing efficiency of maximizing the yield from limited resources. The most common method for solving LPs is the Simplex Method, which will yield a solution, if one exists, but over the real numbers. From a purely numerical standpoint, it will be an optimal solution, but quite often we desire an optimal integer solution. A linear program in which the variables are also constrained to be integers is called an integer linear program or ILP. It is the focus of this report to present a parallel algorithm for solving ILPs. We discuss a serial algorithm using a breadth-first branch-and-bound search to check the feasible solution space, and then extend it into a parallel algorithm using a client-server model. In the parallel mode, the search may not be truly breadth-first, depending on the solution time for each node in the solution tree. Our search takes advantage of pruning, often resulting in super-linear improvements in solution time. Finally, we present results from sample ILPs, describe a few modifications to enhance the algorithm and improve solution time, and offer suggestions for future work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Today the use of concrete ties is on the rise in North America as they become an economically competitive alternative to the historical industry standard wood ties, while providing performance which exceeds its competition in terms of durability and capacity. Similarly, in response to rising energy costs, there is increased demand for efficient and sustainable transportation of people and goods. One source of such transportation is the railroad. To accommodate the increased demand, railroads are constructing new track and upgrading existing track. This update to the track system will increase its capacity while making it a more reliable means of transportation compared to other alternatives. In addition to increasing the track system capacity, railroads are considering an increase in the size of the typical freight rail car to allow larger tonnage. An increase in rail car loads will in turn affect the performance requirements of the track. Due to the increased loads heavy haul railroads are considering applying to their tracks, current designs of prestressed concrete railroad ties for heavy haul applications may be undersized. In an effort to maximize tie capacity while maintaining tie geometry, fastening systems and installation equipment, a parametric study to optimize the existing designs was completed. The optimization focused on maximizing the capacity of an existing tie design through an investigation of prestressing quantity, configuration, stress levels and other material properties. The results of the parametric optimization indicate that the capacity of an existing tie can be increased most efficiently by increasing the diameter of the prestressing and concrete strength. However, researchers also found that current design specifications and procedures do not include consideration of tie behavior beyond the current tie capacity limit of cracking to the first layer of prestressing. In addition to limiting analysis to the cracking limit, failure mechanisms such as shear in deep beams at the rail seat or pullout failure of the prestressing due to lack of development length were absent from specified design procedures, but discussed in this project.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Small-scale village woodlots of less than 0.5ha are the preferred use of land for local farmers with extra land in the village of Isangati, a small community located in the southern highlands of Tanzania. Farmers view woodlots as lucrative investments that do not involve intensive labor or time. The climate is ideal for the types of trees grown and the risks are minimal with no serious threats from insects, fires, thieves, or grazing livestock. It was hypothesized that small-scale village woodlot owners were not maximizing timber outputs with their current timber stand management and harvesting techniques. Personal interviews were conducted over a five month period and field data was collected at each farmer’s woodlots over a seven month period. Woodlot field data included woodlot size, number of trees, tree species, tree height, dbh, age, and spacing. The results indicated that the lack of proper woodlot management techniques results in failure to fully capitalize on the investment of woodlots. While farmers should continue with their current harvesting rotations, some of the reasons for not maximizing tree growth include close spacing (2m x 2m), no tree thinning, extreme pruning (60% of tree), and little to no weeding. Through education and hands-on woodlot management workshops, the farmers could increase their timber output and value of woodlots.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Focusing optical beams on a target through random propagation media is very important in many applications such as free space optical communica- tions and laser weapons. Random media effects such as beam spread and scintillation can degrade the optical system's performance severely. Compensation schemes are needed in these applications to overcome these random media effcts. In this research, we investigated the optimal beams for two different optimization criteria: one is to maximize the concentrated received intensity and the other is to minimize the scintillation index at the target plane. In the study of the optimal beam to maximize the weighted integrated intensity, we derive a similarity relationship between pupil-plane phase screen and extended Huygens-Fresnel model, and demonstrate the limited utility of maximizing the average integrated intensity. In the study ofthe optimal beam to minimize the scintillation index, we derive the first- and second-order moments for the integrated intensity of multiple coherent modes. Hermite-Gaussian and Laguerre-Gaussian modes are used as the coherent modes to synthesize an optimal partially coherent beam. The optimal beams demonstrate evident reduction of scintillation index, and prove to be insensitive to the aperture averaging effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Space Based Solar Power satellites use solar arrays to generate clean, green, and renewable electricity in space and transmit it to earth via microwave, radiowave or laser beams to corresponding receivers (ground stations). These traditionally are large structures orbiting around earth at the geo-synchronous altitude. This thesis introduces a new architecture for a Space Based Solar Power satellite constellation. The proposed concept reduces the high cost involved in the construction of the space satellite and in the multiple launches to the geo-synchronous altitude. The proposed concept is a constellation of Low Earth Orbit satellites that are smaller in size than the conventional system. For this application a Repeated Sun-Synchronous Track Circular Orbit is considered (RSSTO). In these orbits, the spacecraft re-visits the same locations on earth periodically every given desired number of days with the line of nodes of the spacecraft’s orbit fixed relative to the Sun. A wide range of solutions are studied, and, in this thesis, a two-orbit constellation design is chosen and simulated. The number of satellites is chosen based on the electric power demands in a given set of global cities. The orbits of the satellites are designed such that their ground tracks visit a maximum number of ground stations during the revisit period. In the simulation, the locations of the ground stations are chosen close to big cities, in USA and worldwide, so that the space power constellation beams down power directly to locations of high electric power demands. The j2 perturbations are included in the mathematical model used in orbit design. The Coverage time of each spacecraft over a ground site and the gap time between two consecutive spacecrafts visiting a ground site are simulated in order to evaluate the coverage continuity of the proposed solar power constellation. It has been observed from simulations that there always periods in which s spacecraft does not communicate with any ground station. For this reason, it is suggested that each satellite in the constellation be equipped with power storage components so that it can store power for later transmission. This thesis presents a method for designing the solar power constellation orbits such that the number of ground stations visited during the given revisit period is maximized. This leads to maximizing the power transmission to ground stations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Civil infrastructure provides essential services for the development of both society and economy. It is very important to manage systems efficiently to ensure sound performance. However, there are challenges in information extraction from available data, which also necessitates the establishment of methodologies and frameworks to assist stakeholders in the decision making process. This research proposes methodologies to evaluate systems performance by maximizing the use of available information, in an effort to build and maintain sustainable systems. Under the guidance of problem formulation from a holistic view proposed by Mukherjee and Muga, this research specifically investigates problem solving methods that measure and analyze metrics to support decision making. Failures are inevitable in system management. A methodology is developed to describe arrival pattern of failures in order to assist engineers in failure rescues and budget prioritization especially when funding is limited. It reveals that blockage arrivals are not totally random. Smaller meaningful subsets show good random behavior. Additional overtime failure rate is analyzed by applying existing reliability models and non-parametric approaches. A scheme is further proposed to depict rates over the lifetime of a given facility system. Further analysis of sub-data sets is also performed with the discussion of context reduction. Infrastructure condition is another important indicator of systems performance. The challenges in predicting facility condition are the transition probability estimates and model sensitivity analysis. Methods are proposed to estimate transition probabilities by investigating long term behavior of the model and the relationship between transition rates and probabilities. To integrate heterogeneities, model sensitivity is performed for the application of non-homogeneous Markov chains model. Scenarios are investigated by assuming transition probabilities follow a Weibull regressed function and fall within an interval estimate. For each scenario, multiple cases are simulated using a Monte Carlo simulation. Results show that variations on the outputs are sensitive to the probability regression. While for the interval estimate, outputs have similar variations to the inputs. Life cycle cost analysis and life cycle assessment of a sewer system are performed comparing three different pipe types, which are reinforced concrete pipe (RCP) and non-reinforced concrete pipe (NRCP), and vitrified clay pipe (VCP). Life cycle cost analysis is performed for material extraction, construction and rehabilitation phases. In the rehabilitation phase, Markov chains model is applied in the support of rehabilitation strategy. In the life cycle assessment, the Economic Input-Output Life Cycle Assessment (EIO-LCA) tools are used in estimating environmental emissions for all three phases. Emissions are then compared quantitatively among alternatives to support decision making.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fuzzy community detection is to identify fuzzy communities in a network, which are groups of vertices in the network such that the membership of a vertex in one community is in [0,1] and that the sum of memberships of vertices in all communities equals to 1. Fuzzy communities are pervasive in social networks, but only a few works have been done for fuzzy community detection. Recently, a one-step forward extension of Newman’s Modularity, the most popular quality function for disjoint community detection, results into the Generalized Modularity (GM) that demonstrates good performance in finding well-known fuzzy communities. Thus, GMis chosen as the quality function in our research. We first propose a generalized fuzzy t-norm modularity to investigate the effect of different fuzzy intersection operators on fuzzy community detection, since the introduction of a fuzzy intersection operation is made feasible by GM. The experimental results show that the Yager operator with a proper parameter value performs better than the product operator in revealing community structure. Then, we focus on how to find optimal fuzzy communities in a network by directly maximizing GM, which we call it Fuzzy Modularity Maximization (FMM) problem. The effort on FMM problem results into the major contribution of this thesis, an efficient and effective GM-based fuzzy community detection method that could automatically discover a fuzzy partition of a network when it is appropriate, which is much better than fuzzy partitions found by existing fuzzy community detection methods, and a crisp partition of a network when appropriate, which is competitive with partitions resulted from the best disjoint community detections up to now. We address FMM problem by iteratively solving a sub-problem called One-Step Modularity Maximization (OSMM). We present two approaches for solving this iterative procedure: a tree-based global optimizer called Find Best Leaf Node (FBLN) and a heuristic-based local optimizer. The OSMM problem is based on a simplified quadratic knapsack problem that can be solved in linear time; thus, a solution of OSMM can be found in linear time. Since the OSMM algorithm is called within FBLN recursively and the structure of the search tree is non-deterministic, we can see that the FMM/FBLN algorithm runs in a time complexity of at least O (n2). So, we also propose several highly efficient and very effective heuristic algorithms namely FMM/H algorithms. We compared our proposed FMM/H algorithms with two state-of-the-art community detection methods, modified MULTICUT Spectral Fuzzy c-Means (MSFCM) and Genetic Algorithm with a Local Search strategy (GALS), on 10 real-world data sets. The experimental results suggest that the H2 variant of FMM/H is the best performing version. The H2 algorithm is very competitive with GALS in producing maximum modularity partitions and performs much better than MSFCM. On all the 10 data sets, H2 is also 2-3 orders of magnitude faster than GALS. Furthermore, by adopting a simply modified version of the H2 algorithm as a mutation operator, we designed a genetic algorithm for fuzzy community detection, namely GAFCD, where elite selection and early termination are applied. The crossover operator is designed to make GAFCD converge fast and to enhance GAFCD’s ability of jumping out of local minimums. Experimental results on all the data sets show that GAFCD uncovers better community structure than GALS.