961 resultados para Branch banks
Resumo:
http://digitalcommons.fiu.edu/fce_lter_photos/1279/thumbnail.jpg
Resumo:
http://digitalcommons.fiu.edu/fce_lter_photos/1278/thumbnail.jpg
Resumo:
The southwestern part of the subpolar North Atlantic east of the Grand Banks of Newfoundland and Flemish Cap is a crucial area for the Atlantic Meridional Overturning Circulation. Here the exchange between subpolar and subtropical gyre takes place, southward flowing cold and fresh water is replaced by northward flowing warm and salty water within the North Atlantic Current (NAC). As part of a long-term experiment, the circulation east of Flemish Cap has been studied by seven repeat hydrographic sections along inline image (2003-2011), a 2 year time series of current velocities at the continental slope (2009-2011), 19 years of sea surface height, and 47 years of output from an eddy resolving ocean circulation model. The structure of the flow field in the measurements and the model shows a deep reaching NAC with adjacent recirculation and two distinct cores of southward flow in the Deep Western Boundary Current (DWBC): one core above the continental slope with maximum velocities at mid-depth and the second farther east with bottom-intensified velocities. The western core of the DWBC is rather stable, while the offshore core shows high temporal variability that in the model is correlated with the NAC strength. About 30 Sv of deep water flow southward below a density of sigma-theta = 27.68 kg/m**3 in the DWBC. The NAC transports about 110 Sv northward, approximately 15 Sv originating from the DWBC, and 75 Sv recirculating locally east of the NAC, leaving 20 Sv to be supplied by the NAC from the south.
Resumo:
Current interest in measuring quality of life is generating interest in the construction of computerized adaptive tests (CATs) with Likert-type items. Calibration of an item bank for use in CAT requires collecting responses to a large number of candidate items. However, the number is usually too large to administer to each subject in the calibration sample. The concurrent anchor-item design solves this problem by splitting the items into separate subtests, with some common items across subtests; then administering each subtest to a different sample; and finally running estimation algorithms once on the aggregated data array, from which a substantial number of responses are then missing. Although the use of anchor-item designs is widespread, the consequences of several configuration decisions on the accuracy of parameter estimates have never been studied in the polytomous case. The present study addresses this question by simulation, comparing the outcomes of several alternatives on the configuration of the anchor-item design. The factors defining variants of the anchor-item design are (a) subtest size, (b) balance of common and unique items per subtest, (c) characteristics of the common items, and (d) criteria for the distribution of unique items across subtests. The results of this study indicate that maximizing accuracy in item parameter recovery requires subtests of the largest possible number of items and the smallest possible number of common items; the characteristics of the common items and the criterion for distribution of unique items do not affect accuracy.
Resumo:
General note: Title and date provided by Bettye Lane.
Resumo:
General note: Title and date provided by Bettye Lane.
Resumo:
General note: Title and date provided by Bettye Lane.
Resumo:
General note: Title and date provided by Bettye Lane.
Resumo:
General note: Title and date provided by Bettye Lane.
Resumo:
General note: Title and date provided by Bettye Lane.
Resumo:
Supply chain operations directly affect service levels. Decision on amendment of facilities is generally decided based on overall cost, leaving out the efficiency of each unit. Decomposing the supply chain superstructure, efficiency analysis of the facilities (warehouses or distribution centers) that serve customers can be easily implemented. With the proposed algorithm, the selection of a facility is based on service level maximization and not just cost minimization as this analysis filters all the feasible solutions utilizing Data Envelopment Analysis (DEA) technique. Through multiple iterations, solutions are filtered via DEA and only the efficient ones are selected leading to cost minimization. In this work, the problem of optimal supply chain networks design is addressed based on a DEA based algorithm. A Branch and Efficiency (B&E) algorithm is deployed for the solution of this problem. Based on this DEA approach, each solution (potentially installed warehouse, plant etc) is treated as a Decision Making Unit, thus is characterized by inputs and outputs. The algorithm through additional constraints named “efficiency cuts”, selects only efficient solutions providing better objective function values. The applicability of the proposed algorithm is demonstrated through illustrative examples.
Resumo:
De nombreux problèmes liés aux domaines du transport, des télécommunications et de la logistique peuvent être modélisés comme des problèmes de conception de réseaux. Le problème classique consiste à transporter un flot (données, personnes, produits, etc.) sur un réseau sous un certain nombre de contraintes dans le but de satisfaire la demande, tout en minimisant les coûts. Dans ce mémoire, on se propose d'étudier le problème de conception de réseaux avec coûts fixes, capacités et un seul produit, qu'on transforme en un problème équivalent à plusieurs produits de façon à améliorer la valeur de la borne inférieure provenant de la relaxation continue du modèle. La méthode que nous présentons pour la résolution de ce problème est une méthode exacte de branch-and-price-and-cut avec une condition d'arrêt, dans laquelle nous exploitons à la fois la méthode de génération de colonnes, la méthode de génération de coupes et l'algorithme de branch-and-bound. Ces méthodes figurent parmi les techniques les plus utilisées en programmation linéaire en nombres entiers. Nous testons notre méthode sur deux groupes d'instances de tailles différentes (gran-des et très grandes), et nous la comparons avec les résultats donnés par CPLEX, un des meilleurs logiciels permettant de résoudre des problèmes d'optimisation mathématique, ainsi qu’avec une méthode de branch-and-cut. Il s'est avéré que notre méthode est prometteuse et peut donner de bons résultats, en particulier pour les instances de très grandes tailles.
Resumo:
This material is based upon work supported by the National Science Foundation through the Florida Coastal Everglades Long-Term Ecological Research program under Cooperative Agreements #DBI-0620409 and #DEB-9910514. This image is made available for non-commercial or educational use only.