67 resultados para Railroad large scale apparatus


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding natural climate variability and its driving factors is crucial to assessing future climate change. Therefore, comparing proxy-based climate reconstructions with forcing factors as well as comparing these with paleoclimate model simulations is key to gaining insights into the relative roles of internal versus forced variability. A review of the state of modelling of the climate of the last millennium prior to the CMIP5–PMIP3 (Coupled Model Intercomparison Project Phase 5–Paleoclimate Modelling Intercomparison Project Phase 3) coordinated effort is presented and compared to the available temperature reconstructions. Simulations and reconstructions broadly agree on reproducing the major temperature changes and suggest an overall linear response to external forcing on multidecadal or longer timescales. Internal variability is found to have an important influence at hemispheric and global scales. The spatial distribution of simulated temperature changes during the transition from the Medieval Climate Anomaly to the Little Ice Age disagrees with that found in the reconstructions. Thus, either internal variability is a possible major player in shaping temperature changes through the millennium or the model simulations have problems realistically representing the response pattern to external forcing. A last millennium transient climate response (LMTCR) is defined to provide a quantitative framework for analysing the consistency between simulated and reconstructed climate. Beyond an overall agreement between simulated and reconstructed LMTCR ranges, this analysis is able to single out specific discrepancies between some reconstructions and the ensemble of simulations. The disagreement is found in the cases where the reconstructions show reduced covariability with external forcings or when they present high rates of temperature change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Changes in Greenland accumulation and the stability in the relationship between accumulation variability and large-scale circulation are assessed by performing time-slice simulations for the present day, the preindustrial era, the early Holocene, and the Last Glacial Maximum (LGM) with a comprehensive climate model. The stability issue is an important prerequisite for reconstructions of Northern Hemisphere atmospheric circulation variability based on accumulation or precipitation proxy records from Greenland ice cores. The analysis reveals that the relationship between accumulation variability and large-scale circulation undergoes a significant seasonal cycle. As the contributions of the individual seasons to the annual signal change, annual mean accumulation variability is not necessarily related to the same atmospheric circulation patterns during the different climate states. Interestingly, within a season, local Greenland accumulation variability is indeed linked to a consistent circulation pattern, which is observed for all studied climate periods, even for the LGM. Hence, it would be possible to deduce a reliable reconstruction of seasonal atmospheric variability (e.g., for North Atlantic winters) if an accumulation or precipitation proxy were available that resolves single seasons. We further show that the simulated impacts of orbital forcing and changes in the ice sheet topography on Greenland accumulation exhibit strong spatial differences, emphasizing that accumulation records from different ice core sites regarding both interannual and long-term (centennial to millennial) variability cannot be expected to look alike since they include a distinct local signature. The only uniform signal to external forcing is the strong decrease in Greenland accumulation during glacial (LGM) conditions and an increase associated with the recent rise in greenhouse gas concentrations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High brightness electron sources are of great importance for the operation of the hard X-ray free electron lasers. Field emission cathodes based on the double-gate metallic field emitter arrays (FEAs) can potentially offer higher brightness than the currently used ones. We report on the successful application of electron beam lithography for fabrication of the large-scale single-gate as well as double-gate FEAs. We demonstrate operational high-density single-gate FEAs with sub-micron pitch and total number of tips up to 106 as well as large-scale double-gate FEAs with large collimation gate apertures. The details of design, fabrication procedure and successful measurements of the emission current from the single- and double-gate cathodes are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the ongoing trend towards increased product variety, fast-moving consumer goods such as food and beverages, pharmaceuticals, and chemicals are typically manufactured through so-called make-and-pack processes. These processes consist of a make stage, a pack stage, and intermediate storage facilities that decouple these two stages. In operations scheduling, complex technological constraints must be considered, e.g., non-identical parallel processing units, sequence-dependent changeovers, batch splitting, no-wait restrictions, material transfer times, minimum storage times, and finite storage capacity. The short-term scheduling problem is to compute a production schedule such that a given demand for products is fulfilled, all technological constraints are met, and the production makespan is minimised. A production schedule typically comprises 500–1500 operations. Due to the problem size and complexity of the technological constraints, the performance of known mixed-integer linear programming (MILP) formulations and heuristic approaches is often insufficient. We present a hybrid method consisting of three phases. First, the set of operations is divided into several subsets. Second, these subsets are iteratively scheduled using a generic and flexible MILP formulation. Third, a novel critical path-based improvement procedure is applied to the resulting schedule. We develop several strategies for the integration of the MILP model into this heuristic framework. Using these strategies, high-quality feasible solutions to large-scale instances can be obtained within reasonable CPU times using standard optimisation software. We have applied the proposed hybrid method to a set of industrial problem instances and found that the method outperforms state-of-the-art methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Well-known data mining algorithms rely on inputs in the form of pairwise similarities between objects. For large datasets it is computationally impossible to perform all pairwise comparisons. We therefore propose a novel approach that uses approximate Principal Component Analysis to efficiently identify groups of similar objects. The effectiveness of the approach is demonstrated in the context of binary classification using the supervised normalized cut as a classifier. For large datasets from the UCI repository, the approach significantly improves run times with minimal loss in accuracy.