882 resultados para large scale data gathering
Resumo:
Quality of education should be stable or permanently increased – even if the number of students rises. Quality of education is often related to possibilities for active learning and individual facilitation. This paper deals with the question how high-quality learning within oversized courses could be enabled and it presents the approach of e-flashcards that enables active learning and individual facilitation within large scale university courses.
Resumo:
Understanding natural climate variability and its driving factors is crucial to assessing future climate change. Therefore, comparing proxy-based climate reconstructions with forcing factors as well as comparing these with paleoclimate model simulations is key to gaining insights into the relative roles of internal versus forced variability. A review of the state of modelling of the climate of the last millennium prior to the CMIP5–PMIP3 (Coupled Model Intercomparison Project Phase 5–Paleoclimate Modelling Intercomparison Project Phase 3) coordinated effort is presented and compared to the available temperature reconstructions. Simulations and reconstructions broadly agree on reproducing the major temperature changes and suggest an overall linear response to external forcing on multidecadal or longer timescales. Internal variability is found to have an important influence at hemispheric and global scales. The spatial distribution of simulated temperature changes during the transition from the Medieval Climate Anomaly to the Little Ice Age disagrees with that found in the reconstructions. Thus, either internal variability is a possible major player in shaping temperature changes through the millennium or the model simulations have problems realistically representing the response pattern to external forcing. A last millennium transient climate response (LMTCR) is defined to provide a quantitative framework for analysing the consistency between simulated and reconstructed climate. Beyond an overall agreement between simulated and reconstructed LMTCR ranges, this analysis is able to single out specific discrepancies between some reconstructions and the ensemble of simulations. The disagreement is found in the cases where the reconstructions show reduced covariability with external forcings or when they present high rates of temperature change.
Resumo:
Changes in Greenland accumulation and the stability in the relationship between accumulation variability and large-scale circulation are assessed by performing time-slice simulations for the present day, the preindustrial era, the early Holocene, and the Last Glacial Maximum (LGM) with a comprehensive climate model. The stability issue is an important prerequisite for reconstructions of Northern Hemisphere atmospheric circulation variability based on accumulation or precipitation proxy records from Greenland ice cores. The analysis reveals that the relationship between accumulation variability and large-scale circulation undergoes a significant seasonal cycle. As the contributions of the individual seasons to the annual signal change, annual mean accumulation variability is not necessarily related to the same atmospheric circulation patterns during the different climate states. Interestingly, within a season, local Greenland accumulation variability is indeed linked to a consistent circulation pattern, which is observed for all studied climate periods, even for the LGM. Hence, it would be possible to deduce a reliable reconstruction of seasonal atmospheric variability (e.g., for North Atlantic winters) if an accumulation or precipitation proxy were available that resolves single seasons. We further show that the simulated impacts of orbital forcing and changes in the ice sheet topography on Greenland accumulation exhibit strong spatial differences, emphasizing that accumulation records from different ice core sites regarding both interannual and long-term (centennial to millennial) variability cannot be expected to look alike since they include a distinct local signature. The only uniform signal to external forcing is the strong decrease in Greenland accumulation during glacial (LGM) conditions and an increase associated with the recent rise in greenhouse gas concentrations.
Resumo:
BACKGROUND: Enterococcus faecalis has emerged as a major hospital pathogen. To explore its diversity, we sequenced E. faecalis strain OG1RF, which is commonly used for molecular manipulation and virulence studies. RESULTS: The 2,739,625 base pair chromosome of OG1RF was found to contain approximately 232 kilobases unique to this strain compared to V583, the only publicly available sequenced strain. Almost no mobile genetic elements were found in OG1RF. The 64 areas of divergence were classified into three categories. First, OG1RF carries 39 unique regions, including 2 CRISPR loci and a new WxL locus. Second, we found nine replacements where a sequence specific to V583 was substituted by a sequence specific to OG1RF. For example, the iol operon of OG1RF replaces a possible prophage and the vanB transposon in V583. Finally, we found 16 regions that were present in V583 but missing from OG1RF, including the proposed pathogenicity island, several probable prophages, and the cpsCDEFGHIJK capsular polysaccharide operon. OG1RF was more rapidly but less frequently lethal than V583 in the mouse peritonitis model and considerably outcompeted V583 in a murine model of urinary tract infections. CONCLUSION: E. faecalis OG1RF carries a number of unique loci compared to V583, but the almost complete lack of mobile genetic elements demonstrates that this is not a defining feature of the species. Additionally, OG1RF's effects in experimental models suggest that mediators of virulence may be diverse between different E. faecalis strains and that virulence is not dependent on the presence of mobile genetic elements.
Resumo:
High brightness electron sources are of great importance for the operation of the hard X-ray free electron lasers. Field emission cathodes based on the double-gate metallic field emitter arrays (FEAs) can potentially offer higher brightness than the currently used ones. We report on the successful application of electron beam lithography for fabrication of the large-scale single-gate as well as double-gate FEAs. We demonstrate operational high-density single-gate FEAs with sub-micron pitch and total number of tips up to 106 as well as large-scale double-gate FEAs with large collimation gate apertures. The details of design, fabrication procedure and successful measurements of the emission current from the single- and double-gate cathodes are presented.
Resumo:
Due to the ongoing trend towards increased product variety, fast-moving consumer goods such as food and beverages, pharmaceuticals, and chemicals are typically manufactured through so-called make-and-pack processes. These processes consist of a make stage, a pack stage, and intermediate storage facilities that decouple these two stages. In operations scheduling, complex technological constraints must be considered, e.g., non-identical parallel processing units, sequence-dependent changeovers, batch splitting, no-wait restrictions, material transfer times, minimum storage times, and finite storage capacity. The short-term scheduling problem is to compute a production schedule such that a given demand for products is fulfilled, all technological constraints are met, and the production makespan is minimised. A production schedule typically comprises 500–1500 operations. Due to the problem size and complexity of the technological constraints, the performance of known mixed-integer linear programming (MILP) formulations and heuristic approaches is often insufficient. We present a hybrid method consisting of three phases. First, the set of operations is divided into several subsets. Second, these subsets are iteratively scheduled using a generic and flexible MILP formulation. Third, a novel critical path-based improvement procedure is applied to the resulting schedule. We develop several strategies for the integration of the MILP model into this heuristic framework. Using these strategies, high-quality feasible solutions to large-scale instances can be obtained within reasonable CPU times using standard optimisation software. We have applied the proposed hybrid method to a set of industrial problem instances and found that the method outperforms state-of-the-art methods.