868 resultados para temperature-based models
Resumo:
We found a significant positive correlation between local summer air temperature (May-September) and the annual sediment mass accumulation rate (MAR) in Lake Silvaplana (46°N, 9°E, 1800 m a.s.l.) during the twentieth century (r = 0.69, p < 0.001 for decadal smoothed series). Sediment trap data (2001-2005) confirm this relation with exceptionally high particle yields during the hottest summer of the last 140 years in 2003. On this base we developed a decadal-scale summer temperature reconstruction back to AD 1580. Surprisingly, the comparison of our reconstruction with two other independent regional summer temperature reconstructions (based on tree-rings and documentary data) revealed a significant negative correlation for the pre-1900 data (ie, late ‘Little Ice Age’). This demonstrates that the correlation between MAR and summer temperature is not stable in time and the actualistic principle does not apply in this case. We suggest that different climatic regimes (modern/‘Little Ice Age’) lead to changing state conditions in the catchment and thus to considerably different sediment transport mechanisms. Therefore, we calibrated our MAR data with gridded early instrumental temperature series from AD 1760-1880 (r = -0.48, p < 0.01 for decadal smoothed series) to properly reconstruct the late LIA climatic conditions. We found exceptionally low temperatures between AD 1580 and 1610 (0.75°C below twentieth-century mean) and during the late Maunder Minimum from AD 1680 to 1710 (0.5°C below twentieth-century mean). In general, summer temperatures did not experience major negative departures from the twentieth-century mean during the late ‘Little Ice Age’. This compares well with the two existing independent regional reconstructions suggesting that the LIA in the Alps was mainly a phenomenon of the cold season.
Resumo:
Relatively little is known about past cold-season temperature variability in high-Alpine regions because of a lack of natural cold-season temperature proxies as well as under-representation of high-altitude sites in meteorological, early-instrumental and documentary data sources. Recent studies have shown that chrysophyte stomatocysts, or simply cysts (sub-fossil algal remains of Chrysophyceae and Synurophyceae), are among the very few natural proxies that can be used to reconstruct cold-season temperatures. This study presents a quantitative, high-resolution (5-year), cold-season (Oct–May) temperature reconstruction based on sub-fossil chrysophyte stomatocysts in the annually laminated (varved) sediments of high-Alpine Lake Silvaplana, SE Switzerland (1,789 m a.s.l.), since AD 1500. We first explore the method used to translate an ecologically meaningful variable based on a biological proxy into a simple climate variable. A transfer function was applied to reconstruct the ‘date of spring mixing’ from cyst assemblages. Next, statistical regression models were tested to convert the reconstructed ‘dates of spring mixing’ into cold-season surface air temperatures with associated errors. The strengths and weaknesses of this approach are thoroughly tested. One much-debated, basic assumption for reconstructions (‘stationarity’), which states that only the environmental variable of interest has influenced cyst assemblages and the influence of confounding variables is negligible over time, is addressed in detail. Our inferences show that past cold-season air-temperature fluctuations were substantial and larger than those of other temperature reconstructions for Europe and the Alpine region. Interestingly, in this study, recent cold-season temperatures only just exceed those of previous, multi-decadal warm phases since AD 1500. These findings highlight the importance of local studies to assess natural climate variability at high altitudes.
Resumo:
The mid-Holocene (6 kyr BP; thousand years before present) is a key period to study the consistency between model results and proxy-based reconstruction data as it corresponds to a standard test for models and a reasonable number of proxy-based records is available. Taking advantage of this relatively large amount of information, we have compared a compilation of 50 air and sea surface temperature reconstructions with the results of three simulations performed with general circulation models and one carried out with LOVECLIM, a model of intermediate complexity. The conclusions derived from this analysis confirm that models and data agree on the large-scale spatial pattern but the models underestimate the magnitude of some observed changes and that large discrepancies are observed at the local scale. To further investigate the origin of those inconsistencies, we have constrained LOVECLIM to follow the signal recorded by the proxies selected in the compilation using a data-assimilation method based on a particle filter. In one simulation, all the 50 proxy-based records are used while in the other two only the continental or oceanic proxy-based records constrain the model results. As expected, data assimilation leads to improving the consistency between model results and the reconstructions. In particular, this is achieved in a robust way in all the experiments through a strengthening of the westerlies at midlatitude that warms up northern Europe. Furthermore, the comparison of the LOVECLIM simulations with and without data assimilation has also objectively identified 16 proxy-based paleoclimate records whose reconstructed signal is either incompatible with the signal recorded by some other proxy-based records or with model physics.
Resumo:
Sea surface temperatures and sea-ice extent are the most critical variables to evaluate the Southern Ocean paleoceanographic evolution in relation to the development of the global carbon cycle, atmospheric CO2 variability and ocean-atmosphere circulation. In contrast to the Atlantic and the Indian sectors, the Pacific sector of the Southern Ocean has been insufficiently investigated so far. To cover this gap of information we present diatom-based estimates of summer sea surface temperature (SSST) and winter sea-ice concentration (WSI) from 17 sites in the polar South Pacific to study the Last Glacial Maximum (LGM) at the EPILOG time slice (19,000-23,000 cal. years BP). Applied statistical methods are the Imbrie and Kipp Method (IKM) and the Modern Analog Technique (MAT) to estimate temperature and sea-ice concentration, respectively. Our data display a distinct LGM east-west differentiation in SSST and WSI with steeper latitudinal temperature gradients and a winter sea-ice edge located consistently north of the Pacific-Antarctic Ridge in the Ross sea sector. In the eastern sector of our study area, which is governed by the Amundsen Abyssal Plain, the estimates yield weaker latitudinal SSST gradients together with a variable extended winter sea-ice field. In this sector, sea-ice extent may have reached sporadically the area of the present Subantarctic Front at its maximum LGM expansion. This pattern points to topographic forcing as major controller of the frontal system location and sea-ice extent in the western Pacific sector whereas atmospheric conditions like the Southern Annular Mode and the ENSO affected the oceanographic conditions in the eastern Pacific sector. Although it is difficult to depict the location and the physical nature of frontal systems separating the glacial Southern Ocean water masses into different zones, we found a distinct temperature gradient in latitudes straddled by the modern Southern Subtropical Front. Considering that the glacial temperatures north of this zone are similar to the modern, we suggest that this represents the Glacial Southern Subtropical Front (GSSTF), which delimits the zone of strongest glacial SSST cooling (>4K) to its North. The southern boundary of the zone of maximum cooling is close to the glacial 4°C isotherm. This isotherm, which is in the range of SSST at the modern Antarctic Polar Front (APF), represents a circum-Antarctic feature and marks the northern edge of the glacial Antarctic Circumpolar Current (ACC). We also assume that a glacial front was established at the northern average winter sea ice edge, comparable with the modern Southern Antarctic Circumpolar Current Front (SACCF). During the glacial, this front would be located in the area of the modern APF. The northward deflection of colder than modern surface waters along the South American continent leads to a significant cooling of the glacial Humboldt Current surface waters (4-8K), which affects the temperature regimes as far north as into tropical latitudes. The glacial reduction of ACC temperatures may also result in the significant cooling in the Atlantic and Indian Southern Ocean, thus may enhance thermal differentiation of the Southern Ocean and Antarctic continental cooling. Comparison with temperature and sea ice simulations for the last glacial based on numerical simulations show that the majority of modern models overestimate summer and winter sea ice cover and that there exists few models that reproduce our temperature data rather well.
Resumo:
Impact response surfaces (IRSs) depict the response of an impact variable to changes in two explanatory variables as a plotted surface. Here, IRSs of spring and winter wheat yields were constructed from a 25-member ensemble of process-based crop simulation models. Twenty-one models were calibrated by different groups using a common set of calibration data, with calibrations applied independently to the same models in three cases. The sensitivity of modelled yield to changes in temperature and precipitation was tested by systematically modifying values of 1981-2010 baseline weather data to span the range of 19 changes projected for the late 21st century at three locations in Europe.
Resumo:
The next phase envisioned for the World Wide Web is automated ad-hoc interaction between intelligent agents, web services, databases and semantic web enabled applications. Although at present this appears to be a distant objective, there are practical steps that can be taken to advance the vision. We propose an extension to classical conceptual models to allow the definition of application components in terms of public standards and explicit semantics, thus building into web-based applications, the foundation for shared understanding and interoperability. The use of external definitions and the need to store outsourced type information internally, brings to light the issue of object identity in a global environment, where object instances may be identified by multiple externally controlled identification schemes. We illustrate how traditional conceptual models may be augmented to recognise and deal with multiple identities.
Resumo:
Australia needs highly skilled workers to sustain a healthy economy. Current employment-based training models have limitations in meeting the demands for highly skilled labour supply. The research explored current and emerging models of employment-based training to propose more effective models at higher VET qualifications that can maintain a balance between institution and work-based learning.
Resumo:
Childcare workers play a significant role in the learning and development of children in their care. This has major implications for the training of workers. Under new reforms of the childcare industry the Australian government now requires all workers to obtain qualifications from a vocational education and training provider (eg. Technical and Further Education) or university. Effective models of employment-based training are critical to provide training to highly competent workers. This paper presents findings from a study that examined current and emerging models of employment-based training in the childcare sector, particularly at the Diploma level. Semi-structured interviews were conducted with a sample of 16 participants who represented childcare directors, employers, and workers located in childcare services in urban, regional and remote locations in the State of Queensland. The study proposes a ‘best-fit’ employment-based training approach that is characterised by a compendium of five models instead of a ‘one size fits all’. Issues with successful implementation of the EBT models are also discussed
Resumo:
A pragmatic method for assessing the accuracy and precision of a given processing pipeline required for converting computed tomography (CT) image data of bones into representative three dimensional (3D) models of bone shapes is proposed. The method is based on coprocessing a control object with known geometry which enables the assessment of the quality of resulting 3D models. At three stages of the conversion process, distance measurements were obtained and statistically evaluated. For this study, 31 CT datasets were processed. The final 3D model of the control object contained an average deviation from reference values of −1.07±0.52 mm standard deviation (SD) for edge distances and −0.647±0.43 mm SD for parallel side distances of the control object. Coprocessing a reference object enables the assessment of the accuracy and precision of a given processing pipeline for creating CTbased 3D bone models and is suitable for detecting most systematic or human errors when processing a CT-scan. Typical errors have about the same size as the scan resolution.
Resumo:
Many industrial processes and systems can be modelled mathematically by a set of Partial Differential Equations (PDEs). Finding a solution to such a PDF model is essential for system design, simulation, and process control purpose. However, major difficulties appear when solving PDEs with singularity. Traditional numerical methods, such as finite difference, finite element, and polynomial based orthogonal collocation, not only have limitations to fully capture the process dynamics but also demand enormous computation power due to the large number of elements or mesh points for accommodation of sharp variations. To tackle this challenging problem, wavelet based approaches and high resolution methods have been recently developed with successful applications to a fixedbed adsorption column model. Our investigation has shown that recent advances in wavelet based approaches and high resolution methods have the potential to be adopted for solving more complicated dynamic system models. This chapter will highlight the successful applications of these new methods in solving complex models of simulated-moving-bed (SMB) chromatographic processes. A SMB process is a distributed parameter system and can be mathematically described by a set of partial/ordinary differential equations and algebraic equations. These equations are highly coupled; experience wave propagations with steep front, and require significant numerical effort to solve. To demonstrate the numerical computing power of the wavelet based approaches and high resolution methods, a single column chromatographic process modelled by a Transport-Dispersive-Equilibrium linear model is investigated first. Numerical solutions from the upwind-1 finite difference, wavelet-collocation, and high resolution methods are evaluated by quantitative comparisons with the analytical solution for a range of Peclet numbers. After that, the advantages of the wavelet based approaches and high resolution methods are further demonstrated through applications to a dynamic SMB model for an enantiomers separation process. This research has revealed that for a PDE system with a low Peclet number, all existing numerical methods work well, but the upwind finite difference method consumes the most time for the same degree of accuracy of the numerical solution. The high resolution method provides an accurate numerical solution for a PDE system with a medium Peclet number. The wavelet collocation method is capable of catching up steep changes in the solution, and thus can be used for solving PDE models with high singularity. For the complex SMB system models under consideration, both the wavelet based approaches and high resolution methods are good candidates in terms of computation demand and prediction accuracy on the steep front. The high resolution methods have shown better stability in achieving steady state in the specific case studied in this Chapter.
Resumo:
As organizations reach higher levels of Business Process Management maturity, they tend to accumulate large collections of process models. These repositories may contain thousands of activities and be managed by different stakeholders with varying skills and responsibilities. However, while being of great value, these repositories induce high management costs. Thus, it becomes essential to keep track of the various model versions as they may mutually overlap, supersede one another and evolve over time. We propose an innovative versioning model and associated storage structure, specifically designed to maximize sharing across process model versions, and to automatically handle change propagation. The focal point of this technique is to version single process model fragments, rather than entire process models. Indeed empirical evidence shows that real-life process model repositories have numerous duplicate fragments. Experiments on two industrial datasets confirm the usefulness of our technique.
Resumo:
The health of tollbooth workers is seriously threatened by long-term exposure to polluted air from vehicle exhausts. Using traffic data collected at a toll plaza, vehicle movements were simulated by a system dynamics model with different traffic volumes and toll collection procedures. This allowed the average travel time of vehicles to be calculated. A three-dimension Computational Fluid Dynamics (CFD) model was used with a k–ε turbulence model to simulate pollutant dispersion at the toll plaza for different traffic volumes and toll collection procedures. It was shown that pollutant concentration around tollbooths increases as traffic volume increases. Whether traffic volume is low or high (1500 vehicles/h or 2500 vehicles/h), pollutant concentration decreases if electronic toll collection (ETC) is adopted. In addition, pollutant concentration around tollbooths decreases as the proportion of ETC-equipped vehicles increases. However, if the proportion of ETC-equipped vehicles is very low and the traffic volume is not heavy, then pollutant concentration increases as the number of ETC lanes increases.
Resumo:
In this paper, two ideal formation models of serrated chips, the symmetric formation model and the unilateral right-angle formation model, have been established for the first time. Based on the ideal models and related adiabatic shear theory of serrated chip formation, the theoretical relationship among average tooth pitch, average tooth height and chip thickness are obtained. Further, the theoretical relation of the passivation coefficient of chip's sawtooth and the chip thickness compression ratio is deduced as well. The comparison between these theoretical prediction curves and experimental data shows good agreement, which well validates the robustness of the ideal chip formation models and the correctness of the theoretical deducing analysis. The proposed ideal models may have provided a simple but effective theoretical basis for succeeding research on serrated chip morphology. Finally, the influences of most principal cutting factors on serrated chip formation are discussed on the basis of a series of finite element simulation results for practical advices of controlling serrated chips in engineering application.