81 resultados para Sonar Simulations
Resumo:
It is often claimed that scientists can obtain new knowledge about nature by running computer simulations. How is this possible? I answer this question by arguing that computer simulations are arguments. This view parallels Norton’s argument view about thought experiments. I show that computer simulations can be reconstructed as arguments that fully capture the epistemic power of the simulations. Assuming the extended mind hypothesis, I furthermore argue that running the computer simulation is to execute the reconstructing argument. I discuss some objections and reject the view that computer simulations produce knowledge because they are experiments. I conclude by comparing thought experiments and computer simulations, assuming that both are arguments.
Resumo:
Ultrasound (US) has become a useful tool in the detection of early disease, differential diagnosis, guidance of treatment decisions and treatment monitoring of rheumatoid arthritis (RA). In 2008, the Swiss Sonography in Arthritis and Rheumatism (SONAR) group was established to promote the use of US in inflammatory arthritis in clinical practice. A scoring system was developed and taught to a large number of Swiss rheumatologists who already contributed to the Swiss Clinical Quality Management (SCQM) database, a national patient register. This paper intends to give a Swiss consensus about best clinical practice recommendations for the use of US in RA on the basis of the current literature knowledge and experience with the Swiss SONAR score. Literature research was performed to collect data on current evidence. The results were discussed among specialists of the Swiss university centres and private practice, following a structured procedure. Musculoskelatal US was found to be very helpful in establishing the diagnosis and monitoring the evolution of RA, and to be a reliable tool if used by experienced examiners. It influences treatment decisions such as continuing, intensifying or stepping down therapy. The definite modalities of integrating US into the diagnosis and monitoring of RA treatments will be defined within a few years. There are, however, strong arguments to use US findings as of today in daily clinical care. Some practical recommendations about the use of US in RA, focusing on the diagnosis and the use of the SONAR score, are proposed.
Resumo:
Correct estimation of the firn lock-in depth is essential for correctly linking gas and ice chronologies in ice core studies. Here, two approaches to constrain the firn depth evolution in Antarctica are presented over the last deglaciation: outputs of a firn densification model, and measurements of δ15N of N2 in air trapped in ice core, assuming that δ15N is only affected by gravitational fractionation in the firn column. Since the firn densification process is largely governed by surface temperature and accumulation rate, we have investigated four ice cores drilled in coastal (Berkner Island, BI, and James Ross Island, JRI) and semi-coastal (TALDICE and EPICA Dronning Maud Land, EDML) Antarctic regions. Combined with available ice core air-δ15N measurements from the EPICA Dome C (EDC) site, the studied regions encompass a large range of surface accumulation rates and temperature conditions. Our δ15N profiles reveal a heterogeneous response of the firn structure to glacial–interglacial climatic changes. While firn densification simulations correctly predict TALDICE δ15N variations, they systematically fail to capture the large millennial-scale δ15N variations measured at BI and the δ15N glacial levels measured at JRI and EDML – a mismatch previously reported for central East Antarctic ice cores. New constraints of the EDML gas–ice depth offset during the Laschamp event (~41 ka) and the last deglaciation do not favour the hypothesis of a large convective zone within the firn as the explanation of the glacial firn model–δ15N data mismatch for this site. While we could not conduct an in-depth study of the influence of impurities in snow for firnification from the existing datasets, our detailed comparison between the δ15N profiles and firn model simulations under different temperature and accumulation rate scenarios suggests that the role of accumulation rate may have been underestimated in the current description of firnification models.
Resumo:
Understanding natural climate variability and its driving factors is crucial to assessing future climate change. Therefore, comparing proxy-based climate reconstructions with forcing factors as well as comparing these with paleoclimate model simulations is key to gaining insights into the relative roles of internal versus forced variability. A review of the state of modelling of the climate of the last millennium prior to the CMIP5–PMIP3 (Coupled Model Intercomparison Project Phase 5–Paleoclimate Modelling Intercomparison Project Phase 3) coordinated effort is presented and compared to the available temperature reconstructions. Simulations and reconstructions broadly agree on reproducing the major temperature changes and suggest an overall linear response to external forcing on multidecadal or longer timescales. Internal variability is found to have an important influence at hemispheric and global scales. The spatial distribution of simulated temperature changes during the transition from the Medieval Climate Anomaly to the Little Ice Age disagrees with that found in the reconstructions. Thus, either internal variability is a possible major player in shaping temperature changes through the millennium or the model simulations have problems realistically representing the response pattern to external forcing. A last millennium transient climate response (LMTCR) is defined to provide a quantitative framework for analysing the consistency between simulated and reconstructed climate. Beyond an overall agreement between simulated and reconstructed LMTCR ranges, this analysis is able to single out specific discrepancies between some reconstructions and the ensemble of simulations. The disagreement is found in the cases where the reconstructions show reduced covariability with external forcings or when they present high rates of temperature change.
Resumo:
The development of northern high-latitude peatlands played an important role in the carbon (C) balance of the land biosphere since the Last Glacial Maximum (LGM). At present, carbon storage in northern peatlands is substantial and estimated to be 500 ± 100 Pg C (1 Pg C = 1015 g C). Here, we develop and apply a peatland module embedded in a dynamic global vegetation and land surface process model (LPX-Bern 1.0). The peatland module features a dynamic nitrogen cycle, a dynamic C transfer between peatland acrotelm (upper oxic layer) and catotelm (deep anoxic layer), hydrology- and temperature-dependent respiration rates, and peatland specific plant functional types. Nitrogen limitation down-regulates average modern net primary productivity over peatlands by about half. Decadal acrotelm-to-catotelm C fluxes vary between −20 and +50 g C m−2 yr−1 over the Holocene. Key model parameters are calibrated with reconstructed peat accumulation rates from peat-core data. The model reproduces the major features of the peat core data and of the observation-based modern circumpolar soil carbon distribution. Results from a set of simulations for possible evolutions of northern peat development and areal extent show that soil C stocks in modern peatlands increased by 365–550 Pg C since the LGM, of which 175–272 Pg C accumulated between 11 and 5 kyr BP. Furthermore, our simulations suggest a persistent C sequestration rate of 35–50 Pg C per 1000 yr in present-day peatlands under current climate conditions, and that this C sink could either sustain or turn towards a source by 2100 AD depending on climate trajectories as projected for different representative greenhouse gas concentration pathways.
Resumo:
Changes in Greenland accumulation and the stability in the relationship between accumulation variability and large-scale circulation are assessed by performing time-slice simulations for the present day, the preindustrial era, the early Holocene, and the Last Glacial Maximum (LGM) with a comprehensive climate model. The stability issue is an important prerequisite for reconstructions of Northern Hemisphere atmospheric circulation variability based on accumulation or precipitation proxy records from Greenland ice cores. The analysis reveals that the relationship between accumulation variability and large-scale circulation undergoes a significant seasonal cycle. As the contributions of the individual seasons to the annual signal change, annual mean accumulation variability is not necessarily related to the same atmospheric circulation patterns during the different climate states. Interestingly, within a season, local Greenland accumulation variability is indeed linked to a consistent circulation pattern, which is observed for all studied climate periods, even for the LGM. Hence, it would be possible to deduce a reliable reconstruction of seasonal atmospheric variability (e.g., for North Atlantic winters) if an accumulation or precipitation proxy were available that resolves single seasons. We further show that the simulated impacts of orbital forcing and changes in the ice sheet topography on Greenland accumulation exhibit strong spatial differences, emphasizing that accumulation records from different ice core sites regarding both interannual and long-term (centennial to millennial) variability cannot be expected to look alike since they include a distinct local signature. The only uniform signal to external forcing is the strong decrease in Greenland accumulation during glacial (LGM) conditions and an increase associated with the recent rise in greenhouse gas concentrations.
Resumo:
Monte Carlo simulations arrive at their results by introducing randomness, sometimes derived from a physical randomizing device. Nonetheless, we argue, they open no new epistemic channels beyond that already employed by traditional simulations: the inference by ordinary argumentation of conclusions from assumptions built into the simulations. We show that Monte Carlo simulations cannot produce knowledge other than by inference, and that they resemble other computer simulations in the manner in which they derive their conclusions. Simple examples of Monte Carlo simulations are analysed to identify the underlying inferences.
Resumo:
Monte Carlo simulation is a powerful method in many natural and social sciences. But what sort of method is it? And where does its power come from? Are Monte Carlo simulations experiments, theories or something else? The aim of this talk is to answer these questions and to explain the power of Monte Carlo simulations. I provide a classification of Monte Carlo techniques and defend the claim that Monte Carlo simulation is a sort of inference.
Resumo:
We present a conceptual prototype model of a focal plane array unit for the STEAMR instrument, highlighting the challenges presented by the required high relative beam proximity of the instrument and focus on how edge-diffraction effects contribute to the array's performance. The analysis was carried out as a comparative process using both PO & PTD and MoM techniques. We first highlight general differences between these computational techniques, with the discussion focusing on diffractive edge effects for near-field imaging reflectors with high truncation. We then present the results of in-depth modeling analyses of the STEAMR focal plane array followed by near-field antenna measurements of a breadboard model of the array. The results of these near-field measurements agree well with both simulation techniques although MoM shows slightly higher complex beam coupling to the measurements than PO & PTD.
Resumo:
Peptide dendrimers are synthetic tree-like molecules composed of amino acids. There are at least two kinds of preferential structural behaviors exhibited by these molecules, which acquire either compact or noncompact shapes. However, the key structural determinants of such behaviors remained, until now, unstudied. Herein, we conduct a comprehensive investigation of the structural determinants of peptide dendrimers by employing long molecular dynamics simulations to characterize an extended set of third generation dendrimers. Our results clearly show that a trade-off between electrostatic effects and hydrogen bond formation controls structure acquisition in these systems. Moreover, by selectively changing the dendrimers charge we are able to manipulate the exhibited compactness. In contrast, the length of branching residues does not seem to be a major structural determinant. Our results are in accordance with the most recent experimental evidence and shed some light on the key molecular level interactions controlling structure acquisition in these systems. Thus, the results presented constitute valuable insights that can contribute to the development of truly tailor-made dendritic systems.
Resumo:
This study aims to evaluate the direct effects of anthropogenic deforestation on simulated climate at two contrasting periods in the Holocene, ~6 and ~0.2 k BP in Europe. We apply We apply the Rossby Centre regional climate model RCA3, a regional climate model with 50 km spatial resolution, for both time periods, considering three alternative descriptions of the past vegetation: (i) potential natural vegetation (V) simulated by the dynamic vegetation model LPJ-GUESS, (ii) potential vegetation with anthropogenic land use (deforestation) from the HYDE3.1 (History Database of the Global Environment) scenario (V + H3.1), and (iii) potential vegetation with anthropogenic land use from the KK10 scenario (V + KK10). The climate model results show that the simulated effects of deforestation depend on both local/regional climate and vegetation characteristics. At ~6 k BP the extent of simulated deforestation in Europe is generally small, but there are areas where deforestation is large enough to produce significant differences in summer temperatures of 0.5–1 °C. At ~0.2 k BP, extensive deforestation, particularly according to the KK10 model, leads to significant temperature differences in large parts of Europe in both winter and summer. In winter, deforestation leads to lower temperatures because of the differences in albedo between forested and unforested areas, particularly in the snow-covered regions. In summer, deforestation leads to higher temperatures in central and eastern Europe because evapotranspiration from unforested areas is lower than from forests. Summer evaporation is already limited in the southernmost parts of Europe under potential vegetation conditions and, therefore, cannot become much lower. Accordingly, the albedo effect dominates in southern Europe also in summer, which implies that deforestation causes a decrease in temperatures. Differences in summer temperature due to deforestation range from −1 °C in south-western Europe to +1 °C in eastern Europe. The choice of anthropogenic land-cover scenario has a significant influence on the simulated climate, but uncertainties in palaeoclimate proxy data for the two time periods do not allow for a definitive discrimination among climate model results.