869 resultados para artifacts
Resumo:
The application of forecast ensembles to probabilistic weather prediction has spurred considerable interest in their evaluation. Such ensembles are commonly interpreted as Monte Carlo ensembles meaning that the ensemble members are perceived as random draws from a distribution. Under this interpretation, a reasonable property to ask for is statistical consistency, which demands that the ensemble members and the verification behave like draws from the same distribution. A widely used technique to assess statistical consistency of a historical dataset is the rank histogram, which uses as a criterion the number of times that the verification falls between pairs of members of the ordered ensemble. Ensemble evaluation is rendered more specific by stratification, which means that ensembles that satisfy a certain condition (e.g., a certain meteorological regime) are evaluated separately. Fundamental relationships between Monte Carlo ensembles, their rank histograms, and random sampling from the probability simplex according to the Dirichlet distribution are pointed out. Furthermore, the possible benefits and complications of ensemble stratification are discussed. The main conclusion is that a stratified Monte Carlo ensemble might appear inconsistent with the verification even though the original (unstratified) ensemble is consistent. The apparent inconsistency is merely a result of stratification. Stratified rank histograms are thus not necessarily flat. This result is demonstrated by perfect ensemble simulations and supplemented by mathematical arguments. Possible methods to avoid or remove artifacts that stratification induces in the rank histogram are suggested.
Resumo:
Terahertz pulse imaging (TPI) is a novel noncontact, nondestructive technique for the examination of cultural heritage artifacts. It has the advantage of broadband spectral range, time-of-flight depth resolution, and penetration through optically opaque materials. Fiber-coupled, portable, time-domain terahertz systems have enabled this technique to move out of the laboratory and into the field. Much like the rings of a tree, stratified architectural materials give the chronology of their environmental and aesthetic history. This work concentrates on laboratory models of stratified mosaics and fresco paintings, specimens extracted from a neolithic excavation site in Catalhoyuk, Turkey, and specimens measured at the medieval Eglise de Saint Jean-Baptiste in Vif, France. Preparatory spectroscopic studies of various composite materials, including lime, gypsum and clay plasters are presented to enhance the interpretation of results and with the intent to aid future computer simulations of the TPI of stratified architectural material. The breadth of the sample range is a demonstration of the cultural demand and public interest in the life history of buildings. The results are an illustration of the potential role of TPI in providing both a chronological history of buildings and in the visualization of obscured wall paintings and mosaics.
Resumo:
Climate-model simulations of the large-scale temperature responses to increased radiative forcing include enhanced land-sea contrast, stronger response at higher latitudes than in the tropics, and differential responsesin warm and cool season climates to uniform forcing. Here we show that these patterns are also characteristic of model simulations of past climates. The differences in the responses over land as opposed to over the ocean, between high and low latitudes, and between summer and winter are remarkably consistent (proportional and nearly linear) across simulations of both cold and warm climates. Similar patterns also appear in historical observations and paleoclimatic reconstructions, implying that such responses are characteristic features of the climate system, and not simple model artifacts, thereby increasing our confidence in the ability of climate models to correctly simulate different climatic states.
Resumo:
This paper considers ways that experimental design can affect judgments about informally presented context shifting experiments. Reasons are given to think that judgments about informal context shifting experiments are affected by an exclusive reliance on binary truth value judgments and by experimenter bias. Exclusive reliance on binary truth value judgments may produce experimental artifacts by obscuring important differences of degree between the phenomena being investigated. Experimenter bias is an effect generated when, for example, experimenters disclose (even unconsciously) their own beliefs about the outcome of an experiment. Eliminating experimenter bias from context shifting experiments makes it far less obvious what the “intuitive” responses to those experiments are. After it is shown how those different kinds of bias can affect judgments about informal context shifting experiments, those experiments are revised to control for those forms of bias. The upshot of these investigations is that participants in the contextualist debate who employ informal experiments should pay just as much attention to the design of their experiments as those who employ more formal experimental techniques if they want to avoid obscuring the phenomena they aim to uncover
Resumo:
The work presented in this article was performed at the Oriental Institute at the University of Chicago, on objects from their permanent collection: an ancient Egyptian bird mummy and three ancient Sumerian corroded copper-alloy objects. We used a portable, fiber-coupled terahertz time-domain spectroscopic imaging system, which allowed us to measure specimens in both transmission and reflection geometry, and present time- and frequency-based image modes. The results confirm earlier evidence that terahertz imaging can provide complementary information to that obtainable from x-ray CT scans of mummies, giving better visualisation of low density regions. In addition, we demonstrate that terahertz imaging can distinguish mineralized layers in metal artifacts.
Implication of methodological uncertainties for mid-Holocene sea surface temperature reconstructions
Resumo:
We present and examine a multi-sensor global compilation of mid-Holocene (MH) sea surface temperatures (SST), based on Mg/Ca and alkenone palaeothermometry and reconstructions obtained using planktonic foraminifera and organic-walled dinoflagellate cyst census counts. We assess the uncertainties originating from using different methodologies and evaluate the potential of MH SST reconstructions as a benchmark for climate-model simulations. The comparison between different analytical approaches (time frame, baseline climate) shows the choice of time window for the MH has a negligible effect on the reconstructed SST pattern, but the choice of baseline climate affects both the magnitude and spatial pattern of the reconstructed SSTs. Comparison of the SST reconstructions made using different sensors shows significant discrepancies at a regional scale, with uncertainties often exceeding the reconstructed SST anomaly. Apparent patterns in SST may largely be a reflection of the use of different sensors in different regions. Overall, the uncertainties associated with the SST reconstructions are generally larger than the MH anomalies. Thus, the SST data currently available cannot serve as a target for benchmarking model simulations. Further evaluations of potential subsurface and/or seasonal artifacts that may contribute to obscure the MH SST reconstructions are urgently needed to provide reliable benchmarks for model evaluations.
Resumo:
Contamination of the electroencephalogram (EEG) by artifacts greatly reduces the quality of the recorded signals. There is a need for automated artifact removal methods. However, such methods are rarely evaluated against one another via rigorous criteria, with results often presented based upon visual inspection alone. This work presents a comparative study of automatic methods for removing blink, electrocardiographic, and electromyographic artifacts from the EEG. Three methods are considered; wavelet, blind source separation (BSS), and multivariate singular spectrum analysis (MSSA)-based correction. These are applied to data sets containing mixtures of artifacts. Metrics are devised to measure the performance of each method. The BSS method is seen to be the best approach for artifacts of high signal to noise ratio (SNR). By contrast, MSSA performs well at low SNRs but at the expense of a large number of false positive corrections.
Resumo:
A fully automated and online artifact removal method for the electroencephalogram (EEG) is developed for use in brain-computer interfacing. The method (FORCe) is based upon a novel combination of wavelet decomposition, independent component analysis, and thresholding. FORCe is able to operate on a small channel set during online EEG acquisition and does not require additional signals (e.g. electrooculogram signals). Evaluation of FORCe is performed offline on EEG recorded from 13 BCI particpants with cerebral palsy (CP) and online with three healthy participants. The method outperforms the state-of the-art automated artifact removal methods Lagged auto-mutual information clustering (LAMIC) and Fully automated statistical thresholding (FASTER), and is able to remove a wide range of artifact types including blink, electromyogram (EMG), and electrooculogram (EOG) artifacts.
Resumo:
Fluvial redeposition of stone artifacts is a major complicating factor in the interpretation of Lower Palaeolithic open-air archaeological sites. However, the microscopic examination of lithic surfaces may provide valuable background information on the transport history of artifacts, particularly in low energy settings. Replica flint artifacts were therefore abraded in an annular flume and examined with a scanning electron microscope. Results showed that abrasion time, sediment size, and artifact transport mode were very sensitive predictors of microscopic surface abrasion, ridge width, and edge damage (p < 0.000). These results suggest that patterns of micro-abrasion of stone artifacts may enhance understanding of archaeological assemblage formation in fluvial contexts
Resumo:
This special issue is focused on the assessment of algorithms for the observation of Earth’s climate from environ- mental satellites. Climate data records derived by remote sensing are increasingly a key source of insight into the workings of and changes in Earth’s climate system. Producers of data sets must devote considerable effort and expertise to maximise the true climate signals in their products and minimise effects of data processing choices and changing sensors. A key choice is the selection of algorithm(s) for classification and/or retrieval of the climate variable. Within the European Space Agency Climate Change Initiative, science teams undertook systematic assessment of algorithms for a range of essential climate variables. The papers in the special issue report some of these exercises (for ocean colour, aerosol, ozone, greenhouse gases, clouds, soil moisture, sea surface temper- ature and glaciers). The contributions show that assessment exercises must be designed with care, considering issues such as the relative importance of different aspects of data quality (accuracy, precision, stability, sensitivity, coverage, etc.), the availability and degree of independence of validation data and the limitations of validation in characterising some important aspects of data (such as long-term stability or spatial coherence). As well as re- quiring a significant investment of expertise and effort, systematic comparisons are found to be highly valuable. They reveal the relative strengths and weaknesses of different algorithmic approaches under different observa- tional contexts, and help ensure that scientific conclusions drawn from climate data records are not influenced by observational artifacts, but are robust.
Resumo:
This paper explores the role of digital media and creativity in the processes of learning that occur in groups of urban skateboarders. In particular, it examines how the production and consumption of amateur videos contribute to both skaters’ mastery of the techniques of the sport and their integration into the culture of the sport. The data come from an ethnographic study of skateboarders in Hong Kong, which included in-depth interviews, participant observation and the collection of texts and artifacts like magazines, blog entries and amateur skating videos. Skateboarders use video in a number of ways that significantly impact their learning and integration into their communities. They use it to analyze tricks and techniques, to document the stages of their learning and socialization into the group, to set community standards, to build a sense of belonging with their ‘crews’ and to imagine ‘idealized futures’ for themselves and their communities. Understanding the value and function of such ‘semiotic mediation’ in learning and socialization into sport cultures, I suggest, can contribute to helping physical educators design tasks that integrate training in physical skills with opportunities for students to make meaning around their experiences of sport and physical education.
Resumo:
Visual representations of isosurfaces are ubiquitous in the scientific and engineering literature. In this paper, we present techniques to assess the behavior of isosurface extraction codes. Where applicable, these techniques allow us to distinguish whether anomalies in isosurface features can be attributed to the underlying physical process or to artifacts from the extraction process. Such scientific scrutiny is at the heart of verifiable visualization - subjecting visualization algorithms to the same verification process that is used in other components of the scientific pipeline. More concretely, we derive formulas for the expected order of accuracy (or convergence rate) of several isosurface features, and compare them to experimentally observed results in the selected codes. This technique is practical: in two cases, it exposed actual problems in implementations. We provide the reader with the range of responses they can expect to encounter with isosurface techniques, both under ""normal operating conditions"" and also under adverse conditions. Armed with this information - the results of the verification process - practitioners can judiciously select the isosurface extraction technique appropriate for their problem of interest, and have confidence in its behavior.
Resumo:
Mutation testing has been used to assess the quality of test case suites by analyzing the ability in distinguishing the artifact under testing from a set of alternative artifacts, the so-called mutants. The mutants are generated from the artifact under testing by applying a set of mutant operators, which produce artifacts with simple syntactical differences. The mutant operators are usually based on typical errors that occur during the software development and can be related to a fault model. In this paper, we propose a language-named MuDeL (MUtant DEfinition Language)-for the definition of mutant operators, aiming not only at automating the mutant generation, but also at providing precision and formality to the operator definition. The proposed language is based on concepts from transformational and logical programming paradigms, as well as from context-free grammar theory. Denotational semantics formal framework is employed to define the semantics of the MuDeL language. We also describe a system-named mudelgen-developed to support the use of this language. An executable representation of the denotational semantics of the language is used to check the correctness of the implementation of mudelgen. At the very end, a mutant generator module is produced, which can be incorporated into a specific mutant tool/environment. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
The concentrations of the water-soluble inorganic aerosol species, ammonium (NH4+), nitrate (NO3-), chloride (Cl-), and sulfate (SO42-), were measured from September to November 2002 at a pasture site in the Amazon Basin (Rondnia, Brazil) (LBA-SMOCC). Measurements were conducted using a semi-continuous technique (Wet-annular denuder/Steam-Jet Aerosol Collector: WAD/SJAC) and three integrating filter-based methods, namely (1) a denuder-filter pack (DFP: Teflon and impregnated Whatman filters), (2) a stacked-filter unit (SFU: polycarbonate filters), and (3) a High Volume dichotomous sampler (HiVol: quartz fiber filters). Measurements covered the late dry season (biomass burning), a transition period, and the onset of the wet season (clean conditions). Analyses of the particles collected on filters were performed using ion chromatography (IC) and Particle-Induced X-ray Emission spectrometry (PIXE). Season-dependent discrepancies were observed between the WAD/SJAC system and the filter-based samplers. During the dry season, when PM2.5 (D-p <= 2.5 mu m) concentrations were similar to 100 mu g m(-3), aerosol NH4+ and SO42- measured by the filter-based samplers were on average two times higher than those determined by the WAD/SJAC. Concentrations of aerosol NO3- and Cl- measured with the HiVol during daytime, and with the DFP during day- and nighttime also exceeded those of the WAD/SJAC by a factor of two. In contrast, aerosol NO3- and Cl- measured with the SFU during the dry season were nearly two times lower than those measured by the WAD/SJAC. These differences declined markedly during the transition period and towards the cleaner conditions during the onset of the wet season (PM2.5 similar to 5 mu g m(-3)); when filter-based samplers measured on average 40-90% less than the WAD/SJAC. The differences were not due to consistent systematic biases of the analytical techniques, but were apparently a result of prevailing environmental conditions and different sampling procedures. For the transition period and wet season, the significance of our results is reduced by a low number of data points. We argue that the observed differences are mainly attributable to (a) positive and negative filter sampling artifacts, (b) presence of organic compounds and organosulfates on filter substrates, and (c) a SJAC sampling efficiency of less than 100%.
Resumo:
A funerary gold mask from the Museum of Sican, Ferranafe, Peru was analyzed in 30 different areas using a portable equipment using energy-dispersive X-ray fluorescence. It was deduced from the measurements that the main sheet of the mask and the majority of the pendants have a similar composition and are made of tumbaga, which means a poor gold alloy enriched at the surface by depletion gilding, and have a similar `equivalent` gilding thickness of about 5 mu m. The nose, also on tumbaga, has different composition and a thickness of about 8 mu m. The clamps are on gilded or on silvered copper. The red pigment dispersed on the surface of the mask is cinnabar. Copyright (C) 2009 John Wiley & Sons, Ltd.