21 resultados para artifacts
Resumo:
Karaoke singing is a popular form of entertainment in several parts of the world. Since this genre of performance attracts amateurs, the singing often has artifacts related to scale, tempo, and synchrony. We have developed an approach to correct these artifacts using cross-modal multimedia streams information. We first perform adaptive sampling on the user's rendition and then use the original singer's rendition as well as the video caption highlighting information in order to correct the pitch, tempo and the loudness. A method of analogies has been employed to perform this correction. The basic idea is to manipulate the user's rendition in a manner to make it as similar as possible to the original singing. A pre-processing step of noise removal due to feedback and huffing also helps improve the quality of the user's audio. The results are described in the paper which shows the effectiveness of this multimedia approach.
Resumo:
Abstract Adaptability to changing circumstances is a key feature of living creatures. Understanding such adaptive processes is central to developing successful autonomous artifacts. In this paper two perspectives are brought to bear on the issue of adaptability. The first is a short term perspective which looks at adaptability in terms of the interactions between the agent and the environment. The second perspective involves a hierarchical evolutionary model which seeks to identify higher-order forms of adaptability based on the concept of adaptive meta-constructs. Task orientated and agent-centered models of adaptive processes in artifacts are considered from these two perspectives. The former isrepresented by the fitness function approach found in evolutionary learning, and the latter in terms of the concepts of empowerment and homeokinesis found in models derived from the self-organizing systems approach. A meta-construct approach to adaptability based on the identification of higher level meta-metrics is also outlined. 2009 Published by Elsevier B.V.
Resumo:
Radiocarbon dating has been rarely used for chronological problems relating to the Anglo-Saxon period. The "flatness" of the calibration curve and the resultant wide range in calendrical dates provide little advantage over traditional archaeological dating in this period. Recent advances in Bayesian methodology have, however, created the possibility of refining and checking the established chronologies, based on typology of artifacts, against 14C dates. The calibration process, within such a confined age range, however, relies heavily on the structural accuracy of the calibration curve. We have therefore re-measured, at decadal intervals, a section of the Irish oak chronology for the period AD 495–725. These measurements have been included in IntCal04.
Resumo:
KLL dielectronic recombination resonances, where a free electron is captured into the L shell and at the same time a K shell electron is excited into the L shell, have been measured for open shell iodine ions by measuring the detected yield of escaping ions of various charge states and modeling the charge balance in an electron beam ion trap. In the modeling, the escape from the trap and multiple charge exchange were considered. Extracted ions were used to measure the charge balance in the trap. The different charge states were clearly separated, which along with the correction for artifacts connected with ion escape and multiple charge exchange made the open shell highly charged ion measurements of this type possible for the first time. From the measured spectra resonant strengths were obtained. The results were 4.27(39)x10(-19) cm(2) eV, 2.91(26)x10(-19) cm(2) eV, 2.39(22)x10(-19) cm(2) eV, 1.49(14)x10(-19) cm(2) eV and 7.64(76)x10(-20) cm(2) eV for the iodine ions from He-like to C-like, respectively.
Resumo:
We present the first marine reservoir age and Delta R determination for the island of St. Helena using marine mollusk radiocarbon dates obtained from an historical context of known age. This represents the first marine reservoir a.-c and Delta R determination in the southern Atlantic Ocean within thousands of kilometers of the island. The depletion of C-14 in the shells indicates a rather larger reservoir age for that portion of the surface Atlantic than models indicate. The implication is that upwelling old water along the Namibian coast is transported for a considerable distance, although it is likely to be variable on a decadal timescale. An artilleryman's button, together with other artifacts found in a midden, demonstrate association of the mollusk shells with a narrow historic period of AD 1815-1835.
Resumo:
Radiocarbon dating has been used infrequently as a chronological tool for research in Anglo-Saxon archaeology. Primarily, this is because the uncertainty of calibrated dates provides little advantage over traditional archaeological dating in this period. Recent advances in Bayesian methodology in conjunction with high-precision 14C dating have, however, created the possibility of both testing and refining the established Anglo-Saxon chronologies based on typology of artifacts. The calibration process within such a confined age range, however, relies heavily on the structural accuracy of the calibration curve. We have previously reported decadal measurements on a section of the Irish oak chronology for the period AD 495–725 (McCormac et al. 2004). In this paper, we present decadal measurements for the periods AD 395–485 and AD 735–805,which extends the original calibration set.
Resumo:
For any proposed software project, when the software requirements specification has been established, requirements changes may result in not only a modification of the requirements specification but also a series of modifications of all existing artifacts during the development. Then it is necessary to provide effective and flexible requirements changes management. In this paper, we present an approach to managing requirements changes based on Booth’s negotiation-style framework for belief revision. Informally, we consider the current requirements specification as a belief set about the system-to-be. The request of requirements change is viewed as new information about the same system-to-be. Then the process of executing the requirements change is a process of revising beliefs about the system-to-be. We design a family of belief negotiation models appropriate for different processes of requirements revision, including the setting of the request of requirements change being fully accepted, the setting of the current requirements specification being fully preserved, and that of the current specification and the request of requirements change reaching a compromise. In particular, the prioritization of requirements plays an important role in reaching an agreement in each belief negotiation model designed in this paper.
Resumo:
Studies concerning the physiological significance of Ca2+ sparks often depend on the detection and measurement of large populations of events in noisy microscopy images. Automated detection methods have been developed to quickly and objectively distinguish potential sparks from noise artifacts. However, previously described algorithms are not suited to the reliable detection of sparks in images where the local baseline fluorescence and noise properties can vary significantly, and risk introducing additional bias when applied to such data sets. Here, we describe a new, conceptually straightforward approach to spark detection in linescans that addresses this issue by combining variance stabilization with local baseline subtraction. We also show that in addition to greatly increasing the range of images in which sparks can be automatically detected, the use of a more accurate noise model enables our algorithm to achieve similar detection sensitivities with fewer false positives than previous approaches when applied both to synthetic and experimental data sets. We propose, therefore, that it might be a useful tool for improving the reliability and objectivity of spark analysis in general, and describe how it might be further optimized for specific applications.
Resumo:
We consider the stimulated Raman transition between two long-lived states via multiple intermediate states, such as between hyperfine ground states in the alkali-metal atoms. We present a concise treatment of the general, multilevel, off-resonant case, and we show how the lightshift emerges naturally in this approach. We illustrate our results by application to alkali-metal atoms and we make specific reference to cesium. We comment on some artifacts, due solely to the geometrical overlap of states, which are relevant to existing experiments.
Resumo:
Artifact removal from physiological signals is an essential component of the biosignal processing pipeline. The need for powerful and robust methods for this process has become particularly acute as healthcare technology deployment undergoes transition from the current hospital-centric setting toward a wearable and ubiquitous monitoring environment. Currently, determining the relative efficacy and performance of the multiple artifact removal techniques available on real world data can be problematic, due to incomplete information on the uncorrupted desired signal. The majority of techniques are presently evaluated using simulated data, and therefore, the quality of the conclusions is contingent on the fidelity of the model used. Consequently, in the biomedical signal processing community, there is considerable focus on the generation and validation of appropriate signal models for use in artifact suppression. Most approaches rely on mathematical models which capture suitable approximations to the signal dynamics or underlying physiology and, therefore, introduce some uncertainty to subsequent predictions of algorithm performance. This paper describes a more empirical approach to the modeling of the desired signal that we demonstrate for functional brain monitoring tasks which allows for the procurement of a ground truth signal which is highly correlated to a true desired signal that has been contaminated with artifacts. The availability of this ground truth, together with the corrupted signal, can then aid in determining the efficacy of selected artifact removal techniques. A number of commonly implemented artifact removal techniques were evaluated using the described methodology to validate the proposed novel test platform. © 2012 IEEE.
Resumo:
Particle-in-cell (PIC) simulations of relativistic shocks are in principle capable of predicting the spectra of photons that are radiated incoherently by the accelerated particles. The most direct method evaluates the spectrum using the fields given by the Lienard-Wiechart potentials. However, for relativistic particles this procedure is computationally expensive. Here we present an alternative method that uses the concept of the photon formation length. The algorithm is suitable for evaluating spectra both from particles moving in a specific realization of a turbulent electromagnetic field or from trajectories given as a finite, discrete time series by a PIC simulation. The main advantage of the method is that it identifies the intrinsic spectral features and filters out those that are artifacts of the limited time resolution and finite duration of input trajectories.
Resumo:
Biosignal measurement and processing is increasingly being deployed in ambulatory situations particularly in connected health applications. Such an environment dramatically increases the likelihood of artifacts which can occlude features of interest and reduce the quality of information available in the signal. If multichannel recordings are available for a given signal source, then there are currently a considerable range of methods which can suppress or in some cases remove the distorting effect of such artifacts. There are, however, considerably fewer techniques available if only a single-channel measurement is available and yet single-channel measurements are important where minimal instrumentation complexity is required. This paper describes a novel artifact removal technique for use in such a context. The technique known as ensemble empirical mode decomposition with canonical correlation analysis (EEMD-CCA) is capable of operating on single-channel measurements. The EEMD technique is first used to decompose the single-channel signal into a multidimensional signal. The CCA technique is then employed to isolate the artifact components from the underlying signal using second-order statistics. The new technique is tested against the currently available wavelet denoising and EEMD-ICA techniques using both electroencephalography and functional near-infrared spectroscopy data and is shown to produce significantly improved results. © 1964-2012 IEEE.
Resumo:
There is extensive debate concerning the cognitive and behavioral adaptation of Neanderthals, especially in the period when the earliest anatomically modern humans dispersed into Western Europe, around 35,000–40,000 B.P. The site of the Grotte du Renne (at Arcy-sur-Cure) is of great importance because it provides the most persuasive evidence for behavioral complexity among Neanderthals. A range of ornaments and tools usually associated with modern human industries, such as the Aurignacian, were excavated from three of the Châtelperronian levels at the site, along with Neanderthal fossil remains (mainly teeth). This extremely rare occurrence has been taken to suggest that Neanderthals were the creators of these items. Whether Neanderthals independently achieved this level of behavioral complexity and whether this was culturally transmitted or mimicked via incoming modern humans has been contentious. At the heart of this discussion lies an assumption regarding the integrity of the excavated remains. One means of testing this is by radiocarbon dating; however, until recently, our ability to generate both accurate and precise results for this period has been compromised. A series of 31 accelerator mass spectrometry ultra?ltered dates on bones, antlers, artifacts, and teeth from six key archaeological levels shows an unexpected degree of variation. This suggests that some mixing of material may have occurred, which implies a more complex depositional history at the site and makes it dif?cult to be con?dent about the association of artifacts with human remains in the Châtelperronian levels.
Resumo:
In finite difference time domain simulation of room acoustics, source functions are subject to various constraints. These depend on the way sources are injected into the grid and on the chosen parameters of the numerical scheme being used. This paper addresses the issue of selecting and designing sources for finite difference simulation, by first reviewing associated aims and constraints, and evaluating existing source models against these criteria. The process of exciting a model is generalized by introducing a system of three cascaded filters, respectively, characterizing the driving pulse, the source mechanics, and the injection of the resulting source function into the grid. It is shown that hard, soft, and transparent sources can be seen as special cases within this unified approach. Starting from the mechanics of a small pulsating sphere, a parametric source model is formulated by specifying suitable filters. This physically constrained source model is numerically consistent, does not scatter incoming waves, and is free from zero- and low-frequency artifacts. Simulation results are employed for comparison with existing source formulations in terms of meeting the spectral and temporal requirements on the outward propagating wave.