934 resultados para PROBABILISTIC TELEPORTATION
Resumo:
A study was conducted to assess the status of ecological condition and potential human-health risks in subtidal estuarine waters throughout the North Carolina National Estuarine Research Reserve System (NERRS) (Currituck Sound, Rachel Carson, Masonboro Island, and Zeke’s Island). Field work was conducted in September 2006 and incorporated multiple indicators of ecosystem condition including measures of water quality (dissolved oxygen, salinity, temperature, pH, nutrients and chlorophyll, suspended solids), sediment quality (granulometry, organic matter content, chemical contaminant concentrations), biological condition (diversity and abundances of benthic fauna, fish contaminant levels and pathologies), and human dimensions (fish-tissue contaminant levels relative to human-health consumption limits, various aesthetic properties). A probabilistic sampling design permitted statistical estimation of the spatial extent of degraded versus non-degraded condition across these estuaries relative to specified threshold levels of the various indicators (where possible). With some exceptions, the status of these reserves appeared to be in relatively good to fair ecological condition overall, with the majority of the area (about 54%) having various water quality, sediment quality, and biological (benthic) condition indicators rated in the healthy to intermediate range of corresponding guideline thresholds. Only three stations, representing 10.5% of the area, had one or more of these indicators rated as poor/degraded in all three categories. While such a conclusion is encouraging from a coastal management perspective, it should be viewed with some caution. For example, although co-occurrences of adverse biological and abiotic environmental conditions were limited, at least one indicator of ecological condition rated in the poor/degraded range was observed over a broader area (35.5%) represented by 11 of the 30 stations sampled. In addition, the fish-tissue contaminant data were not included in these overall spatial estimates; however, the majority of samples (77% of fish that were analyzed, from 79%, of stations where fish were caught) contained inorganic arsenic above the consumption limits for human cancer risks, though most likely derived from natural sources. Similarly, aesthetic indicators are not reflected in these spatial estimates of ecological condition, though there was evidence of noxious odors in sediments at many of the stations. Such symptoms reflect a growing realization that North Carolina estuaries are under multiple pressures from a variety of natural and human influences. These data also suggest that, while the current status of overall ecological condition appears to be good to fair, long-term monitoring is warranted to track potential changes in the future. This study establishes an important baseline of overall ecological condition within NC NERRS that can be used to evaluate any such future changes and to trigger appropriate management actions in this rapidly evolving coastal environment. (PDF contains 76 pages)
Resumo:
Summary: This cruise report is a summary of a field survey conducted within the Stellwagen Bank National Marine Sanctuary (SBNMS), located between Cape Cod and Cape Ann at the mouth of Massachusetts Bay. The survey was conducted June 14 – June 21, 2008 on NOAA Ship NANCY FOSTER Cruise NF-08-09-CCEHBR. Multiple indicators of ecological condition and human dimensions were sampled synoptically at each of 30 stations throughout SBNMS using a random probabilistic sampling design. Samples were collected for the analysis of benthic community structure and composition; concentrations of chemical contaminants (metals, pesticides, PAHs, PCBs, PBDEs) in sediments and target demersal biota; nutrient and chlorophyll levels in the water column; and other basic habitat characteristics such as depth, salinity, temperature, dissolved oxygen, turbidity, pH, sediment grain size, and organic carbon content. In addition to the fish samples that were collected for analysis of chemical contaminants relative to human-health consumption limits, other human-dimension indicators were sampled as well including presence or absence of fishing gear, vessels, surface trash, marine mammals, and noxious sediment odors. The overall purpose of the survey was to collect data to assess the status of ecosystem condition and potential stressor impacts throughout SBNMS, based on these various indicators and corresponding management thresholds, and to provide this information as a baseline for determining how such conditions may be changing with time. While sample analysis is still ongoing a few preliminary results and observations are reported here. A final report will be completed once all data have been processed. The results are anticipated to be of value in supporting goals of the SBNMS and National Marine Sanctuary Program aimed at the characterization, protection, and management of sanctuary resources (pursuant to the National Marine Sanctuary Reauthorization Act) as well as a new priority of NCCOS and NOAA to apply Ecosystem Based approaches to the Management of coastal resources (EBM) through Integrated Ecosystem Assessments (IEAs) conducted in various coastal regions of the U.S. including the Northeast Atlantic continental shelf. This was a multi-disciplinary partnership effort made possible by scientists from the following organizations: NOAA, National Ocean Service (NOS), National Centers for Coastal Ocean Science (NCCOS), Center for Coastal Environmental Health and Biomolecular Research (CCEHBR), Charleston, SC. U.S. Environmental Protection Agency (EPA), National Health and Environmental Effects Research Laboratory (NHEERL), Atlantic Ecology Division (GED), Narragansett, RI. U.S. Environmental Protection Agency (EPA), National Health and Environmental Effects Research Laboratory (NHEERL), Gulf Ecology Division (GED), Gulf Breeze, FL. U.S. Geological Survey (USGS), National Wetlands Research Center, Gulf Breeze Project Office, Gulf Breeze, FL. NOAA, Office of Marine and Aviation Operations (OMAO), NOAA ship Nancy Foster. (31pp) (PDF contains 58 pages)
Resumo:
This cruise report is a summary of a field survey conducted in coastal-ocean waters off Florida from Anclote Key to West Palm Beach and from approximately 1 nautical mile (nm) offshore seaward to the shelf break (100 m). The survey was conducted May 15 - May 28, 2007 on NOAA Ship NANCY FOSTER Cruise NF-07-08-NCCOS. Multiple indicators of ecological condition were sampled synoptically at each of 50 stations throughout the region including 10 stations within the Florida Keys National Marine Sanctuary (FKNMS) using a random probabilistic sampling design. Samples were collected for the analysis of benthic community structure and composition; concentrations of chemical contaminants (metals, pesticides, PAHs, PCBs, PBDEs) in sediments and target demersal biota; nutrient and chlorophyll levels in the water column; and other basic habitat characteristics such as depth, salinity, temperature, dissolved oxygen, pH, sediment grain size, and organic carbon content. The overall purpose of the survey was to collect data to assess the status of ecological condition in coastal-ocean waters of the region, based on these various indicators, and to provide this information as a baseline for determining how environmental conditions may be changing with time. The results will be of value in helping to broaden our understanding of the status of ecological resources and their controlling factors, including impacts of potential ecosystem stressors, in such strategic coastal areas. (PDF contains 34 pages
Resumo:
This cruise report is a summary of a field survey conducted in coastal-ocean waters of the Mid-Atlantic Bight from Nags Head, North Carolina to Cape Cod, Massachusetts and from approximately 1 nautical mile (nm) of shore seaward to the shelf break (100 m). The survey was conducted May 12 - May 21, 2006 on NOAA Ship NANCY FOSTER Cruise NF-06-06-NCCOS. Multiple indicators of ecological condition were sampled synoptically at each of 49 stations throughout the region using a random probabilistic sampling design. Samples were collected for the analysis of benthic community structure and composition; concentrations of chemical contaminants (metals, pesticides, PAHs, PCBs, PBDEs) in sediments and target demersal biota; nutrient and chlorophyll levels in the water column; and other basic habitat characteristics such as depth, salinity, temperature, dissolved oxygen, pH, sediment grain size, and organic carbon content. The overall purpose of the survey was to collect data to assess the status of ecological condition in coastal-ocean waters of the region, based on these various indicators, and to provide this information as a baseline for determining how environmental conditions may be changing with time. The results will be of value in helping to broaden our understanding of the status of ecological resources and their controlling factors, including impacts of potential ecosystem stressors, in such strategic coastal areas. (18pp.) (PDF contains 24 pages)
Resumo:
[EN]Fundación Zain is developing new built heritage assessment protocols. The goal is to objectivize and standardize the analysis and decision process that leads to determining the degree of protection of built heritage in the Basque Country. The ultimate step in this objectivization and standardization effort will be the development of an information and communication technology (ICT) tool for the assessment of built heritage. This paper presents the ground work carried out to make this tool possible: the automatic, image-based delineation of stone masonry. This is a necessary first step in the development of the tool, as the built heritage that will be assessed consists of stone masonry construction, and many of the features analyzed can be characterized according to the geometry and arrangement of the stones. Much of the assessment is carried out through visual inspection. Thus, this process will be automated by applying image processing on digital images of the elements under inspection. The principal contribution of this paper is the automatic delineation the framework proposed. The other contribution is the performance evaluation of this delineation as the input to a classifier for a geometrically characterized feature of a built heritage object. The element chosen to perform this evaluation is the stone arrangement of masonry walls. The validity of the proposed framework is assessed on real images of masonry walls.
Resumo:
We study quantum state tomography, entanglement detection and channel noise reconstruction of propagating quantum microwaves via dual-path methods. The presented schemes make use of the following key elements: propagation channels, beam splitters, linear amplifiers and field quadrature detectors. Remarkably, our methods are tolerant to the ubiquitous noise added to the signals by phase-insensitive microwave amplifiers. Furthermore, we analyse our techniques with numerical examples and experimental data, and compare them with the scheme developed in Eichler et al (2011 Phys. Rev. Lett. 106 220503; 2011 Phys. Rev. Lett. 107 113601), based on a single path. Our methods provide key toolbox components that may pave the way towards quantum microwave teleportation and communication protocols.
Resumo:
This paper describes Mateda-2.0, a MATLAB package for estimation of distribution algorithms (EDAs). This package can be used to solve single and multi-objective discrete and continuous optimization problems using EDAs based on undirected and directed probabilistic graphical models. The implementation contains several methods commonly employed by EDAs. It is also conceived as an open package to allow users to incorporate different combinations of selection, learning, sampling, and local search procedures. Additionally, it includes methods to extract, process and visualize the structures learned by the probabilistic models. This way, it can unveil previously unknown information about the optimization problem domain. Mateda-2.0 also incorporates a module for creating and validating function models based on the probabilistic models learned by EDAs.
Resumo:
[ES]Los cambios sociodemográficos y el aumento de la esperanza de vida han dado lugar a un aumento de algunas enfermedades, incluyendo la enfermedad de Alzheimer. La enfermedad de Alzheimer no sólo afecta a la persona que padece dicha enfermedad, sino que también repercute en la familia. Los cuidadores familiares son los que, de manera mayoritaria, se hacen cargo de la atención de estos pacientes con un compromiso de 24 horas, con lo que implica hacer cambios en sus estilos de vida. Los objetivos de este estudio son describir las características socio-demográficas, determinar la sobrecarga de los cuidadores informales y evaluar la calidad de sueño de los cuidadores. Se realizará un estudio transversal que incluirá a 40 cuidadores de enfermos de Alzheimer, seleccionados por un muestreo no probabilístico de selección por cuotas. Los participantes serán los cuidadores informales de pacientes con la enfermedad de Alzheimer que estén en el estadío III o IV de dicha enfermedad. Nuestra variable dependiente será el sueño y como variable independiente la sobrecarga. El estudio se realizará en la asociación de familiares de Alzheimer de Bilbao (A.F.A - Bizkaia), dónde se captará a la muestra de estudio y donde se procederá a aplicar los cuestionarios pertinentes para dicho estudio. Para participar en el estudio es necesario que firmen el consentimiento informado. Los instrumentos que se utilizarán son el cuestionario de Pittsburg que evalúa la calidad de sueño y la escala de carga de Zarit. Para el análisis de datos se utilizará el programa SPSS 15.0. Palabras clave: enfermedad de Alzheimer, cuidadores, cuidadores familiares, demencia, sobrecarga, sueño.
Resumo:
This thesis presents a novel framework for state estimation in the context of robotic grasping and manipulation. The overall estimation approach is based on fusing various visual cues for manipulator tracking, namely appearance and feature-based, shape-based, and silhouette-based visual cues. Similarly, a framework is developed to fuse the above visual cues, but also kinesthetic cues such as force-torque and tactile measurements, for in-hand object pose estimation. The cues are extracted from multiple sensor modalities and are fused in a variety of Kalman filters.
A hybrid estimator is developed to estimate both a continuous state (robot and object states) and discrete states, called contact modes, which specify how each finger contacts a particular object surface. A static multiple model estimator is used to compute and maintain this mode probability. The thesis also develops an estimation framework for estimating model parameters associated with object grasping. Dual and joint state-parameter estimation is explored for parameter estimation of a grasped object's mass and center of mass. Experimental results demonstrate simultaneous object localization and center of mass estimation.
Dual-arm estimation is developed for two arm robotic manipulation tasks. Two types of filters are explored; the first is an augmented filter that contains both arms in the state vector while the second runs two filters in parallel, one for each arm. These two frameworks and their performance is compared in a dual-arm task of removing a wheel from a hub.
This thesis also presents a new method for action selection involving touch. This next best touch method selects an available action for interacting with an object that will gain the most information. The algorithm employs information theory to compute an information gain metric that is based on a probabilistic belief suitable for the task. An estimation framework is used to maintain this belief over time. Kinesthetic measurements such as contact and tactile measurements are used to update the state belief after every interactive action. Simulation and experimental results are demonstrated using next best touch for object localization, specifically a door handle on a door. The next best touch theory is extended for model parameter determination. Since many objects within a particular object category share the same rough shape, principle component analysis may be used to parametrize the object mesh models. These parameters can be estimated using the action selection technique that selects the touching action which best both localizes and estimates these parameters. Simulation results are then presented involving localizing and determining a parameter of a screwdriver.
Lastly, the next best touch theory is further extended to model classes. Instead of estimating parameters, object class determination is incorporated into the information gain metric calculation. The best touching action is selected in order to best discern between the possible model classes. Simulation results are presented to validate the theory.
Resumo:
The brain is perhaps the most complex system to have ever been subjected to rigorous scientific investigation. The scale is staggering: over 10^11 neurons, each making an average of 10^3 synapses, with computation occurring on scales ranging from a single dendritic spine, to an entire cortical area. Slowly, we are beginning to acquire experimental tools that can gather the massive amounts of data needed to characterize this system. However, to understand and interpret these data will also require substantial strides in inferential and statistical techniques. This dissertation attempts to meet this need, extending and applying the modern tools of latent variable modeling to problems in neural data analysis.
It is divided into two parts. The first begins with an exposition of the general techniques of latent variable modeling. A new, extremely general, optimization algorithm is proposed - called Relaxation Expectation Maximization (REM) - that may be used to learn the optimal parameter values of arbitrary latent variable models. This algorithm appears to alleviate the common problem of convergence to local, sub-optimal, likelihood maxima. REM leads to a natural framework for model size selection; in combination with standard model selection techniques the quality of fits may be further improved, while the appropriate model size is automatically and efficiently determined. Next, a new latent variable model, the mixture of sparse hidden Markov models, is introduced, and approximate inference and learning algorithms are derived for it. This model is applied in the second part of the thesis.
The second part brings the technology of part I to bear on two important problems in experimental neuroscience. The first is known as spike sorting; this is the problem of separating the spikes from different neurons embedded within an extracellular recording. The dissertation offers the first thorough statistical analysis of this problem, which then yields the first powerful probabilistic solution. The second problem addressed is that of characterizing the distribution of spike trains recorded from the same neuron under identical experimental conditions. A latent variable model is proposed. Inference and learning in this model leads to new principled algorithms for smoothing and clustering of spike data.
Resumo:
We investigate the 2d O(3) model with the standard action by Monte Carlo simulation at couplings β up to 2.05. We measure the energy density, mass gap and susceptibility of the model, and gather high statistics on lattices of size L ≤ 1024 using the Floating Point Systems T-series vector hypercube and the Thinking Machines Corp.'s Connection Machine 2. Asymptotic scaling does not appear to set in for this action, even at β = 2.10, where the correlation length is 420. We observe a 20% difference between our estimate m/Λ^─_(Ms) = 3.52(6) at this β and the recent exact analytical result . We use the overrelaxation algorithm interleaved with Metropolis updates and show that decorrelation time scales with the correlation length and the number of overrelaxation steps per sweep. We determine its effective dynamical critical exponent to be z' = 1.079(10); thus critical slowing down is reduced significantly for this local algorithm that is vectorizable and parallelizable.
We also use the cluster Monte Carlo algorithms, which are non-local Monte Carlo update schemes which can greatly increase the efficiency of computer simulations of spin models. The major computational task in these algorithms is connected component labeling, to identify clusters of connected sites on a lattice. We have devised some new SIMD component labeling algorithms, and implemented them on the Connection Machine. We investigate their performance when applied to the cluster update of the two dimensional Ising spin model.
Finally we use a Monte Carlo Renormalization Group method to directly measure the couplings of block Hamiltonians at different blocking levels. For the usual averaging block transformation we confirm the renormalized trajectory (RT) observed by Okawa. For another improved probabilistic block transformation we find the RT, showing that it is much closer to the Standard Action. We then use this block transformation to obtain the discrete β-function of the model which we compare to the perturbative result. We do not see convergence, except when using a rescaled coupling β_E to effectively resum the series. For the latter case we see agreement for m/ Λ^─_(Ms) at , β = 2.14, 2.26, 2.38 and 2.50. To three loops m/Λ^─_(Ms) = 3.047(35) at β = 2.50, which is very close to the exact value m/ Λ^─_(Ms) = 2.943. Our last point at β = 2.62 disagrees with this estimate however.
Resumo:
We propose an experimentally feasible scheme to generate various types of entangled states of light fields by using beam splitters and single-photon detectors. Two beams of light fields are incident on two beam splitters respectively with each beam being asymmetrically split into two parts in which one part is supposed to be so weak that it contains at most one photon. We let the two weak output modes interfere at a third beam splitter. A conditional joint measurement on both weak output modes may result in an entanglement between the other two output modes. The conditions for the maximal entanglement are discussed based on the concurrence. Several specific examples are also examined.
Resumo:
A classical question in combinatorics is the following: given a partial Latin square $P$, when can we complete $P$ to a Latin square $L$? In this paper, we investigate the class of textbf{$epsilon$-dense partial Latin squares}: partial Latin squares in which each symbol, row, and column contains no more than $epsilon n$-many nonblank cells. Based on a conjecture of Nash-Williams, Daykin and H"aggkvist conjectured that all $frac{1}{4}$-dense partial Latin squares are completable. In this paper, we will discuss the proof methods and results used in previous attempts to resolve this conjecture, introduce a novel technique derived from a paper by Jacobson and Matthews on generating random Latin squares, and use this novel technique to study $ epsilon$-dense partial Latin squares that contain no more than $delta n^2$ filled cells in total.
In Chapter 2, we construct completions for all $ epsilon$-dense partial Latin squares containing no more than $delta n^2$ filled cells in total, given that $epsilon < frac{1}{12}, delta < frac{ left(1-12epsilonright)^{2}}{10409}$. In particular, we show that all $9.8 cdot 10^{-5}$-dense partial Latin squares are completable. In Chapter 4, we augment these results by roughly a factor of two using some probabilistic techniques. These results improve prior work by Gustavsson, which required $epsilon = delta leq 10^{-7}$, as well as Chetwynd and H"aggkvist, which required $epsilon = delta = 10^{-5}$, $n$ even and greater than $10^7$.
If we omit the probabilistic techniques noted above, we further show that such completions can always be found in polynomial time. This contrasts a result of Colbourn, which states that completing arbitrary partial Latin squares is an NP-complete task. In Chapter 3, we strengthen Colbourn's result to the claim that completing an arbitrary $left(frac{1}{2} + epsilonright)$-dense partial Latin square is NP-complete, for any $epsilon > 0$.
Colbourn's result hinges heavily on a connection between triangulations of tripartite graphs and Latin squares. Motivated by this, we use our results on Latin squares to prove that any tripartite graph $G = (V_1, V_2, V_3)$ such that begin{itemize} item $|V_1| = |V_2| = |V_3| = n$, item For every vertex $v in V_i$, $deg_+(v) = deg_-(v) geq (1- epsilon)n,$ and item $|E(G)| > (1 - delta)cdot 3n^2$ end{itemize} admits a triangulation, if $epsilon < frac{1}{132}$, $delta < frac{(1 -132epsilon)^2 }{83272}$. In particular, this holds when $epsilon = delta=1.197 cdot 10^{-5}$.
This strengthens results of Gustavsson, which requires $epsilon = delta = 10^{-7}$.
In an unrelated vein, Chapter 6 explores the class of textbf{quasirandom graphs}, a notion first introduced by Chung, Graham and Wilson cite{chung1989quasi} in 1989. Roughly speaking, a sequence of graphs is called "quasirandom"' if it has a number of properties possessed by the random graph, all of which turn out to be equivalent. In this chapter, we study possible extensions of these results to random $k$-edge colorings, and create an analogue of Chung, Graham and Wilson's result for such colorings.
Resumo:
The changes in internal states, such as fear, hunger and sleep affect behavioral responses in animals. In most of the cases, these state-dependent influences are “pleiotropic”: one state affects multiple sensory modalities and behaviors; “scalable”: the strengths and choices of such modulations differ depending on the imminence of demands; and “persistent”: once the state is switched on the effects last even after the internal demands are off. These prominent features of state-control enable animals to adjust their behavioral responses depending on their internal demands. Here, we studied the neuronal mechanisms of state-controls by investigating energy-deprived state (hunger state) and social-deprived state of fruit flies, Drosophila melanogaster, as prototypic models. To approach these questions, we developed two novel methods: a genetically based method to map sites of neuromodulation in the brain and optogenetic tools in Drosophila.
These methods, and genetic perturbations, reveal that the effect of hunger to alter behavioral sensitivity to gustatory cues is mediate by two distinct neuromodulatory pathways. The neuropeptide F (NPF) – dopamine (DA) pathway increases sugar sensitivity under mild starvation, while the adipokinetic hormone (AKH)- short neuropeptide F (sNPF) pathway decreases bitter sensitivity under severe starvation. These two pathways are recruited under different levels of energy demands without any cross interaction. Effects of both of the pathways are mediated by modulation of the gustatory sensory neurons, which reinforce the concept that sensory neurons constitute an important locus for state-dependent control of behaviors. Our data suggests that multiple independent neuromodulatory pathways are underlying pleiotropic and scalable effects of the hunger state.
In addition, using optogenetic tool, we show that the neural control of male courtship song can be separated into probabilistic/biasing, and deterministic/command-like components. The former, but not the latter, neurons are subject to functional modulation by social experience, supporting the idea that they constitute a locus of state-dependent influence. Interestingly, moreover, brief activation of the former, but not the latter, neurons trigger persistent behavioral response for more than 10 min. Altogether, these findings and new tools described in this dissertation offer new entry points for future researchers to understand the neuronal mechanism of state control.
Resumo:
This study addresses the problem of obtaining reliable velocities and displacements from accelerograms, a concern which often arises in earthquake engineering. A closed-form acceleration expression with random parameters is developed to test any strong-motion accelerogram processing method. Integration of this analytical time history yields the exact velocities, displacements and Fourier spectra. Noise and truncation can also be added. A two-step testing procedure is proposed and the original Volume II routine is used as an illustration. The main sources of error are identified and discussed. Although these errors may be reduced, it is impossible to extract the true time histories from an analog or digital accelerogram because of the uncertain noise level and missing data. Based on these uncertainties, a probabilistic approach is proposed as a new accelerogram processing method. A most probable record is presented as well as a reliability interval which reflects the level of error-uncertainty introduced by the recording and digitization process. The data is processed in the frequency domain, under assumptions governing either the initial value or the temporal mean of the time histories. This new processing approach is tested on synthetic records. It induces little error and the digitization noise is adequately bounded. Filtering is intended to be kept to a minimum and two optimal error-reduction methods are proposed. The "noise filters" reduce the noise level at each harmonic of the spectrum as a function of the signal-to-noise ratio. However, the correction at low frequencies is not sufficient to significantly reduce the drifts in the integrated time histories. The "spectral substitution method" uses optimization techniques to fit spectral models of near-field, far-field or structural motions to the amplitude spectrum of the measured data. The extremes of the spectrum of the recorded data where noise and error prevail are then partly altered, but not removed, and statistical criteria provide the choice of the appropriate cutoff frequencies. This correction method has been applied to existing strong-motion far-field, near-field and structural data with promising results. Since this correction method maintains the whole frequency range of the record, it should prove to be very useful in studying the long-period dynamics of local geology and structures.