58 resultados para SAMPLING


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon’s implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike’s preceding ISI. As we show, the EIF’s exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron’s ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational theories about UP states during slow wave sleep and present possible extensions of the model in the context of spike-frequency adaptation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES Respondent-driven sampling (RDS) is a new data collection methodology used to estimate characteristics of hard-to-reach groups, such as the HIV prevalence in drug users. Many national public health systems and international organizations rely on RDS data. However, RDS reporting quality and available reporting guidelines are inadequate. We carried out a systematic review of RDS studies and present Strengthening the Reporting of Observational Studies in Epidemiology for RDS Studies (STROBE-RDS), a checklist of essential items to present in RDS publications, justified by an explanation and elaboration document. STUDY DESIGN AND SETTING We searched the MEDLINE (1970-2013), EMBASE (1974-2013), and Global Health (1910-2013) databases to assess the number and geographical distribution of published RDS studies. STROBE-RDS was developed based on STROBE guidelines, following Guidance for Developers of Health Research Reporting Guidelines. RESULTS RDS has been used in over 460 studies from 69 countries, including the USA (151 studies), China (70), and India (32). STROBE-RDS includes modifications to 12 of the 22 items on the STROBE checklist. The two key areas that required modification concerned the selection of participants and statistical analysis of the sample. CONCLUSION STROBE-RDS seeks to enhance the transparency and utility of research using RDS. If widely adopted, STROBE-RDS should improve global infectious diseases public health decision making.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND The objective of the study was to evaluate the implications of different classifications of rheumatic heart disease on estimated prevalence, and to systematically assess the importance of incidental findings from echocardiographic screening among schoolchildren in Peru. METHODS We performed a cluster randomized observational survey using portable echocardiography among schoolchildren aged 5 to 16 years from randomly selected public and private schools in Arequipa, Peru. Rheumatic heart disease was defined according to the modified World Health Organization (WHO) criteria and the World Heart Federation (WHF) criteria. FINDINGS Among 1395 eligible students from 40 classes and 20 schools, 1023 (73%) participated in the present survey. The median age of the children was 11 years (interquartile range [IQR] 8-13 years) and 50% were girls. Prevalence of possible, probable and definite rheumatic heart disease according to the modified WHO criteria amounted to 19.7/1000 children and ranged from 10.2/1000 among children 5 to 8 years of age to 39.8/1000 among children 13 to 16 years of age; the prevalence of borderline/definite rheumatic heart disease according to the WHF criteria was 3.9/1000 children. 21 children (2.1%) were found to have congenital heart disease, 8 of which were referred for percutaneous or surgical intervention. CONCLUSIONS Prevalence of RHD in Peru was considerably lower compared to endemic regions in sub-Saharan Africa, southeast Asia, and Oceania; and paralleled by a comparable number of undetected congenital heart disease. Strategies to address collateral findings from echocardiographic screening are necessary in the setup of active surveillance programs for RHD. TRIAL REGISTRATION ClinicalTrials.gov identifier: NCT02353663.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research on open source software (OSS) projects often focuses on the SourceForge collaboration platform. We argue that a GNU/Linwr distribution, such as Debian, is better suited for the sampling ofprojects because it avoids biases and contains unique information only available in an integrated environment. Especially research on the reuse of components can build on dependency information inherent in the Debian GNU/Linux packaging system. This paper therefore contributes to the practice of sampling methods in OSS research and provides empirical data on reuse dependencies in Debian.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES To examine whether circulating levels of matrix metalloproteinase 9 (MMP-9) were associated with ultrasound-assessed intima-media thickness (IMT) and echolucent plaques in the carotid and femoral arteries. To examine preanalytical sources of variability in MMP-9 concentrations related to sampling procedures. SUBJECTS AND DESIGN Plasma and serum MMP-9 levels were compared with ultrasound assessed measures of femoral and carotid atherosclerosis, in a cross-sectional study of 61-year-old men (n = 473). Preanalytical sources of variability in MMP-9 levels were examined in 10 healthy subjects. Main outcome measures were circulating levels of MMP-9 in serum and plasma, IMT of the carotid and femoral arteries, and plaque status based on size and echolucency. SETTING Research unit at university hospital. RESULTS Plasma concentrations of total and active MMP-9 were associated with femoral artery IMT independently of traditional cardiovascular risk factors, and were higher in subjects with moderate to large femoral plaques. Plasma MMP-9 concentration was higher in men with echolucent femoral plaques (P = 0.006) compared with subjects without femoral plaques. No similar associations were found for carotid plaques. MMP-9 concentrations were higher in serum than in plasma, and higher when sampling was performed with Vacutainer than with syringe. MMP-9 levels in serum were more strongly associated with peripheral neutrophil count compared with MMP-9 levels in plasma. CONCLUSIONS Plasma MMP-9 levels were associated with atherosclerosis in the femoral artery, and total MMP-9 concentration was higher in men with echolucent femoral plaques. The choice of sample material and sampling method affect the measurements of circulating MMP-9 levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During acts of physical aggression, offenders frequently come into contact with clothes of the victim, thereby leaving traces of DNA-bearing biological material on the garments. Since tape-lifting and swabbing, the currently established methods for non-destructive trace DNA sampling from clothing, both have their shortcomings in collection efficiency and handling, we thought about a new collection method for these challenging samples. Testing two readily available electrostatic devices for their potential to sample biological material from garments made of different fabrics, we found one of them, the electrostatic dust print lifter (DPL), to perform comparable to well-established sampling with wet cotton swabs. In simulated aggression scenarios, we had the same success rate for the establishment of single aggressor profiles, suitable for database submission, with both the DPL and wet swabbing. However, we lost a substantial amount of information with electrostatic sampling, since almost no mixed aggressor-victim profiles suitable for database entry could be established, compared to conventional swabbing. This study serves as a proof of principle for electrostatic DNA sampling from items of clothing. The technique still requires optimization before it might be used in real casework. But we are confident that in the future it could be an efficient and convenient contribution to the toolbox of forensic practitioners.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monte Carlo integration is firmly established as the basis for most practical realistic image synthesis algorithms because of its flexibility and generality. However, the visual quality of rendered images often suffers from estimator variance, which appears as visually distracting noise. Adaptive sampling and reconstruction algorithms reduce variance by controlling the sampling density and aggregating samples in a reconstruction step, possibly over large image regions. In this paper we survey recent advances in this area. We distinguish between “a priori” methods that analyze the light transport equations and derive sampling rates and reconstruction filters from this analysis, and “a posteriori” methods that apply statistical techniques to sets of samples to drive the adaptive sampling and reconstruction process. They typically estimate the errors of several reconstruction filters, and select the best filter locally to minimize error. We discuss advantages and disadvantages of recent state-of-the-art techniques, and provide visual and quantitative comparisons. Some of these techniques are proving useful in real-world applications, and we aim to provide an overview for practitioners and researchers to assess these approaches. In addition, we discuss directions for potential further improvements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the ongoing shift in the computer graphics industry toward Monte Carlo rendering, there is a need for effective, practical noise-reduction techniques that are applicable to a wide range of rendering effects and easily integrated into existing production pipelines. This course surveys recent advances in image-space adaptive sampling and reconstruction algorithms for noise reduction, which have proven very effective at reducing the computational cost of Monte Carlo techniques in practice. These approaches leverage advanced image-filtering techniques with statistical methods for error estimation. They are attractive because they can be integrated easily into conventional Monte Carlo rendering frameworks, they are applicable to most rendering effects, and their computational overhead is modest.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Interstellar Boundary Explorer (IBEX) has been directly observing neutral atoms from the local interstellar medium for the last six years (2009–2014). This paper ties together the 14 studies in this Astrophysical Journal Supplement Series Special Issue, which collectively describe the IBEX interstellar neutral results from this epoch and provide a number of other relevant theoretical and observational results. Interstellar neutrals interact with each other and with the ionized portion of the interstellar population in the “pristine” interstellar medium ahead of the heliosphere. Then, in the heliosphereʼs close vicinity, the interstellar medium begins to interact with escaping heliospheric neutrals. In this study, we compare the results from two major analysis approaches led by IBEX groups in New Hampshire and Warsaw. We also directly address the question of the distance upstream to the pristine interstellar medium and adjust both sets of results to a common distance of ~1000 AU. The two analysis approaches are quite different, but yield fully consistent measurements of the interstellar He flow properties, further validating our findings. While detailed error bars are given for both approaches, we recommend that for most purposes, the community use “working values” of ~25.4 km s⁻¹, ~75°7 ecliptic inflow longitude, ~−5°1 ecliptic inflow latitude, and ~7500 K temperature at ~1000 AU upstream. Finally, we briefly address future opportunities for even better interstellar neutral observations to be provided by the Interstellar Mapping and Acceleration Probe mission, which was recommended as the next major Heliophysics mission by the NRCʼs 2013 Decadal Survey.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we simulate numerically the catastrophic disruption of a large asteroid as a result of a collision with a smaller projectile and the subsequent reaccumulation of fragments as a result of their mutual gravitational attractions. We then investigate the original location within the parent body of the small pieces that eventually reaccumulate to form the largest offspring of the disruption as a function of the internal structure of the parent body. We consider four cases that may represent the internal structure of such a body (whose diameter is fixed at 250 km) in various early stages of the Solar System evolution: fully molten, half molten (i.e., a 26 km-deep outer layer of melt containing half of the mass), solid except a thin molten layer (8 km thick) centered at 10 km depth, and fully solid. The solid material has properties of basalt. We then focus on the three largest offspring that have enough reaccumulated pieces to consider. Our results indicate that the particles that eventually reaccumulate to form the largest reaccumulated bodies retain a memory of their original locations in the parent body. Most particles in each reaccumulated body are clustered from the same original region, even if their reaccumulations take place far away. The extent of the original region varies considerably depending on the internal structure of the parent. It seems to shrink with the solidity of the body. The fraction of particles coming from a given depth is computed for the four cases, which can give constraints on the internal structure of parent bodies of some meteorites. As one example, we consider the ureilites, which in some petrogenetic models are inferred to have formed at particular depths within their parent body. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

wgttest performs a test proposed by DuMouchel and Duncan (1983) to evaluate whether the weighted and unweighted estimates of a regression model are significantly different.