938 resultados para Sampling Theorem
Resumo:
OBJECTIVES Respondent-driven sampling (RDS) is a new data collection methodology used to estimate characteristics of hard-to-reach groups, such as the HIV prevalence in drug users. Many national public health systems and international organizations rely on RDS data. However, RDS reporting quality and available reporting guidelines are inadequate. We carried out a systematic review of RDS studies and present Strengthening the Reporting of Observational Studies in Epidemiology for RDS Studies (STROBE-RDS), a checklist of essential items to present in RDS publications, justified by an explanation and elaboration document. STUDY DESIGN AND SETTING We searched the MEDLINE (1970-2013), EMBASE (1974-2013), and Global Health (1910-2013) databases to assess the number and geographical distribution of published RDS studies. STROBE-RDS was developed based on STROBE guidelines, following Guidance for Developers of Health Research Reporting Guidelines. RESULTS RDS has been used in over 460 studies from 69 countries, including the USA (151 studies), China (70), and India (32). STROBE-RDS includes modifications to 12 of the 22 items on the STROBE checklist. The two key areas that required modification concerned the selection of participants and statistical analysis of the sample. CONCLUSION STROBE-RDS seeks to enhance the transparency and utility of research using RDS. If widely adopted, STROBE-RDS should improve global infectious diseases public health decision making.
Resumo:
BACKGROUND The objective of the study was to evaluate the implications of different classifications of rheumatic heart disease on estimated prevalence, and to systematically assess the importance of incidental findings from echocardiographic screening among schoolchildren in Peru. METHODS We performed a cluster randomized observational survey using portable echocardiography among schoolchildren aged 5 to 16 years from randomly selected public and private schools in Arequipa, Peru. Rheumatic heart disease was defined according to the modified World Health Organization (WHO) criteria and the World Heart Federation (WHF) criteria. FINDINGS Among 1395 eligible students from 40 classes and 20 schools, 1023 (73%) participated in the present survey. The median age of the children was 11 years (interquartile range [IQR] 8-13 years) and 50% were girls. Prevalence of possible, probable and definite rheumatic heart disease according to the modified WHO criteria amounted to 19.7/1000 children and ranged from 10.2/1000 among children 5 to 8 years of age to 39.8/1000 among children 13 to 16 years of age; the prevalence of borderline/definite rheumatic heart disease according to the WHF criteria was 3.9/1000 children. 21 children (2.1%) were found to have congenital heart disease, 8 of which were referred for percutaneous or surgical intervention. CONCLUSIONS Prevalence of RHD in Peru was considerably lower compared to endemic regions in sub-Saharan Africa, southeast Asia, and Oceania; and paralleled by a comparable number of undetected congenital heart disease. Strategies to address collateral findings from echocardiographic screening are necessary in the setup of active surveillance programs for RHD. TRIAL REGISTRATION ClinicalTrials.gov identifier: NCT02353663.
Resumo:
Research on open source software (OSS) projects often focuses on the SourceForge collaboration platform. We argue that a GNU/Linwr distribution, such as Debian, is better suited for the sampling ofprojects because it avoids biases and contains unique information only available in an integrated environment. Especially research on the reuse of components can build on dependency information inherent in the Debian GNU/Linux packaging system. This paper therefore contributes to the practice of sampling methods in OSS research and provides empirical data on reuse dependencies in Debian.
Resumo:
OBJECTIVES To examine whether circulating levels of matrix metalloproteinase 9 (MMP-9) were associated with ultrasound-assessed intima-media thickness (IMT) and echolucent plaques in the carotid and femoral arteries. To examine preanalytical sources of variability in MMP-9 concentrations related to sampling procedures. SUBJECTS AND DESIGN Plasma and serum MMP-9 levels were compared with ultrasound assessed measures of femoral and carotid atherosclerosis, in a cross-sectional study of 61-year-old men (n = 473). Preanalytical sources of variability in MMP-9 levels were examined in 10 healthy subjects. Main outcome measures were circulating levels of MMP-9 in serum and plasma, IMT of the carotid and femoral arteries, and plaque status based on size and echolucency. SETTING Research unit at university hospital. RESULTS Plasma concentrations of total and active MMP-9 were associated with femoral artery IMT independently of traditional cardiovascular risk factors, and were higher in subjects with moderate to large femoral plaques. Plasma MMP-9 concentration was higher in men with echolucent femoral plaques (P = 0.006) compared with subjects without femoral plaques. No similar associations were found for carotid plaques. MMP-9 concentrations were higher in serum than in plasma, and higher when sampling was performed with Vacutainer than with syringe. MMP-9 levels in serum were more strongly associated with peripheral neutrophil count compared with MMP-9 levels in plasma. CONCLUSIONS Plasma MMP-9 levels were associated with atherosclerosis in the femoral artery, and total MMP-9 concentration was higher in men with echolucent femoral plaques. The choice of sample material and sampling method affect the measurements of circulating MMP-9 levels.
Resumo:
During acts of physical aggression, offenders frequently come into contact with clothes of the victim, thereby leaving traces of DNA-bearing biological material on the garments. Since tape-lifting and swabbing, the currently established methods for non-destructive trace DNA sampling from clothing, both have their shortcomings in collection efficiency and handling, we thought about a new collection method for these challenging samples. Testing two readily available electrostatic devices for their potential to sample biological material from garments made of different fabrics, we found one of them, the electrostatic dust print lifter (DPL), to perform comparable to well-established sampling with wet cotton swabs. In simulated aggression scenarios, we had the same success rate for the establishment of single aggressor profiles, suitable for database submission, with both the DPL and wet swabbing. However, we lost a substantial amount of information with electrostatic sampling, since almost no mixed aggressor-victim profiles suitable for database entry could be established, compared to conventional swabbing. This study serves as a proof of principle for electrostatic DNA sampling from items of clothing. The technique still requires optimization before it might be used in real casework. But we are confident that in the future it could be an efficient and convenient contribution to the toolbox of forensic practitioners.
Resumo:
Monte Carlo integration is firmly established as the basis for most practical realistic image synthesis algorithms because of its flexibility and generality. However, the visual quality of rendered images often suffers from estimator variance, which appears as visually distracting noise. Adaptive sampling and reconstruction algorithms reduce variance by controlling the sampling density and aggregating samples in a reconstruction step, possibly over large image regions. In this paper we survey recent advances in this area. We distinguish between “a priori” methods that analyze the light transport equations and derive sampling rates and reconstruction filters from this analysis, and “a posteriori” methods that apply statistical techniques to sets of samples to drive the adaptive sampling and reconstruction process. They typically estimate the errors of several reconstruction filters, and select the best filter locally to minimize error. We discuss advantages and disadvantages of recent state-of-the-art techniques, and provide visual and quantitative comparisons. Some of these techniques are proving useful in real-world applications, and we aim to provide an overview for practitioners and researchers to assess these approaches. In addition, we discuss directions for potential further improvements.
Resumo:
With the ongoing shift in the computer graphics industry toward Monte Carlo rendering, there is a need for effective, practical noise-reduction techniques that are applicable to a wide range of rendering effects and easily integrated into existing production pipelines. This course surveys recent advances in image-space adaptive sampling and reconstruction algorithms for noise reduction, which have proven very effective at reducing the computational cost of Monte Carlo techniques in practice. These approaches leverage advanced image-filtering techniques with statistical methods for error estimation. They are attractive because they can be integrated easily into conventional Monte Carlo rendering frameworks, they are applicable to most rendering effects, and their computational overhead is modest.
Resumo:
The aim of this note is to characterize all pairs of sufficiently smooth functions for which the mean value in the Cauchy mean value theorem is taken at a point which has a well-determined position in the interval. As an application of this result, a partial answer is given to a question posed by Sahoo and Riedel.
Resumo:
The Interstellar Boundary Explorer (IBEX) has been directly observing neutral atoms from the local interstellar medium for the last six years (2009–2014). This paper ties together the 14 studies in this Astrophysical Journal Supplement Series Special Issue, which collectively describe the IBEX interstellar neutral results from this epoch and provide a number of other relevant theoretical and observational results. Interstellar neutrals interact with each other and with the ionized portion of the interstellar population in the “pristine” interstellar medium ahead of the heliosphere. Then, in the heliosphereʼs close vicinity, the interstellar medium begins to interact with escaping heliospheric neutrals. In this study, we compare the results from two major analysis approaches led by IBEX groups in New Hampshire and Warsaw. We also directly address the question of the distance upstream to the pristine interstellar medium and adjust both sets of results to a common distance of ~1000 AU. The two analysis approaches are quite different, but yield fully consistent measurements of the interstellar He flow properties, further validating our findings. While detailed error bars are given for both approaches, we recommend that for most purposes, the community use “working values” of ~25.4 km s⁻¹, ~75°7 ecliptic inflow longitude, ~−5°1 ecliptic inflow latitude, and ~7500 K temperature at ~1000 AU upstream. Finally, we briefly address future opportunities for even better interstellar neutral observations to be provided by the Interstellar Mapping and Acceleration Probe mission, which was recommended as the next major Heliophysics mission by the NRCʼs 2013 Decadal Survey.
Resumo:
In this paper, we simulate numerically the catastrophic disruption of a large asteroid as a result of a collision with a smaller projectile and the subsequent reaccumulation of fragments as a result of their mutual gravitational attractions. We then investigate the original location within the parent body of the small pieces that eventually reaccumulate to form the largest offspring of the disruption as a function of the internal structure of the parent body. We consider four cases that may represent the internal structure of such a body (whose diameter is fixed at 250 km) in various early stages of the Solar System evolution: fully molten, half molten (i.e., a 26 km-deep outer layer of melt containing half of the mass), solid except a thin molten layer (8 km thick) centered at 10 km depth, and fully solid. The solid material has properties of basalt. We then focus on the three largest offspring that have enough reaccumulated pieces to consider. Our results indicate that the particles that eventually reaccumulate to form the largest reaccumulated bodies retain a memory of their original locations in the parent body. Most particles in each reaccumulated body are clustered from the same original region, even if their reaccumulations take place far away. The extent of the original region varies considerably depending on the internal structure of the parent. It seems to shrink with the solidity of the body. The fraction of particles coming from a given depth is computed for the four cases, which can give constraints on the internal structure of parent bodies of some meteorites. As one example, we consider the ureilites, which in some petrogenetic models are inferred to have formed at particular depths within their parent body. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
Indoor and ambient air organic pollutants have been gaining attention because they have been measured at levels with possible health effects. Studies have shown that most airborne polychlorinated biphenyls (PCBs), pesticides and many polycyclic aromatic hydrocarbons (PAHs) are present in the free vapor state. The purpose of this research was to extend recent investigative work with polyurethane foam (PUF) as a collection medium for semivolatile compounds. Open-porous flexible PUFs with different chemical makeup and physical properties were evaluated as to their collection affinities/efficiencies for various classes of compounds and the degree of sample recovery. Filtered air samples were pulled through plugs of PUF spiked with various semivolatiles under different simulated environmental conditions (temperature and humidity), and sampling parameters (flow rate and sample volume) in order to measure their effects on sample breakthrough volume (V(,B)). PUF was also evaluated in the passive mode using organo-phosphorus pesticides. Another major goal was to improve the overall analytical methodology; PUF is inexpensive, easy to handle in the field and has excellent airflow characteristics (low pressure drop). It was confirmed that the PUF collection apparatus behaves as if it were a gas-solid chromatographic system, in that, (V(,B)) was related to temperature and sample volume. Breakthrough volumes were essentially the same using both polyether and polyester type PUF. Also, little change was observed in the V(,B)s after coating PUF with common chromatographic liquid phases. Open cell (reticulated) foams gave better recoveries than closed cell foams. There was a slight increase in (V(,B)) with an increase in the number of cells/pores per inch. The high-density polyester PUF was found to be an excellent passive and active collection adsorbent. Good recoveries could be obtained using just solvent elution. A gas chromatograph equipped with a photoionization detector gave excellent sensitivities and selectivities for the various classes of compounds investigated. ^
Resumo:
The Hasse-Minkowski theorem concerns the classification of quadratic forms over global fields (i.e., finite extensions of Q and rational function fields with a finite constant field). Hasse proved the theorem over the rational numbers in his Ph.D. thesis in 1921. He extended the research of his thesis to quadratic forms over all number fields in 1924. Historically, the Hasse-Minkowski theorem was the first notable application of p-adic fields that caught the attention of a wide mathematical audience. The goal of this thesis is to discuss the Hasse-Minkowski theorem over the rational numbers and over the rational function fields with a finite constant field of odd characteristic. Our treatments of quadratic forms and local fields, though, are more general than what is strictly necessary for our proofs of the Hasse-Minkowski theorem over Q and its analogue over rational function fields (of odd characteristic). Our discussion concludes with some applications of the Hasse-Minkowski theorem.
Resumo:
We analyze a model of 'postelection politics', in which (unlike in the more common Downsian models of 'preelection politics') politicians cannot make binding commitments prior to elections. The game begins with an incumbent politician in office, and voters adopt reelection strategies that are contingent on the policies implemented by the incumbent. We generalize previous models of this type by introducing heterogeneity in voters' ideological preferences, and analyze how voters' reelection strategies constrain the policies chosen by a rent-maximizing incumbent. We first show that virtually any policy (and any feasible level of rent for the incumbent) can be sustained in a Nash equilibrium. Then, we derive a 'median voter theorem': the ideal point of the median voter, and the minimum feasible level of rent, are the unique outcomes in any strong Nash equilibrium. We then introduce alternative refinements that are less restrictive. In particular, Ideologically Loyal Coalition-proof equilibrium also leads uniquely to the median outcome.
Resumo:
DeMoivre's theorem is of great utility in some parts of physical chemistry and is re-introduced here.