920 resultados para Variable sampling intervals (VSI)


Relevância:

40.00% 40.00%

Publicador:

Resumo:

We propose a nonparametric variance estimator when ranked set sampling (RSS) and judgment post stratification (JPS) are applied by measuring a concomitant variable. Our proposed estimator is obtained by conditioning on observed concomitant values and using nonparametric kernel regression.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon’s implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike’s preceding ISI. As we show, the EIF’s exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron’s ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational theories about UP states during slow wave sleep and present possible extensions of the model in the context of spike-frequency adaptation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Tara Oceans Expedition (2009-2013) sampled the world oceans on board a 36 m long schooner, collecting environmental data and organisms from viruses to planktonic metazoans for later analyses using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present data set provides continuous measurements made with a FRRF instrument, operating in a flow-through mode during the 2009-2012 part of the expedition. It operates by exciting chlorophyll fluorescence using a series of short flashes of controlled energy and time intervals (Kolber et al, 1998). The fluorescence transients produced by this excitation signal were analysed in real-time to provide estimates of abundance of photosynthetic pigments, the photosynthetic yields (Fv/Fm), the functional absorption cross section (a proxy for efficiency of photosynthetic energy acquisition), the kinetics of photosynthetic electron transport between Photosystem II and Photosystem I, and the size of the PQ pool. These parameters were measured at excitation wavelength of 445 nm, 470nm, 505 nm, and 535 nm, allowing to assess the presence and the photosynthetic performance of different phytoplankton taxa based on the spectral composition of their light harvesting pigments. The FRRF-derived photosynthetic characteristics were used to calculate the initial slope, the half saturation, and the maximum level of Photosynthesis vs Irradiance relationship. FRRF data were acquired continuously, at 1-minute time intervals.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Monte Carlo techniques, which require the generation of samples from some target density, are often the only alternative for performing Bayesian inference. Two classic sampling techniques to draw independent samples are the ratio of uniforms (RoU) and rejection sampling (RS). An efficient sampling algorithm is proposed combining the RoU and polar RS (i.e. RS inside a sector of a circle using polar coordinates). Its efficiency is shown in drawing samples from truncated Cauchy and Gaussian random variables, which have many important applications in signal processing and communications. RESUMEN. Método eficiente para generar algunas variables aleatorias de uso común en procesado de señal y comunicaciones (por ejemplo, Gaussianas o Cauchy truncadas) mediante la combinación de dos técnicas: "ratio of uniforms" y "rejection sampling".

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Some factors complicate comparisons between linkage maps from different studies. This problem can be resolved if measures of precision, such as confidence intervals and frequency distributions, are associated with markers. We examined the precision of distances and ordering of microsatellite markers in the consensus linkage maps of chromosomes 1, 3 and 4 from two F 2 reciprocal Brazilian chicken populations, using bootstrap sampling. Single and consensus maps were constructed. The consensus map was compared with the International Consensus Linkage Map and with the whole genome sequence. Some loci showed segregation distortion and missing data, but this did not affect the analyses negatively. Several inversions and position shifts were detected, based on 95% confidence intervals and frequency distributions of loci. Some discrepancies in distances between loci and in ordering were due to chance, whereas others could be attributed to other effects, including reciprocal crosses, sampling error of the founder animals from the two populations, F(2) population structure, number of and distance between microsatellite markers, number of informative meioses, loci segregation patterns, and sex. In the Brazilian consensus GGA1, locus LEI1038 was in a position closer to the true genome sequence than in the International Consensus Map, whereas for GGA3 and GGA4, no such differences were found. Extending these analyses to the remaining chromosomes should facilitate comparisons and the integration of several available genetic maps, allowing meta-analyses for map construction and quantitative trait loci (QTL) mapping. The precision of the estimates of QTL positions and their effects would be increased with such information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective was to develop and test a procedure for applying variable rates of fertilizers and evaluate yield response in coffee (Coffea arabica L.) with regard to the application of phosphorus and potassium. The work was conducted during the 2004 season in a 6.4 ha field located in central Sao Paulo state. Two treatments were applied with alternating strips of fixed and variable rates during the whole season: one following the fertilizing procedures recommended locally, and the other based on a grid soil sampling. A prototype pneumatic fertilizer applicator was used, carrying two conveyor belts, one for each row. Harvesting was done with a commercial harvester equipped with a customized volumetric yield monitor, separating the two treatments. Data were analyzed based on geostatistics, correlations and regressions. The procedure showed to be feasible and effective. The area that received fertilizer applications at a variable rate showed a 34% yield increase compared to the area that received a fixed rate. The variable rate fertilizer resulted in a savings of 23% in phosphate fertilizer and a 13% increase in potassium fertilizer, when compared to fixed rate fertilizer. Yield in 2005, the year after the variable rate treatments, still presented residual effect from treatments carried out during the previous cycle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

HE PROBIT MODEL IS A POPULAR DEVICE for explaining binary choice decisions in econometrics. It has been used to describe choices such as labor force participation, travel mode, home ownership, and type of education. These and many more examples can be found in papers by Amemiya (1981) and Maddala (1983). Given the contribution of economics towards explaining such choices, and given the nature of data that are collected, prior information on the relationship between a choice probability and several explanatory variables frequently exists. Bayesian inference is a convenient vehicle for including such prior information. Given the increasing popularity of Bayesian inference it is useful to ask whether inferences from a probit model are sensitive to a choice between Bayesian and sampling theory techniques. Of interest is the sensitivity of inference on coefficients, probabilities, and elasticities. We consider these issues in a model designed to explain choice between fixed and variable interest rate mortgages. Two Bayesian priors are employed: a uniform prior on the coefficients, designed to be noninformative for the coefficients, and an inequality restricted prior on the signs of the coefficients. We often know, a priori, whether increasing the value of a particular explanatory variable will have a positive or negative effect on a choice probability. This knowledge can be captured by using a prior probability density function (pdf) that is truncated to be positive or negative. Thus, three sets of results are compared:those from maximum likelihood (ML) estimation, those from Bayesian estimation with an unrestricted uniform prior on the coefficients, and those from Bayesian estimation with a uniform prior truncated to accommodate inequality restrictions on the coefficients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A variable that appears to affect preference development is the exposure to a variety of options. Providing opportunities for systematically sampling different options is one procedure that can facilitate the development of preference, which is indicated by the consistency of selections. The purpose of this study was to evaluate the effects of providing sampling opportunities on the preference development for two adults with severe disabilities. Opportunities for sampling a variety of drink items were presented, followed by choice opportunities for selections at the site where sampling occurred and at a non-sampling site (a grocery store). Results show that the participants developed a definite response consistency in selections at both sites. Implications for sampling practices are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to predict time series of SO2 concentrations emitted by coal-fired power stations in order to estimate in advance emission episodes and analyze the influence of some meteorological variables in the prediction. An emission episode is said to occur when the series of bi-hourly means of SO2 is greater than a specific level. For coal-fired power stations it is essential to predict emission epi- sodes sufficiently in advance so appropriate preventive measures can be taken. We proposed a meth- odology to predict SO2 emission episodes based on using an additive model and an algorithm for variable selection. The methodology was applied to the estimation of SO2 emissions registered in sampling lo- cations near a coal-fired power station located in Northern Spain. The results obtained indicate a good performance of the model considering only two terms of the time series and that the inclusion of the meteorological variables in the model is not significant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To assess the effect of the oscillatory breathing on the variability of RR intervals (VRR) and on prognostic significance after one year follow-up in subjects with left ventricular global systolic dysfunction. METHODS: We studied 76 subjects, whose age ranged from 40 to 80 years, paired for age and gender, divided into two groups: group I - 34 healthy subjects; group II - 42 subjects with left ventricular global systolic dysfunction (ejection fraction < 0.40). The ECG signals were acquired during 600s in supine position, and analyzed the variation of the thoracic amplitude and the VRR. Clinical and V-RR variables were applied into a logistic multivariate model to foretell survival after one year follow-up. RESULTS: Oscillatory breathing was detected in 35.7% of subjects in vigil state of group II, with a concentration of the spectral power in the very low frequency band, and was independent of the presence of diabetes, functional class, ejection fraction, cause of ventricular dysfunction and survival after one year follow-up. In the logistic regression model, ejection fraction was the only independent variable to predict survival. CONCLUSION: 1) Oscillatory breathing pattern is frequent during wakefulness in the left ventricular global systolic dysfunction and concentrates spectral power in the very low band of V-RR; 2) it does not relate to severity and cause of left ventricular dysfunction; 3) ejection fraction is the only independent predictive variable for survival in this group of subjects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: In Switzerland, nationwide large-scale radon surveys have been conducted since the early 1980s to establish the distribution of indoor radon concentrations (IRC). The aim of this work was to study the factors influencing IRC in Switzerland using univariate analyses that take into account biases caused by spatial irregularities of sampling. METHODS: About 212,000 IRC measurements carried out in more than 136,000 dwellings were available for this study. A probability map to assess risk of exceeding an IRC of 300 Bq/m(3) was produced using basic geostatistical techniques. Univariate analyses of IRC for different variables, namely the type of radon detector, various building characteristics such as foundation type, year of construction and building type, as well as the altitude, the average outdoor temperature during measurement and the lithology, were performed comparing 95% confidence intervals among classes of each variable. Furthermore, a map showing the spatial aggregation of the number of measurements was generated for each class of variable in order to assess biases due to spatially irregular sampling. RESULTS: IRC measurements carried out with electret detectors were 35% higher than measurements performed with track detectors. Regarding building characteristics, the IRC of apartments are significantly lower than individual houses. Furthermore, buildings with concrete foundations have the lowest IRC. A significant decrease in IRC was found in buildings constructed after 1900 and again after 1970. Moreover, IRC decreases at higher outdoor temperatures. There is also a tendency to have higher IRC with altitude. Regarding lithology, carbonate rock in the Jura Mountains produces significantly higher IRC, almost by a factor of 2, than carbonate rock in the Alps. Sedimentary rock and sediment produce the lowest IRC while carbonate rock from the Jura Mountains and igneous rock produce the highest IRC. Potential biases due to spatially unbalanced sampling of measurements were identified for several influencing factors. CONCLUSIONS: Significant associations were found between IRC and all variables under study. However, we showed that the spatial distribution of samples strongly affected the relevance of those associations. Therefore, future methods to estimate local radon hazards should take the multidimensionality of the process of IRC into account.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To determine the values of, and study the relationships among, central corneal thickness (CCT), intraocular pressure (IOP), and degree of myopia (DM) in an adult myopic population aged 20 to 40 years in Almeria (southeast Spain). To our knowledge this is first study of this kind in this region. Methods: An observational, descriptive, cross-sectional study was done in which a sample of 310 myopic patients (620 eyes) aged 20 to 40 years was selected by gender- and age-stratified sampling, which was proportionally fixed to the size of the population strata for which a 20% prevalence of myopia, 5% epsilon, and a 95% confidence interval were hypothesized. We studied IOP, CCT, and DM and their relationships by calculating the mean, standard deviation, 95% confidence interval for the mean, median, Fisher’s asymmetry coefficient, range (maximum, minimum), and the Brown-Forsythe’s robust test for each variable (IOP, CCT, and DM). Results: In the adult myopic population of Almeria aged 20 to 40 years (mean of 29.8), the mean overall CCT was 550.12 μm. The corneas of men were thicker than those of women (P = 0.014). CCT was stable as no significant differences were seen in the 20- to 40-year-old subjects’ CCT values. The mean overall IOP was 13.60 mmHg. Men had a higher IOP than women (P = 0.002). Subjects over 30 years (13.83) had a higher IOP than those under 30 (13.38) (P = 0.04). The mean overall DM was −4.18 diopters. Men had less myopia than women (P < 0.001). Myopia was stable in the 20- to 40-year-old study population (P = 0.089). A linear relationship was found between CCT and IOP (R2 = 0.152, P ≤ 0.001). CCT influenced the IOP value by 15.2%. However no linear relationship between DM and IOP, or between CCT and DM, was found. Conclusions: CCT was found to be similar to that reported in other studies in different populations. IOP tends to increase after the age of 30 and is not accounted for by alterations in CCT values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Accuracy studies of Patient Safety Indicators (PSIs) are critical but limited by the large samples required due to low occurrence of most events. We tested a sampling design based on test results (verification-biased sampling [VBS]) that minimizes the number of subjects to be verified. METHODS: We considered 3 real PSIs, whose rates were calculated using 3 years of discharge data from a university hospital and a hypothetical screen of very rare events. Sample size estimates, based on the expected sensitivity and precision, were compared across 4 study designs: random and VBS, with and without constraints on the size of the population to be screened. RESULTS: Over sensitivities ranging from 0.3 to 0.7 and PSI prevalence levels ranging from 0.02 to 0.2, the optimal VBS strategy makes it possible to reduce sample size by up to 60% in comparison with simple random sampling. For PSI prevalence levels below 1%, the minimal sample size required was still over 5000. CONCLUSIONS: Verification-biased sampling permits substantial savings in the required sample size for PSI validation studies. However, sample sizes still need to be very large for many of the rarer PSIs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Helping behavior is any intentional behavior that benefits another living being or group (Hogg & Vaughan, 2010). People tend to underestimate the probability that others will comply with their direct requests for help (Flynn & Lake, 2008). This implies that when they need help, they will assess the probability of getting it (De Paulo, 1982, cited in Flynn & Lake, 2008) and then they will tend to estimate one that is actually lower than the real chance, so they may not even consider worth asking for it. Existing explanations for this phenomenon attribute it to a mistaken cost computation by the help seeker, who will emphasize the instrumental cost of “saying yes”, ignoring that the potential helper also needs to take into account the social cost of saying “no”. And the truth is that, especially in face-to-face interactions, the discomfort caused by refusing to help can be very high. In short, help seekers tend to fail to realize that it might be more costly to refuse to comply with a help request rather than accepting. A similar effect has been observed when estimating trustworthiness of people. Fetchenhauer and Dunning (2010) showed that people also tend to underestimate it. This bias is reduced when, instead of asymmetric feedback (getting feedback only when deciding to trust the other person), symmetric feedback (always given) was provided. This cause could as well be applicable to help seeking as people only receive feedback when they actually make their request but not otherwise. Fazio, Shook, and Eiser (2004) studied something that could be reinforcing these outcomes: Learning asymmetries. By means of a computer game called BeanFest, they showed that people learn better about negatively valenced objects (beans in this case) than about positively valenced ones. This learning asymmetry esteemed from “information gain being contingent on approach behavior” (p. 293), which could be identified with what Fetchenhauer and Dunning mention as ‘asymmetric feedback’, and hence also with help requests. Fazio et al. also found a generalization asymmetry in favor of negative attitudes versus positive ones. They attributed it to a negativity bias that “weights resemblance to a known negative more heavily than resemblance to a positive” (p. 300). Applied to help seeking scenarios, this would mean that when facing an unknown situation, people would tend to generalize and infer that is more likely that they get a negative rather than a positive outcome from it, so, along with what it was said before, people will be more inclined to think that they will get a “no” when requesting help. Denrell and Le Mens (2011) present a different perspective when trying to explain judgment biases in general. They deviate from the classical inappropriate information processing (depicted among other by Fiske & Taylor, 2007, and Tversky & Kahneman, 1974) and explain this in terms of ‘adaptive sampling’. Adaptive sampling is a sampling mechanism in which the selection of sample items is conditioned by the values of the variable of interest previously observed (Thompson, 2011). Sampling adaptively allows individuals to safeguard themselves from experiences they went through once and turned out to lay negative outcomes. However, it also prevents them from giving a second chance to those experiences to get an updated outcome that could maybe turn into a positive one, a more positive one, or just one that regresses to the mean, whatever direction that implies. That, as Denrell and Le Mens (2011) explained, makes sense: If you go to a restaurant, and you did not like the food, you do not choose that restaurant again. This is what we think could be happening when asking for help: When we get a “no”, we stop asking. And here, we want to provide a complementary explanation for the underestimation of the probability that others comply with our direct help requests based on adaptive sampling. First, we will develop and explain a model that represents the theory. Later on, we will test it empirically by means of experiments, and will elaborate on the analysis of its results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: As imatinib pharmacokinetics are highly variable, plasma levels differ largely between patients under the same dosage. Retrospective studies in chronic myeloid leukemia (CML) patients showed significant correlations between low levels and suboptimal response, and between high levels and poor tolerability. Monitoring of plasma levels is thus increasingly advised, targeting trough concentrations of 1000 μg/L and above. Objectives: Our study was launched to assess the clinical usefulness of systematic imatinib TDM in CML patients. The present preliminary evaluation questions the appropriateness of dosage adjustment following plasma level measurement to reach the recommended trough level, while allowing an interval of 4-24 h after last drug intake for blood sampling. Methods: Initial blood samples from the first 9 patients in the intervention arm were obtained 4-25 h after last dose. Trough levels in 7 patients were predicted to be significantly away from the target (6 <750 μg/L, and 1 >1500 μg/L with poor tolerance), based on a Bayesian approach using a population pharmacokinetic model. Individual dosage adjustments were taken up in 5 patients, who had a control measurement 1-4 weeks after dosage change. Predicted trough levels were confronted to anterior model-based extrapolations. Results: Before dosage adjustment, observed concentrations extrapolated at trough ranged from 359 to 1832 μg/L (median 710; mean 804, CV 53%) in the 9 patients. After dosage adjustment they were expected to target between 720 and 1090 μg/L (median 878; mean 872, CV 13%). Observed levels of the 5 recheck measurements extrapolated at trough actually ranged from 710 to 1069 μg/L (median 1015; mean 950, CV 16%) and had absolute differences of 21 to 241 μg/L to the model-based predictions (median 175; mean 157, CV 52%). Differences between observed and predicted trough levels were larger when intervals between last drug intake and sampling were very short (~4 h). Conclusion: These preliminary results suggest that TDM of imatinib using a Bayesian interpretation is able to bring trough levels closer to 1000 μg/L (with CV decreasing from 53% to 16%). While this may simplify blood collection in daily practice, as samples do not have to be drawn exactly at trough, the largest possible interval to last drug intake yet remains preferable. This encourages the evaluation of the clinical benefit of a routine TDM intervention in CML patients, which the randomized Swiss I-COME study aims to.