869 resultados para Interval sampling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Study Objective: To evaluate the diagnostic accuracy of transvaginal ultrasound and office hysteroscopy in the differentiation between endometrial polyps and endometrial adenocarcinoma. Design: This is a prospective 100 women longitudinal study, 24 to 80 years, submitted to hysteroscopic polypectomy (n = 80) or surgery due to endometrial adenocarcinoma (n = 20), from january 2010 to December 2011. Clinical, ultrasonographic and hysteroscopic parameters were analyzed and compared with histopathologic find. Statistical analysis were performed utilizing the Tukey, Kruskal-Wallis, Dunn and Mann-Whitney test, with a confidence interval of 95% and p\0, 05 statiscally significant. Setting: Botucatu Medical School. Intervention: Prospective analysis of clinical, ultrasonographic and hysteroscopic parameters in patients with diagnosis suspected of endometrial polyps and adenocarcinoma of endometrium were performed. According to the diagnosis, hysteroscopic polypectomy or pan hysterectomy with lymph node sampling was realized. After the surgery and histopathological study, statistical analysis of parameters was performed and the results were compared between groups. It was Research Ethics Committee approved. Measurements and Main Results: There were no differences between age, BMI, menopause, TH use and associated diseases among groups. The main symptom of endometrial cancer was the postmenopausal bleeding, affecting 84,2% of women against 34,8% of polypectomy group. The majority of women with endometrial polyps were asymptomatic. Transvaginal ultrasonography showed no ability to differentiate cases of endometrial cancer compared with the cases of endometrial polyps, considering the presence of endometrial thickness and blood flow on color Doppler. Office hysteroscopy showed significant changes in 75% of the adenocarcinoma cases, especially the presence of diffuse hypervascularity with atypical vessels. Conclusion: Still remains an inability to establish clinical parameters and reliable ultrasound imaging to differentiate endometrial polyps and cancer of endometrium. Attention should be given to hysteroscopic exams presenting diffuse endometrial hypervascularization with architectural distortion of the vessels. The recommendation of our service remains the systematic removal of all endometrial polyps.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pós-graduação em Saúde Coletiva - FMB

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The conditioned rewarding effects of novelty compete with those of cocaine for control over choice behavior using a place conditioning task. The purpose of the present study was to use multiple doses of cocaine to determine the extent of this competition and to determine whether novelty’s impact on cocaine reward was maintained over an abstinence period. In Experiment 1, rats were conditioned with cocaine (7.5, 20, or 30 mg/kg ip) to prefer one side of an unbiased place conditioning apparatus relative to the other. In a subsequent phase, all rats received alternating daily confinements to the previously cocaine paired and unpaired sides of the apparatus. During this phase, half the rats had access to a novel object on their initially unpaired side; the remaining rats did not receive objects. The ability of novelty to compete with cocaine in a drug free and cocaine challenge test was sensitive to cocaine dose. In Experiment 2, a place preference was established with 10 mg/kg cocaine and testing occurred after 1, 14, or 28 day retention intervals. Findings indicate that choice behaviors mediated by cocaine conditioning are reduced with the passing of time. Taken together, competition between cocaine and novelty conditioned rewards are sensitive to drug dose and retention interval.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Killer whale (Orcinus orca Linnaeus, 1758) abundance in the North Pacific is known only for a few populations for which extensive longitudinal data are available, with little quantitative data from more remote regions. Line-transect ship surveys were conducted in July and August of 2001–2003 in coastal waters of the western Gulf of Alaska and the Aleutian Islands. Conventional and Multiple Covariate Distance Sampling methods were used to estimate the abundance of different killer whale ecotypes, which were distinguished based upon morphological and genetic data. Abundance was calculated separately for two data sets that differed in the method by which killer whale group size data were obtained. Initial group size (IGS) data corresponded to estimates of group size at the time of first sighting, and post-encounter group size (PEGS) corresponded to estimates made after closely approaching sighted groups.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Classical sampling methods can be used to estimate the mean of a finite or infinite population. Block kriging also estimates the mean, but of an infinite population in a continuous spatial domain. In this paper, I consider a finite population version of block kriging (FPBK) for plot-based sampling. The data are assumed to come from a spatial stochastic process. Minimizing mean-squared-prediction errors yields best linear unbiased predictions that are a finite population version of block kriging. FPBK has versions comparable to simple random sampling and stratified sampling, and includes the general linear model. This method has been tested for several years for moose surveys in Alaska, and an example is given where results are compared to stratified random sampling. In general, assuming a spatial model gives three main advantages over classical sampling: (1) FPBK is usually more precise than simple or stratified random sampling, (2) FPBK allows small area estimation, and (3) FPBK allows nonrandom sampling designs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance. 2. We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use. 3. Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated. 4. A first step in analysis of distance sampling data is modeling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark–recapture distance sampling, which relaxes the assumption of certain detection at zero distance. 5. All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap. 6. Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modeling analysis engine for spatial and habitat-modeling, and information about accessing the analysis engines directly from other software. 7. Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of- the-art software that implements these methods is described that makes the methods accessible to practicing ecologists.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a fully model-based approach for the analysis of distance sampling data. Distance sampling has been widely used to estimate abundance (or density) of animals or plants in a spatially explicit study area. There is, however, no readily available method of making statistical inference on the relationships between abundance and environmental covariates. Spatial Poisson process likelihoods can be used to simultaneously estimate detection and intensity parameters by modeling distance sampling data as a thinned spatial point process. A model-based spatial approach to distance sampling data has three main benefits: it allows complex and opportunistic transect designs to be employed, it allows estimation of abundance in small subregions, and it provides a framework to assess the effects of habitat or experimental manipulation on density. We demonstrate the model-based methodology with a small simulation study and analysis of the Dubbo weed data set. In addition, a simple ad hoc method for handling overdispersion is also proposed. The simulation study showed that the model-based approach compared favorably to conventional distance sampling methods for abundance estimation. In addition, the overdispersion correction performed adequately when the number of transects was high. Analysis of the Dubbo data set indicated a transect effect on abundance via Akaike’s information criterion model selection. Further goodness-of-fit analysis, however, indicated some potential confounding of intensity with the detection function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"How large a sample is needed to survey the bird damage to corn in a county in Ohio or New Jersey or South Dakota?" Like those in the Bureau of Sport Fisheries and Wildlife and the U.S.D.A. who have been faced with a question of this sort we found only meager information on which to base an answer, whether the problem related to a county in Ohio or to one in New Jersey, or elsewhere. Many sampling methods and rates of sampling did yield reliable estimates but the judgment was often intuitive or based on the reasonableness of the resulting data. Later, when planning the next study or survey, little additional information was available on whether 40 samples of 5 ears each or 5 samples of 200 ears should be examined, i.e., examination of a large number of small samples or a small number of large samples. What information is needed to make a reliable decision? Those of us involved with the Agricultural Experiment Station regional project concerned with the problems of bird damage to crops, known as NE-49, thought we might supply an ans¬wer if we had a corn field in which all the damage was measured. If all the damage were known, we could then sample this field in various ways and see how the estimates from these samplings compared to the actual damage and pin-point the best and most accurate sampling procedure. Eventually the investigators in four states became involved in this work1 and instead of one field we were able to broaden the geographical base by examining all the corn ears in 2 half-acre sections of fields in each state, 8 sections in all. When the corn had matured well past the dough stage, damage on each corn ear was assessed, without removing the ear from the stalk, by visually estimating the percent of the kernel surface which had been destroyed and rating it in one of 5 damage categories. Measurements (by row-centimeters) of the rows of kernels pecked by birds also were made on selected ears representing all categories and all parts of each field section. These measurements provided conversion factors that, when fed into a computer, were applied to the more than 72,000 visually assessed ears. The machine now had in its memory and could supply on demand a map showing each ear, its location and the intensity of the damage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pós-graduação em Saúde Coletiva - FMB

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Contamination by butyltin compounds (BTs) has been reported in estuarine environments worldwide, with serious impacts on the biota of these areas. Considering that BTs can be degraded by varying environmental conditions such as incident light and salinity, the short-term variations in such factors may lead to inaccurate estimates of BTs concentrations in nature. Therefore, the present study aimed to evaluate the possibility that measurements of BTs in estuarine sediments are influenced by different sampling conditions, including period of the day (day or night), tidal zone (intertidal or subtidal), and tides (high or low). The study area is located on the Brazilian southeastern coast, Sao Vicente Estuary, at Pescadores Beach, where BT contamination was previously detected. Three replicate samples of surface sediment were collected randomly in each combination of period of the day, tidal zone, and tide condition, from three subareas along the beach, totaling 72 samples. BTs were analyzed by GC-PFPD using a tin filter and a VF-5 column, by means of a validated method. The concentrations of tributyltin (TBT), dibutyltin (DBT), and monobutyltin (MBT) ranged from undetectable to 161 ng Sn g(-1) (d.w.). In most samples (71%), only MBT was quantifiable, whereas TBTs were measured in only 14, suggesting either an old contamination or rapid degradation processes. DBT was found in 27 samples, but could be quantified in only one. MBT concentrations did not differ significantly with time of day, zones, or tide conditions. DBT and TBT could not be compared under all these environmental conditions, because only a few samples were above the quantification limit. Pooled samples of TBT did not reveal any difference between day and night. These results indicated that, in assessing contamination by butyltin compounds, surface-sediment samples can be collected in any environmental conditions. However, the wide variation of BTs concentrations in the study area, i.e., over a very small geographic scale, illustrates the need for representative hierarchical and composite sampling designs that are compatible with the multiscalar temporal and spatial variability common to most marine systems. The use of such sampling designs will be necessary for future attempts to quantitatively evaluate and monitor the occurrence and impact of these compounds in nature

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrical impedance tomography (EIT) is an imaging technique that attempts to reconstruct the impedance distribution inside an object from the impedance between electrodes placed on the object surface. The EIT reconstruction problem can be approached as a nonlinear nonconvex optimization problem in which one tries to maximize the matching between a simulated impedance problem and the observed data. This nonlinear optimization problem is often ill-posed, and not very suited to methods that evaluate derivatives of the objective function. It may be approached by simulated annealing (SA), but at a large computational cost due to the expensive evaluation process of the objective function, which involves a full simulation of the impedance problem at each iteration. A variation of SA is proposed in which the objective function is evaluated only partially, while ensuring boundaries on the behavior of the modified algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To compare two modalities of exercise training (i.e., Endurance Training [ET] and High-Intensity Interval Training [HIT]) on health-related parameters in obese children aged between 8 and 12 years. Methods: Thirty obese children were randomly allocated into either the ET or HIT group. The ET group performed a 30 to 60-minute continuous exercise at 80% of the peak heart rate (HR). The HIT group training performed 3 to 6 sets of 60-s sprint at 100% of the peak velocity interspersed by a 3-min active recovery period at 50% of the exercise velocity. HIT sessions last similar to 70% less than ET sessions. At baseline and after 12 weeks of intervention, aerobic fitness, body composition and metabolic parameters were assessed. Results: Both the absolute (ET: 26.0%; HIT: 19.0%) and the relative VO2 peak (ET: 13.1%; HIT: 14.6%) were significantly increased in both groups after the intervention. Additionally, the total time of exercise (ET: 19.5%; HIT: 16.4%) and the peak velocity during the maximal graded cardiorespiratory test (ET: 16.9%; HIT: 13.4%) were significantly improved across interventions. Insulinemia (ET: 29.4%; HIT: 30.5%) and HOMA-index (ET: 42.8%; HIT: 37.0%) were significantly lower for both groups at POST when compared to PRE. Body mass was significantly reduced in the HIT (2.6%), but not in the ET group (1.2%). A significant reduction in BMI was observed for both groups after the intervention (ET: 3.0%; HIT: 5.0%). The responsiveness analysis revealed a very similar pattern of the most responsive variables among groups. Conclusion: HIT and ET were equally effective in improving important health related parameters in obese youth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: We aimed to investigate the effect of rest interval, between successive contractions, on muscular fatigue. Methods: Eighteen subjects performed elbow flexion and extension (30 repetitions) on an isokinetic dynamometer with 80 degrees of range of motion. The flexion velocity was 120 degrees/s, while for elbow extension we used 5 different velocities (30, 75, 120, 240, 360 degrees/s), producing 5 different rest intervals (2.89, 1.28, 0.85, 0.57 and 0.54 s). Results: We observed that when the rest interval was 2.89 s there was a reduction in fatigue. On the other hand, when the rest interval was 0.54 s the fatigue was increased. Conclusions: When the resting time was lower (0.54 s) the decline of work in the flexor muscle group was higher compared with different rest interval duration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, researches have shown that the performance of metaheuristics can be affected by population initialization. Opposition-based Differential Evolution (ODE), Quasi-Oppositional Differential Evolution (QODE), and Uniform-Quasi-Opposition Differential Evolution (UQODE) are three state-of-the-art methods that improve the performance of the Differential Evolution algorithm based on population initialization and different search strategies. In a different approach to achieve similar results, this paper presents a technique to discover promising regions in a continuous search-space of an optimization problem. Using machine-learning techniques, the algorithm named Smart Sampling (SS) finds regions with high possibility of containing a global optimum. Next, a metaheuristic can be initialized inside each region to find that optimum. SS and DE were combined (originating the SSDE algorithm) to evaluate our approach, and experiments were conducted in the same set of benchmark functions used by ODE, QODE and UQODE authors. Results have shown that the total number of function evaluations required by DE to reach the global optimum can be significantly reduced and that the success rate improves if SS is employed first. Such results are also in consonance with results from the literature, stating the importance of an adequate starting population. Moreover, SS presents better efficacy to find initial populations of superior quality when compared to the other three algorithms that employ oppositional learning. Finally and most important, the SS performance in finding promising regions is independent of the employed metaheuristic with which SS is combined, making SS suitable to improve the performance of a large variety of optimization techniques. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing studies of on-line process control are concerned with economic aspects, and the parameters of the processes are optimized with respect to the average cost per item produced. However, an equally important dimension is the adoption of an efficient maintenance policy. In most cases, only the frequency of the corrective adjustment is evaluated because it is assumed that the equipment becomes "as good as new" after corrective maintenance. For this condition to be met, a sophisticated and detailed corrective adjustment system needs to be employed. The aim of this paper is to propose an integrated economic model incorporating the following two dimensions: on-line process control and a corrective maintenance program. Both performances are objects of an average cost per item minimization. Adjustments are based on the location of the measurement of a quality characteristic of interest in a three decision zone. Numerical examples are illustrated in the proposal. (c) 2012 Elsevier B.V. All rights reserved.