996 resultados para Chorionic Villus Sampling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of the current study was to investigate the mechanism by which the corpus luteum (CL) of the monkey undergoes desensitization to luteinizing hormone following exposure to increasing concentration of human chorionic gonadotrophin (hCG) as it occurs in pregnancy. Female bonnet monkeys were injected (im) increasing doses of hCG or dghCG beginning from day 6 or 12 of the luteal phase for either 10 or 4 or 2 days. The day of oestrogen surge was considered as day '0' of luteal phase. Luteal cells obtained from CL of these animals were incubated with hCG (2 and 200 pg/ml) or dbcAMP (2.5, 25 and 100 mu M) for 3 h at 37 degrees C and progesterone secreted was estimated. Corpora lutea of normal cycling monkeys on day 10/16/22 of the luteal phase were used as controls, In addition the in vivo response to CG and deglycosylated hCG (dghCG) was assessed by determining serum steroid profiles following their administration. hCG (from 15-90 IU) but not dghCG (15-90 IU) treatment in vivo significantly (P < 0.05) elevated serum progesterone and oestradiol levels. Serum progesterone, however, could not be maintained at a elevated level by continuous treatment with hCG (from day 6-15), the progesterone level declining beyond day 13 of luteal phase. Administering low doses of hCG (15-90 IU/day) from day 6-9 or high doses (600 IU/day) on days 8 and 9 of the luteal phase resulted in significant increase (about 10-fold over corresponding control P < 0.005) in the ability of luteal cells to synthesize progesterone (incubated controls) in vitro. The luteal cells of the treated animals responded to dbcAMP (P < 0.05) but not to hCG added in vitro, The in vitro response of luteal cells to added hCG was inhibited by 0, 50 and 100% if the animals were injected with low (15-90 IU) or medium (100 IU) between day 6-9 of luteal phase and high (600 IU on day 8 and 9 of luteal phase) doses of dghCG respectively; such treatment had no effect on responsivity of the cells to dbcAMP, The luteal cell responsiveness to dbcAMP in vitro was also blocked if hCG was administered for 10 days beginning day 6 of the luteal phase. Though short term hCG treatment during late luteal phase (from days 12-15) had no effect on luteal function, 10 day treatment beginning day 12 of luteal phase resulted in regain of in vitro responsiveness to both hCG (P < 0.05) and dbcAMP (P < 0.05) suggesting that luteal rescue can occur even at this late stage. In conclusion, desensitization of the CL to hCG appears to be governed by the dose/period for which it is exposed to hCG/dghCG. That desensitization is due to receptor occupancy is brought out by the fact that (i) this can be achieved by giving a larger dose of hCG over a 2 day period instead of a lower dose of the hormone for a longer (4 to 10 days) period and (ii) the effect can largely be reproduced by using dghCG instead of hCG to block the receptor sites. It appears that to achieve desensitization to dbcAMP also it is necessary to expose the luteal cell to relatively high dose of hCG for more than 4 days.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a technique for an all-digital on-chip delay measurement system to measure the skews in a clock distribution network. It uses the principle of sub-sampling. Measurements from a prototype fabricated in a 65 nm industrial process, indicate the ability to measure delays with a resolution of 0.5ps and a DNL of 1.2 ps.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider estimating the total load from frequent flow data but less frequent concentration data. There are numerous load estimation methods available, some of which are captured in various online tools. However, most estimators are subject to large biases statistically, and their associated uncertainties are often not reported. This makes interpretation difficult and the estimation of trends or determination of optimal sampling regimes impossible to assess. In this paper, we first propose two indices for measuring the extent of sampling bias, and then provide steps for obtaining reliable load estimates that minimizes the biases and makes use of informative predictive variables. The key step to this approach is in the development of an appropriate predictive model for concentration. This is achieved using a generalized rating-curve approach with additional predictors that capture unique features in the flow data, such as the concept of the first flush, the location of the event on the hydrograph (e.g. rise or fall) and the discounted flow. The latter may be thought of as a measure of constituent exhaustion occurring during flood events. Forming this additional information can significantly improve the predictability of concentration, and ultimately the precision with which the pollutant load is estimated. We also provide a measure of the standard error of the load estimate which incorporates model, spatial and/or temporal errors. This method also has the capacity to incorporate measurement error incurred through the sampling of flow. We illustrate this approach for two rivers delivering to the Great Barrier Reef, Queensland, Australia. One is a data set from the Burdekin River, and consists of the total suspended sediment (TSS) and nitrogen oxide (NO(x)) and gauged flow for 1997. The other dataset is from the Tully River, for the period of July 2000 to June 2008. For NO(x) Burdekin, the new estimates are very similar to the ratio estimates even when there is no relationship between the concentration and the flow. However, for the Tully dataset, by incorporating the additional predictive variables namely the discounted flow and flow phases (rising or recessing), we substantially improved the model fit, and thus the certainty with which the load is estimated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sampling strategies are developed based on the idea of ranked set sampling (RSS) to increase efficiency and therefore to reduce the cost of sampling in fishery research. The RSS incorporates information on concomitant variables that are correlated with the variable of interest in the selection of samples. For example, estimating a monitoring survey abundance index would be more efficient if the sampling sites were selected based on the information from previous surveys or catch rates of the fishery. We use two practical fishery examples to demonstrate the approach: site selection for a fishery-independent monitoring survey in the Australian northern prawn fishery (NPF) and fish age prediction by simple linear regression modelling a short-lived tropical clupeoid. The relative efficiencies of the new designs were derived analytically and compared with the traditional simple random sampling (SRS). Optimal sampling schemes were measured by different optimality criteria. For the NPF monitoring survey, the efficiency in terms of variance or mean squared errors of the estimated mean abundance index ranged from 114 to 199% compared with the SRS. In the case of a fish ageing study for Tenualosa ilisha in Bangladesh, the efficiency of age prediction from fish body weight reached 140%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In treatment comparison experiments, the treatment responses are often correlated with some concomitant variables which can be measured before or at the beginning of the experiments. In this article, we propose schemes for the assignment of experimental units that may greatly improve the efficiency of the comparison in such situations. The proposed schemes are based on general ranked set sampling. The relative efficiency and cost-effectiveness of the proposed schemes are studied and compared. It is found that some proposed schemes are always more efficient than the traditional simple random assignment scheme when the total cost is the same. Numerical studies show promising results using the proposed schemes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nahhas, Wolfe, and Chen (2002, Biometrics 58, 964-971) considered optimal set size for ranked set sampling (RSS) with fixed operational costs. This framework can be very useful in practice to determine whether RSS is beneficial and to obtain the optimal set size that minimizes the variance of the population estimator for a fixed total cost. In this article, we propose a scheme of general RSS in which more than one observation can be taken from each ranked set. This is shown to be more cost-effective in some cases when the cost of ranking is not so small. We demonstrate using the example in Nahhas, Wolfe, and Chen (2002, Biometrics 58, 964-971), by taking two or more observations from one set even with the optimal set size from the RSS design can be more beneficial.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new technique called the reef resource inventory (RRI) was developed to map the distribution and abundance of benthos and substratum on reefs. The rapid field sampling technique uses divers to visually estimate the percentage cover of categories of benthos and substratum along 2x20 in plotless strip-transects positioned randomly over the tops, and systematically along the edge of reefs. The purpose of this study was to compare the relative sampling accuracy of the RRI against the line intercept transect technique (LIT), an international standard for sampling reef benthos and substratum. Analysis of paired sampling with LIT and RRI at 51 sites indicated sampling accuracy was not different (P > 0.05) for 8 of the 12 benthos and substratum categories used in the study. Significant differences were attributed to small-scale patchiness and cryptic coloration of some benthos; effects associated with sampling a sparsely distributed animal along a line versus an area; difficulties in discriminating some of the benthos and substratum categories; and differences due to visual acuity since LIT measurements were taken by divers close to the seabed whereas RRI measurements were taken by divers higher in the water column. The relative cost efficiency of the RRI technique was at least three times that of LIT for all benthos and substratum categories and as much as 10 times higher for two categories. These results suggest that the RRI can be used to obtain reliable and accurate estimates of relative abundance of broad categories of reef benthos and substratum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article is motivated by a lung cancer study where a regression model is involved and the response variable is too expensive to measure but the predictor variable can be measured easily with relatively negligible cost. This situation occurs quite often in medical studies, quantitative genetics, and ecological and environmental studies. In this article, by using the idea of ranked-set sampling (RSS), we develop sampling strategies that can reduce cost and increase efficiency of the regression analysis for the above-mentioned situation. The developed method is applied retrospectively to a lung cancer study. In the lung cancer study, the interest is to investigate the association between smoking status and three biomarkers: polyphenol DNA adducts, micronuclei, and sister chromatic exchanges. Optimal sampling schemes with different optimality criteria such as A-, D-, and integrated mean square error (IMSE)-optimality are considered in the application. With set size 10 in RSS, the improvement of the optimal schemes over simple random sampling (SRS) is great. For instance, by using the optimal scheme with IMSE-optimality, the IMSEs of the estimated regression functions for the three biomarkers are reduced to about half of those incurred by using SRS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The efficiency with which a small beam trawl (1 x 0.5 m mouth) sampled postlarvae and juveniles of tiger prawns Penaeus esculentus and P, semisulcatus at night was estimated in 3 tropical seagrass communities (dominated by Thalassia hemprichii, Syringodium isoetifolium and Enhalus acoroides, respectively) in the shallow waters of the Gulf of Carpentaria in northern Australia. An area of seagrass (40 x 3 m) was enclosed by a net and the beam trawl was repeatedly hand-hauled over the substrate. Net efficiency (q) was calculated using 4 methods: the unweighted Leslie, weighted Leslie, DeLury and Maximum-likelihood (ML) methods. The Maximum-likelihood is the preferred method for estimating efficiency because it makes the fewest assumptions and is not affected by zero catches. The major difference in net efficiencies was between postlarvae (mean ML q +/- 95% confidence limits = 0.66 +/- 0.16) and juveniles of both species (mean q for juveniles in water less than or equal to 1.0 m deep = 0.47 +/- 0.05), i.e. the beam trawl was more efficient at capturing postlarvae than juveniles. There was little difference in net efficiency for P, esculentus between seagrass types (T, hemprichii versus S. isoetifolium), even though the biomass and morphologies of seagrass in these communities differed greatly (biomasses were 54 and 204 g m(-2), respectively). The efficiency of the net appeared to be the same for juveniles of the 2 species in shallow water, but was lower for juvenile P, semisulcatus at high tide when the water was deeper (1.6 to 1.9 m) (0.35 +/- 0.08). The lower efficiency near the time of high tide is possibly because the prawns are more active at high than low tide, and can also escape above the net. Factors affecting net efficiency and alternative methods of estimating net efficiency are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional comparisons between the capture efficiency of sampling devices have generally looked at the absolute differences between devices. We recommend that the signal-to-noise ratio be used when comparing the capture efficiency of benthic sampling devices. Using the signal-to-noise ratio rather than the absolute difference has the advantages that the variance is taken into account when determining how important the difference is, the hypothesis and minimum detectable difference can be made identical for all taxa, it is independent of the units used for measurement, and the sample-size calculation is independent of the variance. This new technique is illustrated by comparing the capture efficiency of a 0.05 m(2) van Veen grab and an airlift suction device, using samples taken from Heron and One Tree lagoons, Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two-spotted mite, Tetranychus urticae Koch, was until recently regarded as a minor and infrequent pest of papaya in Queensland through the dry late winter/early summer months. The situation has changed over the past 4-5 years, so that now some growers consider spider mites significant pests all year round. This altered pest status corresponded with a substantial increase in the use of fungicides to control black spot (Asperisporium caricae). A project was initiated in 1998 to examine the potential reasons for escalating mite problems in commercially-grown papaya, which included regular sampling over a 2 year period for mites, mite damage and beneficial arthropods on a number of farms on the wet tropical coast and drier Atherton Tableland. Differences in soil type, papaya variety, chemical use and some agronomic practices were included in this assessment. Monthly visits were made to each site where 20 randomly-selected plants from each of 2 papaya lines (yellow and red types) were surveyed. Three leaves were selected from each plant, one from each of the bottom, middle and top strata of leaves. The numbers of mobile predators were recorded, along with visual estimates of the percentage and age of mite damage on each leaf. Leaves were then sprayed with hairspray to fix the mites and immature predators to the leaf surface. Four leaf disks, 25 mm in diameter, were then punched from each leaf into a 50 ml storage container with a purpose-built disk-cutting tool. Disks from each leaf position were separated by tissue paper, within the container. On return to the laboratory, each leaf disk was scrutinised under a binocular microscope to determine the numbers of two-spotted mites and eggs, predatory mites and eggs, and the immature stages of predatory insects (mainly Stethorus, Halmus and lacewings). A total of 2160 leaf disks have been examined each month. All data have been entered into an Access database to facilitate comparisons between sites.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Experimental cattle are often restrained for repeated blood collection and faecal sampling and may baulk at entering the crush, possibly from learning that crush entry is followed by an unpleasant experience. We asked whether repeated sampling affects temperament. One measure of temperament is flight speed, which is the time, measured electronically, for an animal to cover a set distance on release from a weighing crate (Burrow et al. 1988). 22nd Biennial Conference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Between-subject and within-subject variability is ubiquitous in biology and physiology and understanding and dealing with this is one of the biggest challenges in medicine. At the same time it is difficult to investigate this variability by experiments alone. A recent modelling and simulation approach, known as population of models (POM), allows this exploration to take place by building a mathematical model consisting of multiple parameter sets calibrated against experimental data. However, finding such sets within a high-dimensional parameter space of complex electrophysiological models is computationally challenging. By placing the POM approach within a statistical framework, we develop a novel and efficient algorithm based on sequential Monte Carlo (SMC). We compare the SMC approach with Latin hypercube sampling (LHS), a method commonly adopted in the literature for obtaining the POM, in terms of efficiency and output variability in the presence of a drug block through an in-depth investigation via the Beeler-Reuter cardiac electrophysiological model. We show improved efficiency via SMC and that it produces similar responses to LHS when making out-of-sample predictions in the presence of a simulated drug block.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data on seasonal population abundance of Bemisia tabaci biotype B (silverleaf whitefly (SLW)) in Australian cotton fields collected over four consecutive growing seasons (2002/2003-2005/2006) were used to develop and validate a multiple-threshold-based management and sampling plan. Non-linear growth trajectories estimated from the field sampling data were used as benchmarks to classify adult SLW field populations into six density-based management zones with associated control recommendations in the context of peak flowering and open boll crop growth stages. Control options based on application of insect growth regulators (IGRs) are recommended for high-density populations (>2 adults/leaf) whereas conventional (non-IGR) products are recommended for the control of low to moderate population densities. A computerised re-sampling program was used to develop and test a binomial sampling plan. Binomial models with thresholds of T=1, 2 and 3 adults/leaf were tested using the field abundance data. A binomial plan based on a tally threshold of T=2 adults/leaf and a minimum sample of 20 leaves at nodes 3, 4 or 5 below the terminal is recommended as the most parsimonious and practical sampling protocol for Australian cotton fields. A decision support guide with management zone boundaries expressed as binomial counts and control options appropriate for various SLW density situations is presented. Appropriate use of chemical insecticides and tactics for successful field control of whiteflies are discussed.