157 resultados para catch rate


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To compare measurements of sleeping metabolic rate (SMR) in infancy with predicted basal metabolic rate (BMR) estimated by the equations of Schofield. Methods: Some 104 serial measurements of SMR by indirect calorimetry were performed in 43 healthy infants at 1.5, 3, 6, 9 and 12 months of age. Predicted BMR was calculated using the weight only (BMR-wo) and weight and height (BMR-wh) equations of Schofield for 0-3-y-olds. Measured SMR values were compared with both predictive values by means of the Bland-Altman statistical test. Results: The mean measured SMR was 1.48 MJ/day. The mean predicted BMR values were 1.66 and 1.47 MJ/day for the weight only and weight and height equations, respectively. The Bland-Altman analysis showed that BMR-wo equation on average overestimated SMR by 0.18 MJ/day (11%) and the BMR-wh equation underestimated SMR by 0.01 MJ/day (1%). However the 95% limits of agreement were wide: -0.64 to + 0.28 MJ/day (28%) for the former equation and -0.39 to + 0.41 MJ/day (27%) for the latter equation. Moreover there was a significant correlation between the mean of the measured and predicted metabolic rate and the difference between them. Conclusions: The wide variation seen in the difference between measured and predicted metabolic rate and the bias probably with age indicates there is a need to measure actual metabolic rate for individual clinical care in this age group.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Different from other indicators of cardiac function, such as ejection fraction and transmitral early diastolic velocity, myocardial strain is promising to capture subtle alterations that result from early diseases of the myocardium. In order to extract the left ventricle (LV) myocardial strain and strain rate from cardiac cine-MRI, a modified hierarchical transformation model was proposed. Methods A hierarchical transformation model including the global and local LV deformations was employed to analyze the strain and strain rate of the left ventricle by cine-MRI image registration. The endocardial and epicardial contour information was introduced to enhance the registration accuracy by combining the original hierarchical algorithm with an Iterative Closest Points using Invariant Features algorithm. The hierarchical model was validated by a normal volunteer first and then applied to two clinical cases (i.e., the normal volunteer and a diabetic patient) to evaluate their respective function. Results Based on the two clinical cases, by comparing the displacement fields of two selected landmarks in the normal volunteer, the proposed method showed a better performance than the original or unmodified model. Meanwhile, the comparison of the radial strain between the volunteer and patient demonstrated their apparent functional difference. Conclusions The present method could be used to estimate the LV myocardial strain and strain rate during a cardiac cycle and thus to quantify the analysis of the LV motion function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present a new approach for velocity vector imaging and time-resolved measurements of strain rates in the wall of human arteries using MRI and we prove its feasibility on two examples: in vitro on a phantom and in vivo on the carotid artery of a human subject. Results point out the promising potential of this approach for investigating the mechanics of arterial tissues in vivo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Commercial environments may receive only a fraction of expected genetic gains for growth rate as predicted from the selection environment This fraction is the result of undesirable genotype-by-environment interactions (G x E) and measured by the genetic correlation (r(g)) of growth between environments. Rapid estimates of genetic correlation achieved in one generation are notoriously difficult to estimate with precision. A new design is proposed where genetic correlations can be estimated by utilising artificial mating from cryopreserved semen and unfertilised eggs stripped from a single female. We compare a traditional phenotype analysis of growth to a threshold model where only the largest fish are genotyped for sire identification. The threshold model was robust to differences in family mortality differing up to 30%. The design is unique as it negates potential re-ranking of families caused by an interaction between common maternal environmental effects and growing environment. The design is suitable for rapid assessment of G x E over one generation with a true 0.70 genetic correlation yielding standard errors as low as 0.07. Different design scenarios were tested for bias and accuracy with a range of heritability values, number of half-sib families created, number of progeny within each full-sib family, number of fish genotyped, number of fish stocked, differing family survival rates and at various simulated genetic correlation levels

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gulland's [Gulland, J.A., 1965. Estimation of mortality rates. Annex to Arctic Fisheries Working Group Report (meeting in Hamburg, January 1965). ICES. C.M. 1965, Doc. No. 3 (mimeographed)] virtual population analysis (VPA) is commonly used for studying the dynamics of harvested fish populations. However, it necessitates the solving of a nonlinear equation for the instantaneous rate of fishing mortality of the fish in a population. Pope [Pope, J.G., 1972. An investigation of the accuracy of Virtual Population Analysis using cohort analysis. ICNAF Res. Bull. 9, 65-74. Also available in D.H. Cushing (ed.) (1983), Key Papers on Fish Populations, p. 291-301, IRL Press, Oxford, 405 p.] eliminated this necessity in his cohort analysis by approximating its underlying age- and time-dependent population model. His approximation has since become one of the most commonly used age- and time-dependent fish population models in fisheries science. However, some of its properties are not well understood. For example, many assert that it describes the dynamics of a fish population, from which the catch of fish is taken instantaneously in the middle of the year. Such an assertion has never been proven, nor has its implied instantaneous rate of fishing mortality of the fish of a particular age at a particular time been examined, nor has its implied catch equation been derived from a general catch equation. In this paper, we prove this assertion, examine its implied instantaneous rate of fishing mortality of the fish of a particular age at a particular time, derive its implied catch equation from a general catch equation, and comment on how to structure an age- and time-dependent population model to ensure its internal consistency. This work shows that Gulland's (1965) virtual population analysis and Pope's (1972) cohort analysis lie at the opposite end of a continuous spectrum as a general model for a seasonally occurring fishery; Pope's (1972) approximation implies an infinitely large instantaneous rate of fishing mortality of the fish of a particular age at a particular time in a fishing season of zero length; and its implied catch equation has an undefined instantaneous rate of fishing mortality of the fish in a population, but a well-defined cumulative instantaneous rate of fishing mortality of the fish in the population. This work also highlights a need for a more careful treatment of the times of start and end of a fishing season in fish population models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

So far, most Phase II trials have been designed and analysed under a frequentist framework. Under this framework, a trial is designed so that the overall Type I and Type II errors of the trial are controlled at some desired levels. Recently, a number of articles have advocated the use of Bavesian designs in practice. Under a Bayesian framework, a trial is designed so that the trial stops when the posterior probability of treatment is within certain prespecified thresholds. In this article, we argue that trials under a Bayesian framework can also be designed to control frequentist error rates. We introduce a Bayesian version of Simon's well-known two-stage design to achieve this goal. We also consider two other errors, which are called Bayesian errors in this article because of their similarities to posterior probabilities. We show that our method can also control these Bayesian-type errors. We compare our method with other recent Bayesian designs in a numerical study and discuss implications of different designs on error rates. An example of a clinical trial for patients with nasopharyngeal carcinoma is used to illustrate differences of the different designs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aims of this study were to investigate outcome and to evaluate areas of potential ongoing concern after orthotopic liver transplantation (OLT) in children. Actuarial survival in relation to age and degree of undernutrition at the time of OLT was evaluated in 53 children (age 0.58-14.2 years) undergoing OLT for endstage liver disease. Follow-up studies of growth and quality of life were undertaken in those with a minimum follow-up period of 12 months (n = 26). The overall 3 year actuarial survival was 70%. Survival rates did not differ between age groups (actuarial 2 year survival for ages <1, 1-5 and >5 years were 70, 70 and 69% respectively) but did differ according to nutritional status at OLT (actuarial 2 year survival for children with Z scores for weight <-1 was 57%, >-1 was 95%; P = 0.004). Significant catch-up weight gain was observed by 18 months post-transplant, while height improved less rapidly. Quality of life (assessed by Vineland Adaptive Behaviour Scales incorporating socialization, daily living skills, communication and motor skills) was good (mean composite score 91 ± 19). All school-aged children except one were attending normal school. Two children had mild to moderate intellectual handicap related to post-operative intracerebral complications. Satisfactory long-term survival can be achieved after OLT in children regardless of age but the importance of pre-operative nutrition is emphasized. Survivors have an excellent chance of a good quality of life and of satisfactory catch-up weight gain and growth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical methods are often used to analyse commercial catch and effort data to provide standardised fishing effort and/or a relative index of fish abundance for input into stock assessment models. Achieving reliable results has proved difficult in Australia's Northern Prawn Fishery (NPF), due to a combination of such factors as the biological characteristics of the animals, some aspects of the fleet dynamics, and the changes in fishing technology. For this set of data, we compared four modelling approaches (linear models, mixed models, generalised estimating equations, and generalised linear models) with respect to the outcomes of the standardised fishing effort or the relative index of abundance. We also varied the number and form of vessel covariates in the models. Within a subset of data from this fishery, modelling correlation structures did not alter the conclusions from simpler statistical models. The random-effects models also yielded similar results. This is because the estimators are all consistent even if the correlation structure is mis-specified, and the data set is very large. However, the standard errors from different models differed, suggesting that different methods have different statistical efficiency. We suggest that there is value in modelling the variance function and the correlation structure, to make valid and efficient statistical inferences and gain insight into the data. We found that fishing power was separable from the indices of prawn abundance only when we offset the impact of vessel characteristics at assumed values from external sources. This may be due to the large degree of confounding within the data, and the extreme temporal changes in certain aspects of individual vessels, the fleet and the fleet dynamics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although subsampling is a common method for describing the composition of large and diverse trawl catches, the accuracy of these techniques is often unknown. We determined the sampling errors generated from estimating the percentage of the total number of species recorded in catches, as well as the abundance of each species, at each increase in the proportion of the sorted catch. We completely partitioned twenty prawn trawl catches from tropical northern Australia into subsamples of about 10 kg each. All subsamples were then sorted, and species numbers recorded. Catch weights ranged from 71 to 445 kg, and the number of fish species in trawls ranged from 60 to 138, and invertebrate species from 18 to 63. Almost 70% of the species recorded in catches were "rare" in subsamples (less than one individual per 10 kg subsample or less than one in every 389 individuals). A matrix was used to show the increase in the total number of species that were recorded in each catch as the percentage of the sorted catch increased. Simulation modelling showed that sorting small subsamples (about 10% of catch weights) identified about 50% of the total number of species caught in a trawl. Larger subsamples (50% of catch weight on average) identified about 80% of the total species caught in a trawl. The accuracy of estimating the abundance of each species also increased with increasing subsample size. For the "rare" species, sampling error was around 80% after sorting 10% of catch weight and was just less than 50% after 40% of catch weight had been sorted. For the "abundant" species (five or more individuals per 10 kg subsample or five or more in every 389 individuals), sampling error was around 25% after sorting 10% of catch weight, but was reduced to around 10% after 40% of catch weight had been sorted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Subsampling is a common method for estimating the abundance of species in trawl catches. However, the accuracy of subsampling in representing the total catch has not been assessed. To estimate one possible source of bias due to subsampling, we tested whether the position on trawler sorting trays from which subsamples were taken affected their ability to represent species in catches. This was done by sorting catches into 10 kg subsamples and comparing subsamples taken from different positions on the sorting tray. Comparisons were made after species were grouped into three categories of abundance, either 'rare', 'common' or 'abundant'. A generalised linear model analysis showed that taking subsamples from different positions on the sorting tray had no major effect on estimating the total numbers or weights of fish or invertebrates, or the total number of fish or invertebrate taxa, recorded in each position. Some individual taxa showed differences between positions on the sorting tray (11.5% of taxa ina three-position design; 25% in a five-position design). But consistent and meaningful patterns in the position of these taxa on the sorting tray could only be seen for the pony fish Leiognathus moretoniensis and the saucer scallop Amusium pleuronectes. Because most bycatch laxa are well mixed throughout the catch, subsamples can be taken from any position on trawler sorting trays without introducing bias.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple stochastic model of a fish population subject to natural and fishing mortalities is described. The fishing effort is assumed to vary over different periods but to be constant within each period. A maximum-likelihood approach is developed for estimating natural mortality (M) and the catchability coefficient (q) simultaneously from catch-and-effort data. If there is not enough contrast in the data to provide reliable estimates of both M and q, as is often the case in practice, the method can be used to obtain the best possible values of q for a range of possible values of M. These techniques are illustrated with tiger prawn (Penaeus semisulcatus) data from the Northern Prawn Fishery of Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Records of shrimp growth and water quality made during 12 crops from each of 48 ponds, over a period of 6.5 years, were provided by a Queensland, Australia, commercial shrimp farm, These data were analysed with a new growth model derived from the Gompertz model. The results indicate that water temperature, mortality and pond age significantly affect growth rates. After 180 days, shrimp reach 34 g at constant 30 degrees C, but only 15 g after the same amount of time at 20 degrees C. Mortality, through thinning the density of shrimp in the ponds, increased the growth rate, but the effect is small. With continual production, growth rates at first remained steady, then appeared to decrease for the sixth and seventh crop, after which they have increased steadily with each crop. It appears that conservative pond management, together with a gradual improvement in husbandry techniques, particularly feed management, brought about this change. This has encouraging implications for the long-term sustainability of the farming methods used. The growth model can be used to predict productivity, and hence, profitability, of new aquaculture locations or new production strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report describes the development and simulation of a variable rate controller for a 6-degree of freedom nonlinear model. The variable rate simulation model represents an off the shelf autopilot. Flight experiment involves risks and can be expensive. Therefore a dynamic model to understand the performance characteristics of the UAS in mission simulation before actual flight test or to obtain parameters needed for the flight is important. The control and guidance is implemented in Simulink. The report tests the use of the model for air search and air sampling path planning. A GUI in which a set of mission scenarios, in which two experts (mission expert, i.e. air sampling or air search and an UAV expert) interact, is presented showing the benefits of the method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Standard methods for quantifying IncuCyte ZOOM™ assays involve measurements that quantify how rapidly the initially-vacant area becomes re-colonised with cells as a function of time. Unfortunately, these measurements give no insight into the details of the cellular-level mechanisms acting to close the initially-vacant area. We provide an alternative method enabling us to quantify the role of cell motility and cell proliferation separately. To achieve this we calibrate standard data available from IncuCyte ZOOM™ images to the solution of the Fisher-Kolmogorov model. Results: The Fisher-Kolmogorov model is a reaction-diffusion equation that has been used to describe collective cell spreading driven by cell migration, characterised by a cell diffusivity, D, and carrying capacity limited proliferation with proliferation rate, λ, and carrying capacity density, K. By analysing temporal changes in cell density in several subregions located well-behind the initial position of the leading edge we estimate λ and K. Given these estimates, we then apply automatic leading edge detection algorithms to the images produced by the IncuCyte ZOOM™ assay and match this data with a numerical solution of the Fisher-Kolmogorov equation to provide an estimate of D. We demonstrate this method by applying it to interpret a suite of IncuCyte ZOOM™ assays using PC-3 prostate cancer cells and obtain estimates of D, λ and K. Comparing estimates of D, λ and K for a control assay with estimates of D, λ and K for assays where epidermal growth factor (EGF) is applied in varying concentrations confirms that EGF enhances the rate of scratch closure and that this stimulation is driven by an increase in D and λ, whereas K is relatively unaffected by EGF. Conclusions: Our approach for estimating D, λ and K from an IncuCyte ZOOM™ assay provides more detail about cellular-level behaviour than standard methods for analysing these assays. In particular, our approach can be used to quantify the balance of cell migration and cell proliferation and, as we demonstrate, allow us to quantify how the addition of growth factors affects these processes individually.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim An effective catch in sculling is a critical determinant of boat velocity. This study used rowers’ performance-based judgments to compare three measures of catch slip efficiency. Two questions were addressed: (1) would rower-judged Yes strokes be faster than No strokes? and (2) which method of quantifying catch slip best reflected these judgements? Methods Eight single scullers performed two 10-min blocks of sub maximal on-water rowing at 20 strokes per minute. Every 30 s, rowers reported either Yes or No about the quality of their stroke at the catch. Results It was found that Yes strokes identified by rowers had, on average, a moderate effect advantage over No strokes with a standardised effect size of 0.43. In addition, a quicker time to positive acceleration best reflected the change in performance; where the standardised mean difference score of 0.57 for time to positive acceleration was larger than the scores of 0.47 for time to PowerLine force, and 0.35 for time to 30% peak pin force catch slip measures. For all eight rowers, Yes strokes corresponded to time to positive acceleration occurring earlier than No strokes. Conclusion Rower judgements about successful strokes was linked to achieving a quicker time to positive acceleration, and may be of the most value in achieving a higher average boat velocity.