996 resultados para meal size


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Atheromatous plaque rupture h the cause of the majority of strokes and heart attacks in the developed world. The role of calcium deposits and their contribution to plaque vulnerability are controversial. Some studies have suggested that calcified plaque tends to be more stable whereas others have suggested the opposite. This study uses a finite element model to evaluate the effect of calcium deposits on the stress within the fibrous cap by varying their location and size. Plaque fibrous cap, lipid pool and calcification were modeled as hyperelastic, Isotropic, (nearly) incompressible materials with different properties for large deformation analysis by assigning time-dependent pressure loading on the lumen wall. The stress and strain contours were illustrated for each condition for comparison. Von Mises stress only increases up to 1.5% when varying the location of calcification in the lipid pool distant to the fibrous cap. Calcification in the fibrous cap leads to a 43% increase of Von Mises stress when compared with that in the lipid pool. An increase of 100% of calcification area leads to a 15% stress increase in the fibrous cap. Calcification in the lipid pool does not increase fibrous cap stress when it is distant to the fibrous cap, whilst large areas of calcification close to or in the fibrous cap may lead to a high stress concentration within the fibrous cap, which may cause plaque rupture. This study highlights the application of a computational model on a simulation of clinical problems, and it may provide insights into the mechanism of plaque rupture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We derive a new method for determining size-transition matrices (STMs) that eliminates probabilities of negative growth and accounts for individual variability. STMs are an important part of size-structured models, which are used in the stock assessment of aquatic species. The elements of STMs represent the probability of growth from one size class to another, given a time step. The growth increment over this time step can be modelled with a variety of methods, but when a population construct is assumed for the underlying growth model, the resulting STM may contain entries that predict negative growth. To solve this problem, we use a maximum likelihood method that incorporates individual variability in the asymptotic length, relative age at tagging, and measurement error to obtain von Bertalanffy growth model parameter estimates. The statistical moments for the future length given an individual's previous length measurement and time at liberty are then derived. We moment match the true conditional distributions with skewed-normal distributions and use these to accurately estimate the elements of the STMs. The method is investigated with simulated tag-recapture data and tag-recapture data gathered from the Australian eastern king prawn (Melicertus plebejus).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power calculation and sample size determination are critical in designing environmental monitoring programs. The traditional approach based on comparing the mean values may become statistically inappropriate and even invalid when substantial proportions of the response values are below the detection limits or censored because strong distributional assumptions have to be made on the censored observations when implementing the traditional procedures. In this paper, we propose a quantile methodology that is robust to outliers and can also handle data with a substantial proportion of below-detection-limit observations without the need of imputing the censored values. As a demonstration, we applied the methods to a nutrient monitoring project, which is a part of the Perth Long-Term Ocean Outlet Monitoring Program. In this example, the sample size required by our quantile methodology is, in fact, smaller than that by the traditional t-test, illustrating the merit of our method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a new model for estimating the size of a population from successive catches taken during a removal experiment. The data from these experiments often have excessive variation, known as overdispersion, as compared with that predicted by the multinomial model. The new model allows catchability to vary randomly among samplings, which accounts for overdispersion. When the catchability is assumed to have a beta distribution, the likelihood function, which is refered to as beta-multinomial, is derived, and hence the maximum likelihood estimates can be evaluated. Simulations show that in the presence of extravariation in the data, the confidence intervals have been substantially underestimated in previous models (Leslie-DeLury, Moran) and that the new model provides more reliable confidence intervals. The performance of these methods was also demonstrated using two real data sets: one with overdispersion, from smallmouth bass (Micropterus dolomieu), and the other without overdispersion, from rat (Rattus rattus).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural mortality of marine invertebrates is often very high in the early life history stages and decreases in later stages. The possible size-dependent mortality of juvenile banana prawns, P. merguiensis (2-15 mm carapace length) in the Gulf of Carpentaria was investigated. The analysis was based on the data collected at 2-weekly intervals by beam trawls at four sites over a period of six years (between September 1986 and March 1992). It was assumed that mortality was a parametric function of size, rather than a constant. Another complication in estimating mortality for juvenile banana prawns is that a significant proportion of the population emigrates from the study area each year. This effect was accounted for by incorporating the size-frequency pattern of the emigrants in the analysis. Both the extra parameter in the model required to describe the size dependence of mortality, and that used to account for emigration were found to be significantly different from zero, and the instantaneous mortality rate declined from 0.89 week(-1) for 2 mm prawns to 0.02 week(-1) for 15 mm prawns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The bentiromide test was evaluated using plasma p-aminobenzoic acid as an indirect test of pancreatic insufficiency in young children between 2 months and 4 years of age. To determine the optimal test method, the following were examined: (a) the best dose of bentiromide (15 mg/kg or 30 mg/kg); (b) the optimal sampling time for plasma p-aminobenzoic acid, and; (c) the effect of coadministration of a liquid meal. Sixty-nine children (1.6 ± 1.0 years) were studied, including 34 controls with normal fat absorption and 35 patients (34 with cystic fibrosis) with fat maldigestion due to pancreatic insufficiency. Control and pancreatic insufficient subjects were studied in three age-matched groups: (a) low-dose bentiromide (15 mg/kg) with clear fluids; (b) high-dose bentiromide (30 mg/kg) with clear fluids, and; (c) high-dose bentiromide with a liquid meal. Plasma p-aminobenzoic acid was determined at 0, 30, 60, and 90 minutes then hourly for 6 hours. The dose effect of bentiromide with clear liquids was evaluated. High-dose bentiromide best discriminated control and pancreatic insufficient subjects, due to a higher peak plasma p-aminobenzoic acid level in controls, but poor sensitivity and specificity remained. High-dose bentiromide with a liquid meal produced a delayed increase in plasma p-aminobenzoic acid in the control subjects probably caused by retarded gastric emptying. However, in the pancreatic insufficient subjects, use of a liquid meal resulted in significantly lower plasma p-aminobenzoic acid levels at all time points; plasma p-aminobenzoic acid at 2 and 3 hours completely discriminated between control and pancreatic insufficient patients. Evaluation of the data by area under the time-concentration curve failed to improve test results. In conclusion, the bentiromide test is a simple, clinically useful means of detecting pancreatic insufficiency in young children, but a higher dose administered with a liquid meal is recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although subsampling is a common method for describing the composition of large and diverse trawl catches, the accuracy of these techniques is often unknown. We determined the sampling errors generated from estimating the percentage of the total number of species recorded in catches, as well as the abundance of each species, at each increase in the proportion of the sorted catch. We completely partitioned twenty prawn trawl catches from tropical northern Australia into subsamples of about 10 kg each. All subsamples were then sorted, and species numbers recorded. Catch weights ranged from 71 to 445 kg, and the number of fish species in trawls ranged from 60 to 138, and invertebrate species from 18 to 63. Almost 70% of the species recorded in catches were "rare" in subsamples (less than one individual per 10 kg subsample or less than one in every 389 individuals). A matrix was used to show the increase in the total number of species that were recorded in each catch as the percentage of the sorted catch increased. Simulation modelling showed that sorting small subsamples (about 10% of catch weights) identified about 50% of the total number of species caught in a trawl. Larger subsamples (50% of catch weight on average) identified about 80% of the total species caught in a trawl. The accuracy of estimating the abundance of each species also increased with increasing subsample size. For the "rare" species, sampling error was around 80% after sorting 10% of catch weight and was just less than 50% after 40% of catch weight had been sorted. For the "abundant" species (five or more individuals per 10 kg subsample or five or more in every 389 individuals), sampling error was around 25% after sorting 10% of catch weight, but was reduced to around 10% after 40% of catch weight had been sorted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stallard (1998, Biometrics 54, 279-294) recently used Bayesian decision theory for sample-size determination in phase II trials. His design maximizes the expected financial gains in the development of a new treatment. However, it results in a very high probability (0.65) of recommending an ineffective treatment for phase III testing. On the other hand, the expected gain using his design is more than 10 times that of a design that tightly controls the false positive error (Thall and Simon, 1994, Biometrics 50, 337-349). Stallard's design maximizes the expected gain per phase II trial, but it does not maximize the rate of gain or total gain for a fixed length of time because the rate of gain depends on the proportion: of treatments forwarding to the phase III study. We suggest maximizing the rate of gain, and the resulting optimal one-stage design becomes twice as efficient as Stallard's one-stage design. Furthermore, the new design has a probability of only 0.12 of passing an ineffective treatment to phase III study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multi-objective optimization is an active field of research with broad applicability in aeronautics. This report details a variant of the original NSGA-II software aimed to improve the performances of such a widely used Genetic Algorithm in finding the optimal Pareto-front of a Multi-Objective optimization problem for the use of UAV and aircraft design and optimsaiton. Original NSGA-II works on a population of predetermined constant size and its computational cost to evaluate one generation is O(mn^2 ), being m the number of objective functions and n the population size. The basic idea encouraging this work is that of reduce the computational cost of the NSGA-II algorithm by making it work on a population of variable size, in order to obtain better convergence towards the Pareto-front in less time. In this work some test functions will be tested with both original NSGA-II and VPNSGA-II algorithms; each test will be timed in order to get a measure of the computational cost of each trial and the results will be compared.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thermodynamic model first published in 1909, is being used extensively to understand the size-dependent melting of nanoparticles. Pawlow deduced an expression for the size-dependent melting temperature of small particles based on the thermodynamic model which was then modified and applied to different nanostructures such as nanowires, prism-shaped nanoparticles, etc. The model has also been modified to understand the melting of supported nanoparticles and superheating of embedded nanoparticles. In this article, we have reviewed the melting behaviour of nanostructures reported in the literature since 1909.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consumer risk assessment is a crucial step in the regulatory approval of pesticide use on food crops. Recently, an additional hurdle has been added to the formal consumer risk assessment process with the introduction of short-term intake or exposure assessment and a comparable short-term toxicity reference, the acute reference dose. Exposure to residues during one meal or over one day is important for short-term or acute intake. Exposure in the short term can be substantially higher than average because the consumption of a food on a single occasion can be very large compared with typical long-term or mean consumption and the food may have a much larger residue than average. Furthermore, the residue level in a single unit of a fruit or vegetable may be higher by a factor (defined as the variability factor, which we have shown to be typically ×3 for the 97.5th percentile unit) than the average residue in the lot. Available marketplace data and supervised residue trial data are examined in an investigation of the variability of residues in units of fruit and vegetables. A method is described for estimating the 97.5th percentile value from sets of unit residue data. Variability appears to be generally independent of the pesticide, the crop, crop unit size and the residue level. The deposition of pesticide on the individual unit during application is probably the most significant factor. The diets used in the calculations ideally come from individual and household surveys with enough consumers of each specific food to determine large portion sizes. The diets should distinguish the different forms of a food consumed, eg canned, frozen or fresh, because the residue levels associated with the different forms may be quite different. Dietary intakes may be calculated by a deterministic method or a probabilistic method. In the deterministic method the intake is estimated with the assumptions of large portion consumption of a ‘high residue’ food (high residue in the sense that the pesticide was used at the highest recommended label rate, the crop was harvested at the smallest interval after treatment and the residue in the edible portion was the highest found in any of the supervised trials in line with these use conditions). The deterministic calculation also includes a variability factor for those foods consumed as units (eg apples, carrots) to allow for the elevated residue in some single units which may not be seen in composited samples. In the probabilistic method the distribution of dietary consumption and the distribution of possible residues are combined in repeated probabilistic calculations to yield a distribution of possible residue intakes. Additional information such as percentage commodity treated and combination of residues from multiple commodities may be incorporated into probabilistic calculations. The IUPAC Advisory Committee on Crop Protection Chemistry has made 11 recommendations relating to acute dietary exposure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analyses of diffusion and dislocation creep in nanocrystals needs to take into account the generally utilized low temperatures, high stresses and very fine grain sizes. In nanocrystals, diffusion creep may be associated with a nonlinear stress dependence and dislocation creep may involve a grain size dependence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rail track undergoes complex loading patterns under moving traffic conditions compared to roads due to its continued and discontinued multi-layered structure, including rail, sleepers, ballast layer, sub-ballast layer, and subgrade. Particle size distributions (PSDs) of ballast, subballast, and subgrade layers can be critical in cyclic plastic deformation of rail track under moving traffic on frequent track degradation of rail tracks, especially at bridge transition zones. Conventional test approaches: static shear and cyclic single-point load tests are however unable to replicate actual loading patterns of moving train. Multi-ring shear apparatus; a new type of torsional simple shear apparatus, which can reproduce moving traffic conditions, was used in this study to investigate influence of particle size distribution of rail track layers on cyclic plastic deformation. Three particle size distributions, using glass beads were examined under different loading patterns: cyclic sin-gle-point load, and cyclic moving wheel load to evaluate cyclic plastic deformation of rail track under different loading methods. The results of these tests suggest that particle size distributions of rail track structural layers have significant impacts on cyclic plastic deformation under moving train load. Further, the limitations in con-ventional test methods used in laboratories to estimate the plastic deformation of rail track materials lead to underestimate the plastic deformation of rail tracks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, 120–144 commercial varieties and breeding lines were assessed for grain size attributes including plump grain (>2.8 mm) and retention (>2.5 mm+>2.8 mm). Grain samples were produced from replicated trials at 25 sites across four years. Climatic conditions varied between years as well as between sites. Several of the trial sites were irrigated while the remaining were produced under dryland conditions. A number of the dryland sites suffered severe drought stress. The grain size data was analysed for genetic (G), environmental (E) and genotype by environment (G×E) interactions. All analyses included maturity as a covariate. The genetic effect on grain size was greater than environmental or maturity effects despite some sites suffering terminal moisture stress. The model was used to calculate heritability values for each site used in the study. These values ranged from 89 to 98% for plump grain and 88 to 96% for retention. The results demonstrated that removing the sources of non-heritable variation, such as maturity and field effects, can improve genetic estimates of the retention and plump grain fractions. By partitioning all variance components, and thereby having more robust estimates of genetic differences, plant breeders can have greater confidence in selecting barley genotypes which maintain large, stable grain size across a range of environments.