130 resultados para Sequential Modeling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radar images and numerical simulations of three shallow convective precipitation events over the Coastal Range in western Oregon are presented. In one of these events, unusually well-defined quasi-stationary banded formations produced large precipitation enhancements in favored locations, while varying degrees of band organization and lighter precipitation accumulations occurred in the other two cases. The difference between the more banded and cellular cases appeared to depend on the vertical shear within the orographic cap cloud and the susceptibility of the flow to convection upstream of the mountain. Numerical simulations showed that the rainbands, which appeared to be shear-parallel convective roll circulations that formed within the unstable orographic cap cloud, developed even over smooth mountains. However, these banded structures were better organized, more stationary, and produced greater precipitation enhancement over mountains with small-scale topographic obstacles. Low-amplitude random topographic roughness elements were found to be just as effective as more prominent subrange-scale peaks at organizing and fixing the location of the orographic rainbands.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sensitivity, specificity, and reproducibility are vital to interpret neuroscientific results from functional magnetic resonance imaging (fMRI) experiments. Here we examine the scan–rescan reliability of the percent signal change (PSC) and parameters estimated using Dynamic Causal Modeling (DCM) in scans taken in the same scan session, less than 5 min apart. We find fair to good reliability of PSC in regions that are involved with the task, and fair to excellent reliability with DCM. Also, the DCM analysis uncovers group differences that were not present in the analysis of PSC, which implies that DCM may be more sensitive to the nuances of signal changes in fMRI data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding how multiple signals are integrated in living cells to produce a balanced response is a major challenge in biology. Two-component signal transduction pathways, such as bacterial chemotaxis, comprise histidine protein kinases (HPKs) and response regulators (RRs). These are used to sense and respond to changes in the environment. Rhodobacter sphaeroides has a complex chemosensory network with two signaling clusters, each containing a HPK, CheA. Here we demonstrate, using a mathematical model, how the outputs of the two signaling clusters may be integrated. We use our mathematical model supported by experimental data to predict that: (1) the main RR controlling flagellar rotation, CheY6, aided by its specific phosphatase, the bifunctional kinase CheA3, acts as a phosphate sink for the other RRs; and (2) a phosphorelay pathway involving CheB2 connects the cytoplasmic cluster kinase CheA3 with the polar localised kinase CheA2, and allows CheA3-P to phosphorylate non-cognate chemotaxis RRs. These two mechanisms enable the bifunctional kinase/phosphatase activity of CheA3 to integrate and tune the sensory output of each signaling cluster to produce a balanced response. The signal integration mechanisms identified here may be widely used by other bacteria, since like R. sphaeroides, over 50% of chemotactic bacteria have multiple cheA homologues and need to integrate signals from different sources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is known that germin, which is a marker of the onset of growth in germinating wheat, is an oxalate oxidase, and also that germins possess sequence similarity with legumin and vicilin seed storage proteins. These two pieces of information have been combined in order to generate a 3D model of germin based on the structure of vicilin and to examine the model with regard to a potential oxalate oxidase active site. A cluster of three histidine residues has been located within the conserved beta-barrel structure. While there is a relatively low level of overall sequence similarity between the model and the vicilin structures, the conservation of amino acids important in maintaining the scaffold of the beta-barrel lends confidence to the juxtaposition of the histidine residues. The cluster is similar structurally to those found in copper amine oxidase and other proteins, leading to the suggestion that it defines a metal-binding location within the oxalate oxidase active site. It is also proposed that the structural elements involved in intermolecular interactions in vicilins may play a role in oligomer formation in germin/oxalate oxidase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Smooth flow of production in construction is hampered by disparity between individual trade teams' goals and the goals of stable production flow for the project as a whole. This is exacerbated by the difficulty of visualizing the flow of work in a construction project. While the addresses some of the issues in Building information modeling provides a powerful platform for visualizing work flow in control systems that also enable pull flow and deeper collaboration between teams on and off site. The requirements for implementation of a BIM-enabled pull flow construction management software system based on the Last Planner System™, called ‘KanBIM’, have been specified, and a set of functional mock-ups of the proposed system has been implemented and evaluated in a series of three focus group workshops. The requirements cover the areas of maintenance of work flow stability, enabling negotiation and commitment between teams, lean production planning with sophisticated pull flow control, and effective communication and visualization of flow. The evaluation results show that the system holds the potential to improve work flow and reduce waste by providing both process and product visualization at the work face.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Oral nutrition supplements (ONS) are routinely prescribed to those with, or at risk of, malnutrition. Previous research identified poor compliance due to taste and sweetness. This paper investigates taste and hedonic liking of ONS, of varying sweetness and metallic levels, over consumption volume; an important consideration as patients are prescribed large volumes of ONS daily. A sequential descriptive profile was developed to determine the perception of sensory attributes over repeat consumption of ONS. Changes in liking of ONS following repeat consumption were characterised by a boredom test. Certain flavour (metallic taste, soya milk flavour) and mouthfeel (mouthdrying, mouthcoating) attributes built up over increased consumption volume (p 0.002). Hedonic liking data from two cohorts, healthy older volunteers (n = 32, median age 73) and patients (n = 28, median age 85), suggested such build-up was disliked. Efforts made to improve the palatability of ONS must take account of the build up of taste and mouthfeel characteristics over increased consumption volume.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A modeling Study was carried out into pea-barley intercropping in northern Europe. The two objectives were (a) to compare pea-barley intercropping to sole cropping in terms of grain and nitrogen yield amounts and stability, and (b) to explore options for managing pea-barley intercropping systems in order to maximize the biomass produced and the grain and nitrogen yields according to the available resources, such as light, water and nitrogen. The study consisted of simulations taking into account soil and weather variability among three sites located in northern European Countries (Denmark, United Kingdom and France), and using 10 years of weather records. A preliminary stage evaluated the STICS intercrop model's ability to predict grain and nitrogen yields of the two species, using a 2-year dataset from trials conducted at the three sites. The work was carried out in two phases, (a) the model was run to investigate the potentialities of intercrops as compared to sole crops, and (b) the model was run to explore options for managing pea-barley intercropping, asking the following three questions: (i) in order to increase light capture, Would it be worth delaying the sowing dates of one species? (ii) How to manage sowing density and seed proportion of each species in the intercrop to improve total grain yield and N use efficiency? (iii) How to optimize the use of nitrogen resources by choosing the most suitable preceding crop and/or the most appropriate soil? It was found that (1) intercropping made better use of environmental resources as regards yield amount and stability than sole cropping, with a noticeable site effect, (2) pea growth in intercrops was strongly linked to soil moisture, and barley yield was determined by nitrogen uptake and light interception due to its height relative to pea, (3) sowing barley before pea led to a relative grain yield reduction averaged over all three sites, but sowing strategy must be adapted to the location, being dependent on temperature and thus latitude, (4) density and species proportions had a small effect on total grain yield, underlining the interspecific offset in the use of environmental growth resources which led to similar total grain yields whatever the pea-barley design, and (5) long-term strategies including mineralization management through organic residue supply and rotation management were very valuable, always favoring intercrop total grain yield and N accumulation. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two models for predicting Septoria tritici on winter wheat (cv. Ri-band) were developed using a program based on an iterative search of correlations between disease severity and weather. Data from four consecutive cropping seasons (1993/94 until 1996/97) at nine sites throughout England were used. A qualitative model predicted the presence or absence of Septoria tritici (at a 5% severity threshold within the top three leaf layers) using winter temperature (January/February) and wind speed to about the first node detectable growth stage. For sites above the disease threshold, a quantitative model predicted severity of Septoria tritici using rainfall during stern elongation. A test statistic was derived to test the validity of the iterative search used to obtain both models. This statistic was used in combination with bootstrap analyses in which the search program was rerun using weather data from previous years, therefore uncorrelated with the disease data, to investigate how likely correlations such as the ones found in our models would have been in the absence of genuine relationships.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Milk supply from Mexican dairy farms does not meet demand and small-scale farms can contribute toward closing the gap. Two multi-criteria programming techniques, goal programming and compromise programming, were used in a study of small-scale dairy farms in central Mexico. To build the goal and compromise programming models, 4 ordinary linear programming models were also developed, which had objective functions to maximize metabolizable energy for milk production, to maximize margin of income over feed costs, to maximize metabolizable protein for milk production, and to minimize purchased feedstuffs. Neither multicriteria approach was significantly better than the other; however, by applying both models it was possible to perform a more comprehensive analysis of these small-scale dairy systems. The multi-criteria programming models affirm findings from previous work and suggest that a forage strategy based on alfalfa, rye-grass, and corn silage would meet nutrient requirements of the herd. Both models suggested that there is an economic advantage in rescheduling the calving season to the second and third calendar quarters to better synchronize higher demand for nutrients with the period of high forage availability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In clinical trials, situations often arise where more than one response from each patient is of interest; and it is required that any decision to stop the study be based upon some or all of these measures simultaneously. Theory for the design of sequential experiments with simultaneous bivariate responses is described by Jennison and Turnbull (Jennison, C., Turnbull, B. W. (1993). Group sequential tests for bivariate response: interim analyses of clinical trials with both efficacy and safety endpoints. Biometrics 49:741-752) and Cook and Farewell (Cook, R. J., Farewell, V. T. (1994). Guidelines for monitoring efficacy and toxicity responses in clinical trials. Biometrics 50:1146-1152) in the context of one efficacy and one safety response. These expositions are in terms of normally distributed data with known covariance. The methods proposed require specification of the correlation, ρ between test statistics monitored as part of the sequential test. It can be difficult to quantify ρ and previous authors have suggested simply taking the lowest plausible value, as this will guarantee power. This paper begins with an illustration of the effect that inappropriate specification of ρ can have on the preservation of trial error rates. It is shown that both the type I error and the power can be adversely affected. As a possible solution to this problem, formulas are provided for the calculation of correlation from data collected as part of the trial. An adaptive approach is proposed and evaluated that makes use of these formulas and an example is provided to illustrate the method. Attention is restricted to the bivariate case for ease of computation, although the formulas derived are applicable in the general multivariate case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A number of authors have proposed clinical trial designs involving the comparison of several experimental treatments with a control treatment in two or more stages. At the end of the first stage, the most promising experimental treatment is selected, and all other experimental treatments are dropped from the trial. Provided it is good enough, the selected experimental treatment is then compared with the control treatment in one or more subsequent stages. The analysis of data from such a trial is problematic because of the treatment selection and the possibility of stopping at interim analyses. These aspects lead to bias in the maximum-likelihood estimate of the advantage of the selected experimental treatment over the control and to inaccurate coverage for the associated confidence interval. In this paper, we evaluate the bias of the maximum-likelihood estimate and propose a bias-adjusted estimate. We also propose an approach to the construction of a confidence region for the vector of advantages of the experimental treatments over the control based on an ordering of the sample space. These regions are shown to have accurate coverage, although they are also shown to be necessarily unbounded. Confidence intervals for the advantage of the selected treatment are obtained from the confidence regions and are shown to have more accurate coverage than the standard confidence interval based upon the maximum-likelihood estimate and its asymptotic standard error.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most statistical methodology for phase III clinical trials focuses on the comparison of a single experimental treatment with a control. An increasing desire to reduce the time before regulatory approval of a new drug is sought has led to development of two-stage or sequential designs for trials that combine the definitive analysis associated with phase III with the treatment selection element of a phase II study. In this paper we consider a trial in which the most promising of a number of experimental treatments is selected at the first interim analysis. This considerably reduces the computational load associated with the construction of stopping boundaries compared to the approach proposed by Follman, Proschan and Geller (Biometrics 1994; 50: 325-336). The computational requirement does not exceed that for the sequential comparison of a single experimental treatment with a control. Existing methods are extended in two ways. First, the use of the efficient score as a test statistic makes the analysis of binary, normal or failure-time data, as well as adjustment for covariates or stratification straightforward. Second, the question of trial power is also considered, enabling the determination of sample size required to give specified power. Copyright © 2003 John Wiley & Sons, Ltd.