992 resultados para Recursive real numbers


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The DNA extraction is a critical step in Genetically Modified Organisms analysis based on real-time PCR. In this study, the CTAB and DNeasy methods provided good quality and quantity of DNA from the texturized soy protein, infant formula, and soy milk samples. Concerning the Certified Reference Material consisting of 5% Roundup Ready® soybean, neither method yielded DNA of good quality. However, the dilution test applied in the CTAB extracts showed no interference of inhibitory substances. The PCR efficiencies of lectin target amplification were not statistically different, and the coefficients of correlation (R²) demonstrated high degree of correlation between the copy numbers and the threshold cycle (Ct) values. ANOVA showed suitable adjustment of the regression and absence of significant linear deviations. The efficiencies of the p35S amplification were not statistically different, and all R² values using DNeasy extracts were above 0.98 with no significant linear deviations. Two out of three R² values using CTAB extracts were lower than 0.98, corresponding to lower degree of correlation, and the lack-of-fit test showed significant linear deviation in one run. The comparative analysis of the Ct values for the p35S and lectin targets demonstrated no statistical significant differences between the analytical curves of each target.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation describes an approach for developing a real-time simulation for working mobile vehicles based on multibody modeling. The use of multibody modeling allows comprehensive description of the constrained motion of the mechanical systems involved and permits real-time solving of the equations of motion. By carefully selecting the multibody formulation method to be used, it is possible to increase the accuracy of the multibody model while at the same time solving equations of motion in real-time. In this study, a multibody procedure based on semi-recursive and augmented Lagrangian methods for real-time dynamic simulation application is studied in detail. In the semirecursive approach, a velocity transformation matrix is introduced to describe the dependent coordinates into relative (joint) coordinates, which reduces the size of the generalized coordinates. The augmented Lagrangian method is based on usage of global coordinates and, in that method, constraints are accounted using an iterative process. A multibody system can be modelled as either rigid or flexible bodies. When using flexible bodies, the system can be described using a floating frame of reference formulation. In this method, the deformation mode needed can be obtained from the finite element model. As the finite element model typically involves large number of degrees of freedom, reduced number of deformation modes can be obtained by employing model order reduction method such as Guyan reduction, Craig-Bampton method and Krylov subspace as shown in this study The constrained motion of the working mobile vehicles is actuated by the force from the hydraulic actuator. In this study, the hydraulic system is modeled using lumped fluid theory, in which the hydraulic circuit is divided into volumes. In this approach, the pressure wave propagation in the hoses and pipes is neglected. The contact modeling is divided into two stages: contact detection and contact response. Contact detection determines when and where the contact occurs, and contact response provides the force acting at the collision point. The friction between tire and ground is modelled using the LuGre friction model, which describes the frictional force between two surfaces. Typically, the equations of motion are solved in the full matrices format, where the sparsity of the matrices is not considered. Increasing the number of bodies and constraint equations leads to the system matrices becoming large and sparse in structure. To increase the computational efficiency, a technique for solution of sparse matrices is proposed in this dissertation and its implementation demonstrated. To assess the computing efficiency, augmented Lagrangian and semi-recursive methods are implemented employing a sparse matrix technique. From the numerical example, the results show that the proposed approach is applicable and produced appropriate results within the real-time period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims: All members of the ruminal Butyrivibrio group convert linoleic acid (cis-9,cis-12-18 : 2) via conjugated 18 : 2 metabolites (mainly cis-9,trans-11-18 : 2, conjugated linoleic acid) to vaccenic acid (trans-11-18 : 1), but only members of a small branch, which includes Clostridium proteoclasticum, of this heterogeneous group further reduce vaccenic acid to stearic acid (18 : 0, SA). The aims of this study were to develop a real-time polymerase chain reaction (PCR) assay that would detect and quantify these key SA producers and to use this method to detect diet-associated changes in their populations in ruminal digesta of lactating cows. Materials and Results: The use of primers targeting the 16S rRNA gene of Cl. proteoclasticum was not sufficiently specific when only binding dyes were used for detection in real-time PCR. Their sequences were too similar to some nonproducing strains. A molecular beacon probe was designed specifically to detect and quantify the 16S rRNA genes of the Cl. proteoclasticum subgroup. The probe was characterized by its melting curve and validated using five SA-producing and ten nonproducing Butyrivibrio-like strains and 13 other common ruminal bacteria. Analysis of ruminal digesta collected from dairy cows fed different proportions of starch and fibre indicated a Cl. proteoclasticum population of 2-9% of the eubacterial community. The influence of diet on numbers of these bacteria was less than variations between individual cows. Conclusion: A molecular beacon approach in qPCR enables the detection of Cl. proteoclasticum in ruminal digesta. Their numbers are highly variable between individual animals. Signifance and Impact of the Study: SA producers are fundamental to the flow of polyunsaturated fatty acid and vaccenic acid from the rumen. The method described here enabled preliminary information to be obtained about the size of this population. Further application of the method to digesta samples from cows fed diets of more variable composition should enable us to understand how to control these bacteria in order to enhance the nutritional characteristics of ruminant-derived foods, including milk and beef.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mirror lightpipes are useful for providing healthy and energy-efficient daylight into buildings where windows and skylights are unsuitable, insufficient or generate too much heat gain. The lightpipes have been installed in dozens of buildings in the UK. Field monitoring has been carried out to assess their performance in four different buildings: the headquaters of a major insurance company, a health clinic, a residential building and a college dining hall In those cases where lighipipes with moderate aspect ratios were installed, good illuminance of up to 450 lux has been obtained with internal/external illuminance ratios around 1%. When long and narrow lightpipes with many bends are used, however, the ratio reduced to around 0.1%. These results showed that lightpipes can be effective daylighting devices provided that excessive aspect ratios and numbers of bends are avoided. Lightpipes with larger diameters should be used whenever possible. The lightpipes often improved signiScantly the visual quality af the interior environment, and high user satisfaction was found even in buildings where a relatively low level of daylight was admitted through the lightpipes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reduction of portfolio risk is important to all investors but is particularly important to real estate investors as most property portfolios are generally small. As a consequence, portfolios are vulnerable to a significant risk of under-performing the market, or a target rate of return and so investors may be exposing themselves to greater risk than necessary. Given the potentially higher risk of underperformance from owning only a few properties, we follow the approach of Vassal (2001) and examine the benefits of holding more properties in a real estate portfolio. Using Monte Carlo simulation and the returns from 1,728 properties in the IPD database, held over the 10-year period from 1995 to 2004, the results show that increases in portfolio size offers the possibility of a more stable and less volatile return pattern over time, i.e. down-side risk is diminished with increasing portfolio size. Nonetheless, increasing portfolio size has the disadvantage of restricting the probability of out-performing the benchmark index by a significant amount. In other words, although increasing portfolio size reduces the down-side risk in a portfolio, it also decreases its up-side potential. Be that as it may, the results provide further evidence that portfolios with large numbers of properties are always preferable to portfolios of a smaller size.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the short and long-term persistence of tax-exempt real estate funds in the UK through the use of winner-loser contingency table methodology. The persistence tests are applied to a database of varying numbers of funds from a low of 16 to a high of 27 using quarterly returns over the 12 years from 1990 Q1 to 2001 Q4. The overall conclusion is that the real estate funds in the UK show little evidence of persistence in the short-term (quarterly and semi-annual data) or for data over a considerable length of time (bi-annual to six yearly intervals). In contrast, the results are better for annual data with evidence of significant performance persistence. Thus at this stage, it seems that an annual evaluation period, provides the best discrimination of the winner and loser phenomenon in the real estate market. This result is different from equity and bond studies, where it seems that the repeat winner phenomenon is stronger over shorter periods of evaluation. These results require careful interpretation, however, as the results show that when only small samples are used significant adjustments must be made to correct for small sample bias and second the conclusions are sensitive to the length of the evaluation period and specific test used. Nonetheless, it seems that persistence in performance of real estate funds in the UK does exist, at least for the annual data, and it appears to be a guide to beating the pack in the long run. Furthermore, although the evidence of persistence in performance for the overall sample of funds is limited, we have found evidence that two funds were consistent winners over this period, whereas no one fund could be said to be a consistent loser.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The IPD Annual Index is the largest and most comprehensive Real Estate market index available in the UK Such coverage however inevitably leads to delays in publication. In contrast there are a number of quarterly and monthly indices which are published within days of the year end but which lack the coverage in terms of size and numbers of properties. This paper analyses these smaller but more timely indices to see whether such indices can be used to predict the performance of the IPD Annual Index. Using a number of measures of forecasting accuracy it is shown that the smaller indices provide unbiased and efficient predictions of the IPD Annual Index. Such indices also significantly outperform a naive no-change model. Although no one index performs significantly better than the others. The more timely indices however do not perfectly track the IPD Annual Index. As a result any short run predictions of performance will be subject to a degree of error. Nevertheless the more timely indices, although lacking authoritative coverage, provide a valuable service to investors giving good estimates of Real Estates performance well before the publication of the IPD Annual Index.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Energy storage is a potential alternative to conventional network reinforcementof the low voltage (LV) distribution network to ensure the grid’s infrastructure remainswithin its operating constraints. This paper presents a study on the control of such storagedevices, owned by distribution network operators. A deterministic model predictive control (MPC) controller and a stochastic receding horizon controller (SRHC) are presented, wherethe objective is to achieve the greatest peak reduction in demand, for a given storagedevice specification, taking into account the high level of uncertainty in the prediction of LV demand. The algorithms presented in this paper are compared to a standard set-pointcontroller and bench marked against a control algorithm with a perfect forecast. A specificcase study, using storage on the LV network, is presented, and the results of each algorithmare compared. A comprehensive analysis is then carried out simulating a large number of LV networks of varying numbers of households. The results show that the performance of each algorithm is dependent on the number of aggregated households. However, on a typical aggregation, the novel SRHC algorithm presented in this paper is shown to outperform each of the comparable storage control techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we develop a novel constrained recursive least squares algorithm for adaptively combining a set of given multiple models. With data available in an online fashion, the linear combination coefficients of submodels are adapted via the proposed algorithm.We propose to minimize the mean square error with a forgetting factor, and apply the sum to one constraint to the combination parameters. Moreover an l1-norm constraint to the combination parameters is also applied with the aim to achieve sparsity of multiple models so that only a subset of models may be selected into the final model. Then a weighted l2-norm is applied as an approximation to the l1-norm term. As such at each time step, a closed solution of the model combination parameters is available. The contribution of this paper is to derive the proposed constrained recursive least squares algorithm that is computational efficient by exploiting matrix theory. The effectiveness of the approach has been demonstrated using both simulated and real time series examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we describe our system for automatically extracting "correct" programs from proofs using a development of the Curry-Howard process. Although program extraction has been developed by many authors, our system has a number of novel features designed to make it very easy to use and as close as possible to ordinary mathematical terminology and practice. These features include 1. the use of Henkin's technique to reduce higher-order logic to many-sorted (first-order) logic; 2. the free use of new rules for induction subject to certain conditions; 3. the extensive use of previously programmed (total, recursive) functions; 4. the use of templates to make the reasoning much closer to normal mathematical proofs and 5. a conceptual distinction between the computational type theory (for representing programs)and the logical type theory (for reasoning about programs). As an example of our system we give a constructive proof of the well known theorem that every graph of even parity, which is non-trivial in the sense that it does not consist of isolated vertices, has a cycle. Given such a graph as input, the extracted program produces a cycle as promised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A procedure for characterizing global uncertainty of a rainfall-runoff simulation model based on using grey numbers is presented. By using the grey numbers technique the uncertainty is characterized by an interval; once the parameters of the rainfall-runoff model have been properly defined as grey numbers, by using the grey mathematics and functions it is possible to obtain simulated discharges in the form of grey numbers whose envelope defines a band which represents the vagueness/uncertainty associated with the simulated variable. The grey numbers representing the model parameters are estimated in such a way that the band obtained from the envelope of simulated grey discharges includes an assigned percentage of observed discharge values and is at the same time as narrow as possible. The approach is applied to a real case study highlighting that a rigorous application of the procedure for direct simulation through the rainfall-runoff model with grey parameters involves long computational times. However, these times can be significantly reduced using a simplified computing procedure with minimal approximations in the quantification of the grey numbers representing the simulated discharges. Relying on this simplified procedure, the conceptual rainfall-runoff grey model is thus calibrated and the uncertainty bands obtained both downstream of the calibration process and downstream of the validation process are compared with those obtained by using a well-established approach, like the GLUE approach, for characterizing uncertainty. The results of the comparison show that the proposed approach may represent a valid tool for characterizing the global uncertainty associable with the output of a rainfall-runoff simulation model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper uses dynamic programming to study the time consistency of optimal macroeconomic policy in economies with recurring public deficits. To this end, a general equilibrium recursive model introduced in Chang (1998) is extended to include govemment bonds and production. The original mode! presents a Sidrauski economy with money and transfers only, implying that the need for govemment fmancing through the inflation tax is minimal. The extended model introduces govemment expenditures and a deficit-financing scheme, analyzing the SargentWallace (1981) problem: recurring deficits may lead the govemment to default on part of its public debt through inflation. The methodology allows for the computation of the set of alI sustainable stabilization plans even when the govemment cannot pre-commit to an optimal inflation path. This is done through value function iterations, which can be done on a computeI. The parameters of the extended model are calibrated with Brazilian data, using as case study three Brazilian stabilization attempts: the Cruzado (1986), Collor (1990) and the Real (1994) plans. The calibration of the parameters of the extended model is straightforward, but its numerical solution proves unfeasible due to a dimensionality problem in the algorithm arising from limitations of available computer technology. However, a numerical solution using the original algorithm and some calibrated parameters is obtained. Results indicate that in the absence of govemment bonds or production only the Real Plan is sustainable in the long run. The numerical solution of the extended algorithm is left for future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several mobile robots show non-linear behavior, mainly due friction phenomena between the mechanical parts of the robot or between the robot and the ground. Linear models are efficient in some cases, but it is necessary take the robot non-linearity in consideration when precise displacement and positioning are desired. In this work a parametric model identification procedure for a mobile robot with differential drive that considers the dead-zone in the robot actuators is proposed. The method consists in dividing the system into Hammerstein systems and then uses the key-term separation principle to present the input-output relations which shows the parameters from both linear and non-linear blocks. The parameters are then simultaneously estimated through a recursive least squares algorithm. The results shows that is possible to identify the dead-zone thresholds together with the linear parameters

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the use of para-orthogonal polynomials in solving the frequency analysis problem. Through a transformation of Delsarte and Genin, we present an approach for the frequency analysis by using the zeros and Christoffel numbers of polynomials orthogonal on the real line. This leads to a simple and fast algorithm for the estimation of frequencies. We also provide a new method, faster than the Levinson algorithm, for the determination of the reflection coefficients of the corresponding real Szego polynomials from the given moments.