890 resultados para real option analysis
Resumo:
Aims: All members of the ruminal Butyrivibrio group convert linoleic acid (cis-9,cis-12-18 : 2) via conjugated 18 : 2 metabolites (mainly cis-9,trans-11-18 : 2, conjugated linoleic acid) to vaccenic acid (trans-11-18 : 1), but only members of a small branch, which includes Clostridium proteoclasticum, of this heterogeneous group further reduce vaccenic acid to stearic acid (18 : 0, SA). The aims of this study were to develop a real-time polymerase chain reaction (PCR) assay that would detect and quantify these key SA producers and to use this method to detect diet-associated changes in their populations in ruminal digesta of lactating cows. Materials and Results: The use of primers targeting the 16S rRNA gene of Cl. proteoclasticum was not sufficiently specific when only binding dyes were used for detection in real-time PCR. Their sequences were too similar to some nonproducing strains. A molecular beacon probe was designed specifically to detect and quantify the 16S rRNA genes of the Cl. proteoclasticum subgroup. The probe was characterized by its melting curve and validated using five SA-producing and ten nonproducing Butyrivibrio-like strains and 13 other common ruminal bacteria. Analysis of ruminal digesta collected from dairy cows fed different proportions of starch and fibre indicated a Cl. proteoclasticum population of 2-9% of the eubacterial community. The influence of diet on numbers of these bacteria was less than variations between individual cows. Conclusion: A molecular beacon approach in qPCR enables the detection of Cl. proteoclasticum in ruminal digesta. Their numbers are highly variable between individual animals. Signifance and Impact of the Study: SA producers are fundamental to the flow of polyunsaturated fatty acid and vaccenic acid from the rumen. The method described here enabled preliminary information to be obtained about the size of this population. Further application of the method to digesta samples from cows fed diets of more variable composition should enable us to understand how to control these bacteria in order to enhance the nutritional characteristics of ruminant-derived foods, including milk and beef.
Resumo:
Background: Hexaploid wheat is one of the most important cereal crops for human nutrition. Molecular understanding of the biology of the developing grain will assist the improvement of yield and quality traits for different environments. High quality transcriptomics is a powerful method to increase this understanding. Results: The transcriptome of developing caryopses from hexaploid wheat ( Triticum aestivum, cv. Hereward) was determined using Affymetrix wheat GeneChip (R) oligonucleotide arrays which have probes for 55,052 transcripts. Of these, 14,550 showed significant differential regulation in the period between 6 and 42 days after anthesis ( daa). Large changes in transcript abundance were observed which were categorised into distinct phases of differentiation ( 6 - 10 daa), grain fill ( 12 - 21 daa) and desiccation/maturation ( 28 - 42 daa) and were associated with specific tissues and processes. A similar experiment on developing caryopses grown with dry and/or hot environmental treatments was also analysed, using the profiles established in the first experiment to show that most environmental treatment effects on transcription were due to acceleration of development, but that a few transcripts were specifically affected. Transcript abundance profiles in both experiments for nine selected known and putative wheat transcription factors were independently confirmed by real time RT-PCR. These expression profiles confirm or extend our knowledge of the roles of the known transcription factors and suggest roles for the unknown ones. Conclusion: This transcriptome data will provide a valuable resource for molecular studies on wheat grain. It has been demonstrated how it can be used to distinguish general developmental shifts from specific effects of treatments on gene expression and to diagnose the probable tissue specificity and role of transcription factors.
Resumo:
Bovine tuberculosis (TB)is an important economic disease. Badgers (Meles meles) are the wildlife source implicated in many cattle outbreaks of TB in Britain, and extensive badger control is a controversial option to reduce the disease. A badger and cattle population model was developed, simulating TB epidemiology; badger ecology, including postcull social perturbation; and TB-related farm management. An economic cost-benefit module was integrated into the model to assess whether badger control offers economic benefits. Model results strongly indicate that although, if perturbation were restricted, extensive badger culling could reduce rates in cattle, overall an economic loss would be more likely than a benefit. Perturbation of the badger population was a key factor determining success or failure of control. The model highlighted some important knowledge gaps regarding both the spatial and temporal characteristics of perturbation that warrant further research.
Resumo:
Assaying a large number of genetic markers from patients in clinical trials is now possible in order to tailor drugs with respect to efficacy. The statistical methodology for analysing such massive data sets is challenging. The most popular type of statistical analysis is to use a univariate test for each genetic marker, once all the data from a clinical study have been collected. This paper presents a sequential method for conducting an omnibus test for detecting gene-drug interactions across the genome, thus allowing informed decisions at the earliest opportunity and overcoming the multiple testing problems from conducting many univariate tests. We first propose an omnibus test for a fixed sample size. This test is based on combining F-statistics that test for an interaction between treatment and the individual single nucleotide polymorphism (SNP). As SNPs tend to be correlated, we use permutations to calculate a global p-value. We extend our omnibus test to the sequential case. In order to control the type I error rate, we propose a sequential method that uses permutations to obtain the stopping boundaries. The results of a simulation study show that the sequential permutation method is more powerful than alternative sequential methods that control the type I error rate, such as the inverse-normal method. The proposed method is flexible as we do not need to assume a mode of inheritance and can also adjust for confounding factors. An application to real clinical data illustrates that the method is computationally feasible for a large number of SNPs. Copyright (c) 2007 John Wiley & Sons, Ltd.
Resumo:
This article introduces a quantitative approach to e-commerce system evaluation based on the theory of process simulation. The general concept of e-commerce system simulation is presented based on the considerations of some limitations in e-commerce system development such as the huge amount of initial investments of time and money, and the long period from business planning to system development, then to system test and operation, and finally to exact return; in other words, currently used system analysis and development method cannot tell investors about some keen attentions such as how good their e-commerce system could be, how many investment repayments they could have, and which area they should improve regarding the initial business plan. In order to exam the value and its potential effects of an e-commerce business plan, it is necessary to use a quantitative evaluation approach and the authors of this article believe that process simulation is an appropriate option. The overall objective of this article is to apply the theory of process simulation to e-commerce system evaluation, and the authors will achieve this though an experimental study on a business plan for online construction and demolition waste exchange. The methodologies adopted in this article include literature review, system analysis and development, simulation modelling and analysis, and case study. The results from this article include the concept of e-commerce system simulation, a comprehensive review of simulation methods adopted in e-commerce system evaluation, and a real case study of applying simulation to e-commerce system evaluation. Furthermore, the authors hope that the adoption and implementation of the process simulation approach can effectively support business decision-making, and improve the efficiency of e-commerce systems.
Resumo:
A wind catcher/tower natural ventilation system was installed in a seminar room in the building of the School of Construction Management and Engineering, the University of Reading in the UK . Performance was analysed by means of ventilation tracer gas measurements, indoor climate measurements (temperature, humidity, CO2) and occupant surveys. In addition, the potential of simple design tools was evaluated by comparing observed ventilation results with those predicted by an explicit ventilation model and the AIDA implicit ventilation model. To support this analysis, external climate parameters (wind speed and direction, solar radiation, external temperature and humidity) were also monitored. The results showed the chosen ventilation design provided a substantially greater ventilation rate than an equivalent area of openable window. Also air quality parameters stayed within accepted norms while occupants expressed general satisfaction with the system and with comfort conditions. Night cooling was maximised by using the system in combination with openable windows. Comparisons of calculations with ventilation rate measurements showed that while AIDA gave reasonably correlated results with the monitored performance results, the widely used industry explicit model was found to over estimate the monitored ventilation rate.
Resumo:
The relationship between speed and crashes has been well established in the literature, with the consequence that speed reduction through enforced or other means should lead to a reduction in crashes. The extent to which the public regard speeding as a problem that requires enforcement is less clear. Analysis was conducted on public perceptions of antisocial behaviors including speeding traffic. The data was collected as part of the British Crime Survey, a face-to-face interview with UK residents on issues relating to crime. The antisocial behavior section required participants to state the degree to which they perceived 16 antisocial behaviors to be a problem in their area. Results revealed that speeding traffic was perceived as the greatest problem in local communities, regardless of whether respondents were male or female, young, middle aged, or old. The rating of speeding traffic as the greatest problem in the community was replicated in a second, smaller postal survey, where respondents also provided strong support for enforcement on residential roads, and indicated that traveling immediately above the speed limit on residential roads was unacceptable. Results are discussed in relation to practical implications for speed enforcement, and the prioritization of limited police resources. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
An increasing number of neuroscience experiments are using virtual reality to provide a more immersive and less artificial experimental environment. This is particularly useful to navigation and three-dimensional scene perception experiments. Such experiments require accurate real-time tracking of the observer's head in order to render the virtual scene. Here, we present data on the accuracy of a commonly used six degrees of freedom tracker (Intersense IS900) when it is moved in ways typical of virtual reality applications. We compared the reported location of the tracker with its location computed by an optical tracking method. When the tracker was stationary, the root mean square error in spatial accuracy was 0.64 mm. However, we found that errors increased over ten-fold (up to 17 mm) when the tracker moved at speeds common in virtual reality applications. We demonstrate that the errors we report here are predominantly due to inaccuracies of the IS900 system rather than the optical tracking against which it was compared. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
We introduce transreal analysis as a generalisation of real analysis. We find that the generalisation of the real exponential and logarithmic functions is well defined for all transreal numbers. Hence, we derive well defined values of all transreal powers of all non-negative transreal numbers. In particular, we find a well defined value for zero to the power of zero. We also note that the computation of products via the transreal logarithm is identical to the transreal product, as expected. We then generalise all of the common, real, trigonometric functions to transreal functions and show that transreal (sin x)/x is well defined everywhere. This raises the possibility that transreal analysis is total, in other words, that every function and every limit is everywhere well defined. If so, transreal analysis should be an adequate mathematical basis for analysing the perspex machine - a theoretical, super-Turing machine that operates on a total geometry. We go on to dispel all of the standard counter "proofs" that purport to show that division by zero is impossible. This is done simply by carrying the proof through in transreal arithmetic or transreal analysis. We find that either the supposed counter proof has no content or else that it supports the contention that division by zero is possible. The supposed counter proofs rely on extending the standard systems in arbitrary and inconsistent ways and then showing, tautologously, that the chosen extensions are not consistent. This shows only that the chosen extensions are inconsistent and does not bear on the question of whether division by zero is logically possible. By contrast, transreal arithmetic is total and consistent so it defeats any possible "straw man" argument. Finally, we show how to arrange that a function has finite or else unmeasurable (nullity) values, but no infinite values. This arithmetical arrangement might prove useful in mathematical physics because it outlaws naked singularities in all equations.
Resumo:
The major technical objectives of the RC-NSPES are to provide a framework for the concurrent operation of reactive and pro-active security functions to deliver efficient and optimised intrusion detection schemes as well as enhanced and highly correlated rule sets for more effective alerts management and root-cause analysis. The design and implementation of the RC-NSPES solution includes a number of innovative features in terms of real-time programmable embedded hardware (FPGA) deployment as well as in the integrated management station. These have been devised so as to deliver enhanced detection of attacks and contextualised alerts against threats that can arise from both the network layer and the application layer protocols. The resulting architecture represents an efficient and effective framework for the future deployment of network security systems.
Resumo:
A desktop tool for replay and analysis of gaze-enhanced multiparty virtual collaborative sessions is described. We linked three CAVE (TM)-like environments, creating a multiparty collaborative virtual space where avatars are animated with 3D gaze as well as head and hand motions in real time. Log files are recorded for subsequent playback and analysis Using the proposed software tool. During replaying the user can rotate the viewpoint and navigate in the simulated 3D scene. The playback mechanism relies on multiple distributed log files captured at every site. This structure enables an observer to experience latencies of movement and information transfer for every site as this is important fir conversation analysis. Playback uses an event-replay algorithm, modified to allow fast traversal of the scene by selective rendering of nodes, and to simulate fast random access. The tool's is analysis module can show each participant's 3D gaze points and areas where gaze has been concentrated.
Resumo:
A simple and practical technique for assessing the risks, that is, the potential for error, and consequent loss, in software system development, acquired during a requirements engineering phase is described. The technique uses a goal-based requirements analysis as a framework to identify and rate a set of key issues in order to arrive at estimates of the feasibility and adequacy of the requirements. The technique is illustrated and how it has been applied to a real systems development project is shown. How problems in this project could have been identified earlier is shown, thereby avoiding costly additional work and unhappy users.
Resumo:
Transient episodes of synchronisation of neuronal activity in particular frequency ranges are thought to underlie cognition. Empirical mode decomposition phase locking (EMDPL) analysis is a method for determining the frequency and timing of phase synchrony that is adaptive to intrinsic oscillations within data, alleviating the need for arbitrary bandpass filter cut-off selection. It is extended here to address the choice of reference electrode and removal of spurious synchrony resulting from volume conduction. Spline Laplacian transformation and independent component analysis (ICA) are performed as pre-processing steps, and preservation of phase synchrony between synthetic signals. combined using a simple forward model, is demonstrated. The method is contrasted with use of bandpass filtering following the same preprocessing steps, and filter cut-offs are shown to influence synchrony detection markedly. Furthermore, an approach to the assessment of multiple EEG trials using the method is introduced, and the assessment of statistical significance of phase locking episodes is extended to render it adaptive to local phase synchrony levels. EMDPL is validated in the analysis of real EEG data, during finger tapping. The time course of event-related (de)synchronisation (ERD/ERS) is shown to differ from that of longer range phase locking episodes, implying different roles for these different types of synchronisation. It is suggested that the increase in phase locking which occurs just prior to movement, coinciding with a reduction in power (or ERD) may result from selection of the neural assembly relevant to the particular movement. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
An analysis of averaging procedures is presented for an approximate Riemann solver for the equations governing the compressible flow of a real gas. This study extends earlier work for the Euler equations with ideal gases.
Resumo:
The overall operation and internal complexity of a particular production machinery can be depicted in terms of clusters of multidimensional points which describe the process states, the value in each point dimension representing a measured variable from the machinery. The paper describes a new cluster analysis technique for use with manufacturing processes, to illustrate how machine behaviour can be categorised and how regions of good and poor machine behaviour can be identified. The cluster algorithm presented is the novel mean-tracking algorithm, capable of locating N-dimensional clusters in a large data space in which a considerable amount of noise is present. Implementation of the algorithm on a real-world high-speed machinery application is described, with clusters being formed from machinery data to indicate machinery error regions and error-free regions. This analysis is seen to provide a promising step ahead in the field of multivariable control of manufacturing systems.