963 resultados para Statistical models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes the procedure and results from four years research undertaken through the IHD (Interdisciplinary Higher Degrees) Scheme at Aston University in Birmingham, sponsored by the SERC (Science and Engineering Research Council) and Monk Dunstone Associates, Chartered Quantity Surveyors. A stochastic networking technique VERT (Venture Evaluation and Review Technique) was used to model the pre-tender costs of public health, heating ventilating, air-conditioning, fire protection, lifts and electrical installations within office developments. The model enabled the quantity surveyor to analyse, manipulate and explore complex scenarios which previously had defied ready mathematical analysis. The process involved the examination of historical material costs, labour factors and design performance data. Components and installation types were defined and formatted. Data was updated and adjusted using mechanical and electrical pre-tender cost indices and location, selection of contractor, contract sum, height and site condition factors. Ranges of cost, time and performance data were represented by probability density functions and defined by constant, uniform, normal and beta distributions. These variables and a network of the interrelationships between services components provided the framework for analysis. The VERT program, in this particular study, relied upon Monte Carlo simulation to model the uncertainties associated with pre-tender estimates of all possible installations. The computer generated output in the form of relative and cumulative frequency distributions of current element and total services costs, critical path analyses and details of statistical parameters. From this data alternative design solutions were compared, the degree of risk associated with estimates was determined, heuristics were tested and redeveloped, and cost significant items were isolated for closer examination. The resultant models successfully combined cost, time and performance factors and provided the quantity surveyor with an appreciation of the cost ranges associated with the various engineering services design options.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microfluidics has recently emerged as a new method of manufacturing liposomes, which allows for reproducible mixing in miliseconds on the nanoliter scale. Here we investigate microfluidics-based manufacturing of liposomes. The aim of these studies was to assess the parameters in a microfluidic process by varying the total flow rate (TFR) and the flow rate ratio (FRR) of the solvent and aqueous phases. Design of experiment and multivariate data analysis were used for increased process understanding and development of predictive and correlative models. High FRR lead to the bottom-up synthesis of liposomes, with a strong correlation with vesicle size, demonstrating the ability to in-process control liposomes size; the resulting liposome size correlated with the FRR in the microfluidics process, with liposomes of 50 nm being reproducibly manufactured. Furthermore, we demonstrate the potential of a high throughput manufacturing of liposomes using microfluidics with a four-fold increase in the volumetric flow rate, maintaining liposome characteristics. The efficacy of these liposomes was demonstrated in transfection studies and was modelled using predictive modeling. Mathematical modelling identified FRR as the key variable in the microfluidic process, with the highest impact on liposome size, polydispersity and transfection efficiency. This study demonstrates microfluidics as a robust and high-throughput method for the scalable and highly reproducible manufacture of size-controlled liposomes. Furthermore, the application of statistically based process control increases understanding and allows for the generation of a design-space for controlled particle characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, temporal and statistical properties of quasi-CW fiber lasers have attracted a great attention. In particular, properties of Raman fiber laser (RFLs) have been studied both numerically and experimentally [1,2]. Experimental investigation is more challengeable, as the full generation optical bandwidth (typically hundreds of GHz for RFLs) is much bigger than real-time bandwidth of oscilloscopes (up to 60GHz for the newest models). So experimentally measured time dynamics is highly bandwidth averaged and do not provide precise information about overall statistical properties. To overpass this, one can use the spectral filtering technique to study temporal and statistical properties within optical bandwidth comparable with measurement bandwidth [3] or indirect measurements [4]. Ytterbium-doped fiber lasers (YDFL) are more suitable for experimental investigation, as their generation spectrum usually 10 times narrower. Moreover, recently ultra-narrow-band generation has been demonstrated in YDFL [5] which provides in principle possibility to measure time dynamics and statistics in real time using conventional oscilloscopes. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Around 80% of the 63 million people in the UK live in urban areas where demand for affordable housing is highest. Supply of new dwellings is a long way short of demand and with an average annual replacement rate of 0.5% more than 80% of the existing residential housing stock will still be in use by 2050. A high proportion of owner-occupiers, a weak private rental sector and lack of sustainable financing models render England’s housing market one of the least responsive in the developed world. As an exploratory research the purpose of this paper is to examine the provision of social housing in the United Kingdom with a particular focus on England, and to set out implications for housing associations delivering sustainable community development. The paper is based on an analysis of historical data series (Census data), current macro-economic data and population projections to 2033. The paper identifies a chronic undersupply of affordable housing in England which is likely to be exacerbated by demographic development, changes in household composition and reduced availability of finance to develop new homes. Based on the housing market trends analysed in this paper opportunities are identified for policy makers to remove barriers to the delivery of new affordable homes and for social housing providers to evolve their business models by taking a wider role in sustainable community development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The accurate identification of T-cell epitopes remains a principal goal of bioinformatics within immunology. As the immunogenicity of peptide epitopes is dependent on their binding to major histocompatibility complex (MHC) molecules, the prediction of binding affinity is a prerequisite to the reliable prediction of epitopes. The iterative self-consistent (ISC) partial-least-squares (PLS)-based additive method is a recently developed bioinformatic approach for predicting class II peptide−MHC binding affinity. The ISC−PLS method overcomes many of the conceptual difficulties inherent in the prediction of class II peptide−MHC affinity, such as the binding of a mixed population of peptide lengths due to the open-ended class II binding site. The method has applications in both the accurate prediction of class II epitopes and the manipulation of affinity for heteroclitic and competitor peptides. The method is applied here to six class II mouse alleles (I-Ab, I-Ad, I-Ak, I-As, I-Ed, and I-Ek) and included peptides up to 25 amino acids in length. A series of regression equations highlighting the quantitative contributions of individual amino acids at each peptide position was established. The initial model for each allele exhibited only moderate predictivity. Once the set of selected peptide subsequences had converged, the final models exhibited a satisfactory predictive power. Convergence was reached between the 4th and 17th iterations, and the leave-one-out cross-validation statistical terms - q2, SEP, and NC - ranged between 0.732 and 0.925, 0.418 and 0.816, and 1 and 6, respectively. The non-cross-validated statistical terms r2 and SEE ranged between 0.98 and 0.995 and 0.089 and 0.180, respectively. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made freely available online (http://www.jenner.ac.uk/MHCPred).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mathematical Subject Classification 2010:26A33, 33E99, 15A52, 62E15.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

MSC 2010: 15A15, 15A52, 33C60, 33E12, 44A20, 62E15 Dedicated to Professor R. Gorenflo on the occasion of his 80th birthday

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62G08, 62P30.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We build the Conditional Least Squares Estimator of 0 based on the observation of a single trajectory of {Zk,Ck}k, and give conditions ensuring its strong consistency. The particular case of general linear models according to 0=( 0, 0) and among them, regenerative processes, are studied more particularly. In this frame, we may also prove the consistency of the estimator of 0 although it belongs to an asymptotic negligible part of the model, and the asymptotic law of the estimator may also be calculated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62P10, 92C20

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A pénzügyi modellek jelentős része feltételezi a piacok hatékony működését. Ennek következtében számos tudományos kutatás központi témája volt a piacok hatékonyságának tesztelése és ennek igazolása, esetleg cáfolata. Ezen próbálkozások azonban mind a mai napig eredménytelenül zárultak. A tesztelések nyomán a kutatások a termékek áralakulásából indultak ki, és a hozamokat ezen keresztül elemezték. Az elmúlt években azonban a fókusz átterelődött az árak alakulásáról egy elemibb tényezőre, az ajánlati könyvre. Ugyanis végső soron az ajánlatvezérelt piacokon az árakat az ajánlati könyvbe benyújtott megbízások alakulása fogja meghatározni. Mivel a tőzsdék jelentős része ajánlatvezérelt piacként működik, ezért érdemesnek tartották a kutatók, hogy inkább az ajánlati könyv alakulásának statisztikai jellemzőit elemezzék, hátha az eredményre vezet, és sikerül közelebb jutni a hatékony piacok elméletének igazolásához vagy cáfolatához. Jelen tanulmány célja az, hogy az eddig megjelent tudományos kutatások alapján ismertesse az ajánlati könyv alapvető statisztikai tulajdonságait, és rávilágítson arra: mindez valójában hozzájárult-e a hatékony piacok elméletének igazolásához? ______ Most of the fi nancial models assume that markets are effi cient. As a result, numerous scientifi c researchers were focused on testing the effi cient market hypothesis, and tried to prove, or deny it. However, all these attempts are still unsuccessful. During these researches, the analyses of the effi cient market hypothesis were based on the price evolution of a certain asset, and through this the returns were examined. In the recent years the research interest has changed, and instead of analyzing the returns, a more primary factor got into focus, namely the limit order book. The reason is that on order driven markets the prices and the order sizes in the limit order book infl uence the price evolution on the market. Since a notable number of stock markets operate as an order driven market, the researchers thought that it worth analyzing the statistical properties of the limit order book, because maybe it will get us closer to the proof of the effi cient market hypothesis. The purpose of this study is to summarize the statistical properties of the limit order book, based on the scientifi c works published so far. The study would like to highlight whether these studies contributed to the proof or disproof of the effi cient market hypothesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Bázel–2. tőkeegyezmény bevezetését követően a bankok és hitelintézetek Magyarországon is megkezdték saját belső minősítő rendszereik felépítését, melyek karbantartása és fejlesztése folyamatos feladat. A szerző arra a kérdésre keres választ, hogy lehetséges-e a csőd-előrejelző modellek előre jelző képességét növelni a hagyományos matematikai-statisztikai módszerek alkalmazásával oly módon, hogy a modellekbe a pénzügyi mutatószámok időbeli változásának mértékét is beépítjük. Az empirikus kutatási eredmények arra engednek következtetni, hogy a hazai vállalkozások pénzügyi mutatószámainak időbeli alakulása fontos információt hordoz a vállalkozás jövőbeli fizetőképességéről, mivel azok felhasználása jelentősen növeli a csődmodellek előre jelző képességét. A szerző azt is megvizsgálja, hogy javítja-e a megfigyelések szélsőségesen magas vagy alacsony értékeinek modellezés előtti korrekciója a modellek klasszifikációs teljesítményét. ______ Banks and lenders in Hungary also began, after the introduction of the Basel 2 capital agreement, to build up their internal rating systems, whose maintenance and development are a continuing task. The author explores whether it is possible to increase the predictive capacity of business-failure forecasting models by traditional mathematical-cum-statistical means in such a way that they incorporate the measure of change in the financial indicators over time. Empirical findings suggest that the temporal development of the financial indicators of firms in Hungary carries important information about future ability to pay, since the predictive capacity of bankruptcy forecasting models is greatly increased by using such indicators. The author also examines whether the classification performance of the models can be improved by correcting for extremely high or low values before modelling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the present dissertation was to evaluate the internal validity of symptoms of four common anxiety disorders included in the Diagnostic and Statistical Manual of Mental Disorders fourth edition (text revision) (DSM-IV-TR; American Psychiatric Association, 2000), namely, separation anxiety disorder (SAD), social phobia (SOP), specific phobia (SP), and generalized anxiety disorder (GAD), in a sample of 625 youth (ages 6 to 17 years) referred to an anxiety disorders clinic and 479 parents. Confirmatory factor analyses (CFAs) were conducted on the dichotomous items of the SAD, SOP, SP, and GAD sections of the youth and parent versions of the Anxiety Disorders Interview Schedule for DSM-IV (ADIS-IV: C/P; Silverman & Albano, 1996) to test and compare a number of factor models including a factor model based on the DSM. Contrary to predictions, findings from CFAs showed that a correlated model with five factors of SAD, SOP, SP, GAD worry, and GAD somatic distress, provided the best fit of the youth data as well as the parent data. Multiple group CFAs supported the metric invariance of the correlated five factor model across boys and girls. Thus, the present study’s finding supports the internal validity of DSM-IV SAD, SOP, and SP, but raises doubt regarding the internal validity of GAD.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A pre-test, post-test, quasi-experimental design was used to examine the effects of student-centered and traditional models of reading instruction on outcomes of literal comprehension and critical thinking skills. The sample for this study consisted of 101 adult students enrolled in a high-level developmental reading course at a large, urban community college in the Southeastern United States. The experimental group consisted of 48 students, and the control group consisted of 53 students. Students in the experimental group were limited in the time spent reading a course text of basic skills, with instructors using supplemental materials such as poems, news articles, and novels. Discussions, the reading-writing connection, and student choice in material selection were also part of the student-centered curriculum. Students in the control group relied heavily on a course text and vocabulary text for reading material, with great focus placed on basic skills. Activities consisted primarily of multiple-choice questioning and quizzes. The instrument used to collect pre-test data was Descriptive Tests of Language Skills in Reading Comprehension; post-test data were taken from the Florida College Basic Skills Exit Test. A MANCOVA was used as the statistical method to determine if either model of instruction led to significantly higher gains in literal comprehension skills or critical thinking skills. A paired samples t-test was also used to compare pre-test and post-test means. The results of the MANCOVA indicated no significant difference between instructional models on scores of literal comprehension and critical thinking. Neither was there any significant difference in scores between subgroups of age (under 25 and 25 and older) and language background (native English speaker and second-language learner). The results of the t-test indicated, however, that students taught under both instructional models made significant gains in on both literal comprehension and critical thinking skills from pre-test to post-test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We would like to thank the study participants and the clinical and research staff at the Queen Elizabeth National Spinal Injury Unit, as without them this study would not have been possible. We are grateful for the funding received from Glasgow Research Partnership in Engineering for the employment of SC during data collection for this study. We would like to thank the Royal Society of Edinburgh's Scottish Crucible scheme for providing the opportunity for this collaboration to occur. We are also indebted to Maria Dumitrascuta for her time and effort in producing inter-repeatability results for the shape models.