858 resultados para [JEL:C5] Mathematical and Quantitative Methods - Econometric Modeling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this research was to develop a high-fidelity dynamic model of a parafoilpayload system with respect to its application for the Ship Launched Aerial Delivery System (SLADS). SLADS is a concept in which cargo can be transfered from ship to shore using a parafoil-payload system. It is accomplished in two phases: An initial towing phase when the glider follows the towing vessel in a passive lift mode and an autonomous gliding phase when the system is guided to the desired point. While many previous researchers have analyzed the parafoil-payload system when it is released from another airborne vehicle, limited work has been done in the area of towing up the system from ground or sea. One of the main contributions of this research was the development of a nonlinear dynamic model of a towed parafoil-payload system. After performing an extensive literature review of the existing methods of modeling a parafoil-payload system, a five degree-of-freedom model was developed. The inertial and geometric properties of the system were investigated to predict accurate results in the simulation environment. Since extensive research has been done in determining the aerodynamic characteristics of a paraglider, an existing aerodynamic model was chosen to incorporate the effects of air flow around the flexible paraglider wing. During the towing phase, it is essential that the parafoil-payload system follow the line of the towing vessel path to prevent an unstable flight condition called ‘lockout’. A detailed study of the causes of lockout, its mathematical representation and the flight conditions and the parameters related to lockout, constitute another contribution of this work. A linearized model of the parafoil-payload system was developed and used to analyze the stability of the system about equilibrium conditions. The relationship between the control surface inputs and the stability was investigated. In addition to stability of flight, one more important objective of SLADS is to tow up the parafoil-payload system as fast as possible. The tension in the tow cable is directly proportional to the rate of ascent of the parafoil-payload system. Lockout instability is more favorable when tow tensions are large. Thus there is a tradeoff between susceptibility to lockout and rapid deployment. Control strategies were also developed for optimal tow up and to maintain stability in the event of disturbances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Empirical evidence and theoretical studies suggest that the phenotype, i.e., cellular- and molecular-scale dynamics, including proliferation rate and adhesiveness due to microenvironmental factors and gene expression that govern tumor growth and invasiveness, also determine gross tumor-scale morphology. It has been difficult to quantify the relative effect of these links on disease progression and prognosis using conventional clinical and experimental methods and observables. As a result, successful individualized treatment of highly malignant and invasive cancers, such as glioblastoma, via surgical resection and chemotherapy cannot be offered and outcomes are generally poor. What is needed is a deterministic, quantifiable method to enable understanding of the connections between phenotype and tumor morphology. Here, we critically assess advantages and disadvantages of recent computational modeling efforts (e.g., continuum, discrete, and cellular automata models) that have pursued this understanding. Based on this assessment, we review a multiscale, i.e., from the molecular to the gross tumor scale, mathematical and computational "first-principle" approach based on mass conservation and other physical laws, such as employed in reaction-diffusion systems. Model variables describe known characteristics of tumor behavior, and parameters and functional relationships across scales are informed from in vitro, in vivo and ex vivo biology. We review the feasibility of this methodology that, once coupled to tumor imaging and tumor biopsy or cell culture data, should enable prediction of tumor growth and therapy outcome through quantification of the relation between the underlying dynamics and morphological characteristics. In particular, morphologic stability analysis of this mathematical model reveals that tumor cell patterning at the tumor-host interface is regulated by cell proliferation, adhesion and other phenotypic characteristics: histopathology information of tumor boundary can be inputted to the mathematical model and used as a phenotype-diagnostic tool to predict collective and individual tumor cell invasion of surrounding tissue. This approach further provides a means to deterministically test effects of novel and hypothetical therapy strategies on tumor behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

My dissertation focuses on developing methods for gene-gene/environment interactions and imprinting effect detections for human complex diseases and quantitative traits. It includes three sections: (1) generalizing the Natural and Orthogonal interaction (NOIA) model for the coding technique originally developed for gene-gene (GxG) interaction and also to reduced models; (2) developing a novel statistical approach that allows for modeling gene-environment (GxE) interactions influencing disease risk, and (3) developing a statistical approach for modeling genetic variants displaying parent-of-origin effects (POEs), such as imprinting. In the past decade, genetic researchers have identified a large number of causal variants for human genetic diseases and traits by single-locus analysis, and interaction has now become a hot topic in the effort to search for the complex network between multiple genes or environmental exposures contributing to the outcome. Epistasis, also known as gene-gene interaction is the departure from additive genetic effects from several genes to a trait, which means that the same alleles of one gene could display different genetic effects under different genetic backgrounds. In this study, we propose to implement the NOIA model for association studies along with interaction for human complex traits and diseases. We compare the performance of the new statistical models we developed and the usual functional model by both simulation study and real data analysis. Both simulation and real data analysis revealed higher power of the NOIA GxG interaction model for detecting both main genetic effects and interaction effects. Through application on a melanoma dataset, we confirmed the previously identified significant regions for melanoma risk at 15q13.1, 16q24.3 and 9p21.3. We also identified potential interactions with these significant regions that contribute to melanoma risk. Based on the NOIA model, we developed a novel statistical approach that allows us to model effects from a genetic factor and binary environmental exposure that are jointly influencing disease risk. Both simulation and real data analyses revealed higher power of the NOIA model for detecting both main genetic effects and interaction effects for both quantitative and binary traits. We also found that estimates of the parameters from logistic regression for binary traits are no longer statistically uncorrelated under the alternative model when there is an association. Applying our novel approach to a lung cancer dataset, we confirmed four SNPs in 5p15 and 15q25 region to be significantly associated with lung cancer risk in Caucasians population: rs2736100, rs402710, rs16969968 and rs8034191. We also validated that rs16969968 and rs8034191 in 15q25 region are significantly interacting with smoking in Caucasian population. Our approach identified the potential interactions of SNP rs2256543 in 6p21 with smoking on contributing to lung cancer risk. Genetic imprinting is the most well-known cause for parent-of-origin effect (POE) whereby a gene is differentially expressed depending on the parental origin of the same alleles. Genetic imprinting affects several human disorders, including diabetes, breast cancer, alcoholism, and obesity. This phenomenon has been shown to be important for normal embryonic development in mammals. Traditional association approaches ignore this important genetic phenomenon. In this study, we propose a NOIA framework for a single locus association study that estimates both main allelic effects and POEs. We develop statistical (Stat-POE) and functional (Func-POE) models, and demonstrate conditions for orthogonality of the Stat-POE model. We conducted simulations for both quantitative and qualitative traits to evaluate the performance of the statistical and functional models with different levels of POEs. Our results showed that the newly proposed Stat-POE model, which ensures orthogonality of variance components if Hardy-Weinberg Equilibrium (HWE) or equal minor and major allele frequencies is satisfied, had greater power for detecting the main allelic additive effect than a Func-POE model, which codes according to allelic substitutions, for both quantitative and qualitative traits. The power for detecting the POE was the same for the Stat-POE and Func-POE models under HWE for quantitative traits.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A major problem in modern probabilistic modeling is the huge computational complexity involved in typical calculations with multivariate probability distributions when the number of random variables is large. Because exact computations are infeasible in such cases and Monte Carlo sampling techniques may reach their limits, there is a need for methods that allow for efficient approximate computations. One of the simplest approximations is based on the mean field method, which has a long history in statistical physics. The method is widely used, particularly in the growing field of graphical models. Researchers from disciplines such as statistical physics, computer science, and mathematical statistics are studying ways to improve this and related methods and are exploring novel application areas. Leading approaches include the variational approach, which goes beyond factorizable distributions to achieve systematic improvements; the TAP (Thouless-Anderson-Palmer) approach, which incorporates correlations by including effective reaction terms in the mean field theory; and the more general methods of graphical models. Bringing together ideas and techniques from these diverse disciplines, this book covers the theoretical foundations of advanced mean field methods, explores the relation between the different approaches, examines the quality of the approximation obtained, and demonstrates their application to various areas of probabilistic modeling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the work described here has been to seek methods of narrowing the present gap between currently realised heat pump performance and the theoretical limit. The single most important pre-requisite to this objective is the identification and quantitative assessment of the various non-idealities and degradative phenomena responsible for the present shortfall. The use of availability analysis has been introduced as a diagnostic tool, and applied to a few very simple, highly idealised Rankine cycle optimisation problems. From this work, it has been demonstrated that the scope for improvement through optimisation is small in comparison with the extensive potential for improvement by reducing the compressor's losses. A fully instrumented heat pump was assembled and extensively tested. This furnished performance data, and led to an improved understanding of the systems behaviour. From a very simple analysis of the resulting compressor performance data, confirmation of the compressor's low efficiency was obtained. In addition, in order to obtain experimental data concerning specific details of the heat pump's operation, several novel experiments were performed. The experimental work was concluded with a set of tests which attempted to obtain definitive performance data for a small set of discrete operating conditions. These tests included an investigation of the effect of two compressor modifications. The resulting performance data was analysed by a sophisticated calculation which used that measurements to quantify each dagradative phenomenon occurring in that compressor, and so indicate where the greatest potential for improvement lies. Finally, in the light of everything that was learnt, specific technical suggestions have been made, to reduce the losses associated with both the refrigerant circuit and the compressor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter explores ways in which rigorous mathematical techniques, termed formal methods, can be employed to improve the predictability and dependability of autonomic computing. Model checking, formal specification, and quantitative verification are presented in the contexts of conflict detection in autonomic computing policies, and of implementation of goal and utility-function policies in autonomic IT systems, respectively. Each of these techniques is illustrated using a detailed case study, and analysed to establish its merits and limitations. The analysis is then used as a basis for discussing the challenges and opportunities of this endeavour to transition the development of autonomic IT systems from the current practice of using ad-hoc methods and heuristic towards a more principled approach. © 2012, IGI Global.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the emergence of software engineering in the late 1960's as a response to the software crisis, researchers throughout the world are trying to give theoretical support to this discipline. Several points of view have to be reviewed in order to complete this task. In the middle 70's Frederick Brooks Jr. coined the term "silver bullet" suggesting the solution to several problems rela-ted to software engineering and, hence, we adopted such a metaphor as a symbol for this book. Methods, modeling, and teaching are the insights reviewed in this book. Some work related to these topies is presented by software engineering researchers, led by Ivar Jacobson, one of the most remarkable researchers in this area. We hope our work will contribute to advance in giving the theoretieal support that software engineering needs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is a continuation of the paper titled “Concurrent multi-scale modeling of civil infrastructure for analyses on structural deteriorating—Part I: Modeling methodology and strategy” with the emphasis on model updating and verification for the developed concurrent multi-scale model. The sensitivity-based parameter updating method was applied and some important issues such as selection of reference data and model parameters, and model updating procedures on the multi-scale model were investigated based on the sensitivity analysis of the selected model parameters. The experimental modal data as well as static response in terms of component nominal stresses and hot-spot stresses at the concerned locations were used for dynamic response- and static response-oriented model updating, respectively. The updated multi-scale model was further verified to act as the baseline model which is assumed to be finite-element model closest to the real situation of the structure available for the subsequent arbitrary numerical simulation. The comparison of dynamic and static responses between the calculated results by the final model and measured data indicated the updating and verification methods applied in this paper are reliable and accurate for the multi-scale model of frame-like structure. The general procedures of multi-scale model updating and verification were finally proposed for nonlinear physical-based modeling of large civil infrastructure, and it was applied to the model verification of a long-span bridge as an actual engineering practice of the proposed procedures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we propose a multivariate GARCH model with a time-varying conditional correlation structure. The new double smooth transition conditional correlation (DSTCC) GARCH model extends the smooth transition conditional correlation (STCC) GARCH model of Silvennoinen and Teräsvirta (2005) by including another variable according to which the correlations change smoothly between states of constant correlations. A Lagrange multiplier test is derived to test the constancy of correlations against the DSTCC-GARCH model, and another one to test for another transition in the STCC-GARCH framework. In addition, other specification tests, with the aim of aiding the model building procedure, are considered. Analytical expressions for the test statistics and the required derivatives are provided. Applying the model to the stock and bond futures data, we discover that the correlation pattern between them has dramatically changed around the turn of the century. The model is also applied to a selection of world stock indices, and we find evidence for an increasing degree of integration in the capital markets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advent of live cell imaging microscopy, new types of mathematical analyses and measurements are possible. Many of the real-time movies of cellular processes are visually very compelling, but elementary analysis of changes over time of quantities such as surface area and volume often show that there is more to the data than meets the eye. This unit outlines a geometric modeling methodology and applies it to tubulation of vesicles during endocytosis. Using these principles, it has been possible to build better qualitative and quantitative understandings of the systems observed, as well as to make predictions about quantities such as ligand or solute concentration, vesicle pH, and membrane trafficked. The purpose is to outline a methodology for analyzing real-time movies that has led to a greater appreciation of the changes that are occurring during the time frame of the real-time video microscopy and how additional quantitative measurements allow for further hypotheses to be generated and tested.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The psychological contract is a key analytical device utilised by both academics and practitioners to conceptualise and explore the dynamics of the employment relationship. However, despite the recognised import of the construct, some authors suggest that its empirical investigation has fallen into a 'methodological rut' [Conway & Briner, 2005, p. 89] and is neglecting to assess key tenets of the concept, such as its temporal and dynamic nature. This paper describes the research design of a longitudinal, mixed methods study which draws upon the strengths of both qualitative and quantitative modes of inquiry in order to explore the development of, and changes in, the psychological contract. Underpinned by a critical realist philosophy, the paper seeks to offer a research design suitable for exploring the process of change not only within the psychological contract domain, but also for similar constructs in the human resource management and broader organisational behaviour fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Depression is a major public health problem worldwide and is currently ranked second to heart disease for years lost due to disability. For many decades, international research has found that depressive symptoms occur more frequently among low socioeconomic (SES) individuals than their more-advantaged peers. However, the reasons as to why those of low socioeconomic groups suffer more depressive symptoms are not well understood. Studies investigating the prevalence of depression and its association with SES emanate largely from developed countries, with little research among developing countries. In particular, there is a serious dearth of research on depression and no investigation of its association with SES in Vietnam. The aims of the research presented in this Thesis are to: estimate the prevalence of depressive symptoms among Vietnamese adults, examine the nature and extent of the association between SES and depression and to elucidate causal pathways linking SES to depressive symptoms Methods The research was conducted between September 2008 and November 2009 in Hue city in central Vietnam and used a combination of qualitative (in-depth interviews) and quantitative (survey) data collection methods. The qualitative study contributed to the development of the theoretical model and to the refinement of culturally-appropriate data collection instruments for the quantitative study. The main survey comprised a cross-sectional population–based survey with randomised cluster sampling. A sample of 1976 respondents aged between 25-55 years from ten randomly-selected residential zones (quarters) of Hue city completed the questionnaire (response rate 95.5%). Measures SES was classified using three indicators: education, occupation and income. The Center for Epidemiologic Studies-Depression (CES-D) scale was used to measure depressive symptoms (range0-51, mean=11.0, SD=8.5). Three cut-off points for the CES-D scores were applied: ‘at risk for clinical depression’ (16 or above), ‘depressive symptoms’ (above 21) and ‘depression’ (above 25). Six psychosocial indicators: life time trauma, chronic stress, recent life events, social support, self esteem, and mastery were hypothesized to mediate the association between SES and depressive symptoms. Analyses The prevalence of depressive symptoms were analysed using bivariate analyses. The multivariable analytic phase comprised of ordinary least squares regression, in accordance with Baron and Kenny’s three-step framework for mediation modeling. All analyses were adjusted for a range of confounders, including age, marital status, smoking, drinking and chronic diseases and the mediation models were stratified by gender. Results Among these Vietnamese adults, 24.3% were at or above the cut-off for being ‘at risk for clinical depression’, 11.9% were classified as having depressive symptoms and 6.8% were categorised as having depression. SES was inversely related to depressive symptoms: the least educated those with low occupational status or with the lowest incomes reported more depressive symptoms. Socioeconomicallydisadvantaged individuals were more likely to report experiencing stress (life time trauma, chronic stress or recent life events), perceived less social support and reported fewer personal resources (self esteem and mastery) than their moreadvantaged counterparts. These psychosocial resources were all significantly associated with depressive symptoms independent of SES. Each psychosocial factor showed a significant mediating effect on the association between SES and depressive symptoms. This was found for all measures of SES, and for males and females. In particular, personal resources (mastery, self esteem) and chronic stress accounted for a substantial proportion of the variation in depressive symptoms between socioeconomic groups. Social support and recent life events contributed modestly to socioeconomic differences in depressive symptoms, whereas lifetime trauma contributed the least to these inequalities. Conclusion This is the first known study in Vietnam or any developing country to systematically examine the extent to which psychosocial factors mediate the relationship between SES and depression. The study contributes new evidence regarding the burden of depression in Vietnam. The findings have practical relevance for advocacy, for mental health promotion and health-care services, and point to the need for programs that focus on building a sense of personal mastery and self esteem. More broadly, the work presented in this Thesis contributes to the international scientific literature on the social determinants of depression.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous studies have shown that users’ cognitive styles play an important role during Web searching. However, only limited studies have showed the relationship between cognitive styles and Web search behavior. Most importantly, it is not clear which components of Web search behavior are influenced by cognitive styles. This paper examines the relationships between users’ cognitive styles and their Web searching and develops a model that portrays the relationship. The study uses qualitative and quantitative analyses to inform the study results based on data gathered from 50 participants. A questionnaire was utilised to collect participants’ demographic information, and Riding’s (1991) Cognitive Style Analysis (CSA) test to assess their cognitive styles. Results show that users’ cognitive styles influenced their information searching strategies, query reformulation behaviour, Web navigational styles and information processing approaches. The user model developed in this study depicts the fundamental relationships between users’ Web search behavior and their cognitive styles. Modeling Web search behavior with a greater understanding of user’s cognitive styles can help information science researchers and information systems designers to bridge the semantic gap between the user and the systems. Implications of the research for theory and practice, and future work are discussed.