971 resultados para Forward looking models
Resumo:
This Licentiate Thesis is devoted to the presentation and discussion of some new contributions in applied mathematics directed towards scientific computing in sports engineering. It considers inverse problems of biomechanical simulations with rigid body musculoskeletal systems especially in cross-country skiing. This is a contrast to the main research on cross-country skiing biomechanics, which is based mainly on experimental testing alone. The thesis consists of an introduction and five papers. The introduction motivates the context of the papers and puts them into a more general framework. Two papers (D and E) consider studies of real questions in cross-country skiing, which are modelled and simulated. The results give some interesting indications, concerning these challenging questions, which can be used as a basis for further research. However, the measurements are not accurate enough to give the final answers. Paper C is a simulation study which is more extensive than paper D and E, and is compared to electromyography measurements in the literature. Validation in biomechanical simulations is difficult and reducing mathematical errors is one way of reaching closer to more realistic results. Paper A examines well-posedness for forward dynamics with full muscle dynamics. Moreover, paper B is a technical report which describes the problem formulation and mathematical models and simulation from paper A in more detail. Our new modelling together with the simulations enable new possibilities. This is similar to simulations of applications in other engineering fields, and need in the same way be handled with care in order to achieve reliable results. The results in this thesis indicate that it can be very useful to use mathematical modelling and numerical simulations when describing cross-country skiing biomechanics. Hence, this thesis contributes to the possibility of beginning to use and develop such modelling and simulation techniques also in this context.
Resumo:
This paper presents a methodology to explore the impact on poverty of the public spending on education. The methodology consists of two approaches: Benefit Incidence Analysis (BIA) and behavioral approach. BIA considers the cost and use of the educational service, and the distribution of the benefits among groups of income. Regarding the behavioral approach, we use a Probit model of schooling attendance, in order to determinethe influence of public spending on the probability for thepoor to attend the school. As a complement, a measurement of targeting errors in the allocation of public spending is included in the methodology.
Resumo:
Over the past few years many studies have been published on the costs and economic benefits of journal business models. Early studies considered only the costs incurred in publishing traditional journals made available for purchase on a subscription or licensing business model. As the open access business model became available, some studies also covered the cost of making research articles available in open access journals. More recent studies have taken a broader perspective, looking at the position of journal publishers in the market and their business models in the context of the economic benefits from research dissemination. This briefing paper also looks at the outcomes of the broadly cited RIN study and various national studies performed by John Houghton. All links provided in footnotes in this Briefing Paper are to studies available in open access.
Resumo:
We consider an LTE network where a secondary user acts as a relay, transmitting data to the primary user using a decode-and-forward mechanism, transparent to the base-station (eNodeB). Clearly, the relay can decode symbols more reliably if the employed precoder matrix indicators (PMIs) are known. However, for closed loop spatial multiplexing (CLSM) transmit mode, this information is not always embedded in the downlink signal, leading to a need for effective methods to determine the PMI. In this thesis, we consider 2x2 MIMO and 4x4 MIMO downlink channels corresponding to CLSM and formulate two techniques to estimate the PMI at the relay using a hypothesis testing framework. We evaluate their performance via simulations for various ITU channel models over a range of SNR and for different channel quality indicators (CQIs). We compare them to the case when the true PMI is known at the relay and show that the performance of the proposed schemes are within 2 dB at 10% block error rate (BLER) in almost all scenarios. Furthermore, the techniques add minimal computational overhead over existent receiver structure. Finally, we also identify scenarios when using the proposed precoder detection algorithms in conjunction with the cooperative decode-and-forward relaying mechanism benefits the PUE and improves the BLER performance for the PUE. Therefore, we conclude from this that the proposed algorithms as well as the cooperative relaying mechanism at the CMR can be gainfully employed in a variety of real-life scenarios in LTE networks.
Resumo:
According to the significance of the econometric models in foreign exchange market, the purpose of this research is to give a closer examination on some important issues in this area. The research covers exchange rate pass-through into import prices, liquidity risk and expected returns in the currency market, and the common risk factors in currency markets. Firstly, with the significant of the exchange rate pass-through in financial economics, the first empirical chapter studies on the degree of exchange rate pass-through into import in emerging economies and developed countries in panel evidences for comparison covering the time period of 1970-2009. The pooled mean group estimation (PMGE) is used for the estimation to investigate the short run coefficients and error variance. In general, the results present that the import prices are affected positively, though incompletely, by the exchange rate. Secondly, the following study addresses the question whether there is a relationship between cross-sectional differences in foreign exchange returns and the sensitivities of the returns to fluctuations in liquidity, known as liquidity beta, by using a unique dataset of weekly order flow. Finally, the last study is in keeping with the study of Lustig, Roussanov and Verdelhan (2011), which shows that the large co-movement among exchange rates of different currencies can explain a risk-based view of exchange rate determination. The exploration on identifying a slope factor in exchange rate changes is brought up. The study initially constructs monthly portfolios of currencies, which are sorted on the basis of their forward discounts. The lowest interest rate currencies are contained in the first portfolio and the highest interest rate currencies are in the last. The results performs that portfolios with higher forward discounts incline to contain higher real interest rates in overall by considering the first portfolio and the last portfolio though the fluctuation occurs.
Resumo:
The present thesis is a study of movie review entertainment (MRE) which is a contemporary Internet-based genre of texts. MRE are movie reviews in video form which are published online, usually as episodes of an MRE web show. Characteristic to MRE is combining humor and honest opinions in varying degrees as well as the use of subject materials, i.e. clips of the movies, as a part of the review. The study approached MRE from a linguistic perspective aiming to discover 1) whether MRE is primarily text- or image-based and what the primary functions of the modes are, 2) how a reviewer linguistically combines subject footage to her/his commentary?, 3) whether there is any internal variation in MRE regarding the aforementioned questions, and 4) how suitable the selected models and theories are in the analysis of this type of contemporary multimodal data. To answer the aforementioned questions, the multimodal system of image—text relations by Martinec and Salway (2005) in combination with categories of cohesion by Halliday and Hasan (1976) were applied to four full MRE videos which were transcribed in their entirety for the study. The primary data represent varying types of MRE: a current movie review, an analytic essay, a riff review, and a humorous essay. The results demonstrated that image vs. text prioritization can vary between reviews and also within a review. The current movie review and the two essays were primarily commentary-focused whereas the riff review was significantly more dependent on the use of imagery as the clips are a major source of humor which is a prominent value in that type of a review. In addition to humor, clips are used to exemplify the commentary. A reviewer also relates new information to the imagery as well as uses two modes to present the information in a review. Linguistically, the most frequent case was that the reviewer names participants and processes lexically in the commentary. Grammatical relations (reference items such as pronouns and adverbs and conjunctive items in the riff review) were also encountered. There was internal variation to a considerable degree. The methods chosen were deemed appropriate to answer the research questions. Further study could go beyond linguistics to include, for instance, genre and media studies.
Resumo:
Spinal cord injury (SCI) is a devastating condition, which results from trauma to the cord, resulting in a primary injury response which leads to a secondary injury cascade, causing damage to both glial and neuronal cells. Following trauma, the central nervous system (CNS) fails to regenerate due to a plethora of both intrinsic and extrinsic factors. Unfortunately, these events lead to loss of both motor and sensory function and lifelong disability and care for sufferers of SCI. There have been tremendous advancements made in our understanding of the mechanisms behind axonal regeneration and remyelination of the damaged cord. These have provided many promising therapeutic targets. However, very few have made it to clinical application, which could potentially be due to inadequate understanding of compound mechanism of action and reliance on poor SCI models. This thesis describes the use of an established neural cell co-culture model of SCI as a medium throughput screen for compounds with potential therapeutic properties. A number of compounds were screened which resulted in a family of compounds, modified heparins, being taken forward for more intense investigation. Modified heparins (mHeps) are made up of the core heparin disaccharide unit with variable sulphation groups on the iduronic acid and glucosamine residues; 2-O-sulphate (C2), 6-O-sulphate (C6) and N-sulphate (N). 2-O-sulphated (mHep6) and N-sulphated (mHep7) heparin isomers were shown to promote both neurite outgrowth and myelination in the SCI model. It was found that both mHeps decreased oligodendrocyte precursor cell (OPC) proliferation and increased oligodendrocyte (OL) number adjacent to the lesion. However, there is a difference in the direct effects on the OL from each of the mHeps; mHep6 increased myelin internode length and mHep7 increased the overall cell size. It was further elucidated that these isoforms interact with and mediate both Wnt and FGF signalling. In OPC monoculture experiments FGF2 treated OPCs displayed increased proliferation but this effect was removed when co-treated with the mHeps. Therefore, suggesting that the mHeps interact with the ligand and inhibit FGF2 signalling. Additionally, it was shown that both mHeps could be partially mediating their effects through the Wnt pathway. mHep effects on both myelination and neurite outgrowth were removed when co-treated with a Wnt signalling inhibitor, suggesting cell signalling mediation by ligand immobilisation and signalling activation as a mechanistic action for the mHeps. However, the initial methods employed in this thesis were not sufficient to provide a more detailed study into the effects the mHeps have on neurite outgrowth. This led to the design and development of a novel microfluidic device (MFD), which provides a platform to study of axonal injury. This novel device is a three chamber device with two chambers converging onto a central open access chamber. This design allows axons from two points of origin to enter a chamber which can be subjected to injury, thus providing a platform in which targeted axonal injury and the regenerative capacity of a compound study can be performed. In conclusion, this thesis contributes to and advances the study of SCI in two ways; 1) identification and investigation of a novel set of compounds with potential therapeutic potential i.e. desulphated modified heparins. These compounds have multiple therapeutic properties and could revolutionise both the understanding of the basic pathological mechanisms underlying SCI but also be a powered therapeutic option. 2) Development of a novel microfluidic device to study in greater detail axonal biology, specifically, targeted axonal injury and treatment, providing a more representative model of SCI than standard in vitro models. Therefore, the MFD could lead to advancements and the identification of factors and compounds relating to axonal regeneration.
Resumo:
Semantic relations are an important element in the construction of ontologies and models of problem domains. Nevertheless, they remain fuzzy or under-specified. This is a pervasive problem in software engineering and artificial intelligence. Thus, we find semantic links that can have multiple interpretations in wide-coverage ontologies, semantic data models with abstractions that are not enough to capture the relation richness of problem domains, and improperly structured taxonomies. However, if relations are provided with precise semantics, some of these problems can be avoided, and meaningful operations can be performed on them. In this paper we present some insightful issues about the modeling, representation and usage of relations including the available taxonomy structuring methodologies as well as the initiatives aiming to provide relations with precise semantics. Moreover, we explain and propose the control of relations as a key issue for the coherent construction of ontologies.
Resumo:
Understanding how virus strains offer protection against closely related emerging strains is vital for creating effective vaccines. For many viruses, including Foot-and-Mouth Disease Virus (FMDV) and the Influenza virus where multiple serotypes often co-circulate, in vitro testing of large numbers of vaccines can be infeasible. Therefore the development of an in silico predictor of cross-protection between strains is important to help optimise vaccine choice. Vaccines will offer cross-protection against closely related strains, but not against those that are antigenically distinct. To be able to predict cross-protection we must understand the antigenic variability within a virus serotype, distinct lineages of a virus, and identify the antigenic residues and evolutionary changes that cause the variability. In this thesis we present a family of sparse hierarchical Bayesian models for detecting relevant antigenic sites in virus evolution (SABRE), as well as an extended version of the method, the extended SABRE (eSABRE) method, which better takes into account the data collection process. The SABRE methods are a family of sparse Bayesian hierarchical models that use spike and slab priors to identify sites in the viral protein which are important for the neutralisation of the virus. In this thesis we demonstrate how the SABRE methods can be used to identify antigenic residues within different serotypes and show how the SABRE method outperforms established methods, mixed-effects models based on forward variable selection or l1 regularisation, on both synthetic and viral datasets. In addition we also test a number of different versions of the SABRE method, compare conjugate and semi-conjugate prior specifications and an alternative to the spike and slab prior; the binary mask model. We also propose novel proposal mechanisms for the Markov chain Monte Carlo (MCMC) simulations, which improve mixing and convergence over that of the established component-wise Gibbs sampler. The SABRE method is then applied to datasets from FMDV and the Influenza virus in order to identify a number of known antigenic residue and to provide hypotheses of other potentially antigenic residues. We also demonstrate how the SABRE methods can be used to create accurate predictions of the important evolutionary changes of the FMDV serotypes. In this thesis we provide an extended version of the SABRE method, the eSABRE method, based on a latent variable model. The eSABRE method takes further into account the structure of the datasets for FMDV and the Influenza virus through the latent variable model and gives an improvement in the modelling of the error. We show how the eSABRE method outperforms the SABRE methods in simulation studies and propose a new information criterion for selecting the random effects factors that should be included in the eSABRE method; block integrated Widely Applicable Information Criterion (biWAIC). We demonstrate how biWAIC performs equally to two other methods for selecting the random effects factors and combine it with the eSABRE method to apply it to two large Influenza datasets. Inference in these large datasets is computationally infeasible with the SABRE methods, but as a result of the improved structure of the likelihood, we are able to show how the eSABRE method offers a computational improvement, leading it to be used on these datasets. The results of the eSABRE method show that we can use the method in a fully automatic manner to identify a large number of antigenic residues on a variety of the antigenic sites of two Influenza serotypes, as well as making predictions of a number of nearby sites that may also be antigenic and are worthy of further experiment investigation.
Resumo:
Until recently the dynamical evolution of the interstellar medium (ISM) was simu- lated using collisional ionization equilibrium (CIE) conditions. However, the ISM is a dynamical system, in which the plasma is naturally driven out of equilibrium due to atomic and dynamic processes operating on different timescales. A step forward in the field comprises a multi-fluid approach taking into account the joint thermal and dynamical evolutions of the ISM gas.
Resumo:
Imaging technologies are widely used in application fields such as natural sciences, engineering, medicine, and life sciences. A broad class of imaging problems reduces to solve ill-posed inverse problems (IPs). Traditional strategies to solve these ill-posed IPs rely on variational regularization methods, which are based on minimization of suitable energies, and make use of knowledge about the image formation model (forward operator) and prior knowledge on the solution, but lack in incorporating knowledge directly from data. On the other hand, the more recent learned approaches can easily learn the intricate statistics of images depending on a large set of data, but do not have a systematic method for incorporating prior knowledge about the image formation model. The main purpose of this thesis is to discuss data-driven image reconstruction methods which combine the benefits of these two different reconstruction strategies for the solution of highly nonlinear ill-posed inverse problems. Mathematical formulation and numerical approaches for image IPs, including linear as well as strongly nonlinear problems are described. More specifically we address the Electrical impedance Tomography (EIT) reconstruction problem by unrolling the regularized Gauss-Newton method and integrating the regularization learned by a data-adaptive neural network. Furthermore we investigate the solution of non-linear ill-posed IPs introducing a deep-PnP framework that integrates the graph convolutional denoiser into the proximal Gauss-Newton method with a practical application to the EIT, a recently introduced promising imaging technique. Efficient algorithms are then applied to the solution of the limited electrods problem in EIT, combining compressive sensing techniques and deep learning strategies. Finally, a transformer-based neural network architecture is adapted to restore the noisy solution of the Computed Tomography problem recovered using the filtered back-projection method.
Resumo:
Artificial Intelligence (AI) and Machine Learning (ML) are novel data analysis techniques providing very accurate prediction results. They are widely adopted in a variety of industries to improve efficiency and decision-making, but they are also being used to develop intelligent systems. Their success grounds upon complex mathematical models, whose decisions and rationale are usually difficult to comprehend for human users to the point of being dubbed as black-boxes. This is particularly relevant in sensitive and highly regulated domains. To mitigate and possibly solve this issue, the Explainable AI (XAI) field became prominent in recent years. XAI consists of models and techniques to enable understanding of the intricated patterns discovered by black-box models. In this thesis, we consider model-agnostic XAI techniques, which can be applied to Tabular data, with a particular focus on the Credit Scoring domain. Special attention is dedicated to the LIME framework, for which we propose several modifications to the vanilla algorithm, in particular: a pair of complementary Stability Indices that accurately measure LIME stability, and the OptiLIME policy which helps the practitioner finding the proper balance among explanations' stability and reliability. We subsequently put forward GLEAMS a model-agnostic surrogate interpretable model which requires to be trained only once, while providing both Local and Global explanations of the black-box model. GLEAMS produces feature attributions and what-if scenarios, from both dataset and model perspective. Eventually, we argue that synthetic data are an emerging trend in AI, being more and more used to train complex models instead of original data. To be able to explain the outcomes of such models, we must guarantee that synthetic data are reliable enough to be able to translate their explanations to real-world individuals. To this end we propose DAISYnt, a suite of tests to measure synthetic tabular data quality and privacy.
Resumo:
Natural events are a widely recognized hazard for industrial sites where relevant quantities of hazardous substances are handled, due to the possible generation of cascading events resulting in severe technological accidents (Natech scenarios). Natural events may damage storage and process equipment containing hazardous substances, that may be released leading to major accident scenarios called Natech events. The need to assess the risk associated with Natech scenarios is growing and methodologies were developed to allow the quantification of Natech risk, considering both point sources and linear sources as pipelines. A key element of these procedures is the use of vulnerability models providing an estimation of the damage probability of equipment or pipeline segment as a result of the impact of the natural event. Therefore, the first aim of the PhD project was to outline the state of the art of vulnerability models for equipment and pipelines subject to natural events such as floods, earthquakes, and wind. Moreover, the present PhD project also aimed at the development of new vulnerability models in order to fill some gaps in literature. In particular, a vulnerability model for vertical equipment subject to wind and to flood were developed. Finally, in order to improve the calculation of Natech risk for linear sources an original methodology was developed for Natech quantitative risk assessment methodology for pipelines subject to earthquakes. Overall, the results obtained are a step forward in the quantitative risk assessment of Natech accidents. The tools developed open the way to the inclusion of new equipment in the analysis of Natech events, and the methodology for the assessment of linear risk sources as pipelines provides an important tool for a more accurate and comprehensive assessment of Natech risk.
Resumo:
Prosopis rubriflora and Prosopis ruscifolia are important species in the Chaquenian regions of Brazil. Because of the restriction and frequency of their physiognomy, they are excellent models for conservation genetics studies. The use of microsatellite markers (Simple Sequence Repeats, SSRs) has become increasingly important in recent years and has proven to be a powerful tool for both ecological and molecular studies. In this study, we present the development and characterization of 10 new markers for P. rubriflora and 13 new markers for P. ruscifolia. The genotyping was performed using 40 P. rubriflora samples and 48 P. ruscifolia samples from the Chaquenian remnants in Brazil. The polymorphism information content (PIC) of the P. rubriflora markers ranged from 0.073 to 0.791, and no null alleles or deviation from Hardy-Weinberg equilibrium (HW) were detected. The PIC values for the P. ruscifolia markers ranged from 0.289 to 0.883, but a departure from HW and null alleles were detected for certain loci; however, this departure may have resulted from anthropic activities, such as the presence of livestock, which is very common in the remnant areas. In this study, we describe novel SSR polymorphic markers that may be helpful in future genetic studies of P. rubriflora and P. ruscifolia.
Resumo:
In acquired immunodeficiency syndrome (AIDS) studies it is quite common to observe viral load measurements collected irregularly over time. Moreover, these measurements can be subjected to some upper and/or lower detection limits depending on the quantification assays. A complication arises when these continuous repeated measures have a heavy-tailed behavior. For such data structures, we propose a robust structure for a censored linear model based on the multivariate Student's t-distribution. To compensate for the autocorrelation existing among irregularly observed measures, a damped exponential correlation structure is employed. An efficient expectation maximization type algorithm is developed for computing the maximum likelihood estimates, obtaining as a by-product the standard errors of the fixed effects and the log-likelihood function. The proposed algorithm uses closed-form expressions at the E-step that rely on formulas for the mean and variance of a truncated multivariate Student's t-distribution. The methodology is illustrated through an application to an Human Immunodeficiency Virus-AIDS (HIV-AIDS) study and several simulation studies.