876 resultados para 2447: modelling and forecasting


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cells of epithelial origin, e.g. from breast and prostate cancers, effectively differentiate into complex multicellular structures when cultured in three-dimensions (3D) instead of conventional two-dimensional (2D) adherent surfaces. The spectrum of different organotypic morphologies is highly dependent on the culture environment that can be either non-adherent or scaffold-based. When embedded in physiological extracellular matrices (ECMs), such as laminin-rich basement membrane extracts, normal epithelial cells differentiate into acinar spheroids reminiscent of glandular ductal structures. Transformed cancer cells, in contrast, typically fail to undergo acinar morphogenic patterns, forming poorly differentiated or invasive multicellular structures. The 3D cancer spheroids are widely accepted to better recapitulate various tumorigenic processes and drug responses. So far, however, 3D models have been employed predominantly in the Academia, whereas the pharmaceutical industry has yet to adopt a more widely and routine use. This is mainly due to poor characterisation of cell models, lack of standardised workflows and high throughput cell culture platforms, and the availability of proper readout and quantification tools. In this thesis, a complete workflow has been established entailing well-characterised 3D cell culture models for prostate cancer, a standardised 3D cell culture routine based on high-throughput-ready platform, automated image acquisition with concomitant morphometric image analysis, and data visualisation, in order to enable large-scale high-content screens. Our integrated suite of software and statistical analysis tools were optimised and validated using a comprehensive panel of prostate cancer cell lines and 3D models. The tools quantify multiple key cancer-relevant morphological features, ranging from cancer cell invasion through multicellular differentiation to growth, and detect dynamic changes both in morphology and function, such as cell death and apoptosis, in response to experimental perturbations including RNA interference and small molecule inhibitors. Our panel of cell lines included many non-transformed and most currently available classic prostate cancer cell lines, which were characterised for their morphogenetic properties in 3D laminin-rich ECM. The phenotypes and gene expression profiles were evaluated concerning their relevance for pre-clinical drug discovery, disease modelling and basic research. In addition, a spontaneous model for invasive transformation was discovered, displaying a highdegree of epithelial plasticity. This plasticity is mediated by an abundant bioactive serum lipid, lysophosphatidic acid (LPA), and its receptor LPAR1. The invasive transformation was caused by abrupt cytoskeletal rearrangement through impaired G protein alpha 12/13 and RhoA/ROCK, and mediated by upregulated adenylyl cyclase/cyclic AMP (cAMP)/protein kinase A, and Rac/ PAK pathways. The spontaneous invasion model tangibly exemplifies the biological relevance of organotypic cell culture models. Overall, this thesis work underlines the power of novel morphometric screening tools in drug discovery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this thesis was to study the design of demand forecasting processes. A literature review in the field of forecasting was conducted, including general forecasting process design, forecasting methods and techniques, the role of human judgment in forecasting and forecasting performance measurement. The purpose of the literature review was to identify the important design choices that an organization aiming to design or re-design their demand forecasting process would have to make. In the empirical part of the study, these choices and the existing knowledge behind them was assessed in a case study where a demand forecasting process was re-designed for a company in the fast moving consumer goods business. The new target process is described, as well as the reasoning behind the design choices made during the re-design process. As a result, the most important design choices are highlighted, as well as their immediate effect on other processes directly tied to the demand forecasting process. Additionally, some new insights on the organizational aspects of demand forecasting processes are explored. The preliminary results indicate that in this case the new process did improve forecasting accuracy, although organizational issues related to the process proved to be more challenging than anticipated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a Petri Net approach is introduced for modelling and simulation of control strategies in Intelligent Building. In this context, it is claimed that integration with other building systems can be achieved in a more systematic way considering a mechatronic approach (i.e. multidisciplinary concepts applied to the development of systems). The case study is the Ambulatory Building of Medical School Hospital of University of São Paulo. Particularly, the developed methodology is applied to the elevator system and to the HVAC (Heating, Ventilation and Air Conditioning) system. It is shown that using this approach, the control systems could be integrated, improving performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the doctoral dissertation, low-voltage direct current (LVDC) distribution system stability, supply security and power quality are evaluated by computational modelling and measurements on an LVDC research platform. Computational models for the LVDC network analysis are developed. Time-domain simulation models are implemented in the time-domain simulation environment PSCAD/EMTDC. The PSCAD/EMTDC models of the LVDC network are applied to the transient behaviour and power quality studies. The LVDC network power loss model is developed in a MATLAB environment and is capable of fast estimation of the network and component power losses. The model integrates analytical equations that describe the power loss mechanism of the network components with power flow calculations. For an LVDC network research platform, a monitoring and control software solution is developed. The solution is used to deliver measurement data for verification of the developed models and analysis of the modelling results. In the work, the power loss mechanism of the LVDC network components and its main dependencies are described. Energy loss distribution of the LVDC network components is presented. Power quality measurements and current spectra are provided and harmonic pollution on the DC network is analysed. The transient behaviour of the network is verified through time-domain simulations. DC capacitor guidelines for an LVDC power distribution network are introduced. The power loss analysis results show that one of the main optimisation targets for an LVDC power distribution network should be reduction of the no-load losses and efficiency improvement of converters at partial loads. Low-frequency spectra of the network voltages and currents are shown, and harmonic propagation is analysed. Power quality in the LVDC network point of common coupling (PCC) is discussed. Power quality standard requirements are shown to be met by the LVDC network. The network behaviour during transients is analysed by time-domain simulations. The network is shown to be transient stable during large-scale disturbances. Measurement results on the LVDC research platform proving this are presented in the work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With a Sales and Operations Planning (S&OP) process, a company aims to manage the demand and supply by planning and forecasting. The studied company uses an integrated S&OP process to improve the company's operations. The aim of this thesis is to develop this business process by finding the best possible way to manage the soft information in S&OP, whilst also understanding the importance and types (assumptions, risks and opportunities) of soft information in S&OP. The soft information in S&OP helps to refine future S&OP planning, taking into account the uncertainties that affect the balance of the long-term demand and supply (typically 12-18 months). The literature review was used to create a framework for soft information management process in S&OP. There were not found a concrete way how to manage soft information in the existing literature. In consequence of the poor literature available the Knowledge Management literature was used as the base for the framework creation, which was seen in the very same type of information management like the soft information management is. The framework created a four-stage process to manage soft information in S&OP that included also the required support systems. First phase is collecting and acquiring soft information in S&OP, which include also categorization. The categorization was the cornerstone to identify different requirements that needs to be taken into consideration when managing soft information in S&OP process. The next phase focus on storing data, which purpose is to ensure the soft information is managed in a common system (support system) in a way that the following phase makes it available to users in S&OP who need by help of sharing and applications process. The last phase target is to use the soft information to understand assumptions and thoughts of users behind the numbers in S&OP plans. With this soft management process the support system will have a key role. The support system, like S&OP tool, ensures that soft information is stored in the right places, kept up-to-date and relevancy. The soft information management process in S&OP strives to improve the relevant soft information documenting behind the S&OP plans into the S&OP support system. The process offers an opportunity to individuals to review, comment and evaluate soft information in S&OP made by their own or others. In the case company it was noticed that without a properly documented and distributed soft information in S&OP it was seen to cause mistrust towards the planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Life cycle assessment (LCA) is one of the most established quantitative tools for environmental impact assessment of products. To be able to provide support to environmentally-aware decision makers on environmental impacts of biomass value-chains, the scope of LCA methodology needs to be augmented to cover landuse related environmental impacts. This dissertation focuses on analysing and discussing potential impact assessment methods, conceptual models and environmental indicators that have been proposed to be implemented into the LCA framework for impacts of land use. The applicability of proposed indicators and impact assessment frameworks is tested from practitioners' perspective, especially focusing on forest biomass value chains. The impacts of land use on biodiversity, resource depletion, climate change and other ecosystem services is analysed and discussed and the interplay in between value choices in LCA modelling and the decision-making situations to be supported is critically discussed. It was found out that land use impact indicators are necessary in LCA in highlighting differences in impacts from distinct land use classes. However, many open questions remain on certainty of highlighting actual impacts of land use, especially regarding impacts of managed forest land use on biodiversity and ecosystem services such as water regulation and purification. The climate impact of energy use of boreal stemwood was found to be higher in the short term and lower in the long-term in comparison with fossil fuels that emit identical amount of CO2 in combustion, due to changes implied to forest C stocks. The climate impacts of energy use of boreal stemwood were found to be higher than the previous estimates suggest on forest residues and stumps. The product lifetime was found to have much higher influence on the climate impacts of woodbased value chains than the origin of stemwood either from thinnings or final fellings. Climate neutrality seems to be likely only in the case when almost all the carbon of harvested wood is stored in long-lived wooden products. In the current form, the land use impacts cannot be modelled with a high degree of certainty nor communicated with adequate level of clarity to decision makers. The academia needs to keep on improving the modelling framework, and more importantly, clearly communicate to decision-makers the limited certainty on whether land-use intensive activities can help in meeting the strict mitigation targets we are globally facing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research concerns different statistical methods that assist to increase the demand forecasting accuracy of company X’s forecasting model. Current forecasting process was analyzed in details. As a result, graphical scheme of logical algorithm was developed. Based on the analysis of the algorithm and forecasting errors, all the potential directions for model future improvements in context of its accuracy were gathered into the complete list. Three improvement directions were chosen for further practical research, on their basis, three test models were created and verified. Novelty of this work lies in the methodological approach of the original analysis of the model, which identified its critical points, as well as the uniqueness of the developed test models. Results of the study formed the basis of the grant of the Government of St. Petersburg.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Process management refers to improving the key functions of a company. The main functions of the case company - project management, procurement, finance, and human resource - use their own separate systems. The case company is in the process of changing its software. Different functions will use the same system in the future. This software change causes changes in some of the company’s processes. Project cash flow forecasting process is one of the changing processes. Cash flow forecasting ensures the sufficiency of money and prepares for possible changes in the future. This will help to ensure the company’s viability. The purpose of the research is to describe a new project cash flow forecasting process. In addition, the aim is to analyze the impacts of the process change, with regard to the project control department’s workload and resources through the process measurement, and how the impacts take the department’s future operations into account. The research is based on process management. Processes, their descriptions, and the way the process management uses the information, are discussed in the theory part of this research. The theory part is based on literature and articles. Project cash flow and forecasting-related benefits are also discussed. After this, the project cash flow forecasting as-is and to-be processes are described by utilizing information, obtained from the theoretical part, as well as the know-how of the project control department’s personnel. Written descriptions and cross-functional flowcharts are used for descriptions. Process measurement is based on interviews with the personnel – mainly cost controllers and department managers. The process change and the integration of two processes will allow work time for other things, for example, analysis of costs. In addition to the quality of the cash flow information will improve compared to the as-is process. Analyzing the department’s other main processes, department’s roles, and their responsibilities should be checked and redesigned. This way, there will be an opportunity to achieve the best possible efficiency and cost savings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this thesis was to study if the quantitative sales forecasting methods will enhance the accuracy of the sales forecast in comparison to qualitative sales forecasting method. A literature review in the field of forecasting was conducted, including general sales forecasting process, forecasting methods and techniques and forecasting accuracy measurement. In the empirical part of the study the accuracy of the forecasts provided by both qualitative and quantitative methods is being studied and compared in the case of short, medium and long term forecasts. The SAS® Forecast Server –tool was used in creating the quantitative forecasts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nykypäivän monimutkaisessa ja epävakaassa liiketoimintaympäristössä yritykset, jotka kykenevät muuttamaan tuottamansa operatiivisen datan tietovarastoiksi, voivat saavuttaa merkittävää kilpailuetua. Ennustavan analytiikan hyödyntäminen tulevien trendien ennakointiin mahdollistaa yritysten tunnistavan avaintekijöitä, joiden avulla he pystyvät erottumaan kilpailijoistaan. Ennustavan analytiikan hyödyntäminen osana päätöksentekoprosessia mahdollistaa ketterämmän, reaaliaikaisen päätöksenteon. Tämän diplomityön tarkoituksena on koota teoreettinen viitekehys analytiikan mallintamisesta liike-elämän loppukäyttäjän näkökulmasta ja hyödyntää tätä mallinnusprosessia diplomityön tapaustutkimuksen yritykseen. Teoreettista mallia hyödynnettiin asiakkuuksien mallintamisessa sekä tunnistamalla ennakoivia tekijöitä myynnin ennustamiseen. Työ suoritettiin suomalaiseen teollisten suodattimien tukkukauppaan, jolla on liiketoimintaa Suomessa, Venäjällä ja Balteissa. Tämä tutkimus on määrällinen tapaustutkimus, jossa tärkeimpänä tiedonkeruumenetelmänä käytettiin tapausyrityksen transaktiodataa. Data työhön saatiin yrityksen toiminnanohjausjärjestelmästä.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human beings have always strived to preserve their memories and spread their ideas. In the beginning this was always done through human interpretations, such as telling stories and creating sculptures. Later, technological progress made it possible to create a recording of a phenomenon; first as an analogue recording onto a physical object, and later digitally, as a sequence of bits to be interpreted by a computer. By the end of the 20th century technological advances had made it feasible to distribute media content over a computer network instead of on physical objects, thus enabling the concept of digital media distribution. Many digital media distribution systems already exist, and their continued, and in many cases increasing, usage is an indicator for the high interest in their future enhancements and enriching. By looking at these digital media distribution systems, we have identified three main areas of possible improvement: network structure and coordination, transport of content over the network, and the encoding used for the content. In this thesis, our aim is to show that improvements in performance, efficiency and availability can be done in conjunction with improvements in software quality and reliability through the use of formal methods: mathematical approaches to reasoning about software so that we can prove its correctness, together with the desirable properties. We envision a complete media distribution system based on a distributed architecture, such as peer-to-peer networking, in which different parts of the system have been formally modelled and verified. Starting with the network itself, we show how it can be formally constructed and modularised in the Event-B formalism, such that we can separate the modelling of one node from the modelling of the network itself. We also show how the piece selection algorithm in the BitTorrent peer-to-peer transfer protocol can be adapted for on-demand media streaming, and how this can be modelled in Event-B. Furthermore, we show how modelling one peer in Event-B can give results similar to simulating an entire network of peers. Going further, we introduce a formal specification language for content transfer algorithms, and show that having such a language can make these algorithms easier to understand. We also show how generating Event-B code from this language can result in less complexity compared to creating the models from written specifications. We also consider the decoding part of a media distribution system by showing how video decoding can be done in parallel. This is based on formally defined dependencies between frames and blocks in a video sequence; we have shown that also this step can be performed in a way that is mathematically proven correct. Our modelling and proving in this thesis is, in its majority, tool-based. This provides a demonstration of the advance of formal methods as well as their increased reliability, and thus, advocates for their more wide-spread usage in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examined the effect of expHcitly instructing students to use a repertoire of reading comprehension strategies. Specifically, this study examined whether providing students with a "predictive story-frame" which combined the use of prediction and summarization strategies improved their reading comprehension relative to providing students with generic instruction on prediction and summarization. Results were examined in terms of instructional condition and reading ability. Students from 2 grade 4 classes participated in this study. The reading component of the Canadian Achievement Tests, Second Edition (CAT/2) was used to identify students as either "average or above average" or "below average" readers. Students received either strategic predication and summarization instruction (story-frame) or generic prediction and summarization instruction (notepad). Students were provided with new but comparable stories for each session. For both groups, the researcher modelled the strategic tools and provided guided practice, independent practice, and independent reading sessions. Comprehension was measured with an immediate and 1-week delayed comprehension test for each of the 4 stories, hi addition, students participated in a 1- week delayed interview, where they were asked to retell the story and to answer questions about the central elements (character, setting, problem, solution, beginning, middle, and ending events) of each story. There were significant differences, with medium to large effect sizes, in comprehension and recall scores as a fimction of both instructional condition and reading ability. Students in the story-frame condition outperformed students in the notepad condition, and average to above average readers performed better than below average readers. Students in the story-frame condition outperformed students in the notepad condition on the comprehension tests and on the oral retellings when teacher modelling and guidance were present. In the cued recall sessions, students in the story-frame instructional condition recalled more correct information and generated fewer errors than students in the notepad condition. Average to above average readers performed better than below average readers across comprehension and retelling measures. The majority of students in both instructional conditions reported that they would use their strategic tool again.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the past 20 years, researchers have applied the Kalman filter to the modeling and forecasting the term structure of interest rates. Despite its impressive performance in in-sample fitting yield curves, little research has focused on the out-of-sample forecast of yield curves using the Kalman filter. The goal of this thesis is to develop a unified dynamic model based on Diebold and Li (2006) and Nelson and Siegel’s (1987) three-factor model, and estimate this dynamic model using the Kalman filter. We compare both in-sample and out-of-sample performance of our dynamic methods with various other models in the literature. We find that our dynamic model dominates existing models in medium- and long-horizon yield curve predictions. However, the dynamic model should be used with caution when forecasting short maturity yields

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ma thèse est composée de trois chapitres reliés à l'estimation des modèles espace-état et volatilité stochastique. Dans le première article, nous développons une procédure de lissage de l'état, avec efficacité computationnelle, dans un modèle espace-état linéaire et gaussien. Nous montrons comment exploiter la structure particulière des modèles espace-état pour tirer les états latents efficacement. Nous analysons l'efficacité computationnelle des méthodes basées sur le filtre de Kalman, l'algorithme facteur de Cholesky et notre nouvelle méthode utilisant le compte d'opérations et d'expériences de calcul. Nous montrons que pour de nombreux cas importants, notre méthode est plus efficace. Les gains sont particulièrement grands pour les cas où la dimension des variables observées est grande ou dans les cas où il faut faire des tirages répétés des états pour les mêmes valeurs de paramètres. Comme application, on considère un modèle multivarié de Poisson avec le temps des intensités variables, lequel est utilisé pour analyser le compte de données des transactions sur les marchés financières. Dans le deuxième chapitre, nous proposons une nouvelle technique pour analyser des modèles multivariés à volatilité stochastique. La méthode proposée est basée sur le tirage efficace de la volatilité de son densité conditionnelle sachant les paramètres et les données. Notre méthodologie s'applique aux modèles avec plusieurs types de dépendance dans la coupe transversale. Nous pouvons modeler des matrices de corrélation conditionnelles variant dans le temps en incorporant des facteurs dans l'équation de rendements, où les facteurs sont des processus de volatilité stochastique indépendants. Nous pouvons incorporer des copules pour permettre la dépendance conditionnelle des rendements sachant la volatilité, permettant avoir différent lois marginaux de Student avec des degrés de liberté spécifiques pour capturer l'hétérogénéité des rendements. On tire la volatilité comme un bloc dans la dimension du temps et un à la fois dans la dimension de la coupe transversale. Nous appliquons la méthode introduite par McCausland (2012) pour obtenir une bonne approximation de la distribution conditionnelle à posteriori de la volatilité d'un rendement sachant les volatilités d'autres rendements, les paramètres et les corrélations dynamiques. Le modèle est évalué en utilisant des données réelles pour dix taux de change. Nous rapportons des résultats pour des modèles univariés de volatilité stochastique et deux modèles multivariés. Dans le troisième chapitre, nous évaluons l'information contribuée par des variations de volatilite réalisée à l'évaluation et prévision de la volatilité quand des prix sont mesurés avec et sans erreur. Nous utilisons de modèles de volatilité stochastique. Nous considérons le point de vue d'un investisseur pour qui la volatilité est une variable latent inconnu et la volatilité réalisée est une quantité d'échantillon qui contient des informations sur lui. Nous employons des méthodes bayésiennes de Monte Carlo par chaîne de Markov pour estimer les modèles, qui permettent la formulation, non seulement des densités a posteriori de la volatilité, mais aussi les densités prédictives de la volatilité future. Nous comparons les prévisions de volatilité et les taux de succès des prévisions qui emploient et n'emploient pas l'information contenue dans la volatilité réalisée. Cette approche se distingue de celles existantes dans la littérature empirique en ce sens que ces dernières se limitent le plus souvent à documenter la capacité de la volatilité réalisée à se prévoir à elle-même. Nous présentons des applications empiriques en utilisant les rendements journaliers des indices et de taux de change. Les différents modèles concurrents sont appliqués à la seconde moitié de 2008, une période marquante dans la récente crise financière.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les surfaces de subdivision fournissent une méthode alternative prometteuse dans la modélisation géométrique, et ont des avantages sur la représentation classique de trimmed-NURBS, en particulier dans la modélisation de surfaces lisses par morceaux. Dans ce mémoire, nous considérons le problème des opérations géométriques sur les surfaces de subdivision, avec l'exigence stricte de forme topologique correcte. Puisque ce problème peut être mal conditionné, nous proposons une approche pour la gestion de l'incertitude qui existe dans le calcul géométrique. Nous exigeons l'exactitude des informations topologiques lorsque l'on considère la nature de robustesse du problème des opérations géométriques sur les modèles de solides, et il devient clair que le problème peut être mal conditionné en présence de l'incertitude qui est omniprésente dans les données. Nous proposons donc une approche interactive de gestion de l'incertitude des opérations géométriques, dans le cadre d'un calcul basé sur la norme IEEE arithmétique et la modélisation en surfaces de subdivision. Un algorithme pour le problème planar-cut est alors présenté qui a comme but de satisfaire à l'exigence topologique mentionnée ci-dessus.