922 resultados para Empirical Flow Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reduction of greenhouse gas emissions in the European Union promotes the combustion of biomass rather than fossil fuels in energy production. Circulating fluidized bed (CFB) combustion offers a simple, flexible and efficient way to utilize untreated biomass in a large scale. CFB furnaces are modeled in order to understand their operation better and to help in the design of new furnaces. Therefore, physically accurate models are needed to describe the heavily coupled multiphase flow, reactions and heat transfer inside the furnace. This thesis presents a new model for the fuel flow inside the CFB furnace, which acknowledges the physical properties of the fuel and the multiphase flow phenomena inside the furnace. This model is applied with special interest in the firing of untreated biomass. An experimental method is utilized to characterize gas-fuel drag force relations. This characteristic drag force approach is developed into a gas-fuel drag force model suitable for irregular, non-spherical biomass particles and applied together with the new fuel flow model in the modeling of a large-scale CFB furnace. The model results are physically valid and achieve very good correspondence with the measurement results from large-scale CFB furnace firing biomass. With the methods and models presented in this work, the fuel flow field inside a circulating fluidized bed furnace can be modeled with better accuracy and more efficiently than in previous studies with a three-dimensional holistic model frame.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work the separation of multicomponent mixtures in counter-current columns with supercritical carbon dioxide has been investigated using a process design methodology. First the separation task must be defined, then phase equilibria experiments are carried out, and the data obtained are correlated with thermodynamic models or empirical functions. Mutual solubilities, Ki-values, and separation factors aij are determined. Based on this data possible operating conditions for further extraction experiments can be determined. Separation analysis using graphical methods are performed to optimize the process parameters. Hydrodynamic experiments are carried out to determine the flow capacity diagram. Extraction experiments in laboratory scale are planned and carried out in order to determine HETP values, to validate the simulation results, and to provide new materials for additional phase equilibria experiments, needed to determine the dependence of separation factors on concetration. Numerical simulation of the separation process and auxiliary systems is carried out to optimize the number of stages, solvent-to-feed ratio, product purity, yield, and energy consumption. Scale-up and cost analysis close the process design. The separation of palmitic acid and (oleic+linoleic) acids from PFAD-Palm Fatty Acids Distillates was used as a case study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software quality has become an important research subject, not only in the Information and Communication Technology spheres, but also in other industries at large where software is applied. Software quality is not a happenstance; it is defined, planned and created into the software product throughout the Software Development Life Cycle. The research objective of this study is to investigate the roles of human and organizational factors that influence software quality construction. The study employs the Straussian grounded theory. The empirical data has been collected from 13 software companies, and the data includes 40 interviews. The results of the study suggest that tools, infrastructure and other resources have a positive impact on software quality, but human factors involved in the software development processes will determine the quality of the products developed. On the other hand, methods of development were found to bring little effect on software quality. The research suggests that software quality is an information-intensive process whereby organizational structures, mode of operation, and information flow within the company variably affect software quality. The results also suggest that software development managers influence the productivity of developers and the quality of the software products. Several challenges of software testing that affect software quality are also brought to light. The findings of this research are expected to benefit the academic community and software practitioners by providing an insight into the issues pertaining to software quality construction undertakings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The costs of health care are going up in many countries. In order to provide affordable and effective health care solutions, new technologies and approaches are constantly being developed. In this research, video games are presented as a possible solution to the problem. Video games are fun, and nowadays most people like to spend time on them. In addition, recent studies have pointed out that video games can have notable health benefits. Health games have already been developed, used in practice, and researched. However, the bulk of health game studies have been concerned with the design or the effectiveness of the games; no actual business studies have been conducted on the subject, even though health games often lack commercial success despite their health benefits. This thesis seeks to fill this gap. The specific aim of this thesis is to develop a conceptual business model framework and empirically use it in explorative medical game business model research. In the first stage of this research, a literature review was conducted and the existing literature analyzed and synthesized into a conceptual business model framework consisting of six dimensions. The motivation behind the synthesis is the ongoing ambiguity around the business model concept. In the second stage, 22 semi-structured interviews were conducted with different professionals within the value network for medical games. The business model framework was present in all stages of the empirical research: First, in the data collection stage, the framework acted as a guiding instrument, focusing the interview process. Then, the interviews were coded and analyzed using the framework as a structure. The results were then reported following the structure of the framework. In the results, the interviewees highlighted several important considerations and issues for medical games concerning the six dimensions of the business model framework. Based on the key findings of this research, several key components of business models for medical games were identified and illustrated in a single figure. Furthermore, five notable challenges for business models for medical games were presented, and possible solutions for the challenges were postulated. Theoretically, these findings provide pioneering information on the untouched subject of business models for medical games. Moreover, the conceptual business model framework and its use in the novel context of medical games provide a contribution to the business model literature. Regarding practice, this thesis further accentuates that medical games can offer notable benefits to several stakeholder groups and offers advice to companies seeking to commercialize these games.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Studying testis is complex, because the tissue has a very heterogeneous cell composition and its structure changes dynamically during development. In reproductive field, the cell composition is traditionally studied by morphometric methods such as immunohistochemistry and immunofluorescence. These techniques provide accurate quantitative information about cell composition, cell-cell association and localization of the cells of interest. However, the sample preparation, processing, staining and data analysis are laborious and may take several working days. Flow cytometry protocols coupled with DNA stains have played an important role in providing quantitative information of testicular cells populations ex vivo and in vitro studies. Nevertheless, the addition of specific cells markers such as intracellular antibodies would allow the more specific identification of cells of crucial interest during spermatogenesis. For this study, adult rat Sprague-Dawley rats were used for optimization of the flow cytometry protocol. Specific steps within the protocol were optimized to obtain a singlecell suspension representative of the cell composition of the starting material. Fixation and permeabilization procedure were optimized to be compatible with DNA stains and fluorescent intracellular antibodies. Optimization was achieved by quantitative analysis of specific parameters such as recovery of meiotic cells, amount of debris and comparison of the proportions of the various cell populations with already published data. As a result, a new and fast flow cytometry method coupled with DNA stain and intracellular antigen detection was developed. This new technique is suitable for analysis of population behavior and specific cells during postnatal testis development and spermatogenesis in rodents. This rapid protocol recapitulated the known vimentin and γH2AX protein expression patterns during rodent testis ontogenesis. Moreover, the assay was applicable for phenotype characterization of SCRbKO and E2F1KO mouse models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study concentrates on developing a suitable business model for Finnish biobanks, with particular emphasis on value creation to stakeholders. The sub-objective of this thesis are to map the commercial possibilities of biobanks and potential barriers for business development. The study approaches the subject from the biobanks’ as well as the stakeholders’ point of view, integrating their hopes and needs considering current and future co-operation into the findings. In 2013 the Biobank Act came into effect, after which six biobanks have been established and several other pending biobank projects are in process. There is relatively little research in regard to the commercial opportunities of this newcomer of the biomedical industry, and particularly in the Finnish markets. Therefore, the aim of this study is to partially fill the research gap of the commercial potential of biobanks and particularly outline the problematic elements in developing business. The theoretical framework consists of a few select theories, which depict business modeling and value creation of organizations. The theories are combined to form a synthesis, which best adapts to biobanks, and acts as a backbone for interviews. The empirical part of the study was conducted mainly by seven face-to-face interviews, and complemented by two phone interviews and an e-mail questionnaire with four responses. The findings consist mainly of the participants’ reflections on the potential products and services enabled by consumer genomics, as well as perceptions on different obstacles for biobanks’ business development. The nature of the study is tentative, as biobanks are relatively new organizations in Finland, and their operation models and activities are still molding. The aim is to bring to surface the hopes and concerns of biobanks’ representatives, as well as the representatives of stakeholders, in order to transparently discuss the current situation and suggestions for further development. The study concludes that in principle, the interviewees’ agree on the need for development in order not to waste the potential of biobanks; regardless, the participants emphasize different aspects and subsequently lean on differing methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fluctuating commodity prices, foreign exchange rates and interest rates are causing changes in cash flows, market value and the companies’ profit. Most of the commodities are quoted in US dollar. Companies with non-dollar accounting face a double risk in the form of the commodity price risk and foreign exchange risk. The objective of this Master’s thesis is to find out how companies under commodity should manage foreign exchange exposure. The theoretical literature is based on foreign exchange risk, commodity risk and foreign exchange exposure management. The empirical research is done by using constructive modelling of a case company in the oil industry. The exposure is model with foreign exchange net cash flow and net working capital. First, the factors affecting foreign exchange exposure in case company are analyzed, then a model of foreign exchange exposure is created. Finally, the models are compared and the most suitable method is defined. According to the literature, foreign exchange exposure is the foreign exchange net cash flow. However, the results of the study show that foreign exchange risk can be managed also with net working capital. When the purchases, sales and storage are under foreign exchange risk, the best way to manage foreign exchange exposure is with combined net cash flow and net working capital method. The foreign exchange risk policy of the company defines the appropriate way to manage foreign exchange risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study concentrates on developing a suitable business model for Finnish biobanks, with particular emphasis on value creation to stakeholders. The sub-objective of this thesis are to map the commercial possibilities of biobanks and potential barriers for business development. The study approaches the subject from the biobanks’ as well as the stakeholders’ point of view, integrating their hopes and needs considering current and future co-operation into the findings. In 2013 the Biobank Act came into effect, after which six biobanks have been established and several other pending biobank projects are in process. There is relatively little research in regard to the commercial opportunities of this newcomer of the biomedical industry, and particularly in the Finnish markets. Therefore, the aim of this study is to partially fill the research gap of the commercial potential of biobanks and particularly outline the problematic elements in developing business. The theoretical framework consists of a few select theories, which depict business modeling and value creation of organizations. The theories are combined to form a synthesis, which best adapts to biobanks, and acts as a backbone for interviews. The empirical part of the study was conducted mainly by seven face-to-face interviews, and complemented by two phone interviews and an e-mail questionnaire with four responses. The findings consist mainly of the participants’ reflections on the potential products and services enabled by consumer genomics, as well as perceptions on different obstacles for biobanks’ business development. The nature of the study is tentative, as biobanks are relatively new organizations in Finland, and their operation models and activities are still molding. The aim is to bring to surface the hopes and concerns of biobanks’ representatives, as well as the representatives of stakeholders, in order to transparently discuss the current situation and suggestions for further development. The study concludes that in principle, the interviewees’ agree on the need for development in order not to waste the potential of biobanks; regardless, the participants emphasize different aspects and subsequently lean on differing methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämän Pro-gradu tutkielman tavoitteena on tutkia, miten pienyrityksen arvo määrittyy yrityskauppatilanteessa. Tämän lisäksi tutkimus pyrkii selvittämään millaisia erityispiirteitä pienyrityksen arvonmääritykseen liittyy ja miten eri arvonmääritysmenetelmät soveltuvat pienyrityksen arvonmääritykseen. Empiirinen tutkimus koostuu kolmesta yrityskauppaan ja arvonmääritykseen keskittyneen asiantuntijan haastattelusta. Haastattelujen avulla on tarkoitus lisätä ymmärrystä pienyritysten arvonmäärityksestä, sen erityispiirteistä, soveltuvista arvonmääritysmenetelmistä sekä siitä, miten lopullinen arvo muodostuu. Tutkimuksen tulokset osoittavat, että pienyrityksen arvon määrittäminen on vaativa prosessi, jossa on otettava huomioon monia erityisiä piirteitä. Yhtä oikeaa arvonmääritysmenetelmää ei ole vaan useita eri menetelmiä käytetään rinnakkain luotettavan arvon muodostamiseksi. Tärkeimpänä huomioitavana piirteenä esille nousi yrityksen tulevaisuuden ennustamisen hankaluus, minkä takia tulevaisuudelle ei anneta paljoa painoarvoa pienyrityksen arvonmäärityksessä. Tämä vaikuttaa olennaisesti käytettäviin arvonmääritysmenetelmiin ja niiden luotettavuuteen. Rahoitusteoreettisesti oikeaoppisin arvonmääritysmenetelmä, vapaan kassavirran malli, ei tulosten perusteella ole soveltuvin malli kaikkien pienyritysten kohdalla. Syynä tähän on, että suuri osa yrityksen arvosta muodostuu tulevaisuuden kassavirroista, joita pienyrityksen kohdalla on haasteellista ennustaa. Kaikki asiantuntijat käyttivät jokaisen pienyrityksen arvonmäärityksen yhteydessä markkinapohjaisia kertoimia, mutta tämänkin mallin käytön yhteydessä on omat haasteensa ja vaatii arvonmäärittäjältä syvällistä osaamista yrityksen toimialasta. Arvonmäärityksen lisäksi pienyrityksen toiminnan luonne lisää omat erityispiirteet, jotka huomioidaan arvoa laskevina riskitekijöinä. Pienyrityksen toiminta kiteytyy usein yrittäjän osaamiseen, joka koetaan suurena riskinä yrityskaupassa ja täten lopullista arvoa laskevana tekijänä. Voidaankin todeta, että pienyrityksen kohdalla arvonmääritys luo pohjan sekä raja-arvot ostettavasta yrityksestä maksettavalle hinnalle, mutta lopullinen kauppahinta muodostuu neuvottelujen lopputuloksena.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bedrock channels have been considered challenging geomorphic settings for the application of numerical models. Bedrock fluvial systems exhibit boundaries that are typically less mobile than alluvial systems, yet they are still dynamic systems with a high degree of spatial and temporal variability. To understand the variability of fluvial systems, numerical models have been developed to quantify flow magnitudes and patterns as the driving force for geomorphic change. Two types of numerical model were assessed for their efficacy in examining the bedrock channel system consisting of a high gradient portion of the Twenty Mile Creek in the Niagara Region of Ontario, Canada. A one-dimensional (1-D) flow model that utilizes energy equations, HEC RAS, was used to determine velocity distributions through the study reach for the mean annual flood (MAF), the 100-year return flood and the 1,000-year return flood. A two-dimensional (2-D) flow model that makes use of Navier-Stokes equations, RMA2, was created with the same objectives. The 2-D modeling effort was not successful due to the spatial complexity of the system (high slope and high variance). The successful 1 -D model runs were further extended using very high resolution geospatial interpolations inherent to the HEC RAS extension, HEC geoRAS. The modeled velocity data then formed the basis for the creation of a geomorphological analysis that focused upon large particles (boulders) and the forces needed to mobilize them. Several existing boulders were examined by collecting detailed measurements to derive three-dimensional physical models for the application of fluid and solid mechanics to predict movement in the study reach. An imaginary unit cuboid (1 metre by 1 metre by 1 metre) boulder was also envisioned to determine the general propensity for the movement of such a boulder through the bedrock system. The efforts and findings of this study provide a standardized means for the assessment of large particle movement in a bedrock fluvial system. Further efforts may expand upon this standardization by modeling differing boulder configurations (platy boulders, etc.) at a high level of resolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We assess the predictive ability of three VPIN metrics on the basis of two highly volatile market events of China, and examine the association between VPIN and toxic-induced volatility through conditional probability analysis and multiple regression. We examine the dynamic relationship on VPIN and high-frequency liquidity using Vector Auto-Regression models, Granger Causality tests, and impulse response analysis. Our results suggest that Bulk Volume VPIN has the best risk-warning effect among major VPIN metrics. VPIN has a positive association with market volatility induced by toxic information flow. Most importantly, we document a positive feedback effect between VPIN and high-frequency liquidity, where a negative liquidity shock boosts up VPIN, which, in turn, leads to further liquidity drain. Our study provides empirical evidence that reflects an intrinsic game between informed traders and market makers when facing toxic information in the high-frequency trading world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper tests the predictions of the Barro-Gordon model using US data on inflation and unemployment. To that end, it constructs a general game-theoretical model with asymmetric preferences that nests the Barro-Gordon model and a version of Cukierman’s model as special cases. Likelihood Ratio tests indicate that the restriction imposed by the Barro-Gordon model is rejected by the data but the one imposed by the version of Cukierman’s model is not. Reduced-form estimates are consistent with the view that the Federal Reserve weights more heavily positive than negative unemployment deviations from the expected natural rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a recent paper, Bai and Perron (1998) considered theoretical issues related to the limiting distribution of estimators and test statistics in the linear model with multiple structural changes. In this companion paper, we consider practical issues for the empirical applications of the procedures. We first address the problem of estimation of the break dates and present an efficient algorithm to obtain global minimizers of the sum of squared residuals. This algorithm is based on the principle of dynamic programming and requires at most least-squares operations of order O(T 2) for any number of breaks. Our method can be applied to both pure and partial structural-change models. Secondly, we consider the problem of forming confidence intervals for the break dates under various hypotheses about the structure of the data and the errors across segments. Third, we address the issue of testing for structural changes under very general conditions on the data and the errors. Fourth, we address the issue of estimating the number of breaks. We present simulation results pertaining to the behavior of the estimators and tests in finite samples. Finally, a few empirical applications are presented to illustrate the usefulness of the procedures. All methods discussed are implemented in a GAUSS program available upon request for non-profit academic use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we propose several finite-sample specification tests for multivariate linear regressions (MLR) with applications to asset pricing models. We focus on departures from the assumption of i.i.d. errors assumption, at univariate and multivariate levels, with Gaussian and non-Gaussian (including Student t) errors. The univariate tests studied extend existing exact procedures by allowing for unspecified parameters in the error distributions (e.g., the degrees of freedom in the case of the Student t distribution). The multivariate tests are based on properly standardized multivariate residuals to ensure invariance to MLR coefficients and error covariances. We consider tests for serial correlation, tests for multivariate GARCH and sign-type tests against general dependencies and asymmetries. The procedures proposed provide exact versions of those applied in Shanken (1990) which consist in combining univariate specification tests. Specifically, we combine tests across equations using the MC test procedure to avoid Bonferroni-type bounds. Since non-Gaussian based tests are not pivotal, we apply the “maximized MC” (MMC) test method [Dufour (2002)], where the MC p-value for the tested hypothesis (which depends on nuisance parameters) is maximized (with respect to these nuisance parameters) to control the test’s significance level. The tests proposed are applied to an asset pricing model with observable risk-free rates, using monthly returns on New York Stock Exchange (NYSE) portfolios over five-year subperiods from 1926-1995. Our empirical results reveal the following. Whereas univariate exact tests indicate significant serial correlation, asymmetries and GARCH in some equations, such effects are much less prevalent once error cross-equation covariances are accounted for. In addition, significant departures from the i.i.d. hypothesis are less evident once we allow for non-Gaussian errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the problem of testing the error distribution in a multivariate linear regression (MLR) model. The tests are functions of appropriately standardized multivariate least squares residuals whose distribution is invariant to the unknown cross-equation error covariance matrix. Empirical multivariate skewness and kurtosis criteria are then compared to simulation-based estimate of their expected value under the hypothesized distribution. Special cases considered include testing multivariate normal, Student t; normal mixtures and stable error models. In the Gaussian case, finite-sample versions of the standard multivariate skewness and kurtosis tests are derived. To do this, we exploit simple, double and multi-stage Monte Carlo test methods. For non-Gaussian distribution families involving nuisance parameters, confidence sets are derived for the the nuisance parameters and the error distribution. The procedures considered are evaluated in a small simulation experi-ment. Finally, the tests are applied to an asset pricing model with observable risk-free rates, using monthly returns on New York Stock Exchange (NYSE) portfolios over five-year subperiods from 1926-1995.