53 resultados para depth estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gas-liquid mass transfer is an important issue in the design and operation of many chemical unit operations. Despite its importance, the evaluation of gas-liquid mass transfer is not straightforward due to the complex nature of the phenomena involved. In this thesis gas-liquid mass transfer was evaluated in three different gas-liquid reactors in a traditional way by measuring the volumetric mass transfer coefficient (kLa). The studied reactors were a bubble column with a T-junction two-phase nozzle for gas dispersion, an industrial scale bubble column reactor for the oxidation of tetrahydroanthrahydroquinone and a concurrent downflow structured bed.The main drawback of this approach is that the obtained correlations give only the average volumetric mass transfer coefficient, which is dependent on average conditions. Moreover, the obtained correlations are valid only for the studied geometry and for the chemical system used in the measurements. In principle, a more fundamental approach is to estimate the interfacial area available for mass transfer from bubble size distributions obtained by solution of population balance equations. This approach has been used in this thesis by developing a population balance model for a bubble column together with phenomenological models for bubble breakage and coalescence. The parameters of the bubble breakage rate and coalescence rate models were estimated by comparing the measured and calculated bubble sizes. The coalescence models always have at least one experimental parameter. This is because the bubble coalescence depends on liquid composition in a way which is difficult to evaluate using known physical properties. The coalescence properties of some model solutions were evaluated by measuring the time that a bubble rests at the free liquid-gas interface before coalescing (the so-calledpersistence time or rest time). The measured persistence times range from 10 msup to 15 s depending on the solution. The coalescence was never found to be instantaneous. The bubble oscillates up and down at the interface at least a coupleof times before coalescence takes place. The measured persistence times were compared to coalescence times obtained by parameter fitting using measured bubble size distributions in a bubble column and a bubble column population balance model. For short persistence times, the persistence and coalescence times are in good agreement. For longer persistence times, however, the persistence times are at least an order of magnitude longer than the corresponding coalescence times from parameter fitting. This discrepancy may be attributed to the uncertainties concerning the estimation of energy dissipation rates, collision rates and mechanisms and contact times of the bubbles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software engineering is criticized as not being engineering or 'well-developed' science at all. Software engineers seem not to know exactly how long their projects will last, what they will cost, and will the software work properly after release. Measurements have to be taken in software projects to improve this situation. It is of limited use to only collect metrics afterwards. The values of the relevant metrics have to be predicted, too. The predictions (i.e. estimates) form the basis for proper project management. One of the most painful problems in software projects is effort estimation. It has a clear and central effect on other project attributes like cost and schedule, and to product attributes like size and quality. Effort estimation can be used for several purposes. In this thesis only the effort estimation in software projects for project management purposes is discussed. There is a short introduction to the measurement issues, and some metrics relevantin estimation context are presented. Effort estimation methods are covered quite broadly. The main new contribution in this thesis is the new estimation model that has been created. It takes use of the basic concepts of Function Point Analysis, but avoids the problems and pitfalls found in the method. It is relativelyeasy to use and learn. Effort estimation accuracy has significantly improved after taking this model into use. A major innovation related to the new estimationmodel is the identified need for hierarchical software size measurement. The author of this thesis has developed a three level solution for the estimation model. All currently used size metrics are static in nature, but this new proposed metric is dynamic. It takes use of the increased understanding of the nature of the work as specification and design work proceeds. It thus 'grows up' along with software projects. The effort estimation model development is not possible without gathering and analyzing history data. However, there are many problems with data in software engineering. A major roadblock is the amount and quality of data available. This thesis shows some useful techniques that have been successful in gathering and analyzing the data needed. An estimation process is needed to ensure that methods are used in a proper way, estimates are stored, reported and analyzed properly, and they are used for project management activities. A higher mechanism called measurement framework is also introduced shortly. The purpose of the framework is to define and maintain a measurement or estimationprocess. Without a proper framework, the estimation capability of an organization declines. It requires effort even to maintain an achieved level of estimationaccuracy. Estimation results in several successive releases are analyzed. It isclearly seen that the new estimation model works and the estimation improvementactions have been successful. The calibration of the hierarchical model is a critical activity. An example is shown to shed more light on the calibration and the model itself. There are also remarks about the sensitivity of the model. Finally, an example of usage is shown.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Työn tavoitteena on luoda yleinen informaatioinfrastruktuuri autoteollisuuden valmistuskustannusten arviointiin. Nykyään tämä kustannusarviointi on laajassa käytössä oleva menetelmä. Se mahdollistaa tuotekustannusten hallitsemisen, mikä lisää autovalmistajien kilpailukykyä. Kustannusarvioinnissa tarvitaan laadukasta tietoa, mutta suoritetussa tutkimuksessa paljastui, että useat seikat haittaavat tätä arviointia. Erityisesti resurssien vähyys, tiedonhankinta ja tiedon luotettavuuden varmentaminen aiheuttavat ongelmia. Nämä seikat ovat johtaneet kokemusperäisen asiantuntemuksen laajaan käyttöön, minkä johdosta erityisesti kokemattomilla kustannusarvioijilla on vaikeuksia ymmärtää kustannusarvioiden tietovaatimuksia. Tämän johdosta tutkimus tuo esiin kokeneiden kustannusarvioijien käyttämiä tietoja ja tietolähteitä päämääränä lisätä kustannusarvioiden ymmärtämistä. Informaatioinfrastruktuuri, joka sisältää tarvittavan tiedon järkevien ja luotettavien kustannusarvioiden luontiin, perustuu tutkimuksen tuloksiin. Infrastruktuuri määrittelee tarvittavan kustannustiedon ja niiden mahdolliset tietolähteet. Lisäksi se selvittää miksi tieto on tarpeellista ja miten tiedon oikeellisuus pitäisi varmentaa. Infrastruktuuria käytetään yhdessä yleisen kustannusarvioprosessimallin kanssa. Tämä integrointi johtaa tarkempiin ja selkeämpiin kustannusarvioihin autoteollisuudessa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Työn tavoitteena oli kehittää tutkittavan insinööriyksikön projektien kustannusestimointiprosessia, siten että yksikön johdolla olisi tulevaisuudessa käytettävänään tarkempaa kustannustietoa. Jotta tämä olisi mahdollista, ensin täytyi selvittää yksikön toimintatavat, projektien kustannusrakenteet sekä kustannusatribuutit. Tämän teki mahdolliseksi projektien kustannushistoriatiedon tutkiminen sekä asiantuntijoiden haastattelu. Työn tuloksena syntyi kohdeyksikön muiden prosessien kanssa yhteensopiva kustannusestimointiprosessi sekä –malli.Kustannusestimointimenetelmän ja –mallin perustana on kustannusatribuutit, jotka määritellään erikseen tutkittavassa ympäristössä. Kustannusatribuutit löydetään historiatietoa tutkimalla, eli analysoimalla jo päättyneitä projekteja, projektien kustannusrakenteita sekä tekijöitä, jotka ovat vaikuttaneet kustannusten syntyyn. Tämän jälkeen kustannusatribuuteille täytyy määritellä painoarvot sekä painoarvojen vaihteluvälit. Estimointimallin tarkuutta voidaan parantaa mallin kalibroinnilla. Olen käyttänyt Goal – Question – Metric (GQM) –menetelmää tutkimuksen kehyksenä.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of my thesis is to assess mechanisms of ecological community control in macroalgal communities in the Baltic Sea. In the top-down model, predatory fish feed on invertebrate mesograzers, releasing algae partly from grazing pressure. Such a reciprocal relationship is called trophic cascade. In the bottom-up model, nutrients increase biomass in the food chain. The nutrients are first assimilated by algae and, via food chain, increase also abundance of grazers and predators. Previous studies on oceanic shores have described these two regulative mechanisms in the grazer - alga link, but how they interact in the trophic cascades from fish to algae is still inadequately known. Because the top-down and bottom-up mechanisms are predicted to depend on environmental disturbances, such as wave stress and light, I have studied these models at two distinct water depths. There are five factorial field experiments behind the thesis, which were all conducted in the Finnish Archipelago Sea. In all the experiments, I studied macroalgal colonization - either density, filament length or biomass - on submerged colonization substrates. By excluding predatory fish and mesograzers from the algal communities, the studies compared the strength of the top-down control to natural algal communities. A part of the experimental units were, in addition, exposed to enriched nitrogen and phosphorus concentrations, which enabled testing of bottom-up control. These two models of community control were further investigated in shallow (<1 m) and deep (ca. 3 m) water. Moreover, the control mechanisms were also expected to depend on grazer species. Therefore different grazer species were enclosed into experimental units and their impacts on macroalgal communities were followed specifically. The community control in the Baltic rocky shores was found to follow theoretical predictions, which have not been confirmed by field studies before. Predatory fish limited grazing impact, which was seen as denser algal communities and longer algal filaments. Nutrient enrichment increased density and filament length of annual algae and, thus, changed the species composition of the algal community. The perennial alga Fucus vesiculosusA and the red alga Ceramium tenuicorne suffered from the increased nutrient availabilities. The enriched nutrient conditions led to denser grazer fauna, thereby causing strong top-down control over both the annual and perennial macroalgae. The strength of the top-down control seemed to depend on the density and diversity of grazers and predators as well as on the species composition of macroalgal assemblages. The nutrient enrichment led to, however, weaker limiting impact of predatory fish on grazer fauna, because fish stocks did not respond as quickly to enhanced resources in the environment as the invertebrate fauna. According to environmental stress model, environmental disturbances weaken the top-down control. For example, on a wave-exposed shore, wave stress causes more stress to animals close to the surface than deeper on the shore. Mesograzers were efficient consumers at both the depths, while predation by fish was weaker in shallow water. Thus, the results supported the environmental stress model, which predicts that environmental disturbance affects stronger the higher a species is in the food chain. This thesis assessed the mechanisms of community control in three-level food chains and did not take into account higher predators. Such predators in the Baltic Sea are, for example, cormorant, seals, white-tailed sea eagle, cod and salmon. All these predatory species were recently or are currently under intensive fishing, hunting and persecution, and their stocks have only recently increased in the region. Therefore, it is possible that future densities of top predators may yet alter the strengths of the controlling mechanisms in the Baltic littoral zone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In mathematical modeling the estimation of the model parameters is one of the most common problems. The goal is to seek parameters that fit to the measurements as well as possible. There is always error in the measurements which implies uncertainty to the model estimates. In Bayesian statistics all the unknown quantities are presented as probability distributions. If there is knowledge about parameters beforehand, it can be formulated as a prior distribution. The Bays’ rule combines the prior and the measurements to posterior distribution. Mathematical models are typically nonlinear, to produce statistics for them requires efficient sampling algorithms. In this thesis both Metropolis-Hastings (MH), Adaptive Metropolis (AM) algorithms and Gibbs sampling are introduced. In the thesis different ways to present prior distributions are introduced. The main issue is in the measurement error estimation and how to obtain prior knowledge for variance or covariance. Variance and covariance sampling is combined with the algorithms above. The examples of the hyperprior models are applied to estimation of model parameters and error in an outlier case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cost estimation is an important, but challenging process when designing a new product or a feature of it, verifying the product prices given by suppliers or planning a cost saving actions of existing products. It is even more challenging when the product is highly modular, not a bulk product. In general, cost estimation techniques can be divided into two main groups - qualitative and quantitative techniques - which can further be classified into more detailed methods. Generally, qualitative techniques are preferable when comparing alternatives and quantitative techniques when cost relationships can be found. The main objective of this thesis was to develop a method on how to estimate costs of internally manufactured and commercial elevator landing doors. Because of the challenging product structure, the proposed cost estimation framework is developed under three different levels based on past cost information available. The framework consists of features from both qualitative and quantitative cost estimation techniques. The starting point for the whole cost estimation process is an unambiguous, hierarchical product structure so that the product can be classified into controllable parts and is then easier to handle. Those controllable parts can then be compared to existing past cost knowledge of similar parts and create as accurate cost estimates as possible by that way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sensor-based robot control allows manipulation in dynamic environments with uncertainties. Vision is a versatile low-cost sensory modality, but low sample rate, high sensor delay and uncertain measurements limit its usability, especially in strongly dynamic environments. Force is a complementary sensory modality allowing accurate measurements of local object shape when a tooltip is in contact with the object. In multimodal sensor fusion, several sensors measuring different modalities are combined to give a more accurate estimate of the environment. As force and vision are fundamentally different sensory modalities not sharing a common representation, combining the information from these sensors is not straightforward. In this thesis, methods for fusing proprioception, force and vision together are proposed. Making assumptions of object shape and modeling the uncertainties of the sensors, the measurements can be fused together in an extended Kalman filter. The fusion of force and visual measurements makes it possible to estimate the pose of a moving target with an end-effector mounted moving camera at high rate and accuracy. The proposed approach takes the latency of the vision system into account explicitly, to provide high sample rate estimates. The estimates also allow a smooth transition from vision-based motion control to force control. The velocity of the end-effector can be controlled by estimating the distance to the target by vision and determining the velocity profile giving rapid approach and minimal force overshoot. Experiments with a 5-degree-of-freedom parallel hydraulic manipulator and a 6-degree-of-freedom serial manipulator show that integration of several sensor modalities can increase the accuracy of the measurements significantly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the current economy situation companies try to reduce their expenses. One of the solutions is to improve the energy efficiency of the processes. It is known that the energy consumption of pumping applications range from 20 up to 50% of the energy usage in the certain industrial plants operations. Some studies have shown that 30% to 50% of energy consumed by pump systems could be saved by changing the pump or the flow control method. The aim of this thesis is to create a mobile measurement system that can calculate a working point position of a pump drive. This information can be used to determine the efficiency of the pump drive operation and to develop a solution to bring pump’s efficiency to a maximum possible value. This can allow a great reduction in the pump drive’s life cycle cost. In the first part of the thesis, a brief introduction in the details of pump drive operation is given. Methods that can be used in the project are presented. Later, the review of available platforms for the project implementation is given. In the second part of the thesis, components of the project are presented. Detailed description for each created component is given. Finally, results of laboratory tests are presented. Acquired results are compared and analyzed. In addition, the operation of created system is analyzed and suggestions for the future development are given.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this case study is to provide a Finnish solution provider company an objective, in-depth analysis of their project based business and especially of project estimation accuracy. A project and customer profitability analysis is conducted as a complementary addition to describe profitability of the Case Company’s core division. The theoretical framework is constructed on project profitability and customer profitability analysis. Project profitability is approached starting from managing projects, continuing to project pricing process and concluding to project success. The empirical part of this study describes the Case Company’s project portfolio, and by means of quantitative analysis, the study describes how the characteristics of a project impact the project’s profitability. The findings indicate that it really makes a difference in project portfolio’s estimated and actual profitability when methods of installation and technical specifications are scrutinized. Implications on profitability are gathered into a risk assessment tool proposal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this master’s thesis is to develop an algorithm to calculate the cable network for heat and power station CHGRES. This algorithm includes important aspect which has an influence on the cable network reliability. Moreover, according to developed algorithm, the optimal solution for modernization cable system from economical and technical point of view was obtained. The conditions of existing cable lines show that replacement is necessary. Otherwise, the fault situation would happen. In this case company would loss not only money but also its prestige. As a solution, XLPE single core cables are more profitable than other types of cable considered in this work. Moreover, it is presented the dependence of value of short circuit current on number of 10/110 kV transformers connected in parallel between main grid and considered 10 kV busbar and how it affects on final decision. Furthermore, the losses of company in power (capacity) market due to fault situation are presented. These losses are commensurable with investment to replace existing cable system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mathematical models often contain parameters that need to be calibrated from measured data. The emergence of efficient Markov Chain Monte Carlo (MCMC) methods has made the Bayesian approach a standard tool in quantifying the uncertainty in the parameters. With MCMC, the parameter estimation problem can be solved in a fully statistical manner, and the whole distribution of the parameters can be explored, instead of obtaining point estimates and using, e.g., Gaussian approximations. In this thesis, MCMC methods are applied to parameter estimation problems in chemical reaction engineering, population ecology, and climate modeling. Motivated by the climate model experiments, the methods are developed further to make them more suitable for problems where the model is computationally intensive. After the parameters are estimated, one can start to use the model for various tasks. Two such tasks are studied in this thesis: optimal design of experiments, where the task is to design the next measurements so that the parameter uncertainty is minimized, and model-based optimization, where a model-based quantity, such as the product yield in a chemical reaction model, is optimized. In this thesis, novel ways to perform these tasks are developed, based on the output of MCMC parameter estimation. A separate topic is dynamical state estimation, where the task is to estimate the dynamically changing model state, instead of static parameters. For example, in numerical weather prediction, an estimate of the state of the atmosphere must constantly be updated based on the recently obtained measurements. In this thesis, a novel hybrid state estimation method is developed, which combines elements from deterministic and random sampling methods.