895 resultados para calibrated cameras
Redesign av personaltidningen Energitrycket samt anpassning för kopiering och publicering för webben
Resumo:
This degree project has been carried out on the commission of AB Borlänge Energi. The purpose of thisproject was to redesign the magazine Energitrycket. The magazine should inform about the current eventsat the company. The work consisted of redesign of the existing magazine and adjustment to copying andpublishing on the web.This project also contains a research about how paper and colour affect the print results. It shows thatthe maximum contrast and sharp image reproduction is best achieved on woodfree, coated, calenderedpaper with high whiteness. An uncoated, light yellow paper is however to recommend for a printed matterswith a lot of text.It also shows that every production unit reproduces colours differently. To get a good colour reproductionand a good communication between these units, they must be calibrated, and well-functioningICC-profiles must be created.
Resumo:
The motivation for this thesis work is the need for improving reliability of equipment and quality of service to railway passengers as well as a requirement for cost-effective and efficient condition maintenance management for rail transportation. This thesis work develops a fusion of various machine vision analysis methods to achieve high performance in automation of wooden rail track inspection.The condition monitoring in rail transport is done manually by a human operator where people rely on inference systems and assumptions to develop conclusions. The use of conditional monitoring allows maintenance to be scheduled, or other actions to be taken to avoid the consequences of failure, before the failure occurs. Manual or automated condition monitoring of materials in fields of public transportation like railway, aerial navigation, traffic safety, etc, where safety is of prior importance needs non-destructive testing (NDT).In general, wooden railway sleeper inspection is done manually by a human operator, by moving along the rail sleeper and gathering information by visual and sound analysis for examining the presence of cracks. Human inspectors working on lines visually inspect wooden rails to judge the quality of rail sleeper. In this project work the machine vision system is developed based on the manual visual analysis system, which uses digital cameras and image processing software to perform similar manual inspections. As the manual inspection requires much effort and is expected to be error prone sometimes and also appears difficult to discriminate even for a human operator by the frequent changes in inspected material. The machine vision system developed classifies the condition of material by examining individual pixels of images, processing them and attempting to develop conclusions with the assistance of knowledge bases and features.A pattern recognition approach is developed based on the methodological knowledge from manual procedure. The pattern recognition approach for this thesis work was developed and achieved by a non destructive testing method to identify the flaws in manually done condition monitoring of sleepers.In this method, a test vehicle is designed to capture sleeper images similar to visual inspection by human operator and the raw data for pattern recognition approach is provided from the captured images of the wooden sleepers. The data from the NDT method were further processed and appropriate features were extracted.The collection of data by the NDT method is to achieve high accuracy in reliable classification results. A key idea is to use the non supervised classifier based on the features extracted from the method to discriminate the condition of wooden sleepers in to either good or bad. Self organising map is used as classifier for the wooden sleeper classification.In order to achieve greater integration, the data collected by the machine vision system was made to interface with one another by a strategy called fusion. Data fusion was looked in at two different levels namely sensor-level fusion, feature- level fusion. As the goal was to reduce the accuracy of the human error on the rail sleeper classification as good or bad the results obtained by the feature-level fusion compared to that of the results of actual classification were satisfactory.
Resumo:
The question formulation in the essay brought up whether Clas Ohlson works today with their graphical productionflow and if that way of working leads to a desirable result in press. After a dialogue with Clas Ohlson it emerged thatthey had certain problems with the material that came out of the printing presses. Certain colours were not reproducedaccording to the stated values that had been developed. In March -08 an analysis of the graphical production flowspresent situation were done at Clas Ohlsons in order to find out how they worked with their production of print materials.In the analysis, their way to work with color spaces, displays, ICC-profiles, PDF-exports, pictures and printerswere examined. Occasional interviews were also implemented with responsible personnel about how they worked withdifferent aspects.Deficiencies were found on almost all examined parts in their graphical production flow. Wrong colorspaces andCMYK-profiles was used in big extent, the displays where uncalibrated or incorrectly calibrated, wrong PDF-presetswas used frequently and it also showed that six different suppliers for their printing materials were used. The solutionsthat were relevant for the problems and defects were later on presented as proposals on measures.
Resumo:
This Thesis project is a part of the all-round automation of production of concentrating solar PV/T systems Absolicon X10. ABSOLICON Solar Concentrator AB has been invented and started production of the prospective solar concentrated system Absolicon X10. The aims of this Thesis project are designing, assembling, calibrating and putting in operation the automatic measurement system intended to evaluate the shape of concentrating parabolic reflectors.On the basis of the requirements of the company administration and needs of real production process the operation conditions for the Laser testing rig were formulated. The basic concept to use laser radiation was defined.At the first step, the complex design of the whole system was made and division on the parts was defined. After the preliminary conducted simulations the function and operation conditions of the all parts were formulated.At the next steps, the detailed design of all the parts was conducted. Most components were ordered from respective companies. Some of the mechanical components were made in the workshop of the company. All parts of the Laser-testing rig were assembled and tested. Software part, which controls the Laser-testing rig work, was created on the LabVIEW basis. To tune and test software part the special simulator was designed and assembled.When all parts were assembled in the complete system, the Laser-testing rig was tested, calibrated and tuned.In the workshop of Absolicon AB, the trial measurements were conducted and Laser-testing rig was installed in the production line at the plant in Soleftea.
Resumo:
This Thesis project is a part of the research conducted in Solar industry. ABSOLICON Solar Concentrator AB has invented and started production of the prospective solar concentrated system Absolicon X10. The aims of this Thesis project are designing, assembling, calibrating and putting in operation the automatic measurement system intended to evaluate distribution of density of solar radiation in the focal line of the concentrated parabolic reflectors and to measure radiation from the artificial source of light being a calibration-testing tool.On the basis of the requirements of the company’s administration and needs of designing the concentrated reflectors the operation conditions for the Sun-Walker were formulated. As the first step, the complex design of the whole system was made and division on the parts was specified. After the preliminary conducted simulation of the functions and operation conditions of the all parts were formulated.As the next steps, the detailed design of all the parts was made. Most components were ordered from respective companies. Some of the mechanical components were made in the workshop of the company. All parts of the Sun-Walker were assembled and tested. The software part, which controls the Sun-Walker work and conducts measurements of solar irradiation, was created on the LabVIEW basis. To tune and test the software part, the special simulator was designed and assembled.When all parts were assembled in the complete system, the Sun-Walker was tested, calibrated and tuned.
Resumo:
A literature survey and a theoretical study were performed to characterize residential chimney conditions for flue gas flow measurements. The focus is on Pitot-static probes to give sufficient basis for the development and calibration of a velocity pressure averaging probe suitable for the continuous dynamic (i.e. non steady state) measurement of the low flow velocities present in residential chimneys. The flow conditions do not meet the requirements set in ISO 10780 and ISO 3966 for Pitot-static probe measurements, and the methods and their uncertainties are not valid. The flow velocities in residential chimneys from a heating boiler under normal operating condi-tions are shown to be so low that they in some conditions result in voiding the assumptions of non-viscous fluid justifying the use of the quadratic Bernoulli equation. A non-linear Reynolds number dependent calibration coefficient that is correcting for the viscous effects is needed to avoid significant measurement errors. The wide range of flow velocity during normal boiler operation also results in the flow type changing from laminar, across the laminar to turbulent transition region, to fully turbulent flow, resulting in significant changes of the velocity profile during dynamic measurements. In addition, the short duct lengths (and changes of flow direction and duct shape) used in practice are shown to result in that the measurements are done in the hydrodynamic entrance region where the flow velocity profiles most likely are neither symmetrical nor fully developed. A measurement method insensitive to velocity profile changes is thus needed, if the flow velocity profile cannot otherwise be determined or predicted with reasonable accuracy for the whole measurement range. Because of particulate matter and condensing fluids in the flue gas it is beneficial if the probe can be constructed so that it can easily be taken out for cleaning, and equipped with a locking mechanism to always ensure the same alignment in the duct without affecting the calibration. The literature implies that there may be a significant time lag in the measurements of low flow rates due to viscous effects in the internal impact pressure passages of Pitot probes, and the significance in the discussed application should be studied experimentally. The measured differential pressures from Pitot-static probes in residential chimney flows are so low that the calibration and given uncertainties of commercially available pressure transducers are not adequate. The pressure transducers should be calibrated specifically for the application, preferably in combination with the probe, and the significance of all different error sources should be investigated carefully. Care should be taken also with the temperature measurement, e.g. with averaging of several sensors, as significant temperature gradients may be present in flue gas ducts.
Resumo:
Renewable energy production is a basic supplement to stabilize rapidly increasing global energy demand and skyrocketing energy price as well as to balance the fluctuation of supply from non-renewable energy sources at electrical grid hubs. The European energy traders, government and private company energy providers and other stakeholders have been, since recently, a major beneficiary, customer and clients of Hydropower simulation solutions. The relationship between rainfall-runoff model outputs and energy productions of hydropower plants has not been clearly studied. In this research, association of rainfall, catchment characteristics, river network and runoff with energy production of a particular hydropower station is examined. The essence of this study is to justify the correspondence between runoff extracted from calibrated catchment and energy production of hydropower plant located at a catchment outlet; to employ a unique technique to convert runoff to energy based on statistical and graphical trend analysis of the two, and to provide environment for energy forecast. For rainfall-runoff model setup and calibration, MIKE 11 NAM model is applied, meanwhile MIKE 11 SO model is used to track, adopt and set a control strategy at hydropower location for runoff-energy correlation. The model is tested at two selected micro run-of-river hydropower plants located in South Germany. Two consecutive calibration is compromised to test the model; one for rainfall-runoff model and other for energy simulation. Calibration results and supporting verification plots of two case studies indicated that simulated discharge and energy production is comparable with the measured discharge and energy production respectively.
Resumo:
Demands are one of the most uncertain parameters in a water distribution network model. A good calibration of the model demands leads to better solutions when using the model for any purpose. A demand pattern calibration methodology that uses a priori information has been developed for calibrating the behaviour of demand groups. Generally, the behaviours of demands in cities are mixed all over the network, contrary to smaller villages where demands are clearly sectorised in residential neighbourhoods, commercial zones and industrial sectors. Demand pattern calibration has a final use for leakage detection and isolation. Detecting a leakage in a pattern that covers nodes spread all over the network makes the isolation unfeasible. Besides, demands in the same zone may be more similar due to the common pressure of the area rather than for the type of contract. For this reason, the demand pattern calibration methodology is applied to a real network with synthetic non-geographic demands for calibrating geographic demand patterns. The results are compared with a previous work where the calibrated patterns were also non-geographic.
Resumo:
Jakarta is vulnerable to flooding mainly caused by prolonged and heavy rainfall and thus a robust hydrological modeling is called for. A good quality of spatial precipitation data is therefore desired so that a good hydrological model could be achieved. Two types of rainfall sources are available: satellite and gauge station observations. At-site rainfall is considered to be a reliable and accurate source of rainfall. However, the limited number of stations makes the spatial interpolation not very much appealing. On the other hand, the gridded rainfall nowadays has high spatial resolution and improved accuracy, but still, relatively less accurate than its counterpart. To achieve a better precipitation data set, the study proposes cokriging method, a blending algorithm, to yield the blended satellite-gauge gridded rainfall at approximately 10-km resolution. The Global Satellite Mapping of Precipitation (GSMaP, 0.1⁰×0.1⁰) and daily rainfall observations from gauge stations are used. The blended product is compared with satellite data by cross-validation method. The newly-yield blended product is then utilized to re-calibrate the hydrological model. Several scenarios are simulated by the hydrological models calibrated by gauge observations alone and blended product. The performance of two calibrated hydrological models is then assessed and compared based on simulated and observed runoff.
Resumo:
A procedure for characterizing global uncertainty of a rainfall-runoff simulation model based on using grey numbers is presented. By using the grey numbers technique the uncertainty is characterized by an interval; once the parameters of the rainfall-runoff model have been properly defined as grey numbers, by using the grey mathematics and functions it is possible to obtain simulated discharges in the form of grey numbers whose envelope defines a band which represents the vagueness/uncertainty associated with the simulated variable. The grey numbers representing the model parameters are estimated in such a way that the band obtained from the envelope of simulated grey discharges includes an assigned percentage of observed discharge values and is at the same time as narrow as possible. The approach is applied to a real case study highlighting that a rigorous application of the procedure for direct simulation through the rainfall-runoff model with grey parameters involves long computational times. However, these times can be significantly reduced using a simplified computing procedure with minimal approximations in the quantification of the grey numbers representing the simulated discharges. Relying on this simplified procedure, the conceptual rainfall-runoff grey model is thus calibrated and the uncertainty bands obtained both downstream of the calibration process and downstream of the validation process are compared with those obtained by using a well-established approach, like the GLUE approach, for characterizing uncertainty. The results of the comparison show that the proposed approach may represent a valid tool for characterizing the global uncertainty associable with the output of a rainfall-runoff simulation model.
Resumo:
In the last years extreme hydrometeorological phenomena have increased in number and intensity affecting the inhabitants of various regions, an example of these effects are the central basins of the Gulf of Mexico (CBGM) that they have been affected by 55.2% with floods and especially the state of Veracruz (1999-2013), leaving economic, social and environmental losses. Mexico currently lacks sufficient hydrological studies for the measurement of volumes in rivers, since is convenient to create a hydrological model (HM) suited to the quality and quantity of the geographic and climatic information that is reliable and affordable. Therefore this research compares the semi-distributed hydrological model (SHM) and the global hydrological model (GHM), with respect to the volumes of runoff and achieve to predict flood areas, furthermore, were analyzed extreme hydrometeorological phenomena in the CBGM, by modeling the Hydrologic Modeling System (HEC-HMS) which is a SHM and the Modèle Hydrologique Simplifié à I'Extrême (MOHYSE) which is a GHM, to evaluate the results and compare which model is suitable for tropical conditions to propose public policies for integrated basins management and flood prevention. Thus it was determined the temporal and spatial framework of the analyzed basins according to hurricanes and floods. It were developed the SHM and GHM models, which were calibrated, validated and compared the results to identify the sensitivity to the real model. It was concluded that both models conform to tropical conditions of the CBGM, having MOHYSE further approximation to the real model. Worth mentioning that in Mexico there is not enough information, besides there are no records of MOHYSE use in Mexico, so it can be a useful tool for determining runoff volumes. Finally, with the SHM and the GHM were generated climate change scenarios to develop risk studies creating a risk map for urban planning, agro-hydrological and territorial organization.
Resumo:
This article studies the impact of longevity and taxation on life-cycle decisions and long-run income. Individuals allocate optimally their total lifetime between education, working and retirement. They also decide at each moment how much to save or consume out of their income, and after entering the labor market how to divide their time between labor and leisure. The model incorporates experience-earnings profiles and the return-to-education function that follows evidence from the labor literature. In this setup, increases in longevity raises the investment in education - time in school - and retirement. The model is calibrated to the U.S. and is able to reproduce observed schooling levels and the increase in retirement, as the evidence shows. Simulations show that a country equal to the U.S. but with 20% smaller longevity will be 25% poorer. In this economy, labor taxes have a strong impact on the per capita income, as it decreases labor effort, time at school and retirement age, in addition to the general equilibrium impact on physical capital. We conclude that life-cycle effects are relevant in analyzing the aggregate outcome of taxation.
Resumo:
O objetivo da tese é analisar questões relativas à coordenação entre as políticas monetária e fiscal no Brasil após a adoção do regime de metas de inflação. Utiliza-se de um modelo de metas de inflação para uma economia pequena e aberta para a incorporação um bloco de equações que descrevem a dinâmica das variáveis fiscais. Tendo por base os conceitos de Leeper (1991), ambas as entidades, Banco Central e Tesouro Nacional, podem agir de forma ativa ou passiva, e será este comportamento estratégico que determinará a eficiência da política monetária. Foram estimados os parâmetros que calibram o modelo e feitas as simulações para alguns dos choques que abalaram a economia brasileira nos últimos anos. Os resultados mostraram que nos arranjos em que a autoridade fiscal reage a aumentos de dívida pública com alterações no superávit primário, a trajetória de ajuste das variáveis frente a choques tende a ser, na maioria dos casos, menos volátil propiciando uma atuação mais eficiente do Banco Central. Nestes arranjos, o Banco Central não precisa tomar para si funções que são inerentes ao Tesouro. Também são analisadas as variações no comportamento do Banco Central e do Tesouro Nacional em função de diferentes composições da dívida pública. Os resultados mostram que a estrutura do endividamento público será benéfica, ou não, à condução das políticas monetária e fiscal, dependendo do tipo de choque enfrentado. O primeiro capítulo, introdutório, procura contextualizar o regime de metas de inflação brasileiro e descrever, sucintamente, a evolução da economia brasileira desde sua implantação. No segundo capítulo são analisados os fundamentos teóricos do regime de metas de inflação, sua origem e principais componentes; em seguida, são apresentados, as regras de política fiscal necessárias à estabilidade de preços e o problema da dominância fiscal no âmbito da economia brasileira. O terceiro capítulo apresenta a incorporação do bloco de equações fiscais no modelo de metas de inflação para economia aberta proposto por Svensson (2000), e as estimações e calibrações dos seus parâmetros para a economia brasileira. O quarto capítulo discute as diferentes formas de coordenação entre as autoridades monetária e fiscal e a atuação ótima do Banco Central. O quinto capítulo tem como base a mais eficiente forma de coordenação obtida no capítulo anterior para analisar as mudanças no comportamento da autoridade monetária e fiscal frente a diferentes estruturas de prazos e indexadores da dívida pública que afetam suas elasticidades, juros, inflação e câmbio.
Resumo:
É consenso na análise antitruste que o ato de concentração de empresas com participação significativa deve sofrer averiguações quanto a sua aprovação em decorrência dos efeitos prejudiciais que pode gerar sobre a concorrência na indústria. Concorrência é sempre desejável por favorecer melhores níveis de bem-estar econômico. À luz das investigações econômicas que os sistemas de defesa da concorrência realizam, este trabalho analisa as mensurações da simulação de efeitos unilaterais de concentrações horizontais. As avaliações realizadas testam a utilização do modelo PC-AIDS (Proportionaly Calibrated AIDS), de Epstein e Rubinfeld (2002). Dentre algumas conclusões que se extraem do uso do modelo temos que: (i) em mercados com baixa concentração econômica, o modelo avaliado para um intervalo da vizinhança da elasticidade-preço própria estimada, traz mensurações robustas, e (ii) para mercados com alta concentração econômica uma atenção maior deve ser dada à correspondência dos valores calibrados e estimados das elasticidades-preços próprias, para que não ocorra sub ou superestimação dos efeitos unilaterais do ato de concentração. Esse resultado é avaliado no caso Nestlé/Garoto.
Resumo:
Esta tese é composta por três artigos, nos quais são apresentadas extensões e aplicações da Teoria das Opções Reais, todas de interesse para formuladores de política econômica no Brasil. O primeiro faz uma análise original da questão da bioprospecção, ou a exploração da diversidade biológica para fins econômicos. Duas estruturas alternativas para o desenho do mecanismo de concessão, visando o uso sustentável da biodiversidade brasileira, são sugeridas: (i) um modelo de projetos de P&D com maturidade incerta, no qual a intensidade do processo de Poisson que governa o tempo de maturação é explicitamente dependente do nível da biodiversidade no local concedido; (ii) um modelo de Agente-Principal, onde o Estado delega o exercício da opção de investimento à empresa de pesquisa biotecnológica. O segundo artigo avança a analogia entre opções de venda (“put options”) e cotas de importação. Os parâmetros relevantes para apreçar as licenças são agora obtidos endogenamente, a partir da interação entre a firma importadora e os produtores domésticos. Por fim, no terceiro, é feita análise pioneira do mercado paralelo de títulos precatórios no Brasil. Um modelo para a valoração de tais títulos é construído e proposto, tendo por base o arcabouço institucional existente sobre o assunto, tanto no governo central, como nos estados e municípios.