951 resultados para Building demand estimation model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho analisa o setor brasileiro de celulose e tenta responder a duas questões principais: a abrangência do mercado relevante e a existência de poder de mercado das empresas que atuam neste setor. A dimensão produto do mercado relevante foi definida a partir de dados qualitativos. Devido à indisponibilidade de dados para uma análise qualitativa mais apurada, a opção foi pela celulose de fibra curta de eucalipto, produto mais importante do setor, tanto pela posição brasileira em tecnologia como pela pauta de exportações. Já quanto à dimensão geográfica, o procedimento realizado baseou-se em Forni (2004) que utiliza testes de raiz unitária para a definição do mercado. Concluiu-se que, com os dados disponíveis, o mercado deste produto pode ser considerado como internacional, não somente pelo resultado do teste como também pelo modo de funcionamento deste mercado. Definido o mercado de produto e geográfico, realizou-se um teste de poder de mercado, pois neste nicho, a Aracruz é líder mundial. Tal teste foi realizado com base na demanda residual descrita por Mayo, Kaserman e Kahai (1996) e estimado segundo Motta (2004). Concluiu-se que, apesar de a Aracruz possuir um elevado market share no setor, ela não possui poder de mercado.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O objetivo desta dissertação foi estimar a demanda de tratores agrícolas para o mercado brasileiro no triênio 2016-2018, utilizando-se para isto de técnicas de econometria de séries temporais, neste caso, modelos univariados da classe ARIMA e SARIMA e ou multivariados SARIMAX. Justifica-se esta pesquisa quando se observa a indústria de máquinas agrícolas no Brasil, dados os ciclos econômicos e outros fatores exógenos aos fundamentos econômicos da demanda, onde esta enfrenta muitos desafios. Dentre estes, a estimação de demanda se destaca, pois exerce forte impacto, por exemplo, no planejamento e custo de produção de curto e médio prazo, níveis de inventários, na relação com fornecedores de materiais e de mão de obra local, e por consequência na geração de valor para o acionista. Durante a fase de revisão bibliográfica foram encontrados vários trabalhos científicos que abordam o agronegócio e suas diversas áreas de atuação, porém, não foram encontrados trabalhos científicos publicados no Brasil que abordassem a previsão da demanda de tratores agrícolas no Brasil, o que serviu de motivação para agregar conhecimento à academia e valor ao mercado através deste. Concluiu-se, após testes realizados com diversos modelos que estão dispostos no texto e apêndices, que o modelo univariado SARIMA (15,1,1) (1,1,1) cumpriu as premissas estabelecidas nos objetivos específicos para escolha do modelo que melhor se ajusta aos dados, e foi escolhido então, como o modelo para estimação da demanda de tratores agrícolas no Brasil. Os resultados desta pesquisa apontam para uma demanda de tratores agrícolas no Brasil oscilando entre 46.000 e 49.000 unidades ano entre os anos de 2016 e 2018.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Studies have been carried out on the heat transfer in a packed bed of glass beads percolated by air at moderate flow rates. Rigorous statistic analysis of the experimental data was carried out and the traditional two parameter model was used to represent them. The parameters estimated were the effective radial thermal conductivity, k, and the wall coefficient, h, through the least squares method. The results were evaluated as to the boundary bed inlet temperature, T-o, number of terms of the solution series and number of experimental points used in the estimate. Results indicated that a small difference in T-o was sufficient to promote great modifications in the estimated parameters and in the statistical properties of the model. The use of replicas at points of high parametric information of the model improved the results, although analysis of the residuals has resulted in the rejection of this alternative. In order to evaluate cion-linearity of the model, Bates and Watts (1988) curvature measurements and the Box (1971) biases of the coefficients were calculated. The intrinsic curvatures of the model (IN) tend to be concentrated at low bed heights and those due to parameter effects (PE) are spread all over the bed. The Box biases indicated both parameters as responsible for the curvatures PE, h being somewhat more problematic. (C) 2000 Elsevier B.V. Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estudo referente a avaliação do Programa de HIV/aids em um hospital de referência, na concepção dos usuários. O objetivo geral da pesquisa foi descrever a concepção dos usuários na unidade de internação e ambulatorial, sobre o programa de HIV/aids em um hospital universitário de referência na cidade de Belém-PA. Para o estudo foram traçados como objetivos específicos: apresentar as sugestões dos usuários do programa para a melhoria no desenvolvimento das atividades do programa e fornecer as ferramentas mínimas para a construção de um modelo de avaliação para o programa. No referencial foi utilizado o panorama da epidemia de HIV/AIDS, políticas de saúde no Brasil e a aids, e a avaliação de programas e serviços em saúde. O método e a abordagem foram qualitativo, descritivo do tipo estudo de caso. Foi utilizado um roteiro de entrevistas semi-estruturado sobre a temática. A pesquisa foi desenvolvida nas dependências do HUJBB, mais especificamente nos serviços do Programa de HIV/aids, em que foram entrevistados usuários do SAE e os internados na clínica de doenças infecto-parasitárias. Através da análise dos dados, foram divididas as falas em quatro categorias: caracterização do usuário quanto ao acesso ao serviço; a motivação do usuário pela qualidade do serviço; dificuldades frente às atividades e orientações do serviço; expectativas para a melhoria do programa. O resultado da pesquisa esclarece as restrições e qualidades do programa; a insatisfação do usuário no acesso ao serviço; o alto grau de motivação bem como a ausência da mesma; as principias dificuldades dos usuários em aderir as orientações e atividades do programa, como o medo de encarar a morte, a falta de recurso para chegar ao serviço, o preconceito vivido pelas pessoas com HIV/aids e a promiscuidade sexual; e as expectativas dos usuários para o programa: rapidez no atendimento, melhor infraestrutura. Conclui-se com as melhorias que podem contribuir para a qualidade de vida dos usuários, subsidiar demanda suficiente de profissionais para atender no programa e oferecer tratamento completo às pessoas que vivem com HIV/aids atendidas no programa do HUJBB.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The presented study carried out an analysis on rural landscape changes. In particular the study focuses on the understanding of driving forces acting on the rural built environment using a statistical spatial model implemented through GIS techniques. It is well known that the study of landscape changes is essential for a conscious decision making in land planning. From a bibliography review results a general lack of studies dealing with the modeling of rural built environment and hence a theoretical modelling approach for such purpose is needed. The advancement in technology and modernity in building construction and agriculture have gradually changed the rural built environment. In addition, the phenomenon of urbanization of a determined the construction of new volumes that occurred beside abandoned or derelict rural buildings. Consequently there are two types of transformation dynamics affecting mainly the rural built environment that can be observed: the conversion of rural buildings and the increasing of building numbers. It is the specific aim of the presented study to propose a methodology for the development of a spatial model that allows the identification of driving forces that acted on the behaviours of the building allocation. In fact one of the most concerning dynamic nowadays is related to an irrational expansion of buildings sprawl across landscape. The proposed methodology is composed by some conceptual steps that cover different aspects related to the development of a spatial model: the selection of a response variable that better describe the phenomenon under study, the identification of possible driving forces, the sampling methodology concerning the collection of data, the most suitable algorithm to be adopted in relation to statistical theory and method used, the calibration process and evaluation of the model. A different combination of factors in various parts of the territory generated favourable or less favourable conditions for the building allocation and the existence of buildings represents the evidence of such optimum. Conversely the absence of buildings expresses a combination of agents which is not suitable for building allocation. Presence or absence of buildings can be adopted as indicators of such driving conditions, since they represent the expression of the action of driving forces in the land suitability sorting process. The existence of correlation between site selection and hypothetical driving forces, evaluated by means of modeling techniques, provides an evidence of which driving forces are involved in the allocation dynamic and an insight on their level of influence into the process. GIS software by means of spatial analysis tools allows to associate the concept of presence and absence with point futures generating a point process. Presence or absence of buildings at some site locations represent the expression of these driving factors interaction. In case of presences, points represent locations of real existing buildings, conversely absences represent locations were buildings are not existent and so they are generated by a stochastic mechanism. Possible driving forces are selected and the existence of a causal relationship with building allocations is assessed through a spatial model. The adoption of empirical statistical models provides a mechanism for the explanatory variable analysis and for the identification of key driving variables behind the site selection process for new building allocation. The model developed by following the methodology is applied to a case study to test the validity of the methodology. In particular the study area for the testing of the methodology is represented by the New District of Imola characterized by a prevailing agricultural production vocation and were transformation dynamic intensively occurred. The development of the model involved the identification of predictive variables (related to geomorphologic, socio-economic, structural and infrastructural systems of landscape) capable of representing the driving forces responsible for landscape changes.. The calibration of the model is carried out referring to spatial data regarding the periurban and rural area of the study area within the 1975-2005 time period by means of Generalised linear model. The resulting output from the model fit is continuous grid surface where cells assume values ranged from 0 to 1 of probability of building occurrences along the rural and periurban area of the study area. Hence the response variable assesses the changes in the rural built environment occurred in such time interval and is correlated to the selected explanatory variables by means of a generalized linear model using logistic regression. Comparing the probability map obtained from the model to the actual rural building distribution in 2005, the interpretation capability of the model can be evaluated. The proposed model can be also applied to the interpretation of trends which occurred in other study areas, and also referring to different time intervals, depending on the availability of data. The use of suitable data in terms of time, information, and spatial resolution and the costs related to data acquisition, pre-processing, and survey are among the most critical aspects of model implementation. Future in-depth studies can focus on using the proposed model to predict short/medium-range future scenarios for the rural built environment distribution in the study area. In order to predict future scenarios it is necessary to assume that the driving forces do not change and that their levels of influence within the model are not far from those assessed for the time interval used for the calibration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the accuracy of software-based on-line energy estimation techniques. It evaluates today’s most widespread energy estimation model in order to investigate whether the current methodology of pure software-based energy estimation running on a sensor node itself can indeed reliably and accurately determine its energy consumption - independent of the particular node instance, the traffic load the node is exposed to, or the MAC protocol the node is running. The paper enhances today’s widely used energy estimation model by integrating radio transceiver switches into the model, and proposes a methodology to find the optimal estimation model parameters. It proves by statistical validation with experimental data that the proposed model enhancement and parameter calibration methodology significantly increases the estimation accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Latent class regression models are useful tools for assessing associations between covariates and latent variables. However, evaluation of key model assumptions cannot be performed using methods from standard regression models due to the unobserved nature of latent outcome variables. This paper presents graphical diagnostic tools to evaluate whether or not latent class regression models adhere to standard assumptions of the model: conditional independence and non-differential measurement. An integral part of these methods is the use of a Markov Chain Monte Carlo estimation procedure. Unlike standard maximum likelihood implementations for latent class regression model estimation, the MCMC approach allows us to calculate posterior distributions and point estimates of any functions of parameters. It is this convenience that allows us to provide the diagnostic methods that we introduce. As a motivating example we present an analysis focusing on the association between depression and socioeconomic status, using data from the Epidemiologic Catchment Area study. We consider a latent class regression analysis investigating the association between depression and socioeconomic status measures, where the latent variable depression is regressed on education and income indicators, in addition to age, gender, and marital status variables. While the fitted latent class regression model yields interesting results, the model parameters are found to be invalid due to the violation of model assumptions. The violation of these assumptions is clearly identified by the presented diagnostic plots. These methods can be applied to standard latent class and latent class regression models, and the general principle can be extended to evaluate model assumptions in other types of models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The demands in production and associate costs at power generation through non renewable resources are increasing at an alarming rate. Solar energy is one of the renewable resource that has the potential to minimize this increase. Utilization of solar energy have been concentrated mainly on heating application. The use of solar energy in cooling systems in building would benefit greatly achieving the goal of non-renewable energy minimization. The approaches of solar energy heating system research done by initiation such as University of Wisconsin at Madison and building heat flow model research conducted by Oklahoma State University can be used to develop and optimize solar cooling building system. The research uses two approaches to develop a Graphical User Interface (GUI) software for an integrated solar absorption cooling building model, which is capable of simulating and optimizing the absorption cooling system using solar energy as the main energy source to drive the cycle. The software was then put through a number of litmus test to verify its integrity. The litmus test was conducted on various building cooling system data sets of similar applications around the world. The output obtained from the software developed were identical with established experimental results from the data sets used. Software developed by other research are catered for advanced users. The software developed by this research is not only reliable in its code integrity but also through its integrated approach which is catered for new entry users. Hence, this dissertation aims to correctly model a complete building with the absorption cooling system in appropriate climate as a cost effective alternative to conventional vapor compression system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation, whose research has been conducted at the Group of Electronic and Microelectronic Design (GDEM) within the framework of the project Power Consumption Control in Multimedia Terminals (PCCMUTE), focuses on the development of an energy estimation model for the battery-powered embedded processor board. The main objectives and contributions of the work are summarized as follows: A model is proposed to obtain the accurate energy estimation results based on the linear correlation between the performance monitoring counters (PMCs) and energy consumption. the uniqueness of the appropriate PMCs for each different system, the modeling methodology is improved to obtain stable accuracies with slight variations among multiple scenarios and to be repeatable in other systems. It includes two steps: the former, the PMC-filter, to identify the most proper set among the available PMCs of a system and the latter, the k-fold cross validation method, to avoid the bias during the model training stage. The methodology is implemented on a commercial embedded board running the 2.6.34 Linux kernel and the PAPI, a cross-platform interface to configure and access PMCs. The results show that the methodology is able to keep a good stability in different scenarios and provide robust estimation results with the average relative error being less than 5%. Este trabajo fin de máster, cuya investigación se ha desarrollado en el Grupo de Diseño Electrónico y Microelectrónico (GDEM) en el marco del proyecto PccMuTe, se centra en el desarrollo de un modelo de estimación de energía para un sistema empotrado alimentado por batería. Los objetivos principales y las contribuciones de esta tesis se resumen como sigue: Se propone un modelo para obtener estimaciones precisas del consumo de energía de un sistema empotrado. El modelo se basa en la correlación lineal entre los valores de los contadores de prestaciones y el consumo de energía. Considerando la particularidad de los contadores de prestaciones en cada sistema, la metodología de modelado se ha mejorado para obtener precisiones estables, con ligeras variaciones entre escenarios múltiples y para replicar los resultados en diferentes sistemas. La metodología incluye dos etapas: la primera, filtrado-PMC, que consiste en identificar el conjunto más apropiado de contadores de prestaciones de entre los disponibles en un sistema y la segunda, el método de validación cruzada de K iteraciones, cuyo fin es evitar los sesgos durante la fase de entrenamiento. La metodología se implementa en un sistema empotrado que ejecuta el kernel 2.6.34 de Linux y PAPI, un interfaz multiplataforma para configurar y acceder a los contadores. Los resultados muestran que esta metodología consigue una buena estabilidad en diferentes escenarios y proporciona unos resultados robustos de estimación con un error medio relativo inferior al 5%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Core competencies form the basis of an organization’s skills and the basic element of a successful strategic execution. Identifying and strengthening the core competencies enhances flexibility thereby strategically positioning a firm for responding to competition in the dynamic marketplace and can be the difference in quality among firms that follow the same business model. A correct understanding of the concept of business models, employing the right core competencies, organizing them effectively and building the business model around the competencies that are constantly gained and assimilated can result in enhanced business performance and thus having implications for firms that want to innovate their business models. Flexibility can be the firm’s agility to shift focus in response to external factors such as changing markets, new technologies or competition and a firm’s success can be gauged by the ability it displays in this transition. Although industry transformations generally emanate from technological changes, recent examples suggests they may also be due to the introduction of new business models and nowhere is it more relevant than in the airline industry. An analysis of the business model flexibility of 17 Airlines from Asia, Europe and Oceania, that is done with core competence as the indicator reveals a picture of inconsistencies in the core competence strategy of certain airlines and the corresponding reduction in business performance. The performance variations are explained from a service oriented core competence strategy employed by airlines that ultimately enables them in having a flexible business model that not only increases business performance but also helps in reducing the uncertainties in the internal and external operating environments. This is more relevant in the case of airline industry, as the product (the air transportation of passengers) minus the service competence is all the same.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study was motivated by the need to improve densification of Global Horizontal Irradiance (GHI) observations, increasing the number of surface weather stations that observe it, using sensors with a sub-hour periodicity and examining the methods of spatial GHI estimation (by interpolation) with that periodicity in other locations. The aim of the present research project is to analyze the goodness of 15-minute GHI spatial estimations for five methods in the territory of Spain (three geo-statistical interpolation methods, one deterministic method and the HelioSat2 method, which is based on satellite images). The research concludes that, when the work area has adequate station density, the best method for estimating GHI every 15 min is Regression Kriging interpolation using GHI estimated from satellite images as one of the input variables. On the contrary, when station density is low, the best method is estimating GHI directly from satellite images. A comparison between the GHI observed by volunteer stations and the estimation model applied concludes that 67% of the volunteer stations analyzed present values within the margin of error (average of +-2 standard deviations).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Building an interest model is the key to realize personalized text recommendation. Previous interest models neglect the fact that a user may have multiple angles of interests. Different angles of interest provide different requests and criteria for text recommendation. This paper proposes an interest model that consists of two kinds of angles: persistence and pattern, which can be combined to form complex angles. The model uses a new method to represent the long-term interest and the short-term interest, and distinguishes the interest on object and the interest on the link structure of objects. Experiments with news-scale text data show that the interest on object and the interest on link structure have real requirements, and it is effective to recommend texts according to the angles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Managed lane strategies are innovative road operation schemes for addressing congestion problems. These strategies operate a lane (lanes) adjacent to a freeway that provides congestion-free trips to eligible users, such as transit or toll-payers. To ensure the successful implementation of managed lanes, the demand on these lanes need to be accurately estimated. Among different approaches for predicting this demand, the four-step demand forecasting process is most common. Managed lane demand is usually estimated at the assignment step. Therefore, the key to reliably estimating the demand is the utilization of effective assignment modeling processes. ^ Managed lanes are particularly effective when the road is functioning at near-capacity. Therefore, capturing variations in demand and network attributes and performance is crucial for their modeling, monitoring and operation. As a result, traditional modeling approaches, such as those used in static traffic assignment of demand forecasting models, fail to correctly predict the managed lane demand and the associated system performance. The present study demonstrates the power of the more advanced modeling approach of dynamic traffic assignment (DTA), as well as the shortcomings of conventional approaches, when used to model managed lanes in congested environments. In addition, the study develops processes to support an effective utilization of DTA to model managed lane operations. ^ Static and dynamic traffic assignments consist of demand, network, and route choice model components that need to be calibrated. These components interact with each other, and an iterative method for calibrating them is needed. In this study, an effective standalone framework that combines static demand estimation and dynamic traffic assignment has been developed to replicate real-world traffic conditions. ^ With advances in traffic surveillance technologies collecting, archiving, and analyzing traffic data is becoming more accessible and affordable. The present study shows how data from multiple sources can be integrated, validated, and best used in different stages of modeling and calibration of managed lanes. Extensive and careful processing of demand, traffic, and toll data, as well as proper definition of performance measures, result in a calibrated and stable model, which closely replicates real-world congestion patterns, and can reasonably respond to perturbations in network and demand properties.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A modelação matemática de Estações de Tratamento de Águas Residuais (ETAR) tem sido uma ferramenta de enorme utilidade nas fases de projeto e de exploração destas estruturas de tratamento. O presente estudo teve por objetivo principal construir um modelo matemático da ETAR de Bragança, em particular do seu tratamento biológico de lamas ativadas, com vista a avaliar, compreender e otimizar o seu desempenho. A construção do modelo foi efetuada com recurso ao ambiente de simulação WRc STOAT 5.0. O processo de lamas ativadas foi descrito pelo modelo de referência ASAL3. O modelo construído foi calibrado e validado com base em dados experimentais de 2015, obtidos no âmbito do programa de controlo analítico da ETAR. O modelo foi ainda utilizado para avaliar a qualidade do efluente em resposta a alterações do caudal e composição do afluente, a alterações de condições operacionais e a outras alternativas de tratamento. O modelo mostrou-se bastante adequado na descrição da evolução mensal da qualidade do efluente final da ETAR relativamente aos parâmetros Sólidos Suspensos Totais (SST) e Carência Bioquímica de Oxigénio (CBO5), embora apresente uma tendência para os subestimar em 1,5 e 3,5 mg/L, respetivamente. Em relação ao azoto total, os valores simulados aproximaram-se dos valores reais, quando se aumentaram as taxas de recirculação interna para 400%, um fator de cerca de 4 vezes superior. Os resultados do modelo e dos cenários mostram e reforçam o bom desempenho e a operação otimizada da ETAR em relação a remoção de SST e CBO5. Em relação ao azoto total, a ETAR não assegura de forma sistemática uma eficiência elevada, mas apresenta um bom desempenho, face ao que o modelo consegue explicar para as mesmas condições operacionais. Através do estudo de cenários procurou-se encontrar alternativas de tratamento eficientes e viáveis de remoção de azoto total, mas não se identificaram soluções que assegurassem decargas de azoto abaixo dos limites legais. Os melhores resultados que se alcançaram para a remoção deste contaminante estão associados ao aumento das taxas de recirculação interna do sistema pré-anóxico existente e a uma configuração do tipo Bardenpho de quatro estágios com alimentação distribuída, em proporções iguais, pelos dois estágios anóxicos. Outras soluções que envolvam tecnologias distintas podem e devem ser equacionadas em projetos futuros que visem a melhoria de eficiência de remoção de azoto da ETAR.