977 resultados para cost estimation
Resumo:
The Keystone XL has a big role for transforming Canadian oil to the USA. The function of the pipeline is decreasing the dependency of the American oil industry on other countries and it will help to limit external debt. The proposed pipeline seeks the most suitable route which cannot damage agricultural and natural water recourses such as the Ogallala Aquifer. Using the Geographic Information System (GIS) techniques, the suggested path in this study got extremely high correct results that will help in the future to use the least cost analysis for similar studies. The route analysis contains different weighted overlay surfaces, each, was influenced by various criteria (slope, geology, population and land use). The resulted least cost path routes for each weighted overlay surface were compared with the original proposed pipeline and each displayed surface was more effective than the proposed Keystone XL pipeline.
Resumo:
Since the invention of photography humans have been using images to capture, store and analyse the act that they are interested in. With the developments in this field, assisted by better computers, it is possible to use image processing technology as an accurate method of analysis and measurement. Image processing's principal qualities are flexibility, adaptability and the ability to easily and quickly process a large amount of information. Successful examples of applications can be seen in several areas of human life, such as biomedical, industry, surveillance, military and mapping. This is so true that there are several Nobel prizes related to imaging. The accurate measurement of deformations, displacements, strain fields and surface defects are challenging in many material tests in Civil Engineering because traditionally these measurements require complex and expensive equipment, plus time consuming calibration. Image processing can be an inexpensive and effective tool for load displacement measurements. Using an adequate image acquisition system and taking advantage of the computation power of modern computers it is possible to accurately measure very small displacements with high precision. On the market there are already several commercial software packages. However they are commercialized at high cost. In this work block-matching algorithms will be used in order to compare the results from image processing with the data obtained with physical transducers during laboratory load tests. In order to test the proposed solutions several load tests were carried out in partnership with researchers from the Civil Engineering Department at Universidade Nova de Lisboa (UNL).
Resumo:
Cash-in-advance models usually require agents to reallocate money and bonds in fixed periods. Every month or quarter, for example. I show that fixed periods underestimate the welfare cost of inflation. I use a model in which agents choose how often they exchange bonds for money. In the benchmark specification, the welfare cost of 10 percent instead of 0 inflation increases from 0.1 percent of income with fixed periods to 1 percent with optimal periods. The results are robust to different references, to different compositions of income in bonds or money, and to the introduction of capital and labor.
Resumo:
The objective of this work project is to analyse and discuss the importance of the “Cost to Serve” as a differentiation key factor, by accessing cost to serve customers of a Portuguese subsidiary of a multinational company, which is operating in the sector of fast moving consumer goods (FMCG) – Unilever – Jerónimo Martins (UJM). I will also suggest and quantify key proposals to decrease costs and increase customers’ value. Hence, the scope of this work project is focused on logistics and distribution processes of the company supply chain.
Resumo:
Introduction Leprosy remains a relevant public health issue in Brazil. The objective of this study was to analyze the spatial distribution of new cases of leprosy and to detect areas with higher risks of disease in the City of Vitória. Methods The study was ecologically based on the spatial distribution of leprosy in the City of Vitória, State of Espírito Santo between 2005 and 2009. The data sources used came from the available records of the State Health Secretary of Espírito Santo. A global and local empirical Bayesian method was used in the spatial analysis to produce a leprosy risk estimation, and the fluctuation effect was smoothed from the detection coefficients. Results The study used thematic maps to illustrate that leprosy is distributed heterogeneously between the neighborhoods and that it is possible to identify areas with high risk of disease. The Pearson correlation coefficient of 0.926 (p = 0.001) for the Local Method indicated highly correlated coefficients. The Moran index was calculated to evaluate correlations between the incidences of adjoining districts. Conclusions We identified the spatial contexts in which there were the highest incidence rates of leprosy in Vitória during the studied period. The results contribute to the knowledge of the spatial distribution of leprosy in the City of Vitória, which can help establish more cost-effective control strategies because they indicate specific regions and priority planning activities that can interfere with the transmission chain.
Resumo:
Both Oporto and the North Region of Portugal definitions of tourism have evolved significantly during the past decade. In this journey it is relevant to highlight the contribution of the arrival of Low Cost Carriers (LCCs) at Francisco Sá Carneiro Airport, thus contributing to a rapid expansion of this region as a tourism destination. Hence, this work project aims to understand the touristic and economic impact motivated by the entry of LCCs in Oporto and in the North Region of Portugal and tries to understand if this event was in fact an asset in the development of the aforementioned tourism destinations.
Resumo:
The main purpose of this Work Project consists in performing a practical Cost-Benefit Analysis from a social perspective of two noise reduction projects in industrial sites that aim at complying with the existing regulation. By doing so, one may expect a more comprehensive view of the benefits and costs of both projects, as well as relevant insight to the way noise exposure regulation must be optimally defined in Portugal and within the EU area.
Resumo:
The aim of this work project is to analyze the current algorithm used by EDP to estimate their clients’ electrical energy consumptions, create a new algorithm and compare the advantages and disadvantages of both. This new algorithm is different from the current one as it incorporates some effects from temperature variations. The results of the comparison show that this new algorithm with temperature variables performed better than the same algorithm without temperature variables, although there is still potential for further improvements of the current algorithm, if the prediction model is estimated using a sample of daily data, which is the case of the current EDP algorithm.
Resumo:
The hospital pharmacy in large and advanced institutions has evolved from a simple storage and distribution unit into a highly specialized manipulation and dispensation center, responsible for the handling of hundreds of clinical requests, many of them unique and not obtainable from commercial companies. It was therefore quite natural that in many environments, a manufacturing service was gradually established, to cater to both conventional and extraordinary demands of the medical staff. That was the case of Hospital das Clinicas, where multiple categories of drugs are routinely produced inside the pharmacy. However, cost-containment imperatives dictate that such activities be reassessed in the light of their efficiency and essentiality. METHODS: In a prospective study, the output of the Manufacturing Service of the Central Pharmacy during a 12-month period was documented and classified into three types. Group I comprised drugs similar to commercially distributed products, Group II included exclusive formulations for routine consumption, and Group III dealt with special demands related to clinical investigations. RESULTS: Findings for the three categories indicated that these groups represented 34.4%, 45.3%, and 20.3% of total manufacture orders, respectively. Costs of production were assessed and compared with market prices for Group 1 preparations, indicating savings of 63.5%. When applied to the other groups, for which direct equivalent in market value did not exist, these results would suggest total yearly savings of over 5 100 000 US dollars. Even considering that these calculations leave out many components of cost, notably those concerning marketing and distribution, it might still be concluded that at least part of the savings achieved were real. CONCLUSIONS: The observed savings, allied with the convenience and reliability with which the Central Pharmacy performed its obligations, support the contention that internal manufacture of pharmaceutical formulations was a cost-effective alternative in the described setting.
Resumo:
Contém resumo
Resumo:
This work studies fuel retail firms’ strategic behavior in a two-dimensional product differentiation framework. Following the mandatory provision of “low-cost” fuel we consider that capacity constraints force firms to eliminate of one the previously offered qualities. Firms play a two-stage game choosing fuel qualities from three possibilities (low-cost, medium quality and high quality fuel) and then prices having exogenous opposite locations. In the highest level of consumers’ heterogeneity, a subgame perfect Nash equilibrium exists in which firms both choose minimum quality differentiation. Consumers’ are worse off if no differentiation occurs in medium and high qualities. The effect over prices from the mandatory “low-cost” fuel law is ambiguous.
Resumo:
This work is devoted to the broadband wireless transmission techniques, which are serious candidates to be implemented in future broadband wireless and cellular systems, aiming at providing high and reliable data transmission and concomitantly high mobility. In order to cope with doubly-selective channels, receiver structures based on OFDM and SC-FDE block transmission techniques, are proposed, which allow cost-effective implementations, using FFT-based signal processing. The first subject to be addressed is the impact of the number of multipath components, and the diversity order, on the asymptotic performance of OFDM and SC-FDE, in uncoded and for different channel coding schemes. The obtained results show that the number of relevant separable multipath components is a key element that influences the performance of OFDM and SC-FDE schemes. Then, the improved estimation and detection performance of OFDM-based broadcasting systems, is introduced employing SFN (Single Frequency Network) operation. An initial coarse channel is obtained with resort to low-power training sequences estimation, and an iterative receiver with joint detection and channel estimation is presented. The achieved results have shown very good performance, close to that with perfect channel estimation. The next topic is related to SFN systems, devoting special attention to time-distortion effects inherent to these networks. Typically, the SFN broadcast wireless systems employ OFDM schemes to cope with severely time-dispersive channels. However, frequency errors, due to CFO, compromises the orthogonality between subcarriers. As an alternative approach, the possibility of using SC-FDE schemes (characterized by reduced envelope fluctuations and higher robustness to carrier frequency errors) is evaluated, and a technique, employing joint CFO estimation and compensation over the severe time-distortion effects, is proposed. Finally, broadband mobile wireless systems, in which the relative motion between the transmitter and receiver induces Doppler shift which is different or each propagation path, is considered, depending on the angle of incidence of that path in relation to the direction of travel. This represents a severe impairment in wireless digital communications systems, since that multipath propagation combined with the Doppler effects, lead to drastic and unpredictable fluctuations of the envelope of the received signal, severely affecting the detection performance. The channel variations due this effect are very difficult to estimate and compensate. In this work we propose a set of SC-FDE iterative receivers implementing efficient estimation and tracking techniques. The performance results show that the proposed receivers have very good performance, even in the presence of significant Doppler spread between the different groups of multipath components.
Resumo:
Ship tracking systems allow Maritime Organizations that are concerned with the Safety at Sea to obtain information on the current location and route of merchant vessels. Thanks to Space technology in recent years the geographical coverage of the ship tracking platforms has increased significantly, from radar based near-shore traffic monitoring towards a worldwide picture of the maritime traffic situation. The long-range tracking systems currently in operations allow the storage of ship position data over many years: a valuable source of knowledge about the shipping routes between different ocean regions. The outcome of this Master project is a software prototype for the estimation of the most operated shipping route between any two geographical locations. The analysis is based on the historical ship positions acquired with long-range tracking systems. The proposed approach makes use of a Genetic Algorithm applied on a training set of relevant ship positions extracted from the long-term storage tracking database of the European Maritime Safety Agency (EMSA). The analysis of some representative shipping routes is presented and the quality of the results and their operational applications are assessed by a Maritime Safety expert.
Resumo:
The moisture content in concrete structures has an important influence in their behavior and performance. Several vali-dated numerical approaches adopt the governing equation for relative humidity fields proposed in Model Code 1990/2010. Nevertheless there is no integrative study which addresses the choice of parameters for the simulation of the humidity diffusion phenomenon, particularly in concern to the range of parameters forwarded by Model Code 1990/2010. A software based on a Finite Difference Method Algorithm (1D and axisymmetric cases) is used to perform sensitivity analyses on the main parameters in a normal strength concrete. Then, based on the conclusions of the sensi-tivity analyses, experimental results from nine different concrete compositions are analyzed. The software is used to identify the main material parameters that better fit the experimental data. In general, the model was able to satisfactory fit the experimental results and new correlations were proposed, particularly focusing on the boundary transfer coeffi-cient.