937 resultados para price of houses
Resumo:
Uncertainty affects all aspects of the property market but one area where the impact of uncertainty is particularly significant is within feasibility analyses. Any development is impacted by differences between market conditions at the conception of the project and the market realities at the time of completion. The feasibility study needs to address the possible outcomes based on an understanding of the current market. This requires the appraiser to forecast the most likely outcome relating to the sale price of the completed development, the construction costs and the timing of both. It also requires the appraiser to understand the impact of finance on the project. All these issues are time sensitive and analysis needs to be undertaken to show the impact of time to the viability of the project. The future is uncertain and a full feasibility analysis should be able to model the upside and downside risk pertaining to a range of possible outcomes. Feasibility studies are extensively used in Italy to determine land value but they tend to be single point analysis based upon a single set of “likely” inputs. In this paper we look at the practical impact of uncertainty in variables using a simulation model (Crystal Ball ©) with an actual case study of an urban redevelopment plan for an Italian Municipality. This allows the appraiser to address the issues of uncertainty involved and thus provide the decision maker with a better understanding of the risk of development. This technique is then refined using a “two-dimensional technique” to distinguish between “uncertainty” and “variability” and thus create a more robust model.
Resumo:
This article expresses the price of a spread option as the sum of the prices of two compound options. One compound option is to exchange vanilla call options on the two underlying assets and the other is to exchange the corresponding put options. This way we derive a new closed form approximation for the price of a European spread option and a corresponding approximation for each of its price, volatility and correlation hedge ratios. Our approach has many advantages over existing analytical approximations, which have limited validity and an indeterminacy that renders them of little practical use. The compound exchange option approximation for European spread options is then extended to American spread options on assets that pay dividends or incur costs. Simulations quantify the accuracy of our approach; we also present an empirical application to the American crack spread options that are traded on NYMEX. For illustration, we compare our results with those obtained using the approximation attributed to Kirk (1996, Correlation in energy markets. In: V. Kaminski (Ed.), Managing Energy Price Risk, pp. 71–78 (London: Risk Publications)), which is commonly used by traders.
Resumo:
From 2001, the construction of flats and high-density developments increased in England and the building of houses declined. Does this indicate a change in taste or is it a result of government planning policies? In this paper, an analysis is made of the long-term effects of the policy of constraint which has existed for the past 50 years but the increase in density is identified as occurring primarily after new, revised, planning guidance was issued in England in 2000 which discouraged low-density development. To substantiate this, it is pointed out that the change which occurred in England did not occur in Scotland where guidance was not changed to encourage high-density residential development. The conclusion that the change is the result of planning policies and not of a change in taste is confirmed by surveys of the occupants of new high-rise developments in Leeds. The new flat-dwellers were predominantly young and childless and expressed the intention, in the near future, when they could, of moving out of the city centre and into houses. From recent changes in guidance by the new coalition government, it is expected that the construction of flats in England will fall back to earlier levels over the next few years.
Resumo:
This paper analyses the 53 managerial sackings and resignations from 16 stock exchange listed English football clubs during the nine seasons between 2000/01 and 2008/09. The results demonstrate that, on average, a managerial sacking results in a post-announcement day market-adjusted share price rise of 0.3%, whilst a resignation leads to a drop in share price of 1% that continues for a trading month thereafter, cumulating in a negative abnormal return of over 8% from a trading day before the event. These findings are intuitive, and suggest that sacking a poorly performing manager may be welcomed by the markets as a possible route to better future match performance, while losing a capable manager through resignation, who typically progresses to a superior job, will result in a drop in a club’s share price. The paper also reveals that while the impact of managerial departures on stock price volatilities is less clear-cut, speculation in the newspapers is rife in the build-up to such an event.
Resumo:
Free range egg producers face continuing problems from injurious pecking (IP) which has financial consequences for farmers and poor welfare implications for birds. Beak trimming has been practised for many years to limit the damage caused by IP, but with the UK Government giving notification that they intend to ban beak trimming in 2016, considerable efforts have been made to devise feasible housing, range and management strategies to reduce IP. A recent research project investigated the efficacy of a range of IP reducing management strategies, the mean costs of which came to around 5 pence per bird. Here, the results of the above project’s consumer survey are presented: consumers’ attitudes to free range egg production are detailed showing that, whilst consumers had a very positive attitude towards free range eggs, they were especially uninformed about some aspects of free range egg production. The contingent valuation technique was used to estimate the price premium consumers would be prepared to pay to ensure that hens do not suffer from IP: this was calculated as just over 3% on top of the prevailing retail price of free range eggs. These findings reinforce other studies that have found that whilst consumers are not generally well-informed about certain specific welfare problems faced by animals under free range conditions, they are prepared to pay to improve animal welfare. Indeed, the study findings suggest that producers could obtain an additional price premium if they demonstrate the welfare provenance of their eggs, perhaps through marketing the eggs as coming from birds with intact beaks. This welfare provenance issue could usefully be assured to consumers by the introduction of a mandatory, single, accredited EU-wide welfare-standards labelling scheme.
Resumo:
We perform an analysis of the electroweak precision observables in the Lee-Wick Standard Model. The most stringent restrictions come from the S and T parameters that receive important tree level and one loop contributions. In general the model predicts a large positive S and a negative T. To reproduce the electroweak data, if all the Lee-Wick masses are of the same order, the Lee-Wick scale is of order 5 TeV. We show that it is possible to find some regions in the parameter space with a fermionic state as light as 2.4-3.5 TeV, at the price of rising all the other masses to be larger than 5-8 TeV. To obtain a light Higgs with such heavy resonances a fine-tuning of order a few per cent, at least, is needed. We also propose a simple extension of the model including a fourth generation of Standard Model fermions with their Lee-Wick partners. We show that in this case it is possible to pass the electroweak constraints with Lee-Wick fermionic masses of order 0.4-1.5 TeV and Lee-Wick gauge masses of order 3 TeV.
Resumo:
I consider the case for genuinely anonymous web searching. Big data seems to have it in for privacy. The story is well known, particularly since the dawn of the web. Vastly more personal information, monumental and quotidian, is gathered than in the pre-digital days. Once gathered it can be aggregated and analyzed to produce rich portraits, which in turn permit unnerving prediction of our future behavior. The new information can then be shared widely, limiting prospects and threatening autonomy. How should we respond? Following Nissenbaum (2011) and Brunton and Nissenbaum (2011 and 2013), I will argue that the proposed solutions—consent, anonymity as conventionally practiced, corporate best practices, and law—fail to protect us against routine surveillance of our online behavior. Brunton and Nissenbaum rightly maintain that, given the power imbalance between data holders and data subjects, obfuscation of one’s online activities is justified. Obfuscation works by generating “misleading, false, or ambiguous data with the intention of confusing an adversary or simply adding to the time or cost of separating good data from bad,” thus decreasing the value of the data collected (Brunton and Nissenbaum, 2011). The phenomenon is as old as the hills. Natural selection evidently blundered upon the tactic long ago. Take a savory butterfly whose markings mimic those of a toxic cousin. From the point of view of a would-be predator the data conveyed by the pattern is ambiguous. Is the bug lunch or potential last meal? In the light of the steep costs of a mistake, the savvy predator goes hungry. Online obfuscation works similarly, attempting for instance to disguise the surfer’s identity (Tor) or the nature of her queries (Howe and Nissenbaum 2009). Yet online obfuscation comes with significant social costs. First, it implies free riding. If I’ve installed an effective obfuscating program, I’m enjoying the benefits of an apparently free internet without paying the costs of surveillance, which are shifted entirely onto non-obfuscators. Second, it permits sketchy actors, from child pornographers to fraudsters, to operate with near impunity. Third, online merchants could plausibly claim that, when we shop online, surveillance is the price we pay for convenience. If we don’t like it, we should take our business to the local brick-and-mortar and pay with cash. Brunton and Nissenbaum have not fully addressed the last two costs. Nevertheless, I think the strict defender of online anonymity can meet these objections. Regarding the third, the future doesn’t bode well for offline shopping. Consider music and books. Intrepid shoppers can still find most of what they want in a book or record store. Soon, though, this will probably not be the case. And then there are those who, for perfectly good reasons, are sensitive about doing some of their shopping in person, perhaps because of their weight or sexual tastes. I argue that consumers should not have to pay the price of surveillance every time they want to buy that catchy new hit, that New York Times bestseller, or a sex toy.
Resumo:
This thesis evaluates different sites for a weather measurement system and a suitable PV- simulation for University of Surabaya (UBAYA) in Indonesia/Java. The weather station is able to monitor all common weather phenomena including solar insolation. It is planned to use the data for scientific and educational purposes in the renewable energy studies. During evaluation and installation it falls into place that official specifications from global meteorological organizations could not be meet for some sensors caused by the conditions of UBAYA campus. After arranging the hardware the weather at the site was monitored for period of time. A comparison with different official sources from ground based and satellite bases measurements showed differences in wind and solar radiation. In some cases the monthly average solar insolation was deviating 42 % for satellite-based measurements. For the ground based it was less than 10 %. The average wind speed has a difference of 33 % compared to a source, which evaluated the wind power in Surabaya. The wind direction shows instabilities towards east compared with data from local weather station at the airport. PSET has the chance to get some investments to investigate photovoltaic on there own roof. With several simulations a suitable roof direction and the yearly and monthly outputs are shown. With a 7.7 kWpeak PV installation with the latest crystalline technology on the market 8.82 MWh/year could be achieved with weather data from 2012. Thin film technology could increase the value up to 9.13 MWh/year. However, the roofs have enough area to install PV. Finally the low price of electricity in Indonesia makes it not worth to feed in the energy into the public grid.
Resumo:
To identify the relevant product markets for Swedish pharmaceuticals, a spatial econometrics approach is employed. First, we calculate Moran’s Is for different market definitions and then we use a spatial Durbin model to determine the effect of price changes on quantity sold off own and competing products. As expected, the results show that competition is strongest between close substitutes; however, the relevant product markets for Swedish pharmaceuticals extend beyond close substitutes down to products included in the same class on the four-digit level of the Anatomic Therapeutic Chemical system as defined by the World Health Organization. The spatial regression model further indicates that increases in the price of a product significantly lower the quantity sold of that product and in the same time increase the quantity sold of competing products. For close substitutes (products belonging to the same class on the seven-digit level of the Anatomic Therapeutic Chemical system), as well as for products that, without being close substitutes, belong to the same therapeutic/pharmacological/chemical subgroup (the same class on the five-digit level of the Anatomic Therapeutic Chemical system), a significant change towards increased competition is also visible after 1 July 2009 when the latest policy changes with regards to pharmaceuticals have been implemented in Sweden.
Resumo:
Sweden, together with Norway, Finland and Denmark, have created a multi-national electricity market called NordPool. In this market, producers and retailers of electricity can buy and sell electricity, and the retailers then offers this electricity to end consumers such as households and industries. Previous studies have shown that pricing at the NordPool market is functioning quite well, but no other study has to my knowledge studied if pricing in the retail market to consumers in Sweden is well functioning. If the market is well functioning, with competition and low transaction costs when changing electricity retailer, we would expect that a homogeneous good such as electricity would be sold at the approximately same price, and that price changes would be highly correlated, in this market. Thus, the aim of this study is to test whether the price of Vattenfall, the largest energy firm in the Swedish market, is highly correlated to the price of other firms in the Swedish retail market for electricity. Descriptive statistics indicate that the price offered by Vattenfall is quite similar to the price of other firms in the market. In addition, regression analysis show that the correlation between the price of Vattenfall and other firms is as high as 0.98.
Resumo:
In a global economy, manufacturers mainly compete with cost efficiency of production, as the price of raw materials are similar worldwide. Heavy industry has two big issues to deal with. On the one hand there is lots of data which needs to be analyzed in an effective manner, and on the other hand making big improvements via investments in cooperate structure or new machinery is neither economically nor physically viable. Machine learning offers a promising way for manufacturers to address both these problems as they are in an excellent position to employ learning techniques with their massive resource of historical production data. However, choosing modelling a strategy in this setting is far from trivial and this is the objective of this article. The article investigates characteristics of the most popular classifiers used in industry today. Support Vector Machines, Multilayer Perceptron, Decision Trees, Random Forests, and the meta-algorithms Bagging and Boosting are mainly investigated in this work. Lessons from real-world implementations of these learners are also provided together with future directions when different learners are expected to perform well. The importance of feature selection and relevant selection methods in an industrial setting are further investigated. Performance metrics have also been discussed for the sake of completion.
Resumo:
O trabalho enfoca o projeto de galpões para a criação de frangos de corte, do ponto de vista do seu desempenho térmico, nas condições climáticas brasileiras. Define as margens das condições têrmicas de rendimento, usando como indicador a conversão alimentar e, através da metodologia da especificação por desempenho, apresenta as características dos galpões, apontando as principais vias para a solução dos problemas higrotérmicos. No trabalho há uma revisão bibliográfica em relação ao tema da fisiologia das aves, no que se refere a produção e dissipação do calor corporal para o meio ambiente. Foi dado um enfoque à análise do tema de transmissão de calor, com o intuito de servir de guia para os estudos dos profissionais da área de Ciências Agrárias.
Resumo:
O objetivo deste trabalho é modelar o comportamento estratégico dos indivíduos diante de um choque estocástico que desloca o preço de determinado ativo financeiro do seu equilíbrio inicial. Investiga-se o caminho do preço de mercado em direção ao novo equilíbrio, conduzido pelas sucessivas negociações dos agentes em busca de oportunidades de obter lucros imediatos. Os operadores, que por suposição possuem funções de utilidade avessas ao risco, devem escolher a quantidade ótima transacionada e quanto devem aguardar para executar as suas ordens, tendo em vista a diminuição da volatilidade do preço do ativo à medida que as transações se sucedem após o choque. Procura-se demonstrar que os operadores que aceitam incorrer em riscos mais elevados negociam com maior frequência e em volumes e velocidades maiores, usufruindo lucros esperados mais altos que os demais.
Resumo:
Através de dados financeiros de ações negociadas na Bolsa de Valores de São Paulo, testa-se a validade do modelo de valor presente (MVP) com retornos esperados constantes ao longo do tempo (Campbell & Schiller, 1987). Esse modelo relaciona o preço de uma ação ao seu esperado fluxo de dividendos trazido a valor presente a uma taxa de desconto constante ao longo do tempo. Por trás desse modelo está a hipótese de expectativas racionais, bem como a hipótese de previsibilidade de preço futuro do ativo, através da inserção dos dividendos esperados no período seguinte. Nesse trabalho é realizada uma análise multivariada num arcabouço de séries temporais, utilizando a técnica de Auto-Regressões Vetoriais. Os resultados empíricos apresentados, embora inconclusivos, permitem apenas admitir que não é possível rejeitar completamente a hipótese de expectativas racionais para os ativos brasileiros.
Resumo:
O estudo teve como proposta identificar na comunidade do Morro do Vital Brazil referências que indicassem sentido de coletividade e continuidade. Essa comunidade, como pretendido apontar, teria sua origem, nas décadas de 1920 e 1930, no entorno do Instituto de Hygiene, Sorotherapia e Veterinária do Estado do Rio de Janeiro, hoje, Instituto Vital Brazil. Por ser fábrica farmacêutica, a produção necessitaria de mão-de-obra, que no caso estudado, passou a morar no morro atrás do Instituto, mas ainda em seu território. Dessa origem, surge uma comunidade com características de cooperação, união e associativismo. Com a prosperidade da fábrica, cresce o número de moradores e inicia-se um conjunto de domicílios e famílias também possuidores de aspectos em comum. Essas identidades possibilitam um encontro com o poder público na forma de políticos e políticas, como o Programa Médico de Família. Com esse último, interesse inicial da pesquisa, nasceu a relação entre a pesquisadora em questão, médica no posto PMF Vital Brazil, e evidenciou performances dos moradores que indicavam um pertencimento e lugar de fala diferenciado. O estudo apontou características da comunidade e dos atores que contribuíram na criação desse coletivo, utilizou como metodologia a história oral.