794 resultados para Multi-sector New Keynesian DSGE models
Resumo:
Many of the next generation of global climate models will include aerosol schemes which explicitly simulate the microphysical processes that determine the particle size distribution. These models enable aerosol optical properties and cloud condensation nuclei (CCN) concentrations to be determined by fundamental aerosol processes, which should lead to a more physically based simulation of aerosol direct and indirect radiative forcings. This study examines the global variation in particle size distribution simulated by 12 global aerosol microphysics models to quantify model diversity and to identify any common biases against observations. Evaluation against size distribution measurements from a new European network of aerosol supersites shows that the mean model agrees quite well with the observations at many sites on the annual mean, but there are some seasonal biases common to many sites. In particular, at many of these European sites, the accumulation mode number concentration is biased low during winter and Aitken mode concentrations tend to be overestimated in winter and underestimated in summer. At high northern latitudes, the models strongly underpredict Aitken and accumulation particle concentrations compared to the measurements, consistent with previous studies that have highlighted the poor performance of global aerosol models in the Arctic. In the marine boundary layer, the models capture the observed meridional variation in the size distribution, which is dominated by the Aitken mode at high latitudes, with an increasing concentration of accumulation particles with decreasing latitude. Considering vertical profiles, the models reproduce the observed peak in total particle concentrations in the upper troposphere due to new particle formation, although modelled peak concentrations tend to be biased high over Europe. Overall, the multi-model-mean data set simulates the global variation of the particle size distribution with a good degree of skill, suggesting that most of the individual global aerosol microphysics models are performing well, although the large model diversity indicates that some models are in poor agreement with the observations. Further work is required to better constrain size-resolved primary and secondary particle number sources, and an improved understanding of nucleation and growth (e.g. the role of nitrate and secondary organics) will improve the fidelity of simulated particle size distributions.
Resumo:
The UK house building sector is facing dual pressures to expand supply, along with delivering against tougher Building Regulations’ requirements, predominantly in the areas of sustainability. A review of current literature has highlighted that the pressures the UK house building industry is currently under may be having a negative impact on build quality, causing an increase in defects. A review and synthesis of the current defect literature with respect to new-build housing and the wider construction sector has found that the prevailing emphasis is limited to the classification, causes, pathology and statistical analysis of defects. There is thus a need to better understand the overall impact of individual defects on key stakeholders within the new-build housing defect detection and remediation process. As part of ongoing research to develop and verify a defect impact assessment rating system, this paper seeks to contribute to our understanding of the impact of individual defects from a key stakeholder perspective by undertaking the literature review and synthesis phase. The literature review identifies the three distinct, but interrelated, dominant impact factors: cost, disruption, and health and safety. By pulling the strands of defect literature together the theoretical lens and key stakeholder sampling strategy is formed as the basis for the subsequent impact weighting development phase.
Resumo:
In this article we assess the abilities of a new electromagnetic (EM) system, the CMD Mini-Explorer, for prospecting of archaeological features in Ireland and the UK. The Mini-Explorer is an EM probe which is primarily aimed at the environmental/geological prospecting market for the detection of pipes and geology. It has long been evident from the use of other EM devices that such an instrument might be suitable for shallow soil studies and applicable for archaeological prospecting. Of particular interest for the archaeological surveyor is the fact that the Mini-Explorer simultaneously obtains both quadrature (‘conductivity’) and in-phase (relative to ‘magnetic susceptibility’) data from three depth levels. As the maximum depth range is probably about 1.5 m, a comprehensive analysis of the subsoil within that range is possible. As with all EM devices the measurements require no contact with the ground, thereby negating the problem of high contact resistance that often besets earth resistance data during dry spells. The use of the CMD Mini-Explorer at a number of sites has demonstrated that it has the potential to detect a range of archaeological features and produces high-quality data that are comparable in quality to those obtained from standard earth resistance and magnetometer techniques. In theory the ability to measure two phenomena at three depths suggests that this type of instrument could reduce the number of poor outcomes that are the result of single measurement surveys. The high success rate reported here in the identification of buried archaeology using a multi-depth device that responds to the two most commonly mapped geophysical phenomena has implications for evaluation style surveys. Copyright © 2013 John Wiley & Sons, Ltd.
Resumo:
Inspired by the commercial desires of global brands and retailers to access the lucrative green consumer market, carbon is increasingly being counted and made knowable at the mundane sites of everyday production and consumption, from the carbon footprint of a plastic kitchen fork to that of an online bank account. Despite the challenges of counting and making commensurable the global warming impact of a myriad of biophysical and societal activities, this desire to communicate a product or service's carbon footprint has sparked complicated carbon calculative practices and enrolled actors at literally every node of multi-scaled and vastly complex global supply chains. Against this landscape, this paper critically analyzes the counting practices that create the ‘e’ in ‘CO2e’. It is shown that, central to these practices are a series of tools, models and databases which, in building upon previous work (Eden, 2012 and Star and Griesemer, 1989) we conceptualize here as ‘boundary objects’. By enrolling everyday actors from farmers to consumers, these objects abstract and stabilize greenhouse gas emissions from their messy material and social contexts into units of CO2e which can then be translated along a product's supply chain, thereby establishing a new currency of ‘everyday supply chain carbon’. However, in making all greenhouse gas-related practices commensurable and in enrolling and stabilizing the transfer of information between multiple actors these objects oversee a process of simplification reliant upon, and subject to, a multiplicity of approximations, assumptions, errors, discrepancies and/or omissions. Further the outcomes of these tools are subject to the politicized and commercial agendas of the worlds they attempt to link, with each boundary actor inscribing different meanings to a product's carbon footprint in accordance with their specific subjectivities, commercial desires and epistemic framings. It is therefore shown that how a boundary object transforms greenhouse gas emissions into units of CO2e, is the outcome of distinct ideologies regarding ‘what’ a product's carbon footprint is and how it should be made legible. These politicized decisions, in turn, inform specific reduction activities and ultimately advance distinct, specific and increasingly durable transition pathways to a low carbon society.
Resumo:
Rapid growth in the production of new homes in the UK is putting build quality under pressure as evidenced by an increase in the number of defects. Housing associations (HAs) contribute approximately 20% of the UK’s new housing supply. HAs are currently experiencing central government funding cuts and rental revenue reductions. As part of HAs’ quest to ramp up supply despite tight budget conditions, they are reviewing how they learn from defects. Learning from defects is argued as a means of reducing the persistent defect problem within the UK housebuilding industry, yet how HAs learn from defects is under-researched. The aim of this research is to better understand how HAs, in practice, learn from past defects to reduce the prevalence of defects in future new homes. The theoretical lens for this research is organizational learning. The results drawn from 12 HA case studies indicate that effective organizational learning has the potential to reduce defects within the housing sector. The results further identify that HAs are restricting their learning to focus primarily on reducing defects through product and system adaptations. Focusing on product and system adaptations alone suppresses HAs’ abilities to reduce defects in the future.
Resumo:
This paper proposes a novel adaptive multiple modelling algorithm for non-linear and non-stationary systems. This simple modelling paradigm comprises K candidate sub-models which are all linear. With data available in an online fashion, the performance of all candidate sub-models are monitored based on the most recent data window, and M best sub-models are selected from the K candidates. The weight coefficients of the selected sub-model are adapted via the recursive least square (RLS) algorithm, while the coefficients of the remaining sub-models are unchanged. These M model predictions are then optimally combined to produce the multi-model output. We propose to minimise the mean square error based on a recent data window, and apply the sum to one constraint to the combination parameters, leading to a closed-form solution, so that maximal computational efficiency can be achieved. In addition, at each time step, the model prediction is chosen from either the resultant multiple model or the best sub-model, whichever is the best. Simulation results are given in comparison with some typical alternatives, including the linear RLS algorithm and a number of online non-linear approaches, in terms of modelling performance and time consumption.
Resumo:
Evidence of jet precession in many galactic and extragalactic sources has been reported in the literature. Much of this evidence is based on studies of the kinematics of the jet knots, which depends on the correct identification of the components to determine their respective proper motions and position angles on the plane of the sky. Identification problems related to fitting procedures, as well as observations poorly sampled in time, may influence the follow-up of the components in time, which consequently might contribute to a misinterpretation of the data. In order to deal with these limitations, we introduce a very powerful statistical tool to analyse jet precession: the cross-entropy method for continuous multi-extremal optimization. Only based on the raw data of the jet components (right ascension and declination offsets from the core), the cross-entropy method searches for the precession model parameters that better represent the data. In this work we present a large number of tests to validate this technique, using synthetic precessing jets built from a given set of precession parameters. With the aim of recovering these parameters, we applied the cross-entropy method to our precession model, varying exhaustively the quantities associated with the method. Our results have shown that even in the most challenging tests, the cross-entropy method was able to find the correct parameters within a 1 per cent level. Even for a non-precessing jet, our optimization method could point out successfully the lack of precession.
Resumo:
So Paulo is the most developed state in Brazil and contains few fragments of native ecosystems, generally surrounded by intensive agriculture lands. Despite this, some areas still shelter large native animals. We aimed at understanding how medium and large carnivores use a mosaic landscape of forest/savanna and agroecosystems, and how the species respond to different landscape parameters (percentage of landcover and edge density), in a multi-scale perspective. The response variables were: species richness, carnivore frequency and frequency for the three most recorded species (Puma concolor, Chrysocyon brachyurus and Leopardus pardalis). We compared 11 competing models using Akaike`s information criterion (AIC) and assessed model support using weight of AIC. Concurrent models were combinations of landcover types (native vegetation, ""cerrado"" formations, ""cerrado"" and eucalypt plantation), landscape feature (percentage of landcover and edge density) and spatial scale. Herein, spatial scale refers to the radius around a sampling point defining a circular landscape. The scales analyzed were 250 (fine), 1,000 (medium) and 2,000 m (coarse). The shape of curves for response variables (linear, exponential and power) was also assessed. Our results indicate that species with high mobility, P. concolor and C. brachyurus, were best explained by edge density of the native vegetation at a coarse scale (2,000 m). The relationship between P. concolor and C. brachyurus frequency had a negative power-shaped response to explanatory variables. This general trend was also observed for species richness and carnivore frequency. Species richness and P. concolor frequency were also well explained by a second concurrent model: edge density of cerrado at the fine (250 m) scale. A different response was recorded for L. pardalis, as the frequency was best explained for the amount of cerrado at the fine (250 m) scale. The curve of response was linearly positive. The contrasting results (P. concolor and C. brachyurus vs L. pardalis) may be due to the much higher mobility of the two first species, in comparison with the third. Still, L. pardalis requires habitat with higher quality when compared with other two species. This study highlights the importance of considering multiple spatial scales when evaluating species responses to different habitats. An important and new finding was the prevalence of edge density over the habitat extension to explain overall carnivore distribution, a key information for planning and management of protected areas.
Resumo:
Birnbaum-Saunders models have largely been applied in material fatigue studies and reliability analyses to relate the total time until failure with some type of cumulative damage. In many problems related to the medical field, such as chronic cardiac diseases and different types of cancer, a cumulative damage caused by several risk factors might cause some degradation that leads to a fatigue process. In these cases, BS models can be suitable for describing the propagation lifetime. However, since the cumulative damage is assumed to be normally distributed in the BS distribution, the parameter estimates from this model can be sensitive to outlying observations. In order to attenuate this influence, we present in this paper BS models, in which a Student-t distribution is assumed to explain the cumulative damage. In particular, we show that the maximum likelihood estimates of the Student-t log-BS models attribute smaller weights to outlying observations, which produce robust parameter estimates. Also, some inferential results are presented. In addition, based on local influence and deviance component and martingale-type residuals, a diagnostics analysis is derived. Finally, a motivating example from the medical field is analyzed using log-BS regression models. Since the parameter estimates appear to be very sensitive to outlying and influential observations, the Student-t log-BS regression model should attenuate such influences. The model checking methodologies developed in this paper are used to compare the fitted models.
Resumo:
The main purpose of this work is to study the behaviour of Skovgaard`s [Skovgaard, I.M., 2001. Likelihood asymptotics. Scandinavian journal of Statistics 28, 3-32] adjusted likelihood ratio statistic in testing simple hypothesis in a new class of regression models proposed here. The proposed class of regression models considers Dirichlet distributed observations, and the parameters that index the Dirichlet distributions are related to covariates and unknown regression coefficients. This class is useful for modelling data consisting of multivariate positive observations summing to one and generalizes the beta regression model described in Vasconcellos and Cribari-Neto [Vasconcellos, K.L.P., Cribari-Neto, F., 2005. Improved maximum likelihood estimation in a new class of beta regression models. Brazilian journal of Probability and Statistics 19,13-31]. We show that, for our model, Skovgaard`s adjusted likelihood ratio statistics have a simple compact form that can be easily implemented in standard statistical software. The adjusted statistic is approximately chi-squared distributed with a high degree of accuracy. Some numerical simulations show that the modified test is more reliable in finite samples than the usual likelihood ratio procedure. An empirical application is also presented and discussed. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
This thesis consists of a summary and five self-contained papers addressing dynamics of firms in the Swedish wholesale trade sector. Paper [1] focuses upon determinants of new firm formation in the Swedish wholesale trade sector, using two definitions of firms’ relevant markets, markets defined as administrative areas, and markets based on a cost minimizing behavior of retailers. The paper shows that new entering firms tend to avoid regions with already high concentration of other firms in the same branch of wholesaling, while right-of-the-center local government and quality of the infrastructure have positive impacts upon entry of new firms. The signs of the estimated coefficients remain the same regardless which definition of relevant market is used, while the size of the coefficients is generally higher once relevant markets delineated on the cost-minimizing assumption of retailers are used. Paper [2] analyses determinant of firm relocation, distinguishing between the role of the factors in in-migration municipalities and out-migration municipalities. The results of the analysis indicate that firm-specific factors, such as profits, age and size of the firm are negatively related to the firm’s decision to relocate. Furthermore, firms seems to be avoiding municipalities with already high concentration of firms operating in the same industrial branch of wholesaling and also to be more reluctant to leave municipalities governed by right-of-the- center parties. Lastly, firms seem to avoid moving to municipalities characterized with high population density. Paper [3] addresses determinants of firm growth, adopting OLS and a quantile regression technique. The results of this paper indicate that very little of the firm growth can be explained by the firm-, industry- and region-specific factors, controlled for in the estimated models. Instead, the firm growth seems to be driven by internal characteristics of firms, factors difficult to capture in conventional statistics. This result supports Penrose’s (1959) suggestion that internal resources such as firm culture, brand loyalty, entrepreneurial skills, and so on, are important determinants of firm growth rates. Paper [4] formulates a forecasting model for firm entry into local markets and tests this model using data from the Swedish wholesale industry. The empirical analysis is based on directly estimating the profit function of wholesale firms and identification of low- and high-return local markets. The results indicate that 19 of 30 estimated models have more net entry in high-return municipalities, but the estimated parameters is only statistically significant at conventional level in one of our estimated models, and then with unexpected negative sign. Paper [5] studies effects of firm relocation on firm profits of relocating firms, employing a difference-in-difference propensity score matching. Using propensity score matching, the pre-relocalization differences between relocating and non-relocating firms are balanced, while the difference-in-difference estimator controls for all time-invariant unobserved heterogeneity among firms. The results suggest that firms that relocate increase their profits significantly, in comparison to what the profits would be had the firms not relocated. This effect is estimated to vary between 3 to 11 percentage points, depending on the length of the analyzed period.
Resumo:
A new test method based on multipass scratch testing has been developed for evaluating the mechanical and tribological properties of thin, hard coatings. The proposed test method uses a pin-on-disc tribometer and during testing a Rockwell C diamond stylus is used as the “pin” and loaded against the rotating coated sample. The influence of normal load on the number of cycles to coating damage is investigated and the resulting coating damage mechanisms are evaluated by posttest scanning electron microscopy. The present study presents the test method by evaluating the performance of Ti0.86Si0.14N, Ti0.34Al0.66N, and (Al0.7Cr0.3)2O3 coatings deposited by cathodic arc evaporation on cemented carbide inserts. The results show that the test method is quick, simple, and reproducible and can preferably be used to obtain relevant data concerning the fatigue, wear, chipping, and spalling characteristics of different coating-substrate composites. The test method can be used as a virtually nondestructive test and, for example, be used to evaluate the fatigue and wear resistance as well as the cohesive and adhesive interfacial strength of coated cemented carbide inserts prior to cutting tests.
Resumo:
Is private money feasible and desirable? In its absence, is there a central bank policy that partially or fully substitutes for private money? In this paper, some recent modeling ideas about how to address these questioned are reviewed and applied. The main ideas are that people cannot commit to future actions and that their histories are to some extent unknown - are not common knowledge. Under the additional assumption that the private monies issued by diferent people are distinct, a strong recognizability assumption, it is shown that there is a role for private money.
Resumo:
Multi-factor models constitute a useful tool to explain cross-sectional covariance in equities returns. We propose in this paper the use of irregularly spaced returns in the multi-factor model estimation and provide an empirical example with the 389 most liquid equities in the Brazilian Market. The market index shows itself significant to explain equity returns while the US$/Brazilian Real exchange rate and the Brazilian standard interest rate does not. This example shows the usefulness of the estimation method in further using the model to fill in missing values and to provide interval forecasts.
Resumo:
Tendo por base o desenvolvimento sustentável e a mitigação das mudanças climáticas, políticas públicas estão sendo elaboradas para reverter a crescente degradação dos ecossistemas naturais, permitindo novas formas de cooperação na interface global. As recentes tendências da governança indicam que o foco mudou das atividades entre governos para as iniciativas multisetoriais, da governança em nível nacional para a governança em vários níveis internacionais e de um procedimento formal e legalista para uma abordagem mais informal, participativa e integrada, surgindo, como um possível componente dessa nova estrutura, as redes globais de política pública. Os atores brasileiros estão cada vez mais aderindo a essas redes globais de políticas voltadas à redução das mudanças do clima com seus projetos e políticas de desenvolvimento limpo, indicando que modelos estruturais e relacionais como esse podem ser considerados instrumentos viáveis de governança global quando a questão é a minimização dos riscos ambientais que ameaçam o planeta. Diante disso, foi definido como objetivo do estudo verificar a institucionalização da rede global de políticas públicas voltada à mitigação das mudanças climáticas entre os atores brasileiros relacionados com as políticas de redução e/ou compensação das emissões de gases de efeito estufa. Para isso, foi realizada uma pesquisa bibliográfica sobre o tema de estudo e uma pesquisa empírica com os atores brasileiros do setor público, privado e organizações não-governamentais envolvidos na rede global de políticas públicas. Os resultados mostraram que dos elementos analisados no intuito de verificar a institucionalização da rede entre os atores brasileiros, somente parte deles apontaram para a formação dessa estrutura. Notou-se uma tentativa de institucionalizar a rede, entretanto, muito ainda há de ser desenvolvido para uma perfeita institucionalização.