972 resultados para empirical economics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

EEG recordings are often contaminated with ocular artifacts such as eye blinks and eye movements. These artifacts may obscure underlying brain activity in the electroencephalogram (EEG) data and make the analysis of the data difficult. In this paper, we explore the use of empirical mode decomposition (EMD) based filtering technique to correct the eye blinks and eye movementartifacts in single channel EEG data. In this method, the single channel EEG data containing ocular artifact is segmented such that the artifact in each of the segment is considered as some type of slowly varying trend in the dataand the EMD is used to remove the trend. The filtering is done using partial reconstruction from components of the decomposition. The method is completely data dependent and hence adaptive and nonlinear. Experimental results are provided to check the applicability of the method on real EEG data and the results are quantified using power spectral density (PSD) as a measure. The method has given fairlygood results and does not make use of any preknowledge of artifacts or the EEG data used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inventory management (IM) has a decisive role in the enhancement of manufacturing industry's competitiveness. Therefore, major manufacturing industries are following IM practices with the intention of improving their performance. However, the effort to introduce IM in SMEs is very limited due to lack of initiation, expertise, and financial constraints. This paper aims to provide a guideline for entrepreneurs in enhancing their IM performance, as it presents the results of a survey based study carried out for machine tool Small and Medium Enterprises (SMEs) in Bangalore. Having established the significance of inventory as an input, we probed the relationship between IM performance and economic performance of these SMEs. To the extent possible all the factors of production and performance indicators were deliberately considered in pure economic terms. All economic performance indicators adopted seem to have a positive and significant association with IM performance in SMEs. On the whole, we found that SMEs which are IM efficient are likely to perform better on the economic front also and experience higher returns to scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 17th Biennial Conference of the International Institute of Fisheries Economics and Trade (IIFET) was held in Brisbane in July 2014. IIFET is the principal international association for fisheries economics, and the biennial conference is an opportunity for the best fisheries economists in the world to meet and share their ideas. The conference was organised by CSIRO, QUT, UTAS, University of Adelaide and KG Kailis Ltd. This is the first time the conference has been held in Australia. The conferences covered a wide range of topics of relevance to Australia. These included studies of fishery management systems around the world, identified key issues in aquaculture and marine biodiversity conservation, and provided a forum for new modelling and theoretical approaches to analysing fisheries problems to be presented. The theme of the conference was Towards Ecosystem Based Management of Fisheries: What Role can Economics Play? Several sessions were dedicated to modelling socio-ecological systems, and two keynote speakers were invited to present the latest thinking in the area. In this report, the key features of the conference are outlined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The electrical conduction in insulating materials is a complex process and several theories have been suggested in the literature. Many phenomenological empirical models are in use in the DC cable literature. However, the impact of using different models for cable insulation has not been investigated until now, but for the claims of relative accuracy. The steady state electric field in the DC cable insulation is known to be a strong function of DC conductivity. The DC conductivity, in turn, is a complex function of electric field and temperature. As a result, under certain conditions, the stress at cable screen is higher than that at the conductor boundary. The paper presents detailed investigations on using different empirical conductivity models suggested in the literature for HV DC cable applications. It has been expressly shown that certain models give rise to erroneous results in electric field and temperature computations. It is pointed out that the use of these models in the design or evaluation of cables will lead to errors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A test for time-varying correlation is developed within the framework of a dynamic conditional score (DCS) model for both Gaussian and Student t-distributions. The test may be interpreted as a Lagrange multiplier test and modified to allow for the estimation of models for time-varying volatility in the individual series. Unlike standard moment-based tests, the score-based test statistic includes information on the level of correlation under the null hypothesis and local power arguments indicate the benefits of doing so. A simulation study shows that the performance of the score-based test is strong relative to existing tests across a range of data generating processes. An application to the Hong Kong and South Korean equity markets shows that the new test reveals changes in correlation that are not detected by the standard moment-based test.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Supercritical processes are gaining importance in the last few years in the food, environmental and pharmaceutical product processing. The design of any supercritical process needs accurate experimental data on solubilities of solids in the supercritical fluids (SCFs). The empirical equations are quite successful in correlating the solubilities of solid compounds in SCF both in the presence and absence of cosolvents. In this work, existing solvate complex models are discussed and a new set of empirical equations is proposed. These equations correlate the solubilities of solids in supercritical carbon dioxide (both in the presence and absence of cosolvents) as a function of temperature, density of supercritical carbon dioxide and the mole fraction of cosolvent. The accuracy of the proposed models was evaluated by correlating 15 binary and 18 ternary systems. The proposed models provided the best overall correlations. (C) 2009 Elsevier BA/. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with the development of simplified semi-empirical relations for the prediction of residual velocities of small calibre projectiles impacting on mild steel target plates, normally or at an angle, and the ballistic limits for such plates. It has been shown, for several impact cases for which test results on perforation of mild steel plates are available, that most of the existing semi-empirical relations which are applicable only to normal projectile impact do not yield satisfactory estimations of residual velocity. Furthermore, it is difficult to quantify some of the empirical parameters present in these relations for a given problem. With an eye towards simplicity and ease of use, two new regression-based relations employing standard material parameters have been discussed here for predicting residual velocity and ballistic limit for both normal and oblique impact. The latter expressions differ in terms of usage of quasi-static or strain rate-dependent average plate material strength. Residual velocities yielded by the present semi-empirical models compare well with the experimental results. Additionally, ballistic limits from these relations show close correlation with the corresponding finite element-based predictions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper uses original survey data of the Great East Japan earthquake disaster victims to examine their decision to apply for the temporary housing as well as the timing of application. We assess the effects of victims’ attachment to their locality as well as variation in victims’ information seeking behavior. We additionally consider various factors such as income, age, employment and family structure that are generally considered to affect the decision to choose temporary housing as victims’ solution for their displacement. Empirical results indicate that, ceteris paribus, as the degree of attachment increases, victims are more likely to apply for the temporary housing but attachment does not affect the timing of application. On the other hand, the victims who actively seek information and are able to collect higher quality information are less likely to apply for the temporary housing and if they do apply then they apply relatively later.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes the use of empirical modeling techniques for building microarchitecture sensitive models for compiler optimizations. The models we build relate program performance to settings of compiler optimization flags, associated heuristics and key microarchitectural parameters. Unlike traditional analytical modeling methods, this relationship is learned entirely from data obtained by measuring performance at a small number of carefully selected compiler/microarchitecture configurations. We evaluate three different learning techniques in this context viz. linear regression, adaptive regression splines and radial basis function networks. We use the generated models to a) predict program performance at arbitrary compiler/microarchitecture configurations, b) quantify the significance of complex interactions between optimizations and the microarchitecture, and c) efficiently search for'optimal' settings of optimization flags and heuristics for any given microarchitectural configuration. Our evaluation using benchmarks from the SPEC CPU2000 suits suggests that accurate models (< 5% average error in prediction) can be generated using a reasonable number of simulations. We also find that using compiler settings prescribed by a model-based search can improve program performance by as much as 19% (with an average of 9.5%) over highly optimized binaries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective(s) To describe how doctors define and use the terms “futility” and “futile treatment” in end-of-life care. Design, Setting, Participants A qualitative study using semi-structured interviews with 96 doctors across a range of specialties who treat adults at the end of life. Doctors were recruited from three large Australian teaching hospitals and were interviewed from May to July 2013. Results Doctors’ conceptions of futility focused on the quality and chance of patient benefit. Aspects of benefit included physiological effect, weighing benefits and burdens, and quantity and quality of life. Quality and length of life were linked, but many doctors discussed instances when benefit was determined by quality of life alone. Most doctors described the assessment of chance of success in achieving patient benefit as a subjective exercise. Despite a broad conceptual consensus about what futility means, doctors noted variability in how the concept was applied in clinical decision-making. Over half the doctors also identified treatment that is futile but nevertheless justified, such as short-term treatment as part of supporting the family of a dying person. Conclusions There is an overwhelming preference for a qualitative approach to assessing futility, which brings with it variation in clinical decision-making. “Patient benefit” is at the heart of doctors’ definitions of futility. Determining patient benefit requires discussions with patients and families about their values and goals as well as the burdens and benefits of further treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies quantile residuals and uses different methodologies to develop test statistics that are applicable in evaluating linear and nonlinear time series models based on continuous distributions. Models based on mixtures of distributions are of special interest because it turns out that for those models traditional residuals, often referred to as Pearson's residuals, are not appropriate. As such models have become more and more popular in practice, especially with financial time series data there is a need for reliable diagnostic tools that can be used to evaluate them. The aim of the thesis is to show how such diagnostic tools can be obtained and used in model evaluation. The quantile residuals considered here are defined in such a way that, when the model is correctly specified and its parameters are consistently estimated, they are approximately independent with standard normal distribution. All the tests derived in the thesis are pure significance type tests and are theoretically sound in that they properly take the uncertainty caused by parameter estimation into account. -- In Chapter 2 a general framework based on the likelihood function and smooth functions of univariate quantile residuals is derived that can be used to obtain misspecification tests for various purposes. Three easy-to-use tests aimed at detecting non-normality, autocorrelation, and conditional heteroscedasticity in quantile residuals are formulated. It also turns out that these tests can be interpreted as Lagrange Multiplier or score tests so that they are asymptotically optimal against local alternatives. Chapter 3 extends the concept of quantile residuals to multivariate models. The framework of Chapter 2 is generalized and tests aimed at detecting non-normality, serial correlation, and conditional heteroscedasticity in multivariate quantile residuals are derived based on it. Score test interpretations are obtained for the serial correlation and conditional heteroscedasticity tests and in a rather restricted special case for the normality test. In Chapter 4 the tests are constructed using the empirical distribution function of quantile residuals. So-called Khmaladze s martingale transformation is applied in order to eliminate the uncertainty caused by parameter estimation. Various test statistics are considered so that critical bounds for histogram type plots as well as Quantile-Quantile and Probability-Probability type plots of quantile residuals are obtained. Chapters 2, 3, and 4 contain simulations and empirical examples which illustrate the finite sample size and power properties of the derived tests and also how the tests and related graphical tools based on residuals are applied in practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thesis examines the intensification and characteristics of a policy that emphasises economic competitiveness in Finland during the 1990s and early 2000s. This accentuation of economic objectives is studied at the level of national policy-making as well as at the regional level through the policies and strategies of cities and three universities in the Helsinki region. By combining the analysis of state policies, urban strategies and university activities, the study illustrates the pervasiveness of the objective of economic competitiveness and growth across these levels and sheds light on the features and contradictions of these policies on a broad scale. The thesis is composed of five research articles and a summary article. At the level of national policies, the central focus of the thesis is on the growing role of science and technology policy as a state means to promote structural economic change and its transformation towards a broader, yet ambivalent concept of innovation policy. This shift brings forward a tension between an increasing emphasis on economic aspects – innovations and competitiveness – as well as the expanding scope of issues across a wide range of policy sectors that are being subsumed under this market- and economy oriented framework. Related to science and technology policy, attention is paid to adjustments in university policy in which there has been increasing pressure for efficiency, rationalisation and commercialisation of academic activities. Furthermore, political efforts to build an information society through the application of information and communication technologies are analysed with particular attention to the balance between economic and social objectives. Finally, changes in state regional policy priorities and the tendency towards competitiveness are addressed. At the regional level, the focus of the thesis is on the policies of the cities in Finland’s capital region as well as strategies of three universities operating in the region, namely the University of Helsinki, Helsinki University of technology and Helsinki School of Economics. As regards the urban level, the main focus is on the changes and characteristics of the urban economic development policy of the City of Helsinki. With respect to the universities, the thesis examines their attempts to commercialise research and thus bring academic research closer to economic interests, and pays particular attention to the contradictions of commercialisation. Related to the universities, the activities of three intermediary organisations that the universities have established in order to increase cooperation with industry are analysed. These organisations are the Helsinki Science Park, Otaniemi International Innovation Centre and LTT Research Ltd. The summary article provides a synthesis of the material presented in the five original articles and relates the results of the articles to a broader discussion concerning the emergence of competition states and entrepreneurial cities and regions. The main points of reference are Bob Jessop’s and Neil Brenner’s theses on state and urban-regional restructuring. The empirical results and considerations from Finland and the Helsinki region are used to comment on, specify and criticise specific parts of the two theses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agriculture’s contribution to climate change is controversial as it is a significant source of greenhouse gases but also a sink of carbon. Hence its economic and technological potential to mitigate climate change have been argued to be noteworthy. However, social profitability of emission mitigation is a result from factors among emission reductions such as surface water quality impact or profit from production. Consequently, to value comprehensive results of agricultural climate emission mitigation practices, these co-effects to environment and economics should be taken into account. The objective of this thesis was to develop an integrated economic and ecological model to analyse the social welfare of crop cultivation in Finland on distinctive cultivation technologies, conventional tillage and conservation tillage (no-till). Further, we ask whether it would be privately or socially profitable to allocate some of barley cultivation for alternative land use, such as green set-aside or afforestation, when production costs, GHG’s and water quality impacts are taken into account. In the theoretical framework we depict the optimal input use and land allocation choices in terms of environmental impacts and profit from production and derive the optimal tax and payment policies for climate and water quality friendly land allocation. The empirical application of the model uses Finnish data about production cost and profit structure and environmental impacts. According to our results, given emission mitigation practices are not self-evidently beneficial for farmers or society. On the contrary, in some cases alternative land allocation could even reduce social welfare, profiting conventional crop cultivation. This is the case regarding mineral soils such as clay and silt soils. On organic agricultural soils, climate mitigation practices, in this case afforestation and green fallow give more promising results, decreasing climate emissions and nutrient runoff to water systems. No-till technology does not seem to profit climate mitigation although it does decrease other environmental impacts. Nevertheless, the data behind climate emission mitigation practices impact to production and climate is limited and partly contradictory. More specific experiment studies on interaction of emission mitigation practices and environment would be needed. Further study would be important. Particularly area specific production and environmental factors and also food security and safety and socio-economic impacts should be taken into account.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A careful comparison of the distribution in the (R, θ)-plane of all NH ... O hydrogen bonds with that for bonds between neutral NH and neutral C=O groups indicated that the latter has a larger mean R and a wider range of θ and that the distribution was also broader than for the average case. Therefore, the potential function developed earlier for an average NH ... O hydrogen bond was modified to suit the peptide case. A three-parameter expression of the form {Mathematical expression}, with △ = R - Rmin, was found to be satisfactory. By comparing the theoretically expected distribution in R and θ with observed data (although limited), the best values were found to be p1 = 25, p3 = - 2 and q1 = 1 × 10-3, with Rmin = 2·95 Å and Vmin = - 4·5 kcal/mole. The procedure for obtaining a smooth transition from Vhb to the non-bonded potential Vnb for large R and θ is described, along with a flow chart useful for programming the formulae. Calculated values of ΔH, the enthalpy of formation of the hydrogen bond, using this function are in reasonable agreement with observation. When the atoms involved in the hydrogen bond occur in a five-membered ring as in the sequence[Figure not available: see fulltext.] a different formula for the potential function is needed, which is of the form Vhb = Vmin +p1△2 +q1x2 where x = θ - 50° for θ ≥ 50°, with p1 = 15, q1 = 0·002, Rmin = 2· Å and Vmin = - 2·5 kcal/mole. © 1971 Indian Academy of Sciences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Governance has been one of the most popular buzzwords in recent political science. As with any term shared by numerous fields of research, as well as everyday language, governance is encumbered by a jungle of definitions and applications. This work elaborates on the concept of network governance. Network governance refers to complex policy-making situations, where a variety of public and private actors collaborate in order to produce and define policy. Governance is processes of autonomous, self-organizing networks of organizations exchanging information and deliberating. Network governance is a theoretical concept that corresponds to an empirical phenomenon. Often, this phenomenon is used to descirbe a historical development: governance is often used to describe changes in political processes of Western societies since the 1980s. In this work, empirical governance networks are used as an organizing framework, and the concepts of autonomy, self-organization and network structure are developed as tools for empirical analysis of any complex decision-making process. This work develops this framework and explores the governance networks in the case of environmental policy-making in the City of Helsinki, Finland. The crafting of a local ecological sustainability programme required support and knowledge from all sectors of administration, a number of entrepreneurs and companies and the inhabitants of Helsinki. The policy process relied explicitly on networking, with public and private actors collaborating to design policy instruments. Communication between individual organizations led to the development of network structures and patterns. This research analyses these patterns and their effects on policy choice, by applying the methods of social network analysis. A variety of social network analysis methods are used to uncover different features of the networked process. Links between individual network positions, network subgroup structures and macro-level network patterns are compared to the types of organizations involved and final policy instruments chosen. By using governance concepts to depict a policy process, the work aims to assess whether they contribute to models of policy-making. The conclusion is that the governance literature sheds light on events that would otherwise go unnoticed, or whose conceptualization would remain atheoretical. The framework of network governance should be in the toolkit of the policy analyst.