18 resultados para VARIABLE SEPARATION APPROACH

em Aston University Research Archive


Relevância:

90.00% 90.00%

Publicador:

Resumo:

thesis is developed from a real life application of performance evaluation of small and medium-sized enterprises (SMEs) in Vietnam. The thesis presents two main methodological developments on evaluation of dichotomous environment variable impacts on technical efficiency. Taking into account the selection bias the thesis proposes a revised frontier separation approach for the seminal Data Envelopment Analysis (DEA) model which was developed by Charnes, Cooper, and Rhodes (1981). The revised frontier separation approach is based on a nearest neighbour propensity score matching pairing treated SMEs with their counterfactuals on the propensity score. The thesis develops order-m frontier conditioning on propensity score from the conditional order-m approach proposed by Cazals, Florens, and Simar (2002), advocated by Daraio and Simar (2005). By this development, the thesis allows the application of the conditional order-m approach with a dichotomous environment variable taking into account the existence of the self-selection problem of impact evaluation. Monte Carlo style simulations have been built to examine the effectiveness of the aforementioned developments. Methodological developments of the thesis are applied in empirical studies to evaluate the impact of training programmes on the performance of food processing SMEs and the impact of exporting on technical efficiency of textile and garment SMEs of Vietnam. The analysis shows that training programmes have no significant impact on the technical efficiency of food processing SMEs. Moreover, the analysis confirms the conclusion of the export literature that exporters are self selected into the sector. The thesis finds no significant impact from exporting activities on technical efficiency of textile and garment SMEs. However, large bias has been eliminated by the proposed approach. Results of empirical studies contribute to the understanding of the impact of different environmental variables on the performance of SMEs. It helps policy makers to design proper policy supporting the development of Vietnamese SMEs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The importance of informal institutions and in particular culture for entrepreneurship is a subject of ongoing interest. Past research has mostly concentrated on cross-national comparisons, cultural values, and the direct effects of culture on entrepreneurial behavior, but in the main found inconsistent results. The present research adds a fresh perspective to this research stream by turning attention to community-level culture and cultural norms. We hypothesize indirect effects of cultural norms on venture emergence. Specifically that community-level cultural norms (performance-based culture and socially-supportive institutional norms) impact important supply-side variables (entrepreneurial self-efficacy and entrepreneurial motivation) which in turn influence nascent entrepreneurs’ success in creating operational ventures (venture emergence). We test our predictions on a unique longitudinal data set (PSED II) tracking nascent entrepreneurs venture creation efforts over a 5 year time span and find evidence supporting them. Our research contributes to a more fine-grained understanding of how culture, in particular perceptions of community cultural norms, influences venture emergence. This research highlights the embeddedness of entrepreneurial behavior and its immediate antecedent beliefs in the local, community context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visualization has proven to be a powerful and widely-applicable tool the analysis and interpretation of data. Most visualization algorithms aim to find a projection from the data space down to a two-dimensional visualization space. However, for complex data sets living in a high-dimensional space it is unlikely that a single two-dimensional projection can reveal all of the interesting structure. We therefore introduce a hierarchical visualization algorithm which allows the complete data set to be visualized at the top level, with clusters and sub-clusters of data points visualized at deeper levels. The algorithm is based on a hierarchical mixture of latent variable models, whose parameters are estimated using the expectation-maximization algorithm. We demonstrate the principle of the approach first on a toy data set, and then apply the algorithm to the visualization of a synthetic data set in 12 dimensions obtained from a simulation of multi-phase flows in oil pipelines and to data in 36 dimensions derived from satellite images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Thouless-Anderson-Palmer (TAP) approach was originally developed for analysing the Sherrington-Kirkpatrick model in the study of spin glass models and has been employed since then mainly in the context of extensively connected systems whereby each dynamical variable interacts weakly with the others. Recently, we extended this method for handling general intensively connected systems where each variable has only O(1) connections characterised by strong couplings. However, the new formulation looks quite different with respect to existing analyses and it is only natural to question whether it actually reproduces known results for systems of extensive connectivity. In this chapter, we apply our formulation of the TAP approach to an extensively connected system, the Hopfield associative memory model, showing that it produces identical results to those obtained by the conventional formulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many Environmental Information Systems the actual observations arise from a discrete monitoring network which might be rather heterogeneous in both location and types of measurements made. In this paper we describe the architecture and infrastructure for a system, developed as part of the EU FP6 funded INTAMAP project, to provide a service oriented solution that allows the construction of an interoperable, automatic, interpolation system. This system will be based on the Open Geospatial Consortium’s Web Feature Service (WFS) standard. The essence of our approach is to extend the GML3.1 observation feature to include information about the sensor using SensorML, and to further extend this to incorporate observation error characteristics. Our extended WFS will accept observations, and will store them in a database. The observations will be passed to our R-based interpolation server, which will use a range of methods, including a novel sparse, sequential kriging method (only briefly described here) to produce an internal representation of the interpolated field resulting from the observations currently uploaded to the system. The extended WFS will then accept queries, such as ‘What is the probability distribution of the desired variable at a given point’, ‘What is the mean value over a given region’, or ‘What is the probability of exceeding a certain threshold at a given location’. To support information-rich transfer of complex and uncertain predictions we are developing schema to represent probabilistic results in a GML3.1 (object-property) style. The system will also offer more easily accessible Web Map Service and Web Coverage Service interfaces to allow users to access the system at the level of complexity they require for their specific application. Such a system will offer a very valuable contribution to the next generation of Environmental Information Systems in the context of real time mapping for monitoring and security, particularly for systems that employ a service oriented architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new technique in the investigation of limited-dependent variable models. This paper illustrates that variable precision rough set theory (VPRS), allied with the use of a modern method of classification, or discretisation of data, can out-perform the more standard approaches that are employed in economics, such as a probit model. These approaches and certain inductive decision tree methods are compared (through a Monte Carlo simulation approach) in the analysis of the decisions reached by the UK Monopolies and Mergers Committee. We show that, particularly in small samples, the VPRS model can improve on more traditional models, both in-sample, and particularly in out-of-sample prediction. A similar improvement in out-of-sample prediction over the decision tree methods is also shown.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A detailed study has been made of the feasibility of adsorptive purification of slack waxes from traces of aromatic compounds using type 13X molecular sieves to achieve 0.01% aromatics in the product. The limited literature relating to the adsorption of high molecular weight aromatic compounds by zeolites was reviewed. Equilibrium isotherms were determined for typical individual aromatic compounds. Lower molecular weight, or more compact, molecules were preferentially adsorbed and the number of molecules captured by one unit cell decreased with increasing molecular weight of the adsorbate. An increase in adsorption temperature resulted in a decrease in the adsorption value. The isosteric heat of adsorption of differnt types of aromatic compounds was determined from pairs of isotherms at 303 K to 343 K at specific coverages. The lowest heats of adsorption were for dodecylbenzene and phenanthrene. Kinetics of adsorption were studied for different aromatic compounds. The diffusivity decreased significantly when a long alkyl chain was attached to the benzene ring e.g. in dodecylbenzene; molecules with small cross-sectional diameter e.g. cumene were adsorbed most rapidly. The sorption rate increased with temperature. Apparent activation energies increased with increasing polarity. In a study of the dynamic adsorption of selected aromatic compounds from binary solutions in isooctane or n-alkanes, naphthalene exhibited the best dynamic properties followed by dibenzothiophene and finally dodecylbenzene. The dynamic adsorption of naphthalene from different n-alkane solvents increased with a decrease in solvent molecular weight. A tentative mathematical approach is proposed for the prediction of dynamic breakthrough curves from equilibrium isotherms and kinetic data. The dynamic properties of liquid phase adsorption of aromatics from slack waxes were studied at different temperatures and concentrations. The optimum operating temperature was 543 K. The best dynamic performance was achieved with feeds of low aromatic content. The studies with individual aromatic compounds demonstrated the affinity of type NaX molecular sieves to adsorb aromatics in the concentration range 3% - 5% . Wax purification by adsorption was considered promising and extension of the experimental programme was recommended.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to develop econometric models to better understand the economic factors affecting inbound tourist flows from each of six origin countries that contribute to Hong Kong’s international tourism demand. To this end, we test alternative cointegration and error correction approaches to examine the economic determinants of tourist flows to Hong Kong, and to produce accurate econometric forecasts of inbound tourism demand. Our empirical findings show that permanent income is the most significant determinant of tourism demand in all models. The variables of own price, weighted substitute prices, trade volume, the share price index (as an indicator of changes in wealth in origin countries), and a dummy variable representing the Beijing incident (1989) are also found to be important determinants for some origin countries. The average long-run income and own price elasticity was measured at 2.66 and – 1.02, respectively. It was hypothesised that permanent income is a better explanatory variable of long-haul tourism demand than current income. A novel approach (grid search process) has been used to empirically derive the weights to be attached to the lagged income variable for estimating permanent income. The results indicate that permanent income, estimated with empirically determined relatively small weighting factors, was capable of producing better results than the current income variable in explaining long-haul tourism demand. This finding suggests that the use of current income in previous empirical tourism demand studies may have produced inaccurate results. The share price index, as a measure of wealth, was also found to be significant in two models. Studies of tourism demand rarely include wealth as an explanatory forecasting long-haul tourism demand. However, finding a satisfactory proxy for wealth common to different countries is problematic. This study indicates with the ECM (Error Correction Models) based on the Engle-Granger (1987) approach produce more accurate forecasts than ECM based on Pesaran and Shin (1998) and Johansen (1988, 1991, 1995) approaches for all of the long-haul markets and Japan. Overall, ECM produce better forecasts than the OLS, ARIMA and NAÏVE models, indicating the superiority of the application of a cointegration approach for tourism demand forecasting. The results show that permanent income is the most important explanatory variable for tourism demand from all countries but there are substantial variations between countries with the long-run elasticity ranging between 1.1 for the U.S. and 5.3 for U.K. Price is the next most important variable with the long-run elasticities ranging between -0.8 for Japan and -1.3 for Germany and short-run elasticities ranging between – 0.14 for Germany and -0.7 for Taiwan. The fastest growing market is Mainland China. The findings have implications for policies and strategies on investment, marketing promotion and pricing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formative measurement has seen increasing acceptance in organizational research since the turn of the 21st Century. However, in more recent times, a number of criticisms of the formative approach have appeared. Such work argues that formatively-measured constructs are empirically ambiguous and thus flawed in a theory-testing context. The aim of the present paper is to examine the underpinnings of formative measurement theory in light of theories of causality and ontology in measurement in general. In doing so, a thesis is advanced which draws a distinction between reflective, formative, and causal theories of latent variables. This distinction is shown to be advantageous in that it clarifies the ontological status of each type of latent variable, and thus provides advice on appropriate conceptualization and application. The distinction also reconciles in part both recent supportive and critical perspectives on formative measurement. In light of this, advice is given on how most appropriately to model formative composites in theory-testing applications, placing the onus on the researcher to make clear their conceptualization and operationalisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a Data Envelopment Analysis model, some of the weights used to compute the efficiency of a unit can have zero or negligible value despite of the importance of the corresponding input or output. This paper offers an approach to preventing inputs and outputs from being ignored in the DEA assessment under the multiple input and output VRS environment, building on an approach introduced in Allen and Thanassoulis (2004) for single input multiple output CRS cases. The proposed method is based on the idea of introducing unobserved DMUs created by adjusting input and output levels of certain observed relatively efficient DMUs, in a manner which reflects a combination of technical information and the decision maker's value judgements. In contrast to many alternative techniques used to constrain weights and/or improve envelopment in DEA, this approach allows one to impose local information on production trade-offs, which are in line with the general VRS technology. The suggested procedure is illustrated using real data. © 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider analytical and numerical solutions to the Dirichlet boundary-value problem for the biharmonic partial differential equation on a disc of finite radius in the plane. The physical interpretation of these solutions is that of the harmonic oscillations of a thin, clamped plate. For the linear, fourth-order, biharmonic partial differential equation in the plane, it is well known that the solution method of separation in polar coordinates is not possible, in general. However, in this paper, for circular domains in the plane, it is shown that a method, here called quasi-separation of variables, does lead to solutions of the partial differential equation. These solutions are products of solutions of two ordinary linear differential equations: a fourth-order radial equation and a second-order angular differential equation. To be expected, without complete separation of the polar variables, there is some restriction on the range of these solutions in comparison with the corresponding separated solutions of the second-order harmonic differential equation in the plane. Notwithstanding these restrictions, the quasi-separation method leads to solutions of the Dirichlet boundary-value problem on a disc with centre at the origin, with boundary conditions determined by the solution and its inward drawn normal taking the value 0 on the edge of the disc. One significant feature for these biharmonic boundary-value problems, in general, follows from the form of the biharmonic differential expression when represented in polar coordinates. In this form, the differential expression has a singularity at the origin, in the radial variable. This singularity translates to a singularity at the origin of the fourth-order radial separated equation; this singularity necessitates the application of a third boundary condition in order to determine a self-adjoint solution to the Dirichlet boundary-value problem. The penultimate section of the paper reports on numerical solutions to the Dirichlet boundary-value problem; these results are also presented graphically. Two specific cases are studied in detail and numerical values of the eigenvalues are compared with the results obtained in earlier studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) has been proven as an excellent data-oriented efficiency analysis method for comparing decision making units (DMUs) with multiple inputs and multiple outputs. In conventional DEA, it is assumed that the status of each measure is clearly known as either input or output. However, in some situations, a performance measure can play input role for some DMUs and output role for others. Cook and Zhu [Eur. J. Oper. Res. 180 (2007) 692–699] referred to these variables as flexible measures. The paper proposes an alternative model in which each flexible measure is treated as either input or output variable to maximize the technical efficiency of the DMU under evaluation. The main focus of this paper is on the impact that the flexible measures has on the definition of the PPS and the assessment of technical efficiency. An example in UK higher education intuitions shows applicability of the proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Projection of a high-dimensional dataset onto a two-dimensional space is a useful tool to visualise structures and relationships in the dataset. However, a single two-dimensional visualisation may not display all the intrinsic structure. Therefore, hierarchical/multi-level visualisation methods have been used to extract more detailed understanding of the data. Here we propose a multi-level Gaussian process latent variable model (MLGPLVM). MLGPLVM works by segmenting data (with e.g. K-means, Gaussian mixture model or interactive clustering) in the visualisation space and then fitting a visualisation model to each subset. To measure the quality of multi-level visualisation (with respect to parent and child models), metrics such as trustworthiness, continuity, mean relative rank errors, visualisation distance distortion and the negative log-likelihood per point are used. We evaluate the MLGPLVM approach on the ‘Oil Flow’ dataset and a dataset of protein electrostatic potentials for the ‘Major Histocompatibility Complex (MHC) class I’ of humans. In both cases, visual observation and the quantitative quality measures have shown better visualisation at lower levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background - Modelling the interaction between potentially antigenic peptides and Major Histocompatibility Complex (MHC) molecules is a key step in identifying potential T-cell epitopes. For Class II MHC alleles, the binding groove is open at both ends, causing ambiguity in the positional alignment between the groove and peptide, as well as creating uncertainty as to what parts of the peptide interact with the MHC. Moreover, the antigenic peptides have variable lengths, making naive modelling methods difficult to apply. This paper introduces a kernel method that can handle variable length peptides effectively by quantifying similarities between peptide sequences and integrating these into the kernel. Results - The kernel approach presented here shows increased prediction accuracy with a significantly higher number of true positives and negatives on multiple MHC class II alleles, when testing data sets from MHCPEP [1], MCHBN [2], and MHCBench [3]. Evaluation by cross validation, when segregating binders and non-binders, produced an average of 0.824 AROC for the MHCBench data sets (up from 0.756), and an average of 0.96 AROC for multiple alleles of the MHCPEP database. Conclusion - The method improves performance over existing state-of-the-art methods of MHC class II peptide binding predictions by using a custom, knowledge-based representation of peptides. Similarity scores, in contrast to a fixed-length, pocket-specific representation of amino acids, provide a flexible and powerful way of modelling MHC binding, and can easily be applied to other dynamic sequence problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the horizontal deflection behaviour of the streams of particles in paramagnetic fluids under a high-gradient superconducting magnetic field, which is the continued work on the exploration of particle magneto-Archimedes levitation. Based on the previous work on the horizontal deflection of a single particle, a glass box and collector had been designed to observe the movement of particle group in paramagnetic fluids. To get the exact separation efficiency, the method of "sink-float" involved the high density fluid polytungstate (dense medium separation) and MLA (Mineral Liberation Analyser) was performed. It was found that the particles were deflected and settled at certain positions on the container floor due to the combined forces of gravity and magneto-Archimedes forces as well as a lateral buoyancy (displacement) force. Mineral particles with different densities and susceptibilities could be deflected to different positions, thus producing groups of similar types of particles. The work described here, although in its infancy, could form the basis of new approach of separating particles based on a combination of susceptibility and density. © 2014 Elsevier B.V.