19 resultados para Instrumental-variable Methods

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of informal institutions and in particular culture for entrepreneurship is a subject of ongoing interest. Past research has mostly concentrated on cross-national comparisons, cultural values, and the direct effects of culture on entrepreneurial behavior, but in the main found inconsistent results. The present research adds a fresh perspective to this research stream by turning attention to community-level culture and cultural norms. We hypothesize indirect effects of cultural norms on venture emergence. Specifically that community-level cultural norms (performance-based culture and socially-supportive institutional norms) impact important supply-side variables (entrepreneurial self-efficacy and entrepreneurial motivation) which in turn influence nascent entrepreneurs’ success in creating operational ventures (venture emergence). We test our predictions on a unique longitudinal data set (PSED II) tracking nascent entrepreneurs venture creation efforts over a 5 year time span and find evidence supporting them. Our research contributes to a more fine-grained understanding of how culture, in particular perceptions of community cultural norms, influences venture emergence. This research highlights the embeddedness of entrepreneurial behavior and its immediate antecedent beliefs in the local, community context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The literature discusses several methods to control for self-selection effects but provides little guidance on which method to use in a setting with a limited number of variables. The authors theoretically compare and empirically assess the performance of different matching methods and instrumental variable and control function methods in this type of setting by investigating the effect of online banking on product usage. Hybrid matching in combination with the Gaussian kernel algorithm outperforms the other methods with respect to predictive validity. The empirical finding of large self-selection effects indicates the importance of controlling for these effects when assessing the effectiveness of marketing activities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper analyses the association between the number of patenting manufacturing firms and the quantity and quality of relevant university research across UK postcode areas. We show that different measures of research `power' and `excellence' positively affect the patenting of small firms within the same postcode area. Patenting by large firms, in contrast, is unaffected by research undertaken in nearby universities. This confirms the commonly held view that location matters more for small firms than large firms. We also investigate specific channels of technology transfer, finding that university-industry knowledge transfer occurs through both formal and informal channels. From a methodological point of view, we contribute to the existing literature by accounting for potential simultaneity between university research and patenting of local firms by adopting an instrumental variable approach. Moreover, we also allow for the effects of the presence of universities in neighbouring postcode areas to influence firms' patenting activity by incorporating spatial neighborhood effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of seven minimization algorithms are compared on five neural network problems. These include a variable-step-size algorithm, conjugate gradient, and several methods with explicit analytic or numerical approximations to the Hessian.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Correlation and regression are two of the statistical procedures most widely used by optometrists. However, these tests are often misused or interpreted incorrectly, leading to erroneous conclusions from clinical experiments. This review examines the major statistical tests concerned with correlation and regression that are most likely to arise in clinical investigations in optometry. First, the use, interpretation and limitations of Pearson's product moment correlation coefficient are described. Second, the least squares method of fitting a linear regression to data and for testing how well a regression line fits the data are described. Third, the problems of using linear regression methods in observational studies, if there are errors associated in measuring the independent variable and for predicting a new value of Y for a given X, are discussed. Finally, methods for testing whether a non-linear relationship provides a better fit to the data and for comparing two or more regression lines are considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new technique in the investigation of limited-dependent variable models. This paper illustrates that variable precision rough set theory (VPRS), allied with the use of a modern method of classification, or discretisation of data, can out-perform the more standard approaches that are employed in economics, such as a probit model. These approaches and certain inductive decision tree methods are compared (through a Monte Carlo simulation approach) in the analysis of the decisions reached by the UK Monopolies and Mergers Committee. We show that, particularly in small samples, the VPRS model can improve on more traditional models, both in-sample, and particularly in out-of-sample prediction. A similar improvement in out-of-sample prediction over the decision tree methods is also shown.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper begins by suggesting that when considering Corporate Social Responsibility (CSR), even CSR as justified in terms of the business case, stakeholders are of great importance to corporations. In the UK the Company Law Review (DTI, 2002) has suggested that it is appropriate for UK companies to be managed upon the basis of an enlightened shareholder approach. Within this approach the importance of stakeholders, other than shareholders, is recognised as being instrumental in succeeding in providing shareholder value. Given the importance of these other stakeholders it is then important that corporate management measure and manage stakeholder performance. In order to do this there are two general approaches that could be adopted and these are the use of monetary values to reflect stakeholder value or cost and non-monetary values. In order to consider these approaches further this paper considered the possible use of these approaches for two stakeholder groups: namely employees and the environment. It concludes that there are ethical and practical difficulties with calculating economic values for stakeholder resources and so prefers a multi-dimensional approach to stakeholder performance measurement that does not use economic valuation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the extensive use of pulse modulation methods in telecommunications, much work has been done in the search for a better utilisation of the transmission channel.The present research is an extension of these investigations. A new modulation method, 'Variable Time-Scale Information Processing', (VTSIP), is proposed.The basic principles of this system have been established, and the main advantages and disadvantages investigated. With the proposed system, comparison circuits detect the instants at which the input signal voltage crosses predetermined amplitude levels.The time intervals between these occurrences are measured digitally and the results are temporarily stored, before being transmitted.After reception, an inverse process enables the original signal to be reconstituted.The advantage of this system is that the irregularities in the rate of information contained in the input signal are smoothed out before transmission, allowing the use of a smaller transmission bandwidth. A disadvantage of the system is the time delay necessarily introduced by the storage process.Another disadvantage is a type of distortion caused by the finite store capacity.A simulation of the system has been made using a standard speech signal, to make some assessment of this distortion. It is concluded that the new system should be an improvement on existing pulse transmission systems, allowing the use of a smaller transmission bandwidth, but introducing a time delay.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visualization of high-dimensional data has always been a challenging task. Here we discuss and propose variants of non-linear data projection methods (Generative Topographic Mapping (GTM) and GTM with simultaneous feature saliency (GTM-FS)) that are adapted to be effective on very high-dimensional data. The adaptations use log space values at certain steps of the Expectation Maximization (EM) algorithm and during the visualization process. We have tested the proposed algorithms by visualizing electrostatic potential data for Major Histocompatibility Complex (MHC) class-I proteins. The experiments show that the variation in the original version of GTM and GTM-FS worked successfully with data of more than 2000 dimensions and we compare the results with other linear/nonlinear projection methods: Principal Component Analysis (PCA), Neuroscale (NSC) and Gaussian Process Latent Variable Model (GPLVM).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Pearson's correlation coefficient only tests whether the data fit a linear model. With large numbers of observations, quite small values of r become significant and the X variable may only account for a minute proportion of the variance in Y. Hence, the value of r squared should always be calculated and included in a discussion of the significance of r. 2. The use of r assumes that a bivariate normal distribution is present and this assumption should be examined prior to the study. If Pearson's r is not appropriate, then a non-parametric correlation coefficient such as Spearman's rs may be used. 3. A significant correlation should not be interpreted as indicating causation especially in observational studies in which there is a high probability that the two variables are correlated because of their mutual correlations with other variables. 4. In studies of measurement error, there are problems in using r as a test of reliability and the ‘intra-class correlation coefficient’ should be used as an alternative. A correlation test provides only limited information as to the relationship between two variables. Fitting a regression line to the data using the method known as ‘least square’ provides much more information and the methods of regression and their application in optometry will be discussed in the next article.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple regression analysis is a complex statistical method with many potential uses. It has also become one of the most abused of all statistical procedures since anyone with a data base and suitable software can carry it out. An investigator should always have a clear hypothesis in mind before carrying out such a procedure and knowledge of the limitations of each aspect of the analysis. In addition, multiple regression is probably best used in an exploratory context, identifying variables that might profitably be examined by more detailed studies. Where there are many variables potentially influencing Y, they are likely to be intercorrelated and to account for relatively small amounts of the variance. Any analysis in which R squared is less than 50% should be suspect as probably not indicating the presence of significant variables. A further problem relates to sample size. It is often stated that the number of subjects or patients must be at least 5-10 times the number of variables included in the study.5 This advice should be taken only as a rough guide but it does indicate that the variables included should be selected with great care as inclusion of an obviously unimportant variable may have a significant impact on the sample size required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contrary to previously held beliefs, it is now known that bacteria exist not only on the surface of the skin but they are also distributed at varying depths beneath the skin surface. Hence, in order to sterilise the skin, antimicrobial agents are required to penetrate across the skin and eliminate the bacteria residing at all depths. Chlorhexidine is an antimicrobial agent with the widest use for skin sterilisation. However, due to its poor permeation rate across the skin, sterilisation of the skin cannot be achieved and, therefore, the remaining bacteria can act as a source of infection during an operation or insertion of catheters. The underlying theme of this study is to enhance the permeation of this antimicrobial agent in the skin by employing chemical (enhancers and supersaturated systems) or physical (iontophoresis) techniques. The hydrochloride salt of chlorhexidine (CHX), a poorly soluble salt, was used throughout this study. The effect of ionisation on in vitro permeation rate across the excised human epidennis was investigated using Franz-type diffusion cells. Saturated solutions of CHX were used as donor and the variable studied was vehicle pH. Permeation rate was increased with increasing vehicle pH. The pH effect was not related to the level of ionisation of the drug. The effect of donor vehicle was also studied using saturated solutions of CHX in 10% and 20% ethanol as the donor solutions. Permeation of CHX was enhanced by increasing the concentration of ethanol which could be due to the higher concentration of CHX in the donor phase and the effect of ethanol itself on the membrane. The interplay between drug diffusion and enhancer pretreatment of the epidennis was studied. Pretreatment of the membrane with 10% Azone/PG demonstrated the highest diffusion rate followed by 10% olcic acid/PG pretreatment compared to other pretreatment regimens (ethanol, dimethyl sulfoxide (DMSO), propylene glycol (PG), sodium dodecyl sulphate (SDS) and dodecyl trimethyl ammonium bromide (DT AB). Differential Scanning Calorimetry (DSC) was also employed to study the mode of action of these enhancers. The potential of supersaturated solutions in enhancing percutaneous absorption of CHX was investigated. Various anti-nucleating polymers were screened in order to establish the most effective agent. Polyvinylpyrrolidone (PVP, K30) was found to be a better candidate than its lower molecular weight counterpart (K25) and hydroxypropyl methyleellulose (HPMC). The permeation studies showed an increase in diffusion rate by increasing the degree of saturation. Iontophoresis is a physical means of transdemal drug delivery enhancement that causes an increased penetration of molecules into or through the skin by the application of an electric field. This technique was employed in conjunction with chemical enhancers to assess the effect on CHX permeation across the human epidermis. An improved transport of CHX, which was pH dependant was observed upon application of the current. Combined use of iontophoresis and chemical enhancers further increased the CHX transport indicating a synergistic effect. Pretreatment of the membrane with 10% Azone/PG demonstrated the greatest effect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis set out to develop an objective analysis programme that correlates with subjective grades but has improved sensitivity and reliability in its measures so that the possibility of early detection and reliable monitoring of changes in anterior ocular surfaces (bulbar hyperaemia, palpebral redness, palpebral roughness and corneal straining) could be increased. The sensitivity of the program was 20x greater than subjective grading by optometrists. The reliability was found to be optimal (r=1.0) with subjective grading up to 144x more variable (r=0.08). Objective measures were used to create formulae for an overall ‘objective-grade’ (per surface) equivalent to those displayed by the CCLRU or Efron scales. The correlation between the formulated objective verses subjective grades was high, with adjusted r2 up to 0.96. Determination of baseline levels of objective grade were investigated over four age groups (5-85years n= 120) so that in practice a comparison against the ‘normal limits’ could be made. Differences for bulbar hyperaemia were found between the age groups (p<0.001), and also for palpebral redness and roughness (p<0.001). The objective formulae were then applied to the investigation of diurnal variation in order to account for any change that may affect the baseline. Increases in bulbar hyperaemia and palpebral redness were found between examinations in the morning and evening. Correlation factors were recommended. The program was then applied to clinical situations in the form of a contact lens trial and an investigation into iritis and keratoconus where it successfully recognised various surface changes. This programme could become a valuable tool, greatly improving the chances of early detection of anterior ocular abnormalities, and facilitating reliable monitoring of disease progression in clinical as well as research environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are several methods of providing series compensation for transmission lines using power electronic switches. Four methods of series compensation have been examined in this thesis, the thyristor controlled series capacitor, a voltage sourced inverter series compensator using a capacitor as the series element, a current sourced inverter series compensator and a voltage sourced inverter using an inductor as the series element. All the compensators examined will provide a continuously variable series voltage which is controlled by the switching of the electronic switches. Two of the circuits will offer both capacitive and inductive compensation, the thyristor controlled series capacitor and the current sourced inverter series compensator. The other two will produce either capacitive or inductive series compensation. The thyristor controlled series capacitor offers the widest range of series compensation. However, there is a band of unavailable compensation between 0 and 1 pu capacitive compensation. Compared to the other compensators examined the harmonic content of the compensating voltage is quite high. An algebraic analysis showed that there is more than one state the thyristor controlled series capacitor can operate in. This state has the undesirable effect of introducing large losses. The voltage sourced inverter series compensator using a capacitor as the series element will provide only capacitive compensation. It uses two capacitors which increase the cost of the compensator significantly above the other three. This circuit has the advantage of very low harmonic distortion. The current sourced inverter series compensator will provide both capacitive and inductive series compensation. The harmonic content of the compensating voltage is second only to the voltage sourced inverter series compensator using a capacitor as the series element. The voltage sourced inverter series compensator using an inductor as the series element will only provide inductive compensation, and it is the least expensive compensator examined. Unfortunately, the harmonics introduced by this circuit are considerable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A methodology is presented which can be used to produce the level of electromagnetic interference, in the form of conducted and radiated emissions, from variable speed drives, the drive that was modelled being a Eurotherm 583 drive. The conducted emissions are predicted using an accurate circuit model of the drive and its associated equipment. The circuit model was constructed from a number of different areas, these being: the power electronics of the drive, the line impedance stabilising network used during the experimental work to measure the conducted emissions, a model of an induction motor assuming near zero load, an accurate model of the shielded cable which connected the drive to the motor, and finally the parasitic capacitances that were present in the drive modelled. The conducted emissions were predicted with an error of +/-6dB over the frequency range 150kHz to 16MHz, which compares well with the limits set in the standards which specify a frequency range of 150kHz to 30MHz. The conducted emissions model was also used to predict the current and voltage sources which were used to predict the radiated emissions from the drive. Two methods for the prediction of the radiated emissions from the drive were investigated, the first being two-dimensional finite element analysis and the second three-dimensional transmission line matrix modelling. The finite element model took account of the features of the drive that were considered to produce the majority of the radiation, these features being the switching of the IGBT's in the inverter, the shielded cable which connected the drive to the motor as well as some of the cables that were present in the drive.The model also took account of the structure of the test rig used to measure the radiated emissions. It was found that the majority of the radiation produced came from the shielded cable and the common mode currents that were flowing in the shield, and that it was feasible to model the radiation from the drive by only modelling the shielded cable. The radiated emissions were correctly predicted in the frequency range 30MHz to 200MHz with an error of +10dB/-6dB. The transmission line matrix method modelled the shielded cable which connected the drive to the motor and also took account of the architecture of the test rig. Only limited simulations were performed using the transmission line matrix model as it was found to be a very slow method and not an ideal solution to the problem. However the limited results obtained were comparable, to within 5%, to the results obtained using the finite element model.