867 resultados para Empirical studies


Relevância:

30.00% 30.00%

Publicador:

Resumo:

An increasing number of neuroimaging studies are concerned with the identification of interactions or statistical dependencies between brain areas. Dependencies between the activities of different brain regions can be quantified with functional connectivity measures such as the cross-correlation coefficient. An important factor limiting the accuracy of such measures is the amount of empirical data available. For event-related protocols, the amount of data also affects the temporal resolution of the analysis. We use analytical expressions to calculate the amount of empirical data needed to establish whether a certain level of dependency is significant when the time series are autocorrelated, as is the case for biological signals. These analytical results are then contrasted with estimates from simulations based on real data recorded with magnetoencephalography during a resting-state paradigm and during the presentation of visual stimuli. Results indicate that, for broadband signals, 50-100 s of data is required to detect a true underlying cross-correlations coefficient of 0.05. This corresponds to a resolution of a few hundred milliseconds for typical event-related recordings. The required time window increases for narrow band signals as frequency decreases. For instance, approximately 3 times as much data is necessary for signals in the alpha band. Important implications can be derived for the design and interpretation of experiments to characterize weak interactions, which are potentially important for brain processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is the first published edition of John Sinclair, Susan Jones and Robert Daley's research on collocation undertaken in 1970. The unpublished report was circulated amongst a small group of academics and was enormously influential, sparking a growth of interest in collocation amongst researchers in linguistics. Collocation was first viewed as important in computational linguistics in the work of Harold Palmer in Japan. Later M.A.K. Halliday and John Sinclair published on collocation in the 1960s. English Collocation Studies is a report on empirical research into collocation, devised by Halliday with Sinclair acting as the Principal Investigator and editor of the resultant OSTI report. The present edition contains an introduction by Professor Wolfgang Teubert based on his interview with John Sinclair. The introduction assesses the extent to which the findings of the original research have developed in the intervening years, and how some of the techniques mentioned in the report were implemented in the COBUILD project at Birmingham University in the 1980s.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relationship between theory and practice has been discussed in the social sciences for generations. Academics from management and organization studies regularly lament the divide between theory and practice. They regret the insufficient academic knowledge of managerial problems and their solutions, and criticize the scholarly production of theories that are not relevant for organizational practice (Hambrick 1994). Despite the prevalence of this topic in academic discourse, we do not know much about what kind of academic knowledge would be useful to practice, how it would be produced and how the transfer of knowledge between theory and practice actually works. In short, we do not know how we can make academic work more relevant for practice or even whether this would be desirable. In this introduction to the Special Issue, we apply philosophical, theoretical and empirical perspectives to examine the challenges of studying the generation and use of academic knowledge. We then briefly describe the contribution of the seven papers that were selected for this Special Issue. Finally, we discuss issues that still need to be addressed, and make some proposals for future avenues of research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Plasma or "dry" etching is an essential process for the production of modern microelectronic circuits. However, despite intensive research, many aspects of the etch process are not fully understood. The results of studies of the plasma etching of Si and Si02 in fluorine-containing discharges, and the complementary technique of plasma polymerisation are presented in this thesis. Optical emission spectroscopy with argon actinometry was used as the principle plasma diagnostic. Statistical experimental design was used to model and compare Si and Si02 etch rates in CF4 and SF6 discharges as a function of flow, pressure and power. Etch mechanisms m both systems, including the potential reduction of Si etch rates in CF4 due to fluorocarbon polymer formation, are discussed. Si etch rates in CF4 /SF6 mixtures were successfully accounted for by the models produced. Si etch rates in CF4/C2F6 and CHF3 as a function of the addition of oxygen-containing additives (02, N20 and CO2) are shown to be consistent with a simple competition between F, 0 and CFx species for Si surface sites. For the range of conditions studied, Si02 etch rates were not dependent on F-atom concentration, but the presence of fluorine was essential in order to achieve significant etch rates. The influence of a wide range of electrode materials on the etch rate of Si and Si02 in CF4 and CF4 /02 plasmas was studied. It was found that the Si etch rate in a CF4 plasma was considerably enhanced, relative to an anodised aluminium electrode, in the presence of soda glass or sodium or potassium "doped" quartz. The effect was even more pronounced in a CF4 /02 discharge. In the latter system lead and copper electrodes also enhanced the Si etch rate. These results could not be accounted for by a corresponding rise in atomic fluorine concentration. Three possible etch enhancement mechanisms are discussed. Fluorocarbon polymer deposition was studied, both because of its relevance to etch mechanisms and its intrinsic interest, as a function of fluorocarbon source gas (CF4, C2F6, C3F8 and CHF3), process time, RF power and percentage hydrogen addition. Gas phase concentrations of F, H and CF2 were measured by optical emission spectroscopy, and the resultant polymer structure determined by X-ray photoelectron spectroscopy and infrared spectroscopy. Thermal and electrical properties were measured also. Hydrogen additions are shown to have a dominant role in determining deposition rate and polymer composition. A qualitative description of the polymer growth mechanism is presented which accounts for both changes in growth rate and structure, and leads to an empirical deposition rate model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The assessment of the reliability of systems which learn from data is a key issue to investigate thoroughly before the actual application of information processing techniques to real-world problems. Over the recent years Gaussian processes and Bayesian neural networks have come to the fore and in this thesis their generalisation capabilities are analysed from theoretical and empirical perspectives. Upper and lower bounds on the learning curve of Gaussian processes are investigated in order to estimate the amount of data required to guarantee a certain level of generalisation performance. In this thesis we analyse the effects on the bounds and the learning curve induced by the smoothness of stochastic processes described by four different covariance functions. We also explain the early, linearly-decreasing behaviour of the curves and we investigate the asymptotic behaviour of the upper bounds. The effect of the noise and the characteristic lengthscale of the stochastic process on the tightness of the bounds are also discussed. The analysis is supported by several numerical simulations. The generalisation error of a Gaussian process is affected by the dimension of the input vector and may be decreased by input-variable reduction techniques. In conventional approaches to Gaussian process regression, the positive definite matrix estimating the distance between input points is often taken diagonal. In this thesis we show that a general distance matrix is able to estimate the effective dimensionality of the regression problem as well as to discover the linear transformation from the manifest variables to the hidden-feature space, with a significant reduction of the input dimension. Numerical simulations confirm the significant superiority of the general distance matrix with respect to the diagonal one.In the thesis we also present an empirical investigation of the generalisation errors of neural networks trained by two Bayesian algorithms, the Markov Chain Monte Carlo method and the evidence framework; the neural networks have been trained on the task of labelling segmented outdoor images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurements of the sea surface obtained by satellite borne radar altimetry are irregularly spaced and contaminated with various modelling and correction errors. The largest source of uncertainty for low Earth orbiting satellites such as ERS-1 and Geosat may be attributed to orbital modelling errors. The empirical correction of such errors is investigated by examination of single and dual satellite crossovers, with a view to identifying the extent of any signal aliasing: either by removal of long wavelength ocean signals or introduction of additional error signals. From these studies, it was concluded that sinusoidal approximation of the dominant one cycle per revolution orbit error over arc lengths of 11,500 km did not remove a significant mesoscale ocean signal. The use of TOPEX/Poseidon dual crossovers with ERS-1 was shown to substantially improve the radial accuracy of ERS-1, except for some absorption of small TOPEX/Poseidon errors. The extraction of marine geoid information is of great interest to the oceanographic community and was the subject of the second half of this thesis. Firstly through determination of regional mean sea surfaces using Geosat data, it was demonstrated that a dataset with 70cm orbit error contamination could produce a marine geoid map which compares to better than 12cm with an accurate regional high resolution gravimetric geoid. This study was then developed into Optimal Fourier Transform Interpolation, a technique capable of analysing complete altimeter datasets for the determination of consistent global high resolution geoid maps. This method exploits the regular nature of ascending and descending data subsets thus making possible the application of fast Fourier transform algorithms. Quantitative assessment of this method was limited by the lack of global ground truth gravity data, but qualitative results indicate good signal recovery from a single 35-day cycle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite being in the business agenda for almost thirty years, stakeholder management is still an under explored field in the public management context. The investigation presented in this doctoral thesis aims to ensure that stakeholder management is a useful technique able to raise issues about power and interests to public organisation’s strategic management processes. Stakeholder theory is tested in an exploratory study carried out with English Local Authorities whose focus is place on decision-making. The findings derive from two distinct and complementary studies: a cross-sectional survey undertaken with chief executives based on the quantitative approach and a qualitative investigation based on cross-sectional case studies and in-depth interviews of validation. While the first study aimed to produce a reliable and comprehensive list of stakeholders able to raise issues in decision-making, the second study aimed to depict the arena in which decision-making comes about. The findings indicate that local government decision-making is a multistakeholder process in which influences are exerted according to stakeholders’ power and interest. The findings also indicate that local government managers should take into account these tissues to avoid losing resources and legitimacy from its environmental supporters. Another issue raised by the investigation is related to the ethics upon which these types of relationships are based. According to the evidence gathered throughout the investigation, the formal model of accountability does not cover the whole set of stakeholders engaged in the process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The role of the production system as a key determinant of competitive performance of business operations- has long been the subject of industrial organization research, even predating the .explicit conceptua1isation of manufacturing, strategy in the literature. Particular emergent production issues such as the globalisation of production, global supply chain management, management of integrated manufacturing and a growing e~busjness environment are expected to critically influence the overall competitive performance and therefore the strategic success of the organization. More than ever, there is a critical need to configure and improve production system and operations competence in a strategic way so as to contribute to the long-term competitiveness of the organization. In order to operate competitively and profitably, manufacturing companies, no matter how well managed, all need a long-term 'strategic direction' for the development of operations competence in order to consistently produce more market value with less cost towards a leadership position. As to the long-term competitiveness, it is more important to establish a dynamic 'strategic perspective' for continuous operational improvements in pursuit of this direction, as well as ongoing reviews of the direction in relation to the overall operating context. However, it also clear that the 'existing paradigm of manufacturing strategy development' is incapable of adequately responding to the increasing complexities and variations of contemporary business operations. This has been factually reflected as many manufacturing companies are finding that methodologies advocated in the existing paradigm for developing manufacturing strategy have very limited scale and scope for contextual contingency in empirical application. More importantly, there has also emerged a deficiency in the multidimensional and integrative profile from a theoretical perspective when operationalising the underlying concept of strategic manufacturing management established in the literature. The point of departure for this study was a recognition of such contextual and unitary limitations in the existing paradigm of manufacturing strategy development when applied to contemporary industrial organizations in general, and Chinese State Owned Enterprises (SOEs) in particular. As China gradually becomes integrated into the world economy, the relevance of Western management theory and its paradigm becomes a practical matter as much as a theoretical issue. Since China markedly differs from Western countries in terms of culture, society, and political and economic systems, it presents promising grounds to test and refine existing management theories and paradigms with greater contextual contingency and wider theoretical perspective. Under China's ongoing programmes of SOE reform, there has been an increased recognition that strategy development is the very essence of the management task for managers of manufacturing companies in the same way as it is for their counterparts in Western economies. However, the Western paradigm often displays a rather naive and unitary perspective of the nature of strategic management decision-making, one which largely overlooks context-embedded factors and social/political influences on the development of manufacturing strategy. This thesis studies the successful experiences of developing manufacturing strategy from five high-performing large-scale SOEs within China’s petrochemical industry. China’s petrochemical industry constitutes a basic heavy industrial sector, which has always been a strategic focus for reform and development by the Chinese government. Using a confirmation approach, the study has focused on exploring and conceptualising the empirical paradigm of manufacturing strategy development practiced by management. That is examining the ‘empirical specifics’ and surfacing the ‘managerial perceptions’ of content configuration, context of consideration, and process organization for developing a manufacturing strategy during the practice. The research investigation adopts a qualitative exploratory case study methodology with a semi-structural front-end research design. Data collection follows a longitudinal and multiple-case design and triangulates case evidence from sources including qualitative interviews, direct observation, and a search of documentations and archival records. Data analysis follows an investigative progression from a within-case preliminary interpretation of facts to a cross-case search for patterns through theoretical comparison and analytical generalization. The underlying conceptions in both the literature of manufacturing strategy and related studies in business strategy were used to develop theoretical framework and analytical templates applied during data collection and analysis. The thesis makes both empirical and theoretical contributions to our understanding of 'contemporary management paradigm of manufacturing strategy development'. First, it provides a valuable contextual contingency of the 'subject' using the business setting of China's SOEs in petrochemical industry. This has been unpacked into empirical configurations developed for its context of consideration, its content and process respectively. Of special note, a lean paradigm of business operations and production management discovered at case companies has significant implications as an emerging alternative for high-volume capital intensive state manufacturing in China. Second, it provides a multidimensional and integrative theoretical profile of the 'subject' based upon managerial perspectives conceptualised at case companies when operationalising manufacturing strategy. This has been unpacked into conceptual frameworks developed for its context of consideration, its content constructs, and its process patterns respectively. Notably, a synergies perspective towards the operating context, competitive priorities and competence development of business operations and production management has significant implications for implementing a lean manufacturing paradigm. As a whole, in so doing, the thesis established a theoretical platform for future refinement and development of context-specific methodologies for developing manufacturing strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is concerned with the means by which the state in Britain has attempted to influence the technological development of private industry in the period 1945-1979. Particular emphasis is laid on assessing the abilities of technology policy measures to promote innovation. With that objective, the innovation literature is selectively reviewed to draw up an analytical framework to evaluate the innovation content of policy (Chapter 2). Technology policy is taken to consist of the specific measures utilised by government and its agents that affect the technological behaviour of firms. The broad sweep of policy during the period under consideration is described in Chapter 3 which concentrates on elucidating its institutional structure and the activities of the bodies involved. The empirical core of the thesis consists of three parallel case studies of policy toward the computer, machine tool and textile machinery industries (Chapters 4-6). The studies provide detailed historical accounts of the development and composition of policy, relating it to its specific institutional and industrial contexts. Each reveals a different pattern and level of state intervention. The thesis concludes with a comparative review of the findings of the case studies within a discussion centred on the arguments presented in Chapter 2. Topics arising include the state's differential support for the range of activities involved in innovation, the location of state-funded R&D, the encouragement of supplier-user contact, and the difficulties raised in adoption and diffusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Corporate restructuring is perceived as a challenge to research. Prior studies do not provide conclusive evidence regarding the effects of restructuring. Since there are discernible findings, this research attempts to examine the effects of restructuring events amongst the UK listed firms. The sample firms are listed in the LSE and London AIM stock exchange. Only completed restructuring transactions are included in the study. The time horizon extends from year 1999 to 2003. A three-year floating window is assigned to examine the sample firms. The key enquiry is to scrutinise the ex post effects of restructuring on performance and value measures of firms with contrast to a matched criteria non-restructured sample. A cross sectional study employing logit estimate is undertaken to examine firm characteristics of restructuring samples. Further, additional parameters, i.e. Conditional Volatility and Asymmetry are generated under the GJR-GARCH estimate and reiterated in logit models to capture time-varying heteroscedasticity of the samples. This research incorporates most forms of restructurings, while prior studies have examined certain forms of restructuring. Particularly, these studies have made limited attempts to examine different restructuring events simultaneously. In addition to logit analysis, an event study is adopted to evaluate the announcement effect of restructuring under both the OLS and GJR-GARCH estimate supplementing our prior results. By engaging a composite empirical framework, our estimation method validates a full appreciation of restructuring effect. The study provides evidence that restructurings indicate non-trivial significant positive effect. There are some evidences that the response differs because of the types of restructuring, particularly while event study is applied. The results establish that performance measures, i.e. Operating Profit Margin, Return on Equity, Return on Assets, Growth, Size, Profit Margin and Shareholders' Ownership indicate consistent and significant increase. However, Leverage and Asset Turn Over suggest reasonable influence on restructuring across the sample period. Similarly, value measures, i.e. Abnormal Returns, Return on Equity and Cash Flow Margin suggest sizeable improvement. A notable characteristic seen coherently throughout the analysis is the decreasing proportion of Systematic Risk. Consistent with these findings, Conditional Volatility and Asymmetry exhibit similar trend. The event study analysis suggests that on an average market perceives restructuring favourably and shareholders experience significant and systematic positive gain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In previous sea-surface variability studies, researchers have failed to utilise the full ERS-1 mission due to the varying orbital characteristics in each mission phase, and most have simply ignored the Ice and Geodetic phases. This project aims to introduce a technique which will allow the straightforward use of all orbital phases, regardless of orbit type. This technique is based upon single satellite crossovers. Unfortunately the ERS-1 orbital height is still poorly resolved (due to higher air drag and stronger gravitational effects) when compared with that of TOPEX/Poseidon (T/P), so to make best use of the ERS-1 crossover data corrections to the ERS-1 orbital heights are calculated by fitting a cubic-spline to dual-crossover residuals with T/P. This correction is validated by comparison of dual satellite crossovers with tide gauge data. The crossover processing technique is validated by comparing the extracted sea-surface variability information with that from T/P repeat pass data. The two data sets are then combined into a single consistent data set for analysis of sea-surface variability patterns. These patterns are simplified by the use of an empirical orthogonal function decomposition which breaks the signals into spatial modes which are then discussed separately. Further studies carried out on these data include an analysis of the characteristics of the annual signal, discussion of evidence for Rossby wave propagation on a global basis, and finally analysis of the evidence for global mean sea level rise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis first considers the calibration and signal processing requirements of a neuromagnetometer for the measurement of human visual function. Gradiometer calibration using straight wire grids is examined and optimal grid configurations determined, given realistic constructional tolerances. Simulations show that for gradiometer balance of 1:104 and wire spacing error of 0.25mm the achievable calibration accuracy of gain is 0.3%, of position is 0.3mm and of orientation is 0.6°. Practical results with a 19-channel 2nd-order gradiometer based system exceed this performance. The real-time application of adaptive reference noise cancellation filtering to running-average evoked response data is examined. In the steady state, the filter can be assumed to be driven by a non-stationary step input arising at epoch boundaries. Based on empirical measures of this driving step an optimal progression for the filter time constant is proposed which improves upon fixed time constant filter performance. The incorporation of the time-derivatives of the reference channels was found to improve the performance of the adaptive filtering algorithm by 15-20% for unaveraged data, falling to 5% with averaging. The thesis concludes with a neuromagnetic investigation of evoked cortical responses to chromatic and luminance grating stimuli. The global magnetic field power of evoked responses to the onset of sinusoidal gratings was shown to have distinct chromatic and luminance sensitive components. Analysis of the results, using a single equivalent current dipole model, shows that these components arise from activity within two distinct cortical locations. Co-registration of the resulting current source localisations with MRI shows a chromatically responsive area lying along the midline within the calcarine fissure, possibly extending onto the lingual and cuneal gyri. It is postulated that this area is the human homologue of the primate cortical area V4.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to report on an investigation into the selection and evaluation of a suitable strategic positioning methodology for SMEs in Singapore. Design/methodology/approach – The research methodology is based on critical review of the literature to identify the potentially most suitable strategic positioning methodology, evaluation and testing of the methodology within the context of SME's in Singapore, and analysis to determine the strengths and weaknesses of the methodology and opportunities for further research. Findings – This paper illustrates a leading integrated strategic positioning decision making process, which has been found to be potentially suitable for SMEs in Singapore, and the process is then applied and evaluated in two industrial case studies. Results in the form of strengths, weaknesses and opportunities are evaluated and discussed in detail, and further research to improve the process has been identified. Practical implications – A practical and integrated strategic supply chain positioning methodology for SMEs to define their own competitive space, among other companies in the manufacturing supply chain, so as to maximize business competitiveness. Originality/value – This paper contributes to the knowledge of the strategic positioning decision process as well as identifies further research to adapt the process for SMEs in Singapore.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper draws upon the findings of an empirical study comparing the expectations and concerns of engineering students with students enrolled on business and management programs. It argues that whilst the two groups of students have very similar expectations, motivations and concerns before their start their studies, once at university, engineering students are twice as likely to drop-out than are their compatriots in business studies. Drawing upon the study findings, recommendations are made as to what might be done to counteract this. The conclusion argues that there is a need for more in-depth research to be conducted in this area in order to identify the reasons behind the different attrition rates and to further enhance engineering undergraduate experience.