976 resultados para empirical methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The starting point of this study is to direct more attention to the teacher and those entrepreneurship education practices taking place in formal school to find out solutions for more effective promotion of entrepreneurship education. For this objective, the strategy-level aims of entrepreneurship education need to be operationalised into measurable and understandable teacher-level practices. Furthermore, to enable the effective development of entrepreneurship education in basic and upper secondary level education, more knowledge is needed of the state of affairs of entrepreneurship education in teaching. The purpose of the study is to increase the level of understanding of teachers’ entrepreneurship education practices, and through this to develop entrepreneurship education. This study builds on the literature on entrepreneurship education and especially those elements referring to the aims, resources, benefits, methods, and practises of entrepreneurship education. The study comprises five articles highlighting teachers’ role in entrepreneurship education. In the first article the concept of entrepreneurship and the teachers role in reflection upon his/hers approaches to entrepreneurship education are considered. The second article provides a detailed analysis of the process of developing a measurement tool to depict the teachers’ activities in entrepreneurship education. The next three articles highlight the teachers’ role in directing the entrepreneurship education in basic and upper secondary level education. Furthermore, they analyse the relationship between the entrepreneurship education practises and the teachers’ background characteristics. The results of the study suggest a wide range of conclusions and implications. First, in spite of many outspoken aims connected to entrepreneurship education, teachers have not set any aims for themselves. Additionally, aims and results seem to mix. However, it is possible to develop teachers’ target orientation by supporting their reflection skills, and through measurement and evaluation increase their understanding of their own practices. Second, applying a participatory action process it is possible to operationalise teachers’entrepreneurship education practices. It is central to include the practitioners’ perspective in the development of measures to make sure that the concepts and aims of entrepreneurship education are understood. Third, teachers’ demographic or tenure-related background characteristics do not affect their entrepreneurship education practices, but their training related to entrepreneurship education, participation in different school-level or regional planning, and their own capabilities support entrepreneurship education. Fourth, a large number of methods are applied to entrepreneurship education, and the most often used methods were different kinds of discussions, which seem to be an easy, low-threshold way for teachers to include entrepreneurship education regularly in their teaching. Field trips to business enterprises or inviting entrepreneurs to present their work in schools are used fairly seldom. Interestingly, visits outside the school are more common than visitors invited to the school. In line, most of the entrepreneurship education practices take place in a classroom. Therefore it seems to be useful to create and encourage teachers towards more in-depth cooperation with companies (e.g. via joint projects) and to network systematically. Finally, there are plenty of resources available for entrepreneurship education, such as ready-made materials, external stakeholders, support organisations, and learning games, but teachers have utilized them only marginally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software quality has become an important research subject, not only in the Information and Communication Technology spheres, but also in other industries at large where software is applied. Software quality is not a happenstance; it is defined, planned and created into the software product throughout the Software Development Life Cycle. The research objective of this study is to investigate the roles of human and organizational factors that influence software quality construction. The study employs the Straussian grounded theory. The empirical data has been collected from 13 software companies, and the data includes 40 interviews. The results of the study suggest that tools, infrastructure and other resources have a positive impact on software quality, but human factors involved in the software development processes will determine the quality of the products developed. On the other hand, methods of development were found to bring little effect on software quality. The research suggests that software quality is an information-intensive process whereby organizational structures, mode of operation, and information flow within the company variably affect software quality. The results also suggest that software development managers influence the productivity of developers and the quality of the software products. Several challenges of software testing that affect software quality are also brought to light. The findings of this research are expected to benefit the academic community and software practitioners by providing an insight into the issues pertaining to software quality construction undertakings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The strongest wish of the customer concerning chemical pulp features is consistent, uniform quality. Variation may be controlled and reduced by using statistical methods. However, studies addressing the application and benefits of statistical methods in forest product sector are scarce. Thus, the customer wish is the root cause of the motivation behind this dissertation. The research problem addressed by this dissertation is that companies in the chemical forest product sector require new knowledge for improving their utilization of statistical methods. To gain this new knowledge, the research problem is studied from five complementary viewpoints – challenges and success factors, organizational learning, problem solving, economic benefit, and statistical methods as management tools. The five research questions generated on the basis of these viewpoints are answered in four research papers, which are case studies based on empirical data collection. This research as a whole complements the literature dealing with the use of statistical methods in the forest products industry. Practical examples of the application of statistical process control, case-based reasoning, the cross-industry standard process for data mining, and performance measurement methods in the context of chemical forest products manufacturing are brought to the public knowledge of the scientific community. The benefit of the application of these methods is estimated or demonstrated. The purpose of this dissertation is to find pragmatic ideas for companies in the chemical forest product sector in order for them to improve their utilization of statistical methods. The main practical implications of this doctoral dissertation can be summarized in four points: 1. It is beneficial to reduce variation in chemical forest product manufacturing processes 2. Statistical tools can be used to reduce this variation 3. Problem-solving in chemical forest product manufacturing processes can be intensified through the use of statistical methods 4. There are certain success factors and challenges that need to be addressed when implementing statistical methods

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pairs trading is an algorithmic trading strategy that is based on the historical co-movement of two separate assets and trades are executed on the basis of degree of relative mispricing. The purpose of this study is to explore one new and alternative copula-based method for pairs trading. The objective is to find out whether the copula method generates more trading opportunities and higher profits than the more traditional distance and cointegration methods applied extensively in previous empirical studies. Methods are compared by selecting top five pairs from stocks of the large and medium-sized companies in the Finnish stock market. The research period includes years 2006-2015. All the methods are proven to be profitable and the Finnish stock market suitable for pairs trading. However, copula method doesn’t generate more trading opportunities or higher profits than the other methods. It seems that the limitations of the more traditional methods are not too restrictive for this particular sample data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The future of paying in the age of digitalization is a topic that includes varied visions. This master’s thesis explores images of the future of paying in the Single Euro Payment Area (SEPA) up to 2020 and 2025 through the views of experts specialized in paying. This study was commissioned by a credit management company in order to obtain more detailed information about the future of paying. Specifically, this thesis investigates what could be the most used payment methods in the future, what items could work as a medium of exchange in 2020 and how will they evolve towards the year 2025. Changing consumer behavior, trends connected to payment methods, security and private issues of new cashless payment methods were also part of this study. In the empirical part of the study the experts’ ideas about probable and preferable future images of paying were investigated through a two-round Disaggregative Delphi method. The questionnaire included numeric statements and open questions. Three alternative future images were created with the help of cluster analysis: “Unsurprising Future”, “Technology Driven Future” and “The Age of the Customer”. The plausible images had similarities and differences, which were reflected to the previous studies in the literature review. The study’s findings were formed based on the images of futures’ similarities and to the open questions answers that were received from the questionnaire. The main conclusion of the study was that development of technology will unify and diversify SEPA; the trend in 2020 seems to be towards more cashless payment methods but their usage depends on the countries’ financial possibilities and customer preferences. Mobile payments, cards and cash will be the main payment methods but the banks will have competitors from outside the financial sector. Wearable payment methods and NFC technology are seen as widely growing trends but subcutaneous payment devices will likely keep their niche position until 2025. In the meantime, security and private issues are seen to increase because of identity thefts and various frauds. Simultaneously, privacy will lose its meaning to younger consumers who are used to sharing their transaction and personal data with third parties in order to get access to attractive services. Easier access to consumers’ transaction data will probably open the door for hackers and cause new risks in paying processes. There exist many roads to future, and this study was not an attempt to give any complete answers about it even if some plausible assumptions about the future’s course were provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper assesses the empirical performance of an intertemporal option pricing model with latent variables which generalizes the Hull-White stochastic volatility formula. Using this generalized formula in an ad-hoc fashion to extract two implicit parameters and forecast next day S&P 500 option prices, we obtain similar pricing errors than with implied volatility alone as in the Hull-White case. When we specialize this model to an equilibrium recursive utility model, we show through simulations that option prices are more informative than stock prices about the structural parameters of the model. We also show that a simple method of moments with a panel of option prices provides good estimates of the parameters of the model. This lays the ground for an empirical assessment of this equilibrium model with S&P 500 option prices in terms of pricing errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adjustement is an ongoing process by which factors of reallocated to equalize their returns in different uses. Adjustment occurs though market mechanisms or intrafirm reallocation of resources as a result of changes in terms of trade, government policies, resource availability, technological change, etc. These changes alter production opportunities and production, transaction and information costs, and consequently modify production functions, organizational design, etc. In this paper we define adjustment (section 2); review empirical estimates of the extent of adjustment in Canada and abroad (section 3); review selected features of the trade policy and adjustment context of relevance for policy formulation among which: slow growth, a shift to services, a shift to the Pacific Rim, the internationalization of production, investment distribution communications the growing use of NTB's, changes in foreign direct investment patterns, intrafirm and intraindustry trade, interregional trade flows, differences in micro economic adjustment processes of adjustment as between subsidiaries and Canadian companies (section 4); examine methodologies and results of studies of the impact of trade liberalization on jobs (section 5); and review the R. Harris general equilibrium model (section 6). Our conclusion emphasizes the importance of harmonizing commercial and domestic policies dealing with adjustment (section 7). We close with a bibliography of relevant publications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Affiliation: Faculté de pharmacie, Université de Montréal

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cattle feed industry is a major segment of animal feed industry. This industry is gradually evolving into an organized sector and the feed manufactures are increasingly using modern and sophisticated methods that seek to incorporate best global practices. This industry has got high potential for growth in India, given the fact that the country is the world’s leading producer of milk and its production is expected to grow at a compounded annual growth rate of 4 per cent. Besides, the concept of branded cattle feed as a packaged commodity is fast gaining popularity in rural India. There can be a positive change in the demand for cattle feed because of factors like (i) shrinkage of open land for cattle grazing, urbanization and resultant shortage of conventionally used cattle feeds, and (ii) introduction of high yield cattle requires specialized feeds. Earlier research studies done by the present authors have revealed the significant growth prospects of the branded cattle feed industry, the feed consumption pattern and the relatively high share of branded feeds, feed consumption pattern based on product types (like, pellet and mash), composition of cattle feed market and the relatively large shares of Kerala Feeds Ltd. (KFL) and Kerala Solvent Extractions Ltd. (KSE) brands, the major factors influencing the purchasing decisions etc. As a continuation of the earlier studies, this study makes a closer look into the significance of product types in the buyer behavior, level of awareness about the brand and its implications on purchasing decisions, and the brandshifting behavior and its determinants

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bossel's (2001) systems-based approach for deriving comprehensive indicator sets provides one of the most holistic frameworks for developing sustainability indicators. It ensures that indicators cover all important aspects of system viability, performance, and sustainability, and recognizes that a system cannot be assessed in isolation from the systems upon which it depends and which in turn depend upon it. In this reply, we show how Bossel's approach is part of a wider convergence toward integrating participatory and reductionist approaches to measure progress toward sustainable development. However, we also show that further integration of these approaches may be able to improve the accuracy and reliability of indicators to better stimulate community learning and action. Only through active community involvement can indicators facilitate progress toward sustainable development goals. To engage communities effectively in the application of indicators, these communities must be actively involved in developing, and even in proposing, indicators. The accuracy, reliability, and sensitivity of the indicators derived from local communities can be ensured through an iterative process of empirical and community evaluation. Communities are unlikely to invest in measuring sustainability indicators unless monitoring provides immediate and clear benefits. However, in the context of goals, targets, and/or baselines, sustainability indicators can more effectively contribute to a process of development that matches local priorities and engages the interests of local people.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An alternative approach to research is described that has been developed through a succession of significant construction management research projects. The approach follows the principles of iterative grounded theory, whereby researchers iterate between alternative theoretical frameworks and emergent empirical data. Of particular importance is an orientation toward mixing methods, thereby overcoming the existing tendency to dichotomize quantitative and qualitative approaches. The approach is positioned against the existing contested literature on grounded theory, and the possibility of engaging with empirical data in a “theory free” manner is discounted. Emphasis instead is given to the way in which researchers must be theoretically sensitive as a result of being steeped in relevant literatures. Knowledge of existing literatures therefore shapes the initial research design; but emergent empirical findings cause fresh theoretical perspectives to be mobilized. The advocated approach is further aligned with notions of knowledge coproduction and the underlying principles of contextualist research. It is this unique combination of ideas which characterizes the paper's contribution to the research methodology literature within the field of construction management. Examples are provided and consideration is given to the extent to which the emergent findings are generalizable beyond the specific context from which they are derived.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper draws from three case studies of regional construction firms operating in the UK. The case studies provide new insights into the ways in which such firms strive to remain competitive. Empirical data was derived from multiple interactions with senior personnel from with each firm. Data collection methods included semi-structured interviews, informal interactions, archival research, and workshops. The initial research question was informed by existing resource-based theories of competitiveness and an extensive review of constructionspecific literature. However, subsequent emergent empirical findings progressively pointed towards the need to mobilise alternative theoretical models that emphasise localised learning and embeddedness. The findings point towards the importance of de-centralised structures that enable multiple business units to become embedded within localised markets. A significant degree of autonomy is essential to facilitate entrepreneurial behaviour. In essence, sustained competitiveness was found to rest on the way de-centralised business units enact ongoing processes of localised learning. Once local business units have become embedded within localised markets, the essential challenge is how to encourage continued entrepreneurial behaviour while maintaining some degree of centralised control and coordination. This presents a number of tensions and challenges which play out differently across each of the three case studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transient neural assemblies mediated by synchrony in particular frequency ranges are thought to underlie cognition. We propose a new approach to their detection, using empirical mode decomposition (EMD), a data-driven approach removing the need for arbitrary bandpass filter cut-offs. Phase locking is sought between modes. We explore the features of EMD, including making a quantitative assessment of its ability to preserve phase content of signals, and proceed to develop a statistical framework with which to assess synchrony episodes. Furthermore, we propose a new approach to ensure signal decomposition using EMD. We adapt the Hilbert spectrum to a time-frequency representation of phase locking and are able to locate synchrony successfully in time and frequency between synthetic signals reminiscent of EEG. We compare our approach, which we call EMD phase locking analysis (EMDPL) with existing methods and show it to offer improved time-frequency localisation of synchrony.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transient episodes of synchronisation of neuronal activity in particular frequency ranges are thought to underlie cognition. Empirical mode decomposition phase locking (EMDPL) analysis is a method for determining the frequency and timing of phase synchrony that is adaptive to intrinsic oscillations within data, alleviating the need for arbitrary bandpass filter cut-off selection. It is extended here to address the choice of reference electrode and removal of spurious synchrony resulting from volume conduction. Spline Laplacian transformation and independent component analysis (ICA) are performed as pre-processing steps, and preservation of phase synchrony between synthetic signals. combined using a simple forward model, is demonstrated. The method is contrasted with use of bandpass filtering following the same preprocessing steps, and filter cut-offs are shown to influence synchrony detection markedly. Furthermore, an approach to the assessment of multiple EEG trials using the method is introduced, and the assessment of statistical significance of phase locking episodes is extended to render it adaptive to local phase synchrony levels. EMDPL is validated in the analysis of real EEG data, during finger tapping. The time course of event-related (de)synchronisation (ERD/ERS) is shown to differ from that of longer range phase locking episodes, implying different roles for these different types of synchronisation. It is suggested that the increase in phase locking which occurs just prior to movement, coinciding with a reduction in power (or ERD) may result from selection of the neural assembly relevant to the particular movement. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The A-Train constellation of satellites provides a new capability to measure vertical cloud profiles that leads to more detailed information on ice-cloud microphysical properties than has been possible up to now. A variational radar–lidar ice-cloud retrieval algorithm (VarCloud) takes advantage of the complementary nature of the CloudSat radar and Cloud–Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) lidar to provide a seamless retrieval of ice water content, effective radius, and extinction coefficient from the thinnest cirrus (seen only by the lidar) to the thickest ice cloud (penetrated only by the radar). In this paper, several versions of the VarCloud retrieval are compared with the CloudSat standard ice-only retrieval of ice water content, two empirical formulas that derive ice water content from radar reflectivity and temperature, and retrievals of vertically integrated properties from the Moderate Resolution Imaging Spectroradiometer (MODIS) radiometer. The retrieved variables typically agree to within a factor of 2, on average, and most of the differences can be explained by the different microphysical assumptions. For example, the ice water content comparison illustrates the sensitivity of the retrievals to assumed ice particle shape. If ice particles are modeled as oblate spheroids rather than spheres for radar scattering then the retrieved ice water content is reduced by on average 50% in clouds with a reflectivity factor larger than 0 dBZ. VarCloud retrieves optical depths that are on average a factor-of-2 lower than those from MODIS, which can be explained by the different assumptions on particle mass and area; if VarCloud mimics the MODIS assumptions then better agreement is found in effective radius and optical depth is overestimated. MODIS predicts the mean vertically integrated ice water content to be around a factor-of-3 lower than that from VarCloud for the same retrievals, however, because the MODIS algorithm assumes that its retrieved effective radius (which is mostly representative of cloud top) is constant throughout the depth of the cloud. These comparisons highlight the need to refine microphysical assumptions in all retrieval algorithms and also for future studies to compare not only the mean values but also the full probability density function.