853 resultados para Empirical Methods in NLP


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The efficient indirect office work brings competitive advantage for companies in a rapidly changing business environment. The direct work methods in factory floors have been developed already for decades, but the office work is an area where the potential to improve the value add has not been studied and utilized systematically so far. The first objective of the thesis work is to find useful method for identifying and managing value add using literature. The usefulness of the method is validated in the case company`s environment. The second objective of the work is to understand what kind of effort is required to create more efficient target setting for the white collar employees. The operative level targets should be linked more tightly to the company strategy. Lean methods are selected as a tool for the improvement, since they are widely used in all kinds of industries and they are already familiar in other functions in the case company. Based on the literature review, suitable improvement methods are selected. The core of the lean is to identify the value add of a customer and eliminate the waste. Also visual control, cross functional work team, flow office and continuous improvement are used. The methods are tested in one production line and the results and feedback indicate that methods are useful in the studied environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the present study was to investigate percentage body fat (%BF) differences in three Spanish dance disciplines and to compare skinfold and bioelectrical impedance predictions of body fat percentage in the same sample. Seventy-six female dancers, divided into three groups, Classical (n=23), Spanish (n=29) and Flamenco (n=24), were measured using skinfold measurements at four sites: triceps, subscapular, biceps and iliac crest, and whole body multi-frequency bioelectrical impedance (BIA). The skin-fold measures were used to predict body fat percentage via Durnin and Womersley's and Segal, Sun and Yannakoulia equations by BIA. Differences in percent fat mass between groups (Classical, Spanish and Flamenco) were tested by using repeated measures analysis (ANOVA). Also, Pearson's product-moment correlations were performed on the body fat percentage values obtained using both methods. In addition, Bland-Altman plots were used to assess agreement, between anthropometric and BIA methods. Repeated measures analysis of variance did not found differences in %BF between modalities (p<0.05). Fat percentage correlations ranged from r= 0.57 to r=0.97 (all, p<0.001). Bland-Altman analysis revealed differences between BIA Yannakoulia as a reference method with BIA Segal (-0.35 ± 2.32%, 95%CI: -0.89to 0.18, p=0.38), with BIA Sun (-0.73 ± 2.3%, 95%CI: -1.27 to -0.20, p=0.014) and Durnin-Womersley (-2.65 ± 2,48%, 95%CI: -3.22 to -2.07, p<0.0001). It was concluded that body fat percentage estimates by BIA compared with skinfold method were systematically different in young adult female ballet dancers, having a tendency to produce underestimations as %BF increased with Segal and Durnin-Womersley equations compared to Yannakoulia, concluding that these methods are not interchangeable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two passive methods in the assessment of intradomiciliary infestation by Rhodnius ecuadoriensis were tested: (i) the Gomes Nuñez sensor box (GN), (ii) sheets of white typing paper and (iii) one active timed manual method. The study was carried out in the Alto Chicama River Valley, Province of Gran Chimú, Department of La Libertad. The study design consisted of an initial searching of triatomines inside of the domestic environment by the manual capture active procedure (man/hour) covering all the studied houses. Then, matched pairs of GN boxes and paper sheets were simultaneously installed in the bedrooms of 207 households distributed in 19 localities. A comparative prospective trial of these passive detection devices were monitored at 2, 4 and, finally 6 months follow-up. Parasitological Trypanosoma rangeli and/or T. cruzi infections were investigated in two houses with high level of infestation by R. ecuadoriensis.16.9% of the 207 households investigated by an initial active manual method were infested with R. ecuadoriensis. The proportion of infested houses fluctuated from 6.2 to 55.5% amongst the 19 localities investigated. T. rangeli natural infection was detected in R.ecuadoriensis specimens collected in two households. Parasite rates in the bugs ranged from 16.6 to 21.7% respectively. The most striking fact was an average rate of salivary gland infection ranging from 7.4 to 8.3%. At the end of the sixth month period, a cumulative incidence of 31.4% of positive GN boxes against 15.9% for paper sheets was recorded. All three methods combined detected domestic infestation in 129 (62.3%) of the 207 houses studied in the 19 localities. The range of houses infested varies from 6.7% to 92.9%. In areas with low bug density infestation rates, the methodology experienced in our studies, seems to be the best choice for investigations on domestic R. ecuadoriensis populations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop the energy norm a-posteriori error estimation for hp-version discontinuous Galerkin (DG) discretizations of elliptic boundary-value problems on 1-irregularly, isotropically refined affine hexahedral meshes in three dimensions. We derive a reliable and efficient indicator for the errors measured in terms of the natural energy norm. The ratio of the efficiency and reliability constants is independent of the local mesh sizes and weakly depending on the polynomial degrees. In our analysis we make use of an hp-version averaging operator in three dimensions, which we explicitly construct and analyze. We use our error indicator in an hp-adaptive refinement algorithm and illustrate its practical performance in a series of numerical examples. Our numerical results indicate that exponential rates of convergence are achieved for problems with smooth solutions, as well as for problems with isotropic corner singularities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study is to analyze the existing literature on hospitality management from all the research papers published in The International Journal of Hospitality Management (IJHM) between 2008 and 2014. The authors apply bibliometric methodsin particular, author citation and co-citation analyses (ACA) – to identify the main research lines within this scientific field; in other words, its ‘intellectual structure’. Social network analysis (SNA) is also used to perform a visualization of this structure. The results of the analysis allow us to define the different research lines or fronts which shape the intellectual structure of research on hospitality management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adjuvants are substances that boost the protective immune response to vaccine antigens. The majority of known adjuvants have been identified through the use of empirical approaches. Our aim was to identify novel adjuvants with well-defined cellular and molecular mechanisms by combining a knowledge of immunoregulatory mechanisms with an in silico approach. CD4 + CD25 + FoxP3 + regulatory T cells (Tregs) inhibit the protective immune responses to vaccines by suppressing the activation of antigen presenting cells such as dendritic cells (DCs). In this chapter, we describe the identification and functional validation of small molecule antagonists to CCR4, a chemokine receptor expressed on Tregs. The CCR4 binds the chemokines CCL22 and CCL17 that are produced in large amounts by activated innate cells including DCs. In silico identified small molecule CCR4 antagonists inhibited the migration of Tregs both in vitro and in vivo and when combined with vaccine antigens, significantly enhanced protective immune responses in experimental models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nitrobenzoxadiazole (NBD)-labeled lipids are popular fluorescent membrane probes. However, the understanding of important aspects of the photophysics of NBD remains incomplete, including the observed shift in the emission spectrum of NBD-lipids to longer wavelengths following excitation at the red edge of the absorption spectrum (red-edge excitation shift or REES). REES of NBD-lipids in membrane environments has been previously interpreted as reflecting restricted mobility of solvent surrounding the fluorophore. However, this requires a large change in the dipole moment (Dm) of NBD upon excitation. Previous calculations of the value of Dm of NBD in the literature have been carried out using outdated semi-empirical methods, leading to conflicting values. Using up-to-date density functional theory methods, we recalculated the value of Dm and verified that it is rather small (B2 D). Fluorescence measurements confirmed that the value of REES is B16 nm for 1,2-dioleoyl-sn-glycero-3- phospho-L-serine-N-(NBD) (NBD-PS) in dioleoylphosphatidylcholine vesicles. However, the observed shift is independent of both the temperature and the presence of cholesterol and is therefore insensitive to the mobility and hydration of the membrane. Moreover, red-edge excitation leads to an increased contribution of the decay component with a shorter lifetime, whereas time-resolved emission spectra of NBD-PS displayed an atypical blue shift following excitation. This excludes restrictions to solvent relaxation as the cause of the measured REES and TRES of NBD, pointing instead to the heterogeneous transverse location of probes as the origin of these effects. The latter hypothesis was confirmed by molecular dynamics simulations, from which the calculated heterogeneity of the hydration and location of NBD correlated with the measured fluorescence lifetimes/REES. Globally, our combination of theoretical and experiment-based techniques has led to a considerably improved understanding of the photophysics of NBD and a reinterpretation of its REES in particular.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In practical terms, conceptual modeling is at the core of systems analysis and design. The plurality of modeling methods available has however been regarded as detrimental, and as a strong indication that a common view or theoretical grounding of modeling is wanting. This theoretical foundation must universally address all potential matters to be represented in a model, which consequently suggested ontology as the point of departure for theory development. The Bunge–Wand–Weber (BWW) ontology has become a widely accepted modeling theory. Its application has simultaneously led to the recognition that, although suitable as a meta-model, the BWW ontology needs to be enhanced regarding its expressiveness in empirical domains. In this paper, a first step in this direction has been made by revisiting BUNGE’s ontology, and by proposing the integration of a “hierarchy of systems” in the BWW ontology for accommodating domain specific conceptualizations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Designers need to develop good observational skills in order to conduct user studies that reveal the subtleties of human interactions and adequately inform design activity. In this paper we describe a game format that we have used in concert with wiki-web technology, to engage our IT and Information Environments students in developing much sharper observational skills. The Video Card Game is a method of video analysis that is suited to design practitioners as well as to researchers. It uses the familiar format of a card game similar to "Happy Families,, to help students develop themes of interactions from watching video clips. Students then post their interaction themes on wiki-web pages, which allows the teaching team and other students to edit and comment on them. We found that the tangible (cards), game, role playing and sharing aspects of this method led to a much larger amount of interaction and discussion between student groups and between students and the teaching team, than we have achieved using our traditional teaching methods, while taking no more time on the part of the teaching staff. The quality of the resulting interaction themes indicates that this method fosters development of observational skills.In the paper we describe the motivations, method and results in full. We also describe the research context in which we collected the videotape data, and how this method relates to state of the art research methods in interaction design for ubiquitous computing technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The behavioral theory of “entrepreneurial bricolage” attempts to understand what entrepreneurs do when faced with resource constraints. Most research about bricolage, defined as “making do by applying combinations of the resources at hand to new problems and opportunities” (Baker & Nelson 2005: 333), has been qualitative and inductive (Garud & Karnoe, 2003). Although this has created a small body of rich descriptions and interesting insights, little deductive theory has been developed and the relationship between bricolage and firm performance has not been systematically tested. In particular, prior research has suggested bricolage can have both beneficial and harmful effects. Ciborra’s (1996) study of Olivetti suggested that bricolage helped Olivetti to adapt, but simultaneously constrained firm effectiveness. Baker & Nelson (2005) suggested that bricolage may be harmful at very high levels, but more helpful if used judiciously. Other research suggests that firm innovativeness may play an important role in shaping the outcomes of bricolage (Anderson 2008). In this paper, we theorize and provide preliminary test of the bricolage-performance relationship and how it is affected by firm innovativeness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Principal Topic: It is well known that most new ventures suffer from a significant lack of resources, which increases the risk of failure (Shepherd, Douglas and Shanley, 2000) and makes it difficult to attract stakeholders and financing for the venture (Bhide & Stevenson, 1999). The Resource-Based View (RBV) (Barney, 1991; Wernerfelt, 1984) is a dominant theoretical base increasingly drawn on within Strategic Management. While theoretical contributions applying RBV in the domain of entrepreneurship can arguably be traced back to Penrose (1959), there has been renewed attention recently (e.g. Alvarez & Busenitz, 2001; Alvarez & Barney, 2004). This said, empirical work is in its infancy. In part, this may be due to a lack of well developed measuring instruments for testing ideas derived from RBV. The purpose of this study is to develop a measurement scales that can serve to assist such empirical investigations. In so doing we will try to overcome three deficiencies in current empirical measures used for the application of RBV to the entrepreneurship arena. First, measures for resource characteristics and configurations associated with typical competitive advantages found in entrepreneurial firms need to be developed. These include such things as alertness and industry knowledge (Kirzner, 1973), flexibility (Ebben & Johnson, 2005), strong networks (Lee et al., 2001) and within knowledge intensive contexts, unique technical expertise (Wiklund and Shepard, 2003). Second, the RBV has the important limitations of being relatively static and modelled on large, established firms. In that context, traditional RBV focuses on competitive advantages. However, newly established firms often face disadvantages, especially those associated with the liabilities of newness (Aldrich & Auster, 1986). It is therefore important in entrepreneurial contexts to expand to an investigation of responses to competitive disadvantage through an RBV lens. Conversely, recent research has suggested that resource constraints actually have a positive effect on firm growth and performance under some circumstances (e.g., George, 2005; Katila & Shane, 2005; Mishina et al., 2004; Mosakowski, 2002; cf. also Baker & Nelson, 2005). Third, current empirical applications of RBV measured levels or amounts of particular resources available to a firm. They infer that these resources deliver firms competitive advantage by establishing a relationship between these resource levels and performance (e.g. via regression on profitability). However, there is the opportunity to directly measure the characteristics of resource configurations that deliver competitive advantage, such as Barney´s well known VRIO (Valuable, Rare, Inimitable and Organized) framework (Barney, 1997). Key Propositions and Methods: The aim of our study is to develop and test scales for measuring resource advantages (and disadvantages) and inimitability for entrepreneurial firms. The study proceeds in three stages. The first stage developed our initial scales based on earlier literature. Where possible, we adapt scales based on previous work. The first block of the scales related to the level of resource advantages and disadvantages. Respondents were asked the degree to which each resource category represented an advantage or disadvantage relative to other businesses in their industry on a 5 point response scale: Major Disadvantage, Slight Disadvantage, No Advantage or Disadvantage, Slight Advantage and Major Advantage. Items were developed as follows. Network capabilities (3 items) were adapted from (Madsen, Alsos, Borch, Ljunggren & Brastad, 2006). Knowledge resources marketing expertise / customer service (3 items) and technical expertise (3 items) were adapted from Wiklund and Shepard (2003). flexibility (2 items), costs (4 items) were adapted from JIBS B97. New scales were developed for industry knowledge / alertness (3 items) and product / service advantages. The second block asked the respondent to nominate the most important resource advantage (and disadvantage) of the firm. For the advantage, they were then asked four questions to determine how easy it would be for other firms to imitate and/or substitute this resource on a 5 point likert scale. For the disadvantage, they were asked corresponding questions related to overcoming this disadvantage. The second stage involved two pre-tests of the instrument to refine the scales. The first was an on-line convenience sample of 38 respondents. The second pre-test was a telephone interview with a random sample of 31 Nascent firms and 47 Young firms (< 3 years in operation) generated using a PSED method of randomly calling households (Gartner et al. 2004). Several items were dropped or reworded based on the pre-tests. The third stage (currently in progress) is part of Wave 1 of CAUSEE (Nascent Firms) and FEDP (Young Firms), a PSED type study being conducted in Australia. The scales will be tested and analysed with a random sample of approximately 700 Nascent and Young firms respectively. In addition, a judgement sample of approximately 100 high potential businesses in each category will be included. Findings and Implications: The paper will report the results of the main study (stage 3 – currently data collection is in progress) will allow comparison of the level of resource advantage / disadvantage across various sub-groups of the population. Of particular interest will be a comparison of the high potential firms with the random sample. Based on the smaller pre-tests (N=38 and N=78) the factor structure of the items confirmed the distinctiveness of the constructs. The reliabilities are within an acceptable range: Cronbach alpha ranged from 0.701 to 0.927. The study will provide an opportunity for researchers to better operationalize RBV theory in studies within the domain of entrepreneurship. This is a fundamental requirement for the ability to test hypotheses derived from RBV in systematic, large scale research studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This document provides a review of international and national practices in investment decision support tools in road asset management. Efforts were concentrated on identifying analytic frameworks, evaluation methodologies and criteria adopted by current tools. Emphasis was also given to how current approaches support Triple Bottom Line decision-making. Benefit Cost Analysis and Multiple Criteria Analysis are principle methodologies in supporting decision-making in Road Asset Management. The complexity of the applications shows significant differences in international practices. There is continuing discussion amongst practitioners and researchers regarding to which one is more appropriate in supporting decision-making. It is suggested that the two approaches should be regarded as complementary instead of competitive means. Multiple Criteria Analysis may be particularly helpful in early stages of project development, say strategic planning. Benefit Cost Analysis is used most widely for project prioritisation and selecting the final project from amongst a set of alternatives. Benefit Cost Analysis approach is useful tool for investment decision-making from an economic perspective. An extension of the approach, which includes social and environmental externalities, is currently used in supporting Triple Bottom Line decision-making in the road sector. However, efforts should be given to several issues in the applications. First of all, there is a need to reach a degree of commonality on considering social and environmental externalities, which may be achieved by aggregating the best practices. At different decision-making level, the detail of consideration of the externalities should be different. It is intended to develop a generic framework to coordinate the range of existing practices. The standard framework will also be helpful in reducing double counting, which appears in some current practices. Cautions should also be given to the methods of determining the value of social and environmental externalities. A number of methods, such as market price, resource costs and Willingness to Pay, are found in the review. The use of unreasonable monetisation methods in some cases has discredited Benefit Cost Analysis in the eyes of decision makers and the public. Some social externalities, such as employment and regional economic impacts, are generally omitted in current practices. This is due to the lack of information and credible models. It may be appropriate to consider these externalities in qualitative forms in a Multiple Criteria Analysis. Consensus has been reached in considering noise and air pollution in international practices. However, Australia practices generally omitted these externalities. Equity is an important consideration in Road Asset Management. The considerations are either between regions, or social groups, such as income, age, gender, disable, etc. In current practice, there is not a well developed quantitative measure for equity issues. More research is needed to target this issue. Although Multiple Criteria Analysis has been used for decades, there is not a generally accepted framework in the choice of modelling methods and various externalities. The result is that different analysts are unlikely to reach consistent conclusions about a policy measure. In current practices, some favour using methods which are able to prioritise alternatives, such as Goal Programming, Goal Achievement Matrix, Analytic Hierarchy Process. The others just present various impacts to decision-makers to characterise the projects. Weighting and scoring system are critical in most Multiple Criteria Analysis. However, the processes of assessing weights and scores were criticised as highly arbitrary and subjective. It is essential that the process should be as transparent as possible. Obtaining weights and scores by consulting local communities is a common practice, but is likely to result in bias towards local interests. Interactive approach has the advantage in helping decision-makers elaborating their preferences. However, computation burden may result in lose of interests of decision-makers during the solution process of a large-scale problem, say a large state road network. Current practices tend to use cardinal or ordinal scales in measure in non-monetised externalities. Distorted valuations can occur where variables measured in physical units, are converted to scales. For example, decibels of noise converts to a scale of -4 to +4 with a linear transformation, the difference between 3 and 4 represents a far greater increase in discomfort to people than the increase from 0 to 1. It is suggested to assign different weights to individual score. Due to overlapped goals, the problem of double counting also appears in some of Multiple Criteria Analysis. The situation can be improved by carefully selecting and defining investment goals and criteria. Other issues, such as the treatment of time effect, incorporating risk and uncertainty, have been given scant attention in current practices. This report suggested establishing a common analytic framework to deal with these issues.