62 resultados para Representation Construction Approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inventory control in complex manufacturing environments encounters various sources of uncertainity and imprecision. This paper presents one fuzzy knowledge-based approach to solving the problem of order quantity determination, in the presence of uncertain demand, lead time and actual inventory level. Uncertain data are represented by fuzzy numbers, and vaguely defined relations between them are modeled by fuzzy if-then rules. The proposed representation and inference mechanism are verified using a large numbers of examples. The results of three representative cases are summarized. Finally a comparison between the developed fuzzy knowledge-based and traditional, probabilistic approaches is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasingly, scholars are contesting the value of grand theories of leadership in favour of a social constructionist approach that posits the centrality of language for ‘doing’ leadership. This article investigates the extent to which the linguistic enactment of leadership is often gendered, which may have consequences for the career progression of women business leaders. Drawing on a UK-based study of three teams with different gender compositions (men-only; women-only and mixed gender), I use an Interactional Sociolinguistic framework to compare what leadership ‘looks and sounds like’ during the course of a competitive, leadership task. My findings show that the linguistic construction of leadership varies considerably within each team although not always in conventionally gendered ways. The study potentially provides linguistic insights on the business issue of why so few women progress from middle management to senior leadership roles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The UK experienced a number of Extreme Weather Events (EWEs) during recent years and a significant number of businesses were affected as a result. With the intensity and frequency of weather extremes predicted in the future, enhancing the resilience of businesses, especially of Small and Medium-sized Enterprises (SMEs), who are considered as highly vulnerable, has become a necessity. However, little research has been undertaken on how construction SMEs respond to the risk of EWEs. In seeking to help address this dearth of research, this investigation sought to identify how construction SMEs were being affected by EWEs and the coping strategies being used. Design/methodology/approach – A mixed methods research design was adopted to elicit information from construction SMEs, involving a questionnaire survey and case study approach. Findings – Results indicate a lack of coping strategies among the construction SMEs studied. Where the coping strategies have been implemented, these were found to be extensions of their existing risk management strategies rather than radical measures specifically addressing EWEs. Research limitations/implications – The exploratory survey focused on the Greater London area and was limited to a relatively small sample size. This limitation is overcome by conducting detailed case studies utilising two SMEs whose projects were located in EWE prone localities. The mixed method research design adopted benefits the research by presenting more robust findings. Practical implications – A better way of integrating the potential of EWEs into the initial project planning stage is required by the SMEs. This could possibly be achieved through a better risk assessment model supported by better EWE prediction data. Originality/value – The paper provides an original contribution towards the overarching agenda of resilience of SMEs and policy making in the area of EWE risk management. It informs both policy makers and practitioners on issues of planning and preparedness against EWEs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Determining an appropriate research methodology is considered as an important element in a research study; especially in a doctoral research study. It involves approach to the entire process of a research study, starting from theoretical underpinnings and spanning to data collection and analysis, and extending to developing the solutions for the problems investigated. Research methodology in essence is focused around the problems to be investigated in a research study and therefore varies according to the problems investigated. Thus, identifying the research methodology that best suits a research in hand is important, not only as it will benefit achieving the set objectives of a research, but also as it will serve establishing the credibility of the work. Research philosophy, approach, strategy, choice, and techniques are inherent components of the methodology. Research strategy provides the overall direction of the research including the process by which the research is conducted. Case study, experiment, survey, action research, grounded theory and ethnography are examples for such research strategies. Case study is documented as an empirical inquiry that investigates a contemporary phenomenon within its real-life context, especially when the boundaries between phenomenon and context are not clearly evident. Case study was adopted as the overarching research strategy, in a doctoral study developed to investigate the resilience of construction Small and Medium-sized Enterprises (SMEs) in the UK to extreme weather events. The research sought to investigate how construction SMEs are affected by EWEs, respond to the risk of EWEs, and means of enhancing their resilience to future EWEs. It is argued that utilising case study strategy will benefit the research study, in achieving the set objectives of the research and answering the research questions raised, by comparing and contrasting with the alternative strategies available. It is also claimed that the selected strategy will contribute towards addressing the call for improved methodological pluralism in construction management research, enhancing the understanding of complex network of relationships pertinent to the industry and the phenomenon being studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Securities and Exchange Commission (SEC) in the United States and in particular its immediately past chairman, Christopher Cox, has been actively promoting an upgrade of the EDGAR system of disseminating filings. The new generation of information provision has been dubbed by Chairman Cox, "Interactive Data" (SEC, 2006). In October this year the Office of Interactive Disclosure was created(http://www.sec.gov/news/press/2007/2007-213.htm). The focus of this paper is to examine the way in which the non-professional investor has been constructed by various actors. We examine the manner in which Interactive Data has been sold as the panacea for financial market 'irregularities' by the SEC and others. The academic literature shows almost no evidence of researching non-professional investors in any real sense (Young, 2006). Both this literature and the behaviour of representatives of institutions such as the SEC and FSA appears to find it convenient to construct this class of investor in a particular form and to speak for them. We theorise the activities of the SEC and its chairman in particular over a period of about three years, both following and prior to the 'credit crunch'. Our approach is to examine a selection of the policy documents released by the SEC and other interested parties and the statements made by some of the policy makers and regulators central to the programme to advance the socio-technical project that is constituted by Interactive Data. We adopt insights from ANT and more particularly the sociology of translation (Callon, 1986; Latour, 1987, 2005; Law, 1996, 2002; Law & Singleton, 2005) to show how individuals and regulators have acted as spokespersons for this malleable class of investor. We theorise the processes of accountability to investors and others and in so doing reveal the regulatory bodies taking the regulated for granted. The possible implications of technological developments in digital reporting have been identified also by the CEO's of the six biggest audit firms in a discussion document on the role of accounting information and audit in the future of global capital markets (DiPiazza et al., 2006). The potential for digital reporting enabled through XBRL to "revolutionize the entire company reporting model" (p.16) is discussed and they conclude that the new model "should be driven by the wants of investors and other users of company information,..." (p.17; emphasis in the original). Here rather than examine the somewhat illusive and vexing question of whether adding interactive functionality to 'traditional' reports can achieve the benefits claimed for nonprofessional investors we wish to consider the rhetorical and discursive moves in which the SEC and others have engaged to present such developments as providing clearer reporting and accountability standards and serving the interests of this constructed and largely unknown group - the non-professional investor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: There is a growing public perception that serious medical error is commonplace and largely tolerated by the medical profession. The Government and medical establishment's response to this perceived epidemic of error has included tighter controls over practising doctors and individual stick-and-carrot reforms of medical practice. Discussion: This paper critically reviews the literature on medical error, professional socialization and medical student education, and suggests that common themes such as uncertainty, necessary fallibility, exclusivity of professional judgement and extensive use of medical networks find their genesis, in part, in aspects of medical education and socialization into medicine. The nature and comparative failure of recent reforms of medical practice and the tension between the individualistic nature of the reforms and the collegiate nature of the medical profession are discussed. Conclusion: A more theoretically informed and longitudinal approach to decreasing medical error might be to address the genesis of medical thinking about error through reforms to the aspects of medical education and professional socialization that help to create and perpetuate the existence of avoidable error, and reinforce medical collusion concerning error. Further changes in the curriculum to emphasize team working, communication skills, evidence-based practice and strategies for managing uncertainty are therefore potentially key components in helping tomorrow's doctors to discuss, cope with and commit fewer medical errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 1641 Depositions are testimonies collected from (mainly Protestant) witnesses documenting their experiences of the Irish uprising that began in October 1641. As news spread across Europe of the events unfolding in Ireland, reports of violence against women became central to the ideological construction of the barbarism of the Catholic rebels. Against a backdrop of women's subordination and firmly defined gender roles, this article investigates the representation of women in the Depositions, creating what we have termed "lexico-grammatical portraits" of particular categories of woman. In line with other research dealing with discursive constructions in seventeenth-century texts, a corpus-assisted discourse analytical approach is taken. Adopting the assumptions of Critical Discourse Analysis, the discussion is extended to what the findings reveal about representations of the roles of women, both in the reported events and in relation to the dehumanisation of the enemy in atrocity propaganda more generally. © John Benjamins Publishing Company.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research in social psychology has shown that public attitudes towards feminism are mostly based on stereotypical views linking feminism with leftist politics and lesbian orientation. It is claimed that such attitudes are due to the negative and sexualised media construction of feminism. Studies concerned with the media representation of feminism seem to confirm this tendency. While most of this research provides significant insights into the representation of feminism, the findings are often based on a small sample of texts. Also, most of the research was conducted in an Anglo-American setting. This study attempts to address some of the shortcomings of previous work by examining the discourse of feminism in a large corpus of German and British newspaper data. It does so by employing the tools of Corpus Linguistics. By investigating the collocation profiles of the search term feminism, we provide evidence of salient discourse patterns surrounding feminism in two different cultural contexts. © The Author(s) 2012.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, there has been an increas-ing interest in learning a distributed rep-resentation of word sense. Traditional context clustering based models usually require careful tuning of model parame-ters, and typically perform worse on infre-quent word senses. This paper presents a novel approach which addresses these lim-itations by first initializing the word sense embeddings through learning sentence-level embeddings from WordNet glosses using a convolutional neural networks. The initialized word sense embeddings are used by a context clustering based model to generate the distributed representations of word senses. Our learned represen-tations outperform the publicly available embeddings on 2 out of 4 metrics in the word similarity task, and 6 out of 13 sub tasks in the analogical reasoning task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Short text messages a.k.a Microposts (e.g. Tweets) have proven to be an effective channel for revealing information about trends and events, ranging from those related to Disaster (e.g. hurricane Sandy) to those related to Violence (e.g. Egyptian revolution). Being informed about such events as they occur could be extremely important to authorities and emergency professionals by allowing such parties to immediately respond. In this work we study the problem of topic classification (TC) of Microposts, which aims to automatically classify short messages based on the subject(s) discussed in them. The accurate TC of Microposts however is a challenging task since the limited number of tokens in a post often implies a lack of sufficient contextual information. In order to provide contextual information to Microposts, we present and evaluate several graph structures surrounding concepts present in linked knowledge sources (KSs). Traditional TC techniques enrich the content of Microposts with features extracted only from the Microposts content. In contrast our approach relies on the generation of different weighted semantic meta-graphs extracted from linked KSs. We introduce a new semantic graph, called category meta-graph. This novel meta-graph provides a more fine grained categorisation of concepts providing a set of novel semantic features. Our findings show that such category meta-graph features effectively improve the performance of a topic classifier of Microposts. Furthermore our goal is also to understand which semantic feature contributes to the performance of a topic classifier. For this reason we propose an approach for automatic estimation of accuracy loss of a topic classifier on new, unseen Microposts. We introduce and evaluate novel topic similarity measures, which capture the similarity between the KS documents and Microposts at a conceptual level, considering the enriched representation of these documents. Extensive evaluation in the context of Emergency Response (ER) and Violence Detection (VD) revealed that our approach outperforms previous approaches using single KS without linked data and Twitter data only up to 31.4% in terms of F1 measure. Our main findings indicate that the new category graph contains useful information for TC and achieves comparable results to previously used semantic graphs. Furthermore our results also indicate that the accuracy of a topic classifier can be accurately predicted using the enhanced text representation, outperforming previous approaches considering content-based similarity measures. © 2014 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Operation sequencing is one of the crucial tasks in process planning. However, it is an intractable process to identify an optimized operation sequence with minimal machining cost in a vast search space constrained by manufacturing conditions. Also, the information represented by current process plan models for three-axis machining is not sufficient for five-axis machining owing to the two extra degrees of freedom and the difficulty of set-up planning. In this paper, a representation of process plans for five-axis machining is proposed, and the complicated operation sequencing process is modelled as a combinatorial optimization problem. A modern evolutionary algorithm, i.e. the particle swarm optimization (PSO) algorithm, has been employed and modified to solve it effectively. Initial process plan solutions are formed and encoded into particles of the PSO algorithm. The particles 'fly' intelligently in the search space to achieve the best sequence according to the optimization strategies of the PSO algorithm. Meanwhile, to explore the search space comprehensively and to avoid being trapped into local optima, several new operators have been developed to improve the particle movements to form a modified PSO algorithm. A case study used to verify the performance of the modified PSO algorithm shows that the developed PSO can generate satisfactory results in optimizing the process planning problem. © IMechE 2009.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Standardised packaging (SP) of tobacco products is an innovative tobacco control measure opposed by transnational tobacco companies (TTCs) whose responses to the UK government's public consultation on SP argued that evidence was inadequate to support implementing the measure. The government's initial decision, announced 11 months after the consultation closed, was to wait for 'more evidence', but four months later a second 'independent review' was launched. In view of the centrality of evidence to debates over SP and TTCs' history of denying harms and manufacturing uncertainty about scientific evidence, we analysed their submissions to examine how they used evidence to oppose SP. METHODS AND FINDINGS: We purposively selected and analysed two TTC submissions using a verification-oriented cross-documentary method to ascertain how published studies were used and interpretive analysis with a constructivist grounded theory approach to examine the conceptual significance of TTC critiques. The companies' overall argument was that the SP evidence base was seriously flawed and did not warrant the introduction of SP. However, this argument was underpinned by three complementary techniques that misrepresented the evidence base. First, published studies were repeatedly misquoted, distorting the main messages. Second, 'mimicked scientific critique' was used to undermine evidence; this form of critique insisted on methodological perfection, rejected methodological pluralism, adopted a litigation (not scientific) model, and was not rigorous. Third, TTCs engaged in 'evidential landscaping', promoting a parallel evidence base to deflect attention from SP and excluding company-held evidence relevant to SP. The study's sample was limited to sub-sections of two out of four submissions, but leaked industry documents suggest at least one other company used a similar approach. CONCLUSIONS: The TTCs' claim that SP will not lead to public health benefits is largely without foundation. The tools of Better Regulation, particularly stakeholder consultation, provide an opportunity for highly resourced corporations to slow, weaken, or prevent public health policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the major challenges in measuring efficiency in terms of resources and outcomes is the assessment of the evolution of units over time. Although Data Envelopment Analysis (DEA) has been applied for time series datasets, DEA models, by construction, form the reference set for inefficient units (lambda values) based on their distance from the efficient frontier, that is, in a spatial manner. However, when dealing with temporal datasets, the proximity in time between units should also be taken into account, since it reflects the structural resemblance among time periods of a unit that evolves. In this paper, we propose a two-stage spatiotemporal DEA approach, which captures both the spatial and temporal dimension through a multi-objective programming model. In the first stage, DEA is solved iteratively extracting for each unit only previous DMUs as peers in its reference set. In the second stage, the lambda values derived from the first stage are fed to a Multiobjective Mixed Integer Linear Programming model, which filters peers in the reference set based on weights assigned to the spatial and temporal dimension. The approach is demonstrated on a real-world example drawn from software development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, there has been an increasing interest in learning a distributed representation of word sense. Traditional context clustering based models usually require careful tuning of model parameters, and typically perform worse on infrequent word senses. This paper presents a novel approach which addresses these limitations by first initializing the word sense embeddings through learning sentence-level embeddings from WordNet glosses using a convolutional neural networks. The initialized word sense embeddings are used by a context clustering based model to generate the distributed representations of word senses. Our learned representations outperform the publicly available embeddings on half of the metrics in the word similarity task, 6 out of 13 sub tasks in the analogical reasoning task, and gives the best overall accuracy in the word sense effect classification task, which shows the effectiveness of our proposed distributed distribution learning model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cooperative Greedy Pursuit Strategies are considered for approximating a signal partition subjected to a global constraint on sparsity. The approach aims at producing a high quality sparse approximation of the whole signal, using highly coherent redundant dictionaries. The cooperation takes place by ranking the partition units for their sequential stepwise approximation, and is realized by means of i)forward steps for the upgrading of an approximation and/or ii) backward steps for the corresponding downgrading. The advantage of the strategy is illustrated by approximation of music signals using redundant trigonometric dictionaries. In addition to rendering stunning improvements in sparsity with respect to the concomitant trigonometric basis, these dictionaries enable a fast implementation of the approach via the Fast Fourier Transform