939 resultados para Statistical Language Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this dissertation is to find and provide the basis for a managerial tool that allows a firm to easily express its business logic. The methodological basis for this work is design science, where the researcher builds an artifact to solve a specific problem. In this case the aim is to provide an ontology that makes it possible to explicit a firm's business model. In other words, the proposed artifact helps a firm to formally describe its value proposition, its customers, the relationship with them, the necessary intra- and inter-firm infrastructure and its profit model. Such an ontology is relevant because until now there is no model that expresses a company's global business logic from a pure business point of view. Previous models essentially take an organizational or process perspective or cover only parts of a firm's business logic. The four main pillars of the ontology, which are inspired by management science and enterprise- and processmodeling, are product, customer interface, infrastructure and finance. The ontology is validated by case studies, a panel of experts and managers. The dissertation also provides a software prototype to capture a company's business model in an information system. The last part of the thesis consists of a demonstration of the value of the ontology in business strategy and Information Systems (IS) alignment. Structure of this thesis: The dissertation is structured in nine parts: Chapter 1 presents the motivations of this research, the research methodology with which the goals shall be achieved and why this dissertation present a contribution to research. Chapter 2 investigates the origins, the term and the concept of business models. It defines what is meant by business models in this dissertation and how they are situated in the context of the firm. In addition this chapter outlines the possible uses of the business model concept. Chapter 3 gives an overview of the research done in the field of business models and enterprise ontologies. Chapter 4 introduces the major contribution of this dissertation: the business model ontology. In this part of the thesis the elements, attributes and relationships of the ontology are explained and described in detail. Chapter 5 presents a case study of the Montreux Jazz Festival which's business model was captured by applying the structure and concepts of the ontology. In fact, it gives an impression of how a business model description based on the ontology looks like. Chapter 6 shows an instantiation of the ontology into a prototype tool: the Business Model Modelling Language BM2L. This is an XML-based description language that allows to capture and describe the business model of a firm and has a large potential for further applications. Chapter 7 is about the evaluation of the business model ontology. The evaluation builds on literature review, a set of interviews with practitioners and case studies. Chapter 8 gives an outlook on possible future research and applications of the business model ontology. The main areas of interest are alignment of business and information technology IT/information systems IS and business model comparison. Finally, chapter 9 presents some conclusions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data was analyzed on development of the solanaceen fruit crop Cape gooseberry to evaluate how well a classical thermal time model could describe node appearance in different environments. The data used in the analysis were obtained from experiments conducted in Colombia in open fields and greenhouse condition at two locations with different climate. An empirical, non linear segmented model was used to estimate the base temperature and to parameterize the model for simulation of node appearance vs. time. The base temperature (Tb) used to calculate the thermal time (TT, ºCd) for node appearance was estimated to be 6.29 ºC. The slope of the first linear segment was 0.023 nodes per TT and 0.008 for the second linear segment. The time at which the slope of node apperance changed was 1039.5 ºCd after transplanting, determined from a statistical analysis of model for the first segment. When these coefficients were used to predict node appearance at all locations, the model successfully fit the observed data (RSME=2.1), especially for the first segment where node appearance was more homogeneous than the second segment. More nodes were produced by plants grown under greenhouse conditions and minimum and maximum rates of node appearance rates were also higher.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this work is to try to create a statistical model, based only on easily computable parameters from the CSP problem to predict runtime behaviour of the solving algorithms, and let us choose the best algorithm to solve the problem. Although it seems that the obvious choice should be MAC, experimental results obtained so far show, that with big numbers of variables, other algorithms perfom much better, specially for hard problems in the transition phase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present research we have set forth a new, simple, Trade-Off model that would allow us to calculate how much debt and, by default, how much equity a company should have, using easily available information and calculating the cost of debt dynamically on the basis of the effect that the capital structure of the company has on the risk of bankruptcy; in an attempt to answer this question. The proposed model has been applied to the companies that make up the Dow Jones Industrial Average (DJIA) in 2007. We have used consolidated financial data from 1996 to 2006, published by Bloomberg. We have used simplex optimization method to find the debt level that maximizes firm value. Then, we compare the estimated debt with real debt of companies using statistical nonparametric Mann-Whitney. The results indicate that 63% of companies do not show a statistically significant difference between the real and the estimated debt.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: How do listeners manage to recognize words in an unfamiliar language? The physical continuity of the signal, in which real silent pauses between words are lacking, makes it a difficult task. However, there are multiple cues that can be exploited to localize word boundaries and to segment the acoustic signal. In the present study, word-stress was manipulated with statistical information and placed in different syllables within trisyllabic nonsense words to explore the result of the combination of the cues in an online word segmentation task. Results: The behavioral results showed that words were segmented better when stress was placed on the final syllables than when it was placed on the middle or first syllable. The electrophysiological results showed an increase in the amplitude of the P2 component, which seemed to be sensitive to word-stress and its location within words. Conclusion: The results demonstrated that listeners can integrate specific prosodic and distributional cues when segmenting speech. An ERP component related to word-stress cues was identified: stressed syllables elicited larger amplitudes in the P2 component than unstressed ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Annonaceae includes cultivated species of economic interest and represents an important source of information for better understanding the evolution of tropical rainforests. In phylogenetic analyses of DNA sequence data that are used to address evolutionary questions, it is imperative to use appropriate statistical models. Annonaceae are cases in point: Two sister clades, the subfamilies Annonoideae and Malmeoideae, contain the majority of Annonaceae species diversity. The Annonoideae generally show a greater degree of sequence divergence compared to the Malmeoideae, resulting in stark differences in branch lengths in phylogenetic trees. Uncertainty in how to interpret and analyse these differences has led to inconsistent results when estimating the ages of clades in Annonaceae using molecular dating techniques. We ask whether these differences may be attributed to inappropriate modelling assumptions in the phylogenetic analyses. Specifically, we test for (clade-specific) differences in rates of non-synonymous and synonymous substitutions. A high ratio of nonsynonymous to synonymous substitutions may lead to similarity of DNA sequences due to convergence instead of common ancestry, and as a result confound phylogenetic analyses. We use a dataset of three chloroplast genes (rbcL, matK, ndhF) for 129 species representative of the family. We find that differences in branch lengths between major clades are not attributable to different rates of non-synonymous and synonymous substitutions. The differences in evolutionary rate between the major clades of Annonaceae pose a challenge for current molecular dating techniques that should be seen as a warning for the interpretation of such results in other organisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis a model for managing the product data in a product transfer project was created for ABB Machines. This model was then applied for the ongoing product transfer project during its planning phase. Detailed information about the demands and challenges in product transfer projects was acquired by analyzing previous product transfer projects in participating organizations. This analysis and the ABB Gate Model were then used as a base for the creation of the model for managing the product data in a product transfer project. The created model shows the main tasks during each phase in the project, their sub-tasks and relatedness on general level. Furthermore the model emphasizes need for detailed analysis of the situation during the project planning phase. The created model for managing the product data in a product transfer project was applied into ongoing project two main areas; manufacturing instructions and production item data. The results showed that the greatest challenge considering the product transfer project in previously mentioned areas is the current state of the product data. Based on the findings, process and resource proposals for both the ongoing product transfer project and the BU Machines were given. For manufacturing instructions it is necessary to create detailed process instructions in receiving organizations own language for each department so that the manufacturing instructions can be used as a training material during the training in sending organization. For production item data the English version of the bill of materials needs to be fully in English. In addition it needs to be ensured that bill of materials is updated and these changes implemented before the training in sending organization begins.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a very volatile industry of high technology it is of utmost importance to accurately forecast customers’ demand. However, statistical forecasting of sales, especially in heavily competitive electronics product business, has always been a challenging task due to very high variation in demand and very short product life cycles of products. The purpose of this thesis is to validate if statistical methods can be applied to forecasting sales of short life cycle electronics products and provide a feasible framework for implementing statistical forecasting in the environment of the case company. Two different approaches have been developed for forecasting on short and medium term and long term horizons. Both models are based on decomposition models, but differ in interpretation of the model residuals. For long term horizons residuals are assumed to represent white noise, whereas for short and medium term forecasting horizon residuals are modeled using statistical forecasting methods. Implementation of both approaches is performed in Matlab. Modeling results have shown that different markets exhibit different demand patterns and therefore different analytical approaches are appropriate for modeling demand in these markets. Moreover, the outcomes of modeling imply that statistical forecasting can not be handled separately from judgmental forecasting, but should be perceived only as a basis for judgmental forecasting activities. Based on modeling results recommendations for further deployment of statistical methods in sales forecasting of the case company are developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a model for transport in multiply scattering media based on a three-dimensional generalization of the persistent random walk. The model assumes that photons move along directions that are parallel to the axes. Although this hypothesis is not realistic, it allows us to solve exactly the problem of multiple scattering propagation in a thin slab. Among other quantities, the transmission probability and the mean transmission time can be calculated exactly. Besides being completely solvable, the model could be used as a benchmark for approximation schemes to multiple light scattering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Language diversity has become greatly endangered in the past centuries owing to processes of language shift from indigenous languages to other languages that are seen as socially and economically more advantageous, resulting in the death or doom of minority languages. In this paper, we define a new language competition model that can describe the historical decline of minority languages in competition with more advantageous languages. We then implement this non-spatial model as an interaction term in a reactiondiffusion system to model the evolution of the two competing languages. We use the results to estimate the speed at which the more advantageous language spreads geographically, resulting in the shrinkage of the area of dominance of the minority language. We compare the results from our model with the observed retreat in the area of influence of the Welsh language in the UK, obtaining a good agreement between the model and the observed data

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En la implementació del CLIL a l’educació superior, apart d’estudis sobre el nivell de l’estudiantat i la disponibilitat del professorat, i de l’elaboració de material educatiu interdisciplinari, el repte actual és aconseguir que s’involucrin activament en CLIL els professors de contingut d’un ventall ampli de disciplines. En aquesta comunicació es presenten les bases d’un model per un sistema CLIL, utilitzant la dinàmica newtoniana. Pot ser un model interessant i plausible en un context universitari científic i tecnològic, on fins ara el CLIL s’ha implementat només lleugerament.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La tomodensitométrie (TDM) est une technique d'imagerie pour laquelle l'intérêt n'a cessé de croitre depuis son apparition au début des années 70. De nos jours, l'utilisation de cette technique est devenue incontournable, grâce entre autres à sa capacité à produire des images diagnostiques de haute qualité. Toutefois, et en dépit d'un bénéfice indiscutable sur la prise en charge des patients, l'augmentation importante du nombre d'examens TDM pratiqués soulève des questions sur l'effet potentiellement dangereux des rayonnements ionisants sur la population. Parmi ces effets néfastes, l'induction de cancers liés à l'exposition aux rayonnements ionisants reste l'un des risques majeurs. Afin que le rapport bénéfice-risques reste favorable au patient il est donc nécessaire de s'assurer que la dose délivrée permette de formuler le bon diagnostic tout en évitant d'avoir recours à des images dont la qualité est inutilement élevée. Ce processus d'optimisation, qui est une préoccupation importante pour les patients adultes, doit même devenir une priorité lorsque l'on examine des enfants ou des adolescents, en particulier lors d'études de suivi requérant plusieurs examens tout au long de leur vie. Enfants et jeunes adultes sont en effet beaucoup plus sensibles aux radiations du fait de leur métabolisme plus rapide que celui des adultes. De plus, les probabilités des évènements auxquels ils s'exposent sont également plus grandes du fait de leur plus longue espérance de vie. L'introduction des algorithmes de reconstruction itératifs, conçus pour réduire l'exposition des patients, est certainement l'une des plus grandes avancées en TDM, mais elle s'accompagne de certaines difficultés en ce qui concerne l'évaluation de la qualité des images produites. Le but de ce travail est de mettre en place une stratégie pour investiguer le potentiel des algorithmes itératifs vis-à-vis de la réduction de dose sans pour autant compromettre la qualité du diagnostic. La difficulté de cette tâche réside principalement dans le fait de disposer d'une méthode visant à évaluer la qualité d'image de façon pertinente d'un point de vue clinique. La première étape a consisté à caractériser la qualité d'image lors d'examen musculo-squelettique. Ce travail a été réalisé en étroite collaboration avec des radiologues pour s'assurer un choix pertinent de critères de qualité d'image. Une attention particulière a été portée au bruit et à la résolution des images reconstruites à l'aide d'algorithmes itératifs. L'analyse de ces paramètres a permis aux radiologues d'adapter leurs protocoles grâce à une possible estimation de la perte de qualité d'image liée à la réduction de dose. Notre travail nous a également permis d'investiguer la diminution de la détectabilité à bas contraste associée à une diminution de la dose ; difficulté majeure lorsque l'on pratique un examen dans la région abdominale. Sachant que des alternatives à la façon standard de caractériser la qualité d'image (métriques de l'espace Fourier) devaient être utilisées, nous nous sommes appuyés sur l'utilisation de modèles d'observateurs mathématiques. Nos paramètres expérimentaux ont ensuite permis de déterminer le type de modèle à utiliser. Les modèles idéaux ont été utilisés pour caractériser la qualité d'image lorsque des paramètres purement physiques concernant la détectabilité du signal devaient être estimés alors que les modèles anthropomorphes ont été utilisés dans des contextes cliniques où les résultats devaient être comparés à ceux d'observateurs humain, tirant profit des propriétés de ce type de modèles. Cette étude a confirmé que l'utilisation de modèles d'observateurs permettait d'évaluer la qualité d'image en utilisant une approche basée sur la tâche à effectuer, permettant ainsi d'établir un lien entre les physiciens médicaux et les radiologues. Nous avons également montré que les reconstructions itératives ont le potentiel de réduire la dose sans altérer la qualité du diagnostic. Parmi les différentes reconstructions itératives, celles de type « model-based » sont celles qui offrent le plus grand potentiel d'optimisation, puisque les images produites grâce à cette modalité conduisent à un diagnostic exact même lors d'acquisitions à très basse dose. Ce travail a également permis de clarifier le rôle du physicien médical en TDM: Les métriques standards restent utiles pour évaluer la conformité d'un appareil aux requis légaux, mais l'utilisation de modèles d'observateurs est inévitable pour optimiser les protocoles d'imagerie. -- Computed tomography (CT) is an imaging technique in which interest has been quickly growing since it began to be used in the 1970s. Today, it has become an extensively used modality because of its ability to produce accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase in the number of CT examinations performed has raised concerns about the potential negative effects of ionising radiation on the population. Among those negative effects, one of the major risks remaining is the development of cancers associated with exposure to diagnostic X-ray procedures. In order to ensure that the benefits-risk ratio still remains in favour of the patient, it is necessary to make sure that the delivered dose leads to the proper diagnosis without producing unnecessarily high-quality images. This optimisation scheme is already an important concern for adult patients, but it must become an even greater priority when examinations are performed on children or young adults, in particular with follow-up studies which require several CT procedures over the patient's life. Indeed, children and young adults are more sensitive to radiation due to their faster metabolism. In addition, harmful consequences have a higher probability to occur because of a younger patient's longer life expectancy. The recent introduction of iterative reconstruction algorithms, which were designed to substantially reduce dose, is certainly a major achievement in CT evolution, but it has also created difficulties in the quality assessment of the images produced using those algorithms. The goal of the present work was to propose a strategy to investigate the potential of iterative reconstructions to reduce dose without compromising the ability to answer the diagnostic questions. The major difficulty entails disposing a clinically relevant way to estimate image quality. To ensure the choice of pertinent image quality criteria this work was continuously performed in close collaboration with radiologists. The work began by tackling the way to characterise image quality when dealing with musculo-skeletal examinations. We focused, in particular, on image noise and spatial resolution behaviours when iterative image reconstruction was used. The analyses of the physical parameters allowed radiologists to adapt their image acquisition and reconstruction protocols while knowing what loss of image quality to expect. This work also dealt with the loss of low-contrast detectability associated with dose reduction, something which is a major concern when dealing with patient dose reduction in abdominal investigations. Knowing that alternative ways had to be used to assess image quality rather than classical Fourier-space metrics, we focused on the use of mathematical model observers. Our experimental parameters determined the type of model to use. Ideal model observers were applied to characterise image quality when purely objective results about the signal detectability were researched, whereas anthropomorphic model observers were used in a more clinical context, when the results had to be compared with the eye of a radiologist thus taking advantage of their incorporation of human visual system elements. This work confirmed that the use of model observers makes it possible to assess image quality using a task-based approach, which, in turn, establishes a bridge between medical physicists and radiologists. It also demonstrated that statistical iterative reconstructions have the potential to reduce the delivered dose without impairing the quality of the diagnosis. Among the different types of iterative reconstructions, model-based ones offer the greatest potential, since images produced using this modality can still lead to an accurate diagnosis even when acquired at very low dose. This work has clarified the role of medical physicists when dealing with CT imaging. The use of the standard metrics used in the field of CT imaging remains quite important when dealing with the assessment of unit compliance to legal requirements, but the use of a model observer is the way to go when dealing with the optimisation of the imaging protocols.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use two coupled equations to analyze the space-time dynamics of two interacting languages. Firstly, we introduce a cohabitation model, which is more appropriate for human populations than classical (non-cohabitation) models. Secondly, using numerical simulations we nd the front speed of a new language spreading into a region where another language was previously used. Thirdly, for a special case we derive an analytical formula that makes it possible to check the validity of our numerical simulations. Finally, as an example, we nd that the observed front speed for the spread of the English language into Wales in the period 1961-1981 is consistent with the model predictions. We also nd that the e¤ects of linguistic parameters are much more important than those of parameters related to population dispersal and reproduction. If the initial population densities of both languages are similar, they have no e¤ect on the front speed. We outline the potential of the new model to analyze relationships between language replacement and genetic replacement

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The optimal design of a heat exchanger system is based on given model parameters together with given standard ranges for machine design variables. The goals set for minimizing the Life Cycle Cost (LCC) function which represents the price of the saved energy, for maximizing the momentary heat recovery output with given constraints satisfied and taking into account the uncertainty in the models were successfully done. Nondominated Sorting Genetic Algorithm II (NSGA-II) for the design optimization of a system is presented and implemented inMatlab environment. Markov ChainMonte Carlo (MCMC) methods are also used to take into account the uncertainty in themodels. Results show that the price of saved energy can be optimized. A wet heat exchanger is found to be more efficient and beneficial than a dry heat exchanger even though its construction is expensive (160 EUR/m2) compared to the construction of a dry heat exchanger (50 EUR/m2). It has been found that the longer lifetime weights higher CAPEX and lower OPEX and vice versa, and the effect of the uncertainty in the models has been identified in a simplified case of minimizing the area of a dry heat exchanger.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this comparative study is to profile second language learners by exploring the factors which have an impact on their learning. The subjects come from two different countries: one group comes from Milwaukee, US, and the other from Turku, Finland. The subjects have attended bilingual classes from elementary school to senior high school in their respective countries. In the United States, the subjects (N = 57) started in one elementary school from where they moved on to two high schools in the district. The Finnish subjects (N = 39) attended the same school from elementary to high school. The longitudinal study was conducted during 1994-2004 and combines both qualitative and quantitative research methods. A Pilot Study carried out in 1990-1991 preceded the two subsequent studies that form the core material of this research. The theoretical part of the study focuses first on language policies in the United States and Finland: special emphasis is given to the history, development and current state of bilingual education, and the factors that have affected policy-making in the provision of language instruction. Current language learning theories and models form the theoretical foundation of the research, and underpin the empirical studies. Cognitively-labeled theories are at the forefront, but sociocultural theory and the ecological approach are also accounted for. The research methods consist of questionnaires, compositions and interviews. A combination of statistical methods as well as content analysis were used in the analysis. The attitude of the bilingual learners toward L1 and L2 was generally positive: the subjects enjoyed learning through two languages and were motivated to learn both. The knowledge of L1 and parental support, along with early literacy in L1, facilitated the learning of L2. This was particularly evident in the American subject group. The American subjects’ L2 learning was affected by the attitudes of the learners to the L1 culture and its speakers. Furthermore, the negative attitudes taken by L1 speakers toward L2 speakers and the lack of opportunities to engage in activities in the L1 culture affected the American subjects’ learning of L2, English. The research showed that many American L2 learners were isolated from the L1 culture and were even afraid to use English in everyday communication situations. In light of the research results, a politically neutral linguistic environment, which the Finnish subjects inhabited, was seen to be more favorable for learning. The Finnish subjects were learning L2, English, in a neutral zone where their own attitudes and motivation dictated their learning. The role of L2 as a means of international communication in Finland, as opposed to a means of exercising linguistic power, provided a neutral atmosphere for learning English. In both the American and Finnish groups, the learning of other languages was facilitated when the learner had a good foundation in their L1, and the learning of L1 and L2 were in balance. Learning was also fostered when the learners drew positive experiences from their surroundings and were provided with opportunities to engage in activities where L2 was used.