904 resultados para Process-dissociation Framework


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, the consumption of goods and services on the Internet are increasing in a constant motion. Small and Medium Enterprises (SMEs) mostly from the traditional industry sectors are usually make business in weak and fragile market sectors, where customized products and services prevail. To survive and compete in the actual markets they have to readjust their business strategies by creating new manufacturing processes and establishing new business networks through new technological approaches. In order to compete with big enterprises, these partnerships aim the sharing of resources, knowledge and strategies to boost the sector’s business consolidation through the creation of dynamic manufacturing networks. To facilitate such demand, it is proposed the development of a centralized information system, which allows enterprises to select and create dynamic manufacturing networks that would have the capability to monitor all the manufacturing process, including the assembly, packaging and distribution phases. Even the networking partners that come from the same area have multi and heterogeneous representations of the same knowledge, denoting their own view of the domain. Thus, different conceptual, semantic, and consequently, diverse lexically knowledge representations may occur in the network, causing non-transparent sharing of information and interoperability inconsistencies. The creation of a framework supported by a tool that in a flexible way would enable the identification, classification and resolution of such semantic heterogeneities is required. This tool will support the network in the semantic mapping establishments, to facilitate the various enterprises information systems integration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present Work Project (WP) is the result of Sonae’s concern with fraud risk, seeking to implement a method that formally describes and evaluates it in its various forms. In a context of limited human, capital, time and tools’ resources, the Internal Audit (IA) department of the company developed a framework to raise the awareness of top management and identify which processes of its value chain present a higher level of exposure to fraud, with the purpose of redirecting attention to those and prioritizing the creation of new mechanisms to monitor its KPIs’ dynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The year is 2015 and the startup and tech business ecosphere has never seen more activity. In New York City alone, the tech startup industry is on track to amass $8 billion dollars in total funding – the highest in 7 years (CB Insights, 2015). According to the Kauffman Index of Entrepreneurship (2015), this figure represents just 20% of the total funding in the United States. Thanks to platforms that link entrepreneurs with investors, there are simply more funding opportunities than ever, and funding can be initiated in a variety of ways (angel investors, venture capital firms, crowdfunding). And yet, in spite of all this, according to Forbes Magazine (2015), nine of ten startups will fail. Because of the unpredictable nature of the modern tech industry, it is difficult to pinpoint exactly why 90% of startups fail – but the general consensus amongst top tech executives is that “startups make products that no one wants” (Fortune, 2014). In 2011, author Eric Ries wrote a book called The Lean Startup in attempts to solve this all-too-familiar problem. It was in this book where he developed the framework for The Hypothesis-Driven Entrepreneurship Process, an iterative process that aims at proving a market before actually launching a product. Ries discusses concepts such as the Minimum Variable Product, the smallest set of activities necessary to disprove a hypothesis (or business model characteristic). Ries encourages acting briefly and often: if you are to fail, then fail fast. In today’s fast-moving economy, an entrepreneur cannot afford to waste his own time, nor his customer’s time. The purpose of this thesis is to conduct an in-depth of analysis of Hypothesis-Driven Entrepreneurship Process, in order to test market viability of a reallife startup idea, ShowMeAround. This analysis will follow the scientific Lean Startup approach; for the purpose of developing a functional business model and business plan. The objective is to conclude with an investment-ready startup idea, backed by rigorous entrepreneurial study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The assessment of existing timber structures is often limited to information obtained from non or semi destructive testing, as mechanical testing is in many cases not possible due to its destructive nature. Therefore, the available data provides only an indirect measurement of the reference mechanical properties of timber elements, often obtained through empirical based correlations. Moreover, the data must result from the combination of different tests, as to provide a reliable source of information for a structural analysis. Even if general guidelines are available for each typology of testing, there is still a need for a global methodology allowing to combine information from different sources and infer upon that information in a decision process. In this scope, the present work presents the implementation of a probabilistic based framework for safety assessment of existing timber elements. This methodology combines information gathered in different scales and follows a probabilistic framework allowing for the structural assessment of existing timber elements with possibility of inference and updating of its mechanical properties, through Bayesian methods. The probabilistic based framework is based in four main steps: (i) scale of information; (ii) measurement data; (iii) probability assignment; and (iv) structural analysis. In this work, the proposed methodology is implemented in a case study. Data was obtained through a multi-scale experimental campaign made to old chestnut timber beams accounting correlations of non and semi-destructive tests with mechanical properties. Finally, different inference scenarios are discussed aiming at the characterization of the safety level of the elements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a framework of competences developed for Industrial Engineering and Management that can be used as a tool for curriculum analysis and design, including the teaching and learning processes as well as the alignment of the curriculum with the professional profile. The framework was applied to the Industrial Engineering and Management program at University of Minho (UMinho), Portugal, and it provides an overview of the connection between IEM knowledge areas and the competences defined in its curriculum. The framework of competences was developed through a process of analysis using a combination of methods and sources for data collection. The framework was developed according to four main steps: 1) characterization of IEM knowledge areas; 2) definition of IEM competences; 3) survey; 4) application of the framework at the IEM curriculum. The findings showed that the framework is useful to build an integrated vision of the curriculum. The most visible aspect in the learning outcomes of IEM program is the lack of balance between technical and transversal competences. There was not almost any reference to the transversal competences and it is fundamentally concentrated on Project-Based Learning courses. The framework presented in this paper provides a contribution to the definition of IEM professional profile through a set of competences which need to be explored further. In addition, it may be a relevant tool for IEM curriculum analysis and a contribution for bridging the gap between universities and companies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel framework for probabilistic-based structural assessment of existing structures, which combines model identification and reliability assessment procedures, considering in an objective way different sources of uncertainty, is presented in this paper. A short description of structural assessment applications, provided in literature, is initially given. Then, the developed model identification procedure, supported in a robust optimization algorithm, is presented. Special attention is given to both experimental and numerical errors, to be considered in this algorithm convergence criterion. An updated numerical model is obtained from this process. The reliability assessment procedure, which considers a probabilistic model for the structure in analysis, is then introduced, incorporating the results of the model identification procedure. The developed model is then updated, as new data is acquired, through a Bayesian inference algorithm, explicitly addressing statistical uncertainty. Finally, the developed framework is validated with a set of reinforced concrete beams, which were loaded up to failure in laboratory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Doctoral Thesis in Information Systems and Technologies Area of Information Systems and Technology

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El proyecto de investigación se dirige a sistematizar un abordaje postfundacionalista de las identidades políticas en vistas a la construcción de un marco teórico-metodológico para el análisis histórico-político. Se parte de la creciente relevancia en las ciencias sociales de la cuestión de la “identidad” para indagar la singularidad del marco teórico adoptado en su estudio. Así, el postfundacionalismo da cuenta de una teoría de las subjetividades políticas que parte de procesos de identificación que suponen una articulación singular entre relativa estructuralidad y agencia. El sujeto emerge en un contexto nunca plenamente suturado ni plenamente abierto, sino a través de un anudamiento de diversas dimensiones reales, simbólicas e imaginarias. Esta visión es altamente productiva para generar conclusiones relevantes en el campo de la ciencia política y del análisis histórico político comparado. This research project seeks to systematise from a postfundationalist view of political identities a theoretico-methodological framework for historico political analysis. The project starts from the increasing relevance of the quistion of 'identity' for contremporary social sciences. Poststructuralism provides an insight of processes of identification that supposes a singular articulation between relative structurality and agency. The subject emerges in a neither fully structured nor fully opened context but through a knotting process of different registers: real, symblic and imaginary. This vision, we believe, is highly productive to generate meaningful conclusions in the field of political science and comparative political analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As digital imaging processing techniques become increasingly used in a broad range of consumer applications, the critical need to evaluate algorithm performance has become recognised by developers as an area of vital importance. With digital image processing algorithms now playing a greater role in security and protection applications, it is of crucial importance that we are able to empirically study their performance. Apart from the field of biometrics little emphasis has been put on algorithm performance evaluation until now and where evaluation has taken place, it has been carried out in a somewhat cumbersome and unsystematic fashion, without any standardised approach. This paper presents a comprehensive testing methodology and framework aimed towards automating the evaluation of image processing algorithms. Ultimately, the test framework aims to shorten the algorithm development life cycle by helping to identify algorithm performance problems quickly and more efficiently.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Choline supplementation improving memory functions in rodents is assumed to increase the synthesis and release of acetylcholine in the brain. We have found that a combined pre- and postnatal supplementation results in long-lasting facilitation of spatial memory in juvenile rats when training was conducted in presence of a local salient cue. The present work was aimed at analysing the effects of peri- and postnatal choline supplementation on spatial abilities of naive adult rats. Rats given a perinatal choline supplementation were trained in various cued procedures of the Morris navigation task when aged 5 months. The treatment had a specific effect of reducing the escape latency of the rats when the platform was at a fixed position in space and surrounded by a suspended cue. This effect was associated with an increased spatial bias when the cue and platform were removed. In this condition, the control rats showed impaired spatial discrimination following the removal of the target cue, most likely due to an overshadowing of the distant environmental cues. This impairment was not observed in the treated rats. Further training with the suspended cue at unpredictable places in the pool revealed longer escape latencies in the control than in the treated rats suggesting that this procedure induced a selective perturbation of the normal but not of the treated rats. A special probe trial with the cue at an irrelevant position and no escape platform revealed a significant bias of the control rats toward the cue and of the treated rats toward the uncued spatial escape position. This behavioural dissociation suggests that a salient cue associated with the target induces an alternative "non spatial" guidance strategy in normal rats, with the risk of overshadowing of the more distant spatial cues. In this condition, the choline supplementation facilities a spatial reliance on the cue, that is an overall facilitation of learning a set of spatial relations between several visual cues. As a consequence, the improved escape in presence of the cue is associated with a stronger memory of the spatial position following disappearance of the cue. This and previous observations suggest that a specific spatial attention process relies on the buffering of highly salient visual cues.to facilitate integration of their relative position in the environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract: This article presents both a brief systemic intervention method (IBS) consisting in 6 sessions developed in an ambulatory service for couples and families, and two research projects done in collaboration with the Institute for Psychotherapy of the University of Lausanne. The first project is quantitative and it aims at evaluating the effectiveness of ISB. One of its main feature is that outcomes are assessed at different levels of individual and family functioning: 1) symptoms and individual functioning; 2) quality of marital relationship; 3) parental and co-parental relationships; 4) familial relationships. The second project is a qualitative case study about a marital therapy which identifies and analyses significant moments of the therapeutic process from the patients' perspective. Methodology was largely inspired by Daniel Stem's work about "moments of meeting" in psychotherapy. Results show that patients' theories about relationship and change are important elements that deepen our understanding of the change process in couple and family therapy. The interest of associating clinicians and researchers for the development and validation of a new clinical model is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The level of information provided by ink evidence to the criminal and civil justice system is limited. The limitations arise from the weakness of the interpretative framework currently used, as proposed in the ASTM 1422-05 and 1789-04 on ink analysis. It is proposed to use the likelihood ratio from the Bayes theorem to interpret ink evidence. Unfortunately, when considering the analytical practices, as defined in the ASTM standards on ink analysis, it appears that current ink analytical practices do not allow for the level of reproducibility and accuracy required by a probabilistic framework. Such framework relies on the evaluation of the statistics of the ink characteristics using an ink reference database and the objective measurement of similarities between ink samples. A complete research programme was designed to (a) develop a standard methodology for analysing ink samples in a more reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in a forensic context. This report focuses on the first of the three stages. A calibration process, based on a standard dye ladder, is proposed to improve the reproducibility of ink analysis by HPTLC, when these inks are analysed at different times and/or by different examiners. The impact of this process on the variability between the repetitive analyses of ink samples in various conditions is studied. The results show significant improvements in the reproducibility of ink analysis compared to traditional calibration methods.