910 resultados para Process-based model (PBM)
Resumo:
Trata do problema da seleção de Sistemas Integrados, ou ERP (Enterprise Resource Systems), investigando o processo especificamente sob o ponto de vista da Análise de Decisões. Procura analisar a associação entre a satisfação tanto com o Sistema Integrado selecionado quanto com a forma em que foi estruturado o próprio processo de seleção, com variáveis especificamente selecionadas para tal, representativas, entre outros, de grupos de critérios de decisão e características específicas do processo de seleção, relacionadas, estas últimas, a questões como o tratamento dado ao fator risco e ao possível caráter coletivo da decisão. Questiona a possibilidade de modelagem do processo de seleção de Sistemas Integrados, a partir da proposta normativa oferecida pela Teoria da Utilidade, e da suposta existência de um gap ou distância entre esta proposta e a prática naquele processo de seleção. Propõe um modelo mental genérico que procura explicar o modo como os agentes decisórios abordam o problema de seleção de sistemas integrados. Apresenta e propõe um modelo dinâmico que justificaria a existência do gap acima mencionado a partir da incapacidade do modelo mental genérico em apreender toda a complexidade inerente ao problema de seleção de sistemas integrados.
Resumo:
Os avanços tecnológicos, principalmente os relacionados tecnologia da informação das telecomunicações, transformaram as organizações sociedade, são objeto de conflitos empresariais devido às mudanças que produzem na vida das pessoas nas formas de trabalho. Este estudo procura analisar contribuição da TI, em especial dos produtos de software, na discutível mudança do paradigma fordista para pós-fordista. adoção dessas tecnologias pode se configurar como competência essencial fator de competitividade da empresa, no entanto os impactos dessa nova dinâmica empresarial que faz uso intensivo da TI carecem de maior estudo compreensão para verificar se nessa dinâmica está envolvido um processo efetivo de flexibilização ou apenas uma sistemática para redução de custos. trabalho foi baseado em um estudo de caso, por meio de pesquisa realizada com usuários de produtos de software em uma grande empresa do setor elétrico, para verificar contribuição desses produtos no processo de flexibilização organizacional. Os resultados indicam a existência de fatores facilitadores do processo de flexibilização apoiados na utilização de sistemas informatizados. Os produtos de software alteram de forma significativa processo de comunicação aproximam as pessoas entre diferentes níveis hierárquicos. No entanto, tecnologia ainda não utilizada de forma disseminada como recurso para flexibilizar as relações de trabalho, principalmente no que se refere execução de atividades em locais horários de trabalho não convencionais. Para operacionalização de um modelo de gestão flexível, com características pós-fordistas, há necessidade de desenvolvimento de um novo perfil nas relações de trabalho que ainda têm características do modelo fordista de produção.
Resumo:
Este estudo investiga o poder preditivo fora da amostra, um mês à frente, de um modelo baseado na regra de Taylor para previsão de taxas de câmbio. Revisamos trabalhos relevantes que concluem que modelos macroeconômicos podem explicar a taxa de câmbio de curto prazo. Também apresentamos estudos que são céticos em relação à capacidade de variáveis macroeconômicas preverem as variações cambiais. Para contribuir com o tema, este trabalho apresenta sua própria evidência através da implementação do modelo que demonstrou o melhor resultado preditivo descrito por Molodtsova e Papell (2009), o “symmetric Taylor rule model with heterogeneous coefficients, smoothing, and a constant”. Para isso, utilizamos uma amostra de 14 moedas em relação ao dólar norte-americano que permitiu a geração de previsões mensais fora da amostra de janeiro de 2000 até março de 2014. Assim como o critério adotado por Galimberti e Moura (2012), focamos em países que adotaram o regime de câmbio flutuante e metas de inflação, porém escolhemos moedas de países desenvolvidos e em desenvolvimento. Os resultados da nossa pesquisa corroboram o estudo de Rogoff e Stavrakeva (2008), ao constatar que a conclusão da previsibilidade da taxa de câmbio depende do teste estatístico adotado, sendo necessária a adoção de testes robustos e rigorosos para adequada avaliação do modelo. Após constatar não ser possível afirmar que o modelo implementado provém previsões mais precisas do que as de um passeio aleatório, avaliamos se, pelo menos, o modelo é capaz de gerar previsões “racionais”, ou “consistentes”. Para isso, usamos o arcabouço teórico e instrumental definido e implementado por Cheung e Chinn (1998) e concluímos que as previsões oriundas do modelo de regra de Taylor são “inconsistentes”. Finalmente, realizamos testes de causalidade de Granger com o intuito de verificar se os valores defasados dos retornos previstos pelo modelo estrutural explicam os valores contemporâneos observados. Apuramos que o modelo fundamental é incapaz de antecipar os retornos realizados.
Resumo:
O corpo de teorias referente à Capacidade Absortiva versa sobre a gestão da informação. Foi identificado na literatura que esse campo de estudo evoluiu com base especialmente em modelos baseados em processos. Visando facilitar o uso dos conceitos advindos dessa teoria pela aplicação da variedade de técnicas de análise de dados disponíveis, identificou-se a necessidade de propor uma escala para os construtos em modelo de variância. Dentre os vários construtos, optamos pela proposição da operacionalização de Reconhecimento de Valor, o primeiro construto do grupo de teorias sobre Capacidade Absortiva. Este estudo dispõe de três capítulos principais apresentados em formato de artigos acadêmicos, o primeiro dos quais visando à proposição de uma escala para o construto Reconhecimento de Valor, o segundo objetivando a análise de sua formação por meio de seus antecedentes e o terceiro que o testa de forma integrada com outros construtos da Capacidade Absortiva. Espera-se que este trabalho contribua para o entendimento teórico da teoria de Capacidade Absortiva, permita o desenvolvimento de outras pesquisas aplicando o construto desenvolvido e que facilite o processo gerencial na adoção e gestão de procedimentos que efetivamente capacitem a empresa no Reconhecimento de Valor quando diante de uma oportunidade.
Resumo:
This paper investigates the cognitive processes that operate in understanding narratives in this case, the novel Macunaíma, by Mário de Andrade. Our work belongs to the field of Embodied-based Cognitive Linguistics and, due to its interdisciplinary nature, it dialogues with theoretical and methodological frameworks of Psycholinguistics, Cognitive Psychology and Neurosciences. Therefore, we adopt an exploratory research design, recall and cloze tests, adapted, with postgraduation students, all native speakers of Brazilian Portuguese. The choice of Macunaíma as the novel and initial motivation for this proposal is due to the fact it is a fantastic narrative, which consists of events, circumstances and characters that are clearly distant types from what is experienced in everyday life. Thus, the novel provides adequate data to investigate the configuration of meaning, within an understanding-based model. We, therefore, seek, to answer questions that are still, generally, scarcely explored in the field of Cognitive Linguistics, such as to what extent is the activation of mental models (schemas and frames) related to the process of understanding narratives? How are we able to build sense even when words or phrases are not part of our linguistic repertoire? Why do we get emotionally involved when reading a text, even though it is fiction? To answer them, we assume the theoretical stance that meaning is not in the text, it is constructed through language, conceived as a result of the integration between the biological (which results in creating abstract imagery schemes) and the sociocultural (resulting in creating frames) apparatus. In this sense, perception, cognitive processing, reception and transmission of the information described are directly related to how language comprehension occurs. We believe that the results found in our study may contribute to the cognitive studies of language and to the development of language learning and teaching methodologies
Resumo:
The development of strategies for structural health monitoring (SHM) has become increasingly important because of the necessity of preventing undesirable damage. This paper describes an approach to this problem using vibration data. It involves a three-stage process: reduction of the time-series data using principle component analysis (PCA), the development of a data-based model using an auto-regressive moving average (ARMA) model using data from an undamaged structure, and the classification of whether or not the structure is damaged using a fuzzy clustering approach. The approach is applied to data from a benchmark structure from Los Alamos National Laboratory, USA. Two fuzzy clustering algorithms are compared: fuzzy c-means (FCM) and Gustafson-Kessel (GK) algorithms. It is shown that while both fuzzy clustering algorithms are effective, the GK algorithm marginally outperforms the FCM algorithm. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Model-oriented strategies have been used to facilitate products customization in the software products lines (SPL) context and to generate the source code of these derived products through variability management. Most of these strategies use an UML (Unified Modeling Language)-based model specification. Despite its wide application, the UML-based model specification has some limitations such as the fact that it is essentially graphic, presents deficiencies regarding the precise description of the system architecture semantic representation, and generates a large model, thus hampering the visualization and comprehension of the system elements. In contrast, architecture description languages (ADLs) provide graphic and textual support for the structural representation of architectural elements, their constraints and interactions. This thesis introduces ArchSPL-MDD, a model-driven strategy in which models are specified and configured by using the LightPL-ACME ADL. Such strategy is associated to a generic process with systematic activities that enable to automatically generate customized source code from the product model. ArchSPLMDD strategy integrates aspect-oriented software development (AOSD), modeldriven development (MDD) and SPL, thus enabling the explicit modeling as well as the modularization of variabilities and crosscutting concerns. The process is instantiated by the ArchSPL-MDD tool, which supports the specification of domain models (the focus of the development) in LightPL-ACME. The ArchSPL-MDD uses the Ginga Digital TV middleware as case study. In order to evaluate the efficiency, applicability, expressiveness, and complexity of the ArchSPL-MDD strategy, a controlled experiment was carried out in order to evaluate and compare the ArchSPL-MDD tool with the GingaForAll tool, which instantiates the process that is part of the GingaForAll UML-based strategy. Both tools were used for configuring the products of Ginga SPL and generating the product source code
Resumo:
RePART (Reward/Punishment ART) is a neural model that constitutes a variation of the Fuzzy Artmap model. This network was proposed in order to minimize the inherent problems in the Artmap-based model, such as the proliferation of categories and misclassification. RePART makes use of additional mechanisms, such as an instance counting parameter, a reward/punishment process and a variable vigilance parameter. The instance counting parameter, for instance, aims to minimize the misclassification problem, which is a consequence of the sensitivity to the noises, frequently presents in Artmap-based models. On the other hand, the use of the variable vigilance parameter tries to smoouth out the category proliferation problem, which is inherent of Artmap-based models, decreasing the complexity of the net. RePART was originally proposed in order to minimize the aforementioned problems and it was shown to have better performance (higer accuracy and lower complexity) than Artmap-based models. This work proposes an investigation of the performance of the RePART model in classifier ensembles. Different sizes, learning strategies and structures will be used in this investigation. As a result of this investigation, it is aimed to define the main advantages and drawbacks of this model, when used as a component in classifier ensembles. This can provide a broader foundation for the use of RePART in other pattern recognition applications
Resumo:
This work proposes a model based approach for pointcut management in the presence of evolution in aspect oriented systems. The proposed approach, called conceptual visions based pointcuts, is motivated by the observation of the shortcomings in traditional approaches pointcuts definition, which generally refer directly to software structure and/or behavior, thereby creating a strong coupling between pointcut definition and the base code. This coupling causes the problem known as pointcut fragility problem and hinders the evolution of aspect-oriented systems. This problem occurs when all the pointcuts of each aspect should be reviewed due to any software changes/evolution, to ensure that they remain valid even after the changes made in the software. Our approach is focused on the pointcuts definition based on a conceptual model, which has definitions of the system's structure in a more abstract level. The conceptual model consists of classifications (called conceptual views) on entities of the business model elements based on common characteristics, and relationships between these views. Thus the pointcuts definitions are created based on the conceptual model rather than directly referencing the base model. Moreover, the conceptual model contains a set of relationships that allows it to be automatically verified if the classifications in the conceptual model remain valid even after a software change. To this end, all the development using the conceptual views based pointcuts approach is supported by a conceptual framework called CrossMDA2 and a development process based on MDA, both also proposed in this work. As proof of concept, we present two versions of a case study, setting up a scenario of evolution that shows how the use of conceptual visions based pointcuts helps detecting and minimizing the pointcuts fragility. For the proposal evaluation the Goal/Question/Metric (GQM) technique is used together with metrics for efficiency analysis in the pointcuts definition
Resumo:
A great challenge of the Component Based Development is the creation of mechanisms to facilitate the finding of reusable assets that fulfill the requirements of a particular system under development. In this sense, some component repositories have been proposed in order to answer such a need. However, repositories need to represent the asset characteristics that can be taken into account by the consumers when choosing the more adequate assets for their needs. In such a context, the literature presents some models proposed to describe the asset characteristics, such as identification, classification, non-functional requirements, usage and deployment information and component interfaces. Nevertheless, the set of characteristics represented by those models is insufficient to describe information used before, during and after the asset acquisition. This information refers to negotiation, certification, change history, adopted development process, events, exceptions and so on. In order to overcome this gap, this work proposes an XML-based model to represent several characteristics, of different asset types, that may be employed in the component-based development. Besides representing metadata used by consumers, useful for asset discovering, acquisition and usage, this model, called X-ARM, also focus on helping asset developers activities. Since the proposed model represents an expressive amount of information, this work also presents a tool called X-Packager, developed with the goal of helping asset description with X-ARM
Resumo:
Pesquisa qualitativa orientada pelos referenciais teórico-metodológicos: Interacionismo Simbólico e Grounded Theory para compreender o processo planejamento-implementação da Sistematização da Assistência de Enfermagem (SAE), segundo dois grupos amostrais: enfermeiros e auxiliares/técnicos de enfermagem de um hospital universitário, e desenvolver uma síntese dos modelos teóricos representativos dessas experiências. A saturação teórica configurou-se mediante a análise da 24ª entrevista não diretiva de 12 enfermeiros e de 12 técnicos de enfermagem, lotados em unidades de internação. da análise emergiram dois modelos teóricos, cuja síntese originou o terceiro, intitulado Entre o êxito e a frustração com a operacionalização da SAE: recursos humanos como componente determinante para a visibilidade do enfermeiro no processo de trabalho. Este modelo desvela o déficit de recursos humanos, impulsionando o enfermeiro a realizar uma SAE ilusória, e perpetuando um processo cíclico de sofrimento, por vivenciar a invisibilidade de sua práxis no processo de trabalho.
Resumo:
We present a generic spatially explicit modeling framework to estimate carbon emissions from deforestation (INPE-EM). The framework incorporates the temporal dynamics related to the deforestation process and accounts for the biophysical and socioeconomic heterogeneity of the region under study. We build an emission model for the Brazilian Amazon combining annual maps of new clearings, four maps of biomass, and a set of alternative parameters based on the recent literature. The most important results are as follows: (a) Using different biomass maps leads to large differences in estimates of emission; for the entire region of the Brazilian Amazon in the last decade, emission estimates of primary forest deforestation range from 0.21 to 0.26 similar to Pg similar to C similar to yr-1. (b) Secondary vegetation growth presents a small impact on emission balance because of the short duration of secondary vegetation. In average, the balance is only 5% smaller than the primary forest deforestation emissions. (c) Deforestation rates decreased significantly in the Brazilian Amazon in recent years, from 27 similar to Mkm2 in 2004 to 7 similar to Mkm2 in 2010. INPE-EM process-based estimates reflect this decrease even though the agricultural frontier is moving to areas of higher biomass. The decrease is slower than a non-process instantaneous model would estimate as it considers residual emissions (slash, wood products, and secondary vegetation). The average balance, considering all biomass, decreases from 0.28 in 2004 to 0.15 similar to Pg similar to C similar to yr-1 in 2009; the non-process model estimates a decrease from 0.33 to 0.10 similar to Pg similar to C similar to yr-1. We conclude that the INPE-EM is a powerful tool for representing deforestation-driven carbon emissions. Biomass estimates are still the largest source of uncertainty in the effective use of this type of model for informing mechanisms such as REDD+. The results also indicate that efforts to reduce emissions should focus not only on controlling primary forest deforestation but also on creating incentives for the restoration of secondary forests.
Resumo:
This work deals with noise removal by the use of an edge preserving method whose parameters are automatically estimated, for any application, by simply providing information about the standard deviation noise level we wish to eliminate. The desired noiseless image u(x), in a Partial Differential Equation based model, can be viewed as the solution of an evolutionary differential equation u t(x) = F(u xx, u x, u, x, t) which means that the true solution will be reached when t ® ¥. In practical applications we should stop the time ''t'' at some moment during this evolutionary process. This work presents a sufficient condition, related to time t and to the standard deviation s of the noise we desire to remove, which gives a constant T such that u(x, T) is a good approximation of u(x). The approach here focused on edge preservation during the noise elimination process as its main characteristic. The balance between edge points and interior points is carried out by a function g which depends on the initial noisy image u(x, t0), the standard deviation of the noise we want to eliminate and a constant k. The k parameter estimation is also presented in this work therefore making, the proposed model automatic. The model's feasibility and the choice of the optimal time scale is evident through out the various experimental results.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)