1000 resultados para Processos estocàstics -- Models matemàtics
Resumo:
Esta dissertação examina o processo de acumulação de capacidades para atividades de operação e de inovação em gestão de processos e os mecanismos subjacentes de aprendizagem em empresas de serviços, especificamente, da indústria bancária, levando-se em consideração as especificidades do contexto de economias emergentes. Durante as últimas décadas duas décadas tem havido numerosos estudos sobre acumulação de capacidades tecnológicas e os mecanismos subjacentes de aprendizagem. Porém, ainda são escassos os estudos empíricos sobre o relacionamento entre essas duas variáveis no contexto de empresas de serviços, especialmente no âmbito da indústria bancária. Essa escassez de estudos dessa natureza é fortemente observada no Brasil. Por isso, buscando ampliar o entendimento sobre a trajetória de acumulação de capacidades tecnológicas e os mecanismos subjacentes de aprendizagem em empresas de serviços, esta dissertação avalia a função tecnológica gestão de processos de uma empresa de serviços bancários, especificamente, a área de tecnologia da informação e comunicação do Banco do Brasil S.A. durante o período de 1982 a 2008. Baseando-se em evidências empíricas qualitativas e quantitativas, de primeira mão, coletadas a partir de um extensivo trabalho de campo, esta dissertação encontrou os seguintes resultados: 1. A área de TIC da empresa acumulou, de 1982 a 2008, capacidades tecnológicas em gestão de processos através de esforços em aquisição e conversão de conhecimentos, de forma que nesse período, iniciando no nível Operacional Básico, onde era capaz somente de executar operações bancárias básicas, atingisse o nível Inovador Intermediário, onde passou a ser capaz de implementar mudanças avançadas na gestão dos processos internos. Além disso, foram encontradas diferenças na velocidade de acumulação das capacidades no período estudado e entre as duas unidades internas da área de TIC, ambas em função dos esforços empreendidos na capacitação em gestão de processos. Cabe ressaltar que a empresa atingiu o 5° nível, em uma escala de 6 níveis, mas não atingiu a fronteira tecnológica. 2. Os processos de aprendizagem foram fontes essenciais para a acumulação de capacidades tecnológicas em gestão de processos. Os processos de aquisição e conversão de conhecimentos possibilitaram que a empresa criasse a base necessária para assimilar conhecimentos mais avançados e complexos. Apesar da importância expressiva dos processos de aprendizagem, percebeu-se que outros fatores internos (mudanças organizacionais) e externos (políticas econômicas) também influenciaram na acumulação de capacidades tecnológicas em gestão de processos. Porém, não foi importante a quantidade desses mecanismos, mas o seu funcionamento ao longo do tempo. Os resultados observados neste estudo permitem concluir que (i) a trajetória de acumulação de capacidades tecnológicas é um processo intencional, contínuo e crescente, decorrente de esforços e investimentos integrados em todas as dimensões das capacidades tecnológicas, (ii) os mecanismos de aprendizagem influenciam a acumulação de capacidades tecnológicas e (iii) a utilização de modelos adaptados à realidade das empresas garante uma análise mais fidedigna do seu comportamento. Esta dissertação contribui para o entendimento da complexidade envolvida no processo de acumulação de capacidades tecnológicas, fator preponderante no diferencial competitivo para empresas de economias emergentes, e especialmente para indústria bancária, onde a competitividade requer processos internos de qualidade elevada que resultem em eficiência operacional e incremento no desempenho econômico-financeiro. E, ainda, ressalta a importância da dimensão organizacional como suporte às demais dimensões de capacidades tecnológicas, através da organização de processos internos e estratégias corporativas. Além disso, sugere aos executivos das empresas do setor bancário brasileiro que a criação intencional de um processo cíclico e contínuo de desenvolvimento dos mecanismos de aprendizagem, considerando suas características-chave, auxilia a empresa em sua trajetória de acumulação de capacidades tecnológicas. Dessa forma, é importante que esses executivos considerem os investimentos em capacitação tecnológica como forma de manter uma posição competitiva sustentável no mercado em que atuam, adequando a estratégia gerencial à empresarial.
Resumo:
O presente trabalho tem como principal objetivo a descrição da sistemática de certificação digital a ser implementada na Prefeitura de Santos, como parte de um processo maior, a implementação dos Processos Digitais naquele município através da verificação e o acompanhamento dos principais desafios que a Prefeitura Municipal de Santos, por intermédio de sua Secretaria de Gestão, encontrou para a contratação e implantação da fé pública exigida para o correto enquadramento legal do programa de digitalização dos processos administrativos da Municipalidade. Para tanto, tem-se como base a pesquisa de material legal, especialmente do Decreto do Prefeito de Santos e da Portaria Municipal da Secretaria de Gestão que criou efetivamente a obrigação para que todos os servidores do Município elaborem determinados processos administrativos de maneira unicamente digital. Ainda, a MP 2001-02/2001 que trata da certificação digital é retratada. Angariar informações, desde as básicas, como quais são os equipamentos necessários, até o modelo de licitação (pregão eletrônico) para que outros entes públicos busquem a digitalização de seus processos e a consequente licitação para a certificação digital são os desafios deste artigo.
Resumo:
Com o substancial avanço da competitividade, crescem os interesses pela discussão de mecanismos potencializadores de implantação de novas tecnologias de gestão das operações das empresas, percebidas como um lastro de sustentação das organizações, necessário num mundo globalizado, no qual o poder de influência dos consumidores tem ganhado inegável relevância. Entretanto, a realidade tem revelado dificuldades de inclusão dessas abordagens em algumas empresas brasileiras, sobretudo naquelas de menor porte. Uma das barreiras são, indiscutivelmente, as limitações de recursos, aliadas às poucas alternativas de crédito de baixo custo disponibilizadas no mercado nacional para este segmento de empresas. Diante deste cenário, consultores independentes têm o enorme desafio de internalizar nestas organizações modelos de gestão mais coerentes com a dinâmica da economia atual, tendo como obstáculos adicionais as resistências culturais naturais à incorporação de algumas filosofias de administração modernas como o Lean e a Teoria das Restrições, questionadoras de conceitos tradicionais e detentoras de visões singulares, mas igualmente potencializadoras de resultados efetivos, a um baixo custo. Este trabalho objetivou, por um lado avaliar o contexto no qual os pequenos negócios estão inseridos, e, por outro, propor um modelo de apoio para consultoria voltado para este segmento de empresas, no qual os consultores assumem preponderantemente papeis de educador e coach, de maneira a elevar as chances não somente da implantação de técnicas mais modernas de gestão de operação, mas também de sua incorporação definitiva nestas organizações.
Resumo:
Nowadays, more than half of the computer development projects fail to meet the final users' expectations. One of the main causes is insufficient knowledge about the organization of the enterprise to be supported by the respective information system. The DEMO methodology (Design and Engineering Methodology for Organizations) has been proved as a well-defined method to specify, through models and diagrams, the essence of any organization at a high level of abstraction. However, this methodology is platform implementation independent, lacking the possibility of saving and propagating possible changes from the organization models to the implemented software, in a runtime environment. The Universal Enterprise Adaptive Object Model (UEAOM) is a conceptual schema being used as a basis for a wiki system, to allow the modeling of any organization, independent of its implementation, as well as the previously mentioned change propagation in a runtime environment. Based on DEMO and UEAOM, this project aims to develop efficient and standardized methods, to enable an automatic conversion of DEMO Ontological Models, based on UEAOM specification into BPMN (Business Process Model and Notation) models of processes, using clear semantics, without ambiguities, in order to facilitate the creation of processes, almost ready for being executed on workflow systems that support BPMN.
Resumo:
The oscillations presents in control loops can cause damages in petrochemical industry. Canceling, or even preventing such oscillations, would save up to large amount of dollars. Studies have identified that one of the causes of these oscillations are the nonlinearities present on industrial process actuators. This study has the objective to develop a methodology for removal of the harmful effects of nonlinearities. Will be proposed an parameter estimation method to Hammerstein model, whose nonlinearity is represented by dead-zone or backlash. The estimated parameters will be used to construct inverse models of compensation. A simulated level system was used as a test platform. The valve that controls inflow has a nonlinearity. Results and describing function analysis show an improvement on system response
Resumo:
A chemical process optimization and control is strongly correlated with the quantity of information can be obtained from the system. In biotechnological processes, where the transforming agent is a cell, many variables can interfere in the process, leading to changes in the microorganism metabolism and affecting the quantity and quality of final product. Therefore, the continuously monitoring of the variables that interfere in the bioprocess, is crucial to be able to act on certain variables of the system, keeping it under desirable operational conditions and control. In general, during a fermentation process, the analysis of important parameters such as substrate, product and cells concentration, is done off-line, requiring sampling, pretreatment and analytical procedures. Therefore, this steps require a significant run time and the use of high purity chemical reagents to be done. In order to implement a real time monitoring system for a benchtop bioreactor, these study was conducted in two steps: (i) The development of a software that presents a communication interface between bioreactor and computer based on data acquisition and process variables data recording, that are pH, temperature, dissolved oxygen, level, foam level, agitation frequency and the input setpoints of the operational parameters of the bioreactor control unit; (ii) The development of an analytical method using near-infrared spectroscopy (NIRS) in order to enable substrate, products and cells concentration monitoring during a fermentation process for ethanol production using the yeast Saccharomyces cerevisiae. Three fermentation runs were conducted (F1, F2 and F3) that were monitored by NIRS and subsequent sampling for analytical characterization. The data obtained were used for calibration and validation, where pre-treatments combined or not with smoothing filters were applied to spectrum data. The most satisfactory results were obtained when the calibration models were constructed from real samples of culture medium removed from the fermentation assays F1, F2 and F3, showing that the analytical method based on NIRS can be used as a fast and effective method to quantify cells, substrate and products concentration what enables the implementation of insitu real time monitoring of fermentation processes
Resumo:
This paper investigates the cognitive processes that operate in understanding narratives in this case, the novel Macunaíma, by Mário de Andrade. Our work belongs to the field of Embodied-based Cognitive Linguistics and, due to its interdisciplinary nature, it dialogues with theoretical and methodological frameworks of Psycholinguistics, Cognitive Psychology and Neurosciences. Therefore, we adopt an exploratory research design, recall and cloze tests, adapted, with postgraduation students, all native speakers of Brazilian Portuguese. The choice of Macunaíma as the novel and initial motivation for this proposal is due to the fact it is a fantastic narrative, which consists of events, circumstances and characters that are clearly distant types from what is experienced in everyday life. Thus, the novel provides adequate data to investigate the configuration of meaning, within an understanding-based model. We, therefore, seek, to answer questions that are still, generally, scarcely explored in the field of Cognitive Linguistics, such as to what extent is the activation of mental models (schemas and frames) related to the process of understanding narratives? How are we able to build sense even when words or phrases are not part of our linguistic repertoire? Why do we get emotionally involved when reading a text, even though it is fiction? To answer them, we assume the theoretical stance that meaning is not in the text, it is constructed through language, conceived as a result of the integration between the biological (which results in creating abstract imagery schemes) and the sociocultural (resulting in creating frames) apparatus. In this sense, perception, cognitive processing, reception and transmission of the information described are directly related to how language comprehension occurs. We believe that the results found in our study may contribute to the cognitive studies of language and to the development of language learning and teaching methodologies
Resumo:
Galactic stellar clusters have a great variety of physical properties that make valuable probes of stellar and galactic chemical evolution. Current studies show a discrepancy between the standard evolutionary models and observations, mainly considering the level of mixing and convective dilution of light elements, as well as to the evolution of the angular momentum. In order to better settle some of these properties, we present a detailed spectroscopic analysis of 28 evolved stars, from the turn-off to the RGB, belonging to the stellar open cluster M67. The observations were performed using UVES+FLAMES at VLT/UT2. We determined stellar parameters and metallicity from LTE analysis of Fe I and Fe II lines between 420 1100 nm. The Li abundance was obtained using the line at 6707.78 ˚A, for the whole sample of stars. The Li abundances of evolved stars of M67 present a gradual decreasing when decreasing the effective temperature. The Li dilution factor for giant stars of M67 with Teff ∼ 4350K is at least 2300 times greater than that predicted by standard theory for single field giant stars. The Li abundance as a function of rotation exhibits a good correlation for evolved stars of M67, with a much smaller dispersion than the field evolved stars. The mass and the age seem to be some of the parameters that influence this connection. We discovered a Li-rich subgiant star in M67 (S1242). It is member of a spectroscopic binary system with a high eccentricity. Its Li abundance is 2.7, the highest Li content ever measured for an evolved star in M67. Two possibilities could explain this anomalous Li content: (i) preservation of the Li at the post turn off stage due to tidal effects, or (ii) an efficient dredge-up of Li, hidden below the convective zone by atomic diffusion occurring in the post turn off stage. We also study the evolution of the angular momentum for the evolved stars in M67. The results are in agreement with previous studies dedicated to evolved stars of this cluster, where stars in the same region of the CM-diagram have quite similar rotations, but with values that indicate an extra breaking along the main sequence. Finally, we analize the distributions of the average rotational velocity and of the average Li abundance as a function of age. With relation to the average Li abundances, stars in clusters and field stars present the same type of exponencial decay law t−β. Such decay is observed for ages lesser than 2 Gyr. From this age, is observed that the average Li abundance remain constant, differently of the one observed in the rotation age connection, where the average rotational velocity decreases slowly with age
Resumo:
Nowadays, the importance of using software processes is already consolidated and is considered fundamental to the success of software development projects. Large and medium software projects demand the definition and continuous improvement of software processes in order to promote the productive development of high-quality software. Customizing and evolving existing software processes to address the variety of scenarios, technologies, culture and scale is a recurrent challenge required by the software industry. It involves the adaptation of software process models for the reality of their projects. Besides, it must also promote the reuse of past experiences in the definition and development of software processes for the new projects. The adequate management and execution of software processes can bring a better quality and productivity to the produced software systems. This work aimed to explore the use and adaptation of consolidated software product lines techniques to promote the management of the variabilities of software process families. In order to achieve this aim: (i) a systematic literature review is conducted to identify and characterize variability management approaches for software processes; (ii) an annotative approach for the variability management of software process lines is proposed and developed; and finally (iii) empirical studies and a controlled experiment assess and compare the proposed annotative approach against a compositional one. One study a comparative qualitative study analyzed the annotative and compositional approaches from different perspectives, such as: modularity, traceability, error detection, granularity, uniformity, adoption, and systematic variability management. Another study a comparative quantitative study has considered internal attributes of the specification of software process lines, such as modularity, size and complexity. Finally, the last study a controlled experiment evaluated the effort to use and the understandability of the investigated approaches when modeling and evolving specifications of software process lines. The studies bring evidences of several benefits of the annotative approach, and the potential of integration with the compositional approach, to assist the variability management of software process lines
Resumo:
Software Repository Mining (MSR) is a research area that analyses software repositories in order to derive relevant information for the research and practice of software engineering. The main goal of repository mining is to extract static information from repositories (e.g. code repository or change requisition system) into valuable information providing a way to support the decision making of software projects. On the other hand, another research area called Process Mining (PM) aims to find the characteristics of the underlying process of business organizations, supporting the process improvement and documentation. Recent works have been doing several analyses through MSR and PM techniques: (i) to investigate the evolution of software projects; (ii) to understand the real underlying process of a project; and (iii) create defect prediction models. However, few research works have been focusing on analyzing the contributions of software developers by means of MSR and PM techniques. In this context, this dissertation proposes the development of two empirical studies of assessment of the contribution of software developers to an open-source and a commercial project using those techniques. The contributions of developers are assessed through three different perspectives: (i) buggy commits; (ii) the size of commits; and (iii) the most important bugs. For the opensource project 12.827 commits and 8.410 bugs have been analyzed while 4.663 commits and 1.898 bugs have been analyzed for the commercial project. Our results indicate that, for the open source project, the developers classified as core developers have contributed with more buggy commits (although they have contributed with the majority of commits), more code to the project (commit size) and more important bugs solved while the results could not indicate differences with statistical significance between developer groups for the commercial project
Resumo:
One of the mechanisms responsible for the anomalous diffusion is the existence of long-range temporal correlations, for example, Fractional Brownian Motion and walk models according to Elephant memory and Alzheimer profiles, whereas in the latter two cases the walker can always "remember" of his first steps. The question to be elucidated, and the was the main motivation of our work, is if memory of the historic initial is condition for observation anomalous diffusion (in this case, superdiffusion). We give a conclusive answer, by studying a non-Markovian model in which the walkers memory of the past, at time t, is given by a Gaussian centered at time t=2 and standard deviation t which grows linearly as the walker ages. For large widths of we find that the model behaves similarly to the Elephant model; In the opposite limit (! 0), although the walker forget the early days, we observed similar results to the Alzheimer walk model, in particular the presence of amnestically induced persistence, characterized by certain log-periodic oscillations. We conclude that the memory of earlier times is not a necessary condition for the generating of superdiffusion nor the amnestically induced persistence and can appear even in profiles of memory that forgets the initial steps, like the Gausssian memory profile investigated here.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)