18 resultados para accounting model design

em Instituto Politécnico do Porto, Portugal


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Doctoral Thesis in Information Systems and Technologies Area of Engineering and Manag ement Information Systems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes the main research results in a new methodology, in which the stages and strategies of the technology integration process are identified and described. A set of principles and recommendations are therefore presented. The MIPO model described in this paper is a result of the effort made regarding the understanding of the main success features of good practices, in the web environment, integrated in the information systems/information technology context. The initial model has been created, based on experiences and literature review. After that, it was tested in the information and technology system units at higher school and also adapted as a result of four cycles of an actionresearch work combined with a case study research. The information, concepts and procedures presented here give support to teachers and instructors, instructional designers and planning teams – anyone who wants to develop effective b‐learning instructions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a project consisting on the development of an Intelligent Tutoring System, for training and support concerning the development of electrical installation projects to be used by electrical engineers, technicians and students. One of the major goals of this project is to devise a teaching model based on Intelligent Tutoring techniques, considering not only academic knowledge but also other types of more empirical knowledge, able to achieve successfully the training of electrical installation design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Involving groups in important management processes such as decision making has several advantages. By discussing and combining ideas, counter ideas, critical opinions, identified constraints, and alternatives, a group of individuals can test potentially better solutions, sometimes in the form of new products, services, and plans. In the past few decades, operations research, AI, and computer science have had tremendous success creating software systems that can achieve optimal solutions, even for complex problems. The only drawback is that people don’t always agree with these solutions. Sometimes this dissatisfaction is due to an incorrect parameterization of the problem. Nevertheless, the reasons people don’t like a solution might not be quantifiable, because those reasons are often based on aspects such as emotion, mood, and personality. At the same time, monolithic individual decisionsupport systems centered on optimizing solutions are being replaced by collaborative systems and group decision-support systems (GDSSs) that focus more on establishing connections between people in organizations. These systems follow a kind of social paradigm. Combining both optimization- and socialcentered approaches is a topic of current research. However, even if such a hybrid approach can be developed, it will still miss an essential point: the emotional nature of group participants in decision-making tasks. We’ve developed a context-aware emotion based model to design intelligent agents for group decision-making processes. To evaluate this model, we’ve incorporated it in an agent-based simulator called ABS4GD (Agent-Based Simulation for Group Decision), which we developed. This multiagent simulator considers emotion- and argument based factors while supporting group decision-making processes. Experiments show that agents endowed with emotional awareness achieve agreements more quickly than those without such awareness. Hence, participant agents that integrate emotional factors in their judgments can be more successful because, in exchanging arguments with other agents, they consider the emotional nature of group decision making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A cada instante surgem novas soluções de aprendizagem, resultado da evolução tecnológica constante com que nos deparamos. Estas inovações potenciam uma transmissão do conhecimento entre o educador e o educando cada vez mais simplificada, rápida e eficiente. Alguns destes avanços têm em vista a centralização no aluno, através da delegação de tarefas e da disponibilização de conteúdos, investindo na autonomia e na auto-aprendizagem, de modo a que cada aluno crie o seu próprio método de estudo, e evolua gradualmente, com o acompanhamento de um professor ou sistema autónomo de aprendizagem. Com esta investigação, é pretendido fazer um estudo dos métodos de aprendizagem ao longo do tempo até à actualidade, enumerando algumas das ferramentas utilizadas no processo de aprendizagem, indicando os vários benefícios, bem como contrapartidas do uso das mesmas. Será também analisado um caso de estudo baseado numa destas ferramentas, descrevendo o seu funcionamento e modo de interacção entre as várias entidades participantes, apresentando os resultados obtidos. O caso de estudo consistirá na criação de um cenário específico de aprendizagem, na área da saúde, analisando-o em diferentes contextos, e evidenciando as características e benefícios de cada ambiente analisado, no processo aprendizagem. Será então demonstrado como é possível optimizar os processos de aprendizagem, utilizando ferramentas de informatização e automatização desses mesmos processos, de forma tornar o processo de ensino mais célere e eficaz, num ambiente controlável, e com as funcionalidades que a tecnologia actual permite.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Box–Behnken factorial design coupled with surface response methodology was used to evaluate the effects of temperature, pH and initial concentration in the Cu(II) sorption process onto the marine macroalgae Ascophyllum nodosum. The effect of the operating variables on metal uptake capacitywas studied in a batch system and a mathematical model showing the influence of each variable and their interactions was obtained. Study ranges were 10–40ºC for temperature, 3.0–5.0 for pH and 50–150mgL−1 for initial Cu(II) concentration. Within these ranges, the biosorption capacity is slightly dependent on temperature but markedly increases with pH and initial concentration of Cu(II). The uptake capacities predicted by the model are in good agreement with the experimental values. Maximum biosorption capacity of Cu(II) by A. nodosum is 70mgg−1 and corresponds to the following values of those variables: temperature = 40ºC, pH= 5.0 and initial Cu(II) concentration = 150mgL−1.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In life cycle impact assessment (LCIA) models, the sorption of the ionic fraction of dissociating organic chemicals is not adequately modeled because conventional non-polar partitioning models are applied. Therefore, high uncertainties are expected when modeling the mobility, as well as the bioavailability for uptake by exposed biota and degradation, of dissociating organic chemicals. Alternative regressions that account for the ionized fraction of a molecule to estimate fate parameters were applied to the USEtox model. The most sensitive model parameters in the estimation of ecotoxicological characterization factors (CFs) of micropollutants were evaluated by Monte Carlo analysis in both the default USEtox model and the alternative approach. Negligible differences of CFs values and 95% confidence limits between the two approaches were estimated for direct emissions to the freshwater compartment; however the default USEtox model overestimates CFs and the 95% confidence limits of basic compounds up to three orders and four orders of magnitude, respectively, relatively to the alternative approach for emissions to the agricultural soil compartment. For three emission scenarios, LCIA results show that the default USEtox model overestimates freshwater ecotoxicity impacts for the emission scenarios to agricultural soil by one order of magnitude, and larger confidence limits were estimated, relatively to the alternative approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The content of a Learning Object is frequently characterized by metadata from several standards, such as LOM, SCORM and QTI. Specialized domains require new application profiles that further complicate the task of editing the metadata of learning object since their data models are not supported by existing authoring tools. To cope with this problem we designed a metadata editor supporting multiple metadata languages, each with its own data model. It is assumed that the supported languages have an XML binding and we use RDF to create a common metadata representation, independent from the syntax of each metadata languages. The combined data model supported by the editor is defined as an ontology. Thus, the process of extending the editor to support a new metadata language is twofold: firstly, the conversion from the XML binding of the metadata language to RDF and vice-versa; secondly, the extension of the ontology to cover the new metadata model. In this paper we describe the general architecture of the editor, we explain how a typical metadata language for learning objects is represented as an ontology, and how this formalization captures all the data required to generate the graphical user interface of the editor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

STRIPPING is a software application developed for the automatic design of a randomly packing column where the transfer of volatile organic compounds (VOCs) from water to air can be performed and to simulate it’s behaviour in a steady-state. This software completely purges any need of experimental work for the selection of diameter of the column, and allows a choice, a priori, of the most convenient hydraulic regime for this type of operation. It also allows the operator to choose the model used for the calculation of some parameters, namely between the Eckert/Robbins model and the Billet model for estimating the pressure drop of the gaseous phase, and between the Billet and Onda/Djebbar’s models for the mass transfer. Illustrations of the graphical interface offered are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mathematical models and statistical analysis are key instruments in soil science scientific research as they can describe and/or predict the current state of a soil system. These tools allow us to explore the behavior of soil related processes and properties as well as to generate new hypotheses for future experimentation. A good model and analysis of soil properties variations, that permit us to extract suitable conclusions and estimating spatially correlated variables at unsampled locations, is clearly dependent on the amount and quality of data and of the robustness techniques and estimators. On the other hand, the quality of data is obviously dependent from a competent data collection procedure and from a capable laboratory analytical work. Following the standard soil sampling protocols available, soil samples should be collected according to key points such as a convenient spatial scale, landscape homogeneity (or non-homogeneity), land color, soil texture, land slope, land solar exposition. Obtaining good quality data from forest soils is predictably expensive as it is labor intensive and demands many manpower and equipment both in field work and in laboratory analysis. Also, the sampling collection scheme that should be used on a data collection procedure in forest field is not simple to design as the sampling strategies chosen are strongly dependent on soil taxonomy. In fact, a sampling grid will not be able to be followed if rocks at the predicted collecting depth are found, or no soil at all is found, or large trees bar the soil collection. Considering this, a proficient design of a soil data sampling campaign in forest field is not always a simple process and sometimes represents a truly huge challenge. In this work, we present some difficulties that have occurred during two experiments on forest soil that were conducted in order to study the spatial variation of some soil physical-chemical properties. Two different sampling protocols were considered for monitoring two types of forest soils located in NW Portugal: umbric regosol and lithosol. Two different equipments for sampling collection were also used: a manual auger and a shovel. Both scenarios were analyzed and the results achieved have allowed us to consider that monitoring forest soil in order to do some mathematical and statistical investigations needs a sampling procedure to data collection compatible to established protocols but a pre-defined grid assumption often fail when the variability of the soil property is not uniform in space. In this case, sampling grid should be conveniently adapted from one part of the landscape to another and this fact should be taken into consideration of a mathematical procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Volatile organic compounds are a common source of groundwater contamination that can be easily removed by air stripping in columns with random packing and using a counter-current flow between the phases. This work proposes a new methodology for column design for any type of packing and contaminant which avoids the necessity of an arbitrary chosen diameter. It also avoids the employment of the usual graphical Eckert correlations for pressure drop. The hydraulic features are previously chosen as a project criterion. The design procedure was translated into a convenient algorithm in C++ language. A column was built in order to test the design, the theoretical steady-state and dynamic behaviour. The experiments were conducted using a solution of chloroform in distilled water. The results allowed for a correction in the theoretical global mass transfer coefficient previously estimated by the Onda correlations, which depend on several parameters that are not easy to control in experiments. For best describe the column behaviour in stationary and dynamic conditions, an original mathematical model was developed. It consists in a system of two partial non linear differential equations (distributed parameters). Nevertheless, when flows are steady, the system became linear, although there is not an evident solution in analytical terms. In steady state the resulting ODE can be solved by analytical methods, and in dynamic state the discretization of the PDE by finite differences allows for the overcoming of this difficulty. To estimate the contaminant concentrations in both phases in the column, a numerical algorithm was used. The high number of resulting algebraic equations and the impossibility of generating a recursive procedure did not allow the construction of a generalized programme. But an iterative procedure developed in an electronic worksheet allowed for the simulation. The solution is stable only for similar discretizations values. If different values for time/space discretization parameters are used, the solution easily becomes unstable. The system dynamic behaviour was simulated for the common liquid phase perturbations: step, impulse, rectangular pulse and sinusoidal. The final results do not configure strange or non-predictable behaviours.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mestrado em Engenharia Informática - Área de Especialização em Sistemas Gráficos e Multimédia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atualmente a energia é considerada um vetor estratégico nas diversas organizações. Assim sendo, a gestão e a utilização racional da energia são consideradas instrumentos fundamentais para a redução dos consumos associados aos processos de produção do sector industrial. As ações de gestão energética não deverão ficar pela fase do projeto das instalações e dos meios de produção, mas sim acompanhar a atividade da Empresa. A gestão da energia deve ser sustentada com base na realização regular de diagnósticos energéticos às instalações consumidoras e concretizada através de planos de atuação e de investimento que apresentem como principal objetivo a promoção da eficiência energética, conduzindo assim à redução dos respetivos consumos e, consequentemente, à redução da fatura energética. Neste contexto, a utilização de ferramentas de apoio à gestão de energia promovem um consumo energético mais racional, ou seja, promovem a eficiência energética e é neste sentido que se insere este trabalho. O presente trabalho foi desenvolvido na Empresa RAR Açúcar e apresentou como principais objetivos: a reformulação do Sistema de Gestão de Consumos de Energia da Empresa, a criação de um modelo quantitativo que permitisse ao Gestor de Energia prever os consumos anuais de água, fuelóleo e eletricidade da Refinaria e a elaboração de um plano de consumos para o ano de 2014 a partir do modelo criado. A reformulação do respetivo Sistema de Gestão de Consumos resultou de um conjunto de etapas. Numa primeira fase foi necessário efetuar uma caraterização e uma análise do atual Sistema de Gestão de Consumos da Empresa, sistema composto por um conjunto de sete ficheiros de cálculo do programa Microsoft Excel©. Terminada a análise, selecionada a informação pertinente e propostas todas as melhorias a introduzir nos ficheiros, procedeu-se à reformulação do respetivo SGE, reduzindo-se o conjunto de ficheiros de cálculo para apenas dois ficheiros, um onde serão efetuados e visualizados todos os registos e outro onde serão realizados os cálculos necessários para o controlo energético da Empresa. O novo Sistema de Gestão de Consumos de Energia será implementado no início do ano de 2015. Relativamente às alterações propostas para as folhas de registos manuais, estas já foram implementadas pela Empresa. Esta aplicação prática mostrou-se bastante eficiente uma vez que permitiu grandes melhorias processuais nomeadamente, menores tempos de preenchimento das mesmas e um encurtamento das rotas efetuadas diariamente pelos operadores. Através do levantamento efetuado aos diversos contadores foi possível identificar todas as áreas onde será necessário a sua instalação e a substituição de todos os contadores avariados, permitindo deste modo uma contabilização mais precisa de todos os consumos da Empresa. Com esta reestruturação o Sistema de Gestão de Consumos tornou-se mais dinâmico, mais claro e, principalmente, mais eficiente. Para a criação do modelo de previsão de consumos da Empresa foi necessário efetuar-se um levantamento dos consumos históricos de água, eletricidade, fuelóleo e produção de açúcar de dois anos. Após este levantamento determinaram-se os consumos específicos de água, fuelóleo e eletricidade diários (para cada semana dos dois anos) e procedeu-se à caracterização destes consumos por tipo de dia. Efetuada a caracterização definiu-se para cada tipo de dia um consumo específico médio com base nos dois anos. O modelo de previsão de consumos foi criado com base nos consumos específicos médios dos dois anos correspondentes a cada tipo de dia. Procedeu-se por fim à verificação do modelo, comparando-se os consumos obtidos através do modelo (consumos previstos) com os consumos reais de cada ano. Para o ano de 2012 o modelo apresenta um desvio de 6% na previsão da água, 12% na previsão da eletricidade e de 6% na previsão do fuelóleo. Em relação ao ano de 2013, o modelo apresenta um erro de 1% para a previsão dos consumos de água, 8% para o fuelóleo e de 1% para a eletricidade. Este modelo permitirá efetuar contratos de aquisição de energia elétrica com maior rigor o que conduzirá a vantagens na sua negociação e consequentemente numa redução dos custos resultantes da aquisição da mesma. Permitirá também uma adequação dos fluxos de tesouraria à necessidade reais da Empresa, resultante de um modelo de previsão mais rigoroso e que se traduz numa mais-valia financeira para a mesma. Foi também proposto a elaboração de um plano de consumos para o ano de 2014 a partir do modelo criado em função da produção prevista para esse mesmo ano. O modelo apresenta um desvio de 24% na previsão da água, 0% na previsão da eletricidade e de 28% na previsão do fuelóleo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: This exploratory research evaluates if there is a relationship between the number of years since an organization has achieved ISO 9001 certification and the highest level of recognition received by the same organization with the EFQM Business Excellence Model. Methodology/Approach: After state of the art review a detailed comparison between both models was made. Fifty two Portuguese organizations were considered and Correlation coefficient Spearman Rho was used to investigate the possible relationships. Findings: Conclusion is that there is indeed a moderate positive correlation between these two variables, the higher the number of years of ISO 9001 certification, the higher the results of the organization EFQM model evaluation and recognition. This supports the assumption that ISO 9001 International Standard by incorporating many of the principles present in the EFQM Business Excellence Model is consistent with this model and can be considered as a step towards that direction. Research Limitation/implication: Due to the dynamic nature of these models that might change over time and the possible time delays between implementation and results, more in-depth studies like experimental design or a longitudinal quasi-experimental design could be used to confirm the results of this investigation. Originality/Value of paper: This research gives additional insights on conjunct studies of both models. The use of external evaluation results carried out by the independent EFQM assessors minimizes the possible bias of previous studies accessing the value of ISO 9001 certification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

20th International Conference on Reliable Software Technologies - Ada-Europe 2015 (Ada-Europe 2015), 22 to 26, Jun, 2015, Madrid, Spain.