881 resultados para Interaction modeling. Model-based development. Interaction evaluation.
Resumo:
Rural communities in Cuenca (Spain) are characterized by a great social dislocation, mostly due to the low population density in these areas. In this way, the existence of groups of citizens able to be active agents of their development process is a critical aspect for any community-based development process in this Spanish region. The Institute of Community Development of Cuenca (IDC) has been working with this type of groups for the last 30 years focusing on the organizational empowerment of the rural communities. Main tools in this process have been the empowerment evaluation approach and the critical friend role when helping the groups to achieve their objectives and reinforcing them. This chapter analyses the empowerment process and how the critical friend role is nourished by the facilitator figure.
Resumo:
The Mg-chelation is found to be a prerequisite to direct protoporphyrin IX into the chlorophyll (Chl)-synthesizing branch of the tetrapyrrol pathway. The ATP-dependent insertion of magnesium into protoporphyrin IX is catalyzed by the enzyme Mg-chelatase, which consists of three protein subunits (CHL D, CHL I, and CHL H). We have chosen the Mg-chelatase from tobacco to obtain more information about the mode of molecular action of this complex enzyme by elucidating the interactions in vitro and in vivo between the central subunit CHL D and subunits CHL I and CHL H. We dissected CHL D in defined peptide fragments and assayed for the essential part of CHL D for protein–protein interaction and enzyme activity. Surprisingly, only a small part of CHL D, i.e., 110 aa, was required for interaction with the partner subunits and maintenance of the enzyme activity. In addition, it could be demonstrated that CHL D is capable of forming homodimers. Moreover, it interacted with both CHL I and CHL H. Our data led to the outline of a two-step model based on the cooperation of the subunits for the chelation process.
Resumo:
The sensory patches in the ear of a vertebrate can be compared with the mechanosensory bristles of a fly. This comparison has led to the discovery that lateral inhibition mediated by the Notch cell–cell signaling pathway, first characterized in Drosophila and crucial for bristle development, also has a key role in controlling the pattern of sensory hair cells and supporting cells in the ear. We review the arguments for considering the sensory patches of the vertebrate ear and bristles of the insect to be homologous structures, evolved from a common ancestral mechanosensory organ, and we examine more closely the role of Notch signaling in each system. Using viral vectors to misexpress components of the Notch pathway in the chick ear, we show that a simple lateral-inhibition model based on feedback regulation of the Notch ligand Delta is inadequate for the ear just as it is for the fly bristle. The Notch ligand Serrate1, expressed in supporting cells in the ear, is regulated by lateral induction, not lateral inhibition; commitment to become a hair cell is not simply controlled by levels of expression of the Notch ligands Delta1, Serrate1, and Serrate2 in the neighbors of the nascent hair cell; and at least one factor, Numb, capable of blocking reception of lateral inhibition is concentrated in hair cells. These findings reinforce the parallels between the vertebrate ear and the fly bristle and show how study of the insect system can help us understand the vertebrate.
Resumo:
A model based on the nonlinear Poisson-Boltzmann equation is used to study the electrostatic contribution to the binding free energy of a simple intercalating ligand, 3,8-diamino-6-phenylphenanthridine, to DNA. We find that the nonlinear Poisson-Boltzmann model accurately describes both the absolute magnitude of the pKa shift of 3,8-diamino-6-phenylphenanthridine observed upon intercalation and its variation with bulk salt concentration. Since the pKa shift is directly related to the total electrostatic binding free energy of the charged and neutral forms of the ligand, the accuracy of the calculations implies that the electrostatic contributions to binding are accurately predicted as well. Based on our results, we have developed a general physical description of the electrostatic contribution to ligand-DNA binding in which the electrostatic binding free energy is described as a balance between the coulombic attraction of a ligand to DNA and the disruption of solvent upon binding. Long-range coulombic forces associated with highly charged nucleic acids provide a strong driving force for the interaction of cationic ligands with DNA. These favorable electrostatic interactions are, however, largely compensated for by unfavorable changes in the solvation of both the ligand and the DNA upon binding. The formation of a ligand-DNA complex removes both charged and polar groups at the binding interface from pure solvent while it displaces salt from around the nucleic acid. As a result, the total electrostatic binding free energy is quite small. Consequently, nonpolar interactions, such as tight packing and hydrophobic forces, must play a significant role in ligand-DNA stability.
Resumo:
Unripe banana flour (UBF) production employs bananas not submitted to maturation process, is an interesting alternative to minimize the fruit loss reduction related to inappropriate handling or fast ripening. The UBF is considered as a functional ingredient improving glycemic and plasma insulin levels in blood, have also shown efficacy on the control of satiety, insulin resistance. The aim of this work was to study the drying process of unripe banana slabs (Musa cavendishii, Nanicão) developing a transient drying model through mathematical modeling with simultaneous moisture and heat transfer. The raw material characterization was performed and afterwards the drying process was conducted at 40 ºC, 50 ºC e 60 ºC, the product temperature was recorded using thermocouples, the air velocity inside the chamber was 4 m·s-1. With the experimental data was possible to validate the diffusion model based on the Fick\'s second law and Fourier. For this purpose, the sorption isotherms were measured and fitted to the GAB model estimating the equilibrium moisture content (Xe), 1.76 [g H2O/100g d.b.] at 60 ºC and 10 % of relative humidity (RH), the thermophysical properties (k, Cp, ?) were also measured to be used in the model. Five cases were contemplated: i) Constant thermophysical properties; ii) Variable properties; iii) Mass (hm), heat transfer (h) coefficient and effective diffusivity (De) estimation 134 W·m-2·K-1, 4.91x10-5 m-2·s-1 and 3.278?10-10 m·s-2 at 60 ºC, respectively; iv) Variable De, it presented a third order polynomial behavior as function of moisture content; v) The shrinkage had an effect on the mathematical model, especially in the 3 first hours of process, the thickness experienced a contraction of about (30.34 ± 1.29) % out of the initial thickness, finding two decreasing drying rate periods (DDR I and DDR II), 3.28x10-10 m·s-2 and 1.77x10-10 m·s-2, respectively. COMSOL Multiphysics simulations were possible to perform through the heat and mass transfer coefficient estimated by the mathematical modeling.
Resumo:
Desde o final do Século XX e início do Século XXI, estudos analisam a elevada taxa de insucesso ou insatisfação com os Programas de Lean. Esta taxa tem se demonstrado demasiadamente elevada, variando entre 66% e 90%. Como efeito deste insucesso, tem-se o desperdício de tempo, dinheiro, recursos e, talvez o pior, tem-se a propagação do medo nos agentes de mudança em empreitar novas iniciativas de mudança. Estudos apontam a falta de alinhamento de tais projetos com a Cultura Organizacional como uma das questões fundamentais deste insucesso. Partindo desta temática de pesquisa, este ensaio teórico pode ser caracterizado como uma abordagem qualitativa de análise do problema, de natureza básica de pesquisa buscando gerar conhecimentos novos e úteis às organizações, sem aplicação prática prevista neste primeiro estágio de pesquisa. A fonte de evidências para sustentar o modelo proposto foi revisão dos estudos de caso encontrados na literatura, sendo utilizadas tanto uma Revisão Bibliográfica Sistemática (RBS) quanto Exploratória, de tal maneira a buscar o \"estado da arte\" no campo de estudo. A Fundamentação Teórica do trabalho é baseada na literatura de quatro grandes campos de estudo: (i) Estratégia, (ii) Lean, (iii) Cultura Organizacional e (iv) Gestão de Mudanças. A RBS tem foco nas interseções destes grandes campos, agregando 190 trabalhos internacionais. Por sua vez, a Revisão Exploratória traz algumas das principais referências dos três campos de estudo, como: Edgar Schein, John Kotter, Kim Cameron, Robert Quinn, David Mann, dentre outros. Desta maneira, este trabalho estudou a influência da cultura organizacional nos projetos de transformação e, a partir da ruptura com a teoria atual, construiu e propôs uma sistemática teórica, intitulada de \"Sistemática de Transformação\" (ou simplesmente \"Sistemática T\"), a qual propõe o alinhamento entre três dimensões: Estratégia, Projeto de Transformação e Cultura Organizacional. Fazendo uso desta sistemática, é esperado que os agentes de mudança consigam ter um planejamento mais eficaz do processo de diagnóstico, avaliação e gestão da cultura organizacional alinhado à Estratégia e também ao Projeto de Transformação da organização, com ênfase nos Programas de Lean. A proposição e uso desta sistemática pode favorecer tanto a discussão acadêmica na área de Gestão de Operações sobre o tema, quanto fornecer subsídios para aplicações práticas mais eficazes.
Resumo:
O Teste Baseado em Modelos (TBM) emergiu como uma estratégia promissora para minimizar problemas relacionados à falta de tempo e recursos em teste de software e visa verificar se a implementação sob teste está em conformidade com sua especificação. Casos de teste são gerados automaticamente a partir de modelos comportamentais produzidos durante o ciclo de desenvolvimento de software. Entre as técnicas de modelagem existentes, Sistemas de Transição com Entrada/Saída (do inglês, Input/Output Transition Systems - IOTSs), são modelos amplamente utilizados no TBM por serem mais expressivos do que Máquinas de Estado Finito (MEFs). Apesar dos métodos existentes para geração de testes a partir de IOTSs, o problema da seleção de casos de testes é um tópico difícil e importante. Os métodos existentes para IOTS são não-determinísticos, ao contrário da teoria existente para MEFs, que fornece garantia de cobertura completa com base em um modelo de defeitos. Esta tese investiga a aplicação de modelos de defeitos em métodos determinísticos de geração de testes a partir de IOTSs. Foi proposto um método para geração de conjuntos de teste com base no método W para MEFs. O método gera conjuntos de teste de forma determinística além de satisfazer condições de suficiência de cobertura da especificação e de todos os defeitos do domínio de defeitos definido. Estudos empíricos avaliaram a aplicabilidade e eficácia do método proposto: resultados experimentais para analisar o custo de geração de conjuntos de teste utilizando IOTSs gerados aleatoriamente e um estudo de caso com especificações da indústria mostram a efetividade dos conjuntos gerados em relação ao método tradicional de Tretmans.
Resumo:
Este trabalho apresenta uma discussão sobre o estudo dos efeitos térmicos e elásticos decorrentes da pressão de sustentação presentes nos mancais. Para tanto, propõe-se um modelo matemático baseado nas equações para mancais curtos considerando a região de cavitação e utilizando o princípio da continuidade de massa. Com isto, deduzem-se as equações para o mancal a partir das equações de Reynolds e da energia, aplicando uma solução modificada para a solução de Ocvirk, sendo as equações resolvidas numericamente pelo Método das Diferenças Finitas. Somado o tratamento de mecânica dos fluidos, o trabalho discute dois modelos térmicos de previsão de temperatura média do fluido e sua influência no campo de pressão, apresentando gráficos representativos do campo de pressão e de temperatura, assim como as diferenças e implicações das diferenças. Para o cálculo de deformação da estrutura, utiliza-se um Modelo de Elementos Finitos para uma dada geometria, fazendo-se uma avaliação da variação do campo de pressão e o quanto essa diferença afeta as demais propriedades do fluido. Por fim, com o modelo completo, calcula-se o quanto esse modelamento para mancais curtos se aproxima de soluções para mancais finitos, com base em resultados da literatura, chegando a desvios quase oito vezes menores que os previstos pela literatura. Além disso, pode-se estabelecer a abrangência do modelo, ou seja, prever as condições em que suas propriedades são válidas e podem ser utilizadas para estudos iniciais.
Resumo:
Robotics is a field that presents a large number of problems because it depends on a large number of disciplines, devices, technologies and tasks. Its expansion from perfectly controlled industrial environments toward open and dynamic environment presents a many new challenges, such as robots household robots or professional robots. To facilitate the rapid development of robotic systems, low cost, reusability of code, its medium and long term maintainability and robustness are required novel approaches to provide generic models and software systems who develop paradigms capable of solving these problems. For this purpose, in this paper we propose a model based on multi-agent systems inspired by the human nervous system able to transfer the control characteristics of the biological system and able to take advantage of the best properties of distributed software systems.
Empirical study on the maintainability of Web applications: Model-driven Engineering vs Code-centric
Resumo:
Model-driven Engineering (MDE) approaches are often acknowledged to improve the maintainability of the resulting applications. However, there is a scarcity of empirical evidence that backs their claimed benefits and limitations with respect to code-centric approaches. The purpose of this paper is to compare the performance and satisfaction of junior software maintainers while executing maintainability tasks on Web applications with two different development approaches, one being OOH4RIA, a model-driven approach, and the other being a code-centric approach based on Visual Studio .NET and the Agile Unified Process. We have conducted a quasi-experiment with 27 graduated students from the University of Alicante. They were randomly divided into two groups, and each group was assigned to a different Web application on which they performed a set of maintainability tasks. The results show that maintaining Web applications with OOH4RIA clearly improves the performance of subjects. It also tips the satisfaction balance in favor of OOH4RIA, although not significantly. Model-driven development methods seem to improve both the developers’ objective performance and subjective opinions on ease of use of the method. This notwithstanding, further experimentation is needed to be able to generalize the results to different populations, methods, languages and tools, different domains and different application sizes.
Resumo:
Context: Today’s project managers have a myriad of methods to choose from for the development of software applications. However, they lack empirical data about the character of these methods in terms of usefulness, ease of use or compatibility, all of these being relevant variables to assess the developer’s intention to use them. Objective: To compare three methods, each following a different paradigm (Model-Driven, Model-Based and Code-Centric) with respect to their adoption potential by junior software developers engaged in the development of the business layer of a Web 2.0 application. Method: We have conducted a quasi-experiment with 26 graduate students of the University of Alicante. The application developed was a Social Network, which was organized around a fixed set of modules. Three of them, similar in complexity, were used for the experiment. Subjects were asked to use a different method for each module, and then to answer a questionnaire that gathered their perceptions during such use. Results: The results show that the Model-Driven method is regarded as the most useful, although it is also considered the least compatible with previous developers’ experiences. They also show that junior software developers feel comfortable with the use of models, and that they are likely to use them if the models are accompanied by a Model-Driven development environment. Conclusions: Despite their relatively low level of compatibility, Model-Driven development methods seem to show a great potential for adoption. That said, however, further experimentation is needed to make it possible to generalize the results to a different population, different methods, other languages and tools, different domains or different application sizes.
Resumo:
Many applications including object reconstruction, robot guidance, and. scene mapping require the registration of multiple views from a scene to generate a complete geometric and appearance model of it. In real situations, transformations between views are unknown and it is necessary to apply expert inference to estimate them. In the last few years, the emergence of low-cost depth-sensing cameras has strengthened the research on this topic, motivating a plethora of new applications. Although they have enough resolution and accuracy for many applications, some situations may not be solved with general state-of-the-art registration methods due to the signal-to-noise ratio (SNR) and the resolution of the data provided. The problem of working with low SNR data, in general terms, may appear in any 3D system, then it is necessary to propose novel solutions in this aspect. In this paper, we propose a method, μ-MAR, able to both coarse and fine register sets of 3D points provided by low-cost depth-sensing cameras, despite it is not restricted to these sensors, into a common coordinate system. The method is able to overcome the noisy data problem by means of using a model-based solution of multiplane registration. Specifically, it iteratively registers 3D markers composed by multiple planes extracted from points of multiple views of the scene. As the markers and the object of interest are static in the scenario, the transformations obtained for the markers are applied to the object in order to reconstruct it. Experiments have been performed using synthetic and real data. The synthetic data allows a qualitative and quantitative evaluation by means of visual inspection and Hausdorff distance respectively. The real data experiments show the performance of the proposal using data acquired by a Primesense Carmine RGB-D sensor. The method has been compared to several state-of-the-art methods. The results show the good performance of the μ-MAR to register objects with high accuracy in presence of noisy data outperforming the existing methods.
Resumo:
The cell concentration and size distribution of the microalgae Nannochloropsis gaditana were studied over the whole growth process. Various samples were taken during the light and dark periods the algae were exposed to. The distributions obtained exhibited positive skew, and no change in the type of distribution was observed during the growth process. The size distribution shifted to lower diameters in dark periods while in light periods the opposite occurred. The overall trend during the growth process was one where the size distribution shifted to larger cell diameters, with differences between initial and final distributions of individual cycles becoming smaller. A model based on the Logistic model for cell concentration as a function of time in the dark period that also takes into account cell respiration and growth processes during dark and light periods, respectively, was proposed and successfully applied. This model provides a picture that is closer to the real growth and evolution of cultures, and reveals a clear effect of light and dark periods on the different ways in which cell concentration and diameter evolve with time.
Resumo:
The aim of this work is to improve students’ learning by designing a teaching model that seeks to increase student motivation to acquire new knowledge. To design the model, the methodology is based on the study of the students’ opinion on several aspects we think importantly affect the quality of teaching (such as the overcrowded classrooms, time intended for the subject or type of classroom where classes are taught), and on our experience when performing several experimental activities in the classroom (for instance, peer reviews and oral presentations). Besides the feedback from the students, it is essential to rely on the experience and reflections of lecturers who have been teaching the subject several years. This way we could detect several key aspects that, in our opinion, must be considered when designing a teaching proposal: motivation, assessment, progressiveness and autonomy. As a result we have obtained a teaching model based on instructional design as well as on the principles of fractal geometry, in the sense that different levels of abstraction for the various training activities are presented and the activities are self-similar, that is, they are decomposed again and again. At each level, an activity decomposes into a lower level tasks and their corresponding evaluation. With this model the immediate feedback and the student motivation are encouraged. We are convinced that a greater motivation will suppose an increase in the student’s working time and in their performance. Although the study has been done on a subject, the results are fully generalizable to other subjects.
Resumo:
Optimal currency area theory suggests that business cycle comovement is a sufficient condition for monetary union, particularly if there are low levels of labour mobility between potential members of the monetary union. Previous studies of co-movement of business cycle variables (mainly authored by Artis and Zhang in the late 1990s) found that there was a core of member states in the EU that could be grouped together as having similar business cycle comovements, but these studies always used Germany as the country against which to compare. In this study, the analysis of Artis and Zhang is extended and updated but correlating against both German and euro area macroeconomic aggregates and using more recent techniques in cluster analysis, namely model-based clustering techniques.