828 resultados para Input-output data
Resumo:
This research aimed at developing a research framework for the emerging field of enterprise systems engineering (ESE). The framework consists of an ESE definition, an ESE classification scheme, and an ESE process. This study views an enterprise as a system that creates value for its customers. Thus, developing the framework made use of system theory and IDEF methodologies. This study defined ESE as an engineering discipline that develops and applies systems theory and engineering techniques to specification, analysis, design, and implementation of an enterprise for its life cycle. The proposed ESE classification scheme breaks down an enterprise system into four elements. They are work, resources, decision, and information. Each enterprise element is specified with four system facets: strategy, competency, capacity, and structure. Each element-facet combination is subject to the engineering process of specification, analysis, design, and implementation, to achieve its pre-specified performance with respect to cost, time, quality, and benefit to the enterprise. This framework is intended for identifying research voids in the ESE discipline. It also helps to apply engineering and systems tools to this emerging field. It harnesses the relationships among various enterprise aspects and bridges the gap between engineering and management practices in an enterprise. The proposed ESE process is generic. It consists of a hierarchy of engineering activities presented in an IDEF0 model. Each activity is defined with its input, output, constraints, and mechanisms. The output of an ESE effort can be a partial or whole enterprise system design for its physical, managerial, and/or informational layers. The proposed ESE process is applicable to a new enterprise system design or an engineering change in an existing system. The long-term goal of this study aims at development of a scientific foundation for ESE research and development.
Resumo:
The demand side growth accounting studies the demand aggregate component contributions in the Gross Domestic Product (GDP). Traditionally, international and national organizations that uses the traditional method for calculating such contributions. However, this method does not take into account the effect the induction of imports by the various components of aggregate demand on the calculation of these. As an alternative to this method are presented others studies that consider this effect, as the alternative method proposed by Lara (2013), the attribution method, proposed by Kranendonk and Verbruggen (2005) and Hoekstra and van der Helm (2010), and the method the sraffian supermultiplier, by Freitas and Dweck (2013). Was made a summary of these methods, demonstrating the similarities and differences between them. Also, in the aim to contribute to the study of the subject was developed the “method of distribution of imports” that aims to distribute imports for the various components of aggregate demand, through the information set forth in the input-output matrices and tables of resources and uses. Were accounted the contributions to the growth of macroeconomic aggregates for Brazil from 2001 to 2009 using the method of distribution, and realized comparison with the traditional method, understanding the reasons for the differences in contributions. Later was done comparisons with all the methods presented in this work, between the calculated contributions to the growth of the components of aggregate demand and the domestic and external sectors. Was verified that the methods that exist in the literature was not enough to deal with this question, and given the alternatives for contributions to the growth presented throughout this work, it is believed that the method of distribution provides the best estimates for the account of contributions by aggregate demand sector. In particular, the main advantage of this method to the others is the breakdown of the contribution of imports, separated by aggregate demand component, which allows the analysis of contribution of each component to GDP growth. Thus, this type of analysis helps to study the pattern of growth of the Brazilian economy, not just the theoretical point of view, but also empirical and basis for the decision to economic policies
Resumo:
Wind- induced exposure is one of the major forces shaping the geomorphology and biota in coastal areas. The effect of wave exposure on littoral biota is well known in marine environments (Ekebon et al., 2003; Burrows et al., 2008). In the Cabrera Archipelago National Park wave exposure has demostrated to have an effect on the spatial distribution of different stages of E.marginatus (Alvarez et al., 2010). Standarized average wave exposures during 2008 along the Cabrera Archipelago National park coast line were calculated to be applied in studies of littoral species distribution within the archipelago. Average wave exposure (or apparent wave power) was calculated for points located 50 m equidistant on the coastline following the EXA methodology (EXposure estimates for fragmented Archipelagos) (Ekebon et al., 2003). The average wave exposures were standardized from 1 to 100 (minimum and maximum in the area), showing coastal areas with different levels of mea wave exposure during the year. Input wind data (direction and intensity) from 2008 was registered at the Cabrera mooring located north of Cabrera Archipelago. Data were provided by IMEDEA (CSIC-UIB, TMMOS http://www.imedea.uib-csic.es/tmoos/boyas/). This cartography has been developed under the framework of the project EPIMHAR, funded by the National Park's Network (Spanish Ministry of Environment, Maritime and Rural Affairs, reference: 012/2007 ). Part of this work has been developed under the research programs funded by "Fons de Garantia Agrària i Pesquera de les Illes Balears (FOGAIBA)".
Resumo:
Peer reviewed
Resumo:
Peer reviewed
Resumo:
We study a small circuit of coupled nonlinear elements to investigate general features of signal transmission through networks. The small circuit itself is perceived as building block for larger networks. Individual dynamics and coupling are motivated by neuronal systems: We consider two types of dynamical modes for an individual element, regular spiking and chattering and each individual element can receive excitatory and/or inhibitory inputs and is subjected to different feedback types (excitatory and inhibitory; forward and recurrent). Both, deterministic and stochastic simulations are carried out to study the input-output relationships of these networks. Major results for regular spiking elements include frequency locking, spike rate amplification for strong synaptic coupling, and inhibition-induced spike rate control which can be interpreted as a output frequency rectification. For chattering elements, spike rate amplification for low frequencies and silencing for large frequencies is characteristic
Resumo:
The real-time optimization of large-scale systems is a difficult problem due to the need for complex models involving uncertain parameters and the high computational cost of solving such problems by a decentralized approach. Extremum-seeking control (ESC) is a model-free real-time optimization technique which can estimate unknown parameters and can optimize nonlinear time-varying systems using only a measurement of the cost function to be minimized. In this thesis, we develop a distributed version of extremum-seeking control which allows large-scale systems to be optimized without models and with minimal computing power. First, we develop a continuous-time distributed extremum-seeking controller. It has three main components: consensus, parameter estimation, and optimization. The consensus provides each local controller with an estimate of the cost to be minimized, allowing them to coordinate their actions. Using this cost estimate, parameters for a local input-output model are estimated, and the cost is minimized by following a gradient descent based on the estimate of the gradient. Next, a similar distributed extremum-seeking controller is developed in discrete-time. Finally, we consider an interesting application of distributed ESC: formation control of high-altitude balloons for high-speed wireless internet. These balloons must be steered into a favourable formation where they are spread out over the Earth and provide coverage to the entire planet. Distributed ESC is applied to this problem, and is shown to be effective for a system of 1200 ballons subjected to realistic wind currents. The approach does not require a wind model and uses a cost function based on a Voronoi partition of the sphere. Distributed ESC is able to steer balloons from a few initial launch sites into a formation which provides coverage to the entire Earth and can maintain a similar formation as the balloons move with the wind around the Earth.
Resumo:
An assessment of the sustainability of the Irish economy has been carried out using three methodologies, enabling comparison and evaluation of the advantages and disadvantages of each, and potential synergies among them. The three measures chosen were economy-wide Material Flow Analysis (MFA), environmentally extended input-output (EE-IO) analysis and the Ecological Footprint (EF). The research aims to assess the sustainability of the Irish economy using these methods and to draw conclusions on their effectiveness in policy making both individually and in combination. A theoretical description discusses the methods and their respective advantages and disadvantages and sets out a rationale for their combined application. The application of the methods in combination has provided insights into measuring the sustainability of a national economy and generated new knowledge on the collective application of these methods. The limitations of the research are acknowledged and opportunities to address these and build on and extend the research are identified. Building on previous research, it is concluded that a complete picture of sustainability cannot be provided by a single method and/or indicator.
Resumo:
In this work we explore optimising parameters of a physical circuit model relative to input/output measurements, using the Dallas Rangemaster Treble Booster as a case study. A hybrid metaheuristic/gradient descent algorithm is implemented, where the initial parameter sets for the optimisation are informed by nominal values from schematics and datasheets. Sensitivity analysis is used to screen parameters, which informs a study of the optimisation algorithm against model complexity by fixing parameters. The results of the optimisation show a significant increase in the accuracy of model behaviour, but also highlight several key issues regarding the recovery of parameters.
Resumo:
Oikean tiedon siirtyminen oikeaan aikaan, sekä laadukkaan työn tekeminen yrityksen tilaus-toimitusketjun jokaisessa vaiheessa, ovat avaintekijöitä arvolupauksen ja laadun täyttämiseen asiakkaalle. Diplomityön tavoite on kehittää pk-yritykselle työkalut parempaan tiedon hallintaan ja laadukkaan työn tekemiseen toiminnanohjausjärjestelmässä. Tutkimusmenetelmänä diplomityössä käytettiin toimintatutkimusta, jossa diplomityön tekijä osallistui kohdeyrityksen päivittäiseen työn tekemiseen neljän kuukauden ajan. Tutkimuksen tiedon keräämisessä käytettiin myös puolistrukturoitua haastattelua, sekä kyselytutkimuksella. Tutkimusote työssä on kvalitatiivinen eli laadullinen tutkimusote. Työ koostuu teoriaosasta sekä soveltavasta osasta, jonka jälkeen työn tulokset esitetään tiivistetysti johtopäätöksissä ja yhteenvedossa. Toiminnanohjausjärjestelmät keräävät ja tallentavat tietoa, jota työntekijät ja yrityksen rajapinnoilla työskentelevät ihmiset siihen syöttävät. Onkin äärimmäisen tärkeää, että yrityksellä on kuvatut yhtenäiset toimintamallit prosesseille, joita he käyttävät tiedon tallentamisessa järjestelmiin. Tässä diplomityössä tutkitaan pk-yrityksen nykyiset toimintamallit tiedon tallentamisesta toiminnanohjausjärjestelmään, jonka jälkeen kehitetään yhtenäiset ohjeet toiminnanohjausjärjestelmään syötetystä myyntitilaussopimuksesta. Teoriaosuudessa esitetään laatu eri näkökulmista ja mitä laadunhallintajärjestelmät ovat ja kuinka niitä kehitetään. Teoriaosassa myös avataan tilausohjautuvan tuotannon periaatteet, sekä toiminnanohjausjärjestelmän merkitys liiketoiminnalle. Teoriaosuudella pohjustetaan soveltavaa osuutta, jossa ongelma-analyysin jälkeen kehitetään yritykseen oma laadunhallintajärjestelmä, sekä uudet työmallit tiedonvaihtoon ja sen tallentamiseen. Tuloksena on myös toiminnanohjausjärjestelmän käytön tehostuminen ohjelmistotoimittajan tekemänä. Ohjelmasta karsittiin turhat nimikkeistöt ja sen konfigurointia tehostettiin. Työn tuloksena saatiin työohjeet ydinprosessien suorittamiseen, sekä oma laadunhallintajärjestelmä tukemaan yrityksen ydin- ja tukiprosesseja, sekä tiedonhallintaa.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Faculdade Gama, Programa de Pós-Graduação em Engenharia Biomédica, 2015.
Resumo:
Common building energy modeling approaches do not account for the influence of surrounding neighborhood on the energy consumption patterns. This thesis develops a framework to quantify the neighborhood impact on a building energy consumption based on the local wind flow. The airflow in the neighborhood is predicted using Computational Fluid Dynamics (CFD) in eight principal wind directions. The developed framework in this study benefits from wind multipliers to adjust the wind velocity encountering the target building. The input weather data transfers the adjusted wind velocities to the building energy model. In a case study, the CFD method is validated by comparing with on-site temperature measurements, and the building energy model is calibrated using utilities data. A comparison between using the adjusted and original weather data shows that the building energy consumption and air system heat gain decreased by 5% and 37%, respectively, while the cooling gain increased by 4% annually.
Resumo:
Natural language processing has achieved great success in a wide range of ap- plications, producing both commercial language services and open-source language tools. However, most methods take a static or batch approach, assuming that the model has all information it needs and makes a one-time prediction. In this disser- tation, we study dynamic problems where the input comes in a sequence instead of all at once, and the output must be produced while the input is arriving. In these problems, predictions are often made based only on partial information. We see this dynamic setting in many real-time, interactive applications. These problems usually involve a trade-off between the amount of input received (cost) and the quality of the output prediction (accuracy). Therefore, the evaluation considers both objectives (e.g., plotting a Pareto curve). Our goal is to develop a formal understanding of sequential prediction and decision-making problems in natural language processing and to propose efficient solutions. Toward this end, we present meta-algorithms that take an existent batch model and produce a dynamic model to handle sequential inputs and outputs. Webuild our framework upon theories of Markov Decision Process (MDP), which allows learning to trade off competing objectives in a principled way. The main machine learning techniques we use are from imitation learning and reinforcement learning, and we advance current techniques to tackle problems arising in our settings. We evaluate our algorithm on a variety of applications, including dependency parsing, machine translation, and question answering. We show that our approach achieves a better cost-accuracy trade-off than the batch approach and heuristic-based decision- making approaches. We first propose a general framework for cost-sensitive prediction, where dif- ferent parts of the input come at different costs. We formulate a decision-making process that selects pieces of the input sequentially, and the selection is adaptive to each instance. Our approach is evaluated on both standard classification tasks and a structured prediction task (dependency parsing). We show that it achieves similar prediction quality to methods that use all input, while inducing a much smaller cost. Next, we extend the framework to problems where the input is revealed incremen- tally in a fixed order. We study two applications: simultaneous machine translation and quiz bowl (incremental text classification). We discuss challenges in this set- ting and show that adding domain knowledge eases the decision-making problem. A central theme throughout the chapters is an MDP formulation of a challenging problem with sequential input/output and trade-off decisions, accompanied by a learning algorithm that solves the MDP.
Resumo:
La atención a la dependencia es, sin lugar a dudas, uno de los grandes retos de futuro de las sociedades que envejecen aceleradamente, al que España y Andalucía no sólo no son ajenos, sino que presentan rasgos que lo amplifican. Sin embargo, es un fenómeno que adolece de importantes lagunas de información, fundamentalmente la carencia de una perspectiva integral que conecte la realidad social, el cambio económico, y la necesidad de transformación de los modelos de bienestar en general y de cuidados en particular, el gasto necesario para su desarrollo y el impacto económico asociado; enfoque holístico que se adopta en este trabajo. Dentro del marco expuesto el objetivo principal del trabajo es cuantificar el impacto en términos de actividad, empleo y retorno fiscal, así como la relación entre la prestación recibida y las características de los dependientes, que permita avanzar en la consideración de este gasto social como una inversión. Para la estimación de los impactos se ha utilizado el Marco Input-Output de Andalucía (MIOAN) de 2010. No obstante, para calcular los incrementos de renta generados por los requerimientos producidos, se ha endogeneizado la demanda de consumo, esto ha implicado construir expresamente para este cálculo una matriz ampliada de Leontief, procediendo a una estimación especifica de la fila adicional ampliada. Finalmente para estimar la relación entre prestación y características de los dependientes se han construido distintos modelos logísticos binarios, y un modelo multinomial. Desarrollado el trabajo de investigación y cuantificado el impacto, la conclusión fundamental no es sólo fuerte efecto arrastre del gasto en dependencia en la actividad y el empleo , sino que se produce en el marco de un cambio irreversible que sitúa a los Cuidados de Larga Duración asociados a la dependencia con un papel central en el contexto de la evolución del Estado el Bienestar. En concreto se ha verificado que: • Andalucía realiza un elevado gasto en atención a la dependencia, 1314 millones de euros, un 85 % del cual total es público y un 15 % privado, y que este gasto genera un fuerte impacto en la economía andaluza, pues su contribución a los grandes agregados macroeconómicos de Andalucía es de un 0,9 % del PIB, superior al impulso de demanda que supone el gasto en atención a la dependencia. • La atención a la dependencia ha mostrado una elevada capacidad de generación de empleo, más de 40.000 empleos, muy concentrado además en personas de difícil inserción en el mercado laboral, mujeres maduras de cualificación media-baja y con poca experiencia acreditable en sector formal de la atención. • La relación entre usuarios, gasto, e impacto para las trasferencias y los servicios muestra un modelo en escalera que caracteriza un sistema de universalización de coste medio con impacto moderado. Un modelo en el que las transferencias suponen el 50 % de los atendidos, el 30 % del gasto y el 20 % del impacto. • El retorno fiscal generado por la dependencia es muy elevado, casi del 45%. • El esfuerzo financiero, tanto bruto como neto, descontado el retorno vía impuestos que obtiene de cada administración, es muy asimétrico lo que puede generar un efecto desincentivador del desarrollo de un modelo con más servicios frente a transferencias. • Las características de los dependientes condicionan la prestación que parece ajustarse a los la características personales y de grado de discapacidad más que las socioeconómicas.
Resumo:
Synthetic biology, by co-opting molecular machinery from existing organisms, can be used as a tool for building new genetic systems from scratch, for understanding natural networks through perturbation, or for hybrid circuits that piggy-back on existing cellular infrastructure. Although the toolbox for genetic circuits has greatly expanded in recent years, it is still difficult to separate the circuit function from its specific molecular implementation. In this thesis, we discuss the function-driven design of two synthetic circuit modules, and use mathematical models to understand the fundamental limits of circuit topology versus operating regimes as determined by the specific molecular implementation. First, we describe a protein concentration tracker circuit that sets the concentration of an output protein relative to the concentration of a reference protein. The functionality of this circuit relies on a single negative feedback loop that is implemented via small programmable protein scaffold domains. We build a mass-action model to understand the relevant timescales of the tracking behavior and how the input/output ratios and circuit gain might be tuned with circuit components. Second, we design an event detector circuit with permanent genetic memory that can record order and timing between two chemical events. This circuit was implemented using bacteriophage integrases that recombine specific segments of DNA in response to chemical inputs. We simulate expected population-level outcomes using a stochastic Markov-chain model, and investigate how inferences on past events can be made from differences between single-cell and population-level responses. Additionally, we present some preliminary investigations on spatial patterning using the event detector circuit as well as the design of stationary phase promoters for growth-phase dependent activation. These results advance our understanding of synthetic gene circuits, and contribute towards the use of circuit modules as building blocks for larger and more complex synthetic networks.