851 resultados para Computer input-output equipment.
Resumo:
In recent years there has been growing concern about the emission trade balances of countries. This is due to the fact that countries with an open economy are active players in international trade. Trade is not only a major factor in forging a country’s economic structure, but contributes to the movement of embodied emissions beyond country borders. This issue is especially relevant from the carbon accounting policy and domestic production perspective, as it is known that the production-based principle is employed in the Kyoto agreement. The research described herein was designed to reveal the interdependence of countries on international trade and the corresponding embodied emissions both on national and on sectoral level and to illustrate the significance of the consumption-based emission accounting. It is presented here to what extent a consumption-based accounting would change the present system based on production-based accounting and allocation. The relationship of CO2 emission embodied in exports and embodied in imports is analysed here. International trade can blur the responsibility for the ecological effects of production and consumption and it can lengthen the link between consumption and its consequences. Input-output models are used in the methodology as they provide an appropriate framework for climate change accounting. The analysis comprises an international comparative study of four European countries (Germany, the United Kingdom, the Netherlands, and Hungary) with extended trading activities and carbon emissions. Moving from a production-based approach in climate policy to a consumption-based principle and allocation approach would help to increase the efficiency of emission reductions and would force countries to rethink their trading activities in order to decrease the environmental load of production activities. The results of this study show that it is important to distinguish between the two emission accounting approaches, both on the global and the local level.
Resumo:
Small errors proved catastrophic. Our purpose to remark that a very small cause which escapes our notice determined a considerable effect that we cannot fail to see, and then we say that the effect is due to chance. Small differences in the initial conditions produce very great ones in the final phenomena. A small error in the former will produce an enormous error in the latter. When dealing with any kind of electrical device specification, it is important to note that there exists a pair of test conditions that define a test: the forcing function and the limit. Forcing functions define the external operating constraints placed upon the device tested. The actual test defines how well the device responds to these constraints. Forcing inputs to threshold for example, represents the most difficult testing because this put those inputs as close as possible to the actual switching critical points and guarantees that the device will meet the Input-Output specifications. ^ Prediction becomes impossible by classical analytical analysis bounded by Newton and Euclides. We have found that non linear dynamics characteristics is the natural state of being in all circuits and devices. Opportunities exist for effective error detection in a nonlinear dynamics and chaos environment. ^ Nowadays there are a set of linear limits established around every aspect of a digital or analog circuits out of which devices are consider bad after failing the test. Deterministic chaos circuit is a fact not a possibility as it has been revived by our Ph.D. research. In practice for linear standard informational methodologies, this chaotic data product is usually undesirable and we are educated to be interested in obtaining a more regular stream of output data. ^ This Ph.D. research explored the possibilities of taking the foundation of a very well known simulation and modeling methodology, introducing nonlinear dynamics and chaos precepts, to produce a new error detector instrument able to put together streams of data scattered in space and time. Therefore, mastering deterministic chaos and changing the bad reputation of chaotic data as a potential risk for practical system status determination. ^
Resumo:
This research aimed at developing a research framework for the emerging field of enterprise systems engineering (ESE). The framework consists of an ESE definition, an ESE classification scheme, and an ESE process. This study views an enterprise as a system that creates value for its customers. Thus, developing the framework made use of system theory and IDEF methodologies. This study defined ESE as an engineering discipline that develops and applies systems theory and engineering techniques to specification, analysis, design, and implementation of an enterprise for its life cycle. The proposed ESE classification scheme breaks down an enterprise system into four elements. They are work, resources, decision, and information. Each enterprise element is specified with four system facets: strategy, competency, capacity, and structure. Each element-facet combination is subject to the engineering process of specification, analysis, design, and implementation, to achieve its pre-specified performance with respect to cost, time, quality, and benefit to the enterprise. This framework is intended for identifying research voids in the ESE discipline. It also helps to apply engineering and systems tools to this emerging field. It harnesses the relationships among various enterprise aspects and bridges the gap between engineering and management practices in an enterprise. The proposed ESE process is generic. It consists of a hierarchy of engineering activities presented in an IDEF0 model. Each activity is defined with its input, output, constraints, and mechanisms. The output of an ESE effort can be a partial or whole enterprise system design for its physical, managerial, and/or informational layers. The proposed ESE process is applicable to a new enterprise system design or an engineering change in an existing system. The long-term goal of this study aims at development of a scientific foundation for ESE research and development.
Resumo:
This dissertation comprises three individual chapters in an effort to examine different explanatory variables that affect firm performance. Chapter Two proposes an additional determinant of firm survival. Based on a detailed examination of firm survival in the British automobile industry between 1895 and 1970, we conclude that a firm's selection of submarket (defined by quality level) influenced survival. In contrast to findings for the US automobile industry, there is no evidence of first-mover advantage in the market as a whole. However, we do find evidence of first-mover advantage after conditioning on submarket choice. Chapter Three examines the effects of product line expansion on firm performance in terms of survival time. Based on a detailed examination of firm survival time in the British automobile industry between 1895 and 1970, we find that diversification exerts a positive effect on firm survival. Furthermore, our findings support the literature with respect to the impacts of submarket types, pre-entry experience, and timing of entry on firm survival time. Chapter Four examines corporate diversification in U.S. manufacturing and service firms. We develop measures of how related a firm's diverse activities are using input-output data and the NAILS classification to construct indexes of "vertical relatedness" and "complementarity". Strong relationships between these two measures are found. We utilize profitability and excess value as the measure for firm performance. Econometric analysis reveals that there is no relationship between the degree of relatedness of diversification and firm performance for the study period.
Resumo:
Typically, hermetic feedthroughs for implantable devices, such as pacemakers, use a alumina ceramic insulator brazed to a platinum wire pin. This combination of material has a long history in implantable devices and has been approved by the FDA for implantable hermetic feedthroughs. The growing demand for increased input/output (I/O) hermetic feedthroughs for implantable neural stimulator applications could be addressed by developing a new, cofired platinum/alumina multilayer ceramic technology in a configuration that supports 300 plus I/Os, which is not commercially available. Seven platinum powders with different particle sizes were used to develop different conductive cofire inks to control the densification mismatch between platinum and alumina. Firing profile (ramp rate, burn- out and holding times) and firing atmosphere and concentrations (hydrogen (wet/dry), air, neutral, vacuum) were also optimized. Platinum and alumina exhibit the alloy formation reaction in a reduced atmosphere. Formation of any compound can increase the bonding of the metal/ceramic interface, resulting in enhanced hermeticity. The feedthrough fabricated in a reduced atmosphere demonstrated significantly superior performance than that of other atmospheres. A composite structure of tungsten/platinum ratios graded thru the via structure (pure W, 50/50 W/Pt, 80/20 Pt/W and pure Pt) exhibited the best performance in comparison to the performance of other materials used for ink metallization. Studies on the high temperature reaction of platinum and alumina, previously unreported, showed that, at low temperatures in reduced atmosphere, Pt 3Al or Pt8Al21 with a tetragonal structure would be formed. Cubic Pt3Al is formed upon heating the sample to temperatures above 1350 °C. This cubic structure is the equilibrium state of Pt-Al alloy at high temperatures. The alumina dissolves into the platinum ink and is redeposited as a surface coating. This was observed on both cofired samples and pure platinum thin films coated on a 99.6 Wt% alumina and fired at 1550 °C. Different mechanisms are proposed to describe this behavior based on the size of the platinum particle
Resumo:
The purpose of this study is to explore the accuracy issue of the Input-Output model in quantifying the impacts of the 2007 economic crisis on a local tourism industry and economy. Though the model has been used in the tourism impact analysis, its estimation accuracy is rarely verified empirically. The Metro Orlando area in Florida is investigated as an empirical study, and the negative change in visitor expenditure between 2007 and 2008 is taken as the direct shock. The total impacts are assessed in terms of output and employment, and are compared with the actual data. This study finds that there are surprisingly large discrepancies among the estimated and actual results, and the Input-Output model appears to overestimate the negative impacts. By investigating the local economic activities during the study period, this study made some exploratory efforts in explaining such discrepancies. Theoretical and practical implications are then suggested.
Resumo:
The purpose of this study is to explore the accuracy issue of the Input-Output model in quantifying the impacts of the 2007 economic crisis on a local tourism industry and economy. Though the model has been used in the tourism impact analysis, its estimation accuracy is rarely verified empirically. The Metro Orlando area in Florida is investigated as an empirical study, and the negative change in visitor expenditure between 2007 and 2008 is taken as the direct shock. The total impacts are assessed in terms of output and employment, and are compared with the actual data. This study finds that there are surprisingly large discrepancies among the estimated and actual results, and the Input-Output model appears to overestimate the negative impacts. By investigating the local economic activities during the study period, this study made some exploratory efforts in explaining such discrepancies. Theoretical and practical implications are then suggested.
Resumo:
This research aimed at developing a research framework for the emerging field of enterprise systems engineering (ESE). The framework consists of an ESE definition, an ESE classification scheme, and an ESE process. This study views an enterprise as a system that creates value for its customers. Thus, developing the framework made use of system theory and IDEF methodologies. This study defined ESE as an engineering discipline that develops and applies systems theory and engineering techniques to specification, analysis, design, and implementation of an enterprise for its life cycle. The proposed ESE classification scheme breaks down an enterprise system into four elements. They are work, resources, decision, and information. Each enterprise element is specified with four system facets: strategy, competency, capacity, and structure. Each element-facet combination is subject to the engineering process of specification, analysis, design, and implementation, to achieve its pre-specified performance with respect to cost, time, quality, and benefit to the enterprise. This framework is intended for identifying research voids in the ESE discipline. It also helps to apply engineering and systems tools to this emerging field. It harnesses the relationships among various enterprise aspects and bridges the gap between engineering and management practices in an enterprise. The proposed ESE process is generic. It consists of a hierarchy of engineering activities presented in an IDEF0 model. Each activity is defined with its input, output, constraints, and mechanisms. The output of an ESE effort can be a partial or whole enterprise system design for its physical, managerial, and/or informational layers. The proposed ESE process is applicable to a new enterprise system design or an engineering change in an existing system. The long-term goal of this study aims at development of a scientific foundation for ESE research and development.
Resumo:
The demand side growth accounting studies the demand aggregate component contributions in the Gross Domestic Product (GDP). Traditionally, international and national organizations that uses the traditional method for calculating such contributions. However, this method does not take into account the effect the induction of imports by the various components of aggregate demand on the calculation of these. As an alternative to this method are presented others studies that consider this effect, as the alternative method proposed by Lara (2013), the attribution method, proposed by Kranendonk and Verbruggen (2005) and Hoekstra and van der Helm (2010), and the method the sraffian supermultiplier, by Freitas and Dweck (2013). Was made a summary of these methods, demonstrating the similarities and differences between them. Also, in the aim to contribute to the study of the subject was developed the “method of distribution of imports” that aims to distribute imports for the various components of aggregate demand, through the information set forth in the input-output matrices and tables of resources and uses. Were accounted the contributions to the growth of macroeconomic aggregates for Brazil from 2001 to 2009 using the method of distribution, and realized comparison with the traditional method, understanding the reasons for the differences in contributions. Later was done comparisons with all the methods presented in this work, between the calculated contributions to the growth of the components of aggregate demand and the domestic and external sectors. Was verified that the methods that exist in the literature was not enough to deal with this question, and given the alternatives for contributions to the growth presented throughout this work, it is believed that the method of distribution provides the best estimates for the account of contributions by aggregate demand sector. In particular, the main advantage of this method to the others is the breakdown of the contribution of imports, separated by aggregate demand component, which allows the analysis of contribution of each component to GDP growth. Thus, this type of analysis helps to study the pattern of growth of the Brazilian economy, not just the theoretical point of view, but also empirical and basis for the decision to economic policies
Resumo:
Peer reviewed
Resumo:
Peer reviewed
Resumo:
We study a small circuit of coupled nonlinear elements to investigate general features of signal transmission through networks. The small circuit itself is perceived as building block for larger networks. Individual dynamics and coupling are motivated by neuronal systems: We consider two types of dynamical modes for an individual element, regular spiking and chattering and each individual element can receive excitatory and/or inhibitory inputs and is subjected to different feedback types (excitatory and inhibitory; forward and recurrent). Both, deterministic and stochastic simulations are carried out to study the input-output relationships of these networks. Major results for regular spiking elements include frequency locking, spike rate amplification for strong synaptic coupling, and inhibition-induced spike rate control which can be interpreted as a output frequency rectification. For chattering elements, spike rate amplification for low frequencies and silencing for large frequencies is characteristic
Resumo:
The real-time optimization of large-scale systems is a difficult problem due to the need for complex models involving uncertain parameters and the high computational cost of solving such problems by a decentralized approach. Extremum-seeking control (ESC) is a model-free real-time optimization technique which can estimate unknown parameters and can optimize nonlinear time-varying systems using only a measurement of the cost function to be minimized. In this thesis, we develop a distributed version of extremum-seeking control which allows large-scale systems to be optimized without models and with minimal computing power. First, we develop a continuous-time distributed extremum-seeking controller. It has three main components: consensus, parameter estimation, and optimization. The consensus provides each local controller with an estimate of the cost to be minimized, allowing them to coordinate their actions. Using this cost estimate, parameters for a local input-output model are estimated, and the cost is minimized by following a gradient descent based on the estimate of the gradient. Next, a similar distributed extremum-seeking controller is developed in discrete-time. Finally, we consider an interesting application of distributed ESC: formation control of high-altitude balloons for high-speed wireless internet. These balloons must be steered into a favourable formation where they are spread out over the Earth and provide coverage to the entire planet. Distributed ESC is applied to this problem, and is shown to be effective for a system of 1200 ballons subjected to realistic wind currents. The approach does not require a wind model and uses a cost function based on a Voronoi partition of the sphere. Distributed ESC is able to steer balloons from a few initial launch sites into a formation which provides coverage to the entire Earth and can maintain a similar formation as the balloons move with the wind around the Earth.
Resumo:
An assessment of the sustainability of the Irish economy has been carried out using three methodologies, enabling comparison and evaluation of the advantages and disadvantages of each, and potential synergies among them. The three measures chosen were economy-wide Material Flow Analysis (MFA), environmentally extended input-output (EE-IO) analysis and the Ecological Footprint (EF). The research aims to assess the sustainability of the Irish economy using these methods and to draw conclusions on their effectiveness in policy making both individually and in combination. A theoretical description discusses the methods and their respective advantages and disadvantages and sets out a rationale for their combined application. The application of the methods in combination has provided insights into measuring the sustainability of a national economy and generated new knowledge on the collective application of these methods. The limitations of the research are acknowledged and opportunities to address these and build on and extend the research are identified. Building on previous research, it is concluded that a complete picture of sustainability cannot be provided by a single method and/or indicator.
Resumo:
In this work we explore optimising parameters of a physical circuit model relative to input/output measurements, using the Dallas Rangemaster Treble Booster as a case study. A hybrid metaheuristic/gradient descent algorithm is implemented, where the initial parameter sets for the optimisation are informed by nominal values from schematics and datasheets. Sensitivity analysis is used to screen parameters, which informs a study of the optimisation algorithm against model complexity by fixing parameters. The results of the optimisation show a significant increase in the accuracy of model behaviour, but also highlight several key issues regarding the recovery of parameters.