955 resultados para System-Level Models


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The behaviour of control functions in safety critical software systems is typically bounded to prevent the occurrence of known system level hazards. These bounds are typically derived through safety analyses and can be implemented through the use of necessary design features. However, the unpredictability of real world problems can result in changes in the operating context that may invalidate the behavioural bounds themselves, for example, unexpected hazardous operating contexts as a result of failures or degradation. For highly complex problems it may be infeasible to determine the precise desired behavioural bounds of a function that addresses or minimises risk for hazardous operation cases prior to deployment. This paper presents an overview of the safety challenges associated with such a problem and how such problems might be addressed. A self-management framework is proposed that performs on-line risk management. The features of the framework are shown in context of employing intelligent adaptive controllers operating within complex and highly dynamic problem domains such as Gas-Turbine Aero Engine control. Safety assurance arguments enabled by the framework necessary for certification are also outlined.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The connectivity of the Internet at the Autonomous System level is influenced by the network operator policies implemented. These in turn impose a direction to the announcement of address advertisements and, consequently, to the paths that can be used to reach back such destinations. We propose to use directed graphs to properly represent how destinations propagate through the Internet and the number of arc-disjoint paths to quantify this network's path diversity. Moreover, in order to understand the effects that policies have on the connectivity of the Internet, numerical analyses of the resulting directed graphs were conducted. Results demonstrate that, even after policies have been applied, there is still path diversity which the Border Gateway Protocol cannot currently exploit.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This project has been undertaken for Hamworthy Hydraulics Limited. Its objective was to design and develop a controller package for a variable displacement, hydraulic pump for use mainly on mobile earth moving machinery. A survey was undertaken of control options used in practice and from this a design specification was formulated, the successful implementation of which would give Hamworthy an advantage over its competitors. Two different modes for the controller were envisaged. One consisted of using conventional hydro-mechanics and the other was based upon a microprocessor. To meet short term customer prototype requirements the first section of work was the realisation of the hydro-mechanical system. Mathematical models were made to evaluate controller stability and hence aid their design. The final package met the requirements of the specification and a single version could operate all sizes of variable displacement pumps in the Hamworthy range. The choice of controller options and combinations totalled twenty-four. The hydro-mechanical controller was complex and it was realised that a micro-processor system would allow all options to be implemented with just one design of hardware, thus greatly simplifying production. The final section of this project was to determine whether such a design was feasible. This entailed finding cheap, reliable transducers, using mathematical models to predict electro-hydraulic interface stability, testing such interfaces and finally incorporating a micro-processor in an interactive control loop. The study revealed that such a system was technically possible but it would cost 60% more than its hydro-mechanical counterpart. It was therefore concluded that, in the short term, for the markets considered, the hydro-mechanical design was the better solution. Regarding the micro-processor system the final conclusion was that, because the relative costs of the two systems are decreasing, the electro-hydraulic controller will gradually become more attractive and therefore Hamworthy should continue with its development.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Image segmentation is one of the most computationally intensive operations in image processing and computer vision. This is because a large volume of data is involved and many different features have to be extracted from the image data. This thesis is concerned with the investigation of practical issues related to the implementation of several classes of image segmentation algorithms on parallel architectures. The Transputer is used as the basic building block of hardware architectures and Occam is used as the programming language. The segmentation methods chosen for implementation are convolution, for edge-based segmentation; the Split and Merge algorithm for segmenting non-textured regions; and the Granlund method for segmentation of textured images. Three different convolution methods have been implemented. The direct method of convolution, carried out in the spatial domain, uses the array architecture. The other two methods, based on convolution in the frequency domain, require the use of the two-dimensional Fourier transform. Parallel implementations of two different Fast Fourier Transform algorithms have been developed, incorporating original solutions. For the Row-Column method the array architecture has been adopted, and for the Vector-Radix method, the pyramid architecture. The texture segmentation algorithm, for which a system-level design is given, demonstrates a further application of the Vector-Radix Fourier transform. A novel concurrent version of the quad-tree based Split and Merge algorithm has been implemented on the pyramid architecture. The performance of the developed parallel implementations is analysed. Many of the obtained speed-up and efficiency measures show values close to their respective theoretical maxima. Where appropriate comparisons are drawn between different implementations. The thesis concludes with comments on general issues related to the use of the Transputer system as a development tool for image processing applications; and on the issues related to the engineering of concurrent image processing applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Congestion control is critical for the provisioning of quality of services (QoS) over dedicated short range communications (DSRC) vehicle networks for road safety applications. In this paper we propose a congestion control method for DSRC vehicle networks at road intersection, with the aims of providing high availability and low latency channels for high priority emergency safety applications while maximizing channel utilization for low priority routine safety applications. In this method a offline simulation based approach is used to find out the best possible configurations of message rate and MAC layer backoff exponent (BE) for a given number of vehicles equipped with DSRC radios. The identified best configurations are then used online by an roadside access point (AP) for system operation. Simulation results demonstrated that this adaptive method significantly outperforms the fixed control method under varying number of vehicles. The impact of estimation error on the number of vehicles in the network on system level performance is also investigated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis examined solar thermal collectors for use in alternative hybrid solar-biomass power plant applications in Gujarat, India. Following a preliminary review, the cost-effective selection and design of the solar thermal field were identified as critical factors underlying the success of hybrid plants. Consequently, the existing solar thermal technologies were reviewed and ranked for use in India by means of a multi-criteria decision-making method, the Analytical Hierarchy Process (AHP). Informed by the outcome of the AHP, the thesis went on to pursue the Linear Fresnel Reflector (LFR), the design of which was optimised with the help of ray-tracing. To further enhance collector performance, LFR concepts incorporating novel mirror spacing and drive mechanisms were evaluated. Subsequently, a new variant, termed the Elevation Linear Fresnel Reflector (ELFR) was designed, constructed and tested at Aston University, UK, therefore allowing theoretical models for the performance of a solar thermal field to be verified. Based on the resulting characteristics of the LFR, and data gathered for the other hybrid system components, models of hybrid LFR- and ELFR-biomass power plants were developed and analysed in TRNSYS®. The techno-economic and environmental consequences of varying the size of the solar field in relation to the total plant capacity were modelled for a series of case studies to evaluate different applications: tri-generation (electricity, ice and heat), electricity-only generation, and process heat. The case studies also encompassed varying site locations, capacities, operational conditions and financial situations. In the case of a hybrid tri-generation plant in Gujarat, it was recommended to use an LFR solar thermal field of 14,000 m2 aperture with a 3 tonne biomass boiler, generating 815 MWh per annum of electricity for nearby villages and 12,450 tonnes of ice per annum for local fisheries and food industries. However, at the expense of a 0.3 ¢/kWh increase in levelised energy costs, the ELFR increased saving of biomass (100 t/a) and land (9 ha/a). For solar thermal applications in areas with high land cost, the ELFR reduced levelised energy costs. It was determined that off-grid hybrid plants for tri-generation were the most feasible application in India. Whereas biomass-only plants were found to be more economically viable, it was concluded that hybrid systems will soon become cost competitive and can considerably improve current energy security and biomass supply chain issues in India.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The importance of interorganizational networks in supporting or hindering the achievement of organizational objectives is now widely acknowledged. Network research is directed at understanding network processes and structures, and their impact upon performance. A key process is learning. The concepts of individual, group and organizational learning are long established. This article argues that learning might also usefully be regarded as occurring at a fourth system level, the interorganizational network. The concept of network learning - learning by a group of organizations as a group - is presented, and differentiated from other types of learning, notably interorganizational learning (learning in interorganizational contexts). Four cases of network learning are identified and analysed to provide insights into network learning processes and outcomes. It is proposed that 'network learning episode' offers a suitable unit of analysis for the empirical research needed to develop our understanding of this potentially important concept.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents an assessment of the technical and economic performance of thermal processes to generate electricity from a wood chip feedstock by combustion, gasification and fast pyrolysis. The scope of the work begins with the delivery of a wood chip feedstock at a conversion plant and ends with the supply of electricity to the grid, incorporating wood chip preparation, thermal conversion, and electricity generation in dual fuel diesel engines. Net generating capacities of 1–20 MWe are evaluated. The techno-economic assessment is achieved through the development of a suite of models that are combined to give cost and performance data for the integrated system. The models include feed pretreatment, combustion, atmospheric and pressure gasification, fast pyrolysis with pyrolysis liquid storage and transport (an optional step in de-coupled systems) and diesel engine or turbine power generation. The models calculate system efficiencies, capital costs and production costs. An identical methodology is applied in the development of all the models so that all of the results are directly comparable. The electricity production costs have been calculated for 10th plant systems, indicating the costs that are achievable in the medium term after the high initial costs associated with novel technologies have reduced. The costs converge at the larger scale with the mean electricity price paid in the EU by a large consumer, and there is therefore potential for fast pyrolysis and diesel engine systems to sell electricity directly to large consumers or for on-site generation. However, competition will be fierce at all capacities since electricity production costs vary only slightly between the four biomass to electricity systems that are evaluated. Systems de-coupling is one way that the fast pyrolysis and diesel engine system can distinguish itself from the other conversion technologies. Evaluations in this work show that situations requiring several remote generators are much better served by a large fast pyrolysis plant that supplies fuel to de-coupled diesel engines than by constructing an entire close-coupled system at each generating site. Another advantage of de-coupling is that the fast pyrolysis conversion step and the diesel engine generation step can operate independently, with intermediate storage of the fast pyrolysis liquid fuel, increasing overall reliability. Peak load or seasonal power requirements would also benefit from de-coupling since a small fast pyrolysis plant could operate continuously to produce fuel that is stored for use in the engine on demand. Current electricity production costs for a fast pyrolysis and diesel engine system are 0.091/kWh at 1 MWe when learning effects are included. These systems are handicapped by the typical characteristics of a novel technology: high capital cost, high labour, and low reliability. As such the more established combustion and steam cycle produces lower cost electricity under current conditions. The fast pyrolysis and diesel engine system is a low capital cost option but it also suffers from relatively low system efficiency particularly at high capacities. This low efficiency is the result of a low conversion efficiency of feed energy into the pyrolysis liquid, because of the energy in the char by-product. A sensitivity analysis has highlighted the high impact on electricity production costs of the fast pyrolysis liquids yield. The liquids yield should be set realistically during design, and it should be maintained in practice by careful attention to plant operation and feed quality. Another problem is the high power consumption during feedstock grinding. Efficiencies may be enhanced in ablative fast pyrolysis which can tolerate a chipped feedstock. This has yet to be demonstrated at commercial scale. In summary, the fast pyrolysis and diesel engine system has great potential to generate electricity at a profit in the long term, and at a lower cost than any other biomass to electricity system at small scale. This future viability can only be achieved through the construction of early plant that could, in the short term, be more expensive than the combustion alternative. Profitability in the short term can best be achieved by exploiting niches in the market place and specific features of fast pyrolysis. These include: •countries or regions with fiscal incentives for renewable energy such as premium electricity prices or capital grants; •locations with high electricity prices so that electricity can be sold direct to large consumers or generated on-site by companies who wish to reduce their consumption from the grid; •waste disposal opportunities where feedstocks can attract a gate fee rather than incur a cost; •the ability to store fast pyrolysis liquids as a buffer against shutdowns or as a fuel for peak-load generating plant; •de-coupling opportunities where a large, single pyrolysis plant supplies fuel to several small and remote generators; •small-scale combined heat and power opportunities; •sales of the excess char, although a market has yet to be established for this by-product; and •potential co-production of speciality chemicals and fuel for power generation in fast pyrolysis systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The basic construction concepts of many-valued intellectual systems, which are adequate to primal problems of person activity and using hybrid tools with many-valued intellectual systems being two-place, but simulating neuron processes of space toting which are different on a level of actions, inertial and threshold of properties of neuron diaphragms, and also frequency modification of the following transmitted messages are created. All enumerated properties and functions in point of fact are essential not only are discrete on time, but also many-valued.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Performance analysis has become a vital part of the management practices in the banking industry. There are numerous applications using DEA models to estimate efficiency in banking, and most of them assume that inputs and outputs are known with absolute precision. Here, we propose new Fuzzy-DEA α-level models to assess underlying uncertainty. Further, bootstrap truncated regressions with fixed factors are used to measure the impact of each model on the efficiency scores and to identify the most relevant contextual variables on efficiency. The proposed models have been demonstrated using an application in Mozambican banks to handle the underlying uncertainty. Findings reveal that fuzziness is predominant over randomness in interpreting the results. In addition, fuzziness can be used by decision-makers to identify missing variables to help in interpreting the results. Price of labor, price of capital, and market-share were found to be the significant factors in measuring bank efficiency. Managerial implications are addressed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Napjaink informatikai világának talán legkeresettebb hívó szava a cloud computing, vagy magyar fordításban, a számítási felhő. A fordítás forrása az EU-s (Digitális Menetrend magyar változata, 2010) A számítási felhő üzleti modelljének részletes leírását adja (Bőgel, 2009). Bőgel György ismerteti az új, közműszerű informatikai szolgáltatás kialakulását és gazdasági előnyeit, nagy jövőt jósolva a számítási felhőnek az üzleti modellek versenyében. A szerző – a számítási felhő üzleti előnyei mellett – nagyobb hangsúlyt fektet dolgozatában a gyors elterjedést gátló tényezőkre, és arra, hogy mit jelentenek az előnyök és a hátrányok egy üzleti, informatikai vagy megfelelőségi vezető számára. Nem csökkentve a cloud modell gazdasági jelentőségét, fontosnak tartja, hogy a problémákról és a kockázatokról is szóljon. Kiemeli, hogy a kockázatokban – különösen a biztonsági és adatvédelmi kockázatokban – lényeges különbségek vannak az Európai Gazdasági Térség és a világ többi része, pl. az Amerikai Egyesült Államok között. A cikkben rámutat ezekre a különbségekre, és az olvasó magyarázatot kap arra is, hogy miért várható a számítási felhő lassabb terjedése Európában, mint a világ más részein. Bemutatja az EU erőfeszítéseit is a számítási felhő európai terjedésének elősegítésére, tekintettel a modell versenyképességet növelő hatására. / === / One of the most popular concept of the recent web searches is cloud computing. Several authors present detailed description of the new service model and it's business benefits and cite the optimistic prognoses of the cloud experts regarding the competition of information system service models. The author analyses the operational benefits of the cloud application and give a detailed description of the inhibitors of the fast expansion of the service modell. He also analyses the pros and cons of the cloud for a business manager, an information and a compliance officer. When understanding the advantages of the cloud, it is equally important to review the problems and risks associated with the model. The paper gives a list of the expected cloud-specific risks. It also explains the differences in security and data protection approach between the European Economic Area and the rest of the world, including the USA. The explains why slower expansion of the cloud modell is expected in Europe than in the rest of the world. The efforts of the EU Committee in helping to spread the cloud model is also presented, as the EU's officers consider the model as an important element of competitiveness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Water management has altered both the natural timing and volume of freshwater delivered to Everglades National Park. This is especially true for Taylor Slough and the C-111 basin, as hypersaline events in Florida Bay have been linked to reduced freshwater flow in this area. In light of recent efforts to restore historical flows to the eastern Everglades, an understanding of the impact of this hydrologic shift is needed in order to predict the trajectory of restoration. I conducted a study to assess the importance of season, water chemistry, and hydrologic conditions on the exchange of nutrients in dwarf and fringe mangrove wetlands along Taylor Slough. I also performed mangrove leaf decomposition studies to determine the contribution of biotic and abiotic processes to mass loss, the effect of salinity and season on degradation rates, and the importance of this litter component as a rapid source of nutrients. ^ Dwarf mangrove wetlands consistently imported total nutrients (C, N, and P) and released NO2− +NO3 −, with enhanced release during the dry season. Ammonium flux shifted from uptake to release over the study period. Dissolved phosphate activity was difficult to discern in either wetland, as concentrations were often below detection limits. Fluxes of dissolved inorganic nitrogen in the fringe wetland were positively related to DIN concentrations. The opposite was found for total nitrogen in the fringe wetland. A dynamic budget revealed a net annual export of TN to Florida Bay that was highest during the wet season. Simulated increases and decreases in freshwater flow yielded reduced exports of TN to Florida Bay as a result of changes in subsystem and water flux characteristics. Finally, abiotic processes yielded substantial nutrient and mass losses from senesced leaves with little influence of salinity. Dwarf mangrove leaf litter appeared to be a considerable source of nutrients to the water column of this highly oligotrophic wetland. To summarize, nutrient dynamics at the subsystem level were sensitive to short-term changes in hydrologic and seasonal conditions. These findings suggest that increased freshwater flow has the potential to lead to long-term, system-level changes that may reach as far as eastern Florida Bay. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation delivers a framework to diagnose the Bull-Whip Effect (BWE) in supply chains and then identify methods to minimize it. Such a framework is needed because in spite of the significant amount of literature discussing the bull-whip effect, many companies continue to experience the wide variations in demand that are indicative of the bull-whip effect. While the theory and knowledge of the bull-whip effect is well established, there still is the lack of an engineering framework and method to systematically identify the problem, diagnose its causes, and identify remedies. ^ The present work seeks to fill this gap by providing a holistic, systems perspective to bull-whip identification and diagnosis. The framework employs the SCOR reference model to examine the supply chain processes with a baseline measure of demand amplification. Then, research of the supply chain structural and behavioral features is conducted by means of the system dynamics modeling method. ^ The contribution of the diagnostic framework, is called Demand Amplification Protocol (DAMP), relies not only on the improvement of existent methods but also contributes with original developments introduced to accomplish successful diagnosis. DAMP contributes a comprehensive methodology that captures the dynamic complexities of supply chain processes. The method also contributes a BWE measurement method that is suitable for actual supply chains because of its low data requirements, and introduces a BWE scorecard for relating established causes to a central BWE metric. In addition, the dissertation makes a methodological contribution to the analysis of system dynamic models with a technique for statistical screening called SS-Opt, which determines the inputs with the greatest impact on the bull-whip effect by means of perturbation analysis and subsequent multivariate optimization. The dissertation describes the implementation of the DAMP framework in an actual case study that exposes the approach, analysis, results and conclusions. The case study suggests a balanced solution between costs and demand amplification can better serve both firms and supply chain interests. Insights pinpoint to supplier network redesign, postponement in manufacturing operations and collaborative forecasting agreements with main distributors.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Compared to phosphorus (P), nitrogen (N) has received little attention across the Everglades landscape. Despite this lack of attention, N plays important roles in many Everglades systems, including being a significant pollutant in Florida Bay and the Gulf of Mexico, the limiting nutrient in highly P-impacted areas, and an important substrate for microbial metabolism. Storage and transport of N throughout the Everglades is dominated by organic forms, including peat soils and dissolved organic N in the water column. In general, N sources are highest in the northern areas; however, atmospheric deposition and active N2 fixation by the periphyton components are a significant N source throughout most systems. Many of the processes involved in the wetland N cycle remain unmeasured for most of the Everglades systems. In particular, the lack of in situ rates for N2 fixation and denitrification prevent the construction of system-level budgets, especially for the Southern mangrove systems where N export into Florida Bay is critical. There is also the potential for several novel N processes (e.g., Anammox) with an as yet undetermined importance for nitrogen cycling and function of the Everglades ecosystem. Phosphorus loading alters the N cycle by stimulating organic N mineralization with resulting flux of ammonium and DON, and at elevated P concentrations, by increasing rates of N2 fixation and N assimilation. Restoration of hydrology has a potential for significantly impacting N cycling in the Everglades both in terms of affecting N transport, but also by altering aerobic-anaerobic transitions at the soil-water interface or in areas with seasonal drawdowns (e.g., marl prairies). Based on the authors’ understanding of N processes, much more research is necessary to adequately predict potential impacts from hydrologic restoration, as well as the function of Everglades systems as sinks, sources, and transformers of N in the South Florida landscape.