971 resultados para Compact embedding


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work was to study the changes induced by BG in the behaviour of wheat starch, and observe the influence of these variations on the quality of a basic white bread. The effect of four BG addition levels in the wheat flour functional characteristics (WAI, WSI, and pasting properties) and bread quality (physical parameters, crumb grain structure, moisture and hardness) was investigated. The highest levels of BG (1% and 2%) decreased the peak viscosity, and increased the stability and setback of the flour. This was due to a lower gelatinization of the starch granules, caused by a competition for water between the hydrocolloid and starch. These changes influenced the bread quality. The loaves added with 1% and 2% of BG presented smaller alveoli: this resulted in more compact, hard and less airy crumbs. Nevertheless, the moisture of the samples at 1% and 2% of added gum was higher than the control bread. However, the incorporation of BG at 0.5% did not affect the pasting parameters and bread quality, but increased moisture of crumb, so this concentration would be most recommended for baking, since higher humidity could favour the shelf- life of the product.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tämä diplomityö tehtiin Valmet Technologies Oy:n Järvenpään toimipisteelle. Työn tavoitteena oli tutkia miten pituusleikkureiden 3D-suunnittelua voidaan tehostaa hyödyntämällä uuden 3D-CAD -järjestelmän ominaisuuksia optimaalisesti. Työ koostuu teoriaosuudesta, haastattelututkimuksesta sekä käytännön osuudesta. Teoriaosuudessa perehdytään pituusleikkurin toimintaan ja rakenteeseen, 3D-suunnittelun teoriaan sekä CATIA-järjestelmään. Teoriaosuudessa etsitään myös uusia näkökulmia 3D-suunnitteluun. Haastattelututkimuksessa kartoitetaan nykyinen suunnitteluprosessi, suunnittelun kehitettäviä kohteita, sekä käytössä olevia suunnittelumenetelmiä, jotka ovat todettu toimiviksi. Haastattelututkimuksessa haastatellaan Valmet Technologies Oy:n Järvenpään toimipisteessä työskenteleviä pituusleikkureiden pääsuunnittelijoita sekä heidän esimiehiään. Lisäksi erillisten haastattelujen avulla kerätään kokemuksia CATIA V6 -ohjelmiston käytöstä sekä suunnitteluohjelmiston vaihtumisesta. Käytännön osuuden tavoitteena on arvioida pituusleikkurin parametroitujen mallirakenteiden siirtämiseen sekä korjauksiin kuluvia aikamääriä kyseisiin toimenpiteisiin tarvittavien resurssien määrittämiseksi. Käytännön osuudessa siirretään kaksi Valmet OptiWin Drum Compact -pituusleikkurin parametroitua osakokonaisuutta uuteen CAD-järjestelmään ja niille suoritetaan tarvittavat korjaustoimenpiteet Tutkimuksen tulosten perusteella yhteisen mallinnusmetodologian puuttuminen on merkittävin kehityskohde suunnittelun kehittämisessä. Lopuksi luotiin kehitysehdotukset sekä implementointisuunnitelma, joiden avulla pituusleikkureiden 3D-suunnittelua voidaan kehittää ja CATIA V6 -ohjelmisto voidaan ottaa käyttöön tehokkaasti.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a growing trend towards decentralized electricity and heat production throughout the world. Reciprocating engines and gas turbines have an essential role in the global decentralized energy markets and any improvement in their electrical efficiency has a significant impact from the environmental and economic viewpoints. This paper introduces an inter-cooled and recuperated two-shaft microturbine at 500 kW electric output range. The microturbine is optimized for a realistic combination of the turbine inlet temperature, the recuperation rate and the pressure ratio. The new microturbine design aims to achieve significantly increased performance within the range of microturbines and even competing with the efficiencies achieved in large industrial gas turbines. The simulated electrical efficiency is 45%. Improving the efficiency of combined heat and power (CHP) systems will significantly decrease the emissions and operating costs of decentralized heat and electricity production. Cost-effective, compact and environmentally friendly micro-and small-scale CHP turbine systems with high electrical efficiency will have an opportunity to successfully compete against reciprocating engines, which today are used in heat and power generation all over the world and manufactured in large production series. This paper presents a small-scale gas turbine process, capable of competing with reciprocating engine in terms of electrical efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Kartonkivuokien käyttö elintarviketeollisuudessa on kasvanut vuosi vuodelta ja kuluttajat ovat löytäneet kartonkipakkaukset niiden ekologisuuden, graafisen ulkoasun sekä turvallisuuden ansiosta. Vuokien valmistus mekaanisella prässäyksellä kehittyy jatkuvasti ja uusi teknologia mahdollistaa täysin uudentyyppisten prässien valmistuksen. Työn tavoitteena on tutkia voidaanko mekaaninen prässäys toteuttaa vaakatasossa uutta teknologiaa hyödyntämällä kustannustehokkaasti. Lisäksi tavoitteena on selvittää, millä voimantuottomenetelmällä prässiltä vaadittava voimantuotto on edullisinta toteuttaa. Suunnittelussa otetaan huomioon olemassa olevat työkalut, modulaarisuus, EU:n elintarvikelainsäädäntö sekä muut kriittiset tekijät. Työssä sovelletaan metodisen konstruoinnin mukaisesti systemaattista ongelmanratkaisua (VDI 2221). Lisäksi työssä käydään läpi kilpaileva menetelmä arvoanalyysi. Suunnittelua ja komponenttivalintoja analysoidaan SWOT- sekä pisteanalyysien avulla. Suunnittelussa syntyneiden ratkaisujen lujuusominaisuuksia tarkastellaan FEM-analyysillä. Tuloksista havaittiin, että vaakatasossa suoritettava prässäyksen voimantuotto on edullisinta ja yksinkertaisinta toteuttaa sähkömekaniikalla, että halutut ominaisuudet voidaan saavuttaa ja rakenne pitää mahdollisimman kompaktina. Kustannustehokkuuden saavuttamiseksi hankinnat on syytä kilpailuttaa huolellisesti. Työn lopussa esitetään mahdollisia jatkokehityskohteita, joita syntyi tämän diplomityön aikana.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The traditional business models and the traditionally successful development methods that have been distinctive to the industrial era, do not satisfy the needs of modern IT companies. Due to the rapid nature of IT markets, the uncertainty of new innovations‟ success and the overwhelming competition with established companies, startups need to make quick decisions and eliminate wasted resources more effectively than ever before. There is a need for an empirical basis on which to build business models, as well as evaluate the presumptions regarding value and profit. Less than ten years ago, the Lean software development principles and practices became widely well-known in the academic circles. Those practices help startup entrepreneurs to validate their learning, test their assumptions and be more and more dynamical and flexible. What is special about today‟s software startups is that they are increasingly individual. There are quantitative research studies available regarding the details of Lean startups. Broad research with hundreds of companies presented in a few charts is informative, but a detailed study of fewer examples gives an insight to the way software entrepreneurs see Lean startup philosophy and how they describe it in their own words. This thesis focuses on Lean software startups‟ early phases, namely Customer Discovery (discovering a valuable solution to a real problem) and Customer Validation (being in a good market with a product which satisfies that market). The thesis first offers a sufficiently compact insight into the Lean software startup concept to a reader who is not previously familiar with the term. The Lean startup philosophy is then put into a real-life test, based on interviews with four Finnish Lean software startup entrepreneurs. The interviews reveal 1) whether the Lean startup philosophy is actually valuable for them, 2) how can the theory be practically implemented in real life and 3) does theoretical Lean startup knowledge compensate a lack of entrepreneurship experience. A reader gets familiar with the key elements and tools of Lean startups, as well as their mutual connections. The thesis explains why Lean startups waste less time and money than many other startups. The thesis, especially its research sections, aims at providing data and analysis simultaneously.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The traditional business models and the traditionally successful development methods that have been distinctive to the industrial era, do not satisfy the needs of modern IT companies. Due to the rapid nature of IT markets, the uncertainty of new innovations‟ success and the overwhelming competition with established companies, startups need to make quick decisions and eliminate wasted resources more effectively than ever before. There is a need for an empirical basis on which to build business models, as well as evaluate the presumptions regarding value and profit. Less than ten years ago, the Lean software development principles and practices became widely well-known in the academic circles. Those practices help startup entrepreneurs to validate their learning, test their assumptions and be more and more dynamical and flexible. What is special about today‟s software startups is that they are increasingly individual. There are quantitative research studies available regarding the details of Lean startups. Broad research with hundreds of companies presented in a few charts is informative, but a detailed study of fewer examples gives an insight to the way software entrepreneurs see Lean startup philosophy and how they describe it in their own words. This thesis focuses on Lean software startups‟ early phases, namely Customer Discovery (discovering a valuable solution to a real problem) and Customer Validation (being in a good market with a product which satisfies that market). The thesis first offers a sufficiently compact insight into the Lean software startup concept to a reader who is not previously familiar with the term. The Lean startup philosophy is then put into a real-life test, based on interviews with four Finnish Lean software startup entrepreneurs. The interviews reveal 1) whether the Lean startup philosophy is actually valuable for them, 2) how can the theory be practically implemented in real life and 3) does theoretical Lean startup knowledge compensate a lack of entrepreneurship experience. A reader gets familiar with the key elements and tools of Lean startups, as well as their mutual connections. The thesis explains why Lean startups waste less time and money than many other startups. The thesis, especially its research sections, aims at providing data and analysis simultaneously.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two tills are readily identi-f i able in central Southern Ontario, a very stony, loose deposit o-f variable matrix (Dummer till) and a moderately stony, fissile and compact deposit that is more homogeneous (drumlinized till). The quantity o-f Precambr i an, Paleozoic and Shadow Lake Formation (Paleozoic) rock types were determined and corresponding isopleth maps drawn. The changes in lithology content occurred in the direction o-f transport, there-fore, compositional isopleths o-f till may be considered equipotential lines for the reconstruction of glacier flow paths. Areal gradations of drift lithology indicated that the prime agents of dispersal were ice and glacial meltwaters. The down-ice abundance trend of till components indicated a dispersal pattern showing the concentration of a given lithology type peaking within a few kilometres of the source followed by a rapid decline and thereafter, a more gradual decrease with increasing distance. Within the esker deposits, igneous rocks may form the major component and can extend further onto the limestone plain than in the adjacent till. Evidence is presented that indicates the "style" of dispersal was one in which glacial ice may have been strongly influenced by local bedrock topography and the regional structural trends. The ice tended to follow pre-existing valleys and lows, depositing till composed mainly of local bedrock. Gradations in Paleozoic clast content showed that the local bedrock lithology became the primary till component within 3 km of down-ice transport. Evidence is presented that indicated the last glaciation may have occurred as a relatively thin ice mass, followed by stagnation and recession. No evidence of a lateglacial re-advance was found within the study area. Because of the lack of a contact between the Dummer and drumlinized till, and because of results showing gradation of the Dummer till into the drumlinized till (as indicated by lithology content and grain size), it is suggested that no re-advance occurred.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The (n, k)-arrangement interconnection topology was first introduced in 1992. The (n, k )-arrangement graph is a class of generalized star graphs. Compared with the well known n-star, the (n, k )-arrangement graph is more flexible in degree and diameter. However, there are few algorithms designed for the (n, k)-arrangement graph up to present. In this thesis, we will focus on finding graph theoretical properties of the (n, k)- arrangement graph and developing parallel algorithms that run on this network. The topological properties of the arrangement graph are first studied. They include the cyclic properties. We then study the problems of communication: broadcasting and routing. Embedding problems are also studied later on. These are very useful to develop efficient algorithms on this network. We then study the (n, k )-arrangement network from the algorithmic point of view. Specifically, we will investigate both fundamental and application algorithms such as prefix sums computation, sorting, merging and basic geometry computation: finding convex hull on the (n, k )-arrangement graph. A literature review of the state-of-the-art in relation to the (n, k)-arrangement network is also provided, as well as some open problems in this area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The hyper-star interconnection network was proposed in 2002 to overcome the drawbacks of the hypercube and its variations concerning the network cost, which is defined by the product of the degree and the diameter. Some properties of the graph such as connectivity, symmetry properties, embedding properties have been studied by other researchers, routing and broadcasting algorithms have also been designed. This thesis studies the hyper-star graph from both the topological and algorithmic point of view. For the topological properties, we try to establish relationships between hyper-star graphs with other known graphs. We also give a formal equation for the surface area of the graph. Another topological property we are interested in is the Hamiltonicity problem of this graph. For the algorithms, we design an all-port broadcasting algorithm and a single-port neighbourhood broadcasting algorithm for the regular form of the hyper-star graphs. These algorithms are both optimal time-wise. Furthermore, we prove that the folded hyper-star, a variation of the hyper-star, to be maixmally fault-tolerant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spatial data representation and compression has become a focus issue in computer graphics and image processing applications. Quadtrees, as one of hierarchical data structures, basing on the principle of recursive decomposition of space, always offer a compact and efficient representation of an image. For a given image, the choice of quadtree root node plays an important role in its quadtree representation and final data compression. The goal of this thesis is to present a heuristic algorithm for finding a root node of a region quadtree, which is able to reduce the number of leaf nodes when compared with the standard quadtree decomposition. The empirical results indicate that, this proposed algorithm has quadtree representation and data compression improvement when in comparison with the traditional method.