11 resultados para Performance(engineering)

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ohjelmiston suorituskyky on kokonaisvaltainen asia, johon kaikki ohjelmiston elinkaaren vaiheet vaikuttavat. Suorituskykyongelmat johtavat usein projektien viivästymisiin, kustannusten ylittymisiin sekä joissain tapauksissa projektin täydelliseen epäonnistumiseen. Software performance engineering (SPE) on ohjelmistolähtöinen lähestysmistapa, joka tarjoaa tekniikoita suorituskykyisen ohjelmiston kehittämiseen. Tämä diplomityö tutkii näitä tekniikoita ja valitsee niiden joukosta ne, jotka soveltuvat suorituskykyongelmien ratkaisemiseen kahden IT-laitehallintatuotteen kehityksessä. Työn lopputuloksena on päivitetty versio nykyisestä tuotekehitysprosessista, mikä huomioi sovellusten suorituskykyyn liittyvät haasteet tuotteiden elinkaaren eri vaiheissa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tutkielman tavoitteena on määritellä projektikontrolloinnin ja - riskijohtamisen roolit ja toiminnot saksalaisissa kone- ja tehdassuunnitteluteollisuusyrityksissä. Tämä on kvalitatiivinen tutkielma, jossa käytetään voimakkaasti kuvailevia metodeita. Materiaali tutkimuksen empiiriseen osaan kerättiin kyselykaavakkeen avulla. Kyselykaavakkeiden tulokset käsiteltiin Microsoft Office Access- ohjelmalla ja analysoitiin Microsoft Office Excel- ohjelmalla ja Pivot table- työkalun avulla. Tutkimustulokset osoittavat, että asianmukaisessa projektikontrollointi- ja riskijohtamismetodien käytössä ja käyttötiheydessä esiintyy puutteita saksalaisissa kone- ja tehdassuunnitteluteollisuusyrityksissä. Tehostamalla ja keskittymällä enemmän projektikontrollointi- ja riskijohtamismetodeihin ja prosesseihin sekä projektien että yritysten suorituskyky paranisi.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämän työn tarkoituksena oli tutkia kuinka organisaation kyvykkyyksiä voidaan mitata engineering- ja konsultointialalla käyttämällä ns. kyvykkyysauditointimenetelmää. Päämotiivit aineettoman omaisuuden mittaamiseksi tunnistettiin kirjallisuuskatsauksen pohjalta. Erilaisten menetelmien etuja ja haittoja tutkittiin, jotta kyvykkyysauditoinnin suorittamiseen liittyvät haasteet ja vaatimukset tulisivat tunnistetuiksi. Kyvykkyysauditoinnin rakentaminen vaati teollisuudenalan erityispiirteiden tunnistamista. Niiksi havaittiin tietointensiivisyys ja projektikeskeisyys. Auditoinnin implementaatioprosessi koostui neljästä osasta, joista kolmen ensimmäisen suorittamiseen case-yritys antoi merkittävän panoksensa. Kriittisten menestystekijöiden selvittämisen jälkeen voitiin niihin vaikuttavat organisaation kyvykkyydet tunnistaa ja arviointi suorittaa. Arvioinnit kerättiin sisäisiltä ja ulkoisilta arvioijilta, ja ne muodostivat pohjan analyysille, joka selvitti yrityksen kehittämistarpeita. Kyvykkyysauditoinnin hyödyiksi laskettiin kasvanut tietämys yrityksen vahvuuksista ja heikkouksista sekä mahdollisuus tarkkailla säännöllisesti sen kokonaissuorituskykyä ja parantaa sitä.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of the thesis was to develop a competitors’ financial performance monitoring model for management reporting. The research consisted of the selections of the comparison group and the performance meters as well as the actual creation of the model. A brief analysis of the current situation was also made. The aim of the results was to improve the financial reporting quality in the case organization by adding external business environment observation to the management reports. The comparison group for the case company was selected to include five companies that were all involved in power equipment engineering and project type business. The most limiting factor related to the comparison group selection was the availability of quarterly financial reporting. The most suitable performance meters were defined to be the developments of revenue, order backlog and EBITDA. These meters should be monitored systematically on quarterly basis and reported to the company management in a brief and informative way. The monitoring model was based on spreadsheet construction with key characteristics being usability, flexibility and simplicity. The model acts as a centered storage for financial competitor information as well as a reporting tool. The current market situation is strongly affected by the economic boom in the recent years and future challenges can be clearly seen in declining order backlogs. The case company has succeeded well related to its comparison group during the observation period since its business volume and profitability have developed in the best way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Web application performance testing is an emerging and important field of software engineering. As web applications become more commonplace and complex, the need for performance testing will only increase. This paper discusses common concepts, practices and tools that lie at the heart of web application performance testing. A pragmatic, hands-on approach is assumed where applicable; real-life examples of test tooling, execution and analysis are presented right next to the underpinning theory. At the client-side, web application performance is primarily driven by the amount of data transmitted over the wire. At the server-side, selection of programming language and platform, implementation complexity and configuration are the primary contributors to web application performance. Web application performance testing is an activity that requires delicate coordination between project stakeholders, developers, system administrators and testers in order to produce reliable and useful results. Proper test definition, execution, reporting and repeatable test results are of utmost importance. Open-source performance analysis tools such as Apache JMeter, Firebug and YSlow can be used to realise effective web application performance tests. A sample case study using these tools is presented in this paper. The sample application was found to perform poorly even under the moderate load incurred by the sample tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Segmentointi on perinteisesti ollut erityisesti kuluttajamarkkinoinnin työkalu, mutta siirtymä tuotteista palveluihin on lisännyt segmentointitarvetta myös teollisilla markkinoilla. Tämän tutkimuksen tavoite on löytää selkeästi toisistaan erottuvia asiakasryhmiä suomalaisen liikkeenjohdon konsultointiyritys Synocus Groupin tarjoaman case-materiaalin pohjalta. K-means-klusteroinnin avulla löydetään kolme potentiaalista markkinasegmenttiä perustuen siihen, mitkä tarjoamaelementit 105 valikoitua suomalaisen kone- ja metallituoteteollisuuden asiakasta ovat maininneet tärkeimmiksi. Ensimmäinen klusteri on hintatietoiset asiakkaat, jotka laskevat yksikkökohtaisia hintoja. Toinen klusteri koostuu huolto-orientoituneista asiakkaista, jotka laskevat tuntikustannuksia ja maksimoivat konekannan käyttötunteja. Tälle kohderyhmälle kannattaisi ehkä markkinoida teknisiä palveluja ja huoltosopimuksia. Kolmas klusteri on tuottavuussuuntautuneet asiakkaat, jotka ovat kiinnostuneita suorituskyvyn kehittämisestä ja laskevat tonnikohtaisia kustannuksia. He tavoittelevat alempia kokonaiskustannuksia lisääntyneen suorituskyvyn, pidemmän käyttöiän ja alempien huoltokustannusten kautta.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Machine learning provides tools for automated construction of predictive models in data intensive areas of engineering and science. The family of regularized kernel methods have in the recent years become one of the mainstream approaches to machine learning, due to a number of advantages the methods share. The approach provides theoretically well-founded solutions to the problems of under- and overfitting, allows learning from structured data, and has been empirically demonstrated to yield high predictive performance on a wide range of application domains. Historically, the problems of classification and regression have gained the majority of attention in the field. In this thesis we focus on another type of learning problem, that of learning to rank. In learning to rank, the aim is from a set of past observations to learn a ranking function that can order new objects according to how well they match some underlying criterion of goodness. As an important special case of the setting, we can recover the bipartite ranking problem, corresponding to maximizing the area under the ROC curve (AUC) in binary classification. Ranking applications appear in a large variety of settings, examples encountered in this thesis include document retrieval in web search, recommender systems, information extraction and automated parsing of natural language. We consider the pairwise approach to learning to rank, where ranking models are learned by minimizing the expected probability of ranking any two randomly drawn test examples incorrectly. The development of computationally efficient kernel methods, based on this approach, has in the past proven to be challenging. Moreover, it is not clear what techniques for estimating the predictive performance of learned models are the most reliable in the ranking setting, and how the techniques can be implemented efficiently. The contributions of this thesis are as follows. First, we develop RankRLS, a computationally efficient kernel method for learning to rank, that is based on minimizing a regularized pairwise least-squares loss. In addition to training methods, we introduce a variety of algorithms for tasks such as model selection, multi-output learning, and cross-validation, based on computational shortcuts from matrix algebra. Second, we improve the fastest known training method for the linear version of the RankSVM algorithm, which is one of the most well established methods for learning to rank. Third, we study the combination of the empirical kernel map and reduced set approximation, which allows the large-scale training of kernel machines using linear solvers, and propose computationally efficient solutions to cross-validation when using the approach. Next, we explore the problem of reliable cross-validation when using AUC as a performance criterion, through an extensive simulation study. We demonstrate that the proposed leave-pair-out cross-validation approach leads to more reliable performance estimation than commonly used alternative approaches. Finally, we present a case study on applying machine learning to information extraction from biomedical literature, which combines several of the approaches considered in the thesis. The thesis is divided into two parts. Part I provides the background for the research work and summarizes the most central results, Part II consists of the five original research articles that are the main contribution of this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis investigated the contemporary phenomenon of detail engineering outsourcing. The case organization had pursued a new outsourcing approach with a trusted partner. The goal of this empirical study was to examine the impact of the consequential partnership outsourcing arrangement. Particularly, the beneficence of the arrangement was evaluated based on the underlying organizational routine and the long-term economic implications of its performance outcome. The case study was needed, as the unit will likely have to rely on such distance outsourcing arrangements more and more in the future, and understanding on the impact of such operations is needed. The main findings revealed that the new outsourcing arrangement is not currently a very attractive strategic option for organizing production. The benefits which stem from the emerged, unique engineering project routine are not significant enough to make the arrangement an advantageous one, especially since increasing partnering costs are being met. This conclusion was drawn via the extended transaction cost view. Benchmarking was done in reliance to an old arrangement from which the new pursuit was a departure from. The case study then enlightened the engineering unit on the impact of its strategic maneuver by combining the routines-theory framework with contemporary methods of governance structure evaluation. Through this, it was shown that greater efforts are needed to make the new outsourcing approach a more beneficial one. However, the studied arrangement was seen to inhold potential for better results. The findings can be used to capitalize on this.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Logistics infrastructure and transportation services have been the liability of countries and governments for decades, or these have been under strict regulation policies. One of the first branches opened for competition in EU as well as in other continents, has been air transports (operators, like passenger and freight) and road transports. These have resulted on lower costs, better connectivity and in most of the cases higher service quality. However, quite large amount of other logistics related activities are still directly (or indirectly) under governmental influence, e.g. railway infrastructure, road infrastructure, railway operations, airports, and sea ports. Due to the globalization, governmental influence is not that necessary in this sector, since transportation needs have increased with much more significant phase as compared to economic growth. Also freight transportation needs do not correlate with passenger side, due to the reason that only small number of areas in the world have specialized in the production of particular goods. Therefore, in number of cases public-private partnership, or even privately owned companies operating in these sub-branches have been identified as beneficial for countries, customers and further economic growth. The objective of this research work is to shed more light on these kinds of experiments, especially in the relatively unknown sub-branches of logistics like railways, airports and sea container transports. In this research work we have selected companies having public listed status in some stock exchange, and have needed amount of financial scale to be considered as serious company rather than start-up phase venture. Our research results show that railways and airports usually need high fixed investments, but have showed in the last five years generally good financial performance, both in terms of profitability and cash flow. In contrary to common belief of prosperity in globally growing container transports, sea vessel operators of containers have not shown that impressive financial performance. Generally margins in this business are thin, and profitability has been sacrificed in front of high growth – this also concerns cash flow performance, which has been lower too. However, as we examine these three logistics sub-branches through shareholder value development angle during time period of 2002-2007, we were surprised to find out that all of these three have outperformed general stock market indexes in this period. More surprising is the result that financially a bit less performing sea container transportation sector shows highest shareholder value gain in the examination period. Thus, it should be remembered that provided analysis shows only limited picture, since e.g. dividends were not taken into consideration in this research work. Therefore, e.g. US railway operators have disadvantage to other in the analysis, since they have been able to provide dividends for shareholders in long period of time. Based on this research work we argue that investment on transportation/logistics sector seems to be safe alternative, which yields with relatively low risk high gain. Although global economy would face smaller growth period, this sector seems to provide opportunities in more demanding situation as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The production of biodiesel through transesterification has created a surplus of glycerol on the international market. In few years, glycerol has become an inexpensive and abundant raw material, subject to numerous plausible valorisation strategies. Glycerol hydrochlorination stands out as an economically attractive alternative to the production of biobased epichlorohydrin, an important raw material for the manufacturing of epoxy resins and plasticizers. Glycerol hydrochlorination using gaseous hydrogen chloride (HCl) was studied from a reaction engineering viewpoint. Firstly, a more general and rigorous kinetic model was derived based on a consistent reaction mechanism proposed in the literature. The model was validated with experimental data reported in the literature as well as with new data of our own. Semi-batch experiments were conducted in which the influence of the stirring speed, HCl partial pressure, catalyst concentration and temperature were thoroughly analysed and discussed. Acetic acid was used as a homogeneous catalyst for the experiments. For the first time, it was demonstrated that the liquid-phase volume undergoes a significant increase due to the accumulation of HCl in the liquid phase. Novel and relevant features concerning hydrochlorination kinetics, HCl solubility and mass transfer were investigated. An extended reaction mechanism was proposed and a new kinetic model was derived. The model was tested with the experimental data by means of regression analysis, in which kinetic and mass transfer parameters were successfully estimated. A dimensionless number, called Catalyst Modulus, was proposed as a tool for corroborating the kinetic model. Reactive flash distillation experiments were conducted to check the commonly accepted hypothesis that removal of water should enhance the glycerol hydrochlorination kinetics. The performance of the reactive flash distillation experiments were compared to the semi-batch data previously obtained. An unforeseen effect was observed once the water was let to be stripped out from the liquid phase, exposing a strong correlation between the HCl liquid uptake and the presence of water in the system. Water has revealed to play an important role also in the HCl dissociation: as water was removed, the dissociation of HCl was diminished, which had a retarding effect on the reaction kinetics. In order to obtain a further insight on the influence of water on the hydrochlorination reaction, extra semi-batch experiments were conducted in which initial amounts of water and the desired product were added. This study revealed the possibility to use the desired product as an ideal “solvent” for the glycerol hydrochlorination process. A co-current bubble column was used to investigate the glycerol hydrochlorination process under continuous operation. The influence of liquid flow rate, gas flow rate, temperature and catalyst concentration on the glycerol conversion and product distribution was studied. The fluid dynamics of the system showed a remarkable behaviour, which was carefully investigated and described. Highspeed camera images and residence time distribution experiments were conducted to collect relevant information about the flow conditions inside the tube. A model based on the axial dispersion concept was proposed and confronted with the experimental data. The kinetic and solubility parameters estimated from the semi-batch experiments were successfully used in the description of mass transfer and fluid dynamics of the bubble column reactor. In light of the results brought by the present work, the glycerol hydrochlorination reaction mechanism has been finally clarified. It has been demonstrated that the reactive distillation technology may cause drawbacks to the glycerol hydrochlorination reaction rate under certain conditions. Furthermore, continuous reactor technology showed a high selectivity towards monochlorohydrins, whilst semibatch technology was demonstrated to be more efficient towards the production of dichlorohydrins. Based on the novel and revealing discoveries brought by the present work, many insightful suggestions are made towards the improvement of the production of αγ-dichlorohydrin on an industrial scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mixing performance of three passive milli-scale reactors with different geometries was investigated at different Reynolds numbers. The effects of design and operating characteristics such as mixing channel shape and volume flow rate were investigated. The main objective of this work was to demonstrate a process design method that uses on Computational Fluid Dynamics (CFD) for modeling and Additive Manufacturing (AM) technology for manufacture. The reactors were designed and simulated using SolidWorks and Fluent 15.0 software, respectively. Manufacturing of the devices was performed with an EOS M-series AM system. Step response experiments with distilled Millipore water and sodium hydroxide solution provided time-dependent concentration profiles. Villermaux-Dushman reaction experiments were also conducted for additional verification of CFD results and for mixing efficiency evaluation of the different geometries. Time-dependent concentration data and reaction evaluation showed that the performance of the AM-manufactured reactors matched the CFD results reasonably well. The proposed design method allows the implementation of new and innovative solutions, especially in the process design phase, for industrial scale reactor technologies. In addition, rapid implementation is another advantage due to the virtual flow design and due to the fast manufacturing which uses the same geometric file formats.