273 resultados para Computer industry
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Markkinasegmentointi nousi esiin ensi kerran jo 50-luvulla ja se on ollut siitä lähtien yksi markkinoinnin peruskäsitteistä. Suuri osa segmentointia käsittelevästä tutkimuksesta on kuitenkin keskittynyt kuluttajamarkkinoiden segmentointiin yritys- ja teollisuusmarkkinoiden segmentoinnin jäädessä vähemmälle huomiolle. Tämän tutkimuksen tavoitteena on luoda segmentointimalli teollismarkkinoille tietotekniikan tuotteiden ja palveluiden tarjoajan näkökulmasta. Tarkoituksena on selvittää mahdollistavatko case-yrityksen nykyiset asiakastietokannat tehokkaan segmentoinnin, selvittää sopivat segmentointikriteerit sekä arvioida tulisiko tietokantoja kehittää ja kuinka niitä tulisi kehittää tehokkaamman segmentoinnin mahdollistamiseksi. Tarkoitus on luoda yksi malli eri liiketoimintayksiköille yhteisesti. Näin ollen eri yksiköiden tavoitteet tulee ottaa huomioon eturistiriitojen välttämiseksi. Tutkimusmetodologia on tapaustutkimus. Lähteinä tutkimuksessa käytettiin sekundäärisiä lähteitä sekä primäärejä lähteitä kuten case-yrityksen omia tietokantoja sekä haastatteluita. Tutkimuksen lähtökohtana oli tutkimusongelma: Voiko tietokantoihin perustuvaa segmentointia käyttää kannattavaan asiakassuhdejohtamiseen PK-yritys sektorilla? Tavoitteena on luoda segmentointimalli, joka hyödyntää tietokannoissa olevia tietoja tinkimättä kuitenkaan tehokkaan ja kannattavan segmentoinnin ehdoista. Teoriaosa tutkii segmentointia yleensä painottuen kuitenkin teolliseen markkinasegmentointiin. Tarkoituksena on luoda selkeä kuva erilaisista lähestymistavoista aiheeseen ja syventää näkemystä tärkeimpien teorioiden osalta. Tietokantojen analysointi osoitti selviä puutteita asiakastiedoissa. Peruskontaktitiedot löytyvät mutta segmentointia varten tietoa on erittäin rajoitetusti. Tietojen saantia jälleenmyyjiltä ja tukkureilta tulisi parantaa loppuasiakastietojen saannin takia. Segmentointi nykyisten tietojen varassa perustuu lähinnä sekundäärisiin tietoihin kuten toimialaan ja yrityskokoon. Näitäkään tietoja ei ole saatavilla kaikkien tietokannassa olevien yritysten kohdalta.
Resumo:
The horse industry is in many ways still operating the same way as it did in the beginning of the 20th century. At the same time the role of the horse has changed dramatically, from a beast of burden to a top athlete, a production animal or a beloved pet. A racehorse or an equestrian sport horse is trained and taken care of like any other athlete, but unlike its human counterpart, it might end up on our plate. According to European and many other countries’ laws, a horse is a production animal. The medical data of a horse should be known if it is to be slaughtered, to ensure that the meat is safe for human consumption. Today this vital medical information should be noted in the horse’s passport, but this paperbased system is not reliable. If a horse gets sold, depending on the country’s laws, the medical records might not be transferred to the new owner, the horse’s passport might get lost etc. Thus the system is not fool proof. It is not only the horse owners who have to struggle with paperwork; veterinarians as well as other officials often use much time on redundant paperwork. The main research question of this thesis is if IS could be used to help the different stakeholders within the horse industry? Veterinarians in particular who travel to stables to treat horses cannot always take with them their computers, since the somewhat unsanitary environment is not suitable for a sensitive technological device. Currently there is no common medical database developed for horses, although such a database with a support system could help with many problems. These include vaccination and disease control, food-safety, as well as export and import problems. The main stakeholders within the horse industry, including equine veterinarians and horse owners, were studied to find out their daily routines and needs for a possible support system. The research showed that there are different aspects within the horse industry where IS could be used to support the stakeholders daily routines. Thus a support system including web and mobile accessibility for the main stakeholders is under development. Since veterinarians will be the main users of this support system, it is very important to make sure that they find it useful and beneficial in their daily work. To ensure a desired result, the research and development of the system has been done iteratively with the stakeholders following the Action Design Research methodology.
Resumo:
As the world becomes more technologically advanced and economies become globalized, computer science evolution has become faster than ever before. With this evolution and globalization come the need for sustainable university curricula that adequately prepare graduates for life in the industry. Additionally, behavioural skills or “soft” skills have become just as important as technical abilities and knowledge or “hard” skills. The objective of this study was to investigate the current skill gap that exists between computer science university graduates and actual industry needs as well as the sustainability of current computer science university curricula by conducting a systematic literature review of existing publications on the subject as well as a survey of recently graduated computer science students and their work supervisors. A quantitative study was carried out with respondents from six countries, mainly Finland, 31 of the responses came from recently graduated computer science professionals and 18 from their employers. The observed trends suggest that a skill gap really does exist particularly with “soft” skills and that many companies are forced to provide additional training to newly graduated employees if they are to be successful at their jobs.
Resumo:
The computer game industry has grown steadily for years, and in revenues it can be compared to the music and film industries. The game industry has been moving to digital distribution. Computer gaming and the concept of business model are discussed among industrial practitioners and the scientific community. The significance of the business model concept has increased in the scientific literature recently, although there is still a lot of discussion going on on the concept. In the thesis, the role of the business model in the computer game industry is studied. Computer game developers, designers, project managers and organization leaders in 11 computer game companies were interviewed. The data was analyzed to identify the important elements of computer game business model, how the business model concept is perceived and how the growth of the organization affects the business model. It was identified that the importance of human capital is crucial to the business. As games are partly a product of creative thinking also innovation and the creative process are highly valued. The same applies to technical skills when performing various activities. Marketing and customer relationships are also considered as key elements in the computer game business model. Financing and partners are important especially for startups, when the organization is dependent on external funding and third party assets. The results of this study provide organizations with improved understanding on how the organization is built and what business model elements are weighted.
Resumo:
Abstract
Resumo:
Abstract
Resumo:
Vol.23, No. 5, pp. 1024-1037, 2007.
Resumo:
The suitable timing of capacity investments is a remarkable issue especially in capital intensive industries. Despite its importance, fairly few studies have been published on the topic. In the present study models for the timing of capacity change in capital intensive industry are developed. The study considers mainly the optimal timing of single capacity changes. The review of earlier research describes connections between cost, capacity and timing literature, and empirical examples are used to describe the starting point of the study and to test the developed models. The study includes four models, which describe the timing question from different perspectives. The first model, which minimizes unit costs, has been built for capacity expansion and replacement situations. It is shown that the optimal timing of an investment can be presented with the capacity and cost advantage ratios. After the unit cost minimization model the view is extended to the direction of profit maximization. The second model states that early investments are preferable if the change of fixed costs is small compared to the change of the contribution margin. The third model is a numerical discounted cash flow model, which emphasizes the roles of start-up time, capacity utilization rate and value of waiting as drivers of the profitable timing of a project. The last model expands the view from project level to company level and connects the flexibility of assets and cost structures to the timing problem. The main results of the research are the solutions of the models and analysis or simulations done with the models. The relevance and applicability of the results are verified by evaluating the logic of the models and by numerical cases.
Resumo:
Nanotiltration is a membrane separation method known for its special characteristic of rejecting multivalent ions and passing monovalent ions. Thus, it is commonly applied with dilute aqueous solutions in partial salt removal, like in drinking water production. The possibilities of nanofiltration have been studied and the technique applied in a wide branch of industries, e.g. the pulp and paper, the textile and the chemical processing industry. However, most present applications and most of the potential applications studied involve dilute solutions, the permeating stream being generally water containing monovalent salts. In this study nanotiltration is investigated more as a fractionation method. A well-known application in the dairy industry is concentration and partial salt removal from whey. Concentration and partial demineralization is beneficial for futher processing of whey as whey concentrates are used e.g. in baby foods. In the experiments of this study nanotiltration effectively reduced the monovalent salts in the whey concentrate. The main concern in this application is lactose leakage into the permeate. With the nanofiltration membranes used the lactose retentions were practically ? 99%. Another dairy application studied was the purification and reuse of cleaning solutions. This is an environmentally driven application. An 80% COD reduction by nanofiltration was observed for alkaline cleaning-in-place solution. Nanofiltration is not as commonly applied in the sugar and sweeteners industry as in the dairy industry. In this study one potential application was investigated, namely xylose purification from hemicellulose hydrolyzate. Xylose is raw material for xylitol production. Xylose separation from glucose was initially studied with xylose-glucose model solutions. The ability of nanofiltration to partially separate xylose into the permeate from rather concentrated xylose-glucose solutions (10 w-% and 30 w-%) became evident. The difference in size between xylose and glucose molecules according to any size measure is small, e.g. the Stokes diameter of glucose is 0.73 nm compared to 0.65 nm for xylose. In further experiments, xylose was purified into nanoliltration permeate from a hemicellulose hydrolyzate solution. The xylose content in the total solids was increased by 1.4—1.7 fold depending on temperature, pressure and feed composition.
Resumo:
The research of condition monitoring of electric motors has been wide for several decades. The research and development at universities and in industry has provided means for the predictive condition monitoring. Many different devices and systems are developed and are widely used in industry, transportation and in civil engineering. In addition, many methods are developed and reported in scientific arenas in order to improve existing methods for the automatic analysis of faults. The methods, however, are not widely used as a part of condition monitoring systems. The main reasons are, firstly, that many methods are presented in scientific papers but their performance in different conditions is not evaluated, secondly, the methods include parameters that are so case specific that the implementation of a systemusing such methods would be far from straightforward. In this thesis, some of these methods are evaluated theoretically and tested with simulations and with a drive in a laboratory. A new automatic analysis method for the bearing fault detection is introduced. In the first part of this work the generation of the bearing fault originating signal is explained and its influence into the stator current is concerned with qualitative and quantitative estimation. The verification of the feasibility of the stator current measurement as a bearing fault indicatoris experimentally tested with the running 15 kW induction motor. The second part of this work concentrates on the bearing fault analysis using the vibration measurement signal. The performance of the micromachined silicon accelerometer chip in conjunction with the envelope spectrum analysis of the cyclic bearing faultis experimentally tested. Furthermore, different methods for the creation of feature extractors for the bearing fault classification are researched and an automatic fault classifier using multivariate statistical discrimination and fuzzy logic is introduced. It is often important that the on-line condition monitoring system is integrated with the industrial communications infrastructure. Two types of a sensor solutions are tested in the thesis: the first one is a sensor withcalculation capacity for example for the production of the envelope spectra; the other one can collect the measurement data in memory and another device can read the data via field bus. The data communications requirements highly depend onthe type of the sensor solution selected. If the data is already analysed in the sensor the data communications are needed only for the results but in the other case, all measurement data need to be transferred. The complexity of the classification method can be great if the data is analysed at the management level computer, but if the analysis is made in sensor itself, the analyses must be simple due to the restricted calculation and memory capacity.
Resumo:
ei
Resumo:
Vuonna 2003 uudistunut työturvallisuuslaki lisäsi yritysten vastuuta oman työympäristön riskien tunnistamisessa. Laki velvoittaa yritykset tunnistamaan, selvittämään ja arvioimaan työstä ja työolosuhteista johtuvat vaara- ja haittatekijät. Perinteisesti yritykset ovat suorittaneet riskien arviointia erilaisten tarkistuslistojen avulla, mutta nykyään tietotekniikan käytön lisääntyminen on tuonut tietotekniikan myös yhdeksi riskienhallinnan työvälineeksi. Tämän työn tavoitteena oli tutkia If Vahinkovakuutusyhtiön uuden riskien arviointiohjelman käyttöönottoa metsäteollisuuden yrityksessä, ja selvittää sen vaikutuksia turvallisuustoimenpiteiden hallintaan ja vastaavuutta sille ennakkoon asetettuihin käytettävyysvaatimuksiin. Riskien arviointiohjelmaa tutkittiin pilottikohteessa Stora Enson Anjalankoskentehtailla. Tietoa kerättiin mm. käyttäjäkoulutustilaisuuksien palautekyselyillä, haastattelemalla arvioijia ja osallistumalla riskien arviointikierroksille. Tutkimuksessa seurattiin mm. ohjelman avulla syntyneiden toimenpide-ehdotuksien määrää ja laatua, ja sitä kuinka hyvin ohjelmalle suunniteltu sisältö sopii käyttöön. Tämän tutkimuksen perusteella havaittiin, että riskien arviointiin tarkoitettu tietokoneohjelma voidaan suunnitella helppokäyttöiseksi ja toimivaksi. Myös ohjelman sisältö, joka oli tarkoitettu kattamaan työturvallisuuden eri osa-alueet, havaittiin tarkoitukseen sopivaksi. Riskien arvioinnin ongelmallisin alue, eli turvallisuustoimenpiteiden suunnittelu ja seuranta, havaittiin haasteelliseksi toteutettavaksi ohjelman avulla. Yritysten erilaisiin riskien arviointikäytäntöihinsopivan raportointijärjestelmän ja toimenpideseurannan kehittäminen tulee olemaan tulevaisuudessakin tärkeä osa-alue riskien arviointiohjelmien kehittämisessä.