979 resultados para Point-of-purchase
Resumo:
To determine if radiocontrast impairs vascular relaxation of the renal artery, segments (4-5 mm in length) of canine renal artery were suspended in vitro in organ chambers to measure isometric force (95% O2/5% CO2, at 37ºC). Arterial segments with and without endothelium were placed at the optimal point of their length-tension relation and incubated with 10 µM indomethacin to prevent synthesis of endogenous prostanoids. The presence of nonionic radiocontrast (iohexol, Omnipaque 350, 1 ml in 25 ml control solution, 4% (v/v)) did not alter endothelium-dependent relaxation to acetylcholine in rings precontracted with both norepinephrine and prostaglandin F2alpha (N = 6). When the rings were precontracted with prostaglandin F2alpha, the presence of ionic contrast did not inhibit the relaxation of the arteries. However, in canine renal arteries contracted with norepinephrine, the presence of ionic radiocontrast (diatrizoate meglumine and diatrizoate sodium, MD-76, 1 ml in 25 ml control solution, 4% (v/v)) inhibited relaxation in response to acetylcholine, sodium nitroprusside (N = 6 in each group), and isoproterenol (N = 5; P < 0.05). Rings were relaxed less than 50% of norepinephrine contraction. Following removal of the contrast, vascular relaxation in response to the agonists returned to normal. These results indicate that ionic radiocontrast nonspecifically inhibits vasodilation (both cAMP-mediated and cGMP-mediated) of canine renal arteries contracted with norepinephrine. This reversible impairment of vasodilation could inhibit normal renal perfusion and act as a mechanism of renal failure following radiocontrast infusion. In the adopted experimental protocol the isoproterenol-induced relaxation of renal arteries precontracted with norepinephrine was more affected, suggesting a pivotal role of the cAMP system.
Resumo:
The deficiency of the A isoenzyme of ß-hexosaminidase (Hex) produced by different mutations of the gene that codes for the alpha subunit (Tay-Sachs disease) has two variants with enzymological differences: the B variant consists of the absence of Hex A isoenzyme and the B1 variant produces an inactive Hex A isoenzyme for the hydrolysis of the GM2 ganglioside and synthetic substrates with negative charge. In contrast to the early childhood form of the B variant, the B1 variant appears at a later clinical stage (3 to 7 years of age) with neurodegenerative symptoms leading to the death of the patient in the second decade of life. The most frequent mutation responsible for the GM2 gangliosidosis B1 variant is R178H, which has a widespread geographic and ethnic distribution. The highest incidence has been described in Portugal, which has been suggested as the point of origin of this mutation. Biochemical characterization of this lysosomal disease is carried out using negatively charged synthetic alpha subunit-specific sulfated substrates, since Hex A isoenzyme heat-inactivation assays are not applicable. However, the determination of the apparent activation energy of Hex using the neutral substrate 3,3'-dichlorophenolsulfonphthaleinyl N-acetyl-ß-D-glucosaminide, may offer a valid alternative. The presence of an alpha subunit in the alphaß heterodimer Hex A means that its activation energy (41.8 kJ/mol) is significantly lower than that of the ßß homodimer Hex B (75.1 kJ/mol); however, as mutation inactivates the alpha subunit, the Hex A of the B1 variant presents an activation energy that is similar to that of the Hex B isoenzyme.
Resumo:
Presentation of Jussi-Pekka Hakkarainen, held at the Emtacl15 conference on the 20th of April 2015 in Trondheim, Norway.
Resumo:
The growing population on earth along with diminishing fossil deposits and the climate change debate calls out for a better utilization of renewable, bio-based materials. In a biorefinery perspective, the renewable biomass is converted into many different products such as fuels, chemicals, and materials, quite similar to the petroleum refinery industry. Since forests cover about one third of the land surface on earth, ligno-cellulosic biomass is the most abundant renewable resource available. The natural first step in a biorefinery is separation and isolation of the different compounds the biomass is comprised of. The major components in wood are cellulose, hemicellulose, and lignin, all of which can be made into various end-products. Today, focus normally lies on utilizing only one component, e.g., the cellulose in the Kraft pulping process. It would be highly desirable to utilize all the different compounds, both from an economical and environmental point of view. The separation process should therefore be optimized. Hemicelluloses can partly be extracted with hot-water prior to pulping. Depending in the severity of the extraction, the hemicelluloses are degraded to various degrees. In order to be able to choose from a variety of different end-products, the hemicelluloses should be as intact as possible after the extraction. The main focus of this work has been on preserving the hemicellulose molar mass throughout the extraction at a high yield by actively controlling the extraction pH at the high temperatures used. Since it has not been possible to measure pH during an extraction due to the high temperatures, the extraction pH has remained a “black box”. Therefore, a high-temperature in-line pH measuring system was developed, validated, and tested for hot-water wood extractions. One crucial step in the measurements is calibration, therefore extensive efforts was put on developing a reliable calibration procedure. Initial extractions with wood showed that the actual extraction pH was ~0.35 pH units higher than previously believed. The measuring system was also equipped with a controller connected to a pump. With this addition it was possible to control the extraction to any desired pH set point. When the pH dropped below the set point, the controller started pumping in alkali and by that the desired set point was maintained very accurately. Analyses of the extracted hemicelluloses showed that less hemicelluloses were extracted at higher pH but with a higher molar-mass. Monomer formation could, at a certain pH level, be completely inhibited. Increasing the temperature, but maintaining a specific pH set point, would speed up the extraction without degrading the molar-mass of the hemicelluloses and thereby intensifying the extraction. The diffusion of the dissolved hemicelluloses from the wood particle is a major part of the extraction process. Therefore, a particle size study ranging from 0.5 mm wood particles to industrial size wood chips was conducted to investigate the internal mass transfer of the hemicelluloses. Unsurprisingly, it showed that hemicelluloses were extracted faster from smaller wood particles than larger although it did not seem to have a substantial effect on the average molar mass of the extracted hemicelluloses. However, smaller particle sizes require more energy to manufacture and thus increases the economic cost. Since bark comprises 10 – 15 % of a tree, it is important to also consider it in a biorefinery concept. Spruce inner and outer bark was hot-water extracted separately to investigate the possibility to isolate the bark hemicelluloses. It was showed that the bark hemicelluloses comprised mostly of pectic material and differed considerably from the wood hemicelluloses. The bark hemicelluloses, or pectins, could be extracted at lower temperatures than the wood hemicelluloses. A chemical characterization, done separately on inner and outer bark, showed that inner bark contained over 10 % stilbene glucosides that could be extracted already at 100 °C with aqueous acetone.
Resumo:
Over the last 30 years, new technologies and globalization have radically changed the way in which marketing is conducted. However, whereas their effects on business in general have been widely discussed, the focus of the effects on marketing remains without clear recognition. Global research has been made to shed light onto the issue, but it has widely concentrated on the views of executives as well as the consumer markets. In addition, a research gap is existent in applying the concept of marketing change in a specific business-to-business (B2B) industry. Therefore, the main research question this study seeks to answer is: “How is contemporary marketing conducted in the high-technology industry?” In this research, the researcher considers the specific industry of high-technology. However, as the industry is comprised of differing markets, the focus will be given to one of the industry’s prime sectors – the information technology (IT) markets, where companies offer other firms products or services manufactured with advanced technology. The growing IT-market is considered of critical importance in the economies of technologically ready countries such as Finland, where this research is also conducted. Through multiple case studies the researcher aims to describe how the changes in technology, customer engagement and future trends have shaped the way in which successful high-tech marketing is conducted in today’s marketplace. Then, results derived from the empirical research are presented to the reader with links to existing literature. As a conclusion, a generalized framework is constructed to depict and ideal marketer-customer relationship, with emphasis on dynamic, two-way communication and its supporting elements of customer analytics, change adaptation, strategic customer communication and organizational support. From a managerial point of view, the research may provide beneficial information as contemporary marketing can yield profitable outcomes if managed correctly. As a new way to grasp competitive advantage, strategic marketing is much more data-driven and customer-focused than ever before. The study can also prove to be relevant for the academic communities, while its results may act as inspiring for new focus on the education trends of future marketers. This study was limited to the internal activities done at the high-tech industry, leaving out the considerations for co-marketing, marketing via business partners or marketing at other B2B-industries.
Resumo:
Permanent magnet synchronous machines (PMSM) have become widely used in applications because of high efficiency compared to synchronous machines with exciting winding or to induction motors. This feature of PMSM is achieved through the using the permanent magnets (PM) as the main excitation source. The magnetic properties of the PM have significant influence on all the PMSM characteristics. Recent observations of the PM material properties when used in rotating machines revealed that in all PMSMs the magnets do not necessarily operate in the second quadrant of the demagnetization curve which makes the magnets prone to hysteresis losses. Moreover, still no good analytical approach has not been derived for the magnetic flux density distribution along the PM during the different short circuits faults. The main task of this thesis is to derive simple analytical tool which can predict magnetic flux density distribution along the rotor-surface mounted PM in two cases: during normal operating mode and in the worst moment of time from the PM’s point of view of the three phase symmetrical short circuit. The surface mounted PMSMs were selected because of their prevalence and relatively simple construction. The proposed model is based on the combination of two theories: the theory of the magnetic circuit and space vector theory. The comparison of the results in case of the normal operating mode obtained from finite element software with the results calculated with the proposed model shows good accuracy of model in the parts of the PM which are most of all prone to hysteresis losses. The comparison of the results for three phase symmetrical short circuit revealed significant inaccuracy of the proposed model compared with results from finite element software. The analysis of the inaccuracy reasons was provided. The impact on the model of the Carter factor theory and assumption that air have permeability of the PM were analyzed. The propositions for the further model development are presented.
Resumo:
The topic of the present doctoral dissertation is the analysis of the phonological and tonal structures of a previously largely undescribed language, namely Samue. It is a Gur language belonging to the Niger-Congo language phulym, which is spoken in Burkina Faso. The data were collected during the fieldwork period in a Sama village; the data include 1800 lexical items, thousands of elicited sentences and 30 oral texts. The data were first transcribed phonetically and then the phonological and tonal analyses were conducted. The results show that the phonological system of Samue with the phoneme inventory and phonological processes has the same characteristics as other related Gur languages, although some particularities were found, such as the voicing and lenition of stop consonants in medial positions. Tonal analysis revealed three level tones, which have both lexical and grammatical functions. A particularity of the tonal system is the regressive Mid tone spreading in the verb phrase. The theoretical framework used in the study is Optimality theory. Optimality theory is rarely used in the analysis of an entire language system, and thus an objective was to see whether the theory was applicable to this type of work. Within the tonal analysis especially, some language specific constraints had to be created, although the basic Optimality Theory principle is the universal nature of the constraints. These constraints define the well-formedness of the language structures and they are differently ranked in different languages. This study gives new insights about typological phenomena in Gur languages. It is also a fundamental starting point for the Samue language in relation to the establishment of an orthography. From the theoretical point of view, the study proves that Optimality theory is largely applicable in the analysis of an entire sound system.
Resumo:
The goal of the thesis was to investigate how much after-sales profits a crane sale generates over the life cycle of the crane and the effects of these after-sales profits on the overall profitability of the crane. The thesis utilizes theories about life cycle costing from an equipment and service supplier’s point of view. However, instead of costs, the thesis is focused on the life cycle after-sales profits from maintenance services and spare parts provided for the sold crane. The case study approach was chosen and a total of five cranes from three different segments were investigated. An eight-step life cycle profit calculation model was developed in order to analyze the chosen cases’ life cycle profits systematically. The results of the investigation suggest that the life cycle after-sales profits are significant in value. In the case analyses they accounted for between 20% and 44% of the overall life cycle profits of the case cranes. The after-sales profits should be taken into account already in the pricing when offering a crane to a customer.
Resumo:
In order to evaluate the performance of a 1-h 75-g oral glucose tolerance test (OGTT) for the diagnosis of gestational diabetes mellitus (GDM), a cohort of 4998 women, 20 years or older, without previous diabetes being treated in prenatal care clinics in Brazil answered a questionnaire and performed a 75-g OGTT including fasting, 1-h and 2-h glucose measurements between their 24th and 28th gestational weeks. Pregnancy outcomes were transcribed from medical registries. GDM was defined according to WHO criteria (fasting: ≥126 mg/dL; 2-h value: ≥140 mg/dL) and macrosomia as a birth weight equal to or higher than 4000 g. Areas under the receiver operator characteristic curve (AUC) were compared and diagnostic properties of various cut-off points were evaluated. The AUCs for the prediction of macrosomia were 0.606 (0.572-0.637) for the 1-h and 0.589 (0.557-0.622) for the 2-h plasma glucose test. Similar predictability was demonstrable regarding combined adverse outcomes: 0.582 (0.559-0.604) for the 1-h test and 0.572 (0.549-0.595) for the 2-h test. When the 1-h glucose test was evaluated against a diagnosis of GDM defined by the 2-h glucose test, the AUC was 0.903 (0.886-0.919). The cut-off point that maximized sensitivity (83%) and specificity (83%) was 141 mg/dL, identifying 21% of the women as positive. A cut-off point of 160 mg/dL, with lower sensitivity (62%), had higher specificity (94%), labeling 8.6% as positive. Detection of GDM can be done with a 1-h 75-g OGTT: the value of 160 mg/dL has the same diagnostic performance as the conventional 2-h value (140 mg/dL). The simplification of the test may improve coverage and timing of the diagnosis of GDM.
Resumo:
A recent assessment of 4400 postgraduate courses in Brazil by CAPES (a federal government agency dedicated to the improvement of the quality of and research at the postgraduate level) stimulated a large amount of manifestations in the press, scientific journals and scientific congresses. This gigantic effort to classify 16,400 scientific journals in order to provide indicators for assessment proved to be puzzling and methodologically erroneous in terms of gauging the institutions from a metric point of view. A simple algorithm is proposed here to weigh the scientometric indicators that should be considered in the assessment of a scientific institution. I conclude here that the simple gauge of the total number of citations accounts for both the productivity of scientists and the impact of articles. The effort spent in this exercise is relatively small, and the sources of information are fully accessible. As an exercise to estimate the value of the methodology, 12 institutions of physics (10 from Brazil, one from the USA and one from Italy) have been evaluated.
Resumo:
The emerging technologies have recently challenged the libraries to reconsider their role as a mere mediator between the collections, researchers, and wider audiences (Sula, 2013), and libraries, especially the nationwide institutions like national libraries, haven’t always managed to face the challenge (Nygren et al., 2014). In the Digitization Project of Kindred Languages, the National Library of Finland has become a node that connects the partners to interplay and work for shared goals and objectives. In this paper, I will be drawing a picture of the crowdsourcing methods that have been established during the project to support both linguistic research and lingual diversity. The National Library of Finland has been executing the Digitization Project of Kindred Languages since 2012. The project seeks to digitize and publish approximately 1,200 monograph titles and more than 100 newspapers titles in various, and in some cases endangered Uralic languages. Once the digitization has been completed in 2015, the Fenno-Ugrica online collection will consist of 110,000 monograph pages and around 90,000 newspaper pages to which all users will have open access regardless of their place of residence. The majority of the digitized literature was originally published in the 1920s and 1930s in the Soviet Union, and it was the genesis and consolidation period of literary languages. This was the era when many Uralic languages were converted into media of popular education, enlightenment, and dissemination of information pertinent to the developing political agenda of the Soviet state. The ‘deluge’ of popular literature in the 1920s to 1930s suddenly challenged the lexical orthographic norms of the limited ecclesiastical publications from the 1880s onward. Newspapers were now written in orthographies and in word forms that the locals would understand. Textbooks were written to address the separate needs of both adults and children. New concepts were introduced in the language. This was the beginning of a renaissance and period of enlightenment (Rueter, 2013). The linguistically oriented population can also find writings to their delight, especially lexical items specific to a given publication, and orthographically documented specifics of phonetics. The project is financially supported by the Kone Foundation in Helsinki and is part of the Foundation’s Language Programme. One of the key objectives of the Kone Foundation Language Programme is to support a culture of openness and interaction in linguistic research, but also to promote citizen science as a tool for the participation of the language community in research. In addition to sharing this aspiration, our objective within the Language Programme is to make sure that old and new corpora in Uralic languages are made available for the open and interactive use of the academic community as well as the language societies. Wordlists are available in 17 languages, but without tokenization, lemmatization, and so on. This approach was verified with the scholars, and we consider the wordlists as raw data for linguists. Our data is used for creating the morphological analyzers and online dictionaries at the Helsinki and Tromsø Universities, for instance. In order to reach the targets, we will produce not only the digitized materials but also their development tools for supporting linguistic research and citizen science. The Digitization Project of Kindred Languages is thus linked with the research of language technology. The mission is to improve the usage and usability of digitized content. During the project, we have advanced methods that will refine the raw data for further use, especially in the linguistic research. How does the library meet the objectives, which appears to be beyond its traditional playground? The written materials from this period are a gold mine, so how could we retrieve these hidden treasures of languages out of the stack that contains more than 200,000 pages of literature in various Uralic languages? The problem is that the machined-encoded text (OCR) contains often too many mistakes to be used as such in research. The mistakes in OCRed texts must be corrected. For enhancing the OCRed texts, the National Library of Finland developed an open-source code OCR editor that enabled the editing of machine-encoded text for the benefit of linguistic research. This tool was necessary to implement, since these rare and peripheral prints did often include already perished characters, which are sadly neglected by the modern OCR software developers, but belong to the historical context of kindred languages and thus are an essential part of the linguistic heritage (van Hemel, 2014). Our crowdsourcing tool application is essentially an editor of Alto XML format. It consists of a back-end for managing users, permissions, and files, communicating through a REST API with a front-end interface—that is, the actual editor for correcting the OCRed text. The enhanced XML files can be retrieved from the Fenno-Ugrica collection for further purposes. Could the crowd do this work to support the academic research? The challenge in crowdsourcing lies in its nature. The targets in the traditional crowdsourcing have often been split into several microtasks that do not require any special skills from the anonymous people, a faceless crowd. This way of crowdsourcing may produce quantitative results, but from the research’s point of view, there is a danger that the needs of linguists are not necessarily met. Also, the remarkable downside is the lack of shared goal or the social affinity. There is no reward in the traditional methods of crowdsourcing (de Boer et al., 2012). Also, there has been criticism that digital humanities makes the humanities too data-driven and oriented towards quantitative methods, losing the values of critical qualitative methods (Fish, 2012). And on top of that, the downsides of the traditional crowdsourcing become more imminent when you leave the Anglophone world. Our potential crowd is geographically scattered in Russia. This crowd is linguistically heterogeneous, speaking 17 different languages. In many cases languages are close to extinction or longing for language revitalization, and the native speakers do not always have Internet access, so an open call for crowdsourcing would not have produced appeasing results for linguists. Thus, one has to identify carefully the potential niches to complete the needed tasks. When using the help of a crowd in a project that is aiming to support both linguistic research and survival of endangered languages, the approach has to be a different one. In nichesourcing, the tasks are distributed amongst a small crowd of citizen scientists (communities). Although communities provide smaller pools to draw resources, their specific richness in skill is suited for complex tasks with high-quality product expectations found in nichesourcing. Communities have a purpose and identity, and their regular interaction engenders social trust and reputation. These communities can correspond to research more precisely (de Boer et al., 2012). Instead of repetitive and rather trivial tasks, we are trying to utilize the knowledge and skills of citizen scientists to provide qualitative results. In nichesourcing, we hand in such assignments that would precisely fill the gaps in linguistic research. A typical task would be editing and collecting the words in such fields of vocabularies where the researchers do require more information. For instance, there is lack of Hill Mari words and terminology in anatomy. We have digitized the books in medicine, and we could try to track the words related to human organs by assigning the citizen scientists to edit and collect words with the OCR editor. From the nichesourcing’s perspective, it is essential that altruism play a central role when the language communities are involved. In nichesourcing, our goal is to reach a certain level of interplay, where the language communities would benefit from the results. For instance, the corrected words in Ingrian will be added to an online dictionary, which is made freely available for the public, so the society can benefit, too. This objective of interplay can be understood as an aspiration to support the endangered languages and the maintenance of lingual diversity, but also as a servant of ‘two masters’: research and society.
Resumo:
Purpose of this study is to clarify the industrial solutions purchasing process from purchaser companies’ point of view. Also customer’s view on value generating aspects and difficulties in purchases will be discussed as well as different purchas-ing entities where customers have ended up in their solution purchases. Current solution literature is mainly concentrated in supplier views and customer perspec-tive has been left without adequate attention. However, knowledge of the customer and the identification of customer need are at the core of a successful solution business. The focus of this thesis is on Finnish companies’ solution purchases that have been realized during last five years. Industrial solutions in this case are facto-ries or other large industrial plants. Industrial solutions’ purchasing process will be opened all the way from discovering the need until the start-up of the plant. Of in-terest is the customer experience of the success of the acquisition and the pur-chaser’s view on good practices allowing a successful procurement project.
Resumo:
The National Library of Finland is implementing the Digitization Project of Kindred Languages in 2012–16. Within the project we will digitize materials in the Uralic languages as well as develop tools to support linguistic research and citizen science. Through this project, researchers will gain access to new corpora 329 and to which all users will have open access regardless of their place of residence. Our objective is to make sure that the new corpora are made available for the open and interactive use of both the academic community and the language societies as a whole. The project seeks to digitize and publish approximately 1200 monograph titles and more than 100 newspapers titles in various Uralic languages. The digitization will be completed by the early of 2015, when the Fenno-Ugrica collection would contain around 200 000 pages of editable text. The researchers cannot spend so much time with the material that they could retrieve a satisfactory amount of edited words, so the participation of a crowd in editing work is needed. Often the targets in crowdsourcing have been split into several microtasks that do not require any special skills from the anonymous people, a faceless crowd. This way of crowdsourcing may produce quantitative results, but from the research’s point of view, there is a danger that the needs of linguistic research are not necessarily met. Also, the number of pages is too high to deal with. The remarkable downside is the lack of shared goal or social affinity. There is no reward in traditional methods of crowdsourcing. Nichesourcing is a specific type of crowdsourcing where tasks are distributed amongst a small crowd of citizen scientists (communities). Although communities provide smaller pools to draw resources, their specific richness in skill is suited for the complex tasks with high-quality product expectations found in nichesourcing. Communities have purpose, identity and their regular interactions engenders social trust and reputation. These communities can correspond to research more precisely. Instead of repetitive and rather trivial tasks, we are trying to utilize the knowledge and skills of citizen scientists to provide qualitative results. Some selection must be made, since we are not aiming to correct all 200,000 pages which we have digitized, but give such assignments to citizen scientists that would precisely fill the gaps in linguistic research. A typical task would editing and collecting the words in such fields of vocabularies, where the researchers do require more information. For instance, there’s a lack of Hill Mari words in anatomy. We have digitized the books in medicine and we could try to track the words related to human organs by assigning the citizen scientists to edit and collect words with OCR editor. From the nichesourcing’s perspective, it is essential that the altruism plays a central role, when the language communities involve. Upon the nichesourcing, our goal is to reach a certain level of interplay, where the language communities would benefit on the results. For instance, the corrected words in Ingrian will be added onto the online dictionary, which is made freely available for the public and the society can benefit too. This objective of interplay can be understood as an aspiration to support the endangered languages and the maintenance of lingual diversity, but also as a servant of “two masters”, the research and the society.
Resumo:
This qualitative study has started from the interest to examine how the reality of crosscultural encounters is presented in the global business press. The research paper emphasizes different ways to classify culture and cross-cultural competency, both from the point of view of individuals and organizations. The analysis consists of public discourses, where cross-cultural realities are created through different persons, stories and contexts For data collection, a comprehensive database search was performed and 10 articles from the widely known worldwide business magazine The Financial Times were chosen as the data for the study paper. For the functions of addressing the research study questions, Thematic Content Analysis (TCA) and also Discourse Analysis (DA) are utilized, added with the continuous comparison method of grounded theory in the formation of the data.The academic references consist of literary works and articles presenting relevant concepts, creating a cross-cultural framework, and it is designed to assist the reader in the navigation through the topics of culture and cross-cultural competency. The repertoires were formed from the data and following, the first repertoire is contrast difference between home and target culture that the individual was able to discern. As a consequence of the first repertoire, the companies then offer cultural training to their employees to prepare them to situations of increasing levels of cultural variation. The third repertoire is increased awareness of other cultures, which is conveyed as a result of cultural training and contextual work experience. The fourth repertoire is globalization as an international business environment, where the people in the articles perform their job functions. It is stated in the conclusions that the representations emphasize Western values and personal traits in leadership.
Resumo:
Pro Gradu -tutkielman tavoitteena on luoda uusien autojen kohdevakuudelliseen osamaksukauppaan pooling järjestely, joka mahdollistaisi aiempaa useammalle hakijalle myönteisen luottopäätöksen. Tutkielma tehdään autoliikkeen näkökul-masta. Työn teoriaosuus tutustuu vastaavien järjestelyjen hyödyntämiseen muun muassa vakuutusalalla ja toimitusketjun hallinnassa. Kerättyjen havaintojen avulla muodostetaan viitekehys autokauppaan soveltuvalle järjestelylle. Empiirisessä osuudessa muodostettua pooling järjestelyä testataan yksittäisen rahoitettavan auton case mallin avulla. Lisäksi järjestelyn potentiaalia arvioidaan eri näkökulmista. Saavutettujen tulosten perusteella voidaan todeta, että malli voi olla autoliikkeelle taloudellisesti kannattava ja antaa aihetta jatkotutkimukselle.