61 resultados para discourse approaches
Resumo:
Mix marketing and relationships marketing are two major approaches that often form a basis for organizational marketing planning. The superiority of these approaches has been debated for long without any rational conclusion. Lately there have been studies indicating that both of the major approaches are many times used side by side in marketing planning. There have been also studies suggesting that even combining the mix marketing and relationship marketing approaches might be possible. The aim of this thesis is to provide knowledge about the usage of mix marketing and relationship marketing approaches in organizations and possibilities in combining the approaches. Also a settlement of strengths, weaknesses and risks of combining is intended to provide. The objectives were met through the literature and a case study research. In the case study, interviews were conducted in order to gain a deeper knowledge about marketing planning in various organizations. Based on this study, the combining of the major marketing approaches will be possible and even recommended when keeping in mind few aspects which might cause some troubles in the combining process.
Resumo:
The use of enantiopure intermediates for drug synthesis is a trend in pharmaceutical industry. Different physiological effects are associated with the enantiomers of chiral molecules. Thus, the safety profile of a drug based on an enantiopure active pharmaceutical ingredient is more reliable. Biocatalysis is an important tool to access enantiopure molecules. In biocatalysis, the advantage of selectivity (chemo-, regio- and stereoselectivity) is combined with the benefits of a green synthesis strategy. Chemoenzymatic syntheses of drug molecules, obtained by combining biocatalysis with modern chemical synthesis steps usually consists of fewer reaction steps, reduced waste production and improved overall synthetic efficiency both in yields and enantio- and/or diastereoselectivities compared with classical chemical synthesis. The experimental work together with the literature review clearly indicates that lipase catalysis is highly applicable in the synthesis of enantiopure intermediates of drug molecules as the basis to infer the correct stereochemistry. By lipase catalysis, enantiopure secondary alcohols used as intermediates in the synthesis of Dorzolamide, an antiglaucoma drug, were obtained. Enantiopure _-hydroxy nitriles as potential intermediates for the synthesis of antidepressant drugs with 1-aryl-3- methylaminopropan-1-ol structure were also obtained with lipases. Kinetic resolution of racemates was the main biocatalytic approach applied. Candida Antarctica lipase B, Burkholderia cepacia lipase and Thermomyces lanuginosus lipase were applied for the acylation of alcohols and the alcoholysis of their esters in organic solvents, such as in diisopropyl ether and tert-butyl methyl ether. Candida Antarctica lipase B was used under solvent free conditions for the acylation of ethyl 3-hydroxybutanoate.
Resumo:
Many manufacturing companies have started to offer complete solutions to their customers’ unique needs due to toughening competition and customer demand. Discourse on this kind of solution business is still developing, hence, there is not an established definition for the concept of solution. The aim of the study is to profoundly identify the concept of solution and to understand how the industry’s current views differ from the theoretical concepts. The describing dimensions are identified from selected 13 theoretical notions, and from responses, that the employees of five different companies have given. The 32 interview transcripts are analyzed with thematic analysis and qualitative content analysis. According to the findings, the concept of solution is characterized by integration, customization, risk-sharing, value co-creation, long-term orientation, and desired outcomes. The industry’s insights differ in terms of them all. The results illustrate, that a solution is a bundle, and the whole solution is customized on some level for a client. A solution supplier needs to be customer-focused, in which value co-creation is only a part. The solution solves the customer’s problem, and improves both the customer’s, and the supplier’s business. Neither long-term focus nor risks-sharing were directly employed to characterize the concept of solution. Differences are mainly due to the different approaches to the definitions and inexperience of the companies.
Resumo:
Tutkielmassa eritellään Norman Faircloughin kriittisen diskurssianalyysin teoriaa ja siihen kohdistettua kritiikkiä. Pyrkimyksenä on sovittaa näitä erilaisia näkemyksiä keskenään ja tarjota ratkaisuja yhteen kiriittisen diskurssianalyysin keskeiseen ongelmaan eli emansipaation (sosiaalisten epäkohtien tunnistamisen ja ratkaisemisen) puutteellisuuteen. Teoriaosuudesta esiin nousevia mahdollisuuksia sovelletaan tekstianalyysiin. Tutkimuksen kohteena on teksti Rebuilding America’s Defenses: Strategy, Forces and Resources For a New Century ja jossain määrin sen tuottanut järjestö Project for the New American Century. Näitä tarkastellaan ennen kaikkea sosiaalisina ilmiöinä ja suhteessa toisiinsa. Faircloughin mallin suurimmiksi ongelmiksi muodostuvat perinteinen käsitys kielestä, jonka mukaan kielen järjestelmän abstraktit ja sisäiset suhteet ovat tärkeimpiä, sekä ideologinen vastakkainasettelu kritiikin lähtökohtana. Ensimmäinen johtaa kielellisten tutkimustulosten epätyydyttävään kykyyn selittää sosiaalisia havaintoja ja jälkimmäinen poliittiseen tai maailmankatsomukselliseen väittelyyn, joka ei mahdollista uusia näkemyksiä. Tutkielman lopputulema on, että keskittymällä asiasisältöön kielen rakenteen sijasta ja ymmärtämällä tekstin tuottaja yksittäisenä, rajattuna sosiaalisena toimijana voidaan analyysiin saada avoimuutta ja täsmällisyyttä. Kriittiinen diskurssianalyysi kaipaa tällaista näkemystä kielellisten analyysien tueksi ja uudenlaisen relevanssin löytääkseen.
Interactional Perspectives on Discourse. Proceedings from the Organization in Discourse 3 Conference
Resumo:
Programming and mathematics are core areas of computer science (CS) and consequently also important parts of CS education. Introductory instruction in these two topics is, however, not without problems. Studies show that CS students find programming difficult to learn and that teaching mathematical topics to CS novices is challenging. One reason for the latter is the disconnection between mathematics and programming found in many CS curricula, which results in students not seeing the relevance of the subject for their studies. In addition, reports indicate that students' mathematical capability and maturity levels are dropping. The challenges faced when teaching mathematics and programming at CS departments can also be traced back to gaps in students' prior education. In Finland the high school curriculum does not include CS as a subject; instead, focus is on learning to use the computer and its applications as tools. Similarly, many of the mathematics courses emphasize application of formulas, while logic, formalisms and proofs, which are important in CS, are avoided. Consequently, high school graduates are not well prepared for studies in CS. Motivated by these challenges, the goal of the present work is to describe new approaches to teaching mathematics and programming aimed at addressing these issues: Structured derivations is a logic-based approach to teaching mathematics, where formalisms and justifications are made explicit. The aim is to help students become better at communicating their reasoning using mathematical language and logical notation at the same time as they become more confident with formalisms. The Python programming language was originally designed with education in mind, and has a simple syntax compared to many other popular languages. The aim of using it in instruction is to address algorithms and their implementation in a way that allows focus to be put on learning algorithmic thinking and programming instead of on learning a complex syntax. Invariant based programming is a diagrammatic approach to developing programs that are correct by construction. The approach is based on elementary propositional and predicate logic, and makes explicit the underlying mathematical foundations of programming. The aim is also to show how mathematics in general, and logic in particular, can be used to create better programs.
Resumo:
The main focus of the present thesis was at verbal episodic memory processes that are particularly vulnerable to preclinical and clinical Alzheimer’s disease (AD). Here these processes were studied by a word learning paradigm, cutting across the domains of memory and language learning studies. Moreover, the differentiation between normal aging, mild cognitive impairment (MCI) and AD was studied by the cognitive screening test CERAD. In study I, the aim was to examine how patients with amnestic MCI differ from healthy controls in the different CERAD subtests. Also, the sensitivity and specificity of the CERAD screening test to MCI and AD was examined, as previous studies on the sensitivity and specificity of the CERAD have not included MCI patients. The results indicated that MCI is characterized by an encoding deficit, as shown by the overall worse performance on the CERAD Wordlist learning test compared with controls. As a screening test, CERAD was not very sensitive to MCI. In study II, verbal learning and forgetting in amnestic MCI, AD and healthy elderly controls was investigated with an experimental word learning paradigm, where names of 40 unfamiliar objects (mainly archaic tools) were trained with or without semantic support. The object names were trained during a 4-day long period and a follow-up was conducted one week, 4 weeks and 8 weeks after the training period. Manipulation of semantic support was included in the paradigm because it was hypothesized that semantic support might have some beneficial effects in the present learning task especially for the MCI group, as semantic memory is quite well preserved in MCI in contrast to episodic memory. We found that word learning was significantly impaired in MCI and AD patients, whereas forgetting patterns were similar across groups. Semantic support showed a beneficial effect on object name retrieval in the MCI group 8 weeks after training, indicating that the MCI patients’ preserved semantic memory abilities compensated for their impaired episodic memory. The MCI group performed equally well as the controls in the tasks tapping incidental learning and recognition memory, whereas the AD group showed impairment. Both the MCI and the AD group benefited less from phonological cueing than the controls. Our findings indicate that acquisition is compromised in both MCI and AD, whereas long13 term retention is not affected to the same extent. Incidental learning and recognition memory seem to be well preserved in MCI. In studies III and IV, the neural correlates of naming newly learned objects were examined in healthy elderly subjects and in amnestic MCI patients by means of positron emission tomography (PET) right after the training period. The naming of newly learned objects by healthy elderly subjects recruited a left-lateralized network, including frontotemporal regions and the cerebellum, which was more extensive than the one related to the naming of familiar objects (study III). Semantic support showed no effects on the PET results for the healthy subjects. The observed activation increases may reflect lexicalsemantic and lexical-phonological retrieval, as well as more general associative memory mechanisms. In study IV, compared to the controls, the MCI patients showed increased anterior cingulate activation when naming newly learned objects that had been learned without semantic support. This suggests a recruitment of additional executive and attentional resources in the MCI group.
Resumo:
By interpreting research results about textbooks this study tries to answer the question: how should texts be formulated to optimize the learning of the reading pupil? Seven perspectives structure the research: History. The amount of information available in a society influences the learning offered by textbooks. A compressed description indicates that memorizing activities have turned to tendencies for critical reading. Curriculum. The decentralization of curriculum development accentuates the importance of the textbook authors as interpreters of significant information. Because of the authority of textbooks, the way that information is presented can function as an unintended curriculum. Use. In the use of textbooks different functions can be identified. Thus the textbooks have, for instance, an authoritarian, a cohesive and a disciplinary function. Level of difficulty. A text that optimally matches the skills of the readers ina class should both provide facilitating scaffolds for the learning and at the same time challenge especially capable students by not being too obvious. Changing of preconceptions. The only possible starting point for teaching is the knowledge developed earlier by the student. In certain areas misconceptions are usual, which motivates the use of conceptual change texts that enhance the transformation of earlier obtained knowledge. Coherence. Well structured texts can usually be considered beneficial for learning. The same applies to the use of meta-discourse guiding the reader through the text. If the structuring is too obvious, the content may not, at least by capable students, be processed on a deep level. Content. The analysis underlines the need for deep approaches in textbooks. Though this is a relative statement, textbooks from a learning perspective seem to have been too superficial in their presentations. Ten principles were developed as a concluding interpretation of the good textbook. Though these principles capture important tendencies for developing a good textbook, textbook writing has to be considered as an art that cannot be captured in a short formula. The paradoxes identified indicate that the results should be seen as a starting point for further research.
Resumo:
Dagens programvaruindustri står inför alltmer komplicerade utmaningar i en värld där programvara är nästan allstädes närvarande i våra dagliga liv. Konsumenten vill ha produkter som är pålitliga, innovativa och rika i funktionalitet, men samtidigt också förmånliga. Utmaningen för oss inom IT-industrin är att skapa mer komplexa, innovativa lösningar till en lägre kostnad. Detta är en av orsakerna till att processförbättring som forskningsområde inte har minskat i betydelse. IT-proffs ställer sig frågan: “Hur håller vi våra löften till våra kunder, samtidigt som vi minimerar vår risk och ökar vår kvalitet och produktivitet?” Inom processförbättringsområdet finns det olika tillvägagångssätt. Traditionella processförbättringsmetoder för programvara som CMMI och SPICE fokuserar på kvalitets- och riskaspekten hos förbättringsprocessen. Mer lättviktiga metoder som t.ex. lättrörliga metoder (agile methods) och Lean-metoder fokuserar på att hålla löften och förbättra produktiviteten genom att minimera slöseri inom utvecklingsprocessen. Forskningen som presenteras i denna avhandling utfördes med ett specifikt mål framför ögonen: att förbättra kostnadseffektiviteten i arbetsmetoderna utan att kompromissa med kvaliteten. Den utmaningen attackerades från tre olika vinklar. För det första förbättras arbetsmetoderna genom att man introducerar lättrörliga metoder. För det andra bibehålls kvaliteten genom att man använder mätmetoder på produktnivå. För det tredje förbättras kunskapsspridningen inom stora företag genom metoder som sätter samarbete i centrum. Rörelsen bakom lättrörliga arbetsmetoder växte fram under 90-talet som en reaktion på de orealistiska krav som den tidigare förhärskande vattenfallsmetoden ställde på IT-branschen. Programutveckling är en kreativ process och skiljer sig från annan industri i det att den största delen av det dagliga arbetet går ut på att skapa något nytt som inte har funnits tidigare. Varje programutvecklare måste vara expert på sitt område och använder en stor del av sin arbetsdag till att skapa lösningar på problem som hon aldrig tidigare har löst. Trots att detta har varit ett välkänt faktum redan i många decennier, styrs ändå många programvaruprojekt som om de vore produktionslinjer i fabriker. Ett av målen för rörelsen bakom lättrörliga metoder är att lyfta fram just denna diskrepans mellan programutvecklingens innersta natur och sättet på vilket programvaruprojekt styrs. Lättrörliga arbetsmetoder har visat sig fungera väl i de sammanhang de skapades för, dvs. små, samlokaliserade team som jobbar i nära samarbete med en engagerad kund. I andra sammanhang, och speciellt i stora, geografiskt utspridda företag, är det mera utmanande att införa lättrörliga metoder. Vi har nalkats utmaningen genom att införa lättrörliga metoder med hjälp av pilotprojekt. Detta har två klara fördelar. För det första kan man inkrementellt samla kunskap om metoderna och deras samverkan med sammanhanget i fråga. På så sätt kan man lättare utveckla och anpassa metoderna till de specifika krav som sammanhanget ställer. För det andra kan man lättare överbrygga motstånd mot förändring genom att introducera kulturella förändringar varsamt och genom att målgruppen får direkt förstahandskontakt med de nya metoderna. Relevanta mätmetoder för produkter kan hjälpa programvaruutvecklingsteam att förbättra sina arbetsmetoder. När det gäller team som jobbar med lättrörliga och Lean-metoder kan en bra uppsättning mätmetoder vara avgörande för beslutsfattandet när man prioriterar listan över uppgifter som ska göras. Vårt fokus har legat på att stöda lättrörliga och Lean-team med interna produktmätmetoder för beslutsstöd gällande så kallad omfaktorering, dvs. kontinuerlig kvalitetsförbättring av programmets kod och design. Det kan vara svårt att ta ett beslut att omfaktorera, speciellt för lättrörliga och Lean-team, eftersom de förväntas kunna rättfärdiga sina prioriteter i termer av affärsvärde. Vi föreslår ett sätt att mäta designkvaliteten hos system som har utvecklats med hjälp av det så kallade modelldrivna paradigmet. Vi konstruerar även ett sätt att integrera denna mätmetod i lättrörliga och Lean-arbetsmetoder. En viktig del av alla processförbättringsinitiativ är att sprida kunskap om den nya programvaruprocessen. Detta gäller oavsett hurdan process man försöker introducera – vare sig processen är plandriven eller lättrörlig. Vi föreslår att metoder som baserar sig på samarbete när processen skapas och vidareutvecklas är ett bra sätt att stöda kunskapsspridning på. Vi ger en översikt över författarverktyg för processer på marknaden med det förslaget i åtanke.
Resumo:
The Department of French Studies of the University of Turku (Finland) organized an International Bilingual Conference on Crosscultural and Crosslinguistic Perspectives on Academic Discourse from 2022 May 2005. The event hosted specialists on Academic Discourse from Belgium, Finland, France, Germany, Italy, Norway, Spain, and the USA. This book is the first volume in our series of publications on Academic Discourse (AD hereafter). The following pages are composed of selected papers from the conference and focus on different aspects and analytical frameworks of Academic Discourse. One of the motivations behind organizing the conference was to examine and expand research on AD in different languages. Another one was to question to what extent academic genres are culturebound and language specific or primarily field or domain specific. The research carried out on AD has been mainly concerned with the use of English in different academic settings for a long time now – mainly written contexts – and at the expense of other languages. Alternatively the academic genre conventions of English and English speaking world have served as a basis for comparison with other languages and cultures. We consider this first volume to be a strong contribution to the spreading out of researches based on other languages than English in AD, namely Finnish, French, Italian, Norwegian and Romanian in this book. All the following articles have a strong link with the French language: either French is constitutive of the AD corpora under examination or the article was written in French. The structure of the book suggests and provides evidence that the concept of AD is understood and tackled to varying degrees by different scholars. Our first volume opens up the discussion on what AD is and backs dissemination, overlapping and expansion of current research questions and methodologies. The book is divided into three parts and contains four articles in English and six articles in French. The papers in part one and part two cover what we call the prototypical genre of written AD, i.e. the research article. Part one follows up on issues linked to the 13 Research Article (RA hereafter). Kjersti Fløttum asks wether a typical RA exists and concentrates on authors’ voices in RA (self and other dimensions), whereas Didriksen and Gjesdal’s article focuses on individual variation of the author’s voice in RA. The last article in this section is by Nadine Rentel and deals with evaluation in the writing of RA. Part two concentrates on the teaching and learning of AD within foreign language learning, another more or less canonical genre of AD. Two aspects of writing are covered in the first two articles: foreign students’ representations on rhetorical traditions (Hidden) and a contrastive assessment of written exercices in French and Finnish in Higher Education (Suzanne). The last contribution in this section on AD moves away from traditional written forms and looks at how argumentation is constructed in students’ oral presentations (Dervin and Fauveau). The last part of the book continues the extension by featuring four articles written in French exploring institutional and scientific discourses. Institutional discourses under scrutiny include the European Bologna Process (Galatanu) and Romanian reform texts (Moilanen). As for scientific discourses, the next paper in this section deconstructs an ideological discourse on the didactics of French as a foreign language (Pescheux). Finally, the last paper in part three reflects on varied forms of AD at university (Defays). We hope that this book will add some fuel to continue discussing diverse forms of and approches to AD – in different languages and voices! No need to say that with the current upsurge in academic mobility, reflecting on crosscultural and crosslinguistic AD has just but started.
Resumo:
The currently used forms of cancer therapy are associated with drug resistance and toxicity to healthy tissues. Thus, more efficient methods are needed for cancer-specific induction of growth arrest and programmed cell death, also known as apoptosis. Therapeutic forms of tumor necrosis factor-related apoptosis-inducing ligand (TRAIL) are investigated in clinical trials due to the capability of TRAIL to trigger apoptosis specifically in cancer cells by activation of cell surface death receptors. Many tumors, however, have acquired resistance to TRAIL-induced apoptosis and sensitizing drugs for combinatorial treatments are, therefore, in high demand. This study demonstrates that lignans, natural polyphenols enriched in seeds and cereal, have a remarkable sensitizing effect on TRAIL-induced cell death at non-toxic lignan concentrations. In TRAIL-resistant and androgen-dependent prostate cancer cells we observe that lignans repress receptor tyrosine kinase (RTK) activity and downregulate cell survival signaling via the Akt pathway, which leads to increased TRAIL sensitivity. A structure-activity relationship analysis reveals that the γ-butyrolactone ring of the dibenzylbutyrolactone lignans is essential for the rapidly reversible TRAIL-sensitizing activity of these compounds. Furthermore, the lignan nortrachelogenin (NTG) is identified as the most efficient of the 27 tested lignans and norlignans in sensitization of androgen-deprived prostate cancer cells to TRAIL-induced apoptosis. While this combinatorial anticancer approach may leave normal cells unharmed, several efficient cancer drugs are too toxic, insoluble or unstable to be used in systemic therapy. To enable use of such drugs and to protect normal cells from cytotoxic effects, cancer-targeted drug delivery vehicles of nanometer scale have recently been generated. The newly developed nanoparticle system that we tested in vitro for cancer cell targeting combines the efficient drug-loading capacity of mesoporous silica to the versatile particle surface functionalization of hyperbranched poly(ethylene imine), PEI. The mesoporous hybrid silica nanoparticles (MSNs) were functionalized with folic acid to promote targeted internalization by folate receptor overexpressing cancer cells. The presented results demonstrate that the developed carrier system can be employed in vitro for cancer selective delivery of adsorbed or covalently conjugated molecules and furthermore, for selective induction of apoptotic cell death in folate receptor expressing cancer cells. The tested carrier system displays potential for simultaneous delivery of several anticancer agents specifically to cancer cells also in vivo.
Resumo:
Software systems are expanding and becoming increasingly present in everyday activities. The constantly evolving society demands that they deliver more functionality, are easy to use and work as expected. All these challenges increase the size and complexity of a system. People may not be aware of a presence of a software system, until it malfunctions or even fails to perform. The concept of being able to depend on the software is particularly significant when it comes to the critical systems. At this point quality of a system is regarded as an essential issue, since any deficiencies may lead to considerable money loss or life endangerment. Traditional development methods may not ensure a sufficiently high level of quality. Formal methods, on the other hand, allow us to achieve a high level of rigour and can be applied to develop a complete system or only a critical part of it. Such techniques, applied during system development starting at early design stages, increase the likelihood of obtaining a system that works as required. However, formal methods are sometimes considered difficult to utilise in traditional developments. Therefore, it is important to make them more accessible and reduce the gap between the formal and traditional development methods. This thesis explores the usability of rigorous approaches by giving an insight into formal designs with the use of graphical notation. The understandability of formal modelling is increased due to a compact representation of the development and related design decisions. The central objective of the thesis is to investigate the impact that rigorous approaches have on quality of developments. This means that it is necessary to establish certain techniques for evaluation of rigorous developments. Since we are studying various development settings and methods, specific measurement plans and a set of metrics need to be created for each setting. Our goal is to provide methods for collecting data and record evidence of the applicability of rigorous approaches. This would support the organisations in making decisions about integration of formal methods into their development processes. It is important to control the software development, especially in its initial stages. Therefore, we focus on the specification and modelling phases, as well as related artefacts, e.g. models. These have significant influence on the quality of a final system. Since application of formal methods may increase the complexity of a system, it may impact its maintainability, and thus quality. Our goal is to leverage quality of a system via metrics and measurements, as well as generic refinement patterns, which are applied to a model and a specification. We argue that they can facilitate the process of creating software systems, by e.g. controlling complexity and providing the modelling guidelines. Moreover, we find them as additional mechanisms for quality control and improvement, also for rigorous approaches. The main contribution of this thesis is to provide the metrics and measurements that help in assessing the impact of rigorous approaches on developments. We establish the techniques for the evaluation of certain aspects of quality, which are based on structural, syntactical and process related characteristics of an early-stage development artefacts, i.e. specifications and models. The presented approaches are applied to various case studies. The results of the investigation are juxtaposed with the perception of domain experts. It is our aspiration to promote measurements as an indispensable part of quality control process and a strategy towards the quality improvement.
Resumo:
Supply chain risk management has emerged as an increasingly important issue in logistics as disruptions in the supply chain have become critical issues for many companies. The scientific literature on the subject is developing and in many respects the understanding of it is still in its infancy. Thus, there is a need for more information in order for scholars and practitioners to understand the causalities and interrelations that characterise the phenomenon. The aim of this dissertation is to narrow this gap by exploring key aspects of supply chain risk management through two maritime supply chains in the immediate region of the Gulf of Finland. The study contributes to the field in three different ways. Firstly, it facilitates the identification of risks on different levels of the supply chain through a systematic analysis of the processes and actors, and of the cognitive barriers that limit the actors’ visibility and their understanding of the operations and the risks involved. There is a clear need to increase collaboration and information exchange in order to improve visibility in the chain. Risk management should be a collaborative effort among the individual actors, aimed at obtaining a holistic picture. Secondly, the study contributes to the literature on risk analysis through the use of systemic frameworks that illustrate the causalities and linkages in the system, thereby making it easier to perceive the vulnerabilities. Thirdly, the study enhances current knowledge of risk control in identifying actor roles, risk visibility and risk controllability as being among the key factors determining risk-management effectiveness against supply-chain vulnerability. This dissertation is divided into two parts. The first part gives a general overview of the relevant literature, the research design and the conclusions of the study, and the second part comprises six research publications. Case-study methodology with systematic combining approach is used, where in-depth interviews, questionnaires and expert panel sessions are the main data collection methods. The study illustrates the current state of risk management in multimodal maritime supply chains, and develops frameworks for further analysis. The results imply that there are major differences between organizations in their ability to execute supply chain risk management. Further collaboration should be considered in order to facilitate the development of systematic and effective management processes.
Resumo:
Customer satisfaction has been widely studied concept due to its importance on business performance. Customer satisfaction should ideally lead to customer loyalty and have a positive effect on business profitability and growth. This study investigates customer satisfaction and loyalty in the Do-It-Yourself retailing in Russian market. “K-rauta” retail chain was chosen as a focus company for this study. Goal of the study was to investigate what creates customer satisfaction in this given market and what is the role of quality, trust and satisfaction for creating customer loyalty. The role of internet in consumer purchasing process was also investigated. Furthermore, consumer preferences towards new marketing solutions such as smart phone applications were briefly examined.