910 resultados para Art in literature.
Resumo:
Työn tavoitteena oli selvittää Larox Oy:n, Lappeenranta, palveluorganisaation prosessit ja prosessien väliset rajapinnat. Prosesseja ja prosessien kehittämistä ja innovointia tarkasteltiin ensin kirjallisuuden perusteella. Rajapintojen selkeää esittämistä varten kehitettiin yksinkertainen metodologia mind mapping -tekniikan pohjalta. Nykyisten prosessien tila ja rajapinnat analysoitiin ja dokumentoitiin haastattelemalla Larox Oy:n työntekijöitä ja asiakkaita sekä tutustumalla prosessikuvauksiin ja muihin olennaisiin dokumentteihin. Analyysin tulosten perusteella tunnistettiin suurimmat ongelmakohdat rajapinnoissa ja pohdittiin mahdollisia ratkaisuja niihin. Pieniä prosessinkehitysaloitteita kehitettiin yhteistyössä Larox Oy:n työntekijöiden kanssa. Työn lopussa on pohdittu mahdollisia tulevaisuuden malleja Larox Oy:n palveluorganisaation toimintamalleiksi.
Resumo:
Tutkimusongelmana oli kuinka tiedon johtamisella voidaan edesauttaa tuotekehitysprosessia. Mitkä ovat ne avaintekijät tietoympäristössä kuin myös itse tiedossa, joilla on merkitystä erityisesti tuotekehitysprosessin arvon muodostumiseen ja prosessien kehittämiseen? Tutkimus on laadullinen Case-tutkimus. Tutkimusongelmat on ensin selvitetty kirjallisuuden avulla, jonka jälkeen teoreettinen viitekehys on rakennettu tutkimaan rajattua ongelma-aluetta case-yrityksestä. Empiirisen tutkimuksen materiaali koostuu pääasiallisesti henkilökohtaisten teemahaastattelujen aineistosta. Tulokset merkittävimmistä tiedon hyväksikäytön haittatekijöistä, kuten myös parannusehdotukset on lajiteltu teoreettisessa viitekehyksessä esitettyjen oletustekijöiden mukaan. Haastatteluissa saadut vastaukset tukevat kirjallisuudesta ja alan ammattilaiselta saatua käsitystä tärkeimmistä vaikuttavista tekijöistä. Tärkeimmät toimenpiteet ja aloitteet joilla parannettaisiin tiedon muodostumista, koskivat ennnen kaikkea työnteon ulkoisia olosuhteita, eikä niinkään tiedon muodostumisen prosessia itseään. Merkittävimpiä haittatekijöitä olivat kultturiin, fyysiseen ja henkiseen tilaan ja henkilöstöresursseihin liittyvät ongelmat. Ratkaisuja ongelmiin odotettiin saatavan lähinnä tietotekniikan, henkilöstöresurssien ja itse tiedon muokkaamisen avulla. Tuotekehitysprosessin ydin tietovirtojen ja –pääomien luokittelu ja tulkitseminen tiedon muodostusta kuvaavan Learning Spiralin avulla antoi lähinnä teoreettisia viitteitä siitä millaisia keinoja on olemassa tiedon lisäämiseen ja jakamiseen eri tietotyypeittäin. Tulosten perusteella caseyrityksessä pitäisi kiinnittää erityistä huomiota tiedon dokumentointiin ja jakamiseen erityisesti sen tiedon osalta, joka on organisaatiossa vain harvalla ja/tai luonteeltaan hyvin tacitia.
Resumo:
The objective of the thesis was to explore the nature and characteristics of customer-related internal communication in a global industrial matrix organization during a specific customer relationship, and how it could be improved. The theoretical part of the study views the field of the concepts of intra-organizational information and knowledge sharing. The theoretical part also views the internal communications influences to customer relationships, its problematic, and the suggestions to improve internal communication in literature. The empirical part of the study was conducted with the Content Analysis and the Social Network Analysis as research methods. The data was collected by interviews and a questionnaire. Internal communication was observed first generally within the organization from the point of view of a certain business, and secondly, during a specific customer relationship at personal level and at departmental level. The results of the study describe the nature and characteristics of internal communication in the organization. The results give 13 suggestions for improving internal communication in the organization. Although the study has been done in one specific organization, it also offers insights for other organizations as well as managers to improve their internal communication.
Resumo:
BACKGROUND: Reducing the fraction of transmissions during recent human immunodeficiency virus (HIV) infection is essential for the population-level success of "treatment as prevention". METHODS: A phylogenetic tree was constructed with 19 604 Swiss sequences and 90 994 non-Swiss background sequences. Swiss transmission pairs were identified using 104 combinations of genetic distance (1%-2.5%) and bootstrap (50%-100%) thresholds, to examine the effect of those criteria. Monophyletic pairs were classified as recent or chronic transmission based on the time interval between estimated seroconversion dates. Logistic regression with adjustment for clinical and demographic characteristics was used to identify risk factors associated with transmission during recent or chronic infection. FINDINGS: Seroconversion dates were estimated for 4079 patients on the phylogeny, and comprised between 71 (distance, 1%; bootstrap, 100%) to 378 transmission pairs (distance, 2.5%; bootstrap, 50%). We found that 43.7% (range, 41%-56%) of the transmissions occurred during the first year of infection. Stricter phylogenetic definition of transmission pairs was associated with higher recent-phase transmission fraction. Chronic-phase viral load area under the curve (adjusted odds ratio, 3; 95% confidence interval, 1.64-5.48) and time to antiretroviral therapy (ART) start (adjusted odds ratio 1.4/y; 1.11-1.77) were associated with chronic-phase transmission as opposed to recent transmission. Importantly, at least 14% of the chronic-phase transmission events occurred after the transmitter had interrupted ART. CONCLUSIONS: We demonstrate a high fraction of transmission during recent HIV infection but also chronic transmissions after interruption of ART in Switzerland. Both represent key issues for treatment as prevention and underline the importance of early diagnosis and of early and continuous treatment.
Resumo:
This thesis gathers knowledge about ongoing high-temperature reactor projects around the world. Methods for calculating coolant flow and heat transfer inside a pebble-bed reactor core are also developed. The thesis begins with the introduction of high-temperature reactors including the current state of the technology. Process heat applications that could use the heat from a high-temperature reactor are also introduced. A suitable reactor design with data available in literature is selected for the calculation part of the thesis. Commercial computational fluid dynamics software Fluent is used for the calculations. The pebble-bed is approximated as a packed-bed, which causes sink terms to the momentum equations of the gas flowing through it. A position dependent value is used for the packing fraction. Two different models are used to calculate heat transfer. First a local thermal equilibrium is assumed between the gas and solid phases and a single energy equation is used. In the second approach, separate energy equations are used for the phases. Information about steady state flow behavior, pressure loss, and temperature distribution in the core is obtained as results of the calculations. The effect of inlet mass flow rate to pressure loss is also investigated. Data found in literature and the results correspond each other quite well, considered the amount of simplifications in the calculations. The models developed in this thesis can be used to solve coolant flow and heat transfer in a pebble-bed reactor, although additional development and model validation is needed for better accuracy and reliability.
Resumo:
Nanoparticles offer adjustable and expandable reactive surface area compared to the more traditional solid phase forms utilized in bioaffinity assays due to the high surface to-volume ratio. The versatility of nanoparticles is further improved by the ability to incorporate various molecular complexes such as luminophores into the core. Nanoparticle labels composed of polystyrene, silica, inorganic crystals doped with high number of luminophores, preferably lanthanide(III) complexes, are employed in bioaffinity assays. Other label species such as semiconductor crystals (quantum dots) or colloidal gold clusters are also utilized. The surface derivatization of such particles with biomolecules is crucial for the applicability to bioaffinity assays. The effectiveness of a coating is reliant on the biomolecule and particle surface characteristics and the selected coupling technique. The most critical aspects of the particle labels in bioaffinity assays are their size-dependent features. For polystyrene, silica and inorganic phosphor particles, these include the kinetics, specific activity and colloidal stability. For quantum dots and gold colloids, the spectral properties are also dependent on particle size. This study reports the utilization of europium(III)-chelate-embedded nanoparticle labels in the development of bioaffinity assays. The experimental covers both the heterogeneous and homogeneous assay formats elucidating the wide applicability of the nanoparticles. It was revealed that the employment of europium(III) nanoparticles in heterogeneous assays for viral antigens, adenovirus hexon and hepatitis B surface antigen (HBsAg), resulted in sensitivity improvement of 10-1000 fold compared to the reference methods. This improvement was attributed to the extreme specific activity and enhanced monovalent affinity of the nanoparticles conjugates. The applicability of europium(III)-chelate-doped nanoparticles to homogeneous assay formats were proved in two completely different experimental settings; assays based on immunological recognition or proteolytic activity. It was shown that in addition to small molecule acceptors, particulate acceptors may also be employed due to the high specific activity of the particles promoting proximity-induced reabsorptive energy transfer in addition to non-radiative energy transfer. The principle of proteolytic activity assay relied on a novel dual-step FRET concept, wherein the streptavidin-derivatized europium(III)-chelate-doped nanoparticles were used as donors for peptide substrates modified with biotin and terminal europium emission compliant primary acceptor and a secondary quencher acceptor. The recorded sensitized emission was proportional to the enzyme activity, and the assay response to various inhibitor doses was in agreement with those found in literature showing the feasibility of the technique. Experiments regarding the impact of donor particle size on the extent of direct donor fluorescence and reabsorptive excitation interference in a FRET-based application was conducted with differently sized europium(III)-chelate-doped nanoparticles. It was shown that the size effect was minimal
Resumo:
Quality of life is increasingly becoming a concept researched empirically and theoretically in the field of economics. In urban economics in particular, this increasing interest stems mainly from the fact that quality of life affects urban competitiveness and urban growth: research shows that when households and businesses decide where to locate, quality of life considerations can play a very important role. The purpose of the present paper is to examine the way economic literature and urban economic literature in particular, have adopted quality of life considerations in the economic thinking. Moreover, it presents the ways various studies have attempted to capture the multidimensional nature of the concept, and quantify it for the purposes of empirical research. Additionally we focus on the state of the art in Spain. Looking at the experiences in the last years we see very important possibilities of developing new studies in the field.
Resumo:
The aim of the work is to study the existing analytical calculation procedures found in literature to calculate the eddy-current losses in surface mounted permanent magnets within PMSM application. The most promising algorithms are implemented with MATLAB software under the dimensional data of LUT prototype machine. In addition finite elements analyze, utilized with help of Flux 2D software from Cedrat Ltd, is applied to calculate the eddy-current losses in permanent magnets. The results obtained from analytical methods are compared with numerical results.
Resumo:
Organic food products are highly susceptible to fraud. Currently, administrative controls are conducted to detect fraud, but having an analytical tool able to verify the organic identity of food would be very supportive. The state-of-the-art in food authentication relies on fingerprinting approaches that find characteristic analytical patterns to unequivocally identify authentic products. While wide research on authentication has been conducted for other commodities, the authentication of organic chicken products is still in its infancy. Challenges include finding fingerprints to discriminate organic from conventional products, and recruiting sample sets that cover natural variability. Future research might be oriented towards developing new authentication models for organic feed, eggs and chicken meat, keeping models updated and implementing them into regulations. Meanwhile, these models might be very supportive to the administrative controls directing inspections towards suspicious fraudulent samples.
Resumo:
Validation and verification operations encounter various challenges in product development process. Requirements for increasing the development cycle pace set new requests for component development process. Verification and validation usually represent the largest activities, up to 40 50 % of R&D resources utilized. This research studies validation and verification as part of case company's component development process. The target is to define framework that can be used in improvement of the validation and verification capability evaluation and development in display module development projects. Validation and verification definition and background is studied in this research. Additionally, theories such as project management, system, organisational learning and causality is studied. Framework and key findings of this research are presented. Feedback system according of the framework is defined and implemented to the case company. This research is divided to the theory and empirical parts. Theory part is conducted in literature review. Empirical part is done in case study. Constructive methode and design research methode are used in this research A framework for capability evaluation and development was defined and developed as result of this research. Key findings of this study were that double loop learning approach with validation and verification V+ model enables defining a feedback reporting solution. Additional results, some minor changes in validation and verification process were proposed. There are a few concerns expressed on the results on validity and reliability of this study. The most important one was the selected research method and the selected model itself. The final state can be normative, the researcher may set study results before the actual study and in the initial state, the researcher may describe expectations for the study. Finally reliability of this study, and validity of this work are studied.
Resumo:
Analyzing the state of the art in a given field in order to tackle a new problem is always a mandatory task. Literature provides surveys based on summaries of previous studies, which are often based on theoretical descriptions of the methods. An engineer, however, requires some evidence from experimental evaluations in order to make the appropriate decision when selecting a technique for a problem. This is what we have done in this paper: experimentally analyzed a set of representative state-of-the-art techniques in the problem we are dealing with, namely, the road passenger transportation problem. This is an optimization problem in which drivers should be assigned to transport services, fulfilling some constraints and minimizing some function cost. The experimental results have provided us with good knowledge of the properties of several methods, such as modeling expressiveness, anytime behavior, computational time, memory requirements, parameters, and free downloadable tools. Based on our experience, we are able to choose a technique to solve our problem. We hope that this analysis is also helpful for other engineers facing a similar problem
Resumo:
With this article we want to present a state of affairs of the didactics of art in our context, and at the same time we want todiscuss on its pluri- and interdisciplinary construction. We check the different disciplines that configure it and we analysethe paradigm of artistic education as a discipline (DBAE) and its passage to post-modernity. This example focuses thediscussion about the opportunity of adapting holistic educative models and the transition of the current educativeinnovation towards skill models
Resumo:
The size and complexity of projects in the software development are growing very fast. At the same time, the proportion of successful projects is still quite low according to the previous research. Although almost every project's team knows main areas of responsibility which would help to finish project on time and on budget, this knowledge is rarely used in practice. So it is important to evaluate the success of existing software development projects and to suggest a method for evaluating success chances which can be used in the software development projects. The main aim of this study is to evaluate the success of projects in the selected geographical region (Russia-Ukraine-Belarus). The second aim is to compare existing models of success prediction and to determine their strengths and weaknesses. Research was done as an empirical study. A survey with structured forms and theme-based interviews were used as the data collection methods. The information gathering was done in two stages. At the first stage, project manager or someone with similar responsibilities answered the questions over Internet. At the second stage, the participant was interviewed; his or her answers were discussed and refined. It made possible to get accurate information about each project and to avoid errors. It was found out that there are many problems in the software development projects. These problems are widely known and were discussed in literature many times. The research showed that most of the projects have problems with schedule, requirements, architecture, quality, and budget. Comparison of two models of success prediction presented that The Standish Group overestimates problems in project. At the same time, McConnell's model can help to identify problems in time and avoid troubles in future. A framework for evaluating success chances in distributed projects was suggested. The framework is similar to The Standish Group model but it was customized for distributed projects.
Resumo:
Specific combustion programs (Gaseq, Chemical equilibria in perfect gases, Chris Morley) are used to model dioxin and formation in the incineration processes of urban solid wastes. Thanks to these programs, it is possible to establish correlations with the formation mechanisms postulated in literature on the subject. It was found that minimum oxygen quantities are required to obtain a significant formation of these compounds and that more furans than dioxins are formed. Likewise, dioxin and furan formation is related to the presence of carbon monoxide, and dioxin and furan distribution among its different compounds depends on the chlorine and hydrogen relative composition. This is due to the fact that an increased chlorine availability leads to the formation of compounds bearing a higher chlorine concentration (penta-, hexa-, hepta-, and octachlorides), whereas an increased hydrogen availability leads to the formation of compounds bearing a lower chlorine number (mono, di-, tri-, and tetrachlorides).