934 resultados para Reality in literature


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tutkimusongelmana oli kuinka tiedon johtamisella voidaan edesauttaa tuotekehitysprosessia. Mitkä ovat ne avaintekijät tietoympäristössä kuin myös itse tiedossa, joilla on merkitystä erityisesti tuotekehitysprosessin arvon muodostumiseen ja prosessien kehittämiseen? Tutkimus on laadullinen Case-tutkimus. Tutkimusongelmat on ensin selvitetty kirjallisuuden avulla, jonka jälkeen teoreettinen viitekehys on rakennettu tutkimaan rajattua ongelma-aluetta case-yrityksestä. Empiirisen tutkimuksen materiaali koostuu pääasiallisesti henkilökohtaisten teemahaastattelujen aineistosta. Tulokset merkittävimmistä tiedon hyväksikäytön haittatekijöistä, kuten myös parannusehdotukset on lajiteltu teoreettisessa viitekehyksessä esitettyjen oletustekijöiden mukaan. Haastatteluissa saadut vastaukset tukevat kirjallisuudesta ja alan ammattilaiselta saatua käsitystä tärkeimmistä vaikuttavista tekijöistä. Tärkeimmät toimenpiteet ja aloitteet joilla parannettaisiin tiedon muodostumista, koskivat ennnen kaikkea työnteon ulkoisia olosuhteita, eikä niinkään tiedon muodostumisen prosessia itseään. Merkittävimpiä haittatekijöitä olivat kultturiin, fyysiseen ja henkiseen tilaan ja henkilöstöresursseihin liittyvät ongelmat. Ratkaisuja ongelmiin odotettiin saatavan lähinnä tietotekniikan, henkilöstöresurssien ja itse tiedon muokkaamisen avulla. Tuotekehitysprosessin ydin tietovirtojen ja –pääomien luokittelu ja tulkitseminen tiedon muodostusta kuvaavan Learning Spiralin avulla antoi lähinnä teoreettisia viitteitä siitä millaisia keinoja on olemassa tiedon lisäämiseen ja jakamiseen eri tietotyypeittäin. Tulosten perusteella caseyrityksessä pitäisi kiinnittää erityistä huomiota tiedon dokumentointiin ja jakamiseen erityisesti sen tiedon osalta, joka on organisaatiossa vain harvalla ja/tai luonteeltaan hyvin tacitia.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tutkielmassa selvitetään pienten ja keskisuurten yritysten avainhenkilöihin keskittyneen tietopääoman katoamisen riskiä ja pyritään löytämään keinoja riskin hallintaan. Pk-yrityksissä yrityksen kannalta kriittinen tieto on usein harvoihin henkilöihin keskittynyttä. Tällaisten avainhenkilöiden työpanoksen ja osaamisen menettäminen kokonaan saattaa olla yrityksen toiminnalle kohtalokasta. Väliaikainen osaamisen menettäminen voi myös vaikeuttaa toimintoja. Tietopääoman katoamisen riskin pienentäminen onnistuu joko avainhenkilön tiedon siirtämisellä toisiin henkilöihin tai huolehtimalla avainhenkilön työsuhteen mahdollisimman pitkästä jatkumisesta. Tiedon siirtämisen menetelmät riippuvat tiedon laadusta: onko se hiljaista vai eksplisiittistä. Avainhenkilön pysyvyyteen vaikuttaa yrityksen henkilöstöpolitiikka, jonka mukaisesti työntekijät saavat korvauksen työstään ja sitoutuvat yritykseen. Myös työntekijän työkyvystä huolehtiminen vaikuttaa tiedon pysymiseen yrityksen käytössä. Esimerkkiyrityksen haastatteluissa voidaan todeta yhtymäkohtia tutkimuksen muissa osissa käsiteltyihin tietojohtamisen ongelmiin.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The objective of the thesis was to explore the nature and characteristics of customer-related internal communication in a global industrial matrix organization during a specific customer relationship, and how it could be improved. The theoretical part of the study views the field of the concepts of intra-organizational information and knowledge sharing. The theoretical part also views the internal communications influences to customer relationships, its problematic, and the suggestions to improve internal communication in literature. The empirical part of the study was conducted with the Content Analysis and the Social Network Analysis as research methods. The data was collected by interviews and a questionnaire. Internal communication was observed first generally within the organization from the point of view of a certain business, and secondly, during a specific customer relationship at personal level and at departmental level. The results of the study describe the nature and characteristics of internal communication in the organization. The results give 13 suggestions for improving internal communication in the organization. Although the study has been done in one specific organization, it also offers insights for other organizations as well as managers to improve their internal communication.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis gathers knowledge about ongoing high-temperature reactor projects around the world. Methods for calculating coolant flow and heat transfer inside a pebble-bed reactor core are also developed. The thesis begins with the introduction of high-temperature reactors including the current state of the technology. Process heat applications that could use the heat from a high-temperature reactor are also introduced. A suitable reactor design with data available in literature is selected for the calculation part of the thesis. Commercial computational fluid dynamics software Fluent is used for the calculations. The pebble-bed is approximated as a packed-bed, which causes sink terms to the momentum equations of the gas flowing through it. A position dependent value is used for the packing fraction. Two different models are used to calculate heat transfer. First a local thermal equilibrium is assumed between the gas and solid phases and a single energy equation is used. In the second approach, separate energy equations are used for the phases. Information about steady state flow behavior, pressure loss, and temperature distribution in the core is obtained as results of the calculations. The effect of inlet mass flow rate to pressure loss is also investigated. Data found in literature and the results correspond each other quite well, considered the amount of simplifications in the calculations. The models developed in this thesis can be used to solve coolant flow and heat transfer in a pebble-bed reactor, although additional development and model validation is needed for better accuracy and reliability.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper is a literature review which describes the construction of state of the art of permanent magnet generators and motors constructing and discusses the current and possible application of these machines in industry. Permanent magnet machines are a well-know class of rotating and linear electric machines used for many years in industrial applications. A particular interest for permanent magnet generators is connected with wind mills, which seem to be becoming increasingly popular nowadays. Geared and direct-driven permanent magnet generators are described. A classification of direct-driven permanent magnet generators is given. Design aspects of permanent magnet generators are presented. Permanent magnet generators for wind turbines designs are highlighted. Dynamics and vibration problems of permanent magnet generators covered in literature are presented. The application of the Finite Element Method for mechanical problems solution in the field of permanent magnet generators is discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nanoparticles offer adjustable and expandable reactive surface area compared to the more traditional solid phase forms utilized in bioaffinity assays due to the high surface to-volume ratio. The versatility of nanoparticles is further improved by the ability to incorporate various molecular complexes such as luminophores into the core. Nanoparticle labels composed of polystyrene, silica, inorganic crystals doped with high number of luminophores, preferably lanthanide(III) complexes, are employed in bioaffinity assays. Other label species such as semiconductor crystals (quantum dots) or colloidal gold clusters are also utilized. The surface derivatization of such particles with biomolecules is crucial for the applicability to bioaffinity assays. The effectiveness of a coating is reliant on the biomolecule and particle surface characteristics and the selected coupling technique. The most critical aspects of the particle labels in bioaffinity assays are their size-dependent features. For polystyrene, silica and inorganic phosphor particles, these include the kinetics, specific activity and colloidal stability. For quantum dots and gold colloids, the spectral properties are also dependent on particle size. This study reports the utilization of europium(III)-chelate-embedded nanoparticle labels in the development of bioaffinity assays. The experimental covers both the heterogeneous and homogeneous assay formats elucidating the wide applicability of the nanoparticles. It was revealed that the employment of europium(III) nanoparticles in heterogeneous assays for viral antigens, adenovirus hexon and hepatitis B surface antigen (HBsAg), resulted in sensitivity improvement of 10-1000 fold compared to the reference methods. This improvement was attributed to the extreme specific activity and enhanced monovalent affinity of the nanoparticles conjugates. The applicability of europium(III)-chelate-doped nanoparticles to homogeneous assay formats were proved in two completely different experimental settings; assays based on immunological recognition or proteolytic activity. It was shown that in addition to small molecule acceptors, particulate acceptors may also be employed due to the high specific activity of the particles promoting proximity-induced reabsorptive energy transfer in addition to non-radiative energy transfer. The principle of proteolytic activity assay relied on a novel dual-step FRET concept, wherein the streptavidin-derivatized europium(III)-chelate-doped nanoparticles were used as donors for peptide substrates modified with biotin and terminal europium emission compliant primary acceptor and a secondary quencher acceptor. The recorded sensitized emission was proportional to the enzyme activity, and the assay response to various inhibitor doses was in agreement with those found in literature showing the feasibility of the technique. Experiments regarding the impact of donor particle size on the extent of direct donor fluorescence and reabsorptive excitation interference in a FRET-based application was conducted with differently sized europium(III)-chelate-doped nanoparticles. It was shown that the size effect was minimal

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of the work is to study the existing analytical calculation procedures found in literature to calculate the eddy-current losses in surface mounted permanent magnets within PMSM application. The most promising algorithms are implemented with MATLAB software under the dimensional data of LUT prototype machine. In addition finite elements analyze, utilized with help of Flux 2D software from Cedrat Ltd, is applied to calculate the eddy-current losses in permanent magnets. The results obtained from analytical methods are compared with numerical results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Validation and verification operations encounter various challenges in product development process. Requirements for increasing the development cycle pace set new requests for component development process. Verification and validation usually represent the largest activities, up to 40 50 % of R&D resources utilized. This research studies validation and verification as part of case company's component development process. The target is to define framework that can be used in improvement of the validation and verification capability evaluation and development in display module development projects. Validation and verification definition and background is studied in this research. Additionally, theories such as project management, system, organisational learning and causality is studied. Framework and key findings of this research are presented. Feedback system according of the framework is defined and implemented to the case company. This research is divided to the theory and empirical parts. Theory part is conducted in literature review. Empirical part is done in case study. Constructive methode and design research methode are used in this research A framework for capability evaluation and development was defined and developed as result of this research. Key findings of this study were that double loop learning approach with validation and verification V+ model enables defining a feedback reporting solution. Additional results, some minor changes in validation and verification process were proposed. There are a few concerns expressed on the results on validity and reliability of this study. The most important one was the selected research method and the selected model itself. The final state can be normative, the researcher may set study results before the actual study and in the initial state, the researcher may describe expectations for the study. Finally reliability of this study, and validity of this work are studied.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The size and complexity of projects in the software development are growing very fast. At the same time, the proportion of successful projects is still quite low according to the previous research. Although almost every project's team knows main areas of responsibility which would help to finish project on time and on budget, this knowledge is rarely used in practice. So it is important to evaluate the success of existing software development projects and to suggest a method for evaluating success chances which can be used in the software development projects. The main aim of this study is to evaluate the success of projects in the selected geographical region (Russia-Ukraine-Belarus). The second aim is to compare existing models of success prediction and to determine their strengths and weaknesses. Research was done as an empirical study. A survey with structured forms and theme-based interviews were used as the data collection methods. The information gathering was done in two stages. At the first stage, project manager or someone with similar responsibilities answered the questions over Internet. At the second stage, the participant was interviewed; his or her answers were discussed and refined. It made possible to get accurate information about each project and to avoid errors. It was found out that there are many problems in the software development projects. These problems are widely known and were discussed in literature many times. The research showed that most of the projects have problems with schedule, requirements, architecture, quality, and budget. Comparison of two models of success prediction presented that The Standish Group overestimates problems in project. At the same time, McConnell's model can help to identify problems in time and avoid troubles in future. A framework for evaluating success chances in distributed projects was suggested. The framework is similar to The Standish Group model but it was customized for distributed projects.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Specific combustion programs (Gaseq, Chemical equilibria in perfect gases, Chris Morley) are used to model dioxin and formation in the incineration processes of urban solid wastes. Thanks to these programs, it is possible to establish correlations with the formation mechanisms postulated in literature on the subject. It was found that minimum oxygen quantities are required to obtain a significant formation of these compounds and that more furans than dioxins are formed. Likewise, dioxin and furan formation is related to the presence of carbon monoxide, and dioxin and furan distribution among its different compounds depends on the chlorine and hydrogen relative composition. This is due to the fact that an increased chlorine availability leads to the formation of compounds bearing a higher chlorine concentration (penta-, hexa-, hepta-, and octachlorides), whereas an increased hydrogen availability leads to the formation of compounds bearing a lower chlorine number (mono, di-, tri-, and tetrachlorides).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Currently, numerous high-throughput technologies are available for the study of human carcinomas. In literature, many variations of these techniques have been described. The common denominator for these methodologies is the high amount of data obtained in a single experiment, in a short time period, and at a fairly low cost. However, these methods have also been described with several problems and limitations. The purpose of this study was to test the applicability of two selected high-throughput methods, cDNA and tissue microarrays (TMA), in cancer research. Two common human malignancies, breast and colorectal cancer, were used as examples. This thesis aims to present some practical considerations that need to be addressed when applying these techniques. cDNA microarrays were applied to screen aberrant gene expression in breast and colon cancers. Immunohistochemistry was used to validate the results and to evaluate the association of selected novel tumour markers with the outcome of the patients. The type of histological material used in immunohistochemistry was evaluated especially considering the applicability of whole tissue sections and different types of TMAs. Special attention was put on the methodological details in the cDNA microarray and TMA experiments. In conclusion, many potential tumour markers were identified in the cDNA microarray analyses. Immunohistochemistry could be applied to validate the observed gene expression changes of selected markers and to associate their expression change with patient outcome. In the current experiments, both TMAs and whole tissue sections could be used for this purpose. This study showed for the first time that securin and p120 catenin protein expression predict breast cancer outcome and the immunopositivity of carbonic anhydrase IX associates with the outcome of rectal cancer. The predictive value of these proteins was statistically evident also in multivariate analyses with up to a 13.1- fold risk for cancer specific death in a specific subgroup of patients.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this study is to analyse the content of the interdisciplinary conversations in Göttingen between 1949 and 1961. The task is to compare models for describing reality presented by quantum physicists and theologians. Descriptions of reality indifferent disciplines are conditioned by the development of the concept of reality in philosophy, physics and theology. Our basic problem is stated in the question: How is it possible for the intramental image to match the external object?Cartesian knowledge presupposes clear and distinct ideas in the mind prior to observation resulting in a true correspondence between the observed object and the cogitative observing subject. The Kantian synthesis between rationalism and empiricism emphasises an extended character of representation. The human mind is not a passive receiver of external information, but is actively construing intramental representations of external reality in the epistemological process. Heidegger's aim was to reach a more primordial mode of understanding reality than what is possible in the Cartesian Subject-Object distinction. In Heidegger's philosophy, ontology as being-in-the-world is prior to knowledge concerning being. Ontology can be grasped only in the totality of being (Dasein), not only as an object of reflection and perception. According to Bohr, quantum mechanics introduces an irreducible loss in representation, which classically understood is a deficiency in knowledge. The conflicting aspects (particle and wave pictures) in our comprehension of physical reality, cannot be completely accommodated into an entire and coherent model of reality. What Bohr rejects is not realism, but the classical Einsteinian version of it. By the use of complementary descriptions, Bohr tries to save a fundamentally realistic position. The fundamental question in Barthian theology is the problem of God as an object of theological discourse. Dialectics is Barth¿s way to express knowledge of God avoiding a speculative theology and a human-centred religious self-consciousness. In Barthian theology, the human capacity for knowledge, independently of revelation, is insufficient to comprehend the being of God. Our knowledge of God is real knowledge in revelation and our words are made to correspond with the divine reality in an analogy of faith. The point of the Bultmannian demythologising programme was to claim the real existence of God beyond our faculties. We cannot simply define God as a human ideal of existence or a focus of values. The theological programme of Bultmann emphasised the notion that we can talk meaningfully of God only insofar as we have existential experience of his intervention. Common to all these twentieth century philosophical, physical and theological positions, is a form of anti-Cartesianism. Consequently, in regard to their epistemology, they can be labelled antirealist. This common insight also made it possible to find a common meeting point between the different disciplines. In this study, the different standpoints from all three areas and the conversations in Göttingen are analysed in the frameworkof realism/antirealism. One of the first tasks in the Göttingen conversations was to analyse the nature of the likeness between the complementary structures inquantum physics introduced by Niels Bohr and the dialectical forms in the Barthian doctrine of God. The reaction against epistemological Cartesianism, metaphysics of substance and deterministic description of reality was the common point of departure for theologians and physicists in the Göttingen discussions. In his complementarity, Bohr anticipated the crossing of traditional epistemic boundaries and the generalisation of epistemological strategies by introducing interpretative procedures across various disciplines.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this article I deal with time as a notion of epistemological content associated though with the notion of a subjective consciousness co-constitutive of physical reality. In this phenomenologically grounded approach I attempt to establish a 'metaphysical' aspect of time, within a strictly epistemological context, in the sense of an underlying absolute subjectivity which is non-objectifiable within objective temporality and thus non-susceptible of any ontological designation. My arguments stem, on the one hand, from a version of quantum-mechanical theory (History Projection Operator theory, HPO theory) in view of its formal treatment of two different aspects of time within a quantum context. The discrete, partial-ordering properties (the notions of before and after) and the dynamical-parameter properties reflected in the wave equations of motion. On the other hand, to strengthen my arguments for a transcendental factor of temporality, I attempt an interpretation of some relevant conclusions in the work of J. Eccles ([5]) and of certain results of experimental research of S. Deahaene et al. ([2]) and others.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Preference relations, and their modeling, have played a crucial role in both social sciences and applied mathematics. A special category of preference relations is represented by cardinal preference relations, which are nothing other than relations which can also take into account the degree of relation. Preference relations play a pivotal role in most of multi criteria decision making methods and in the operational research. This thesis aims at showing some recent advances in their methodology. Actually, there are a number of open issues in this field and the contributions presented in this thesis can be grouped accordingly. The first issue regards the estimation of a weight vector given a preference relation. A new and efficient algorithm for estimating the priority vector of a reciprocal relation, i.e. a special type of preference relation, is going to be presented. The same section contains the proof that twenty methods already proposed in literature lead to unsatisfactory results as they employ a conflicting constraint in their optimization model. The second area of interest concerns consistency evaluation and it is possibly the kernel of the thesis. This thesis contains the proofs that some indices are equivalent and that therefore, some seemingly different formulae, end up leading to the very same result. Moreover, some numerical simulations are presented. The section ends with some consideration of a new method for fairly evaluating consistency. The third matter regards incomplete relations and how to estimate missing comparisons. This section reports a numerical study of the methods already proposed in literature and analyzes their behavior in different situations. The fourth, and last, topic, proposes a way to deal with group decision making by means of connecting preference relations with social network analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The objective of this thesis was to study the role of capabilities in purchasing and supply management. For the pre-understanding of the research topic, purchasing and supply management development and the multidimensional, unstructured and complex nature of purchasing and supply management performance was studied in literature review. In addition, a capability-based purchasing and supply management performance framework were researched and structured for the empirical research. Due to the unstructured nature of the research topic, the empirical research is three-pronged in this study including three different research methods: the Delphi method, semi-structured interview, and case research. As a result, the purchasing and supply management capability assessment tool was structured to measure current level of capabilities and impact of capabilities on purchasing and supply management performance. The final results indicate that capabilities are enablers of purchasing and supply management performance, and therefore critical to purchasing and supply performance.