903 resultados para Data-driven Methods
Resumo:
Suomessa on uudistuneen jätelain tavoitteena jätteiden lajittelun tehostaminen ja jätemäärien pienentäminen. Tutkimuksen tavoitteena oli selvittää miten asiakas voi osallistua uuden palvelun kehittämisprosessiin. Empirian kautta vastattiin kysymyksiin millainen on asiakkaan halukkuus ja valmius osallistua palvelunkehittämiseen ja mikä on asiakkaiden palaute palvelun ominaisuuksista. Tutkimus toteutettiin tapaustutkimuksena, jossa tutkittavana oli Aalto-yliopiston jätteiden raportointi- ja suunnittelupalvelun (Kierrätyspalvelin) palvelunkehitysprosessi. Tiedonkeruumenetelmänä oli yksilö- ja ryhmäteemahaastattelut. Tutkimustuloksena esitettiin Kierrätyspalvelimen palvelunkehitysprosessin malli. Tärkeimpänä tutkimuksellisena havaintona oli, että haastateltavat toimivat lähinnä tiedonlähteinä tarpeistaan. He osallistuivat palvelun ominaisuuksien testaukseen ja määrittelyyn.
Resumo:
Potilaan hoitamisessa korostuvat mm. triagen tekeminen, potilaan voinnin seuranta ja hoitoa koskevien päätösten tekeminen nopeasti potilaan voinnin mukaan sekä potilaan jatkohoidon turvaaminen. Tämä kaksivaiheinen koulutustutkimus kohdistui päivystyshoitotyön osaamiseen. Tutkimuksen ensimmäisessä vaiheessa määriteltiin päivystyshoitotyön osaaminen ja toisessa vaiheessa arvioitiin valmistuvien sairaanhoitajaopiskelijoiden päivystyshoitotyön osaamista ja osaamiseen yhteydessä olevia tekijöitä. Osaamisen arvioinnin suorittivat opiskelijat itse ja vertailuperustana opiskelijoiden päivystyshoitotyön osaamiselle käytettiin ammatissa toimivien sairaanhoitajien päivystyshoitotyön osaamista. Tutkimuksen tavoitteena oli arvioinnin perusteella määrittää päivystyshoitotyön osaamisen nykytaso ja tehdä tarvittavat ehdotukset päivystyshoitotyön osaamisen kehittämiseen. Tutkimuksen ensimmäisessä vaiheessa (2006–2012) tiedonkeruumenetelminä oli kirjallisuuskatsaus ja asiantuntija-arviointi hyödyntäen delphi-menetelmää. Kirjallisuuskatsauksen perusteella muodostettiin päivystyshoitotyön osaamista kuvaavat pääkategoriat, yläkategoriat ja alakategoriat.Alakategoriat (n=61) annettiin asiantuntijoille (sairaanhoitajat, opettajat, ylihoitajat) arvioitavaksi.Kaksivaiheisen asiantuntija-arvioinnin perusteella muodostui 92 päivystyshoitotyön osaamista kuvaavaa alakategoriaa. Tutkimuksen toisessa vaiheessa (2007–2012) valmistuvien suomalaisten sairaanhoitaja-opiskelijoiden (N=382, n=208, vastausprosentti 55 %) päivystyshoitotyön osaamista arvioitiin tätä tutkimusta varten kehitetyllä mittarilla (Päivystyshoitotyön osaaminen -mittari). Mittari perustui tutkimuksen ensimmäisessä vaiheessa muodostettuun määrittelyyn päivystyshoitotyön osaamisesta. Osaamista mitattiin VAS-janalla (asteikko 0–100) arvon 100 ollessa optimaalinen taso, johon pyrittiin. Sairaanhoitajaopiskelijoiden tavoiteltavaksi osaamisen tasoksi asetettiin 80 olettaen opiskelijoiden osaamisen vielä kehittyvän työkokemuksen myötä. Ammatissa toimivien sairaanhoitajien (N=586, n=280, vastausprosentti 48 %) itsearvioitua osaamista käytettiin vertailuperustana opiskelijoiden osaamiselle. Aineisto analysoitiin tilastollisin menetelmin. Valmistuvien sairaanhoitajaopiskelijoiden itsearvioitu päivystyshoitotyön osaaminen oli alle tavoiteltavan osaamisen tason. Opiskelijoilla oli mielestään eniten eettistä osaamista sekä vuorovaikutus- ja yhteistyöosaamista ja vähiten päätöksenteko-osaamista ja kliinistä osaamista. Myös ammatissa toimivilla sairaanhoitajilla oli mielestään eniten vuorovaikutus- ja yhteistyöosaamista. Vähiten heillä oli ohjausosaamista ja päätöksenteko-osaamista. Sairaanhoitajilla oli tilastollisesti merkitsevästi enemmän päivystyshoitotyön osaamista kuin opiskelijoilla. Opiskelijoiden päivystyshoitotyön osaamista selitti eniten aikaisempi terveysalan tutkinto. Päivystyshoitotyön osaamisen kehittämisehdotukset kohdistuvat ammatillisen peruskoulutuksen ja täydennyskoulutuksen opetuksen sisältöihin ja määrään, opetus- ja opiskelumenetelmiin, osaamisen arviointiin sekä urasuunnitteluun. Jatkotutkimusehdotukset kohdistuvat päivystyshoitotyön osaamisen määrittelyn ja osaamista arvioivan mittarin edelleen kehittämiseen, erilaisten arviointimenetelmien kehittämiseen sekä osaamiseen yhteydessä olevien tekijöiden edelleen tutkimiseen.
Resumo:
The ongoing global financial crisis has demonstrated the importance of a systemwide, or macroprudential, approach to safeguarding financial stability. An essential part of macroprudential oversight concerns the tasks of early identification and assessment of risks and vulnerabilities that eventually may lead to a systemic financial crisis. Thriving tools are crucial as they allow early policy actions to decrease or prevent further build-up of risks or to otherwise enhance the shock absorption capacity of the financial system. In the literature, three types of systemic risk can be identified: i ) build-up of widespread imbalances, ii ) exogenous aggregate shocks, and iii ) contagion. Accordingly, the systemic risks are matched by three categories of analytical methods for decision support: i ) early-warning, ii ) macro stress-testing, and iii ) contagion models. Stimulated by the prolonged global financial crisis, today's toolbox of analytical methods includes a wide range of innovative solutions to the two tasks of risk identification and risk assessment. Yet, the literature lacks a focus on the task of risk communication. This thesis discusses macroprudential oversight from the viewpoint of all three tasks: Within analytical tools for risk identification and risk assessment, the focus concerns a tight integration of means for risk communication. Data and dimension reduction methods, and their combinations, hold promise for representing multivariate data structures in easily understandable formats. The overall task of this thesis is to represent high-dimensional data concerning financial entities on lowdimensional displays. The low-dimensional representations have two subtasks: i ) to function as a display for individual data concerning entities and their time series, and ii ) to use the display as a basis to which additional information can be linked. The final nuance of the task is, however, set by the needs of the domain, data and methods. The following ve questions comprise subsequent steps addressed in the process of this thesis: 1. What are the needs for macroprudential oversight? 2. What form do macroprudential data take? 3. Which data and dimension reduction methods hold most promise for the task? 4. How should the methods be extended and enhanced for the task? 5. How should the methods and their extensions be applied to the task? Based upon the Self-Organizing Map (SOM), this thesis not only creates the Self-Organizing Financial Stability Map (SOFSM), but also lays out a general framework for mapping the state of financial stability. This thesis also introduces three extensions to the standard SOM for enhancing the visualization and extraction of information: i ) fuzzifications, ii ) transition probabilities, and iii ) network analysis. Thus, the SOFSM functions as a display for risk identification, on top of which risk assessments can be illustrated. In addition, this thesis puts forward the Self-Organizing Time Map (SOTM) to provide means for visual dynamic clustering, which in the context of macroprudential oversight concerns the identification of cross-sectional changes in risks and vulnerabilities over time. Rather than automated analysis, the aim of visual means for identifying and assessing risks is to support disciplined and structured judgmental analysis based upon policymakers' experience and domain intelligence, as well as external risk communication.
Resumo:
In this thesis mainly long quasi-periodic solar oscillations in various solar atmospheric structures are discussed, based on data obtained at several wavelengths, focussing, however, mainly on radio frequencies. Sunspot (Articles II and III) and quiet Sun area (QSA) (Article I) oscillations are investigated along with quasi-periodic pulsations (QPP) in a flaring event with wide-range radio spectra (Article IV). Various oscillation periods are detected; 3–15, 35–70 and 90 minutes (QSA), 10-60 and 80-130 minutes (in sunspots at various radio frequencies), 3-5, 10-23, 220-240, 340 and 470 minutes (in sunspots at photosphere) and 8-12 and 15-17 seconds (in a solar flare at radio frequencies). Some of the oscillation periods are detected for the first time, while some of them have been confirmed earlier by other research groups. Solar oscillations can provide more information on the nature of various solar structures. This thesis presents the physical mechanisms of some solar structure oscillations. Two different theoretical approaches are chosen; magnetohydrodynamics (MHD) and the shallow sunspot model. These two theories can explain a wide range of solar oscillations from a few seconds up to some hours. Various wave modes in loop structures cause solar oscillations (<45 minutes) both in sunspots and quiet Sun areas. Periods lasting more than 45 minutes in the sunspots (and a fraction of the shorter periods) are related to sunspot oscillations as a whole. Sometimes similar oscillation periods are detected both in sunspot area variations and respectively in magnetic field strength changes. This result supports a concept that these oscillations are related to sunspot oscillations as a whole. In addition, a theory behind QPPs at radio frequencies in solar flares is presented. The thesis also covers solar instrumentation and data sources. Additionally, the data processing methods are presented. As the majority of the investigations in this thesis focus on radio frequencies, also the most typical radio emission mechanisms are presented. The main structures of the Sun, which are related to solar oscillations, are also presented. Two separate projects are included in this thesis. Solar cyclicity is studied using the extensively large solar radio map archieve from Metsähovi Radio Observatory (MRO) at 37 GHz, between 1978 and 2011 (Article V) covering two full solar cycles. Also, some new solar instrumentation (Article VI) was developed during this thesis.
Resumo:
Few people see both opportunities and threats coming from IT legacy in current world. On one hand, effective legacy management can bring substantial hard savings and smooth transition to the desired future state. On the other hand, its mismanagement contributes to serious operational business risks, as old systems are not as reliable as it is required by the business users. This thesis offers one perspective of dealing with IT legacy – through effective contract management, as a component towards achieving Procurement Excellence in IT, thus bridging IT delivery departments, IT procurement, business units, and suppliers. It developed a model for assessing the impact of improvements on contract management process and set of tools and advices with regards to analysis and improvement actions. The thesis conducted case study to present and justify the implementation of Lean Six Sigma in IT legacy contract management environment. Lean Six Sigma proved to be successful and this thesis presents and discusses all the steps necessary, and pitfalls to avoid, to achieve breakthrough improvement in IT contract management process performance. For the IT legacy contract management process two improvements require special attention and can be easily copied to any organization. First is the issue of diluted contract ownership that stops all the improvements, as people do not know who is responsible for performing those actions. Second is the contract management performance evaluation tool, which can be used for monitoring, identifying outlying contracts and opportunities for improvements in the process. The study resulted in a valuable insight on the benefits of applying Lean Six Sigma to improve IT legacy contract management, as well as on how Lean Six Sigma can be applied in IT environment. Managerial implications are discussed. It is concluded that the use of data-driven Lean Six Sigma methodology for improving the existing IT contract management processes is a significant addition to the existing best practices in contract management.
Resumo:
Diplomityön tavoitteena oli selkeyttää ja yhtenäistää case-yrityksen asiakaspalveluprosesseja kuvaamalla ne ja kehittämällä niitä. Case-yhtiönä oli seudullinen elinkeinoyhtiö, joka tarjoaa asiakkailleen tietointensiivisiä asiantuntija- ja neuvontapalveluita. Diplomityön lopputuloksena oli tarkoitus luoda yhtenäiset asiakaspalveluprosessimallit case-yritykselle. Tutkimuksessa vastattiin kirjallisuuskatsauksen avulla kysymyksiin, kuinka asiakaspalveluprosesseja voidaan kehittää ja millä menetelmillä prosesseja voidaan kuvata. Tämän jälkeen tutkittiin, miten case-yritys voi kehittää asiakaspalveluprosessejaan soveltamalla näitä menetelmiä. Selvisi, että palveluprosessin kehittäminen edellyttää käytännössä palvelun tuotteistamista, joka voidaan jakaa kolmeen osa-alueeseen: 1) palvelun määrittäminen ja standardointi, 2) palvelun ja asiantuntijuuden aineellistaminen ja konkretisointi ja 3) prosessien ja metodien systematisointi ja standardisointi. Asiantuntijapalveluja on vaikea tai mahdoton standardoida sellaisenaan, mutta ne voidaan tuotteistaa modulaarisesti. Palveluprosessien systematisoimiseen sopivin työkalu on niiden kuvaaminen service blueprinting -menetelmällä. Tutkimus on laadultaan kvalitatiivinen kuvaileva tapaustutkimus. Tutkimuksen empiirisessä on käytetty aineistonhankintatapoina puolistrukturoitua kyselyä, teemahaastatteluja, osallistuvaa havainnointia ja dokumenttien tutkimista. Näiden aineistojen analysoinnin tuloksena saatiin case-yritykselle kehitettyä yhtenäinen malli yleisestä asiakaspalveluprosessista ja kuvattua keskeisimpien palvelujen asiakaspalveluprosessit service blueprinting -menetelmällä.
Resumo:
Ion mobility spectrometry (IMS) is a straightforward, low cost method for fast and sensitive determination of organic and inorganic analytes. Originally this portable technique was applied to the determination of gas phase compounds in security and military use. Nowadays, IMS has received increasing attention in environmental and biological analysis, and in food quality determination. This thesis consists of literature review of suitable sample preparation and introduction methods for liquid matrices applicable to IMS from its early development stages to date. Thermal desorption, solid phase microextraction (SPME) and membrane extraction were examined in experimental investigations of hazardous aquatic pollutants and potential pollutants. Also the effect of different natural waters on the extraction efficiency was studied, and the utilised IMS data processing methods are discussed. Parameters such as extraction and desorption temperatures, extraction time, SPME fibre depth, SPME fibre type and salt addition were examined for the studied sample preparation and introduction methods. The observed critical parameters were extracting material and temperature. The extraction methods showed time and cost effectiveness because sampling could be performed in single step procedures and from different natural water matrices within a few minutes. Based on these experimental and theoretical studies, the most suitable method to test in the automated monitoring system is membrane extraction. In future an IMS based early warning system for monitoring water pollutants could ensure the safe supply of drinking water. IMS can also be utilised for monitoring natural waters in cases of environmental leakage or chemical accidents. When combined with sophisticated sample introduction methods, IMS possesses the potential for both on-line and on-site identification of analytes in different water matrices.
Resumo:
Vaikka liiketoimintatiedon hallintaa sekä johdon päätöksentekoa on tutkittu laajasti, näiden kahden käsitteen yhteisvaikutuksesta on olemassa hyvin rajallinen määrä tutkimustietoa. Tulevaisuudessa aiheen tärkeys korostuu, sillä olemassa olevan datan määrä kasvaa jatkuvasti. Yritykset tarvitsevat jatkossa yhä enemmän kyvykkyyksiä sekä resursseja, jotta sekä strukturoitua että strukturoimatonta tietoa voidaan hyödyntää lähteestä riippumatta. Nykyiset Business Intelligence -ratkaisut mahdollistavat tehokkaan liiketoimintatiedon hallinnan osana johdon päätöksentekoa. Aiemman kirjallisuuden pohjalta, tutkimuksen empiirinen osuus tunnistaa liiketoimintatiedon hyödyntämiseen liittyviä tekijöitä, jotka joko tukevat tai rajoittavat johdon päätöksentekoprosessia. Tutkimuksen teoreettinen osuus johdattaa lukijan tutkimusaiheeseen kirjallisuuskatsauksen avulla. Keskeisimmät tutkimukseen liittyvät käsitteet, kuten Business Intelligence ja johdon päätöksenteko, esitetään relevantin kirjallisuuden avulla – tämän lisäksi myös dataan liittyvät käsitteet analysoidaan tarkasti. Tutkimuksen empiirinen osuus rakentuu tutkimusteorian pohjalta. Tutkimuksen empiirisessä osuudessa paneudutaan tutkimusteemoihin käytännön esimerkein: kolmen tapaustutkimuksen avulla tutkitaan sekä kuvataan toisistaan irrallisia tapauksia. Jokainen tapaus kuvataan sekä analysoidaan teoriaan perustuvien väitteiden avulla – nämä väitteet ovat perusedellytyksiä menestyksekkäälle liiketoimintatiedon hyödyntämiseen perustuvalle päätöksenteolle. Tapaustutkimusten avulla alkuperäistä tutkimusongelmaa voidaan analysoida tarkasti huomioiden jo olemassa oleva tutkimustieto. Analyysin tulosten avulla myös yksittäisiä rajoitteita sekä mahdollistavia tekijöitä voidaan analysoida. Tulokset osoittavat, että rajoitteilla on vahvasti negatiivinen vaikutus päätöksentekoprosessin onnistumiseen. Toisaalta yritysjohto on tietoinen liiketoimintatiedon hallintaan liittyvistä positiivisista seurauksista, vaikka kaikkia mahdollisuuksia ei olisikaan hyödynnetty. Tutkimuksen merkittävin tulos esittelee viitekehyksen, jonka puitteissa johdon päätöksentekoprosesseja voidaan arvioida sekä analysoida. Despite the fact that the literature on Business Intelligence and managerial decision-making is extensive, relatively little effort has been made to research the relationship between them. This particular field of study has become important since the amount of data in the world is growing every second. Companies require capabilities and resources in order to utilize structured data and unstructured data from internal and external data sources. However, the present Business Intelligence technologies enable managers to utilize data effectively in decision-making. Based on the prior literature, the empirical part of the thesis identifies the enablers and constraints in computer-aided managerial decision-making process. In this thesis, the theoretical part provides a preliminary understanding about the research area through a literature review. The key concepts such as Business Intelligence and managerial decision-making are explored by reviewing the relevant literature. Additionally, different data sources as well as data forms are analyzed in further detail. All key concepts are taken into account when the empirical part is carried out. The empirical part obtains an understanding of the real world situation when it comes to the themes that were covered in the theoretical part. Three selected case companies are analyzed through those statements, which are considered as critical prerequisites for successful computer-aided managerial decision-making. The case study analysis, which is a part of the empirical part, enables the researcher to examine the relationship between Business Intelligence and managerial decision-making. Based on the findings of the case study analysis, the researcher identifies the enablers and constraints through the case study interviews. The findings indicate that the constraints have a highly negative influence on the decision-making process. In addition, the managers are aware of the positive implications that Business Intelligence has for decision-making, but all possibilities are not yet utilized. As a main result of this study, a data-driven framework for managerial decision-making is introduced. This framework can be used when the managerial decision-making processes are evaluated and analyzed.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Tämän pro gradu –tutkielman tarkoituksena on selvittää minkälaisella prosessilla saadaan määriteltyä resursoinnin näkökulmasta toteutettu osaamiskartoitus. Tutkimus on laadullinen tapaustutkimus kohdeorganisaatiossa. Tutkimusaineisto on kerätty dokumenteista ja tutkimuksessa toteutetuista tapaamisista sekä työpajoista. Tutkimusaineisto on analysoitu aineistolähtöisellä sisällönanalyysimenetelmällä. Tutkimuksen tulosten mukaan osaamiskartoitusprosessiin ja sen onnistumiseen vaikuttavat merkittävästi yrityksen strategia, johdon sitoutuminen osaamiskartoitustyöhön, nykytilan analyysi, yhteiset käsitteistöt, mittarit ja tavoitteet. Resursoinnin näkökulmasta vaadittavat osaamiset eivät välttämättä ole samat kuin kehittämisen näkökulmasta. Määrittelyprosessin onnistumisen kannalta merkittäviä tekijöitä ovat oikeiden henkilöiden osallistuminen prosessiin ja heidän halunsa jakaa tietoa.
Resumo:
The amount of biological data has grown exponentially in recent decades. Modern biotechnologies, such as microarrays and next-generation sequencing, are capable to produce massive amounts of biomedical data in a single experiment. As the amount of the data is rapidly growing there is an urgent need for reliable computational methods for analyzing and visualizing it. This thesis addresses this need by studying how to efficiently and reliably analyze and visualize high-dimensional data, especially that obtained from gene expression microarray experiments. First, we will study the ways to improve the quality of microarray data by replacing (imputing) the missing data entries with the estimated values for these entries. Missing value imputation is a method which is commonly used to make the original incomplete data complete, thus making it easier to be analyzed with statistical and computational methods. Our novel approach was to use curated external biological information as a guide for the missing value imputation. Secondly, we studied the effect of missing value imputation on the downstream data analysis methods like clustering. We compared multiple recent imputation algorithms against 8 publicly available microarray data sets. It was observed that the missing value imputation indeed is a rational way to improve the quality of biological data. The research revealed differences between the clustering results obtained with different imputation methods. On most data sets, the simple and fast k-NN imputation was good enough, but there were also needs for more advanced imputation methods, such as Bayesian Principal Component Algorithm (BPCA). Finally, we studied the visualization of biological network data. Biological interaction networks are examples of the outcome of multiple biological experiments such as using the gene microarray techniques. Such networks are typically very large and highly connected, thus there is a need for fast algorithms for producing visually pleasant layouts. A computationally efficient way to produce layouts of large biological interaction networks was developed. The algorithm uses multilevel optimization within the regular force directed graph layout algorithm.
Resumo:
As technology has developed it has increased the number of data produced and collected from business environment. Over 80% of that data includes some sort of reference to geographical location. Individuals have used that information by utilizing Google Maps or different GPS devices, however such information has remained unexploited in business. This thesis will study the use and utilization of geographically referenced data in capital-intensive business by first providing theoretical insight into how data and data-driven management enables and enhances the business and how especially geographically referenced data adds value to the company and then examining empirical case evidence how geographical information can truly be exploited in capital-intensive business and what are the value adding elements of geographical information to the business. The study contains semi-structured interviews that are used to scan attitudes and beliefs of an organization towards the geographic information and to discover fields of applications for the use of geographic information system within the case company. Additionally geographical data is tested in order to illustrate how the data could be used in practice. Finally the outcome of the thesis provides understanding from which elements the added value of geographical information in business is consisted of and how such data can be utilized in the case company and in capital-intensive business.
Resumo:
The visualization of measurement data is important in the fields of engineering for research analysis and presentation purposes. A suitable visualization method for scientific visualization is needed when handling measurement data. Visualization methods and techniques will be presented throughout this work. They are the bases of scientific visualization from the abstract visualization process to the applied techniques suited for each situation. This work also proposes a visualization tool using the MATLAB® software. The tool was designed as general as possible to encompass the most needs in terms of measurement data visualization. It offers possibilities for both static and dynamic visualization of the data.
Resumo:
The purpose of the thesis is to study how mathematics is experienced and used in preschool children’s activities and how preschool teachers frame their teaching of mathematical content. The studies include analyses of children’s actions in different activities from a mathematical perspective and preschool teachers’ intentions with and their teaching of mathematics. Preschool teachers’ understanding of the knowledge required in this area is also scrutinised. The theoretical points of departure are variation theory and sociocultural theory. With variation theory the focus is directed towards how mathematical content is dealt with in teaching situations where preschool teachers have chosen the learning objects. The sociocultural perspective has been chosen because children’s mathematical learning in play often takes place in interactions with others and in the encounter with culturally mediated concepts. The theoretical framework also includes didactical points of departure. The study is qualitative, with videography and phenomenography as metholological research approaches. In the study, video observations and interviews with preschool teachers have been used as data collection methods. The results show that in children’s play mathematics consists of volume, geometrical shapes, gravity, quantity and positioning. The situations also include size, patterns, proportions, counting and the creation of pairs. The preschool teachers’ intentions, planning and staging of their goal-oriented work are that all children should be given the opportunity to discern a mathematical content. This also includes making learning objects visible in here-and-now-situations. Variation and a clear focus on the mathematical content are important in this context. One of the study’s knowledge contributions concerns the didactics of mathematics in the preschool. This relates to the teaching of mathematics and includes the knowledge that preschool teachers regard as essential for their teaching. This includes theoretical and practical knowledge about children and children’s learning and didactical issues and strategies. The conclusion is that preschool teachers need to have a basic knowledge of mathematics and the didactics of mathematics.
Resumo:
cDNA microarray is an innovative technology that facilitates the analysis of the expression of thousands of genes simultaneously. The utilization of this methodology, which is rapidly evolving, requires a combination of expertise from the biological, mathematical and statistical sciences. In this review, we attempt to provide an overview of the principles of cDNA microarray technology, the practical concerns of the analytical processing of the data obtained, the correlation of this methodology with other data analysis methods such as immunohistochemistry in tissue microarrays, and the cDNA microarray application in distinct areas of the basic and clinical sciences.