938 resultados para Research Tools
Resumo:
Over the last several years, lawmakers have been responding to several highly publicized child abduction, assault and murder cases. While such cases remain rare in Iowa, the public debates they have generated are having far-reaching effects. Policy makers are responsible for controlling the nature of such effects. Challenges they face stem from the need to avoid primarily politically-motivated responses and the desire to make informed decisions that recognize both the strengths and the limitations of the criminal justice system as a vehicle for promoting safe and healthy families and communities. Consensus was reached by the Task Force at its first meeting that one of its standing goals is to provide nonpartisan guidance to help avoid or fix problematic sex offense policies and practices. Setting this goal was a response to the concern over what can result from elected officials’ efforts to respond to the types of sex offender-related concerns that can easily become emotionally laden and politically charged due to the universally held abhorrence of sex crimes against children. The meetings of the Task Force and the various work groups it has formed have included some spirited and perhaps emotionally charged discussions, despite the above-stated ground rule. However, as is described in the report, the Task Force’s first set of recommendations and plans for further study were approved through consensus. It is hoped that in upcoming legislative deliberations, it will be remembered that the non-legislative members of the Task Force all agreed on the recommendations contained in this report. The topics discussed in this first report from the Task Force are limited to the study issues specifically named in H.F. 619, the Task Force’s enabling legislation. However, other topics of concern were discussed by the Task Force because of their immediacy or because of their possible relationships with one or more of the Task Force’s mandated study issues. For example, it has been reported by some probation/parole officers and others that the 2000 feet rule has had a negative influence on treatment participation and supervision compliance. While such concerns were noted, the Task Force did not take it upon itself to investigate them at this time and thus broaden the agenda it was given by the General Assembly last session. As a result, the recently reinstated 2000 feet rule, the new cohabitation/child endangerment law and other issues of interest to Task Force members but not within the scope of their charge are not discussed in the body of this report. An issue of perhaps the greatest interest to most Task Force members that was not a part of their charge was a belief in the benefit of viewing Iowa’s efforts to protect children from sex crimes with as comprehensive a platform as possible. It has been suggested that much more can be done to prevent child-victim sex crimes than would be accomplished by only concentrating on what to do with offenders after a crime has occurred. To prevent child victimization, H.F. 619 policy provisions rely largely on incapacitation and future deterrent effects of increased penalties, more restrictive supervision practices and greater public awareness of the risk presented by a segment of Iowa’s known sex offenders. For some offenders, these policies will no doubt prevent future sex crimes against children, and the Task Force has begun long-term studies to look for the desired results and for ways to improve such results through better supervision tools and more effective offender treatment. Unfortunately, much of the effects from the new policies may primarily influence persons who have already committed sex offenses against minors and who have already been caught doing so. Task Force members discussed the need for a range of preventive efforts and a need to think about sex crimes against children from other than just a “reaction- to-the-offender” perspective. While this topic is not addressed in the report that follows, it was suggested that some of the Task Force’s discussions could be briefly shared through these opening comments. Along with incapacitation and deterrence, comprehensive approaches to the prevention of child-victim sex crimes would also involve making sure parents have the tools they need to detect signs of adults with sex behavior problems, to help teach their children about warning signs and to find the support they need for healthy parenting. School, faithbased and other community organizations might benefit from stronger supports and better tools they can use to more effectively promote positive youth development and the learning of respect for others, respect for boundaries and healthy relationships. All of us who have children, or who live in communities where there are children, need to understand the limitations of our justice system and the importance of our own ability to play a role in preventing sexual abuse and protecting children from sex offenders, which are often the child’s own family members. Over 1,000 incidences of child sexual abuse are confirmed or founded each year in Iowa, and most such acts take place in the child’s home or the residence of the caretaker of the child. Efforts to prevent child sexual abuse and to provide for early interventions with children and families at risk could be strategically examined and strengthened. The Sex Offender Treatment and Supervision Task Force was established to provide assistance to the General Assembly. It will respond to legislative direction for adjusting its future plans as laid out in this report. Its plans could be adjusted to broaden or narrow its scope or to assign different priority levels of effort to its current areas of study. Also, further Task Force considerations of the recommendations it has already submitted could be called for. In the meantime, it is hoped that the information and recommendations submitted through this report prove helpful.
Resumo:
The overarching goal of this project was to identify and evaluate cognitive and behavioral indices that are sensitive to sleep deprivation and may help identify commercial motor vehicle drivers (CMV) who are at-risk for driving in a sleep deprived state and may prove useful in field tests administered by officers. To that end, we evaluated indices of driver physiognomy (e.g., yawning, droopy eyelids, etc.) and driver behavioral/cognitive state (e.g. distracted driving) and the sensitivity of these indices to objective measures of sleep deprivation. The measures of sleep deprivation were sampled on repeated occasions over a period of 3.5-months in each of 44 drivers diagnosed with Obstructive Sleep Apnea (OSA) and 22 controls (matched for gender, age within 5 years, education within 2 years, and county of residence for rural vs. urban driving). Comprehensive analyses showed that specific dimensions of driver physiognomy associated with sleepiness in previous research and face-valid composite scores of sleepiness did not: 1) distinguish participants with OSA from matched controls; 2) distinguish participants before and after PAP treatment including those who were compliant with their treatment; 3) predict levels of sleep deprivation acquired objectively from actigraphy watches, not even among those chronically sleep deprived. Those findings are consistent with large individual differences in driver physiognomy. In other words, when individuals were sleep deprived as confirmed by actigraphy watch output they did not show consistently reliable behavioral markers of being sleep deprived. This finding held whether each driver was compared to him/herself with adequate and inadequate sleep, and even among chronically sleep deprived drivers. The scientific evidence from this research study does not support the use of driver physiognomy as a valid measure of sleep deprivation or as a basis to judge whether a CMV driver is too fatigued to drive, as on the current Fatigued Driving Evaluation Checklist.. Fair and accurate determinations of CMV driver sleepiness in the field will likely require further research on alternative strategies that make use of a combination of information sources besides driver physiognomy, including work logs, actigraphy, in vehicle data recordings, GPS data on vehicle use, and performance tests.
Resumo:
The genus Artemisia is one of the largest of the Asteraceae family, with more than 500 species. It is widely distributed mainly across the Northern Hemisphere, being profusely represented in the Old World, with a great centre of diversification in Asia, and also reaching the New World. The evolution of this genus has been deeply studied using different approaches, and polyploidy has been found to perform an important role leading to speciation processes. Karyological, molecular cytogenetic and phylogenetic data have been compiled in the present review to provide a genomic characterization throughout some complexes within the genus.
Resumo:
Statistics occupies a prominent role in science and citizens' daily life. This article provides a state-of-the-art of the problems associated with statistics in science and in society, structured along the three paradigms defined by Bauer, Allum and Miller (2007). It explores in more detail medicine and public understanding of science on the one hand, and risks and surveys on the other. Statistics has received a good deal of attention; however, very often handled in terms of deficit - either of scientists or of citizens. Many tools have been proposed to improve statistical literacy, the image of and trust in statistics, but with little understanding of their roots, with little coordination among stakeholders and with few assessments of impacts. These deficiencies represent as many new and promising directions in which the PUS research agenda could be expanded.
Resumo:
PURPOSE: Pharmacovigilance methods have advanced greatly during the last decades, making post-market drug assessment an essential drug evaluation component. These methods mainly rely on the use of spontaneous reporting systems and health information databases to collect expertise from huge amounts of real-world reports. The EU-ADR Web Platform was built to further facilitate accessing, monitoring and exploring these data, enabling an in-depth analysis of adverse drug reactions risks.METHODS: The EU-ADR Web Platform exploits the wealth of data collected within a large-scale European initiative, the EU-ADR project. Millions of electronic health records, provided by national health agencies, are mined for specific drug events, which are correlated with literature, protein and pathway data, resulting in a rich drug-event dataset. Next, advanced distributed computing methods are tailored to coordinate the execution of data-mining and statistical analysis tasks. This permits obtaining a ranked drug-event list, removing spurious entries and highlighting relationships with high risk potential.RESULTS: The EU-ADR Web Platform is an open workspace for the integrated analysis of pharmacovigilance datasets. Using this software, researchers can access a variety of tools provided by distinct partners in a single centralized environment. Besides performing standalone drug-event assessments, they can also control the pipeline for an improved batch analysis of custom datasets. Drug-event pairs can be substantiated and statistically analysed within the platform's innovative working environment.CONCLUSIONS: A pioneering workspace that helps in explaining the biological path of adverse drug reactions was developed within the EU-ADR project consortium. This tool, targeted at the pharmacovigilance community, is available online at https://bioinformatics.ua.pt/euadr/. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
Since 2004, four antiangiogenic drugs have been approved for clinical use in patients with advanced solid cancers, on the basis of their capacity to improve survival in phase III clinical studies. These achievements validated the concept introduced by Judah Folkman that the inhibition of tumor angiogenesis could control tumor growth. It has been suggested that biomarkers of angiogenesis would greatly facilitate the clinical development of antiangiogenic therapies. For these four drugs, the pharmacodynamic effects observed in early clinical studies were important to corroborate activities, but were not essential for the continuation of clinical development and approval. Furthermore, no validated biomarkers of angiogenesis or antiangiogenesis are available for routine clinical use. Thus, the quest for biomarkers of angiogenesis and their successful use in the development of antiangiogenic therapies are challenges in clinical oncology and translational cancer research. We review critical points resulting from the successful clinical trials, review current biomarkers, and discuss their potential impact on improving the clinical use of available antiangiogenic drugs and the development of new ones.
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
A table showing a comparison and classification of tools (intelligent tutoring systems) for e-learning of Logic at a college level.
Resumo:
The objective of this work was to develop an experimental kit for assessments of repellency, deterrence for oviposition, and insecticidal activity on adults of the whitefly Bemisia tabaci biotype B. The kit, which consisted of arenas and nebulizer, was effective for conducting bioassays, and the application of aqueous extracts by inhaler was adequate. The techniques are simple, cheap, and may contribute to research on this insect.
Resumo:
Nanotechnology is becoming part of our daily life in a wide range of products such as computers, bicycles, sunscreens or nanomedicines. While these applications already become reality, considerable work awaits scientists, engineers, and policy makers, who want such nanotechnological products to yield a maximum of benefit at a minimum of social, environmental, economic and (occupational) health cost. Considerable efforts for coordination and collaboration in research are needed if one wants to reach these goals in a reasonable time frame and an affordable price tag. This is recognized in Europe by the European Commission which funds not only research projects but also supports the coordination of research efforts. One of these coordination efforts is NanoImpactNet, a researcher-operated network, which started in 2008 promote scientific cross-talk across all disciplines on the health and environmental impact of nanomaterials. Stakeholders contribute to these activities, notably the definition of research and knowledge needs. Initial discussions in this domain focused on finding an agreement on common metrics, and which elements are needed for standardized approaches for hazard and exposure identification. There are many nanomaterial properties that may play a role. Hence, to gain the time needed to study this complex matter full of uncertainties, researchers and stakeholders unanimously called for simple, easy and fast risk assessment tools that can support decision making in this rapidly moving and growing domain. Today, several projects are starting or already running that will develop such assessment tools. At the same time, other projects investigate in depth which factors and material properties can lead to unwanted toxicity or exposure, what mechanisms are involved and how such responses can be predicted and modelled. A vision for the future is that once these factors, properties and mechanisms are understood, they can and will be accounted for in the development of new products and production processes following the idea of "Safety by Design". The promise of all these efforts is a future with nanomaterials where most of their risks are recognized and addressed before they even reach the market.
Resumo:
Avoimesta innovaatiosta ja innovaatioiden tehokkaasta hyödyntämisestä on tulossa tärkeitä osia yritysten T&K-prosesseihin. Diplomityön tarkoituksena on luoda viitekehys teknologioiden, jotka eivät kuulu yrityksen ydinliiketoimintaan, tehokkaampaan hallinnointiin tutkimusorganisaatiossa. Konstruktiivinen viitekehys on rakennettu pohjautuen aineettomien pääomien johtamisen ja portfolion hallinnoinnin teorioihin. Lisäksi työssä määritellään työkaluja jatekniikoita ylijäämäteknologioiden arviointiin. Uutta ylijäämäteknologioiden portfoliota voidaan hyödyntää hakukoneena, ideapankkina, kommunikaatiotyökaluna tai teknologioiden markkinapaikkana. Sen johtaminen koostuu tietojen dokumentoinnista järjestelmään, teknologioiden arvioinnista ja portfolion päivityksestä ja ylläpidosta.
Resumo:
A change in paradigm is needed in the prevention of toxic effects on the nervous system, moving from its present reliance solely on data from animal testing to a prediction model mostly based on in vitro toxicity testing and in silico modeling. According to the report published by the National Research Council (NRC) of the US National Academies of Science, high-throughput in vitro tests will provide evidence for alterations in"toxicity pathways" as the best possible method of large scale toxicity prediction. The challenges to implement this proposal are enormous, and provide much room for debate. While many efforts address the technical aspects of implementing the vision, many questions around it need also to be addressed. Is the overall strategy the only one to be pursued? How can we move from current to future paradigms? Will we ever be able to reliably model for chronic and developmental neurotoxicity in vitro? This paper summarizes four presentations from a symposium held at the International Neurotoxicology Conference held in Xi"an, China, in June 2011. A. Li reviewed the current guidelines for neurotoxicity and developmental neurotoxicity testing, and discussed the major challenges existing to realize the NCR vision for toxicity testing. J. Llorens reviewed the biology of mammalian toxic avoidance in view of present knowledge on the physiology and molecular biology of the chemical senses, taste and smell. This background information supports the hypothesis that relating in vivo toxicity to chemical epitope descriptors that mimic the chemical encoding performed by the olfactory system may provide a way to the long term future of complete in silico toxicity prediction. S. Ceccatelli reviewed the implementation of rodent and human neural stem cells (NSCs) as models for in vitro toxicity testing that measures parameters such as cell proliferation, differentiation and migration. These appear to be sensitive endpoints that can identify substances with developmental neurotoxic potential. C. Sun ol reviewed the use of primary neuronal cultures in testing for neurotoxicity of environmental pollutants, including the study of the effects of persistent exposures and/or in differentiating cells, which allow recording of effects that can be extrapolated to human developmental neurotoxicity.
Resumo:
AIM: To provide insight into cancer registration coverage, data access and use in Europe. This contributes to data and infrastructure harmonisation and will foster a more prominent role of cancer registries (CRs) within public health, clinical policy and cancer research, whether within or outside the European Research Area. METHODS: During 2010-12 an extensive survey of cancer registration practices and data use was conducted among 161 population-based CRs across Europe. Responding registries (66%) operated in 33 countries, including 23 with national coverage. RESULTS: Population-based oncological surveillance started during the 1940-50s in the northwest of Europe and from the 1970s to 1990s in other regions. The European Union (EU) protection regulations affected data access, especially in Germany and France, but less in the Netherlands or Belgium. Regular reports were produced by CRs on incidence rates (95%), survival (60%) and stage for selected tumours (80%). Evaluation of cancer control and quality of care remained modest except in a few dedicated CRs. Variables evaluated were support of clinical audits, monitoring adherence to clinical guidelines, improvement of cancer care and evaluation of mass cancer screening. Evaluation of diagnostic imaging tools was only occasional. CONCLUSION: Most population-based CRs are well equipped for strengthening cancer surveillance across Europe. Data quality and intensity of use depend on the role the cancer registry plays in the politico, oncomedical and public health setting within the country. Standard registration methodology could therefore not be translated to equivalent advances in cancer prevention and mass screening, quality of care, translational research of prognosis and survivorship across Europe. Further European collaboration remains essential to ensure access to data and comparability of the results.
Resumo:
Tutkimuksen tavoitteena oli selvittää miten kehittää yrityksen nykyistä e-palvelujärjestelmää, Internet -teknologiaan perustuvaa sähköisiä kommunikaatio- ja tiedonjakojärjestelmää, yrityksen business-to-business asiakkuuksien johtamisessa. Tavoitteena oli myös luoda ehdotukset uusista e-palvelusopimusmalleista. Tutkimuksen teoriaosuudessa pyrittiin kehittämään aikaisempiin tutkimuksiin, tietokirjallisuuteen ja asiantuntijoihin perustuva viitekehysmalli. Empiirisessä osassa tutkimuksen tavoitteisiin pyrittiin haastattelemalla yrityksen asiakkaita ja henkilöstöä, sekä tarkastelemalla asiakaskontaktien nykyistä tilaa ja kehittymistä. Näiden tietojen perusteella selvitettiin e-palvelun käyttäjien tarpeita, profiilia ja valmiuksia palvelun käyttöön sekä palvelun nykyistä houkuttelevuutta. Tutkimuksen teoriaosan lähdeaineistona käytettiin kirjallisuutta, artikkeleita ja tilastoja asiakashallinnasta sekä e-palveluiden, erityisesti Internet ja verkkopalveluiden markkinoinnista, nykytilasta sekä palveluiden kehittämisestä. Lisäksi tutkittiin kirjallisuutta arvoverkostoanalyysistä, asiakkaan arvosta, informaatioteknologiasta, palvelun laadusta ja asiakastyytyväisyydestä. Tutkimuksen empiirinen osa perustuu yrityksen henkilöstöltä sekä asiakkailta haastatteluissa kerättyihin tietoihin, yrityksen ennalta keräämiin materiaaleihin sekä Taloustutkimuksen keräämiin tietoihin. Tutkimuksessa käytettiin case -menetelmää, joka oli yhdistelmä sekä kvalitatiivista että kvantitatiivista tutkimusta. Casen tarkoituksena oli testata mallin paikkansapitävyyttä ja käyttökelpoisuutta, sekä selvittää onko olemassa vielä muita tekijöitä, jotka vaikuttavat asiakkaan saamaan arvoon. Kvalitatiivinen aineisto perustuu teemahaastattelumenetelmää soveltaen haastateltuihin asiakkaisiin ja yrityksen työntekijöihin. Kvantitatiivinen tutkimus perustuu Taloustutkimuksen tutkimukseen ja yrityksen asiakaskontakteista kerättyyn tietoon. Haastatteluiden perusteella e-palvelut nähtiin hyödyllisinä ja tulevaisuudessa erittäin tärkeinä. E-palvelut nähdään yhtenä tärkeänä kanavana, perinteisten kanavien rinnalla, tehostaa business-to-business -asiakkuuksien johtamista. Tutkimuksen antamien tulosten mukaan asiakkaiden palveluun liittyvän tieto-, taito-, tarpeellisuus- ja kiinnostavuustasojen vaihtelevaisuus osoittaa selvän tarpeen eritasoisille e-palvelupaketti ratkaisuille. Tuloksista muodostettu ratkaisuehdotus käsittää neljän eri e-palvelupaketin rakentamisen asiakkaiden eri tarpeita mukaillen.
Resumo:
Tutkimuksen tavoitteena oli selvittää sisäisen kommunikoinnin tilannetta case-yrityksissä. Yritykset kuuluvat kahteen case-arvoverkostoon, jotka toimivat informaatio- ja kommunikaatioteknologian alalla. Sisäinen kommunikointi valittiin tutkimusalueeksi, koska se muodostaa perustan ulkoiselle, yritysten väliselle kommunikoinnille. Tutkimuksen painopiste oli web-pohjaisessa kommunikoinnissa ja webin ominaisuuksissa arvoverkoston näkökulmasta. Tutkimusprosessissa käytettiin sekä kvalitatiivisia että kvantitatiivisia menetelmiä. Tutkimuksen kvantitatiivinen osa toteutettiin web-kyselynä, jonka tulokset osoittivat, että case-yritysten sisäinen kommunikointi perustuu pääasiassa perinteisten kommunikointivälineiden käyttöön. Toisin sanoen, webin hyödyntäminen on vähäistä, mihin vaikuttavat monet eri tekijät. Webissä on kuitenkin useita ominaisuuksia, jotka parantavat kommunikointia arvoverkostossa ja siksi nämä web-pohjaiset välineet tulisi huomioida, kun suunnitellaan yleistä kommunikointijärjestelmää. Tutkimuksen teoreettisessa osassa määriteltiin vuorovaikutteisuus-ominaisuuteen perustuva kommunikointivälineiden luokittelu. Tämän lisäksi määriteltiin myös arvoverkoston käsite. Empiirinen osa koostui web-kyselyn toteutuksen ja tulosten raportoinnista, jonka jälkeen yhteenvetokappale koosti merkittävimmät havainnot sekä mahdolliset jatkotutkimusaiheet.