966 resultados para Data quality-aware mechanisms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many regions of the world, including inland lakes, present with suboptimal conditions for the remotely sensed retrieval of optical signals, thus challenging the limits of available satellite data-processing tools, such as atmospheric correction models (ACM) and water constituent-retrieval (WCR) algorithms. Working in such regions, however, can improve our understanding of remote-sensing tools and their applicabil- ity in new contexts, in addition to potentially offering useful information about aquatic ecology. Here, we assess and compare 32 combinations of two ACMs, two WCRs, and three binary categories of data quality standards to optimize a remotely sensed proxy of plankton biomass in Lake Kivu. Each parameter set is compared against the available ground-truth match-ups using Spearman's right-tailed ρ. Focusing on the best sets from each ACM-WCR combination, their performances are discussed with regard to data distribution, sample size, spatial completeness, and seasonality. The results of this study may be of interest both for ecological studies on Lake Kivu and for epidemio- logical studies of disease, such as cholera, the dynamics of which has been associated with plankton biomass in other regions of the world.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: As part of EUROCAT's surveillance of congenital anomalies in Europe, a statistical monitoring system has been developed to detect recent clusters or long-term (10 year) time trends. The purpose of this article is to describe the system for the identification and investigation of 10-year time trends, conceived as a "screening" tool ultimately leading to the identification of trends which may be due to changing teratogenic factors.METHODS: The EUROCAT database consists of all cases of congenital anomalies including livebirths, fetal deaths from 20 weeks gestational age, and terminations of pregnancy for fetal anomaly. Monitoring of 10-year trends is performed for each registry for each of 96 non-independent EUROCAT congenital anomaly subgroups, while Pan-Europe analysis combines data from all registries. The monitoring results are reviewed, prioritized according to a prioritization strategy, and communicated to registries for investigation. Twenty-one registries covering over 4 million births, from 1999 to 2008, were included in monitoring in 2010.CONCLUSIONS: Significant increasing trends were detected for abdominal wall anomalies, gastroschisis, hypospadias, Trisomy 18 and renal dysplasia in the Pan-Europe analysis while 68 increasing trends were identified in individual registries. A decreasing trend was detected in over one-third of anomaly subgroups in the Pan-Europe analysis, and 16.9% of individual registry tests. Registry preliminary investigations indicated that many trends are due to changes in data quality, ascertainment, screening, or diagnostic methods. Some trends are inevitably chance phenomena related to multiple testing, while others seem to represent real and continuing change needing further investigation and response by regional/national public health authorities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Well-conducted behavioural surveillance (BS) is essential for policy planning and evaluation. Data should be comparable across countries. In 2008, the European Centre for Disease Prevention and Control (ECDC) began a programme to support Member States in the implementation of BS for Second Generation Surveillance. Methods: Data from a mapping exercise on current BS activities in EU/EFTA countries led to recommendations for establishing national BS systems and international coordination, and the definition of a set of core and transversal (UNGASS-Dublin compatible) indicators for BS in the general and eight specific populations. A toolkit for establishing BS has been developed and a BS needs-assessment survey has been launched in 30 countries. Tools for BS self-assessment and planning are currently being tested during interactive workshops with country representatives. Results: The mapping exercise revealed extreme diversity between countries. Around half had established a BS system, but this did not always correspond to the epidemiological situation. Challenges to implementation and harmonisation at all levels emerged from survey findings and workshop feedback. These include: absence of synergy between biological and behavioural surveillance and of actors having an overall view of all system elements; lack of awareness of the relevance of BS and of coordination between agencies; insufficient use of available data; financial constraints; poor sustainability, data quality and access to certain key populations; unfavourable legislative environments. Conclusions: There is widespread need in the region not only for technical support but also for BS advocacy: BS remains the neglected partner of second generation surveillance and requires increased political support and capacity-building in order to become effective. Dissemination of validated tools for BS, developed in interaction with country experts, proves feasible and acceptable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Accuracy studies of Patient Safety Indicators (PSIs) are critical but limited by the large samples required due to low occurrence of most events. We tested a sampling design based on test results (verification-biased sampling [VBS]) that minimizes the number of subjects to be verified. METHODS: We considered 3 real PSIs, whose rates were calculated using 3 years of discharge data from a university hospital and a hypothetical screen of very rare events. Sample size estimates, based on the expected sensitivity and precision, were compared across 4 study designs: random and VBS, with and without constraints on the size of the population to be screened. RESULTS: Over sensitivities ranging from 0.3 to 0.7 and PSI prevalence levels ranging from 0.02 to 0.2, the optimal VBS strategy makes it possible to reduce sample size by up to 60% in comparison with simple random sampling. For PSI prevalence levels below 1%, the minimal sample size required was still over 5000. CONCLUSIONS: Verification-biased sampling permits substantial savings in the required sample size for PSI validation studies. However, sample sizes still need to be very large for many of the rarer PSIs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Imatinib mesylate, a selective inhibitor of tyrosine kinases, has excellent efficacy in the treatment of chronic myeloid leukaemia (CML) and gastrointestinal stromal tumour (GIST). Inducing durable responses and achieving prolonged survival, it has become the standard of care for the treatment of these diseases. It has opened the way to the development of additional tyrosine kinase inhibitors (TKIs), including sunitinib, nilotinib, dasatinib and sorafenib, all indicated for the treatment of various haematological malignancies and solid tumours. TKIs are prescribed for prolonged periods and are often taken by patients with - notably cardiovascular - comorbidities. Hence TKIs are regularly co-administered with cardiovascular drugs, with a considerable risk of potentially harmful drug-drug interactions due to the large number of agents used in combination. However, this aspect has received limited attention so far, and a comprehensive review of the published data on this important topic has been lacking. We review here the available data and pharmacological mechanisms of interactions between commonly prescribed cardiovascular drugs and the TKIs marketed at present. Regular updating of the literature on this topic will be mandatory, as will the prospective reporting of unexpected clinical observations, given the fact that these drugs have been only recently marketed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The prevalence of infectious diseases at our hospital (Centre hospitalier universitaire vaudois, Lausanne [CHUV], 900 beds) was studied retrospectively over a two years period (1980-1981). The medical diagnosis of 30203 patients recorded in the computerized medical archives, representing 93% of the patients admitted during the period of observation, was reviewed. To assess the reliability of the computerized data, quality control was carried out through detailed analysis of all the histologically proven appendicitis recorded during 1981. 88% of the histologically proven appendicitis were registered in the computer and the diagnosis was specific in 87% of cases. An infectious disease was the primary reason for admission in 12.8% of the patients (3873) during the study period. Altogether, 20.2% of patients presented with an infection during their hospital stay. Because of the retrospective nature of the study it was not possible to determine whether these additional infections were nosocomially acquired. The organ systems most frequently infected were the respiratory tract (28.5% of all infections), the digestive tract (20.5%), the skin and osteoarticular system (16%) and the urogenital tract (11.6%). An infection was the primary reason for admission of 40.2% of the patients hospitalized in the dermatology service, of 19.7% of patients admitted in internal medicine, of 15-17% of the patients admitted in pediatrics, ENT and general surgery, and of 1-2% of the patients admitted in neurosurgery and radiotherapy. These observations highlight the continuing importance of infectious diseases in a modern hospital, in spite of high socio-economic levels, stringent hygiene and epidemiologic measures, and modern antibiotic availability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Un système efficace de sismique tridimensionnelle (3-D) haute-résolution adapté à des cibles lacustres de petite échelle a été développé. Dans le Lac Léman, près de la ville de Lausanne, en Suisse, des investigations récentes en deux dimension (2-D) ont mis en évidence une zone de faille complexe qui a été choisie pour tester notre système. Les structures observées incluent une couche mince (<40 m) de sédiments quaternaires sub-horizontaux, discordants sur des couches tertiaires de molasse pentées vers le sud-est. On observe aussi la zone de faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau de la Molasse Subalpine. Deux campagnes 3-D complètes, d?environ d?un kilomètre carré, ont été réalisées sur ce site de test. La campagne pilote (campagne I), effectuée en 1999 pendant 8 jours, a couvert 80 profils en utilisant une seule flûte. Pendant la campagne II (9 jours en 2001), le nouveau système trois-flûtes, bien paramétrés pour notre objectif, a permis l?acquisition de données de très haute qualité sur 180 lignes CMP. Les améliorations principales incluent un système de navigation et de déclenchement de tirs grâce à un nouveau logiciel. Celui-ci comprend un contrôle qualité de la navigation du bateau en temps réel utilisant un GPS différentiel (dGPS) à bord et une station de référence près du bord du lac. De cette façon, les tirs peuvent être déclenchés tous les 5 mètres avec une erreur maximale non-cumulative de 25 centimètres. Tandis que pour la campagne I la position des récepteurs de la flûte 48-traces a dû être déduite à partir des positions du bateau, pour la campagne II elle ont pu être calculées précisément (erreur <20 cm) grâce aux trois antennes dGPS supplémentaires placées sur des flotteurs attachés à l?extrémité de chaque flûte 24-traces. Il est maintenant possible de déterminer la dérive éventuelle de l?extrémité des flûtes (75 m) causée par des courants latéraux ou de petites variations de trajet du bateau. De plus, la construction de deux bras télescopiques maintenant les trois flûtes à une distance de 7.5 m les uns des autres, qui est la même distance que celle entre les lignes naviguées de la campagne II. En combinaison avec un espacement de récepteurs de 2.5 m, la dimension de chaque «bin» de données 3-D de la campagne II est de 1.25 m en ligne et 3.75 m latéralement. L?espacement plus grand en direction « in-line » par rapport à la direction «cross-line» est justifié par l?orientation structurale de la zone de faille perpendiculaire à la direction «in-line». L?incertitude sur la navigation et le positionnement pendant la campagne I et le «binning» imprécis qui en résulte, se retrouve dans les données sous forme d?une certaine discontinuité des réflecteurs. L?utilisation d?un canon à air à doublechambre (qui permet d?atténuer l?effet bulle) a pu réduire l?aliasing observé dans les sections migrées en 3-D. Celui-ci était dû à la combinaison du contenu relativement haute fréquence (<2000 Hz) du canon à eau (utilisé à 140 bars et à 0.3 m de profondeur) et d?un pas d?échantillonnage latéral insuffisant. Le Mini G.I 15/15 a été utilisé à 80 bars et à 1 m de profondeur, est mieux adapté à la complexité de la cible, une zone faillée ayant des réflecteurs pentés jusqu?à 30°. Bien que ses fréquences ne dépassent pas les 650 Hz, cette source combine une pénétration du signal non-aliasé jusqu?à 300 m dans le sol (par rapport au 145 m pour le canon à eau) pour une résolution verticale maximale de 1.1 m. Tandis que la campagne I a été acquise par groupes de plusieurs lignes de directions alternées, l?optimisation du temps d?acquisition du nouveau système à trois flûtes permet l?acquisition en géométrie parallèle, ce qui est préférable lorsqu?on utilise une configuration asymétrique (une source et un dispositif de récepteurs). Si on ne procède pas ainsi, les stacks sont différents selon la direction. Toutefois, la configuration de flûtes, plus courtes que pour la compagne I, a réduit la couverture nominale, la ramenant de 12 à 6. Une séquence classique de traitement 3-D a été adaptée à l?échantillonnage à haute fréquence et elle a été complétée par deux programmes qui transforment le format non-conventionnel de nos données de navigation en un format standard de l?industrie. Dans l?ordre, le traitement comprend l?incorporation de la géométrie, suivi de l?édition des traces, de l?harmonisation des «bins» (pour compenser l?inhomogénéité de la couverture due à la dérive du bateau et de la flûte), de la correction de la divergence sphérique, du filtrage passe-bande, de l?analyse de vitesse, de la correction DMO en 3-D, du stack et enfin de la migration 3-D en temps. D?analyses de vitesse détaillées ont été effectuées sur les données de couverture 12, une ligne sur deux et tous les 50 CMP, soit un nombre total de 600 spectres de semblance. Selon cette analyse, les vitesses d?intervalles varient de 1450-1650 m/s dans les sédiments non-consolidés et de 1650-3000 m/s dans les sédiments consolidés. Le fait que l?on puisse interpréter plusieurs horizons et surfaces de faille dans le cube, montre le potentiel de cette technique pour une interprétation tectonique et géologique à petite échelle en trois dimensions. On distingue cinq faciès sismiques principaux et leurs géométries 3-D détaillées sur des sections verticales et horizontales: les sédiments lacustres (Holocène), les sédiments glacio-lacustres (Pléistocène), la Molasse du Plateau, la Molasse Subalpine de la zone de faille (chevauchement) et la Molasse Subalpine au sud de cette zone. Les couches de la Molasse du Plateau et de la Molasse Subalpine ont respectivement un pendage de ~8° et ~20°. La zone de faille comprend de nombreuses structures très déformées de pendage d?environ 30°. Des tests préliminaires avec un algorithme de migration 3-D en profondeur avant sommation et à amplitudes préservées démontrent que la qualité excellente des données de la campagne II permet l?application de telles techniques à des campagnes haute-résolution. La méthode de sismique marine 3-D était utilisée jusqu?à présent quasi-exclusivement par l?industrie pétrolière. Son adaptation à une échelle plus petite géographiquement mais aussi financièrement a ouvert la voie d?appliquer cette technique à des objectifs d?environnement et du génie civil.<br/><br/>An efficient high-resolution three-dimensional (3-D) seismic reflection system for small-scale targets in lacustrine settings was developed. In Lake Geneva, near the city of Lausanne, Switzerland, past high-resolution two-dimensional (2-D) investigations revealed a complex fault zone (the Paudèze thrust zone), which was subsequently chosen for testing our system. Observed structures include a thin (<40 m) layer of subhorizontal Quaternary sediments that unconformably overlie southeast-dipping Tertiary Molasse beds and the Paudèze thrust zone, which separates Plateau and Subalpine Molasse units. Two complete 3-D surveys have been conducted over this same test site, covering an area of about 1 km2. In 1999, a pilot survey (Survey I), comprising 80 profiles, was carried out in 8 days with a single-streamer configuration. In 2001, a second survey (Survey II) used a newly developed three-streamer system with optimized design parameters, which provided an exceptionally high-quality data set of 180 common midpoint (CMP) lines in 9 days. The main improvements include a navigation and shot-triggering system with in-house navigation software that automatically fires the gun in combination with real-time control on navigation quality using differential GPS (dGPS) onboard and a reference base near the lake shore. Shots were triggered at 5-m intervals with a maximum non-cumulative error of 25 cm. Whereas the single 48-channel streamer system of Survey I requires extrapolation of receiver positions from the boat position, for Survey II they could be accurately calculated (error <20 cm) with the aid of three additional dGPS antennas mounted on rafts attached to the end of each of the 24- channel streamers. Towed at a distance of 75 m behind the vessel, they allow the determination of feathering due to cross-line currents or small course variations. Furthermore, two retractable booms hold the three streamers at a distance of 7.5 m from each other, which is the same distance as the sail line interval for Survey I. With a receiver spacing of 2.5 m, the bin dimension of the 3-D data of Survey II is 1.25 m in in-line direction and 3.75 m in cross-line direction. The greater cross-line versus in-line spacing is justified by the known structural trend of the fault zone perpendicular to the in-line direction. The data from Survey I showed some reflection discontinuity as a result of insufficiently accurate navigation and positioning and subsequent binning errors. Observed aliasing in the 3-D migration was due to insufficient lateral sampling combined with the relatively high frequency (<2000 Hz) content of the water gun source (operated at 140 bars and 0.3 m depth). These results motivated the use of a double-chamber bubble-canceling air gun for Survey II. A 15 / 15 Mini G.I air gun operated at 80 bars and 1 m depth, proved to be better adapted for imaging the complexly faulted target area, which has reflectors dipping up to 30°. Although its frequencies do not exceed 650 Hz, this air gun combines a penetration of non-aliased signal to depths of 300 m below the water bottom (versus 145 m for the water gun) with a maximum vertical resolution of 1.1 m. While Survey I was shot in patches of alternating directions, the optimized surveying time of the new threestreamer system allowed acquisition in parallel geometry, which is preferable when using an asymmetric configuration (single source and receiver array). Otherwise, resulting stacks are different for the opposite directions. However, the shorter streamer configuration of Survey II reduced the nominal fold from 12 to 6. A 3-D conventional processing flow was adapted to the high sampling rates and was complemented by two computer programs that format the unconventional navigation data to industry standards. Processing included trace editing, geometry assignment, bin harmonization (to compensate for uneven fold due to boat/streamer drift), spherical divergence correction, bandpass filtering, velocity analysis, 3-D DMO correction, stack and 3-D time migration. A detailed semblance velocity analysis was performed on the 12-fold data set for every second in-line and every 50th CMP, i.e. on a total of 600 spectra. According to this velocity analysis, interval velocities range from 1450-1650 m/s for the unconsolidated sediments and from 1650-3000 m/s for the consolidated sediments. Delineation of several horizons and fault surfaces reveal the potential for small-scale geologic and tectonic interpretation in three dimensions. Five major seismic facies and their detailed 3-D geometries can be distinguished in vertical and horizontal sections: lacustrine sediments (Holocene) , glaciolacustrine sediments (Pleistocene), Plateau Molasse, Subalpine Molasse and its thrust fault zone. Dips of beds within Plateau and Subalpine Molasse are ~8° and ~20°, respectively. Within the fault zone, many highly deformed structures with dips around 30° are visible. Preliminary tests with 3-D preserved-amplitude prestack depth migration demonstrate that the excellent data quality of Survey II allows application of such sophisticated techniques even to high-resolution seismic surveys. In general, the adaptation of the 3-D marine seismic reflection method, which to date has almost exclusively been used by the oil exploration industry, to a smaller geographical as well as financial scale has helped pave the way for applying this technique to environmental and engineering purposes.<br/><br/>La sismique réflexion est une méthode d?investigation du sous-sol avec un très grand pouvoir de résolution. Elle consiste à envoyer des vibrations dans le sol et à recueillir les ondes qui se réfléchissent sur les discontinuités géologiques à différentes profondeurs et remontent ensuite à la surface où elles sont enregistrées. Les signaux ainsi recueillis donnent non seulement des informations sur la nature des couches en présence et leur géométrie, mais ils permettent aussi de faire une interprétation géologique du sous-sol. Par exemple, dans le cas de roches sédimentaires, les profils de sismique réflexion permettent de déterminer leur mode de dépôt, leurs éventuelles déformations ou cassures et donc leur histoire tectonique. La sismique réflexion est la méthode principale de l?exploration pétrolière. Pendant longtemps on a réalisé des profils de sismique réflexion le long de profils qui fournissent une image du sous-sol en deux dimensions. Les images ainsi obtenues ne sont que partiellement exactes, puisqu?elles ne tiennent pas compte de l?aspect tridimensionnel des structures géologiques. Depuis quelques dizaines d?années, la sismique en trois dimensions (3-D) a apporté un souffle nouveau à l?étude du sous-sol. Si elle est aujourd?hui parfaitement maîtrisée pour l?imagerie des grandes structures géologiques tant dans le domaine terrestre que le domaine océanique, son adaptation à l?échelle lacustre ou fluviale n?a encore fait l?objet que de rares études. Ce travail de thèse a consisté à développer un système d?acquisition sismique similaire à celui utilisé pour la prospection pétrolière en mer, mais adapté aux lacs. Il est donc de dimension moindre, de mise en oeuvre plus légère et surtout d?une résolution des images finales beaucoup plus élevée. Alors que l?industrie pétrolière se limite souvent à une résolution de l?ordre de la dizaine de mètres, l?instrument qui a été mis au point dans le cadre de ce travail permet de voir des détails de l?ordre du mètre. Le nouveau système repose sur la possibilité d?enregistrer simultanément les réflexions sismiques sur trois câbles sismiques (ou flûtes) de 24 traces chacun. Pour obtenir des données 3-D, il est essentiel de positionner les instruments sur l?eau (source et récepteurs des ondes sismiques) avec une grande précision. Un logiciel a été spécialement développé pour le contrôle de la navigation et le déclenchement des tirs de la source sismique en utilisant des récepteurs GPS différentiel (dGPS) sur le bateau et à l?extrémité de chaque flûte. Ceci permet de positionner les instruments avec une précision de l?ordre de 20 cm. Pour tester notre système, nous avons choisi une zone sur le Lac Léman, près de la ville de Lausanne, où passe la faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau et de la Molasse Subalpine. Deux campagnes de mesures de sismique 3-D y ont été réalisées sur une zone d?environ 1 km2. Les enregistrements sismiques ont ensuite été traités pour les transformer en images interprétables. Nous avons appliqué une séquence de traitement 3-D spécialement adaptée à nos données, notamment en ce qui concerne le positionnement. Après traitement, les données font apparaître différents faciès sismiques principaux correspondant notamment aux sédiments lacustres (Holocène), aux sédiments glacio-lacustres (Pléistocène), à la Molasse du Plateau, à la Molasse Subalpine de la zone de faille et la Molasse Subalpine au sud de cette zone. La géométrie 3-D détaillée des failles est visible sur les sections sismiques verticales et horizontales. L?excellente qualité des données et l?interprétation de plusieurs horizons et surfaces de faille montrent le potentiel de cette technique pour les investigations à petite échelle en trois dimensions ce qui ouvre des voies à son application dans les domaines de l?environnement et du génie civil.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reference collections of multiple Drosophila lines with accumulating collections of "omics" data have proven especially valuable for the study of population genetics and complex trait genetics. Here we present a description of a resource collection of 84 strains of Drosophila melanogaster whose genome sequences were obtained after 12 generations of full-sib inbreeding. The initial rationale for this resource was to foster development of a systems biology platform for modeling metabolic regulation by the use of natural polymorphisms as perturbations. As reference lines, they are amenable to repeated phenotypic measurements, and already a large collection of metabolic traits have been assayed. Another key feature of these strains is their widespread geographic origin, coming from Beijing, Ithaca, Netherlands, Tasmania, and Zimbabwe. After obtaining 12.5× coverage of paired-end Illumina sequence reads, SNP and indel calls were made with the GATK platform. Thorough quality control was enabled by deep sequencing one line to >100×, and single-nucleotide polymorphisms and indels were validated using ddRAD-sequencing as an orthogonal platform. In addition, a series of preliminary population genetic tests were performed with these single-nucleotide polymorphism data for assessment of data quality. We found 83 segregating inversions among the lines, and as expected these were especially abundant in the African sample. We anticipate that this will make a useful addition to the set of reference D. melanogaster strains, thanks to its geographic structuring and unusually high level of genetic diversity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIM: To provide insight into cancer registration coverage, data access and use in Europe. This contributes to data and infrastructure harmonisation and will foster a more prominent role of cancer registries (CRs) within public health, clinical policy and cancer research, whether within or outside the European Research Area. METHODS: During 2010-12 an extensive survey of cancer registration practices and data use was conducted among 161 population-based CRs across Europe. Responding registries (66%) operated in 33 countries, including 23 with national coverage. RESULTS: Population-based oncological surveillance started during the 1940-50s in the northwest of Europe and from the 1970s to 1990s in other regions. The European Union (EU) protection regulations affected data access, especially in Germany and France, but less in the Netherlands or Belgium. Regular reports were produced by CRs on incidence rates (95%), survival (60%) and stage for selected tumours (80%). Evaluation of cancer control and quality of care remained modest except in a few dedicated CRs. Variables evaluated were support of clinical audits, monitoring adherence to clinical guidelines, improvement of cancer care and evaluation of mass cancer screening. Evaluation of diagnostic imaging tools was only occasional. CONCLUSION: Most population-based CRs are well equipped for strengthening cancer surveillance across Europe. Data quality and intensity of use depend on the role the cancer registry plays in the politico, oncomedical and public health setting within the country. Standard registration methodology could therefore not be translated to equivalent advances in cancer prevention and mass screening, quality of care, translational research of prognosis and survivorship across Europe. Further European collaboration remains essential to ensure access to data and comparability of the results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diplomityö tehtiin Partek Oyj Abp:lle antamaan IT-järjestelmistä vastuussa oleville ja niihin tukeutuville johtohenkilöille yleisnäkemys IT-sovellusintegroinnista sekä luomaan ohjeet integrointi projekteihin. Diplomityön alkuosassa esitellään liiketoiminnan prosessien pulmakohtia ja sovellusintegrointien liiketoiminnalle tuomia etuja yleisellä tasolla perustuen kirjallisuuteen. Yleisen tason etuja saadaan mm. nopeampien prosessien, paremman tiedon saatavuuden ja ihmisille tulevien uusien toimintatapojen kautta tulevista eduista. Työn seuraavassa osiossa esitellään mitä sovellusintegraatio tarkoittaa käytännössä, mitä erilaisia vaihtoehtoja integroinneille on ja mitä etuja ja haittoja erilaisista integrointitavoista on. Integrointitavoista viesti-pohjainen integrointitapa on noussut suosituimmaksi sen yksinkertaisuuden, luotettavuuden ja helpon liitettävyyden takia. Integrointisovelluksilla on mahdollista siirtää, muokata, prosessoida ja varastoida viestejä. Näiden ominaisuuksien avulla on mahdollista luoda reaaliaikaisia yhteistyöverkostoja. Tämä osio perustuu kirjallisuuteen , artikkeleihin ja haastatteluihin. Kolmas osio keskittyy integrointi projektin ominaispiirteisiin luoden toimintakartan integrointiprojektin kululle. Osiossa esitellään huomioitavat tekniset asiat, kustannukset ja edut sekä mallipohjia integroinnin dokumentointiin. Osio perustuu omiin kokemuksiin, haastatteluihin sekä kirjallisuuteen. Neljännessä osiossa esitellään Partekissa tehty integrointiprojekti. Integrointityö tehtiin ostajille tarkoitetun toimittajarekisterin (PPM) ja ERP-järjestelmän (Baan) välillä. Integrointiin käytettiin yhtä suosituinta integrointityökalua nimeltään IBM WebSphere MQ.Osio perustuu projektin dokumentointiin, omiin kokemuksiin ja kirjallisuuteen. Diplomityön päättää yhteenveto. Kolme pääetua voidaan saavuttaa integroinneilla ja toimintakartalla; tiedon luotettavuus paranee, toimintakartalla saadaan integroinneille malli ja luodaan riippumattomuutta tiettyihin avain henkilöihin tarkalla dokumentoinnilla ja toimintatapojen standardoinnilla.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkimus käsittelee tuotekustannuslaskennan kehittämistä VAASAN Oy:ssä, joka on yksi johtavista leipomoyrityksistä Suomessa ja Baltiassa. Työn tavoitteena on kehittää tuotekustannuslaskentaa paremmin päätöksentekoa tukevaksi. Kehitystyössä otettiin huomioon laskentamallin joustavuuden säilyminen. Kehitystarpeet selvitettiin tuotekustannuslaskennan nykytilaanalyysillä sekä käyttäjäryhmien tietotarvekartoituksella. Kehityksen ohjaamiseen sovellettiin kustannustiedon laadukkuuden teorioita sekä riskienhallinnan työkalua. Lähtökohtana oli varsin hajautetusti toteutettu tuotekustannuslaskenta, minkä luotettavuuden ongelma tiedostettiin. Työn päätteeksi tuotekustannuslaskenta keskitetään. Keskittämisen tuoma tilaisuus halutaan käyttää mahdollisimman tehokkaasti hyödyksi kustannuslaskennan kehittämisessä. Ennen keskittämistä tulee tuotekustannuslaskenta johdonmukaistaa eri leipomoiden välillä. Tässä työssä kartoitettiin kehityskohteet sekä esitetään toimintamalli tai –ehdotus tärkeimpien kehityskohteiden ratkaisemiseksi. Tärkeimmät kehityskohteet löydettiin jäsentämällä ja priorisoimalla kehityskohteet. Tuotekustannuslaskentaa kehitettiin paremmin päätöksentekoa tukevaksi, johdonmukaisemmaksi, luotettavammaksi sekä sen käytettävyyttä kehitettiin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, chief information officers (CIOs) around the world have identified Business Intelligence (BI) as their top priority and as the best way to enhance their enterprises competitiveness. Yet, many enterprises are struggling to realize the business value that BI promises. This discrepancy causes important questions, for example: what are the critical success factors of Business Intelligence and, more importantly, how it can be ensured that a Business Intelligence program enhances enterprises competitiveness. The main objective of the study is to find out how it can be ensured that a BI program meets its goals in providing competitive advantage to an enterprise. The objective is approached with a literature review and a qualitative case study. For the literature review the main objective populates three research questions (RQs); RQ1: What is Business Intelligence and why is it important for modern enterprises? RQ2: What are the critical success factors of Business Intelligence programs? RQ3: How it can be ensured that CSFs are met? The qualitative case study covers the BI program of a Finnish global manufacturer company. The research questions for the case study are as follows; RQ4: What is the current state of the case company’s BI program and what are the key areas for improvement? RQ5: In what ways the case company’s Business Intelligence program could be improved? The case company’s BI program is researched using the following methods; action research, semi-structured interviews, maturity assessment and benchmarking. The literature review shows that Business Intelligence is a technology-based information process that contains a series of systematic activities, which are driven by the specific information needs of decision-makers. The objective of BI is to provide accurate, timely, fact-based information, which enables taking actions that lead to achieving competitive advantage. There are many reasons for the importance of Business Intelligence, two of the most important being; 1) It helps to bridge the gap between an enterprise’s current and its desired performance, and 2) It helps enterprises to be in alignment with key performance indicators meaning it helps an enterprise to align towards its key objectives. The literature review also shows that there are known critical success factors (CSFs) for Business Intelligence programs which have to be met if the above mentioned value is wanted to be achieved, for example; committed management support and sponsorship, business-driven development approach and sustainable data quality. The literature review shows that the most common challenges are related to these CSFs and, more importantly, that overcoming these challenges requires a more comprehensive form of BI, called Enterprise Performance Management (EPM). EPM links measurement to strategy by focusing on what is measured and why. The case study shows that many of the challenges faced in the case company’s BI program are related to the above-mentioned CSFs. The main challenges are; lack of support and sponsorship from business, lack of visibility to overall business performance, lack of rigid BI development process, lack of clear purpose for the BI program and poor data quality. To overcome these challenges the case company should define and design an enterprise metrics framework, make sure that BI development requirements are gathered and prioritized by business, focus on data quality and ownership, and finally define clear goals for the BI program and then support and sponsor these goals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genomics is expanding the horizons of epidemiology, providing a new dimension for classical epidemiological studies and inspiring the development of large-scale multicenter studies with the statistical power necessary for the assessment of gene-gene and gene-environment interactions in cancer etiology and prognosis. This paper describes the methodology of the Clinical Genome of Cancer Project in São Paulo, Brazil (CGCP), which includes patients with nine types of tumors and controls. Three major epidemiological designs were used to reach specific objectives: cross-sectional studies to examine gene expression, case-control studies to evaluate etiological factors, and follow-up studies to analyze genetic profiles in prognosis. The clinical groups included patients' data in the electronic database through the Internet. Two approaches were used for data quality control: continuous data evaluation and data entry consistency. A total of 1749 cases and 1509 controls were entered into the CGCP database from the first trimester of 2002 to the end of 2004. Continuous evaluation showed that, for all tumors taken together, only 0.5% of the general form fields still included potential inconsistencies by the end of 2004. Regarding data entry consistency, the highest percentage of errors (11.8%) was observed for the follow-up form, followed by 6.7% for the clinical form, 4.0% for the general form, and only 1.1% for the pathology form. Good data quality is required for their transformation into useful information for clinical application and for preventive measures. The use of the Internet for communication among researchers and for data entry is perhaps the most innovative feature of the CGCP. The monitoring of patients' data guaranteed their quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to determine the effect of eight 5-hydroxy-5-trifluoromethyl-4,5-dihydro-1H-1-carboxyamidepyrazoles (TFDPs) on rat body temperature and baker’s yeast-induced fever. TFDPs or vehicle (5% Tween 80 in 0.9% NaCl, 5 mL/kg) were injected subcutaneously and rectal temperature was measured as a function of time in 28-day-old male Wistar rats (N = 5-12 per group). Antipyretic activity was determined in feverish animals injected with baker’s yeast (Saccharomyces cerevisiae suspension, 0.135 mg/kg, 10 mL/kg, ip). 3-Ethyl- and 3-propyl-TFDP (140 and 200 μmol/kg, respectively, 4 h after yeast injection) attenuated baker’s yeast-induced fever by 61 and 82%, respectively. These two effective antipyretics were selected for subsequent analysis of putative mechanisms of action. We then determined the effects on cyclooxygenase-1 and -2 (COX-1 and COX-2) activities on 1,1-diphenyl-2-picrylhydrazyl (DPPH) oxidation in vitro, on tumor necrosis factor-α (TNF-α) and interleukin-1β (IL-1β) levels and on leukocyte counts in the washes of peritoneal cavities of rats injected with baker’s yeast. While 3-ethyl- and 3-propyl-TFDP did not reduce baker’s yeast-induced increases of IL-1β or TNF-α levels, 3-ethyl-TFDP caused a 42% reduction in peritoneal leukocyte count. 3-Ethyl- and 3-propyl-TFDP did not alter COX-1 or COX-2 activities in vitro, but presented antioxidant activity in the DPPH assay with an IC50 of 39 mM (25-62) and 163 mM (136-196), respectively. The data indicate that mechanisms of action of these two novel antipyretic pyrazole derivatives do not involve the classic inhibition of the COX pathway or pyrogenic cytokine release.