42 resultados para OAIS reference model for an open archival information system


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The outcome of diffuse large B-cell lymphoma has been substantially improved by the addition of the anti-CD20 monoclonal antibody rituximab to chemotherapy regimens. We aimed to assess, in patients aged 18-59 years, the potential survival benefit provided by a dose-intensive immunochemotherapy regimen plus rituximab compared with standard treatment plus rituximab. METHODS: We did an open-label randomised trial comparing dose-intensive rituximab, doxorubicin, cyclophosphamide, vindesine, bleomycin, and prednisone (R-ACVBP) with subsequent consolidation versus standard rituximab, doxorubicin, cyclophosphamide, vincristine, and prednisone (R-CHOP). Random assignment was done with a computer-assisted randomisation-allocation sequence with a block size of four. Patients were aged 18-59 years with untreated diffuse large B-cell lymphoma and an age-adjusted international prognostic index equal to 1. Our primary endpoint was event-free survival. Our analyses of efficacy and safety were of the intention-to-treat population. This study is registered with ClinicalTrials.gov, number NCT00140595. FINDINGS: One patient withdrew consent before treatment and 54 did not complete treatment. After a median follow-up of 44 months, our 3-year estimate of event-free survival was 81% (95% CI 75-86) in the R-ACVBP group and 67% (59-73) in the R-CHOP group (hazard ratio [HR] 0·56, 95% CI 0·38-0·83; p=0·0035). 3-year estimates of progression-free survival (87% [95% CI, 81-91] vs 73% [66-79]; HR 0·48 [0·30-0·76]; p=0·0015) and overall survival (92% [87-95] vs 84% [77-89]; HR 0·44 [0·28-0·81]; p=0·0071) were also increased in the R-ACVBP group. 82 (42%) of 196 patients in the R-ACVBP group experienced a serious adverse event compared with 28 (15%) of 183 in the R-CHOP group. Grade 3-4 haematological toxic effects were more common in the R-ACVBP group, with a higher proportion of patients experiencing a febrile neutropenic episode (38% [75 of 196] vs 9% [16 of 183]). INTERPRETATION: Compared with standard R-CHOP, intensified immunochemotherapy with R-ACVBP significantly improves survival of patients aged 18-59 years with diffuse large B-cell lymphoma with low-intermediate risk according to the International Prognostic Index. Haematological toxic effects of the intensive regimen were raised but manageable. FUNDING: Groupe d'Etudes des Lymphomes de l'Adulte and Amgen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, Business Model Canvas design has evolved from being a paper-based activity to one that involves the use of dedicated computer-aided business model design tools. We propose a set of guidelines to help design more coherent business models. When combined with functionalities offered by CAD tools, they show great potential to improve business model design as an ongoing activity. However, in order to create complex solutions, it is necessary to compare basic business model design tasks, using a CAD system over its paper-based counterpart. To this end, we carried out an experiment to measure user perceptions of both solutions. Performance was evaluated by applying our guidelines to both solutions and then carrying out a comparison of business model designs. Although CAD did not outperform paper-based design, the results are very encouraging for the future of computer-aided business model design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a spectrally-negative Markov additive process as a model of a risk process in a random environment. Following recent interest in alternative ruin concepts, we assume that ruin occurs when an independent Poissonian observer sees the process as negative, where the observation rate may depend on the state of the environment. Using an approximation argument and spectral theory, we establish an explicit formula for the resulting survival probabilities in this general setting. We also discuss an efficient evaluation of the involved quantities and provide a numerical illustration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract : In the subject of fingerprints, the rise of computers tools made it possible to create powerful automated search algorithms. These algorithms allow, inter alia, to compare a fingermark to a fingerprint database and therefore to establish a link between the mark and a known source. With the growth of the capacities of these systems and of data storage, as well as increasing collaboration between police services on the international level, the size of these databases increases. The current challenge for the field of fingerprint identification consists of the growth of these databases, which makes it possible to find impressions that are very similar but coming from distinct fingers. However and simultaneously, this data and these systems allow a description of the variability between different impressions from a same finger and between impressions from different fingers. This statistical description of the withinand between-finger variabilities computed on the basis of minutiae and their relative positions can then be utilized in a statistical approach to interpretation. The computation of a likelihood ratio, employing simultaneously the comparison between the mark and the print of the case, the within-variability of the suspects' finger and the between-variability of the mark with respect to a database, can then be based on representative data. Thus, these data allow an evaluation which may be more detailed than that obtained by the application of rules established long before the advent of these large databases or by the specialists experience. The goal of the present thesis is to evaluate likelihood ratios, computed based on the scores of an automated fingerprint identification system when the source of the tested and compared marks is known. These ratios must support the hypothesis which it is known to be true. Moreover, they should support this hypothesis more and more strongly with the addition of information in the form of additional minutiae. For the modeling of within- and between-variability, the necessary data were defined, and acquired for one finger of a first donor, and two fingers of a second donor. The database used for between-variability includes approximately 600000 inked prints. The minimal number of observations necessary for a robust estimation was determined for the two distributions used. Factors which influence these distributions were also analyzed: the number of minutiae included in the configuration and the configuration as such for both distributions, as well as the finger number and the general pattern for between-variability, and the orientation of the minutiae for within-variability. In the present study, the only factor for which no influence has been shown is the orientation of minutiae The results show that the likelihood ratios resulting from the use of the scores of an AFIS can be used for evaluation. Relatively low rates of likelihood ratios supporting the hypothesis known to be false have been obtained. The maximum rate of likelihood ratios supporting the hypothesis that the two impressions were left by the same finger when the impressions came from different fingers obtained is of 5.2 %, for a configuration of 6 minutiae. When a 7th then an 8th minutia are added, this rate lowers to 3.2 %, then to 0.8 %. In parallel, for these same configurations, the likelihood ratios obtained are on average of the order of 100,1000, and 10000 for 6,7 and 8 minutiae when the two impressions come from the same finger. These likelihood ratios can therefore be an important aid for decision making. Both positive evolutions linked to the addition of minutiae (a drop in the rates of likelihood ratios which can lead to an erroneous decision and an increase in the value of the likelihood ratio) were observed in a systematic way within the framework of the study. Approximations based on 3 scores for within-variability and on 10 scores for between-variability were found, and showed satisfactory results. Résumé : Dans le domaine des empreintes digitales, l'essor des outils informatisés a permis de créer de puissants algorithmes de recherche automatique. Ces algorithmes permettent, entre autres, de comparer une trace à une banque de données d'empreintes digitales de source connue. Ainsi, le lien entre la trace et l'une de ces sources peut être établi. Avec la croissance des capacités de ces systèmes, des potentiels de stockage de données, ainsi qu'avec une collaboration accrue au niveau international entre les services de police, la taille des banques de données augmente. Le défi actuel pour le domaine de l'identification par empreintes digitales consiste en la croissance de ces banques de données, qui peut permettre de trouver des impressions très similaires mais provenant de doigts distincts. Toutefois et simultanément, ces données et ces systèmes permettent une description des variabilités entre différentes appositions d'un même doigt, et entre les appositions de différents doigts, basées sur des larges quantités de données. Cette description statistique de l'intra- et de l'intervariabilité calculée à partir des minuties et de leurs positions relatives va s'insérer dans une approche d'interprétation probabiliste. Le calcul d'un rapport de vraisemblance, qui fait intervenir simultanément la comparaison entre la trace et l'empreinte du cas, ainsi que l'intravariabilité du doigt du suspect et l'intervariabilité de la trace par rapport à une banque de données, peut alors se baser sur des jeux de données représentatifs. Ainsi, ces données permettent d'aboutir à une évaluation beaucoup plus fine que celle obtenue par l'application de règles établies bien avant l'avènement de ces grandes banques ou par la seule expérience du spécialiste. L'objectif de la présente thèse est d'évaluer des rapports de vraisemblance calcul és à partir des scores d'un système automatique lorsqu'on connaît la source des traces testées et comparées. Ces rapports doivent soutenir l'hypothèse dont il est connu qu'elle est vraie. De plus, ils devraient soutenir de plus en plus fortement cette hypothèse avec l'ajout d'information sous la forme de minuties additionnelles. Pour la modélisation de l'intra- et l'intervariabilité, les données nécessaires ont été définies, et acquises pour un doigt d'un premier donneur, et deux doigts d'un second donneur. La banque de données utilisée pour l'intervariabilité inclut environ 600000 empreintes encrées. Le nombre minimal d'observations nécessaire pour une estimation robuste a été déterminé pour les deux distributions utilisées. Des facteurs qui influencent ces distributions ont, par la suite, été analysés: le nombre de minuties inclus dans la configuration et la configuration en tant que telle pour les deux distributions, ainsi que le numéro du doigt et le dessin général pour l'intervariabilité, et la orientation des minuties pour l'intravariabilité. Parmi tous ces facteurs, l'orientation des minuties est le seul dont une influence n'a pas été démontrée dans la présente étude. Les résultats montrent que les rapports de vraisemblance issus de l'utilisation des scores de l'AFIS peuvent être utilisés à des fins évaluatifs. Des taux de rapports de vraisemblance relativement bas soutiennent l'hypothèse que l'on sait fausse. Le taux maximal de rapports de vraisemblance soutenant l'hypothèse que les deux impressions aient été laissées par le même doigt alors qu'en réalité les impressions viennent de doigts différents obtenu est de 5.2%, pour une configuration de 6 minuties. Lorsqu'une 7ème puis une 8ème minutie sont ajoutées, ce taux baisse d'abord à 3.2%, puis à 0.8%. Parallèlement, pour ces mêmes configurations, les rapports de vraisemblance sont en moyenne de l'ordre de 100, 1000, et 10000 pour 6, 7 et 8 minuties lorsque les deux impressions proviennent du même doigt. Ces rapports de vraisemblance peuvent donc apporter un soutien important à la prise de décision. Les deux évolutions positives liées à l'ajout de minuties (baisse des taux qui peuvent amener à une décision erronée et augmentation de la valeur du rapport de vraisemblance) ont été observées de façon systématique dans le cadre de l'étude. Des approximations basées sur 3 scores pour l'intravariabilité et sur 10 scores pour l'intervariabilité ont été trouvées, et ont montré des résultats satisfaisants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Forest fire sequences can be modelled as a stochastic point process where events are characterized by their spatial locations and occurrence in time. Cluster analysis permits the detection of the space/time pattern distribution of forest fires. These analyses are useful to assist fire-managers in identifying risk areas, implementing preventive measures and conducting strategies for an efficient distribution of the firefighting resources. This paper aims to identify hot spots in forest fire sequences by means of the space-time scan statistics permutation model (STSSP) and a geographical information system (GIS) for data and results visualization. The scan statistical methodology uses a scanning window, which moves across space and time, detecting local excesses of events in specific areas over a certain period of time. Finally, the statistical significance of each cluster is evaluated through Monte Carlo hypothesis testing. The case study is the forest fires registered by the Forest Service in Canton Ticino (Switzerland) from 1969 to 2008. This dataset consists of geo-referenced single events including the location of the ignition points and additional information. The data were aggregated into three sub-periods (considering important preventive legal dispositions) and two main ignition-causes (lightning and anthropogenic causes). Results revealed that forest fire events in Ticino are mainly clustered in the southern region where most of the population is settled. Our analysis uncovered local hot spots arising from extemporaneous arson activities. Results regarding the naturally-caused fires (lightning fires) disclosed two clusters detected in the northern mountainous area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

14C dating models are limited when considering recent groundwater for which the carbon isotopic signature of the total dissolved inorganic carbon (TDIC) is mainly acquired in the unsaturated zone. Reducing the uncertainties of dating thus implies a better identification of the processes controlling the carbon isotopic composition of the TDIC during groundwater recharge. Geochemical interactions between gas, water and carbonates in the unsaturated zone were investigated for two aquifers (the carbonate-free Fontainebleau sands and carbonate-bearing Astian sands, France) in order to identify the respective roles of CO2 and carbonates on the carbon isotopic signatures of the TDIC; this analysis is usually approached using open or closed system terms. Under fully open system conditions, the seasonality of the 13C values in the soil CO2 can lead to important uncertainties regarding the so-called "initial 14C activity" used in 14C correction models. In a carbonate-bearing unsaturated zone such as in the Astian aquifer, we show that an approach based on fully open or closed system conditions is not appropriate. Although the chemical saturation between water and calcite occurs rapidly within the first metre of the unsaturated zone, the carbon isotopic contents (δ13C) of the CO2 and the TDIC evolve downward, impacted by the dissolution-precipitation of the carbonates. In this study, we propose a numerical approach to describe this evolution. The δ13C and the A 14C (radiocarbon activity) of the TDIC at the base of the carbonate-hearing unsaturated zone depends on (i) the δ13C and the A 14C of the TDIC in the soil determined by the soil CO2, (ii) the water's residence time in the unsaturated zone and (iii) the carbonate precipitation-dissolution fluxes. In this type of situation, the carbonate δ13C-A 14C evolutions indicate the presence of secondary calcite and permit the calculation of its accretion flux, equal to ~ 4.5 ± 0.5 x 10-9 mol grock-1 yr-1. More generally, for other sites under temperate climate and with similar properties to the Astian sands site, this approach allows for a reliable determination of the carbon isotopic composition at the base of the unsaturated zone as the indispensable "input function" data of the carbon cycle into the aquifer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the tremendous amount of data collected in the field of ambulatory care, political authorities still lack synthetic indicators to provide them with a global view of health services utilization and costs related to various types of diseases. Moreover, public health indicators fail to provide useful information for physicians' accountability purposes. The approach is based on the Swiss context, which is characterized by the greatest frequency of medical visits in Europe, the highest rate of growth for care expenditure, poor public information but a lot of structured data (new fee system introduced in 2004). The proposed conceptual framework is universal and based on descriptors of six entities: general population, people with poor health, patients, services, resources and effects. We show that most conceptual shortcomings can be overcome and that the proposed indicators can be achieved without threatening privacy protection, using modern cryptographic techniques. Twelve indicators are suggested for the surveillance of the ambulatory care system, almost all based on routinely available data: morbidity, accessibility, relevancy, adequacy, productivity, efficacy (from the points of view of the population, people with poor health, and patients), effectiveness, efficiency, health services coverage and financing. The additional costs of this surveillance system should not exceed Euro 2 million per year (Euro 0.3 per capita).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Although intra-retinal tumor has long been staged presurgically according to the Reese-Ellsworth (R-E) system, retinoblastoma differs from other pediatric neoplasms in never having had a widely accepted classification system that encompasses the entire spectrum of the disease. Comparisons among studies that consider disease extension, risk factors for extra-ocular relapse, and response to therapy require a universally accepted staging system for extra-ocular disease. PROCEDURE: A committee of retinoblastoma experts from large centers worldwide has developed a consensus classification that can encompass all retinoblastoma cases and is presented herein. Patients are classified according to extent of disease and the presence of overt extra-ocular extension. In addition, a proposal for substaging considering histopathological features of enucleated specimens is presented to further discriminate between Stage I and II patients. RESULTS: The following is a summary of the classification system developed-Stage 0: Patients treated conservatively (subject to presurgical ophthalmologic classifications); Stage I: Eye enucleated, completely resected histologically; Stage II: Eye enucleated, microscopic residual tumor; Stage III: Regional extension [(a) overt orbital disease, (b) preauricular or cervical lymph node extension]; Stage IV: Metastatic disease [(a) hematogenous metastasis: (1) single lesion, (2) multiple lesions; (b) CNS extension: (1) prechiasmatic lesion, (2) CNS mass, (3) leptomeningeal disease]. A proposal is also presented for substaging of enucleated Stages I and II eyes. CONCLUSIONS: The proposed staging system is the product of an international effort to adopt a uniform staging system for patients with retinoblastoma to cover the whole spectrum of the disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study proposes a method based on ski fixed inertial sensors to automatically compute spatio-temporal parameters (phase durations, cycle speed and cycle length) for the diagonal stride in classical cross-country skiing. The proposed system was validated against a marker-based motion capture system during indoor treadmill skiing. Skiing movement of 10 junior to world-cup athletes was measured for four different conditions. The accuracy (i.e. median error) and precision (i.e. interquartile range of error) of the system was below 6ms for cycle duration and ski thrust duration and below 35ms for pole push duration. Cycle speed precision (accuracy) was below 0.1m/s (0.005m/s) and cycle length precision (accuracy) was below 0.15m (0.005m). The system was sensitive to changes of conditions and was accurate enough to detect significant differences reported in previous studies. Since capture volume is not limited and setup is simple, the system would be well suited for outdoor measurements on snow.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increase in seafood production, especially in mariculture worldwide, has brought out the need of continued monitoring of shellfish production areas in order to ensure safety to human consumption. The purpose of this research was to evaluate pathogenic protozoa, viruses and bacteria contamination in oysters before and after UV depuration procedure, in brackish waters at all stages of cultivation and treatment steps and to enumerate microbiological indicators of fecal contamination from production site up to depuration site in an oyster cooperative located at the Southeastern estuarine area of Brazil. Oysters and brackish water were collected monthly from September 2009 to November 2010. Four sampling sites were selected for enteropathogens analysis: site 1- oyster growth, site 2- catchment water (before UV depuration procedure), site 3 - filtration stage of water treatment (only for protozoa analysis) and site 4- oyster's depuration tank. Three microbiological indicators ! were examined at sites 1, 2 and 4. The following pathogenic microorganisms were searched: Giardia cysts, Cryptosporidium oocysts, Human Adenovirus (HAdV), Hepatitis A virus (HAV), Human Norovirus (HnoV) (genogroups I and II), JC strain Polyomavirus (JCPyV) and Salmonella sp. Analysis consisted of molecular detection (qPCR) for viruses (oysters and water samples); immunomagnetic separation followed by direct immunofluorescence assay for Cryptosporidium oocysts and Giardia cysts and also molecular detection (PCR) for the latter (oysters and water samples); commercial kit (Reveal-Neogee (R)) for Salmonella analysis (oysters). Giardia was the most prevalent pathogen in all sites where it was detected: 36.3%, 18.1%, 36.3% and 27.2% of water from sites 1, 2, 3 and 4 respectively; 36.3% of oysters from site 1 and 54.5% of depurated oysters were harboring Giardia cysts. The huge majority of contaminated samples were classified as Giardia duodenalis. HAdv was detected in water and o! ysters from growth site and HnoV GI in two batches of oysters ! (site 1) in huge concentrations (2.11 x 10(13), 3.10 x 10(12) gc/g). In depuration tank site, Salmonella sp., HAV (4.84 x 10(3)) and HnoV GII (7.97 x 10(14)) were detected once in different batches of oysters. Cryptosporidium spp. oocysts were present in 9.0% of water samples from site four. These results reflect the contamination of oysters even when UV depuration procedures are employed in this shellfish treatment plant. Moreover, the molecular comprehension of the sources of contamination is necessary to develop an efficient management strategy allied to shellfish treatment improvement to prevent foodborne illnesses. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RESUME Ce travail se propose de discuter des résultats comportementaux observés chez des rats obtenus dans trois paradigmes expérimentaux différents : le bassin de Morris (Morris Water Maze, Morris, 1984) ; la table à trous (Homing Board, Schenk, 1989) et le labyrinthe radial (Radial Arm Maze, Olton et Samuelson, 1976). Les deux premières tâches sont spatiales et permettent un apprentissage de place en environnements contrôlés, et la troisième est une tâche comportementale qui différencie deux habiletés particulières, celle d'élimination (basée sur la mémoire de travail) et celle de sélection (basée sur la mémoire de référence). La discussion des résultats porte sur les stratégies de navigation utilisées par les animaux pour résoudre les tâches et plus précisément sur les facteurs qui peuvent influencer le choix de ces stratégies. Le facteur environnemental (environnement contrôlé) et le facteur cognitif (vieillissement) représentent les variables étudiées ici. C'est ainsi que certaines hypothèses communément acceptées ont été malmenées par nos résultats. Or si l'espace est habituellement supposé homogène (toutes les positions spatiales présentent le même degré de difficulté lors d'un apprentissage en champ ouvert), ce travail établit qu'une position associée -sans contiguïté - à l'un des trois indices visuels situés dans la périphérie de l'environnement est plus difficile à apprendre qu'une position située entre deux des trois indices. Deuxièmement, alors qu'il est admis que l'apprentissage d'une place dans un environnement riche requiert le même type d'information. dans la bassin de Morris (tâche nagée) que sur la table à trous (tâche marchée), nous avons montré que la discrimination spatiale en bassin ne peut être assurée par les trois indices visuels périphériques et nécessite la présence d'au moins un élément supplémentaire. Enfin, l'étude du vieillissement a souvent montré que l'âge réduit les capacités cognitives nécessaires à la navigation spatiale, conduisant à un déficit général des performances d'un animal sénescent, alors que dans notre travail, nous avons trouvé les animaux âgés plus performants et plus efficaces que les adultes dans une tâche particulière de collecte de nourriture. Ces expériences s'inscrivent dans une étude générale qui met à l'épreuve le modèle théorique proposé pax Schenk et Jacobs (2003), selon lequel l'encodage de la carte cognitive (Tolman, 1948 ; O'Keefe et Nadel, 1978) se ferait dans l'hippocampe par l'activité de deux modules complémentaires :d'une part le CA3 - Gyrus Denté pour le traitement d'une trame spatiale basée sur des éléments directionnels et Jou distribués en gradient (bearing map) et d'autre part le CAl - Subiculum pour le traitement des représentations locales basées sur les positions relatives des éléments fixes de l'environnement (sketch map). SUMMARY This work proposes to talk about behavioural results observed in three different experimental paradigms with rats: the Morris Water Maze (Morris, 1984); the Homing Board (Schenk, 1989) and the Radial Arm Maze (Olton and Samuelson, 1976). The two first tasks are spatial ones and allow place learning in controlled environments. The third one is a behavioural task which contrasts two particular skills, the elimination (based on working memory) and the selection one (based on reference memory). The topic of the discussion will be the navigation strategies used by animals to solve the different tasks, and more precisely the factors which can bias this strategies' choice. The environmental (controlled) and the cognitive (aging) factors are the variables studied here. Thus, some hypotheses usually accepted were manhandled by our results. Indeed, if space is habitually homogenously considered (all spatial positions present the same degree of difficulty in an open field learning), this work establishes that an associated position -without being adjacent - to one of the three visual cues localised in the environmental periphery is more difficult to learn than a configurationnel position (situated between two of the three cues). Secondly, if it is received that place learning in a rich environment requires the same information in the Morris water maze (swimming task) that on the Homing board (walking task), we showed that spatial discrimination in the water maze can't be provided by the three peripheral cue cards and needs the presence of a supplementary cue. At last, aging studies often showed that oldness decreases cognitive skills in spatial navigation, leading to a general deficit in performances. But, in our work, we found that senescent rats were more efficient than adult ones in a special food collecting task. These experiments come within the scope of a general study which tests the theoretical model proposed by Jacobs and Schenk (2003), according to which the cognitive map's encoding (Tolman, 1948, O'Keefe and Nadel, 1978) should take place in the hippocampus by two complementary modules, first the DG-CA3 should encode directional and/or gradients references (the bearing map), and secondly the Subiculum-CAl should process locale elements (the sketch map).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Multiple logistic regression is precluded from many practical applications in ecology that aim to predict the geographic distributions of species because it requires absence data, which are rarely available or are unreliable. In order to use multiple logistic regression, many studies have simulated "pseudo-absences" through a number of strategies, but it is unknown how the choice of strategy influences models and their geographic predictions of species. In this paper we evaluate the effect of several prevailing pseudo-absence strategies on the predictions of the geographic distribution of a virtual species whose "true" distribution and relationship to three environmental predictors was predefined. We evaluated the effect of using a) real absences b) pseudo-absences selected randomly from the background and c) two-step approaches: pseudo-absences selected from low suitability areas predicted by either Ecological Niche Factor Analysis: (ENFA) or BIOCLIM. We compared how the choice of pseudo-absence strategy affected model fit, predictive power, and information-theoretic model selection results. Results Models built with true absences had the best predictive power, best discriminatory power, and the "true" model (the one that contained the correct predictors) was supported by the data according to AIC, as expected. Models based on random pseudo-absences had among the lowest fit, but yielded the second highest AUC value (0.97), and the "true" model was also supported by the data. Models based on two-step approaches had intermediate fit, the lowest predictive power, and the "true" model was not supported by the data. Conclusion If ecologists wish to build parsimonious GLM models that will allow them to make robust predictions, a reasonable approach is to use a large number of randomly selected pseudo-absences, and perform model selection based on an information theoretic approach. However, the resulting models can be expected to have limited fit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In medical imaging, merging automated segmentations obtained from multiple atlases has become a standard practice for improving the accuracy. In this letter, we propose two new fusion methods: "Global Weighted Shape-Based Averaging" (GWSBA) and "Local Weighted Shape-Based Averaging" (LWSBA). These methods extend the well known Shape-Based Averaging (SBA) by additionally incorporating the similarity information between the reference (i.e., atlas) images and the target image to be segmented. We also propose a new spatially-varying similarity-weighted neighborhood prior model, and an edge-preserving smoothness term that can be used with many of the existing fusion methods. We first present our new Markov Random Field (MRF) based fusion framework that models the above mentioned information. The proposed methods are evaluated in the context of segmentation of lymph nodes in the head and neck 3D CT images, and they resulted in more accurate segmentations compared to the existing SBA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The competitiveness of businesses is increasingly dependent on their electronic networks with customers, suppliers, and partners. While the strategic and operational impact of external integration and IOS adoption has been extensively studied, much less attention has been paid to the organizational and technical design of electronic relationships. The objective of our longitudinal research project is the development of a framework for understanding and explaining B2B integration. Drawing on existing literature and empirical cases we present a reference model (a classification scheme for B2B Integration). The reference model comprises technical, organizational, and institutional levels to reflect the multiple facets of B2B integration. In this paper we onvestigate the current state of electronic collaboration in global supply chains focussing on the technical view. Using an indepth case analysis we identify five integration scenarios. In the subsequent confirmatory phase of the research we analyse 112 real-world company cases to validate these five integration scenarios. Our research advances and deepens existing studies by developing a B2B reference model, which reflects the current state of practice and is independent of specific implementation technologies. In the next stage of the research the emerging reference model will be extended to create an assessment model for analysing the maturity level of a given company in a specific supply chain.