897 resultados para Transformation-based semi-parametric estimators
Resumo:
The technique to generate transgenic mosquitoes requires adaptation for each target species because of aspects related to species biology, sensitivity to manipulation and rearing conditions. Here we tested different parameters on the microinjection procedure in order to obtain a transgenic Neotropical mosquito species. By using a transposon-based strategy we were able to successfully transform Aedes fluviatilis (Lutz), which can be used as an avian malaria model. These results demonstrate the usefulness of the piggyBac transposable element as a transformation vector for Neotropical mosquito species and opens up new research frontiers for South American mosquito vectors.
Resumo:
Dans les dernières années du 20ème siècle, l'aluminium a fait l'objet de beaucoup de communications outrancières et divergentes cautionnées par des scientifiques et des organismes faisant autorité. En 1986, la société PECHINEY le décrète perpétuel tel le mouvement « L'aluminium est éternel. Il est recyclable indéfiniment sans que ses propriétés soient altérées », ce qui nous avait alors irrité. Peu de temps après, en 1990, une communication tout aussi outrancière et irritante d'une grande organisation environnementale, le World Wild Fund, décrète que « le recyclage de l'aluminium est la pire menace pour l'environnement. Il doit être abandonné ». C'est ensuite à partir de la fin des années 1990, l'explosion des publications relatives au développement durable, le bien mal nommé. Au développement, synonyme de croissance obligatoire, nous préférons société ou organisation humaine et à durable, mauvaise traduction de l'anglais « sustainable », nous préférons supportable : idéalement, nous aurions souhaité parler de société durable, mais, pour être compris de tous, nous nous sommes limités à parler dorénavant de développement supportable. Pour l'essentiel, ces publications reconnaissent les très graves défauts de la métallurgie extractive de l'aluminium à partir du minerai et aussi les mérites extraordinaires du recyclage de l'aluminium puisqu'il représente moins de 10% de la consommation d'énergie de la métallurgie extractive à partir du minerai (on verra que c'est aussi moins de 10% de la pollution et du capital). C'est précisément sur le recyclage que se fondent les campagnes de promotion de l'emballage boisson, en Suisse en particulier. Cependant, les données concernant le recyclage de l'aluminium publiées par l'industrie de l'aluminium reflètent seulement en partie ces mérites. Dans les années 1970, les taux de croissance de la production recyclée sont devenus plus élevés que ceux de la production électrolytique. Par contre, les taux de recyclage, établis à indicateur identique, sont unanimement tous médiocres comparativement à d'autres matériaux tels le cuivre et le fer. Composante de l'industrie de l'aluminium, le recyclage bénéficie d'une image favorable auprès du grand public, démontrant le succès des campagnes de communication. A l'inverse, à l'intérieur de l'industrie de l'aluminium, c'est une image dévalorisée. Les opinions émises par tous les acteurs, commerçants, techniciens, dirigeants, encore recueillies pendant ce travail, sont les suivantes : métier de chiffonnier, métier misérable, métier peu technique mais très difficile (un recycleur 15 d'aluminium n'a-t-il pas dit que son métier était un métier d'homme alors que celui du recycleur de cuivre était un jeu d'enfant). A notre avis ces opinions appartiennent à un passé révolu qu'elles retraduisent cependant fidèlement car le recyclage est aujourd'hui reconnu comme une contribution majeure au développement supportable de l'aluminium. C'est bien pour cette raison que, en 2000, l'industrie de l'aluminium mondiale a décidé d'abandonner le qualificatif « secondaire » jusque là utilisé pour désigner le métal recyclé. C'est en raison de toutes ces données discordantes et parfois contradictoires qu'a débuté ce travail encouragé par de nombreuses personnalités. Notre engagement a été incontestablement facilité par notre connaissance des savoirs indispensables (métallurgie, économie, statistiques) et surtout notre expérience acquise au cours d'une vie professionnelle menée à l'échelle mondiale dans (recherche et développement, production), pour (recherche, développement, marketing, stratégie) et autour (marketing, stratégie de produits connexes, les ferro-alliages, et concurrents, le fer) de l'industrie de l'aluminium. Notre objectif est de faire la vérité sur le recyclage de l'aluminium, un matériau qui a très largement contribué à faire le 20ème siècle, grâce à une revue critique embrassant tous les aspects de cette activité méconnue ; ainsi il n'y a pas d'histoire du recyclage de l'aluminium alors qu'il est plus que centenaire. Plus qu'une simple compilation, cette revue critique a été conduite comme une enquête scientifique, technique, économique, historique, socio-écologique faisant ressortir les faits principaux ayant marqué l'évolution du recyclage de l'aluminium. Elle conclut sur l'état réel du recyclage, qui se révèle globalement satisfaisant avec ses forces et ses faiblesses, et au-delà du recyclage sur l'adéquation de l'aluminium au développement supportable, adéquation largement insuffisante. C'est pourquoi, elle suggère les thèmes d'études intéressant tous ceux scientifiques, techniciens, historiens, économistes, juristes concernés par une industrie très représentative de notre monde en devenir, un monde où la place de l'aluminium dépendra de son aptitude à satisfaire les critères du développement supportable. ABSTRACT Owing to recycling, the aluminium industry's global energetic and environmental prints are much lower than its ore extractive metallurgy's ones. Likewise, recycling will allow the complete use of the expected avalanche of old scraps, consequently to the dramatic explosion of aluminium consumption since the 50's. The recycling state is characterized by: i) raw materials split in two groups :one, the new scrap, internal and prompt, proportional to semi-finished and finished products quantities, exhibits a fairly good and regular quality. The other, the old scrap, proportional to the finished products arrivïng at their end-of--life, about 22 years later on an average, exhibits a variable quality depending on the collect mode. ii) a poor recycling rate, near by that of steel. The aluminium industry generates too much new internal scrap and doesn't collect all the availa~e old scrap. About 50% of it is not recycled (when steel is recycling about 70% of the old scrap flow). iii) recycling techniques, all based on melting, are well handled in spite of aluminium atiiníty to oxygen and the practical impossibility to purify aluminium from any impurity. Sorting and first collect are critical issues before melting. iv) products and markets of recycled aluminium :New scraps have still been recycled in the production lines from where there are coming (closed loop). Old scraps, mainly those mixed, have been first recycled in different production lines (open loop) :steel deoxidation products followed during the 30's, with the development of the foundry alloys, by foundry pieces of which the main market is the automotive industry. During the 80's, the commercial development of the beverage can in North America has permitted the first old scrap recycling closed loop which is developing. v) an economy with low and erratic margins because the electrolytic aluminium quotation fixes scrap purchasing price and recycled aluminium selling price. vi) an industrial organisation historically based on the scrap group and the loop mode. New scrap is recycled either by the transformation industry itself or by the recycling industry, the remelter, old scrap by the refiner, the other component of the recycling industry. The big companies, the "majors" are often involved in the closed loop recycling and very seldom in the open loop one. To-day, aluminium industry's global energetic and environmental prints are too unbeara~ e and the sustainaЫe development criteria are not fully met. Critical issues for the aluminium industry are to better produce, to better consume and to better recycle in order to become a real sustainaЫe development industry. Specific issues to recycling are a very efficient recycling industry, a "sustainaЫe development" economy, a complete old scrap collect favouring the closed loop. Also, indirectly connected to the recycling, are a very efficient transformation industry generating much less new scrap and a finished products industry delivering only products fulfilling sustainaЫe development criteria.
Resumo:
We analyzed prospectively 326 laboratory-confirmed, uncomplicated malarial infections (46.3% due to Plasmodium vivax, 35.3% due to P. falciparum, and 18.4% mixed-species infections) diagnosed in 162 rural Amazonians aged 5-73 years. Thirteen symptoms (fever, chills, sweating, headache, myalgia, arthralgia, abdominal pain, nausea, vomiting, dizziness, cough, dyspnea, and diarrhea) were scored using a structured questionnaire. Headache (59.8%), fever (57.1%), and myalgia (48.4%) were the most frequent symptoms. Ninety-six (29.4%) episodes, all of them diagnosed during cross-sectional surveys of the whole study population (96.9% by molecular technique only), were asymptomatic. Of 93 symptom-less infections left untreated, only 10 became symptomatic over the next two months following diagnosis. Fever was perceived as " intense " in 52.6% of 230 symptomatic malaria episodes, with no fever reported in 19.1% episodes although other symptoms were present. We found significant differences in the prevalence and perceived intensity of fever and other clinical symptoms in relation to parasite load at the time of diagnosis and patient's age, cumulative exposure to malaria, recent malaria morbidity, and species of malaria parasite. These factors are all likely to affect the effectiveness of malaria control strategies based on active or passive detection of febrile subjects in semi-immune populations.
Resumo:
This paper presents a hybrid behavior-based scheme using reinforcement learning for high-level control of autonomous underwater vehicles (AUVs). Two main features of the presented approach are hybrid behavior coordination and semi on-line neural-Q_learning (SONQL). Hybrid behavior coordination takes advantages of robustness and modularity in the competitive approach as well as efficient trajectories in the cooperative approach. SONQL, a new continuous approach of the Q_learning algorithm with a multilayer neural network is used to learn behavior state/action mapping online. Experimental results show the feasibility of the presented approach for AUVs
Resumo:
INTRODUCTION No definitive data are available regarding the value of switching to an alternative TNF antagonist in rheumatoid arthritis patients who fail to respond to the first one. The aim of this study was to evaluate treatment response in a clinical setting based on HAQ improvement and EULAR response criteria in RA patients who were switched to a second or a third TNF antagonist due to failure with the first one. METHODS This was an observational, prospective study of a cohort of 417 RA patients treated with TNF antagonists in three university hospitals in Spain between January 1999 and December 2005. A database was created at the participating centres, with well-defined operational instructions. The main outcome variables were analyzed using parametric or non-parametric tests depending on the level of measurement and distribution of each variable. RESULTS Mean (+/- SD) DAS-28 on starting the first, second and third TNF antagonist was 5.9 (+/- 2.0), 5.1 (+/- 1.5) and 6.1 (+/- 1.1). At the end of follow-up, it decreased to 3.3 (+/- 1.6; Delta = -2.6; p > 0.0001), 4.2 (+/- 1.5; Delta = -1.1; p = 0.0001) and 5.4 (+/- 1.7; Delta = -0.7; p = 0.06). For the first TNF antagonist, DAS-28-based EULAR response level was good in 42% and moderate in 33% of patients. The second TNF antagonist yielded a good response in 20% and no response in 53% of patients, while the third one yielded a good response in 28% and no response in 72%. Mean baseline HAQ on starting the first, second and third TNF antagonist was 1.61, 1.52 and 1.87, respectively. At the end of follow-up, it decreased to 1.12 (Delta = -0.49; p < 0.0001), 1.31 (Delta = -0.21, p = 0.004) and 1.75 (Delta = -0.12; p = 0.1), respectively. Sixty four percent of patients had a clinically important improvement in HAQ (defined as > or = -0.22) with the first TNF antagonist and 46% with the second. CONCLUSION A clinically significant effect size was seen in less than half of RA patients cycling to a second TNF antagonist.
Resumo:
PURPOSE: The potential of stem cells (SCs) as a source for cell-based therapy on a wide range of degenerative diseases and damaged tissues such as retinal degeneration has been recognized. Generation of a high number of retinal stem cells (RSCs) in vitro would thus be beneficial for transplantation in the retina. However, as cells in prolonged cultivation may be unstable and thus have a risk of transformation, it is important to assess the stability of these cells. METHODS: Chromosomal aberrations were analyzed in mouse RSC lines isolated from adult and from postnatal day (PN)1 mouse retinas. Moreover, selected cell lines were tested for anchorage-dependent proliferation, and SCs were transplanted into immunocompromised mice to assess the possibility of transformation. RESULTS: Marked aneuploidy occurred in all adult cell lines, albeit to different degrees, and neonatal RSCs were the most stable and displayed a normal karyotype until at least passage 9. Of interest, the level of aneuploidy of adult RSCs did not necessarily correlate with cell transformation. Only the adult RSC lines passaged for longer periods and with a higher dilution ratio underwent transformation. Furthermore, we identified several cell cycle proteins that might support the continuous proliferation and transformation of the cells. CONCLUSIONS: Adult RSCs rapidly accumulated severe chromosomal aberrations during cultivation, which led to cell transformation in some cell lines. The culture condition plays an important role in supporting the selection and growth of transformed cells.
Resumo:
Organ transplantation offers a treatment of choice for patients suffering from end stage illnesses. The aim of this IRB approved prospective qualitative study was to analyze patients psychological concerns from their inclusion on the waiting list for first organ transplantation (TX) (T1; N=71; kidney, K=30; liver, Li=11; lung, Lu=15; heart, H=15) and six months after TX (T2; N=49; K=15; Li=10; Lu=14; H=10). Semi-structured interviews were conducted at home or in a place selected by patients. Qualitative pattern analysis (QUAPA) of the verbatim transcriptions was applied. T1 (K) Patients maintained an apparent normality (87%), building emotional protection (23%), and developing a fatalist attitude towards life (43%). (Li) Physical limits were set to spare energy until TX (73%). Illness led to reevaluation of life values (66%). (Lu) Physical and psychological self-protection was prioritized when health declined (67%). Modified life values, fatalism (33%) and spirituality (27 %) were mentioned. (H) Patients husbanded physical (80%) and psychological (67%) resources and self-protection. Modified life values and fatalist attitude towards life were reported (40%). T2 (K) New perspective on life was described, with increase of empathy towards others (20%). (Li) Positive identity and life values modifications (60%), greater openness towards others, closeness to significant ones (30%) and a more self-centered attitude (30%) prioritizing the essential (20%) were reported. Lack of respect of life values generated anger (40%). (Lu) Setting existential priorities and increase in spirituality (64%), along with the development of new life values, greater openness to others (57%) and closeness to significant ones (21%) were underlined. Lack of respect of human values induced negative feelings (36%). Self-centered attitudes, setting limits to other people were mentioned (29%). (H) Change in life values with setting life priorities was reported (70%) with increase in spirituality, and the lack of respect of life values generated anger (50%). Self-centered attitudes were reported (60%). TX not only comes with positive physical benefits, but also with positive existential values and psychological transformation, and the development of a more altruistic attitude and humanistic values.
Resumo:
Uncertainty quantification of petroleum reservoir models is one of the present challenges, which is usually approached with a wide range of geostatistical tools linked with statistical optimisation or/and inference algorithms. Recent advances in machine learning offer a novel approach to model spatial distribution of petrophysical properties in complex reservoirs alternative to geostatistics. The approach is based of semisupervised learning, which handles both ?labelled? observed data and ?unlabelled? data, which have no measured value but describe prior knowledge and other relevant data in forms of manifolds in the input space where the modelled property is continuous. Proposed semi-supervised Support Vector Regression (SVR) model has demonstrated its capability to represent realistic geological features and describe stochastic variability and non-uniqueness of spatial properties. On the other hand, it is able to capture and preserve key spatial dependencies such as connectivity of high permeability geo-bodies, which is often difficult in contemporary petroleum reservoir studies. Semi-supervised SVR as a data driven algorithm is designed to integrate various kind of conditioning information and learn dependences from it. The semi-supervised SVR model is able to balance signal/noise levels and control the prior belief in available data. In this work, stochastic semi-supervised SVR geomodel is integrated into Bayesian framework to quantify uncertainty of reservoir production with multiple models fitted to past dynamic observations (production history). Multiple history matched models are obtained using stochastic sampling and/or MCMC-based inference algorithms, which evaluate posterior probability distribution. Uncertainty of the model is described by posterior probability of the model parameters that represent key geological properties: spatial correlation size, continuity strength, smoothness/variability of spatial property distribution. The developed approach is illustrated with a fluvial reservoir case. The resulting probabilistic production forecasts are described by uncertainty envelopes. The paper compares the performance of the models with different combinations of unknown parameters and discusses sensitivity issues.
Resumo:
BACKGROUND Functional brain images such as Single-Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) have been widely used to guide the clinicians in the Alzheimer's Disease (AD) diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD) Systems. METHODS It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs) are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE) features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT), Principal Component Analysis (PCA) or Partial Least Squares (PLS) (the two latter also analysed with a LMNN transformation). Regarding the classifiers, kernel Support Vector Machines (SVMs) and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. RESULTS Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i) linear transformation of the PLS or PCA reduced data, ii) feature reduction technique, and iii) classifier (with Euclidean, Mahalanobis or Energy-based methodology). The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT) and 90.67%, 88% and 93.33% (for PET), respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. CONCLUSIONS All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between the classes but it also makes (in combination with NMSE and PLS) this rate variation more stable. In addition, their generalization ability is another advance since several experiments were performed on two image modalities (SPECT and PET).
Resumo:
The Global Program for the Elimination of Lymphatic Filariasis (GPELF) aims to eliminate this disease by the year 2020. However, the development of more specific and sensitive tests is important for the success of the GPELF. The present study aimed to standardise polymerase chain reaction (PCR)-based systems for the diagnosis of filariasis in serum and urine. Twenty paired biological urine and serum samples from individuals already known to be positive for Wuchereria bancrofti were collected during the day. Conventional PCR and semi-nested PCR assays were optimised. The detection limit of the technique for purified W. bancrofti DNA extracted from adult worms was 10 fg for the internal systems (WbF/Wb2) and 0.1 fg by using semi-nested PCR. The specificity of the primers was confirmed experimentally by amplification of 1 ng of purified genomic DNA from other species of parasites. Evaluation of the paired urine and serum samples by the semi-nested PCR technique indicated only two of the 20 tested individuals were positive, whereas the simple internal PCR system (WbF/Wb2), which has highly promising performance, revealed that all the patients were positive using both samples. This study successfully demonstrated the possibility of using the PCR technique on urine for the diagnosis of W. bancrofti infection.
Resumo:
In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel densityestimation techniques in the context of compositional data analysis. Indeed, they gavetwo options for the choice of the kernel to be used in the kernel estimator. One ofthese kernels is based on the use the alr transformation on the simplex SD jointly withthe normal distribution on RD-1. However, these authors themselves recognized thatthis method has some deficiencies. A method for overcoming these dificulties based onrecent developments for compositional data analysis and multivariate kernel estimationtheory, combining the ilr transformation with the use of the normal density with a fullbandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu-Figueras (2006). Here we present an extensive simulation study that compares bothmethods in practice, thus exploring the finite-sample behaviour of both estimators
Resumo:
SummaryDiscrete data arise in various research fields, typically when the observations are count data.I propose a robust and efficient parametric procedure for estimation of discrete distributions. The estimation is done in two phases. First, a very robust, but possibly inefficient, estimate of the model parameters is computed and used to indentify outliers. Then the outliers are either removed from the sample or given low weights, and a weighted maximum likelihood estimate (WML) is computed.The weights are determined via an adaptive process such that if the data follow the model, then asymptotically no observation is downweighted.I prove that the final estimator inherits the breakdown point of the initial one, and that its influence function at the model is the same as the influence function of the maximum likelihood estimator, which strongly suggests that it is asymptotically fully efficient.The initial estimator is a minimum disparity estimator (MDE). MDEs can be shown to have full asymptotic efficiency, and some MDEs have very high breakdown points and very low bias under contamination. Several initial estimators are considered, and the performances of the WMLs based on each of them are studied.It results that in a great variety of situations the WML substantially improves the initial estimator, both in terms of finite sample mean square error and in terms of bias under contamination. Besides, the performances of the WML are rather stable under a change of the MDE even if the MDEs have very different behaviors.Two examples of application of the WML to real data are considered. In both of them, the necessity for a robust estimator is clear: the maximum likelihood estimator is badly corrupted by the presence of a few outliers.This procedure is particularly natural in the discrete distribution setting, but could be extended to the continuous case, for which a possible procedure is sketched.RésuméLes données discrètes sont présentes dans différents domaines de recherche, en particulier lorsque les observations sont des comptages.Je propose une méthode paramétrique robuste et efficace pour l'estimation de distributions discrètes. L'estimation est faite en deux phases. Tout d'abord, un estimateur très robuste des paramètres du modèle est calculé, et utilisé pour la détection des données aberrantes (outliers). Cet estimateur n'est pas nécessairement efficace. Ensuite, soit les outliers sont retirés de l'échantillon, soit des faibles poids leur sont attribués, et un estimateur du maximum de vraisemblance pondéré (WML) est calculé.Les poids sont déterminés via un processus adaptif, tel qu'asymptotiquement, si les données suivent le modèle, aucune observation n'est dépondérée.Je prouve que le point de rupture de l'estimateur final est au moins aussi élevé que celui de l'estimateur initial, et que sa fonction d'influence au modèle est la même que celle du maximum de vraisemblance, ce qui suggère que cet estimateur est pleinement efficace asymptotiquement.L'estimateur initial est un estimateur de disparité minimale (MDE). Les MDE sont asymptotiquement pleinement efficaces, et certains d'entre eux ont un point de rupture très élevé et un très faible biais sous contamination. J'étudie les performances du WML basé sur différents MDEs.Le résultat est que dans une grande variété de situations le WML améliore largement les performances de l'estimateur initial, autant en terme du carré moyen de l'erreur que du biais sous contamination. De plus, les performances du WML restent assez stables lorsqu'on change l'estimateur initial, même si les différents MDEs ont des comportements très différents.Je considère deux exemples d'application du WML à des données réelles, où la nécessité d'un estimateur robuste est manifeste : l'estimateur du maximum de vraisemblance est fortement corrompu par la présence de quelques outliers.La méthode proposée est particulièrement naturelle dans le cadre des distributions discrètes, mais pourrait être étendue au cas continu.
Resumo:
The "Europeanization" of non-EU countries' laws is predominantly seen as an "export" of the EU acquis, especially in the case of so-called "quasi-member" states such as Switzerland. Based on an examination of the Swiss experience, this paper highlights the flaws of this conceptualization: the Europeanization of Swiss Law is a highly differentiated phenomenon, encompassing several forms of approximation to EU Law. All of these forms fall short of an "export" of norms, and result in the creation of something new: a "Europeanized law" that is similar to, but qualitatively different from, EU Law. Another drawback of the "export" metaphor is the emphasis it places on the isomorphism of positive legislation. Europeanization goes deeper than that. As shown in this paper, it is a process of transformation involving not only positive law, but also legal thinking. The Swiss case demonstrates how significant such deeper transformations can be: the Europeanization of positive law has induced an alteration of the traditional canon of legal interpretation. It also demonstrates how problematic such transformations can be: the above-mentioned alteration has not given rise to a new and universally accepted canon of interpretation. This reflects the tension between the need for clear "rules of reference" for EU legal materials - which are required in order to restore coherence and predictability to an extensively Europeanized legal system - and the reluctance to give a legal value to foreign legal materials - which is rooted in a traditional understanding of the concept of "law". Such tension, in turn, shows what deep and difficult transformations are required in order to establish a viable model of legal integration outside supranational structures.
Resumo:
This is an exploratory and descriptive study with a quantitative approach that aimed to understand the social production and reproduction processes of women working at university restaurants and the occurrence and the magnitude of gender-based violence committed against them by their intimate partners. The data were collected through semi-structured interviews. The analysis categories used were social production and reproduction, gender and gender-based violence. The interviewees held a subordinate social position during the productive and reproductive periods of their lives. Approximately 70% reported having experienced gender-based violence from an intimate partner (66% psychological violence, 36.3% physical violence and 28.6% sexual violence). Most of the health problems resulting from violence were related to mental health. The results indicate that the situation requires immediate interventions, mostly guided by the instrumentalization of these women and the support by the state and the university as appropriate to address violence.
Resumo:
The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.