945 resultados para mean field independent component analysis
Resumo:
We extend the relativistic mean field theory model of Sugahara and Toki by adding new couplings suggested by modern effective field theories. An improved set of parameters is developed with the goal to test the ability of the models based on effective field theory to describe the properties of finite nuclei and, at the same time, to be consistent with the trends of Dirac-Brueckner-Hartree-Fock calculations at densities away from the saturation region. We compare our calculations with other relativistic nuclear force parameters for various nuclear phenomena.
Resumo:
With Chinas rapid economic development during the last decades, the national demand for livestock products has quadrupled within the last 20 years. Most of that increase in demand has been answered by subsidized industrialized production systems, while million of smallholders, which still provide the larger share of livestock products in the country, have been neglected. Fostering those systems would help China to lower its strong urban migration streams, enhance the livelihood of poorer rural population and provide environmentally save livestock products which have a good chance to satisfy customers demand for ecological food. Despite their importance, China’s smallholder livestock keepers have not yet gained appropriate attention from governmental authorities and researchers. However, profound analysis of those systems is required so that adequate support can lead to a better resource utilization and productivity in the sector. To this aim, this pilot study analyzes smallholder livestock production systems in Xishuangbanna, located in southern China. The area is bordered by Lao and Myanmar and geographically counts as tropical region. Its climate is characterized by dry and temperate winters and hot summers with monsoon rains from May to October. While the regionis plain, at about 500 m asl above sea level in the south, outliers of the Himalaya mountains reach out into the north of Xishuangbanna, where the highest peak reaches 2400 m asl. Except of one larger city, Jinghong, Xishuangbanna mainly is covered by tropical rainforest, areas under agricultural cultivation and villages. The major income is generated through inner-Chinese tourism and agricultural production. Intensive rubber plantations are distinctive for the lowland plains while small-scaled traditional farms are scattered in the mountane regions. In order to determine the current state and possible future chances of smallholder livestock production in that region, this study analyzed the current status of the smallholder livestock sector in the Naban River National Nature Reserve (NRNNR), an area which is largely representative for the whole prefecture. It covers an area of about 50square kilometer and reaches from 470 up to 2400 m asl. About 5500 habitants of different ethnic origin are situated in 24 villages. All data have been collected between October 2007 and May 2010. Three major objectives have been addressed in the study: 1. Classifying existing pig production systems and exploring respective pathways for development 2. Quantifying the performance of pig breeding systemsto identify bottlenecks for production 3. Analyzing past and current buffalo utilization to determine the chances and opportunities of buffalo keeping in the future In order to classify the different pig production s ystems, a baseline survey (n=204, stratified cluster sampling) was carried out to gain data about livestock species, numbers, management practices, cultivated plant species and field sizes as well associo-economic characteristics. Sampling included two clusters at village level (altitude, ethnic affiliation), resulting in 13 clusters of which 13-17 farms were interviewed respectively. Categorical Principal Component Analysis (CatPCA) and a two-step clustering algorithm have been applied to identify determining farm characteristics and assort recorded households into classes of livestock production types. The variables keep_sow_yes/no, TLU_pig, TLU_buffalo, size_of_corn_fields, altitude_class, size_of_tea_plantationand size_of_rubber_fieldhave been found to be major determinants for the characterization of the recorded farms. All farms have extensive or semi-intensive livestock production, pigs and buffaloes are predominant livestock species while chicken and aquaculture are available but play subordinate roles for livelihoods. All pig raisers rely on a single local breed, which is known as Small Ear Pig (SMEP) in the region. Three major production systemshave been identified: Livestock-corn based LB; 41%), rubber based (RB; 39%) and pig based (PB;20%) systems. RB farms earn high income from rubber and fatten 1.9 ±1.80 pigs per household (HH), often using purchased pig feed at markets. PB farms own similar sized rubber plantations and raise 4.7 ±2.77 pigs per HH, with fodder mainly being cultivated and collected in theforest. LB farms grow corn, rice and tea and keep 4.6 ±3.32 pigs per HH, also fed with collected and cultivated fodder. Only 29% of all pigs were marketed (LB: 20%; RB: 42%; PB: 25%), average annual mortality was 4.0 ±4.52 pigs per farm (LB: 4.6 ±3.68; RB: 1.9 ±2.14; PB: 7.1 ±10.82). Pig feed mainly consists of banana pseudo stem, corn and rice hives and is prepared in batches about two to three times per week. Such fodder might be sufficient in energy content but lacks appropriate content of protein. Pigs therefore suffer from malnutrition, which becomes most critical in the time before harvest season around October. Farmers reported high occurrences of gastrointestinal parasites in carcasses and often pig stables were wet and filled with manure. Deficits in nutritional and hygienic management are major limits for development and should be the first issues addressed to improve productivity. SME pork was found to be known and referred by local customers in town and by richer lowland farmers. However, high prices and lacking availability of SME pork at local wet-markets were the reasons which limited purchase. If major management constraints are overcome, pig breeders (PB and LB farms) could increase the share of marketed pigs for town markets and provide fatteners to richer RB farmers. RB farmers are interested in fattening pigs for home consumption but do not show any motivation for commercial pig raising. To determine the productivity of input factors in pig production, eproductive performance, feed quality and quantity as well as weight development of pigs under current management were recorded. The data collection included a progeny history survey covering 184 sows and 437 farrows, bi-weekly weighing of 114 pigs during a 16-months time-span on 21 farms (10 LB and 11 PB) as well as the daily recording of feed quality and quantity given to a defined number of pigs on the same 21 farms. Feed samples of all recorded ingredients were analyzed for their respective nutrient content. Since no literature values on thedigestibility of banana pseudo stem – which is a major ingredient of traditional pig feed in NRNNR – were found, a cross-sectional digestibility trial with 2x4 pigs has been conducted on a station in the research area. With the aid of PRY Herd Life Model, all data have been utilized to determine thesystems’ current (Status Quo = SQ) output and the productivity of the input factor “feed” in terms of saleable life weight per kg DM feed intake and monetary value of output per kg DM feed intake.Two improvement scenarios were simulated, assuming 1) that farmers adopt a culling managementthat generates the highest output per unit input (Scenario 1; SC I) and 2) that through improved feeding, selected parameters of reproduction are improved by 30% (SC II). Daily weight gain averaged 55 ± 56 g per day between day 200 and 600. The average feed energy content of traditional feed mix was 14.92 MJ ME. Age at first farrowing averaged 14.5 ± 4.34 months, subsequent inter-farrowing interval was 11.4 ± 2.73 months. Littersize was 5.8 piglets and weaning age was 4.3 ± 0.99 months. 18% of piglets died before weaning. Simulating pig production at actualstatus, it has been show that monetary returns on inputs (ROI) is negative (1:0.67), but improved (1:1.2) when culling management was optimized so that highest output is gained per unit feed input. If in addition better feeding, controlled mating and better resale prices at fixed dates were simulated, ROI further increased to 1:2.45, 1:2.69, 1:2.7 and 1:3.15 for four respective grower groups. Those findings show the potential of pork production, if basic measures of improvement are applied. Futureexploration of the environment, including climate, market-season and culture is required before implementing the recommended measures to ensure a sustainable development of a more effective and resource conserving pork production in the future. The two studies have shown that the production of local SME pigs plays an important role in traditional farms in NRNNR but basic constraints are limiting their productivity. However, relatively easy approaches are sufficient for reaching a notable improvement. Also there is a demand for more SME pork on local markets and, if basic constraints have been overcome, pig farmers could turn into more commercial producers and provide pork to local markets. By that, environmentally safe meat can be offered to sensitive consumers while farmers increase their income and lower the risk of external shocks through a more diverse income generating strategy. Buffaloes have been found to be the second important livestock species on NRNNR farms. While they have been a core resource of mixed smallholderfarms in the past, the expansion of rubber tree plantations and agricultural mechanization are reasons for decreased swamp buffalo numbers today. The third study seeks to predict future utilization of buffaloes on different farm types in NRNNR by analyzing the dynamics of its buffalo population and land use changes over time and calculating labor which is required for keeping buffaloes in view of the traction power which can be utilized for field preparation. The use of buffaloes for field work and the recent development of the egional buffalo population were analyzed through interviews with 184 farmers in 2007/2008 and discussions with 62 buffalo keepers in 2009. While pig based farms (PB; n=37) have abandoned buffalo keeping, 11% of the rubber based farms (RB; n=71) and 100% of the livestock-corn based farms (LB; n=76) kept buffaloes in 2008. Herd size was 2.5 ±1.80 (n=84) buffaloes in early 2008 and 2.2 ±1.69 (n=62) in 2009. Field work on own land was the main reason forkeeping buffaloes (87.3%), but lending work buffaloes to neighbors (79.0%) was also important. Other purposes were transport of goods (16.1%), buffalo trade (11.3%) and meat consumption(6.4%). Buffalo care required 6.2 ±3.00 working hours daily, while annual working time of abuffalo was 294 ±216.6 hours. The area ploughed with buffaloes remained constant during the past 10 years despite an expansion of land cropped per farm. Further rapid replacement of buffaloes by tractors is expected in the near future. While the work economy is drastically improved by the use of tractors, buffaloes still can provide cheap work force and serve as buffer for economic shocks on poorer farms. Especially poor farms, which lack alternative assets that could quickly be liquidizedin times of urgent need for cash, should not abandon buffalo keeping. Livestock has been found to be a major part of small mixed farms in NRNNR. The general productivity was low in both analyzed species, buffaloes and pigs. Productivity of pigs can be improved through basic adjustments in feeding, reproductive and hygienic management, and with external support pig production could further be commercialized to provide pork and weaners to local markets and fattening farms. Buffalo production is relatively time intensive, and only will be of importance in the future to very poor farms and such farms that cultivate very small terraces on steep slopes. These should be encouraged to further keep buffaloes. With such measures, livestock production in NRNNR has good chances to stay competitive in the future.
Resumo:
This paper presents a new paradigm for signal reconstruction and superresolution, Correlation Kernel Analysis (CKA), that is based on the selection of a sparse set of bases from a large dictionary of class- specific basis functions. The basis functions that we use are the correlation functions of the class of signals we are analyzing. To choose the appropriate features from this large dictionary, we use Support Vector Machine (SVM) regression and compare this to traditional Principal Component Analysis (PCA) for the tasks of signal reconstruction, superresolution, and compression. The testbed we use in this paper is a set of images of pedestrians. This paper also presents results of experiments in which we use a dictionary of multiscale basis functions and then use Basis Pursuit De-Noising to obtain a sparse, multiscale approximation of a signal. The results are analyzed and we conclude that 1) when used with a sparse representation technique, the correlation function is an effective kernel for image reconstruction and superresolution, 2) for image compression, PCA and SVM have different tradeoffs, depending on the particular metric that is used to evaluate the results, 3) in sparse representation techniques, L_1 is not a good proxy for the true measure of sparsity, L_0, and 4) the L_epsilon norm may be a better error metric for image reconstruction and compression than the L_2 norm, though the exact psychophysical metric should take into account high order structure in images.
Resumo:
At CoDaWork'03 we presented work on the analysis of archaeological glass composi- tional data. Such data typically consist of geochemical compositions involving 10-12 variables and approximates completely compositional data if the main component, sil- ica, is included. We suggested that what has been termed `crude' principal component analysis (PCA) of standardized data often identi ed interpretable pattern in the data more readily than analyses based on log-ratio transformed data (LRA). The funda- mental problem is that, in LRA, minor oxides with high relative variation, that may not be structure carrying, can dominate an analysis and obscure pattern associated with variables present at higher absolute levels. We investigate this further using sub- compositional data relating to archaeological glasses found on Israeli sites. A simple model for glass-making is that it is based on a `recipe' consisting of two `ingredients', sand and a source of soda. Our analysis focuses on the sub-composition of components associated with the sand source. A `crude' PCA of standardized data shows two clear compositional groups that can be interpreted in terms of di erent recipes being used at di erent periods, re ected in absolute di erences in the composition. LRA analysis can be undertaken either by normalizing the data or de ning a `residual'. In either case, after some `tuning', these groups are recovered. The results from the normalized LRA are di erently interpreted as showing that the source of sand used to make the glass di ered. These results are complementary. One relates to the recipe used. The other relates to the composition (and presumed sources) of one of the ingredients. It seems to be axiomatic in some expositions of LRA that statistical analysis of compositional data should focus on relative variation via the use of ratios. Our analysis suggests that absolute di erences can also be informative
Resumo:
In an earlier investigation (Burger et al., 2000) five sediment cores near the Rodrigues Triple Junction in the Indian Ocean were studied applying classical statistical methods (fuzzy c-means clustering, linear mixing model, principal component analysis) for the extraction of endmembers and evaluating the spatial and temporal variation of geochemical signals. Three main factors of sedimentation were expected by the marine geologists: a volcano-genetic, a hydro-hydrothermal and an ultra-basic factor. The display of fuzzy membership values and/or factor scores versus depth provided consistent results for two factors only; the ultra-basic component could not be identified. The reason for this may be that only traditional statistical methods were applied, i.e. the untransformed components were used and the cosine-theta coefficient as similarity measure. During the last decade considerable progress in compositional data analysis was made and many case studies were published using new tools for exploratory analysis of these data. Therefore it makes sense to check if the application of suitable data transformations, reduction of the D-part simplex to two or three factors and visual interpretation of the factor scores would lead to a revision of earlier results and to answers to open questions . In this paper we follow the lines of a paper of R. Tolosana- Delgado et al. (2005) starting with a problem-oriented interpretation of the biplot scattergram, extracting compositional factors, ilr-transformation of the components and visualization of the factor scores in a spatial context: The compositional factors will be plotted versus depth (time) of the core samples in order to facilitate the identification of the expected sources of the sedimentary process. Kew words: compositional data analysis, biplot, deep sea sediments
Resumo:
In order to obtain a high-resolution Pleistocene stratigraphy, eleven continuously cored boreholes, 100 to 220m deep were drilled in the northern part of the Po Plain by Regione Lombardia in the last five years. Quantitative provenance analysis (QPA, Weltje and von Eynatten, 2004) of Pleistocene sands was carried out by using multivariate statistical analysis (principal component analysis, PCA, and similarity analysis) on an integrated data set, including high-resolution bulk petrography and heavy-mineral analyses on Pleistocene sands and of 250 major and minor modern rivers draining the southern flank of the Alps from West to East (Garzanti et al, 2004; 2006). Prior to the onset of major Alpine glaciations, metamorphic and quartzofeldspathic detritus from the Western and Central Alps was carried from the axial belt to the Po basin longitudinally parallel to the SouthAlpine belt by a trunk river (Vezzoli and Garzanti, 2008). This scenario rapidly changed during the marine isotope stage 22 (0.87 Ma), with the onset of the first major Pleistocene glaciation in the Alps (Muttoni et al, 2003). PCA and similarity analysis from core samples show that the longitudinal trunk river at this time was shifted southward by the rapid southward and westward progradation of transverse alluvial river systems fed from the Central and Southern Alps. Sediments were transported southward by braided river systems as well as glacial sediments transported by Alpine valley glaciers invaded the alluvial plain. Kew words: Detrital modes; Modern sands; Provenance; Principal Components Analysis; Similarity, Canberra Distance; palaeodrainage
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing one or more parameters in their definition. Methods that can be linked in this way are correspondence analysis, unweighted or weighted logratio analysis (the latter also known as "spectral mapping"), nonsymmetric correspondence analysis, principal component analysis (with and without logarithmic transformation of the data) and multidimensional scaling. In this presentation I will show how several of these methods, which are frequently used in compositional data analysis, may be linked through parametrizations such as power transformations, linear transformations and convex linear combinations. Since the methods of interest here all lead to visual maps of data, a "movie" can be made where where the linking parameter is allowed to vary in small steps: the results are recalculated "frame by frame" and one can see the smooth change from one method to another. Several of these "movies" will be shown, giving a deeper insight into the similarities and differences between these methods
Resumo:
Antecedentes: El interés en las enfermedades autoinmunes (EA) y su desenlace en la unidad de cuidado intensivo (UCI) han incrementado debido al reto clínico que suponen para el diagnóstico y manejo, debido a que la mortalidad en UCI fluctúa entre el 17 – 55 %. El siguiente trabajo representa la experiencia de un año de nuestro grupo en un hospital de tercer nivel. Objetivo: Identificar factores asociados a mortalidad particulares de los pacientes con enfermedades autoinmunes que ingresan a una UCI, de un hospital de tercer nivel en Bogotá, Colombia. Métodos: El uso de análisis de componentes principales basado en el método descriptivo multivariado y análisis de múltiple correspondencia fue realizado para agrupar varias variables relacionadas con asociación significativa y contexto clínico común. Resultados: Cincuenta pacientes adultos con EA con una edad promedio de 46,7 ± 17,55 años fueron evaluados. Los dos diagnósticos más comunes fueron lupus eritematoso sistémico y esclerosis sistémica, con una frecuencia de 45% y 20% de los pacientes respectivamente. La principal causa de admisión en la UCI fue la infección seguido de actividad aguda de la EA, 36% y 24% respectivamente. La mortalidad durante la estancia en UCI fue del 24%. El tiempo de hospitalización antes de la admisión a la UCI, el choque, soporte vasopresor, ventilación mecánica, sepsis abdominal, Glasgow bajo y plasmaféresis fueron factores asociados con mortalidad. Dos fenotipos de variables fueron definidos relacionadas con tiempo en la UCI y medidas de soporte en UCI, las cuales fueron asociadas supervivencia y mortalidad. Conclusiones: La identificación de factores individuales y grupos de factores por medio del análisis de componentes principales permitirá la implementación de medidas terapéutica de manera temprana y agresiva en pacientes con EA en la UCI para evitar desenlaces fatales.
Resumo:
Ozone and temperature profiles from the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS) have been assimilated, using three-dimensional variational assimilation, into a stratosphere troposphere version of the Met Office numerical weather-prediction system. Analyses are made for the month of September 2002, when there was an unprecedented split in the southern hemisphere polar vortex. The analyses are validated against independent ozone observations from sondes, limb-occultation and total column ozone satellite instruments. Through most of the stratosphere, precision varies from 5 to 15%, and biases are 15% or less of the analysed field. Problems remain in the vortex and below the 60 hPa. level, especially at the tropopause where the analyses have too much ozone and poor agreement with independent data. Analysis problems are largely a result of the model rather than the data, giving confidence in the MIPAS ozone retrievals, though there may be a small high bias in MIPAS ozone in the lower stratosphere. Model issues include an excessive Brewer-Dobson circulation, which results both from known problems with the tracer transport scheme and from the data assimilation of dynamical variables. The extreme conditions of the vortex split reveal large differences between existing linear ozone photochemistry schemes. Despite these issues, the ozone analyses are able to successfully describe the ozone hole split and compare well to other studies of this event. Recommendations are made for the further development of the ozone assimilation system.
Resumo:
Locality to other nodes on a peer-to-peer overlay network can be established by means of a set of landmarks shared among the participating nodes. Each node independently collects a set of latency measures to landmark nodes, which are used as a multi-dimensional feature vector. Each peer node uses the feature vector to generate a unique scalar index which is correlated to its topological locality. A popular dimensionality reduction technique is the space filling Hilbert’s curve, as it possesses good locality preserving properties. However, there exists little comparison between Hilbert’s curve and other techniques for dimensionality reduction. This work carries out a quantitative analysis of their properties. Linear and non-linear techniques for scaling the landmark vectors to a single dimension are investigated. Hilbert’s curve, Sammon’s mapping and Principal Component Analysis have been used to generate a 1d space with locality preserving properties. This work provides empirical evidence to support the use of Hilbert’s curve in the context of locality preservation when generating peer identifiers by means of landmark vector analysis. A comparative analysis is carried out with an artificial 2d network model and with a realistic network topology model with a typical power-law distribution of node connectivity in the Internet. Nearest neighbour analysis confirms Hilbert’s curve to be very effective in both artificial and realistic network topologies. Nevertheless, the results in the realistic network model show that there is scope for improvements and better techniques to preserve locality information are required.
Resumo:
The definition and interpretation of the Arctic oscillation (AO) are examined and compared with those of the North Atlantic oscillation (NAO). It is shown that the NAO reflects the correlations between the surface pressure variability at its centers of action, whereas this is not the case for the AO. The NAO pattern can be identified in a physically consistent way in principal component analysis applied to various fields in the Euro-Atlantic region. A similar identification is found in the Pacific region for the Pacific–North American (PNA) pattern, but no such identification is found here for the AO. The AO does reflect the tendency for the zonal winds at 35° and 55°N to anticorrelate in both the Atlantic and Pacific regions associated with the NAO and PNA. Because climatological features in the two ocean basins are at different latitudes, the zonally symmetric nature of the AO does not mean that it represents a simple modulation of the circumpolar flow. An increase in the AO or NAO implies strong, separated tropospheric jets in the Atlantic but a weakened Pacific jet. The PNA has strong related variability in the Pacific jet exit, but elsewhere the zonal wind is similar to that related to the NAO. The NAO-related zonal winds link strongly through to the stratosphere in the Atlantic sector. The PNA-related winds do so in the Pacific, but to a lesser extent. The results suggest that the NAO paradigm may be more physically relevant and robust for Northern Hemisphere variability than is the AO paradigm. However, this does not disqualify many of the physical mechanisms associated with annular modes for explaining the existence of the NAO.
Resumo:
This study clarifies the taxonomic status of Anemone coronaria and segregates the species and A. coronaria infraspecific variants using morphological and morphometric analyses. Principal component analysis of the coronaria group was performed on 25 quantitative and qualitative characters, and morphometric analysis of the A. coronaria infraspecific variants was performed on 21 quantitative and qualitative characters. The results showed that the A. coronaria group clustered into four major groups: A. coronaria L., A. biflora DC, A. bucharica (Regel) Juz.ex Komarov, and a final group including A. eranthioides Regel and A. tschernjaewii Regel. The data on the A. coronaria infraspecific variants clustered into six groups: A. coronaria L. var. coronaria L., var. cyanea Ard., var. albiflora Rouy & Fouc., var. parviflora Regel, var. ventreana Ard., and var. rissoana Ard. © 2007 The Linnean Society of London
Resumo:
Baking and 2-g mixograph analyses were performed for 55 cultivars (19 spring and 36 winter wheat) from various quality classes from the 2002 harvest in Poland. An instrumented 2-g direct-drive mixograph was used to study the mixing characteristics of the wheat cultivars. A number of parameters were extracted automatically from each mixograph trace and correlated with baking volume and flour quality parameters (protein content and high molecular weight glutenin subunit [HMW-GS] composition by SDS-PAGE) using multiple linear regression statistical analysis. Principal component analysis of the mixograph data discriminated between four flour quality classes, and predictions of baking volume were obtained using several selected mixograph parameters, chosen using a best subsets regression routine, giving R-2 values of 0.862-0.866. In particular, three new spring wheat strains (CHD 502a-c) recently registered in Poland were highly discriminated and predicted to give high baking volume on the basis of two mixograph parameters: peak bandwidth and 10-min bandwidth.
Resumo:
Deep Brain Stimulation (DBS) has been successfully used throughout the world for the treatment of Parkinson's disease symptoms. To control abnormal spontaneous electrical activity in target brain areas DBS utilizes a continuous stimulation signal. This continuous power draw means that its implanted battery power source needs to be replaced every 18–24 months. To prolong the life span of the battery, a technique to accurately recognize and predict the onset of the Parkinson's disease tremors in human subjects and thus implement an on-demand stimulator is discussed here. The approach is to use a radial basis function neural network (RBFNN) based on particle swarm optimization (PSO) and principal component analysis (PCA) with Local Field Potential (LFP) data recorded via the stimulation electrodes to predict activity related to tremor onset. To test this approach, LFPs from the subthalamic nucleus (STN) obtained through deep brain electrodes implanted in a Parkinson patient are used to train the network. To validate the network's performance, electromyographic (EMG) signals from the patient's forearm are recorded in parallel with the LFPs to accurately determine occurrences of tremor, and these are compared to the performance of the network. It has been found that detection accuracies of up to 89% are possible. Performance comparisons have also been made between a conventional RBFNN and an RBFNN based on PSO which show a marginal decrease in performance but with notable reduction in computational overhead.
Resumo:
Changes in climate variability and, in particular, changes in extreme climate events are likely to be of far more significance for environmentally vulnerable regions than changes in the mean state. It is generally accepted that sea-surface temperatures (SSTs) play an important role in modulating rainfall variability. Consequently, SSTs can be prescribed in global and regional climate modelling in order to study the physical mechanisms behind rainfall and its extremes. Using a satellite-based daily rainfall historical data set, this paper describes the main patterns of rainfall variability over southern Africa, identifies the dates when extreme rainfall occurs within these patterns, and shows the effect of resolution in trying to identify the location and intensity of SST anomalies associated with these extremes in the Atlantic and southwest Indian Ocean. Derived from a Principal Component Analysis (PCA), the results also suggest that, for the spatial pattern accounting for the highest amount of variability, extremes extracted at a higher spatial resolution do give a clearer indication regarding the location and intensity of anomalous SST regions. As the amount of variability explained by each spatial pattern defined by the PCA decreases, it would appear that extremes extracted at a lower resolution give a clearer indication of anomalous SST regions.