38 resultados para Risk analysis in organizations
Resumo:
Report for the scientific sojourn at the University of Reading, United Kingdom, from January until May 2008. The main objectives have been firstly to infer population structure and parameters in demographic models using a total of 13 microsatellite loci for genotyping approximately 30 individuals per population in 10 Palinurus elephas populations both from Mediterranean and Atlantic waters. Secondly, developing statistical methods to identify discrepant loci, possibly under selection and implement those methods using the R software environment. It is important to consider that the calculation of the probability distribution of the demographic and mutational parameters for a full genetic data set is numerically difficult for complex demographic history (Stephens 2003). The Approximate Bayesian Computation (ABC), based on summary statistics to infer posterior distributions of variable parameters without explicit likelihood calculations, can surmount this difficulty. This would allow to gather information on different demographic prior values (i.e. effective population sizes, migration rate, microsatellite mutation rate, mutational processes) and assay the sensitivity of inferences to demographic priors by assuming different priors.
Resumo:
Compositional data naturally arises from the scientific analysis of the chemicalcomposition of archaeological material such as ceramic and glass artefacts. Data of thistype can be explored using a variety of techniques, from standard multivariate methodssuch as principal components analysis and cluster analysis, to methods based upon theuse of log-ratios. The general aim is to identify groups of chemically similar artefactsthat could potentially be used to answer questions of provenance.This paper will demonstrate work in progress on the development of a documentedlibrary of methods, implemented using the statistical package R, for the analysis ofcompositional data. R is an open source package that makes available very powerfulstatistical facilities at no cost. We aim to show how, with the aid of statistical softwaresuch as R, traditional exploratory multivariate analysis can easily be used alongside, orin combination with, specialist techniques of compositional data analysis.The library has been developed from a core of basic R functionality, together withpurpose-written routines arising from our own research (for example that reported atCoDaWork'03). In addition, we have included other appropriate publicly availabletechniques and libraries that have been implemented in R by other authors. Availablefunctions range from standard multivariate techniques through to various approaches tolog-ratio analysis and zero replacement. We also discuss and demonstrate a smallselection of relatively new techniques that have hitherto been little-used inarchaeometric applications involving compositional data. The application of the libraryto the analysis of data arising in archaeometry will be demonstrated; results fromdifferent analyses will be compared; and the utility of the various methods discussed
Resumo:
We shall call an n × p data matrix fully-compositional if the rows sum to a constant, and sub-compositional if the variables are a subset of a fully-compositional data set1. Such data occur widely in archaeometry, where it is common to determine the chemical composition of ceramic, glass, metal or other artefacts using techniques such as neutron activation analysis (NAA), inductively coupled plasma spectroscopy (ICPS), X-ray fluorescence analysis (XRF) etc. Interest often centres on whether there are distinct chemical groups within the data and whether, for example, these can be associated with different origins or manufacturing technologies
Resumo:
There are two principal chemical concepts that are important for studying the naturalenvironment. The first one is thermodynamics, which describes whether a system is atequilibrium or can spontaneously change by chemical reactions. The second main conceptis how fast chemical reactions (kinetics or rate of chemical change) take place wheneverthey start. In this work we examine a natural system in which both thermodynamics andkinetic factors are important in determining the abundance of NH+4 , NO−2 and NO−3 insuperficial waters. Samples were collected in the Arno Basin (Tuscany, Italy), a system inwhich natural and antrophic effects both contribute to highly modify the chemical compositionof water. Thermodynamical modelling based on the reduction-oxidation reactionsinvolving the passage NH+4 -& NO−2 -& NO−3 in equilibrium conditions has allowed todetermine the Eh redox potential values able to characterise the state of each sample and,consequently, of the fluid environment from which it was drawn. Just as pH expressesthe concentration of H+ in solution, redox potential is used to express the tendency of anenvironment to receive or supply electrons. In this context, oxic environments, as thoseof river systems, are said to have a high redox potential because O2 is available as anelectron acceptor.Principles of thermodynamics and chemical kinetics allow to obtain a model that oftendoes not completely describe the reality of natural systems. Chemical reactions may indeedfail to achieve equilibrium because the products escape from the site of the rectionor because reactions involving the trasformation are very slow, so that non-equilibriumconditions exist for long periods. Moreover, reaction rates can be sensitive to poorly understoodcatalytic effects or to surface effects, while variables as concentration (a largenumber of chemical species can coexist and interact concurrently), temperature and pressurecan have large gradients in natural systems. By taking into account this, data of 91water samples have been modelled by using statistical methodologies for compositionaldata. The application of log–contrast analysis has allowed to obtain statistical parametersto be correlated with the calculated Eh values. In this way, natural conditions in whichchemical equilibrium is hypothesised, as well as underlying fast reactions, are comparedwith those described by a stochastic approach
Resumo:
The disintegration of the USSR brought the emergence of a new geo-energy space in Central Asia. This space arose in the context of a global energy transition, which began in the late 1970s. Therefore, this new space in a changing energy world requires both new conceptual frameworks of analysis and the creation of new analytical tools. Taking into account this fact, our paper attempts to apply the theoretical framework of the Global Commodity Chain (GCC) to the case of natural resources in Central Asia. The aim of the paper is to check if there could be any Central Asia’s geo-energy space, assuming that this space would exist if natural resources were managed with regional criteria. The paper is divided into four sections. First an introduction that describes the new global energy context within natural resources of Central Asia would be integrated. Secondly, the paper justifies why the GCC methodology is suitable for the study of the value chains of energy products. Thirdly, we build up three cases studies (oil and uranium from Kazakhstan and gas from Turkmenistan) which reveal a high degree of uncertainty over the direction these chains will take. Finally, we present the conclusions of this study that state that the most plausible scenario would be the integration of energy resources of these countries in GCC where the core of the decision-making process will be far away from the region of Central Asia. Key words: Energy transition, geo-energy space, Global Commodity Chains, Central Asia
Resumo:
The use of simple and multiple correspondence analysis is well-established in socialscience research for understanding relationships between two or more categorical variables.By contrast, canonical correspondence analysis, which is a correspondence analysis with linearrestrictions on the solution, has become one of the most popular multivariate techniques inecological research. Multivariate ecological data typically consist of frequencies of observedspecies across a set of sampling locations, as well as a set of observed environmental variablesat the same locations. In this context the principal dimensions of the biological variables aresought in a space that is constrained to be related to the environmental variables. Thisrestricted form of correspondence analysis has many uses in social science research as well,as is demonstrated in this paper. We first illustrate the result that canonical correspondenceanalysis of an indicator matrix, restricted to be related an external categorical variable, reducesto a simple correspondence analysis of a set of concatenated (or stacked ) tables. Then weshow how canonical correspondence analysis can be used to focus on, or partial out, aparticular set of response categories in sample survey data. For example, the method can beused to partial out the influence of missing responses, which usually dominate the results of amultiple correspondence analysis.
Resumo:
We propose a stylized model of a problem-solving organization whoseinternal communication structure is given by a fixed network. Problemsarrive randomly anywhere in this network and must find their way to theirrespective specialized solvers by relying on local information alone.The organization handles multiple problems simultaneously. For this reason,the process may be subject to congestion. We provide a characterization ofthe threshold of collapse of the network and of the stock of foatingproblems (or average delay) that prevails below that threshold. We buildupon this characterization to address a design problem: the determinationof what kind of network architecture optimizes performance for any givenproblem arrival rate. We conclude that, for low arrival rates, the optimalnetwork is very polarized (i.e. star-like or centralized ), whereas it islargely homogenous (or decentralized ) for high arrival rates. We also showthat, if an auxiliary assumption holds, the transition between these twoopposite structures is sharp and they are the only ones to ever qualify asoptimal.
Resumo:
[cat] Espanya és un dels principals mercats de productes pesquers d’Europa i del món. El consum de productes pesquers ha estat tradicionalment molt important a Espanya, el 2005 es varen consumir 36,7 kg per persona (MAPA, diversos anys). Malgrat això, el mercat i cóm interactuen els diversos nivells de la cadena de comercialització han gaudit de poca atenció. En aquest estudi, utilitzant dades setmanals, s’analitza per als dotze principals productes pesquers, l’elasticitat en la transmissió de preus al llarg de la cadena de comercialització a Espanya (llotja, mercat central i detallista). Finalment s’investiga la presència d’assimetria en la transmissió de preus entre aquests nivells de mercat. Els resultats obtinguts tenen importants implicacions a l’hora d’analitzar la demanda, poder de mercat i marges al llarg del mercat per als productes pesquers.
Resumo:
OBJECTIVE To determine the prevalence and clinical significance of hepatitis G virus (HGV) infection in a large cohort of patients with primary Sjögren¿s syndrome (SS). PATIENTS AND METHODS The study included 100 consecutive patients (92 female and eight male), with a mean age of 62 years (range 31¿80) that were prospectively visited in our unit. All patients fulfilled the European Community criteria for SS and underwent a complete history, physical examination, as well as biochemical and immunological evaluation for liver disease. Two hundred volunteer blood donors were also studied. The presence of HGV-RNA was investigated in the serum of all patients and donors. Aditionally, HBsAg and antibodies to hepatitis C virus were determined. RESULTS Four patients (4%) and six volunteer blood donors (3%) presented HGV-RNA sequences in serum. HGV infection was associated with biochemical signs of liver involvement in two (50%) patients. When compared with primary SS patients without HGV infection, no significant differences were found in terms of clinical or immunological features. HCV coinfection occurs in one (25%) of the four patients with HGV infection. CONCLUSION The prevalence of HGV infection in patients with primary SS is low in the geographical area of the study and HCV coinfection is very uncommon. HGV infection alone does not seen to be an important cause of chronic liver injury in the patients with primary SS in this area.
Resumo:
[cat] Espanya és un dels principals mercats de productes pesquers d’Europa i del món. El consum de productes pesquers ha estat tradicionalment molt important a Espanya, el 2005 es varen consumir 36,7 kg per persona (MAPA, diversos anys). Malgrat això, el mercat i cóm interactuen els diversos nivells de la cadena de comercialització han gaudit de poca atenció. En aquest estudi, utilitzant dades setmanals, s’analitza per als dotze principals productes pesquers, l’elasticitat en la transmissió de preus al llarg de la cadena de comercialització a Espanya (llotja, mercat central i detallista). Finalment s’investiga la presència d’assimetria en la transmissió de preus entre aquests nivells de mercat. Els resultats obtinguts tenen importants implicacions a l’hora d’analitzar la demanda, poder de mercat i marges al llarg del mercat per als productes pesquers.
Resumo:
Visual inspection remains the most frequently applied method for detecting treatment effects in single-case designs. The advantages and limitations of visual inference are here discussed in relation to other procedures for assessing intervention effectiveness. The first part of the paper reviews previous research on visual analysis, paying special attention to the validation of visual analysts" decisions, inter-judge agreement, and false alarm and omission rates. The most relevant factors affecting visual inspection (i.e., effect size, autocorrelation, data variability, and analysts" expertise) are highlighted and incorporated into an empirical simulation study with the aim of providing further evidence about the reliability of visual analysis. Our results concur with previous studies that have reported the relationship between serial dependence and increased Type I rates. Participants with greater experience appeared to be more conservative and used more consistent criteria when assessing graphed data. Nonetheless, the decisions made by both professionals and students did not match sufficiently the simulated data features, and we also found low intra-judge agreement, thus suggesting that visual inspection should be complemented by other methods when assessing treatment effectiveness.
Resumo:
This master thesis presents a research on the analysis of film tourism stakeholders in Catalonia applying the network analysis approach. The research aims to provide an analysis of the relations between local tourism stakeholders with local film offices through their websites. Therefore, the development of the present work involved the review of literature on the themes of film tourism and network analysis. Then the main stakeholders of film and tourism of Catalonia were identified and their websites analyzed. The measures indicators for network analysis such as centrality, closeness and betweenness degree have been applied on the analysis of the websites to determine the extent of the relations of film and tourism stakeholders in Catalonia. Results and conclusions are presented on the referred sections
Resumo:
What determines risk-bearing capacity and the amount of leverage in financial markets? Thispaper uses unique micro-data on collateralized lending contracts during a period of financialdistress to address this question. An investor syndicate speculating in English stocks wentbankrupt in 1772. Using hand-collected information from Dutch notarial archives, we examinechanges in lenders' behavior following exposure to potential (but not actual) losses. Before thedistress episode, financiers that lent to the ill-fated syndicate were indistinguishable from therest. Afterwards, they behaved differently: they lent with much higher haircuts. Only lendersexposed to the failed syndicate altered their behavior. The differential change is remarkable sincethe distress was public knowledge, and because none of the lenders suffered actual losses ? allfinanciers were repaid in full. Interest rates were also unaffected; the market balanced solelythrough changes in collateral requirements. Our findings are consistent with a heterogeneousbeliefs-interpretation of leverage. They also suggest that individual experience can modify thelevel of leverage in a market quickly.
Resumo:
Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0