939 resultados para STATISTICAL DATA
Resumo:
This work presents one software developed to process solar radiation data. This software can be used in meteorological and climatic stations, and also as a support for solar radiation measurements in researches of solar energy availability allowing data quality control, statistical calculations and validation of models, as well as ease interchanging of data. (C) 1999 Elsevier B.V. Ltd. All rights reserved.
Resumo:
The combined CERN and Brookhaven heavy ion (H.I.) data supports a scenario of hadron gas which is in chemical and thermal equilibrium at a temperature T of about 140 MeV. Using the Brown-Stachel-Welke model (which gives 150 MeV) we show that in this scenario, the hot nucleons have mass 3 pi T and the pi and rho mesons have masses close to pi T and 2 pi T, respectively. A simple model with pions and quarks supports the co-existence of two phases in these heavy ion experiments, suggesting a second order phase transition. The masses of the pion, rho and the nucleon are intriguingly close to the lattice screening masses.
Resumo:
This work presents analyses of the atmospheric conditions and the hindcast of the surface wave field when six extratropical cyclones formed and displaced over the South Atlantic Ocean (10degreesN, 60degreesS; 75degreesW, 15degreesE) between April and September 1999. These events caused high sea waves associated with hazardous conditions along the south and southeast coast of Brazil. The meteorological composite fields for these cyclones show a strong near-surface wind velocity (up to 14 m s(-1)) during its mature phase. The sea-state wave hindcast was obtained using a third-generation wave model forced by the 10-m above ground level wind field from the National Centers for Environmental Prediction-National Center for Atmospheric Research reanalysis dataset. Closer to the south and southeast Brazilian coast, the hindcast results showed significant wave heights of up to 5 m in some of the events. The wave hindcast results for the significant wave height were compared against satellite altimeter data at 6 h intervals. The statistical index showed a systematic underestimation of the significant wave height by 0.5 m. The correlation between wave hindcast and altimeter measurements was greater than 90%, showing a good phase reproduction by the wave model.
Resumo:
This article introduces the software program called EthoSeq, which is designed to extract probabilistic behavioral sequences (tree-generated sequences, or TGSs) from observational data and to prepare a TGS-species matrix for phylogenetic analysis. The program uses Graph Theory algorithms to automatically detect behavioral patterns within the observational sessions. It includes filtering tools to adjust the search procedure to user-specified statistical needs. Preliminary analyses of data sets, such as grooming sequences in birds and foraging tactics in spiders, uncover a large number of TGSs which together yield single phylogenetic trees. An example of the use of the program is our analysis of felid grooming sequences, in which we have obtained 1,386 felid grooming TGSs for seven species, resulting in a single phylogeny. These results show that behavior is definitely useful in phylogenetic analysis. EthoSeq simplifies and automates such analyses, uncovers much of the hidden patterns of long behavioral sequences, and prepares this data for further analysis with standard phylogenetic programs. We hope it will encourage many empirical studies on the evolution of behavior.
Resumo:
The code STATFLUX, implementing a new and simple statistical procedure for the calculation of transfer coefficients in radionuclide transport to animals and plants, is proposed. The method is based on the general multiple-compartment model, which uses a system of linear equations involving geometrical volume considerations. Flow parameters were estimated by employing two different least-squares procedures: Derivative and Gauss-Marquardt methods, with the available experimental data of radionuclide concentrations as the input functions of time. The solution of the inverse problem, which relates a given set of flow parameter with the time evolution of concentration functions, is achieved via a Monte Carlo Simulation procedure.Program summaryTitle of program: STATFLUXCatalogue identifier: ADYS_v1_0Program summary URL: http://cpc.cs.qub.ac.uk/summaries/ADYS_v1_0Program obtainable from: CPC Program Library, Queen's University of Belfast, N. IrelandLicensing provisions: noneComputer for which the program is designed and others on which it has been tested: Micro-computer with Intel Pentium III, 3.0 GHzInstallation: Laboratory of Linear Accelerator, Department of Experimental Physics, University of São Paulo, BrazilOperating system: Windows 2000 and Windows XPProgramming language used: Fortran-77 as implemented in Microsoft Fortran 4.0. NOTE: Microsoft Fortran includes non-standard features which are used in this program. Standard Fortran compilers such as, g77, f77, ifort and NAG95, are not able to compile the code and therefore it has not been possible for the CPC Program Library to test the program.Memory, required to execute with typical data: 8 Mbytes of RAM memory and 100 MB of Hard disk memoryNo. of bits in a word: 16No. of lines in distributed program, including test data, etc.: 6912No. of bytes in distributed Program, including test data, etc.: 229 541Distribution format: tar.gzNature of the physical problem: the investigation of transport mechanisms for radioactive substances, through environmental pathways, is very important for radiological protection of populations. One such pathway, associated with the food chain, is the grass-animal-man sequence. The distribution of trace elements in humans and laboratory animals has been intensively studied over the past 60 years [R.C. Pendlenton, C.W. Mays, R.D. Lloyd, A.L. Brooks, Differential accumulation of iodine-131 from local fallout in people and milk, Health Phys. 9 (1963) 1253-1262]. In addition, investigations on the incidence of cancer in humans, and a possible causal relationship to radioactive fallout, have been undertaken [E.S. Weiss, M.L. Rallison, W.T. London, W.T. Carlyle Thompson, Thyroid nodularity in southwestern Utah school children exposed to fallout radiation, Amer. J. Public Health 61 (1971) 241-249; M.L. Rallison, B.M. Dobyns, F.R. Keating, J.E. Rall, F.H. Tyler, Thyroid diseases in children, Amer. J. Med. 56 (1974) 457-463; J.L. Lyon, M.R. Klauber, J.W. Gardner, K.S. Udall, Childhood leukemia associated with fallout from nuclear testing, N. Engl. J. Med. 300 (1979) 397-402]. From the pathways of entry of radionuclides in the human (or animal) body, ingestion is the most important because it is closely related to life-long alimentary (or dietary) habits. Those radionuclides which are able to enter the living cells by either metabolic or other processes give rise to localized doses which can be very high. The evaluation of these internally localized doses is of paramount importance for the assessment of radiobiological risks and radiological protection. The time behavior of trace concentration in organs is the principal input for prediction of internal doses after acute or chronic exposure. The General Multiple-Compartment Model (GMCM) is the powerful and more accepted method for biokinetical studies, which allows the calculation of concentration of trace elements in organs as a function of time, when the flow parameters of the model are known. However, few biokinetics data exist in the literature, and the determination of flow and transfer parameters by statistical fitting for each system is an open problem.Restriction on the complexity of the problem: This version of the code works with the constant volume approximation, which is valid for many situations where the biological half-live of a trace is lower than the volume rise time. Another restriction is related to the central flux model. The model considered in the code assumes that exist one central compartment (e.g., blood), that connect the flow with all compartments, and the flow between other compartments is not included.Typical running time: Depends on the choice for calculations. Using the Derivative Method the time is very short (a few minutes) for any number of compartments considered. When the Gauss-Marquardt iterative method is used the calculation time can be approximately 5-6 hours when similar to 15 compartments are considered. (C) 2006 Elsevier B.V. All rights reserved.
Resumo:
Statistical methods of multiple regression analysis, trend surface analysis and principal components analysis were applied to seismographic data recorded during production blasting at a diabase quarry in the urban area of Campinas (SP), Brazil. The purpose of these analyses was to determine the influence of the following variables: distance (D), charge weight per delay (W), and scaled distance (SD) associated with properties of the rock body (orientation, frequency and angle of geological discontinuities; depth of bedrock and thickness of the soil overburden) in the variation of the peak particle velocity (PPV). This approach yielded variables with larger influences (loads) on the variation of ground vibration, as well as behavior and space tendency of this variation. The results showed a better relationship between PPV and D, with D being the most important factor in the attenuation of the ground vibrations. The geological joints and the depth to bedrock have a larger influence than the explosive charges in the variation of the vibration levels, but frequencies appear to be more influenced by the amount of soil overburden.
Resumo:
The strangeness content of the nucleon is determined from a statistical model using confined quark levels, and is shown to have a good agreement with the corresponding values extracted from experimental data. The quark levels are generated in a Dirac equation that uses a linear confining potential (scalar plus vector). With the requirement that the result for the Gottfried sum rule violation, given by the New Muon Collaboration (NMC), is well reproduced, we also obtain the difference between the structure functions of the proton and neutron, and the corresponding sea quark contributions.
Resumo:
Interactive visual representations complement traditional statistical and machine learning techniques for data analysis, allowing users to play a more active role in a knowledge discovery process and making the whole process more understandable. Though visual representations are applicable to several stages of the knowledge discovery process, a common use of visualization is in the initial stages to explore and organize a sometimes unknown and complex data set. In this context, the integrated and coordinated - that is, user actions should be capable of affecting multiple visualizations when desired - use of multiple graphical representations allows data to be observed from several perspectives and offers richer information than isolated representations. In this paper we propose an underlying model for an extensible and adaptable environment that allows independently developed visualization components to be gradually integrated into a user configured knowledge discovery application. Because a major requirement when using multiple visual techniques is the ability to link amongst them, so that user actions executed on a representation propagate to others if desired, the model also allows runtime configuration of coordinated user actions over different visual representations. We illustrate how this environment is being used to assist data exploration and organization in a climate classification problem.
Resumo:
Structural health monitoring (SHM) is related to the ability of monitoring the state and deciding the level of damage or deterioration within aerospace, civil and mechanical systems. In this sense, this paper deals with the application of a two-step auto-regressive and auto-regressive with exogenous inputs (AR-ARX) model for linear prediction of damage diagnosis in structural systems. This damage detection algorithm is based on the. monitoring of residual error as damage-sensitive indexes, obtained through vibration response measurements. In complex structures there are. many positions under observation and a large amount of data to be handed, making difficult the visualization of the signals. This paper also investigates data compression by using principal component analysis. In order to establish a threshold value, a fuzzy c-means clustering is taken to quantify the damage-sensitive index in an unsupervised learning mode. Tests are made in a benchmark problem, as proposed by IASC-ASCE with different damage patterns. The diagnosis that was obtained showed high correlation with the actual integrity state of the structure. Copyright © 2007 by ABCM.
Resumo:
Incluye Bibliografía
Resumo:
A statistical quark model, with quark energy levels given by a central linear confining potential is used to obtain the light sea-quark asymmetry, d̄/ū, and also for the ratio d/u, inside the nucleon. After adjusting a temperature parameter by the Gottfried sum rule violation, and chemical potentials by the valence up and down quark normalizations, the results are compared with experimental data available. © 2009 American Institute of Physics.
Resumo:
An improved statistical quark model, with quark energy levels given by a central linear confining potential, is used to obtain the light sea-quark asymmetry, d̄/ū, and also for the corresponding difference d̄-ū, inside the nucleon. In the model, a temperature parameter is adjusted by recent results obtained for the Gottfried sum rule violation, with two chemical potentials adjusted by the valence up and down quark normalizations. The results are compared with available recent experimental data. © 2010 American Institute of Physics.
Resumo:
Includes bibliography
Resumo:
Includes bibliography
Resumo:
We are investigating the combination of wavelets and decision trees to detect ships and other maritime surveillance targets from medium resolution SAR images. Wavelets have inherent advantages to extract image descriptors while decision trees are able to handle different data sources. In addition, our work aims to consider oceanic features such as ship wakes and ocean spills. In this incipient work, Haar and Cohen-Daubechies-Feauveau 9/7 wavelets obtain detailed descriptors from targets and ocean features and are inserted with other statistical parameters and wavelets into an oblique decision tree. © 2011 Springer-Verlag.